COMP 3211 - Spring 2019
Spring 2019, COMP 3211 Introduction to Artificial Intelligence [3-0-1:3]
Lecture 1, TuTh 15:00-16:20, Rm 2465 at L25-26
Prof. Dekai WU, Rm 3556,
2358-6989, dekai@cs.ust.hk
Lab 1A We 09:30-10:20, Rm 1104 Academic Concourse
You are welcome to knock on the door of the instructor any time. The TAs' office hours are posted at http://course.cs.ust.hk/comp3211/ta/.
ANNOUNCEMENTS
Welcome to COMP3211! (This course was formerly called COMP221.) Tutorials will begin after Week 2.
Always check the Discussion Forum for up-to-the-minute
announcements.
Discussion forum is at http://comp151.cse.ust.hk/~dekai/content/?q=forum/3.
Always read before asking/posting/emailing your question.
Course home page is at http://www.cs.ust.hk/~dekai/3211/.
Tutorial info is at http://course.cs.ust.hk/comp3211/ta/.
ORIENTATION
Course Description
COMP 3211. Foundations underlying design of intelligent systems. Relations between logical, statistical, cognitive, biological paradigms; basic techniques for heuristic search, theorem proving, knowledge representation, adaptation; applications in vision, language, planning, expert systems.
Course Outcomes
- Identify the fundamental concepts and techniques of AI: autonomous agents, search, knowledge representation, and machine learning.
- Understand and apply techniques for searching state spaces, including breadth-first, depth-first, best-first, A* search, minmax game tree search, minmax with alpha-beta pruning, and hill-climbing search.
- Appreciate some cutting edge research in AI such as multiagent systems, game theory, ontology, semantic web, big data, deep learning, and others.
TEXTBOOKS
- Introduction to Text Alignment: Statistical Machine Translation Models from Bitexts to Bigrammars (forthcoming), by Dekai WU. Springer, 2016.
- Artificial Intelligence: A Modern Approach (2nd Edition), by Stuart RUSSELL and Peter NORVIG. Prentice-Hall, 2003. ISBN-13: 978-0137903955.
- Structure and Interpretation of Computer Programs (2nd edition),
by Harold ABELSON and Gerald Jay SUSSMAN,
with Julie SUSSMAN. MIT Press, 1984. ISBN-10: 0-262-01077-1.
Full text and code are available online at no cost for the Scheme book (Structure and Interpretation of Computer Programs) at http://mitpress.mit.edu/sicp/.
HONOR POLICY
To receive a passing grade, you are required to sign an honor statement acknowledging that you understand and will uphold all policies on plagiarism and collaboration.Plagiarism
All materials submitted for grading must be your own work. You are advised against being involved in any form of copying (either copying other people's work or allowing others to copy yours). If you are found to be involved in an incident of plagiarism, you will receive a failing grade for the course and the incident will be reported for appropriate disciplinary actions.
University policy requires that students who cheat more than once be expelled. Please review the cheating topic from your UST Student Orientation.
Warning: sophisticated plagiarism detection systems are in operation!
Collaboration
You are encouraged to collaborate in study groups. However, you must write up solutions on your own. You must also acknowledge your collaborators in the write-up for each problem, whether or not they are classmates. Other cases will be dealt with as plagiarism.GRADING
The course will be graded on a curve, but no matter what the curve is, I guarantee you the following.
If you achieve | 85% | you will receive at least a | A | grade. |
75% | B | |||
65% | C | |||
55% | D |
Your grade will be determined by a combination of factors:
Midterm exam | ~20% |
Final exam | ~25% |
Participation | ~5% |
Assignments | ~50% |
Examinations
No reading material is allowed during the examinations. No make-ups will be given unless prior approval is granted by the instructor, or you are in unfavorable medical condition with physician's documentation on the day of the examination. In addition, being absent at the final examination results in automatic failure of the course according to university regulations, unless prior approval is obtained from the department head.There will be one midterm worth approximately 20%, and one final exam worth approximately 25%.
Participation
Science and engineering (including software engineering!) is about communication between people. Good participation in class and/or the online forum will count for approximately 5%.
Assignments
All assignments must be submitted by 23:00 on the due date. Scheme programming assignments must run under Chicken Scheme on Linux. Assignments will be collected electronically using the automated CASS assignment collection system. Late assignments cannot be accepted. Sorry, in the interest of fairness, exceptions cannot be made.
Programming assignments will account for a total of approximately 50%.
Tutorials
All information for tutorials is at http://course.cs.ust.hk/comp3211/ta/.
SYLLABUS
Date | Wk | Event | Description | AI Methods | AI Applications |
2019.01.31 | 1 | Lecture | Welcome; Introduction; Survey | foundations underlying design of intelligent systems; relations between logical, statistical, cognitive, biological paradigms | vision, natural language, planning, expert systems |
2019.02.05 | 1 | Holiday | Lunar New Year | ||
2019.02.07 | 1 | Holiday | Lunar New Year | ||
2019.02.12 | 2 | Lecture | Impact of AI on ethics and society | AI ethics | natural language: emotional and empathetic language |
2019.02.13 | 2 | Lecture | Does God play dice? Assumptions: scientific method, hypotheses, models, learning, probability; linguistic relativism and the Sapir-Whorf hypothesis; inductive bias, language bias, search bias; the great cycle of intelligence [at tutorial] | foundations underlying design of intelligent systems; relations between logical, statistical, cognitive, biological paradigms | vision, natural language, planning, expert systems |
2019.02.14 | 2 | Lecture | Languages of the world Admiinistrivia (honor statement, HKUST classroom conduct) |
foundations underlying design of intelligent systems; relations between logical, statistical, cognitive paradigms | natural language |
2019.02.19 | 3 | Lecture | Engineering (application), social (impact), and scientific (cognition) foundations and motivations for AI; learning to translate | foundations underlying design of intelligent systems; relations between logical, statistical, cognitive, biological paradigms | vision, natural language, planning, expert systems |
2019.02.20 | 3 | Lecture | Language structures thought (Sapir-Whorf hypothesis); "It's all Chinese to me": linguistic complexity; challenges in modeling translation [at tutorial] | foundations underlying design of intelligent systems; relations between logical, statistical, cognitive, biological paradigms | natural language |
2019.02.21 | 3 | Lecture | Turing test; translation based test of AI; interactive simulation exercise; error analysis; what's still missing in AI | foundations underlying design of intelligent systems; relations between logical, statistical, cognitive paradigms | natural language |
2019.02.26 | 4 | Lecture | Introduction to search | basic techniques for heuristic search | |
2019.02.27 | 4 | Lecture | State spaces; the anagram problem [at tutorial] | basic techniques for heuristic search | natural language: letter distributions |
2019.02.28 | 4 | Lecture | Conditional decomposition; Markovian approximations; memoryless property; Markov models | relations between logical, statistical, cognitive paradigms; machine learning/adaptation: probabilistic approximation | natural language: morphology and character distributions |
2019.03.05 | 5 | Lecture | Character n-gram models; goal states and objective functions | basic techniques for heuristic search; machine learning/adaptation: probabilistic approximation; relations between logical, statistical, cognitive paradigms | natural language: morphology and letter distributions |
2019.03.06 | 5 | Lecture | Uninformed search; BFS, DFS, depth-bounded search, iterative deepening, bidirectional search [at tutorial] | basic techniques for heuristic search | natural language: letter distributions |
2019.03.07 | 5 | Lecture | Informed search; greedy best-first search; agenda-driven search; search fringe; Dijkstra's shortest path algorithm | basic techniques for heuristic search; relations between logical, statistical paradigms | natural language: letter distributions |
2019.03.12 | 6 | Lecture | implementing agenda-driven search for finding best anagrams (Assignment 1 due 2019.03.18 23:59); physical demo of agenda-driven search for Dijkstra's algorithm; anagrams with replacement | basic techniques for heuristic search; relations between logical, statistical paradigms | natural language: letter distributions |
2019.03.13 | 6 | Lecture | Basic probability theory; conditional probabilities; Bayes' theorem [at tutorial] | machine learning/adaptation: fundamental statistics and probability | |
2019.03.14 | 6 | Lecture | Chinese anagrams; word vs character n-grams; first- and second-order n-grams; Shannon | machine learning/adaptation: basic information theory | natural language: word distributions; n-gram language models |
2019.03.19 | 7 | Exam | Midterm (closed book; handwritten notebook only) | ||
2019.03.21 | 7 | Lecture | hidden Markov models, finite-state models; weighted and stochastic FSAs; parts of speech; generation vs recognition/parsing | knowledge representation: generative vs recognition models; finite state automata; machine learning/adaptation: HMM; relations between logical, statistical, cognitive paradigms | natural language: regular language models, part-of-speech tagging |
2019.03.26 | 8 | Lecture | Converting state-based to transition based FSAs for both logical and stochastic FSAs; segmental HMM/SFA/WFSAs; WFST: finite-state translation models | knowledge representation: transduction models; machine learning/adaptation: HMM; relations between logical, statistical paradigms | natural language: regular language models, part-of-speech tagging |
2019.03.28 | 8 | Lecture | HMM/SFSA/WFSA decoding, evaluation, learning: unrolling in time; ways to search the lattice; formalization for Viterbi decoding and evaluation [slides] | basic techniques for heuristic search: dynamic programming; machine learning/adaptation: training HMMs | natural language: regular language models, part-of-speech tagging |
2019.04.02 | 9 | Lecture | Introduction to neural networks and their language biases | knowledge representation: vector representations of knowledge; logical expressiveness of linear and nonlinear neural network layers; relations between logical, statistical, cognitive paradigmsn | |
2019.04.04 | 9 | Lecture | Implementing feedforward neural network n-gram models; model design following scientific method for machine learning/adaptation in practice (Assignment 2 due 2019.04.15 23:59); AI and society | knowledge representation: multilayer vector representations of knowledge; machine learning/adaptation: feedforward neural networks, back propagation training; relations between logical, statistical, cognitive paradigms | natural language: neural network n-gram language models |
2019.04.09 | 10 | Lecture | Assignment 2 discussion | knowledge representation: multilayer vector representations of knowledge; machine learning/adaptation: feedforward neural networks, back propagation training; relations between logical, statistical, cognitive paradigms | natural language: neural network n-gram language models |
2019.04.10 | 10 | Lecture | HMM review; search bias [at tutorial] | basic techniques for heuristic search: dynamic programming; machine learning/adaptation: training HMMs; relations between logical, statistical, cognitive paradigms | natural language: regular language models, part-of-speech tagging |
2019.04.11 | 10 | Lecture | Search bias (cont'd); forward algorithm for HMM/SFSA/WFSAs | basic techniques for heuristic search: dynamic programming; machine learning/adaptation: training HMMs; relations between logical, statistical, cognitive paradigms | natural language: regular language models, part-of-speech tagging |
2019.04.16 | 11 | Lecture | Training HMM/SFSA/WFSAs: backward algorithm; completing missing data; expectations; EM (expectation maximization); Baum-Welch parameter estimation | basic techniques for heuristic search: dynamic programming; machine learning/adaptation: training HMMs; relations between logical, statistical, cognitive paradigms | natural language: regular language models, part-of-speech tagging |
2019.04.18 | 11 | Holiday | Spring break | ||
2019.04.23 | 11 | Holiday | Spring break | ||
2019.04.25 | 11 | Lecture | RNN: recurrent neural networks | knowledge representation: recursive vector representations of knowledge; machine learning/adaptation: RNN; relations between logical, statistical, cognitive paradigms | natural language: recurrent neural network language models, part-of-speech tagging |
2019.04.30 | 12 | Lecture | Implementng RNNs; RNN language models; recurrent model design (Assignment 3 due 2019.05.10 23:59); memoization/caching; HMM alignment models; memory-bounded search; heuristics; admissibility; A* search; memory-bounded heuristic search; AI ethics in China | knowledge representation: recursive vector representations of knowledge; machine learning/adaptation: RNN; relations between logical, statistical, cognitive paradigms; AI ethics | natural language: recurrent neural network language models, part-of-speech tagging |
2019.05.02 | 12 | Lecture | Revisiting knoweldge representation and language bias; propositional logic; conjunctive normal form; AND/OR graphs; AND/OR hypergraphs; forward chaining | knowledge representation | natural language: context-free language models |
2019.05.07 | 13 | Lecture | Definite clauses; definite clause grammars; Knuth's algorithm; context-free grammars (CFGs); weighted and stochastic CFGs | knowledge representation; logical and stochastic AND/OR graphs; basic techniques for heuristic search: theorem proving by agenda-driven search for CNF and AND/OR graphs; relatons between logical, statistical, cognitive paradigms | natural language: context-free language models, stochastic context-free language models |
2019.05.08 | 13 | Lecture | Probabilistic dynamic programming based chart parsing; probabilistic Cocke-Kasami-Younger (CKY) parsing; inside-outside algorithm; parsing by theorem proving; productions as inference rules; backward chaining; abductive (diagnostic or explanatory) inference vs deductive inference | knowledge representation: theorem proving; basic techniques for heuristic search: dynamic programming; relatons between logical, statistical, cognitive paradigms | natural language: context-free parsing, stochastic context-free parsing | 2019.05.09 | 13 | Lecture | Overview of inversion transduction grammars (ITG); recursive autoencoders (RAE) and recursive auto-associative memory (RAAM); TRAAM (transduction RAAM); recursive neural network realizations of ITGs; a self-learning rap battle bot | knowledge representation: AND/OR graph based transduction models, logical and stochastic transduction models; basic techniques for heuristic search: dynamic programming; relatons between logical, statistical, cognitive paradigms | natural language: bilingual parsing, stochastic bilingual parsing; statistical machine translation (SMT); neural machine translation (NMT); learning the language of rap battles |
2019.05.24 | Exam | Final (closed book; handwritten notebook only) LG1 Table Tennis Room (LG1031) 16:30-19:30 |
dekai@cs.ust.hk
Last updated: 2019.05.10