This page will have all relevant email transactions with the students regarding the AI course, so that everyone has equal information regarding the course material.
Newest transactions will be posted on the top. Regularly view this page to see what's going on between other students and the instructor.
All sensitive material such as your name, email address, etc. will be removed, and also any part of code that is not relevant to the discussion will not be uploaded here.
Date: 4/18 | Title: Miniproject teams/topics |
Date: 4/08 | Title: Miniproject info |
Date: 4/08 | Title: Quiz 3 topics |
Date: 3/05 | Title: Quiz 2 topics |
Date: 2/09 | Title: Quiz 1 topics |
Date: 1/12 | Title: Welcome |
Date: OLD 4/07 | Title: Mini project details |
Date: OLD 4/04 | Title: HW3 corrections and clarifications |
Date: OLD 2/15 | Title: HW1 plotting the error curve |
Date: OLD 2/10 | Title: Dealing with missing attribute values |
Date: OLD 2/09 | Title: HW1 clarification |
Date: 4/18 | Title: Miniproject teams/topics |
Teams and topics
| |
Date: 4/08 | Title: Miniproject info |
| |
Date: 4/08 | Title: Quiz 3 topics |
You may bring 2 sheets of notes to the quiz. Quiz 2 topics: 1. Genetic algorithms - Basics, plus selection strategies - Importance of representation (e.g., GABIL) - Analysis using schemas - Baldwin effect 2. Evaluating hypotheses - Sampling distribution of the mean - Confidence interval - Difference in error of two hypotheses - Comparing learning algorithms 3. Bayesian learning - Bayes theorem - Application of Bayes rule in analysis of learning algorithms - MAP hypothesis - Least square and Max Likelihood - MDL and relation to MAP - Bayes optimal classifier (different with MAP) - Gibbs sampling - Naive Bayes - Bayesian belief network - Basics - Monte Carlo inference - Gradient ascent for ML learning of cond. prob. - EM algorithm 4. Computational learning theory - Sample complexity - epsilon-exhausting thoerem - PAC learning - bound on sample complexity - VC dimension and sample complexity - Mistake bound - Find-S - Halving algorithm | |
Date: 3/05 | Title: Quiz 2 topics |
Quiz 2 topics: 1. Decision tree learning - relationship between entropy, uncertainty, and surprise - relationship between information gain and transition from one state to another - decision tree learning algorithm - choosing the best attribute - inductive bias 2. Reinforcement learning - value function V(s) - optimal policy pi - optimal value function V*(s) - delta(s,a) and r(s,a) - difficulty of V(s)-based approach (no knowledge of delta and r) - how does Q-learning overcome that difficulty? - Q-learning - deterministic - nondeterministic - [NEW] how to select actions given the current estimate of Q? - TD(lambda) - role of lambda - equivalences (when lambda = 0 or lambda = 1) | |
Date: 2/09 | Title: Quiz 1 topics |
Quiz 1 topics
| |
Date: 1/12 | Title: Welcome |
Welcome to CPSC 633 Machine Learning. The articles below are from the last time I taught 633 (Spring 2006), and would generally be helpful so I kept it here. New articles will be appeded at the top. | |
Date: OLD 4/07 | Title: Mini project details |
Submit a hardcopy of your mini-project proposal by Monday 4/10,
in class. The proposal should contain:
| |
Date: OLD 4/04 | Title: HW3 corrections and clarifications |
1. For problem 2, use the same grid and reward as problem 1. 2. For problem 2 in general, checking for convergence in the way it is described may give mixed results. You may a. set a fixed max number of iterations and run all three policies b. compare the resulting Q(s,a) and the analytical results. 3. For the probabilistic policy in problem 2 (item 3), the two k values should be 2 and 5, not 1 and 5. 4. For problem 3, the subscript "t" represents the t-th visit to (s,a). | |
Date: OLD 2/15 | Title: HW1 plotting the error curve |
Producing the error plot seems to be an issue. You may omit the intermediate steps, and just report the final error on the training set and the test set. | |
Date: OLD 2/10 | Title: Dealing with missing attribute values |
> I am having a little problem in finishing up the homework. > It's working fine with the 'PlayTennis' data, but not with the 'Breast > Cancer' data. I found out this is because of the case when Attributes or > Examples_vi is empty(line 8 and 16 of the ID3 pseudocode in the textbook > page 56.) > In those cases, it says to add a node with label='most common value of > Target_attribute in Examples'. This is not + or -, but one of the values of > the attributes, then how can it make a decision in this case? And what is > the 'most common' value of the attributes? There can be several strategies. You can simply replace the missing attribute value with the most commonly occurring value for that particular attribute. This can be done in some kind of preprocessing stage of the data. | |
Date: OLD 2/09 | Title: HW1 clarification |
Hi everyone, For problem 1, item 2, you don't need to provide a formal proof. Just state your intuition. As for the breast cancer database, you may experience problem if you copy and paste the url from the pdf file. Use this url in that case: http://www.ics.uci.edu/~mlearn/databases/breast-cancer-wisconsin/ Yoonsuck |