2-INF-150: Strojové učenie / Machine Learning Fall 2019 Handouts |
Contact | Basic information | Handouts | Previous semesters
This page shows preliminary schedule of classes and servers as a repository of materials relevant to class. The schedule will be update regularly after lectures.
Recommended literature:
In the schedule, we list the chapters most relevant to the material covered in class. Presentation of the material in lectures usually differs from the books. The book chapters should serve mainly as an additional materials for self study.
Additional materials:
Schedule:
Week 23-27.9.2019 | |
Administrivia. Introduction. Supervised learning / regression. Linear regression. Math review (partial derivatives, gradient, matrices). Normal equations. Literature: GBC:2.1-2.4 or B:C; GBC:5.1 or B:3.1 or HTF:3.1-3.2 |
|
Slides and notes: | Supporting materials: |
---|---|
Administrivia: [ PDF, 62 Kb ] Math review: [ PDF, 41 Kb ] |
Week 30.9.-4.10.2019 | |
Alternative error functions (L1, Linf).
Optimization methods (gradient descent, stochastic gradient descent,
linear / quadratic programming). Generalized linear regression.
Locally weighted approximation. Tutorial: jupyter notebooks, numpy Literature: GBC:4.3,5.9; B:3.1 |
|
Slides and notes: | Supporting materials: |
---|---|
Tutorial 1: numpy: [ linka ] |
Week 7.-11.10.2019 | |
Machine learning theory (part 1). Stochastic model of machine learning.
Bias-variance tradeoff. Learning curves. Regularization (ridge regression, lasso). Holdout testing. Supervised learning / classification. How to turn regression to classification (sigmoid function, likelihood-based error function). Literature: GBC:5.2-5.2.2 or B:3.2,3.1.4 |
|
Slides and notes: | Supporting materials: |
---|---|
Teória strojového učenia - poznámky: [ PDF, 362 Kb ] |
Week 14.-18.10.2019 | |
Logistic regresssion. Simple neural networks. Back propagation as a gradient descent. Tutorial 2: regression Literature: GBC:5.3 or HTF:4.4;GBC:6.1-6.2 or B:5.1,5.3 or HTF:11-11.4 |
|
Slides and notes: | Supporting materials: |
---|---|
Tutorial 2: regression: [ linka ] |
Week 21.-25.10.2019 | |
Support vector machines: Maximum margin classification. Formulation as a quadratic program. Lagrange multipliers. Dual formulation. Support vectors. Literature: B:7.1 |
|
Slides and notes: | Supporting materials: |
---|---|
Lagrange duality of convex programs: [ linka ] Support Vector Machines and Kernels for Computational Biology (tutorial): [ linka ] |
Week 28.10.-1.11.2019 | |
Kernel trick. Kernel function design (proper kernel function, Mercer theorem,
closure properties, arbitrary similarity functions as kernels).
Soft margin classification. Decision trees. Algorithm ID3. Improving bias-variance tradeoff with ensemble methods (bagging, boosting). Literature: GBC:5.7.2 or B:6.1-6.2; HTF:9.2; HTF:8.7; B:14.2-14.3 |
|
Slides and notes: | Supporting materials: |
---|---|
Random forests: [ linka ] |
Week 4.-8.11.2019 | |
Machine learning theory (part 2). PAC learning. PAC bound for finite
size hypothesis space. Estimate of training error for finite size hypothesis space. Tutorial 3: neural networks |
|
Slides and notes: | Supporting materials: |
---|---|
Tutorial slides: Convolutional neural networks: [ PDF, 11237 Kb ] Tutorial 3: neural networks: [ linka ] |
PAC - finite hypothesis space / Yishai Mansour, U Tel Aviv: [ PDF, 243 Kb ] |
Week 11.-15.11.2019 | |
Infinite hypothesis spaces. Rectangle game (example). Vapnik-Chervonenkis (VC) dimension. PAC bound and training error with VC dimension. Bounds for SVMs. |
|
Slides and notes: | Supporting materials: |
---|---|
Rectangle game / Andres Munoz, NYU: [ PDF, 97 Kb ] VC dimension - definition and examples: [ PDF, 109 Kb ] VC dimension - PAC bounds / Yishai Mansour, U Tel Aviv: [ PDF, 173 Kb ] PAC bounds for SVMs: [ PDF, 295 Kb ] |
Week 18.-22.11.2019 | |
Unsupervised learning: k-means clustering, k-medians clustering, hierarchical clustering Tutorial 4: support vector machines, decision trees, random forests Literature: HTF:14.3 or B:9.1 |
|
Slides and notes: | Supporting materials: |
---|---|
k-means clustering: [ PDF, 154 Kb ] Tutorial 4: SVM, random forests: [ linka ] |
Week 25.-29.11.2019 | |
Dimensionality reduction (principal component analysis / PCA, non-linear methods - kernel PCA, t-SNE) Literature: HTF:14.5.1 or B:12.1; HTF:14.5.4 or B:12.3; HTF:14.8 |
Week 2.-6.12.2019 | |
Best practices in machine learning Tutorial 5: PCA |
|
Slides and notes: | Supporting materials: |
---|---|
Tutorial 5: PCA: [ linka ] |
Video (Andrew Ng, Stanford): [ linka ] |
Week 9.-13.12.2019 | |
On-line learning (halving algorithm, weighted majority,
upper bound on the number of errors). Reinforcement learning. Markov decision process. Value iteration. |
|
Slides and notes: | Supporting materials: |
---|---|
Reinforcement learning (notes from Andrew Ng): [ linka ] |
Week 16.-20.12.2019 | |
Reinforcement learning (cont): continuous states, fitted value iteration Summary of the course no class Wednesday |
|
Slides and notes: | Supporting materials: |
---|---|
Class summary: [ PDF, 812 Kb ] |