2-INF-150: Strojové učenie / Machine Learning
Fall 2022

Contact | Basic information | Homework assigments | Handouts | Previous semesters

This page shows preliminary schedule of classes and servers as a repository of materials relevant to class. The schedule will be update regularly after lectures.

Recommended literature:

In the schedule, we list the chapters most relevant to the material covered in class. Presentation of the material in lectures usually differs from the books. The book chapters should serve mainly as an additional materials for self study.

Additional materials:

Week 19.-23.9.2022
Administration. Introduction.
Supervised learning / regression. Linear regression and it's variants.
Literature: GBC:2.1-2.4 or B:C; GBC:5.1 or B:3.1 or HTF:3.1-3.2
Slides and notes:Supporting materials:
video intro:linka ]
video regression:linka ]
slides 1:PDF, 275 Kb ]
slides 2:PDF, 342 Kb ]
stanford (chapters 1, 2, 4):linka ]

Week 26.9.-30.9.2022
Tutorials 1: collabora/jupyter notebooks, numpy
Theory of learning, overfit, underfit, bias variance.
Literature: GBC:4.3,5.9; B:3.1;
Slides and notes:Supporting materials:
Tutorials 1: numpy:linka ]
Learning theory notes:PDF, 362 Kb ]
video lecture:linka ]
stanford (chapter 4):linka ]
stanford (chapter 1):linka ]

Week 3.-7.10.2022
Theory of learning cont. Regularization. Classification. Logistic regression, softmax (maximum entropy) classifier. Neural networks.
Slides and notes:Supporting materials:
lecture:linka ]
lecture 2 (until 35 minute, stop at neural networks):linka ]
stanford (chapter 3, 5 and 9.3):linka ]
stanford (chapter 1):linka ]

Week 10.-14.10.2022
Tutorials 2: regression
Support vector machines.
Slides and notes:Supporting materials:
SVM lecture:linka ]
Tutorials 2: Regression:linka ]

Week 17.-21.10.2022
Tutorials 3: Neural networks
Support vector machines (continued). Kernel trick.
Slides and notes:Supporting materials:
Tutorials 3: neural networks:linka ]
lecture1:linka ]
lecture2:linka ]
SVM Stanford:linka ]

Week 24.-28.10.2022
Decision trees. Bagging and boosting. Probabilistic interpreration of regresion and regularization.
Slides and notes:Supporting materials:
lecture:linka ]
Decision Trees:linka ]
Decision Trees 2:linka ]
Gradient boosting:linka ]

Week 31.10-4.11.2022
Tutorials 4: trees and SVMs
Theory of ML (PAC learning). Estimating needed number of training samples and expected test error for finite hyphothesis set. Estimation for infinite set of hyphotheses. VC dimension. PAC learning and SVM.
Slides and notes:Supporting materials:
Tutorials 4: trees and SVMs:linka ]
PAC learning rectangle game:linka ]
PAC learing, VC dimension (until 50 minute):linka ]
PAC - finite hyphotheses:PDF, 243 Kb ]
axis aligned rectangles / Andres Munoz, NY Univ:PDF, 97 Kb ]
VC dimension - definition and examples / Yishai Mansour, U Tel Aviv:PDF, 109 Kb ]
VC dimension - PAC estimates / Yishai Mansour, U Tel Aviv:PDF, 173 Kb ]
PAC estimates for SVM:PDF, 295 Kb ]

Week 7.11-11.11.2022
PAC learning continued
Principal component analysis (PCA)
Slides and notes:Supporting materials:
Video lecture PCA:linka ]
PCA:PDF, 567 Kb ]

Week 14.-18.11.2022
Tutorials 5: PCA
Slides and notes:Supporting materials:
Tutorials 5: PCA:linka ]

Week 21.-25.11.2022
Clustering (k-means, k-medoids, hierarchical clustering)
Reinforcement learning
Slides and notes:Supporting materials:
Clustering slides:PDF, 185 Kb ]
Clustering lecture (from minute 50):linka ]
Reinforcement learning notes (contain more links at the end):linka ]

Maintained by 2-INF-150 personnel