Learning theory

CS-526

Media

CS-526 Learning theory

Lecture 13: Power method and applications

23.05.2022, 13:27

Lecture 12: Multilinear rank and Tucker decomposition

16.05.2022, 10:56

Lecture 11: Matricizations and Alternating Least Squares

10.05.2022, 14:09

Lecture 10: Tensor decomposition and Jennrich’s theorem

02.05.2022, 10:49

Lecture 9: Tensors (motivations and introduction)

26.04.2022, 14:43

For some technique issues, we don't have recordings for the first ~15 minutes. But hand-written notes are available on the course web page as supplementary. We apologize for the inconvenience. 

Lecture 8.2: Neural networks under SGD

11.04.2022, 12:25

Lecture 8.1: Neural networks under SGD

11.04.2022, 12:24

Lecture 7 - 2022 edition - SGD and Mean Field Analysis of tow layer NNs (start)

04.04.2022, 19:52

Lecture 7 - 2021 edition - Stochastic gradient descent

04.04.2022, 19:48

Lecture 6: Gradient descent

28.03.2022, 14:27

Lecture 5: Nonuniform learnability and structural risk minimization

23.03.2022, 11:51

Lecture 4: VC dimension

17.03.2022, 07:35

Lecture 3: Growth rate and uniform convergence

09.03.2022, 22:53

Lecture 2: Uniform convergence and No-Free-Lunch theorem

03.03.2022, 07:30

1, Lecture 1: PAC learning framework

23.02.2022, 00:08


This file is part of the content downloaded from Learning theory.
Course summary

This master class on learning theory covers the classical PAC framework for learning, stochastic gradient descent, tensor methods. We also touch upon topics from the recent literature on mean field methods for NN and the double descent phenomenon.

Teacher: Nicolas Macris: nicolas.macris@epfl.ch  -  with also some lectures by Rodrigo Veiga: rodrigo.veiga@epfl.ch

Teaching Assitant: Anastasia Remizova - anastasia.remizova@epfl.ch

Courses: Mondays 8h15-10h in presence Room INM202; Exercises: Tuesdays 17h15-19h in presence Room INR219.

We will use this moodle page to distribute homeworks, solutions, and lecture material each week. As well as use the discussion and questions forum. Dont hesitate to actively use this forum.

Lectures are in presence. If you miss a lecture an old recorded version is accessible here  https://mediaspace.epfl.ch/channel/CS-526+Learning+theory/29761 however the material, instructors and order of lectures might be slightly different this year

GRADED HOMEWORKS: there will be 3 graded homeworks (one on each topic basically). Dates and deadlines will be announced as we go. You will usually have two weeks to hand them back. These will count for 20% of the final grade.

EXAM: its open book. You can bring your notes, printed material, the UML book. If you dont want to print you can upload your material on your laptop beforehand, and have wifi switched off. The final exam will count for 80% of the final grade.

Textbooks and notes:



17 - 18 February

PAC learning framework. Finite classes. Uniform convergence.

Lecture this week is by Dr. Rodrigo Veiga

See chapters 3 and 4 in UML

Homework 1: exercises 1, 3, 7, 8 of Chapter 3.


24 - 25 February

No free lunch theorem.

See chapter 5 in UML

Homework 2: exercises 1 and 2 of chapter 4 + extra on proof of Hoeffding's inequality


3 - 4 March

Learning infinite classes I

Chapter 6 in UML

Homework 3 is graded. Deadline for handling 18 March.


10 - 11 March

Learning infinite classes II (VC dimension)

Chapter 6 continued

Homework 3 continued


17 - 18 March

Bias variance tradeoff and the double descent phenomenon 

We will study the double descent of generalization error based on the paper "Two models of double descent for weak features" by Belkin, Hsu, Xu

Lecture by Dr Rodrigo Veiga

Deadline for handling homework 3: 18 March


24 - 25 March

Double descent phenomenon: continuation and derivation for weak features model

Lecture by Dr Rodrigo Veiga


31 March - 1st April

Gradient descent (convexity, Lipshitzness, Approach to optimal solution)

Stochastic gradient descent, application to learning

Second graded homework this week: deadline 15 April midnight.

Chapter 14 in UML


7 - 8 April

Mean field approach for two layer neural networks

based on the paper "One lecture on two layer neural networks" by A. Montanari


14 - 15 April

We finish discussing the main idea of the mean analysis of two layer neural networks (notably part II of notes).

Homework: we have extra homework 7 on convexity, GD, SGD below.


21 April - 22 April

EASTER WEEK BREAK

28 - 29 April

Tensors 1. Motivations and examples, multi-dimensional arrays, tensor product, tensor rank.

Tensors 2. Tensor rank and decompositions, Jennrich's theorem (proof of thm next time)

Graded Homework 8: deadline May Tuesday 13 May midnight.   EXTENDED Monday 19 May midnight


5 - 6 May

CLASS CANCELLED THIS WEEK. SESSION DEVOTED TO THE EXERCISES (graded hmw 8)

Tensors 2bis. Tensor decomposizion and Jennrich's algorithm


12 -13 May

Tensors 2bis. Tensor decomposizion and Jennrich's algorithm


19 - 20 May

 Tensors 3. Alternating least square algorithm

Tensors 4. Multilinear rank Tucker higher order singular value decomposition




26 - 27 May

Tensors 5. Power method and Applications: Gaussian Mixture Models, Topic models of documents


Old exams

Here we will post old exams with solutions which will allow you to train yourself. Note that some problems (but not all) are included in the current year's material.


Further lecture notes