Link Search Menu Expand Document

CMPUT 654: Theoretical Foundations of Machine Learning F2023

Following Tong Zhang’s book, students will be introduced to the basic tools required to understand foundational results in learning theory and will get them prepared to read and understand a large portion of learning theory papers. The focus is on the the statistical approach to learning: The contents of the first 12 chapters from Tong Zhang’s book.

Pre-requisites

Students are expected to follow (and even enjoy) mathematical proofs, deal with expressions involving probabilities and have a working knowledge of calculus and linear algebra.

Basic probability, linear algebra and convex optimization is covered in Chapters 2, 3, 5, 7, 26, and 38 of the Bandit Algorithms book. One very nice book that covers more, but is still highly recommended is A Second Course in Probability Theory. The book is available online and also in book format. Chapters 1, 3, 4, and 5 are most useful from here.

Instructor

Lecture Time

Monday and Wednesdays from 3:30 PM - 4:50 PM (MST) in T 1-100.

Office Hours

We use slack for discussion of short questions. If more time is needed, arrange it with the instructor using slack.

Slack Channel

We will use Slack for everything. We have a separate workspace to discuss all topics related for this course. If you would like to join the channel please message the instructor. All announcements will be made on this channel. We strongly encourage all students to ask questions regarding course content on the Slack channel.

Lectures Notes

The lecture notes for this year’s class are under the heading LECTURE NOTES.

Keywords: Theory, Machine learning, Statistical learning, Concentration inequalities, Uniform deviation bounds