Topics in information-theoretic cryptography
COM-622
Media
Media
Welcome to Topics in Information-Theoretic Cryptography!
Time and Location: Thursdays 8:15-10am, BC03. Zoom link for Lecture: https://epfl.zoom.us/j/67269795690
Instructor: Yanina Shkel
Office Hours: By Appointment, INR 131. Email: yanina.shkel@epfl.ch
Course Webpage: Moodle
Special Course Notes: The first lecture will take place on September 30, and the last lecture is expected to take place on December 16. We will cover a selection of papers from the provided reading list. You are welcome to audit the course, even if you already took it last semester, since the sets of papers covered each semester will have a very small intersection.
Overview: This semester we will do a survey of a large body of work on statistical measures of information. Our particular focus will be on understanding the motivation behind some of the more commonly used notions that relate to capturing and measuring information. For example, many classical information-theoretic measures (e.g. entropy, mutual information) arise as answers to specific engineering questions (how well could we compress this data? how much data could we send over this noisy communication channel?). This provides a strong justification for their use and imbues them with meaning. At the same time, many measures of information leakage for privacy and secrecy applications (e.g. differential privacy, maximal leakage) are postulated as the ‘correct’ measure of leakage first, and then applied as a performance gauge to the design of actual algorithms and systems.
Some of the topics may include: traditional information-theoretic notions like entropy, mutual information, and relative entropy; their extensions to Renyi entropy, Renyi divergence, alpha-mutual information; various notions of common information; ‘operationally’ defined measures of information leakage like differential privacy and maximal leakage; and a closely related question of statistical measures of fairness. We will discuss the motivation behind these measures, their mathematic properties, as well as their strengths and weaknesses.
Course Format: The format is based on paper reading and presentation during lecture. The instructor will present the papers, but an active participation from class is encouraged. We expect to have 10-12 lectures total with 1-2 reading assignments per lecture. Lectures will be in person as much as possible given the current epidemiological situation. However, an option to view them online through Zoom will be available. Lectures will not be recorded.
Grading: The final grade will be based 20% on course participation and 80% on the final project. Course participation includes completing reading assignments, attending lectures, and asking questions in lecture. The final project will include a scientific assignment based on another chosen article. Choices for project assignments include extension of existing results, implementation tasks, critical summary of a paper, etc. Students may use a paper from the provided reading list or suggest their own paper.
Lecture 1 - September 30
Introduction
- Reading Assignment (File)
- Notes for Lecture 2 (File)
- Further Reading (File)
- Lecture Slides - Intro (File)
- Lecture Slides for Lecture 1 (File)
Lecture 2 - October 7
Réniy Entropy and Axiomatic Definitions of Entropy
Lecture 3 - October 14
Introducing Maximal Leakage
October 21 - No Lecture
Lecture 4 - October 28
Motivating Maximal Leakage
Lecture 5 - November 4
Generalizing Maximal Leakage
Lecture 6 - November 11
Generalizing Maximal Leakage / Introducing Differential Privacy
- Video: Protecting Privacy with MATH (URL)
- Video: Protecting Privacy with MATH footnote (URL)
- Notes for Lecture 6 and 7 (File)
Lecture 7 - November 18 - Project Proposal Due
Introducing Differential Privacy
Project Proposal Due
- Reading Assignment (File)
- Further Reading (File)
- Further Reading: Privacy in Statistics and Machine Learning (URL)