Lecture series on scientific machine learning

PHYS-754

Media

Media

This file is part of the content downloaded from Lecture series on scientific machine learning.
Course summary

Welcome to Phys-754! 

Machine learning is a data analysis and computational tool that in the last two decades brought groundbreaking progress into many modern technologies. What is more, machine learning is becoming an indispensable tool enabling progress in many scientific disciplines where knowledge is deduced from data.

This course will present some recent works in this direction. In the first part of the course works of different EPFL laboratories that use machine learning to address scientific questions in physics, chemistry, material science and biology will be presented.

Professors involved include: Lenka Zdeborova, Giuseppe Carleo, Michele Ceriotti, Philippe Schwaller, Alexander Mathis, Anne-Florence Bitbol, David Harvey. Examples of problems covered include neural-network enhanced solutions of the Schrodinger equation in a variety of contexts, machine learning for the prediction and rationalization of chemical and physical properties of materials, analysis of proteins from their sequence and structure, or automated data analysis and modeling brain-function in neuroscience, application in astrophysics. In the second part of the lecture students will read, present and discuss selected recent articles on the subject.

More information on the class can be found here.


Sep 12 (Bitbol)

Lecturer: Prof. Anne-Florence Bitbol

Making sense of protein sequence data using machine learning

The advances of DNA sequencing are providing more and more biological sequence data, which is a great opportunity for data-driven methods, including machine learning, to learn about biology. In this lecture, we will discuss some of these advances. We will start from the fundamentals of sequence data structure and statistical dependence in this data, and move towards recent advances involving protein language models. The lecture will be structured as follows.

Part 1: Inference from protein sequences – traditional methods

Conservation and coevolution in protein sequence data; Pairwise maximum entropy models (Potts models)

Part 2: Inference from protein sequences – protein language models

Transformers and protein language models; link to AlphaFold and structure prediction; applications of protein language models

Note: The 4 papers we will discuss on October 3 are highlighted in the slides with "PAPER" in red





Sept 19 (Zdeborova)

Lecturer: Prof. Lenka Zdeborova, Statistical Physics of Computation Laboratory. 

Title: Statistical physics approach to theory of deep learning 

Machine learning finds applications in sciences, but the opposite is, of course, also true, and this lecture will illustrate this opposite direction. We will present how one uses statistical physics to study machine learning and the underlying algorithms. We will discuss phase transitions that appear in the ability of a system to learn and their algorithmic consequences. This line of work helps to clarify properties of deep learning such as double descent -- the lack of overfitting and the role of overparametrization and gives a hint about the nature of the emergence of abilities in AI systems. 

Related papers will be presented by the students on October 10. 




Sept 26 (Mathis)

Lecturer: Alexander Mathis 

Neuroscience and machine learning have a long and intertwined history. Just to mention one crucial example, neural networks are clearly motivated by their biological counterpart. Firstly, I will illustrate how complex the brain is and how machine learning approaches are helping us to carry out better measurements of its structure and function. Secondly, I will highlight how machine learning approaches are also essential for understanding the brain.

Later, students will present and discuss a selection from the following papers: 

[1] Hassabis. D. et al (2017): Neuroscience-Inspired Artificial Intelligence, Neuron

[2] Vargas, A. M., Bisi, A., Chiappa, A. S., Versteeg, C., Miller, L. E., & Mathis, A. (2024). Task-driven neural network models predict neural dynamics of proprioceptionCell

[3] Zador, T. (2019): A critique of pure learning and what artificial neural networks can learn from animal brains, Nature Communication

[4] Chiappa, A. S., Tano, P., Patel, N., Ingster, A., Pouget, A., & Mathis, A. (2024). Acquiring musculoskeletal skills with curriculum-based reinforcement learning. Neuron


Oct 3 (Bitbol)

Lecturer: Prof. Anne-Florence Bitbol

Students will present and discuss the following papers:


Oct 10 (Zdeborova)

Lecturer: Prof. Lenka Zdeborova

Students will present and discuss (a selection from) the following papers: 

For those who want to hear more about how the Boltzmann machine is related to attention, I shared one lecture from the ML for physicists course here: https://youtu.be/dwMhw2X8_TU (part 1) and https://www.youtube.com/watch?v=od_XiaCJzV0 (part 2)




Oct 17 (Schwaller)

Lecturer: Prof. Philippe Schwaller

(Large) Language models for accelerated chemical discovery and synthesis 

AI-accelerated synthesis is an emerging field that uses machine learning algorithms to improve the efficiency and productivity of chemical and materials synthesis. Modern machine learning models, such as (large) language models, can capture the knowledge hidden in large chemical databases to rapidly design and discover new compounds, predict the outcome of reactions, and help optimize chemical reactions. One of the key advantages of AI-accelerated synthesis is its ability to make vast chemical data accessible and predict promising candidate synthesis paths, potentially leading to breakthrough discoveries. Overall, AI is poised to revolutionise the field of organic synthesis, enabling faster and more efficient drug development, catalysis, and other applications.

Extraction of organic chemistry grammar from unsupervised learning of chemical reactions
P Schwaller, B Hoover, JL Reymond, H Strobelt, T Laino
Science Advances 7 (15), eabe4166

Augmenting large language models with chemistry tools
A M. Bran, S Cox, O Schilter, C Baldassari, AD White, P Schwaller
Nature Machine Intelligence, 1-11

Saturn: Sample-efficient Generative Molecular Design using Memory Manipulation
J Guo, P Schwaller
arXiv preprint arXiv:2405.17066



Oct 31 (Mathis)


Nov 7 (Schwaller)

Lecturer: Prof. Philippe Schwaller

Students will present and discuss a selection from the following papers: 

Extraction of organic chemistry grammar from unsupervised learning of chemical reactions
P Schwaller, B Hoover, JL Reymond, H Strobelt, T Laino
Science Advances 7 (15), eabe4166

Augmenting large language models with chemistry tools
A M. Bran, S Cox, O Schilter, C Baldassari, AD White, P Schwaller
Nature Machine Intelligence, 1-11

Saturn: Sample-efficient Generative Molecular Design using Memory Manipulation
J Guo, P Schwaller
arXiv preprint arXiv:2405.17066



Nov 14 (Ceriotti)

Lecturer: Prof. Michele Ceriotti

Atomic-scale Machine Learning

Approximate solutions of the electronic-structure problems have made it possible to predict, from first principles, the stability and properties of molecules and materials for the most diverse fundamental and technological applications. This field has been transformed by the use of machine-learning techniques, that make it possible to model the structure-property relations in a data-driven manner, and thereby predict the behavior of atomic-scale systems at much lower cost than by quantum calculations, based on training on relatively small numbers of reference structures.

I will give a broad introduction to the applications of ML to atomistic simulations, presenting a historical account of the early developments [1], a demonstration of the breadth of possible applications [2] and a discussion of some recent developments that increase the range of properties that can be predicted by ML models [3]. All along, I will emphasize the interplay between mathematical concepts and physical principles that undepins the most effective frameworks for atomistic machine learning [4].

See further below for references and reading list.







Nov 21 (Carleo)

Lecturer: Prof. Giuseppe Carleo

Machine Learning for Many-Body Quantum Physics 

Machine-learning-based approaches, routinely adopted in cutting-edge industrial applications, are being increasingly adopted to study fundamental problems in science. Many-body physics is very much at the forefront of these exciting developments, given its intrinsic "big-data" nature.

In this lecture I will present selected applications to the quantum realm. First, I will discuss how a systematic, and controlled machine learning of the many-body wave-function can be realized. This goal is achieved by a variational representation of quantum states based on artificial neural networks. I will then discuss applications in diverse domains, including prototypical open problems in Condensed Matter physics-- fermions and frustrated spins-- as well as applications to characterize and improve quantum hardware and software.



Nov 28 (Ceriotti)

Lecturer: Michele Ceriotti 

These three articles are to be read and presented to the class. Focus on the core ideas, no need to memorize derivations.

[1] J. Behler and M. Parrinello, "Generalized Neural-Network Representation of High-Dimensional Potential-Energy Surfaces," Physical Review Letters 98(14), 146401 (2007).

[2] A. P. Bartók, S. De, C. Poelking, N. Bernstein, J. R. Kermode, G. Csányi, and M. Ceriotti, "Machine learning unifies the modeling of materials and molecules," Science Advances 3(12), e1701816 (2017).

[3] A. Grisafi, D. M. Wilkins, G. Csányi, and M. Ceriotti, "Symmetry-Adapted Machine Learning for Tensorial Properties of Atomistic Systems," Physical Review Letters 120(3), 036002 (2018).

These two articles are more technical, and do not have to be read in full. The first is a recent work that tries to unify under the same framework several framework for atomic-scale ML, and the second is a longer review that provides more context and can provide additional explanation of some concepts. Skim through and read more if and when needed

[4] J. Nigam, S. Pozdnyakov, G. Fraux, and M. Ceriotti, "Unified theory of atom-centered representations and message-passing machine-learning schemes," J. Chem. Phys. 156(20), 204115 (2022).

[5] F. Musil, A. Grisafi, A. P. Bartók, C. Ortner, G. Csányi, and M. Ceriotti, "Physics-Inspired Structural Representations for Molecules and Materials," Chem. Rev. 121(16), 9759–9815 (2021).







Dec 5 (Harvey)

The use of Machine Learning in astrophysics  has become increasingly common in the last few years. From the automatic classification of galaxies to the inference of the total matter content of the Universe, its uses are both diverse and ubiquitous. In this part of the course students will learn about the variety of different Machine Learning models that help astronomers, specifically I will cover:


Dec 12 (Carleo)

Lecturer: Prof. Giuseppe Carleo

Students will present and discuss the following papers: 


[1] Neural-network quantum state tomographyG. Torlai, G. Mazzola, J. Carrasquilla, M. Troyer, R. Melko, and G. Carleo, Nature Physics 14, 447–450 (2018)

[2] Quantum many-body dynamics in two dimensions with artificial neural networks M. Schmitt and M. Heyl, Phys. Rev. Lett. 125, 100503 (2020)  

[3] Deep-neural-network solution of the electronic Schrödinger equation, J. Hermann, Z. Schätzle, and F. Noé, Nature Chemistry 12, 891-897 (2020)

Notice that most of the details are often contained in the supplementary material! In most cases, you will need to focus mostly on the high level concepts, not on the tiny derivations. 



Dec 19 (Harvey)