Schedule for: 20ss230 - Online Open Probability School

Beginning on Sunday, May 17 and ending Thursday August 13, 2020

All times in Banff, Alberta time, MDT (UTC-6).

Monday, May 18
10:00 - 11:10 Jean Christophe Mourrat: Rank-one matrix estimation and Hamilton-Jacobi equations - 1
We consider the problem of estimating a large rank-one matrix, given noisy observations. This inference problem is known to have a phase transition, in the sense that the partial recovery of the original matrix is only possible if the signal-to-noise ratio exceeds a (non-zero) value. We will present a new proof of this fact based on the study of a Hamilton-Jacobi equation. This alternative argument allows to obtain better rates of convergence, and also seems more amenable to extensions to other models such as spin glasses.
(Online)
Tuesday, May 19
10:00 - 11:10 Jean Christophe Mourrat: Rank-one matrix estimation and Hamilton-Jacobi equations - 2
We consider the problem of estimating a large rank-one matrix, given noisy observations. This inference problem is known to have a phase transition, in the sense that the partial recovery of the original matrix is only possible if the signal-to-noise ratio exceeds a (non-zero) value. We will present a new proof of this fact based on the study of a Hamilton-Jacobi equation. This alternative argument allows to obtain better rates of convergence, and also seems more amenable to extensions to other models such as spin glasses.
(Online)
Thursday, May 21
10:00 - 11:10 Jean Christophe Mourrat: Rank-one Matrix Estimation and Hamilton-Jacobi Equations - 3
We consider the problem of estimating a large rank-one matrix, given noisy observations. This inference problem is known to have a phase transition, in the sense that the partial recovery of the original matrix is only possible if the signal-to-noise ratio exceeds a (non-zero) value. We will present a new proof of this fact based on the study of a Hamilton-Jacobi equation. This alternative argument allows to obtain better rates of convergence, and also seems more amenable to extensions to other models such as spin glasses.
(Online)
Monday, May 25
10:00 - 11:10 Gady Kozma: Critical and Near-Critical Percolation - 1
Critical and near-critical percolation is well-understood in dimension 2 and in high dimensions. The behaviour in intermediate dimensions (in particular 3) is still largely not understood, but in recent years there was some progress in this field, with contributions by van den Berg, Cerf, Duminil-Copin, Tassion and others. We will survey this recent progress (and a few older but not sufficiently known results).

Prerequisits: The Fortuin–Kasteleyn–Ginibre (FKG) and van den Berg-Kesten (BK) inequalities.
(Online)
Wednesday, May 27
10:00 - 11:10 Gady Kozma: Critical and Near-Critical Percolation - 2
Critical and near-critical percolation is well-understood in dimension 2 and in high dimensions. The behaviour in intermediate dimensions (in particular 3) is still largely not understood, but in recent years there was some progress in this field, with contributions by van den Berg, Cerf, Duminil-Copin, Tassion and others. We will survey this recent progress (and a few older but not sufficiently known results).

Prerequisite: The Fortuin–Kasteleyn–Ginibre (FKG) and van den Berg-Kesten (BK) inequalities.
(Online)
Thursday, May 28
10:00 - 11:10 Gady Kozma: Critical and Near-Critical Percolation - 3
Critical and near-critical percolation is well-understood in dimension 2 and in high dimensions. The behaviour in intermediate dimensions (in particular 3) is still largely not understood, but in recent years there was some progress in this field, with contributions by van den Berg, Cerf, Duminil-Copin, Tassion and others. We will survey this recent progress (and a few older but not sufficiently known results).

Prerequisite: The Fortuin–Kasteleyn–Ginibre (FKG) and van den Berg-Kesten (BK) inequalities.
(Online)
Monday, June 1
10:00 - 11:10 Nina Gantert: Branching Random Walks: some recent results and open questions - 1
We give an introduction to branching random walks and their continuous counterpart, branching Brownian motion. We explain some recent results on the maximum of a branching random walk and its relation to point processes, as well as a connection with fragmentations. The focus will be on open questions.

Preparatory reading: Lyons and Peres, Probability on Trees and Networks, Chapters 5.1 (Galton-Watson branching processes) and 13.8 (Tree-indexed random walks)

Further reading:

Zhan Shi, Branching Random Walks
Julien Berestycki, Topics on Branching Brownian motion
Ofer Zeitouni, Branching Random Walks and Gaussian Fields
(Online)
Tuesday, June 2
10:00 - 11:10 Nina Gantert: Branching Random Walks: some recent results and open questions - 2
We give an introduction to branching random walks and their continuous counterpart, branching Brownian motion. We explain some recent results on the maximum of a branching random walk and its relation to point processes, as well as a connection with fragmentations. The focus will be on open questions.

Preparatory reading: Lyons and Peres, Probability on Trees and Networks, Chapters 5.1 (Galton-Watson branching processes) and 13.8 (Tree-indexed random walks)

Further reading:

Zhan Shi, Branching Random Walks
Julien Berestycki, Topics on Branching Brownian motion
Ofer Zeitouni, Branching Random Walks and Gaussian Fields
(Online)
Thursday, June 4
10:00 - 11:10 Nina Gantert: Branching Random Walks: some recent results and open questions - 3
We give an introduction to branching random walks and their continuous counterpart, branching Brownian motion. We explain some recent results on the maximum of a branching random walk and its relation to point processes, as well as a connection with fragmentations. The focus will be on open questions.

Preparatory reading: Lyons and Peres, Probability on Trees and Networks, Chapters 5.1 (Galton-Watson branching processes) and 13.8 (Tree-indexed random walks)

Further reading:

Zhan Shi, Branching Random Walks
Julien Berestycki, Topics on Branching Brownian motion
Ofer Zeitouni, Branching Random Walks and Gaussian Fields
(OOPS)
Friday, June 5
10:00 - 10:30 Piotr Dyszewski: Branching random walks and stretched exponential tails (Online)
10:30 - 11:00 Samuel Johnston: The extremal particles of branching Brownian motion (Online)
Monday, June 8
10:00 - 11:30 Ivan Corwin: Gibbsian Line ensembles in Integrable Probability - 1
Many important models in integrable probability (e.g. the KPZ equation, solvable directed polymers, ASEP, stochastic six vertex model) can be embedded into Gibbsian line ensembles. This hidden probabilistic structure provides new tools to control the behavior and asymptotics of these systems. In my first talk, I will discuss the Airy line ensemble and its origins and properties. In my second talk, I will discuss the KPZ line ensemble and explain how this structure is used to probe the temporal correlation structure of the KPZ equation. In my final talk, I will zoom out and discuss the origins of this hidden structure.

In my lectures I will try to work from first principles as much as possible. The TA sessions and problem sets will serve to reinforce and fill in details from the lectures. Additionally, there will be thematically relevant short talks delivered immediately after my lectures.

Note:This course will consist of 90 minute lectures, and be accompanied by short presentations on related topics. A detailed schedule of those is here.
(Zoom)
Tuesday, June 9
10:00 - 11:30 Ivan Corwin: Gibbsian Line ensembles in Integrable Probability - 2
Many important models in integrable probability (e.g. the KPZ equation, solvable directed polymers, ASEP, stochastic six vertex model) can be embedded into Gibbsian line ensembles. This hidden probabilistic structure provides new tools to control the behavior and asymptotics of these systems. In my first talk, I will discuss the Airy line ensemble and its origins and properties. In my second talk, I will discuss the KPZ line ensemble and explain how this structure is used to probe the temporal correlation structure of the KPZ equation. In my final talk, I will zoom out and discuss the origins of this hidden structure.

In my lectures I will try to work from first principles as much as possible. The TA sessions and problem sets will serve to reinforce and fill in details from the lectures. Additionally, there will be thematically relevant short talks delivered immediately after my lectures.

Note:This course will consist of 90 minute lectures, and be accompanied by short presentations on related topics. A detailed schedule of those is here.
(Zoom)
Wednesday, June 10
10:00 - 11:10 Ivan Corwin: Gibbsian Line ensembles in Integrable Probability - 3
Many important models in integrable probability (e.g. the KPZ equation, solvable directed polymers, ASEP, stochastic six vertex model) can be embedded into Gibbsian line ensembles. This hidden probabilistic structure provides new tools to control the behavior and asymptotics of these systems. In my first talk, I will discuss the Airy line ensemble and its origins and properties. In my second talk, I will discuss the KPZ line ensemble and explain how this structure is used to probe the temporal correlation structure of the KPZ equation. In my final talk, I will zoom out and discuss the origins of this hidden structure.

In my lectures I will try to work from first principles as much as possible. The TA sessions and problem sets will serve to reinforce and fill in details from the lectures. Additionally, there will be thematically relevant short talks delivered immediately after my lectures.

Note:This course will consist of 90 minute lectures, and be accompanied by short presentations on related topics. A detailed schedule of those is here.
(Online)
Thursday, June 11
10:00 - 10:40 Duncan Dauvergne: The Airy Sheet
The Airy sheet is a random two-parameter function that arises as the scaling limit of last passage percolation when both the start and end point are allowed to vary spatially. It is also the fundamental building block of the richer scaling limit, the directed landscape. In this talk, I will describe how the Airy sheet is built from asymptotic last passage values along parabolas in the Airy line ensemble via a curious property of the RSK bijection. Based on joint work with Janosch Ortmann and Balint Virag.
(Zoom)
10:50 - 11:10 Jacob Calvert: The quantitatively Brownian nature of the Airy line ensemble - 1
In these two talks we will discuss a quantitative form of Brownianity enjoyed by the Airy line ensemble, proved using techniques which develop the basic Brownian Gibbs property. The first talk will state the main result, provide some context, and illustrate its applicability with a simple example. The second talk will discuss in broad strokes the framework of the proof and the role of the Brownian Gibbs property, ideas which may be of use in other problems.

(Zoom)
11:10 - 11:30 Milind Hegde: The quantitatively Brownian nature of the Airy line ensemble - 2
In these two talks we will discuss a quantitative form of Brownianity enjoyed by the Airy line ensemble, proved using techniques which develop the basic Brownian Gibbs property. The first talk will state the main result, provide some context, and illustrate its applicability with a simple example. The second talk will discuss in broad strokes the framework of the proof and the role of the Brownian Gibbs property, ideas which may be of use in other problems.

(Zoom)
11:30 - 11:50 Erik Bates: Endpoints of disjoint Geodesics in the directed Landscape
Brownian last passage percolation (LPP) has proved a fruitful arena for understanding the geometry of geodesics, or "polymers", arising in models within the Kardar--Parisi--Zhang universality class. The recent construction by Dauvergne, Ortmann, and Virág of a scaling limit for Brownian LPP---the directed landscape---offers a setting in which this understanding can be phrased in terms of fractal geometry. Moreover, the measure theoretic "content" of the directed landscape is carried by exceptional space-time points admitting disjoint geodesics. This talk will quantify, in terms of Hausdorff dimension, the size of some of these exceptional sets, thus shedding light on the landscape's rich fractal structure. (Joint work with Shirshendu Ganguly and Alan Hammond.)
(Zoom)
12:00 - 12:40 Shirshendu Ganguly: Geodesic Watermelons in Last Passage Percolation (Zoom)
Friday, June 12
10:00 - 10:40 David Croydon: Invariant measures for KdV and Toda-type discrete integrable systems
This talk is based on joint work with Makiko Sasada (University of Tokyo) and Satoshi Tsujimoto (Kyoto University). I will give a brief introduction to four discrete integrable systems, which are derived from the KdV and Toda lattice equations, and discuss some arguments that are useful in identifying invariant measures for them. As a first key input, I will describe how it is possible to construct global solutions for each of the systems of interest using variants of Pitman's transformation. Secondly, I will present a "detailed balance" criterion for identifying i.i.d.-type invariant measures, and will relate this to approaches used to study various stochastic integrable systems, such as last passage percolation, random polymers, and higher spin vertex models. In many of the examples I discuss, solutions to the detailed balance criterion are given by well-known characterizations of certain standard distributions, including the exponential, geometric, gamma and generalized inverse Gaussian distributions. Our work leads to a number of natural conjectures about the characterization of some other standard distributions.
(Zoom)
10:50 - 11:10 Xuan Wu: Tightness of the KPZ line ensemble
A long standing conjecture in KPZ universality class is the convergence of the solution H_t to KPZ equation to Airy_2 process. In a recent work, Ivan and Hammand proposed a scheme to attack this conjecture through Gibbisan line ensembles, in which H_t is embedded as the top curve. In this talk, we will discuss the tightness of KPZ line ensemble as t varies.
(Zoom)
11:10 - 11:30 Sourav Sarkar: Brownian absolute continuity of KPZ fixed point with arbitrary initial condition (Zoom)
11:30 - 11:50 Lingfu Zhang: Empirical distribution along geodesics in exponential last passage percolation (Zoom)
12:00 - 12:40 Sylvie Corteel: Multispecies ASEP and MacDonald polynomials (Zoom)
Monday, June 15
10:00 - 11:10 Perla Sousi: Mixing and hitting times for Markov chains - 1
Mixing and hitting times are fundamental parameters of a Markov chain. In this mini-course I will discuss connections between them for reversible Markov chains.
(Zoom)
Tuesday, June 16
10:00 - 11:10 Perla Sousi: Mixing and hitting times for Markov chains - 2
Mixing and hitting times are fundamental parameters of a Markov chain. In this mini-course I will discuss connections between them for reversible Markov chains.
(Zoom)
Thursday, June 18
10:00 - 11:10 Perla Sousi: Mixing and hitting times for Markov chains - 3
Mixing and hitting times are fundamental parameters of a Markov chain. In this mini-course I will discuss connections between them for reversible Markov chains.
(Zoom)
Monday, June 22
10:00 - 11:20 Frank den Hollander: with Elena Pulvirenti: Metastability for interacting Particle Systems - 1
Metastability is a wide-spread phenomenon in the dynamics of non-linear systems subject noise. In the narrower perspective of statistical physics, metastable behaviour can be seen as the dynamical manifestation of a first-order phase transition.

A fruitful approach to metastability is via potential theory. The key point is the realisation that most questions of interest can be reduced to the computation of capacities, and that these capacities in turn can be estimated by exploiting variational principles. In this way, the metastable dynamics of the system can essentially be understood via an analysis of its statics. This constitutes a major simplification, and acts as a guiding principle. The setting of potential theory relevant for interacting particle systems is that of reversible Markov processes.

Within this limitation, there is a wide range of models that are adequate to describe a variety of different systems. Our aim is to unveil the common universal features of these systems with respect to their metastable behaviour.

The first lecture will be an introduction to metastability. In the other three lectures, we will focus on three examples in detail:

Kawasaki dynamics on lattices.
Glauber dynamics on random graphs.
Widom-Rowlinson dynamics on the continuum.

Reference: Anton Bovier and Frank den Hollander, Metastability -- a Potential-Theoretic Approach, Grundlehren der mathematischen Wissenschaften 351, Springer, Berlin, 2015.
(Online)
Tuesday, June 23
10:00 - 11:20 Frank den Hollander: with Elena Pulvirenti: Metastability for interacting Particle Systems - 2
Metastability is a wide-spread phenomenon in the dynamics of non-linear systems subject noise. In the narrower perspective of statistical physics, metastable behaviour can be seen as the dynamical manifestation of a first-order phase transition.

A fruitful approach to metastability is via potential theory. The key point is the realisation that most questions of interest can be reduced to the computation of capacities, and that these capacities in turn can be estimated by exploiting variational principles. In this way, the metastable dynamics of the system can essentially be understood via an analysis of its statics. This constitutes a major simplification, and acts as a guiding principle. The setting of potential theory relevant for interacting particle systems is that of reversible Markov processes.

Within this limitation, there is a wide range of models that are adequate to describe a variety of different systems. Our aim is to unveil the common universal features of these systems with respect to their metastable behaviour.

The first lecture will be an introduction to metastability. In the other three lectures, we will focus on three examples in detail:

Kawasaki dynamics on lattices.
Glauber dynamics on random graphs.
Widom-Rowlinson dynamics on the continuum.

Reference: Anton Bovier and Frank den Hollander, Metastability -- a Potential-Theoretic Approach, Grundlehren der mathematischen Wissenschaften 351, Springer, Berlin, 2015.
(Online)
Thursday, June 25
10:00 - 11:20 Frank den Hollander: with Elena Pulvirenti: Metastability for interacting Particle Systems - 3
Metastability is a wide-spread phenomenon in the dynamics of non-linear systems subject noise. In the narrower perspective of statistical physics, metastable behaviour can be seen as the dynamical manifestation of a first-order phase transition.

A fruitful approach to metastability is via potential theory. The key point is the realisation that most questions of interest can be reduced to the computation of capacities, and that these capacities in turn can be estimated by exploiting variational principles. In this way, the metastable dynamics of the system can essentially be understood via an analysis of its statics. This constitutes a major simplification, and acts as a guiding principle. The setting of potential theory relevant for interacting particle systems is that of reversible Markov processes.

Within this limitation, there is a wide range of models that are adequate to describe a variety of different systems. Our aim is to unveil the common universal features of these systems with respect to their metastable behaviour.

The first lecture will be an introduction to metastability. In the other three lectures, we will focus on three examples in detail:

Kawasaki dynamics on lattices.
Glauber dynamics on random graphs.
Widom-Rowlinson dynamics on the continuum.

Reference: Anton Bovier and Frank den Hollander, Metastability -- a Potential-Theoretic Approach, Grundlehren der mathematischen Wissenschaften 351, Springer, Berlin, 2015.
(Online)
Friday, June 26
10:00 - 11:20 Frank den Hollander: with Elena Pulvirenti: Metastability for interacting Particle Systems - 4
Metastability is a wide-spread phenomenon in the dynamics of non-linear systems subject noise. In the narrower perspective of statistical physics, metastable behaviour can be seen as the dynamical manifestation of a first-order phase transition.

A fruitful approach to metastability is via potential theory. The key point is the realisation that most questions of interest can be reduced to the computation of capacities, and that these capacities in turn can be estimated by exploiting variational principles. In this way, the metastable dynamics of the system can essentially be understood via an analysis of its statics. This constitutes a major simplification, and acts as a guiding principle. The setting of potential theory relevant for interacting particle systems is that of reversible Markov processes.

Within this limitation, there is a wide range of models that are adequate to describe a variety of different systems. Our aim is to unveil the common universal features of these systems with respect to their metastable behaviour.

The first lecture will be an introduction to metastability. In the other three lectures, we will focus on three examples in detail:

Kawasaki dynamics on lattices.
Glauber dynamics on random graphs.
Widom-Rowlinson dynamics on the continuum.

Reference: Anton Bovier and Frank den Hollander, Metastability -- a Potential-Theoretic Approach, Grundlehren der mathematischen Wissenschaften 351, Springer, Berlin, 2015.
(Online)
Monday, June 29
10:00 - 11:10 Aukosh Jagannath: A Brief Introduction to Mean Field Spin Glass Models - 1
Historically, mean field spin glass models come from the study of statistical physics and have served as prototypical examples of complex energy landscapes. To tackle these questions statistical physicists developed a new class of tools, such as the cavity method and the replica symmetry breaking. Since their introduction, these methods have been applied to a wide variety of problems from statistical physics, to combinatorics, to data science. This course will serve as a high-level introduction to the basics of mean field spin glasses and is intended to introduce the students to the basic notions that will arise in other courses during the Seminaire. On the first day, we plan to cover the random energy model, the ultrametric decomposition of Gibbs measures in p-spin glass models and the connection to Poisson-Dirichlet statistics. On the second day, if there is time, we will also introduce notions of free energy barriers and overlap gaps and their connection to spectral gap inequalities and algorithmic hardness results. Suggested Prerequisites: Measure theoretic probability; Point processes and their definition as random probability measures; Basic notions from Gaussian analysis (concentration of measure, Slepian's interpolation inequality)
(Online)
11:30 - 12:40 Aukosh Jagannath: A Brief Introduction to Mean Field Spin Glass Models - 2
Historically, mean field spin glass models come from the study of statistical physics and have served as prototypical examples of complex energy landscapes. To tackle these questions statistical physicists developed a new class of tools, such as the cavity method and the replica symmetry breaking. Since their introduction, these methods have been applied to a wide variety of problems from statistical physics, to combinatorics, to data science. This course will serve as a high-level introduction to the basics of mean field spin glasses and is intended to introduce the students to the basic notions that will arise in other courses during the Seminaire. On the first day, we plan to cover the random energy model, the ultrametric decomposition of Gibbs measures in p-spin glass models and the connection to Poisson-Dirichlet statistics. On the second day, if there is time, we will also introduce notions of free energy barriers and overlap gaps and their connection to spectral gap inequalities and algorithmic hardness results. Suggested Prerequisites: Measure theoretic probability; Point processes and their definition as random probability measures; Basic notions from Gaussian analysis (concentration of measure, Slepian's interpolation inequality)
(Online)
Tuesday, June 30
10:00 - 11:10 Aukosh Jagannath: A Brief Introduction to Mean Field Spin Glass Models - 3
Historically, mean field spin glass models come from the study of statistical physics and have served as prototypical examples of complex energy landscapes. To tackle these questions statistical physicists developed a new class of tools, such as the cavity method and the replica symmetry breaking. Since their introduction, these methods have been applied to a wide variety of problems from statistical physics, to combinatorics, to data science. This course will serve as a high-level introduction to the basics of mean field spin glasses and is intended to introduce the students to the basic notions that will arise in other courses during the Seminaire. On the first day, we plan to cover the random energy model, the ultrametric decomposition of Gibbs measures in p-spin glass models and the connection to Poisson-Dirichlet statistics. On the second day, if there is time, we will also introduce notions of free energy barriers and overlap gaps and their connection to spectral gap inequalities and algorithmic hardness results. Suggested Prerequisites: Measure theoretic probability; Point processes and their definition as random probability measures; Basic notions from Gaussian analysis (concentration of measure, Slepian's interpolation inequality)
(Online)
11:30 - 12:40 Aukosh Jagannath: A Brief Introduction to Mean Field Spin Glass Models - 4
Historically, mean field spin glass models come from the study of statistical physics and have served as prototypical examples of complex energy landscapes. To tackle these questions statistical physicists developed a new class of tools, such as the cavity method and the replica symmetry breaking. Since their introduction, these methods have been applied to a wide variety of problems from statistical physics, to combinatorics, to data science. This course will serve as a high-level introduction to the basics of mean field spin glasses and is intended to introduce the students to the basic notions that will arise in other courses during the Seminaire. On the first day, we plan to cover the random energy model, the ultrametric decomposition of Gibbs measures in p-spin glass models and the connection to Poisson-Dirichlet statistics. On the second day, if there is time, we will also introduce notions of free energy barriers and overlap gaps and their connection to spectral gap inequalities and algorithmic hardness results. Suggested Prerequisites: Measure theoretic probability; Point processes and their definition as random probability measures; Basic notions from Gaussian analysis (concentration of measure, Slepian's interpolation inequality)
(Online)
Wednesday, July 1
10:00 - 11:10 Amin Coja-Oghlan: Disordered systems and random graphs - 1
Besides being a classical research topic at the junction of combinatorics and probability with applications in several other disciplies, random graphs and their phase transitions have been attracting the interest of the statistical physics community. From a statistical physics viewpoint, random graphs can be viewed as disordered systems, real-world examples of which include glasses and spin glasses. Physicists have thus brought to bear techniques centered around the notion of "replica symmetry breaking", thereby putting forward a multitude of predictions. In this course we will learn about the present state of the art with respect to rigorising these predictions, and about the new mathematical tools developed over the recent years. Additionally, we will look at applications, particularly in the area of Bayesian inference. Prerequisits:
(Online)
11:30 - 12:40 Aukosh Jagannath: A Brief Introduction to Mean Field Spin Glass Models - 5
Historically, mean field spin glass models come from the study of statistical physics and have served as prototypical examples of complex energy landscapes. To tackle these questions statistical physicists developed a new class of tools, such as the cavity method and the replica symmetry breaking. Since their introduction, these methods have been applied to a wide variety of problems from statistical physics, to combinatorics, to data science. This course will serve as a high-level introduction to the basics of mean field spin glasses and is intended to introduce the students to the basic notions that will arise in other courses during the Seminaire. On the first day, we plan to cover the random energy model, the ultrametric decomposition of Gibbs measures in p-spin glass models and the connection to Poisson-Dirichlet statistics. On the second day, if there is time, we will also introduce notions of free energy barriers and overlap gaps and their connection to spectral gap inequalities and algorithmic hardness results. Suggested Prerequisites: Measure theoretic probability; Point processes and their definition as random probability measures; Basic notions from Gaussian analysis (concentration of measure, Slepian's interpolation inequality)
(Online)
Thursday, July 2
10:00 - 11:10 Eliran Subag: TAP approach and optimization of full-RSB spherical spin glasses
I will describe a proof of the celebrated Thouless-Anderson-Palmer representation for the free energy which follows from first principles and concentration results, and also extends it to all overlaps in the support of the Parisi measure. I will then explain how certain consequences of the representation concerning the location of maxima can be used to design an algorithm to find an approximate global maximizer in polynomial time, in the full-RSB case.
(Online)
11:30 - 12:40 Amin Coja-Oghlan: Disordered systems and random graphs - 2
Besides being a classical research topic at the junction of combinatorics and probability with applications in several other disciplies, random graphs and their phase transitions have been attracting the interest of the statistical physics community. From a statistical physics viewpoint, random graphs can be viewed as disordered systems, real-world examples of which include glasses and spin glasses. Physicists have thus brought to bear techniques centered around the notion of "replica symmetry breaking", thereby putting forward a multitude of predictions. In this course we will learn about the present state of the art with respect to rigorising these predictions, and about the new mathematical tools developed over the recent years. Additionally, we will look at applications, particularly in the area of Bayesian inference. Prerequisits:
(Online)
Friday, July 3
10:00 - 11:10 Amin Coja-Oghlan: Disordered systems and random graphs - 3
Besides being a classical research topic at the junction of combinatorics and probability with applications in several other disciplies, random graphs and their phase transitions have been attracting the interest of the statistical physics community. From a statistical physics viewpoint, random graphs can be viewed as disordered systems, real-world examples of which include glasses and spin glasses. Physicists have thus brought to bear techniques centered around the notion of "replica symmetry breaking", thereby putting forward a multitude of predictions. In this course we will learn about the present state of the art with respect to rigorising these predictions, and about the new mathematical tools developed over the recent years. Additionally, we will look at applications, particularly in the area of Bayesian inference. Prerequisits:
(Online)
Monday, July 6
10:00 - 11:10 Andrea Montanari: Mean field methods in high-dimensional statistics and nonconvex optimization - 1
Starting in the seventies, physicists have introduced a class of random energy functions and corresponding random probability distributions (Gibbs measures), that are known as mean-field spin glasses. Over the years, it has become increasingly clear that a broad array of canonical models in random combinatorics and (more recently) high-dimensional statistics are in fact examples of mean field spin glasses, and can be studied using tools developed in that area. Crucially, these new application domains have brought up a number of interesting new questions that were not central from the viewpoint of statistical physics. These lectures will focus on these new questions: (i) Statistical questions: what is the accuracy or uncertainty associated to a certain statistical method? (ii) Computational questions: can we efficiently compute marginals of a Gibbs measure? Can we generate low-energy configurations? The following is a rough outline of the lectures: 1) High-dimensional statistics. General setting and key questions. The role of sharp asymptotics. Examples and general phenomena. 2) Message passing algorithms, and approximate message passing (AMP). Sharp analysis of AMP. 3) Optimal AMP algorithms. Connection with Bayes error. Connection with convex optimization. 4) Replica symmetry breaking. Parisi formula. Computational implications aspect 5) Optimization algorithms for mean field spin glasses. This course will be accompanied by exercise sessions.
(Online)
Tuesday, July 7
10:00 - 11:10 Andrea Montanari: Mean field methods in high-dimensional statistics and nonconvex optimization - 2
Starting in the seventies, physicists have introduced a class of random energy functions and corresponding random probability distributions (Gibbs measures), that are known as mean-field spin glasses. Over the years, it has become increasingly clear that a broad array of canonical models in random combinatorics and (more recently) high-dimensional statistics are in fact examples of mean field spin glasses, and can be studied using tools developed in that area. Crucially, these new application domains have brought up a number of interesting new questions that were not central from the viewpoint of statistical physics. These lectures will focus on these new questions: (i) Statistical questions: what is the accuracy or uncertainty associated to a certain statistical method? (ii) Computational questions: can we efficiently compute marginals of a Gibbs measure? Can we generate low-energy configurations? The following is a rough outline of the lectures: 1) High-dimensional statistics. General setting and key questions. The role of sharp asymptotics. Examples and general phenomena. 2) Message passing algorithms, and approximate message passing (AMP). Sharp analysis of AMP. 3) Optimal AMP algorithms. Connection with Bayes error. Connection with convex optimization. 4) Replica symmetry breaking. Parisi formula. Computational implications aspect 5) Optimization algorithms for mean field spin glasses. This course will be accompanied by exercise sessions.
(Online)
Wednesday, July 8
10:00 - 11:10 Andrea Montanari: Mean field methods in high-dimensional statistics and nonconvex optimization - 3
Starting in the seventies, physicists have introduced a class of random energy functions and corresponding random probability distributions (Gibbs measures), that are known as mean-field spin glasses. Over the years, it has become increasingly clear that a broad array of canonical models in random combinatorics and (more recently) high-dimensional statistics are in fact examples of mean field spin glasses, and can be studied using tools developed in that area. Crucially, these new application domains have brought up a number of interesting new questions that were not central from the viewpoint of statistical physics. These lectures will focus on these new questions: (i) Statistical questions: what is the accuracy or uncertainty associated to a certain statistical method? (ii) Computational questions: can we efficiently compute marginals of a Gibbs measure? Can we generate low-energy configurations? The following is a rough outline of the lectures: 1) High-dimensional statistics. General setting and key questions. The role of sharp asymptotics. Examples and general phenomena. 2) Message passing algorithms, and approximate message passing (AMP). Sharp analysis of AMP. 3) Optimal AMP algorithms. Connection with Bayes error. Connection with convex optimization. 4) Replica symmetry breaking. Parisi formula. Computational implications aspect 5) Optimization algorithms for mean field spin glasses. This course will be accompanied by exercise sessions.
(Online)
Thursday, July 9
10:00 - 11:10 Andrea Montanari: Mean field methods in high-dimensional statistics and nonconvex optimization - 4
Starting in the seventies, physicists have introduced a class of random energy functions and corresponding random probability distributions (Gibbs measures), that are known as mean-field spin glasses. Over the years, it has become increasingly clear that a broad array of canonical models in random combinatorics and (more recently) high-dimensional statistics are in fact examples of mean field spin glasses, and can be studied using tools developed in that area. Crucially, these new application domains have brought up a number of interesting new questions that were not central from the viewpoint of statistical physics. These lectures will focus on these new questions: (i) Statistical questions: what is the accuracy or uncertainty associated to a certain statistical method? (ii) Computational questions: can we efficiently compute marginals of a Gibbs measure? Can we generate low-energy configurations? The following is a rough outline of the lectures: 1) High-dimensional statistics. General setting and key questions. The role of sharp asymptotics. Examples and general phenomena. 2) Message passing algorithms, and approximate message passing (AMP). Sharp analysis of AMP. 3) Optimal AMP algorithms. Connection with Bayes error. Connection with convex optimization. 4) Replica symmetry breaking. Parisi formula. Computational implications aspect 5) Optimization algorithms for mean field spin glasses. This course will be accompanied by exercise sessions.
(Online)
11:30 - 12:40 Léo Miolane: Information-theoretic limits of Bayesian inference in Gaussian noise
We will discuss briefly the statistical estimation of a signal (vector, matrix, tensor...) corrupted by Gaussian noise. We will restrict ourselves to information-theoretic considerations and draw connections with statistical physics (random energy model, p-spin model).
(Online)
Friday, July 10
10:00 - 11:10 Andrea Montanari: Mean field methods in high-dimensional statistics and nonconvex optimization - 5
Starting in the seventies, physicists have introduced a class of random energy functions and corresponding random probability distributions (Gibbs measures), that are known as mean-field spin glasses. Over the years, it has become increasingly clear that a broad array of canonical models in random combinatorics and (more recently) high-dimensional statistics are in fact examples of mean field spin glasses, and can be studied using tools developed in that area. Crucially, these new application domains have brought up a number of interesting new questions that were not central from the viewpoint of statistical physics. These lectures will focus on these new questions: (i) Statistical questions: what is the accuracy or uncertainty associated to a certain statistical method? (ii) Computational questions: can we efficiently compute marginals of a Gibbs measure? Can we generate low-energy configurations? The following is a rough outline of the lectures: 1) High-dimensional statistics. General setting and key questions. The role of sharp asymptotics. Examples and general phenomena. 2) Message passing algorithms, and approximate message passing (AMP). Sharp analysis of AMP. 3) Optimal AMP algorithms. Connection with Bayes error. Connection with convex optimization. 4) Replica symmetry breaking. Parisi formula. Computational implications aspect 5) Optimization algorithms for mean field spin glasses. This course will be accompanied by exercise sessions.
(Online)
Wednesday, July 15
10:00 - 11:10 Elchanan Mossel: Simplicity and Complexity in Belief Propagation - 3
TBA
(Online)
11:30 - 12:40 Shirshendu Ganguly: Large deviations for random networks and applications - 1
While large deviations theory for sums and other linear functions of independent random variables is well developed and classical, the set of tools to analyze non-linear functions, such as polynomials, is limited. Canonical examples of such non-linear functions include subgraph counts and spectral observables in random networks. In this series of lectures we will review the recent exciting developments around building a suitable nonlinear large deviations theory to treat such random variables and understand geometric properties of large random networks conditioned on associated rare events. We will start with a discussion on dense graphs and see how the theory of graphons provides a natural framework to study large deviations in this setting. We will then primarily focus on sparse graphs and the new technology needed to treat them. Finally, we will see how the above and new ideas can be used to study spectral properties in this context. If time permits, we will also discuss Exponential random graphs, a well known family of Gibbs measures on graphs, and the bearing this theory has on them. The lectures will aim to offer a glimpse of the different ideas and tools that come into play including from extremal graph theory, arithmetic combinatorics and spectral graph theory. Several open problems will also be discussed throughout the course.
(Online)
Thursday, July 16
10:00 - 11:10 Shirshendu Ganguly: Large deviations for random networks and applications - 2
While large deviations theory for sums and other linear functions of independent random variables is well developed and classical, the set of tools to analyze non-linear functions, such as polynomials, is limited. Canonical examples of such non-linear functions include subgraph counts and spectral observables in random networks. In this series of lectures we will review the recent exciting developments around building a suitable nonlinear large deviations theory to treat such random variables and understand geometric properties of large random networks conditioned on associated rare events. We will start with a discussion on dense graphs and see how the theory of graphons provides a natural framework to study large deviations in this setting. We will then primarily focus on sparse graphs and the new technology needed to treat them. Finally, we will see how the above and new ideas can be used to study spectral properties in this context. If time permits, we will also discuss Exponential random graphs, a well known family of Gibbs measures on graphs, and the bearing this theory has on them. The lectures will aim to offer a glimpse of the different ideas and tools that come into play including from extremal graph theory, arithmetic combinatorics and spectral graph theory. Several open problems will also be discussed throughout the course.
(Online)
Friday, July 17
10:00 - 11:10 Shirshendu Ganguly: Large deviations for random networks and applications - 3
While large deviations theory for sums and other linear functions of independent random variables is well developed and classical, the set of tools to analyze non-linear functions, such as polynomials, is limited. Canonical examples of such non-linear functions include subgraph counts and spectral observables in random networks. In this series of lectures we will review the recent exciting developments around building a suitable nonlinear large deviations theory to treat such random variables and understand geometric properties of large random networks conditioned on associated rare events. We will start with a discussion on dense graphs and see how the theory of graphons provides a natural framework to study large deviations in this setting. We will then primarily focus on sparse graphs and the new technology needed to treat them. Finally, we will see how the above and new ideas can be used to study spectral properties in this context. If time permits, we will also discuss Exponential random graphs, a well known family of Gibbs measures on graphs, and the bearing this theory has on them. The lectures will aim to offer a glimpse of the different ideas and tools that come into play including from extremal graph theory, arithmetic combinatorics and spectral graph theory. Several open problems will also be discussed throughout the course.
(Online)
Monday, August 3
10:00 - 11:10 Nina Holden: Schramm-Loewner evolution and imaginary geometry 1

The Schramm-Loewner evolution (SLE) is a random fractal curve in the plane that describes the scaling limit of interfaces in several statistical physics models. It is uniquely characterized by two properties known as conformal invariance and the domain Markov property. The first two lectures of the course will be an introduction to SLE and its basic properties via classical Loewner chain theory. The third lecture will be about imaginary geometry, which gives a very useful alternative perspective on SLE.

prerequisites: The course will require no prior knowledge except standard graduate probability courses. There will be a small amount of stochastic calculus at a few occasions (e.g. I will refer to Ito's formula).

(Online)
Tuesday, August 4
10:00 - 11:10 Nina Holden: Schramm-Loewner evolution and imaginary geometry 2

The Schramm-Loewner evolution (SLE) is a random fractal curve in the plane that describes the scaling limit of interfaces in several statistical physics models. It is uniquely characterized by two properties known as conformal invariance and the domain Markov property. The first two lectures of the course will be an introduction to SLE and its basic properties via classical Loewner chain theory. The third lecture will be about imaginary geometry, which gives a very useful alternative perspective on SLE.

prerequisites: The course will require no prior knowledge except standard graduate probability courses. There will be a small amount of stochastic calculus at a few occasions (e.g. I will refer to Ito's formula).

(Online)
Thursday, August 6
10:00 - 11:10 Nina Holden: Schramm-Loewner evolution and imaginary geometry 3

The Schramm-Loewner evolution (SLE) is a random fractal curve in the plane that describes the scaling limit of interfaces in several statistical physics models. It is uniquely characterized by two properties known as conformal invariance and the domain Markov property. The first two lectures of the course will be an introduction to SLE and its basic properties via classical Loewner chain theory. The third lecture will be about imaginary geometry, which gives a very useful alternative perspective on SLE.

prerequisites: The course will require no prior knowledge except standard graduate probability courses. There will be a small amount of stochastic calculus at a few occasions (e.g. I will refer to Ito's formula).

(Online)
Monday, August 10
10:00 - 11:10 Tom Hutchcroft: Uniform spanning trees in high dimension 1

Uniform spanning trees have played an important role in modern probability theory as a non-trivial statistical mechanics model that is much more tractable than other (more physically relevant) models such as percolation and the Ising model. It also enjoys many connections with other topics in probability and beyond, including electrical networks, loop-erased random walk, dimers, sandpiles, l^2 Betti numbers, and so on. In this course, I will introduce the model and explain how we can understand its large-scale behaviour at and above the upper critical dimension d=4.

In Lecture 1 I will discuss the main sampling algorithms for the UST and the connectivity/disconnectivity transition in dimension 4. Chapter 4 of my lecture notes contains complementary material that will flesh out many of the details from this lecture. Lectures 2 and 3 will discuss scaling exponents in dimensions d>=4 and will be based primarily on the papers arXiv:1804.04120 and arXiv:1512.08509 and forthcoming work with Perla Sousi.

(Online)
Tuesday, August 11
10:00 - 11:10 Tom Hutchcroft: Uniform spanning trees in high dimension 2

Uniform spanning trees have played an important role in modern probability theory as a non-trivial statistical mechanics model that is much more tractable than other (more physically relevant) models such as percolation and the Ising model. It also enjoys many connections with other topics in probability and beyond, including electrical networks, loop-erased random walk, dimers, sandpiles, l^2 Betti numbers, and so on. In this course, I will introduce the model and explain how we can understand its large-scale behaviour at and above the upper critical dimension d=4.

In Lecture 1 I will discuss the main sampling algorithms for the UST and the connectivity/disconnectivity transition in dimension 4. Chapter 4 of my lecture notes contains complementary material that will flesh out many of the details from this lecture. Lectures 2 and 3 will discuss scaling exponents in dimensions d>=4 and will be based primarily on the papers arXiv:1804.04120 and arXiv:1512.08509 and forthcoming work with Perla Sousi.

(Online)
Thursday, August 13
10:00 - 11:10 Tom Hutchcroft: Uniform spanning trees in high dimension 3

Uniform spanning trees have played an important role in modern probability theory as a non-trivial statistical mechanics model that is much more tractable than other (more physically relevant) models such as percolation and the Ising model. It also enjoys many connections with other topics in probability and beyond, including electrical networks, loop-erased random walk, dimers, sandpiles, l^2 Betti numbers, and so on. In this course, I will introduce the model and explain how we can understand its large-scale behaviour at and above the upper critical dimension d=4.

In Lecture 1 I will discuss the main sampling algorithms for the UST and the connectivity/disconnectivity transition in dimension 4. Chapter 4 of my lecture notes contains complementary material that will flesh out many of the details from this lecture. Lectures 2 and 3 will discuss scaling exponents in dimensions d>=4 and will be based primarily on the papers arXiv:1804.04120 and arXiv:1512.08509 and forthcoming work with Perla Sousi.

(Online)