Combinatorics and Probability Seminar
Shuangping Li
Princeton
On the binary perceptron
Abstract: We consider the binary perceptron model, a simple model of neural
networks that has gathered significant attention in the statistical
physics, information theory and probability theory communities. For
the symmetric binary perceptron model, we establish that the partition
function of this model, normalized by its expected value, converges to
a lognormal distribution. As a consequence, this allows us to
establish several conjectures for this model, including contiguity,
sharp threshold and strong freezing conjectures.
The strong freezing property further points to an interesting
phenomenon that is not present in sparse constraint satisfaction
problems: on the one hand, the perceptron model is conjectured to be
typically hard; on the other hand, efficient algorithms have been
shown empirically to succeed in finding solutions, suggesting that
such algorithms find atypical solutions. We formally establish such a
phenomenon for both the symmetric and asymmetric binary perceptrons.
We show that at low constraint density , there exists a subdominant
connected cluster of solutions with almost maximal diameter, and that
an efficient multiscale majority algorithm can find solutions in such
a cluster with high probability.
This is based on joint works with Emmanuel Abbe and Allan Sly.
Monday March 7, 2022 at 3:00 PM in Zoom