BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//ARNI - ECPv6.15.20//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-WR-CALNAME:ARNI
X-ORIGINAL-URL:https://arni-institute.org
X-WR-CALDESC:Events for ARNI
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:UTC
BEGIN:STANDARD
TZOFFSETFROM:+0000
TZOFFSETTO:+0000
TZNAME:UTC
DTSTART:20230101T000000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=UTC:20240429T113000
DTEND;TZID=UTC:20240429T130000
DTSTAMP:20260515T095104
CREATED:20240416T222403Z
LAST-MODIFIED:20240423T191956Z
UID:806-1714390200-1714395600@arni-institute.org
SUMMARY:Lenka Zdeborova (Seminar Speaker)
DESCRIPTION:Title: Phase transition in learning with neural networks  \nAbstract: Statistical physics has studied exactly solvable models of neural networks for more than four decades. In this talk\, we will put this line of work in perspective of recent questions stemming from deep learning. We will describe several types of phase transition that appear in the high-dimensional limit as a function of the amount of data. Discontinuous phase transitions are linked to adjacent algorithmic hardness. This so-called hard phase influences the behaviour of gradient-descent-like algorithms. We show a case where the hardness is mitigated by overparametrization\, proposing that the benefits of overparametrization may be linked to the usage of a specific type of algorithm. We will also discuss recent progress in identifying phase transitions and their consequences in networks with attention layers and in sampling with generative diffusion-based networks.
URL:https://arni-institute.org/event/lenka-zdeborova-seminar-speaker/
LOCATION:Zuckerman Institute – L5-084\, 3227 Broadway\, New York\, NY\, United States
END:VEVENT
END:VCALENDAR