BEGIN:VCALENDAR
VERSION:2.0
PRODID:icalendar-ruby
CALSCALE:GREGORIAN
METHOD:PUBLISH
BEGIN:VTIMEZONE
TZID:Europe/Vienna
BEGIN:DAYLIGHT
DTSTART:20200329T030000
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=3
TZNAME:CEST
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20191027T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10
TZNAME:CET
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20200606T080138Z
UID:1584370800@ist.ac.at
DTSTART:20200316T160000
DTEND:20200316T170000
DESCRIPTION:Speaker: Helmut Bölcskei\nhosted by Christoph Lampert\nAbstrac
t: We develop a theory that allows to characterize the fundamental limits
of learning in deep neural networks. Concretely\, we consider Kolmogorov-o
ptimal approximation through deep neural networks with the guiding theme b
eing a relation between the epsilon-entropy of the hypothesis class to be
learned and the complexity of the approximating network in terms of connec
tivity and memory requirements for storing the network topology and the qu
antized weights and biases. The theory we develop educes remarkable univer
sality properties of deep networks. Specifically\, deep networks can Kolmo
gorov-optimally learn essentially any hypothesis class. In addition\, we f
ind that deep networks provide exponential approximation accuracy—i.e.\
, the approximation error decays exponentially in the number of non-zero w
eights in the network—of widely different functions including the multip
lication operation\, polynomials\, sinusoidal functions\, general smooth f
unctions\, and even one-dimensional oscillatory textures and fractal funct
ions such as the Weierstrass function\, both of which do not have any know
n methods achieving exponential approximation accuracy. We also show that
in the approximation of sufficiently smooth functions finite-width deep ne
tworks require strictly smaller connectivity than finite-depth wide networ
ks. We conclude with an outlook on the further role our theory could play.
LOCATION:Raiffeisen Lecture Hall\, Central Building\, IST Austria
ORGANIZER:arinya.eller@ist.ac.at
SUMMARY:(CANCELED) Fundamental limits of learning in deep neural networks
URL:https://talks-calendar.app.ist.ac.at/events/1155
END:VEVENT
END:VCALENDAR