Theoretical Perspectives Enhancing Deep Learning Models

Dima Krotov, Jonathan Cohen, Eric Vanden-Eijnden
Date
Oct 13, 2023, 9:30 am4:00 pm
Location
Skylight Room at the CUNY Graduate Center (365 Fifth Ave, Manhattan)

Details

Event Description

PROGRAM

9:30 AM
Coffee and bagels
__

10:00 AM - 11:30 PM
Dense associative memory for novel transformer architectures
Dima Krotov
MIT-IBM Watson AI Lab and IBM Research
__

11:30 AM - 12:00 PM
Break
__

12:00 PM - 1:30 PM
Toward a brain-inspired model of the flexibility and efficiency of human cognition
Jonathan Cohen
Princeton Neuroscience Institute

Modern AI is generally built around symbolic architectures or neural networks. Symbolic systems are flexible, but can be difficult to configure and inefficient to execute for complex problems. Neural networks can be trained to efficiently execute complex functions, but require massive amounts of data to do so, and cannot be re-applied broadly in new domains sharing underlying structure without considerable retraining. In contrast, the human brain achieves both in a single computational architecture. It can carry out symbolic processing, identifying fundamental regularities and flexibly generalizing abstract structure across domains of processing, and at the same time learn complex representations and functions in specific domains and compute these with remarkable efficiency.  In this talk, I will highlight ongoing work at the intersection of cognitive neuroscience and machine learning that is revealing fundamental principles about how subsystems in neural architectures — both in the brain and artificial systems – can interact to achieve the unique combination of flexibility and efficiency characteristics of human cognition.
__

1:30 PM - 2:30 PM
Lunch
__

2:30 PM - 3:30 PM
Stochastic Interpolant: A unified framework for generative modeling with flows and diffusions
Eric Vanden-Eijnden
Courant Institute, New York University