Language, learning, and networks

Fri, Dec 4, 2020, 10:00 am
Location: 
Zoom
Sponsor(s): 
CPBF an NSF PFC
Initiative for the Theoretical Sciences (ITS)
CUNY doctoral programs in Physics and Biology

We are in the midst of a revolution driven by models for learning in neural networks.
In this symposium we will explore these models, their application to the deep networks that are the state of the art in language processing by machines,
and the relation of these ideas to the extraordinary ability of human infants to learn language. 

To receive a Zoom invitation and a password, please register at https://forms.gle/g376xGejcLt1A4oG9

10:00 - 11:30    Jenny Saffran, University of Wisconsin
12:00 - 1:30      Surya Ganguli, Stanford University and Google
2:00 - 3:30        Jared Kaplan, Johns Hopkins University and OpenAI  

Learning to understand:  Statistical learning and infant language development
Jenny Saffran, University of Wisconsin

A mathematical theory of learning in deep neural networks
Surya Ganguli, Stanford University and Google

Neural scaling laws and GPT-3
Jared Kaplan, Johns Hopkins University and OpenAI

Stephanie Palmer: How behavioral and evolutionary constraints sculpt early visual processing

Mon, Feb 1, 2021, 12:15 pm

An animal eye is only as efficient as the organism’s behavioral constraints demand it to be.

Location: Zoom