Thesis defense

Low-rank network models of neural computations

Speaker(s)
Adrian Valente (LNC2)
Practical information
19 October 2022
3-6pm
Place

ENS, room Favard, 46 rue d'Ulm, 75005 Paris

LNC2

Abstract

At every instant, myriads of neurons collaborate in a nervous system, generating collective patterns of activity that form a biological substrate for perception, cognition, and behavior. Recent in vivo recordings of hundreds or thousands of neurons in awake animals suggest that this activity patterns form specific geometries, known as low-dimensional, from which mental representations can be extracted. Hence, an influential contemporary paradigm in systems neuroscience posits that neural computations emerge from the collective dynamics of networks of neurons which generate low-dimensional activity patterns visible at the population level.

Understanding how the structure of a network impacts its dynamics and ultimately its function remains an important open question, both for artificial and biological neural systems. A promising direction to illuminate this question is to study recurrent networks whose connectivity is constrained to be low-rank. This particular mathematical property has been previously shown to directly induce low-dimensional dynamics in a network, as those observed in artificial and biological systems. Can thus low-rank networks help us “open the black-box” of recurrent computations? Can they reveal the links between network structure and function? Can they generate hypotheses and insights from neural recordings? This thesis aims at answering those questions, by developing new methods to train low-rank RNNs, and a new theory that links a statistical description of network structure to its dynamics and function. 

We will start by exposing our theory and methods, showing how low-rank matrices link structure and computations in a recurrent network. We will then dissect low-rank networks trained to perform a range of cognitive tasks, obtaining mechanistic insights from their trained connectivity. We will then relate the low-rank RNN model class to a set of methods widely used to interpret neural recordings, the latent dynamical system models. Finally, we will demonstrate the capabilities of low-rank RNNs to probe the computational mechanisms in unconstrained, full-rank RNNs, and to interpret in vivo cortical recordings.