GoToMeeting - https://www.gotomeet.me/PhD_ManuelBeiran
Jury:
Jaime de la Rocha (Rapporteur - IDIBAPS, Spain)
Julijana Gjorgjieva (Rapporteuse - Max Planck Institute for Brain
Research, Germany)
Mehrdad Jazayeri (Examinateur - MIT, USA)
Virginie Van Wassenhove (Examinatrice - NeuroSpin, France)
Vincent Hakim (Examinateur - ENS)
Srdjan Ostojic (Directeur de thèse - ENS)
Abstract:
Neural activity in awake animals exhibits a vast range of timescales
giving rise to behavior that can adapt to a constantly evolving
environment. How are such complex temporal patterns generated in the
brain, given that individual neurons function with membrane time
constants in the range of tens of milliseconds? How can neural
computations rely on such activity patterns to produce flexible temporal
behavior?
One hypothesis posits that long timescales at the level of neural
network dynamics can be inherited from long timescales of underlying
biophysical processes at the single neuron level, such as adaptive ionic
currents and synaptic transmission. We analyzed large networks of
randomly connected neurons taking into account these slow cellular
process, and characterized the temporal statistics of the emerging
neural activity. Our overarching result is that the timescales of
different biophysical processes do not necessarily induce a wide range
of timescales in the collective activity of large recurrent networks.
Conversely, complex temporal patterns can be generated by structure in
synaptic connectivity. In the second chapter of the dissertation, we
considered a novel class of models, Gaussian-mixture low-rank recurrent
networks, in which connectivity structure is characterized by two
independent properties, the rank of the connectivity matrix and the
number of statistically-defined populations. We show that such networks
act as universal approximators of arbitrary low-dimensional dynamical
systems, and therefore can generate temporally complex activity.
Finally, we investigated how dynamical mechanisms at the network level
implement flexible sensorimotor timing tasks. We first show that
low-rank networks trained on such tasks generate low-dimensional
invariant manifolds, where dynamics evolve slowly and can be flexibly
modulated. We then identified the core dynamical components and tested
them in simplified network models that carry out the same flexible
timing tasks. Overall, we uncovered novel dynamical mechanisms for
temporal flexibility that rely on minimal connectivity structure and can
implement a vast range of computations.