dc.description.abstract | The term statistical modelling refers to a number of abstract models designed to
reproduce and understand the statistical properties of the activity of neuronal
networks at the population level. Large-scale recordings by multielectrode arrays
(MEAs) have now made possible to scale their use to larger groups of neurons.
The initial step in this work focused on improving the data analysis pipeline that
leads from the experimental protocol used in dense MEA recordings to a clean
dataset of sorted spike times, to be used in model training. In collaboration with
experimentalists, I contributed to developing a fast and scalable algorithm for
spike sorting, which is based on action potential shapes and on the estimated
location for the spike. Using the resulting datasets, I investigated the use of restricted
Boltzmann machines in the analysis of neural data, finding that they can
be used as a tool in the detection of neural ensembles or low-dimensional activity
subspaces. I further studied the physical properties of RBMs fitted to neural activity,
finding they exhibit signatures of criticality, as observed before in similar
models. I discussed possible connections between this phenomenon and the \dynamical"
criticality often observed in neuronal networks that exhibit emergent
behaviour. Finally, I applied what I found about the structure of the parameter
space in statistical models to the discovery of a learning rule that helps long-term
storage of previously learned memories in Hopfield networks during sequential
learning tasks. Overall, this work aimed to contribute to the computational tools
used for analysing and modelling large neuronal populations, on different levels:
starting from raw experimental recordings and gradually proceeding towards
theoretical aspects. | en |