Statistical Learning in the Nervous System
(in Hungarian, Statisztikai tanulás az idegrendszerben)
Spring semester, 2023/24
Kurzus időpont: hétfőnként 16:15
Első előadás: 2024. Február 12
Hely: BME E404 (Egry József utca)
Kurzus nyelve: magyar
The course is listed at the following universities:
- Eötvös University (ELTE), Neptun code: mv2n9044, CCNM17-214
- Technical University (BME), in masters programmes with code BMETE47MC39; in PhD programmes with code BMETE47D119
- Pázmány University (PPKE), students can take credits to the class offered by ELTE by accrediting the course after completion
- we welcome everyone else with personalized administrative procedures if needed
Lecturers: Gergő Orbán, Anna Székely
Time: Mondays, 4:15 pm – 5:45 pm
The course aims to cover a few topics in the functional description of the nervous system with special focus on statistical methods. Efficient methods for learning about visual data are described and the ways the nervous system implements these computations are also discussed. Materials of the course from previous years can be accessed here. Background reading for all lectures is listed here.
List of exam topics (updated on 23 May 2023)
Introduction. Computational approach, perception as inference, representation, coding, why probabilities?
Knowledge representation. Formal systems, logic, probability theory
https://meet.google.com/psd-gxet-wxx
Probabilistic models. Probability calculus, graphical models, Bayesian inference, approximate inference
- Slides (updated on 11 March 2024)
- Supporting materials
Bayesian behaviour
- Slides (updated on 18 April 2023)
- Handwritten notes for ‘Optimal predictions’
- 3D plot for ‘Optimal predictions’
- Handwritten notes for Kording&Wolpert
Approximate inference, Sampling. MCMC
- Slides (updated on 11 March 2024)
- MCMC notebook (updated on 23 March 2021)
- assignment (updated 25 March 2021)
Sampling in cognition
- Slides (updated on 18 April 2023)
- assignment (updated on 19 April 2021)
Reading:
- Weiss, Y., Simoncelli, E.P., Adelson, E.H., 2002. Motion illusions as optimal percepts. 5, 598–604. doi:10.1038/nn858
- Sanborn, A., Griffiths, T., 2008. Markov chain Monte Carlo with people. Advances in neural information processing systems
- Houlsby, N.M.T., Huszar, F., Ghassemi, M.M., Orbán, G., Wolpert, D.M., Lengyel, M., 2013. Cognitive Tomography Reveals Complex, Task-Independent Mental Representations. Curr Biol 1–7. doi:10.1016/j.cub.2013.09.012
Measuring priors
- Slides (updated on May 2024)
Bayesian modelling of vision I. PCA, the Olshausen & Field model, Modelling correlations of filters, GSM
- Slides (updated on 19 April 2021)
Reading:
- Dayan & Abbott, Theoretical neuroscience, chapter 10.3 & 10.4
- Olshausen, B.A., Field, D.J., 1996. Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381, 607–609.
- Karklin, Y., Lewicki, M.S., 2009. Emergence of complex cell properties by learning to generalize in natural scenes. Nature 457, 83–86. doi:10.1038/nature07481
- Schwartz, O., Simoncelli, E.P., 2001. Natural signal statistics and sensory gain control 4, 819–825. doi:10.1038/90526
Bayesian modelling of vision II. Complex models of natural images, hierarchical models
Reading:
- Freeman, J., Simoncelli, E.P., 2011. Metamers of the ventral stream. 14, 1195–1201. doi:10.1038/nn.2889
- Freeman, J., Ziemba, C.M., Heeger, D.J., Simoncelli, E.P., Movshon, J.A., 2013. A functional and perceptual signature of the second visual area in primates 16, 974–981. doi:10.1038/nn.3402
- DiCarlo, J.J., Cox, D.D., 2007. Untangling invariant object recognition. Trends Cogn Sci 11, 333–341. doi:10.1016/j.tics.2007.06.010
- Ziemba, C.M., Freeman, J., Movshon, J.A., Simoncelli, E.P., 2016. Selectivity and tolerance for visual texture in macaque V2. Proceedings of the National Academy of Sciences 201510847. doi:10.1073/pnas.1510847113
Neural representation of probabilities. PPC, sampling hypothesis
- Slides (updated on 12 May 2021)
Reading:
- Fiser, J., Berkes, P., Orbán, G., Lengyel, M., 2010. Statistically optimal perception and learning: from behavior to neural representations. Trends Cogn Sci 14, 119–130. doi:10.1016/j.tics.2010.01.003
- Bányai, M., Lazar, A., Klein, L., Klon-Lipok, J., Stippinger, M., Singer, W., Orbán, G., 2019. Stimulus complexity shapes response correlations in primary visual cortex. Proc. Natl. Acad. Sci. U. S. A. 20, 201816766–10. doi:10.1073/pnas.1816766116
- Ma, W.J., Beck, J.M., Latham, P.E., Pouget, A., 2006. Bayesian inference with probabilistic population codes. 9, 1432–1438. doi:10.1038/nn1790
- Orbán, G., Pietro Berkes, Fiser, J., Lengyel, M., 2016. Neural Variability and Sampling-Based Probabilistic Representations in the Visual Cortex. Neuron 92, 530–543. doi:10.1016/j.neuron.2016.09.038
Learning. Information theory and learning theory. Maximum likelihood learning, minimum description length principle
- Slides (updated on May 2024)
Structure learning. Automatic Occam’s razor, visual chunk learning
- Slides (updated on May 2024)
Computer lab, implementation of Bayesian inference problems — ? we’ll see if this will happen
- Slides
- Typo modell illustrating algorithmic likelihoods
- Mental rotation modell using Stan for posterior sampling
- Learning model parameters with tensorflow
- Olshausen-Field model with natural images
- Olshausen-Field model implemented as a variational autoencoder
Kódolási gyakorlat
A kódolás python notebookban végezhető el. A kódok részletes instrukciókat tartalmaznak, nem a kódolási készségek fejlesztése (se nem ezek felmérése) a célja, hanem kódolás segítségével kíván bepillantást adni az órán tárgyalt eszközök használatába.
A python notebook a saját gépen is futtatható amennyiben python rendelkezésre áll (open source szoftver), de még egyszerűbb a google colab szolgáltatását használni
A két gyakorlat:
Computational Neuroscience
(in Hungarian, Idegrendszeri modellezés)
Neptun: kv2n9o46
Fall semester, yearly.
Lecturers: Gergő Orbán, Balázs Ujfalussy and Zoltán Somogyvári.
Course material can be found at http://cneuro.rmki.kfki.hu/education/neuromodel
The course focuses on basic principles of computational neuroscience: the biophysics of neurons; action potential generation, transduction, and transmission; simple networks of neurons, and their modifications by learning; and the ways the nervous system encodes and decodes information about the environment and about the body.