ITB logo
SS 2004

Computational Neuroscience: Oberseminar (31187)

Dr. Laurenz Wiskott, Prof. Andreas V. M. Herz, and Dr. Richard Kempter
HU logo


Oberseminar: Monday 11-13 Uhr in the ITB seminar room (Invalidenstraße 43)

In this seminar various topics of current research in computational neuroscience are presented. Attendees should have basic knowledge in neuroscience and computational neuroscience, e.g. from the courses "Computational Neuroscience I-IV".


Talks

05.04.2004 Raphael Ritz

19.04.2004 Kay Thurley

26.04.2004 Dr. Isao Tokuda
"Fitting ODE to neural data."

03.05.2004 Wiebke Krambeck
"Effects of Channel Noise on Subthreshold Fluctuations and on the Reliability of Action Potentials"
Abstract: Neural noise produced by probabilistic gating of voltage-dependent ion channels limits the precision and reliability of neuronal responses. In theoretical approaches, including conductance-based models, the effect of noise is often simulated by the addition of noise to the current input while the gating of channels remains deterministic. The consequences of this simplification for spike timing reliability are investigated by direct comparison of stochastic (Markov-type) models and deterministic models with additive current noise.
Literature:
Carson C. Chow and John A. White (1996): "Spontaneous Action Potentials due to Channel Fluctuations" Biophysics Journal Vol. 71, pages 3013-3021
Elad Schneidman, Barry Freedman, and Idan Segev (1998): "Ion Channel Stochasticity May Be Critical in Determining the Reliability and Precision of Spike Timing" Neural Computation, Vol. 10, pages 1679-1703
J.A. White, J.T. Rubinstein, and A.R. Kay (2000): "Channel Noise in neurons" Trends Neurosci., Vol. 23, pages 131-137

10.05.2004 Tim Oppermann
"The effect of a phase-reset on a simple neural system"
Abstract: Two different mechanisms have been discussed to describe evoked activity in the brain: Linear summation and phase-resetting. In my talk I will discuss a special phase resetting paradigm and present simulation results, obtained from a simple neural system.
Literature:
A. Arieli et al. (1996): "Dynamics of ongoing activity: explanation of the large variability in evoked cortical responses.", Science (273) p.1868
S. Makeig et al. (2002):"Dynamic brain sources of visual evoked responses.",Science (295) p.690

17.05.2004 Dr. Matthias Bethge (Redwood Neuroscience Institute, Menlo Park, CA)
"Sparseness, Coarseness, Slowness: Learning Population Codes for Natural Image Patches"
Abstract: Energy consumption and limitation of space as well as signal processing complexity impose strong evolutionary pressure towards efficient codes of sensory inputs. Efficient coding relies on two complementary aspects: how well does the encoding match the statistics of the input signal (i.e. source coding) and how well does it match the biophysical constraints of neural signal transmission (i.e. channel coding). To date these two aspects have been studied rather independently in the context of representational learning (ICA, sparse coding) and population coding (coarse coding, binary coding). Here, I present a new algorithm -- orthogonal Anti-Matching Pursuit (AMP) -- that allows to bridge the gap between these two aspects. Like (orthogonal) matching pursuit, the algorithm is based on overcomplete dictionaries, but leads to much sparser coefficient distributions. In fact, for two times overcomplete dictionaries the deviations of all but one coefficients from zero is neglible. This means that the quality of an image is very accaptable, if for each 8x8 image patch only the best matching component is used. This is possible, because most of the entropy in the analog coefficient values is translated into the large degree of freedom for selecting a basis from the overcomplete dictionary. Finally, I explain why redundancy reduction in case of binary Markov channels is related to minimizing the average Hamming distance over time and an example is given of a binary representation obtained from AMP that exhibits significantly increased temporal stability for a learned dictionary.

24.05.2004 Felix Creutzig
"What Can Burst Statistics Tell Us? "
Abstract: Grasshoppers process acoustic information in the metathoracic ganglion. The ascending neuron AN12 responds to the onset of song syllables with bursts. I will present preliminary results which suggest that burst statistics like the spikes per burst are singular for each song. Furthermore, I will discuss the role of the temporal fine structure (interspike intervalls) in bursts for coding.
Literature:
Stumpner, A (1988), "Auditorische thorakale Interneurone von Chorthippus biguttulus L.: Morphologische und physiologische Charakterisierung und Darstellung ihrer Filtereigenschaften fuer verhaltensrelevante Lautattrapen", PhD-Thesis, pp.148-159
Krahe, R. and Gabbiani, F. (2004), "Burst Firing in Sensory Systems", Nature Neuroscience Reviews, Vol 5

31.05.2004 Pfingsten / Pentecost

07.06.2004 Samuel Glauser
"Comparing electrical and acoustic stimulation and their effects on auditory interneurons in the locust"
Abstract: In this talk, I will compare electrical stimulation of the auditory nerve of the locust with acoustic stimulation of its ear. Additionally, current state of affairs and perspectives my own work will be presented.
Literature:
Römer, H. and Rheinländer, J. (1983): "Electrical stimulation of the tympanal nerve as a tool for analysing the responses of auditory interneurons in the locust" Journal of Comparative Physiology Vol. 152, pages 289 - 296

14.06.2004 SFB-Workshop "Regulation: Historical and Current Themes in Theoretical Biology"

21.06.2004 Christian Leibold
"Scaling of Sequence Memory in a Recurrent Network"
Abstract: A recurrent neural network can act as memory for sequences of activity patterns. It is for instance generally believed that this is true for the hippocampal CA3 region. I present a theoretical framework that allows for calculating the sequence memory capacity in a recurrent network of binary neurons. Assuming discrete dynamics, we calculate the optimal size of a pattern and discuss its relation to sequence memory capacity, as well as its dependence on sequence length and on resources available for synaptic plasticity. We find that scaling of sequence memory heavily depends on system constraints such as a constant number of synapses per neuron (surface constraint) or a constant total number of synapses in the network. As a consequence of the surface constraint we can derive the existence of an optimal network size.
Literature:
Hertz J, Krogh A, Palmer R (1991) Introduction to the Theory of Neural Computation Addison-Wesley Boston, MA, USA; pp. 2
Nowotny T, Huerta R (2003) Explaining synchrony in feed-forward networks: Are McCulloch-Pitts neurons good enough? Biol. Cybern. 89:237-41.

28.06.2004 Pietro Berkes
"Computational models of complex cells."
Abstract: In this talk I'm going to review the computational models of complex cells, discussing similarities and differences with the results presented in (Berkes and Wiskott, 2003).
Literature:
Hyvärinen, Aapo and Hoyer, Patrik (2000) Emergence of phase and shift invariant features by decomposition of natural images into independent features subspaces. Neural Computation 12 (7), 1705-1720.
Hyvärinen, Aapo and Hoyer, Patrik (2001) A two-layer sparse coding model learns simple and complex cell receptive fields and topography from natural images. Vision Research 41 (18), 2413-2423.
Zetzsche, Christoph and Röhrbein, Florian (2001) Nonlinear and extra-classical receptive fields properties and the statistics of natural scenes. Network: Computation in Neural Systems 12, 331-350.
Hashimoto, Wakako (2003) Quadratic forms in natural images. Network: Computation in Neural Systems 14 (4), 765-788.
Berkes, Pietro and Wiskott, Laurenz (2003) Slow feature analysis yields a rich repertoire of complex-cell properties. Cognitive Sciences EPrint Archive (CogPrint) 2804, http://cogprints.ecs.soton.ac.uk/archive/00002804/ .
Körding, K., Kayser, C., Einhäuser, W. and König, P. (2004) How are complex cell properties adapted to the statistics of natural scenes? Journal of Neurophysiology 91(1), 206-212.

05.07.2004 Thomas Voegtlin
"Representing structured data with neural nets"
Abstract: In order to represent structured data, Hinton proposed to use different representations for the same object, depending on whether this object is seen as simple or complex. A first implementation of this idea was proposed by Pollack, in a network called Recursive Auto Associative Memory, that can encode binary trees. I will present a variant of this network, that uses linear neurons and is trained with PCA/ICA. Given these unsupervised learning methods, I expect the network to learn from the statistics of its training set. I will present different methods to analyze the internal representations used by the network, in terms of "receptive fields" and hierarchical clustering of activity patterns.
Literature:
Pollack, J.B. (1990), "Recursive Distributed Representations", Artificial Intelligence, 46, pp. 77-105. http://citeseer.ist.psu.edu/pollack90recursive.html

12.07.2004 Gregor Wenning (Neural Information Processing group, Technical University Berlin)
"Noisy Neural Information Processing"
Abstract: I will introduce the content of my PhD thesis and give a brief survey on the following topics:
1) stochastic resonance in a neural context
2) adaptation to the optimal noise level in single neurons
3) energy efficient coding
4) stochastic resonance in populations of neurons
5) detection of transient inputs, the influence of colored noise
6) analytic approximations for the response stimulus correlation in a conductance-based LIF neuron
I intend to focus on topics 2 and 5.
Literature: wenningetal2003.pdf, wenningetal2004.ps (not available anymore).

19.07.2004 Mathias Franzius

26.07.2004 Tiziano Zito


Created by Laurenz Wiskott, http://itb.biologie.hu-berlin.de/~wiskott/