Invited talk: Jonathan D. VICTOR (NY)

2010-05-27 at 14:00

University Pierre&Marie Curie Paris VI, Building B, 5th floor, Room 501 (How to come)


Information-theoretic analysis of neural data: why do it, why it is challenging, and what can be learned

Entropy and information are of interest to neuroscientists, because of their mathematical properties and because they place limits on the performance of a neural system. Estimating these quantities from neural spike trains is much more challenging than estimating other statistics, such as mean and variance. The central difficulty in estimating information is tightly linked to the properties of information that make it a desirable quantity to estimate. To surmount this fundamental difficulty, most approaches to estimation of information rely (perhaps implicitly) on a model for how spike trains are related. The nature of these model assumptions vary widely. As a result, information estimates are useful not only in situations in which several approaches provide mutually consistent results, but also in situations in which they differ. These ideas are illustrated with examples from the visual and gustatory systems.