Lab Home | Phone | Search | ||||||||
|
||||||||
Neuroscientists want to quantify how well neurons, individually and collectively, process information and encode the result in their outputs. But classic information theory only demarcates optimal performance boundaries and does not provide results that would be useful in analyzing an existing system in which little is known (such as the brain). Non-Poisson processes, which are required to describe neural signals, are shown to have individually a capacity strictly smaller than the Poisson ideal. I describe recent capacity results for Poisson neural populations, showing that connections among neurons can increase capacity. Going beyond classic theory, I present an alternative theory more amenable to data analysis and to situations wherein interconnected systems actively extract and represent information. Using this theory, we show that the ability of a neural population to jointly represent information depends nature of its input signal, not on the encoded information. Host: Ron Pistone, pistone@lanl.gov |