Information Theory in Neuroscience

Piasini, Eugenio

Information Theory in Neuroscience - MDPI - Multidisciplinary Digital Publishing Institute 2019 - 1 electronic resource (280 p.)

Open Access

As the ultimate information processing device, the brain naturally lends itself to being studied with information theory. The application of information theory to neuroscience has spurred the development of principled theories of brain function, and has led to advances in the study of consciousness, as well as to the development of analytical techniques to crack the neural code-that is, to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling the precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative testing of hypotheses about how the brain encodes and transmits the information used for specific functions across areas. This Special Issue presents twelve original contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience.


Creative Commons


English

books978-3-03897-665-3 9783038976646

10.3390/books978-3-03897-665-3 doi

synergy Gibbs measures categorical perception entorhinal cortex neural network perceived similarity graph theoretical analysis orderness navigation network eigen-entropy Ising model higher-order correlations discrimination information theory recursion goodness consciousness neuroscience feedforward networks spike train statistics decoding eigenvector centrality discrete Markov chains submodularity free-energy principle infomax principle neural information propagation integrated information mismatched decoding maximum entropy principle perceptual magnet graph theory internal model hypothesis channel capacity complex networks representation latching noise correlations independent component analysis mutual information decomposition connectome redundancy mutual information information entropy production unconscious inference hippocampus neural population coding spike-time precision neural coding maximum entropy neural code Potts model pulse-gating functional connectome integrated information theory minimum information partition brain network Queyranne's algorithm principal component analysis

O.P. Jindal Global University, Sonepat-Narela Road, Sonepat, Haryana (India) - 131001

Send your feedback to glus@jgu.edu.in

Hosted, Implemented & Customized by: BestBookBuddies   |   Maintained by: Global Library