JJ Torres, MA Muñoz, Jesus M Cortes, J Mejias. Special Issue on Emergent Effects in Stochastic Neural Networks with Application to Learning and Information Processing. Neurocomputing . In press, 2021. [pdf]
This special issue includes a series of 12 articles, which represent a selection of extended contributions presented at the 15th Granada Seminar on Computational and Statistical Physics held in Granada from September 17 to 20, 2019, and organized by Institute Carlos I for Theoretical and Computational Physics at the University of Granada.
The brain is a paradigmatic example of a highly complex system, in which cognitive functions are the result of emergent phenomena derived from collective effects of a large number of microscopic elemental components, such as neurons, synapses, and glial cells, which in turn interact with multiple elements at higher spatial scales, thus forming microcircuits or anatomical structures with a well-characteristic cellular, functional and organizational differentiation.
It is precisely for this reason that the tools and ideas from statistical mechanics and the modern field of complex networks provide rigorous and adequate frameworks to shed light on the collective characteristics of brain networks, thus opening a window of opportunity to investigate the theory of concomitant cognitive functions. Theoretical advances in this regard are now crucially complemented by the astonishing availability of both structural and functional data, on a wide range of spatial and temporal scales, as provided by recent developments in neuroimaging and neurophysiology.
Despite a large number of scientific publications on the structure and dynamics of brain networks and, in particular, on the interaction of them with learning and information processing, still today there are unsolved challenges, including aspects related to the acquisition and consolidation of memory, the emergence of high-level cognition, the plasticity and reconfiguration of these networks to compensate for cerebral damage, to name just a few. The use of computers has proven to be an extremely powerful tool for modeling neural activity, synaptic transmission, as well as designing biologically inspired circuits.
Therefore, we are currently in a position to achieve a much deeper understanding of how the brain works and how the large repertoire of high-level functionalities emerge. This combination of emerging neural properties and complex brain networks understood from a computational point of view, and with applications to artificial intelligence and computer science, was the focus of the latest 15th edition of the Granada Seminar on Computational and Statistical Physics. This conference constituted a meeting point where the latest advances in neuroscience, computational modeling and research in neural networks were presented in a highly interdisciplinary and stimulating environment. This special issue gathers together various high-quality original contributions presented at this seminar, covering the areas of research in experimental neuroscience, computational neuroscience, artificial intelligence, and brain networks.