skip to main content
Primo Search
Search in: Busca Geral

Statistical mechanics of complex neural systems and high dimensional data

Advani, Madhu ; Lahiri, Subhaneil ; Ganguli, Surya

Journal of statistical mechanics, 2013-03, Vol.2013 (3), p.P03014-66 [Periódico revisado por pares]

IOP Publishing and SISSA

Texto completo disponível

Citações Citado por
  • Título:
    Statistical mechanics of complex neural systems and high dimensional data
  • Autor: Advani, Madhu ; Lahiri, Subhaneil ; Ganguli, Surya
  • Assuntos: Computation ; Computer simulation ; Dynamical systems ; Dynamics ; Mathematical analysis ; Mathematical models ; Neural networks ; Statistical mechanics
  • É parte de: Journal of statistical mechanics, 2013-03, Vol.2013 (3), p.P03014-66
  • Notas: ObjectType-Article-1
    SourceType-Scholarly Journals-1
    ObjectType-Feature-2
    content type line 23
  • Descrição: Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
  • Editor: IOP Publishing and SISSA
  • Idioma: Inglês

Buscando em bases de dados remotas. Favor aguardar.