skip to main content

Nonequilibrium Markov Processes Conditioned on Large Deviations

Chetrite, Raphaël ; Touchette, Hugo

Annales Henri Poincaré, 2015-09, Vol.16 (9), p.2005-2057 [Periódico revisado por pares]

Basel: Springer Basel

Texto completo disponível

Citações Citado por
  • Título:
    Nonequilibrium Markov Processes Conditioned on Large Deviations
  • Autor: Chetrite, Raphaël ; Touchette, Hugo
  • Assuntos: Classical and Quantum Gravitation ; Dynamical Systems and Ergodic Theory ; Elementary Particles ; Mathematical and Computational Physics ; Mathematical Methods in Physics ; Physics ; Physics and Astronomy ; Quantum Field Theory ; Quantum Physics ; Relativity Theory ; Theoretical
  • É parte de: Annales Henri Poincaré, 2015-09, Vol.16 (9), p.2005-2057
  • Descrição: We consider the problem of conditioning a Markov process on a rare event and of representing this conditioned process by a conditioning-free process, called the effective or driven process. The basic assumption is that the rare event used in the conditioning is a large deviation-type event, characterized by a convex rate function. Under this assumption, we construct the driven process via a generalization of Doob’s h -transform, used in the context of bridge processes, and show that this process is equivalent to the conditioned process in the long-time limit. The notion of equivalence that we consider is based on the logarithmic equivalence of path measures and implies that the two processes have the same typical states. In constructing the driven process, we also prove equivalence with the so-called exponential tilting of the Markov process, often used with importance sampling to simulate rare events and giving rise, from the point of view of statistical mechanics, to a nonequilibrium version of the canonical ensemble. Other links between our results and the topics of bridge processes, quasi-stationary distributions, stochastic control, and conditional limit theorems are mentioned.
  • Editor: Basel: Springer Basel
  • Idioma: Inglês

Buscando em bases de dados remotas. Favor aguardar.