摘要 :
Given an arbitrary long but finite sequence of observations from a finite set, we construct a simple process that approximates the sequence, in the sense that with high probability the empirical frequency, as well as the empirical...
展开
Given an arbitrary long but finite sequence of observations from a finite set, we construct a simple process that approximates the sequence, in the sense that with high probability the empirical frequency, as well as the empirical one-step transitions along a realization from the approximating process, are close to that of the given sequence.We generalize the result to the case where the one-step transitions are required to be in given polyhedra.
收起
摘要 :
Satellite on-board switching offers the possibility of covering a wide area with increased total capacity. In order to show a mechanism for reducing a highly dimensional complex Markov chain, in this paper we use a terrestrial pac...
展开
Satellite on-board switching offers the possibility of covering a wide area with increased total capacity. In order to show a mechanism for reducing a highly dimensional complex Markov chain, in this paper we use a terrestrial packet grouping approach to minimize the on-board switching operations. By carefully redirecting some transitions to different states, we can evaluate exact statistics of the original complex Markov chain from the analysis of a much simpler reduced Markov chain. Although we use a communication example close to our research interest, the method can have a much wider area of application.
收起
摘要 :
We develop a practical approach to establish the stability, that is, the recurrence in a given set, of a large class of controlled Markov chains. These processes arise in various areas of applied science and encompass important nu...
展开
We develop a practical approach to establish the stability, that is, the recurrence in a given set, of a large class of controlled Markov chains. These processes arise in various areas of applied science and encompass important numerical methods. We show in particular how individual Lyapunov functions and associated drift conditions for the parametrized family of Markov transition probabilities and the parameter update can be combined to form Lyapunov functions for the joint process, leading to the proof of the desired stability property. Of particular interest is the fact that the approach applies even in situations where the two components of the process present a time-scale separation, which is a crucial feature of practical situations. We then move on to show how such a recurrence property can be used in the context of stochastic approximation in order to prove the convergence of the parameter sequence, including in the situation where the so-called stepsize is adaptively tuned. We finally show that the results apply to various algorithms of interest in computational statistics and cognate areas.
收起
摘要 :
A proof of a general theorem for the calculation of conditional mean duration of a finite absorbing discrete time Markov chain is presented. In the simplest case, this result is equivalent to one suggested in the book of Kemeny an...
展开
A proof of a general theorem for the calculation of conditional mean duration of a finite absorbing discrete time Markov chain is presented. In the simplest case, this result is equivalent to one suggested in the book of Kemeny and Snell (1976). In addition, we prove that the mean duration and mean conditional duration of a finite absorbing continuous time Markov chain can be calculated via the fundamental matrix of the embedded discrete time chain. These results are also extended to certain non-absorbing Markov chains. Applications are presented to illustrate the utility of these results. (C) 2019 Elsevier B.V. All rights reserved.
收起
摘要 :
Markov transition models are frequently used to model disease progression. The authors show how the solution to Kolmogorov's forward equations can be exploited to map between transition rates and probabilities from probability dat...
展开
Markov transition models are frequently used to model disease progression. The authors show how the solution to Kolmogorov's forward equations can be exploited to map between transition rates and probabilities from probability data in multistate models. They provide a uniform, Bayesian treatment of estimation and propagation of uncertainty of transition rates and probabilities when 1) observations are available on all transitions and exact time at risk in each state (fully observed data) and 2) observations are on initial state and final state after a fixed interval of time but not on the sequence of transitions (partially observed data). The authors show how underlying transition rates can be recovered from partially observed data using Markov chain Monte Carlo methods in WinBUGS, and they suggest diagnostics to investigate inconsistencies between evidence from different starting states. An illustrative example for a 3-state model is given, which shows how the methods extend to more complex Markov models using the software WBDiff to compute solutions. Finally, the authors illustrate how to statistically combine data from multiple sources, including partially observed data at several follow-up times and also how to calibrate a Markov model to be consistent with data from one specific study.
收起
摘要 :
Markov regression models describe the way in which a categorical response variable changes over time for subjects with different explanatory variables. Frequently it is difficult to measure the response variable on equally spaced ...
展开
Markov regression models describe the way in which a categorical response variable changes over time for subjects with different explanatory variables. Frequently it is difficult to measure the response variable on equally spaced discrete time intervals. Here we propose a Pearson-type goodness-of-fit test for stationary Markov regression models fitted to panel data. A parametric bootstrap algorithm is used to study the distribution of the test statistic. The proposed technique is applied to examine the fit of a Markov regression model used to identify markers for disease progression in psoriatic arthritis.
收起
摘要 :
We propose a model of random walks on weighted graphs where the weights are interval valued, and connect it to reversible imprecise Markov chains. While the theory of imprecise Markov chains is now well established, this is a firs...
展开
We propose a model of random walks on weighted graphs where the weights are interval valued, and connect it to reversible imprecise Markov chains. While the theory of imprecise Markov chains is now well established, this is a first attempt to model reversible chains. In contrast with the existing theory, the probability models that have to be considered are now non convex. This presents a difficulty in computational sense, since convexity is critical for the existence of efficient optimization algorithms used in the existing models. The second part of the paper therefore addresses the computational issues of the model. The goal is finding sets of weights which maximize or minimize expectations corresponding to multiple steps transition probabilities. In particular, we present a local optimization algorithm and numerically test its efficiency. We show that its application allows finding close approximations of the globally best solutions in reasonable time. (C) 2016 Elsevier Inc. All rights reserved.
收起
摘要 :
In this article we introduce a new missing data model, based on a standard parametric Hidden Markov Model (HMM), for which information on the latent Markov chain is given since this one reaches a fixed state (and until it leaves t...
展开
In this article we introduce a new missing data model, based on a standard parametric Hidden Markov Model (HMM), for which information on the latent Markov chain is given since this one reaches a fixed state (and until it leaves this state). We study, under mild conditions, the consistency and asymptotic normality of the maximum likelihood estimator. We point out also that the underlying Markov chain does not need to be ergodic, and that identifiability of the model is not tractable in a simple way (unlike standard HMMs), but can be studied using various technical arguments.
收起
摘要 :AbstractWe derive a sufficient condition for a kth order homogeneous Markov chain Z with finite alphabet Z to have a unique invariant distribution on Zk. Specifically, let X be a first-order, stationary Markov ![CDATA[...
展开AbstractWe derive a sufficient condition for a kth order homogeneous Markov chain Z with finite alphabet Z to have a unique invariant distribution on Zk. Specifically, let X be a first-order, stationary Markov chain with finite alphabet X and a single recurrent class, let g:X→Z be non-injective, and define the (possibly non-Markovian) process Y:=g(X) (where g is applied coordinate-wise). If Z is the kth order Markov approximation of Y, its invariant distribution is unique. We generalize this to non-Markovian processes 收起
摘要 :
We extend the traditional operator theoretic approach for the study of dynamical systems in order to handle the problem of non-geometric convergence. We show that the probabilistic treatment developed and popularized under Richard...
展开
We extend the traditional operator theoretic approach for the study of dynamical systems in order to handle the problem of non-geometric convergence. We show that the probabilistic treatment developed and popularized under Richard Tweedie's impulsion, can be placed into an operator framework in the spirit of Yosida-Kakutani's approach. General theorems as well as specific results for Markov chains are given. Application examples to general classes of Markov chains and dynamical systems are presented.
收起