Markov Chain Process (Theory and Cases)

Markov Chain Process

Author(s): Carlos Polanco * .

Pp: 28-36 (9)

DOI: 10.2174/9789815080476123010009

* (Excluding Mailing and Handling)

Abstract

In this chapter, and from the historical introduction raised in the previous chapters, we introduce and exemplify all the components of a Markov Chain Process such as: initial state vector, Markov property (or Markov property), matrix of transition probabilities, and steady-state vector. A Markov Chain Process is formally defined and by way of categorization this process is divided into two types: Discrete-Time Markov Chain Process and Continuous-Time Markov Chain Process, which occurs as a result of observing whether the time between states in a random walk is discrete or continuous. Each of its components is exemplified, and analytically all the examples are solved.

Related Books
© 2025 Bentham Science Publishers | Privacy Policy