Title: Predictions Using <i>n</i> ‐State Markov Chains
Abstract:The states of a Markov chain reflect the number of types of events present in a sequence of observations. This chapter describes some examples with more than two states. The first two examples include...The states of a Markov chain reflect the number of types of events present in a sequence of observations. This chapter describes some examples with more than two states. The first two examples include a three-state Markov chain and a four-state Markov chain, after which a gradual generalization is made for an arbitrary number of states (n states). The supporting theory is accompanied by an algorithm implementation for each example, which allows a prediction based on a sequence of observations. The two-state Markov chain has been the main focus for understanding the stochastic processes. When a complex analysis is needed and the state space becomes wider, multiplying a probability vector by a transition matrix becomes difficult to follow in a Markov chain. So far, Markov chains with two, three, and four states have been observed. Also, different letters have been used to represent the vector components.Read More
Publication Year: 2017
Publication Date: 2017-06-22
Language: en
Type: other
Indexed In: ['crossref']
Access and Citation
Cited By Count: 1
AI Researcher Chatbot
Get quick answers to your questions about the article from our AI researcher chatbot