A Markov chain is a sequence of random variables that satisfies P(X t+1 ∣X t ,X t−1 ,…,X 1 )=P(X t+1 ∣X t ). Simply put, it is a sequence in which X t+1 depends only on X t and appears before X t−1 ...
Studies axioms, counting formulas, conditional probability, independence, random variables, continuous and discrete distribution, expectation, moment generating functions, law of large numbers, ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results