CS70 Guide
Search…
⌃K

# Markov Chains

Markov Chains are a type of stochastic process (a collection of random variables that evolves over time) that satisfy the Markov property (the future state
$n+1$
only depends on the current state
$n$
, and not any of the past states).
Markov chains are often used to model transitions between discrete states.