252. Markov Chain

A stochastic process that undergoes transitions from one state to another on a state space, with the probability of each subsequent state depending only on the current state.

Last updated