← Back to context

Comment by NotAnOtter

5 days ago

Your overall point might be correct but your example does not prove your point.

A Makrov chain is just the path taken through the course of a markov process. The terms 'chain' and 'process' are sometimes conflated in this context, but this is the most common distinction I've seen. As such you can run a markov process for some number of steps N times, and then ask how many generated chains contain the property you are interested in. The process is memoryless but the chain is the result of the process and therefor contains memory.

I agree 'Random walk' is a superset of 'a markov process', but IMO when someone says Random walk - they normally make assumptions that qualify it as a markov chain. Therefor it's useful as a teaching to just call it a random walk.

Interesting. I'd not heard the term Markov Chain used to describe a path so I checked my copy of "All of Statistics" by Larry Wasserman and he (slightly) disagrees with both of us and says

    "A Markov chain is a stochastic process for which the distribution of X_t depends only on X_{t-1}."

So he doesn't need it to be a discrete process but he also doesn't think it's a path. I guess the terminology is not 100% standardized. But anyhow thanks for making me think about this. Always interesting.

  • But a random walk is precisely a stochastic process for which the _next state_ depends only on the _current state_. In terms of graphs (where _random walk_ comes from), the next node is decided by randomly selecting a neighbor of the current node.

    • That's not true for random walks in general I don't think. A random walk is a process derived from taking random steps in some mathematical space. It can include jumps and it can include memory.

      Take a "path-avoiding" random walk. At time t the distribution of the next step depends on whether or not I have at some point hit any of the adjacent nodes in the current path. That's not the current state, that's memory.

      7 replies →

  • A Markov chain is commonly understood to be a time-discrete Markov process. Intuitively, it’s a “chain” because you can “single out” its states in time rather than intervals. That’s also Markov’s original definition. Instead of Wassermann one can look up the notion in Wikipedia. A path is a notion relevant to the state space of a process - it’s a realization of states at every single point in time for the time span of interest.