You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/markov_chains_II.md
+26-16Lines changed: 26 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ jupytext:
4
4
extension: .md
5
5
format_name: myst
6
6
format_version: 0.13
7
-
jupytext_version: 1.14.5
7
+
jupytext_version: 1.14.4
8
8
kernelspec:
9
9
display_name: Python 3 (ipykernel)
10
10
language: python
@@ -37,19 +37,24 @@ to be installed on your computer. Installation instructions for graphviz can be
37
37
[here](https://www.graphviz.org/download/)
38
38
```
39
39
40
+
40
41
## Overview
41
42
42
-
This lecture continues our journey in Markov chains.
43
+
This lecture continues on from our {doc}`earlier lecture on Markov chains
44
+
<markov_chains_I>`.
45
+
43
46
44
-
Specifically, we will introduce irreducibility and ergodicity, and how they connect to stationarity.
47
+
Specifically, we will introduce the concepts of irreducibility and ergodicity, and see how they connect to stationarity.
45
48
46
-
Irreducibility is a concept that describes the ability of a Markov chain to move between any two states in the system.
49
+
Irreducibility describes the ability of a Markov chain to move between any two states in the system.
47
50
48
51
Ergodicity is a sample path property that describes the behavior of the system over long periods of time.
49
52
50
-
The concepts of irreducibility and ergodicity are closely related to the idea of stationarity.
53
+
As we will see,
51
54
52
-
An irreducible Markov chain guarantees the existence of a unique stationary distribution, while an ergodic Markov chain ensures that the system eventually reaches its stationary distribution, regardless of its initial state.
55
+
* an irreducible Markov chain guarantees the existence of a unique stationary distribution, while
56
+
* an ergodic Markov chain generates time series that satisfy a version of the
57
+
law of large numbers.
53
58
54
59
Together, these concepts provide a foundation for understanding the long-term behavior of Markov chains.
55
60
@@ -263,17 +268,19 @@ In view of our latest (ergodicity) result, it is also the fraction of time that
263
268
264
269
Thus, in the long run, cross-sectional averages for a population and time-series averages for a given person coincide.
265
270
266
-
This is one aspect of the concept of ergodicity.
271
+
This is one aspect of the concept of ergodicity.
267
272
268
273
269
274
(ergo)=
270
275
### Example 2
271
276
272
-
Another example is Hamilton {cite}`Hamilton2005`dynamics {ref}`discussed before <mc_eg2>`.
277
+
Another example is the Hamilton dynamics we {ref}`discussed before <mc_eg2>`.
273
278
274
-
The diagram of the Markov chain shows that it is **irreducible**.
279
+
The {ref}`graph <mc_eg2>`of the Markov chain shows it is irreducible
275
280
276
-
Therefore, we can see the sample path averages for each state (the fraction of time spent in each state) converges to the stationary distribution regardless of the starting state
281
+
Therefore, we can see the sample path averages for each state (the fraction of
282
+
time spent in each state) converges to the stationary distribution regardless of
283
+
the starting state
277
284
278
285
Let's denote the fraction of time spent in state $x$ at time $t$ in our sample path as $\hat p_t(x)$ and compare it with the stationary distribution $\psi^* (x)$
279
286
@@ -304,7 +311,7 @@ for i in range(n):
304
311
plt.show()
305
312
```
306
313
307
-
Note that the convergence to the stationary distribution regardless of the starting point $x_0$.
314
+
Note the convergence to the stationary distribution regardless of the starting point $x_0$.
308
315
309
316
### Example 3
310
317
@@ -324,9 +331,10 @@ P :=
324
331
$$
325
332
326
333
327
-
The graph for the chain shows states are densely connected indicating that it is **irreducible**.
334
+
The {ref}`graph <mc_eg3>` for the chain shows all states are reachable,
335
+
indicating that this chain is irreducible.
328
336
329
-
Then we visualize the difference between $\hat p_t(x)$ and the stationary distribution $\psi^* (x)$
337
+
Here we visualize the difference between $\hat p_t(x)$ and the stationary distribution $\psi^* (x)$ for each state $x$
330
338
331
339
```{code-cell} ipython3
332
340
P = [[0.86, 0.11, 0.03, 0.00, 0.00, 0.00],
@@ -357,7 +365,8 @@ ax.legend()
357
365
plt.show()
358
366
```
359
367
360
-
Similar to previous examples, the sample path averages for each state converge to the stationary distribution as the trend converge towards 0.
368
+
Similar to previous examples, the sample path averages for each state converge
0 commit comments