You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/markov_chains_II.md
+42-51Lines changed: 42 additions & 51 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -4,7 +4,7 @@ jupytext:
4
4
extension: .md
5
5
format_name: myst
6
6
format_version: 0.13
7
-
jupytext_version: 1.14.4
7
+
jupytext_version: 1.14.5
8
8
kernelspec:
9
9
display_name: Python 3 (ipykernel)
10
10
language: python
@@ -37,24 +37,19 @@ to be installed on your computer. Installation instructions for graphviz can be
37
37
[here](https://www.graphviz.org/download/)
38
38
```
39
39
40
-
41
40
## Overview
42
41
43
-
This lecture continues on from our {doc}`earlier lecture on Markov chains
44
-
<markov_chains_I>`.
45
-
42
+
This lecture continues our journey in Markov chains.
46
43
47
-
Specifically, we will introduce the concepts of irreducibility and ergodicity, and see how they connect to stationarity.
44
+
Specifically, we will introduce irreducibility and ergodicity, and how they connect to stationarity.
48
45
49
-
Irreducibility describes the ability of a Markov chain to move between any two states in the system.
46
+
Irreducibility is a concept that describes the ability of a Markov chain to move between any two states in the system.
50
47
51
48
Ergodicity is a sample path property that describes the behavior of the system over long periods of time.
52
49
53
-
As we will see,
50
+
The concepts of irreducibility and ergodicity are closely related to the idea of stationarity.
54
51
55
-
* an irreducible Markov chain guarantees the existence of a unique stationary distribution, while
56
-
* an ergodic Markov chain generates time series that satisfy a version of the
57
-
law of large numbers.
52
+
An irreducible Markov chain guarantees the existence of a unique stationary distribution, while an ergodic Markov chain ensures that the system eventually reaches its stationary distribution, regardless of its initial state.
58
53
59
54
Together, these concepts provide a foundation for understanding the long-term behavior of Markov chains.
60
55
@@ -179,8 +174,6 @@ mc = qe.MarkovChain(P, ('poor', 'middle', 'rich'))
179
174
mc.is_irreducible
180
175
```
181
176
182
-
+++ {"user_expressions": []}
183
-
184
177
It might be clear to you already that irreducibility is going to be important
185
178
in terms of long-run outcomes.
186
179
@@ -270,19 +263,19 @@ In view of our latest (ergodicity) result, it is also the fraction of time that
270
263
271
264
Thus, in the long run, cross-sectional averages for a population and time-series averages for a given person coincide.
272
265
273
-
This is one aspect of the concept of ergodicity.
266
+
This is one aspect of the concept of ergodicity.
274
267
275
268
276
269
(ergo)=
277
270
### Example 2
278
271
279
-
Another example is the Hamilton dynamics we {ref}`discussed before <mc_eg2>`.
272
+
Another example is Hamilton {cite}`Hamilton2005` dynamics {ref}`discussed before <mc_eg2>`.
280
273
281
-
The {ref}`graph <mc_eg2>`of the Markov chain shows it is irreducible
274
+
The diagram of the Markov chain shows that it is **irreducible**.
282
275
283
-
Therefore, we can see the sample path averages for each state (the fraction of
284
-
time spent in each state) converges to the stationary distribution regardless of
285
-
the starting state
276
+
Therefore, we can see the sample path averages for each state (the fraction of time spent in each state) converges to the stationary distribution regardless of the starting state
277
+
278
+
Let's denote the fraction of time spent in state $x$ at time $t$ in our sample path as $\hat p_t(x)$ and compare it with the stationary distribution $\psi^* (x)$
286
279
287
280
```{code-cell} ipython3
288
281
P = np.array([[0.971, 0.029, 0.000],
@@ -291,27 +284,28 @@ P = np.array([[0.971, 0.029, 0.000],
0 commit comments