Skip to content

Commit 92d6ba8

Browse files
Tom's Dec 10 edits of mix_model.md lecture
1 parent fc2139e commit 92d6ba8

File tree

1 file changed

+12
-6
lines changed

1 file changed

+12
-6
lines changed

lectures/mix_model.md

Lines changed: 12 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -228,7 +228,7 @@ Our second method uses a uniform distribution and the following fact that we als
228228

229229
In other words, if $X \sim F(x)$ we can generate a random sample from $F$ by drawing a random sample from
230230
a uniform distribution on $[0,1]$ and computing $F^{-1}(U)$.
231-
$
231+
232232

233233
We'll use this fact
234234
in conjunction with the `numpy.searchsorted` command to sample from $H$ directly.
@@ -570,6 +570,7 @@ fig, ax = plt.subplots(1, figsize=[10, 6])
570570
ax.plot(α_arr, KL_g_arr, label='KL(g, h)')
571571
ax.plot(α_arr, KL_f_arr, label='KL(f, h)')
572572
ax.set_ylabel('K-L divergence')
573+
ax.set_xlabel(r'$\alpha$')
573574
574575
ax.legend(loc='upper right')
575576
plt.show()
@@ -617,6 +618,7 @@ fig, ax = plt.subplots(1, figsize=[10, 6])
617618
ax.plot(α_arr, KL_g_arr, label='KL(g, h)')
618619
ax.plot(α_arr, KL_f_arr, label='KL(f, h)')
619620
ax.set_ylabel('K-L divergence')
621+
ax.set_xlabel(r'$\alpha$')
620622
621623
# plot KL
622624
ax2 = ax.twinx()
@@ -697,7 +699,7 @@ def MCMC_run(ws):
697699
return sample['α']
698700
```
699701

700-
The following code displays Bayesian posteriors for $\alpha$ at various history lengths.
702+
The following code generates the graph below that displays Bayesian posteriors for $\alpha$ at various history lengths.
701703

702704
```{code-cell} ipython3
703705
@@ -715,7 +717,7 @@ ax.set_xlabel('$\\alpha$')
715717
plt.show()
716718
```
717719

718-
It shows how the Bayesian posterior narrows in on the true value $\alpha = .8$ of the mixing parameter as the length of a history of observations grows.
720+
Evidently, the Bayesian posterior narrows in on the true value $\alpha = .8$ of the mixing parameter as the length of a history of observations grows.
719721

720722
## Concluding Remarks
721723

@@ -728,13 +730,17 @@ That is wrong because nature is actually mixing each period with mixing probabil
728730

729731
Our type 1 agent eventually believes that either $f$ or $g$ generated the $w$ sequence, the outcome being determined by the model, either $f$ or $g$, whose KL divergence relative to $h$ is smaller.
730732

731-
Our type 2 agent has a statistical model that lets him learn more.
733+
Our type 2 agent has a different statistical model, one that is correctly specified.
734+
735+
He knows the parametric form of the statistical model but not the mixing parameter $\alpha$.
736+
737+
He knows that he does not know it.
732738

733-
Using Bayes law he eventually figures out $\alpha$.
739+
But by using Bayes' law in conjunction with his statistical model and a history of data, he eventually acquires a more and more accurate inference about $\alpha$.
734740

735741
This little laboratory exhibits some important general principles that govern outcomes of Bayesian learning of misspecified models.
736742

737-
The following situation prevails quite generally in empirical work.
743+
Thus, the following situation prevails quite generally in empirical work.
738744

739745
A scientist approaches the data with a manifold $S$ of statistical models $ s (X | \theta)$ , where $s$ is a probability distribution over a random vector $X$, $\theta \in \Theta$
740746
is a vector of parameters, and $\Theta$ indexes the manifold of models.

0 commit comments

Comments
 (0)