Skip to content

Commit b855fc1

Browse files
Tom's Dec 4 edits of recalitrant lecture likelihood_bayes.md
1 parent 86f176b commit b855fc1

File tree

1 file changed

+10
-17
lines changed

1 file changed

+10
-17
lines changed

lectures/likelihood_bayes.md

Lines changed: 10 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -713,7 +713,6 @@ plt.show()
713713
714714
715715
716-
**Drilling down a little bit**
717716
718717
For the preceding ensemble that assumed $\pi_0 = .5$, the following graph shows two paths of
719718
$w_t$'s and the $\pi_t$ sequences that gave rise to them.
@@ -762,20 +761,20 @@ table
762761
The fraction of simulations for which $\pi_{t}$ had converged to $1$ is indeed always close to $\pi_{-1}$, as anticipated.
763762
764763
764+
## Drilling Down a Little Bit
765765
766+
To understand how the local dynamics of $\pi_t$ behaves, it is enlightening to consult the variance of $\pi_{t}$ conditional on $\pi_{t-1}$.
766767
767-
768-
### Conditional Variance of Subjective Distribution
769-
770-
We can use a Monte Carlo simulation to approximate the conditional variance of $\pi_{t+1}$ under the
771-
subjective distribution:
772-
768+
Under the subjective distribution this conditional variance is defined as
769+
773770
$$
774771
\sigma^2(\pi_t | \pi_{t-1}) = \int \Bigl[ { \pi_{t-1} f(w) \over \pi_{t-1} f(w) + (1-\pi_{t-1})g(w) } - \pi_{t-1} \Bigr]^2
775772
\Bigl[ \pi_{t-1} f(w) + (1-\pi_{t-1})g(w) \Bigr] d w
776773
$$
777774
778-
We approximate this for a grid of points $\pi_{t-1} \in [0,1]$.
775+
We can use a Monte Carlo simulation to approximate this conditional variance.
776+
777+
We approximate it for a grid of points $\pi_{t-1} \in [0,1]$.
779778
780779
Then we'll plot it.
781780
@@ -807,17 +806,11 @@ ax.set_ylabel('$\sigma^{2}(\pi_{t}\\vert \pi_{t-1})$')
807806
plt.show()
808807
```
809808
810-
Notice how the conditional variance approaches $0$ for $\pi_{t-1}$ near either $0$ or $1$, where
811-
the agent is almost sure that $w_t$ is drawn from $F$ or from $G$.
812-
813-
```{code-cell} ipython3
814-
815-
```
816-
817-
818-
809+
The shape of the the conditional variance as a function of $\pi_{t-1}$ is informative about the behavior of sample paths of $\{\pi_t\}$.
819810
811+
Notice how the conditional variance approaches $0$ for $\pi_{t-1}$ near either $0$ or $1$.
820812
813+
In each of these reasons, the agent is almost sure that $w_t$ is drawn from $F$ or from $G$.
821814
822815
## Sequels
823816

0 commit comments

Comments
 (0)