You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -413,7 +413,7 @@ Bayes' law is simply an application of laws of
413
413
414
414
After our worker puts a subjective probability $\pi_{-1}$ on nature having selected distribution $F$, we have in effect assumes from the start that the decision maker **knows** the joint distribution for the process $\{w_t\}_{t=0}$.
415
415
416
-
We assume that the workers also knows the laws of probability theory.
416
+
We assume that the worker also knows the laws of probability theory.
417
417
418
418
A respectable view is that Bayes' law is less a theory of learning than a statement about the consequences of information inflows for a decision maker who thinks he knows the truth (i.e., a joint probability distribution) from the beginning.
419
419
@@ -651,30 +651,59 @@ ax.set_ylabel('$\pi_t$')
651
651
plt.show()
652
652
```
653
653
654
-
Now let's plot two paths of pairs of $\{\pi_t, w_t\}$ sequences, one in which $\pi_t \rightarrow 1$,
655
-
another in which $\pi_t \rightarrow 0$.
654
+
655
+
656
+
The above graph indicates that
657
+
658
+
* each of paths converges
659
+
660
+
* some of the paths converge to $1$
661
+
662
+
* some of the paths converge to $0$
663
+
664
+
* none of the paths converge to a limit point not equal to $0$ or $1$
665
+
666
+
Convergence actually occurs pretty fast, as the following graph of the cross-ensemble distribution of $\pi_t$ for various small $t$'s indicates.
0 commit comments