You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/exchangeable.md
+11-11Lines changed: 11 additions & 11 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -116,7 +116,7 @@ If a sequence is random variables is IID, past information provides no informati
116
116
Therefore, there is **nothing to learn** from the past about the future.
117
117
118
118
To understand these statements, let the joint distribution of a sequence of random variables $\{W_t\}_{t=0}^T$
119
-
that is not necessarily IID, be
119
+
that is not necessarily IID be
120
120
121
121
$$
122
122
p(W_T, W_{T-1}, \ldots, W_1, W_0)
@@ -149,9 +149,9 @@ and partial history $W_{t-1}, \ldots, W_0$ contains no information about the pro
149
149
150
150
So in the IID case, there is **nothing to learn** about the densities of future random variables from past random variables.
151
151
152
-
In the general case, there is something to learn from observations of past random variables.
152
+
But when the sequence is not IID, there is something to learn about the future from observations of past random variables.
153
153
154
-
We turn next to an instance of this general case.
154
+
We turn next to an instance of the general case in which the sequence is not IID.
155
155
156
156
Please watch for what can be learned from the past and when.
157
157
@@ -174,18 +174,18 @@ distribution.
174
174
So the data are permanently generated as independently and identically distributed (IID) draws from **either** $F$ **or**
175
175
$G$.
176
176
177
-
We could say that *objectively* the probability that the data are generated as draws from $F$ is either $0$
177
+
We could say that *objectively*, meaning *after* nature has chosen either $F$ or $G$, the probability that the data are generated as draws from $F$ is either $0$
178
178
or $1$.
179
179
180
180
We now drop into this setting a partially informed decision maker who knows
181
181
182
-
- both $F$ and $G$, and
182
+
- both $F$ and $G$, but
183
183
184
-
-but not the $F$ or $G$ that nature drew once-and-for-all at $t = -1$
184
+
- not the $F$ or $G$ that nature drew once-and-for-all at $t = -1$
185
185
186
186
So our decision maker does not know which of the two distributions nature selected.
187
187
188
-
The decision maker summarizes his ignorance with a **subjective probability**
188
+
The decision maker describes his ignorance with a **subjective probability**
189
189
$\tilde \pi$ and reasons as if nature had selected $F$ with probability
190
190
$\tilde \pi \in (0,1)$ and
191
191
$G$ with probability $1 - \tilde \pi$.
@@ -194,7 +194,7 @@ Thus, we assume that the decision maker
194
194
195
195
-**knows** both $F$ and $G$
196
196
-**doesn't know** which of these two distributions that nature has drawn
197
-
- expresses his ignorance by acting as if or **thinking** that nature chose distribution $F$ with probability $\tilde \pi \in (0,1)$ and distribution
197
+
- expresses his ignorance by **acting as if** or **thinking that** nature chose distribution $F$ with probability $\tilde \pi \in (0,1)$ and distribution
198
198
$G$ with probability $1 - \tilde \pi$
199
199
- at date $t \geq 0$ knows the partial history $w_t, w_{t-1}, \ldots, w_0$
200
200
@@ -258,7 +258,7 @@ $$
258
258
259
259
This means that random variable $W_0$ contains information about random variable $W_1$.
260
260
261
-
So there is something to learn.
261
+
So there is something to learn from the past about the future.
262
262
263
263
But what and how?
264
264
@@ -282,7 +282,7 @@ Equation {eq}`eq_definetti` represents our instance of an exchangeable joint den
282
282
variables as a **mixture** of two IID joint densities over a sequence of random variables.
283
283
284
284
For a Bayesian statistician, the mixing parameter $\tilde \pi \in (0,1)$ has a special interpretation
285
-
as a **prior probability** that nature selected probability distribution $F$.
285
+
as a subjective **prior probability** that nature selected probability distribution $F$.
286
286
287
287
DeFinetti {cite}`definetti` established a related representation of an exchangeable process created by mixing
288
288
sequences of IID Bernoulli random variables with parameter $\theta \in (0,1)$ and mixing probability density $\pi(\theta)$
@@ -306,7 +306,7 @@ Another way to say *use Bayes' Law* is to say *from a (subjective) joint distrib
306
306
307
307
Let's dive into Bayes' Law in this context.
308
308
309
-
Let $q$ represent the distribution that nature actually draws $w$ from
309
+
Let $q$ represent the distribution that nature actually draws $w$
0 commit comments