Skip to content

Commit 3f8ac5e

Browse files
Tom's March 10 edits of two lectures
1 parent e45e958 commit 3f8ac5e

File tree

2 files changed

+12
-12
lines changed

2 files changed

+12
-12
lines changed

lectures/exchangeable.md

Lines changed: 11 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -116,7 +116,7 @@ If a sequence is random variables is IID, past information provides no informati
116116
Therefore, there is **nothing to learn** from the past about the future.
117117

118118
To understand these statements, let the joint distribution of a sequence of random variables $\{W_t\}_{t=0}^T$
119-
that is not necessarily IID, be
119+
that is not necessarily IID be
120120

121121
$$
122122
p(W_T, W_{T-1}, \ldots, W_1, W_0)
@@ -149,9 +149,9 @@ and partial history $W_{t-1}, \ldots, W_0$ contains no information about the pro
149149

150150
So in the IID case, there is **nothing to learn** about the densities of future random variables from past random variables.
151151

152-
In the general case, there is something to learn from observations of past random variables.
152+
But when the sequence is not IID, there is something to learn about the future from observations of past random variables.
153153

154-
We turn next to an instance of this general case.
154+
We turn next to an instance of the general case in which the sequence is not IID.
155155

156156
Please watch for what can be learned from the past and when.
157157

@@ -174,18 +174,18 @@ distribution.
174174
So the data are permanently generated as independently and identically distributed (IID) draws from **either** $F$ **or**
175175
$G$.
176176

177-
We could say that *objectively* the probability that the data are generated as draws from $F$ is either $0$
177+
We could say that *objectively*, meaning *after* nature has chosen either $F$ or $G$, the probability that the data are generated as draws from $F$ is either $0$
178178
or $1$.
179179

180180
We now drop into this setting a partially informed decision maker who knows
181181

182-
- both $F$ and $G$, and
182+
- both $F$ and $G$, but
183183

184-
- but not the $F$ or $G$ that nature drew once-and-for-all at $t = -1$
184+
- not the $F$ or $G$ that nature drew once-and-for-all at $t = -1$
185185

186186
So our decision maker does not know which of the two distributions nature selected.
187187

188-
The decision maker summarizes his ignorance with a **subjective probability**
188+
The decision maker describes his ignorance with a **subjective probability**
189189
$\tilde \pi$ and reasons as if nature had selected $F$ with probability
190190
$\tilde \pi \in (0,1)$ and
191191
$G$ with probability $1 - \tilde \pi$.
@@ -194,7 +194,7 @@ Thus, we assume that the decision maker
194194

195195
- **knows** both $F$ and $G$
196196
- **doesn't know** which of these two distributions that nature has drawn
197-
- expresses his ignorance by acting as if or **thinking** that nature chose distribution $F$ with probability $\tilde \pi \in (0,1)$ and distribution
197+
- expresses his ignorance by **acting as if** or **thinking that** nature chose distribution $F$ with probability $\tilde \pi \in (0,1)$ and distribution
198198
$G$ with probability $1 - \tilde \pi$
199199
- at date $t \geq 0$ knows the partial history $w_t, w_{t-1}, \ldots, w_0$
200200

@@ -258,7 +258,7 @@ $$
258258

259259
This means that random variable $W_0$ contains information about random variable $W_1$.
260260

261-
So there is something to learn.
261+
So there is something to learn from the past about the future.
262262

263263
But what and how?
264264

@@ -282,7 +282,7 @@ Equation {eq}`eq_definetti` represents our instance of an exchangeable joint den
282282
variables as a **mixture** of two IID joint densities over a sequence of random variables.
283283

284284
For a Bayesian statistician, the mixing parameter $\tilde \pi \in (0,1)$ has a special interpretation
285-
as a **prior probability** that nature selected probability distribution $F$.
285+
as a subjective **prior probability** that nature selected probability distribution $F$.
286286

287287
DeFinetti {cite}`definetti` established a related representation of an exchangeable process created by mixing
288288
sequences of IID Bernoulli random variables with parameter $\theta \in (0,1)$ and mixing probability density $\pi(\theta)$
@@ -306,7 +306,7 @@ Another way to say *use Bayes' Law* is to say *from a (subjective) joint distrib
306306

307307
Let's dive into Bayes' Law in this context.
308308

309-
Let $q$ represent the distribution that nature actually draws $w$ from
309+
Let $q$ represent the distribution that nature actually draws $w$
310310
from and let
311311

312312
$$

lectures/wald_friedman.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ In addition to what's in Anaconda, this lecture will need the following librarie
4242

4343
## Overview
4444

45-
This lecture describes a statistical decision problem encountered by Milton
45+
This lecture describes a statistical decision problem presented to Milton
4646
Friedman and W. Allen Wallis during World War II when they were analysts at
4747
the U.S. Government's Statistical Research Group at Columbia University.
4848

0 commit comments

Comments
 (0)