You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/svd_intro.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1589,7 +1589,7 @@ is an $m \times n$ matrix of least squares projections of $X$ on $\Phi$.
1589
1589
1590
1590
1591
1591
1592
-
By virtue of least-squares projection theory discussed in this quantecon lecture e <https://python-advanced.quantecon.org/orth_proj.html>, we can represent $X$ as the sum of the projection $\check X$ of $X$ on $\Phi$ plus a matrix of errors.
1592
+
By virtue of least-squares projection theory discussed in this quantecon lecture <https://python-advanced.quantecon.org/orth_proj.html>, we can represent $X$ as the sum of the projection $\check X$ of $X$ on $\Phi$ plus a matrix of errors.
1593
1593
1594
1594
1595
1595
To verify this, note that the least squares projection $\check X$ is related to $X$ by
@@ -1726,7 +1726,7 @@ We can then use $\check X_{t+j}$ or $\hat X_{t+j}$ to forecast $X_{t+j}$.
1726
1726
1727
1727
### Using Fewer Modes
1728
1728
1729
-
In applications, we'll actually want to just a few modes, often three or less.
1729
+
In applications, we'll actually want to use just a few modes, often three or less.
1730
1730
1731
1731
Some of the preceding formulas assume that we have retained all $p$ modes associated with the positive
0 commit comments