You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: lectures/svd_intro.md
+71-29Lines changed: 71 additions & 29 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1001,7 +1001,7 @@ In effect,
1001
1001
1002
1002
$$
1003
1003
\Phi_s = UW
1004
-
$$
1004
+
$$ (eq:Phisfull)
1005
1005
1006
1006
and represented equation {eq}`eq:DSSEbookrepr` as
1007
1007
@@ -1015,48 +1015,101 @@ DMD **projected nodes**.
1015
1015
1016
1016
1017
1017
1018
-
We turn next to an alternative representation suggested by Tu et al. {cite}`tu_Rowley`.
1018
+
We turn next to an alternative representation suggested by Tu et al. {cite}`tu_Rowley`, one that is more appropriate to use when, as in practice is typically the case, we use a reduced SVD.
1019
1019
1020
1020
1021
1021
1022
1022
1023
1023
## Representation 3
1024
1024
1025
-
1026
-
As we did with representation 2, it is useful to construct an eigendecomposition of the $m \times m$ transition matrix $\tilde A$
1027
-
according the equation {eq}`eq:tildeAeigen`.
1028
-
1029
-
1030
1025
Departing from the procedures used to construct Representations 1 and 2, each of which deployed a **full** SVD, we now use a **reduced** SVD.
1031
1026
1032
-
As above, we let $p \leq \textrm{min}(m,n)$ be the rank of $X$ and consider a **reduced** SVD
1027
+
Again, we let $p \leq \textrm{min}(m,n)$ be the rank of $X$.
1028
+
1029
+
Construct a **reduced** SVD
1033
1030
1034
1031
$$
1035
-
X = U \Sigma V^T
1032
+
X = \tilde U \tilde \Sigma \tilde V^T,
1036
1033
$$
1037
1034
1038
1035
where now $U$ is $m \times p$ and $\Sigma$ is $ p \times p$ and $V^T$ is $p \times n$.
1039
1036
1037
+
Our minimum-norm least-squares estimator approximator of $A$ now has representation
1038
+
1039
+
$$
1040
+
\hat A = X' \tilde V \tilde \Sigma^{-1} \tilde U^T
1041
+
$$
1042
+
1043
+
1044
+
Paralleling a step in Representation 1, define a transition matrix for a rotated $p \times 1$ state $\tilde b_t$ by
1040
1045
1046
+
$$
1047
+
\tilde A =\tilde U^T \hat A \tilde U
1048
+
$$ (eq:Atildered)
1049
+
1050
+
Because we are now working with a reduced SVD, so that $\tilde U \tilde U^T \neq I$, we can't recover $\hat A$ from $ \hat A \neq \tilde U \tilde A \tilde U^T$.
1051
+
1052
+
1053
+
Nevertheless, hoping for the best, we trudge on and construct an eigendecomposition of what is now a
1054
+
$p \times p$ matrix $\tilde A$:
1055
+
1056
+
$$
1057
+
\tilde A = W \Lambda W^{-1}
1058
+
$$ (eq:tildeAeigenred)
1059
+
1060
+
1061
+
Mimicking our procedure in Representation 2, we cross our fingers and compute the $m \times p$ matrix
1062
+
1063
+
$$
1064
+
\tilde \Phi_s = \tilde U W
1065
+
$$ (eq:Phisred)
1041
1066
1067
+
that corresponds to {eq}`eq:Phisfull` for a full SVD.
1042
1068
1043
-
Construct an $m \times p$ matrix
1069
+
At this point, it is interesting to compute $\hat A \tilde \Phi_s$:
1070
+
1071
+
$$
1072
+
\begin{aligned}
1073
+
\hat A \tilde \Phi_s & = (X' \tilde V \tilde \Sigma^{-1} \tilde U^T) (\tilde U W) \\
1074
+
& = X' \tilde V \tilde \Sigma^{-1} W \\
1075
+
& \neq (\tilde U W) \Lambda \\
1076
+
& = \tilde \Phi_s \Lambda
1077
+
\end{aligned}
1078
+
$$
1079
+
1080
+
That
1081
+
$ \hat A \tilde \Phi_s \neq \tilde \Phi_s \Lambda $ means, that unlike the corresponding situation in Representation 2, columns of $\tilde \Phi_s = \tilde U W$
1082
+
are **not** eigenvectors of $\hat A$ corresponding to eigenvalues $\Lambda$.
1083
+
1084
+
But in the quest for eigenvectors of $\hat A$ that we can compute with a reduced SVD, let's define
1085
+
1086
+
$$
1087
+
\Phi \equiv \hat A \tilde \Phi_s = X' \tilde V \tilde \Sigma^{-1} W
1088
+
$$
1089
+
1090
+
It turns out that columns of $\Phi$ **are** eigenvectors of $\hat A$,
1091
+
a consequence of a result established by Tu et al. {cite}`tu_Rowley`.
1092
+
1093
+
To present their result, for convenience we'll drop the tilde $\tilde \cdot$ for $U, V,$ and $\Sigma$
1094
+
and adopt the understanding that they are computed with a reduced SVD.
1095
+
1096
+
1097
+
Thus, we now use the notation
1098
+
that ths $m \times p$ matrix is defined as
1044
1099
1045
1100
$$
1046
1101
\Phi = X' V \Sigma^{-1} W
1047
1102
$$ (eq:Phiformula)
1048
1103
1049
1104
1050
1105
1051
-
Tu et al. {cite}`tu_Rowley` established the following
1052
-
1053
1106
**Proposition** The $p$ columns of $\Phi$ are eigenvectors of $\check A$.
1054
1107
1055
1108
**Proof:** From formula {eq}`eq:Phiformula` we have
1056
1109
1057
1110
$$
1058
1111
\begin{aligned}
1059
-
\check A \Phi & = (X' V \Sigma^{-1} U^T) (X' V \Sigma^{-1} W) \cr
1112
+
\hat A \Phi & = (X' V \Sigma^{-1} U^T) (X' V \Sigma^{-1} W) \cr
1060
1113
& = X' V \Sigma^{-1} \tilde A W \cr
1061
1114
& = X' V \Sigma^{-1} W \Lambda \cr
1062
1115
& = \Phi \Lambda
@@ -1066,34 +1119,23 @@ $$
1066
1119
Thus, we have deduced that
1067
1120
1068
1121
$$
1069
-
\check A \Phi = \Phi \Lambda
1122
+
\hat A \Phi = \Phi \Lambda
1070
1123
$$ (eq:APhiLambda)
1071
1124
1072
-
Let $\phi_i$ be the the $i$the column of $\Phi$ and $\lambda_i$ be the corresponding $i$ eigenvalue of $\tilde A$ from decomposition {eq}`eq:tildeAeigen`.
1125
+
Let $\phi_i$ be the the $i$the column of $\Phi$ and $\lambda_i$ be the corresponding $i$ eigenvalue of $\tilde A$ from decomposition {eq}`eq:tildeAeigenred`.
1073
1126
1074
1127
Writing out the $m \times 1$ vectors on both sides of equation {eq}`eq:APhiLambda` and equating them gives
1075
1128
1076
1129
1077
1130
$$
1078
-
\check A \phi_i = \lambda_i \phi_i .
1131
+
\hat A \phi_i = \lambda_i \phi_i .
1079
1132
$$
1080
1133
1081
-
Thus, $\phi_i$ is an eigenvector of $A$ that corresponds to eigenvalue $\lambda_i$ of $\check A$.
1134
+
Thus, $\phi_i$ is an eigenvector of $\hat A$ that corresponds to eigenvalue $\lambda_i$ of $\tilde A$.
1082
1135
1083
1136
This concludes the proof.
1084
1137
1085
1138
1086
-
We also have the following
1087
-
1088
-
**Corollary:** Assume that the integer $r$ satisfies $1 \leq r < p$. As a counterpart of $\tilde A$ defined above in equation {eq}`eq:Atilde0` with a full SVD, instead use a reduced SVD to redefine $\tilde A$ as the following $r \times r$ counterpart
1089
-
1090
-
$$
1091
-
\tilde A = \tilde U^T \hat A \tilde U
1092
-
$$ (eq:Atilde10)
1093
-
1094
-
where in equation {eq}`eq:Atilde10` $\tilde U$ is now the $m \times r$ matrix consisting of the eigevectors of $X X^T$ corresponding to the $r$
1095
-
largest singular values of $X$.
1096
-
The conclusions of the proposition remain true when we replace $U$ by $\tilde U$.
1097
1139
1098
1140
1099
1141
Also see {cite}`DDSE_book` (p. 238)
@@ -1123,7 +1165,7 @@ X_t & = \Phi \check b_t
1123
1165
$$
1124
1166
1125
1167
1126
-
But there is a better way to compute the $r \times 1$ vector $\check b_t$
1168
+
But there is a better way to compute the $p \times 1$ vector $\check b_t$
1127
1169
1128
1170
In particular, the following argument from {cite}`DDSE_book` (page 240) provides a computationally efficient way
0 commit comments