678678
679679## Reduced-order VAR
680680
681- Consider a **vector autoregression**
681+ DMD is a natural tool for estimating a **reduced order vector autoregression**,
682+ an object that we define in terms of the populations regression equation
682683
683684$$
684685X_ {t+1} = \check A X_t + C \epsilon_ {t+1}
@@ -687,7 +688,7 @@ $$ (eq:VARred)
687688where
688689
689690* $X_t$ is an $m \times 1$ vector
690- * $\check A$ is an $m \times m$ matrix of rank $r$
691+ * $\check A$ is an $m \times m$ matrix of rank $r$ whose eigenvalues are all less than $1$ in modulus
691692* $\epsilon_{t+1} \sim {\mathcal N}(0, I)$ is an $m \times 1$ vector of i.i.d. shocks
692693* $E \epsilon_{t+1} X_t = 0$, so that the shocks are orthogonal to the regressors
693694
@@ -705,14 +706,14 @@ so that according to model {eq}`eq:VARred`
705706
706707
707708$$
708- X' = [ \ check A X_1 + C \epsilon_2 \mid \check A X_2 + C \epsilon_3 \mid \cdots \mid \check A X_ {n-1} C
709- \epsilon_n ]
709+ X' = \begin{bmatrix} \ check A X_1 + C \epsilon_2 \mid \check A X_2 + C \epsilon_3 \mid \cdots \mid \check A X_ {n-1} + C
710+ \epsilon_n \end{bmatrix}
710711$$
711712
712713To illustrate some useful calculations, assume that $n =3 $ and form
713714
714715$$
715- X' X^T = [ \ check A X_1 + C \epsilon_2 \mid \check A X_2 + C \epsilon_3 ]
716+ X' X^T = \begin{bmatrix} \ check A X_1 + C \epsilon_2 & \check A X_2 + C \epsilon_3 \end{bmatrix}
716717 \begin{bmatrix} X_1^T \cr X_2^T \end{bmatrix}
717718$$
718719
725726but because
726727
727728$$
728- E C ( \epsilon_2 X_1^T + \epsilon_3 X_2^T) = 0
729+ E ( \epsilon_2 X_1^T + \epsilon_3 X_2^T) = 0
729730$$
730731
731732we have
768769C \epsilon_ {t+1} = X_ {t+1} - \check A X_t , \quad t =1, \ldots, n-1
769770$$
770771
771- and check whether they are serially uncorrelated as assumed.
772+ and check whether they are serially uncorrelated as assumed in {eq}`eq:VARred` .
772773
773774For example, we can compute spectra and cross-spectra of components of $C \epsilon_{t+1}$
775+ and check for serial-uncorrelatedness in the usual ways.
774776
775777We can also estimate the covariance matrix of $C \epsilon_{t+1}$
776778from
777779
778780$$
779- \frac{1}{n} \sum_ {j =1}^{n-1} (C \epsilon_ {t+1} )( C \epsilon_ {t+1})^T
781+ \frac{1}{n-1 } \sum_ {t =1}^{n-1} (C \epsilon_ {t+1} )( C \epsilon_ {t+1})^T
780782$$
781783
782- It can be useful to transform variables in our reduced order VAR
783-
784+ It can be enlightening to diagonize our reduced order VAR {eq}`eq:VARred` by noting that it can
785+ be written
786+
784787
785788$$
786789X_ {t+1} = \Phi \Lambda \Phi^{+} X_t + C \epsilon_ {t+1}
787790$$
788791
789- according to
792+
793+ and then writing it as
790794
791795$$
792796\Phi^+ X_ {t+1} = \Lambda \Phi^{+} X_t + \Phi^+ C \epsilon_ {t+1}
796800
797801$$
798802\tilde X_ {t+1} = \Lambda \tilde X_t + \tilde \epsilon_ {t+1}
799- $$
803+ $$ (eq:VARmodes)
800804
801805where $\tilde X_t $ is an $r \times 1$ **mode** and $\tilde \epsilon_{t+1}$ is an $r \times 1$
802806shock.
803807
808+ The $r$ modes $\tilde X_t$ obey the first-order VAR {eq}`eq:VARmodes` in which $\Lambda$ is an $r \times r$ diagonal matrix.
809+
810+ Note that while $\Lambda$ is diagonal, the contemporaneous covariance matrix of $\tilde \epsilon_{t+1}$ need not be.
811+
812+
813+ **Remark:** It is permissible for $X_t$ to contain lagged values of observables.
804814
805- **Remark:** It is permissible for $X_t$ to contain lagged values of observables. For example:
815+ For example, we might have a setting in which
806816
807817$$
808818X_t = \begin{bmatrix}
@@ -816,12 +826,6 @@ y_{2, t-1} \cr
816826\end{bmatrix}
817827$$
818828
819-
820-
821-
822-
823-
824-
825829+++
826830
827831## Source for Some Python Code
0 commit comments