Skip to content

Commit 6155d97

Browse files
authored
Merge pull request #82 from QuantEcon/lineareqns
integrated comments
2 parents 8a0be8f + 6b31dc6 commit 6155d97

File tree

1 file changed

+21
-46
lines changed

1 file changed

+21
-46
lines changed

lectures/linear_equations.md

Lines changed: 21 additions & 46 deletions
Original file line numberDiff line numberDiff line change
@@ -41,8 +41,6 @@ from matplotlib import cm
4141
from mpl_toolkits.mplot3d import Axes3D
4242
```
4343

44-
+++
45-
4644
## A Two Good Example
4745

4846
We discuss a simple two good example and solve it by
@@ -128,8 +126,6 @@ Traditionally, vectors are represented visually as arrows from the origin to the
128126

129127
The following figure represents three vectors in this manner.
130128

131-
+++
132-
133129
```{code-cell} ipython3
134130
fig, ax = plt.subplots(figsize=(10, 8))
135131
# Set the axes through the origin
@@ -139,7 +135,7 @@ for spine in ['right', 'top']:
139135
ax.spines[spine].set_color('none')
140136
141137
ax.set(xlim=(-5, 5), ylim=(-5, 5))
142-
ax.grid()
138+
143139
vecs = ((2, 4), (-3, 3), (-4, -3.5))
144140
for v in vecs:
145141
ax.annotate('', xy=v, xytext=(0, 0),
@@ -378,8 +374,6 @@ np.sqrt(np.sum(x**2)) # Norm of x, take one
378374
np.linalg.norm(x) # Norm of x, take two
379375
```
380376

381-
+++
382-
383377
## Matrix Operations
384378

385379
```{index} single: Matrix; Operations
@@ -590,8 +584,6 @@ NumPy arrays are also used as matrices, and have fast, efficient functions and m
590584

591585
You can create them manually from tuples of tuples (or lists of lists) as follows
592586

593-
+++
594-
595587
```{code-cell} ipython3
596588
A = ((1, 2),
597589
(3, 4))
@@ -630,8 +622,6 @@ B = np.ones((3, 3)) # 3 x 3 matrix of ones
630622
A + B
631623
```
632624

633-
+++
634-
635625
To multiply matrices we use the `@` symbol.
636626

637627

@@ -855,8 +845,6 @@ It can be verified manually that this system has no possible solution.
855845

856846
To illustrate why this situation arises let's plot the two lines.
857847

858-
+++
859-
860848
```{code-cell} ipython3
861849
fig, ax = plt.subplots(figsize=(5, 4))
862850
x = np.linspace(-10,10)
@@ -866,7 +854,7 @@ plt.legend()
866854
plt.show()
867855
```
868856

869-
+++
857+
+++ {"tags": []}
870858

871859
Clearly, these are parallel lines and hence we will never find a point $x \in \mathbb{R}^2$
872860
such that these lines intersect.
@@ -894,12 +882,11 @@ We can rewrite this system in matrix form as
894882

895883
It can be noted that the $2^{nd}$ row of matrix $A = (2, 6)$ is just a scalar multiple of the $1^{st}$ row of matrix $A = (1, 3)$.
896884

897-
Matrix $A$ in this case is called **linearly dependent.**
885+
The rows of matrix $A$ in this case is called **linearly dependent.**
898886

899-
Linear dependence arises when one row of a matrix can be expressed as a [linear combination](https://en.wikipedia.org/wiki/Linear_combination)
900-
of the other rows.
887+
A collection of vectors $A$ is called linearly dependent whenever a vector $v \in A$ can be expressed as a [linear combination](https://en.wikipedia.org/wiki/Linear_combination) of all the other vectors in $A$.
901888

902-
A matrix that is **not** linearly dependent is called **linearly independent**.
889+
A collection of vectors that is **not** linearly dependent is called **linearly independent**.
903890

904891
We will keep our discussion of linear dependence and independence limited but a more detailed and generalized
905892
explanation can be found [here](https://python.quantecon.org/linear_algebra.html#linear-independence).
@@ -919,7 +906,7 @@ Any vector $v = (x,y)$ such that $x = 2y - 4$ will solve the above system.
919906

920907
Since we can find infinite such vectors this system has infinitely many solutions.
921908

922-
Check whether the matrix
909+
Check whether the rows
923910

924911
```{math}
925912
:label: many_solns
@@ -930,7 +917,7 @@ Check whether the matrix
930917
\end{bmatrix}
931918
```
932919

933-
is linearly dependent or independent.
920+
are linearly dependent or independent.
934921

935922
We can now impose conditions on $A$ in {eq}`la_se2` that rule out these problems.
936923

@@ -954,9 +941,9 @@ $$
954941
If the determinant of $A$ is not zero, then we say that $A$ is
955942
*nonsingular*.
956943

957-
A square matrix $A$ is nonsingular if and only if $A$ is linearly independent.
944+
A square matrix $A$ is nonsingular if and only if the rows and columns of $A$ are linearly independent.
958945

959-
You can check yourself that the linearly dependent matrices in {eq}`no_soln` and {eq}`many_solns` are singular matrices
946+
You can check yourself that the in {eq}`no_soln` and {eq}`many_solns` with linearly dependent rows are singular matrices
960947
as well.
961948

962949
This gives us a useful one-number summary of whether or not a square matrix can be
@@ -994,8 +981,6 @@ We can now solve for equilibrium prices with NumPy's `linalg` submodule.
994981

995982
All of these routines are Python front ends to time-tested and highly optimized FORTRAN code.
996983

997-
+++
998-
999984
```{code-cell} ipython3
1000985
C = ((10, 5), #matrix C
1001986
(5, 10))
@@ -1054,29 +1039,17 @@ q = C @ p # equilibrium quantities
10541039
q
10551040
```
10561041

1057-
+++
1058-
10591042
Observe how we can solve for $x = A^{-1} y$ by either via `inv(A) @ y`, or using `solve(A, y)`.
10601043

10611044
The latter method uses a different algorithm that is numerically more stable and hence should be the default option.
10621045

1063-
NOTE Add more examples. Perhaps Tom has suggestions.
1064-
1065-
NOTE Perhaps discuss LU decompositions in a very simple way?
1066-
1067-
1068-
1069-
10701046

10711047
### Further Reading
10721048

10731049
The documentation of the `numpy.linalg` submodule can be found [here](https://numpy.org/devdocs/reference/routines.linalg.html).
10741050

10751051
More advanced topics in linear algebra can be found [here](https://python.quantecon.org/linear_algebra.html#id5).
10761052

1077-
NOTE Add more references.
1078-
1079-
NOTE Add exercises.
10801053

10811054
## Exercises
10821055

@@ -1156,8 +1129,6 @@ b =
11561129
\end{bmatrix}
11571130
$$
11581131

1159-
+++
1160-
11611132
```{code-cell} ipython3
11621133
import numpy as np
11631134
from numpy.linalg import det
@@ -1221,11 +1192,11 @@ vectors $x \in \mathbb{R}^n$
12211192

12221193
$$
12231194
\begin{aligned}
1224-
distance(A\hat{x} - b) & \leq distance(Ax - b) \\
1225-
\|A\hat{x} - b\| & \leq \|Ax - b\| \\
1226-
\|A\hat{x} - b\|^2 & \leq \|Ax - b\|^2 \\
1227-
(A\hat{x}_1 - b_1)^2 + (A\hat{x}_2 - b_2)^2 + \cdots + (A\hat{x}_m - b_m)^2 & \leq
1228-
(Ax_1 - b_1)^2 + (Ax_2 - b_2)^2 + \cdots + (Ax_m - b_m)^2
1195+
distance(A\hat{x} - b) & \leq distance(Ax - b) \\
1196+
\iff \|A\hat{x} - b\| & \leq \|Ax - b\| \\
1197+
\iff \|A\hat{x} - b\|^2 & \leq \|Ax - b\|^2 \\
1198+
\iff (A\hat{x}_1 - b_1)^2 + (A\hat{x}_2 - b_2)^2 + \cdots + (A\hat{x}_m - b_m)^2 & \leq
1199+
(Ax_1 - b_1)^2 + (Ax_2 - b_2)^2 + \cdots + (Ax_m - b_m)^2
12291200
\end{aligned}
12301201
$$
12311202

@@ -1237,8 +1208,8 @@ $\hat{x}$ is given by:
12371208
12381209
\begin{aligned}
12391210
{A^T} A \hat{x} & = {A^T} b \\
1240-
\hat{x} & = (A^T A)^{-1} A^T b
1241-
\end{aligned}
1211+
\hat{x} = (A^T & A)^{-1} A^T b
1212+
\end{aligned}
12421213
```
12431214

12441215
Consider the general equation of a linear demand curve of a good given by:
@@ -1329,4 +1300,8 @@ plt.show()
13291300
```
13301301

13311302
```{solution-end}
1332-
```
1303+
```
1304+
1305+
```{code-cell} ipython3
1306+
1307+
```

0 commit comments

Comments
 (0)