Skip to content

Commit 9a93699

Browse files
committed
misc
1 parent 093e12d commit 9a93699

File tree

1 file changed

+29
-25
lines changed

1 file changed

+29
-25
lines changed

lectures/newton_method.md

Lines changed: 29 additions & 25 deletions
Original file line numberDiff line numberDiff line change
@@ -41,11 +41,11 @@ In other words, an equilibrium is a root of the excess demand function.
4141
There are various computational techniques for solving for fixed points and
4242
roots.
4343

44-
In this lecture we study a very important one called [Newton's
44+
In this lecture we study an important gradient-based technique called [Newton's
4545
method](https://en.wikipedia.org/wiki/Newton%27s_method).
4646

4747
Newton's method does not always work but, in situations where it does,
48-
convergence is often very fast when compared to other methods.
48+
convergence is often fast when compared to other methods.
4949

5050
The lecture will apply Newton's method in one-dimensional and
5151
multi-dimensional settings to solve fixed-point and root-finding problems.
@@ -74,7 +74,7 @@ plt.rcParams["figure.figsize"] = (10, 5.7)
7474

7575
## Fixed Point Computation Using Newton's Method
7676

77-
In this section, we will solve the fixed point of the law of motion for capital in the setting of the [Solow growth model](https://en.wikipedia.org/wiki/Solow%E2%80%93Swan_model).
77+
In this section we solve the fixed point of the law of motion for capital in the setting of the [Solow growth model](https://en.wikipedia.org/wiki/Solow%E2%80%93Swan_model).
7878

7979
We will inspect the fixed point visually, solve it by successive approximation, and then apply Newton's method to achieve faster convergence.
8080

@@ -181,8 +181,8 @@ plt.show()
181181

182182
First, let's compute the fixed point using successive approximation.
183183

184-
This elementary method simply involves repeatedly updating capital using the
185-
law of motion until it converges.
184+
In this case, successive approximation involves repeatedly updating capital
185+
using the law of motion until it converges.
186186

187187
Here's a time series from a particular choice of $k_0$.
188188

@@ -242,7 +242,7 @@ $$
242242
x_1=\frac{g\left(x_0\right)-g^{\prime}\left(x_0\right) x_0}{1-g^{\prime}\left(x_0\right)}
243243
$$
244244

245-
Generalising the process above, Newton's method iterates on
245+
Generalising the process above, Newton's fixed point method iterates on
246246

247247
```{math}
248248
:label: newtons_method
@@ -315,18 +315,18 @@ params = create_solow_params()
315315
plot_trajectories(params)
316316
```
317317

318-
We can see that Newton's Method reaches convergence faster than the successive approximation.
318+
We can see that Newton's method converges faster than successive approximation.
319319

320-
The above fixed-point calculation can be seen as a root-finding problem since the computation of a fixed point can be seen as approximating $x^*$ iteratively such that $g(x^*) - x^* = 0$.
320+
The above fixed-point calculation is connected to root-finding because the computation of a fixed point of $g$ is equivalent to finding a root of $f(x) = g(x)-x$.
321321

322-
We the formula [](motivation) can be rewritten in terms of $f(x)$
322+
The formula [](motivation) can be rewritten in terms of $f(x)$
323323

324324
$$
325325
\hat{f}(x) \approx f\left(x_0\right)+f^{\prime}\left(x_0\right)\left(x-x_0\right)
326326
$$
327327

328328

329-
Assuming $f(x) = g(x) - x$, set $\hat{f}(x_1) = 0$ and solve for $x_1$ to get
329+
With $f(x) = g(x) - x$, set $\hat{f}(x_1) = 0$ and solve for $x_1$ to get
330330

331331
$$
332332
x_1 = x_0 - \frac{ f(x_0) }{ f'(x_0) },
@@ -342,8 +342,6 @@ x_{t+1} = x_t - \frac{ f(x_t) }{ f'(x_t) },
342342
\quad x_0 \text{ given}
343343
```
344344

345-
Root-finding formula is also a more frequently used iteration.
346-
347345
The following code implements the iteration [](oneD-newton)
348346

349347
(first_newton_attempt)=
@@ -383,23 +381,26 @@ k_star_approx_newton
383381

384382
The result confirms the descent we saw in the graphs above: a very accurate result is reached with only 5 iterations.
385383

386-
The multi-dimensional variant will be left as an [exercise](newton_ex1).
387-
388-
By observing the formula of Newton's method, it is easy to see the possibility to implement Newton's method using Jacobian when we move up the ladder to higher dimensions.
389384

390-
This naturally leads us to use Newton's method to solve multi-dimensional problems for which we will use the powerful auto-differentiation functionality in JAX to do intricate calculations.
391385

392386
## Multivariate Newton’s Method
393387

394-
In this section, we will first introduce a two-good problem, present a visualization of the problem, and solve the equilibrium of the two-good market using both a root finder in `SciPy` and Newton's method.
388+
In this section, we introduce a two-good problem, present a
389+
visualization of the problem, and solve for the equilibrium of the two-good market
390+
using both a root finder in `SciPy` and Newton's method.
391+
392+
We then expand the idea to a larger market with 5000 goods and compare the
393+
performance of the two methods again to show a significant improvement in
394+
performance using Netwon's method.
395395

396-
We will then expand the idea to a larger market with 5000 goods and compare the performance of the two methods again to show a significant improvement in performance using Netwon's method.
397396

398397
### A Two Goods Market Equilibrium
399398

400-
Before moving to higher dimensional settings, let's compute the market equilibrium of a two-good problem.
399+
Before moving to higher dimensional settings, let's compute the market
400+
equilibrium of a two-good problem.
401401

402-
We first consider a market for two related products, good 0 and good 1, with price vector $p = (p_0, p_1)$
402+
We first consider a market for two related products, good 0 and good 1, with
403+
price vector $p = (p_0, p_1)$
403404

404405
Supply of good $i$ at price $p$,
405406

@@ -556,7 +557,6 @@ We see the black contour line of zero, which tells us when $e_i(p)=0$.
556557

557558
For a price vector $p$ such that $e_i(p)=0$ we know that good $i$ is in equilibrium (demand equals supply).
558559

559-
560560
If these two contour lines cross at some price vector $p^*$, then $p^*$ is an equilibrium price vector.
561561

562562

@@ -645,17 +645,21 @@ np.max(np.abs(e(p, A, b, c)))
645645

646646
#### Using Newton's Method
647647

648-
Now let's use Newton's method to compute the equilibrium price using the multivariate version of Newton's method:
648+
Now let's use Newton's method to compute the equilibrium price using the multivariate version of Newton's method
649649

650650
```{math}
651651
:label: multi-newton
652652
653653
p_{n+1} = p_n - J_e(p_n)^{-1} e(p_n)
654654
```
655655

656-
starting from some initial guess of the price vector $p_0$. (Here $J_e(p_n)$ is the Jacobian of $e$ evaluated at $p_n$.)
656+
This is a multivariate version of [](oneD-newton)
657+
658+
(Here $J_e(p_n)$ is the Jacobian of $e$ evaluated at $p_n$.)
659+
660+
The iteration starts from some initial guess of the price vector $p_0$.
657661

658-
Instead of coding Jacobian by hand, We use the `jax.jacobian()` function to auto-differentiate and calculate Jacobian.
662+
Here, instead of coding Jacobian by hand, We use the `jax.jacobian()` function to auto-differentiate and calculate Jacobian.
659663

660664
With only slight modification, we can generalize [our previous attempt](first_newton_attempt) to multi-dimensional problems
661665

@@ -684,7 +688,7 @@ def e(p, A, b, c):
684688
return jnp.exp(- jnp.dot(A, p)) + c - b * jnp.sqrt(p)
685689
```
686690

687-
We find the convergence is reached in 4 steps
691+
We find the algorithm terminates in 4 steps
688692

689693
```{code-cell} python3
690694
%%time

0 commit comments

Comments
 (0)