You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In this section, we will solve the fixed point of the law of motion for capital in the setting of the [Solow growth model](https://en.wikipedia.org/wiki/Solow%E2%80%93Swan_model).
77
+
In this section we solve the fixed point of the law of motion for capital in the setting of the [Solow growth model](https://en.wikipedia.org/wiki/Solow%E2%80%93Swan_model).
78
78
79
79
We will inspect the fixed point visually, solve it by successive approximation, and then apply Newton's method to achieve faster convergence.
80
80
@@ -181,8 +181,8 @@ plt.show()
181
181
182
182
First, let's compute the fixed point using successive approximation.
183
183
184
-
This elementary method simply involves repeatedly updating capital using the
185
-
law of motion until it converges.
184
+
In this case, successive approximation involves repeatedly updating capital
185
+
using the law of motion until it converges.
186
186
187
187
Here's a time series from a particular choice of $k_0$.
We can see that Newton's Method reaches convergence faster than the successive approximation.
318
+
We can see that Newton's method converges faster than successive approximation.
319
319
320
-
The above fixed-point calculation can be seen as a root-finding problem since the computation of a fixed point can be seen as approximating $x^*$ iteratively such that $g(x^*) - x^* = 0$.
320
+
The above fixed-point calculation is connected to root-finding because the computation of a fixed point of $g$ is equivalent to finding a root of $f(x) = g(x)-x$.
321
321
322
-
We the formula [](motivation) can be rewritten in terms of $f(x)$
322
+
The formula [](motivation) can be rewritten in terms of $f(x)$
Root-finding formula is also a more frequently used iteration.
346
-
347
345
The following code implements the iteration [](oneD-newton)
348
346
349
347
(first_newton_attempt)=
@@ -383,23 +381,26 @@ k_star_approx_newton
383
381
384
382
The result confirms the descent we saw in the graphs above: a very accurate result is reached with only 5 iterations.
385
383
386
-
The multi-dimensional variant will be left as an [exercise](newton_ex1).
387
-
388
-
By observing the formula of Newton's method, it is easy to see the possibility to implement Newton's method using Jacobian when we move up the ladder to higher dimensions.
389
384
390
-
This naturally leads us to use Newton's method to solve multi-dimensional problems for which we will use the powerful auto-differentiation functionality in JAX to do intricate calculations.
391
385
392
386
## Multivariate Newton’s Method
393
387
394
-
In this section, we will first introduce a two-good problem, present a visualization of the problem, and solve the equilibrium of the two-good market using both a root finder in `SciPy` and Newton's method.
388
+
In this section, we introduce a two-good problem, present a
389
+
visualization of the problem, and solve for the equilibrium of the two-good market
390
+
using both a root finder in `SciPy` and Newton's method.
391
+
392
+
We then expand the idea to a larger market with 5000 goods and compare the
393
+
performance of the two methods again to show a significant improvement in
394
+
performance using Netwon's method.
395
395
396
-
We will then expand the idea to a larger market with 5000 goods and compare the performance of the two methods again to show a significant improvement in performance using Netwon's method.
397
396
398
397
### A Two Goods Market Equilibrium
399
398
400
-
Before moving to higher dimensional settings, let's compute the market equilibrium of a two-good problem.
399
+
Before moving to higher dimensional settings, let's compute the market
400
+
equilibrium of a two-good problem.
401
401
402
-
We first consider a market for two related products, good 0 and good 1, with price vector $p = (p_0, p_1)$
402
+
We first consider a market for two related products, good 0 and good 1, with
403
+
price vector $p = (p_0, p_1)$
403
404
404
405
Supply of good $i$ at price $p$,
405
406
@@ -556,7 +557,6 @@ We see the black contour line of zero, which tells us when $e_i(p)=0$.
556
557
557
558
For a price vector $p$ such that $e_i(p)=0$ we know that good $i$ is in equilibrium (demand equals supply).
558
559
559
-
560
560
If these two contour lines cross at some price vector $p^*$, then $p^*$ is an equilibrium price vector.
561
561
562
562
@@ -645,17 +645,21 @@ np.max(np.abs(e(p, A, b, c)))
645
645
646
646
#### Using Newton's Method
647
647
648
-
Now let's use Newton's method to compute the equilibrium price using the multivariate version of Newton's method:
648
+
Now let's use Newton's method to compute the equilibrium price using the multivariate version of Newton's method
649
649
650
650
```{math}
651
651
:label: multi-newton
652
652
653
653
p_{n+1} = p_n - J_e(p_n)^{-1} e(p_n)
654
654
```
655
655
656
-
starting from some initial guess of the price vector $p_0$. (Here $J_e(p_n)$ is the Jacobian of $e$ evaluated at $p_n$.)
656
+
This is a multivariate version of [](oneD-newton)
657
+
658
+
(Here $J_e(p_n)$ is the Jacobian of $e$ evaluated at $p_n$.)
659
+
660
+
The iteration starts from some initial guess of the price vector $p_0$.
657
661
658
-
Instead of coding Jacobian by hand, We use the `jax.jacobian()` function to auto-differentiate and calculate Jacobian.
662
+
Here, instead of coding Jacobian by hand, We use the `jax.jacobian()` function to auto-differentiate and calculate Jacobian.
659
663
660
664
With only slight modification, we can generalize [our previous attempt](first_newton_attempt) to multi-dimensional problems
661
665
@@ -684,7 +688,7 @@ def e(p, A, b, c):
684
688
return jnp.exp(- jnp.dot(A, p)) + c - b * jnp.sqrt(p)
0 commit comments