@@ -412,19 +412,19 @@ $\beta_0$ (the OLS parameter estimates might be a reasonable
412412guess), then
413413
4144141 . Use the updating rule to iterate the algorithm
415-
415+
416416 $$
417417 \boldsymbol{\beta}_{(k+1)} = \boldsymbol{\beta}_{(k)} - H^{-1}(\boldsymbol{\beta}_{(k)})G(\boldsymbol{\beta}_{(k)})
418418 $$
419419 where:
420-
420+
421421 $$
422422 \begin{aligned}
423423 G(\boldsymbol{\beta}_{(k)}) = \frac{d \log \mathcal{L(\boldsymbol{\beta}_{(k)})}}{d \boldsymbol{\beta}_{(k)}} \\
424424 H(\boldsymbol{\beta}_{(k)}) = \frac{d^2 \log \mathcal{L(\boldsymbol{\beta}_{(k)})}}{d \boldsymbol{\beta}_{(k)}d \boldsymbol{\beta}'_{(k)}}
425425 \end{aligned}
426426 $$
427-
427+
4284281 . Check whether $\boldsymbol{\beta}_ {(k+1)} - \boldsymbol{\beta}_ {(k)} < tol$
429429 - If true, then stop iterating and set
430430 $\hat{\boldsymbol{\beta}} = \boldsymbol{\beta}_ {(k+1)}$
@@ -506,7 +506,7 @@ def newton_raphson(model, tol=1e-3, max_iter=1000, display=True):
506506 while np.any(error > tol) and i < max_iter:
507507 H, G = model.H(), model.G()
508508 β_new = model.β - (np.linalg.inv(H) @ G)
509- error = β_new - model.β
509+ error = np.abs( β_new - model.β)
510510 model.β = β_new
511511
512512 # Print iterations
@@ -547,7 +547,7 @@ poi = PoissonRegression(y, X, β=init_β)
547547```
548548
549549As this was a simple model with few observations, the algorithm achieved
550- convergence in only 6 iterations.
550+ convergence in only 7 iterations.
551551
552552You can see that with each iteration, the log-likelihood value increased.
553553
@@ -973,4 +973,4 @@ print(Probit(y, X).fit().summary())
973973```
974974
975975``` {solution-end}
976- ```
976+ ```
0 commit comments