Newton's Method/Sequence of Approximations Converges Quadratically

From ProofWiki
Jump to navigation Jump to search



Theorem

Let $\map f x$ be a real function.

Let $\alpha$ be a root of $\map f x$.

Let $\epsilon > 0$ be a positive real number, and $I = \closedint {\alpha - \epsilon} {\alpha + \epsilon}$.

Let $f$ have a continuous second derivative on $I$.

Let $\ds M = \map \sup {\size {\frac {\map {f' '} s} {\map {f'} t} } }$ over all $s, t \in I$.

For this to be well-defined, it is also necessary that $\map {f'} x$ is non-vanishing on $I$.

Suppose $\epsilon < \dfrac 2 M$.

Then the sequence generated by Newton's Method, starting with any initial guess $x_0 \in I$, converges to $\alpha$ with order $2$.


Proof

Suppose that the sequence is produced up to $x_n$.

Suppose also that $x_n \in I$.

\(\ds \map f {x_n} + \paren {\alpha - x_n} \map {f'} {x_n} + \frac {\map {f' '} {\zeta_n} } 2 \paren {\alpha - x_n}^2\) \(=\) \(\ds \map f \alpha\) Taylor's Theorem, for some $\zeta_n \in \openint {x_n} \alpha$
\(\ds \) \(=\) \(\ds 0\) Definition of Root of Function
\(\ds \frac {\map f {x_n} } {\map {f'} {x_n} } + \paren {\alpha - x_n} + \frac {\map {f' '} {\zeta_n} } {2\map {f'} {x_n} } \paren {\alpha - x_n}^2\) \(=\) \(\ds 0\) $\map {f'} {x_n} \ne 0$
\(\ds \alpha - \paren {x_n - \frac {\map f {x_n} } {\map {f'} {x_n} } }\) \(=\) \(\ds -\frac {\map {f' '} {\zeta_n} } {2\map {f'} {x_n} } \paren {\alpha - x_n}^2\)
\(\ds \alpha - x_{n + 1}\) \(=\) \(\ds -\frac {\map {f' '} {\zeta_n} } {2\map {f'} {x_n} } \paren {\alpha - x_n}^2\) Newton's Method
\(\ds \size {\alpha - x_{n + 1} }\) \(=\) \(\ds \size {\frac {\map {f''} {\zeta_n} } {2\map {f'} {x_n} } \paren {\alpha - x_n}^2}\)
\(\ds \) \(\le\) \(\ds \frac M 2 \size {\paren {\alpha - x_n} }^2\) Definition of $M$

First:

\(\ds \size {\alpha - x_{n + 1} }\) \(\le\) \(\ds \frac {M \epsilon} 2 \size {\alpha - x_n}\) by assumption of $x_n \in I$
\(\ds \) \(<\) \(\ds \size {\alpha - x_n}\) by hypothesis of $\epsilon < \dfrac 2 M$

Therefore $x_{n + 1} \in I$.

But as $x_0 \in I$, by Principle of Mathematical Induction, every $x_i \in I$.


Next, apply the Principle of Recursive Definition to define the sequence:

\(\ds \epsilon_0\) \(=\) \(\ds \size {\alpha - x_0}\)
\(\ds \epsilon_{n + 1}\) \(=\) \(\ds \frac M 2 \epsilon_n^2\)

Trivially:

$\size {\alpha - x_0} \le \epsilon_0$

Assuming $\size {\alpha - x_n} \le \epsilon_n$, it follows that:

\(\ds \size {\alpha - x_{n + 1} }\) \(\le\) \(\ds \frac M 2 \size {\alpha - x_n}^2\)
\(\ds \) \(\le\) \(\ds \frac M 2 \epsilon_n^2\)
\(\ds \) \(=\) \(\ds \epsilon_{n + 1}\)

By Principle of Mathematical Induction, every $\size {\alpha - x_n} \le \epsilon_n$.

But:

\(\ds \lim_{n \mathop \to \infty} \frac {\epsilon_{n + 1} } {\epsilon_n^p}\) \(=\) \(\ds \lim_{n \mathop \to \infty} \frac M 2\)
\(\ds \) \(=\) \(\ds \frac M 2\)

Therefore, the sequence $\sequence {x_n}$ converges to $\alpha$ with order $2$, by definition.

$\blacksquare$


Sources