# Locating Roots using Iteration (including Newton-Raphson)

## Locating Roots

Recall that locating roots of a curve means to find the $x$-coordinates of the points at which the curve crosses the $x$-axis. To do this analytically means to do it by solving equations using algebra. We know how to find roots analytically for some functions, like quadratics, for example. That is, by solving $y=ax^2+bx+c=0$ for $x$. We can also do it for some cubics and some trigonometric equations etc. However, there are many equations where we cannot find roots analytically and we must do it numerically. This is where we make ‘guesses’ for the roots and improve them successively via some calculation. This is sometimes known as iteration.

### Change of Sign

We can make a first guess at a root of the equation $y=f(x)$ (where $f(x)=0$ cannot be solved analytically) if we see a change in sign of $f(x)$ between two given $x$-values. For example, consider the graph of $f(x)=\sin(e^x)$ – we cannot find the roots analytically. We can see from the graph that $f(1)=0.4108$ and $f(1.5)=-0.9735$ (using radians). It follows that since $f(1)>0$ and $f(1.5)<0$, the curve has a root between $x=1$ and $x=1.5$. Similarly, $f(1.5)<0$ and $f(2)>0$, the curve has a root between $x=1.5$ and $x=2$. See Example 1 for successive interval narrowing to get an approximation to the root.

This curve also illustrates some situations to be aware of.  Note that $f(0.5)>0$ and $f(1)>0$ and there is no root between $x=0.5$ and $x=1$. However, $f(1)>0$ and $f(2)>0$ and there are two roots. Hence, it is possible that if the curve has the same sign at two points, there could be no roots or an even number of roots. There are 4 roots between $x=1.5$ and $x=2.9$, two $x$-values where the curve is negative. Similarly, $f(2.9)<0$ and $f(3)>0$ and there is one root between $x=2.9$ and $x=3$. Whereas, $f(2.5)<0$ and $f(3)>0$ and there are three roots between $x=2.5$ and $x=3$. Hence, it is possible if the curve has different signs at two points, there could be an odd number of roots in between.

These examples illustrate the importance of choosing intervals sufficiently small to indicate the presence of a single root. The final situation to be aware of is the case where a curve has different signs at two points and has no roots in between. This can happen if the curve is not continuous – see some discontinuous curves. In these cases, a change of sign may be due to an asymptote rather than a root.

To summarise: if the function $f(x)$ is continuous on a sufficiently small interval $[a,b]$, and $f(a)$ and $f(b)$ have opposite signs, then $f(x)$ has a root on the interval $[a,b]$.

## Iteration - Recursive Formula

An alternative way to find roots is to rewrite the equation $f(x)=0$ in the form $x=g(x)$. Finding the root $x$ that satisfies $f(x)=0$ is then the same as finding the $x$ that satisfies the equation $x=g(x)$. On a graph, this is the point where the curve of $y=g(x)$ crosses the straight line $y=x$. By setting up the recursive formula $x_{n+1}=g(x_n)$, the solution can be found iteratively. This means that we choose $x_0$ and by setting $n=1$, this formula says that $x_1=g(x_0)$. So, our next guess for the root is $x_1$. Similarly, the next guess is $x_2$, found by setting $n=1$ and evaluating $g(x_1)$. Recursively, $x_3=g(x_2)$ and $x_4=g(x_3)$ are our next guesses for the root and so on. By choosing a value for $x_0$ well, this formula may converge to a root in one of two ways.

This is an example of a staircase diagram. We put the first choice of $x_0$ into $g(x)$ to find $x_1$. This is our next choice to put in to $g(x)$ so we go along horizontally to the line $y=x$ so we can put this value in. We move vertically up to find $g(x_1)$ and so on, converging to the solution of $x=g(x)$, that is the original root, iteratively.

This is an example of a cobweb diagram. As before, we move vertically to find $g(x)$ values and horizontally to find the new corresponding $x$-values. Due to the shape of the curve, the convergence occurs by going around the solution creating a cobweb-like effect as opposed to the staircase effect seen in the other diagram.

In these diagrams, the choice of $x_0$ does not affect the convergence. That is, all choices of $x_0$ result in staircase/cobweb convergence towards the root. This is dependent on the shape of the curve when compared with the straight line, however.  The diagrams below show how the curve and the choice of $x_0$ can result in staircase/cobweb divergence.

Note that there may be several choices for rewriting $f(x)=0$ in the form $x=g(x)$ when finding the recursive formula $x_{n+1}=g(x_n)$. The choice of $g(x_n)$, as well as the choice of $x_0$, may affect the convergence of the iteration. See Example 2. Even when the sequence converges, the location of the root needs to be justified – see Example 2 for this too.

## Locating Roots using the Newton-Raphson Method

The Newton-Raphson method is also an iterative procedure for locating roots. To solve $f(x)=0$, Newton-Raphson uses a specific recursive formula:

$x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)}$

Notice the difference between this formula, that uses the derivative $f'(x)$, as opposed to any $g(x)$ in the iterations above. We revert to looking for the actual root as opposed to the intersection between two curves. To do this, the Newton-Raphson method works using tangents – root guesses are updated by finding where successive tangents cross the $x$-axis.

The Newton-Raphson formula is essentially a rearrangement of the formula for a derivative. Consider the tangent at $x_0$, for example. We find the gradient of the tangent (or derivative of the curve) using rise over run: $f'(x_0)=\frac{f(x_0)-0}{x_1-x_0}$. We can rearrange this to get $x_1=x_0-\frac{f(x_0)}{f'(x_0)}$. This basically says that we improve the guess for the root by moving backwards by an amount that is the run. If the gradient is rise over run then the run is rise over gradient.  See Example 3 for the use of Newton-Raphson in a modelling example where we locate a stationary point rather than a root.

Since there is division by the derivative, the Newton-Raphson method can fail when the derivative is 0. That is, locating roots near a stationary point. This makes sense when considering that a tangent at a stationary points is flat and doesn’t cross the $x$-axis. We are then unable to use Newton-Raphson to update the root guess. It follows that choosing $x_0$ at a stationary point will result in the method failing. In the case where $f'(x)$ is small, division by a small number creates a large update to the root guess and convergence can be slow. Hence, the choice of $x_0$ should be made away from any stationary points.

## Examples of Locating Roots using Iteration

Use successive interval narrowing to find the root of $f(x)=\sin(e^x)$ that is between $x=1$ and $x=1.5$ to 2 decimal places. (Make sure you are using radians).

Consider the function $f(x)=e^{x-3}-x$.

1. Show that $f(x)=0$ can be written as $x=\ln(x)+3$.
2. Apply the formula $x_{n+1}=\ln(x_n)+3$ with $x_0=2$. Does the sequence diverge/converge to a root? If the sequence converges, is it staircase or cobweb convergent and what is the root, with justification, to 3 decimal places? Experiment with other $x_0$.
3. Now apply the formula $x_{n+1}=e^{x_n-3}$ with $x_0=2$. Does the sequence diverge/converge to a root? If the sequence converges, is it staircase or cobweb convergent and what is the root, with justification, to 2 decimal places? Experiment with other $x_0$.

The shape of a hill can be modelled by the function $y=5\sin(0.1x)-10\cos(0.2x)$ for $7\leq x\leq 24$.

1. Show that $y$ has a stationary point on the interval $[10,20]$.
2. Perform 3 iterations of the Newton-Raphson method, starting with $x_0=15$, to find the stationary point.