A popular technique for approximating the square root of a number is to use the tangent line approximation. Say we wish to find as=a, and we know that the square root of a neighbouring number b is bs, then we would use the approximation: a≈b+2ba−b. This approximation is called the tangent line approximation because it is derived by using the tangent line for the square root function x to approximate its values. This approximation can also be derived by using the Taylor series of the square root function.
In this case, you can use Newton's method to refine the approximation. Newton's method is a technique in which we essentially use the tangent line to approximate the zeros of a function. If one of that function's roots happens to be the square root you're searching for, applying Newton's method will have the result of giving you ever improving approximations to the square root. One function that is easy to exactly compute that's root is as is f(x)=x2−a. Newton's method then gives us this scheme for approximating as:
A classic example where the tangent line approximation is pretty poor is when we are trying to find 2. The tangent line approximation gives:
2≈===1+212−11+21231.5.
When of course 2 is to five decimal places 1.41421, and the above approximation is only accurate to one significant figure (or not even that if you round it to the nearest integer). We can improve on this approximation using Newton's method. Our first iteration is:
Similarly the cube root of a number can be approximated by a tangent line approximation and refined using Newton's method. If we let f(x)=3x, then f′(x)=3(3x)21 and:
The correct value of 363 is approximately 3.9790572078963917, so this is not accurate to five decimal places. So to improve this estimate we apply Newton's method:
Newton's method can be used for finding the roots of any continuous real-valued function, or even system of continuous real-valued functions, although it has some shortcomings. One is that it requires an initial guess as to the root, and that its results can depend heavily on this initial guess. Additionally, it can sometimes fail to converge, such as when the root occurs at a stationary point for the function (maxima or minima), as there the derivative is zero. For example, it fails for the polynomial x4+3x3+4x2+3x+1, as its only real root is at x=−1 where the function is at minima. For polynomial equations that have real roots this failure to converge is uncommon provided the initial guess is reasonably close to the true value of the root.
Newton's method belongs to a class of methods known as the Householder's methods. Newton's method is the first order Householder's method, Halley's method is the second order Householder's method and it is possible to derive the dth order Householder's method using the formula:
xn+1=xn+d(1/f)(d)(xn)(1/f)(d−1)(xn)
where the bracketed superscripts denotes derivatives of the superscript's order. For example (1/f)(d) denotes the dth derivative of 1/f. The reason I mentioned this fact is that sometimes, especially when the lower order derivatives are equal to zero at the root, higher order Householder's methods may be superior.
Getting the initial guesses for the roots, especially of polynomials that have multiple real roots, can be a challenge. One technique is to use the bisection method. In this technique one takes an interval over which the sign of the function changes from positive to negative or vice versa (which implies that at least one root must lie within this interval) and continuously subdivides this interval and re-evaluates the function at its endpoints until we find the root. This method is very slow, so usually one would only apply it a few times, and then once our interval is acceptably small we would apply Newton's method to get the root. One can also evaluate the function at a long list of points within an interval and find where in the interval the sign changes, as that is where a root will be. If the interval is large enough and the function has multiple real roots there may be multiple sign change points and hence multiple roots we can converge to using Newton's method. Like Newton's method, however, it does not converge when the root is a minima or maxima.
These methods can be done by hand, but for many problems they can be the quite tedious; as such I wrote this Julia script that implements the bisection method and Newton's method to find the roots of a given function:
#!/usr/bin/env julia"""
bisection(f::Function, N::Integer, a::Number, b::Number)
Uses the bisection method with N subdivisions of the specified domain [a, b]
to find the roots of the function f within said domain. Used to find the
initial guess that Newton's method then uses to get a more precise estimate of
the roots.
"""function bisection(f, N, a, b)
x = LinRange(a, b, N+1)
fval = f.(x)
xv = x[2:end]
initGuess = xv[diff(sign.(fval)).!= 0]
return initGuess
end"""
newtons(f::Function, h::Float, tol::Float, itMax::Integer,
initGuess::Vector{Float})
Uses Newton's method (and if needed, up to third order Householder's methods)
to refine initGuess, our initial guess of the root(s) of f, until either itMax
iterations has been performed or the relative magnitude of the update Newton's
provides to the estimate is below tol. h is the step size used to approximate
the derivative.
"""function newtons(f, h, tol, itMax, initGuess)
# Derivativesfunction fd(x, h)
return (f(x+h)-f(x-h))/(2*h)
endfunction f2d(x, h)
return (f(x+h)-2*f(x)+f(x-h))/(h^2)
endfunction f3d(x, h)
return (f(x+2*h)-2*f(x+h)+f(x-h)-f(x-2*h))/(2*h^3)
end# Check to see if (relative) difference in update is below tolfunction tolCheck(diff, sol, tol)
if ! (sol ≈ 0)
check = abs(diff/sol) > tol
else
check = abs(diff) > tol
endreturn check
end# Initialize sol
sol = initGuess
count = Int64.(zeros(size(initGuess)))
for j=1:length(initGuess)
diff = 1while (tolCheck(diff, sol[j], tol) && count[j] < itMax)
fn = f(sol[j])
der = fd(sol[j], h)
der2 = f2d(sol[j], h)
der3 = f3d(sol[j], h)
if (der ≈ 0) && (der2 ≈ 0)
# Third order Householder's method
diff = (6*fn*der^2 - 3*fn^2*der2)/(6*der^3-6*fn*der*der2 + fn^2*der3)
elseif (der ≈ 0)
# Second order, Halley's method
diff = 2*fn*der/(-fn*der2 + 2*der^2)
else# First order, Newton's method
diff = fn/der
end
sol[j] -= diff
count[j] += 1endif (count[j] == itMax)
print("Maximum iterations exceeded and the amount by which ")
println("Newton's last updated the solution was: ", diff)
endend# Delete NaN entries from sol and corresponding count entries# While loop is used because sol's length will change over this loop
j = 1;
while j < length(sol)
if isnan(sol[j])
deleteat!(sol, j)
deleteat!(count, j)
end
j += 1;
endreturn sol, count
end"""
findRoot(f::Function, h::Float, tol::Float, itMax::Integer, a::Number, b::Number, N::Integer)
Uses the bisection method to get an initial guess of the root(s) of f on the
domain [a, b] with N subdivisions, then uses Newton's method with a maximum of
itMax iterations and a relative error tolerance of tol. h is the step size
used to approximate the derivative.
"""function findRoot(f, h, tol, itMax, a, b, N)
initGuess = bisection(f, N, a, b)
sol, count = newtons(f, h, tol, itMax, initGuess);
return sol, count
end"""
printSoln(sol::Vector{Float}, count::Vector{Integer})
Print the solution and the number of iterations used.
"""function printSoln(sol, count)
if (length(sol) == 1)
println("Root = ", sol[1], ", count = ", count[1])
elsefor i=1:length(sol)
println("The $(i)th root = ", sol[i], ", count = ", count[i])
endendend# This is where you specify the function you want to# find the root offunction f(x)
return x^4 + x^3 - 10x^2 - 4x + 16end# Initialize required variables
h = 1e-10
tol = 1e-15
itMax = 1000
a = -100
b = 100
N = 100000
sol, count = findRoot(f, h, tol, itMax, a, b, N)
printSoln(sol, count)