Given a function f(x) on floating number x and an initial guess for root, find root of function in interval. Here f(x) represents algebraic or transcendental equation.

For simplicity, we have assumed that derivative of function is also provided as input.

Example:

Input: A function of x (for example x^{3}– x^{2}+ 2), derivative function of x (3x^{2}– 3x for above example) and an initial guess x0 = -20 Output: The value of root is : -1.00 OR any other value close to root.

We have discussed below methods to find root in set 1 and set 2

Set 1: The Bisection Method

Set 2: The Method Of False Position

**Comparison with above two methods:**

- In previous methods, we were given an interval. Here we are required an initial guess value of root.
- The previous two methods are guaranteed to converge, Newton Rahhson may not converge in some cases.
- Newton Raphson method requires derivative. Some functions may be difficult to

impossible to differentiate. - For many problems, Newton Raphson method converges faster than the above two methods.
- Also, it can identify repeated roots, since it does not look for changes in the sign of f(x) explicitly

**The formula:**

Starting from initial guess x_{1}, the Newton Raphson method uses below formula to find next value of x, i.e., x_{n+1} from previous value x_{n}.

Algorithm:

Input: initial x, func(x), derivFunc(x)

Output: Root of Func()

- Compute values of func(x) and derivFunc(x) for given initial x
- Compute h: h = func(x) / derivFunc(x)
- While h is greater than allowed error ε
- h = func(x) / derivFunc(x)
- x = x – h

Below is the implementation of above algorithm.

## C++

// C++ program for implementation of Newton Raphson Method for // solving equations #include<bits/stdc++.h> #define EPSILON 0.001 using namespace std; // An example function whose solution is determined using // Bisection Method. The function is x^3 - x^2 + 2 double func(double x) { return x*x*x - x*x + 2; } // Derivative of the above function which is 3*x^x - 2*x double derivFunc(double x) { return 3*x*x - 2*x; } // Function to find the root void newtonRaphson(double x) { double h = func(x) / derivFunc(x); while (abs(h) >= EPSILON) { h = func(x)/derivFunc(x); // x(i+1) = x(i) - f(x) / f'(x) x = x - h; } cout << "The value of the root is : " << x; } // Driver program to test above int main() { double x0 = -20; // Initial values assumed newtonRaphson(x0); return 0; }

## Python3

# Python3 code for implementation of Newton # Raphson Method for solving equations # An example function whose solution # is determined using Bisection Method. # The function is x^3 - x^2 + 2 def func( x ): return x * x * x - x * x + 2 # Derivative of the above function # which is 3*x^x - 2*x def derivFunc( x ): return 3 * x * x - 2 * x # Function to find the root def newtonRaphson( x ): h = func(x) / derivFunc(x) while abs(h) >= 0.0001: h = func(x)/derivFunc(x) # x(i+1) = x(i) - f(x) / f'(x) x = x - h print("The value of the root is : ", "%.4f"% x) # Driver program to test above x0 = -20 # Initial values assumed newtonRaphson(x0) # This code is contributed by "Sharad_Bhardwaj"

Output:

The value of root is : -1.00

**How does this work?**

The idea is to draw a line tangent to f(x) at point x_{1}. The point where the tangent line crosses the x axis should be a better estimate of the root than x_{1}. Call this point x_{2}. Calculate f(x_{2}), and draw a line tangent at x_{2}.

We know that slope of line from (x_{1}, f(x_{1})) to (x_{2}, 0) is f'(x_{1})) where f’ represents derivative of f.

f'(x_{1}) = (0 - f(x_{1})) / (x_{2}- x_{1}) f'(x_{1}) * (x_{2}- x_{1}) = - f(x_{1}) x_{2}= x_{1}- f(x_{1}) / f'(x_{1}) By finding this point 'x2', we move closer towards the root. We have to keep on repeating the above step till we we get really close to the root or we find it. In general, x_{n+1}= x_{n}- f(x_{n}) / f'(x_{n})

**Alternate Explanation using Taylor’s Series:**

Let x_{1}be the initial guess. We can write x_{2}as below: x_{n+1}= x_{n}+ h ------- (1) Here h would be a small value that can be positive or negative. According to Taylor's Series, ƒ(x) that is infinitely differentiable can be written as below f(x_{n+1}) = f(x_{n}+ h) = f(x_{n}) + h*f'(x_{n}) + ((h*h)/2!)*(f''(x_{n})) + ... Since we are looking for root of function, f(x_{n+1}) = 0 f(x_{n}) + h*f'(x_{n}) + ((h*h)/2!)*(f''(x_{n})) + ... = 0 Now since h is small, h*h would be very small. So if we ignore higher order terms, we get f(x_{n}) + h*f'(x_{n}) = 0 Substituting this value of h = x_{n+1}- x_{n}from equation (1) we get, f(x_{n}) + (x_{n+1}- x_{n})*f'(x_{n}) = 0 x_{n+1}= x_{n}- f(x_{n}) / f'(x_{n})

**Notes:**

- We generally used this method to improve the result obtained by either bisection method or method of false position.
- Babylonian method for square root is derived from the Newton-Raphson method.

**References:**

Introductory Methods of Numerical Analysis by S.S. Sastry

https://en.wikipedia.org/wiki/Newton’s_method

http://www.cae.tntech.edu/Members/renfro/me2000/lectures/2004-09-07_handouts.pdf/at_download/file

This article is contributed by **Abhiraj Smit**. Please write comments if you find anything incorrect, or you want to share more information about the topic discussed above