Solution of Algebraic and Transcendental Equations

An expression of the form,

$$f(x) = a_{0}x^{n}+a_{1}x^{n-1}+...+a_{n-1}x+a_{n}$$

where, a's are constants ($a_{0} \neq 0$) and n is a positive integer, is called a polynomial in $x$ of degree $n$. The polynomial $f(x) = 0$ is called an algebraic equation of degree $n$. If $f(x)$ contains some other functions such as trigonometric, logarithmic, exponential etc., then $f(x)= 0$ is called a transcendental equation.

Root of $f(x)$

The value of $x$ for which the function $f(x)=0$ is called the root of the function. If we show it graphically , the value of $x$ for which function $f(x)$ crosses the $x$-axis$(y=0)$ is the root of that function. The process of finding the roots of an equation is known as the solution of that equation.

Methods to Find Root

There are two ways of finding the root, either we plot the function on the graph or we find by numerical computation. This tutorial is about numerical computation, so we will take the mathematical route.

In numerical methods, finding the roots are iterative procedures that require a starting point, i.e, an estimate of the root. So,it is highly advisable to bracket the roots, i.e, determine the upper and lower bounds, before passing it to a root finding algorithm. The methods mostly used are,

  • Bisection(or Bolzano) method
  • Regula-Falsi method
  • Secant method
  • Newton-Raphson method

Bisection Method

This method is based on the repeated application of intermediate value property. Given a function $f(x)$ between interval $(a,b)$, such that $f(a)\cdot f(b)< 0$. The first approximation to the root is the midpoint of the extremes, $$x_{1}= \frac{(a+b)}{2}$$


If $f(x_1) = 0$, then $x_1$ is the root of $f(x) = 0$. Otherwise, the root lies in $(a,x_1)$ or $(x_1,b)$ following the above condition. We iterate the process, until, the root is found to the desired accuracy.

Regula-Falsi Method

This method closely resembles the bisection method. Given a function $f(x)$ between interval $(x_0,x_1)$, such that $f(x_0)\cdot f(x_1) < 0$. The method consists in replacing the curve AB by means of a chord AB. The equation of the chord joining the points $A[x_0,f(x_0)]$ and $B[x_1,f(x_1)]$ is, $$y - f(x_0) = \frac{f(x_1)-f(x_0)}{x_1-x_0} \cdot (x-x_0)$$


The point of intersection of the chord with the $x$-axis is an approximation to the root. So the abscissa of the point where the chord cuts the $x$-axis$(y=0)$ is given by, $$x_{2} = \frac{x_1-x_0}{f(x_1)-f(x_0)}\cdot f(x_0)$$

We iterate the process following the above condition, until, the root is found to the desired accuracy.

Secant Method

This method is an improvement over Regula-Falsi method. The graph of the function is approximated by a secant line as shown above. Here, it is not necessary that the interval must contain the root. Taking points $A[x_0,f(x_0)]$ and $B[x_1,f(x_1)]$ as the initial limits of the interval, we write the equation of the chord joining these as $$y - f(x_1) = \frac{f(x_1)-f(x_0)}{x_1 - x_0} \cdot (x - x_1)$$

Then the abscissa of the point where it crosses the $x$-axis is, $$x_2 = x_1 - \frac{x_1 - x_0}{f(x_1) - f(x_0)} \cdot f(x_1)$$

The general formula for successive approximation is, $$x_{n+1} = x_n - \frac{x_n - x_{n-1}}{f(x_n) - f(x_{n-1})} \cdot f(x_n)$$

Newton-Raphson Method

It is one of the most widely used methods for finding roots. It works on the principle that a continuous and differentiable function can be approximated by a straight line tangent to it. Suppose we need to find the root of a function $f(x)$, and we know that the root we are looking for is near the point $x = x_0$. Then a better approximation for the root is given by, $$x_1 = x_0 - \frac{f(x_0)}{f^{'}(x_0)}$$

We iterate the process to get the desired accuracy. The general formula is given by, $$x_{n+1} = x_n - \frac{f(x_n)}{f^{'}(x_n)}$$

Source Code

The Algorithm has been written in python language which can be found at my github link 'Solution_of_Algebraic_Transcendental_Equation'.