Experience Mathematics #28 -- How fast do functions grow?

In the last column, we talked about the explosive growth of the exponential function. The number of computers infected by the SQL Slammer worm increased dramatically, bringing the Internet crashing down in a couple of hours.

Computer scientists measure the speed of computer algorithms by comparing them to functions.
Some of the functions they use are logarithmic: $\log(n)$, linear: $n$; the power functions: $n^2, n^3, n^4,\dots $; and the exponential functions: $2^n, 10^n,$ etc.

Usually $n$ is the size of the input. Computer scientists make statements such as: The “order” of an algorithm is $n^2$. For example, if you have to sort $n$ numbers, the algorithm is of order $n^2$. This means that the computer has to make approximately $n^2$ calculations. To get an idea of which algorithms are faster, consider when $n=1000$. $\log(n)$ is just $3$. The linear function $n$ is also manageable, at $1000$. However, $n^2$ is $1,000,000$ (one million) and $n^3$, is one billion. And $10^n$ is a huge number, 1 followed by a thousand zeros. This number of calculations is more than what Deep Junior had to perform to defeat Kasparov in Chess.

All these functions go to infinity as $n$ goes to infinity. That is to say, they become bigger and bigger as $n$ becomes bigger. What matters (to computer scientists) is how fast or slow this increase is. The slower the increase, the faster the algorithm.

Logarithmic, linear and Polynomial time algorithms the only algorithms that are fast enough to work in practical situations.

Computer scientists are continuously finding faster and faster algorithms. Recently, Maninder Agarwal, Neeraj Kayal, and Nitin Saxena, of IIT, Kanpur, found a deterministic polynomial-time algorithm to determine whether a number is a prime number.

This solved a problem that mathematicians have been trying to solve for centuries.