## Friday, February 28, 2003

### Experience Mathematics #28 -- How fast do functions grow?

In the last column, we talked about the explosive growth of the exponential function. The number of computers infected by the SQL Slammer worm increased dramatically, bringing the Internet crashing down in a couple of hours.

Computer scientists measure the speed of computer algorithms by comparing them to functions.
Some of the functions they use are logarithmic: $\log(n)$, linear: $n$; the power functions: $n^2, n^3, n^4,\dots$; and the exponential functions: $2^n, 10^n,$ etc.

Usually $n$ is the size of the input. Computer scientists make statements such as: The “order” of an algorithm is $n^2$. For example, if you have to sort $n$ numbers, the algorithm is of order $n^2$. This means that the computer has to make approximately $n^2$ calculations. To get an idea of which algorithms are faster, consider when $n=1000$. $\log(n)$ is just $3$. The linear function $n$ is also manageable, at $1000$. However, $n^2$ is $1,000,000$ (one million) and $n^3$, is one billion. And $10^n$ is a huge number, 1 followed by a thousand zeros. This number of calculations is more than what Deep Junior had to perform to defeat Kasparov in Chess.

All these functions go to infinity as $n$ goes to infinity. That is to say, they become bigger and bigger as $n$ becomes bigger. What matters (to computer scientists) is how fast or slow this increase is. The slower the increase, the faster the algorithm.

Logarithmic, linear and Polynomial time algorithms the only algorithms that are fast enough to work in practical situations.

Computer scientists are continuously finding faster and faster algorithms. Recently, Maninder Agarwal, Neeraj Kayal, and Nitin Saxena, of IIT, Kanpur, found a deterministic polynomial-time algorithm to determine whether a number is a prime number.

This solved a problem that mathematicians have been trying to solve for centuries.

## Friday, February 07, 2003

### Experience Mathematics # 27 -- The exponential function

“It only took 10 minutes for the SQL Slammer worm to race across the globe and wreak havoc on the Internet two weeks ago” is what newspapers reported on February 7, 2003. The number of computers doubled every 8.5 seconds in the first minute of the worm’s existence. So how many computers got infected in the first minute?

The number of computers infected in $t$ seconds can be modelled by the function $N(t)$, where
$$N(t)=2^{t/8.5}.$$

This is a reasonable model. In the beginning (when $t=0$) we assume that the worm has infected only $1$ computer, and indeed, $N(0)=1$. In 8.5 seconds, the number becomes $N(8.5)=2$, so it doubles. In another $8.5$ seconds, $N(17)=4$. This doesn’t sound like very fast growth, but at the end of one minute ($t=60$) the number becomes more than $133$. In two minutes, $28995$ computers are infected, and in $5$ minutes, the number of computers infected is in the billions. Which means that the rate of growth must have slowed down, because there aren’t so many computers in the world! I hope this helps you understand why it caused the slowdown of the Internet traffic in Korea.

$N(t)$ is an example of an exponential function, which increases very fast. The prototypical example is THE exponential function, $f(x)=e^x$. Here the number e is an irrational number (just like the famous $\pi$, whose value is approximately $2.7182818\dots$. The graph of the function (made using desmos.com) is as follows.

The exponential function is not symmetric across the $y$-axis, nor across the origin. That is to say, it is not an even function or an odd function. However, consider the functions:
$$E(x)=(e^x+e^{-x})/2,$$ and $$O(x)=(e^x-e^{-x})/2.$$ $E$ is an even function, and $O$ is an odd function, and the exponential function is the sum of these functions. Draw the function $E(x)$ from $x=-5$ to $x=5$ using MS-Excel (update: try www.desmos.com), and see what the graph looks like. Does it look like a clothesline secured at its two ends?