Algorithms

Simple as the definition of the notion of algorithm is, the concept of what it attempts to convey is a matter of debate and scientific research. In most of textbooks (see, e.g. Review of Discrete Algorithmic Mathematics by S. B. Maurer and A. Ralston) algorithms are required to possess several properties, notably Finiteness and Definiteness. In prose
 An algorithm is a sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any legitimate input in a finite amount of time [Levitin, p. 3].
If the problem is that of computations, being "unambiguous" usually means that a human of average intelligence must be able (if only in principle) to follow the instructions with pencil and paper. In theory, a discerning robot must be able to perform the job as well. And this inescapably links the idea of algorithm to programming. Indeed, algorithms are commonly expressed either in a programming language like Pascal or C++, or a pseudocode. Some go an extra mile by inventing a custom programming language, like AL by Maurer and Ralston and the notorious assembly for the hypothetical computer MIX by R. Knuth. This is all for the purpose of making the Description of the algorithms as "unambiguous" as possible.
However, quite often, the concept of algorithm is thought to be distinct from that of program. The distinction is similar to that made in mathematics between the Platonic notion of number and a magnitude describing the size of a naturally available set (as in "three cows," say.)
Often, too, a program, unlike an algorithm, is not required to terminate in a finite time, and, indeed, the Halting Problem, that is the problem of existence of an algorithm that could predict whether a given program will terminate (in finite time, of course) for a given input, is a cornerstone of the Computability Theory.
In my view, the finiteness requirement is a red herring. Consider an example of typing out consecutive prime numbers. For any integer, there is an unambiguous (and finite) way to determine whether it is prime or not. We'll do this an integer at a time staring with, say, 2. We may set up a terminating condition (say, check the first million of integers or print out the first million of primes) or not. Compare
 N = 2
while (N < 1000000)
{
  if (N is prime)
    print N
  N = N + 1
}
    N = 2
while (true)
{
  if (N is prime)
    print N
  N = N + 1
}
It's hard for me to imagine why one of these would be honored with the term algorithm while the other would be denied that honor.

No comments:

Post a Comment