Practical Constraints on Reason
Background
- From the Church-Turing thesis, we now have a clear
statement of what an effective procedure is.
- Many questions follow:
- Are there things that we might like to know
but which a Turing machine cannot do?
- Are there any measures on effective procedures
which we may want or need to add to our understanding
of computation?
- The answer to both of these questions is yes.
- We have already seen one problem no Turing machine
can solve that we want to solve: an effective way to
decide whether any effective procedure will give an answer
(i.e., whether a machine will halt).
- These kinds of limitations (Godel's theorems, the halting
problem) are in principle limitations on reason.
- There are also practical limits on reason. These include:
- How long it takes to do something (time complexity).
- How much memory it takes to do something (space
complexity).
- Sensitivity to initial conditions (chaos).
1. Time Complexity
- Call an effective procedure an "algorithm."
- A Turing machine takes discrete steps. Assuming that
for any real implementation, these take time, a measure of
the number of steps is a measure of time complexity.
- Time complexity is measured as a function of the input.
An input of n items may require n, 2n, n2, or even
n! steps. The more steps it requires, the more time.
- Some effective procedures require so much time for
certain tasks that they are practically impossible to solve
(although useful approximations may be possible).
- Complexity is measured with "O-notation." This is to
say some task is bounded by some function plus some constant.
To be O(2n), for example, means that for n items
there will be as many or fewer than 2n + K steps
for some constant K.
- An example of a very time consuming but useful problem
is the travelling salesman problem. This problem is "NP-
complete," meaning that it is in a special category of
problems each of which is equivalently very difficult and
apparently cannot be done in polynomial time (for input n, it
appears to be worse than any solution of nk for
some constant k).
- A fun puzzle is the Tower of Hanoi: disks of various
sizes are stacked by order of size on one of three rods in a
line; you must move them one at a time from the left rod
where they are stacked to the rightmost one. BUT, you cannot
stack a disk on top of a smaller disk. This problem is of
the order of 2n steps for n disks. In class, I
displayed a variant where you could also only move the disks
one post at a time; Justin noted that this could be described
recursively where f(0) = 0, and for each n there are 3f(n) + 2
steps.
2. Space complexity
- For an input of n items, an algorithm will also require
working space (for our Turing machines, this is tape).
- We measure space complexity as the amount of memory
(tape squares) that the algorithm will take, as a function of
an input of n items.
- Like time complexity, some problems require impractical
amounts of memory, and cannot therefore ever really be solved.
3. Chaos
- By "chaos," we mean systems that are very sensitive to initial
conditions. Such a system might be one where a very slight change
now results in a very big change later.
- The problem with such systems is that if they are very sensitive,
then errors in measurement can in time result in predictions being
very inaccurate. Suppose for example that some aspect of the weather
is choatic. You measure the temperature and wind velocities and etc.,
and feed this into a predictive model. But if the model included
chaotic equations, then even if it were accurate, some tiny error in
measurements might result in the prediction being very far off from
what actually results.
- A simple example is the logistic equation, an iterative or
recursive equation: xn = K*xn-1*(1 - xn-1
). n refers to the step in the iteration. x should be between
0 and 1. When K is around 3.6+, this equation is sensitive to initial
conditions over time.