The Mathematica Journal
Departments
Current Issue
Tricks of the Trade
In and Out
Riemann Surfaces IIc
The Mathematica Programmer
New Products
New Publications
Classifieds
Calendar
News Bulletins
Library
Editor's Pick
Mailbox
FAQ
Write Us
About the Journal
Staff and Contributors
Submissions
Subscriptions
Advertising
Back Issues
Home
Download This Issue

Searching for

With these straightforward additions, Chaitin's LISP can be used in an empirical quest for . One approach consists of a LISP expression implementing a universal Turing machine (UTM). This expression, along with an expression representing the program for the UTM, is fed to the LISP interpreter and tested to see if it halts within a given time limit. Since the UTM's program is simply a bit string of some fixed length, one can search through sets of possible "programs" (i.e., bit strings) and determine if they will halt within a given amount of time, according to the implementation of the UTM (again, in the form of a self-delimited bit string).

LISP Interpreter Run

[[ An implementation of the UTM ]]
...
define (U p) cadr try no-time-limit 'eval read-exp p
...
define U
value (lambda (p) (car (cdr (try no-time-limit (' (eval
(read-exp))) p))))
...

[[ A run of the UTM on a bit string representing a LISP M-Expression ]]
...
(U bits 'cons x cons y cons z nil)

expression (U (bits (' (cons x (cons y (cons z nil))))))
value (x y z)
...

End of LISP Run

From here it is a simple extension to try all possible strings of a given length to see if any of them will halt. Of course this is going to be a bit of an extended exercise should we want to test all possible strings of all possible lengths. Yet, we trundle on nonetheless by computing the nth lower bound of . To get this lower bound, we take all programs of length , run them for some given time, and look at which ones halt. As then we converge on , albeit very slowly indeed. But let's take a look at a few anyway.

LISP Interpreter Run

[[[[ Omega in the limit from below! ]]]]

define (all-bit-strings-of-size k)
if = 0 k '(())
(extend-by-one-bit (all-bit-strings-of-size - k 1))
...
define (count-halt p)
if atom p 0
+
if = success car try t 'eval read-exp car p
1 0
(count-halt cdr p)
...
define (omega t) cons (count-halt (all-bit-strings-of-size t))
cons /
cons ^ 2 t
nil

[[ Now, run it for a few values of t ]]

(omega 0)
value (0 / 1)

(omega 1)
value (0 / 2)

(omega 2)
value (0 / 4)

(omega 3)
value (0 / 8)

(omega 8)
value (1 / 256)

End of LISP Run

As it turns out, the various form a monotonically increasing series of rational numbers as increases. The fun thing here is that you can actually try it. You can compute a lower bound for at least for some given program size. To give some of the punch line away, the bits that make up the actual value of turn out to be unpredictable; they're random! Therefore they are incompressible. You can't make a program that will generate them that is any smaller than the bits themselves (plus you need a few bits to do the generation itself). The most elegant representation of turns out to be the most complex as well.


Copyright © 2002 Wolfram Media, Inc. All rights reserved.

[Article Index][Prev Page][Next Page]