Sunday, August 02, 2009

Chaitin's Constant

(with Nicoletta Sabadini).

I have been reading about Chaitin's work, beginning with this document.

Various comments as to the significance of the work seem to me rather exaggerated.
Consider, for example, the following regarding Chaitin's constant:
"$\Omega$ = concentrated creativity? Each bit of $\Omega$ = one bit of creativity? Can human intellectual progress be measured by the number of bits of $\Omega$ that we know, or are currently capable of knowing, as a function of time?"

My main objection is that the various concepts discussed depend on the computing language involved, but that reference to this dependence is quickly omitted.

For example, the definition of random finite sequence (also Kolmogorov, or perhaps Solomonoff) does not allow one to decide whether any particular sequence is random. Dropping the language dependence from the definition is imprecise.
("Definition of Randomness R1: A random finite binary string is one that cannot be compressed into a program smaller than itself, that is, that is not the unique output of a program without any input, a program whose size in bits is smaller than the size in bits of its output.")
Given a language in which the shortest legal program has one billion bits, any sequence of less than one billion bits is incompressible, and hence random.

Now let's look at the Chaitin's constant $\Omega$.
The omission of the language dependence from $\Omega$ is imprecise.

What is Chaitin's constant? The definition is a little complicated which I think adds to its fascination. A very clear description is given in his paper of 1975.

First Chaitin's constant is not a constant but depends on a particular interpreter M of a particular Turing complete computing language L, in which programs and data are bit strings: the interpreter acts on bit strings which consist of a program and data concatenated, and if it terminates outputs a data bitstring.

There is now an important assumption: Chaitin considers a class of programs P which compute just the partial recursive functions on bitstrings whose domains are prefix-free sets of bitstrings. He shows that there is a universal such machine M: it is defined on a set PD of program-data bitstrings pd which is prefix free, and calculates the value of p on data d. Then Chaitin's "constant" for this language, and interpreter, is number
\[\Omega}_{L,M}=\sum_{pd\in PD}2^{-length(pd)}.\] The prefix freeness tells you that this series converges to a real number $\Omega_{L,M} < 1$, Chaitin's constant.

Chaitin shows that there exist such interpreters M of Turing complete languages L which have the prefix free property, in which the program bitstrings are of the form $1^k0$.

For the above kind of interpreter and a language for which all programs with at most one million and one bits diverge on any input Chaitin's constant has first million bits $0$. If the first million programs all halt on empty input then the first million bits of Chaitin's constant are $1$.

Let's now look at a simpler analogue $K$ of Chaitin's constant.
The languages L I have in mind are those Turing complete languages with programs encoded (via Godel numbering, for example) as natural number written as unary terminated by $0$; data is also written as unary terminated by $0$. An interpreter M first checks that the string is of the form described above, and cycles infinitely if not. Then the interpreter decodes the program number as a program, using the zero to see the end of the program, and applies it to the data number. If the program number does not correspond to a legitimate program the interpreter cycles infinitely. The set of all program-data strings is prefix free, and hence also the set of those for which the interpret halts.

The strings on which M halts are all of the form
\[(1^s)0(1^t)0=11111...101111...10;\] that is s 1's followed by a zero, and then t ones followed by a zero. We can immediately see in this case that the constant (defined in the same way as $\Omega$ but for this interpreter M) $K <1 1.="" because="" br="" form="" is="" k="" length="" number="" of="" s="" sequences="" so="" sum_="" t="" that="" the="" this="">
Let's look a bit more closely at the numbers $K$ which occur in our class of examples. The contribution to $K_{L,M}$ of program-data strings of $length>n$ is easy to estimate since
\[\sum_{k=n+1}^{\infty} k/2^{(k+1)} = (n+1)/2^n.\] So for example the program-data strings of length > 1000 contribute at most $1001/2^{1000} < 1/2^{990}$ to the constant $T$. This means that the first 990 digits of the constant are determined by the halting or not of the interpreter on program-data strings of length at most 1000. But the coding of programs may be chosen so that all programs of length at most 1000 halt on any input; or alternatively all diverge on any input. This means that with appropriate choice of L,M the number $K_{L,M}$ can have first 990 digits 1; for a different choice it will have the first 990 digits 0; in fact any sequence of 990 bits can occur. There is nothing special about 990: there are constants $K_{L,M}$ in our class which have first n digits 0 (or, if you wish, first n digits 1) for any fixed n.

A much simpler constant. Consider now a language L and interpreter M, not prefix free, for which the program and data are encoded as a single number using Godel numbering plus the Cantor coding of pairs numbers as single numbers. So the strings we are considering now are just strings of 1's. The interpreter first decodes the string into two numbers using Cantor, then decodes the first string using Godel into a program (checking if the program is syntactically correct, and cycling otherwise). Finally the interpreter applies the program to the second string, sometimes halting and producing a string, sometimes diverging.

Define the Turing constant $T_{L,M}$ to be the sum over all natural number n for which the interpreter halts, of $2^{-n}$ which clearly is a real number <1 .="" 0.="" a="" allow="" also="" avoided="" binary.="" binary="" br="" but="" by="" could="" cycle="" found="" have="" having="" if="" infinitely="" instead="" interpreter="" it="" length="" many="" of="" problem="" program-data="" same="" strings="" the="" unary="" using="" we="" would="">
Knowing the Turing constant would solve many problems of mathematics - it contains the information of which theorems are true in a formal system (program the search for deductions); it resolves Goldbach's conjecture (progam a search for counter examples - if the program terminates Goldbach is false, if not Goldbach is true). Etc, etc. These are some of the advertised propeties of Chaitin's constant.

Labels: , ,

1 Comments:

Anonymous madaco said...

on wikipedia it says that chatlains constant is normal, do you know why is says that?

-confused

12:42 PM  

Post a Comment

<< Home