To define Ω formally, we first need to fix a model of computation, for instance Turing machines or LISP or Pascal programs. We then need to specify an unambiguous encoding of programs (or machines) as bit strings. This encoding must have the property that if w encodes a syntactically correct program, then no proper prefix of w encodes a syntactically correct program. This can always be achieved by using a special end symbol. We only consider programs that don't require any input. |
To define Ω formally, we first need to fix a model of computation, for instance Turing machines or LISP or Pascal programs. We then need to specify an unambiguous encoding of programs (or machines) as bit strings. This encoding must have the property that if w encodes a syntactically correct program, then no proper prefix of w encodes a syntactically correct program. This can always be achieved by using a special end symbol. We only consider programs that don't require any input. |
One can prove that there is no algorithm which produces the digits of Ω. Furthermore, if you fix, in addition to the computation model and encoding mentioned above, a specific consistent axiomatic system for the natural numbers, say [Peano's axioms]?, then there exists a constant N such that no digit of Ω after the N-th can be proven to be one or zero within that system. (The constant N heavily depends on the encoding choices and does not reflect the complexity of the axiomatic system in any way.) |
One can prove that there is no algorithm which produces the digits of Ω: Ω is definable but not computable. Furthermore, if you fix, in addition to the computation model and encoding mentioned above, a specific consistent axiomatic system for the natural numbers, say [Peano's axioms]?, then there exists a constant N such that no digit of Ω after the N-th can be proven to be one or zero within that system. (The constant N heavily depends on the encoding choices and does not reflect the complexity of the axiomatic system in any way.) |