MENTAL
 Main Menu
 Applications
 Computer Science
 The Question of Hypercomputation


The Question of Hypercomputation
> THE QUESTION OF
HYPERCOMPUTATION

"There is no doubt that the study of hypercomputation will lead to many important theoretical results in computer science, philosophy, mathematics, and physics" (Toby Ord).



Computing and the Church-Turing Thesis

Since it is not possible to capture the concept of computation, computation is defined as anything that can be computed by a Turing machine (TM). Computability is the same as process in a TM. The so-called Church-Turing Thesis (TCT) states: "Every effective algorithm or procedure is computable by a Turing machine (it is Turing-computable)".

An effective procedure or effective computation is one that can be performed by a human with pencil and paper. It is an informal notion. Precisely, the formalization of the notion of computation is TM. In this way, Turing formalized a concept that until then was ambiguous.

There are many examples of effective computations. For example, elementary arithmetic operations, Euclid's algorithm to obtain the greatest common divisor of two numbers, the algorithm to obtain the least common multiple, etc.

Effective is synonymous with mechanical, and has the following characteristics: TCT, like any thesis, is not provable. But it is falsifiable in Karl Popper's sense: it is enough to find an effective algorithm or method that is not computable by a TM.

TCT is usually justified by appealing to 3 reasons:
  1. The various attempts to formalize the notion of computation have produced formalisms computationally equivalent to an MT.

  2. MT connects abstract computability and physical computation, so MT captures well the intuitive notion of computation.

  3. There are no known counterexamples. That is, no computable function has been found that cannot be computed by a MT. No theoretical development has succeeded in disproving TCT, so MT is considered to be the most fundamental law of computation, a law that sets the limits of what is computable and what is not.

    An example of a non-computable function is the "halting problem": determining whether or not an MT with an input will reach a final result, i.e., that the machine will halt.
For these reasons, most logicians and mathematicians accept TCT as a reasonable and adequate thesis.


The fallacy of TCT, according to Copeland

According to Jack Copeland [1998, 2000], the Church-Turing thesis has been misunderstood. This is what he calls "the Church-Turing fallacy":
Hypercomputation

"Hypercomputation" −a term coined by Jack Copeland in 1999− is a hypothetical computational model that goes beyond a universal Turing machine (MTU). An MTU is an MT in which the program of a specific MT, along with its specific input data, is stored on memory tape and is capable of execution. An MTU can emulate any particular MT. In this sense, an MTU can be considered an abstract hypermachine or supermachine. Today, digital computers are, in essence, MTU's.

A task (process or computation) that cannot be performed by an MTU is termed "incomputable". A hypothetical hypercomputation machine could (paradoxically) compute the incomputable.

Hypercomputation is an interdisciplinary area of research that touches on a wide variety of fields, mainly computer science, mathematics, artificial intelligence, physics, psychology, and philosophy. In general, MT is usually considered to represent the computational limit at the physical level, but this issue is not sufficiently clear, so it is considered an open issue to the possibility that the machine that exceeds this limit will ever be built.

Hypercomputation is an emerging field, a relatively new and very active field. Its researchers are trying to break the so-called "Turing barrier", both at the theoretical (mental) and practical (physically constructible) levels. Whether or not a hypercomputation machine is ever built, it is a topic of enormous interest because the theoretical models provide many insights into the foundations of mathematics and computer science, connecting to the deep philosophical and psychological level. According to Toby Ord [2002], the current theory of computation is in a situation similar to that of geometry before the emergence of non-Euclidean geometries. Euclidean geometry was considered to be the only true geometry because it was the geometry of the physical world. Non-Euclidean geometries were initially labeled "imaginary" to distinguish them from "true" geometry. These new geometries belonged to a mental or theoretical realm, whether or not they had a physical reality.

So an analogy can be drawn between: Non-Euclidean geometries led to a new and generalized view of geometry. Thanks to them, Einstein was able to postulate in 1915, with his theory of general relativity, that the geometry of the universe is not Euclidean. In the same way, the new generalized computational models of hypercomputation should lead to new ideas and results.


The connection between Gödel's theorem and the Church-Turing thesis

In the 1930s two major "negative" results appeared regarding the limitations of mathematics and computer science.
  1. Gödel's incompleteness theorem.
    It is a metamathematical theorem that states that in any formal axiomatic system that contemplates the arithmetic of natural numbers, there are undecidable sentences, that is, they are not derivable from the axioms, so it is not possible to prove their truth or falsity.

    This theorem has been erroneously interpreted by associating it with limitations of the human mind, since supposedly there would be sentences unknowable to the human mind. It is really only a limitation of formal axiomatic systems, since the mind is not a formal axiomatic system, but a semantic system based on degrees of freedom.

  2. The Church-Turing thesis.
    This thesis placed limits on what is computable by stating that only what can be performed by an MT or Church's lambda calculus is computable, since both systems, although conceptually distinct, are computationally equivalent and represent the maximum possible computational power.

    This thesis has been applied to the laws of the universe, with three possibilities:

    1. The universe is a computer governed by the limitations of an MT, so it is impossible to build a physical device with greater computational power than an MT, because to do so would go against the laws of the universe.

    2. The universe is not a computer (the laws of the universe are not computable), so MT is not applicable to the universe.

    3. The universe is a hypercomputer, so it would be possible to build a machine more powerful than an MT.
Both theoretical results are connected, since there is an analogy between functions not computable by an MT and sentences undecidable by a formal axiomatic system.


Proposed Hypercomputation Models

Alternative models have been proposed to try to overcome the limitations of MTU, including the following.


MT with oracle

It can be said that hypercomputation was introduced by Turing himself, since two years after presenting his theoretical machine, in his doctoral thesis "Systems of Logic Based on Ordinals" (supervised by Alonzo Church) he proposed a computational model that went beyond his original theoretical machine: the "O-machine", an MT with oracle (MTO) [Turing, 1938]. An MTO is an MT with an additional degree of freedom: the machine is assisted by an oracle who can be asked questions and whose answers would be part of the computational process. Turing did not specify the nature of that oracle. Today, Turing's work with MTOs has been virtually forgotten.


Accelerated MT

One hypercomputation model that attempts to overcome the limits of an MT is that of an accelerated MT (ATM). It is assumed that in classical models of computation, each operation is executed in a certain unit of time. However, it is conceivable to think of a model of computation that executes its first operation in a unit u of time, the second operation in u/2, the third in u/4, the fourth in u/8, and so on. This machine would execute infinitely many operations in finite time (2u), which would allow it to compute any non-computable problem, such as the halting problem.

But implementing at the physical level an MTA is impossible because we would stumble over the Planck time (≈ 1.4·10−43 secs.), the time it takes light to travel the Planck distance (≈ 1.6·10−35 m.). If we assume that u is one second, step 101 of an MTA requires 2−100 secs, which is less than the Planck time. Therefore, the implementation of an MTA is impossible at the physical (external) level, but possible at the mental (internal) level where time does not exist. It is a situation analogous to the paradox of Achilles and the tortoise: at the physical level, Achilles catches up with the tortoise, but at the mental (logical or theoretical) level he never catches up with it.


Interactivity

Some authors compare algorithms to interactivity [Wegner, 1997], claiming that interactive computing is more powerful than an MT.

In interactivity you are not looking for an outcome. It is only a reaction or interaction with the environment in which a system is situated. All possible inputs are unspecified and may depend on intermediate results or external sources.


Super-recursivity

The term "super-recursivity" was coined by mathematician Mark Burgin [1999, 2004] to refer to complex algorithms that cannot be implemented in an MT. For Burgin, recursive algorithms can be implemented in MTs, but super-recursive algorithms cannot. Recursive algorithms are associated with computation, and super-recursive algorithms with hypercomputation, algorithms that are more efficient than traditional recursive algorithms.

According to Burgin, super-recursive algorithms provide a model closer to the real world. He gives the example of the Ackermann function: Burgin, in his book "Super-recursive algorithms" [2004] develops a theory in which he presents several super-recursive mathematical models. One of them is the inductive MT: an MT that computes indefinitely, without stopping, and produces a result from time to time. The reason for not stopping is that in some cases it is not possible to know whether or not the result has been obtained.

The output can be of many types. If the successive outputs do not change, they are considered the result of the computation. If the output is repetitive oscillating between two or more states, it is considered (or is equivalent to) a stop.

Inductive MTs should not be confused with MTs that compute indefinitely and produce no output.

According to Burgin, there are three possible approaches toward hypercomputation:
  1. The hardware approach. With new, more efficient hardware elements or with more elements operating in parallel.

  2. The software approach. With new computational methods.

  3. The "infware" approach. It is based on the creation of new information structures that enhance information processing.

Quantum computing

Quantum computing is a theoretical computing model (or computational paradigm) inspired by the properties of quantum entities of having superposition of states. While a classical computer is equivalent to an MT, a quantum computer is equivalent to a "quantum MT", a concept created by David Deutsch [1985], one of the main founders of quantum computing.

Quantum computing is based on the use of qubits (short for "quantum bits"). A bit has one state between two possible states (0 and 1). A qubit has simultaneously different degrees of both states in coherent superposition. With n bits there are 2n possible states (mutually exclusive). With nqubits there are 2nsimultaneous states, where each state is a degree of a basic state, which would allow 2noperations to be performed in parallel.

The two states of a qubit are represented as |o⟩ and |1⟩. A qubit is a linear combination |x> = c0|0⟩ + c1|1⟩ of these two states, where the coefficients c0 and c1 are complex numbers (called amplitudes) satisfying the condition |c0|2 + |c1|2 = 1, where |ci| is the modulus of the complex number ci. The modulus of a complex number a+bi is (a+bi)(abi) = a2+b2. The values |c0|2 and |c1|2 are interpreted as the probabilities that, upon making a measurement, the system jumps to the |o⟩ and the |1⟩ state, respectively.

An example of a qubit is (1/√2) |0⟩ + (1/√2) |1⟩. It is satisfied that (1/√2)2 + (1/√2)2 = 1.

Mathematically, a qubit is an element of a 2-dimensional Hilbert space. A Hilbert space of n dimensions is a vector space where the vectors are combinations of the basic vectors by complex numbers. A Hilbert space can be of infinite dimension. A quantum system of n qubits forms a Hilbert space of 2n dimensions. The so-called "Poincaré sphere" represents all possible values of a qubit, which are the points on the surface of the sphere, which are combinations of the basic states |0⟩ and |1⟩. These basic states correspond to the poles of the sphere.

An example of a qubit is the spin of an atom. The spin is associated with the rotational motion around an axis of the particle. The rotation can be in one direction or in the opposite direction. The qubit is the superposition of both states.

When we have 2 qubits, we have 4 intertwined states: |00⟩, |01⟩, |10⟩, |11⟩. The state of a system is a linear combination (superposition) of these 4 states: c0|00⟩ + c1|01⟩ + c2|10⟩ + c3|11⟩. When we have n qubits, there are 2n intertwined states.

Quantum computing is based on quantum logic gates (or, simply, quantum gates). A quantum gate is a function that has one or more qubits as its argument. They differ from classical logic gates in 3 ways: 1) they are probabilistic (classical ones are deterministic); 2) they are reversible (most classical logic gates are not); 3) the number of input qubits is always the same as the output (in classical ones, the output is always one bit). Quantum gates are represented by matrices. For n qubits, the dimension of the matrix is 2n.

Examples of simple (1-qubit) quantum gates are: A universal gate set is a set of gates such that any other gate can be defined as a finite combination of them. In general, a quantum process is a sequence of m logic gates acting on n input qubits. All quantum algorithms are probabilistic because they act on qubits that are probabilistic.

The problem of finding an ideal hardware for quantum computing has not yet been solved, although different systems have been proposed. The reason is that no one has been able to create a sufficiently reliable version of the qubit, the basic building block of a quantum computer. For a superposition (of one or several qubits) to be stable, the system must be protected from the outside and cooled because superpositions are fragile and collapse (the system enters a state of decoherence).

Milestones in quantum computing have been: Quantum computing received a major boost with Peter Shor's algorithm [1994]: a quantum algorithm for decomposing a natural number into prime factors in time of the order of (log n)3 and space of the order of log n. Shor showed that his algorithm could exponentially speed up the classical computation of factoring a natural number. Shor's algorithm requires polynomial time resources, as opposed to the classical ones, which are exponential. For example, for a 300-digit number, the best classical algorithm requires 1024 steps. Shor's algorithm requires "only" 1010 steps. Factorization of a 1000-digit number would take billions of years on a classical computer. A quantum computer would do it in a few minutes, thus being able to break cryptosystems based on factoring two large prime numbers (such as the public key system like RSA).

Shor's algorithm was followed by other quantum algorithms aimed at solving computationally complex algebraic and combinatorial problems. Lou Grover's [1996] algorithm for searching an unordered sequence of data stands out. Searching a database of nrecords requires an average of n/2 steps. A quantum computer would do it in √n steps. For one million records, a classical computer would need 500,000 steps and a quantum computer would do it in 1,000 steps.

Tien D. Kien [2005] claims to have a scheme according to which, in principle, quantum systems could be used to solve computationally undecidable problems in finite time.

Quantum information theory analyzes the possibilities offered by the laws of quantum physics for the storage, processing, and transmission of information.

Much research is being done to create the first quantum computer, a computer that makes use of quantum phenomena (such as superposition and entanglement) to perform computations. The research is aimed at trying to achieve an architecture that is compatible with the current ones, i.e., that classical computing is a particular case of quantum computing. The most progress has been made at the theoretical level.

The so-called "Di Vinzenzo's list" establishes 5 conditions that a quantum computing system must fulfill:
  1. The system must be able to be initialized, i.e. brought to a known initial state.

  2. The qubits must be manipulable by means of a universal set of logic gates.

  3. The system must maintain its coherence during all computation.

  4. After the computation, it must be possible to read the final state corresponding to the result.

  5. The system must be scalable in number of qubits to increase its computational power.
Current problems with quantum computing are: 1) a qubit cannot be cloned, copied or transmitted; 2) reading the result of a computation changes the superposition state of the qubits.

Quantum algorithms have great advantages over classical algorithms. Processes can be executed in parallel without the need to add processors to the machine. An operation performed on a qubit system is performed simultaneously on all its states. This means that processes that require exponential processing time are reduced to linear time.

According to Seth Lloyd [2013], the universe is a quantum computer. This would explain that the universe is a mixture of order and randomness and gives rise to simple and complex systems.


MENTAL and Hypercomputation

First, we must distinguish between theoretical (mental, internal) models of computation and practical (physical, external) models, i.e., models implementable by mechanical, electronic, optical, biological, quantum, etc., devices.

David Deutsch claims that it makes no sense to talk about computability without a physical referent. However, Turing never talked about implementation of his theoretical machine. The MT is a theoretical (or mental) device and the Church-Turing thesis is a theoretical thesis. Turing never mentioned how to implement his theoretical machine nor an MTO. But current computers are partial implementations of an MTU, and every program that runs on it is an implementation of a particular MT.

Therefore, it does make sense to posit theoretical models of computation because one must always follow the Principle of Descending Causality: from the superior (mental) to the inferior (physical). This is precisely the case of MENTAL, a theoretical computational model of universal type, where the computational limit is governed by universal semantic primitives. It is also a theoretical thesis, which generalizes the Church-Turing thesis by contemplating not only computations but also descriptions. It is also a theory of a higher level of abstraction, conceptually superior to universal MT, which is of pure computational detail. Programming an algorithm with a MT is a complex task because it requires too much detail, with very low level instructions.

At the algorithmic computational level (regardless of implementation), MENTAL is superior to the MTU:

Addenda

The Quantum Mind

The theory of quantum mind is a theory with two versions:
  1. Strong version. It postulates that the mind functions according to the principles of quantum physics.

  2. Weak version. Quantum phenomena (such as superposition and entanglement) play an important role in the functioning of the mind and may even be the basis for the explanation of consciousness.
The theory of quantum mind is founded on two points: 1) that classical physics is not capable of explaining mind and consciousness; 2) that quantum theory is the most fundamental theory of matter and that at this deep level there must be some correspondence or relationship with mind.

The idea that quantum theory is related to the workings of the mind goes back to Eugene Wigner, who postulated that the wave function of a quantum entity collapses due to interaction with consciousness.

For researchers in favor of the quantum theory of mind, the process of a quantum system starting in coherence and ending in collapse is similar to what happens in the mind. Multiple ideas flit under the threshold of consciousness (in the subconscious) and then one of them emerges and becomes conscious.

According to David Bohm, in the universe there is a deep level (the implicate order) from which the physical world and the mental world (the explicate order) emerge. Mind and matter are manifestations or projections of the implicate order in the explicate order. Consciousness emerges from the implicate order. The understanding of movement in consciousness (like listening to music) is also grounded in the implicate order.

For Roger Penrose [1996], in his work "The Emperor's New Mind", the brain may be functioning as a quantum computer. Based on Gödel's incompleteness theorem, he claims that consciousness is not computable; the mind has the capacity to think beyond the algorithmic mode.


Bibliography