In computability theory, the Church–Turing thesis (also known as the Turing–Church thesis,^{[1]} the Church–Turing conjecture, Church's thesis, Church's conjecture, and Turing's thesis) is a combined hypothesis ("thesis") about the nature of functions whose values are effectively calculable; or, in more modern terms, functions whose values are algorithmically computable. In simple terms, the Church–Turing thesis states that a function is algorithmically computable if and only if it is computable by a Turing machine.
Several independent attempts were made in the first half of the 20th century to formalize the notion of computability:
 American mathematician Alonzo Church created a method for defining functions called the λcalculus,
 British mathematician Alan Turing created a theoretical model for machines, now called Turing machines, that could carry out calculations from inputs,
 Kurt Gödel, with Jacques Herbrand, created a formal definition of a class of functions whose values could be calculated by recursion.
All three computational processes (recursion, the λcalculus, and the Turing machine) were shown to be equivalent—all three approaches define the same class of functions.^{[2]}^{[3]} This has led mathematicians and computer scientists to believe that the concept of computability is accurately characterized by these three equivalent processes.
Informally, the Church–Turing thesis states that if some method (algorithm) exists to carry out a calculation, then the same calculation can also be carried out by a Turing machine (as well as by a recursively definable function, and by a λfunction).
Even though the three processes mentioned above proved to be equivalent, the fundamental premise behind the thesis — the notion of what it means for a function to be effectively calculable — is "a somewhat vague intuitive one".^{[4]} Thus, the thesis, although it has nearuniversal acceptance, cannot be formally proven.
Formal statement
J.B. Rosser 1939 addresses the notion of "effective computability" as follows: "Clearly the existence of CC and RC (Church's and Rosser's proofs) presupposes a precise definition of 'effective'. 'Effective method' is here used in the rather special sense of a method each step of which is precisely predetermined and which is certain to produce the answer in a finite number of steps".^{[5]} Thus the adverbadjective "effective" is used in a sense of "1a: producing a decided, decisive, or desired effect", and "capable of producing a result".^{[6]}
In the following, the words "effectively calculable" will mean "produced by any intuitively 'effective' means whatsoever" and "effectively computable" will mean "produced by a Turingmachine or equivalent mechanical device". Turing's "definitions" given in a footnote in his 1939 Ph.D. thesis Systems of Logic Based on Ordinals, supervised by Church, are virtually the same:
 "^{†} We shall use the expression 'computable function' to mean a function calculable by a machine, and let 'effectively calculable' refer to the intuitive idea without particular identification with any one of these definitions."^{[7]}
The thesis can be stated as follows:
 Every effectively calculable function is a computable function.^{[8]}
Turing stated it this way:
 "It was stated ... that 'a function is effectively calculable if its values can be found by some purely mechanical process.' We may take this literally, understanding that by a purely mechanical process one which could be carried out by a machine. The development ... leads to ... an identification of computability^{†} with effective calculability." († is the footnote above, ibid.)
History
One of the important problems for logicians in the 1930s was David Hilbert's Entscheidungsproblem, which asked whether there was a mechanical procedure for separating mathematical truths from mathematical falsehoods. This quest required that the notion of "algorithm" or "effective calculability" be pinned down, at least well enough for the quest to begin.^{[9]} But from the very outset Alonzo Church's attempts began with a debate that continues to this day.^{[10]} Was the notion of "effective calculability" to be (i) an "axiom or axioms" in an axiomatic system, or (ii) merely a definition that "identified" two or more propositions, or (iii) an empirical hypothesis to be verified by observation of natural events, or (iv) or just a proposal for the sake of argument (i.e. a "thesis").
Circa 1930–1952
In the course of studying the problem, Church and his student Stephen Kleene introduced the notion of λdefinable functions, and they were able to prove that several large classes of functions frequently encountered in number theory were λdefinable.^{[11]} The debate began when Church proposed to Gödel that one should define the "effectively computable" functions as the λdefinable functions. Gödel, however, was not convinced and called the proposal "thoroughly unsatisfactory".^{[12]} Rather, in correspondence with Church (ca 1934–5), Gödel proposed axiomatizing the notion of "effective calculability"; indeed, in a 1935 letter to Kleene, Church reported that:
 "His [Gödel's] only idea at the time was that it might be possible, in terms of effective calculability as an undefined notion, to state a set of axioms which would embody the generally accepted properties of this notion, and to do something on that basis".^{[13]}
But Gödel offered no further guidance. Eventually, he would suggest his (primitive) recursion, modified by Herbrand's suggestion, that Gödel had detailed in his 1934 lectures in Princeton NJ (Kleene and another student Rosser transcribed the notes). But "he did not think that the two ideas could be satisfactorily identified "except heuristically".^{[14]}
Next, it was necessary to identify and prove the equivalence of two notions of effective calculability. Equipped with the λcalculus and "general" recursion, Stephen Kleene with help of Church and J. B. Rosser produced proofs (1933, 1935) to show that the two calculi are equivalent. Church subsequently modified his methods to include use of Herbrand–Gödel recursion and then proved (1936) that the Entscheidungsproblem is unsolvable: There is no generalized "effective calculation" (method, algorithm) that can determine whether or not a formula in either the recursive or λcalculus is "valid" (more precisely: no method to show that a well formed formula has a "normal form").^{[15]}
Many years later in a letter to Davis (ca 1965), Gödel would confess that "he was, at the time of these [1934] lectures, not at all convinced that his concept of recursion comprised all possible recursions".^{[16]} By 1963–4 Gödel would disavow Herbrand–Gödel recursion and the λcalculus in favor of the Turing machine as the definition of "algorithm" or "mechanical procedure" or "formal system".^{[17]}
A hypothesis leading to a natural law?: In late 1936 Alan Turing's paper (also proving that the Entscheidungsproblem is unsolvable) was delivered orally, but had not yet appeared in print.^{[18]} On the other hand, Emil Post's 1936 paper had appeared and was certified independent of Turing's work.^{[19]} Post strongly disagreed with Church's "identification" of effective computability with the λcalculus and recursion, stating:
 "Actually the work already done by Church and others carries this identification considerably beyond the working hypothesis stage. But to mask this identification under a definition . . . blinds us to the need of its continual verification."^{[20]}
Rather, he regarded the notion of "effective calculability" as merely a "working hypothesis" that might lead by inductive reasoning to a "natural law" rather than by "a definition or an axiom".^{[21]} This idea was "sharply" criticized by Church.^{[22]}
Thus Post in his 1936^{[13]} paper was also discounting Kurt Gödel's suggestion to Church in 1934–5 that the thesis might be expressed as an axiom or set of axioms.^{[13]}
Turing adds another definition, Rosser equates all three: Within just a short time, Turing's 1936–37 paper "On Computable Numbers, with an Application to the Entscheidungsproblem"^{[18]} appeared. In it he stated another notion of "effective computability" with the introduction of his amachines (now known as the Turing machine abstract computational model). And in a proofsketch added as an "Appendix" to his 1936–37 paper, Turing showed that the classes of functions defined by λcalculus and Turing machines coincided.^{[23]}
In a few years (1939) Turing would propose, like Church and Kleene before him, that his formal definition of mechanical computing agent was the correct one.^{[24]} Thus, by 1939, both Church (1934) and Turing (1939), neither having knowledge of the other's efforts, had individually proposed that their "formal systems" should be definitions of "effective calculability";^{[25]} neither framed their statements as theses.
Rosser (1939) formally identified the three notionsasdefinitions:
 "All three definitions are equivalent, so it does not matter which one is used."^{[26]}
Kleene proposes Church's Thesis: This left the overt expression of a "thesis" to Kleene. In his 1943 paper Recursive Predicates and Quantifiers Kleene proposed his "THESIS I":
 "This heuristic fact [general recursive functions are effectively calculable]...led Church to state the following thesis(^{22}). The same thesis is implicit in Turing's description of computing machines(^{23}).
 "THESIS I. Every effectively calculable function (effectively decidable predicate) is general^{[27]} recursive [Kleene's italics]
 "Since a precise mathematical definition of the term effectively calculable (effectively decidable) has been wanting, we can take this thesis ... as a definition of it..."^{[28]}
 "(^{22}) references Church 1936
 "(^{23}) references Turing 1936–7
Kleene goes on to note that:
 "...the thesis has the character of an hypothesis—a point emphasized by Post and by Church(^{24}). If we consider the thesis and its converse as definition, then the hypothesis is an hypothesis about the application of the mathematical theory developed from the definition. For the acceptance of the hypothesis, there are, as we have suggested, quite compelling grounds."^{[28]}
 "(24) references Post 1936 of Post and Church's Formal definitions in the theory of ordinal numbers, Fund. Math. vol 28 (1936) pp.11–21 (see ref. #2, Davis 1965:286).
Kleene's Church–Turing Thesis: A few years later (1952) Kleene would overtly name, defend, and express the two "theses" and then "identify" them (show equivalence) by use of his Theorem XXX:
 "Heuristic evidence and other considerations led Church 1936 to propose the following thesis.
 Thesis I. Every effectively calculable function (effectively decidable predicate) is general recursive.^{[29]}
 Theorem XXX: "The following classes of partial functions are coextensive, i.e. have the same members: (a) the partial recursive functions, (b) the computable functions. . . ".^{[30]}
 Turing's thesis: "Turing's thesis that every function which would naturally be regarded as computable is computable under his definition, i.e. by one of his machines, is equivalent to Church's thesis by Theorem XXX."^{[31]}
Later developments
An attempt to understand the notion of "effective computability" better led Robin Gandy (Turing's student and friend) in 1980 to analyze machine computation (as opposed to humancomputation acted out by a Turing machine). Gandy's curiosity about, and analysis of, "cellular automata", "Conway's game of life", "parallelism" and "crystalline automata" led him to propose four "principles (or constraints) ... which it is argued, any machine must satisfy."^{[32]} His mostimportant fourth, "the principle of causality" is based on the "finite velocity of propagation of effects and signals; contemporary physics rejects the possibility of instantaneous action at a distance."^{[33]} From these principles and some additional constraints—(1a) a lower bound on the linear dimensions of any of the parts, (1b) an upper bound on speed of propagation (the velocity of light), (2) discrete progress of the machine, and (3) deterministic behavior—he produces a theorem that "What can be calculated by a device satisfying principles I–IV is computable.^{[34]} ".
In the late 1990s Wilfried Sieg analyzed Turing's and Gandy's notions of "effective calculability" with the intent of "sharpening the informal notion, formulating its general features axiomatically, and investigating the axiomatic framework".^{[35]} In his 1997 and 2002 Sieg presents a series of constraints on the behavior of a computor—"a human computing agent who proceeds mechanically"; these constraints reduce to:
 "(B.1) (Boundedness) There is a fixed bound on the number of symbolic configurations a computor can immediately recognize.
 "(B.2) (Boundedness) There is a fixed bound on the number of internal states a computor can be in.
 "(L.1) (Locality) A computor can change only elements of an observed symbolic configuration.
 "(L.2) (Locality) A computor can shift attention from one symbolic configuration to another one, but the new observed configurations must be within a bounded distance of the immediately previously observed configuration.
 "(D) (Determinacy) The immediately recognizable (sub)configuration determines uniquely the next computation step (and id [instantaneous description] )"; stated another way: "A computor's internal state together with the observed configuration fixes uniquely the next computation step and the next internal state."^{[36]}
The matter remains in active discussion within the academic community.^{[37]}
The thesis as a definition
The thesis can be viewed as nothing but an ordinary mathematical definition. Comments by Gödel on the subject suggest this view, e.g. "the correct definition of mechanical computability was established beyond any doubt by Turing".^{[38]} The case for viewing the thesis as nothing more than a definition is made explicitly by Robert I. Soare in ^{[39]} where it is also argued that Turing's definition of computability is no less likely to be correct than the epsilondelta definition of a continuous function.
Success of the thesis
Other formalisms (besides recursion, the λcalculus, and the Turing machine) have been proposed for describing effective calculability/computability. Stephen Kleene (1952) adds to the list the functions "reckonable in the system S_{1}" of Kurt Gödel 1936, and Emil Post's (1943, 1946) "canonical [also called normal] systems".^{[40]} In the 1950s Hao Wang and Martin Davis greatly simplified the onetape Turingmachine model (see Post–Turing machine). Marvin Minsky expanded the model to two or more tapes and greatly simplified the tapes into "updown counters", which Melzak and Lambek further evolved into what is now known as the counter machine model. In the late 1960s and early 1970s researchers expanded the counter machine model into the register machine, a close cousin to the modern notion of the computer. Other models include combinatory logic and Markov algorithms. Gurevich adds the pointer machine model of Kolmogorov and Uspensky (1953, 1958): "...they just wanted to ... convince themselves that there is no way to extend the notion of computable function."^{[41]}
All these contributions involve proofs that the models are computationally equivalent to the Turing machine; such models are said to be Turing complete. Because all these different attempts at formalizing the concept of "effective calculability/computability" have yielded equivalent results, it is now generally assumed that the Church–Turing thesis is correct. In fact, Gödel (1936) proposed something stronger than this; he observed that there was something "absolute" about the concept of "reckonable in S_{1}":
 "It may also be shown that a function which is computable ['reckonable'] in one of the systems S_{i}, or even in a system of transfinite type, is already computable [reckonable] in S_{1}. Thus the concept 'computable' ['reckonable'] is in a certain definite sense 'absolute', while practically all other familiar metamathematical concepts (e.g. provable, definable, etc.) depend quite essentially on the system to which they are defined"^{[42]}
Informal usage in proofs
Proofs in computability theory often invoke^{[43]} the Church–Turing thesis in an informal way to establish the computability of functions while avoiding the (often very long) details which would be involved in a rigorous, formal proof. To establish that a function is computable by Turing machine, it is usually considered sufficient to give an informal English description of how the function can be effectively computed, and then conclude "By the Church–Turing thesis" that the function is Turing computable (equivalently partial recursive).
Dirk van Dalen (in Gabbay 2001:284^{[44]}) gives the following example for the sake of illustrating this informal use of the Church–Turing thesis:
 EXAMPLE: Each infinite RE set contains an infinite recursive set.
 Proof: Let A be infinite RE. We list the elements of A effectively, n_{0}, n_{1}, n_{2}, n_{3}, ...
 From this list we extract an increasing sublist: put m_{0}=n_{0}, after finitely many steps we find an n_{k} such that n_{k} > m_{0}, put m_{1}=n_{k}. We repeat this procedure to find m_{2} > m_{1}, etc. this yields an effective listing of the subset B={m_{0},m_{1},m_{2},...} of A, with the property m_{i} < m_{i+1}.
 Claim. B is decidable. For, in order to test k in B we must check if k=m_{i} for some i. Since the sequence of m_{i}'s is increasing we have to produce at most k+1 elements of the list and compare them with k. If none of them is equal to k, then k not in B. Since this test is effective, B is decidable and, by Church's thesis, recursive.
(Emphasis added). In order to make the above example completely rigorous, one would have to carefully construct a Turing Machine, or λfunction, or carefully invoke recursion axioms, or at best, cleverly invoke various theorems of computability theory. But because the computability theorist believes that Turing computability correctly captures what can be computed effectively, and because an effective procedure is spelled out in English for deciding the set B, the computability theorist accepts this as proof that the set is indeed recursive.
As a rule of thumb, the Church–Turing thesis should only be invoked to simplify proofs in cases where the writer would be capable of, and expects the readers also to be capable of, easily (but not necessarily without tedium) producing a rigorous proof if one were demanded.
Variations
The success of the Church–Turing thesis prompted variations of the thesis to be proposed. For example, the Physical Church–Turing thesis (PCTT) states:
 "According to Physical CTT, all physically computable functions are Turingcomputable"^{[45]}
The Church–Turing thesis says nothing about the efficiency with which one model of computation can simulate another. It has been proved for instance that a (multitape) universal Turing machine only suffers a logarithmic slowdown factor in simulating any Turing machine.^{[46]} No such result has been proved in general for an arbitrary but reasonable model of computation. A variation of the Church–Turing thesis that addresses this issue is the Feasibility Thesis^{[47]} or (Classical) ComplexityTheoretic Church–Turing Thesis (SCTT), which is not due to Church or Turing, but rather was realized gradually in the development of complexity theory. It states:^{[48]}
 "A probabilistic Turing machine can efficiently simulate any realistic model of computation."
The word 'efficiently' here means up to polynomialtime reductions. This thesis was originally called Computational ComplexityTheoretic Church–Turing Thesis by Ethan Bernstein and Umesh Vazirani (1997). The ComplexityTheoretic Church–Turing Thesis, then, posits that all 'reasonable' models of computation yield the same class of problems that can be computed in polynomial time. Assuming the conjecture that probabilistic polynomial time (BPP) equals deterministic polynomial time (P), the word 'probabilistic' is optional in the ComplexityTheoretic Church–Turing Thesis. A similar thesis, called the Invariant Thesis, was introduced by Cees F. Slot and Peter van Emde Boas. It states: "Reasonable" machines can simulate each other within a polynomially bounded overhead in time and a constantfactor overhead in space.^{[49]} The thesis originally appeared in a paper at STOC'84, which was the first paper to show that polynomialtime overhead and constantspace overhead could be simultaneously achieved for a simulation of a Random Access Machine on a Turing machine.^{[50]}
If BQP is shown to be a strict superset of BPP, it would invalidate the ComplexityTheoretic Church–Turing Thesis. In other words, there would be efficient quantum algorithms that perform tasks that do not have efficient probabilistic algorithms. This would not however invalidate the original Church–Turing thesis, since a quantum computer can always be simulated by a Turing machine, but it would invalidate the classical ComplexityTheoretic Church–Turing thesis for efficiency reasons. Consequently, the Quantum ComplexityTheoretic Church–Turing thesis states:^{[48]}
 "A quantum Turing machine can efficiently simulate any realistic model of computation."
Eugene Eberbach and Peter Wegner^{[51]} claim that the Church–Turing thesis is sometimes interpreted too broadly,
stating "the broader assertion that algorithms precisely capture
what can be computed is invalid". They claim that forms of computation not captured by the thesis are relevant today,
terms which they call superTuring computation.
Philosophical implications
Philosophers have interpreted the Church–Turing thesis as having implications for the philosophy of mind; however, many of the philosophical interpretations of the Thesis involve basic misunderstandings of the thesis statement.^{[52]} B. Jack Copeland states that it's an open empirical question whether there are actual deterministic physical processes that, in the long run, elude simulation by a Turing machine; furthermore, he states that it is an open empirical question whether any such processes are involved in the working of the human brain.^{[53]} There are also some important open questions which cover the relationship between the Church–Turing thesis and physics, and the possibility of hypercomputation. When applied to physics, the thesis has several possible meanings:
 The universe is equivalent to a Turing machine; thus, computing nonrecursive functions is physically impossible. This has been termed the Strong Church–Turing thesis and is a foundation of digital physics.
 The universe is not equivalent to a Turing machine (i.e., the laws of physics are not Turingcomputable), but incomputable physical events are not "harnessable" for the construction of a hypercomputer. For example, a universe in which physics involves real numbers, as opposed to computable reals, might fall into this category. The assumption that incomputable physical events are not "harnessable" has been challenged, however,^{[54]} by a proposed computational process that uses quantum randomness together with a computational machine to hide the computational steps of a Universal Turing Machine with Turingincomputable firing patterns.
 The universe is a hypercomputer, and it is possible to build physical devices to harness this property and calculate nonrecursive functions. For example, it is an open question whether all quantum mechanical events are Turingcomputable, although it is known that rigorous models such as quantum Turing machines are equivalent to deterministic Turing machines. (They are not necessarily efficiently equivalent; see above.) John Lucas and Roger Penrose^{[55]} have suggested that the human mind might be the result of some kind of quantummechanically enhanced, "nonalgorithmic" computation, although there is no scientific evidence for this proposal.
There are many other technical possibilities which fall outside or between these three categories, but these serve to illustrate the range of the concept.
Noncomputable functions
One can formally define functions that are not computable. A wellknown example of such a function is the Busy Beaver function. This function takes an input n and returns the largest number of symbols that a Turing machine with n states can print before halting, when run with no input. Finding an upper bound on the busy beaver function is equivalent to solving the halting problem, a problem known to be unsolvable by Turing machines. Since the busy beaver function cannot be computed by Turing machines, the Church–Turing thesis states that this function cannot be effectively computed by any method.
Several computational models allow for the computation of (ChurchTuring) noncomputable functions. These are known as
hypercomputers.
Mark Burgin^{[56]} argues that superrecursive algorithms such as inductive Turing machines disprove the Church–Turing thesis. His argument relies on a definition of algorithm broader than the ordinary one, so that noncomputable functions obtained from some inductive Turing machines are called computable. This interpretation of the Church–Turing thesis differs from the interpretation commonly accepted in computability theory, discussed above. The argument that superrecursive algorithms are indeed algorithms in the sense of the Church–Turing thesis has not found broad acceptance within the computability research community.
See also
References









 Includes original papers by Gödel, Church, Turing, Rosser, Kleene, and Post mentioned in this section.




 Cited by Kleene (1952) as "Über die Lāange von Beweisen", in Ergebnisse eines math. Koll, etc.






 Reprinted in The Undecidable, p. 255ff. Kleene refined his definition of "general recursion" and proceeded in his chapter "12. Algorithmic theories" to posit "Thesis I" (p. 274); he would later repeat this thesis (in Kleene 1952:300) and name it "Church's Thesis" (Kleene 1952:317) (i.e., the Church thesis).










 and (See also: Davis 1965:115ff)


External links
 Stanford Encyclopedia of Philosophy.
 A comprehensive philosophical treatment of relevant issues.


 Overview 

 Academic areas  

 Foundations  


                

lt:Tiuringo mašina#Tiuringo tezė
This article was sourced from Creative Commons AttributionShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, EGovernment Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a nonprofit organization.