January 15, 2008 * Turing Machine * Variants of Turing Machines * Church-Turing Thesis TURING MACHINE A Turing Machine (TM) is a 7-tuple (Q, Sigma, Gamma, d, q_0, q_a, q_r), where 1. Q is a set of states, 2. Sigma is the input alphabet not containing the blank symbol u, 3. Gamma is the tape alphabet, where u in Gamma and Sigma subset of Gamma, 4. d: Q x Gamma -> Q x Gamma x {L,R} is the transition function, 5. q_0 in Q is the start state, 6. q_a in Q is the accept state, 7. q_r in Q is the reject state, where q_r not equal to q_a. Configuration -- captures the state, contents of the tape, and current location of the head. Written as uqv, where q is the current state, uv is the current contents of the tape, and the head is at the first symbol of v. Following the last symbol of v, the tape only contains blanks. ua q_i bv yields uq_j acv iff d(q_i,b) = (q_j,c,L) ua q_i bv yields uacq_jv iff d(q_i,b) = (q_j,c,R) Need to deal specially with left and right ends. Start configuration of M on input w is the configuration q_0w In an accepting (resp., rejecting) configuration, the machine accepts (resp,, rejects) the input string and halts. A TM M accepts w if a sequence of configurations C_1, C_2, ..., C_k exists such that: 1. C_1 is the start configuration of M on w, 2. C_i yields C_{i+1} for each i, 3. C_k is an accepting configuration The language recognized by a TM M is the set of all strings accepted by M. A language L is Turing-recognizable (recursively enumerable) if it is the language recognized by a TM. A TM is a decider if it halts on all inputs. A language L is Turing-decidable (recursive) if it is the language recognized by a decider TM. Examples: {a^ib^jc^k: 0 <= i <= j <= k} On input w: 0. Start at the left end. 1. If a not found, then go to 4. 2. Put a marker # for the first a, then keep moving right until no more a's or *'s found. If b not the current symbol, reject. 3. Otherwise, replace b by *, and keep moving left until # is found. Go to 1. 4. Keep moving right until no more *s found. In the process, replace *'s by b. 5. Keep moving left until the first b. Repeat steps 1, 2, 3 with a replaced by b and b by c. Instead of step 4, we go to step 6. 6. Keep moving right until no more *s or c's found. If u the current symbol, then accept; otherwise reject =================== VARIANTS OF TURING MACHINES One of the hallmarks of a good computational model is its robustness. How do "minor" changes to the parameters affect the class of languages recognized by the machines. -- Replace {L,R} by {L,R,S} -- Multiple tapes instead of one -- Nondeterministic transitions instead of deterministic transitions -- Two-way infinite tape -- Storage in the state Turing machines are amazingly robust. All of the above variants accept the same set of languages. Multitape machines A multitape TM is like an ordinary TM with several tapes. Each tape has its own head for reading and writing. Initially the input is in tape 1, and the other tapes are blank. The transition function d: Q x Gamma^k -> Q x Gamma^k x {L,R}^k Theorem: Every multitape TM has an equivalent single-tape TM. Proof: Given multi-tape TM M, we will simulate using a single-tape TM S. Suppose M has k tapes. We will have the tape of S contain the first tape of M followed by #, then the second tape of M followed by #, then the third tape, and so on. To indicate the current location of the head, we will replace the tape alphabet Gamma with the tape alphabet Gamma union Gamma', which contains a "primed" version of each symbol of Gamma. Initial tape contents are #w_1w_2...w_n#u#u#...# . To simulate a single move, S scans the tape from the first # to the last # and reads the current symbols that the k heads of M are pointing to -- memorizing the symbols using a state, then apply the transition function of M, and finally make a second sweep implementing the transition of M. If, at any point, S moves one of the virtual heads to the right onto a #, then we need to extend this tape by shifting the contents right by 1 and placing a blank symbol. Nondeterministic machines A nondeterministic TM is identical to TM except that the transition function has the form: d: Q x Gamma -> P(Q x Gamma x {L,R}) Theorem: Every nondeterministic TM has an equivalent deterministic TM. Proof: The simulating deterministic TM D has three tapes. The first tape just contains the input string. The second tape is the simulation tape which will simulate one branch of the NDTM computation tree. The third tape lists the current simulation branch. Consider the computation tree of the NDTM. The root of the tree is the starting configuration. The children of a configuration C are all the configurations that C can possibily yield in one step of NDTM. This is finite, say b. Thus, the max-degree of the computation tree is b. We number the child configurations of a configuration arbitrarily from 1 to at most b. Now any (partial) computation can be specified by a string over the alphabet {1,2,...,b}. Some strings over this alphabet may not correspond to any node in the computation tree. D proceeds as follows. 1. Initially tape 1 contains the input w, and tapes 2 and 3 are empty. 2. Copy tape 1 to tape 2. 3. Use tape 2 to simulate N with input w on one branch of its computation tree. Before each step of N, consult the next symbol on tape 3 to determine which of the nondeterministic choices to make. If no more symbols remain on tape 3 or this choice is invalid or N goes to reject state then go to step 4. If N goes to accept state, then move to accept and halt. 4. Replace the string on tape 3 by its lexciographic successor. Go to step 2. End Proof Corollary: A language is Turing-recognizable iff some NDTM recognizes it. Theorem: A language is decidable iff some NDTM decides it. An NDTM is a decider if N always halts on all branches of its computation. Is it equivalent to saying: An NDTM is a decider if either their exists a branch that accepts and halts or all branches halt? Yes, suppose we have such an NDTM T. Then, we can obtain another DTM that always halts as follows. We simulate T, one branch at a time in the manner similar to the above proof. If T accepts at any point, then we accept. Otherwise, all branches of T should halt and reject, in which case we would have no more branches to proceed on, and would reject. History: Turing defined TMs in 1936. The work of Godel preceded this in 1931. Church, Kleene, and Post had computational models in 1936. Multitape Turing machines and their complexity were studied by Hartmanis-Stearns (1965). =================== CHURCH-TURING THESIS Church used a notational system called lambda-calculus to define algorithms. Turing defined algorithms using machines. These two were shown to be equivalent. Also relevant here are predicate calculus and partial recursive functions. The Church hypothesis (or the Church-Turing thesis) says that the intuitive notion of algorithms equals Turing machine algorithms. Hilbert's 10th problem: Devise an algorithm that tests whether a polynomial has an integral root. We will describe Turing machine algorithms using the more informal pseudocode for algorithms that you are already familiar with. For the most part, we will be discussing decision problems, so the TM will (if it halts) end in an accept or reject state. The input to the TM would be a string, which would encode the problem input that may have some underlying structure. Rather than explicitly listing the tape contents, we will often describe how the algorithm manipulates the objects of the input and creates other objects, all of which of course can be written down as a string with an appropriate encoding.