(Redirected from Turing, Alan)
- Mathematical reasoning may be regarded rather schematically as the exercise of a combination of two facilities, which we may call intuition and ingenuity. The activity of the intuition consists in making spontaneous judgements which are not the result of conscious trains of reasoning... The exercise of ingenuity in mathematics consists in aiding the intuition through suitable arrangements of propositions, and perhaps geometrical figures or drawings.
- "Systems of Logic Based on Ordinals," section 11: The purpose of ordinal logics (1938), published in Proceedings of the London Mathematical Society, series 2, vol. 45 (1939)
- In a footnote to the first sentence, Turing added: "We are leaving out of account that most important faculty which distinguishes topics of interest from others; in fact, we are regarding the function of the mathematician as simply to determine the truth or falsity of propositions."
- Instruction tables will have to be made up by mathematicians with computing experience and perhaps a certain puzzle-solving ability. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.
- "Proposed Electronic Calculator" (1946), a report for National Physical Laboratory, Teddington; published in A. M. Turing's ACE Report of 1946 and Other Papers (1986), edited by B. E. Carpenter and R. W. Doran, and in The Collected Works of A. M. Turing (1992), edited by D. C. Ince, Vol. 3.
- A man provided with paper, pencil, and rubber, and subject to strict discipline, is in effect a universal machine.
- "Intelligent Machinery: A Report by A. M. Turing," (Summer 1948), submitted to the National Physical Laboratory (1948) and published in Key Papers: Cybernetics, ed. C. R. Evans and A. D. J. Robertson (1968) and, in variant form, in Machine Intelligence 5, ed. B. Meltzer and D. Michie (1969).
- Science is a differential equation. Religion is a boundary condition.
- Epigram to Robin Gandy (1954); reprinted in Andrew Hodges, Alan Turing: the Enigma (Vintage edition 1992), p. 513.
- The Exclusion Principle is laid down purely for the benefit of the electrons themselves, who might be corrupted (and become dragons or demons) if allowed to associate too freely.
- Epigram to Robin Gandy (1954).
- A computer would deserve to be called intelligent if it could deceive a human into believing that it was human.
- Computing Machinery and Intelligence (1950) 
Computing Machinery and Intelligence (1950)
- Published in Mind – A Quarterly Review of Psychology and Philosophy, vol. 59, #236 (1950). This paper describes what has come to be known as the Turing Test. At the time it was written, the term "computer" was a job title describing an individual who processed figures by hand. - Full text online
- "Can machines think?"... The new form of the problem can be described in terms of a game which we call the 'imitation game." It is played with three people, a man (A), a woman (B), and an interrogator (C) who may be of either sex. The interrogator stays in a room apart front the other two. The object of the game for the interrogator is to determine which of the other two is the man and which is the woman. He knows them by labels X and Y, and at the end of the game he says either "X is A and Y is B" or "X is B and Y is A." The interrogator is allowed to put questions to A and B... We now ask the question, "What will happen when a machine takes the part of A in this game?" Will the interrogator decide wrongly as often when the game is played like this as he does when the game is played between a man and a woman? These questions replace our original, "Can machines think?"
- We do not wish to penalise the machine for its inability to shine in beauty competitions, nor to penalise a man for losing in a race against an aeroplane. The conditions of our game make these disabilities irrelevant.
- May not machines carry out something which ought to be described as thinking but which is very different from what a man does?
- We are not asking whether all digital computers would do well in the game nor whether the computers at present available would do well, but whether there are imaginable computers which would do well.
- The idea behind digital computers may be explained by saying that these machines are intended to carry out any operations which could be done by a human computer.
- p. 436.
- A digital computer can usually be regarded as consisting of three parts: (i) Store. (ii) Executive unit. (iii) Control. ...The executive unit is the part which carries out the various individual operations involved in a calculation. ...It is the duty of the control to see that...[the table of] instructions are obeyed correctly and in the right order. ...A typical instruction might say—"Add the number stored in position 6809 to that in 4302 and put the result back into the latter storage position." Needless to say it would not occur in the machine expressed in English. It would more likely be coded in a form such as 6809430217. Here 17 says which of various possible operations [add] is to be performed on the two numbers. ...It will be noticed that the instruction takes up 10 digits and so forms one packet of information...
- Suppose Mother wants Tommy to call at the cobbler's every morning on his way to school to see if her shoes are done, she can ask him afresh every morning. Alternatively she can stick up a notice once and for all in the hall which he will see when he leaves for school and which tells him to call for the shoes, and also to destroy the notice when he comes back if he has the shoes with him.
- If one wants to make a machine mimic the behaviour of the human computer in some complex operation one has to ask him how it is done, and then translate the answer into the form of an instruction table. Constructing instruction tables is usually described as "programming."
- I believe that at the end of the century the use of words and general educated opinion will have altered so much that one will be able to speak of machines thinking without expecting to be contradicted.
- p. 442.
- I am not very impressed with theological arguments whatever they may be used to support. Such arguments have often been found unsatisfactory in the past. In the time of Galileo it was argued that the texts, "And the sun stood still... and hasted not to go down about a whole day" (Joshua x. 13) and "He laid the foundations of the earth, that it should not move at any time" (Psalm cv. 5) were an adequate refutation of the Copernican theory.
- pp. 443-444.
- Machines take me by surprise with great frequency.
- p. 450.
- The view that machines cannot give rise to surprises is due, I believe, to a fallacy to which philosophers and mathematicians are particularly subject. This is the assumption that as soon as a fact is presented to a mind all consequences of that fact spring into the mind simultaneously with it. It is a very useful assumption under many circumstances, but one too easily forgets that it is false. A natural consequence of doing so is that one then assumes that there is no virtue in the mere working out of consequences from data and general principles.
- p. 451.
- Another simile would be an atomic pile of less than critical size: an injected idea is to correspond to a neutron entering the pile from without. Each such neutron will cause a certain disturbance which eventually dies away. If, however, the size of the pile is sufficiently increased, the disturbance caused by such an incoming neutron will very likely go on and on increasing until the whole pile is destroyed. Is there a corresponding phenomenon for minds, and is there one for machines? There does seem to be one for the human mind. The majority of them seem to be "sub-critical," i.e., to correspond in this analogy to piles of sub-critical size. An idea presented to such a mind will on average give rise to less than one idea in reply. A smallish proportion are super-critical. An idea presented to such a mind may give rise to a whole "theory" consisting of secondary, tertiary and more remote ideas. Animals minds seem to be very definitely sub-critical. Adhering to this analogy we ask, "Can a machine be made to be super-critical?"
- p. 454.
- Presumably the child-brain is something like a note-book as one buys it from the stationer's. Rather little mechanism, and lots of blank sheets.
- p. 456.
- We can only see a short distance ahead, but we can see plenty there that needs to be done.
- p. 460.
Quotes about Turing
- Sorted alphabetically by author or source
- Alan Turing was the first to make a careful analysis of the potential capabilities of machines, inventing his famous "Turing machines" for the purpose. He argued that if any machine could perform a computation, then some Turing machine could perform it. The argument focuses on the assertion that any machine's operations could be simulated, one step at a time, by certain simple operations, and that Turing machines were capable of those simple operations. Turing's first fame resulted from applying this analysis to a problem posed earlier by Hilbert, which concerned the possibility of mechanizing mathematics. Turing showed that in a certain sense, it is impossible to mechanize mathematics: We shall never be able to build an "oracle" machine that can correctly answer all mathematical questions presented to it with a "yes" or "no" answer. In another famous paper Turing went on to consider the somewhat different question, "Can machines think?." It is a different question, because perhaps machines can think, but they might not be any better at mathematics than humans are; or perhaps they might be better at mathematics than humans are, but not by thinking, just by brute-force calculation power. These two papers of Turing lie near the roots of the subjects today known as automated deduction and artificial intelligence.
- Michael J. Beeson, "The Mechanization of Mathematics," in Alan Turing: Life and Legacy of a Great Thinker (2004).
- Beyond any doubt, the most important thing that has happened in cognitive science was Turing’s invention of the notion of mechanical rationality.
- From a very young age, I knew about the legend of Alan Turing – among awkward, nerdy teenagers, he is a patron saint. He never fit in, but accomplished these wonderful things, as part of a secret queer history of computer science.
And so I always dreamt of writing something about him, and I thought that there had never been a proper narrative treatment of his life, that he deserved. I by chance met the producers of the film at a party, and one of them told me they had optioned a biography. When I asked who it was, they said, ‘it’s a mathematician that you’ve never heard of.’ When they told me it was Alan Turing, I almost tackled them, and I told them I’d do anything to write this film, I’d write it for free. It was all about luck and passion.
- Here’s the thing. Alan Turing never got to stand on a stage like this and look out at all of these disconcertingly attractive faces. I do. And that’s the most unfair thing I’ve ever heard. So in this brief time here, what I wanted to do was say this: When I was 16-years-old, I tried to kill myself because I felt weird and I felt different, and I felt like I did not belong. And now I’m standing here — and so I would like this moment to be for this kid out there who feels like she’s weird or she’s different or she doesn’t fit in anywhere: Yes, you do. I promise you do. Stay weird, stay different — and then, when it’s your turn, and you are standing on this stage, please pass the same message to the next person who comes along.
- Graham Moore, in his acceptance speech for best adapted screenplay at the 87th Academy Awards presentations, quoted in "Oscars 2015: Graham Moore Tells Kids to 'Stay Weird, Stay Different'" in ABC News (22 February 2015).
- I’m not gay, but I don’t think you have to be gay to have a gay hero. Growing up, Alan Turing was certainly mine. … I’m also not the greatest mathematician of my generation. We have lots of biographical differences, but nonetheless I always identified with him so much.
- His high-pitched voice already stood out above the general murmur of well-behaved junior executives grooming themselves for promotion within the Bell corporation. Then he was suddenly heard to say: "No, I'm not interested in developing a powerful brain. All I'm after is just a mediocre brain, something like the President of the American Telephone and Telegraph Company."
- Andrew Hodges, describing an incident which occurred in the New York AT & T lab cafeteria in 1943, in Alan Turing: The Enigma of Intelligence (1983), p. 251.
- He wondered what one might ask in a structured conversation to decide if one's interlocutor was a human being or a computer. ...but the ultimate Turing test might be to pose the question "How would a guilt-stricken homosexual commit suicide?" Would a computer ever conceive of eating an apple laced with cyanide?
- Theodore Roszak, The Gendered Atom (1999).
- Although a mathematician, Turing took quite an interest in the engineering side of computer design. There was some discussion in 1947 as to whether a cheaper substance than mercury could not be found for use as an ultrasonic delay medium. Turing's contribution to this discussion was to advocate the use of gin, which he said contained alcohol and water in just the right proportions to give a zero temperature coefficient of propagation velocity at room temperature.
- Maurice V. Wilkes, "Computers Then and Now", Journal of the ACM 15 (1), (January 1968), pp. 1-7.
- Turing had a strong predeliction for working things out from first principles, usually in the first instance without consulting any previous work on the subject, and no doubt it was this habit which gave his work that characteristically original flavor. I was reminded of a remark which Beethoven is reputed to have made when he was asked if he had heard a certain work of Mozart which was attracting much attention. He replied that he had not, and added "neither shall I do so, lest I forfeit some of my own originality."
- He was particularly fond of little programming tricks (some people would say that he was too fond of them to be a "good" programmer) and would chuckle with boyish good humor at any little tricks I may have used.