Is Alan Turing given too much credit when it comes to computers?

In summary: Turing's 1936 paper does not show that a computer can be built which can solve any problem.In summary, Alan Turing's paper, "On Computable Numbers, with an Application to the Entscheidungsproblem", showed that a computer could be built which could solve any problem given the right input. However, the notion of algorithm had to be formally defined before this paper could be useful.
  • #1
PainterGuy
940
69
Hi,

Is Alan Turing given too much credit when it comes to computers? He is presented as someone who played a pivotal role with the realization of a computer. What Turing did as an answer to Hilbert's problem was a great achievement from mathematical point of view.

Alan Turing was mathematician and it all started with his paper in 1936, "On Computable Numbers, with an Application to the Entscheidungsproblem", https://www.cs.virginia.edu/~robins/Turing_Paper_1936.pdf. In this paper he envisioned an entirely hypothetical machine.

The computer technology was already on its way of progress and becoming mature as time passed as the two excerpts below show. It's not that Turing envision a machine which didn't exist before or nobody had thought of before. The idea was already there and so were physical computing devices; Charles Babbage had carefully thought about in in 19th century. I can not see how Turing's 1936 paper played such as important role that his name is associated so much with modern computer.

Could you please comment on it? Thank you for your help, in advance.

Babbage's machines were among the first mechanical computers. That they were not actually completed was largely because of funding problems and clashes of personality, most notably with George Biddell Airy, the Astronomer Royal.

While Babbage's machines were mechanical and unwieldy, their basic architecture was similar to a modern computer. The data and program memory were separated, operation was instruction-based, the control unit could make conditional jumps, and the machine had a separate I/O unit.

The major innovation was that the Analytical Engine was to be programmed using punched cards: the Engine was intended to use loops of Jacquard's punched cards to control a mechanical calculator, which could use as input the results of preceding computations. The machine was also intended to employ several features subsequently used in modern computers, including sequential control, branching and looping. It would have been the first mechanical device to be, in principle, Turing-complete. The Engine was not a single physical machine, but rather a succession of designs that Babbage tinkered with until his death in 1871.
Source: https://en.wikipedia.org/wiki/Charles_Babbage#Computing_pioneer
In 1932, the Bureau of Ordnance (BuOrd) initiated development of the TDC with Arma Corporation and Ford Instruments. This culminated in the "very complicated" Mark 1 in 1938. This was retrofitted into older boats, beginning with Dolphin and up through the newest Salmons.

The first submarine designed to use the TDC was Tambor, launched in 1940 with the Mark III, located in the conning tower.[20] (This differed from earlier outfits.) It proved to be the best torpedo fire control system of World War II.

In 1943, the Torpedo Data Computer Mark IV was developed to support the Mark 18 torpedo.
Source: https://en.wikipedia.org/wiki/Torpedo_Data_Computer#History
You can choose to ignore all the material given below but it attempts to support my own opinion that Turing didn't play such a crucial role.

Background to the Turing's paper:
Turing's 1936 paper was written as an answer to one of the three problems posed by the mathematician Hilbert in 1928. The answers to two of the three problems posed by Hilbert were given by Gödel in 1930.

Entscheidungsproblem
A decision problem of finding a way to decide whether a formula is true or provable within a given system.
Source: https://en.wiktionary.org/wiki/Entscheidungsproblem

Entscheidungsproblem is a challenge posed by David Hilbert and Wilhelm Ackermann in 1928. The problem asks for an algorithm that considers, as input, a statement and answers "Yes" or "No" according to whether the statement is universally valid, i.e., valid in every structure satisfying the axioms.
...
By the completeness theorem of first-order logic, a statement is universally valid if and only if it can be deduced from the axioms, so the Entscheidungsproblem can also be viewed as asking for an algorithm to decide whether a given statement is provable from the axioms using the rules of logic.
...
In 1936, Alonzo Church and Alan Turing published independent papers showing that a general solution to the Entscheidungsproblem is impossible, assuming that the intuitive notion of "effectively calculable" is captured by the functions computable by a Turing machine (or equivalently, by those expressible in the lambda calculus). This assumption is now known as the Church–Turing thesis.
...
Before the question could be answered, the notion of "algorithm" had to be formally defined. This was done by Alonzo Church in 1935 with the concept of "effective calculability" based on his λ-calculus, and by Alan Turing the next year with his concept of Turing machines.
Source: https://en.wikipedia.org/wiki/Entscheidungsproblem

Alonzo Church was PhD advisor to Turing.

In 1930 Gödel came up with his incompleteness theorems. This means that in mathematical logic there are going to be some truths which are though true can never be proved to be so.

His incompleteness theorems meant there can be no mathematical theory of everything, no unification of what’s provable and what’s true. What mathematicians can prove depends on their starting assumptions, not on any fundamental ground truth from which all answers spring.
https://www.quantamagazine.org/how-godels-incompleteness-theorems-work-20200714/

If 'algorithm' is understood as meaning a method that can be represented as a Turing machine, and with the answer to the latter question negative (in general), the question about the existence of an algorithm for the Entscheidungsproblem also must be negative (in general).
https://en.wikipedia.org/wiki/Entscheidungsproblem#Negative_answer
My own understanding:
Hilbert posed the decision problem in 1928 which was concerned about finding a general algorithm to decide whether a mathematical formula is true or provable.

Church and Turing, around 1936, came up with an answer, using different and independent approaches, which showed that there exists no such algorithm which can be used to show if every mathematical formula is true or provable. For example, one can find an algorithm which could be used to decide whether a given formula is true or provable but it cannot used for every formula out there since no such algorithm exists. For example, no algorithm could be used to decide if Goldbach's Conjecture is true or provable.

In Gödel’s paradigm, statements still are either true or false, but true statements can either be provable or unprovable within a given set of axioms.
Source: /watch?v=I4pQbo5MQOs (add www.youtube.com in front)

Gödel's incompleteness theorems left the Entscheidungsproblem as unfinished business. Although he had shown that any consistent axiomatic system of arithmetic would leave some arithmetical truths unprovable, this did not in itself rule out the existence of some "effectively computable" decision procedure which would infallibly, and in a finite time, reveal whether or not any given proposition P was, or was not, provable.
Source: https://www.philocomp.net/computing/hilbert.htm
General important related info:
Bombe was a electromechanical machine used to decrypt the codes generated by German Enigma machine in World War 2. Turing contributed to making of Bombe machine.

Lorenz cipher was an encryption machine used by German army during World War 2. Colossus computer(s) was used by British to decrypt encrypted code generated by Lorenz cipher.
For more info check: https://en.wikipedia.org/wiki/Lorenz_cipher

Colossus was a set of computers developed by British codebreakers in the years 1943–1945[1] to help in the cryptanalysis of the Lorenz cipher. Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations. Colossus is thus regarded[2] as the world's first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.[3]

Colossus was designed by General Post Office (GPO) research telephone engineer Tommy Flowers[1] to solve a problem posed by mathematician Max Newman at the Government Code and Cypher School (GC&CS) at Bletchley Park. Alan Turing's use of probability in cryptanalysis (see Banburismus) contributed to its design.
Source: https://en.wikipedia.org/wiki/Colossus_computer

ENIAC was the first programmable, electronic, general-purpose digital computer, completed in 1945. There were other computers that had these features, but the ENIAC had all of them in one package. It was Turing-complete and able to solve "a large class of numerical problems" through reprogramming.
Source: https://en.wikipedia.org/wiki/ENIAC

What is a Turing machine?
A Turing machine is a hypothetical machine thought of by the mathematician Alan Turing in 1936. Despite its simplicity, the machine can simulate ANY computer algorithm, no matter how complicated it is!
Source: https://www.cl.cam.ac.uk/projects/raspberrypi/tutorials/turing-machine/one.html

A Turing Complete system means a system in which a program can be written that will find an answer (although with no guarantees regarding runtime or memory).

So, if somebody says "my new thing is Turing Complete" that means in principle (although often not in practice) it could be used to solve any computation problem.
Source: https://stackoverflow.com/questions/7284/what-is-turing-complete

Entscheidungsproblem stands for "decision problem" in German.

Turing conceived of the universal machine as a means of answering the last of the three questions about mathematics posed by David Hilbert in 1928: (1) is mathematics complete; (2) is mathematics consistent; and (3) is mathematics decidable.
...
The Czech logician Kurt Gödel had already shown that arithmetic (and by extension mathematics) was both inconsistent and incomplete. Turing showed, by means of his universal machine, that mathematics was also undecidable.
Source: https://historyofinformation.com/detail.php?entryid=735

ENIAC was enormous. It occupied the 50-by-30-foot (15-by-9-metre) basement of the Moore School, where its 40 panels were arranged, U-shaped, along three walls. Each panel was about 2 feet wide by 2 feet deep by 8 feet high (0.6 metre by 0.6 metre by 2.4 metres). With more than 17,000 vacuum tubes, 70,000 resistors, 10,000 capacitors, 6,000 switches, and 1,500 relays, it was easily the most complex electronic system theretofore built. ENIAC ran continuously (in part to extend tube life), generating 174 kilowatts of heat and thus requiring its own air conditioning system. It could execute up to 5,000 additions per second, several orders of magnitude faster than its electromechanical predecessors. It and subsequent computers employing vacuum tubes are known as first-generation computers. (With 1,500 mechanical relays, ENIAC was still transitional to later, fully electronic computers.)
Source: https://www.britannica.com/technology/ENIAC

The Church–Turing thesis conjectures that any function whose values can be computed by an algorithm can be computed by a Turing machine,
Source: https://en.wikipedia.org/wiki/Turing_completeness

Turing completeness is significant in that every real-world design for a computing device can be simulated by a universal Turing machine. The Church–Turing thesis states that this is a law of mathematics – that a universal Turing machine can, in principle, perform any calculation that any other programmable computer can. This says nothing about the effort needed to write the program, or the time it may take for the machine to perform the calculation, or any abilities the machine may possesses that have nothing to do with computation.
Source: https://en.wikipedia.org/wiki/Turing_completeness#History

Charles Babbage's analytical engine (1830s) would have been the first Turing-complete machine if it had been built at the time it was designed. Babbage appreciated that the machine was capable of great feats of calculation, including primitive logical reasoning, but he did not appreciate that no other machine could do better. From the 1830s until the 1940s, mechanical calculating machines such as adders and multipliers were built and improved, but they could not perform a conditional branch and therefore were not Turing-complete.
Source: https://en.wikipedia.org/wiki/Turing_completeness#History

In 1941 Konrad Zuse completed the Z3 computer. Zuse was not familiar with Turing's work on computability at the time. In particular, the Z3 lacked dedicated facilities for a conditional jump, thereby precluding it from being Turing complete. However, in 1998, it was shown by Rojas that the Z3 is capable of simulating conditional jumps, and therefore Turing complete in theory. To do this, its tape program would have to be long enough to execute every possible path through both sides of every branch.

The first computer capable of conditional branching in practice, and therefore Turing complete in practice, was the ENIAC in 1946. Zuse's Z4 computer was operational in 1945, but it did not support conditional branching until 1950.
Source: https://en.wikipedia.org/wiki/Turing_completeness#History

Turing machines proved the existence of fundamental limitations on the power of mechanical computation. While they can express arbitrary computations, their minimalist design makes them unsuitable for computation in practice: real-world computers are based on different designs that, unlike Turing machines, use random-access memory.
https://en.wikipedia.org/wiki/Turing_machine

Early computers were built to perform a series of single tasks, like a calculator. Basic operating system features were developed in the 1950s, such as resident monitor functions that could automatically run different programs in succession to speed up processing. Operating systems did not exist in their modern and more complex forms until the early 1960s. Hardware features were added, that enabled use of runtime libraries, interrupts, and parallel processing. When personal computers became popular in the 1980s, operating systems were made for them similar in concept to those used on larger computers.

In the 1940s, the earliest electronic digital systems had no operating systems. Electronic systems of this time were programmed on rows of mechanical switches or by jumper wires on plugboards. These were special-purpose systems that, for example, generated ballistics tables for the military or controlled the printing of payroll checks from data on punched paper cards. After programmable general-purpose computers were invented, machine languages(consisting of strings of the binary digits 0 and 1 on punched paper tape) were introduced that sped up the programming process (Stern, 1981).

In the early 1950s, a computer could execute only one program at a time. Each user had sole use of the computer for a limited period and would arrive at a scheduled time with their program and data on punched paper cards or punched tape. The program would be loaded into the machine, and the machine would be set to work until the program completed or crashed.
Source: https://en.wikipedia.org/wiki/Operating_system#HistoryHelpful links:
1: https://en.wikipedia.org/wiki/Turing's_proof
2: https://en.wikipedia.org/wiki/Enigma_machine
3: https://en.wikipedia.org/wiki/Bombe
4: https://en.wikipedia.org/wiki/Bletchley_Park
5: https://en.wikipedia.org/wiki/Torpedo_Data_Computer
6: https://en.wikipedia.org/wiki/Resident_monitor
7: https://en.wikipedia.org/wiki/Alan_Turing
8: https://en.wikipedia.org/wiki/Gödel's_incompleteness_theorems
9: /watch?v=t37GQgUPa6k (add www.youtube.com in front)
10: /watch?v=macM_MtS_w4 (add www.youtube.com in front)
11: https://en.wikipedia.org/wiki/Goldbach's_conjecture
12: https://www.scientificamerican.com/article/what-is-goumldels-proof
13: /watch?v=I4pQbo5MQOs (add www.youtube.com in front)
14: https://www.newscientist.com/articl...ing-found-machine-thinking-in-the-human-mind/
15: https://cs.stackexchange.com/questi...of-turings-answer-to-the-entscheidungsproblem
 
Technology news on Phys.org
  • #2
Do you understand the meaning of the phrase "straw man fallacy"?

Here is your straw man:
PainterGuy said:
I can not see how Turing's 1936 paper played such as important role that his name is associated so much with modern computer.

However Turing did much more than write "On Computable Numbers...": https://en.wikipedia.org/wiki/Alan_Turing#Career_and_research.
 
  • Like
Likes PainterGuy and russ_watters
  • #3
It is estimated by military historians that Turing saved 2 million lives in WWII. You don't do that just by writing a paper. His work during the war was legendary.
Of course his subsequent treatment is a stain on history,
 
  • Like
Likes aaroman, BillTre, berkeman and 3 others
  • #4
PainterGuy said:
Is Alan Turing given too much credit when it comes to computers?
By my understanding: computers has two relevant aspect. One is the technical side, and one is the mathematical background.

Regarding the technical side, although Turing made important, but limited contribution: especially if it's about modern computers, as you extended your query later on.

The real deal is exactly the mathematical side. That 'Turing machine' (with everything added to it later on) might seem clumsy and likely there are several possible equivalents, but still it was exactly the right tool/definition at the right moment to give a kick still in effect today.
 
  • Like
Likes aaroman and PainterGuy
  • #5
A few years ago the BBC ran a competition to determine the greatest person of the 20th Century. The winner was Alan Turing. That's stretching a point, IMO, even if it was biased towards the British contenders.
 
  • Like
Likes PainterGuy
  • #6
When the world is ripe for an idea, many folks come up with great solutions until one or two win out in the end.

Turings solutions to computing were to some extent limited by their classified nature and so others got the credit early on before his work was fully disclosed.

Similarly for Conrad Zuse and other forgotten pioneers.

I noticed you cited ENIAC but not Atanasoff

https://en.wikipedia.org/wiki/Atanasoff–Berry_computer

Ideas from his computer was adapted into ENIAC.
 
Last edited:
  • Like
Likes PainterGuy
  • #7
pbuk said:
Do you understand the meaning of the phrase "straw man fallacy"?

I think you are indirectly saying that I didn't try to handle the question directly. I agree with you. I didn't feel qualified enough to do that and my writing skills aren't that good. I included some relevant material to make the point.
hutchphd said:
It is estimated by military historians that Turing saved 2 million lives in WWII. You don't do that just by writing a paper. His work during the war was legendary.

Gordon Welchman did an equally important work and, in my humble opinion, Welchman is also responsible for saving those 2 million lives by breaking the Enigma code.

Polish cryptanalysts had developed the bomba, an electromechanical device for finding the Enigma settings used by German operators; Turing improved the Polish design. Welchman invented the "Diagonal Board", an addition which made the British Bombe immensely more powerful.
...
Gordon Welchman was the subject of a BBC documentary in 2015. The programme was entitled Bletchley Park: Code-breaking's Forgotten Genius and as The Codebreaker Who Hacked Hitler when broadcast on the Smithsonian Channel in the US. The documentary notes that traffic analysis is now known as "network analysis" and "metadata" analysis and gives as an example the location of Osama bin Laden by the use of network analysis.
Source: https://en.wikipedia.org/wiki/Gordon_WelchmanGiven below are some of the reviews of the documentary about Welchman, Bletchley Park: Code-breaking's Forgotten Genius. They have said it better than I could so I thought it's better to borrow their words.

After all the revelations about Bletchley Park, and in particular, Alan Turing, in recent years; slowly and surely, a great and deserving 'catch-up' is emerging about those equally great heroes, who were shamefully forgotten, and ignored. As stated in this brilliant DVD, the efforts and subsequent break-through's by our hero Gordon Welchman, were at least, the equal of Turing's work. With far more long lasting effects, on today's international digital gathering, of military, industrial, and personal data, be it allies or foes. Even today, the creations of our hero are still secret in many respects. He was treated very badly by people at the top, which eventually destroyed him.
By P. G. Croft
Source: https://www.amazon.com/dp/B01KXEGUHO/?tag=pfamazon01-20

Brilliant documentary, such a shame this brilliantly intelligent man didn’t get the recognition that Alan Turin got, he is the brainchild of modern code breaking and his life was turned to misery by the CIA and GCHQ, after publishing a book about hut 6 where he worked during the war, 40 years after the war, unfortunately some of the secrets revealed in his book are still being used by US and UK government agencies. Really worth watching.
By Glog
Source: https://www.amazon.com/dp/B01KXEGUHO/?tag=pfamazon01-20Then, there was Colossus computer used to decrypt messages generated by Lorenz encryption machine used by Germans in World War 2. Lorenz was more sophisticated machine compared to Enigma. Tommy Flowers was one of main persons responsible for making Colossus computer. It has been said that Colossus helped to end War War 2 quickly.

It was a much more complex system than Enigma; the decoding procedure involved trying so many possibilities that it was impractical to do by hand. Flowers and Frank Morrell (also at Dollis Hill) designed the Heath Robinson, in an attempt to automate the cryptanalysis of the Lorenz SZ-40/42 cipher machine.
...
After the war, Flowers received little recognition for his contribution to cryptanalysis. Flowers was left in debt after the war after using his own personal funds to build Colossus. The government granted him £1,000 payment which did not cover Flowers' personal investment in the equipment; he shared much of the money amongst the staff who had helped him build and test Colossus. Flowers applied for a loan from the Bank of England to build another machine like Colossus but was denied the loan because the bank did not believe that such a machine could work. He could not argue that he had already designed and built many of these machines because his work on Colossus was covered by the Official Secrets Act.
Source: https://en.wikipedia.org/wiki/Tommy_Flowers

Colossus used thermionic valves (vacuum tubes) to perform Boolean and counting operations. Colossus is thus regarded as the world's first programmable, electronic, digital computer, although it was programmed by switches and plugs and not by a stored program.
Source: https://en.wikipedia.org/wiki/Colossus_computer

Although the Colossus was the first of the electronic digital machines with programmability, albeit limited by modern standards, it was not a general-purpose machine, being designed for a range of cryptanalytic tasks, most involving counting the results of evaluating Boolean algorithms.

A Colossus computer was thus not a fully Turing complete machine. However, University of San Francisco professor Benjamin Wells has shown that if all ten Colossus machines made were rearranged in a specific cluster, then the entire set of computers could have simulated a universal Turing machine, and thus be Turing complete. The notion of a computer as a general-purpose machine – that is, as more than a calculator devoted to solving difficult but specific problems – did not become prominent until after World War II.
Source: https://en.wikipedia.org/wiki/Colossus_computer
jedishrfu said:
I noticed you cited ENIAC but not Atanasoff

https://en.wikipedia.org/wiki/Atanasoff–Berry_computer

Ideas from his computer was adapted into ENIAC.

Thank you for mentioning it. I did check it while writing my earlier post but then decided to exclude it since it was considered more of an arithmetic logic unit and not a computer. Now I think I should have included it since it played an important role in digital computer history.

The Atanasoff–Berry computer (ABC) was the first automatic electronic digital computer. Limited by the technology of the day, and execution, the device has remained somewhat obscure. The ABC's priority is debated among historians of computer technology, because it was neither programmable, nor Turing-complete. Conventionally, the ABC would be considered the first electronic ALU (arithmetic logic unit) – which is integrated into every modern processor's design.

Its unique contribution was to make computing faster by being the first to use vacuum tubes to do the arithmetic calculations. Prior to this, slower electro-mechanical methods were used by the Harvard Mark I and Konrad Zuse's machines. The first electronic, programmable, digital machine, the Colossus computer from 1943 to 1945, used similar tube-based technology as ABC.
...
The ABC was designed for a specific purpose – the solution of systems of simultaneous linear equations. It could handle systems with up to 29 equations, a difficult problem for the time. Problems of this scale were becoming common in physics, the department in which John Atanasoff worked.
...
On June 26, 1947, J. Presper Eckert and John Mauchly were the first to file for patent on a digital computing device (ENIAC), much to the surprise of Atanasoff. The ABC had been examined by John Mauchly in June 1941, and Isaac Auerbach, a former student of Mauchly's, alleged that it influenced his later work on ENIAC, although Mauchly denied this. The ENIAC patent did not issue until 1964, and by 1967 Honeywell sued Sperry Rand in an attempt to break the ENIAC patents, arguing that the ABC constituted prior art. The United States District Court for the District of Minnesota released its judgement on October 19, 1973, finding in Honeywell v. Sperry Rand that the ENIAC patent was a derivative of John Atanasoff's invention.
Source: https://en.wikipedia.org/wiki/Atanasoff–Berry_computer
 
  • #8
Significant inventions are most often not linear, not conceived independent of the times or the thoughts of others. That makes the true story very messy. Numerous examples are found in the history of Thomas Edison. Every one of his major inventions were challenged numerous times in several countries. Edison had to spend much of his wealth defending his patents. See "Edison, His Life and Inventions Kindle Edition by Frank Dyer, and Thomas Martin"

But speaking of too much or too little credit in connection with Alan Turing, I think of Elizebeth Smith Friedman. Working for the US Coast Guard, she and her team broke the Enigma code independent of Bletchley Park, independent of Turing, and without use of a computer. Her story is documented in the book, "The Woman Who Cracked Codes" by Jason Fagone. But you are not likely to find mention of her name in most histories.
 
Last edited:
  • Like
  • Informative
Likes hmmm27, PainterGuy, hutchphd and 2 others
  • #9
Last edited:
  • Like
Likes PainterGuy and anorlunda
  • #10
anorlunda said:
Elizebeth Smith Friedman
Or Mavis Lever Bates. She was at Bletchley, and noticed that a particular Enigma intercept did not contain a single letter L in it (I may be misremembering the exact letter). Given that the Enigma never encrypts a letter to itself, she inferred the message was nothing but repeated L's: essentially the key for the day.
 
  • Like
  • Informative
Likes PainterGuy and anorlunda
  • #12
anorlunda said:
Significant inventions are most often not linear, not conceived independent of the times or the thoughts of others. That makes the true story very messy.

Thank you. I agree with you. I'd still say that sometimes it's good to dig the story a little deeper to see all the main characters who made important contributions rather than just one or two pampered children of history.

anorlunda said:
Numerous examples are found in the history of Thomas Edison. Every one of his major inventions were challenged numerous times in several countries. Edison had to spend much of his wealth defending his patents.

Agreed, again. But in the given case Gordon Welchman and Tommy Flowers are the pivotal characters from Bletchley Park where Turing worked along with them, especially Welchman.

Thanks to everyone for your opinions and information about other interesting individuals.
 
  • #13
My sense of this question is that Turing gained his fame partly through the concept of the "Turing machine" and partly by his use of the term "decidability."

The "Turing machine" concept is straightforward enough that it is still used as a description of a (very) basic computer. It provides an intuitive mental model of how a universal computer can work. Church's work, based on the lambda calculus, is mathematically equivalent but metaphorically inferior as an aid to intuition.

The concept of "decidability" sets the limits of what computers can theoretically achieve. In this sense, Turing completed Godel's agenda. Modern algorithm design, programming language design, and many other areas of computer science are crucially guided by the concept of decidability and the need to remain within decidable fragments of logic.

So in my opinion, it's the combination of these two key ideas -- the universal computer, and the limits on what it can achieve -- that cemented Turing's fame. These ideas are mathematically fundamental and always will be, and he was the first to describe them most clearly.

Reference: https://www.physicsforums.com/threa...ch-credit-when-it-comes-to-computers.1016596/
 
  • Like
Likes anorlunda, PainterGuy, BillTre and 1 other person
  • #14
Turing was savvy enough with hardware to have invented the first audio scrambler/descrambler to secure telephone conversations.
 
  • Like
Likes PainterGuy
  • #15
PainterGuy said:
Is Alan Turing given too much credit when it comes to computers?
It is arguable (witness this thread) as to whether or not Turing has been overrated (personally, I think not) but if you want an historical figure who most definitely has been overrated, how about Tesla?
 
  • Like
Likes Mondayman and PainterGuy
  • #16
“Lets not talk about Mr Tesla” said the man with gravely voice dressed in a dark suit of the finest materials carrying a copy of Mario Puzo’s The Godfather.
 
  • Like
Likes PainterGuy
  • #17
Turing's first venture into practical computer engineering was his development of the "Bombe Machine" that he used for breaking the enigma cypher. Bombe was not a general purpose computer. Soon after, he used the "Colossus computer" developed by Tommy Flowers for another (more challenging) code breaking project.

After the war, he went on to develop the "Automatic Computing Engine". Here is an excerpt from the wiki article describing Turing's contribution:
Turing's report on the ACE was written in late 1945 and included detailed logical circuit diagrams and a cost estimate of £11,200. He felt that speed and size of memory were crucial and he proposed a high-speed memory of what would today be called 25 kilobytes, accessed at a speed of 1 MHz; he remarked that for the purposes required "the memory needs to be very large indeed by comparison with standards which prevail in most valve and relay work, and [so] it is necessary to look for some more economical form of storage", and that memory "appears to be the main limitation in the design of a calculator, i.e. if the storage problem can be solved all the rest is comparatively straightforward". The ACE implemented subroutine calls, whereas the EDVAC did not, and what also set the ACE apart from the EDVAC was the use of Abbreviated Computer Instructions, an early form of programming language. Initially, it was planned that Tommy Flowers, the engineer at the Post Office Research Station at Dollis Hill in north London, who had been responsible for building the Colossus computers should build the ACE, but because of the secrecy around his wartime achievements and the pressure of post-war work, this was not possible.

So there is no doubt that Turing advanced the development of modern computers. The concepts and practices that support modern computers architecture can be traced from Flowers to Turing to von Neumann.

Certainly the Mathematical treatment of computing provided by Turing is very interesting. But practical computer science is very much an engineering exercise. I'm tempted to paraphrase Richard Feynman's "If all of mathematics disappeared, physics would be set back by exactly one week." - except I don't think that practical computer science is even that dependent on Mathematical treatment.
 
  • Like
Likes PainterGuy
  • #18
I think Turing's theoretical papers on computing helped to put a positive attitude on the topic in the minds of nonspecialists and the public.

Everyday, we hear claims about new developments that will revolutionize the world. Obviously, almost all of them prove false. Such was the case in the 40s and 50s. People like Turing helped by indirectly saying, "this one is real. Pay attention and be supportive."
 
  • Like
Likes PainterGuy
  • #19
You have asked one of those questions, like 'do Irish monks and the Norse get enough credit for European discovery of the Americas?' or 'How and by whom were the Americas populated?' Much of the evidence for your question disappeared into the murk of the wartime secrecy and the Cold War. Someone above quoted something saying Turing saved 2 million lives. It could be equally said that the Nazi officer stationed in the Qattara depression who, each day, sent 'Nothing to report. Heil Hitler.' saved millions of lives too, since his message was invaluable to the Allied code-breakers. Turing is like Columbus. He built on the work of a crew, financed by government, and successfully made his knowledge widely known.

I guess the question really is 'if you built a time machine and eliminated Alan Turing in grade school, how different would our world be today?' And you could garner all sorts of opinion. Alan Turing was a genius--but if he had never lived, would no one else have thought the thoughts he had? I find that unlikely. YMMV
 
  • Like
Likes PainterGuy
  • #20
jedishrfu said:
“Lets not talk about Mr Tesla” said the man with gravely voice dressed in a dark suit of the finest materials carrying a copy of Mario Puzo’s The Godfather.
I think i didn't got the joke. ? :(
 
  • #21
LCSphysicist said:
I think i didn't got the joke. ? :(

Just image standing in an elevator mentioning Tesla to a friend and having some stranger behind you say Let's not talk about Mr Tesla.

This actually happened to my two friends in NYC years ago. They were in a elevator and the people in front were talking softly about Gotti, leader of the Gambino crime family, and his recent racketeering conviction. As a joke, my friend who is a swarthy looking Italian (imagine Gimli the dwarf of LoTr dressed in a business suit) said quietly "Let's not talk about Mr Gotti".

Startled, the people in front looked back, went silent and got off at the next floor thinking they just escaped getting whacked by a mafioso.

https://en.wikipedia.org/wiki/John_Gotti

It occurred to me that the joke might work here as we are focused on Turing and not Tesla. Oh well.
 
  • Like
  • Haha
Likes PainterGuy, LCSphysicist and BillTre
  • #22
.Scott said:
Turing's first venture into practical computer engineering was his development of the "Bombe Machine" that he used for breaking the enigma cypher. Bombe was not a general purpose computer. Soon after, he used the "Colossus computer" developed by Tommy Flowers for another (more challenging) code breaking project.

Thank you! Yes, Turing contributed toward the development of Bombe Machine as was stated earlier; Gordon Welchman also did great work and also wrote the letter to Churchill to explain him the importance of their work and fund them to complete the work.

Colossus Computer was Tommy Flowers' baby and I'm sure there would be other unsung heroes who contributed to make the Colossus.

Turing did a good work but IMHO when it comes to Bombe Machine and Colossus he is celebrated as a lone hero which is unfair. Welchman made very important contributions toward the field of cryptography. I'm a layman so I might be wrong but intention hasn't been to say that Turing didn't contribute significantly.

anorlunda said:
I think Turing's theoretical papers on computing helped to put a positive attitude on the topic in the minds of nonspecialists and the public.

IMHO Turing's papers were very mathematical and that too pure mathematics so they wouldn't matter much to non-specialists and general public. Anyway, the papers were important toward the field of pure mathematics and computer science.
 
  • #24
PainterGuy said:
Since some women contributors were mentioned so I thought to share it here.
From Lady Lovelace on, women have been programming computers.
But before that (from etymonline):

computer (n.)​

1640s, "one who calculates, a reckoner, one whose occupation is to make arithmetical calculations," agent noun from compute (v.).

But before that, "computer" was an occupation - and the jobs were filled primarily by women.

Early "core" memory was also woven by women. Those jobs were initially open to men, but apparently, we couldn't cut it (or rather "thread it").

From Science News:
Far from the Shiprock desert, outside of Boston, women employees at Raytheon assembled the Apollo Guidance Computer’s core memory with a process that in this case directly mimicked weaving. Again, the moon missions demanded a stable and compact way of storing Apollo’s computing instructions. Core memory used metal wires threaded through tiny doughnut-shaped ferrite rings, or “cores,” to represent 1s and 0s. All of this core memory was woven by hand, with women sitting on opposite sides of a panel passing a wire-threaded needle back and forth to create a particular pattern. (In some cases, a woman worked alone, passing the needle through the panel to herself.)

In this case "outside Boston", was as often as not, my home town of Lowell, MA. And the weaving was not just for Raytheon. It was also Lockheed who, in turn, sold these units to all US computer manufactures at that time (1960's).

From my own experience, for the last half century, software engineering has consistently been as receptive to women as to men. I suppose you have to be logical - and by what logic would you exclude women.
 
  • Like
Likes PainterGuy
  • #25
While this thread is very interesting and well worth reading, I think it started off (the wording of the OP) on the wrong foot. Why does it matter at all whether Alan Turing was over or under - appreciated or who did the most important work at the time?

One thing is definite and that is he was treated abominably for his homosexuality; what a disgusting culture. And, to cap it all, he has been PARDONED. Pardoned for what? If the system is 'decent enough' to acknowledge that he was not in the wrong then logic should mean that he should have been acquitted and absolved (or what ever word indicates that he did no wrong). His contribution to the war effort was massive but that makes him no more or less deserving of a proper posthumous treatment - along with everyone else who was treated so badly. The system is obsessed with avoiding restitution but that doesn't have to come into it.
 
Last edited:
  • Like
Likes Buzz Bloom, Rive, PainterGuy and 1 other person
  • #26
sophiecentaur said:
While this thread is very interesting and well worth reading, I think it started off (the wording of the OP) on the wrong foot. Why does it matter at all whether Alan Turing was over or under - appreciated or who did the most important work at the time?

One thing is definite and that is he was treated abominably for his homosexuality; what a disgusting culture. And, to cap it all, he has been PARDONED. Pardoned for what? If the system is 'decent enough' to acknowledge that he was not in the wrong then logic should mean that he should have been acquitted and absolved (or what ever word indicates that he did no wrong). His contribution to the war effort was massive but that makes him no more or less deserving of a proper posthumous treatment - along with everyone else who was treated so badly. The system is obsessed with avoiding restitution but that doesn't have to come into it.
Lawyers are going to do law--and law has rules.
Turing was charged and convicted.
Evidence was presented and a verdict rendered.
The only lawyerly way to make that go away is to be re-tried and acquitted.
And the rules are that you CAN'T do that for a) dead people and b) repealed laws.
So the only lawyerly way to ameliorate an injustice is a pardon, which is the law's way of saying 'ignore that conviction; it no longer serves the purposes of justice as viewed from hereon forward.'

It is a struggle law always has. Canada has legalized cannabis. You can't undo 10's of thousands of possession convictions that were obtained under previous law. You can issue pardons so that those convictions don't lead to future consequences. Merriam-Webster gives 'pardon' the meaning that it has in regard to unjust convictions: 'to relieve of a penalty improperly assessed'
 
  • Like
  • Informative
Likes .Scott, Buzz Bloom, PainterGuy and 1 other person
  • #27
N1206 said:
The only lawyerly way to make that go away is to be re-tried and acquitted.
I guess you are right but laws can change; it just requires the will of parliament and enough people to require it. But turning things around can have unknown consequences. I mentioned restitution and that could be a bottomless pit - in this age of monetarianism.
 
  • Like
Likes PainterGuy
  • #28
All we can do is apologize, it will not bring him back but we can try and not make the same mistakes going forward.

https://en.wikipedia.org/wiki/Alan_Turing_law

On a lighter note you can actually see all this stuff.

The complex, huts, the machines, his room and even his mug chained to the radiator! Some of it was it was in the 1940s.

There was an old fellow there who explained how the Bombe worked- I got lost after a few minutes.

One of the things that moved me the most was reading the original letter the British government sent to his family, in the 1970s I think.

Turing was not allowed to discuss his contributions so presumably even his family did not know the role he played.

Whatever his rating was, the government seemed to think he was important reading that letter.

Anyway if you are in the UK it is an interesting day out and a few of the sets for the film they made about him are there also.

https://bletchleypark.org.uk/plan-a-visit/
 
  • Like
Likes Buzz Bloom, PainterGuy and sophiecentaur
  • #29
My mother was one of the first programmers. They thought it was great when the assembler was invented and they didn't have to use machine code any more.
 
  • Like
Likes sophiecentaur, pinball1970, PainterGuy and 2 others
  • #30
Hornbein said:
My mother was one of the first programmers. They thought it was great when the assembler was invented and they didn't have to use machine code any more.
She still needed machine code.
When the first assemblers came out, you still needed to debug them through the "programmers panel".
H200c.jpg

I had withdrawal symptoms when programmers panels started disappearing.
 
  • Haha
  • Like
Likes PainterGuy, jedishrfu and anorlunda

Related to Is Alan Turing given too much credit when it comes to computers?

1. What is Alan Turing's contribution to computers?

Alan Turing is known as the father of theoretical computer science and artificial intelligence. He made significant contributions to the development of early computers, including the concept of the Turing machine and the Turing test for measuring a machine's intelligence.

2. Why is Alan Turing given so much credit for his work with computers?

Alan Turing's contributions to computer science were groundbreaking and laid the foundation for modern computing. His work not only advanced the field of computer science, but also had a major impact on cryptography and code-breaking during World War II.

3. Are there any other individuals who deserve credit for the development of computers?

While Alan Turing is often credited as the father of computer science, there were many other individuals who also made significant contributions to the development of computers. Some notable names include Charles Babbage, Ada Lovelace, and John Von Neumann.

4. How did Alan Turing's work impact the field of computer science?

Alan Turing's work had a profound impact on the field of computer science. His concept of the Turing machine laid the foundation for the development of modern computers, and his ideas on artificial intelligence continue to influence research and development in the field.

5. Is Alan Turing's contribution to computers still relevant today?

Absolutely. Alan Turing's work continues to be relevant and influential in the field of computer science. His ideas and theories are still studied and applied in various areas, including artificial intelligence, cryptography, and computer programming.

Similar threads

  • Programming and Computer Science
Replies
7
Views
1K
  • Sticky
  • Programming and Computer Science
Replies
13
Views
4K
Replies
10
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
617
  • Computing and Technology
2
Replies
44
Views
3K
  • Computing and Technology
Replies
10
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
Replies
1
Views
871
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
Back
Top