Entscheidungsproblem

by Sherwin Jaleel
936 views

The problem that kicked off the computer era

Was it not for the Entscheidungsproblem, many of us might not have a vocation in IT. History tells us that the Entscheidungsproblem triggered the theoretical underpinnings of the modern computer. But what is the Entscheidungsproblem?  The name stands for the German of ‘decision problem’ proposed by David Hilbert in 1928. When simplified, it asks the ominous question – Is there an algorithm to determine if a statement is true in all models of a theory? Despite what we IT geeks believe of our programming abilities, stunningly, the answer turns out to be “no”.

Source: Britannica.com

The man on the £50 note – Alan Turing

A young man who you will find on a British £50 banknote – Alan Turing, was the one credited with working out this intriguing answer and in doing so inadvertently invented the modern-day computer. Look closely at the £50 banknote and you’ll find the mathematical table and formulae from Turing’s 1936 paper “On Computable Numbers, with an application to the Entscheidungsproblem”. David Hilbert was a venerated University of Göttingen professor, who believed that the basic axioms of mathematics are logically consistent.

Hilbert sought an algorithm to show that a given mathematical statement could be proved from first-order logic axioms alone. Turing’s seminal paper decisively showed that there was no such algorithm and in the process of doing so deftly launched the computer age. Back then the concept of a computer algorithm had not taken shape. Turing first established a working definition for the term – algorithm. Turing’s approach was to break down seemingly complex human cogitation into simple arithmetical procedures. He demonstrated this using his now famous Turning Machine.

Turing machine

Despite its simplicity, the Turing machine can simulate any computer algorithm, no matter how complex it is. Alan Turing demonstrated that everything that can be reasonably said to be computed by a human using a standard procedure can be computed by the Turing machine. Turing used a human as the basis for his model. He abstracted the actions of a human ‘computer’ using paper and pencil to perform a calculation into a hypothesised machine that could manipulate symbols on an infinite paper tape. Turing demonstrated that there were algorithms that the Turing machines would run indefinitely and inconclusively. Thereby proving David Hilbert’s assertion wrong. The Turing machine was hypothetical and theoretical, but it became the conceptual model of modern computers. John von Neumann’s pioneering 1945 design for electronic computers was based on the turning machine.

Read more about the Turing Machine here

Homebrew Computer Club

It did not take very long for businesses to see the benefits of machines that can be programmed to compute algorithms. The idea of machines that could be program to do anything was incredibly promising and novel. By the 1975 we had the Homebrew Computer Club a group of hobbyists harnessing silicon chips to build their own computers. The club played a pivotal role in the development of the microcomputer revolution. Perhaps the likes of Steve Jobs and Bill Gates have the Entscheidungsproblem to thank, for making them household names.

The Legacy continues

Turing’s legacy did not stop with the Turning machine. Back in the 1950s, he envisaged computers so powerful that they could think. He dared envisage the reality of Artificial Intelligence and so solemnly believed in the idea that he even defined the Turing Test. The test of a machine’s ability to exhibit intelligent behaviour. In 1945 Turing predicted that computers would one day play very good chess. 10 decades later in 1997, IBM’s beat the reigning chess world champion. Alan Turing was clearly a man ahead of his time. His invention started with the Entscheidungsproblem. Today many (like me) owe their vocations and livelihood to the field of computer science and in a way indirectly to the Entscheidungsproblem. The relentless march of computers continues into the decades ahead, promising innovations like quantum computers and biocomputers as the world waits in anticipation rather than scepticism.