Theory of Information or Signals

07 May

David Hilbert proposed 23 problems for Mathematics to solve, among them the 2nd. problem was the consistency of Arithmetic to solve problems (which started the idea of algorithm). Later from 1910 to 1913 Alfred Whitehead and Bertrand Russerl published the 3 volumes of Principia Mathematica recalling Isaac Newton’s Principia (Philosophiae naturalis principia mathematica) of 1980, but logic has shown contradictions.
It was Kurt Gödel (1906-1978) who showed the contradictions of logic by demonstrating that a mathematical system is either complete or consistent, a problem that was discussed by Alan Turing and Claude Shannon at Bell Labs in the late 1930s amidst World War II .
Both were in secret projects, Turing was to decipher the Enigma machine, captured from the German army, and Shannon build the System X machine to encode messages from Roosevelt and Churchill from the allied troops.
Both spoke at the lab meetings about Gödel’s problem, and wondered if the machine could think. Boole’s circuits and the Turing machine are at the origin of modern computers, this information theory is actually information in another context, Shannon himself would call it a message, and modern computing calls it signals.
The computation evolved with the construction of the first computer by von Neuman, and we initiate at the moment to ask again if the machine will surpass the human, the philosopher Sam Harris has a opinion:



Comentários estão fechados.