RSS
 

Thought and information technology

12 Mar

The origins of almost all realities (if we don’t consider the divine and eternal) come from human thought, the idea of politics in the Greek polis, the idea of the “art of war”, from the law codes of Hammurabi (1792 to 1750 BC) to modern contractualists, compilations of religious treatises, epistemological constructions of the sciences and computer science could not be left out.

In 1900, when physics and mathematics seemed to give an air of precision and certainty to the scientific universe, positivism still reigned in law, a German mathematician David Hilbert proposed 23 “final” problems for mathematics at an International Congress in Paris in 1900.

Among these was the second problem: the finitist solution to the consistency of the axioms of arithmetic, which together with the sixth problem, which was the axiomatization of physics, seemed to give a logical and precise finish to all of science, but there had already been a return to the question of Being through Husserl and Heidegger, and this returned thought to human complexity.

Kurt Gödel, a member of the Vienna Circle who eschewed this logic and for this reason was called a neologicist, proved the incompleteness of the second problem, that arithmetic was either consistent or complete, thus remaining in a paradox, called Gödel’s Paradox.

The question of arithmetic is important to understand the origin of the idea of algorithms, which were previously just formulas like Bhaskara’s formula (for 2nd degree equations), complex solutions to differential equations, while physics had the problem of formulating all of physics in a single theory, the so-called Standard Theory of Physics, but quantum mechanics and the theory of general relativity, where time and space are not absolute, changed this scenario.

The meeting of Claude Shannon and Alain Turing, who were working on secret projects to code transmissions (made for the Roosevelt government) and decode the Enigma machine captured from the Nazis (Turing’s secret project) will create a new event.

Unable to talk about their secret projects (Gleick, 2013, p. 213), they talked about Gödel’s paradox and wondered about the possibility of the machine elaborating thoughts, even if it was something limited, and both developed theories about language and algorithms.

While Turing devised a state machine that, through back and forth movements of a tape recording symbols, would produce intelligible sentences, Shannon worked on a similar model (using a theory called Markov chain) that, through finite vocabularies, could compose sentences and formulate broader ideas.

Alain Turing’s definitive contribution was the so-called Finite State Machine, whose model was completed by Alonzo Church, while Claude Shannon left the contribution of a Mathematical Theory for Communication, his theory establishing the amount necessary for the information transmitted not to be damaged, but within the limits of the “machine”.

The reductionist idea that it is possible to carry out actions without a necessary, elaborate, meditated and tested thought is part of current pseudo-scientific narratives.

GJeick. (2013) . Informação: uma história, uma teoria e uma enxurrada. (Information: a history, a theory and a flood). Transl. Augusto Cali. Brazil, São Paulo: Companhia das Letras.

 

Comentários estão fechados.