Arquivo para a ‘Framework’ Categoria
The electronic narrative
The rapid evolution of Artificial Intelligence, after a serious crisis towards the end of the millennium, brings a mystifying aspect to the scenario of scientific dissemination and sometimes even to scientific research itself, which sees it beyond the real possibilities or below what it is able.
That is why we pointed out in the previous post the real evolution and sophistication of Machine Learning algorithms and the growth of Deep Learning technology, this is the current rapid evolution, the evolution of electronic assistants (several of them are already on the market such as Siri and Alexa) is still limited and we commented in a post about the LaMBDA machine that it would have “sentient” capability.
Sentient is different from consciousness, because it is the ability of beings to perceive sensations and feelings through the senses, this would mean in the case of machines having something “subjective” (we have already spoken about the limitation of the term and its difference from the soul), although they are capable of of narratives.
This narrative, however complex it may be, is an electronic narrative, an algorithmic one, with the interaction of man and machine through “deep learning”, it is possible that it confuses and even surprises the human being with narratives and elaborations of speeches, however it will depend on always from the human narratives from which they are fed and create an electronic narrative.
I cite an example of the chatGPT that excites the mystifying discourse and creates an alarm in the technophobic discourse and creates speculations even about the transhuman limits of the machine.
A list of films considered extraordinary, exemplifies the limit of electronic storytelling, due to its human power, the list gave the following films: “Citizen Kane” (1941), “The Godfather” (1972), “Back to the Future ” (1985), “Casablanca” (1942), “2001: A Space Odyssey” (1968), “The Lord of the Rings: The Fellowship of the Ring” (2001), “The Shawshank Redemption” (1994), ” Psycho” (1960), “Star Wars: Episode V – The Empire Strikes Back” (1980) and “Pulp Fiction” (1994).
No mention of the Japanese Akira Kurosawa, the German Werner Herzog or the Italian Frederico Felini, just to name a few, about fiction would not leave out of the list Blade Runner – the hunter of androids, well connected to the technologies of “open AI” or the historic Metropolis (from 1927 by the Austrian Fritz Lang).
The electronic narrative has the limitation of what feeds it, which is the human narrative, even if it is made by the wisest human, it will have contextual and historical limitations.
Twitter, Cyberculture and Perpetual Peace
The purchase of Twitter by Elon Musk for $ 44 billion, the Space X and Tesla billionaire, makes social media increasingly tied to the political field and shakes the empire of traditional media.
One of Musk’s basic ideas is to make the network less controlled (the moderator function) and with more text possibility, at launch in 2006 it was 140, in 2017 it was expanded to 280 and will probably add larger texts, Musk is the owner of the tool Revue.
NetFlix lost 200,000 subscribers (a little for 100 million subscribers, but a trend), CNN faces serious problems with an editorial discourse (7 out of 10 viewers) tries to change the focus, it is the mainstream media suffering from the advancement of new media , and everything indicates that the war in the military and ideological field will move to the cybernetic field, the drones practically retire the idea of using tanks and drones airplanes and flight autonomy, making war unequal in strength and war material more equal .
But the release of arbitrage on Twitter worries, although the open source algorithm proposal is interesting, but the big question is Musk’s ideas about war?
Of course, all this is reprehensible due to the number of civilian victims they cause, the human tragedies that develop there, also among soldiers who are on a battlefield where many would not want to be.
Kant’s Perpetual Peace proposed a precept of reason over power, but a strange saying appears in the middle of its text (we will analyze the text in the next post), which is prudent as serpents and gentle as doves, in the text Biblical (Mt 10:16) it is also possible to translate “simple as doves, but the Kantian interpretation is divergent:
“…Be wise as a serpent”; morality adds (as a limiting condition): “and without falsehood like doves” (Kant, 2008, p.34), and Kant himself points out that “the two things cannot coexist in a precept” an evident contradiction, Perpetual Peace It’s more complex, of course.
This is the problem with the new media, it is necessary to use the resource that is often used in politics of a certain “falsehood”, or dubiousness, whether to mislead opponents or to deceive the people.
There is no way to establish peace if there is no respect for conflicting cultures and values, of course within reasonable humanitarian limits, the first is life itself put in check in war and the second is enabling the survival and self-determination of peoples to decide their destiny.
Kant, I. (2008) A Paz Perpetua (Perpetual Peace). trans. Arthur Mourao. Portugal: University of Beira Interior Covilhã.
Digital transformation beyond Buzzword
We alerted and problematized in the 10 years of this blog the transformation that was being led by digital changes, social, educational, industrial and even behavioral aspects, most of the skeptics reacted, mocked or despised a real change that was happening.
The pandemic has shown that more than necessary tools can build bridges, establish new relationships, energize companies and avoid wasting time, money and especially in these times, endangering health.
Now everyone lives in the digital reality, companies have survived through online services, families, social groups, public services and meetings of various types depend on digital tools, shows depend on lives, meetings or posts on social media tools.
A buzzword emerged very strongly called the “digital transformation”, but the danger of opportunism is great for companies and sites that exploit and mystify these services and charge dearly for it, so some concepts are necessary, first what happens differently in generation Z of previous call of millennials, those who were born at the beginning of the millennium, therefore before the year 2000, which is now 22 to 37 years old.
The millenials followed the evolution of the Web (the pages, websites and blogs), they were born in a reality in which computers were an appliance, so they were only used at home and optionally at school, while generation Z through cell phones took the digital world to everywhere, create chat groups and behave differently with the credibility of websites, blogs and media networks, create their own relationships and idols, in general different from everything that is known.
Although more closed and with a tendency to have little social relationship, they are more critical than millennials, who are more anxious, more efficient and more demanding.
Thus, relations with the market are very different, they return to prefer shopping in physical stores and select well what they buy, less impulsive and already have the technology with excellent support, although very connected they already know the limits of technology.
Major economics magazines like Forbes and Fortune have done generation Z analyzes to understand the necessary market transformation, Forbes says it represents 25% of the current world population, digital is a natural part of their lives, like TV and the radio of past generations, while Fortune claims that 32 of generation Z are striving for a job of their dreams and rule out taking on any job, although temporarily accepted to lift the future.
Thus the old CRMs (Customer Relationship Management) do not work and many criticisms and analyzes made for the millennial generation are outdated.
According to Kasey Panetta, a researcher at Gartner, 5 new concepts are emerging: Composite architectures, agile and responsive architectures, Algorithmic trust, products, links, websites and reliable transactions, Beyond silicon, the limits of Moore’s law of the evolution of computers, now technologies smaller and more agile are sought, Formative Artificial Intelligence (AI) adaptation to the client, customization of services, times and location, and the Digital Me concept, a kind of passport to the digital world, tools and websites that already know the client and their needs, forms of behavior and preferences.
So the entire digital universe that seemed stable is also going to collapse and much of what is called “digital transformation” is just a digital mystification.
Panetta, Kasey. 5 Trends Drive the Gartner Hyper Cycle for Emerging Technologies, 2020. Available at: https://www.gartner.com/smarterwithgartner/5-trends-drive-the-gartner-hype-cycle-for-emerging-technologies-2020, Access: September 15, 2020
2020: IT predictions
It is famous and historical predictions in the 70s by the presidents of Digital Equipments and IBM that personal computers would not come true, but in the early 80s they were. The renowned Wired magazine said at that time that they would happen, but would be first adopted in companies and then in families, the reverse happened. T he magazine's predictions for simultaneous translation were for 2015, they happened in 2017 but there are still complaints about its effectiveness, the bet on hydrogen cars was for 2010, which is becoming reality are electric cars, slowly because of the market it is true, but also the technology of batteries and autonomy of the expensive ones still evolves. Five technologies may meanwhile change the market in 2020: 5G may definitely enter the market changing the business of smartphone operators , multiclouds as evolution of cloud storage will be an evolution of current clouds, AI in particular, Ma chine Learning will enter companies and businesses giving impetus to current IT. And, finally, many possibilities of mobility can change, with the evolution of IoT.
Even waves can go to AI
The OpenAI project, although it is said to be “non-profit” and is really open, just enter the blog of the project to check the progress and the possibilities, in fact even pointed the experts when publishing a system that writes news / texts, theoretically fictions, but that can be classified as fake, or as they are being called: faketexts. Also the code is open and available on the GitHub site for developers.
Natural language processing systems can perform tasks such as answering questions, machine conversion, reading comprehension, and text summarization, which are already typically addressed with supervised learning in task-specific data sets, but text search in GPT2 is wider in quantity.
GPT2, the successor to the GPT that was just a producer of texts from basic texts, can now read up to 40 GB of existing text on the Web, and what it has produced frightens a little, for clarity, depth and worst of all, pure fiction or more clearly: fakes.
Among its syntactic characteristics, it is superior to others of the sort, writing passages that make sense with the previous text and maintains the style, without getting lost in long sentences. The problem is that it can generate fakenews that can now be longer, becoming faketexts, a report from The Guardian shows the novelty and the problems:
Web Summit in Lisbon
One of the biggest events of the Web was this week, it was a side event, I could only follow videos and news, undoubtedly the biggest star was the founder of the Web Tim-Berners Lee who already has a great new project, although he has spoken between the lines.
He started an interview, which in fact he spoke at will without many questions saying the beginning of the Web and how his growth was also surprising for him, he told technical details like “I wrote the code of the first server and the code of the first browser, it was called WorldWideWeb.app “and was on info.cern.ch.
He then said that his concern is the same as everyone, after 25 years we should deal with: cyberbullying, misinformation, hate speech, privacy issues and said what many are talking about, “What the hell could go wrong?” to the public: “in the first 15 years … great things have happened. We had Wikipedia, the Khan Academy, blogs, we had cats, “he said jokingly, adding:” Connected Humanity should be more constructive, more peaceful, than Humanity disconnected”, but jnt (just not).
“Because we are almost at the point where half of the world will be online”, explained the British engineer was referring to the ’50 / 50′ moment, that is half the connected humanity expected in 50 years, but it should reach this point in May 2019.
After trying to argue the responsibilities of governments and companies, I believe they can happen but they will be slow, he spoke indirectly of his SOLID (Social Linked Data) project, stating that “as individuals we have to hold corporations and governments accountable for what is happening on the internet ” and “the idea is, from now on, everyone is responsible for making the Web a better place, “said encouraging start-ups too to get into this process.
Thinking about the development of interfaces where users know people from different cultures, but above all ensure the universality of the Web, according to Berners-Lee the main aspect should be (speaking indirectly again of SOLID) that the popular intervention at global level and that made the Web “just a platform, without attitude, that should be independent, can be used for any kind of information, any culture, any language, any hardware, software”, linked data may help this.
Tim Berners-Lee presented the #ForTheWeb movement on the same day that his World Wide Web Foundation released the report “The Case for the Web”, the event had a superaudience, more than 30 thousand people, there are several videos, but the Opening Ceremony is one of the most outstanding and has Tim-Berners Lee as well, see on vídeo: https://www.youtube.com/watch?v=lkzNZKCxM
Wikipedia and Artificial Intelligence
Having already almost surpassed the point of singularity (see our post), the point that the machine would surpass human intelligence, the question now turns to consciousness and a well-considered point is the question of consciousness.
In this sense the main criticism is the perpetuation of prejudices, which would avoid what I call hermeneutics, but it is an incorrect view of the evolution of digital technology, for example, the use of Digital Ontologies and the ability to seek scientific studies outside of Wikipedia.
This is what recently announced an article in The Verge, and the most serious omission after researching scientists who are omitted from Wikipedia, was to note that 82% of written biographies are about men.
In a blog post, according to The Verge website, John Bohannon, director of science at Primer, explains the development of Quicksilver tool to read 500 million original documents, sift through the most cited numbers and then write a basic article about the work of these scientists not mentioned in Wikipedia.
Two examples of illustrious women found and written for AI are Teresa Woodruff, a scientist who designed mice ovaries using 3D printers, was cited by Time magazine in 2013 with one of the most influential people in the world scientist, and the other case is that of Jessica Wade, a physicist at Imperial College London, who wrote the new entry for Pineau.
Wade was one of the scientists who said “Wikipedia is incredibly tantalizing, and the underrepresentation of women in science is particularly bad,” and praised Quicksilver stating that with it you can quickly find large amounts of information very quickly.
Wikipedia will have to evolve with Machine Learning tool, this may happen in the coming years, the fact that there are specific tools for this does not invalidate Wikipedia, shows that it has weaknesses and should be corrected
Paul Allen died
Co-founder with Microsoft’s Bill Gates (photo), was fortunate enough and was in fact the great developer of Microsoft, Bill Gates had worked before Microsoft only in a version of Basic language, it was he who suggested the purchase of QDOS, developed system by Tim Paterson when he works at Seattle Computer Products, where MS DOS came from, whose sale to IBM is the origin of Microsoft’s millionaire project.
Paul Allen was familiar with Xerox’s Palo Alto MVT system, which was an inspiration for early versions of Windows, and later invested in Explorer in a heavily competitive version with Netscape, which triggered the so-called Web browser war.
Paul Gardner Allen created a foundation with his name in 1988 to run philanthropic projects; between 1990 and 2014 he donated more than $ 500 million to more than 1,500 nonprofit organizations, most of them for technology, arts and culture projects, but also a significant slice of social development (about $ 100 million).
He died in the 65 years old, cancer victim in his hometown of Seattle, where he owned the basketball team.
This is solid or liquid
It sounds like a joke, not the question that appears in the design of the Solid website, in fact the question there is: What is Solid? the new Internet project by Tim Berners-Lee and MIT.
After Web 2.0 that included everyone, but lacked validation of data, authorship and thoughts, Web 3.0 emerged from Linked Data in 2009, and this is in the composition of the name Solid: Social Linked Data, although the idea of the acronym is rejoined it circulated through the networks and makes all sense, the main idea is to decentralize the Web, give greater security giving users the possibilities of total control over the use of data, explains an article by Klint Finley in the prestigious magazine Wired .
The main idea is to give individual users full control over the use of their data, but with validation, authorship and data processing through the concept of linked data.
The main startup of this project is Inrupt, according to Wired magazine: “If everything goes as planned, Inrupt will be for Solid what Netscape was for the network beginners (Web): an easy way to enter, the magazine was invited to learn about the project at Berners-Lee’s office, which revealed several concerns.
Despite all the good we have achieved, the cycle of inequality and division, captured by “powerful forces that use for their own interests,” Berners-Lee said, adding: “I have always believed that the network is for everyone. That is why I and others fought hard to protect it “and now a decisive step has been taken.
The Inrupt screen will bring together functions like Whatsapp, google Drive, Spotify and Google Drive, it seems all the same, the difference is that the control will be personal, the individual will define their priorities and strategies and not algorithms of social networks.
It’s also an emerging need because you just have to look at the screen of your cell phone or the computer, personally install few things, and we see a multitude of applications that we do not even use, it’s like a wardrobe full of old clothes waiting for an occasion that does not come.
The SOLID Project is here to stay, even if it’s a newbie and lots of it’s just a promise, it’s easy to see its viability, necessity and potentiality through MIT’s seal.
Deep Mind Advanced Project
Projects that attempted to simulate brain synapses, communication between neurons, were formerly called neural or neural networks, and had a large development and applications.
Gradually these projects were moving to studies of the mind and the code was being directed to Machine Learning that now using neural networks happened to be called deep learning, an advanced project is Google Brain.
Basically it is a system for the creation and training of neural networks that detect and decipher patterns and correlations in applied systems, although analogous, only imitate the way humans learn and reason about certain patterns.
Deep Learning is a branch of Machine Learning that operates a set of algorithms used to model data in a deep graph (complex networks) with several layers of processing, and that, unlike the training of neural networks, operate with both linear and non-linear patterns .
One platform that works with this concept is Tensor Flow, originated from an earlier project called DistBelief, is now an open source system, released by the Apache 2.0 team in November 2015, Google Brain uses this platform.
In May 2016, Google announced to this system the TPU (Tensor Processing Unit), a programmable artificial intelligence program accelerator with high transfer rate ability for low precision arithmetic (8 bts), which runs models and does not more training as neural networks did, a Deep Compute Engine stage begins.
The second step of this process in Google Compute Engine, the second generation of TPUs achieves up to 180 teraflops (10 ^ 12 floating point operations), and mounted in clusters of 64 TPUs, work up to 11.5 petaflops.