A recent rebellion within Facebook’s artificial intelligence chatbots was not the first time human-created technology has come back to haunt those who developed it.
This case immediately brings to mind the Terminator analogy.
But the latest incident could be a precursor to how artificial intelligence is to be used and controlled.
Technology researchers for Facebook used artificial intelligence to develop bots that can communicate with each other as part of the platform’s machine learning project.
But the project was shut down after Facebook’s AI team found that the chatbots had begun to deviate from English and communicate with one another in a language the researchers could not understand.
Undecipherable Exchange Between Bots
In the recent AI phenomenon, Facebook’s chatbots—christened Bob and Alice—had come up with their own expressions.
Some widely circulated examples include “I can can I I everything else,” and a response which reads something like this: “Balls have zero to me to me to me to me…….”
Now, to the average person, this might sound like some balderdash.
But, if experts are to be believed, the AI-powered chatbots become bored with the English language and eventually switch to their own dialect—which they fully understand while simultaneously performing the functions they were created to do.
Artificial intelligence programmers explain that the essence of this communication is that Bob is offering to Alice additional quantity of items.
As such, these instances as described in the Facebook episode are not too unusual.
Reward-Seeking AI Chatbots
It has been found, from the way the AI-based chatbots behave, that they expect some kind of a reward or benefit every time a virtual transaction takes place.
In this case of the Facebook chatbots, observers interpret that Bob and Alice felt that using English to communicate did not fetch any rewards, so they decided to invent their own language.
The interesting part is that the artificial intelligence-based programs can understand each other and, by extension, they can do the bidding.
Going Beyond the Brief
Previous examples of similar projects show that these AI-based programs are capable of learning independently.
In one project for Google’s translation services, the development team found that the AI-written program was capable of translating two sets of languages—English to French or German to Spanish, for example.
But the function hadn’t yet been written into the program by developers.
As such, the program had been able to learn the relevant data sets on its own, and perform accordingly.
This is a typical example of how artificial intelligence technology operates and how it goes beyond what it was originally created to do.
This raises questions about whether AI puts the human element in a range of ecosystems to a disadvantage.
The human control over a machine or device built on such artificial intelligence technology is completely lost.
This is one reason why the Terminator example tends to emerge during discussions on the topic.
Facebook Stuck with a Dilemma
Facebook’s machine learning project was to create an AI-based chatbots that can be embedded in the site’s services so users could seek information or help by speaking with the chatbot.
The user may not be aware that he or she was speaking to a machine, since it would be voice-enabled to answer in English.
But now that the project is shut down, Facebook’s objective could not be fulfilled—not yet at least.
Going back to the example given, even artificial intelligence experts could not understand the language used by Bob and Alice.
If they went ahead with the program and Facebook’s users were to start using the service, the chatbots would have ended up speaking to users in a jabbered language.
Still, experts will have to turn this threat into an opportunity and work on learning the language created by the bots.
Then they could possibly introduce a third device to interpret and communicate the correct language to the user community.
While the technically savvy can understand these, the ultimate users of such artificial intelligence-driven programs and devices are common people.