I’m sure everyone is aware of “The Terminator”. It has a huge dedicated fan base because of its unique plot and captivating execution, but have you ever wondered what would happen if something like that happens in reality. What if such evil AI robots start existing in real?

Many tech experts and scientists including some of the biggest personalities of the tech world like Bill Gates, Elon Musk and Stephen Hawking have elucidated that there will come a time when AI system will become smart enough to wipe out the human race just like Skynet did in the Terminator film series.

In 2017, Facebook became a trending topic after Facebook’s artificial intelligence bot invented their own language and the experiment eventually was shut down because of some undisclosed reasons. Reporters and news channels had a field day while covering this incident and most of the news outlet blamed the mysterious AI Bot language as the reason for the ceasing of the experiment. Let’s explore it a little in order to find out what exactly happened.

I’m sure everyone is aware of “The Terminator”. It has a huge dedicated fan base because of its unique plot and captivating execution, but have you ever wondered what would happen if something like that happens in reality. What if such evil AI robots start existing in real?


In the Facebook Bot experiment, the robots were made to conduct a negotiation. Mundane objects like a bat, ball, etc. were given values so that the negotiation will feel as close to real as possible. The bots were instructed to improvise the negotiation during the exchange to make that process even more advanced.

One thing to notice is that the robots were not instructed to use the English language exclusively during the negotiation. During the exchange, the bots suddenly start talking in incomprehensible shorthand English which the observers were unable to understand. Bob and Alice are the names of the bots involved in this experiment and the following image is the interaction that happened between them.

As it is clear in the picture, the language used by Alice and Bob is in English but still incomprehensible. AI robots have a learning algorithm that will help them learn things from the user, their behavior, patterns, etc. therefore, what happened in the Facebook bot experiment is not something very unusual.

AI robots are usually designed in a way that allows them to find ways in order to optimize the results. So if AI robots start using shorthand English to increase the efficiency of the negotiation then it’s not that big of a deal.


So the question arises, is it the first incident that happened of this nature or has it happened before?

The answer to this question is “Yes”. Back in 2016, Microsoft’s chatbot “Tay” was made available on twitter and other social platforms. It was designed for human engagement and a machine learning project. When Tay got exposed to social media, after a while it starts posting racist comments and expletives due to which Microsoft had to shut it down. Microsoft later explained that as the Chabot got exposed to social media, it starts picking up behavior patterns that were being shown on twitter, as a result, the bot start making racist comments in order to engage the audience. It mainly happened because the company did not provide specific filters to the system.


Above we have discussed what happened with Facebook’s bot and does that ever happened before, now for the sake of understanding the issue better let’s discuss what machine learning is.

In machine learning, artificial intelligence uses different techniques to help a machine in learning without the use of added software. The algorithm helps in predicting the outcome of every response. According to Best assignment writing service UK, machine learning can be done through a number of ways such as,

  1. The Netflix challenge
  2. Visual object detection:
  • Natural language translation
  1. Open-domain continuous speech recognition

There is also room for error when dealing with AI software and machine learning because at the end of the day it’s just a program and everything is prone to error even the high and mighty AI. Here is a list of possible errors:

  1. Sparse text data
  2. Societal bias
  • Going out of context
  1. Error in interpreting syntactic and semantics of a language.


Artificial intelligence experts have been speaking against the media for overdramatizing the news of the experiment conducted by Facebook through misleading headings of different articles. The whole media charade revolved around one theme i.e. “Facebook shut down experiment because AI bots invented their own language”

After finishing the experiment, Facebook published a research paper in June and explain the experiment in detail while deeming it as a completely normal research experiment. Facebook further explains that the resulting gibberish was a result of improvising the English language by the AI bots in order to negotiate better. This improvement and improvising of language was a result of trial and error.

The main objective of the paper was not to highlight the invention of a supposedly new language as dubbed by the media but to highlight that the negotiation that took place between the bots was successful. In short, the experiment was successful and it was finished after the achievement of desired results not shutdown abruptly as it was portrayed by the media rather it is now on old as Facebook has lost interest in that project.


To sum it up, the experiment was done by Facebook sure has its quirks, with the sudden invention of a new “supposed” language and shutting down of experiment it was natural to speculate. However, the media also went all out in coming up with headlines and clickbait. After reading and researching a lot, it has been established that it was not more than a shorthand language, the same way humans use with each other. And it’s certainly not the time to fear robots like they are an incarnation of the terminator.

Author’s Bio: Adam Sheen is an MSc in Marketing and Business from the University of Edinburgh. He is an expert in Finance, management, and government taxation. He also has a vast experience of working as a digital marketer and loves to write innovative topics.

Intestinal Gas Production and Measures in Adults

Previous article

Quickbooks Error 15223

Next article

You may also like


Leave a reply

Your email address will not be published. Required fields are marked *

More in Technology