News Express Planet

AI Released that was deemed to be too dangerous to Release

AI Released that was deemed to be too dangerous to Release

November 19
11:41 2019

An Artificial Intelligence has now been released into the world, which was previously deemed to be too dangerous to be released.

This model of Artificial Intelligence is known as “GPT-2”. The researchers fear that this model of AI is so powerful that it has the potential danger of being misused maliciously by anyone, right from the politicians to scammers.

The creation of GPT-2 was made for a very simple purpose. It can be fed a piece of text and this AI is able to predict the words that will come next in that text. It can help in creating long strings of writing by doing so. These writings are not distinguishable from the writings of human beings.

It was found that GPT-2 was extraordinarily good at the job it has to do. It created texts so powerful that it could easily be used to scam people. This may lead us to undermine the text that we read every day.

Moreover, this model of Artificial Intelligence can prove to be a potential danger in the hands of extremist groups, who can use it to create Synthetic Propaganda for them. These synthetic texts can be used by these extremists to automatically generate long texts that promote White Supremacy or Jihadist Islamis, as per their objectives.

OpenAI wrote in a February Blog Post that they were not releasing the trained model of this AI because of their concerns about the possibility of malicious application of the technology by the miscreants. It released the AI by making the announcement that they were releasing a much smaller model of the AI as an experiment in responsible disclosure, for researchers to experiment with and also as a technical paper.

This organization released a very limited version of the tool at that time that used 124 million parameters. Now the full version has been made available by the organization.

About Author



Related Articles


No Comments Yet!

There are no comments at the moment, do you want to add one?

Write a comment

Write a Comment