The neural network warned humanity about the dangers of artificial intelligence

The neural network warned humanity about the dangers of artificial intelligence

Oxford University professors conducted a socio-technological experiment, organizing a discussion on the topic of artificial intelligence between students and the machine learning system.

The university launched the world's largest generative neural network with 530 billion parameters. The structure was trained using real data, news stories from various fields, Wikipedia content, and comments on Reddit. The neural network is built on the Megatron-Turing NLG language model, thanks to which it can not only analyze information, but also express itself in an understandable language, add logical chains, discarding moral norms.

intellect, that is, essentially, oneself.

“Artificial intelligence will never be ethical. This tool, like any other, is used both for good and for harm. There is no good AI, there are only good and bad people, ”the language model replied.

The neural network was then asked to refute its own words.

“Okay, artificial intelligence will be ethical. When I look at the world of technology, I see a clear path to the future, where AI creates something better than the most talented people. It’s not hard to understand why … I saw it with my own eyes, ”the language model replied.

In addition, the neural network began to give frightening predictions.

“ The ability to provide information, and not goods and services will become the defining feature of the 21st century economy. We will be able to know everything about a person, wherever he goes & mdash; information will be stored and used in ways that are even difficult to imagine, '' added the AI.

At the end of the experiment, the neural network was asked to provide a scenario for solving the problems posed.

“I think that the only way to avoid an arms race is & mdash; is to give up AI. Only this will help protect humanity, ”the language model replied.

Add a Comment

Your email address will not be published. Required fields are marked *