Home » News » Why ‘Godfather of AI’ decided he had to ‘raise the alarm’ on the tech he made? Geoffrey Hinton says

Why ‘Godfather of AI’ decided he had to ‘raise the alarm’ on the tech he made? Geoffrey Hinton says

(Image Credit Google)
Godfather Godfather Image credit : University of Toronto Geoffrey Hinton, commonly referred to as the "Godfather of AI," said he became concerned about how intelligent the technology was growing and decided he had to "blow the whistle" on it. “I’m just a scientist who suddenly realized that these things are getting smarter than us,” Hinton told CNN’s Jake Tapper in an interview on Tuesday. “I want to sort of blow the whistle and say we should worry seriously about how we stop these things getting control over us.” Many of today's goods' artificial intelligence systems are powered by Hinton's groundbreaking work on neural networks. He gained notoriety on Monday after resigning from his position at Google, where he had spent ten years working, in order to publicly discuss his developing concerns regarding technology. Hinton expressed his worries about AI's ability to destroy jobs and usher in a day when many people "will not be able to know what is true anymore" in an interview with the New York Times on Monday, which broke the news of his departure. In addition, he emphasized how quickly things had advanced—far faster than he and others had predicted. [caption id="attachment_169525" align="aligncenter" width="2560"]Geoffrey Hinton Image credit : The New York Times[/caption] “If it gets to be much smarter than us, it will be very good at manipulation because it will have learned that from us, and there are very few examples of a more intelligent thing being controlled by a less intelligent thing,” Hinton, Godfather of AItold Tapper on Tuesday. “It knows how to program so it’ll figure out ways of getting around restrictions we put on it. It’ll figure out ways of manipulating people to do what it wants.” The letter was released by the Elon Musk-funded Future of Life Institute just two weeks after OpenAI unveiled GPT-4, an improved version of the technology that drives the popular chatbot ChatGPT. GPT-4 was used to prepare litigation, ace standardized tests, and create a functional website from a hand-drawn sketch in early tests and a company demo. One of the letter's signatories, Apple co-founder Steve Wozniak, spoke out about the letter's potential to spread false information on "CNN This Morning" on Tuesday. “Tricking is going to be a lot easier for those who want to trick you,” Wozniak told CNN. “We’re not really making any changes in that regard – we’re just assuming that the laws we have will take care of it.” [caption id="attachment_169450" align="aligncenter" width="1000"]Chatgpt,Chat,Bot,Screen,Seen,On,Smartphone,And,Laptop,Display Image credit : www.governing.com[/caption] Wozniak also said “some type” of regulation is probably needed. Hinton, for his part, told CNN he did not sign the petition. “I don’t think we can stop the progress,” he said. “I didn’t sign the petition saying we should stop working on AI because if people in America stop, people in China wouldn’t.” But he admitted not having a clear answer for what to do instead. Also read : ‘Godfather of AI’ & Nobel Laureate quits Google while warning about risks of AI “It’s not clear to me that we can solve this problem,” Hinton told Tapper. “I believe we should put a big effort into thinking about ways to solve the problem. I don’t have a solution at present.”

By Awanish Kumar

I keep abreast of the latest technological developments to bring you unfiltered information about gadgets.

RELATED NEWS

Elon Musk's new X marketing campaign was a risk th...

news-extra-space

Apple has expanded the options for its satellite-b...

news-extra-space

Artificial intelligence (AI) can help us find info...

news-extra-space

If you use Facebook and Instagram in Europe, there...

news-extra-space

Canva, the famous design platform, has unveiled an...

news-extra-space

Anthropic is a research and safety firm for AI. Th...

news-extra-space
2
3
4
5
6
7
8
9
10