Image credit : Techopedia
If there is no control on the technology’s rapid progress, a future “God-like AI” could result in the “obsolescence or destruction of the human race,” a prominent AI investor warned in a Financial Times column.
Although it hasn’t arrived yet, artificial general intelligence, or AGI, is regarded as the main objective of the quickly expanding business. AGI is the moment at which a machine can comprehend or learn anything that humans can. Additionally, the stakes are tremendous.
A former Meta executive predicted that AI would be worth trillions of dollars by the 2030s, but others are concerned that it may lead to a “nuclear-level catastrophe.”
“A three-letter acronym doesn’t capture the enormity of what AGI would represent, so I will refer to it as what is: God-like AI,” Ian Hogarth wrote in the FT. Hogarth used that term, he said, because such technology could develop by itself and transform the world without supervision.
“God-like AI could be a force beyond our control or understanding, and one that could usher in the obsolescence or destruction of the human race,” he added.
“Until now, humans have remained a necessary part of the learning process that characterizes progress in AI. At some point, someone will figure out how to cut us out of the loop, creating a God-like AI capable of infinite self-improvement,” Hogarth added. “By then, it may be too late.”
Before cofounding Songkick, a concert discovery website that was eventually purchased by Warner Music Group, Hogarth studied engineering at Cambridge University, including artificial intelligence.He has now made investments in over 50 machine-learning-based firms, including Anthropic, which was formed by former OpenAI personnel, according to his own website. He publishes a report yearly titled “The State of AI.”
In a recent earnings call, Jensen Huang, CEO of Nvidia, the chip manufacturer whose GPUs are frequently used to power AI, stated that over the past ten years, AI has become one million times more powerful. According to PC Gamer, Huang anticipates that OpenAI’s ChatGPT will make a similar leap in the coming ten years.
Also read : Robot throws tape measures like how Spider-Man uses webs
According to how much they can compute per second, Hogarth stated in his FT essay that the largest AIs had 100 million times greater processing power over the same time period.
He also cautioned that a lack of regulation might result in an unstable “God-like AI” as a result of the intense competition between those at the vanguard of the technology, such as OpenAI and Alphabet-owned DeepMind.
“They are running towards a finish line without an understanding of what lies on the other side,” he wrote.
In a 2019 interview with the New York Times, OpenAI CEO Sam Altman compared his ambitions to the Manhattan Project, which developed the first nuclear weapons. He reiterated its mastermind, Robert Oppenheimer, saying: “Technology happens because it is possible,” and pointed out that the pair share the same birthday.
AGI will have a significant influence, but according to Hogarth, whether that impact is beneficial or devastating will depend on how quickly development is made and how long regulation takes.