Home » News » Nvidia May Be Getting A Major Boost With ChatGPT's Popularity

Nvidia May Be Getting A Major Boost With ChatGPT's Popularity

(Image Credit Google)
Photo Credit: NVIDIA Newsroom The Open AI's ChatGPT news may not be ending any time soon. It is safe to say that ChatGPT may not lose its sheen in the coming times as Microsoft too is using the same technology for Bing Chat. This is good news for Microsoft and ChatGPT but they are not the only ones to gain. Nvidia’s data center graphics card sales are at an all time high. The escalating sales of OpenAI and ChatGPT may require 10,000 GPUs to support the growth and Nvidia is the most likely player for the sales. How to access OpenAI ChatGPT Tutorial for Beginners - YouTube Photo Credit: YouTube TrendForce, a research firm shared some exciting insights of the estimations which throws light towards the following ChatGPT. According to the research, GPT powering ChatGPT will need an increased hardware to take on the development. TrendForce stated in its report, “The number of training parameters used in the development of this autoregressive language model rose from around 120 million in 2018 to almost 180 billion in 2020,” It did not provide predictions for 2023, but it is safe to presume that these figures will only keep rising as much as budget and technology permit. In 2020, ChatGPT needed 20,000 graphics cards according to the reports for data training and as it continues to do so, the number is expected to rise above 30,000. For Nvidia, this is great news. The estimation is built on the supposition that Nvidia’s A100 GPU will be used by OpenAI so that the language model will improve. The average cost of these extremely powerful graphics cards is between $10,000 and $15,000. However, they are not Nvidia's best data center cards either, so it's feasible that OpenAI would choose the more recent H100 cards instead, which are predicted to perform up to three times as well as A100. One of these cards can cost up to $30,000, which is a substantial price rise for these GPUs. Nvidia is not the only data center GPU in the market, there are other AI accelerators too sold by Intel and AMD but Nvidia is the most reliable solution when it comes to AI associated works. There is a great chance that Nvidia will be in luck whenever AI escalates. NVIDIA Data Center GPU Manager Simplifies Cluster Administration | NVIDIA Technical Blog Photo Credit: Nvidia Developer The question is, do the gamers need to worry about Nvidia supplying 10,000 GPUs for GPU powering. Open AI graphics cards have no relation with the Nvidia’s gaming GPUs, so the gamers can relax. But if Nvidia shifts its production to data center GPUs, then the consumer graphics cards supply would be affected. However, in reality, it would not be that badly affected, even with the prediction 10,000 GPU requirement, Nvidia would not need to deliver all of it at once.

By Prelo Con

Following my passion by reviewing latest tech. Just love it.

RELATED NEWS

In the ever-changing world of technology and retai...

news-extra-space

In a bid to capture the attention of users and dri...

news-extra-space

Apple is preparing for a game-changing move with i...

news-extra-space

Google has been making huge headways in artificial...

news-extra-space

Elon Musk's artificial intelligence firm, xAI, is ...

news-extra-space

In a digital showdown that has captured the attent...

news-extra-space
2
3
4
5
6
7
8
9
10