👉Do you really know about this leading technology coin about to pump article?
Billionaire Elon Musk's xAI company uses 100,000 NVIDIA H100 GPUs to train the Grok 3 AI (artificial intelligence) model, while Grok 2 is planned to launch next August.
According to WCCF Tech, billionaire Elon Musk has just revealed shocking information that xAI's next generation Grok 3 large language model (LLM) is being trained on 100,000 NVIDIA H100 GPUs. This is a huge number, far exceeding the 40,000 A100 GPUs used to train OpenAI's GPT-4.
Billionaire Elon Musk's xAI is joining the race to develop AI chatbots with Grok LLM models. Grok 2, an improved version compared to Grok and Grok 1.5, is expected to launch next August. However, Mr. Musk has begun promoting Grok 3, promising it will be a superior product.
The use of 100,000 NVIDIA H100 GPUs, one of the most powerful GPU lines today, shows xAI's ambition to create an AI model capable of surpassing current competitors. However, this also raises questions about the huge cost of training this model, estimated to be up to 3 billion USD.
Previously, Mr. Elon Musk also announced plans to acquire NVIDIA's Blackwell B200 AI accelerators for xAI, with a value of up to 9 billion USD. This shows that xAI invests heavily in hardware and technology to compete in the increasingly hot AI field.
With these moves, xAI and its Grok products are attracting great attention from the technology community. Whether Grok 3 really becomes 'something special' as Mr. Musk claims, everything remains to be seen.
👉 It is very possible that this is an opportunity to help restore technology companies and the potential of technologies that can grow strongly in the near future.