Elon Musk told investors this month that his startup xAI is planning to build a supercomputer by the fall of 2025 that would power a future, smarter iteration of its Grok chatbot, The Information reports. This supercomputer, which Musk reportedly referred to as a “gigafactory of compute,” would rely on tens of thousands of NVIDIA H100 GPUs and cost billions of dollars to build. Musk has previously said the third version of Grok will require at least 100,000 of the chips — a fivefold increase over the 20,000 GPUs said to be in use for training Grok 2.0.
According to The Information, Musk also told investors in the presentation that the planned GPU cluster would be at least four times the size of anything used today by xAI competitors. Grok is currently in version 1.5, which was released in April, and is now touted to process visual information like photographs and diagrams as well as text. X earlier this month started rolling out AI-generated news summaries powered by Grok for premium users.
Trending Products