Josh Edelson | AFP | Getty Pictures
Meta is spending billions of {dollars} on Nvidia’s in style pc chips, that are on the coronary heart of synthetic intelligence analysis and initiatives.
In an Instagram Reels put up on Thursday, Zuckerberg stated the corporate’s “future roadmap” for AI requires it to construct a “large compute infrastructure.” By the tip of 2024, Zuckerberg stated that infrastructure will embrace 350,000 H100 graphics playing cards from Nvidia.
Zuckerberg did not say how lots of the graphics processing items (GPUs) the corporate has already bought, however the H100 did not hit the market till late 2022, and that was in restricted provide. Analysts at Raymond James estimate Nvidia is promoting the H100 for $25,000 to $30,000, and on eBay they’ll price over $40,000. If Meta had been paying on the low finish of the value vary, that might quantity to shut to $9 billion in expenditures.
Moreover, Zuckerberg stated Meta’s compute infrastructure will comprise “virtually 600k H100 equivalents of compute when you embrace different GPUs.” In December, tech firms like Meta, OpenAI and Microsoft stated they’d use the brand new Intuition MI300X AI pc chips from AMD.
Meta wants these heavy-duty pc chips because it pursues analysis in synthetic common intelligence (AGI), which Zuckerberg stated is a “long run imaginative and prescient” for the corporate. OpenAI and Google’s DeepMind unit are additionally researching AGI, a futuristic type of AI that is corresponding to human-level intelligence.
Meta’s chief scientist Yann LeCun pressured the significance of GPUs throughout a media occasion in San Francisco final month.
″[If] you assume AGI is in, the extra GPUs it’s important to purchase,” LeCun stated on the time. Concerning Nvidia CEO Jensen Huang, LeCun stated “There may be an AI conflict, and he is supplying the weapons.”
In Meta’s third-quarter earnings report, the corporate stated that complete bills for 2024 can be within the vary of $94 billion to $99 billion, pushed partly by computing enlargement.
“When it comes to funding priorities, AI can be our largest funding space in 2024, each in engineering and pc sources,” Zuckerberg stated on the decision with analysts.
Zuckerberg stated on Thursday that Meta plans to “open supply responsibly” its yet-to-be developed “common intelligence,” an method the corporate can also be taking with its Llama household of enormous language fashions.
Meta is presently coaching Llama 3 and can also be making its Elementary AI Analysis workforce (FAIR) and GenAI analysis workforce work extra carefully collectively, Zuckerberg stated.
Shortly after Zuckerberg’s put up, LeCun stated in a put up on X, that “To speed up progress, FAIR is now a sister group of GenAI, the AI product division.”
— CNBC’s Kif Leswing contributed to this report
WATCH: The AI darkish horse: Why Apple might win the subsequent evolution of the AI arms race
Unique information supply Credit score: www.cnbc.com
You must be logged in to post a comment Login