The popular computer chips from Nvidia, which are the foundation of artificial intelligence research and projects, are being purchased by Meta for billions of dollars.
In a Thursday Instagram Reels post, Zuckerberg stated that the business has to develop a “massive compute infrastructure” in order to fulfil its AI “future roadmap.” According to Zuckerberg, the infrastructure will include 350,000 Nvidia H100 graphics cards by the end of 2024.
The number of graphics processing units (GPUs) that the company has already acquired is unknown, but Zuckerberg did note that the H100 was only available in limited quantities when it was released in late 2022. According to Raymond James analysts, Nvidia is reportedly charging between $25,000 and $30,000 for the H100, while they can fetch over $40,000 on eBay. If Meta were to make the smallest feasible payment, it would come to around $9 billion in costs.
Zuckerberg also stated that “almost 600k H100 equivalents of compute, if you include other GPUs,” will be part of Meta’s compute infrastructure. Tech giants like Microsoft, Meta, and OpenAI announced in December that they would be utilising AMD’s new Instinct MI300X AI Chips.
In order to begin its research in artificial general intelligence (AGI), which Zuckerberg described as a “long term vision” for the company, Meta needs these powerful computer chips. AGI, a potential type of AI that is equivalent to human intelligence, is also being studied by OpenAI and Google’s DeepMind division.
During a media event in San Francisco last month, Yann LeCun, the chief scientist of Meta, emphasised the significance of GPUs.
The more GPUs you need to purchase, [if] you believe AGI is in, LeCun stated at the time. “There is an AI war, and he’s supplying the weapons,” LeCun remarked in reference to Nvidia CEO Jensen Huang.
According to Meta’s third-quarter financial report, the company’s overall expenses for 2024 are expected to be between $94 billion and $99 billion, with computer expansion being a contributing factor.
During the analyst call, Zuckerberg stated, “AI will be our biggest investment area in terms of investment priorities in 2024, both in engineering and computer resources.”
On Thursday, Zuckerberg announced that Meta plans to “open source responsibly” its undeveloped “general intelligence,” following suit with its Llama family of massive language models.
According to Zuckerberg, Meta is training Llama 3 and fostering closer collaboration between its Fundamental AI Research team (FAIR) and GenAI research team.
LeCun stated in a post on X shortly after Zuckerberg’s that “To accelerate progress, FAIR is now a sister organization of GenAI, the AI product division.”
- Mac Miller’s Long-Awaited Album ‘Balloonerism’ Drops Soon - November 22, 2024
- WhatsApp Announces Upcoming Voice Message Transcription Feature - November 22, 2024
- Google doodle celebrates the Lebanon’s Independence Day - November 22, 2024