FORBES. Zuckerberg on AI: Meta Building AGI For Everyone And Open Sourcing It
John Koetsier
18 January 2024
Meta CEO Mark Zuckerberg just announced on Threads that he’s focusing Meta on building full general intelligence, or artificial general intelligence, and then releasing it as open source software for everyone.
“It’s become clearer that the next generation of services requires building full general intelligence,” he said in a personal video. “Building the best AI assistants, AIs for creators, Ads for business, and more … that needs advances in every area of AI, from reasoning to planning to coding, to memory, and other cognitive abilities.”
To support this effort, Zuckerberg said that Meta would have a massive array of compute power in its cloud facilities by the end of 2024: 350,000 Nvidia H100s, or around 600,000 H100 equivalents if you include other GPUs.
Only Microsoft is ordering enough H100s to build equivalent capacity, and with such huge orders, the H100 delivery times are stretching out as long as a year.
Each Nvidia H100, announced in 2022, contains 80 billion transistors, is up to six times faster than previous models, and has memory bandwidth of up to three terabytes per second. Just one implementation of only 4,600 H100s forms a supercomputer, Eos, while Meta’s will be the equivalent of 130 times bigger.
Do the math, and Zuckerberg’s massive AI compute capacity will have 4.8e+16 transistors. That’s 48,000,000,000,000,000, or 48 quadrillion.
With that massive amount of compute power, Zuckerberg says Meta will continue training Llama 3, and a “future roadmap of exciting roadmap of future models we’re going to be training responsibly and safely too.”
Llama 3 is a generative AI text model that some say could challenge or even surpass OpenAI’s GPT-4, currently the gold standard for generative AI models. Meta made Llama 2, its predecessor, openly available, and it sounds like something similar will happen with Llama 3.
Zuckerberg thinks AI and the metaverse are intimately connected, and he says that smart glasses will be the way most people experience AI and the metaverse together.
“A lot of us are going to talk to AI throughout the day,” he said. “I think a lot of us our going to do that using glasses, because glasses are the ideal form factor for letting an AI see what you see and hear what you hear.”
He also referenced Meta’s Ray-Ban Meta glasses, which he said are “off to a very strong start.”
Learn more;
- Mark Zuckerberg’s new goal is creating artificial general intelligence And he wants Meta to open source it. Eventually. Maybe – The Verge
- Zuckerberg’s Meta Is Spending Billions To Buy 350,000 Nvidia H100 GPUs – PC Magazine
- Zuck. Some updates on our AI efforts. Our long term vision is to build general intelligence, open source it responsibly, and make it widely available so everyone can benefit. We’re bringing our two major AI research efforts (FAIR and GenAI) closer together to support this. We’re currently training our next-gen model Llama 3, and we’re building massive compute infrastructure to support our future roadmap, including 350k H100s by the end of this year — and overall almost 600k H100s equivalents of compute if you include other GPUs. Also really excited about our progress building new AI-centric computing devices like Ray Ban Meta smart glasses. Lots more to come soon. – Instagram
- Meta and Microsoft say they will buy AMD’s new AI chip as an alternative to Nvidia’s – CNBC