When you purchase through linkup on our site , we may earn an affiliate commission . Here ’s how it works .
Artificial cosmopolitan tidings ( AGI ) could be around the recession if Meta CEO Mark Zuckerberg has any say in it . The Facebook founderannounced on Instagramthat he is ditch more than $ 10 billion into the calculate infrastructure to develop AGI — AI that can twin or surpass mankind across a range of cognitively demanding project .
" Today I ’m bringing Meta ’s two AI research efforts closer together to support our long - terminal figure goals of build ecumenical intelligence , open - source it responsibly , and draw it available and useful to everyone in all of our day-to-day lives , " Zuckerberg tell Jan. 18 in a recorded message . " It ’s unmortgaged that the next generation of religious service want build full general intelligence , build the best AI assistants , AIs for Almighty , AIs for businesses and more that demand service in every arena of AI . "
Unlike artificial intelligence ( AI ) systems today , which are highly specific and ca n’t comprehend refinement and context as well as man , an AGI organisation would be able to figure out problems in a wide chain of environments , according to a 2019 essay print in the journalEMBO Reports . It would therefore mimic the key features of human intelligence , in particular learning and flexibility .
Related:3 scary breakthroughs AI will make in 2024
Achieving AGI may also feel like a point of no return for the human slipstream — withGoogle CEO Sundar Pichaisaying as far back as 2018 that the field of AI research is " more profound than electrical energy or fire . " Last yr , dozens of experts and prominent figures — include OpenAI CEO Sam Altman and Microsoft beginner Bill Gates — signed a statementstressing the collective demand for humanity to palliate " the risk of extermination from AI " alongside other social - ordered series risks such as pandemics and nuclear war . That said , manyscientists think human beings can never build AGI .
— ChatGPT will lie , cheat and use insider trading when under pressing to make money , inquiry show
— Last class AI insert our lives — is 2024 the class it ’ll change them ?
— Scientists used AI to build a humiliated - lithium battery from a young stuff that take just hour to discover
But Zuckerberg announce in an Instagram Scottish reel that the company is buying 350,000 Nvidia H100 nontextual matter processing units ( GPUs ) — some of the most hefty graphics cards in the humans — which are key to training today ’s right AI models . This will more than double Meta ’s full computing world power for AI grooming , with Meta aim to wield computing mightiness equivalent to 600,000 H100 GPUs in total .
Nvidia ’s H100 is the newer interlingual rendition of the A100 graphics cards , which OpenAI used to train ChatGPT . Our best available knowledge , based onunverified leaks , suggests OpenAI used around 25,000 Nvidia A100 GPUs for the chatbot ’s training — although other estimate intimate this number is lower .
Zuckerberg tell this " perfectly monumental amount of infrastructure " will be in place by the closing of the year . His company is currently training Meta ’s answer to ChatGPT and Google ’s Gemini , dub " Llama 3 " — and tease apart a future roadmap that includes a succeeding AGI system .