Meta founder Mark Zuckerberg’s dream metaverse project will need a 1,000-fold increase in computational efficiency to be successful, chipmaker Intel said.
Intel, in its first statement on the metaverse, said it remained a believer in what some think could be the future of how physical and digital worlds would coexist, but warned that there were major roadblocks ahead.
Raja Koduri, Senior Vice President and General Manager of Intel’s Accelerated Computing Systems and Graphics Group, said in a blog post: “Indeed, the metaverse may be the next major platform in computing after the world wide web and mobile.”
“Consider what is required to put two individuals in a social setting in an entirely virtual environment… Now, imagine solving this problem at scale – for hundreds of millions of users simultaneously – and you will quickly realise that our computing, storage and networking infrastructure today is simply not enough to enable this vision.”
For the uninitiated, the metaverse removes the digital world from a fixed device to create virtual spaces enabled by augmented reality, virtual reality, and mixed reality. It blends the real world with the digital where digital objects exist in the real world and the digital world appears like the physical one.
However, according to Koduri: “We need several orders of magnitude more powerful computing capability, accessible at much lower latencies across a multitude of device form factors. To enable these capabilities at scale, the entire plumbing of the internet will need major upgrades.”
Vested interests seemingly also drive Intel’s statement. Intel makes CPUs for consumer devices and data centres and if a metaverse needs a 1,000-times increase in computing capacity, it would be good for business. For its part, Intel is already planning for the metaverse and claims some of its scheduled products for release in 2022 would bring it a step closer.
“Beyond these 2022 products, we have a multigenerational roadmap of high-performance XPUs from client through edge to cloud that move us toward zettascale computing in the next five years,” Koduri said.