Explainer: Why Willow’s low-error pitch is remarkable

In 2019 Google had announced that its Sycamore quantum processor could complete a mathematical equation in three minutes, as opposed to 10,000 years on a supercomputer.

Explainer: Why Willow’s low-error pitch is remarkable
Explainer: Why Willow’s low-error pitch is remarkable. (Image: Google Quantum AI)

With 105 qubits and advanced error correction, Google’s new quantum computer chip Willow promises to change the world of quantum computing. Banasree Purkayastha looks at what makes Willow so fascinating and scary at the same time

What makes Willow unique?

Willow is Google’s new quantum computer chip that has demonstrated how to exponentially reduce errors in quantum computers while using more qubits to scale up the technology — something that researchers have been trying to solve for the last 30 years. It was also able to perform a standard benchmark computation in under five minutes that would take Frontier, today’s fastest supercomputer, 10 septillion (that is, 1025) years — a number that exceeds the age of the universe. Willow, which has 105 qubits, “now has best-in-class performance,” according to Hartmut Neven, founder and lead of Google Quantum AI.

In 2019 Google had announced that its Sycamore quantum processor could complete a mathematical equation in three minutes, as opposed to 10,000 years on a supercomputer. So Google has certainly come a long way since then.

Google fabricated its previous chips in a shared facility at the University of California, Santa Barbara, but built its own dedicated fabrication facility to produce its Willow chips.

Qubits and quantum computing

Quantum Computing relies on the laws of quantum mechanics, but handles data in a different way. So instead of using bits as classical computers do, it has quantum bits or qubits. Qubits can exist in multiple states at the same time, such as 1, 0, and anything in between. In terms of how qubits are physically built, they are not switches or transistors, but are actually elements that have quantum mechanical behaviour. The problem with qubits is that they tend to rapidly exchange information with their environment, making it difficult to protect the information needed to complete a computation. 

So, the more qubits you use, the more errors will occur. As Neven explained in his blogpost, errors are one of the greatest challenges in quantum computing. “We tested ever-larger arrays of physical qubits, scaling up from a grid of 3×3 encoded qubits, to a grid of 5×5, to a grid of 7×7 — and each time, using our latest advances in quantum error correction, we were able to cut the error rate in half. In other words, we achieved an exponential reduction in the error rate,” he said.

What this means for AI & encryption

Quantum Technology has the potential to exponentially increase computational power, allowing more accurate predictions and insights to transform communication networks and optimise the flow of goods, resources and money. Industries as diverse as telecommunications, pharmaceuticals, banking and mining could all be transformed with quantum computing; developers can train machine learning and AI models with fewer data points. Neven says the technology will be indispensable for collecting AI training data, eventually helping to “discover new medicines, designing more efficient batteries for electric cars, and accelerating progress in fusion and new energy alternatives.”

But quantum computers could also solve the data “puzzles” that are at the heart of encryption protection, leaving all systems and data immediately vulnerable, points out The Guardian. That could give a body blow to cryptocurrency and even cybersecurity.

Others in the quantum computing race

IBM, Microsoft and Amazon are all working on quantum computing systems of their own. IBM first made quantum computers available on the cloud in 2016. IBM Quantum Heron, its fastest performing quantum processor can now leverage Qiskit — the world’s most performant quantum software. Recently, Microsoft achieved a new quantum computing milestone in partnership with California-based Atom Computing. The duo entangled the largest number of logical qubits on record, the objective here being toward creating scalable, fault-tolerant quantum systems. They intend to start shipping these machines next year.

Researchers from Amazon’s AWS Center for Quantum Networking have been working with Harvard University on creating a “quantum network” that can transmit entangled photons from one quantum computer to another over fibre-optic cables.

Can we expect real-world application anytime soon?

That is a long way off. Though Google says the chip represents a major breakthrough and could soon pave the way to “a useful, large-scale quantum computer”, Willow still remains a largely, experimental device. Quantum computing, itself, is still an experimental field. Willow is thus the latest development in quantum computing, a field that is “attempting to use the principles of particle physics to create a new type of mind-bogglingly powerful computer”, as the BBC says.

The more impressive breakthrough is that Willow is “below the threshold” — being able to drive errors down while scaling up the number of qubits. As Engadget says, that could show the way to a future where quantum computers solve problems that have tangible effects on people’s lives. For now, we are perhaps one step closer to running commercially-relevant algorithms that can’t be replicated on conventional computers.

Get live Share Market updates, Stock Market Quotes, and the latest India News and business news on Financial Express. Download the Financial Express App for the latest finance news.

This article was first uploaded on December twelve, twenty twenty-four, at thirty minutes past two in the night.
Market Data
Market Data