One the largest challenges for quantum computers is the incredibly short time that qubits can retain information. But a brand new qubit from Princeton University lasts 15 times longer than industry standard versions in a significant step towards large-scale, fault-tolerant quantum systems.
A significant bottleneck for quantum computing is decoherence—the speed at which qubits lose stored quantum information to the environment. The faster this happens, the less time the pc has to perform operations and the more errors are introduced to the calculations.
While corporations and researchers are developing error-correction schemes to mitigate this problem, qubits with greater stability could possibly be a more robust solution. Trapped-ion and neutral-atom qubits can have coherence times on the order of seconds, however the superconducting qubits utilized by corporations like Google and IBM remain below the 100-microsecond threshold.
These so-called “transmon” qubits produce other benefits equivalent to faster operation speeds, but their short shelf life stays a significant drawback. Now a team from Princeton has designed novel transmon qubits with coherence times of as much as 1.6 milliseconds—15 times longer than those utilized in industry and thrice longer than the perfect lab experiment.
“This advance brings quantum computing out of the realm of merely possible and into the realm of practical,” Princeton’s Andrew Houck, who co-led the research, said in a press release. “Now we will begin to make progress way more quickly.”
The team’s latest approach, detailed in a paper in Nature, tackles a long-standing problem within the design of transmon qubits. Tiny surface defects within the metal used to make them, typically aluminium, can absorb energy because it travels through the circuit, leading to errors within the underlying computations.
The brand new qubit as a substitute uses the metal tantalum, which has far fewer of those defects. The researchers had already experimented with this material way back to 2021, but earlier versions were built on top of a layer of sapphire. The researchers realized the sapphire was also resulting in significant energy loss and so replaced it with a layer of silicon, which is commercially available at extremely high purity.
Making a clean enough interface between the 2 materials to keep up superconductivity is difficult, however the team solved the issue with a brand new fabrication process. And since silicon is the computing industry’s material of alternative, the brand new qubits must be easier to mass-produce than earlier versions.
To prove out the brand new process, the researchers built a completely functioning quantum chip with six of the brand new qubits. Crucially, the brand new design is comparable enough to the qubits utilized by corporations like Google and IBM that it could easily slot into existing processors to spice up performance, the researchers say.
This might chip away on the major barrier stopping existing quantum computers from solving larger problems—the proven fact that short coherence times mean qubits are overwhelmed by errors before they’ll do any useful calculations.
The technique of getting the design from the lab bench to the chip foundry is prone to be long and sophisticated though, so it’s unclear if corporations will switch to this latest qubit architecture any time soon. Still, the research has made dramatic progress on certainly one of the largest challenges holding back superconducting quantum computers.

