Researchers on the Niels Bohr Institute have significantly increased how quickly changes in delicate quantum states will be detected inside a qubit. By combining commercially available hardware with recent adaptive measurement techniques, the team can now observe rapid shifts in qubit behavior that were previously unimaginable to see.
Qubits are the elemental units of quantum computers, which scientists hope will someday outperform today’s strongest machines. But qubits are extremely sensitive. The materials used to construct them often contain tiny defects that scientists still don’t fully understand. These microscopic imperfections can shift position a whole lot of times per second. As they move, they alter how quickly a qubit loses energy and with it priceless quantum information.
Until recently, standard testing methods took as much as a minute to measure qubit performance. That was far too slow to capture these rapid fluctuations. As a substitute, researchers could only determine a mean energy loss rate, masking the true and infrequently unstable behavior of the qubit.
It’s somewhat like asking a robust workhorse to tug a plow while obstacles continually appear in its path faster than anyone can react. The animal could also be capable, but unpredictable disruptions make the job much harder.
FPGA Powered Real Time Qubit Control
A research team from the Niels Bohr Institute’s Center for Quantum Devices and the Novo Nordisk Foundation Quantum Computing Programme, led by postdoctoral researcher Dr. Fabrizio Berritta, developed an actual time adaptive measurement system that tracks changes within the qubit energy loss (rest) rate as they occur. The project involved collaboration with scientists from the Norwegian University of Science and Technology, Leiden University, and Chalmers University.
The brand new approach relies on a quick classical controller that updates its estimate of a qubit’s rest rate inside milliseconds. This matches the natural speed of the fluctuations themselves, relatively than lagging seconds or minutes behind as older methods did.
To realize this, the team used a Field Programmable Gate Array (FPGA), a kind of classical processor designed for terribly rapid operations. By running the experiment directly on the FPGA, they might quickly generate a “best guess” of how briskly the qubit was losing energy using only a couple of measurements. This eliminated the necessity for slower data transfers to a standard computer.
Programming FPGAs for such specialized tasks will be difficult. Even so, the researchers succeeded in updating the controller’s internal Bayesian model after each qubit measurement. That allowed the system to repeatedly refine its understanding of the qubit’s condition in real time.
In consequence, the controller now keeps pace with the qubit’s changing environment. Measurements and adjustments occur on nearly the identical timescale because the fluctuations themselves, making the system roughly 100 times faster than previously demonstrated.
The work also revealed something recent. Scientists didn’t previously know just how quickly fluctuations occur in superconducting qubits. These experiments have now provided that insight.
Industrial Quantum Hardware Meets Advanced Control
FPGAs have long been utilized in other scientific and engineering fields. On this case, the researchers used a commercially available FPGA based controller from Quantum Machines called the OPX1000. The system will be programmed in a language much like Python, which many physicists already use, making it more accessible to research groups worldwide.
The combination of this controller with advanced quantum hardware was made possible through close collaboration between the Niels Bohr Institute research group led by Associate Professor Morten Kjaergaard and Chalmers University, where the quantum processing unit was designed and fabricated. “The controller enables very tight integration between logic, measurements and feedforward: these components made our experiment possible,” says Morten Kjærgaard.
Why Real Time Calibration Matters for Quantum Computers
Quantum technologies promise powerful recent capabilities, though practical large scale quantum computers are still under development. Progress often comes incrementally, but occasionally major steps forward occur.
By uncovering these previously hidden dynamics, the findings reshape how scientists take into consideration testing and calibrating superconducting quantum processors. With current materials and manufacturing methods, moving toward real time monitoring and adjustment appears essential for improving reliability. The outcomes also highlight the importance of partnerships between academic research and industry, together with creative uses of accessible technology.
“Nowadays, in quantum processing units basically, the general performance will not be determined by the very best qubits, but by the worst ones: those are those we’d like to give attention to. The surprise from our work is that a ‘good’ qubit can turn right into a ‘bad’ one in fractions of a second, relatively than minutes or hours.
“With our algorithm, the fast control hardware can pinpoint which qubit is ‘good’ or ‘bad’ mainly in real time. We may gather useful statistics on the ‘bad` qubits in seconds as an alternative of hours or days.
“We still cannot explain a big fraction of the fluctuations we observe. Understanding and controlling the physics behind such fluctuations in qubit properties will probably be crucial for scaling quantum processors to a useful size,” Fabrizio says.

