Pull to refresh
Logo
Daily Brief
Following
Why
The race to scale quantum computing

The race to scale quantum computing

New Capabilities
By Newzino Staff | |

From tabletop modulators to microchips: solving the control problem that stands between 100 qubits and a million

December 26th, 2025: University of Colorado Unveils Microchip-Scale Quantum Computing Control Device

Overview

Quantum computers can already outperform classical supercomputers on specific tasks. Google's Willow chip solved a problem in five minutes that would take today's fastest machines 10 septillion years, and in October 2025 demonstrated the first verifiable quantum advantage with its Quantum Echoes algorithmโ€”13,000 times faster than supercomputers. But scaling from today's 100-qubit systems to the million-qubit machines needed for real-world applications requires control hardware that doesn't exist yet. Current laser control systems are tabletop-sized, power-hungry, and impossible to replicate thousands of times over.

That bottleneck is breaking through multiple parallel advances. In December 2025, University of Colorado Boulder researchers published a breakthrough optical phase modulator nearly 100 times smaller than a human hair that uses 80 times less power than commercial alternatives. The same month, China's Zuchongzhi 3.2 became the second team globally to achieve below-threshold quantum error correction using an innovative microwave-based approach that may offer simpler scaling paths. Meanwhile, Stanford researchers demonstrated room-temperature quantum communication devices, eliminating the need for expensive cooling systems. These hardware advances are converging just as IBM prepares to deploy its 120-qubit Nighthawk processor and Google proves quantum algorithms can deliver verifiable advantages over classical computing.

Key Indicators

100x
Size reduction vs. human hair diameter
CU Boulder's CMOS-fabricated optical modulator enables compact qubit control
80x
Power reduction vs. commercial modulators
Lower power means less heat, enabling denser integration of control channels
$2B
Global quantum investment in 2024
Investment grew 50% year-over-year from $1.3B in 2023
1,386
IBM Kookaburra qubits (2025 roadmap)
Multi-chip processor demonstrating scalable architecture approaches

Interactive

Exploring all sides of a story is often best achieved with Play.

Ever wondered what historical figures would say about today's headlines?

Sign up to generate historical perspectives on this story.

Sign Up

Debate Arena

Two rounds, two personas, one winner. You set the crossfire.

People Involved

Matt Eichenfield
Matt Eichenfield
Senior Author, University of Colorado Boulder (Leading acousto-optic quantum control research)
Jacob M. Freedman
Jacob M. Freedman
Lead Author, University of Colorado Boulder (Quantum photonics researcher)
Sundar Pichai
Sundar Pichai
CEO, Google and Alphabet (Overseeing Google Quantum AI initiatives)
Jennifer Dionne
Jennifer Dionne
Professor of Materials Science and Engineering, Stanford University (Leading room-temperature quantum device research)
Pan Jianwei
Pan Jianwei
Lead Researcher, University of Science and Technology of China (Leading China's quantum computing efforts)

Organizations Involved

Google Quantum AI
Google Quantum AI
Corporate Research Lab
Status: Leading quantum error correction development

Google's quantum computing research division, operating from Santa Barbara, California.

IBM Quantum
IBM Quantum
Corporate Research Division
Status: Scaling superconducting quantum systems

IBM's quantum computing initiative focusing on superconducting qubits and quantum-centric supercomputers.

Quantum Systems Accelerator
Quantum Systems Accelerator
Federal Research Center
Status: Coordinating national quantum hardware development

Department of Energy center advancing quantum computing for scientific applications.

Northwestern University
Northwestern University
Research University
Status: Pioneering commercial CMOS quantum photonics

Leading institution in electronic-photonic quantum chip integration.

IonQ
IonQ
Public Company (NYSE: IONQ)
Status: Commercializing trapped-ion quantum systems

Publicly-traded quantum computing company using trapped ytterbium and barium ion qubits.

Timeline

  1. University of Colorado Unveils Microchip-Scale Quantum Computing Control Device

    Control Systems

    Researchers at University of Colorado Boulder published breakthrough optical phase modulator device enabling efficient control of lasers for thousands or millions of qubits using standard microchip manufacturing.

  2. China Achieves Below-Threshold Error Correction with Zuchongzhi 3.2

    Hardware Milestone

    China's University of Science and Technology team demonstrated fault-tolerant quantum error correction below threshold on 107-qubit Zuchongzhi 3.2 using microwave-based control, becoming the second team globally after Google to achieve this milestone. Published as Physical Review Letters cover paper.

  3. CU Boulder Publishes CMOS Optical Modulator Breakthrough

    Control Systems

    University of Colorado researchers published in Nature Communications a CMOS-fabricated acousto-optic phase modulator 100x smaller than human hair diameter, using 80x less power than commercial systemsโ€”enabling scalable laser control for millions of qubits.

  4. Stanford Develops Room-Temperature Quantum Communication Device

    Control Systems

    Stanford researchers created a nanoscale device using molybdenum diselenide on silicon nanostructures that entangles photons and electrons at room temperature, eliminating the need for cooling to near absolute zero temperatures.

  5. IBM Unveils Nighthawk and Loon Quantum Processors

    Hardware Milestone

    IBM announced Nighthawk, a 120-qubit processor with 218 next-generation tunable couplers enabling circuits 30% more complex than Heron, and Loon, demonstrating key components for fault-tolerant quantum computing. Nighthawk targets quantum advantage by 2026.

  6. DOE Renews Quantum Systems Accelerator Funding

    Funding

    Department of Energy awarded $125 million over five years to QSA for continued development of neutral atom, trapped ion, and superconducting quantum platforms.

  7. Google Demonstrates First Verifiable Quantum Advantage

    Algorithm Breakthrough

    Google's Quantum Echoes algorithm ran 13,000 times faster on Willow than on the world's fastest supercomputers, marking the first quantum algorithm to show verifiable quantum advantage where two quantum processors can independently confirm results.

  8. First Commercial-Foundry Quantum Photonic Chip Demonstrated

    Manufacturing Breakthrough

    Northwestern, Boston University, and UC Berkeley researchers fabricated the first electronic-photonic quantum system-on-chip in a commercial 45nm CMOS foundry, proving quantum components can use existing semiconductor infrastructure.

  9. China Unveils Zuchongzhi 3.0 Superconducting Processor

    Hardware Milestone

    Chinese scientists introduced 105-qubit Zuchongzhi 3.0, claiming quantum random circuit sampling speeds quadrillion times faster than leading supercomputers and one million times faster than Google's published results.

  10. Google Achieves Below-Threshold Error Correction with Willow

    Hardware Milestone

    Google's 105-qubit Willow processor demonstrated exponential error reduction as qubit arrays scaled, achieving the 30-year goal of below-threshold quantum error correction. Performance on random circuit sampling: 5 minutes versus 10 septillion years for classical supercomputers.

  11. China Introduces 504-Qubit Tianyan Chip

    Hardware Milestone

    China unveiled Xiaohong, a 504-qubit superconducting chip, setting domestic records and rivaling IBM in key performance metrics through the Tianyan commercial platform.

  12. IBM Crosses 1,000-Qubit Threshold with Condor

    Hardware Milestone

    IBM introduced the 1,121-qubit Condor processor, first superconducting system to surpass 1,000 qubits, alongside the higher-performance 133-qubit Heron architecture.

  13. China's Jiuzhang 3.0 Reaches 255-Photon Detection

    Hardware Milestone

    China scaled photonic quantum computing to 255 detected photons, solving sampling problems in microseconds that would take supercomputers over 20 billion years.

  14. IBM Unveils 127-Qubit Eagle Processor

    Hardware Milestone

    IBM announced Eagle, the first quantum processor to exceed 100 qubits, setting new performance benchmarks for superconducting quantum systems.

  15. China Claims Quantum Advantage with Jiuzhang Photonic System

    Hardware Milestone

    Chinese researchers demonstrated quantum computational advantage using 76-photon Gaussian boson sampling, completing calculations in minutes that would take classical supercomputers billions of years.

Scenarios

1

CMOS Quantum Components Enable Million-Qubit Systems by 2030

Discussed by: McKinsey Technology Monitor, Department of Energy researchers, industry roadmaps from IBM and Google

CMOS-compatible quantum control systems trigger rapid scaling. The CU Boulder modulator and Northwestern's commercial-foundry photonics prove that existing semiconductor fabs can mass-produce quantum components. IBM's Kookaburra multi-chip architecture and Google's error-corrected logical qubits combine with these compact control systems to reach 100,000+ qubit systems by 2028-2030. Applications in drug discovery, materials science, and optimization begin delivering commercial value. McKinsey's $72 billion quantum computing market projection by 2035 proves conservative as pharmaceutical and chemical companies adopt quantum simulation. The breakthrough mirrors how silicon photonics enabled modern datacentersโ€”unglamorous infrastructure enabling exponential capability growth.

2

Control Systems Scale, But Error Correction Remains Intractable

Discussed by: Gartner analysis, quantum computing skeptics, classical computing researchers

Compact control hardware solves one bottleneck while error correction remains stubbornly difficult. Google's Willow breakthrough proves fragileโ€”different qubit architectures fail to replicate exponential error suppression. Physical error rates plateau above thresholds needed for useful logical qubits. The field scales to thousands of physical qubits by 2028, but applications remain limited to specialized sampling problems and quantum simulation tasks that tolerate high error rates. Investment continues but commercial returns disappoint. The technology becomes a valuable tool for specific scientific applications rather than the general-purpose revolution promised. Gartner's cautious stanceโ€”quantum computing "not expected to be very actionable in the immediate years after 2024"โ€”looks prescient.

3

China Achieves Quantum Dominance Through Photonic Architecture

Discussed by: Chinese Academy of Sciences announcements, geopolitical technology analysts, defense policy researchers

China's photonic quantum computing approachโ€”demonstrated through Jiuzhang's progression from 76 to 255 photonsโ€”proves superior for specific applications. While U.S. and European efforts focus on superconducting and trapped-ion systems requiring complex control infrastructure, Chinese teams scale photonic systems that need less elaborate cooling and control. Jiuzhang 4.0 integrates 2,000+ photons by 2027. Combined with China's $7.4 billion government quantum commitment and the 37 million visits to Tianyan commercial platform, Chinese quantum systems achieve practical advantages in optimization and machine learning before Western trapped-ion systems scale. U.S. quantum computing efforts, despite hardware breakthroughs like the CU Boulder modulator, fall behind due to fragmented commercialization and coordination challenges between national labs, universities, and startups.

4

Hybrid Classical-Quantum Systems Dominate Through 2030s

Discussed by: IBM Quantum, McKinsey analysis, enterprise quantum computing users

Pure quantum advantage proves elusive for most applications. Instead, compact control systems enable proliferation of 1,000-10,000 qubit machines that excel as specialized coprocessors for classical systems. Drug discovery, financial optimization, and materials science use quantum systems for specific bottleneck calculations while classical computers handle everything else. IonQ's datacenter-mounted systems and IBM's quantum-centric supercomputer vision become realityโ€”but quantum remains a powerful accelerator, not a replacement. The $1.25 billion in Q1 2025 investment flows to companies building practical hybrid tools rather than pursuing million-qubit systems. This mirrors GPU adoption: transformative impact through specialized acceleration rather than general-purpose replacement of existing computing paradigms.

Historical Context

The Transistor Invention and Semiconductor Scaling (1947-1970s)

1947-1975

What Happened

Bell Labs invented the transistor in 1947, but practical applications took decades to develop. Early transistors were expensive, unreliable, and limited in capability. The semiconductor industry invested billions in manufacturing infrastructure before integrated circuits enabled the computer revolution. The industry's challenge was scaling from individual components to integrated systems with thousands, then millions, of transistors on a single chip.

Outcome

Short Term

Initial applications in hearing aids and radios proved the concept, but computers continued using vacuum tubes through the 1950s due to reliability and cost concerns.

Long Term

Decades of incremental manufacturing improvements and materials science advances enabled Moore's Law scaling, transforming society through personal computers, smartphones, and the internet. The $75+ trillion invested in semiconductor infrastructure over 75 years created the foundation for modern computing.

Why It's Relevant Today

Quantum computing faces identical scaling challenges. Like early transistors, today's qubits work but can't be manufactured at scale with acceptable cost and reliability. The CU Boulder and Northwestern breakthroughs prove CMOS manufacturing can produce quantum componentsโ€”potentially leveraging existing semiconductor infrastructure rather than building entirely new fabs. The question is whether quantum computing's "transistor moment" leads to similar exponential scaling or plateaus due to fundamental physical limits.

Photonics Integration in Telecommunications (1970s-2000s)

1970-2010

What Happened

Fiber optic communication required precisely controlled laser systems and photonic components. Early systems used bulky, expensive equipment that consumed substantial power. The industry's breakthrough came when CMOS-compatible silicon photonics enabled integration of optical components with electronic control circuits on single chips. This transformation occurred gradually through the 2000s as manufacturing techniques matured.

Outcome

Short Term

Initial deployments in telecommunications networks were expensive and limited to high-value applications like long-distance trunk lines.

Long Term

Silicon photonics enabled modern datacenter interconnects, enabling cloud computing at scale. Companies like Intel and Cisco now manufacture photonic components using modified semiconductor fabs, producing millions of optical transceivers annually at commodity prices.

Why It's Relevant Today

The acousto-optic modulator breakthrough follows the silicon photonics playbook: take components previously requiring specialized manufacturing and redesign them for CMOS fabrication. If quantum photonics follows telecommunications' trajectory, today's tabletop laser control systems could become mass-produced chips within a decade, removing the scaling bottleneck just as silicon photonics enabled hyperscale datacenters.

GPU Emergence as Specialized Accelerators (1999-2015)

1999-2015

What Happened

Graphics processing units evolved from specialized gaming hardware to general-purpose accelerators for parallel computation. NVIDIA's CUDA platform (2006) enabled programmers to use GPUs for scientific computing, machine learning, and data analytics. Initially dismissed as niche technology, GPUs became essential infrastructure for AI training and high-performance computing without replacing general-purpose CPUs.

Outcome

Short Term

Early adoption concentrated in gaming and scientific visualization markets. Programmers needed to learn new parallel programming paradigms, limiting mainstream adoption through the mid-2000s.

Long Term

GPUs became essential for AI revolution starting around 2012 with deep learning breakthroughs. NVIDIA's market cap grew from $10 billion (2012) to over $1 trillion (2023) as GPUs proved indispensable for training large language models and other AI systems. The hybrid CPU-GPU architecture now dominates high-performance computing.

Why It's Relevant Today

Quantum computers may follow the GPU model rather than replacing classical computers entirely. Compact control systems enable quantum coprocessors integrated into datacenters for specialized tasks: optimization, simulation, cryptography. IonQ's rack-mounted systems and IBM's quantum-centric supercomputers mirror the hybrid CPU-GPU architecture that proved more practical than pure GPU computing. This scenarioโ€”quantum as powerful accelerator rather than general-purpose replacementโ€”aligns with realistic near-term applications while avoiding overhyped predictions.

15 Sources: