What are the main challenges in scaling quantum computers?

Quantum computing promises transformative advances in cryptography, materials science, and optimization, but moving from laboratory prototypes to large-scale, fault-tolerant machines faces multiple intertwined challenges. Scholars and practitioners emphasize that scaling is not a single technical problem but a system-level undertaking that requires advances in hardware, control, error correction, supply chains, and workforce development. John Preskill at the California Institute of Technology has framed the current era as one of noisy intermediate-scale quantum devices, highlighting the gulf between those devices and the fault-tolerant quantum computers needed for many impactful applications.

Physical and engineering obstacles

At the hardware level, maintaining coherent quantum states across many qubits remains the most immediate barrier. David DiVincenzo at IBM articulated foundational physical criteria for viable quantum processors that still guide engineering choices today. Qubits are fragile and susceptible to decoherence from electromagnetic noise, material defects, and uncontrolled interactions. Building architectures that suppress these effects while enabling precise control and fast gate operations requires novel materials, ultra-low temperature cryogenics for many qubit technologies, and high-fidelity microwave and optical control systems. These requirements drive complex fabrication processes and stringent clean-room capabilities, elevating costs and limiting manufacturing scalability.

Error correction and software complexity

Practical quantum error correction will multiply the number of physical qubits required to represent one logical qubit, increasing the engineering scale by orders of magnitude. Researchers such as Austin Fowler and teams working on surface code implementations describe how error-correcting codes demand dense interconnectivity and continuous measurement and feedback, creating intensive classical control overhead. Software stacks must coordinate massive real-time classical processing with quantum hardware, and compilers must optimize noisy operations across heterogeneous qubit types. The resource overhead of error correction and control places strong demands on both hardware yield and system integration.

Supply chains, energy, and territorial dimensions

Scaling quantum systems has environmental and territorial implications. Many leading laboratories concentrate in the United States, Europe, China, and Australia, creating geopolitical competition for talent, investment, and specialized fabrication facilities. Michelle Simmons at the University of New South Wales leads work on silicon spin qubits that aims to leverage existing semiconductor infrastructure, illustrating one pathway to more distributed manufacturing. Cryogenic systems and continuous refrigeration impose energy demands and materials needs that raise environmental considerations as systems scale from dozens to thousands of qubits.

Human capital, standards, and trust

Beyond physical systems, scaling requires a workforce with interdisciplinary skills spanning condensed matter physics, electrical engineering, computer science, and cryogenics. The National Academies of Sciences, Engineering, and Medicine have highlighted gaps between prototype achievements and the sustained industrial capacity needed for scaling. Standardization of interfaces, benchmarking metrics, and open tools will be essential to build interoperable ecosystems and to ensure trustworthy performance claims.

Consequences of failing to meet these challenges include slow commercialization, concentration of advantages among a few institutions or nations, and misaligned investments. Success will depend on coordinated advances in materials, error correction, control electronics, manufacturing, and policy to manage environmental and societal impacts while broadening participation in the emerging quantum economy.