Quantum computers may process AI datasets with 60 logical qubits

Researchers propose feeding large AI datasets into quantum machines in small batches and estimate about 60 logical qubits could give advantages on some data-processing tasks.

A team of researchers at Caltech, Google Quantum AI, Oratomic and MIT proposes a method to process large AI datasets on quantum computers by feeding data in small batches rather than loading complete datasets into quantum memory.

The study identifies converting large datasets-often measured in terabytes or petabytes-into quantum states as a key bottleneck. Preparing full dataset states conventionally requires large amounts of quantum memory and time. The proposed method assembles the required quantum states during processing, reducing the need for upfront quantum storage.

The researchers estimate that machines with roughly 300 logical qubits could outperform classical systems on some data-processing problems and that preliminary advantages might appear with about 60 logical qubits. Logical qubits are error-corrected units intended to perform reliable quantum operations while masking errors in the underlying physical qubits.

The team notes that the threshold figures refer to future, error-corrected machines, not current hardware. Reaching tens or hundreds of logical qubits will require improvements in error correction and the quality of physical qubits. Dolev Bluvstein, Oratomic co-founder and CEO, pointed to faster progress in hardware and methods over the past decade as a factor shaping expectations.

The researchers also describe a two-way relationship between AI and quantum research. AI tools are being used to model and analyze complex quantum systems, which helps engineers design and test quantum hardware and algorithms more quickly. Adrián Pérez-Salinas, a computational physics professor at ETH Zurich, noted that the study focuses on feeding data to quantum processors step by step rather than loading entire datasets at once.

Beyond the data-processing estimates, the authors say the work centers on algorithmic and architectural changes rather than a single hardware design. They report that further research is needed to map specific AI workloads to the proposed schemes, measure real-world speedups, and identify the hardware improvements required to reach the logical qubit counts cited. The paper also notes that making some quantum computations more efficient could have implications for fields such as cryptography and blockchain.

The material on GNcrypto is intended solely for informational use and must not be regarded as financial advice. We make every effort to keep the content accurate and current, but we cannot warrant its precision, completeness, or reliability. GNcrypto does not take responsibility for any mistakes, omissions, or financial losses resulting from reliance on this information. Any actions you take based on this content are done at your own risk. Always conduct independent research and seek guidance from a qualified specialist. For further details, please review our Terms, Privacy Policy and Disclaimers.

Articles by this author