NVIDIA will combine the power of classical and quantum computers with the cuQuantum toolkit

NVIDIA will combine the power of classical and quantum computers with the cuQuantum toolkit

In 2023, IBM intends to release the first prototype of a system that will combine quantum-centric supercomputers with CPUs and GPUs into a computing structure .  

Help

We will help raise funds for thermal imagers for rescuers in Mykolaiv and Kharkiv regions

Now, NVIDIA has announced that it wants to bridge the fields of GPUs and quantum computing with cuQuantum , a quantum simulation toolkit with support for quantum processors on GPU tensor cores. To do this, a low-latency interface will be created that will allow linking computing accelerators and quantum processors (QPUs). This will allow quantum computers to use the powerful parallel computing potential of the GPU to solve classical problems. In particular, it is proposed to use them for circuit optimization, calibration and error correction. GPUs can shorten the execution time of these tasks and reduce the latency between classical and quantum computers, which is the main problem of modern hybrid quantum systems. Tom’s Hardware writes about it.

NVIDIA offers a generic software layer that is not similar to CUDA . The new programming model will greatly simplify code-level interaction with QPUs and quantum simulations, which are still run in low-level assembler. To do this, it is planned to optimize the quantum unified programming model and the compiler tool chain, which force various QPUs to more targeted use of quantum capabilities.  

NVIDIA expects to ease the transition from classical to quantum-classical workloads by allowing users to partially port their HPC (High-Performance Computing) applications to a simulated QPU and then to the processor itself.  

According to the company, dozens of organizations are already using the cuQuantum toolkit. Amazon Web Services already offers integration with cuQuantum through its Braket service and has demonstrated 900x quantum machine learning workload acceleration. cuQuantum is also used by Google’s qsim, IBM’s Qiskit Aer, Xanadu’s PennyLane, Classiq’s Quantum.

Related Posts

Leave a Reply

Your email address will not be published.