Skip to content

Latest commit

 

History

History
54 lines (30 loc) · 6.17 KB

README.md

File metadata and controls

54 lines (30 loc) · 6.17 KB

Quantum Machine Learning with Circuit Cutting

Quantum Machine Learning (QML) techniques, including variational Quantum Tensor Networks (QTN), pose a huge implementation challenge regarding qubit requirements. An approach to circumvent this issue is to perform circuit cutting that segments large quantum circuits into multiple smaller sub-circuits that can be trained easily on a quantum device 1. This project lays down the workflow for training a variational QTN circuit that implements circuit cutting, specifically gate cutting, to perform data classification. The workflow is built with the help of the Qiskit SDK with dependence on the Circuit Knitting Toolbox 2 for circuit cutting procedures. Additionally, a significant amount of modifications are made to algorithms like SamplerQNN, a part of the Qiskit Machine Learning package, to accommodate the training of multiple sub-circuits with Qiskit Aer's Sampler runtime primitive. The dataset used for classification is the diabetes dataset from the National Institute of Diabetes and Digestive and Kidney Diseases publicly available on Kaggle.

Workflow

The training of QML models integrated with circuit cutting can be performed using two possible workflows.

In workflow A, sub-circuits generated after a circuit cutting undergo training using a subset of input data features. Evaluation is subsequently performed with respect to the original training labels. Following this, the circuits undergo a tuning stage, wherein variable parameters are updated based on the loss function computed in the evaluation stage. Upon obtaining optimal parameter values from the optimizer, the quasi-probability distributions derived from the sub-circuits are combined to reconstruct the expectation value of the original circuit.

This workflow facilitates an independent and parallel evaluation of sub-circuits over multiple iterations. The concurrent training can be performed with the help of existing Batching techniques in the Qiskit Runtime primitives 2.

Training Workflow for QML model with Circuit Cutting

Workflow B, also proposed in 3, involves training the sub-circuits to reconstruct the original expectation values after each training iteration. This is unlike Workflow A, where the reconstruction stage occurs after multiple iterations are performed on each sub-circuit. Subsequently, an optimization step is performed on the reconstructed expectation value to tune the variable parameters within the sub-circuits. The updated parameters are then utilized to finalize the training process. The ultimately reconstructed expectation value is further used for validation and testing purposes.

This workflow facilitates concurrent training over one training iteration at a time. This is unlike Workflow A that allows for the parallel execution of sub-circuits over multiple iterations. Following the completion of each training iteration, the sub-circuits undergo classical post-processing steps to reconstruct the original expectation value.

Training Workflow for QML model with Circuit Cutting

For this project, we have opted to adopt the Workflow A structure. This design decision aligns with considerations related to the constraints imposed by the current Qiskit stack, which currently lacks support for training sub-circuits using the existing SamplerQNN primitive. Additionally, this decision is influenced by the time limitations inherent in the project timeline.

Evaluation and Results

Time Taken to Run One Forward Pass on Different Backends.

Time Taken to Run One Backward Pass on Different Backends.

Results from Training an 8-qubit QML Model with 1 Circuit Cut on CPU and GPU

Training Loss in 4-qubit sub-circuits (A) on a CPU (50 iterations) Training Loss in 4-qubit sub-circuits (B) on a CPU (50 iterations)
Training Loss in 4-qubit sub-circuits (A) on a GPU (50 iterations) Training Loss in 4-qubit sub-circuits (B) on a GPU (50 iterations)

Time to Train the QML Model with Circuit Cutting on CPU and GPU

Footnotes

  1. D. Guala, S. Zhang, E. Cruz, C. A. Riofrío, J. Klepsch, and J. M. Arrazola, “Practical overview of image classification with tensor-network quantum circuits,” Scientific Reports, vol. 13, no. 1, p. 4427, Mar. 2023, doi: https://doi.org/10.1038/s41598-023-30258-y.

  2. Jim Garrison, ‘Qiskit-Extensions/circuit-knitting-toolbox: Circuit Knitting Toolbox 0.6.0’. Zenodo, Feb. 12, 2024. doi: 10.5281/zenodo.10651875. 2

  3. M. Beisel, J. Barzen, M. Bechtold, F. Leymann, F. Truger, and B. Weder, “QuantME4VQA: Modeling and Executing Variational Quantum Algorithms Using Workflows,” Proceedings of the 13th International Conference on Cloud Computing and Services Science, 2023, doi: https://doi.org/10.5220/0011997500003488.