You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am currently exploring Deep Learning and Quantum Computing for my research, and I am using qiskit_machine_learning to implement a Variational Quantum Classifier (VQC) for binary classification. Below is my model:
I need some clarification regarding the backpropagation process in Quantum Neural Networks (QNNs), particularly in Sampler_QNN.
I noticed that the measurement of my quantum circuit is performed in the _forward function of Sampler_QNN, which is used by default in VQC.
However, I expected _backward to be used at some point for gradient optimization.
I inserted a print statement in _backward, but it was never executed, whereas _forward was called 101 times.
Questions:
Why is _backward never used in Sampler_QNN?
I assumed that _backward is responsible for optimizing gradients. If not, how does optimization work in a QNN like Sampler_QNN?
Since COBYLA is a classical optimizer, does that mean optimization is performed in a classical way?
What does backpropagation correspond to in quantum neural networks?
Does the weight update happen every time _forward is called, or is there another mechanism?
I am struggling to fully grasp the underlying principle and would greatly appreciate any insights!
Thank you in advance!
The text was updated successfully, but these errors were encountered:
invochex
changed the title
Why the _backward i not used from the SamplerQNN
Question : Why the _backward i not used from the SamplerQNN
Feb 18, 2025
invochex
changed the title
Question : Why the _backward i not used from the SamplerQNN
Question : Why the _backward i not used from the SamplerQNN
Feb 18, 2025
Hello,
I am currently exploring Deep Learning and Quantum Computing for my research, and I am using qiskit_machine_learning to implement a Variational Quantum Classifier (VQC) for binary classification. Below is my model:
Here is the sampler:
Here is the optimizer using COBYLA:
I need some clarification regarding the backpropagation process in Quantum Neural Networks (QNNs), particularly in Sampler_QNN.
I noticed that the measurement of my quantum circuit is performed in the _forward function of Sampler_QNN, which is used by default in VQC.
However, I expected _backward to be used at some point for gradient optimization.
I inserted a print statement in _backward, but it was never executed, whereas _forward was called 101 times.
Questions:
Why is _backward never used in Sampler_QNN?
I assumed that _backward is responsible for optimizing gradients. If not, how does optimization work in a QNN like Sampler_QNN?
Since COBYLA is a classical optimizer, does that mean optimization is performed in a classical way?
What does backpropagation correspond to in quantum neural networks?
Does the weight update happen every time _forward is called, or is there another mechanism?
I am struggling to fully grasp the underlying principle and would greatly appreciate any insights!
Thank you in advance!
The text was updated successfully, but these errors were encountered: