Callback function is not being called for TrainableModels when using optimizers that don't inherit from SciPyOptimizer #893
Labels
type: bug 🐞
Something isn't working
Milestone
Environment
What is happening?
Callback function is not being called for TrainableModels when using optimzers that don't inherit from SciPyOptimizer
Around 2 months ago a commit named "Fix callback compatibility for trainable_model" introduced an if clause for the callback calling in the _get_objective method, checking if the optimizer is a SciPyOptimizer instance to then execute the callback function.
But some of the optimizers in the repo are not inheriting from SciPyOptimizer, but from Optimizer, causing a bug where the callback function does not execute since the if clause is false.
ADAM, AQGD, GradientDescent, GSLS, SPSA and UMDA are not SciPyOptimizers, but it is important for any to have the callback function executed at each iteraction.
How can we reproduce the issue?
-- with any callback function
-- with any of the following optimizers: ADAM, AQGD, GradientDescent, GSLS, SPSA or UMDA
-- with any data, feature_map and ansatz
What should happen?
The callback function is never executed.
Any suggestions?
No response
The text was updated successfully, but these errors were encountered: