Optimizers (qiskit_machine_learning.optimizers
)¶
Contains a variety of classical optimizers designed for Qiskit Algorithm’s quantum variational algorithms. Logically, these optimizers can be divided into two categories:
- Local Optimizers
Given an optimization problem, a local optimizer is a function that attempts to find an optimal value within the neighboring set of a candidate solution.
- Global Optimizers
Given an optimization problem, a global optimizer is a function that attempts to find an optimal value among all possible solutions.
Optimizer base classes¶
The result of an optimization routine. |
|
Base class for optimization algorithm. |
|
Callable Protocol for minimizer. |
Steppable optimization¶
Supplementary tools for optimizers. |
Local optimizers¶
Adam and AMSGRAD optimizers. |
|
Analytic Quantum Gradient Descent (AQGD) with Epochs optimizer. |
|
Conjugate Gradient optimizer. |
|
Constrained Optimization By Linear Approximation optimizer. |
|
Limited-memory BFGS Bound optimizer. |
|
Gaussian-smoothed Line Search. |
|
The gradient descent minimization routine. |
|
State of |
|
Nelder-Mead optimizer. |
|
Nakanishi-Fujii-Todo algorithm. |
|
Parallelized Limited-memory BFGS optimizer. |
|
Powell optimizer. |
|
Sequential Least SQuares Programming optimizer. |
|
Simultaneous Perturbation Stochastic Approximation (SPSA) optimizer. |
|
The Quantum Natural SPSA (QN-SPSA) optimizer. |
|
Truncated Newton (TNC) optimizer. |
|
A general Qiskit Optimizer wrapping scipy.optimize.minimize. |
|
Continuous Univariate Marginal Distribution Algorithm (UMDA). |
The optimizers from scikit-quant are not included in the Qiskit Machine Learning library. To continue using them, please import them from Qiskit Algorithms. Be aware that and a deprecation of the methods snobfit, imfil and bobyqa the was considered: https://github.com/qiskit-community/qiskit-algorithms/issues/84.
Global optimizers¶
The global optimizers here all use NLOpt for their
core function and can only be used if the optional dependent NLOpt
package is installed.
To install the NLOpt
dependent package you can use pip install nlopt
.
Controlled Random Search (CRS) with local mutation optimizer. |
|
DIviding RECTangles Locally-biased optimizer. |
|
DIviding RECTangles Locally-biased Randomized optimizer. |
|
ESCH evolutionary optimizer. |
|
Improved Stochastic Ranking Evolution Strategy optimizer. |