Optimizers (qiskit_algorithms.optimizers
)#
Classical Optimizers.
This package contains a variety of classical optimizers and were designed for use by
qiskit_algorithm’s quantum variational algorithms, such as VQE
.
Logically, these optimizers can be divided into two categories:
- Local Optimizers
Given an optimization problem, a local optimizer is a function that attempts to find an optimal value within the neighboring set of a candidate solution.
- Global Optimizers
Given an optimization problem, a global optimizer is a function that attempts to find an optimal value among all possible solutions.
Optimizer Base Classes#
The result of an optimization routine. |
|
Base class for optimization algorithm. |
|
Callable Protocol for minimizer. |
Steppable Optimization#
Utils for optimizers |
Local Optimizers#
Adam and AMSGRAD optimizers. |
|
Analytic Quantum Gradient Descent (AQGD) with Epochs optimizer. |
|
Conjugate Gradient optimizer. |
|
Constrained Optimization By Linear Approximation optimizer. |
|
Limited-memory BFGS Bound optimizer. |
|
Gaussian-smoothed Line Search. |
|
The gradient descent minimization routine. |
|
State of |
|
Nelder-Mead optimizer. |
|
Nakanishi-Fujii-Todo algorithm. |
|
Parallelized Limited-memory BFGS optimizer. |
|
Powell optimizer. |
|
Sequential Least SQuares Programming optimizer. |
|
Simultaneous Perturbation Stochastic Approximation (SPSA) optimizer. |
|
The Quantum Natural SPSA (QN-SPSA) optimizer. |
|
Truncated Newton (TNC) optimizer. |
|
A general Qiskit Optimizer wrapping scipy.optimize.minimize. |
|
Continuous Univariate Marginal Distribution Algorithm (UMDA). |
Qiskit also provides the following optimizers, which are built-out using the optimizers from
scikit-quant. The scikit-quant
package
is not installed by default but must be explicitly installed, if desired, by the user. The
optimizers therein are provided under various licenses, hence it has been made an optional install.
To install the scikit-quant
dependent package you can use pip install scikit-quant
.
Global Optimizers#
The global optimizers here all use NLOpt for their
core function and can only be used if the optional dependent NLOpt
package is installed.
To install the NLOpt
dependent package you can use pip install nlopt
.
Controlled Random Search (CRS) with local mutation optimizer. |
|
DIviding RECTangles Locally-biased optimizer. |
|
DIviding RECTangles Locally-biased Randomized optimizer. |
|
ESCH evolutionary optimizer. |
|
Improved Stochastic Ranking Evolution Strategy optimizer. |