Optimizers (qiskit_machine_learning.optimizers)

Contains a variety of classical optimizers designed for Qiskit Algorithm’s quantum variational algorithms. Logically, these optimizers can be divided into two categories:

Local Optimizers

Given an optimization problem, a local optimizer is a function that attempts to find an optimal value within the neighboring set of a candidate solution.

Global Optimizers

Given an optimization problem, a global optimizer is a function that attempts to find an optimal value among all possible solutions.

Optimizer base classes

OptimizerResult

The result of an optimization routine.

Optimizer

Base class for optimization algorithm.

Minimizer

Callable Protocol for minimizer.

Steppable optimization

optimizer_utils

Supplementary tools for optimizers.

SteppableOptimizer

Base class for a steppable optimizer.

AskData

Base class for return type of ask().

TellData

Base class for argument type of tell().

OptimizerState

Base class representing the state of the optimizer.

Local optimizers

ADAM

Adam and AMSGRAD optimizers.

AQGD

Analytic Quantum Gradient Descent (AQGD) with Epochs optimizer.

CG

Conjugate Gradient optimizer.

COBYLA

Constrained Optimization By Linear Approximation optimizer.

L_BFGS_B

Limited-memory BFGS Bound optimizer.

GSLS

Gaussian-smoothed Line Search.

GradientDescent

The gradient descent minimization routine.

GradientDescentState

State of GradientDescent.

NELDER_MEAD

Nelder-Mead optimizer.

NFT

Nakanishi-Fujii-Todo algorithm.

P_BFGS

Parallelized Limited-memory BFGS optimizer.

POWELL

Powell optimizer.

SLSQP

Sequential Least SQuares Programming optimizer.

SPSA

Simultaneous Perturbation Stochastic Approximation (SPSA) optimizer.

QNSPSA

The Quantum Natural SPSA (QN-SPSA) optimizer.

TNC

Truncated Newton (TNC) optimizer.

SciPyOptimizer

A general Qiskit Optimizer wrapping scipy.optimize.minimize.

UMDA

Continuous Univariate Marginal Distribution Algorithm (UMDA).

The optimizers from scikit-quant are not included in the Qiskit Machine Learning library. To continue using them, please import them from Qiskit Algorithms. Be aware that and a deprecation of the methods snobfit, imfil and bobyqa the was considered: https://github.com/qiskit-community/qiskit-algorithms/issues/84.

Global optimizers

The global optimizers here all use NLOpt for their core function and can only be used if the optional dependent NLOpt package is installed. To install the NLOpt dependent package you can use pip install nlopt.

CRS

Controlled Random Search (CRS) with local mutation optimizer.

DIRECT_L

DIviding RECTangles Locally-biased optimizer.

DIRECT_L_RAND

DIviding RECTangles Locally-biased Randomized optimizer.

ESCH

ESCH evolutionary optimizer.

ISRES

Improved Stochastic Ranking Evolution Strategy optimizer.