SVCLoss#

class SVCLoss(**kwargs)[source]#

Bases : KernelLoss

This class provides a kernel loss function for classification tasks by fitting an SVC model from scikit-learn. Given training samples, xi, with binary labels, yi, and a kernel, Kθ, parameterized by values, θ, the loss is defined as:

SVCLoss=iai0.5i,jaiajyiyjKθ(xi,xj)

where ai are the optimal Lagrange multipliers found by solving the standard SVM quadratic program. Note that the hyper-parameter C for the soft-margin penalty can be specified through the keyword args.

Minimizing this loss over the parameters, θ, of the kernel is equivalent to maximizing a weighted kernel alignment, which in turn yields the smallest upper bound to the SVM generalization error for a given parameterization.

See https://arxiv.org/abs/2105.03406 for further details.

Paramètres:

**kwargs – Arbitrary keyword arguments to pass to SVC constructor within SVCLoss evaluation.

Methods

evaluate(parameter_values, quantum_kernel, data, labels)[source]#

An abstract method for evaluating the loss of a kernel function on a labeled dataset.

Paramètres:
  • parameter_values (Sequence[float]) – An array of values to assign to the user params

  • quantum_kernel (TrainableKernel) – A trainable quantum kernel object to evaluate

  • data (ndarray) – An (N, M) matrix containing the data N = # samples, M = dimension of data

  • labels (ndarray) – A length-N array containing the truth labels

Renvoie:

A loss value

Type renvoyé:

float