optimizer_utils

Supplementary tools for optimizers.

class LearningRate(learning_rate)[source]

Represents a Learning Rate. Will be an attribute of GradientDescentState. Note that GradientDescent also has a learning rate. That learning rate can be a float, a list, an array, a function returning a generator and will be used to create a generator to be used during the optimization process. This class wraps Generator so that we can also access the last yielded value.

Parameters:

learning_rate (float | list[float] | np.ndarray | Callable[[], Generator[float, None, None]]) – Used to create a generator to iterate on.

close()

Raise GeneratorExit inside generator.

property current

Returns the current value of the learning rate.

send(value)[source]

Send a value into the generator. Return next yielded value or raise StopIteration.

throw(typ, val=None, tb=None)[source]

Raise an exception in the generator. Return next yielded value or raise StopIteration.