optimizer_utils¶
Supplementary tools for optimizers.
- class LearningRate(learning_rate)[source]¶
Represents a Learning Rate. Will be an attribute of
GradientDescentState
. Note thatGradientDescent
also has a learning rate. That learning rate can be a float, a list, an array, a function returning a generator and will be used to create a generator to be used during the optimization process. This class wrapsGenerator
so that we can also access the last yielded value.- Parameters:
learning_rate (float | list[float] | np.ndarray | Callable[[], Generator[float, None, None]]) – Used to create a generator to iterate on.
- close()¶
Raise GeneratorExit inside generator.
- property current¶
Returns the current value of the learning rate.