# neupy.algorithms.step_update.leak_step module

Leak Learning Rate Adaptation algorithm is a step adaptation procedure in backpropagation algortihm.

Parameters: leak_size : float Defaults to 0.01. This variable identified proportion, so it’s always between 0 and 1. Typically this value is small. alpha : float The alpha is control total step update ratio. Defaults to 0.001. Typically this value is small. beta : float This similar to alpha, but it control ration only for update matrix norms. Defaults to 20. Typically this value is bigger than 1. It works only with algorithms based on backpropagation.

References

[1] Noboru M. “Adaptive on-line learning in changing
environments”, 1997

[2] LeCun, “Efficient BackProp”, 1998

Examples

>>> from neupy import algorithms
>>> bpnet = algorithms.GradientDescent(
...     (2, 4, 1),