neupy.algorithms.step_update.leak_step module

class neupy.algorithms.step_update.leak_step.LeakStepAdaptation[source]

Leak Learning Rate Adaptation algorithm is a step adaptation procedure in backpropagation algortihm.


leak_size : float

Defaults to 0.01. This variable identified proportion, so it’s always between 0 and 1. Typically this value is small.

alpha : float

The alpha is control total step update ratio. Defaults to 0.001. Typically this value is small.

beta : float

This similar to alpha, but it control ration only for update matrix norms. Defaults to 20. Typically this value is bigger than 1.


It works only with algorithms based on backpropagation.


[1] Noboru M. “Adaptive on-line learning in changing
environments”, 1997

[2] LeCun, “Efficient BackProp”, 1998


>>> from neupy import algorithms
>>> bpnet = algorithms.GradientDescent(
...     (2, 4, 1),
...     addons=[algorithms.LeakStepAdaptation]
... )
alpha = None[source]
beta = None[source]
leak_size = None[source]
options = {'leak_size': Option(class_name='LeakStepAdaptation', value=ProperFractionProperty(name="leak_size")), 'alpha': Option(class_name='LeakStepAdaptation', value=BoundedProperty(name="alpha")), 'beta': Option(class_name='LeakStepAdaptation', value=BoundedProperty(name="beta"))}[source]