neupy.algorithms.step_update.step_decay module

class neupy.algorithms.step_update.step_decay.StepDecay[source]

Algorithm minimizes learning step monotonically after each iteration.

\[\alpha_{t + 1} = \frac{\alpha_{0}} {1 + \frac{t}{m}}\]

where \(\alpha\) is a step, \(t\) is an epoch number and \(m\) is a reduction_freq parameter.

Parameters:

reduction_freq : int

Parameter controls step redution frequency. The larger the value the slower step parameter decreases.

For instance, if reduction_freq=100 and step=0.12 then after 100 epochs step is going to be equal to 0.06 (which is 0.12 / 2), after 200 epochs step is going to be equal to 0.04 (which is 0.12 / 3) and so on.

Defaults to 100 epochs.

Warns:

It works only with algorithms based on backpropagation.

Notes

Step will be reduced faster when you have smaller training batches.

Examples

>>> from neupy import algorithms
>>>
>>> bpnet = algorithms.GradientDescent(
...     (2, 4, 1),
...     step=0.1,
...     reduction_freq=100,
...     addons=[algorithms.StepDecay]
... )
>>>
init_train_updates()[source]
options = {'reduction_freq': Option(class_name='StepDecay', value=IntProperty(name="reduction_freq"))}[source]
reduction_freq = None[source]