neupy.algorithms.step_update.search_then_converge module

class neupy.algorithms.step_update.search_then_converge.SearchThenConverge[source]

Algorithm decrease learning step after each epoch.

Parameters:

reduction_freq : int

The parameter controls the frequency reduction step with respect to epochs. Defaults to 100 epochs. Can’t be less than 1. Less value mean that step decrease faster.

rate_coefitient : float

Second important parameter to control the rate of error reduction. Defaults to 0.2

Warns:

It works only with algorithms based on backpropagation.

See also

StepDecay

Examples

>>> from neupy import algorithms
>>>
>>> bpnet = algorithms.GradientDescent(
...     (2, 4, 1),
...     step=0.1,
...     verbose=False,
...     addons=[algorithms.SearchThenConverge]
... )
>>>
init_train_updates()[source]
options = {'reduction_freq': Option(class_name='SearchThenConverge', value=IntProperty(name="reduction_freq")), 'rate_coefitient': Option(class_name='SearchThenConverge', value=NumberProperty(name="rate_coefitient"))}[source]
rate_coefitient = None[source]
reduction_freq = None[source]