# neupy.algorithms.step_update.search_then_converge module

class neupy.algorithms.step_update.search_then_converge.SearchThenConverge[source]

Algorithm decrease learning step after each epoch.

Parameters: reduction_freq : int The parameter controls the frequency reduction step with respect to epochs. Defaults to 100 epochs. Can’t be less than 1. Less value mean that step decrease faster. rate_coefitient : float Second important parameter to control the rate of error reduction. Defaults to 0.2 It works only with algorithms based on backpropagation.

StepDecay

Examples

>>> from neupy import algorithms
>>>
>>> bpnet = algorithms.GradientDescent(
...     (2, 4, 1),
...     step=0.1,
...     verbose=False,