neupy.algorithms.regularization.weight_decay module

class neupy.algorithms.regularization.weight_decay.WeightDecay[source]

Weight decay algorithm penalizes large weights. Also known as L2-regularization.

Parameters:

decay_rate : float

Controls training penalties during the parameter updates. The larger the value the stronger effect regularization has during the training. Defaults to 0.1.

Warns:

It works with any algorithm based on the backpropagation.

Examples

>>> from neupy import algorithms
>>> bpnet = algorithms.GradientDescent(
...     (2, 4, 1),
...     step=0.1,
...     decay_rate=0.1,
...     addons=[algorithms.WeightDecay]
... )
decay_rate = None[source]
init_train_updates()[source]
options = {'decay_rate': Option(class_name='WeightDecay', value=BoundedProperty(name="decay_rate"))}[source]