neupy.algorithms.l2

neupy.algorithms.l2(weight, decay_rate=0.01)[source]

Applies l2 regularization to the trainable parameters in the network.

Regularization cost per weight parameter in the layer can be computed in the following way (pseudocode).

cost = decay_rate * sum(weight ** 2)
Parameters:
decay_rate : float

Controls training penalties during the parameter updates. The larger the value the stronger effect regularization has during the training. Defaults to 0.01.

exclude : list

List of parameter names that has to be excluded from the regularization. Defaults to ['bias'].

Examples

>>> from neupy import algorithms
>>> from neupy.layers import *
>>>
>>> optimizer = algorithms.Momentum(
...     Input(5) >> Relu(10) >> Sigmoid(1),
...     step=algorithms.l2(decay_rate=0.01)
... )

With included regularization for bias

>>> from neupy import algorithms
>>> from neupy.layers import *
>>>
>>> optimizer = algorithms.Momentum(
...     Input(5) >> Relu(10) >> Sigmoid(1),
...     step=algorithms.l2(decay_rate=0.01, exclude=[])
... )