neupy.algorithms.l1

neupy.algorithms.l1(weight, decay_rate=0.01)[source]

Applies l1 regularization to the trainable parameters in the network.

Regularization cost per weight parameter in the layer can be computed in the following way (pseudocode).

cost = decay_rate * sum(abs(weight))
Parameters:
decay_rate : float

Controls training penalties during the parameter updates. The larger the value the stronger effect regularization has during the training. Defaults to 0.01.

exclude : list

List of parameter names that has to be excluded from the regularization. Defaults to ['bias'].

Examples

>>> from neupy import algorithms
>>> from neupy.layers import *
>>>
>>> optimizer = algorithms.Momentum(
...     Input(5) >> Relu(10) >> Sigmoid(1),
...     step=algorithms.l1(decay_rate=0.01)
... )

With included regularization for bias

>>> from neupy import algorithms
>>> from neupy.layers import *
>>>
>>> optimizer = algorithms.Momentum(
...     Input(5) >> Relu(10) >> Sigmoid(1),
...     step=algorithms.l1(decay_rate=0.01, exclude=[])
... )