neupy.algorithms.maxnorm
- neupy.algorithms.maxnorm(weight, decay_rate=0.01)[source]
Applies max-norm regularization to the trainable parameters in the network. Also known and l-inf regularization.
Regularization cost per weight parameter in the layer can be computed in the following way (pseudocode).
cost = decay_rate * max(abs(weight))
Parameters: - decay_rate : float
Controls training penalties during the parameter updates. The larger the value the stronger effect regularization has during the training. Defaults to 0.01.
- exclude : list
List of parameter names that has to be excluded from the regularization. Defaults to ['bias'].
Examples
>>> from neupy import algorithms >>> from neupy.layers import * >>> >>> optimizer = algorithms.Momentum( ... Input(5) >> Relu(10) >> Sigmoid(1), ... step=algorithms.maxnorm(decay_rate=0.01) ... )
With included regularization for bias
>>> from neupy import algorithms >>> from neupy.layers import * >>> >>> optimizer = algorithms.Momentum( ... Input(5) >> Relu(10) >> Sigmoid(1), ... step=algorithms.maxnorm(decay_rate=0.01, exclude=[]) ... )