neupy.algorithms.regularization.max_norm module

class neupy.algorithms.regularization.max_norm.MaxNormRegularization[source]

Max-norm regularization algorithm will clip norm of the parameter in case if it will exceed maximum limit.

if norm(weight) > max_norm:
    weight = max_norm * weight / norm(weight)

Parameters:

max_norm : int, float

Any parameter that has norm greater than this value will be clipped. Defaults to 10.

Warns:

It works with any algorithm based on the backpropagation.

References

[1] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever,
R. Salakhutdinov. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf

Examples

>>> from neupy import algorithms
>>> bpnet = algorithms.GradientDescent(
...     (2, 4, 1),
...     step=0.1,
...     max_norm=4,
...     addons=[algorithms.MaxNormRegularization]
... )
init_train_updates()[source]
max_norm = None[source]
options = {'max_norm': Option(class_name='MaxNormRegularization', value=NumberProperty(name="max_norm"))}[source]