# neupy.algorithms.MaxNormRegularization

class neupy.algorithms.MaxNormRegularization[source]

Max-norm regularization algorithm will clip norm of the parameter in case if it will exceed maximum limit.

if norm(weight) > max_norm:
weight = max_norm * weight / norm(weight)


Parameters: max_norm : int, float Any parameter that has norm greater than this value will be clipped. Defaults to 10. It works with any algorithm based on the backpropagation.

References

[1] N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever,
R. Salakhutdinov. Dropout: A Simple Way to Prevent Neural Networks from Overfitting. http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf

Examples

>>> from neupy import algorithms