neupy.algorithms.regularization.weight_elimination module
- class neupy.algorithms.regularization.weight_elimination.WeightElimination[source]
Weight Elimination algorithm penalizes large weights and limits the freedom in network. The algorithm is able to solve one of the possible problems of network overfitting.
Parameters: decay_rate : float
Controls the effect of penalties on the update network weights. Defaults to 0.1.
zero_weight : float
Second important parameter for weights penalization. Defaults to 1. Small value can make all weights close to zero. Big value will make less significant contribution in weights update. Which mean that with a bigger value of the zero_weight parameter network allows higher values for the weights.
Warns: It works with any algorithm based on the backpropagation.
See also
- WeightDecay
- Weight Decay penalty.
Notes
Before adding that regularization parameter carefully choose decay_rate and zero_weight parameters for the problem. Invalid parameters can make weight very close to the origin (all values become close to zero).
References
- [1] Weigend, A. S.; Rumelhart, D. E. & Huberman, B. A. (1991),
- Generalization by Weight-Elimination with Application to Forecasting, in Richard P. Lippmann; John E. Moody & David S. Touretzky, ed., Advances in Neural Information Processing Systems, San Francisco, CA: Morgan Kaufmann, pp. 875–882 .
Examples
>>> from neupy import algorithms >>> bpnet = algorithms.GradientDescent( ... (2, 4, 1), ... step=0.1, ... decay_rate=0.1, ... addons=[algorithms.WeightElimination] ... )
- decay_rate = None[source]
- init_train_updates()[source]
- options = {'decay_rate': Option(class_name='WeightElimination', value=BoundedProperty(name="decay_rate")), 'zero_weight': Option(class_name='WeightElimination', value=BoundedProperty(name="zero_weight"))}[source]
- zero_weight = None[source]