neupy.algorithms.exponential_decay
- neupy.algorithms.exponential_decay(initial_value, reduction_freq, reduction_rate, staircase=False, start_iter=0, name='step')[source]
Applies exponential decay to the learning rate. This function is a wrapper for the tensorflow’s exponential_decay function.
\[\alpha_{t + 1} = \alpha_{0} \cdot d^{\frac{t}{r}}\]where \(\alpha\) is a step, \(t\) is an iteration number, \(d\) is a reduction_freq and \(r\) is a reduction_rate.
step = initial_value * reduction_rate ^ ( current_iteration / reduction_freq)
When staircase=True the \(\frac{t}{r}\) value will be rounded.
step = initial_value * reduction_rate ^ floor( current_iteration / reduction_freq)
Parameters: - initial_value : float
Initial value for the learning rate.
- reduction_freq : int
Parameter controls step reduction frequency. The larger the value the slower step parameter decreases.
- reduction_rate : float
Parameter controls step reduction rate. The larger the value the slower step parameter decreases.
- staircase : bool
If True decay the learning rate at discrete intervals. Defaults to False.
- start_iter : int
Start iteration. At has to be equal to 0 when network just started the training. Defaults to 0.
- name : str
Learning rate’s variable name. Defaults to step.
Notes
Step will be reduced faster when you have smaller training batches.
Examples
>>> from neupy import algorithms >>> from neupy.layers import * >>> >>> optimizer = algorithms.Momentum( ... Input(5) >> Relu(10) >> Sigmoid(1), ... step=algorithms.exponential_decay( ... initial_value=0.1, ... reduction_freq=1000, ... reduction_rate=0.95, ... ) ... )