# neupy.layers.normalization module

class neupy.layers.normalization.BatchNorm[source]

Batch normalization layer.

Parameters: axes : tuple with ints or None Axes along which normalization will be applied. The None value means that normalization will be applied over all axes except the last one. In case of 4D tensor it will be equal to (0, 1, 2). Defaults to None. epsilon : float Epsilon is a positive constant that adds to the standard deviation to prevent the division by zero. Defaults to 1e-5. alpha : float Coefficient for the exponential moving average of batch-wise means and standard deviations computed during training; the closer to one, the more it will depend on the last batches seen. Value needs to be between 0 and 1. Defaults to 0.1. gamma : array-like, Tensorfow variable, scalar or Initializer Scale. Default initialization methods you can find here. Defaults to Constant(value=1). beta : array-like, Tensorfow variable, scalar or Initializer Offset. Default initialization methods you can find here. Defaults to Constant(value=0). running_mean : array-like, Tensorfow variable, scalar or Initializer Default initialization methods you can find here. Defaults to Constant(value=0). running_inv_std : array-like, Tensorfow variable, scalar or Initializer Default initialization methods you can find here. Defaults to Constant(value=1). name : str or None Layer’s name. Can be used as a reference to specific layer. Name Can be specified as: String: Specified name will be used as a direct reference to the layer. For example, name=”fc” Format string: Name pattern could be defined as a format string and specified field will be replaced with an index. For example, name=”fc{}” will be replaced with fc1, fc2 and so on. A bit more complex formatting methods are acceptable, for example, name=”fc-{:<03d}” will be converted to fc-001, fc-002, fc-003 and so on. None: When value specified as None than name will be generated from the class name. Defaults to None.

References

 [1] Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift, http://arxiv.org/pdf/1502.03167v3.pdf

Examples

Feedforward Neural Networks (FNN) with batch normalization after activation function was applied.

>>> from neupy.layers import *
>>> network = join(
...     Input(10),
...     Relu(5) >> BatchNorm(),
...     Relu(5) >> BatchNorm(),
...     Sigmoid(1),
... )


Feedforward Neural Networks (FNN) with batch normalization before activation function was applied.

>>> from neupy.layers import *
>>> network = join(
...     Input(10),
...     Linear(5) >> BatchNorm() >> Relu(),
...     Linear(5) >> BatchNorm() >> Relu(),
...     Sigmoid(1),
... )


Convolutional Neural Networks (CNN)

>>> from neupy.layers import *
>>> network = join(
...     Input((28, 28, 1)),
...     Convolution((3, 3, 16)) >> BatchNorm() >> Relu(),
...     Convolution((3, 3, 16)) >> BatchNorm() >> Relu(),
...     Reshape(),
...     Softmax(10),
... )

Attributes: variables : dict Variable names and their values. Dictionary can be empty in case if variables hasn’t been created yet.

Methods

 variable(value, name, shape=None, trainable=True) Initializes variable with specified values. get_output_shape(input_shape) Computes expected output shape from the layer based on the specified input shape. output(*inputs, **kwargs) Propagates input through the layer. The kwargs variable might contain additional information that propagates through the network.
alpha = None[source]
axes = None[source]
beta = None[source]
create_variables(input_shape)[source]
epsilon = None[source]
gamma = None[source]
options = {'alpha': Option(class_name='BatchNorm', value=ProperFractionProperty(name="alpha")), 'axes': Option(class_name='BatchNorm', value=TypedListProperty(name="axes")), 'beta': Option(class_name='BatchNorm', value=ParameterProperty(name="beta")), 'epsilon': Option(class_name='BatchNorm', value=NumberProperty(name="epsilon")), 'gamma': Option(class_name='BatchNorm', value=ParameterProperty(name="gamma")), 'name': Option(class_name='BaseLayer', value=Property(name="name")), 'running_inv_std': Option(class_name='BatchNorm', value=ParameterProperty(name="running_inv_std")), 'running_mean': Option(class_name='BatchNorm', value=ParameterProperty(name="running_mean"))}[source]
output(input, training=False)[source]
running_inv_std = None[source]
running_mean = None[source]
class neupy.layers.normalization.LocalResponseNorm[source]

Local Response Normalization Layer.

Aggregation is purely across channels, not within channels, and performed “pixelwise”.

If the value of the $$i$$ th channel is $$x_i$$, the output is

$x_i = \frac{x_i}{ (k + ( \alpha \sum_j x_j^2 ))^\beta }$

where the summation is performed over this position on $$n$$ neighboring channels.

Parameters: alpha : float Coefficient, see equation above. Defaults to 1e-4. beta : float Offset, see equation above. Defaults to 0.75. k : float Exponent, see equation above. Defaults to 2. depth_radius : int Number of adjacent channels to normalize over, must be odd. Defaults to 5. name : str or None Layer’s name. Can be used as a reference to specific layer. Name Can be specified as: String: Specified name will be used as a direct reference to the layer. For example, name=”fc” Format string: Name pattern could be defined as a format string and specified field will be replaced with an index. For example, name=”fc{}” will be replaced with fc1, fc2 and so on. A bit more complex formatting methods are acceptable, for example, name=”fc-{:<03d}” will be converted to fc-001, fc-002, fc-003 and so on. None: When value specified as None than name will be generated from the class name. Defaults to None.

Examples

>>> from neupy.layers import *
>>> network = Input((10, 10, 12)) >> LocalResponseNorm()

Attributes: variables : dict Variable names and their values. Dictionary can be empty in case if variables hasn’t been created yet.

Methods

 variable(value, name, shape=None, trainable=True) Initializes variable with specified values. get_output_shape(input_shape) Computes expected output shape from the layer based on the specified input shape. output(*inputs, **kwargs) Propagates input through the layer. The kwargs variable might contain additional information that propagates through the network.
alpha = None[source]
beta = None[source]
get_output_shape(input_shape)[source]
k = None[source]
options = {'alpha': Option(class_name='LocalResponseNorm', value=NumberProperty(name="alpha")), 'beta': Option(class_name='LocalResponseNorm', value=NumberProperty(name="beta")), 'depth_radius': Option(class_name='LocalResponseNorm', value=IntProperty(name="depth_radius")), 'k': Option(class_name='LocalResponseNorm', value=NumberProperty(name="k")), 'name': Option(class_name='BaseLayer', value=Property(name="name"))}[source]
output(input, **kwargs)[source]
class neupy.layers.normalization.GroupNorm[source]

Group Normalization layer. This layer is a simple alternative to the Batch Normalization layer for cases when batch size is small.

Parameters: n_groups : int During normalization all the channels will be break down into separate groups and mean and variance will be estimated per group. This parameter controls number of groups. gamma : array-like, Tensorfow variable, scalar or Initializer Scale. Default initialization methods you can find here. Defaults to Constant(value=1). beta : array-like, Tensorfow variable, scalar or Initializer Offset. Default initialization methods you can find here. Defaults to Constant(value=0). epsilon : float Epsilon ensures that input rescaling procedure that uses estimated variance will never cause division by zero. Defaults to 1e-5. name : str or None Layer’s name. Can be used as a reference to specific layer. Name Can be specified as: String: Specified name will be used as a direct reference to the layer. For example, name=”fc” Format string: Name pattern could be defined as a format string and specified field will be replaced with an index. For example, name=”fc{}” will be replaced with fc1, fc2 and so on. A bit more complex formatting methods are acceptable, for example, name=”fc-{:<03d}” will be converted to fc-001, fc-002, fc-003 and so on. None: When value specified as None than name will be generated from the class name. Defaults to None.

References

 [1] Group Normalization, Yuxin Wu, Kaiming He, https://arxiv.org/pdf/1803.08494.pdf

Examples

Convolutional Neural Networks (CNN)

>>> from neupy.layers import *
>>> network = join(
...     Input((28, 28, 1)),
...     Convolution((3, 3, 16)) >> GroupNorm(4) >> Relu(),
...     Convolution((3, 3, 16)) >> GroupNorm(4) >> Relu(),
...     Reshape(),
...     Softmax(10),
... )

Attributes: variables : dict Variable names and their values. Dictionary can be empty in case if variables hasn’t been created yet.

Methods

 variable(value, name, shape=None, trainable=True) Initializes variable with specified values. get_output_shape(input_shape) Computes expected output shape from the layer based on the specified input shape. output(*inputs, **kwargs) Propagates input through the layer. The kwargs variable might contain additional information that propagates through the network.
beta = None[source]
create_variables(input_shape)[source]
epsilon = None[source]
gamma = None[source]
get_output_shape(input_shape)[source]
n_groups = None[source]
options = {'beta': Option(class_name='GroupNorm', value=ParameterProperty(name="beta")), 'epsilon': Option(class_name='GroupNorm', value=NumberProperty(name="epsilon")), 'gamma': Option(class_name='GroupNorm', value=ParameterProperty(name="gamma")), 'n_groups': Option(class_name='GroupNorm', value=IntProperty(name="n_groups")), 'name': Option(class_name='BaseLayer', value=Property(name="name"))}[source]
output(input)[source]