neupy.layers.PRelu

class neupy.layers.PRelu[source]

Layer with the parametrized ReLu used as an activation function. Layer learns additional parameter alpha during the training.

It applies linear transformation when the n_units parameter specified and parametrized relu function after the transformation. When n_units is not specified, only parametrized relu function will be applied to the input.

Parameters:
alpha_axes : int or tuple

Axes that will not include unique alpha parameter. Single integer value defines the same as a tuple with one value. Defaults to -1.

alpha : array-like, Tensorfow variable, scalar or Initializer

Separate alpha parameter per each non-shared axis for the ReLu. Scalar value means that each element in the tensor will be equal to the specified value. Default initialization methods you can find here. Defaults to Constant(value=0.25).

n_units : int or None

Number of units in the layers. It also corresponds to the number of output features that will be produced per sample after passing it through this layer. The None value means that layer will not have parameters and it will only apply activation function to the input without linear transformation output for the specified input value. Defaulst to None.

weight : array-like, Tensorfow variable, scalar or Initializer

Defines layer’s weights. Default initialization methods you can find here. Defaults to HeNormal().

bias : 1D array-like, Tensorfow variable, scalar, Initializer or None

Defines layer’s bias. Default initialization methods you can find here. Defaults to Constant(0). The None value excludes bias from the calculations and do not add it into parameters list.

name : str or None

Layer’s name. Can be used as a reference to specific layer. When value specified as None than name will be generated from the class name. Defaults to None

References

[1]Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification. https://arxiv.org/pdf/1502.01852v1.pdf

Examples

Feedforward Neural Networks (FNN)

>>> from neupy.layers import *
>>> network = Input(10) >> PRelu(20) >> PRelu(1)

Convolutional Neural Networks (CNN)

>>> from neupy.layers import *
>>> network = join(
...     Input((32, 32, 3)),
...     Convolution((3, 3, 16)) >> PRelu(),
...     Convolution((3, 3, 32)) >> PRelu(),
...     Reshape(),
...     Softmax(10),
... )
Attributes:
variables : dict

Variable names and their values. Dictionary can be empty in case if variables hasn’t been created yet.

Methods

variable(value, name, shape=None, trainable=True) Initializes variable with specified values.
get_output_shape(input_shape) Computes expected output shape from the layer based on the specified input shape.
output(*inputs, **kwargs) Propagetes input through the layer. The kwargs variable might contain additional information that propages through the network.
activation_function(input) Applies activation function to the input.
activation_function(input)[source]
alpha = None[source]
alpha_axes = None[source]
create_variables(input_shape)[source]
get_output_shape(input_shape)[source]
options = {'alpha': Option(class_name='PRelu', value=ParameterProperty(name="alpha")), 'alpha_axes': Option(class_name='PRelu', value=TypedListProperty(name="alpha_axes")), 'bias': Option(class_name='Linear', value=ParameterProperty(name="bias")), 'n_units': Option(class_name='Linear', value=IntProperty(name="n_units")), 'name': Option(class_name='BaseLayer', value=Property(name="name")), 'weight': Option(class_name='Linear', value=ParameterProperty(name="weight"))}[source]