neupy.layers.PRelu

class neupy.layers.PRelu[source]

The layer with the parametrized ReLu activation function.

Parameters:
alpha_axes : int or tuple

Axes that will not include unique alpha parameter. Single integer value defines the same as a tuple with one value. Defaults to -1.

alpha : array-like, Tensorfow variable, scalar or Initializer

Alpha parameter per each non-shared axis for the ReLu. Scalar value means that each element in the tensor will be equal to the specified value. Default initialization methods you can find here. Defaults to Constant(value=0.25).

size : int or None

Layer input size. None means that layer will not create parameters and will return only activation function output for the specified input value.

weight : array-like, Tensorfow variable, scalar or Initializer

Defines layer’s weights. Default initialization methods you can find here. Defaults to HeNormal().

bias : 1D array-like, Tensorfow variable, scalar, Initializer or None

Defines layer’s bias. Default initialization methods you can find here. Defaults to Constant(0). The None value excludes bias from the calculations and do not add it into parameters list.

name : str or None

Layer’s identifier. If name is equal to None than name will be generated automatically. Defaults to None.

References

[1]https://arxiv.org/pdf/1502.01852v1.pdf

Examples

Feedforward Neural Networks (FNN)

>>> from neupy.layers import *
>>> network = Input(10) > PRelu(20) > PRelu(1)

Convolutional Neural Networks (CNN)

>>> from neupy.layers import *
>>> network = join(
...     Input((32, 32, 3)),
...     Convolution((3, 3, 16)) > PRelu(),
...     Convolution((3, 3, 32)) > PRelu(),
...     Reshape(),
...     Softmax(10),
... )
Attributes:
input_shape : tuple

Returns layer’s input shape in the form of a tuple. Shape will not include batch size dimension.

output_shape : tuple

Returns layer’s output shape in the form of a tuple. Shape will not include batch size dimension.

training_state : bool

Defines whether layer in training state or not. Training state will enable some operations inside of the layers that won’t work otherwise.

parameters : dict

Parameters that networks uses during propagation. It might include trainable and non-trainable parameters.

graph : LayerGraph instance

Graphs that stores all relations between layers.

Methods

disable_training_state() Context manager that switches off trainig state.
initialize() Set up important configurations related to the layer.
activation_function(input_value)[source]
alpha = None[source]
alpha_axes = None[source]
initialize()[source]

Initialize connection

options = {'alpha': Option(class_name='PRelu', value=ParameterProperty(name="alpha")), 'alpha_axes': Option(class_name='PRelu', value=AxesProperty(name="alpha_axes")), 'bias': Option(class_name='ParameterBasedLayer', value=ParameterProperty(name="bias")), 'name': Option(class_name='BaseLayer', value=Property(name="name")), 'size': Option(class_name='ActivationLayer', value=IntProperty(name="size")), 'weight': Option(class_name='ParameterBasedLayer', value=ParameterProperty(name="weight"))}[source]
validate(input_shape)[source]

Validate input shape value before assigning it.

Parameters:
input_shape : tuple with int