neupy.layers.merge module

class neupy.layers.merge.Elementwise[source]

Layer merges multiple input with elementwise function and generate single output. Each input to this layer should have exactly the same shape, otherwise it won’t be possible to apply elementwise operation.

Parameters:
merge_function : callable or {add, multiply}

Callable object that accepts two inputs and combines them in value using elementwise operation.

  • add - Sum all the inputs. Alias to tf.add.
  • multiply - Multiplies all the inputs. Alias to tf.multiply.
  • Custom function requires to have two input arguments.
def subtraction(x, y):
    return x - y

Defaults to add.

name : str or None

Layer’s name. Can be used as a reference to specific layer. When value specified as None than name will be generated from the class name. Defaults to None

Examples

>>> from neupy import layers
>>> network = (Input(10) | Input(10)) >> Elementwise('add')
[(?, 10), (?, 10)] -> [... 3 layers ...] -> (?, 10)
Attributes:
variables : dict

Variable names and their values. Dictionary can be empty in case if variables hasn’t been created yet.

Methods

variable(value, name, shape=None, trainable=True) Initializes variable with specified values.
get_output_shape(input_shape) Computes expected output shape from the layer based on the specified input shape.
output(*inputs, **kwargs) Propagetes input through the layer. The kwargs variable might contain additional information that propages through the network.
get_output_shape(*input_shapes)[source]
merge_function = None[source]
options = {'merge_function': Option(class_name='Elementwise', value=FunctionWithOptionsProperty(name="merge_function")), 'name': Option(class_name='BaseLayer', value=Property(name="name"))}[source]
output(*inputs, **kwargs)[source]
class neupy.layers.merge.Concatenate[source]

Concatenate multiple inputs into one. Inputs will be concatenated over the specified axis (controlled with parameter axis).

Parameters:
axis : int

The axis along which the inputs will be concatenated. Default is -1.

name : str or None

Layer’s name. Can be used as a reference to specific layer. When value specified as None than name will be generated from the class name. Defaults to None

Examples

>>> from neupy.layers import *
>>> network = (Input(10) | Input(20)) >> Concatenate()
[(?, 10), (?, 20)] -> [... 3 layers ...] -> (?, 30)
Attributes:
variables : dict

Variable names and their values. Dictionary can be empty in case if variables hasn’t been created yet.

Methods

variable(value, name, shape=None, trainable=True) Initializes variable with specified values.
get_output_shape(input_shape) Computes expected output shape from the layer based on the specified input shape.
output(*inputs, **kwargs) Propagetes input through the layer. The kwargs variable might contain additional information that propages through the network.
axis = None[source]
get_output_shape(*input_shapes)[source]
options = {'axis': Option(class_name='Concatenate', value=IntProperty(name="axis")), 'name': Option(class_name='BaseLayer', value=Property(name="name"))}[source]
output(*inputs, **kwargs)[source]
class neupy.layers.merge.GatedAverage[source]

Layer uses applies weighted elementwise addition to multiple outputs. Weight can be control using separate input known as gate. Number of outputs from the gate has to be equal to the number of networks, since each value from the weight will be a weight per each network.

Layer expects gate as a first input, but it can be controlled with the gate_index parameter.

Parameters:
gate_index : int

Input layers passed as a list and current variable specifies index in which it can find gating network. Defaults to 0, which means that it expects to see gating layer in first position.

name : str or None

Layer’s name. Can be used as a reference to specific layer. When value specified as None than name will be generated from the class name. Defaults to None

Examples

>>> from neupy.layers import *
>>>
>>> gate = Input(10) >> Softmax(2)
>>> net1 = Input(20) >> Relu(10)
>>> net2 = Input(20) >> Relu(20) >> Relu(10)
>>>
>>> network = (gate | net1 | net2) >> GatedAverage()
>>> network
[(10,), (20,), (20,)] -> [... 8 layers ...] -> 10
Attributes:
variables : dict

Variable names and their values. Dictionary can be empty in case if variables hasn’t been created yet.

Methods

variable(value, name, shape=None, trainable=True) Initializes variable with specified values.
get_output_shape(input_shape) Computes expected output shape from the layer based on the specified input shape.
output(*inputs, **kwargs) Propagetes input through the layer. The kwargs variable might contain additional information that propages through the network.
fail_if_shape_invalid(input_shapes)[source]
gate_index = None[source]
get_output_shape(*input_shapes)[source]
options = {'gate_index': Option(class_name='GatedAverage', value=IntProperty(name="gate_index")), 'name': Option(class_name='BaseLayer', value=Property(name="name"))}[source]
output(*inputs, **kwargs)[source]