neupy.architectures.mixture_of_experts
- neupy.architectures.mixture_of_experts(networks, gating_layer=None)[source]
Generates mixture of experts architecture from the set of networks that has the same input and output shapes.
Mixture of experts learns to how to mix results from different networks in order to get better performances. It adds gating layer that using input data tries to figure out which of the networks will make better contribution to the final result. The final result mixes from all networks using different weights. The higher the weight the larger contribution from the individual layer.
Parameters: - networks : list of networks/layers
- gating_layer : None or layer
In case if value equal to None that the following layer will be created.
gating_layer = layers.Softmax(len(networks))
Output from the gating layer should be 1D and equal to the number of networks.
Returns: - network
Mixture of experts network that combine all networks into single one and adds gating layer to it.
Raises: - ValueError
In case if there is some problem with input networks or custom gating layer.
Examples
>>> from neupy import algorithms, architectures >>> from neupy.layers import * >>> >>> network = architectures.mixture_of_experts([ ... join( ... Input(10), ... Relu(5), ... ), ... join( ... Input(10), ... Relu(33), ... Relu(5), ... ), ... join( ... Input(10), ... Relu(12), ... Relu(25), ... Relu(5), ... ), ... ]) >>> network (?, 10) -> [... 12 layers ...] -> (?, 5) >>> >>> optimizer = algorithms.Momentum(network, step=0.1)