neupy.algorithms.rbm module

class neupy.algorithms.rbm.RBM[source]

Boolean/Bernoulli Restricted Boltzmann Machine (RBM). Algorithm assumes that inputs are either binary values or values between 0 and 1.

Parameters:

n_visible : int

Number of visible units.

n_hidden : int

Number of hidden units.

batch_size : int or {None, -1, ‘all’, ‘*’, ‘full’}

Set up min-batch size. If mini-batch size is equal to one of the values from the list (like full) then it’s just a batch that equal to number of samples. Defaults to 128.

weight : array-like, Theano variable, Initializer or scalar

Default initialization methods you can find here. Defaults to XavierNormal.

hidden_bias : array-like, Theano variable, Initializer or scalar

Default initialization methods you can find here. Defaults to Constant(value=0).

visible_bias : array-like, Theano variable, Initializer or scalar

Default initialization methods you can find here. Defaults to Constant(value=0).

step : float

Learning rate, defaults to 0.1.

show_epoch : int or str

This property controls how often the network will display information about training. There are two main syntaxes for this property.

  • You can define it as a positive integer number. It defines how offen would you like to see summary output in terminal. For instance, number 100 mean that network shows summary at 100th, 200th, 300th ... epochs.
  • String defines number of times you want to see output in terminal. For instance, value '2 times' mean that the network will show output twice with approximately equal period of epochs and one additional output would be after the finall epoch.

Defaults to 1.

shuffle_data : bool

If it’s True class shuffles all your training data before training your network, defaults to True.

epoch_end_signal : function

Calls this function when train epoch finishes.

train_end_signal : function

Calls this function when train process finishes.

verbose : bool

Property controls verbose output interminal. True enables informative output in the terminal and False - disable it. Defaults to False.

References

[1] G. Hinton, A Practical Guide to Training Restricted
Boltzmann Machines, 2010. http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf

Examples

>>> import numpy as np
>>> from neupy import algorithms
>>>
>>> data = np.array([
...     [1, 0, 1, 0],
...     [1, 0, 1, 0],
...     [1, 0, 0, 0],  # incomplete sample
...     [1, 0, 1, 0],
...
...     [0, 1, 0, 1],
...     [0, 0, 0, 1],  # incomplete sample
...     [0, 1, 0, 1],
...     [0, 1, 0, 1],
...     [0, 1, 0, 1],
...     [0, 1, 0, 1],
... ])
>>>
>>> rbm = algorithms.RBM(n_visible=4, n_hidden=1)
>>> rbm.train(data, epochs=100)
>>>
>>> hidden_states = rbm.visible_to_hidden(data)
>>> hidden_states.round(2)
array([[ 0.99],
       [ 0.99],
       [ 0.95],
       [ 0.99],
       [ 0.  ],
       [ 0.01],
       [ 0.  ],
       [ 0.  ],
       [ 0.  ],
       [ 0.  ]])

Methods

train(input_train, epochs=100) Trains network.
fit(*args, **kwargs) Alias to the train method.
visible_to_hidden(visible_input) Populates data throught the network and returns output from the hidden layer.
hidden_to_visible(hidden_input) Propagates output from the hidden layer backward to the visible.
gibbs_sampling(visible_input, n_iter=1) Makes Gibbs sampling n times using visible input.
gibbs_sampling(visible_input, n_iter=1)[source]

Makes Gibbs sampling n times using visible input.

Parameters:

visible_input : 1d or 2d array

n_iter : int

Number of Gibbs sampling iterations. Defaults to 1.

Returns:

array-like

Output from the visible units after perfoming n Gibbs samples. Array will contain only binary units (0 and 1).

hidden_bias = None[source]
hidden_to_visible(hidden_input)[source]

Propagates output from the hidden layer backward to the visible.

Parameters:hidden_input : array-like (n_samples, n_hidden_features)
Returns:array-like
init_input_output_variables()[source]

Initialize input and output Theano variables.

init_methods()[source]

Initialize Theano functions.

init_variables()[source]

Initialize Theano variables.

n_hidden = None[source]
n_visible = None[source]
options = {'verbose': Option(class_name='Verbose', value=VerboseProperty(name="verbose")), 'weight': Option(class_name='RBM', value=ParameterProperty(name="weight")), 'shuffle_data': Option(class_name='BaseNetwork', value=Property(name="shuffle_data")), 'batch_size': Option(class_name='MinibatchTrainingMixin', value=BatchSizeProperty(name="batch_size")), 'train_end_signal': Option(class_name='BaseNetwork', value=Property(name="train_end_signal")), 'step': Option(class_name='BaseNetwork', value=NumberProperty(name="step")), 'n_visible': Option(class_name='RBM', value=IntProperty(name="n_visible")), 'hidden_bias': Option(class_name='RBM', value=ParameterProperty(name="hidden_bias")), 'n_hidden': Option(class_name='RBM', value=IntProperty(name="n_hidden")), 'epoch_end_signal': Option(class_name='BaseNetwork', value=Property(name="epoch_end_signal")), 'show_epoch': Option(class_name='BaseNetwork', value=ShowEpochProperty(name="show_epoch")), 'visible_bias': Option(class_name='RBM', value=ParameterProperty(name="visible_bias"))}[source]
prediction_error(input_data, target_data=None)[source]

Compute the pseudo-likelihood of input samples.

Parameters:

input_data : array-like

Values of the visible layer

Returns:

float

Value of the pseudo-likelihood.

train(input_train, input_test=None, epochs=100, summary='table')[source]

Train RBM.

Parameters:

input_train : 1D or 2D array-like

input_test : 1D or 2D array-like or None

Defaults to None.

epochs : int

Number of training epochs. Defaults to 100.

summary : {‘table’, ‘inline’}

Training summary type. Defaults to 'table'.

train_epoch(input_train, target_train=None)[source]

Train one epoch.

Parameters:input_train : array-like (n_samples, n_features)
Returns:float
visible_bias = None[source]
visible_to_hidden(visible_input)[source]

Populates data throught the network and returns output from the hidden layer.

Parameters:visible_input : array-like (n_samples, n_visible_features)
Returns:array-like
weight = None[source]