neupy.algorithms.rbm module

class neupy.algorithms.rbm.RBM[source]

Boolean/Bernoulli Restricted Boltzmann Machine (RBM). Algorithm assumes that inputs are either binary values or values between 0 and 1.

Parameters:

n_visible : int

Number of visible units. Number of features (columns) in the input data.

n_hidden : int

Number of hidden units. The large the number the more information network can capture from the data, but it also mean that network is more likely to overfit.

batch_size : int

Size of the mini-batch. Defaults to 10.

weight : array-like, Tensorfow variable, Initializer or scalar

Default initialization methods you can find here. Defaults to Normal.

hidden_bias : array-like, Tensorfow variable, Initializer or scalar

Default initialization methods you can find here. Defaults to Constant(value=0).

visible_bias : array-like, Tensorfow variable, Initializer or scalar

Default initialization methods you can find here. Defaults to Constant(value=0).

step : float

Learning rate, defaults to 0.1.

show_epoch : int or str

This property controls how often the network will display information about training. There are two main syntaxes for this property.

  • You can define it as a positive integer number. It defines how offen would you like to see summary output in terminal. For instance, number 100 mean that network shows summary at 100th, 200th, 300th ... epochs.
  • String defines number of times you want to see output in terminal. For instance, value '2 times' mean that the network will show output twice with approximately equal period of epochs and one additional output would be after the finall epoch.

Defaults to 1.

shuffle_data : bool

If it’s True class shuffles all your training data before training your network, defaults to True.

epoch_end_signal : function

Calls this function when train epoch finishes.

train_end_signal : function

Calls this function when train process finishes.

verbose : bool

Property controls verbose output interminal. True enables informative output in the terminal and False - disable it. Defaults to False.

References

[1] G. Hinton, A Practical Guide to Training Restricted
Boltzmann Machines, 2010. http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf

Examples

>>> import numpy as np
>>> from neupy import algorithms
>>>
>>> data = np.array([
...     [1, 0, 1, 0],
...     [1, 0, 1, 0],
...     [1, 0, 0, 0],  # incomplete sample
...     [1, 0, 1, 0],
...
...     [0, 1, 0, 1],
...     [0, 0, 0, 1],  # incomplete sample
...     [0, 1, 0, 1],
...     [0, 1, 0, 1],
...     [0, 1, 0, 1],
...     [0, 1, 0, 1],
... ])
>>>
>>> rbm = algorithms.RBM(n_visible=4, n_hidden=1)
>>> rbm.train(data, epochs=100)
>>>
>>> hidden_states = rbm.visible_to_hidden(data)
>>> hidden_states.round(2)
array([[ 0.99],
       [ 0.99],
       [ 0.95],
       [ 0.99],
       [ 0.  ],
       [ 0.01],
       [ 0.  ],
       [ 0.  ],
       [ 0.  ],
       [ 0.  ]])

Methods

train(input_train, epochs=100) Trains network.
fit(*args, **kwargs) Alias to the train method.
visible_to_hidden(visible_input) Populates data throught the network and returns output from the hidden layer.
hidden_to_visible(hidden_input) Propagates output from the hidden layer backward to the visible.
gibbs_sampling(visible_input, n_iter=1) Makes Gibbs sampling n times using visible input.
batch_size = None[source]
gibbs_sampling(visible_input, n_iter=1)[source]

Makes Gibbs sampling n times using visible input.

Parameters:

visible_input : 1d or 2d array

n_iter : int

Number of Gibbs sampling iterations. Defaults to 1.

Returns:

array-like

Output from the visible units after perfoming n Gibbs samples. Array will contain only binary units (0 and 1).

hidden_bias = None[source]
hidden_to_visible(hidden_input)[source]

Propagates output from the hidden layer backward to the visible.

Parameters:hidden_input : array-like (n_samples, n_hidden_features)
Returns:array-like
init_input_output_variables()[source]

Initialize input and output Tensorflow variables.

init_methods()[source]

Initialize Tensorflow functions.

init_variables()[source]

Initialize Tensorflow variables.

n_hidden = None[source]
n_visible = None[source]
options = {'batch_size': Option(class_name='RBM', value=IntProperty(name="batch_size")), 'verbose': Option(class_name='Verbose', value=VerboseProperty(name="verbose")), 'step': Option(class_name='BaseNetwork', value=NumberProperty(name="step")), 'show_epoch': Option(class_name='BaseNetwork', value=ShowEpochProperty(name="show_epoch")), 'shuffle_data': Option(class_name='BaseNetwork', value=Property(name="shuffle_data")), 'epoch_end_signal': Option(class_name='BaseNetwork', value=Property(name="epoch_end_signal")), 'train_end_signal': Option(class_name='BaseNetwork', value=Property(name="train_end_signal")), 'n_visible': Option(class_name='RBM', value=IntProperty(name="n_visible")), 'n_hidden': Option(class_name='RBM', value=IntProperty(name="n_hidden")), 'weight': Option(class_name='RBM', value=ParameterProperty(name="weight")), 'hidden_bias': Option(class_name='RBM', value=ParameterProperty(name="hidden_bias")), 'visible_bias': Option(class_name='RBM', value=ParameterProperty(name="visible_bias"))}[source]
prediction_error(input_data, target_data=None)[source]

Compute the pseudo-likelihood of input samples.

Parameters:

input_data : array-like

Values of the visible layer

Returns:

float

Value of the pseudo-likelihood.

train(input_train, input_test=None, epochs=100, summary='table')[source]

Train RBM.

Parameters:

input_train : 1D or 2D array-like

input_test : 1D or 2D array-like or None

Defaults to None.

epochs : int

Number of training epochs. Defaults to 100.

summary : {‘table’, ‘inline’}

Training summary type. Defaults to 'table'.

train_epoch(input_train, target_train=None)[source]

Train one epoch.

Parameters:input_train : array-like (n_samples, n_features)
Returns:float
visible_bias = None[source]
visible_to_hidden(visible_input)[source]

Populates data throught the network and returns output from the hidden layer.

Parameters:visible_input : array-like (n_samples, n_visible_features)
Returns:array-like
weight = None[source]