neupy.algorithms.RBM
- class neupy.algorithms.RBM[source]
Boolean/Bernoulli Restricted Boltzmann Machine (RBM). Algorithm assumes that inputs are either binary values or values between 0 and 1.
Parameters: - n_visible : int
Number of visible units. Number of features (columns) in the input data.
- n_hidden : int
Number of hidden units. The large the number the more information network can capture from the data, but it also mean that network is more likely to overfit.
- batch_size : int
Size of the mini-batch. Defaults to 10.
- weight : array-like, Tensorfow variable, Initializer or scalar
Default initialization methods you can find here. Defaults to Normal.
- hidden_bias : array-like, Tensorfow variable, Initializer or scalar
Default initialization methods you can find here. Defaults to Constant(value=0).
- visible_bias : array-like, Tensorfow variable, Initializer or scalar
Default initialization methods you can find here. Defaults to Constant(value=0).
- step : float
Learning rate, defaults to 0.1.
- show_epoch : int
This property controls how often the network will display information about training. It has to be defined as positive integer. For instance, number 100 mean that network shows summary at 1st, 100th, 200th, 300th … and last epochs.
Defaults to 1.
- shuffle_data : bool
If it’s True than training data will be shuffled before the training. Defaults to True.
- signals : dict, list or function
Function that will be triggered after certain events during the training.
- verbose : bool
Property controls verbose output in terminal. The True value enables informative output in the terminal and False - disable it. Defaults to False.
References
- [1] G. Hinton, A Practical Guide to Training Restricted
- Boltzmann Machines, 2010. http://www.cs.toronto.edu/~hinton/absps/guideTR.pdf
Examples
>>> import numpy as np >>> from neupy import algorithms >>> >>> data = np.array([ ... [1, 0, 1, 0], ... [1, 0, 1, 0], ... [1, 0, 0, 0], # incomplete sample ... [1, 0, 1, 0], ... ... [0, 1, 0, 1], ... [0, 0, 0, 1], # incomplete sample ... [0, 1, 0, 1], ... [0, 1, 0, 1], ... [0, 1, 0, 1], ... [0, 1, 0, 1], ... ]) >>> >>> rbm = algorithms.RBM(n_visible=4, n_hidden=1) >>> rbm.train(data, epochs=100) >>> >>> hidden_states = rbm.visible_to_hidden(data) >>> hidden_states.round(2) array([[ 0.99], [ 0.99], [ 0.95], [ 0.99], [ 0. ], [ 0.01], [ 0. ], [ 0. ], [ 0. ], [ 0. ]])
Methods
train(X_train, epochs=100) Trains network. predict(X) Alias to the visible_to_hidden method. visible_to_hidden(visible_input) Propagates data through the network and returns output from the hidden layer. hidden_to_visible(hidden_input) Propagates output from the hidden layer backward to the visible. gibbs_sampling(visible_input, n_iter=1) Makes Gibbs sampling n times using visible input. fit(*args, **kwargs) Alias to the train method. - batch_size = None[source]
- gibbs_sampling(visible_input, n_iter=1)[source]
Makes Gibbs sampling n times using visible input.
Parameters: - visible_input : 1d or 2d array
- n_iter : int
Number of Gibbs sampling iterations. Defaults to 1.
Returns: - array-like
Output from the visible units after perfoming n Gibbs samples. Array will contain only binary units (0 and 1).
Propagates output from the hidden layer backward to the visible.
Parameters: - hidden_input : array-like (n_samples, n_hidden_features)
Returns: - array-like
- init_functions()[source]
- init_methods()[source]
- n_visible = None[source]
- one_training_update(X_train, y_train=None)[source]
Train one epoch.
Parameters: - X_train : array-like (n_samples, n_features)
Returns: - float
- options = {'batch_size': Option(class_name='RBM', value=IntProperty(name="batch_size")), 'hidden_bias': Option(class_name='RBM', value=ParameterProperty(name="hidden_bias")), 'n_hidden': Option(class_name='RBM', value=IntProperty(name="n_hidden")), 'n_visible': Option(class_name='RBM', value=IntProperty(name="n_visible")), 'show_epoch': Option(class_name='BaseNetwork', value=IntProperty(name="show_epoch")), 'shuffle_data': Option(class_name='BaseNetwork', value=Property(name="shuffle_data")), 'signals': Option(class_name='BaseNetwork', value=Property(name="signals")), 'step': Option(class_name='BaseNetwork', value=NumberProperty(name="step")), 'verbose': Option(class_name='Verbose', value=VerboseProperty(name="verbose")), 'visible_bias': Option(class_name='RBM', value=ParameterProperty(name="visible_bias")), 'weight': Option(class_name='RBM', value=ParameterProperty(name="weight"))}[source]
- predict(X)[source]
- score(X, y=None)[source]
Compute the pseudo-likelihood of input samples.
Parameters: - X : array-like
Values of the visible layer
Returns: - float
Value of the pseudo-likelihood.
- train(X_train, X_test=None, epochs=100)[source]
Train RBM.
Parameters: - X_train : 1D or 2D array-like
- X_test : 1D or 2D array-like or None
Defaults to None.
- epochs : int
Number of training epochs. Defaults to 100.
- visible_bias = None[source]
Propagates data through the network and returns output from the hidden layer.
Parameters: - visible_input : array-like (n_samples, n_visible_features)
Returns: - array-like
- weight = None[source]