neupy.algorithms.associative.hebb module

class neupy.algorithms.associative.hebb.HebbRule[source]

Hebbian Learning Neural Network is an unsupervised learning technique. Network can learn associations from the data.

Parameters:

decay_rate : float

Decay rate controls network’s weights. It helps network to ‘forget’ information and control weight’s size. Without this parameter network’s weights will increase fast. Defaults to 0.2.

n_inputs : int

Number of input units.

n_outputs : int

Number of output units.

n_unconditioned : int

Number of unconditioned units in neraul networks. All these units wouldn’t update during the training procedure. Unconditioned should be the first feature in the dataset.

weight : array-like, Initializer

Neural network weights. Value defined manualy should have shape (n_inputs, n_outputs). Defaults to Normal().

bias : array-like, Initializer

Neural network bias units. Defaults to Constant(-0.5).

step : float

Learning rate, defaults to 0.1.

show_epoch : int or str

This property controls how often the network will display information about training. There are two main syntaxes for this property.

  • You can define it as a positive integer number. It defines how offen would you like to see summary output in terminal. For instance, number 100 mean that network shows summary at 100th, 200th, 300th ... epochs.
  • String defines number of times you want to see output in terminal. For instance, value '2 times' mean that the network will show output twice with approximately equal period of epochs and one additional output would be after the finall epoch.

Defaults to 1.

shuffle_data : bool

If it’s True class shuffles all your training data before training your network, defaults to True.

epoch_end_signal : function

Calls this function when train epoch finishes.

train_end_signal : function

Calls this function when train process finishes.

verbose : bool

Property controls verbose output interminal. True enables informative output in the terminal and False - disable it. Defaults to False.

Notes

  • Network always generates weights that contains 0 weight for the conditioned stimulus and 1 for the other. Such initialization helps to controll your default state for the feature learning.

Examples

>>> import numpy as np
>>> from neupy import algorithms
>>>
... pavlov_dog_data = np.array([
...     [1, 0],  # food, no bell
...     [1, 1],  # food, bell
... ])
>>> dog_test_cases = np.array([
...     [0, 0],  # no food, no bell
...     [0, 1],  # no food, bell
...     [1, 0],  # food, no bell
...     [1, 1],  # food, bell
... ])
>>>
>>> hebbnet = algorithms.HebbRule(
...     n_inputs=2,
...     n_outputs=1,
...     n_unconditioned=1,
...     step=0.1,
...     decay_rate=0.8,
...     verbose=False
... )
>>> hebbnet.train(pavlov_dog_data, epochs=2)
>>> hebbnet.predict(dog_test_cases)
array([[-1],
       [ 1],
       [ 1],
       [ 1]])

Methods

predict(input_data) Predicts output for the specified input.
train(input_train, summary=’table’, epochs=100) Train neural network.
fit(*args, **kwargs) Alias to the train method.
decay_rate = None[source]
options = {'verbose': Option(class_name='Verbose', value=VerboseProperty(name="verbose")), 'weight': Option(class_name='BaseStepAssociative', value=ArrayProperty(name="weight")), 'shuffle_data': Option(class_name='BaseNetwork', value=Property(name="shuffle_data")), 'train_end_signal': Option(class_name='BaseNetwork', value=Property(name="train_end_signal")), 'step': Option(class_name='BaseNetwork', value=NumberProperty(name="step")), 'bias': Option(class_name='BaseStepAssociative', value=ParameterProperty(name="bias")), 'n_inputs': Option(class_name='BaseStepAssociative', value=IntProperty(name="n_inputs")), 'epoch_end_signal': Option(class_name='BaseNetwork', value=Property(name="epoch_end_signal")), 'show_epoch': Option(class_name='BaseNetwork', value=ShowEpochProperty(name="show_epoch")), 'decay_rate': Option(class_name='HebbRule', value=BoundedProperty(name="decay_rate")), 'n_outputs': Option(class_name='BaseAssociative', value=IntProperty(name="n_outputs")), 'n_unconditioned': Option(class_name='BaseStepAssociative', value=IntProperty(name="n_unconditioned"))}[source]
weight_delta(input_row, layer_output)[source]