class neupy.algorithms.HebbRule[source]

Neural Network with Hebbian Learning. It’s an unsupervised algorithm. Network can learn associations from the data.

decay_rate : float

Decay rate controls network’s weights. It helps network to ‘forget’ information and control weight’s size. Without this parameter network’s weights will increase fast. Defaults to 0.2.

n_inputs : int

Number of features (columns) in the input data.

n_outputs : int

Number of outputs in the network.

n_unconditioned : int

Number of unconditioned units in neural networks. All these units wouldn’t update during the training procedure. Unconditioned should be the first feature in the dataset.

weight : array-like

Neural network weights. Value defined manually should have shape (n_inputs, n_outputs). Defaults to None which means that all unconditional weights will be equal to 1. Other weights equal to 0.

bias : array-like, Initializer

Neural network bias units. Defaults to Constant(-0.5).

step : float

Learning rate, defaults to 0.1.

show_epoch : int

This property controls how often the network will display information about training. It has to be defined as positive integer. For instance, number 100 mean that network shows summary at 1st, 100th, 200th, 300th … and last epochs.

Defaults to 1.

shuffle_data : bool

If it’s True than training data will be shuffled before the training. Defaults to True.

signals : dict, list or function

Function that will be triggered after certain events during the training.

verbose : bool

Property controls verbose output in terminal. The True value enables informative output in the terminal and False - disable it. Defaults to False.


  • Network always generates weights that contains 0 weight for the conditioned stimulus and 1 for the other. Such initialization helps to control your default state for the feature learning.


>>> import numpy as np
>>> from neupy import algorithms
>>> pavlov_dog_data = np.array([
...     [1, 0],  # food, no bell
...     [1, 1],  # food, bell
... ])
>>> dog_test_cases = np.array([
...     [0, 0],  # no food, no bell
...     [0, 1],  # no food, bell
...     [1, 0],  # food, no bell
...     [1, 1],  # food, bell
... ])
>>> hebbnet = algorithms.HebbRule(
...     n_inputs=2,
...     n_outputs=1,
...     n_unconditioned=1,
...     step=0.1,
...     decay_rate=0.8,
...     verbose=False
... )
>>> hebbnet.train(pavlov_dog_data, epochs=2)
>>> hebbnet.predict(dog_test_cases)


predict(X) Predicts output for the specified input.
train(X_train, epochs=100) Train neural network.
fit(*args, **kwargs) Alias to the train method.
decay_rate = None[source]
options = {'bias': Option(class_name='BaseStepAssociative', value=ParameterProperty(name="bias")), 'decay_rate': Option(class_name='HebbRule', value=BoundedProperty(name="decay_rate")), 'n_inputs': Option(class_name='BaseStepAssociative', value=IntProperty(name="n_inputs")), 'n_outputs': Option(class_name='BaseAssociative', value=IntProperty(name="n_outputs")), 'n_unconditioned': Option(class_name='BaseStepAssociative', value=IntProperty(name="n_unconditioned")), 'show_epoch': Option(class_name='BaseNetwork', value=IntProperty(name="show_epoch")), 'shuffle_data': Option(class_name='BaseNetwork', value=Property(name="shuffle_data")), 'signals': Option(class_name='BaseNetwork', value=Property(name="signals")), 'step': Option(class_name='BaseNetwork', value=NumberProperty(name="step")), 'verbose': Option(class_name='Verbose', value=VerboseProperty(name="verbose")), 'weight': Option(class_name='BaseStepAssociative', value=ArrayProperty(name="weight"))}[source]
weight_delta(input_row, layer_output)[source]