Artificial Neural Network Hopfield Network

Hopfield Network

  • Hopfield network may be a special kind of neural network whose response is different from other neural networks.
  • It’s calculated by converging iterative process. It's only one layer of neurons relating to the size of the input and output, which must be an equivalent.
  • In 1982, John Hopfield introduced an artificial neural network to gather and retrieve memory just like the human brain. Here, a neuron is either on or off thing.
  • The state of a neuron (on +1 or off 0) will be restored, relying on the input it receives from the other neuron. A Hopfield network is initially prepared to store various patterns or memories.
  • A Hopfield network may be a single-layered and recurrent network during which the neurons are entirely connected, i.e., each neuron is related to other neurons.
  • If there are two neurons i and j, then there's a connectivity weight wij lies between them which is symmetric wij = wji .
  • With zero self-connectivity, Wii =0 is given below. The given three neurons having values i = 1, 2, 3 with values Xi=±1 have connectivity weight Wij.
 Hopfield Network

Hopfield Network

Updating rule

    Consider N neurons = 1, … , N with values Xi = +1, -1.

    The update rule is applied to the node i is given by:

    If hi ≥ 0 then xi → 1 otherwise xi → -1

    Where hi =  Hopfield Network
is called field at i, with b£ R a bias.

    Thus, xi → sgn(hi), where the value of sgn(r)=1, if r ≥ 0, and the value of sgn(r)=-1, if r < 0.

    We need to put bi=0 so that it makes no difference in training the network with random patterns.

    We, therefore, consider hi=  Hopfield Network

 Hopfield Network

Hopfield Network approches

We have two different approaches to update the nodes:


  • The update of all the nodes taking place simultaneously at each time.


  • At each point of time, update one node chosen randomly or according to some rule. Asynchronous updating is more biologically realistic.

Hopfield Network as a Dynamical system

    Consider, K = {-1, 1} N so that each state x £ X is given by xi £ { -1,1 } for 1 ≤ I ≤ N

    we get 2N possible states or configurations of the network.

    We can describe a metric X by using the Hamming distance between any two states:

    P(x, y) = # {i: xi≠yi}

    N Here, P is a metric with 0≤H(x,y)≤ N. It is clearly symmetric and reflexive.

    With any of the asynchronous or synchronous updating rules, we get a discrete-time dynamical system.

    The updating rule up: X → X describes a map.

    And Up: X → X is trivially continuous.


    Suppose we have only two neurons: N = 2

    There are two non-trivial choices for connectivities:

    w12 = w21 = 1

    w12= w21 = -1

Asynchronous updating:

  • There are two attracting fixed points termed as [-1,-1] and [-1,-1]. All orbits converge to at least one of them. Next, fixed points are [-1,1] and [1,-1], and every one orbits are joined through one among these. For any fixed point, swapping all the signs gives another fixed point.

Synchronous updating:

  • In the first and second cases, although there are fixed points, none are often attracted to nearby points, i.e., they're not attracting fixed points. Some orbits oscillate forever.

Energy Function Evaluation

  • Hopfield networks have an energy function that diminishes or is unchanged with asynchronous updating.
  • For a given state X ∈ {−1, 1} N of the network and for any set of association weights Wij with Wij = wji and wii =0 let,
 Hopfield Network

Hopfield Network Energy Function

    Here, we need to update Xm to X'm and denote the new energy by E' and show that.

    E'-E = (Xm-X'm ) ∑i≠mWmiXi.

    Using the above equation, if Xm = Xm' then we have E' = E

    If Xm = -1 and Xm' = 1 , then Xm - Xm' = 2 and hm= ∑iWmiXi ? 0

    Thus, E' - E ≤ 0

    Similarly if Xm =1 and Xm'= -1 then Xm - Xm' = 2 and hm= ∑iWmiXi < 0

    Thus, E - E' < 0.

Neurons pull in or push away from each other

    Suppose the connection weight Wij = Wji between two neurons I and j.

    If Wij > 0, the updating rule implies:

    If Xj = 1, the contribution of j in the weighted sum, i.e., WijXj, is positive. Thus the value of Xi is pulled by j towards its value Xj= 1

    If Xj= -1 then WijXj , is negative, and Xi is again pulled by j towards its value Xj = -1

    Thus, if Wij > 0 , then the value of i is pulled by the value of j. By symmetry, the value of j is also pulled by the value of i.

    If Wij < 0, then the value of i is pushed away by the value of j.

    It follows that for a particular set of values Xi ∈ { -1 , 1 } for;

    1 ≤ i ≤ N, the selection of weights taken as Wij = XiXj for;

    1 ≤ i ≤ N correlates to the Hebbian rule.

Training the network: One pattern (Ki=0)

    Suppose the vector x→ = (x1,…,xi,…,xN) ∈ {-1,1}N is a pattern that we like to store in the Hopfield network.

    To build a Hopfield network that recognizes x→, we need to select connection weight Wij accordingly.

    If we select Wij =ɳ XiXj for 1 ≤ i , j ≤ N (Here, i≠j), where ɳ > 0 is the learning rate, then the value of Xi will not change under updating condition as we illustrate below.

    We have

 Hopfield Network

    It implies that the value of Xi, whether 1 or -1 will not change, so that x→ is a fixed point.

    Note that - x→ also becomes a fixed point when we train the network with x→ validating that Hopfield networks are sign blind.

If you want to learn about Artificial Intelligence Course , you can refer the following links Artificial Intelligence Training in Chennai , Machine Learning Training in Chennai , Python Training in Chennai , Data Science Training in Chennai.

Related Searches to Artificial Neural Network Hopfield Network