# Artificial Neural Network Hopfield Network

## Hopfield Network

**Hopfield network**may be a special kind of neural network whose response is different from other neural networks.- It’s calculated by converging iterative process. It's only one layer of neurons relating to the size of the input and output, which must be an equivalent.
- In 1982, John Hopfield introduced an artificial neural network to gather and retrieve memory just like the human brain. Here, a neuron is either on or off thing.
- The state of a neuron
**(on +1 or off 0)**will be restored, relying on the input it receives from the other neuron. A Hopfield network is initially prepared to store various patterns or memories. - A Hopfield network may be a single-layered and recurrent network during which the neurons are entirely connected, i.e., each neuron is related to other neurons.
- If there are two neurons i and j, then there's a connectivity weight w
_{ij}lies between them which is symmetric w_{ij}= w_{ji}. - With zero self-connectivity, W
_{ii}=0 is given below. The given three neurons having values i = 1, 2, 3 with values X_{i}=±1 have connectivity weight W_{ij}.

Hopfield Network

## Updating rule

Consider **N neurons = 1, … , N with values X _{i} = +1, -1.**

The update rule is applied to the node i is given by:

If h_{i} ≥ 0 then x_{i} → 1 otherwise x_{i} → -1

Where hi = is called field at i, with b£ R a bias.

Thus, xi → sgn(hi), where the value of **sgn(r)=1, if r ≥ 0,** and the value of **sgn(r)=-1, if r < 0.**

We need to put bi=0 so that it makes no difference in training the network with random patterns.

We, therefore, consider hi= .

Hopfield Network approches

We have two different approaches to update the nodes:

## Synchronously

- The update of all the nodes taking place simultaneously at each time.

## Asynchronously

- At each point of time, update one node chosen randomly or according to some rule. Asynchronous updating is more biologically realistic.

## Hopfield Network as a Dynamical system

Consider, K = {-1, 1} ^{N} so that each state x £ X is given by **xi £ { -1,1 } for 1 ≤ I ≤ N**

we get 2^{N} possible states or configurations of the network.

We can describe a metric X by using the Hamming distance between any two states:

**P(x, y) = # {i: xi≠y _{i}}**

N Here, P is a metric with **0≤H(x,y)≤ N**. It is clearly symmetric and reflexive.

With any of the asynchronous or synchronous updating rules, we get a discrete-time dynamical system.

The updating rule up: X → X describes a map.

And Up: X → X is trivially continuous.

## Example

Suppose we have only two neurons: N = 2

There are two non-trivial choices for connectivities:

w_{12} = _{w21} = 1

w_{12}= _{w21} = -1

## Asynchronous updating:

- There are two attracting fixed points termed as
**[-1,-1] and [-1,-1].**All orbits converge to at least one of them. Next, fixed points are [-1,1] and [1,-1], and every one orbits are joined through one among these. For any fixed point, swapping all the signs gives another fixed point.

## Synchronous updating:

- In the first and second cases, although there are fixed points, none are often attracted to nearby points, i.e., they're not attracting fixed points. Some orbits oscillate forever.

## Energy Function Evaluation

- Hopfield networks have an energy function that diminishes or is unchanged with asynchronous updating.
- For a given state X ∈ {−1, 1} N of the network and for any set of association weights W
_{ij}with W_{ij}= w_{ji}and w_{ii}=0 let,

Hopfield Network Energy Function

Here, we need to update Xm to X'm and denote the new energy by E' and show that.

**E'-E = (X _{m}-X'_{m} ) ∑i≠mWmiXi.**

Using the above equation, if X_{m} = X_{m'} then we have E' = E

If X_{m} = -1 and X_{m'} = 1 , then **X _{m} - X_{m'} = 2 and hm= ∑iWmiXi ? 0**

Thus, E' - E ≤ 0

Similarly if **X _{m} =1 and X_{m'}= -1 then X_{m} - X_{m'} = 2 and hm= ∑iWmiXi **
< 0

Thus, E - E' < 0.

## Neurons pull in or push away from each other

Suppose the connection weight W_{ij} = W_{ji} between two neurons I and j.

If W_{ij} > 0, the updating rule implies:

If** X _{j} = 1,** the contribution of j in the weighted sum, i.e., W

_{ij}X

_{j}, is positive. Thus the value of X

_{i}is pulled by j towards its value X

_{j}= 1

If** X _{j}= -1** then W

_{ij}X

_{j}, is negative, and X

_{i}is again pulled by j towards its value X

_{j}= -1

Thus, if W_{ij} > 0 , then the value of i is pulled by the value of j. By symmetry, the value of j is also pulled by the value of i.

If W_{ij} < 0, then the value of i is pushed away by the value of j.

It follows that for a particular set of values Xi ∈ { -1 , 1 } for;

1 ≤ i ≤ N, the selection of weights taken as ** _{Wij} = X_{i}X_{j}** for;

1 ≤ i ≤ N correlates to the Hebbian rule.

## Training the network: One pattern (Ki=0)

Suppose the vector **x→ = (x _{1},…,x_{i},…,x_{N})** ∈ {-1,1}N is a pattern that we like to store in the Hopfield network.

To build a Hopfield network that recognizes x→, we need to select connection weight W_{ij} accordingly.

If we select **W _{ij} =ɳ X_{i}X_{j} for 1 ≤ i , j ≤ N **(Here, i≠j), where ɳ > 0 is the learning rate, then the value of Xi will not change under updating condition as we illustrate below.

We have

It implies that the value of X_{i}, whether 1 or -1 will not change, so that x→ is a fixed point.

Note that - x→ also becomes a fixed point when we train the network with x→ validating that Hopfield networks are sign blind.

If you want to learn about Artificial Intelligence Course , you can refer the following links Artificial Intelligence Training in Chennai , Machine Learning Training in Chennai , Python Training in Chennai , Data Science Training in Chennai.