Member-only story

How are memories stored in neural networks? — Hopfield Networks

Adina Socaci
8 min readNov 10, 2023

--

The human brain, with its complex neural networks, serves as a remarkable model for understanding memory storage. In the field of artificial intelligence and neuroscience, the Hopfield network stands out as an important theoretical construct that attempts to simulate the storage and retrieval of memories in a computational framework.

Memory, the essence of consciousness, wields immense power in shaping the human experience. With rigorous study of the similarities between biological brains and artificial neural networks, researchers have uncovered the secrets of memory storage. The Hopfield connection, introduced by John Hopfield in 1982, assertively offers a unique and valuable perspective on associative memory that has become a foundation in the field of neural network models.

Neural Networks Fundamentals

Neural networks consist of computational entities known as neurons, which process inputs through an activation function. Mathematically, the output (o) of a neuron with input vector (X) and weight vector (W) is articulated as:

o = ϕ(∑i = 1nWiXi ​+ b)

Here, ϕ represents the activation function, b is the bias term, and n denotes the number of input connections. This foundational unit serves as the computational…

--

--

Adina Socaci
Adina Socaci

Written by Adina Socaci

Tech | Science | Languages | Having Fun

No responses yet