atyy
Science Advisor
- 15,170
- 3,379
madness said:I wasn't referring to the energy. The Hopfield network before the memory is embedded has random connections, and after the memory is embedded it has a specific pattern of connections which generate attractor dynamics within the network. This is what constitutes a decrease in entropy.
madness said:If you still don't believe me, read this paper titled "Self-organization and entropy decreasing in neural networks" http://ptp.oxfordjournals.org/content/92/5/927.full.pdf and this paper, titled "Pattern recognition minimizes entropy production in a neural network of electrical oscillators" http://www.sciencedirect.com/science/article/pii/S0375960113007305.
Here you seem to be referring to memory formation. But is there any memory formation in these articles? It seems the weights are fixed, and the articles are discussing recall.
Also, what you are arguing is that the initial state of the Hopfield net is a high entropy state (which I don't think the articles you link to show). Even if that turns out to be true, that does not mean that the Hopfield net is remembering a high entropy past, since after training, the memories stored in the network are not of the network's initial state. Rather, the memories in the network are of the examples presented during its training. The training period requires a "teacher" who turns on Hebbian plasticity, presents a limited selection of examples, and then turns off the Hebbian plasticity. This period of training seems much more like a low entropy period, so it would seem the network is remembering a low entropy past.
Last edited: