Memory Module¶
The bayes_hdc.memory module provides content-addressable memory structures.
SparseDistributedMemory¶
- class bayes_hdc.memory.SparseDistributedMemory(locations, contents, dimensions, radius)[source]¶
Bases:
objectSparse Distributed Memory (SDM) for content-addressable storage.
- locations: Array¶
- contents: Array¶
HopfieldMemory¶
- class bayes_hdc.memory.HopfieldMemory(patterns, dimensions, beta=1.0)[source]¶
Bases:
objectModern continuous Hopfield network (Ramsauer et al. 2020).
One-step softmax-attention retrieval over stored patterns. Distinct from the classical sign-thresholded recurrent Hopfield network (Hopfield 1982) — which settles via repeated application of a sign update — and from the spiking-neuron cleanup memories of Stewart, Tang & Eliasmith (2010), which run on populations of leaky- integrate-and-fire neurons via the Neural Engineering Framework. Retrieval here is a single feed-forward softmax over cosine similarities to the stored patterns; no recurrent settling.
References: Ramsauer, H. et al. (2020). Hopfield Networks is All You Need. arXiv:2008.02217. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. PNAS 79(8): 2554-2558. Stewart, T. C., Tang, Y., Eliasmith, C. (2010). A Biologically Realistic Cleanup Memory: Autoassociation in Spiking Neurons. Cognitive Systems Research 12: 84-92.
- patterns: Array¶
AttentionMemory¶
- class bayes_hdc.memory.AttentionMemory(keys, values, dimensions, temperature=1.0, num_heads=1)[source]¶
Bases:
objectAttention-based retrieval with key-value storage and multi-head support.
- keys: Array¶
- values: Array¶