Hopfield Decision Graph - The Classics Revival
Graph Neural Networks with Hopfield-Style Associative Memory
Experimental Research Code - Functional but unoptimized, expect rough edges
What Is This?
Hopfield Decision Graph combines Graph Neural Networks with Hopfield-style associative memory, where both nodes and edges maintain their own memory systems. Each edge can branch through learnable decision gates that mix multiple relation hypotheses, creating dynamic graph structures.
Core Innovation: Every edge becomes a decision point with multiple relation hypotheses, while Hopfield memories enable associative retrieval across the entire graph structure.
Architecture Highlights
- Dual Memory Systems: Separate Hopfield memories for nodes and edges
- Branching Edge Relations: Each edge routes through K alternative hypotheses
- Differentiable Decision Gates: Soft routing during training, hard routing in evaluation
- Associative Graph Dynamics: Message passing guided by memory retrieval
- Dynamic Graph Topology: Edge weights adapt based on decision gate outputs
- Protocol-Based Design: Modular architecture for custom merge strategies
Quick Start
from hopfield_decision_graph import HopfieldDecisionGNN, HopfieldDecisionGNNConfig
# Configure the model
config = HopfieldDecisionGNNConfig(
dim=64,
layers=3,
mem_slots_nodes=128,
mem_slots_edges=64,
branches=4
)
# Create model
model = HopfieldDecisionGNN(config)
# Forward pass
x = torch.randn(batch_size, num_nodes, dim) # Node features
A = torch.randint(0, 2, (batch_size, num_nodes, num_nodes)) # Adjacency
output, aux = model(x, A)
Current Status
- Working: Dual memory architecture, decision branching, hard/soft routing, gradient flow
- Rough Edges: Requires training for meaningful branch specialization and memory patterns
- Still Missing: Pre-trained examples, advanced memory consolidation, multi-scale graphs
- Performance: Solid architecture, needs task-specific training to showcase capabilities
- Memory Usage: High due to dual memory systems, compression strategies needed
- Speed: Moderate, bottlenecked by associative memory operations
Mathematical Foundation
The Hopfield memory retrieval uses content-based addressing:
attention_ij = softmax(q_i · k_j / √d)
retrieved_i = Σ_j attention_ij · v_j
Decision gates compute branch probabilities for each edge:
branch_weights_ij = softmax(MLP([x_i; x_j]) / τ)
The final adjacency emerges from branch mixing:
A'_ij = Σ_k branch_weights_ijk · hypothesis_k_ij
Message passing combines standard GNN updates with associative retrieval.
Research Applications
- Knowledge graph reasoning with memory
- Social network analysis with associative patterns
- Molecular property prediction with learned bonds
- Recommender systems with graph memory
- Scene graph understanding with object associations
Installation
pip install torch numpy
# Download hopfield_decision_graph.py from this repo
The Classics Revival Collection
Hopfield Decision Graph is part of a larger exploration of foundational algorithms enhanced with modern neural techniques:
- Evolutionary Turing Machine
- Hebbian Bloom Filter
- Hopfield Decision Graph ← You are here
- Liquid Bayes Chain
- Liquid State Space Model
- Möbius Markov Chain
- Memory Forest
Citation
@misc{hopfielddecision2025,
title={Hopfield Decision Graph: Associative Memory for Graph Neural Networks},
author={Jae Parker 𓅸 1990two},
year={2025},
note={Part of The Classics Revival Collection}
}