Dynamic programming algorithms for combinatorial optimization problems use max-plus semiring value functions, which can be blurred by softmax-normalized dot-product attention in existing Neural Algorithmic Reasoning models.
A novel Tropical attention function is introduced to operate natively in the max-plus semiring of tropical geometry, approximating tropical circuits of DP-type combinatorial algorithms.
Using Tropical transformers enhances empirical out-of-distribution (OOD) performance in algorithmic reasoning tasks, improving length and value generalization while maintaining stability under adversarial attacks.
Tropical attention restores sharp and scale-invariant reasoning in Neural Algorithmic Reasoning, outperforming softmax baselines in OOD settings and introducing adversarial-attack generalization as a benchmarking axis.