Revisiting the Platonic Representation Hypothesis: An Aristotelian View
Abstract
Network representations converge toward shared local neighborhood relationships rather than global statistical models, as revealed by calibrated similarity metrics.
The Platonic Representation Hypothesis suggests that representations from neural networks are converging to a common statistical model of reality. We show that the existing metrics used to measure representational similarity are confounded by network scale: increasing model depth or width can systematically inflate representational similarity scores. To correct these effects, we introduce a permutation-based null-calibration framework that transforms any representational similarity metric into a calibrated score with statistical guarantees. We revisit the Platonic Representation Hypothesis with our calibration framework, which reveals a nuanced picture: the apparent convergence reported by global spectral measures largely disappears after calibration, while local neighborhood similarity, but not local distances, retains significant agreement across different modalities. Based on these findings, we propose the Aristotelian Representation Hypothesis: representations in neural networks are converging to shared local neighborhood relationships.
Community
Are neural nets across modalities really converging to the same representation as they scale, as the Platonic Representation Hypothesis suggests?
We show that common representational similarity metrics are confounded by network width & depth. We propose a permutation-based null calibration that fixes this.
Result❓
• Global convergence largely disappears.
• Local neighborhoods persist.
We propose the alternative Aristotelian Representation Hypothesis: Neural networks, trained with different objectives on different data and modalities, are converging to shared local neighborhood relationships
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper