Out-of-distribution recognition is a crucial issue in deep learning to identify samples not part of the original training data.
This study suggests that effective hierarchical hyperbolic embedding is essential for distinguishing between in- and out-of-distribution samples.
Balanced Hyperbolic Learning is introduced, optimizing class embedding by balancing hierarchical distortion and subhierarchy distribution.
Hyperbolic prototypes derived from these embeddings are used for classification on in-distribution data.
Existing out-of-distribution scoring functions are adapted to work with hyperbolic prototypes in this study.
Empirical assessments involving 13 datasets and 13 scoring functions demonstrate the superiority of hyperbolic embeddings over existing out-of-distribution methods with the same training and backbone data.
Comparison with other hyperbolic models and contrastive methods also show the effectiveness of the proposed hyperbolic embeddings.
The hyperbolic embeddings additionally support hierarchical out-of-distribution generalization, providing a native advantage.