Out-of-distribution (OOD) detection is a critical requirement for the deployment of deep neural networks. This paper introduces the HEAT model, a new post-hoc OOD detection method estimating the density of in-distribution (ID) samples using hybrid energy-based models (EBM) in the feature space of a pre-trained backbone. HEAT complements prior density estimators of the ID density, e.g. parametric models like the Gaussian Mixture Model (GMM), to provide an accurate yet robust density estimation. A second contribution is to leverage the EBM framework to provide a unified density estimation and to compose several energy terms. Extensive experiments demonstrate the significance of the two contributions. HEAT sets new state-of-the-art OOD detection results on the CIFAR-10 / CIFAR100 benchmark as well as on the large-scale Imagenet benchmark. The code is available at: github.com/MarcLafon/heatood.
@article{Lafon2023,author={Lafon, Marc and Elias, Ramzi and Clément, Rambour and Thome, Nicolas},journal={Proceedings of the 40 th International Conference on Machine
Learning, Honolulu, Hawaii, USA. PMLR 202, 2023},title={{Hybrid Energy Based Model in the Feature Space for Out-of-Distribution Detection}},year={2023},}
UDL
Beyond First-Order Uncertainty Estimation with Evidential Models for Open-World Recognition
Charles Corbière, Marc Lafon, Nicolas Thome, and
2 more authors
ICML Workshop on Uncertainty and Robustness in Deep Learning, 2021
In this paper, we tackle the challenge of jointly quantifying in-distribution and out-of-distribution (OOD) uncertainties. We introduce KLoS, a KL-divergence measure defined on the class-probability simplex. By leveraging the second-order uncertainty representation provided by evi-dential models, KLoS captures more than existing first-order uncertainty measures such as predic-tive entropy. We design an auxiliary neural network , KLoSNet, to learn a refined measure directly aligned with the evidential training objective. Experiments show that KLoSNet acts as a class-wise density estimator and outperforms current uncertainty measures in the realistic context where no OOD data is available during training. We also report comparisons in the presence of OOD training samples, which shed a new light on the impact of the vicinity of this data with OOD test data.
@article{Corbiere2021,author={Corbière, Charles and Lafon, Marc and Thome, Nicolas and Cord, Matthieu and Pérez, Patrick},journal={ICML Workshop on Uncertainty and Robustness in Deep Learning},title={{Beyond First-Order Uncertainty Estimation with Evidential Models for Open-World Recognition}},year={2021},}
You can even add a little note about which of these is the best way to reach you.