Do you understand epistemic uncertainty? Think again! Rigorous frequentist epistemic uncertainty estimation in regression

Created by MG96

External Public stat.ML cs.LG

Statistics

Citations
0
References
0
Last updated
Loading...
Authors

Enrico Foglia Benjamin Bobbia Nikita Durasov Michael Bauerheim Pascal Fua Stephane Moreau Thierry Jardin
Project Resources

Name Type Source Actions
ArXiv Paper Paper arXiv
Abstract

Quantifying model uncertainty is critical for understanding prediction reliability, yet distinguishing between aleatoric and epistemic uncertainty remains challenging. We extend recent work from classification to regression to provide a novel frequentist approach to epistemic and aleatoric uncertainty estimation. We train models to generate conditional predictions by feeding their initial output back as an additional input. This method allows for a rigorous measurement of model uncertainty by observing how prediction responses change when conditioned on the model's previous answer. We provide a complete theoretical framework to analyze epistemic uncertainty in regression in a frequentist way, and explain how it can be exploited in practice to gauge a model's uncertainty, with minimal changes to the original architecture.

Note:

No note available for this project.

No note available for this project.
Contact:

No contact available for this project.

No contact available for this project.