CalibrationAnalysis.jl

Multi-language suite for analyzing calibration of probabilistic predictive models.
Author devmotion
Popularity
1 Star
Updated Last
2 Years Ago
Started In
January 2022

CalibrationAnalysis.jl

Analysis of calibration of probabilistic predictive models.

Stable Dev Build Status Coverage Coverage Code Style: Blue

This is a suite for analyzing calibration of probabilistic predictive models written in Julia.

It is available for use in Julia, Python, and R.

The package supports:

Talk at JuliaCon 2021

Calibration analysis of probabilistic models in Julia

The slides of the talk are available as Pluto notebook.

Citing

If you use CalibrationAnalysis.jl as part of your research, teaching, or other activities, please consider citing the following publications:

Widmann, D., Lindsten, F., & Zachariah, D. (2019). Calibration tests in multi-class classification: A unifying framework. In Advances in Neural Information Processing Systems 32 (NeurIPS 2019) (pp. 12257–12267).

Widmann, D., Lindsten, F., & Zachariah, D. (2021). Calibration tests beyond classification. International Conference on Learning Representations (ICLR 2021).

Acknowledgements

This work was financially supported by the Swedish Research Council via the projects Learning of Large-Scale Probabilistic Dynamical Models (contract number: 2016-04278), Counterfactual Prediction Methods for Heterogeneous Populations (contract number: 2018-05040), and Handling Uncertainty in Machine Learning Systems (contract number: 2020-04122), by the Swedish Foundation for Strategic Research via the project Probabilistic Modeling and Inference for Machine Learning (contract number: ICA16-0015), by the Wallenberg AI, Autonomous Systems and Software Program (WASP) funded by the Knut and Alice Wallenberg Foundation, and by ELLIIT.

Used By Packages

No packages found.