Document Type

Conference Proceeding

Publication Date

9-2025

Abstract

It is common in machine learning to estimate a response y given covariate information x . However, these predictions alone do not quantify any uncertainty associated with said predictions. One way to overcome this deficiency is with conformal inference methods, which construct a set containing the unobserved response with a prescribed probability. Unfortunately, even with a one-dimensional response, conformal inference is computationally expensive despite recent encouraging advances. In this paper, we explore multi-output regression, delivering exact derivations of conformal inference p-values when the predictive model can be described as a linear function of y . Additionally, we introduce a multivariate extension of rootCP as well unionCP as efficient ways of approximating the conformal prediction region for a wide array of multi-output predictors, both linear and nonlinear, while preserving computational advantages. We also provide both theoretical and empirical evidence of the effectiveness of our methods using both real-world and simulated data.

Comments

© 2025 C. Johnstone & E. Ndiaye.

Source Publication

Proceedings of Machine Learning Research, volume 266

Share

COinS