TY - JOUR
T1 - Conformal predictions for information fusion
T2 - A comparative study of p-value combination methods
AU - Balasubramanian, Vineeth N.
AU - Chakraborty, Shayok
AU - Panchanathan, Sethuraman
N1 - Publisher Copyright:
© 2014, Springer Science+Business Media Dordrecht.
PY - 2015/6/8
Y1 - 2015/6/8
N2 - The increased availability of a wide range of sensing technologies over the last few decades has resulted in an equivalent increased need for reliable information fusion methods in machine learning applications. While existing theories such as the Dempster-Shafer theory and the possibility theory have been used for several years now, they do not provide guarantees of error calibration in information fusion settings. The Conformal Predictions (CP) framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. In this work, we present a methodology to extend the Conformal Predictions framework to both classification and regression-based information fusion settings. This methodology is based on applying the CP framework to each data source as an independent hypothesis test, and subsequently using p-value combination methods as a test statistic for the combined hypothesis after fusion. The proposed methodology was studied in classification and regression settings within two real-world application contexts: person recognition using multiple modalities (classification), and head pose estimation using multiple image features (regression). Our experimental results showed that quantile methods of combining p-values (such as the Standard Normal Function and the Non-conformity Aggregation methods) provided the most statistically valid calibration results, and can be considered to extend the CP framework for information fusion settings.
AB - The increased availability of a wide range of sensing technologies over the last few decades has resulted in an equivalent increased need for reliable information fusion methods in machine learning applications. While existing theories such as the Dempster-Shafer theory and the possibility theory have been used for several years now, they do not provide guarantees of error calibration in information fusion settings. The Conformal Predictions (CP) framework is a new game-theoretic approach to reliable machine learning, which provides a methodology to obtain error calibration under classification and regression settings. In this work, we present a methodology to extend the Conformal Predictions framework to both classification and regression-based information fusion settings. This methodology is based on applying the CP framework to each data source as an independent hypothesis test, and subsequently using p-value combination methods as a test statistic for the combined hypothesis after fusion. The proposed methodology was studied in classification and regression settings within two real-world application contexts: person recognition using multiple modalities (classification), and head pose estimation using multiple image features (regression). Our experimental results showed that quantile methods of combining p-values (such as the Standard Normal Function and the Non-conformity Aggregation methods) provided the most statistically valid calibration results, and can be considered to extend the CP framework for information fusion settings.
KW - Conformal predictors
KW - Face processing applications
KW - Information fusion
KW - Multiple hypothesis testing
UR - http://www.scopus.com/inward/record.url?scp=84930417842&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84930417842&partnerID=8YFLogxK
U2 - 10.1007/s10472-013-9392-4
DO - 10.1007/s10472-013-9392-4
M3 - Article
AN - SCOPUS:84930417842
SN - 1012-2443
VL - 74
SP - 45
EP - 65
JO - Annals of Mathematics and Artificial Intelligence
JF - Annals of Mathematics and Artificial Intelligence
IS - 1-2
ER -