The previous chapters demonstrated how the Conformal Predictions (CP) framework can be adapted to traditional machine learning problems including active learning, feature selection, anomaly detection, change detection, model selection, and quality estimation. In this chapter, we describe three other adaptations of the CP framework, each of which is non-traditional in its own way. The task of obtaining a reliability value for the classification of a data instance has been the focus of a number of studies. In Sections 9.2 and 9.3, we describe two methods that use the idea of a metaclassifier to associate reliability values with output predictions from a base classifier. In particular, in Section 9.2, we describe the Metaconformal Predictors, where a base classifier is combined with a metaclassifier that is trained on metadata generated from the data instances and the classification results of the base classifier to associate reliability values on the classification of data instances. In Section 9.3, we describe the Single-Stacking Conformal Predictors, where an ensemble classifier consisting of the base classifier and the metaclassifier is constructed to compute reliability values on the classification outputs. The difference between the metaconformal and the single-stacking approaches is the manner in which the metadata are constructed and the way in which the reliability values are estimated.

Original languageEnglish (US)
Title of host publicationConformal Prediction for Reliable Machine Learning
Subtitle of host publicationTheory, Adaptations and Applications
PublisherElsevier Inc.
Number of pages19
ISBN (Print)9780123985378
StatePublished - Apr 2014


  • Ensemble Classifiers
  • Metaclassification
  • Metaconformal Predictors
  • ROC Curves
  • Single-Stacking Conformal Predictors
  • Time Series Analysis

ASJC Scopus subject areas

  • Computer Science(all)


Dive into the research topics of 'Other Adaptations'. Together they form a unique fingerprint.

Cite this