Information divergence functions play a critical role in statistics and information theory. In this paper we show that a nonparametric-divergence measure can be used to provide improvedboundsontheminimumbinaryclassificationprobabilityof error for the case when the training and test data are drawn from the same distribution and for the case where there exists some mismatch between training and test distributions. We confirm these theoretical results by designing feature selection algorithms using the criteria from these bounds and by evaluating the algorithms on a series of pathological speech classification tasks.
- Bayes error rate
- Divergence measures
- Domain adaptation
- Nonparametric divergence estimator
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering