### Abstract

A number of fundamental quantities in statistical signal processing and information theory can be expressed as integral functions of two probability density functions. Such quantities are called density functionals as they map density functions onto the real line. For example, information divergence functions measure the dissimilarity between two probability density functions and are useful in a number of applications. Typically, estimating these quantities requires complete knowledge of the underlying distribution followed by multi-dimensional integration. Existing methods make parametric assumptions about the data distribution or use non-parametric density estimation followed by high-dimensional integration. In this paper, we propose a new alternative. We introduce the concept of “data-driven basis functions” - functions of distributions whose value we can estimate given only samples from the underlying distributions without requiring distribution fitting or direct integration. We derive a new data-driven complete basis that is similar to the deterministic Bernstein polynomial basis and develop two methods for performing basis expansions of functionals of two distributions. We also show that the new basis set allows us to approximate functions of distributions as closely as desired. Finally, we evaluate the methodology by developing data driven estimators for the Kullback-Leibler divergences and the Hellinger distance and by constructing empirical estimates of tight bounds on the Bayes error rate.

Original language | English (US) |
---|---|

Journal | IEEE Transactions on Signal Processing |

DOIs | |

State | Accepted/In press - Nov 25 2017 |

### Fingerprint

### Keywords

- Bernstein polynomial
- direct estimation
- Divergence estimation
- nearest neighbor graphs

### ASJC Scopus subject areas

- Signal Processing
- Electrical and Electronic Engineering

### Cite this

*IEEE Transactions on Signal Processing*. https://doi.org/10.1109/TSP.2017.2775587

**Direct estimation of density functionals using a polynomial basis.** / Wisler, Alan; Berisha, Visar; Spanias, Andreas; Hero, Alfred O.

Research output: Contribution to journal › Article

}

TY - JOUR

T1 - Direct estimation of density functionals using a polynomial basis

AU - Wisler, Alan

AU - Berisha, Visar

AU - Spanias, Andreas

AU - Hero, Alfred O.

PY - 2017/11/25

Y1 - 2017/11/25

N2 - A number of fundamental quantities in statistical signal processing and information theory can be expressed as integral functions of two probability density functions. Such quantities are called density functionals as they map density functions onto the real line. For example, information divergence functions measure the dissimilarity between two probability density functions and are useful in a number of applications. Typically, estimating these quantities requires complete knowledge of the underlying distribution followed by multi-dimensional integration. Existing methods make parametric assumptions about the data distribution or use non-parametric density estimation followed by high-dimensional integration. In this paper, we propose a new alternative. We introduce the concept of “data-driven basis functions” - functions of distributions whose value we can estimate given only samples from the underlying distributions without requiring distribution fitting or direct integration. We derive a new data-driven complete basis that is similar to the deterministic Bernstein polynomial basis and develop two methods for performing basis expansions of functionals of two distributions. We also show that the new basis set allows us to approximate functions of distributions as closely as desired. Finally, we evaluate the methodology by developing data driven estimators for the Kullback-Leibler divergences and the Hellinger distance and by constructing empirical estimates of tight bounds on the Bayes error rate.

AB - A number of fundamental quantities in statistical signal processing and information theory can be expressed as integral functions of two probability density functions. Such quantities are called density functionals as they map density functions onto the real line. For example, information divergence functions measure the dissimilarity between two probability density functions and are useful in a number of applications. Typically, estimating these quantities requires complete knowledge of the underlying distribution followed by multi-dimensional integration. Existing methods make parametric assumptions about the data distribution or use non-parametric density estimation followed by high-dimensional integration. In this paper, we propose a new alternative. We introduce the concept of “data-driven basis functions” - functions of distributions whose value we can estimate given only samples from the underlying distributions without requiring distribution fitting or direct integration. We derive a new data-driven complete basis that is similar to the deterministic Bernstein polynomial basis and develop two methods for performing basis expansions of functionals of two distributions. We also show that the new basis set allows us to approximate functions of distributions as closely as desired. Finally, we evaluate the methodology by developing data driven estimators for the Kullback-Leibler divergences and the Hellinger distance and by constructing empirical estimates of tight bounds on the Bayes error rate.

KW - Bernstein polynomial

KW - direct estimation

KW - Divergence estimation

KW - nearest neighbor graphs

UR - http://www.scopus.com/inward/record.url?scp=85035817032&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85035817032&partnerID=8YFLogxK

U2 - 10.1109/TSP.2017.2775587

DO - 10.1109/TSP.2017.2775587

M3 - Article

AN - SCOPUS:85035817032

JO - IEEE Transactions on Signal Processing

JF - IEEE Transactions on Signal Processing

SN - 1053-587X

ER -