We derive bounds on mutual information for arbitrary estimation problems in additive noise, modeled using Gaussian mixtures. Previous work exploiting the I-minimum-mean-squared-error (MMSE) formula to formulate a bridge between bounds on the MMSE for Gaussian mixture model estimation problems and bounds on the mutual information are generalized to allow arbitrary noise modeling. A novel upper bound on estimation information is also developed for the general estimation case. In addition, limits are analyzed to develop bounds on arbitrary entropy, asymptotic behavior of all bounds, and bound errors with some results bridged back to the MMSE domain.
- Gaussian mixture models
- estimation information
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering