TY - GEN
T1 - On Misspecified Parameter Bounds with Application to Sparse Bayesian Learning
AU - Richmond, Christ D.
AU - Alhowaish, Abdulhakim
N1 - Publisher Copyright:
© 2020 IEEE.
PY - 2020/11/1
Y1 - 2020/11/1
N2 - The sparse vector recovery problem can lead to a combinatorial search of prohibitive computations. Hence, reformulations amenable to convex optimization strategies have been considered. Alternatively, Bayesian inference approaches can curtail computations such as variational Bayesian methods (VBM). VBM, however, intentionally introduces a misspecified model to improve the efficiency of computational requirements. This talk will review the theory of misspecified parameter bounds and extensions to the Bayesian framework. Additionally, it will be shown that misspecified bounds can provide tight prediction of sparse Bayesian learning approaches, and thus can be used to tune the hyperparameters of VBM for improved performance. The VBM gains in computational efficiency, however, come at the cost of increased mean squared error (MSE) when compared to the perfectly specified model case. Examples will be shown that quantify this MSE increase and illustrate the apparent tradespace.
AB - The sparse vector recovery problem can lead to a combinatorial search of prohibitive computations. Hence, reformulations amenable to convex optimization strategies have been considered. Alternatively, Bayesian inference approaches can curtail computations such as variational Bayesian methods (VBM). VBM, however, intentionally introduces a misspecified model to improve the efficiency of computational requirements. This talk will review the theory of misspecified parameter bounds and extensions to the Bayesian framework. Additionally, it will be shown that misspecified bounds can provide tight prediction of sparse Bayesian learning approaches, and thus can be used to tune the hyperparameters of VBM for improved performance. The VBM gains in computational efficiency, however, come at the cost of increased mean squared error (MSE) when compared to the perfectly specified model case. Examples will be shown that quantify this MSE increase and illustrate the apparent tradespace.
KW - Bayesian
KW - Cramér-Rao bound
KW - learning
KW - misspecified
KW - parameter bound
KW - sparse
UR - http://www.scopus.com/inward/record.url?scp=85107802627&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85107802627&partnerID=8YFLogxK
U2 - 10.1109/IEEECONF51394.2020.9443550
DO - 10.1109/IEEECONF51394.2020.9443550
M3 - Conference contribution
AN - SCOPUS:85107802627
T3 - Conference Record - Asilomar Conference on Signals, Systems and Computers
SP - 1472
EP - 1476
BT - Conference Record of the 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
A2 - Matthews, Michael B.
PB - IEEE Computer Society
T2 - 54th Asilomar Conference on Signals, Systems and Computers, ACSSC 2020
Y2 - 1 November 2020 through 5 November 2020
ER -