TY - GEN
T1 - Combining SOS with Branch and Bound to Isolate Global Solutions of Polynomial Optimization Problems
AU - Colbert, Brendon
AU - Mohammadi, Hesameddin
AU - Peet, Matthew
N1 - Funding Information:
ACKNOWLEDGMENTS The authors gratefully acknowledge funding from NSF with the award number CMMI-1538374 for the present work.
PY - 2018/8/9
Y1 - 2018/8/9
N2 - In this paper, we combine a branch and bound algorithm with SOS programming in order to obtain arbitrarily accurate solutions to Global Polynomial Optimization (GPO) problems with bounded feasible sets. These arbitrarily accurate solutions are then fed into local gradient descent algorithms to obtain the true global optimizer. The algorithm successively bisects the feasible set and uses SOS to compute a Greatest Lower Bound (GLB) over each feasible set. For any desired accuracy, ϵ, we prove that the algorithm will return a point x such that |x-y| ≤ϵ for some point with objective value |f(y)-f(x∗)| ≤ϵ where x∗ is the global optimizer. To achieve this point, x, the algorithm sequentially solves O(\log(1/ϵ)) GLB problems, each of identical polynomial-time complexity. The point x, can then be used as an accurate initial value for gradient descent algorithms. We illustrate this approach using a numerical example with several local optima and demonstrate that the proposed algorithm dramatically increases the effectiveness of standard global optimization solvers.
AB - In this paper, we combine a branch and bound algorithm with SOS programming in order to obtain arbitrarily accurate solutions to Global Polynomial Optimization (GPO) problems with bounded feasible sets. These arbitrarily accurate solutions are then fed into local gradient descent algorithms to obtain the true global optimizer. The algorithm successively bisects the feasible set and uses SOS to compute a Greatest Lower Bound (GLB) over each feasible set. For any desired accuracy, ϵ, we prove that the algorithm will return a point x such that |x-y| ≤ϵ for some point with objective value |f(y)-f(x∗)| ≤ϵ where x∗ is the global optimizer. To achieve this point, x, the algorithm sequentially solves O(\log(1/ϵ)) GLB problems, each of identical polynomial-time complexity. The point x, can then be used as an accurate initial value for gradient descent algorithms. We illustrate this approach using a numerical example with several local optima and demonstrate that the proposed algorithm dramatically increases the effectiveness of standard global optimization solvers.
UR - http://www.scopus.com/inward/record.url?scp=85052575576&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85052575576&partnerID=8YFLogxK
U2 - 10.23919/ACC.2018.8431203
DO - 10.23919/ACC.2018.8431203
M3 - Conference contribution
AN - SCOPUS:85052575576
SN - 9781538654286
T3 - Proceedings of the American Control Conference
SP - 2190
EP - 2197
BT - 2018 Annual American Control Conference, ACC 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 Annual American Control Conference, ACC 2018
Y2 - 27 June 2018 through 29 June 2018
ER -