TY - GEN
T1 - Subspace Communication Driven Search for High Dimensional Optimization
AU - Mathesen, Logan
AU - Chandrasekar, Kaushik Keezhnagar
AU - Li, Xinsheng
AU - Pedrielli, Giulia
AU - Candan, K. Selcuk
N1 - Funding Information:
This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No. 026257-001. This research is also partially supported by NSF#1827757 “Data-Driven Services for High Performance and Sustainable Buildings”, NSF#1610282 “DataStorm: A Data Enabled System for End-to-End Disaster Planning and Response”, NSF#1633381 “Discovering Context-Sensitive Impact in Complex Systems”, NSF#1633381 “Discovering Context-Sensitive Impact in Complex Systems”, NSF#1909555 “pCAR: Discovering and Leveraging Plausibly Causal (p-causal) Relationships to Understand Complex Dynamic Systems” and “FourCmodeling”: EUH2020 Marie Sklodowska-Curie grant agreement No 690817.
Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - Global optimization techniques often suffer the curse of dimensionality. In an attempt to face this challenge, high dimensional search techniques try to identify and leverage upon the effective, lower, dimensionality of the problem either in the original or in a transformed space. As a result, algorithms search for and exploit a projection or create a random embedding. Our approach avoids modeling of high dimensional spaces, and the assumption of low effective dimensionality. We argue that effectively high dimensional functions can be recursively optimized over sets of complementary lower dimensional subspaces. In this light, we propose the novel Subspace COmmunication for OPtimization (SCOOP) algorithm, which enables intelligent information sharing among subspaces such that each subspace guides the other towards improved locations. The experiments show that the accuracy of SCOOP rivals the state-of-the-art global optimization techniques, while being several orders of magnitude faster and having better scalability against the problem dimensionality.
AB - Global optimization techniques often suffer the curse of dimensionality. In an attempt to face this challenge, high dimensional search techniques try to identify and leverage upon the effective, lower, dimensionality of the problem either in the original or in a transformed space. As a result, algorithms search for and exploit a projection or create a random embedding. Our approach avoids modeling of high dimensional spaces, and the assumption of low effective dimensionality. We argue that effectively high dimensional functions can be recursively optimized over sets of complementary lower dimensional subspaces. In this light, we propose the novel Subspace COmmunication for OPtimization (SCOOP) algorithm, which enables intelligent information sharing among subspaces such that each subspace guides the other towards improved locations. The experiments show that the accuracy of SCOOP rivals the state-of-the-art global optimization techniques, while being several orders of magnitude faster and having better scalability against the problem dimensionality.
UR - http://www.scopus.com/inward/record.url?scp=85081130080&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081130080&partnerID=8YFLogxK
U2 - 10.1109/WSC40007.2019.9004851
DO - 10.1109/WSC40007.2019.9004851
M3 - Conference contribution
AN - SCOPUS:85081130080
T3 - Proceedings - Winter Simulation Conference
SP - 3528
EP - 3539
BT - 2019 Winter Simulation Conference, WSC 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 Winter Simulation Conference, WSC 2019
Y2 - 8 December 2019 through 11 December 2019
ER -