Global optimization techniques often suffer the curse of dimensionality. In an attempt to face this challenge, high dimensional search techniques try to identify and leverage upon the effective, lower, dimensionality of the problem either in the original or in a transformed space. As a result, algorithms search for and exploit a projection or create a random embedding. Our approach avoids modeling of high dimensional spaces, and the assumption of low effective dimensionality. We argue that effectively high dimensional functions can be recursively optimized over sets of complementary lower dimensional subspaces. In this light, we propose the novel Subspace COmmunication for OPtimization (SCOOP) algorithm, which enables intelligent information sharing among subspaces such that each subspace guides the other towards improved locations. The experiments show that the accuracy of SCOOP rivals the state-of-the-art global optimization techniques, while being several orders of magnitude faster and having better scalability against the problem dimensionality.