Vision-based robotic applications with aggressive maneuvers suffer from the low sensing speed of standard cameras that sample frames at constant time intervals. On the other hand, although neuromorphic vision sensors are promising candidates to provide the needed high-frequency sensing, a new class of algorithms needs to be synthesized that can deal with the uncommon output from each pixel of these sensors, which (independently of other pixels) fire an asynchronous stream of 'retinal events' once a change in the light field is detected. In this paper, we investigate the problem of stabilizing a stochastic continuous-time linear time invariant system using noisy measurements from a neuromorphic vision sensor. We propose an H∞ controller that addresses this problem and provide the critical event-generation threshold for these neuromorphic vision sensors and characterize the statistical properties of the resulting states. The efficacy of our approach is illustrated on an unstable system.