Project Details

Description

VR SCENT: VR Smell Composition Engine to assess Neurological Trauma VR SCENT: VR Smell Composition Engine to assess Neurological Trauma Having a robust way of evaluating a patients sense of smell would make a critical difference in identifying and intervening in cases of brain damage and disease. Currently, treatments for brain injury like PTSD and concussion have yet to integrate complex olfactory cues, despite strong correlative and mechanistic evidence linking those conditions to smell. Similarly, clinically common scratch-and-sniff tests have been used to help diagnose Alzheimers or Parkinsons disease, but they lack the complexity of real olfactory experience which must be harnessed to disambiguate true early disease onset from routine smell loss. As such, a physicians ability to diagnose and to intervene on behalf of the injured brain remains limited. We propose to evaluate whether a virtual reality system with olfactory integration can improve upon the ability of researchers and clinicians to intervene in brain damage and disease. We are currently developing a smell engine that links mathematically-generated olfactory cues with virtual visual, auditory, and haptic senses with the goal of studying the impact of olfaction in diagnosis and treatment of brain disorder. Integrated with existing VR software, and enhanced by transcutaneous vestibular stimulation that provides motion perception matched to the visual stimulus without causing motion sickness, the smell engine maps user exploration of a virtual scene to corresponding scene-appropriate smells. It does this through programmed actuation of the output channels carrying odorants. This is a novel development; current commercial VR systems do not integrate smell, a critical component of sensory experience, into any VR environment, despite the potential for VR to assist in diagnosis and treatment of brain injury or damage. The combined skillset of this ASU-Mayo collaboration is critical to evaluating and refining our platforms suitability for clinical diagnostic and treatment uses. Team members are currently engaged in developing individual parts and integrating them into a device that simulates multisensory worlds. At ASU, LiKamWa, Gerkin, and Smith have developed a preliminary smell engine, constituting a virtual interface to activate hardware valves to control gas phase concentrations of odorants dynamically in response to VR context. Our existing work draws on algorithms developed by Smith and Gerkin that determine the minimal, safe molecular recipe necessary to construct nearly any target smell from a manageable set of primary odorants. Stepanek and Cevette at Mayo have developed and patented a system using noninvasive electrical stimulation of the inner ear that allows VR users to perceive motion consistent with the visual stimulus and also avoid motion sickness while undergoing complex simulations. Spackman, and LiKamWa bring expertise in user study design for immersive technologies. With support from the Mayo-ASU seed grant, we will conduct preliminary evaluation of the potential of multimodal sensory stimulation (VR / olfaction / neurovestibular) to address cases of brain injury or disease. We will refine the prototype of our smell engine through pilot experimental studies that demonstrate the efficacy of our approach. These will position us to seek future funding, from military programs to investigate PTSD, and through NIH to investigate brain trauma and neurodegenerative diseases. We will also explore a commercial venture through SBIR mechanisms.
StatusFinished
Effective start/end date1/1/206/30/21

Funding

  • ASU: Mayo Seed Grant: $50,000.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.