Multisensory integration is the process by which information from different sensory modalities is integrated by the nervous system. Understanding this process is important not only from a basic science perspective but also for translational reasons, e.g. for the development of closed-loop neural prosthetic systems. Here we describe a versatile virtual reality platform which can be used to study the neural mechanisms of multisensory integration for the upper limb and could potentially be incorporated into systems for training of robust neural prosthetic control. The platform involves the interaction of multiple computers and programs and allows for selection of different avatar arms and for modification of a selected arm's visual properties. The system was tested with two non-human primates (NHP) that were trained to reach to multiple targets on a tabletop. Reliability of arm visual feedback was altered by applying different levels of blurring to the arm. In addition, tactile feedback was altered by adding or removing physical targets from the environment. We observed differences in movement endpoint distributions that varied between animals and visual feedback conditions, as well as across targets. The results indicate that the system can be used to study multisensory integration in a well-controlled manner.