In haptic environments, it is especially demanding to design realistic interaction paradigms and provide global navigational cues. In this paper, we present a methodology that can replace and/or augment realistic haptic environments and is inspired by the psychological basis of haptics. The system employs haptic cueing to convey information about shape, size, texture, and material of the object through user-determined cues. The key conceptual framework that guides this approach is that humans have haptical memory of an object, and sparse data about the object features presented through cues can invoke spatial concepts that reveal the identity of the object. Secondly, the system represents an object's surface as a 2D raised surface map in the virtual environment. Since the haptic modality is specialized to perceive surface properties, surface rendering mimics the real environment and presents veridical sensations about the surface. We compare this methodology to conventional rendering of haptic objects where the object is sensed in its entirety through a haptic interface. The paper further studies usability of realistic haptic rendering in 1) egocentric and exocentric reference frames, and 2) with tactile and/or force feedback Initial results indicate that tactile cueing in combination with realistic rendering of surfaces with force feedback in an egocentric reference frame leads to the most efficient perception.