The University of British Columbia
UBC - A Place of Mind
The University of British Columbia
Sensorimotor Systems Lab
  • Home
  • Dinesh K. Pai
  • Research
    • Biomechanics
    • Graphics
    • Haptics
    • Mechanics
    • Neuroscience
    • Robotics
    • Scientific Computing
    • Sound
    • All Projects
  • Publications
  • People
  • Courses
    • CPSC 535P Digital Humans
    • CPSC 314 Computer Graphics
    • CPSC 530P Sensorimotor Computation
  • News
  • Contact
» Home » 2006 » September » 08 » HAVEN: the Haptic Auditory and Visual ENvironment

HAVEN: the Haptic Auditory and Visual ENvironment

AR/VR, Graphics, Haptics, Sound

The HAVEN was a facility for multisensory modeling and simulation, developed at Rutgers University by Prof. Pai to support multisensory human interaction in an immersive virtual environment.

The HAVEN is a densely-sensed environment. As human interact in the HAVEN, their position, motion, applied forces, and appearance, (as well as the readings from hand-held tools), can be simultaneously measured using a multitude of sensors located on the walls, the ceiling, the floor, and even attached to the subject’s hand. Sensors include a Vicon motion capture system, a ceiling-mounted stereo-vision system, a specially modified pressure-sensor with 10 thousand capacitative tactels for measuring the position and pressure of a person’s footsteps, a microphone array, and several force sensors.

As well as being densely-sensed, the HAVEN was designed to be a rich multisensory display environment. For visual display, the chamber contains a rear-projection screen sufficiently large to display human-sized avatars. Projectors located above the ceiling treat the table-top (or the floor) as a front-projection display. Polarized filters on the projectors and worn by users allow interactive stereo visualization of 3-D objects and environments. Haptic interaction included both a PHANToM haptic device and passive haptics.

The chamber is acoustically insulated with an array of loudspeakers for spatialized auditory display.

References
  • D. K. Pai, “Multisensory Interaction: Real and Virtual,” in proceedings of the International Symposium on Robotics Research, Siena, Italy, October 19-22, 2003 (Invited Paper). Also appears in Robotics Research: the Eleventh International Symposium, P. Dario and R. Chatila (Eds.), Springer Tracts on Advanced Robotics 15, 2005. pp. 489-498. [Paper page]
  • K. Yin and D. K. Pai, “FootSee: an Interactive Animation System,” in Proceedings of the Eurographics/SIGGRAPH Symposium on Computer Animation, San Diego, July 26-27, 2003, pp. 329-338. [Paper page]
  • T. Edmunds and D. K. Pai, “An Event Architecture for Distributed Interactive Multisensory Rendering,” 5th IEEE and ACM International Symposium on Mixed and Augmented Reality, Santa Barbara, CA, October 22-25, 2006. pp. 197-202. [DOI]
  • P. G. Kry and D. K. Pai, “Interaction Capture and Synthesis,” in ACM Transactions on Graphics (Proc. SIGGRAPH), 25(3), July 2006. pp. 872–880. [DOI]
  • T. Edmunds and D. K. Pai, “Perceptual Rendering for Learning Haptic Skills,” in Proceedings of the Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, Reno, Nevada, USA, March 13-14, 2008. pp. 225-230. [DOI]
Related sites
  • The Rutgers HAVEN
  • Previous
  • Next
Prof. Dinesh K. Pai
Department of Computer Science
2366 Main Mall
Vancouver, BC Canada V6T 1Z4
Tel 604 822 8197
Back to top
The University of British Columbia
  • Emergency Procedures |
  • Terms of Use |
  • Copyright |
  • Accessibility