PhysioHMD: a conformable, modular toolkit for collecting physiological data from head-mounted displays
PubDate: October 2018
Teams: MIT Media Lab,Xi’an Jiaotong University
Writers: Guillermo Bernal;Tao Yang;Abhinandan Jain;Pattie Maes
Abstract
Virtual and augmented reality headsets are unique as they have access to our facial area: an area that presents an excellent opportunity for always-available input and insight into the user’s state. Their position on the face makes it possible to capture bio-signals as well as facial expressions. This paper introduces the PhysioHMD, a software and hardware modular interface built for collecting affect and physiological data from users wearing a head-mounted display. The PhysioHMD platform is a flexible architecture enables researchers and developers to aggregate and interprets signals in real-time, and use those to develop novel, personalized interactions and evaluate virtual experiences. Offering an interface that is not only easy to extend but also is complemented by a suite of tools for testing and analysis. We hope that PhysioHMD can become a universal, publicly available testbed for VR and AR researchers.