Multisensory Immersive Analytics
PubDate: October 2018
Teams: Monash University；Bangor University；University of Edinburgh；Federal University of Rio Grande do Sul；Ochanomizu University；Ecole Nationale de l’Aviation Civile
Writers: Jon McCormackJonathan C. RobertsBenjamin BachCarla Dal Sasso FreitasTakayuki ItohChristophe HurterKim Marriott
While visual cues are traditionally used for visual analytics, multimodal interaction technologies offer many new possibilities. This chapter explores the opportunities and challenges for developers and users to utilize and represent data through non-visual sensory channels to help them understand and interact with data. Users are able to experience data in new ways: variables from complex datasets can be conveyed through different senses; presentations are more accessible to people with vision impairment and can be personalized to specific user needs; interactions can involve multiple senses to provide natural and transparent methods. All these techniques enable users to obtain a better understanding of the underlying information. While the emphasis of this chapter is towards non-visual immersive analytics, we include a discussion on how visual presentations are integrated with different modalities, and the opportunities of mixing several sensory signals, including the visual domain.