Nweon Paper https://paper.nweon.com 映维网,影响力虚拟现实(VR)、增强现实(AR)产业信息数据平台 Tue, 02 Jun 2020 05:34:12 +0000 zh-CN hourly 1 https://wordpress.org/?v=4.8.13 https://paper.nweon.com/wp-content/uploads/2020/03/cropped-nweon-icon-1-150x150.png Nweon Paper https://paper.nweon.com 32 32 Behind the Curtain of the “Ultimate Empathy Machine”: On the Composition of Virtual Reality Nonfiction Experiences https://paper.nweon.com/1967 https://paper.nweon.com/1967#respond Tue, 02 Jun 2020 05:34:12 +0000 https://paper.nweon.com/1967 PubDate: May 2019Teams: University of Bristol,University of the West of England,University of BathWriters: Chris Bevan;David Philip Green;Harry Farmer;Mandy Rose;Kirsten Cater;Danaë Stanton Fraser;Helen BrownPDF: Behind the Curtain of the “Ultimate Empathy Machine”: On the Composition of Virtual Reality Nonfiction ExperiencesAbstractVirtual Reality nonfiction (VRNF) is an emerging form of immersive media experience created for consumption using panoramic “Virtual Reality”

Behind the Curtain of the “Ultimate Empathy Machine”: On the Composition of Virtual Reality Nonfiction Experiences最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: University of Bristol,University of the West of England,University of Bath

Writers: Chris Bevan;David Philip Green;Harry Farmer;Mandy Rose;Kirsten Cater;Danaë Stanton Fraser;Helen Brown

PDF: Behind the Curtain of the “Ultimate Empathy Machine”: On the Composition of Virtual Reality Nonfiction Experiences

Abstract

Virtual Reality nonfiction (VRNF) is an emerging form of immersive media experience created for consumption using panoramic “Virtual Reality” headsets. VRNF promises nonfiction content producers the potential to create new ways for audiences to experience “the real”; allowing viewers to transition from passive spectators to active participants. Our current project is exploring VRNF through a series of ethnographic and experimental studies. In order to document the content available, we embarked on an analysis of VR documentaries produced to date. In this paper, we present an analysis of a representative sample of 150 VRNF titles released between 2012-2018. We identify and quantify 64 characteristics of the medium over this period, discuss how producers are exploiting the affordances of VR, and shed light on new audience roles. Our findings provide insight into the current state of the art in VRNF and provide a digital resource for other researchers in this area.

Behind the Curtain of the “Ultimate Empathy Machine”: On the Composition of Virtual Reality Nonfiction Experiences最先出现在Nweon Paper

]]>
https://paper.nweon.com/1967/feed 0
VRsneaky: Increasing Presence in VR Through Gait-Aware Auditory Feedback https://paper.nweon.com/1965 https://paper.nweon.com/1965#respond Tue, 02 Jun 2020 05:23:42 +0000 https://paper.nweon.com/1965 PubDate: May 2019Teams: Ludwig Maximilian University of Munich,Utrecht UniversityWriters: Matthias Hoppe;Jakob Karolus;Felix Dietz;Paweł W. Woźniak;Albrecht Schmidt;Tonja-Katrin MachullaPDF: VRsneaky: Increasing Presence in VR Through Gait-Aware Auditory FeedbackAbstractWhile Virtual Reality continues to increase in fidelity, it remains an open question how to effectively reflect the user’s movements and provide congruent feedback in virtual environments.

VRsneaky: Increasing Presence in VR Through Gait-Aware Auditory Feedback最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: Ludwig Maximilian University of Munich,Utrecht University

Writers: Matthias Hoppe;Jakob Karolus;Felix Dietz;Paweł W. Woźniak;Albrecht Schmidt;Tonja-Katrin Machulla

PDF: VRsneaky: Increasing Presence in VR Through Gait-Aware Auditory Feedback

Abstract

While Virtual Reality continues to increase in fidelity, it remains an open question how to effectively reflect the user’s movements and provide congruent feedback in virtual environments. We present VRsneaky, a system for producing auditory movement feedback, which helps participants orient themselves in a virtual environment by providing footstep sounds. The system reacts to the user’s specific gait features and adjusts the audio accordingly. In a user study with 28 participants, we found that VRsneaky increases users’ sense of presence as well as awareness of their own posture and gait. Additionally, we find that increasing auditory realism significantly influences certain characteristics of participants’ gait. Our work shows that gait-aware audio feedback is a means to increase presence in virtual environments. We discuss opportunities and design requirements for future scenarios where users walk through immersive virtual worlds.

VRsneaky: Increasing Presence in VR Through Gait-Aware Auditory Feedback最先出现在Nweon Paper

]]>
https://paper.nweon.com/1965/feed 0
Investigating Implicit Gender Bias and Embodiment of White Males in Virtual Reality with Full Body Visuomotor Synchrony https://paper.nweon.com/1963 https://paper.nweon.com/1963#respond Tue, 02 Jun 2020 05:16:35 +0000 https://paper.nweon.com/1963 PubDate: May 2019Teams: University of San FranciscoWriters: Sarah Lopez;Yi Yang;Kevin Beltran;Soo Jung Kim;Jennifer Cruz Hernandez;Chelsy Simran;Bingkun Yang;Beste F. YukselPDF: Investigating Implicit Gender Bias and Embodiment of White Males in Virtual Reality with Full Body Visuomotor SynchronyAbstractPrevious research has shown that when White people embody a black avatar in virtual reality (VR) with full body visuomotor synchrony, this can reduce their implicit racial bias.

Investigating Implicit Gender Bias and Embodiment of White Males in Virtual Reality with Full Body Visuomotor Synchrony最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: University of San Francisco

Writers: Sarah Lopez;Yi Yang;Kevin Beltran;Soo Jung Kim;Jennifer Cruz Hernandez;Chelsy Simran;Bingkun Yang;Beste F. Yuksel

PDF: Investigating Implicit Gender Bias and Embodiment of White Males in Virtual Reality with Full Body Visuomotor Synchrony

Abstract

Previous research has shown that when White people embody a black avatar in virtual reality (VR) with full body visuomotor synchrony, this can reduce their implicit racial bias. In this paper, we put men in female and male avatars in VR with full visuomotor synchrony using wearable trackers and investigated implicit gender bias and embodiment. We found that participants embodied in female avatars displayed significantly higher levels of implicit gender bias than those embodied in male avatars. The implicit gender bias actually increased after exposure to female embodiment in contrast to male embodiment. Results also showed that participants felt embodied in their avatars regardless of gender matching, demonstrating that wearable trackers can be used for a realistic sense of avatar embodiment in VR. We discuss the future implications of these findings for both VR scenarios and embodiment technologies.

Investigating Implicit Gender Bias and Embodiment of White Males in Virtual Reality with Full Body Visuomotor Synchrony最先出现在Nweon Paper

]]>
https://paper.nweon.com/1963/feed 0
Crossing-Based Selection with Virtual Reality Head-Mounted Displays https://paper.nweon.com/1961 https://paper.nweon.com/1961#respond Tue, 02 Jun 2020 04:56:17 +0000 https://paper.nweon.com/1961 PubDate: May 2019Teams: Nanjing University of Aeronautics and Astronautics,Kochi University of Technolog,Chinese Academy of SciencesWriters: Huawei Tu;Susu Huang;Jiabin Yuan;Xiangshi Ren;Feng TianPDF: Crossing-Based Selection with Virtual Reality Head-Mounted DisplaysAbstractThis paper presents the first investigation into using the goal-crossing paradigm for object selection with virtual reality (VR) head-mounted displays.

Crossing-Based Selection with Virtual Reality Head-Mounted Displays最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: Nanjing University of Aeronautics and Astronautics,Kochi University of Technolog,Chinese Academy of Sciences

Writers: Huawei Tu;Susu Huang;Jiabin Yuan;Xiangshi Ren;Feng Tian

PDF: Crossing-Based Selection with Virtual Reality Head-Mounted Displays

Abstract

This paper presents the first investigation into using the goal-crossing paradigm for object selection with virtual reality (VR) head-mounted displays. Two experiments were carried out to evaluate ray-casting crossing tasks with target discs in 3D space and goal lines on 2D plane respectively in comparison to ray-casting pointing tasks. Five factors, i.e. task difficulty, the direction of movement constraint (collinear vs. orthogonal), the nature of the task (discrete vs. continuous), field of view of VR devices and target depth, were considered in both experiments. Our findings are: (1) crossing generally had shorter or no longer time, and higher or similar accuracy than pointing, indicating crossing can complement or substitute pointing; (2) crossing tasks can be well modelled with Fitts’ Law; (3) crossing performance depended on target depth; (4) crossing target discs in 3D space differed from crossing goal lines on 2D plane in many aspects such as time and error performance, the effects of target depth and the parameters of Fitts’ models. Based on these findings, we formulate a number of design recommendations for crossing-based interaction in VR.

Crossing-Based Selection with Virtual Reality Head-Mounted Displays最先出现在Nweon Paper

]]>
https://paper.nweon.com/1961/feed 0
A Design Space for Gaze Interaction on Head-mounted Displays https://paper.nweon.com/1959 https://paper.nweon.com/1959#respond Tue, 02 Jun 2020 04:40:39 +0000 https://paper.nweon.com/1959 PubDate: May 2019Teams: Ulm University,University of StuttgartWriters: Teresa Hirzle;Jan Gugenheimer;Florian Geiselhart;Andreas Bulling;Enrico RukzioPDF: A Design Space for Gaze Interaction on Head-mounted DisplaysAbstractAugmented and virtual reality (AR/VR) has entered the mass market and, with it, will soon eye tracking as a core technology for next generation head-mounted displays (HMDs).

A Design Space for Gaze Interaction on Head-mounted Displays最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: Ulm University,University of Stuttgart

Writers: Teresa Hirzle;Jan Gugenheimer;Florian Geiselhart;Andreas Bulling;Enrico Rukzio

PDF: A Design Space for Gaze Interaction on Head-mounted Displays

Abstract

Augmented and virtual reality (AR/VR) has entered the mass market and, with it, will soon eye tracking as a core technology for next generation head-mounted displays (HMDs). In contrast to existing gaze interfaces, the 3D nature of AR and VR requires estimating a user’s gaze in 3D. While first applications, such as foveated rendering, hint at the compelling potential of combining HMDs and gaze, a systematic analysis is missing. To fill this gap, we present the first design space for gaze interaction on HMDs. Our design space covers human depth perception and technical requirements in two dimensions aiming to identify challenges and opportunities for interaction design. As such, our design space provides a comprehensive overview and serves as an important guideline for researchers and practitioners working on gaze interaction on HMDs. We further demonstrate how our design space is used in practice by presenting two interactive applications: EyeHealth and XRay-Vision.

A Design Space for Gaze Interaction on Head-mounted Displays最先出现在Nweon Paper

]]>
https://paper.nweon.com/1959/feed 0
The Heat is On: Exploring User Behaviour in a Multisensory Virtual Environment for Fire Evacuation https://paper.nweon.com/1957 https://paper.nweon.com/1957#respond Tue, 02 Jun 2020 04:29:26 +0000 https://paper.nweon.com/1957 PubDate: May 2019Teams: University of NottinghamWriters: Emily Shaw;Tessa Roper;Tommy Nilsson;Glyn Lawson;Sue V.G. Cobb;Daniel MillerPDF: The Heat is On: Exploring User Behaviour in a Multisensory Virtual Environment for Fire EvacuationAbstractUnderstanding validity of user behaviour in Virtual Environments (VEs) is critical as they are increasingly being used for serious Health and Safety applications such as predicting human behaviour and training in hazardous situations.

The Heat is On: Exploring User Behaviour in a Multisensory Virtual Environment for Fire Evacuation最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: University of Nottingham

Writers: Emily Shaw;Tessa Roper;Tommy Nilsson;Glyn Lawson;Sue V.G. Cobb;Daniel Miller

PDF: The Heat is On: Exploring User Behaviour in a Multisensory Virtual Environment for Fire Evacuation

Abstract

Understanding validity of user behaviour in Virtual Environments (VEs) is critical as they are increasingly being used for serious Health and Safety applications such as predicting human behaviour and training in hazardous situations. This paper presents a comparative study exploring user behaviour in VE-based fire evacuation and investigates whether this is affected by the addition of thermal and olfactory simulation. Participants (N=43) were exposed to a virtual fire in an office building. Quantitative and qualitative analyses of participant attitudes and behaviours found deviations from those we would expect in real life (e.g. pre-evacuation actions), but also valid behaviours like fire avoidance. Potentially important differences were found between multisensory and audiovisual-only conditions (e.g. perceived urgency). We conclude VEs have significant potential in safety-related applications, and that multimodality may afford additional uses in this context, but the identified limitations of behavioural validity must be carefully considered to avoid misapplication of the technology.

The Heat is On: Exploring User Behaviour in a Multisensory Virtual Environment for Fire Evacuation最先出现在Nweon Paper

]]>
https://paper.nweon.com/1957/feed 0
Measuring and Understanding Photo Sharing Experiences in Social Virtual Reality https://paper.nweon.com/1955 https://paper.nweon.com/1955#respond Tue, 02 Jun 2020 04:16:40 +0000 https://paper.nweon.com/1955 PubDate: May 2019Teams: Centrum Wiskunde & Informatica,University of Oldenburg,Delft University of TechnologyWriters: Jie Li;Yiping Kong;Thomas Röggla;Francesca De Simone;Swamy Ananthanarayan;Huib de Ridder;Abdallah El Ali;Pablo CesarPDF: Measuring and Understanding Photo Sharing Experiences in Social Virtual RealityAbstractMillions of photos are shared online daily, but the richness of interaction compared with face-to-face (F2F) sharing is still missing.

Measuring and Understanding Photo Sharing Experiences in Social Virtual Reality最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: Centrum Wiskunde & Informatica,University of Oldenburg,Delft University of Technology

Writers: Jie Li;Yiping Kong;Thomas Röggla;Francesca De Simone;Swamy Ananthanarayan;Huib de Ridder;Abdallah El Ali;Pablo Cesar

PDF: Measuring and Understanding Photo Sharing Experiences in Social Virtual Reality

Abstract

Millions of photos are shared online daily, but the richness of interaction compared with face-to-face (F2F) sharing is still missing. While this may change with social Virtual Reality (socialVR), we still lack tools to measure such immersive and interactive experiences. In this paper, we investigate photo sharing experiences in immersive environments, focusing on socialVR. Running context mapping (N=10), an expert creative session (N=6), and an online experience clustering questionnaire (N=20), we develop and statistically evaluate a questionnaire to measure photo sharing experiences. We then ran a controlled, within-subject study (N=26 pairs) to compare photo sharing under F2F, Skype, and Facebook Spaces. Using interviews, audio analysis, and our questionnaire, we found that socialVR can closely approximate F2F sharing. We contribute empirical findings on the immersiveness differences between digital communication media, and propose a socialVR questionnaire that can in the future generalize beyond photo sharing.

Measuring and Understanding Photo Sharing Experiences in Social Virtual Reality最先出现在Nweon Paper

]]>
https://paper.nweon.com/1955/feed 0
Does It Feel Real?: Using Tangibles with Different Fidelities to Build and Explore Scenes in Virtual Reality https://paper.nweon.com/1953 https://paper.nweon.com/1953#respond Tue, 02 Jun 2020 02:58:45 +0000 https://paper.nweon.com/1953 PubDate: May 2019Teams: University of BremenWriters: Thomas Muender;Anke V. Reinschluessel;Sean Drewes;Dirk Wenig;Tanja Döring;Rainer MalakaPDF: Does It Feel Real?: Using Tangibles with Different Fidelities to Build and Explore Scenes in Virtual RealityAbstractProfessionals in domains like film, theater, or architecture often rely on physical models to visualize spaces. With virtual reality (VR) new tools are available providing immersive experiences with correct perceptions of depth and scale.

Does It Feel Real?: Using Tangibles with Different Fidelities to Build and Explore Scenes in Virtual Reality最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: University of Bremen

Writers: Thomas Muender;Anke V. Reinschluessel;Sean Drewes;Dirk Wenig;Tanja Döring;Rainer Malaka

PDF: Does It Feel Real?: Using Tangibles with Different Fidelities to Build and Explore Scenes in Virtual Reality

Abstract

Professionals in domains like film, theater, or architecture often rely on physical models to visualize spaces. With virtual reality (VR) new tools are available providing immersive experiences with correct perceptions of depth and scale. However, these lack the tangibility of physical models. Using tangible objects in VR can close this gap but creates the challenges of producing suitable objects and interacting with them with only the virtual objects visible. This work addresses these challenges by evaluating tangibles with three haptic fidelities: equal disc-shaped tangibles for all virtual objects, Lego-built tangibles, and 3D-printed tangibles resembling the virtual shapes. We present results from a comparative study on immersion, performance, and intuitive interaction and interviews with domain experts. The results show that 3D-printed objects perform best, but Lego offers a good trade-off between fast creation of tangibles and sufficient fidelity. The experts rate our approach as useful and would use all three versions.

Does It Feel Real?: Using Tangibles with Different Fidelities to Build and Explore Scenes in Virtual Reality最先出现在Nweon Paper

]]>
https://paper.nweon.com/1953/feed 0
Adding Proprioceptive Feedback to Virtual Reality Experiences Using Galvanic Vestibular Stimulation https://paper.nweon.com/1951 https://paper.nweon.com/1951#respond Tue, 02 Jun 2020 02:32:43 +0000 https://paper.nweon.com/1951 PubDate: May 2019Teams: Massachusetts Institute of TechnologyWriters: Misha Sra;Abhinandan Jain;Pattie MaesPDF: Adding Proprioceptive Feedback to Virtual Reality Experiences Using Galvanic Vestibular StimulationAbstractWe present a small and lightweight wearable device that enhances virtual reality experiences and reduces cybersickness by means of galvanic vestibular stimulation (GVS).

Adding Proprioceptive Feedback to Virtual Reality Experiences Using Galvanic Vestibular Stimulation最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: Massachusetts Institute of Technology

Writers: Misha Sra;Abhinandan Jain;Pattie Maes

PDF: Adding Proprioceptive Feedback to Virtual Reality Experiences Using Galvanic Vestibular Stimulation

Abstract

We present a small and lightweight wearable device that enhances virtual reality experiences and reduces cybersickness by means of galvanic vestibular stimulation (GVS). GVS is a specific way to elicit vestibular reflexes that has been used for over a century to study the function of the vestibular system. In addition to GVS, we support physiological sensing by connecting heart rate, electrodermal activity and other sensors to our wearable device using a plug and play mechanism. An accompanying Android app communicates with the device over Bluetooth (BLE) for transmitting the GVS stimulus to the user through electrodes attached behind the ears. Our system supports multiple categories of virtual reality applications with different types of virtual motion such as driving, navigating by flying, teleporting, or riding. We present a user study in which participants (N = 20) experienced significantly lower cybersickness when using our device and rated experiences with GVS-induced haptic feedback as significantly more immersive than a no-GVS baseline.

Adding Proprioceptive Feedback to Virtual Reality Experiences Using Galvanic Vestibular Stimulation最先出现在Nweon Paper

]]>
https://paper.nweon.com/1951/feed 0
VibEye: Vibration-Mediated Object Recognition for Tangible Interactive Applications https://paper.nweon.com/1949 https://paper.nweon.com/1949#respond Tue, 02 Jun 2020 02:13:34 +0000 https://paper.nweon.com/1949 PubDate: May 2019Teams: Pohang University of Science and TechnologyWriters: Seungjae Oh;Gyeore Yun;Chaeyong Park;Jinsoo Kim;Seungmoon ChoiPDF: VibEye: Vibration-Mediated Object Recognition for Tangible Interactive ApplicationsAbstractWe present VibEye: a vibration-mediated recognition system of objects for tangible interaction. A user holds an object between two fingers wearing VibEye.

VibEye: Vibration-Mediated Object Recognition for Tangible Interactive Applications最先出现在Nweon Paper

]]>

PubDate: May 2019

Teams: Pohang University of Science and Technology

Writers: Seungjae Oh;Gyeore Yun;Chaeyong Park;Jinsoo Kim;Seungmoon Choi

PDF: VibEye: Vibration-Mediated Object Recognition for Tangible Interactive Applications

Abstract

We present VibEye: a vibration-mediated recognition system of objects for tangible interaction. A user holds an object between two fingers wearing VibEye. VibEye triggers a vibration from one finger, and the vibration that has propagated through the object is sensed at the other finger. This vibration includes information about the object’s identity, and we represent it using a spectrogram. Collecting the spectrograms of many objects, we formulate the object recognition problem to a classical classification problem among the images. This simple method, when tested with 20 users, shows 92.5% accuracy for 16 objects of the same shape with various materials. This material-based classifier is also extended to the recognition of everyday objects. Lastly, we demonstrate several tangible applications where VibEye provides the needed functionality while enhancing user experiences. VibEye is particularly effective for recognizing objects made of different materials, which is difficult to distinguish by other means such as light and sound.

VibEye: Vibration-Mediated Object Recognition for Tangible Interactive Applications最先出现在Nweon Paper

]]>
https://paper.nweon.com/1949/feed 0