Nweon Paper https://paper.nweon.com 映维网,影响力虚拟现实(VR)、增强现实(AR)产业信息数据平台 Tue, 29 Sep 2020 07:36:31 +0000 zh-CN hourly 1 https://wordpress.org/?v=4.8.14 https://paper.nweon.com/wp-content/uploads/2020/03/cropped-nweon-icon-1-150x150.png Nweon Paper https://paper.nweon.com 32 32 Enabling Human-Robot-Interaction via Virtual and Augmented Reality in Distributed Control Systems https://paper.nweon.com/6813 Tue, 29 Sep 2020 07:36:31 +0000 https://paper.nweon.com/6813 PubDate: Jan 2018

Enabling Human-Robot-Interaction via Virtual and Augmented Reality in Distributed Control Systems最先出现在Nweon Paper

]]>
PubDate: Jan 2018

Teams: Technische Universität Berlin;Fraunhofer Institute for Production Systems and Design Technology

Writers: Jan Guhl;Johannes Hügle;Jörg Krüger

PDF: Enabling Human-Robot-Interaction via Virtual and Augmented Reality in Distributed Control Systems

Abstract

Production and assembly lines are nowadays transforming into flexible and interconnected cells due to rapidly changing production demands. Changes are, for example, varying locations and poses for the processed work pieces and tools as well as the involved machinery like industrial robots. Even a variation in the combination or sequence of different production steps is possible. In case of older involved machines the task of reconfiguration and calibration can be time consuming. This may lead, in addition with the expected upcoming shortage of highly skilled workers, to future challenges, especially for small and medium sized enterprises.

One possibility to address these challenges is to use distributed or cloud-based control for the participating machines. These approaches allow the use of more sophisticated and therefore in most cases computationally heavier algorithms than offered by classic monolithic controls. Those algorithms range from simple visual servoing applications to more complex scenarios, like sampling-based path planning in a previously 3D-reconstructed robot cell. Moving the computation of the machine’s reactions physically and logically away from the machine control complicates the supervision and verification of the computed robot paths and trajectories. This poses a potential threat to the machine’s operator since he/she is hardly capable of predicting or controlling the robot’s movements.

To overcome this drawback, this paper presents a system which allows the user to interact with industrial robot and other cyber physical systems via augmented and virtual reality. Captured topics in this paper are the architecture and concept for the distributed system, first implementation details and promising results obtained by using a Microsoft HoloLens and other visualization devices.

Enabling Human-Robot-Interaction via Virtual and Augmented Reality in Distributed Control Systems最先出现在Nweon Paper

]]>
Haptic feedback for virtual reality https://paper.nweon.com/6811 Tue, 29 Sep 2020 07:27:08 +0000 https://paper.nweon.com/6811 PubDate: April 2019

Haptic feedback for virtual reality最先出现在Nweon Paper

]]>
PubDate: April 2019

Teams: Beihang University

Writers: Dangxiao Wang

PDF: Haptic feedback for virtual reality

Abstract

Touch is an important channel for human beings to communicate with the external world and plays a crucial role in human interaction with nature. Information such as softness, friction, texture and warmth, or more complicated human emotional communication can only be perceived by the act of touching. Despite its importance, it is striking that haptic feeback in human-machine interaction is still in its infancy. For example, a human user can enjoy realistic visual or auditory experiences facilitated by a computer through devices such as a head-mounted display or a stereo headset. However, haptic experiences that the user can obtain is extremely limited during such interactions.

Haptic feedback for virtual reality最先出现在Nweon Paper

]]>
Avatar Implementation in Virtual Reality Environment using Situated Learning for “Tawaf” https://paper.nweon.com/6809 Tue, 29 Sep 2020 07:24:05 +0000 https://paper.nweon.com/6809 PubDate: Dec 2019

Avatar Implementation in Virtual Reality Environment using Situated Learning for “Tawaf”最先出现在Nweon Paper

]]>
PubDate: Dec 2019

Teams: Universiti Teknologi MARA

Writers: Anita Mohd Yasin;Zeti Darleena;Mohd Ali MohdIsa

PDF: Avatar Implementation in Virtual Reality Environment using Situated Learning for “Tawaf”

Abstract

The growing need on learner-centered education in recent years has motivated this study by applying virtual reality learning environment as an approach via engaging problem-solving activities. This study is emphasizing on 3D virtual reality environment using avatar human based immersive and simulation. The avatar will guide learners through virtual reality environment as they participate in a role based problem solving, hold highly potential for situated learning. Simulation allows learners to immerse themselves in a problematic situation by providing a dynamic navigation. Situated learning enables learner to takes on active role in the learning context in system being studied which is learning the Moslem’s tawaf practice as one of the hajj pillars. Later, it will provide learners with preliminary understanding by engaging into practice the tawaf virtually before they undergo for actual training by using avatar. Thus, learners will be able to immerse themselves with tawaf process virtually. Finally a virtual reality prototype that focuses on tawaf practice will be developed which can be used by a diversify learners from school students to adult learners.

Avatar Implementation in Virtual Reality Environment using Situated Learning for “Tawaf”最先出现在Nweon Paper

]]>
Tile-based 360-degree video streaming for mobile virtual reality in cyber physical system https://paper.nweon.com/6807 Tue, 29 Sep 2020 07:13:14 +0000 https://paper.nweon.com/6807 PubDate: Nov 2018

Tile-based 360-degree video streaming for mobile virtual reality in cyber physical system最先出现在Nweon Paper

]]>
PubDate: Nov 2018

Teams: Gachon University

Writers: Jangwoo Son;Eun-Seok Ryu

PDF: Tile-based 360-degree video streaming for mobile virtual reality in cyber physical system

Abstract

Today, the demand for and interest in virtual reality (VR) is increasing, since we can now easily experience VR in many applications. However, the computational ability of mobile VR is limited compared to that of tethered VR. Since VR represents a 360-degree area, providing high quality only for the area viewed by the user saves considerable bandwidth. Therefore, we propose a new tile-based streaming method that transforms 360-degree videos into mobile VR using high efficiency video coding (HEVC) and the scalability extension of HEVC (SHVC). While the SHVC base layer (BL) represents the entire picture, the enhancement layer (EL) can transmit only the desired tiles by applying the proposed method. By transmitting the BL and EL using region of interest (ROI) tiles, the proposed method helps reduce not only the computational complexity on the decoder side but also the network bandwidth.

Tile-based 360-degree video streaming for mobile virtual reality in cyber physical system最先出现在Nweon Paper

]]>
Joint foveation-depth just-noticeable-difference model for virtual reality environment https://paper.nweon.com/6805 Tue, 29 Sep 2020 07:00:05 +0000 https://paper.nweon.com/6805 PubDate: Oct 2018

Joint foveation-depth just-noticeable-difference model for virtual reality environment最先出现在Nweon Paper

]]>
PubDate: Oct 2018

Teams: Wuhan University

Writers: Di Liu;Yingbin Wang;Zhenzhong Chen

PDF: Joint foveation-depth just-noticeable-difference model for virtual reality environment

Abstract

In this paper, we develop a joint foveation-depth just-noticeable-difference (FD-JND) model to quantify the perceptual redundancy of image in the VR display environment. The proposed FD-JND model is developed with considerations on the effects of both foveation and depth. More specifically, experiments for the VR environment on synthesized stimuli are conducted based on luminance masking and contrast masking and the FD-JND model is developed accordingly. Subjective quality discrimination experiments between the noise contaminated images and original ones validate favorableness of the proposed FD-JND model.

Joint foveation-depth just-noticeable-difference model for virtual reality environment最先出现在Nweon Paper

]]>
Hand pose estimation in object-interaction based on deep learning for virtual reality applications https://paper.nweon.com/6803 Tue, 29 Sep 2020 06:42:11 +0000 https://paper.nweon.com/6803 PubDate: July 2020

Hand pose estimation in object-interaction based on deep learning for virtual reality applications最先出现在Nweon Paper

]]>
PubDate: July 2020

Teams: National Taiwan University

Writers: Min-Yu Wu;Pai-Wen Ting;Ya-Hui Tang;En-Te Chou;Li-Chen Fua

PDF: Hand pose estimation in object-interaction based on deep learning for virtual reality applications

Abstract

Hand Pose Estimation aims to predict the position of joints on a hand from an image, and it has become popular because of the emergence of VR/AR/MR technology. Nevertheless, an issue surfaces when trying to achieve this goal, since a hand tends to cause self-occlusion or external occlusion easily as it interacts with external objects. As a result, there have been many projects dedicated to this field for a better solution of this problem. This paper develops a system that accurately estimates a hand pose in 3D space using depth images for VR applications. We propose a data-driven approach of training a deep learning model for hand pose estimation with object interaction. In the convolutional neural network (CNN) training procedure, we design a skeleton-difference loss function, which effectively can learn the physical constraints of a hand. Also, we propose an object-manipulating loss function, which considers knowledge of the hand-object interaction, to enhance performance.

In the experiments we have conducted for hand pose estimation under different conditions, the results validate the robustness and the performance of our system and show that our method is able to predict the joints more accurately in challenging environmental settings. Such appealing results may be attributed to the consideration of the physical joint relationship as well as object information, which in turn can be applied to future VR/AR/MR systems for more natural experience.

Hand pose estimation in object-interaction based on deep learning for virtual reality applications最先出现在Nweon Paper

]]>
Venturing into the uncanny valley of mind—The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting https://paper.nweon.com/6801 Tue, 29 Sep 2020 06:15:09 +0000 https://paper.nweon.com/6801 PubDate: March 2017

Venturing into the uncanny valley of mind—The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting最先出现在Nweon Paper

]]>
PubDate: March 2017

Teams: Technische Universität Chemnitz

Writers: Jan-Philipp Stein;Peter Ohler

PDF: Venturing into the uncanny valley of mind—The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting

Abstract

For more than 40 years, the uncanny valley model has captivated researchers from various fields of expertise. Still, explanations as to why slightly imperfect human-like characters can evoke feelings of eeriness remain the subject of controversy. Many experiments exploring the phenomenon have emphasized specific visual factors in connection to evolutionary psychological theories or an underlying categorization conflict. More recently, studies have also shifted away focus from the appearance of human-like entities, instead exploring their mental capabilities as basis for observers’ discomfort. In order to advance this perspective, we introduced 92 participants to a virtual reality (VR) chat program and presented them with two digital characters engaged in an emotional and empathic dialogue. Using the same pre-recorded 3D scene, we manipulated the perceived control type of the depicted characters (human-controlled avatars vs. computer-controlled agents), as well as their alleged level of autonomy (scripted vs. self-directed actions). Statistical analyses revealed that participants experienced significantly stronger eeriness if they perceived the empathic characters to be autonomous artificial intelligences. As human likeness and attractiveness ratings did not result in significant group differences, we present our results as evidence for an “uncanny valley of mind“ that relies on the attribution of emotions and social cognition to non-human entities. A possible relationship to the philosophy of anthropocentrism and its “threat to human distinctiveness” concept is discussed.

Venturing into the uncanny valley of mind—The influence of mind attribution on the acceptance of human-like characters in a virtual reality setting最先出现在Nweon Paper

]]>
Expansion of Peripheral Visual Field with Novel Virtual Reality Digital Spectacles https://paper.nweon.com/6799 Tue, 29 Sep 2020 06:00:19 +0000 https://paper.nweon.com/6799 PubDate: Feb 2020

Expansion of Peripheral Visual Field with Novel Virtual Reality Digital Spectacles最先出现在Nweon Paper

]]>
PubDate: Feb 2020

Teams: University of Miami

Writers: Ahmed M.Sayed;Mostafa Abdel-Mottaleb;Rashed Kashem;Vatookarn Roongpoovapatr;Amr Elsawy;Mohamed Abdel-Mottaleb;Richard K.ParrishII;Mohamed Abou Shousha

PDF: Expansion of Peripheral Visual Field with Novel Virtual Reality Digital Spectacles

Abstract

Purpose
To examine an image remapping method for peripheral visual field (VF) expansion with novel virtual reality digital spectacles (DSpecs) to improve visual awareness in glaucoma patients.

Design
Prospective case series.

Methods
Monocular peripheral VF defects were measured and defined with a head-mounted display diagnostic algorithm. The monocular VF was used to calculate remapping parameters with a customized algorithm to relocate and resize unseen peripheral targets within the remaining VF. The sequence of monocular VF was tested and customized image remapping was carried out in 23 patients with typical glaucomatous defects. Test images demonstrating roads and cars were used to determine increased awareness of peripheral hazards while wearing the DSpecs. Patients’ scores in identifying and counting peripheral objects with the remapped images were the main outcome measurements.

Results
The diagnostic monocular VF testing algorithm was comparable to standard automated perimetric determination of threshold sensitivity based on point-by-point assessment. Eighteen of 23 patients (78%) could identify safety hazards with the DSpecs that they could not previously. The ability to identify peripheral objects improved with the use of the DSpecs (P = 0.024, chi-square test). Quantification of the number of peripheral objects improved with the DSpecs (P = 0.0026, Wilcoxon rank sum test).

Conclusions
These novel spectacles may enhance peripheral objects awareness by enlarging the functional field of view in glaucoma patients.

Expansion of Peripheral Visual Field with Novel Virtual Reality Digital Spectacles最先出现在Nweon Paper

]]>
Gaze direction affects walking speed when using a self-paced treadmill with a virtual reality environment https://paper.nweon.com/6797 Tue, 29 Sep 2020 04:33:29 +0000 https://paper.nweon.com/6797 PubDate: Oct 2019

Gaze direction affects walking speed when using a self-paced treadmill with a virtual reality environment最先出现在Nweon Paper

]]>
PubDate: Oct 2019

Teams: University Medical Center Groningen;University Medical Center Utrecht

Writers: A.M. Jeschke;L.E.de Groot;L.H.V.van der Woude;I.L.B.Oude Lansink;L.van Kouwenhove;J.M.Hijmans

PDF: Gaze direction affects walking speed when using a self-paced treadmill with a virtual reality environment

Abstract

Background
In a previous study it was observed that participants increase their walking speed during a dual task while walking on a self-paced treadmill in a virtual reality (VR) environment (Gait Real time Analysis Interactive Lab (GRAIL)). This observation is in contrast with the limited resources hypothesis, which suggests walking speed of healthy persons to decrease when performing a cognitive dual task.

Aim
The aim of the present study was therefore to determine whether the cognitive demand of the task, an aroused feeling, discrepancy in optic flow or a change in gaze direction caused participants to walk faster in this computer assisted rehabilitation environment.

Materials
The GRAIL included a self-paced treadmill, a motion-capture system and synchronized VR environments.

Methods
Thirteen healthy young adults (mean age 21.6 ± 2.5) were included in this study. Participants walked on the self-paced treadmill while seven different intervention conditions (IC) were offered. Prior to each IC, a control condition (CC) was used to determine the natural self-selected walking speed. Walking speed during the last 30 s of each IC was compared with the walking speed during the last 30 s of the preceding CC.

Results
Results show that the height on which a visual task was presented in the VR environment, influenced walking speed. Participants walked faster when gaze was directed above the focus of expansion.

Significance
These findings contribute to a further understanding of the differences between walking in a real life environment or computer assisted rehabilitation environment. When analyzing gait on a self-paced treadmill in the future, one must be attentive where to place a visual stimulus in the VR environment.

Gaze direction affects walking speed when using a self-paced treadmill with a virtual reality environment最先出现在Nweon Paper

]]>
Effects of interventions on normalizing step width during self-paced dual-belt treadmill walking with virtual reality, a randomised controlled trial https://paper.nweon.com/6795 Tue, 29 Sep 2020 04:13:19 +0000 https://paper.nweon.com/6795 PubDate: Otc 2017

Effects of interventions on normalizing step width during self-paced dual-belt treadmill walking with virtual reality, a randomised controlled trial最先出现在Nweon Paper

]]>
PubDate: Otc 2017

Teams: University Medical Center Groningen

Writers: I.L.B.Oude Lansinka;L.van Kouwenhove;P.U.Dijkstra;K.Postema;J.M.Hijmans

PDF: Effects of interventions on normalizing step width during self-paced dual-belt treadmill walking with virtual reality, a randomised controlled trial

Abstract

Background
Step width is increased during dual-belt treadmill walking, in self-paced mode with virtual reality. Generally a familiarization period is thought to be necessary to normalize step width.

Aim
The aim of this randomised study was to analyze the effects of two interventions on step width, to reduce the familiarization period.

Methods
We used the GRAIL (Gait Real-time Analysis Interactive Lab), a dual-belt treadmill with virtual reality in the self-paced mode. Thirty healthy young adults were randomly allocated to three groups and asked to walk at their preferred speed for 5 min. In the first session, the control-group received no intervention, the ‘walk-on-the-line’-group was instructed to walk on a line, projected on the between-belt gap of the treadmill and the feedback-group received feedback about their current step width and were asked to reduce it. Interventions started after 1 min and lasted 1 min. During the second session, 7–10 days later, no interventions were given.

Findings
Linear mixed modeling showed that interventions did not have an effect on step width after the intervention period in session 1. Initial step width (second 30 s) of session 1 was larger than initial step width of session 2. Step width normalized after 2 min and variation in step width stabilized after 1 min.

Interpretation
Interventions do not reduce step width after intervention period. A 2-min familiarization period is sufficient to normalize and stabilize step width, in healthy young adults, regardless of interventions. A standardized intervention to normalize step width is not necessary.

Effects of interventions on normalizing step width during self-paced dual-belt treadmill walking with virtual reality, a randomised controlled trial最先出现在Nweon Paper

]]>
Development of a novel virtual reality gait intervention https://paper.nweon.com/6793 Tue, 29 Sep 2020 03:00:34 +0000 https://paper.nweon.com/6793 PubDate: Feb 2019

Development of a novel virtual reality gait intervention最先出现在Nweon Paper

]]>
PubDate: Feb 2019

Teams: Washington University School of Medicine in St. Louis

Writers: Anna E.Boone;Matthew H.Foreman;Jack R.Engsberg

PDF: Development of a novel virtual reality gait intervention

Abstract

Introduction
Improving gait speed and kinematics can be a time consuming and tiresome process. We hypothesize that incorporating virtual reality videogame play into variable improvement goals will improve levels of enjoyment and motivation and lead to improved gait performance.

Purpose
To develop a feasible, engaging, VR gait intervention for improving gait variables.

Methods
Completing this investigation involved four steps: 1) identify gait variables that could be manipulated to improve gait speed and kinematics using the Microsoft Kinect and free software, 2) identify free internet videogames that could successfully manipulate the chosen gait variables, 3) experimentally evaluate the ability of the videogames and software to manipulate the gait variables, and 4) evaluate the enjoyment and motivation from a small sample of persons without disability.

Results
The Kinect sensor was able to detect stride length, cadence, and joint angles. FAAST software was able to identify predetermined gait variable thresholds and use the thresholds to play free online videogames. Videogames that involved continuous pressing of a keyboard key were found to be most appropriate for manipulating the gait variables. Five participants without disability evaluated the effectiveness for modifying the gait variables and enjoyment and motivation during play. Participants were able to modify gait variables to permit successful videogame play. Motivation and enjoyment were high.

Summary
A clinically feasible and engaging virtual intervention for improving gait speed and kinematics has been developed and initially tested. It may provide an engaging avenue for achieving thousands of repetitions necessary for neural plastic changes and improved gait.

Development of a novel virtual reality gait intervention最先出现在Nweon Paper

]]>
Emotional activity in early immersive design: Sketches and moodboards in virtual reality https://paper.nweon.com/6791 Tue, 29 Sep 2020 02:27:38 +0000 https://paper.nweon.com/6791 PubDate: Jan 2017

Emotional activity in early immersive design: Sketches and moodboards in virtual reality最先出现在Nweon Paper

]]>
PubDate: Jan 2017

Teams: Arts & Métiers ParisTech

Writers: Vincent Rieuf;Carole Bouchard;Vincent Meyrueis;Jean-François Omhover

PDF: Emotional activity in early immersive design: Sketches and moodboards in virtual reality

Abstract

In between industrial design, virtual reality and experience psychology we aim to offer engaging and efficient immersive tools to augment the design workflow. To evaluate the efficiency of our innovative design tools and their impact on the traditional industrial design process, we use experimental protocols inspired from design science, user experience evaluation and psycho-physiology. This research framework gives access to complex micro processes appearing between the designer and his tools, regarding the emotional engagement of a designer in his activity. The goal of this paper is to present how specific early design needs can be fulfilled by immersive technologies. The presented findings show the level to which an immersive experience is valid for early design tasks.

Emotional activity in early immersive design: Sketches and moodboards in virtual reality最先出现在Nweon Paper

]]>
Effect of virtual reality and whole-body heating on motion sickness severity: A combined and individual stressors approach https://paper.nweon.com/6789 Tue, 29 Sep 2020 02:00:11 +0000 https://paper.nweon.com/6789 PubDate: Dec 2019

Effect of virtual reality and whole-body heating on motion sickness severity: A combined and individual stressors approach最先出现在Nweon Paper

]]>
PubDate: Dec 2019

Teams: Loughborough University;

Writers: Josh T.Arnold;Kate O’Keeffe;Chloe McDaniel;Simon Hodder;Alex Lloyd

PDF: Effect of virtual reality and whole-body heating on motion sickness severity: A combined and individual stressors approach

Abstract

Background
Virtual reality (VR) use is limited by the potential side effects of prolonged exposure to vection, leading to motion sickness. Air temperature (Ta) may exacerbate the severity of such side effects through a synergistic interaction. This study assessed the individual and combined impact of a hot Ta and VR on motion sickness severity.

Method
Thirteen healthy volunteers were exposed to a 20 min visual stimulus, across four experimental conditions: N_CS: 22 °C Ta with computer screen; N_VR: 22 °C Ta with VR; H_CS: 35 °C Ta with computer screen; H_VR: 35 °C Ta with VR. Motion sickness was assessed via fast motion sickness scale (FMS) and simulator sickness questionnaire (SSQ). Physiological indices of motion sickness including, sweat rate, rectal temperature, cutaneous vascular conductance (CVC), skin temperature, blood pressure and heart rate were also examined.

Results
FMS and SSQ ratings indicate a significant main effect for VR, increasing sickness severity (p < 0.001). A significant main effect of Ta was observed for SSQ, but not FMS ratings (FMS, p = 0.07; SSQ, p < 0.04). Despite trends towards synergism, no interaction (Ta × VR) was observed for FMS (p = 0.2) or SSQ scores (p = 0.07), indicating an additive response. Synergistic trends were also observed for sweat rate and CVC.

Conclusion
Synergism between VR and heat on motion sickness remains unclear, possibly as a result of considerable inter-individual variation in the reported subjective responses. Understanding of the questions raised by this study inform safe working guidelines for the use of VR in commercial and occupational settings.

Effect of virtual reality and whole-body heating on motion sickness severity: A combined and individual stressors approach最先出现在Nweon Paper

]]>
Using Emotion Analysis to Define Human Factors of Virtual Reality Wearables https://paper.nweon.com/6787 Tue, 29 Sep 2020 01:06:24 +0000 https://paper.nweon.com/6787 PubDate: Dec 2019

Using Emotion Analysis to Define Human Factors of Virtual Reality Wearables最先出现在Nweon Paper

]]>
PubDate: Dec 2019

Teams: King Abdulaziz University;Effat University

Writers: Ibtihal Makki;Wadee Alhalabi;Rania Samir Adham

PDF: Using Emotion Analysis to Define Human Factors of Virtual Reality Wearables

Abstract

Virtual Reality (VR) is a rapidly evolving technology which is using widely in different areas of our life. The VR wearable devices manufacturing has been significantly grown those days, the performance and quality have reached an incredible level of improvement. Developing technologies for human use such as VR wearable devices, must taking in the account not only the technical specifications, but also the Human Factors (HF) which involves people and technology must be considered to improve the level of user’s acceptance. There are many approaches to define HF for any product based on user impression about that product such as questionnaires, surveys or interviews, but these methods could be extremely peripheral and poor methods because of unfair questions and answers, respondents truthful and the difficulties on finding users actual emotion. Thus, emotion could be considered as an affective component in the user opinion about a product and then in the human factors.

This paper proposed an adaptive multi-label classification model (HF_EMA) based on supervised learning method, to predict five human factors from users’ tweets (wearability, usability, safety, satisfaction and aesthetics) and to analyze and classify users’ emotion regarding those factors into four emotions (Happy, Sad, Anger and Love). The experimental results proved the validity of the proposed model in predicting and classification process, it was given competitive results in predicting human factors from users tweets and then measure the users’ emotion regarding those factors, with an average of ROC = 1, naïve base classifiers outperform the other classifiers. The results indicated that, the usability factor is the most affected factor on VR wearables following by wearability.

Using Emotion Analysis to Define Human Factors of Virtual Reality Wearables最先出现在Nweon Paper

]]>
Is your virtual self as sensational as your real? Virtual Reality: The effect of body consciousness on the experience of exercise sensations https://paper.nweon.com/6785 Tue, 29 Sep 2020 01:00:12 +0000 https://paper.nweon.com/6785 PubDate: March 2019

Is your virtual self as sensational as your real? Virtual Reality: The effect of body consciousness on the experience of exercise sensations最先出现在Nweon Paper

]]>
PubDate: March 2019

Teams: University of Kent;University of Cyprus

Writers: Maria Matsangidou;Chee Siang Ang;Alexis R.Mauger;Jittrapol Intarasirisawat;Boris Otkhmezuri;Marios N.Avraamidesc

PDF: Is your virtual self as sensational as your real? Virtual Reality: The effect of body consciousness on the experience of exercise sensations

Abstract

Objectives
Past research has shown that Virtual Reality (VR) is an effective method for reducing the perception of pain and effort associated with exercise. As pain and effort are subjective feelings, they are influenced by a variety of psychological factors, including one’s awareness of internal body sensations, known as Private Body Consciousness (PBC). The goal of the present study was to investigate whether the effectiveness of VR in reducing the feeling of exercise pain and effort is moderated by PBC.

Design and methods
Eighty participants were recruited to this study and were randomly assigned to a VR or a non-VR control group. All participants were required to maintain a 20% 1RM isometric bicep curl, whilst reporting ratings of pain intensity and perception of effort. Participants in the VR group completed the isometric bicep curl task whilst wearing a VR device which simulated an exercising environment. Participants in the non-VR group completed a conventional isometric bicep curl exercise without VR. Participants’ heart rate was continuously monitored along with time to exhaustion. A questionnaire was used to assess PBC.

Results
Participants in the VR group reported significantly lower pain and effort and exhibited longer time to exhaustion compared to the non-VR group. Notably, PBC had no effect on these measures and did not interact with the VR manipulation.

Conclusions
Results verified that VR during exercise could reduce negative sensations associated with exercise regardless of the levels of PBC.

Is your virtual self as sensational as your real? Virtual Reality: The effect of body consciousness on the experience of exercise sensations最先出现在Nweon Paper

]]>
Building environment information and human perceptual feedback collected through a combined virtual reality (VR) and electroencephalogram (EEG) method https://paper.nweon.com/6783 Mon, 28 Sep 2020 08:00:17 +0000 https://paper.nweon.com/6783 PubDate: Oct 2020

Building environment information and human perceptual feedback collected through a combined virtual reality (VR) and electroencephalogram (EEG) method最先出现在Nweon Paper

]]>
PubDate: Oct 2020

Teams: Beijing Jiaotong University;the University of Sydney

Writers: Junjie Li;Yichun Jin;Shuai Lu;Wei Wu;Peng fei Wang

PDF: Building environment information and human perceptual feedback collected through a combined virtual reality (VR) and electroencephalogram (EEG) method

Abstract

In order to accurately and quantitatively describe the influence of a building’s spatial environment on subjective human perception, this research establishes a relationship framework between the spatial environment and subjective feelings, with the goal of improving occupant satisfaction and work efficiency. In pursuit of this goal, this study presents actual scenes through virtual space and introduces a new research concept for analyzing human brain data. The researchers adopted virtual reality technology in a controlled laboratory environment to compose building simulation spaces and created immersive space perceptions in response to different scenarios. Neural signal electroencephalogram (EEG) data were obtained in the simulation space from participants wearing EEG signal acquisition caps. The experiment process was divided into two phases: scene cognition and task performance. Changes in human perception, as measured on physiological, psychological, and work efficiency indexes, were examined in three environments: open natural, semi-open library, and closed basement spaces. Based on a 32-point analysis of the EEGs of 30 subjects, researchers determined four points and one region with the most significant EEG changes after scene switching. Also, by examining the EEG rhythms in the scene cognition experiment phase, the authors identified a coupling relationship between β rhythms and total time to task completion, proving the mechanistic relationship between β rhythms and work efficiency. Finally, this research revealed a correlation between subjective perception and physiological signals by analyzing the relevance of the connection between subjective questionnaire responses and the β rhythms demonstrated in the EEG experiment, and then deducing the mechanism affecting work efficiency as influenced by different environments. The results obtained show that in the context of changes to elements of a building’s spatial environment, human work efficiency is most related to the β rhythms at several test points and the right temporal lobe region of the brain. Moreover, β rhythms are closely related to satisfaction with human spatial perception. Therefore, this research provides a more accurate set of reference information for building space design based on occupant satisfaction and physical and mental health. The methods and conclusions demonstrated here can be adopted as feedback for ways of obtaining more realistic information from human brain signals and using those data to optimize architectural space design.

Building environment information and human perceptual feedback collected through a combined virtual reality (VR) and electroencephalogram (EEG) method最先出现在Nweon Paper

]]>
3iVClass: a new classification method for Virtual, Augmented and Mixed Realities https://paper.nweon.com/6781 Mon, 28 Sep 2020 08:00:16 +0000 https://paper.nweon.com/6781 PubDate: Oct 2018

3iVClass: a new classification method for Virtual, Augmented and Mixed Realities最先出现在Nweon Paper

]]>
PubDate: Oct 2018

Teams: Université du Québec à Rimouski

Writers: Marc Parveau;MehdiAdda

PDF: 3iVClass: a new classification method for Virtual, Augmented and Mixed Realities

Abstract

For several years, augmented and virtual reality technologies have attracted increasing interest in all areas. In the midst of this universe, the concept, already well known, of mixed reality has established itself as a distinct paradigm. However, and contrary to augmented and virtual realities, there is not a clear and one definition of what it is exactly and why it is different from the other concepts. In this article, we attempt to provide a new classification method to standardize the definition of virtual, augmented and mixed realities. First, a quick overview of existing taxonomies is made, then we present our classification which is based on three criteria we called 3iVClass (Immersion, Interaction, Information). Finally, in order to verify its reliability, we used this classification to propose a definition of mixed reality.

3iVClass: a new classification method for Virtual, Augmented and Mixed Realities最先出现在Nweon Paper

]]>
Gesture-based target acquisition in virtual and augmented reality https://paper.nweon.com/6779 Mon, 28 Sep 2020 08:00:12 +0000 https://paper.nweon.com/6779 PubDate: June 2019

Gesture-based target acquisition in virtual and augmented reality最先出现在Nweon Paper

]]>
PubDate: June 2019

Teams: Tsinghua University

Writers: Yukang YAN;Xin YI;Chun YU;Yuanchun SHI

PDF: Gesture-based target acquisition in virtual and augmented reality

Abstract

Gesture is a basic interaction channel that is frequently used by humans to communicate in daily life. In this paper, we explore to use gesture-based approaches for target acquisition in virtual and augmented reality. A typical process of gesture-based target acquisition is: when a user intends to acquire a target, she performs a gesture with her hands, head or other parts of the body, the computer senses and recognizes the gesture and infers the most possible target. Methods We build mental model and behavior model of the user to study two key parts of the interaction process. Mental model describes how user thinks up a gesture for acquiring a target, and can be the intuitive mapping between gestures and targets. Behavior model describes how user moves the body parts to perform the gestures, and the relationship between the gesture that user intends to perform and signals that computer senses. Results In this paper, we present and discuss three pieces of research that focus on the mental model and behavior model of gesture-based target acquisition in VR and AR. Conclusions We show that leveraging these two models, interaction experience and performance can be improved in VR and AR environments.

Gesture-based target acquisition in virtual and augmented reality最先出现在Nweon Paper

]]>
The effect of gaming on accommodative and vergence facilities after exposure to virtual reality head-mounted display https://paper.nweon.com/6777 Mon, 28 Sep 2020 07:42:06 +0000 https://paper.nweon.com/6777 PubDate: Feb 2020

The effect of gaming on accommodative and vergence facilities after exposure to virtual reality head-mounted display最先出现在Nweon Paper

]]>
PubDate: Feb 2020

Teams: University of KwaZulu-Natal;

Writers: Alvin J.Munsamy;Husna Paruk;Bronwyn Gopichunder;Anela Luggya;Thembekile Majola;Sneliswa Khulu

PDF: The effect of gaming on accommodative and vergence facilities after exposure to virtual reality head-mounted display

Abstract

Background
To investigate the change between accommodative and vergence facilities before and after exposure to gaming in a virtual reality (VR) device amongst participants with normal binocular visual function.

Methods
62 participants between the ages of 18–30 years with normal binocular visual function and inter-pupillary distances between 51 and 70 mm were selected for the study. Spectacle and contact lenses users were excluded. The experimental group (n = 42) was exposed to gaming using Samsung Gear VR(SM -R323) whilst the control group (n = 20) watched a television film projected on a two-dimensional screen at 1 m. Pre-test and post-test binocular amplitude-scaled facilities and vergence facilities were obtained for both groups after exposures of 25 min.

Results
Binocular accommodative facilities for the experimental group had a mean pre-test and post-test facility of 11.14 ± 3.67 cpm and 13.38 ± 3.63 cpm, respectively, after gaming using VR device. The vergence facilities for the experimental group had a mean pre-test and post-test facility of 11.41 ± 3.86 cpm and 15.28 ± 4.93 cpm, respectively, after gaming using a VR device. Binocular accommodative facilities for the control group had a mean pre-test and post-test facility of 11.70 ± 3.2 cpm and 11.95 ± 3.4 cpm, respectively. Vergence facilities for the control group had a mean pre-test and post-test facility of 11.55 ± 6.4 cpm and 11.70 ± 4.9 cpm, respectively. The mean change for binocular accommodative facilities was 2.24 ± 3.43 cpm and 0.25 ± 1.25 cpm for the experimental and control group, respectively. The mean change for vergence facilities was 3.81 ± 3.09 cpm and 0.15 ± 2.72 cpm for the experimental and control group, respectively. Binocular accommodative facilities and vergence facility showed a statistically significant mean increase greater than the control group after gaming using a VR device using an independent t-test (p < 0.05).

Conclusion
The results showed that binocular accommodative facilities and vergence facilities increased after 25 min of VR gaming in emmetropic participants under 30 years of age with inter-pupillary distances between 51 mm and 70 mm.

The effect of gaming on accommodative and vergence facilities after exposure to virtual reality head-mounted display最先出现在Nweon Paper

]]>
A Dual-cable Hand Exoskeleton System for Virtual Reality https://paper.nweon.com/6775 Mon, 28 Sep 2020 07:36:11 +0000 https://paper.nweon.com/6775 PubDate: Feb 2018

A Dual-cable Hand Exoskeleton System for Virtual Reality最先出现在Nweon Paper

]]>
PubDate: Feb 2018

Teams: UNIST

Writers: Yeongyu Park;Inseong Jo;Jeongsoo Lee;Joonbum Bae

PDF: A Dual-cable Hand Exoskeleton System for Virtual Reality

Abstract

In this paper, a hand exoskeleton system for virtual reality is proposed. As a virtual reality interface for the hand, a wearable system should be able to measure the finger joint angles and apply force feedback to the fingers at the same time with a simple and light structure. In the proposed system, two different cable mechanisms are applied to achieve such requirements; three finger joint angles in the direction of the flexion/extension (F/E) motion are measured by a tendon-inspired cable mechanism and another cable is used for force feedback to the finger for one degree of freedom (DOF) actuation per finger. As two different types of cables are used, the system is termed a dual-cable hand exoskeleton system. Using the measured finger joint angles and motor current, the cable-driven actuation system applies the desired force to the fingers. That is, when the desired force is zero, the motor position is controlled to follow the finger posture while maintaining the appropriate cable slack; when the desired force needs to be applied, the motor current is controlled to generate the desired force. To achieve a smooth transition between the two control strategies, the control inputs were linearly integrated; and the desired motor position was generated to prevent a sudden motor rotation. A prototype of the proposed system was manufactured with a weight of 320g, a volume of 13 × 23 × 8cm3, maximum force up to 5 N. The proposed control algorithms were verified by experiments with virtual reality applications.

A Dual-cable Hand Exoskeleton System for Virtual Reality最先出现在Nweon Paper

]]>
Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research https://paper.nweon.com/6773 Mon, 28 Sep 2020 07:27:34 +0000 https://paper.nweon.com/6773 PubDate: July 2019

Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research最先出现在Nweon Paper

]]>
PubDate: July 2019

Teams: University of Southern Denmark;Karlsruhe Institute of Technolo;Bielefeld University;Monash University

Writers: Martin Meißner;Jella Pfeiffer;Thies Pfeiffer;Harmen Oppewal

PDF: Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research

Abstract

Technological advances in eye tracking methodology have made it possible to unobtrusively measure consumer visual attention during the shopping process. Mobile eye tracking in field settings however has several limitations, including a highly cumbersome data coding process. In addition, field settings allow only limited control of important interfering variables. The present paper argues that virtual reality can provide an alternative setting that combines the benefits of mobile eye tracking with the flexibility and control provided by lab experiments. The paper first reviews key advantages of different eye tracking technologies as available for desktop, natural and virtual environments. It then explains how combining virtual reality settings with eye tracking provides a unique opportunity for shopper research in particular regarding the use of augmented reality to provide shopper assistance.

Combining virtual reality and mobile eye tracking to provide a naturalistic experimental environment for shopper research最先出现在Nweon Paper

]]>
Towards OPM-MEG in a virtual reality environment https://paper.nweon.com/6771 Mon, 28 Sep 2020 07:18:29 +0000 https://paper.nweon.com/6771 PubDate: Otc 2019

Towards OPM-MEG in a virtual reality environment最先出现在Nweon Paper

]]>
PubDate: Otc 2019

Teams: University Park;Aston University;QuSpin Inc;Insitute of Neurology,

Writers: GillianRobertsaNiallHolmesaNicholasAlexandercElenaBotoaJamesLeggettaRyan M.HillaVishalShahbMollyReaaRichardVaughanaEleanor A.MaguiredKlausKesslerceShaunBeebeaMarkFromholdaGareth R.BarnesdRichardBowtellaMatthew J.Brookesa

PDF: Towards OPM-MEG in a virtual reality environment

Abstract

Virtual reality (VR) provides an immersive environment in which a participant can experience a feeling of presence in a virtual world. Such environments generate strong emotional and physical responses and have been used for wide-ranging applications. The ability to collect functional neuroimaging data whilst a participant is immersed in VR would represent a step change for experimental paradigms; unfortunately, traditional brain imaging requires participants to remain still, limiting the scope of naturalistic interaction within VR. Recently however, a new type of magnetoencephalography (MEG) device has been developed, that employs scalp-mounted optically-pumped magnetometers (OPMs) to measure brain electrophysiology. Lightweight OPMs, coupled with precise control of the background magnetic field, enables participant movement during data acquisition. Here, we exploit this technology to acquire MEG data whilst a participant uses a virtual reality head-mounted display (VRHMD). We show that, despite increased magnetic interference from the VRHMD, we were able to measure modulation of alpha-band oscillations, and the visual evoked field. Moreover, in a VR experiment in which a participant had to move their head to look around a virtual wall and view a visual stimulus, we showed that the measured MEG signals map spatially in accordance with the known organisation of primary visual cortex. This technique could transform the type of neuroscientific experiment that can be undertaken using functional neuroimaging.

Towards OPM-MEG in a virtual reality environment最先出现在Nweon Paper

]]>
Influence of multi-modality on moving target selection in virtual reality https://paper.nweon.com/6769 Mon, 28 Sep 2020 07:03:05 +0000 https://paper.nweon.com/6769 PubDate: June 2019

Influence of multi-modality on moving target selection in virtual reality最先出现在Nweon Paper

]]>
PubDate: June 2019

Teams: Chinese Academy of Sciences

Writers: Yang LI;Dong WU;Jin HUANG;Feng TIAN;Hong’an WANG;Guozhong DAI

PDF: Influence of multi-modality on moving target selection in virtual reality

Abstract

Background Owing to recent advances in virtual reality (VR) technologies, effective user interaction with dynamic content in 3D scenes has become a research hotspot. Moving target selection is a basic interactive task in which the user performance research in tasks is significant to user interface design in VR. Different from the existing static target selection studies, the moving target selection in VR is affected by the change in target speed, angle and size, and lack of research on some key factors. Methods This study designs an experimental scenario in which the users play badminton under the condition of VR. By adding seven kinds of modal clues such as vision, audio, haptics, and their combinations, five kinds of moving speed and four kinds of serving angles, and the effect of these factors on the performance and subjective feelings in moving target selection in VR, is studied. Results The results show that the moving speed of the shuttlecock has a significant impact on the user performance. The angle of service has a significant impact on hitting rate, but has no significant impact on the hitting distance. The acquisition of the user performance by the moving target is mainly influenced by vision under the combined modalities; adding additional modalities can improve user performance. Although the hitting distance of the target is increased in the trimodal condition, the hitting rate decreases. Conclusion This study analyses the results of user performance and subjective perception, and then provides suggestions on the combination of modality clues in different scenarios.

Influence of multi-modality on moving target selection in virtual reality最先出现在Nweon Paper

]]>
The effects of sub-threshold vibratory noise on visuomotor entrainment during human walking and standing in a virtual reality environment https://paper.nweon.com/6767 Mon, 28 Sep 2020 06:42:22 +0000 https://paper.nweon.com/6767 PubDate: Aug 2019

The effects of sub-threshold vibratory noise on visuomotor entrainment during human walking and standing in a virtual reality environment最先出现在Nweon Paper

]]>
PubDate: Aug 2019

Teams: University of Wisconsin–Madison

Writers: Samuel A.Acuña;John D.Zunker;Darryl G.Thelen

PDF: The effects of sub-threshold vibratory noise on visuomotor entrainment during human walking and standing in a virtual reality environment

Abstract

Humans will naturally synchronize their posture to the motion of a visual surround, but it is unclear if this visuomotor entrainment can be attenuated with an increased sensitivity to somatosensory information. Sub-threshold vibratory noise applied to the Achilles tendons has proven to enhance ankle proprioception through the phenomenon of stochastic resonance. Our purpose was to compare visuomotor entrainment during walking and standing, and to understand how this entrainment might be attenuated by applying sub-threshold vibratory noise over the Achilles tendons. We induced visuomotor entrainment during standing and treadmill walking for ten subjects (24.5 ± 2.9 years) using a speed-matched virtual hallway with continuous mediolateral perturbations at three different frequencies. Vibrotactile motors over the Achilles tendons provided noise (0–400 Hz) with an amplitude set to 90% of each participant’s sensory threshold. Mediolateral sacrum, C7, and head motion was greatly amplified (4–8× on average) at the perturbation frequencies during walking, but was much less pronounced during standing. During walking, individuals with greater mediolateral head motion at the fastest perturbation frequency saw the greatest attenuation of that motion with applied noise. Similarly, during standing, individuals who exhibited greater postural sway (as measured by the center of pressure) also saw the greatest reductions in sway with sub-threshold noise applied in three of our summary metrics. Our results suggest that, at least for healthy young adults, sub-threshold vibratory noise over the Achilles tendons can slightly improve postural control during disruptive mediolateral visual perturbations, but the applied noise does not substantially attenuate visuomotor entrainment during walking or standing.

The effects of sub-threshold vibratory noise on visuomotor entrainment during human walking and standing in a virtual reality environment最先出现在Nweon Paper

]]>
Effect of spatial enhancement technology on input through the keyboard in virtual reality environment https://paper.nweon.com/6765 Mon, 28 Sep 2020 06:30:03 +0000 https://paper.nweon.com/6765 PubDate: July 2019

Effect of spatial enhancement technology on input through the keyboard in virtual reality environment最先出现在Nweon Paper

]]>
PubDate: July 2019

Teams: Zhejiang Sci-Tech University;Zhejiang Gongshang University

Writers: Zhen Yang;Cheng Chen;Yuqing Lin;Duming Wang;Hongting Li;Weidan Xu

PDF: Effect of spatial enhancement technology on input through the keyboard in virtual reality environment

Abstract

Scientific developments have enabled the application of virtual reality (VR) technology in various fields. However, this technology is disadvantaged by low recognition, existence of bias, lack of precision, and fatigue of text input in VR environments. To address these problems, this study proposed a spatial enhancement technique. This study investigated the effectiveness of spatial enhancement keys of a virtual keyboard from various angles and explored the impact of enhanced response time and enhanced protrusion distance on the spatial enhancement technology. Finally, the following conclusions are obtained: (1) The average text input performance of the keyboard using the spatial enhancement technique is significantly better than that of the ordinary virtual keyboard without using the spatial enhancement technique. (2) The recommended time interval for enhanced response time and the protrusion distance are 0–100 ms and 1.85 diopter, respectively. The keyboard angle insignificantly affects the input through the keyboard performance.

Effect of spatial enhancement technology on input through the keyboard in virtual reality environment最先出现在Nweon Paper

]]>
New interactive strategies for virtual reality streaming in degraded context of use https://paper.nweon.com/6763 Mon, 28 Sep 2020 06:24:16 +0000 https://paper.nweon.com/6763 PubDate: Feb 2020

New interactive strategies for virtual reality streaming in degraded context of use最先出现在Nweon Paper

]]>
PubDate: Feb 2020

Teams: Université Cote d’Azur

Writers:Lucile Sassatelli;Marco Winckler;Thomas Fisichella;Antoine Dezarnaud;Julien Lemaire;Ramon Aparicio-Pardo;Daniela Trevisan

PDF: New interactive strategies for virtual reality streaming in degraded context of use

Abstract

Virtual reality videos are an important element in the range of immersive contents as they open new perspectives for story-telling, journalism or education. Accessing these immersive contents through Internet streaming is however much more difficult owing to the required data rates much higher than for regular videos. While current streaming strategies rely on video compression, in this paper we investigate a radically new stance: we posit that degrading the visual quality is not the only choice to reduce the required data rate, and not necessarily the best. Instead, we propose two new impairments, Virtual Walls (VWs) and Slow Downs (SDs), that change the way the user can interact with the 360∘ video in an environment with insufficient available bandwidth. User experiments with a double-stimulus approach show that, when triggered in proper time periods, these impairments are better perceived than visual quality degradation from video compression. We confirm with network simulations the usefulness of these new types of impairments: incorporated into a FoV-based adaptation, they can enable reduction in stalls and startup delay, and increase quality in FoV, even in the presence of substantial playback buffers.

New interactive strategies for virtual reality streaming in degraded context of use最先出现在Nweon Paper

]]>
User color temperature preferences in immersive virtual realities https://paper.nweon.com/6761 Mon, 28 Sep 2020 06:15:21 +0000 https://paper.nweon.com/6761 PubDate: June 2019

User color temperature preferences in immersive virtual realities最先出现在Nweon Paper

]]>
PubDate: June 2019

Teams: Karlsruhe University of Applied Sciences

Writers: Andreas Siess;Matthias Wölfel

PDF: User color temperature preferences in immersive virtual realities

Abstract

Virtual reality is currently experiencing a renaissance, with more and more fields of application emerging and more and more users spending an increasing amount of time using head mounted displays (HMD). While technical parameters such as field of view or pixel density are heavily researched, some fundamental aspects of perception and in particular the effects on human cognition and physical constitution are less investigated. One of these aspects to be addressed in this study is the perceived color temperature—a stimulus to which every color-seeing person reacts both consciously and subconsciously. A total of 86 test persons were asked to adjust the color temperature according to their personal taste in five photorealistic scenarios; once using a PC screen and once using a HMD. Between these two devices significant differences in the personal preferences of the test persons were found. This study will open a broad discussion about the effects of color temperature as an omnipresent stimulus and how external factors like daytime or season may effect these preferences in immersive environments.

User color temperature preferences in immersive virtual realities最先出现在Nweon Paper

]]>
Postural stability predicts the likelihood of cybersickness in active HMD-based virtual reality https://paper.nweon.com/6759 Mon, 28 Sep 2020 06:00:10 +0000 https://paper.nweon.com/6759 PubDate: July 2019

Postural stability predicts the likelihood of cybersickness in active HMD-based virtual reality最先出现在Nweon Paper

]]>
PubDate: July 2019

Teams: University of Wollongong;University of New Englan

Writers: Benjamin Arcioni;Stephen Palmisano;Deborah Apthorp;Juno Kim

PDF: Postural stability predicts the likelihood of cybersickness in active HMD-based virtual reality

Abstract

Cybersickness is common during virtual reality experiences with head-mounted displays (HMDs). Previously it has been shown that individual differences in postural activity can predict which people are more likely to experience visually-induced motion sickness. This study examined whether such predictions also generalise to the cybersickness experienced during active HMD-based virtual reality. Multisensory stimulation was generated by having participants continuously turn their heads from left to right while viewing the self-motion simulations. Real-time head tracking was then used to create ecological (‘compensated’) and non-ecological (‘inversely compensated’) head-and-display motion conditions. Ten (out of 20) participants reported feeling sick after being exposed to these self-motion simulations. Cybersickness did not differ significantly between the two compensation conditions. However, individual differences in spontaneous postural instability when standing quietly were found to predict the likelihood of subsequently experiencing cybersickness. These findings support recent proposals that postural measures can help diagnose who will benefit the most/least from HMD-based virtual reality.

Postural stability predicts the likelihood of cybersickness in active HMD-based virtual reality最先出现在Nweon Paper

]]>
With or without you? Interaction and immersion in a virtual reality experience https://paper.nweon.com/6757 Mon, 28 Sep 2020 05:39:17 +0000 https://paper.nweon.com/6757 PubDate: July 2019

With or without you? Interaction and immersion in a virtual reality experience最先出现在Nweon Paper

]]>
PubDate: July 2019

Teams: Rennes School of Business;IRT b-com – Institut de Recherche Technologique b-com

Writers: Sarah Hudson;Sheila Matson-Barkat;Nico Pallamin;Guillaume Jegou

PDF: With or without you? Interaction and immersion in a virtual reality experience

Abstract

This collaborative research between a team of digital technology developers and academic researchers investigates how social interaction affects visitors’ experience during a virtual reality (VR) underwater seascape exploration. Prior research in immersive VR focused more on individual perceptions of immersion, interactive features and enjoyment. Analysis of focus-group discussions revealed three categories of immersion, interaction with the virtual environment (VE) and social interaction salient to satisfaction with the experience. Moderated mediation analysis of survey results from a full-scale trial (N = 234) show that the three variables had a significant role in experience satisfaction and loyalty intentions. Specifically, immersion mediates person-VE interaction effects on satisfaction and loyalty. The results contrast with previous findings from online gaming contexts, showing that social interactions decrease the impact of immersion on satisfaction and loyalty. We call for caution in the positioning and communication of VR experiences and for further research in other settings.

With or without you? Interaction and immersion in a virtual reality experience最先出现在Nweon Paper

]]>
Construction of virtual reality system for radiation working environment reproduced by gamma-ray imagers combined with SLAM technologies https://paper.nweon.com/6755 Mon, 28 Sep 2020 05:15:10 +0000 https://paper.nweon.com/6755 PubDate: Otc 2020

Construction of virtual reality system for radiation working environment reproduced by gamma-ray imagers combined with SLAM technologies最先出现在Nweon Paper

]]>
PubDate: Otc 2020

Teams: Japan Atomic Energy Agency;Visible Information Center, Inc

Writers: Yuki Sato;Kojiro Minemoto;Makoto Nemoto;Tatsuo Torii

PDF: Construction of virtual reality system for radiation working environment reproduced by gamma-ray imagers combined with SLAM technologies

Abstract

The Fukushima Daiichi Nuclear Power Station (FDNPS), operated by Tokyo Electric Power Company Holdings, Inc., experienced a meltdown as a result of a large tsunami caused by the Great East Japan Earthquake on March 11, 2011. At that time, it was necessary to understand the aspects of the decommissioning working environment inside the FDNPS, such as establishing how the radioactive substances were distributed across the site, for work to be done efficiently without exposure to large amounts of radiation. Therefore, virtual reality (VR) emerged as a solution. There have been previous reports done on a technique for visualizing the distribution of radioactive substances in three dimensions utilizing a freely moving gamma-ray imager combined with simultaneous localization and mapping (SLAM) technology. In this paper, we introduce imaging technologies for the acquisition of image data from radioactive substances and three-dimensional (3D) structural models of the working environment, using a freely moving gamma-ray imager combined with SLAM technology. For this research, we also constructed a VR system and displayed the 3D data in a VR space, which enables users to experience the actual working environment without radiation exposure. In creating the VR system, any user can implement this method by donning an inexpensive head-mounted display apparatus and using a free, or low-cost, application software.

Construction of virtual reality system for radiation working environment reproduced by gamma-ray imagers combined with SLAM technologies最先出现在Nweon Paper

]]>