Read by QxMD icon Read

IEEE Transactions on Visualization and Computer Graphics

Biao Xie, Yongqi Zhang, Haikun Huang, Elisa Ogawa, Tongjian You, Lap-Fai Yu
Games and experiences designed for virtual or augmented reality usually require the player to move physically to play. This poses substantial challenge for level designers because the player's physical experience in a level will need to be considered, otherwise the level may turn out to be too exhausting or not challenging enough. This paper presents a novel approach to optimize level designs by considering the physical challenge imposed upon the player in completing a level of motion-based games. A game level is represented as an assembly of chunks characterized by the exercise intensity levels they impose on players...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Vincent Sitzmann, Ana Serrano, Amy Pavel, Maneesh Agrawala, Diego Gutierrez, Belen Masia, Gordon Wetzstein
Understanding how people explore immersive virtual environments is crucial for many applications, such as designing virtual reality (VR) content, developing new compression algorithms, or learning computational models of saliency or visual attention. Whereas a body of recent work has focused on modeling saliency in desktop viewing conditions, VR is very different from these conditions in that viewing behavior is governed by stereoscopic vision and by the complex interaction of head orientation, gaze, and other kinematic constraints...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Nitish Padmanaban, Timon Ruban, Vincent Sitzmann, Anthony M Norcia, Gordon Wetzstein
Virtual reality systems are widely believed to be the next major computing platform. There are, however, some barriers to adoption that must be addressed, such as that of motion sickness - which can lead to undesirable symptoms including postural instability, headaches, and nausea. Motion sickness in virtual reality occurs as a result of moving visual stimuli that cause users to perceive self-motion while they remain stationary in the real world. There are several contributing factors to both this perception of motion and the subsequent onset of sickness, including field of view, motion velocity, and stimulus depth...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Katja Zibrek, Elena Kokkinara, Rachel Mcdonnell
Virtual characters that appear almost photo-realistic have been shown to induce negative responses from viewers in traditional media, such as film and video games. This effect, described as the uncanny valley, is the reason why realism is often avoided when the aim is to create an appealing virtual character. In Virtual Reality, there have been few attempts to investigate this phenomenon and the implications of rendering virtual characters with high levels of realism on user enjoyment. In this paper, we conducted a large-scale experiment on over one thousand members of the public in order to gather information on how virtual characters are perceived in interactive virtual reality games...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Jingxin Zhang, Eike Langbehn, Dennis Krupke, Nicholas Katzakis, Frank Steinicke
Telepresence systems have the potential to overcome limits and distance constraints of the real-world by enabling people to remotely visit and interact with each other. However, current telepresence systems usually lack natural ways of supporting interaction and exploration of remote environments (REs). In particular, single webcams for capturing the RE provide only a limited illusion of spatial presence, and movement control of mobile platforms in today's telepresence systems are often restricted to simple interaction devices...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Robert Xiao, Julia Schwarz, Nick Throm, Andrew D Wilson, Hrvoje Benko
We present MRTouch, a novel multitouch input solution for head-mounted mixed reality systems. Our system enables users to reach out and directly manipulate virtual interfaces affixed to surfaces in their environment, as though they were touchscreens. Touch input offers precise, tactile and comfortable user input, and naturally complements existing popular modalities, such as voice and hand gesture. Our research prototype combines both depth and infrared camera streams together with real-time detection and tracking of surface planes to enable robust finger-tracking even when both the hand and head are in motion...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Thomas Waltemate, Dominik Gall, Daniel Roth, Mario Botsch, Marc Erich Latoschik
This article reports the impact of the degree of personalization and individualization of users' avatars as well as the impact of the degree of immersion on typical psychophysical factors in embodied Virtual Environments. We investigated if and how virtual body ownership (including agency), presence, and emotional response are influenced depending on the specific look of users' avatars, which varied between (1) a generic hand-modeled version, (2) a generic scanned version, and (3) an individualized scanned version...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Patric Schmitz, Julian Hildebrandt, Andre Calero Valdez, Leif Kobbelt, Martina Ziefle
In virtual environments, the space that can be explored by real walking is limited by the size of the tracked area. To enable unimpeded walking through large virtual spaces in small real-world surroundings, redirection techniques are used. These unnoticeably manipulate the user's virtual walking trajectory. It is important to know how strongly such techniques can be applied without the user noticing the manipulation-or getting cybersick. Previously, this was estimated by measuring a detection threshold (DT) in highly-controlled psychophysical studies, which experimentally isolate the effect but do not aim for perceived immersion in the context of VR applications...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Atul Rungta, Carl Schissler, Nicholas Rewkowski, Ravish Mehra, Dinesh Manocha
We present a novel method to generate plausible diffraction effects for interactive sound propagation in dynamic scenes. Our approach precomputes a diffraction kernel for each dynamic object in the scene and combines them with interactive ray tracing algorithms at runtime. A diffraction kernel encapsulates the sound interaction behavior of individual objects in the free field and we present a new source placement algorithm to significantly accelerate the precomputation. Our overall propagation algorithm can handle highly-tessellated or smooth objects undergoing rigid motion...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Tabitha C Peck, My Doan, Kimberly A Bourne, Jessica J Good
The underrepresentation of women in technical and STEM fields is a well-known problem, and stereotype threatening situations have been linked to the inability to recruit and retain women into these fields. Virtual reality enables the unique ability to perform body-swap illusions, and research has shown that these illusions can change participant behavior. Characteristically people take on the traits of the avatar they are embodying. We hypothesized that female participants embodying male avatars when a stereotype threat was made salient would demonstrate stereotype lift...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Ryohei Nagao, Keigo Matsumoto, Takuji Narumi, Tomohiro Tanikawa, Michitaka Hirose
This paper presents a novel interactive system that provides users with virtual reality (VR) experiences, wherein users feel as if they are ascending/descending stairs through passive haptic feedback. The passive haptic stimuli are provided by small bumps under the feet of users; these stimuli are provided to represent the edges of the stairs in the virtual environment. The visual stimuli of the stairs and shoes, provided by head-mounted displays, evoke a visuo-haptic interaction that modifies a user's perception of the floor shape...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Maria Murcia-Lopez, Anthony Steed
As we explore the use of consumer virtual reality technology for training applications, there is a need to evaluate its validity compared to more traditional training formats. In this paper, we present a study that compares the effectiveness of virtual training and physical training for teaching a bimanual assembly task. In a between-subjects experiment, 60 participants were trained to solve three 3D burr puzzles in one of six conditions comprised of virtual and physical training elements. In the four physical conditions, training was delivered via paper- and video-based instructions, with or without the physical puzzles to practice with...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Andrew MacQuarrie, Anthony Steed
360° images and video have become extremely popular formats for immersive displays, due in large part to the technical ease of content production. While many experiences use a single camera viewpoint, an increasing number of experiences use multiple camera locations. In such multi-view 360° media (MV360M) systems, a visual effect is required when the user transitions from one camera location to another. This effect can take several forms, such as a cut or an image-based warp, and the choice of effect may impact many aspects of the experience, including issues related to enjoyment and scene understanding...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Geng Lyu, Xukun Shen, Taku Komura, Kartic Subr, Lijun Teng
Displays that can portray environments that are perceivable from multiple views are known as multiscopic displays. Some multiscopic displays enable realistic perception of 3D environments without the need for cumbersome mounts or fragile head-tracking algorithms. These automultiscopic displays carefully control the distribution of emitted light over space, direction (angle) and time so that even a static image displayed can encode parallax across viewing directions (Iightfield). This allows simultaneous observation by multiple viewers, each perceiving 3D from their own (correct) perspective...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Bicheng Luo, Feng Xu, Christian Richardt, Jun-Hai Yong
We propose a novel 360° scene representation for converting real scenes into stereoscopic 3D virtual reality content with head-motion parallax. Our image-based scene representation enables efficient synthesis of novel views with six degrees-of-freedom (6-DoF) by fusing motion fields at two scales: (1) disparity motion fields carry implicit depth information and are robustly estimated from multiple laterally displaced auxiliary viewpoints, and (2) pairwise motion fields enable real-time flow-based blending, which improves the visual fidelity of results by minimizing ghosting and view transition artifacts...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Pietro Lungaro, Rickard Sjoberg, Alfredo Jose Fanghella Valero, Ashutosh Mittal, Konrad Tollmar
This paper presents a novel approach to content delivery for video streaming services. It exploits information from connected eye-trackers embedded in the next generation of VR Head Mounted Displays (HMDs). The proposed solution aims to deliver high visual quality, in real time, around the users' fixations points while lowering the quality everywhere else. The goal of the proposed approach is to substantially reduce the overall bandwidth requirements for supporting VR video experiences while delivering high levels of user perceived quality...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Myungho Lee, Gerd Bruder, Tobias Hollerer, Greg Welch
In this paper, we investigate factors and issues related to human locomotion behavior and proxemics in the presence of a real or virtual human in augmented reality (AR). First, we discuss a unique issue with current-state optical see-through head-mounted displays, namely the mismatch between a small augmented visual field and a large unaugmented periphery, and its potential impact on locomotion behavior in close proximity of virtual content. We discuss a potential simple solution based on restricting the field of view to the central region, and we present the results of a controlled human-subject study...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Hyungil Kim, Joseph L Gabbard, Alexandre Miranda Anon, Teruhisa Misu
This article investigates the effects of visual warning presentation methods on human performance in augmented reality (AR) driving. An experimental user study was conducted in a parking lot where participants drove a test vehicle while braking for any cross traffic with assistance from AR visual warnings presented on a monoscopic and volumetric head-up display (HUD). Results showed that monoscopic displays can be as effective as volumetric displays for human performance in AR braking tasks. The experiment also demonstrated the benefits of conformal graphics, which are tightly integrated into the real world, such as their ability to guide drivers' attention and their positive consequences on driver behavior and performance...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Ginga Kato, Yoshihiro Kuroda, Kiyoshi Kiyokawa, Haruo Takemura
Most existing locomotion devices that represent the sensation of walking target a user who is actually performing a walking motion. Here, we attempted to represent the walking sensation, especially a kinesthetic sensation and advancing feeling (the sense of moving forward) while the user remains seated. To represent the walking sensation using a relatively simple device, we focused on the force rendering and its evaluation of the longitudinal friction force applied on the sole during walking. Based on the measurement of the friction force applied on the sole during actual walking, we developed a novel friction force display that can present the friction force without the influence of body weight...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Kasun Karunanayaka, Nurafiqah Johari, Surina Hariri, Hanis Camelia, Kevin Stanley Bielawski, Adrian David Cheok
Today's virtual reality (VR) applications such as gaming, multisensory entertainment, remote dining, and online shopping are mainly based on audio, visual, and touch interactions between humans and virtual worlds. Integrating the sense of taste into VR is difficult since humans are dependent on chemical-based taste delivery systems. This paper presents the 'Thermal Taste Machine', a new digital taste actuation technology that can effectively produce and modify thermal taste sensations on the tongue. It modifies the temperature of the surface of the tongue within a short period of time (from 25°C to 40 °C while heating, and from 25°C to 10 °C while cooling)...
April 2018: IEEE Transactions on Visualization and Computer Graphics
Fetch more papers »
Fetching more papers... Fetching...
Read by QxMD. Sign in or create an account to discover new knowledge that matter to you.
Remove bar
Read by QxMD icon Read

Search Tips

Use Boolean operators: AND/OR

diabetic AND foot
diabetes OR diabetic

Exclude a word using the 'minus' sign

Virchow -triad

Use Parentheses

water AND (cup OR glass)

Add an asterisk (*) at end of a word to include word stems

Neuro* will search for Neurology, Neuroscientist, Neurological, and so on

Use quotes to search for an exact phrase

"primary prevention of cancer"
(heart or cardiac or cardio*) AND arrest -"American Heart Association"