Virginia Tech® home

Perceptual Challenges in Augmented Reality

Insert your title here

PERCEPTUAL CHALLENGES IN AUGMENTED REALITY

EXAMINING PERCEPTUAL UNDERPINNINGS OF AR COLOR, VISUAL ACUITY, TEXT LEGIBILITY AND DEPTH PERCEPTION TO INFORM EFFECTIVE USER INTERFACE DESIGN.

NSF Logo
Boeing Logo
NSF Logo

As public interest in mobile, wearable AR-based experiences is on the rise, technology companies are rushing to develop wearable augmented reality (AR) display systems with embedded lightweight computing capabilities that strive to make mobile living effective, convenient, pleasant and aesthetic.

While we are already seeing some successful first-generation, commercially available lightweight wearable AR systems, the full potential of opportunities for mobile AR headworn displays has not been fully tapped, nor the visual perceptual challenges widely understood. As we race to field headworn outdoor AR applications for mobile computing, we need to first understand, and then design for, the visuoperceptual and cognitive issues to inform safe and effective AR user interface designs. 

THE GOAL OF THIS WORK IS BETTER UNDERSTANDING EFFECTS OF AR INTERFACE DESIGN ON:

    Depth perception & color perception issues in AR
    Visual clutter, visual search & information processing with AR user interfaces
   Context-switching and distance-switching in AR
   Impact of AR interface design on selective, focused & divided attention
   Just noticeable differences, signal thresholds and Individual differences with AR-based visual stimulii

SELECT PUBLICATIONS

Rick Skarbez, Joseph L. Gabbard, Doug A. Bowman, Todd Ogle & Thomas Tucker. (2021). "Virtual Replicas of Real Places: Experimental investigations," IEEE Transactions on Visualization and Computer Graphics, doi: 10.1109/TVCG.2021.3096494.
Michele Gattullo, Alessandro Evangelista, Vito M. Manghisi, Antonio E. Uva, Michele Fiorentino, Antonio Boccaccio, Michele Ruta & Joseph L. Gabbard. (2020). "Towards Next Generation Technical Documentation in Augmented Reality Using a Context-Aware Information Manager." Applied Sciences 10, no. 3: 780.
Joseph L. Gabbard, Missie Smith, Coleman Merenda‡, Gary Burnett & David R. Large. (2020). “A Perceptual Color-Matching Method for Examining Color Blending in Augmented Reality Head-Up Display Graphics.” IEEE Transactions on Visualization and Computer Graphics. (accepted and in press, DOI: 10.1109/TVCG.2020.3044715).
Jonathan Flittner‡, John Luksas† & Joseph L. Gabbard. "Predicting User Performance in Augmented Reality User Interfaces with Image Analysis Algorithms." In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, vol. 64, no. 1, pp. 2108-2112. Sage CA: Los Angeles, CA: SAGE Publications, 2020.
Jonathan Flittner‡, John Luksas† &  Joseph L. Gabbard. "The Viability of Image Analysis Measures of Visual Clutter in the AR UI Space as a Predictive Measures of User Performance." In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp. 230-233. IEEE, 2020.
Arefin Mohammed Safayet, Nate Phillips, Alexander Plopski, Joseph L. Gabbard & J. Edward Swan II. "Impact of AR Display Context Switching and Focal Distance Switching on Human Performance: Replication on an AR Haploscope." In 2020 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), pp. 571-572. IEEE, 2020.