
Information Visualization
Guided by task analysis, our research into information visualization explores new ways to visualize information and data to users in order to make decisions faster and easier.
Multimodal Interfaces
We are investigating new ways to interact with digital information that integrates multiple natural modalities that include eye gaze, speech commands, and hand gesture recognition.
VR/AR/MR Interfaces
We are developing virtual information dashboards and heads up displays that can be used with Virtual/Augmented/Mixed reality technology in order to improve situation awareness and decision-making.
Information Visualization
Information Dashboards
Information presentation for supervisory control tasks to maintain situation awareness and support decision making.
Zoomable User Interfaces (ZUIs)
A graphical user interface that supports zooming in and out that naturally supports Shneiderman’s Visual Information Seeking Mantra: Overview first, zoom and filter, details on demand.
Cognitively Tailored Interfaces (CTIs)
An interface that adapts how it organizes and presents information based on cognitively stable attributes and styles of individual users to more effectively present information to that specific user.
Selected Publications:
Hansberger, J.T. & Meacham, S.C. (2018). Designing data & dashboard visualizations for future UAV systems. 2018 Human Factors & Ergonomics Society Conference. 1-5, October, 2018. Philadelphia, PA.
Hansberger, J.T., McArthur, J., Hansen, J., Blakely, V.R., Kaufman, C.M. (under review). An interface for navigating and relating who, what, where, & when across large image collections.
Multimodal Interfaces
Voice Input
Use of speech recognition systems to allow user commands to the system. Primary research focus is with key-word commands.
Eye Gaze Input
Use of eye-tracking systems to allow the user’s eye gaze to be used as the selection capability in the interface.
Hand Gesture Input
Use of everyday type of hand gestures as an input modality to be paired with voice and eye gaze input. We have trained a highly accurate gesture recognition system that works in real-time.
Selected Publications:
Hansberger, J.T., Peng, C., Blakely, V., Meacham, S.C., Cao, L., and Diliberti, N. (2019). A multimodal interface for virtual information environments. 20th International Conference on Human-Computer Interaction. 26-31 July, 2019, Orlando, FL.
Diliberti, N., Peng, C., Cao, L., Hansberger, J.T. (2019). Real-time gesture recognition using 3D sensory data and a light convolutional neural network. ACM Multimedia 2019 Conference. 21-25 October, 2019. Nice, France
Hansberger, J.T., Peng, C., Mathis, S.L., Shanthakumar, V.A., Meacham, S.C., Cao, L., and Blakely, V. (2017). Dispelling the Gorilla Arm Syndrome: The viability of prolonged gesture interactions. 19th International Conference on Human-Computer Interaction . 9-14 July, 2017, Vancouver, Canada.
VR/AR/MR Interfaces
Virtual Information Environment (VIE)
A virtual environment whose primary purpose is to facilitate information foraging and processing activities. A VIE allows the user to 1) view information, 2) control how it is organized, and 3) allow interaction with the desired information elements.
Role Specific Dashboards & HUDs
Tailoring the UI design to match user information needs and requirements to each team member’s role for improved awareness and decision making.
Human Performance with Virtual Interfaces
Experimentation efforts to quantify the advantages and disadvantages of using interfaces in VR/AR/MR information environments.
Selected Publications:
Hansberger, J.T., Blakely, V. McArthur, J., Smith, J. (2020). Target detection performance for head mounted indirect vision displays. Virtual, Augmented, and Mixed Reality (XR) Technology for Multi-Domain Operations. June, 2020.
Cao, Lizhou, Chao Peng, and Jeffery T. Hansberger. “A Large Curved Display System in Virtual Reality for Immersive Data Interaction.” Proceedings of the 2019 IEEE Games, Entertainment, Media Conference. Ed. GEM’19. New Haven, CT: IEEE, 2019. Web.
Hansberger, J.T., Peng, C., Cao, L., Diliberti, N., Shanthakumar, V.A., Hansen, J. (2018). The design of a virtual information environment. 2018 Human Factors & Ergonomics Society Conference. 1-5, October, 2018. Philadelphia, PA.
Cao, Lizhou, Chao Peng, and Jeffery T. Hansberger. “Usability and Engagement Study for a Serious Virtual Reality Game of Lunar Exploration Missions.” Journal of Informatics 6. 4 (2019): Article ID 44. Web.
Collaborations
Army Research Laboratory
The U.S. Army’s corporate research laboratory that conducts basic and applied research. Dr. Jeff Hansberger is part of ARL and the Human Research and Engineering Directorate (HRED).
Rochester Institute of Technology (RIT)
The School of Interactive Games and Media (IGM) comprises faculty from a variety of academic backgrounds with a shared interest in computing as it relates to interactive media, games, simulations, VR/AR, experimental interfaces, and media-centric systems of all varieties.
University of Alabama in Huntsville (UAH)
The university’s 505-acre campus, which includes 17 high-tech research centers and labs responsible for $99 million in annual research expenditures, serves as the anchor tenant for the second-largest research park in the nation.
Industry
The lab often collaborates with industry partners in Huntsville, AL as well as other locations in the US. We have recently partnered with SRI Technologies on machine learning techniques for automatic target recognition research.
VUE Lab
The eValuation and User Experience (VUE) Lab is a multidisciplinary research facility at UAH that supports learning, collaboration, design and evaluation efforts in the area of user experience and usability testing.
Jayse Hansen
Jayse Hansen is known for his future-forward design and animation of computer interfaces, holograms, HUDs, and medical simulations for blockbuster film franchises such as Marvel’s The Avengers, Iron Man, Spider-Man, Guardians of the Galaxy, Star Wars, and The Hunger Games.