2021
-
Mobile Recognition and Tracking of Objects in the Environment through Augmented Reality and 3D Audio Cues for People with Visual Impairments
Oliver Beren Kaul, Kersten Behrens and Michael Rohs
Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems - CHI EA '21People with visual impairments face challenges in scene and object recognition, especially in unknown environments. We combined the mobile scene detection framework Apple ARKit with MobileNet-v2 and 3D spatial audio to provide an auditory scene description to people with visual impairments. The combination of ARKit and MobileNet allows keeping recognized objects in the scene even if the user turns away from the object. An object can thus serve as an auditory landmark. With a search function, the system can even guide the user to a particular item. The system also provides spatial audio warnings for nearby objects and walls to avoid collisions. We evaluated the implemented app in a preliminary user study. The results show that users can find items without visual feedback using the proposed application. The study also reveals that the range of local object detection through MobileNet-v2 was insufficient, which we aim to overcome using more accurate object detection frameworks in future work (YOLOv5x). -
Around-the-Head Tactile System for Supporting Micro Navigation of People with Visual Impairments
Oliver Beren Kaul, Michael Rohs, Marc Mogalle and Benjamin Simon
ACM Transactions on Computer-Human Interaction, Volume 28, Issue 4 - TOCHI '21Tactile patterns are a means to convey navigation instructions to pedestrians and are especially helpful for people with visual impairments. This article presents a concept to provide precise micro-navigation instructions through a tactile around-the-head display. Our system presents four tactile patterns for fundamental navigation instructions in conjunction with continuous directional guidance. We followed an iterative, user-centric approach to design the patterns for the fundamental navigation instructions, combined them with a continuous directional guidance stimulus, and tested our system with 13 sighted (blindfolded) and 2 blind participants in an obstacle course, including stairs. We optimized the patterns and validated the final prototype with another five blind participants in a follow-up study. The system steered our participants successfully with a 5.7 cm average absolute deviation from the optimal path. Our guidance is only a little less precise than the usual shoulder wobbling during normal walking and an order of magnitude more precise than previous tactile navigation systems. Our system allows various new use cases of micro-navigation for people with visual impairments, e.g., preventing collisions on a sidewalk or as an anti-veering tool. It also has applications in other areas, such as personnel working in low-vision environments (e.g., firefighters). -
VRTactileDraw: A Virtual Reality Tactile Pattern Designer for Complex Spatial Arrangements of Actuators
Oliver Beren Kaul, Andreas Domin, Michael Rohs, Benjamin Simon and Maximilian Schrapel
Proceedings of the 18th IFIP TC 13 International Conference on Human-Computer Interaction, Part V - INTERACT '21Creating tactile patterns on the body via a spatial arrangement of many tactile actuators offers many opportunities and presents a challenge, as the design space is enormous. This paper presents a VR interface that enables designers to rapidly prototype complex tactile interfaces. It allows for painting strokes on a modeled body part and translates these strokes into continuous tactile patterns using an interpolation algorithm. The presented VR approach avoids several problems of traditional 2D editors. It realizes spatial 3D input using VR controllers with natural mapping and intuitive spatial movements. To evaluate this approach in detail, we conducted a user study and iteratively improved the system. The study participants gave predominantly positive feedback on the presented VR interface (SUS score 79.7, AttrakDiff “desirable”). The final system is released alongside this paper as an open-source Unity project for various tactile hardware.
2020
-
Vibrotactile Funneling Illusion and Localization Performance on the Head
Oliver Beren Kaul, Michael Rohs, Benjamin Simon, Kerem Can Demir and Kamillo Ferry
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '20The vibrotactile funneling illusion is the sensation of a single (non-existing) stimulus somewhere in-between the actual stimulus locations. Its occurrence depends upon body location, distance between the actuators, signal synchronization, and intensity. Related work has shown that the funneling illusion may occur on the forehead. We were able to reproduce these findings and explored five further regions to get a more complete picture of the occurrence of the funneling illusion on the head. The results of our study (24 participants) show that the actuator distance, for which the funneling illusion occurs, strongly depends upon the head region. Moreover, we evaluated the centralizing bias (smaller perceived than actual actuator distances) for different head regions, which also showed widely varying characteristics. We computed a detailed heat map of vibrotactile localization accuracies on the head. The results inform the design of future tactile head-mounted displays that aim to support the funneling illusion. -
Design and Evaluation of On-the-Head Spatial Tactile Patterns
Oliver Beren Kaul, Michael Rohs and Marc Mogalle
Proceedings of the 19th International Conference on Mobile and Ubiquitous Multimedia - MUM '20We propose around-the-head spatial vibrotactile patterns for representing different kinds of notifications. The patterns are defined in terms of stimulus location, intensity profile, rhythm, and roughness modulation. A first study evaluates recall and distinguishability of 30 patterns, as well as agreement on meaning without a predetermined context: Agreement is low, yet the recognition rate is surprisingly high. We identify which kinds of patterns users recognize well and which ones they prefer. Static stimulus location patterns have a higher recognition rate than dynamic patterns, which move across the head as they play. Participants preferred dynamic patterns for comfort. A second study shows that participants are able to distinguish substantially more around-the-head spatial patterns than smartphone-based patterns. Spatial location has the highest positive impact on accuracy among the examined features, so this parameter allows for a large number of levels.
2019
-
3DTactileDraw: A Tactile Pattern Design Interface for Complex Arrangements of Actuators
Oliver Beren Kaul, Leonard Hansing and Michael Rohs
Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems - CHI EA '19Creating tactile patterns for a grid or a 3D arrangement of a large number of actuators presents a challenge as the design space is huge. This paper explores two different possibilities of implementing an easy-to-use interface for tactile pattern design on a large number of actuators around the head. Two user studies were conducted in order to iteratively improve the prototype to fit user needs. -
Concept for Navigating the Visually Impaired using a Tactile Interface around the Head
Oliver Beren Kaul and Michael Rohs
Hacking Blind Navigation Workshop at CHI '19 - CHI Workshop '19
2018
-
Requirements of Navigation Support Systems for People with Visual Impairments
Oliver Beren Kaul and Michael Rohs
Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers - UbiComp '18Tactile patterns are a means to convey general direction information to pedestrians (for example when turning right) and specific navigation instructions (for example when approaching the stairs). Tactile patterns are especially helpful for people with visual impairments in navigation scenarios and can also be used to deliver general notifications. This workshop paper is supposed to spark a discussion within the workshop about correctly identifying requirements and other needs of the visually impaired population in order to create a useful guidance tool to eventually replace the white cane as a primary navigation tool for the visually impaired.
2017
-
Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head
Oliver Beren Kaul, Kevin Meier and Michael Rohs
Proceedings of the 16th IFIP TC 13 International Conference on Human-Computer Interaction, Part IV - INTERACT '17A high level of presence is an important aspect of immersive virtual reality applications. However, presence is difficult to achieve as it depends on the individual user, immersion capabilities of the system (visual, auditory, and tactile) and the concrete application. We use a vibrotactile grid around the head in order to further increase the level of presence users feel in virtual reality scenes. In a between-groups comparison study the vibrotactile group scored significantly higher in a standardized presence questionnaire compared to the baseline of no tactile feedback. This suggests the proposed prototype as an additional tool to increase the level of presence users feel in virtual reality scenes. -
HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality
Oliver Beren Kaul and Michael Rohs
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '17Current virtual and augmented reality head-mounted displays usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. We present HapticHead, a system utilizing multiple vibrotactile actuators distributed in three concentric ellipses around the head for intuitive haptic guidance through moving tactile cues. We conducted three experiments, which indicate that HapticHead vibrotactile feedback is both faster (2.6 s vs. 6.9 s) and more precise (96.4% vs. 54.2% success rate) than spatial audio (generic head-related transfer function) for finding visible virtual objects in 3D space around the user. The baseline of visual feedback is as expected more precise (99.7% success rate) and faster (1.3 s) in comparison, but there are many applications in which visual feedback is not desirable or available due to lighting conditions, visual overload, or visual impairments. Mean final precision with HapticHead feedback on invisible targets is 2.3° compared to 0.8° with visual feedback. We successfully navigated blindfolded users to real household items at different heights using HapticHead vibrotactile feedback independently of a head-mounted display.
2016
-
Wearable Head-mounted 3D Tactile Display Application Scenarios
Oliver Beren Kaul and Michael Rohs
Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct - MobileHCI '16Current generation virtual reality (VR) and augmented reality (AR) head-mounted displays (HMDs) usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. In a previous work, we presented HapticHead, a potentially mobile system utilizing vibration motors distributed in three concentric ellipses around the head to give intuitive haptic guidance hints and to increase immersion for VR and AR applications. The purpose of this paper is to explore potential application scenarios and aesthetic possibilities of the proposed concept in order to create an active discussion amongst workshop participants. -
HapticHead: 3D Guidance and Target Acquisition Through a Vibrotactile Grid
Oliver Beren Kaul and Michael Rohs
Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA '16Current generation virtual reality (VR) and augmented reality (AR) head-mounted displays (HMDs) usually include no or only a single vibration motor for haptic feedback and do not use it for guidance. We present HapticHead, a system utilizing 20 vibration motors distributed in three concentric ellipses around the head to give intuitive haptic guidance hints and to increase immersion for VR and AR applications. Our user study indicates that HapticHead is both faster (mean=3.7s, SD=2.3s vs. mean=7.8s, SD=5.0s) and more precise (92.7% vs. 44.9% hit rate) than auditory feedback for the purpose of finding virtual objects in 3D space around the user. The baseline of visual feedback is as expected more precise (99.9% hit rate) and faster (mean=1.5s, SD=0.6s) in comparison but there are many applications in which visual feedback is not desirable or available due to lighting conditions, visual overload, or visual impairments. -
Follow the Force: Steering the Index Finger towards Targets using EMS
Oliver Beren Kaul, Max Pfeiffer and Michael Rohs
Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems - - CHI EA '16In mobile contexts guidance towards objects is usually done through the visual channel. Sometimes this channel is overloaded or not appropriate. A practicable form of haptic feedback is challenging. Electrical muscle stimulation (EMS) can generate mobile force feedback but has a number of drawbacks. For complex movements several muscles need to be actuated in concert and a feedback loop is necessary to control movements. We present an approach that only requires the actuation of six muscles with four pairs of electrodes to guide the index finger to a 2D point and let the user perform mid-air disambiguation gestures. In our user study participants found invisible, static target positions on top of a physical box with a mean 2D deviation of 1.44 cm from the intended target.