2019
-
3DTactileDraw: A Tactile Pattern Design Interface for Complex Arrangements of Actuators
Oliver Beren Kaul, Leonard Hansing and Michael Rohs
Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems -
Concept for Navigating the Visually Impaired using a Tactile Interface around the Head
Oliver Beren Kaul and Michael Rohs
Hacking Blind Navigation Workshop at CHI '19
2018
2017
-
Increasing Presence in Virtual Reality with a Vibrotactile Grid Around the Head
Oliver Beren Kaul, Kevin Meier and Michael Rohs
Human-Computer Interaction -- INTERACT 2017: 16th IFIP TC 13 International Conference, Mumbai, India, September 25-29, 2017, Proceedings, Part IVA high level of presence is an important aspect of immersive virtual reality applications. However, presence is difficult to achieve as it depends on the individual user, immersion capabilities of the system (visual, auditory, and tactile) and the concrete application. We use a vibrotactile grid around the head in order to further increase the level of presence users feel in virtual reality scenes. In a between-groups comparison study the vibrotactile group scored significantly higher in a standardized presence questionnaire compared to the baseline of no tactile feedback. This suggests the proposed prototype as an additional tool to increase the level of presence users feel in virtual reality scenes. -
HapticHead: A Spherical Vibrotactile Grid around the Head for 3D Guidance in Virtual and Augmented Reality
Oliver Beren Kaul and Michael Rohs
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI '17
2016
-
Wearable Head-mounted 3D Tactile Display Application Scenarios
Oliver Beren Kaul and Michael Rohs
Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services Adjunct -
HapticHead: 3D Guidance and Target Acquisition Through a Vibrotactile Grid
Oliver Beren Kaul and Michael Rohs
Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems -
Follow the Force: Steering the Index Finger towards Targets using EMS
Oliver Beren Kaul, Max Pfeiffer and Michael Rohs
Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing SystemsIn mobile contexts guidance towards objects is usually done through the visual channel. Sometimes this channel is overloaded or not appropriate. A practicable form of haptic feedback is challenging. Electrical muscle stimulation (EMS) can generate mobile force feedback but has a number of drawbacks. For complex movements several muscles need to be actuated in concert and a feedback loop is necessary to control movements. We present an approach that only requires the actuation of six muscles with four pairs of electrodes to guide the index finger to a 2D point and let the user perform mid-air disambiguation gestures. In our user study participants found invisible, static target positions on top of a physical box with a mean 2D deviation of 1.44 cm from the intended target.