OmniVib: cross body spatiotemporal vibrotactile notifications for phones (CHI15)
Jessalyn Alvina, Shengdong Zhao, Simon T. Perrault, Maryam Azh, Thijs Roumen, and Morten Fjeld
Previous works illustrate that one's palm can reliably recognize 10 or more spatiotemporal vibrotactile patterns. However, recognition of the same patterns on other body parts is unknown. In this paper, we investigate how users perceive spatiotemporal vibrotactile patterns on the arm, palm, thigh, and waist. Results of the first two experiments indicate that precise recognition of either position or orientation is difficult across multiple body parts. Nonetheless, users were able to distinguish whether two vibration pulses were from the same location when played in quick succession. Based on this finding, we designed eight spatiotemporal vibrotactile patterns and evaluated them in two additional experiments. The results demonstrate that these patterns can be reliably recognized (>80%) across the four tested body parts, both in the lab and in a more realistic context.
This work is published at CHI 2015:
link to paper
link to recording to talk
Jessalyn Alvina, Shengdong Zhao, Simon T. Perrault, Maryam Azh, Thijs Roumen, and Morten Fjeld. 2015. OmniVib: Towards Cross-body Spatiotemporal Vibrotactile Notifications for Mobile Phones. In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 2487-2496. DOI: http://dx.doi.org/10.1145/2702123.2702341