ACM SIGCHI Conference on Human Factors in Computing Systems (ACM CHI 2015)
Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices
Liwei Chan1, 2 Chi-Hao Hsieh1 Yi-Ling Chen1, 2 Shou Yang1 Da-Yuan Huang1 Rong-Hao Liang1, 2 Bing-Yu Chen1, 2
1National Taiwan University
2Intel-NTU Connected Context Computing Center
Cyclops is a wearable single-piece bodily gesture input device that captures an ego-centric view of the user. With Cyclops worn at the center of the body, the user can play a racing game on the mobile phone with hand and foot interactions. For example, the user may push forward an imaginary gear stick on his right to trigger a nitro turbo.
Abstract
This paper presents Cyclops, a single-piece wearable device that sees its user’s whole body postures through an ego- centric view of the user that is obtained through a fisheye lens at the center of the user’s body, allowing it to see only the user’s limbs and interpret body postures effectively. Unlike currently available body gesture input systems that depend on external cameras or distributed motion sensors across the user’s body, Cyclops is a single-piece wearable device that is worn as a pendant or a badge. The main idea proposed in this paper is the observation of limbs from a central lo- cation of the body. Owing to the ego-centric view, Cyclops turns posture recognition into a highly controllable computer vision problem. This paper demonstrates a proof-of-concept device and an algorithm for recognizing static and moving bodily gestures based on motion history images (MHI) and a random decision forest (RDF). Four example applications of interactive bodily workout, a mobile racing game that in- volves hands and feet, a full-body virtual reality system, and interaction with a tangible toy are presented. The experiment on the bodily workout demonstrates that, from a database of 20 body workout gestures that were collected from 20 partici- pants, Cyclops achieved a recognition rate of 79% using MHI and simple template matching, which increased to 92% with the more advanced machine learning approach of RDF.
Paper
ACM Digital Library
BibTex
@inproceedings{Chan:2015:CWS, author = {Chan, Liwei and Hsieh, Chi-Hao and Chen, Yi-Ling and Yang, Shuo and Huang, Da-Yuan and Liang, Rong-Hao and Chen, Bing-Yu}, title = {Cyclops: Wearable and Single-Piece Full-Body Gesture Input Devices}, booktitle = {Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems}, series = {CHI '15}, year = {2015}, isbn = {978-1-4503-3145-6}, location = {Seoul, Republic of Korea}, pages = {3001--3009}, publisher = {ACM}, address = {New York, NY, USA}, keywords = {ego-centric view, full-body gesture input, posture recognition, single-point wearable devices}, }