EVALUATING THE PLACEMENT OF ARM-WORN DEVICES FOR RECOGNIZING VARIATIONS OF DYNAMIC HAND GESTURES

Authors

  • KATHRIN KEFER Department of Mobile Computing, University of Applied Sciences Upper Austria Softwarepark 11, Hagenberg, 4232, Austria
  • CLEMENS HOLZMANN Department of Mobile Computing, University of Applied Sciences Upper Austria Softwarepark 11, Hagenberg, 4232, Austria
  • RAINHARD DIETER FINDLING Department of Mobile Computing, University of Applied Sciences Upper Austria Softwarepark 11, Hagenberg, 4232, Austria

Keywords:

Gesture recognition, Hand gestures, Accelerometer, Sensor placement, Arm- worn devices

Abstract

Dynamic hand gestures have become increasingly popular as touch-free input modality for interactive systems. There exists a variety of arm-worn devices for the recognition of hand gestures, which dier not only in their capabilities, but also in their positioning on users' arms. These dierences in positioning might in uence how well gestures are recognized, leading to dierent gesture recognition accuracies. In this paper, we investi- gate the eect of device placement on dynamic hand gesture recognition accuracy. We consider devices being strapped to the forearm on two positions: the wrist and below the elbow. These positions represent smart watches being worn on the wrist and devices with EMG sensors for the additional detection of static hand gestures (e.g spreading the ngers) being worn right below the elbow. Our hypothesis is that wrist-worn devices will have better recognition accuracy, caused by higher acceleration values of a bigger action radius of dynamic hand gestures. We conducted a comparative study using an LG G Watch and Thalmic Labs' Myo armband, for which we recorded a total of 12960 gesture samples of eight simple dynamic gestures in three dierent variants with eight participants. We evaluated a potential dierence in gesture recognition accuracies us- ing dierent feature sets and classiers. Although recognition accuracies for wrist-worn devices seem higher, the dierence is not statistically signicant due to substantial vari- ations in accuracy across participants. We thus cannot conclude that dierent positions of gesture recording devices on the lower arm have signicant in uence on correctly recognizing arm gestures.

 

Downloads

Download data is not yet available.

References

Lingchen Chen and Feng Wang and Hui Deng and Kaifan Ji (2013), A Survey on Hand Gesture

Recognition, Proc. of the 2013 International Conference on Computer Sciences and Applications,

CSA '13, pp. 313-316

J. J. LaViola (1999), A Survey of Hand Posture and Gesture Recognition Techniques and Tech-

nology, Brown University

S. Mitra and T. Acharya (2007), Gesture Recognition: A Survey, IEEE T SYST MAN CY C,

Vol.37, pp. 311-324

J. Kela and P. Korpip and J. Mantyjrvi and S. Kallio and G. Savino and L. Jozzo and D. Marca

(2006), Accelerometer-based Gesture Control for a Design Environment, Pers. Ubiquit. Comput.,

Vol.10, pp. 285-299

N. Helmi and M. Helmi (2009), Applying a neuro-fuzzy classi er for gesture-based control us-

ing a single wrist-mounted accelerometer, Proc. of the 2009 IEEE International Symposium on

Computational Intelligence in Robotics and Automation, pp. 216-221

J. Liu and L. Zhong and J. Wickramasuriya and V. Vasudevan(2009), uWave: Accelerometer-based

personalized gesture recognition and its applications, Pervasive Mob Comput, Vol.5, pp.657-675

L. Porzi and S. Messelodi and C. M. Modena and E. Ricci (2013), A Smart Watch-based Ges-

ture Recognition System for Assisting People with Visual Impairments, Proc. of the 3rd ACM

International Workshop on Interactive Multimedia on Mobile and Portable Devices, pp. 19-24

K. Kefer and C. Holzmann and R.D. Findling (2016), Comparing the Placement of Two Arm-Worn

Devices for Recognizing Dynamic Hand Gestures, Proc. of the 14th International Conference on

Advances in Mobile Computing and Multi Media, pp. 99-104

J. Wu and G. Pan and D. Zhang and G. Qi and S. Li (2009), Gesture Recognition with a 3-D

Accelerometer, Ubiquitous Intelligence and Computing, Vol.5585, pp. 25-38

T. Marasovi, V. Papi (2011), Accelerometer-based gesture classi cation using principal component

analysis, SoftCOM, pp. 1-5

J. Rekimoto (2001), GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices,

Proc. of the 5th International Symposium on Wearable Computers

X. Chen and T. Grossman and D. J. Wigdor and G. Fitzmaurice (2014), Duet: Exploring Joint

Interactions on a Smart Phone and a Smart Watch, Proc. of the SIGCHI Conference on Human

Factors in Computing Systems, pp. 159-168

I. Cleland and B. Kikhia and C. Nugent and A. Boytsov and J. Hallberg and K. Synnes and S.

McClean and D. Finlay (2013),Optimal placement of accelerometers for the detection of everyday

activities, Sensors, Vol.13, pp. 9183-9200

D. O. Olguin and A. S. Pentland (2006), Human activity recognition: Accuracy across common

locations for wearable sensors, Proc. of the 10th IEEE International Symposium on Wearable

Computers, pp. 11-13

H. Gjoreski and M. Lustrek and M. Gams (2011), Accelerometer Placement for Posture Recognition

and Fall Detection, Proc. of the 7th International Conference on Intelligent Environments, pp. 47-

T. Marasovic and V. Papic (2011), Accelerometer-based gesture classi cation using principal com-

ponent analysis, Proc. of the 19th International Conference on Software, Telecommunications and

Computer Networks, pp. 1-5

Yinghui Zhou and Lei Jing and Junbo Wang and Zixue Cheng (2012), Analysis and Selection of

Features for Gesture Recognition Based on a Micro Wearable Device, IJACSA, Vol.3

Downloads

Published

2017-05-31

Issue

Section

Articles