Abstract

In mobile applications it is crucial to provide intuitive means for 2D and 3D interaction. A large number of techniques exist to support a natural user interface (NUI) by detecting the user´s hand posture in RGB+D (depth) data. Depending on a given interaction scenario, each technique hast its advantages and disadvantages. To evaluate the performance of the various techniques on a mobile device, we conducted a systematic study by comparing the accuracy of five common posture recognition approaches with varying illumination and background. To be able to perform this study, we developed a powerful hard- and software framework that is capable of processing and fusing RGB and depth data directly on a handheld device. Overall results reveal best recognition rate of posture detection for combined RGB+D data at the expense of update rate. Finally, to support users in choosing the appropriate technique for their specific mobile interaction task, we derived guidelines based on our study.

Reference

Fritz, D., Mossel, A., & Kaufmann, H. (2014). Evaluating RGB+D Hand Posture Detection Methods for Mobile 3D Interaction. In Proceedings of the 16th International Conference of Virtual Technologies (VRIC’14). 16th International Conference and Exibition on Virtual Technologies, Laval, France, EU. ACM. http://hdl.handle.net/20.500.12708/55083