Abstract

The visualisation of limbs in Virtual Reality (VR) helps to get a better immersion in the virtual world and it creates better confidence in movement. Sadly a lot of VR applications omit the visualisation of limbs. One reason lies in technical difficulties with bigger scale VR environments and multi-user VR environments where you can not rely on outside-in tracking methods because of the size and possible occlusion that hinders accurate tracking data. Another reason is that developers do not want to exclude parts of their already small user base by demanding special hardware for foot tracking that costs as much as the hand controllers but is only usable in a small number of applications. This thesis tackles these problems by generating a lightweight tracking system that only relies on the correct tracking of the head position so that either inside-out or outside-in tracking can be used with it. To achieve this, a RGB depth camera is mounted on the VR headset. A combination of fiducial marker tracking, depth tracking and inertial measurement units (IMUs) are used to track the user’s feet. These individual tracking signals are then fused to one signal that combines the advantages of the single tracking systems. This tracking information can then be used to animate the feet of a virtual avatar with an Inverse Kinematics (IK) algorithm.

Reference

Bayer, A. (2021). Foot tracking in virtual reality [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2021.77646