Master's Thesis

Integration of Mixed Reality and Touchscreen Interfaces for Humanoid Robot Embodiment in a Virtual Clinical Setting

Title:

Integration of Mixed Reality and Touchscreen Interfaces for Humanoid Robot Embodiment in a Virtual Clinical Setting.

Abstract:

This paper aims to develop a comprehensive Robody control platform that integrates a smartphone and an augmented reality (AR) head-mounted display (HMD) to explore new modes of achieving remote embodiment through Robody. We created a multifunctional smartphone application allowing users to remotely monitor Robody, assign autonomous tasks, and use the smartphone as an alternative to AR controllers. Additionally, we proposed two smartphone-based and one hand tracking-based methods for controlling Robody’s hands. We successfully integrated our developed control methods into the physical Robody, validating the feasibility of our approach. We evaluated the usability, embodiment, and performance of various control methods through a user study. The results indicate that our control system’s usability surpasses the average level and induces a certain level of embodiment. In the experiment, the hand tracking control mode performed the best, while the smartphone pointer control mode performed the least satisfying. Our work demonstrates the potential of combining smartphone and AR HMD, as well as hand tracking-based control methods in Robody control, providing a foundation for achieving further embodiment and improved control in the future.

Smartphone pointer method:
Smartphone motion tracking method:
Hand-tracking method: