Telepresence technology can provide an alternative working style for seniors who want to continuously contribute to society after retirment by delivering their experiences and know-hows to a young generation. Conventional telepresence methods are, however, limited to a face-to-face communication utilizing a static flat-panel display or a physically constrained humanoid robot, which lack the presence of a remote operator, non-verbal communication cues such as a body language, and the freedom of mobility. In this project, we explored a mobile embodied 3D avatar system to shift a rich expressiveness of full-body avatar from virtual world to daily life by integrating a high maneuvability into a human-scale semi-transparent holographic display. We investigated a novel architecture to contorl embodied multimodal avatar by deploying a multimodal behavior generation framework, SAIBA (Situation, Agent, Intention, Behavior, Animation). Synergy effects of a physical movement through a moveable robotic display and a smooth computer graphic animaiton visualized on a 3D display demonstrated a dynamic visual media expression for Spatial Augmented Reality, which can improve the presence and non-verbal communication cues of a remote operator.
Remote Lecture from Home
Dynamical 3D Depth Animation
Outdoor Test
Paper Yutaka Tokuda, Atsushi Hiyama, Takahiro Miura, Tomohiro Tanikawa, Michitaka Hirose. 2013. Towards Mobile Embodied 3D Avatar as Telepresence Vehicle.
In Proceedings of the 7th International Conference on Universal Access in Human-Computer Interaction (UAHCI'13),
pp.671--680. Springer. https://doi.org/10.1007%2F978-3-642-39194-1_77
Paper Masahiko Izumi, Tomoya Kikuno, Yutaka Tokuda, Atsushi Hiyama, Takahiro Miura, Michitaka Hirose. 2014. Practical Use of a Remote Movable Avatar Robot with an Immersive Interface for Seniors. In Proceedings of the 8th International Conference on Universal Access in Human-Computer Interaction (UAHCI'14), pp.648--659. Springer. https://doi.org/10.1007/978-3-319-07446-7_62