Proceedings-Artikel

A Mixed Reality Head-Mounted Text Translation System Using Eye Gaze Input

Takumi Toyama; Daniel Sonntag; Takahiro Matsuda; Andreas Dengel; Masakazu Iwamura; Koichi Kise
In: Proceedings of the 2014 international conference on Intelligent user interfaces. International Conference on Intelligent User Interfaces (IUI-14), Haifa, Israel, Pages 329-334, ACM, 2014.

Abstract

Efficient text recognition has recently been a challenge for augmented reality systems. In this paper, we propose a system with the ability to provide translations to the user in real-time. We use eye gaze for more intuitive and efficient input for ubiquitous text reading and translation in head mounted displays (HMDs). The eyes can be used to indicate regions of interest in text documents and activate optical-character-recognition (OCR) and translation functions. Visual feedback and navigation help in the interaction process, and text snippets with translations from Japanese to English text snippets, are presented in a see-through HMD. We focus on travelers who go to Japan and need to read signs and propose two different gaze gestures for activating the OCR text reading and translation function. We evaluate which type of gesture suits our OCR scenario best. We also show that our gaze-based OCR method on the extracted gaze regions provide faster access times to information than traditional OCR approaches. Other benefits include that visual feedback of the extracted text region can be given in real-time, the Japanese to English translation can be presented in real-time, and the augmentation of the synchronized and calibrated HMD in this mixed reality application are presented at exact locations in the augmented user view to allow for dynamic text translation management in head-up display systems.

Weitere Links

BibTeX