Home >

news ヘルプ

論文・著書情報


タイトル
和文: 
英文:Corneal-Imaging Calibration for Optical See-Through Head-Mounted Displays 
著者
和文: Aelxandar Plopski, 伊藤勇太, Christian Nitschke, 清川清, Gudrun Klinker, 竹村治雄.  
英文: Aelxandar Plopski, Yuta Itoh, Christian Nitschke, Kiyoshi Kiyokawa, Gudrun Klinker, Haruo Takemura.  
言語 English 
掲載誌/書名
和文: 
英文:IEEE Transactions on Visualization and Computer Graphics 
巻, 号, ページ vol. 21    num. 4    pp. 481-490
出版年月 2015年1月16日 
出版者
和文: 
英文:IEEE 
会議名称
和文: 
英文:IEEE VR 2015 
開催地
和文: 
英文:Arles 
DOI https://doi.org/10.1109/TVCG.2015.2391857
アブストラクト In recent years optical see-through head-mounted displays (OST-HMDs) have moved from conceptual research to a market of mass-produced devices with new models and applications being released continuously. It remains challenging to deploy augmented reality (AR) applications that require consistent spatial visualization. Examples include maintenance, training and medical tasks, as the view of the attached scene camera is shifted from the user's view. A calibration step can compute the relationship between the HMD-screen and the user's eye to align the digital content. However, this alignment is only viable as long as the display does not move, an assumption that rarely holds for an extended period of time. As a consequence, continuous recalibration is necessary. Manual calibration methods are tedious and rarely support practical applications. Existing automated methods do not account for user-specific parameters and are error prone. We propose the combination of a pre-calibrated display with a per-frame estimation of the user's cornea position to estimate the individual eye center and continuously recalibrate the system. With this, we also obtain the gaze direction, which allows for instantaneous uncalibrated eye gaze tracking, without the need for additional hardware and complex illumination. Contrary to existing methods, we use simple image processing and do not rely on iris tracking, which is typically noisy and can be ambiguous. Evaluation with simulated and real data shows that our approach achieves a more accurate and stable eye pose estimation, which results in an improved and practical calibration with a largely improved distribution of projection error.

©2007 Institute of Science Tokyo All rights reserved.