2017 Fiscal Year Research-status Report
Gaze understanding in Optical-see-through Head-mounted Displays
Project/Area Number |
17K12726
|
Research Institution | Nara Institute of Science and Technology |
Principal Investigator |
Plopski Alex 奈良先端科学技術大学院大学, 情報科学研究科, 助教 (70780071)
|
Project Period (FY) |
2017-04-01 – 2019-03-31
|
Keywords | ウェアラブル機器 |
Outline of Annual Research Achievements |
According to our plans for 2017, we developed a system for estimating the user’s focus depth. The system records the user’s gaze at different distances and regresses the depth gaze with a multi-layer perceptron neural network. We tested the system on 12 participants and found that we could accurately determine when the user is focused at the separate planes, however we still have significant estimation errors at unknown depths. We developed a method for realistic rendering of virtual content on an optical see-through head-mounted display (OST-HMD) based on the user’s focus depth. Our results show that we can fool participants into believing that the virtual object is real in some cases, this does not succeed in others. We also studied how users perceive virtual content when they focus at objects located at a different distance. By sharpening the presented content we expected users to be able to better recognize the virtual content. Our results indicate that while sharpening indeed improved readability, the amount of sharpening varied between participants. Finally, we evaluated if it is possible to increase the perceived brightness of virtual objects, without noticeably dimming the brightness of the real world. We gradually adjusted the transmission of shutter glasses over a period of time, thus reducing the amount of incoming light. Our results show that gradual dimming indeed increases the perceived brightness of the virtual content without significantly affecting the perceived brightness of the real world.
|
Current Status of Research Progress |
Current Status of Research Progress
2: Research has progressed on the whole more than it was originally planned.
Reason
According to our plan, we have developed an HMD capable of eye-gaze tracking. We also investigated how to estimate the depth the user is focusing on. We are currently still investigating different calibration algorithms, but hat already applied the depth estimation to algorithms that modify the virtual content according to the estimated focus depth. We published 1 journal, 3 conference papers, and 1 poster paper with the results of this project.
|
Strategy for Future Research Activity |
In 2018, we continue mostly according to our previous plans. We will focus on evaluation the gaze calibration accuracy of eye-gaze in augmented reality scenarios. We will compare the calibration results when real and virtual targets are used for the calibration process and explore how this process can assist estimating the user’s focus depth. We will combine the eye-tracker with the autorefractometer available at our laboratory to provide an independent benchmark that can be used to compare different calibration and gaze-depth estimation methods. Finally, we will focus on the development of novel interaction metaphors and interfaces that take advantage of the available information. Furthermore, we want to investigate how eye-gaze tracking can be used for automatic calibration of an optical see-through head-mounted display.
|
Causes of Carryover |
I was conducting research in the USA for 3 months. During this time, I could make use of the local facilities and did not need to hire an RA during this time. I plan to purchase additional equipment and use the remaining budget for experiments during 2018. Our research progressed acoording to the original plan, thus it is not an issue that some budget remained for the second year.
|
Research Products
(6 results)