• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Solving the Midas Touch problem in object selection with eye tracking

Research Project

Project/Area Number 20K20866
Research Category

Grant-in-Aid for Challenging Research (Exploratory)

Allocation TypeMulti-year Fund
Review Section Medium-sized Section 10:Psychology and related fields
Research InstitutionKyushu University

Principal Investigator

Remijn Gerard  九州大学, 芸術工学研究院, 准教授 (40467098)

Co-Investigator(Kenkyū-buntansha) 富松 江梨佳  九州大学, 芸術工学研究院, 特別研究員 (20584668)
Project Period (FY) 2020-07-30 – 2023-03-31
Project Status Completed (Fiscal Year 2022)
Budget Amount *help
¥6,240,000 (Direct Cost: ¥4,800,000、Indirect Cost: ¥1,440,000)
Fiscal Year 2022: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Fiscal Year 2021: ¥1,560,000 (Direct Cost: ¥1,200,000、Indirect Cost: ¥360,000)
Fiscal Year 2020: ¥3,640,000 (Direct Cost: ¥2,800,000、Indirect Cost: ¥840,000)
Keywordseye gaze / visual object / dwell time / eye tracking / Midas Touch problem / visual password / object selection / age differences / pupil size / psychophysics
Outline of Research at the Start

For object selection with eye tracking, users need to rest their eyes on an object for a certain amount of time. Due to differences in user capabilities, e.g., in age, dwell times need to be set according to individual needs. Our goal is to investigate individual dwell times and enable calibrations.

Outline of Final Research Achievements

By using eye-tracking technology, we can select objects on a screen with our eye gaze. For example, we can perform “eye typing” to select letters to make a password. To select an object on a screen, we need to focus our eyes for a certain amount of time, called a dwell time. When this dwell time is too short, our eyes select objects even if we do not want to, called the Midas Touch problem. However, if the dwell time is too long, eye typing takes too much time. By performing experiments, we found that for persons under 35 years old, the ideal dwell time is about 600 ms. For users over 55 years of age, the ideal dwell time is about 800 ms. Importantly, these two main dwell time settings were obtained with relatively cheap eye-tracking devices, for a wide variety of objects, such as letters and numbers, dot patterns, and visual icons. Simplifying and standardizing dwell times to 600 ms and 800 ms settings will greatly assist the use of eye-gaze-based information input in daily life.

Academic Significance and Societal Importance of the Research Achievements

When using our eye gaze to select objects on a screen, the time to focus our eyes on an object (dwell time) can be reduced to just two settings: 600 ms or 800 ms, depending on the user’s age. This greatly simplifies engineering developments of walk-on, touchless screen interfaces that use eye gaze.

Report

(4 results)
  • 2022 Annual Research Report   Final Research Report ( PDF )
  • 2021 Research-status Report
  • 2020 Research-status Report
  • Research Products

    (10 results)

All 2022 2021 Other

All Journal Article (2 results) (of which Int'l Joint Research: 1 results,  Peer Reviewed: 2 results,  Open Access: 1 results) Presentation (3 results) (of which Int'l Joint Research: 2 results,  Invited: 1 results) Book (1 results) Remarks (4 results)

  • [Journal Article] Usability of various dwell times for eye-gaze-based object selection with eye tracking2021

    • Author(s)
      Paulus, Y.T., Remijn, G.B.
    • Journal Title

      Displays

      Volume: 67 Pages: 101997-101997

    • DOI

      10.1016/j.displa.2021.101997

    • Related Report
      2021 Research-status Report 2020 Research-status Report
    • Peer Reviewed
  • [Journal Article] Perceived Congruency in Audiovisual Stimuli Consisting of Gabor Patches and AM and FM Tones2021

    • Author(s)
      Postnova, N., Remijn, G.B.
    • Journal Title

      Multisensory Research

      Volume: 34 Issue: 5 Pages: 455-475

    • DOI

      10.1163/22134808-bja10041

    • Related Report
      2020 Research-status Report
    • Peer Reviewed / Open Access / Int'l Joint Research
  • [Presentation] Gaze-based object selection: Dwell time preferences for different object types and age groups2022

    • Author(s)
      Remijn, G.B. Paulus Y.T.
    • Organizer
      44th European Conference on Visual Perception
    • Related Report
      2022 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Effect of implied motion in pictograms on perceived presentation duration.2022

    • Author(s)
      Tomimatsu, E., Remijn, G.B., Ito, H.
    • Organizer
      44th European Conference on Visual Perception
    • Related Report
      2022 Annual Research Report
    • Int'l Joint Research
  • [Presentation] fNIRS and eye tracking: “Barrier-free” methods to observe human information processing2021

    • Author(s)
      Remijn, G.B.
    • Organizer
      Science and Design Symposium Vol. 4, Kyushu University
    • Related Report
      2021 Research-status Report 2020 Research-status Report
    • Invited
  • [Book] 日本視覚学会編,図説視覚の事典,I視覚の基本特性 1.5光覚2022

    • Author(s)
      富松江梨佳 (分担執筆)
    • Total Pages
      360
    • Publisher
      朝倉書店
    • Related Report
      2022 Annual Research Report
  • [Remarks]

    • URL

      https://hyoka.ofc.kyushu-u.ac.jp/search/details/K003633/english.html

    • Related Report
      2021 Research-status Report
  • [Remarks]

    • URL

      https://hyoka.ofc.kyushu-u.ac.jp/search/details/K003633/index.html

    • Related Report
      2021 Research-status Report
  • [Remarks]

    • URL

      https://hyoka.ofc.kyushu-u.ac.jp/search/details/K003633/english.html

    • Related Report
      2020 Research-status Report
  • [Remarks]

    • URL

      https://hyoka.ofc.kyushu-u.ac.jp/search/details/K003633/index.html

    • Related Report
      2020 Research-status Report

URL: 

Published: 2020-08-03   Modified: 2024-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi