• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Gaze region estimation algorithm without calibration available in a tablet

Research Project

Project/Area Number 17K04966
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Research Field Special needs education
Research InstitutionNational Institute of Technology, Kumamoto College

Principal Investigator

HAKATA Tetsuya  熊本高等専門学校, 電子情報システム工学系CIグループ, 教授 (60237899)

Co-Investigator(Kenkyū-buntansha) 柴里 弘毅  熊本高等専門学校, 電子情報システム工学系CIグループ, 教授 (60259968)
加藤 達也  熊本高等専門学校, 電子情報システム工学系CIグループ, 助教 (10707970)
Project Period (FY) 2017-04-01 – 2020-03-31
Project Status Completed (Fiscal Year 2019)
Budget Amount *help
¥4,550,000 (Direct Cost: ¥3,500,000、Indirect Cost: ¥1,050,000)
Fiscal Year 2019: ¥650,000 (Direct Cost: ¥500,000、Indirect Cost: ¥150,000)
Fiscal Year 2018: ¥520,000 (Direct Cost: ¥400,000、Indirect Cost: ¥120,000)
Fiscal Year 2017: ¥3,380,000 (Direct Cost: ¥2,600,000、Indirect Cost: ¥780,000)
Keywords重度重複障害 / 人間福祉工学 / 社会福祉関係 / アシスティブテクノロジー / 特別支援教育
Outline of Final Research Achievements

This study proposes gaze region estimation method that does not require an expensive device nor initial setting called calibration. Objective of this study is not only implementation of the application but also enhancement of utilization of the application by teachers in special needs education school who are not good at IT or computers, hence concept of this application is “simple, user friendly but effective”. In order for challenged people to show their intentions, it is considered that detecting gaze with high precision is not necessary and it is sufficient to detect gaze region concisely. Therefore, a typical webcam is adopted to implement for the algorithm. Generally, smartphone or tablet computer have the built-in camera and the proposed method is available for these devices. To figure out of applicability of the proposed algorithm, some trial experiments were conducted and the accuracy and usability is evaluated.

Academic Significance and Societal Importance of the Research Achievements

視線領域検出に機械学習を用いることで,キャリブレーションを不要な仕組みを構築した.高額で特殊なセンサ機器を使用することなくタブレット端末で児童生徒の意思表示の支援が可能になり有用である.例えば,食事の挨拶の場面で使用する際,児童生徒の画面注視による意思表示時に「いただきます」の音声が再生されれば,周囲の児童生徒・教諭らに自立的な動作を行えたことが自然に伝わり,その場の空間を共有することができる.日常生活における児童生徒の自発的な意思表出を周囲が理解し,達成感や自己肯定感が得られることから,自立活動の拡大に繋がることが期待される.

Report

(4 results)
  • 2019 Annual Research Report   Final Research Report ( PDF )
  • 2018 Research-status Report
  • 2017 Research-status Report
  • Research Products

    (5 results)

All 2019 2018 2017

All Presentation (5 results) (of which Int'l Joint Research: 2 results)

  • [Presentation] Gaze Region Estimation Algorithm without Calibration Using Convolutional Neural Network2019

    • Author(s)
      Tatsuya Kato, Keigo JO, Koki SHIBASATO, Tetsuya HAKATA
    • Organizer
      7th International Conference on Applied Computing and Information Technology
    • Related Report
      2019 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Deep Learningを用いた視線領域推定 システムの構築と評価2019

    • Author(s)
      城 啓悟,柴里 弘毅,大塚 弘文
    • Organizer
      電気学会九州支部 平成30年度(第9回)高専研究講演会
    • Related Report
      2018 Research-status Report
  • [Presentation] キャリブレーションを必要としない視線領域推定アルゴリズムの構築2018

    • Author(s)
      城 啓悟,大浦 稜平,柴里 弘毅, 加藤 達也,博多 哲也
    • Organizer
      日本福祉工学会第22回学術講演会
    • Related Report
      2018 Research-status Report
  • [Presentation] A study on communication aid of face and eye motion2018

    • Author(s)
      Masaki Kuwata, Koki Shibasato, Hirofumi Ohtsuka, Yasuyuki Shimada
    • Organizer
      Proceedings of International Symposium on Innovative Engineering 2018
    • Related Report
      2017 Research-status Report
    • Int'l Joint Research
  • [Presentation] 単眼カメラを用いた視線方向推定のためのファジィ推論手法2017

    • Author(s)
      鍬田雅輝,博多哲也,加藤達也,大塚弘文,柴里弘毅
    • Organizer
      日本福祉工学会 第21回学術講演会
    • Related Report
      2017 Research-status Report

URL: 

Published: 2017-04-28   Modified: 2021-12-27  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi