• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2019 Fiscal Year Final Research Report

Generating artificial eye movements on a document for an objective readability measurement

Research Project

  • PDF
Project/Area Number 17K00276
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Research Field Human interface and interaction
Research InstitutionOsaka Prefecture University

Principal Investigator

Dengel Andreas  大阪府立大学, 研究推進機構, 客員教授 (00773574)

Co-Investigator(Kenkyū-buntansha) 石丸 翔也  大阪府立大学, 研究推進機構, 客員研究員 (10788730)
黄瀬 浩一  大阪府立大学, 工学(系)研究科(研究院), 教授 (80224939)
Project Period (FY) 2017-04-01 – 2020-03-31
Keywordsアイトラッキング / 可視性 / 可読性 / 皮膚電気活動 / 読書行動解析 / 文書解析 / データ拡張 / 人工知能
Outline of Final Research Achievements

In this research, we aim to evaluate the readability of a document on the basis of eye movements. Since it is not realistic to ask participants to read documents with eye tracking devices every time, we develop a system that synthesizes artificial eye movements on the document and utilizes them for the readability measurement. In the first year, as basic research for the readability measurement, we investigated the relationship between cognitive load while reading and sensor signals including eye movements and electrodermal activities. In the next year, We proposed a method to estimate fixation durations of each word of a document without asking people to read the document with an eye tracker. In the final year, we improved the performance of gaze synthesis by conducting a well-controlled experiment and applied it to the readability measurement.

Free Research Field

Human-Document Interaction

Academic Significance and Societal Importance of the Research Achievements

学術的観点では,生成された眼球運動は深層学習のための人工的なデータセットとして利用することができる.深層学習においてはデータ量の多寡が問題となるため,データセットの拡張は活動認識の研究を加速させるものである.社会的インパクトとしては,出版,広告,教育への貢献が期待される.教科書は最も重要な知識源である.本プロジェクトの成果により,適切なレイアウトやコンテンツを提供することで学生の認知負荷を軽減することができる.

URL: 

Published: 2021-02-19  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi