• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2023 Fiscal Year Final Research Report

A comprehensive optical and neuronal model of vision considering individual differences in complex environments based on the real space

Research Project

  • PDF
Project/Area Number 20H04271
Research Category

Grant-in-Aid for Scientific Research (B)

Allocation TypeSingle-year Grants
Section一般
Review Section Basic Section 61060:Kansei informatics-related
Research InstitutionTeikyo University

Principal Investigator

Mihashi Toshifumi  帝京大学, 医療技術学部, 教授 (20506266)

Co-Investigator(Kenkyū-buntansha) 棚橋 重仁  新潟大学, 自然科学系, 助教 (00547292)
広田 雅和  帝京大学, 医療技術学部, 講師 (40835435)
斎田 真也  神奈川大学, 付置研究所, 研究員 (90357054)
Project Period (FY) 2020-04-01 – 2024-03-31
Keywords眼光学 / 心理物理 / AI / 周辺視 / 眼球運動 / VR空間 / 読書 / 眼球運動検査
Outline of Final Research Achievements

This study was conducted with a multifaceted approach, aiming to test visual function in complex environments. In eye movement measurements, an automatic analysis system for smooth eye movement in clinical settings was developed and evaluated, and the accuracy was improved. In addition, a 3D model of an underground mall was reconstructed and reproduced in a VR space, demonstrating the possibility of presenting a complex environment using an HMD. Furthermore, visual function was evaluated through a reading experiment with a restricted visual field and a visual acuity test in the peripheral visual field. In addition, a wide range of research results were obtained, including the construction of an eye model specific to Japanese people and the prediction of visual function from fundus images using AI.

Free Research Field

眼光学

Academic Significance and Societal Importance of the Research Achievements

本研究では、VR空間での行動や視覚機能を評価する手法を開発し、没入感の高い地下街VR空間を構築した。また、眼球運動測定と視覚刺激提示を組み合わせた実験システムを開発し、臨床応用可能な眼球運動検査自動解析システムや、日本人特有の眼球モデルを構築した。これらの成果は、VR技術の進歩や、視覚に関する医療技術の発展に貢献し、人々の生活を豊かにすることが期待される。

URL: 

Published: 2025-01-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi