• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2018 Fiscal Year Final Research Report

Research on guidance method for visually impaired person using a three-dimensional range sensor

Research Project

  • PDF
Project/Area Number 16K00349
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Research Field Intelligent robotics
Research InstitutionUniversity of Tsukuba

Principal Investigator

Ohya Akihisa  筑波大学, システム情報系, 教授 (30241798)

Research Collaborator SUWABE kiyoshi  
ENDO yohei  
NISHINO yuki  
WATANABE takashi  
Project Period (FY) 2016-04-01 – 2019-03-31
Keywords人間誘導 / 視覚障碍者 / 三次元測域センサ
Outline of Final Research Achievements

In this research, as a novel walking support for the visually impaired, we propose a method of guiding to the destination along the route specified in the map created in advance. The goal is to prototype the system and to evaluate its usefulness. The current position and posture of the subject to be guided are calculated based on the surrounding environment information measured using a recently developed three-dimensional laser range sensor. The multiple vibration motors fixed on a waist belt are arranged to indicate the direction to proceed next. Through experiments conducted by using constructed system, it was clarified that guidance including stair climbing and obstacle avoidance can be realized in the indoor environment. In addition, applicable area was expanded by solving problems when using it outdoors.

Free Research Field

知能ロボット

Academic Significance and Societal Importance of the Research Achievements

本研究の独創的な点は、移動ロボットの自律ナビゲーション技術を人間の誘導に用いることで、従来成し得なかった精度で人間を指定した経路上で歩行させることか可能な点である。誘導対象の人間は、ロボットで実現されているのと同様に障害物も回避して歩行可能である。本方式は、人間の視覚機能を必要としないため、視覚障害者が直接利用できる。予め地図を作成し、歩行すべき経路を指定しておけば、初めて訪れる場所でも正確に誘導できる。三次元センサにより歩行する前方路面の状況がわかるため、駅のホーム等で誤って転落する事故を防ぐことも可能になる。本方式が実用化されれば、視覚障碍者が外出するのを助ける非常に有用な手段となり得る。

URL: 

Published: 2020-03-30  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi