Human -robot interface based on instructions by neck movement
Project/Area Number |
23500653
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Research Field |
Rehabilitation science/Welfare engineering
|
Research Institution | Ehime University |
Principal Investigator |
|
Project Period (FY) |
2011 – 2013
|
Project Status |
Completed (Fiscal Year 2013)
|
Budget Amount *help |
¥4,550,000 (Direct Cost: ¥3,500,000、Indirect Cost: ¥1,050,000)
Fiscal Year 2013: ¥910,000 (Direct Cost: ¥700,000、Indirect Cost: ¥210,000)
Fiscal Year 2012: ¥910,000 (Direct Cost: ¥700,000、Indirect Cost: ¥210,000)
Fiscal Year 2011: ¥2,730,000 (Direct Cost: ¥2,100,000、Indirect Cost: ¥630,000)
|
Keywords | ヒューマンロボットインタフェース / 福祉ロボット / 感性工学 / 福祉工学 / 首振り指示 / 全方向移動ロボット / レーザポインタ / 感性的効果 / 首振り操作 / 感性コントローラ |
Research Abstract |
A human-robot system in which a mobile robot moves according to the instruction of the laser spot projected by the laser pointer attached at the human head is considered. Human gives instruction of desired movement to the robot by rotating his or her head. Three modes, stopping mode, following mode, and a mode to move to the desired position autonomously, are introduced. In the following mode, robot can realize intended movement by following the movement of the laser spot on the floor. Kansei controller is introduced between the instruction movement of the laser spot and following motion of the robot to realize psychologically acceptable motion of the robot. The effectiveness of the proposed system is discussed experimentally using an omni-directional robot.
|
Report
(4 results)
Research Products
(6 results)