Interface for Mobile Feeding Assistive Robotic Arm
Project/Area Number |
25350671
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Research Field |
Rehabilitation science/Welfare engineering
|
Research Institution | University of the Ryukyus |
Principal Investigator |
Higa Hiroki 琉球大学, 工学部, 教授 (60295300)
|
Project Period (FY) |
2013-04-01 – 2016-03-31
|
Project Status |
Completed (Fiscal Year 2015)
|
Budget Amount *help |
¥5,200,000 (Direct Cost: ¥4,000,000、Indirect Cost: ¥1,200,000)
Fiscal Year 2015: ¥390,000 (Direct Cost: ¥300,000、Indirect Cost: ¥90,000)
Fiscal Year 2014: ¥520,000 (Direct Cost: ¥400,000、Indirect Cost: ¥120,000)
Fiscal Year 2013: ¥4,290,000 (Direct Cost: ¥3,300,000、Indirect Cost: ¥990,000)
|
Keywords | ロボットアーム / 食事支援 / 神経疾患 / インタフェース / 脳波 / 眼球運動 / 脳波計測 |
Outline of Final Research Achievements |
We have developed a user-interface for a mobile robotic arm in this study. We made a visual stimulus program that one of four triangles was randomly turned white in a computer monitor. Six able-bodied volunteers participated in the experiments. EEG signals were recorded from the scalp, amplified, sampled, and bandpass-filtered. Each subject was required to silently count white triangle. Some features, event-related potentials P300 and N100, were extracted from the averaged waveforms, and the total of classification accuracy across subjects of 95.8 % was obtained using artificial neural network. A vision-based user interface was also considered. The user interface consists of a single web camera to obtain user’s eye movements, computer running a detection program of the center of the iris and pupil from the captured images. An able-bodied subject could operate the proposed user interface without any difficulty. The interface allows user to support to select one of foods on a table.
|
Report
(4 results)
Research Products
(9 results)