• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

1998 Fiscal Year Final Research Report Summary

Experimental analysis of auditory and gestural interface

Research Project

Project/Area Number 09610071
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeSingle-year Grants
Section一般
Research Field 実験系心理学
Research InstitutionMusashino Womens' University (1998)
Tokyo Institute of Technology (1997)

Principal Investigator

HAMANO Takashi  Musashino Womens' University, Faculty of Contemporary society, Assistant professor, 現代社会学部, 講師 (00262288)

Co-Investigator(Kenkyū-buntansha) KUSUMI Takashi  Tokyo Institute of technology, Graduate school of Decision Science and Technolog, 大学院・社会理工学研究科, 助教授 (70195444)
Project Period (FY) 1996 – 1998
Keywordsinterface / gesuture / auditory
Research Abstract

The purpose of this research is to examine the usability of multimodal interface using systematic musical tones and gesture. In experiment 1, three participantsmatched the sound stimulus on three elements of musical tones ; pitch, rhythm, and the direction of the sound. The result showed that the accuracy of matching was the highest in the pitch, the second in rhythm, and the last in the direction. The order of matching is (1) rhythm, (2) pitch and (3) direction. The combinations of the elements are also discussed. With this result, we can make auditory interfaces that lead users' attention to the most urgent information, and enable it to be received with the necessary accuracy. In experiment 2 and 3, an auditory condition that uses syncbronic constant tones keep presented multi-dimensional information was also examined. Two experiments were conducted with this condition, in maze-navigation task (experiment 2) and a mobile PC simulator (experiment 3). Under this condition, the performing time was shorter than no-sound condition (experiment 2). The frequency of operating errors was less when participants used the system the second time (experiment 3).The result indicates that subjects' learning about the systems are promoted by this auditory condition. In experiment 4 and 5, 24 participants operated a simulator of data base. Under gestural command condition, the performing time was shorter than pointing and short-cut key conditions. Finally, the implications of these results for auditory and gestural interfaces were discussed.

URL: 

Published: 1999-12-08  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi