A stdudy of perception process of the three dimensional auditory space under multi-mordal informations
Project/Area Number |
12650359
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Single-year Grants |
Section | 一般 |
Research Field |
情報通信工学
|
Research Institution | TOHOKU UNIVERSITY |
Principal Investigator |
SUZUKI Yoichi Tohoku University, Research Institute of Electrical Communication, Proffesor, 電気通信研究所, 教授 (20143034)
|
Co-Investigator(Kenkyū-buntansha) |
SAKAMOTO Shuichi Tohoku University, Research Institute of Electrical Communication, Research Associate, 電気通信研究所, 助手 (60332524)
NISHIMURA Ryouichi Tohoku University, Research Institute of Electrical Communication, Research Associate, 電気通信研究所, 助手 (30323116)
GYOBA Jiro Tohoku University, Graduate School/Faculty of Arts and Letters, Associate Professor, 文学研究科, 助教授 (50142899)
|
Project Period (FY) |
2000 – 2001
|
Project Status |
Completed (Fiscal Year 2001)
|
Budget Amount *help |
¥3,200,000 (Direct Cost: ¥3,200,000)
Fiscal Year 2001: ¥1,500,000 (Direct Cost: ¥1,500,000)
Fiscal Year 2000: ¥1,700,000 (Direct Cost: ¥1,700,000)
|
Keywords | spatial hearing / virtual sound space / multi-modality / sound localization / vection / self-motion perception / somatic information / auditory display / 移動音源 / 頭部運動 / 頭部伝達関数 / 前後判定誤り |
Research Abstract |
This study aims to clarify how we perceive the three dimensional auditory space under somatic and visual information which are generated by listener's movement. In the first year, we experimentally examined the effect of the head movement concerning sound localization by using real sound and virtual sound. Moreover we also investigated the pattern of head movement during sound localization. The result showed that irrespective of the kind of sound source, real or virtual, both front-back confusions and localization errors were improved when listener moved their head. The amount of head movement when we heard virtual sound source was less than that when we heard real sound source. This suggests that head movement is very effective for sound localization, especially hearing virtual sound source. In the second year, we examined whether auditory information including spatial information produces self-motion. The results showed that self-motion consistent with the directions of sound movement was produced. Moreover, the amount of produced self-motion was different between the directions of sound movement. This tendency was different from that in visual vection. This suggests that we could create high-performance three dimensional spatial hearing system by effective use of visual and auditory information. From the results mentioned above, we first created a precise three dimensional spatial hearing system, and with this system, we considered the effect of head movement, and analyzed the somatic perception quantitatively. Our present research results must contribute toward the full understanding of perception of three dimensional spatial hearing under multi-modal environment.
|
Report
(3 results)
Research Products
(8 results)