• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2017 Fiscal Year Final Research Report

Sound design methodology based on an Acoustic Affordance theory

Research Project

  • PDF
Project/Area Number 15H02882
Research Category

Grant-in-Aid for Scientific Research (B)

Allocation TypeSingle-year Grants
Section一般
Research Field Design science
Research InstitutionHachinohe Institute of Technology (2017)
Ryukoku University (2015-2016)

Principal Investigator

Miura Masanobu  八戸工業大学, 工学部, 准教授 (80368034)

Co-Investigator(Kenkyū-buntansha) 安井 希子  松江工業高等専門学校, その他部局等, 講師 (80607896)
Research Collaborator MIYAWAKI Satoshi  
FUKUMOTO Ippei  
HIROOKA Yuya  
OKADA Sota  
NATSUHARA Kou  
KURODA Toshiki  
YAMAGUCHI Shoya  
SUGURO Akio  
NISHIMORI Yumi  
OKEMOTO Madoka  
SHIMIZU Makiko  
FUKUMOTO Sota  
IWADATE Yuki  
FURUYA Kaihei-Brett  
Project Period (FY) 2015-04-01 – 2018-03-31
Keywordsアフォーダンス / 脳波 / MMN / サイン音 / デザイン
Outline of Final Research Achievements

Recent artificial objects have been designed by taking Gibson’s “Affordance” into consideration. Sign sounds are designed to send certain messages to people. Sound designers often produce them by relying on intuition and experience. However, sort of logical methodology to create the sign sound has not yet established. This project examines a possibility of existence the “Sound Affordance” by conducting listening experiments whose experimental stimuli are musical chords under ERP measurement. As a result of the appearance time of ERP in case of listening to sounds with potential meaning, the ERP appeared more slowly than the other that has no potential meaning. Moreover, the energy of ERP when listening positive stimulus elicits stronger ERP than the other condition. From them, the brain processing appears the complexity by the existence of potential meaning of sounds. Therefore, sound affordance is suggested to positively exist.

Free Research Field

音響情報処理

URL: 

Published: 2019-03-29  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi