• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

The direct method foreign language learning by acting of actual 3-D robots on 2-D projection

Research Project

Project/Area Number 16K12788
Research Category

Grant-in-Aid for Challenging Exploratory Research

Allocation TypeMulti-year Fund
Research Field Educational technology
Research InstitutionMie University

Principal Investigator

MATSUI Hirokazu  三重大学, 工学研究科, 助教 (10303752)

Project Period (FY) 2016-04-01 – 2020-03-31
Project Status Completed (Fiscal Year 2019)
Budget Amount *help
¥3,250,000 (Direct Cost: ¥2,500,000、Indirect Cost: ¥750,000)
Fiscal Year 2018: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Fiscal Year 2017: ¥650,000 (Direct Cost: ¥500,000、Indirect Cost: ¥150,000)
Fiscal Year 2016: ¥1,560,000 (Direct Cost: ¥1,200,000、Indirect Cost: ¥360,000)
Keywords外国語会話学習 / コミュニケーションロボット / ダイレクトメソッド / E-learning / 語学学習 / 3次元と2次元の融合 / 仮想と実体の融合 / 外国語学習
Outline of Final Research Achievements

This basic outcome is the 2 followings:The first, I have made a foreign language conversation learning system based on the direct method, that consists of two small mobile robots on a stage of a large LCD monitor with using speech recognition system of a smart phone.The two robots act a play with gesture and un-learned language for a subject.The play is built so that the subject can understand it.The subject joins the play by uttering some of un-learned language words.The system judges whether the utterances are suitable or not.The second, I showed some examples that subjects can learn foreign language conversation with using the system without using his/her any known language.

Academic Significance and Societal Importance of the Research Achievements

本研究の学術的意義は,ロボットの非言語コミュニケーションに対して,人がどのように解釈するかを研究したところにある.この研究は,言語に頼らないコミュニケーションを探求するものであり,より自然なロボットとのコミュニケーションにつながるものである.
本研究の社会的意義は,母国語を介さずに外国語会話ができるようになる会話学習法をロボットを用いて実施できる例をいくつか示せたことである.これにより,日本人の外国語会話への上達に貢献できる可能性を示せたことである.

Report

(5 results)
  • 2019 Annual Research Report   Final Research Report ( PDF )
  • 2018 Research-status Report
  • 2017 Research-status Report
  • 2016 Research-status Report
  • Research Products

    (5 results)

All 2019 2018 2016

All Journal Article (2 results) (of which Peer Reviewed: 2 results) Presentation (3 results)

  • [Journal Article] Reinforcement Learning with Multiplex Learning Space in Reward-delay Environments2019

    • Author(s)
      西澤 智恵子, 松井 博和, 野村 由司彦
    • Journal Title

      IEEJ Transactions on Electronics, Information and Systems

      Volume: 139 Issue: 7 Pages: 847-848

    • DOI

      10.1541/ieejeiss.139.847

    • NAID

      130007672075

    • ISSN
      0385-4221, 1348-8155
    • Year and Date
      2019-07-01
    • Related Report
      2019 Annual Research Report
    • Peer Reviewed
  • [Journal Article] Reinforcement Learning by Using Dual Q-table for the Whole Space and a Partial Space2019

    • Author(s)
      松井 博和, 西澤 智恵子, 野村 由司彦
    • Journal Title

      Journal of the Robotics Society of Japan

      Volume: 37 Issue: 7 Pages: 620-631

    • DOI

      10.7210/jrsj.37.620

    • NAID

      130007708639

    • ISSN
      0289-1824, 1884-7145
    • Related Report
      2019 Annual Research Report
    • Peer Reviewed
  • [Presentation] ロボット教師によるダイレクトメソッドの外国語会話学習 --「間」の時間と回答の内容の推定の関係性の調査--2019

    • Author(s)
      柴田雄也,松井博和
    • Organizer
      令和元年度 電気・電子・情報関係学会東海支部連合大会
    • Related Report
      2019 Annual Research Report
  • [Presentation] 「ロボット教師によるダイレクトメソッドの外国語会話学習」-音声認識システムの自動化-2018

    • Author(s)
      松崎大起,松井博和,加藤典彦
    • Organizer
      ロボティクス・メカトロニクス講演会2018
    • Related Report
      2018 Research-status Report
  • [Presentation] ロボット教師によるダイレクトメソッドの外国語会話学習―中学一年の英語教育への適用―2016

    • Author(s)
      島中美羽,松井博和
    • Organizer
      平成28年度 電気・電子・情報関係学会東海支部連合大会
    • Place of Presentation
      豊田高専(愛知県豊田市)
    • Year and Date
      2016-09-12
    • Related Report
      2016 Research-status Report

URL: 

Published: 2016-04-21   Modified: 2021-02-19  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi