• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Development of Cognitive Symbiosis in Virtual Agents to Improve Remote Classroom Learning Outcomes

Research Project

Project/Area Number 23K21688
Project/Area Number (Other) 21H03482 (2021-2023)
Research Category

Grant-in-Aid for Scientific Research (B)

Allocation TypeMulti-year Fund (2024)
Single-year Grants (2021-2023)
Section一般
Review Section Basic Section 61020:Human interface and interaction-related
Research InstitutionOsaka University

Principal Investigator

ORLOSKY JASON  大阪大学, サイバーメディアセンター, 特任准教授(常勤) (10815111)

Co-Investigator(Kenkyū-buntansha) 白井 詩沙香  大阪大学, サイバーメディアセンター, 講師 (30757430)
清川 清  奈良先端科学技術大学院大学, 先端科学技術研究科, 教授 (60358869)
Project Period (FY) 2021-04-01 – 2025-03-31
Project Status Granted (Fiscal Year 2024)
Budget Amount *help
¥17,160,000 (Direct Cost: ¥13,200,000、Indirect Cost: ¥3,960,000)
Fiscal Year 2024: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
Fiscal Year 2023: ¥3,120,000 (Direct Cost: ¥2,400,000、Indirect Cost: ¥720,000)
Fiscal Year 2022: ¥5,720,000 (Direct Cost: ¥4,400,000、Indirect Cost: ¥1,320,000)
Fiscal Year 2021: ¥6,890,000 (Direct Cost: ¥5,300,000、Indirect Cost: ¥1,590,000)
KeywordsEye Tracking / Artificial Intelligence / Learning / Agents / augmented reality / virtual reality / eye tracking / State Detection / Remote Environments / agents / cognition / learning / simulation / remote interaction / education
Outline of Research at the Start

This research involves integrating voice-to-text with AR for interactive learning, customizing LLMs for conversational education styles, and refining these technologies through user feedback. Evaluations will be conducted for enhanced language and content learning.

Outline of Annual Research Achievements

In this period, we explored applications of virtual reality (VR) and augmented reality (AR) in education and healthcare. First, we conducted a study on educational comics that utilized eye-tracking in VR to identify key gaze features to help estimate the difficulty levels perceived by readers, suggesting a way to dynamically adjust educational content. We also developed, AMSwipe, a method that allows for gaze-based text input into virtual environments, which allows for efficient, hands-free typing without the need for physical controllers. Additionally, EyeShadows, a tool that we developed and tested with both AR and VR, improves the selection and manipulation of virtual elements using peripheral copies of items, enabling faster, more accurate interactions. Furthermore, we leveraged VR to enhance medical training, particularly by simulating the experiences of Parkinson’s disease patients to foster empathy and insight among healthcare students. These technologies demonstrate significant potential for enhancing remote education, providing immersive, interactive learning experiences that can be tailored to individual needs and capabilities. We also explored the use of interactive re-training of neural networks for applications in language learning.

Current Status of Research Progress
Current Status of Research Progress

2: Research has progressed on the whole more than it was originally planned.

Reason

This research is generally progressing on schedule. We have several publications in different areas, and we have set up a remote environment to conduct the last phase of the research over the next year.

Strategy for Future Research Activity

The last phase of the research will proceed as planned, though we have made some updates due to the advancement of AI technology. In addition, we are working on an in-situ object labeling approach, which can assist with a more specific learning task, language learning. Our system will be extended to incorporate Large Language Models(LLM), which will be customized to power virtual educational agents. This should result in more interactive and context-based learning, leading to longer retention of the concepts.

Report

(3 results)
  • 2023 Annual Research Report
  • 2022 Annual Research Report
  • 2021 Annual Research Report
  • Research Products

    (17 results)

All 2024 2023 2022 Other

All Int'l Joint Research (4 results) Journal Article (7 results) (of which Int'l Joint Research: 7 results,  Peer Reviewed: 7 results,  Open Access: 1 results) Presentation (6 results) (of which Int'l Joint Research: 5 results)

  • [Int'l Joint Research] Augusta University(米国)

    • Related Report
      2023 Annual Research Report
  • [Int'l Joint Research] Augusta University(米国)

    • Related Report
      2022 Annual Research Report
  • [Int'l Joint Research] Augusta University(米国)

    • Related Report
      2021 Annual Research Report
  • [Int'l Joint Research] University of California Santa Barbara(米国)

    • Related Report
      2021 Annual Research Report
  • [Journal Article] EyeShadows: Peripheral Virtual Copies for Rapid Gaze Selection and Interaction2024

    • Author(s)
      Orlosky Jason、Liu Chang、Sakamoto Kenya、Sidenmark Ludwig、Mansour Adam
    • Journal Title

      IEEE Virtual Reality

      Volume: . Pages: 681-689

    • DOI

      10.1109/vr58804.2024.00088

    • Related Report
      2023 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Immersing healthcare students in a virtual reality Parkinson's disease experience2024

    • Author(s)
      Li Yi (Joy)、Orlosky Jason、Jirau-Rosaly Wanda、Brown Shilpa、Rockich-Winston Nicole
    • Journal Title

      International Journal of Medical Education

      Volume: . Pages: 34-36

    • DOI

      10.5116/ijme.65f5.725c

    • Related Report
      2023 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Subjective Difficulty Estimation of Educational Comics Using Gaze Features2023

    • Author(s)
      SAKAMOTO Kenya、SHIRAI Shizuka、TAKEMURA Noriko、ORLOSKY Jason、NAGATAKI Hiroyuki、UEDA Mayumi、URANISHI Yuki、TAKEMURA Haruo
    • Journal Title

      IEICE Transactions on Information and Systems

      Volume: E106.D Issue: 5 Pages: 1038-1048

    • DOI

      10.1587/transinf.2022EDP7100

    • ISSN
      0916-8532, 1745-1361
    • Year and Date
      2023-05-01
    • Related Report
      2023 Annual Research Report
    • Peer Reviewed / Open Access / Int'l Joint Research
  • [Journal Article] Approximated Match Swiping: Exploring more Ergonomic Gaze-based Text Input for XR2023

    • Author(s)
      Mansour Adam、Orlosky Jason
    • Journal Title

      International Symposium on Mixed and Augmented Reality

      Volume: . Pages: 141-145

    • DOI

      10.1109/ismar-adjunct60411.2023.00037

    • Related Report
      2023 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Mitigation of VR Sickness During Locomotion With a Motion-Based Dynamic Vision Modulator2023

    • Author(s)
      Zhao Guanghan、Orlosky Jason、Feiner Steven、Ratsamee Photchara、Uranishi Yuki
    • Journal Title

      IEEE Transactions on Visualization and Computer Graphics

      Volume: 29 Issue: 10 Pages: 4089-4103

    • DOI

      10.1109/tvcg.2022.3181262

    • Related Report
      2022 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Layerable Apps: Comparing Concurrent and Exclusive Display of Augmented Reality Applications2022

    • Author(s)
      Huynh Brandon、Wysopal Abby、Ross Vivian、Orlosky Jason、Hollerer Tobias
    • Journal Title

      IEEE International Symposium on Mixed and Augmented Reality

      Volume: . Pages: 857-863

    • DOI

      10.1109/ismar55827.2022.00104

    • Related Report
      2022 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Journal Article] Evaluation of User Interfaces for Three-Dimensional Locomotion in Virtual Reality2022

    • Author(s)
      Lim Donghae、Shirai Shizuka、Orlosky Jason、Ratsamee Photchara、Uranishi Yuki、Takemura Haruo
    • Journal Title

      ACM Symposium on Spatial User Interaction

      Volume: . Pages: 1-9

    • DOI

      10.1145/3565970.3567693

    • Related Report
      2022 Annual Research Report
    • Peer Reviewed / Int'l Joint Research
  • [Presentation] EyeShadows: Peripheral Virtual Copies for Rapid Gaze Selection and Interaction2024

    • Author(s)
      Jason Orlosky
    • Organizer
      IEEE Virtual Reality
    • Related Report
      2023 Annual Research Report
    • Int'l Joint Research
  • [Presentation] 視線,表情,心拍,脳波を用いたVR空間でのマンガ教材読書時の主観的難易度推定2024

    • Author(s)
      石塚 裕之
    • Organizer
      電子情報通信学会 メディアエクスペリエンス・バーチャル環境基礎研究会
    • Related Report
      2023 Annual Research Report
  • [Presentation] Approximated Match Swiping: Exploring more Ergonomic Gaze-based Text Input for XR2023

    • Author(s)
      Jason Orlosky
    • Organizer
      International Symposium on Mixed and Augmented Reality
    • Related Report
      2023 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Mitigation of VR Sickness During Locomotion With a Motion-Based Dynamic Vision Modulator2023

    • Author(s)
      Guanghan Zhao
    • Organizer
      IEEE Virtual Reality
    • Related Report
      2022 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Layerable Apps: Comparing Concurrent and Exclusive Display of Augmented Reality Applications2022

    • Author(s)
      Brandon Huynh
    • Organizer
      IEEE International Symposium on Mixed and Augmented Reality
    • Related Report
      2022 Annual Research Report
    • Int'l Joint Research
  • [Presentation] Evaluation of User Interfaces for Three-Dimensional Locomotion in Virtual Reality2022

    • Author(s)
      Donghae Lim
    • Organizer
      ACM Symposium on Spatial User Interaction
    • Related Report
      2022 Annual Research Report
    • Int'l Joint Research

URL: 

Published: 2021-04-28   Modified: 2024-12-25  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi