2019 Fiscal Year Research-status Report
Development of artificial agents for the classroom and study of their efficiency
Project/Area Number |
19K12167
|
Research Institution | Tokyo University of Agriculture and Technology |
Principal Investigator |
Venture Gentiane 東京農工大学, 工学(系)研究科(研究院), 卓越教授 (30538278)
|
Project Period (FY) |
2019-04-01 – 2022-03-31
|
Keywords | remote teaching / intelligent robot / physical avatar |
Outline of Annual Research Achievements |
In the first year of the project we have achieved our goals in setting up an infrastructure for data collection and then in data collection itself. Using motion capture technology (single camera and open pose) and eye tracking technology we have collected the behaviour data of both the lecturer and the students. The data from the lecturer were used to define teaching styles and important teaching behaviours and gestures that support the lecture content and the lecture material to improve audience understanding and memorization. The students’ data collected have allowed to differentiate several behaviours during a lecture. Questionnaire were used to verify the students impressions during the experiment. Quizzes have been used to verify the learning ability of the students.
|
Current Status of Research Progress |
Current Status of Research Progress
2: Research has progressed on the whole more than it was originally planned.
Reason
The plan was well established and I didn't schedule things that we wouldn't be doing. So every goes very smoothly and we are on track with our objectives.
|
Strategy for Future Research Activity |
We are now entering phase B and C of the project: B.Big data/Machine learning: all the collected data in A are then processed to be ana-lysed in machine learning algorithms. We will use a recurrent neural network (RNN) architec-ture, known to be very powerful in highlighting features in complex data set and enabling be-haviour recognition. We will use language programming python and the open machine learning library “tensor flow” to develop our software. This will result in an offline behaviour analysis software of both lecturer and students’ behaviours, enabling us to choose the relevant features for step C. Relevant features will be compared with the know-how in education and see if they are in adequation or if other features need to be considered. C.Realtime behaviour recognition system: The relevant features found in B. will be used to develop a real-time algorithm to identify the changes in students’ behaviour. Again, this is based on an implementation of machine learning algorithm in real-time for classification. We propose here to use RNN or auto-encoders depending of the nature of the features found in B.
|
Research Products
(12 results)
-
-
-
-
-
-
-
-
-
-
[Presentation] On the Role of Trust in Child-Robot Interaction2019
Author(s)
Zguda Paulina、Sniezynski Bartlomiej、Indurkhya Bipin、Kolota Anna、Jarosz Mateusz、Sondej Filip、Izui Takamune、Dziok Maria、Belowska Anna、Jedras Wojciech、Venture Gentiane
Organizer
28th IEEE Int. Conf. on Robot & Human Interactive Communication, New Delhi, India, October 14-18, 2019
Int'l Joint Research
-
-