研究課題/領域番号 |
21K14117
|
研究種目 |
若手研究
|
配分区分 | 基金 |
審査区分 |
小区分20020:ロボティクスおよび知能機械システム関連
|
研究機関 | 北見工業大学 |
研究代表者 |
ラワンカル アビジート 北見工業大学, 工学部, 准教授 (70802594)
|
研究期間 (年度) |
2021-04-01 – 2024-03-31
|
研究課題ステータス |
交付 (2022年度)
|
配分額 *注記 |
3,250千円 (直接経費: 2,500千円、間接経費: 750千円)
2023年度: 520千円 (直接経費: 400千円、間接経費: 120千円)
2022年度: 650千円 (直接経費: 500千円、間接経費: 150千円)
2021年度: 2,080千円 (直接経費: 1,600千円、間接経費: 480千円)
|
キーワード | Autonomous Robots / Artificial Intelligence / SLAM / Agriculture Robots / Robotics / Smart Agriculture / Computer Vision / 自律ロボティクス / 人工知能 |
研究開始時の研究の概要 |
北海道では,第一次産業である大規模農業においてブドウ園は重労働作業が多いことが特徴であり,農業従事者の作業軽減のためにロボットやAI技術を活用したスマート農業のニーズが非常に大きい.そこで,本研究では,ブドウ園における作業負担を減少するために自律移動ロボットによる下記の3つの重労働作業の自動化を目的とする: (1) GPSを使用せずに安価なセンサーでマップ生成及び自己位置推定のシステム開発. (2) 自律ロボットによる果樹の真下にある雑草まで除草するシステムの開発. (3) 自律ロボットによる収穫と剪定システムの開発. 本研究で開発するシステムは,北海道のブドウ園で実証実験を行う予定である.
|
研究実績の概要 |
This research aims at developing a smart viticulture system through the use of autonomous robots and AI. Smart viticulture aims at reducing the burden of farmers by automating tasks like vineyard monitoring, inspection, yield estimation, and harvesting of grapes. In FY2022, the accuracy of the monitoring system developed in previous FY was improved through the use of Deep Learning based landmark detection. The previously developed system relied on landmark (pillar) detection based on color, which caused inaccuracies during the change of illumination conditions and rain. A convolutional neural network based model was trained and evaluated to be robust against dynamic changes in the environment and gave better estimation of the landmarks. This improved the overall SLAM (Simultaneous Localization and Mapping) module which was developed without the use of GPS, and relies on local-only features for cost-cutting. In Addition, a deep learning CNN model was developed for detecting grapes/branches and their conditions (ripe, almost-ripe, raw). A weed detection model has been developed. The harvesting module is developed in simulator and a real-world manufacturing and test is planned for FY2023. Algorithms for robot navigation in structured vineyard environment, obstacle avoidance, alarm system, multi-robot cooperation algorithms for vineyard, have also been developed and results have been published in conferences.
|
現在までの達成度 (区分) |
現在までの達成度 (区分)
2: おおむね順調に進展している
理由
Smart Viticulture project is realized through Software component and Hardware component. The Software component comprises of robust landmark detection, pose estimation, precise SLAM development, navigation algorithms for narrow lanes of vineyard, deep-learning algorithms for grape/weed detection, etc. The software component is being developed as per the plan and several modules have been completed and tested. Important datasets for harvesting, cutting, and navigation have been collected. The datasets recorded in the previous years have improved the accuracy of detection of grapes, branches, and weeds. The hardware component comprises of developing the robot arm for harvester. Particularly, a dual-model cutter-grasper module is required for automation of harvesting. Although the mechanical component development is slightly delayed, it is important to first test the model in simulation environments before actual fabrication. A 6-DoF robot harm is being currently trained using reinforcement learning while the dual-model harvester module is being developed in simulation software. In FY2023, an actual model is planned to be developed.
|
今後の研究の推進方策 |
The hardware component of Smart Viticulture project includes developing the robot arm for harvester/cutting. Particularly, a dual-model cutter-grasper module is required for automation of harvesting. A 6-DoF robot harm is being currently trained using reinforcement learning while the dual-model harvester module is being developed in simulation software. In FY2023, the algorithm will be improved to also include obstacles around the robot arm to prevent potential accidents. Upon finalizing the model, an actual mechanical fabrication is planned to be developed and tested in real environments. There is a need to place a spectral camera on the robot for efficient classification of nutrition and infection in the field. The arm module will be placed on the top of mobile ground robot platform which will increase the overall weight and dynamics of the system. Hence, the previously developed navigation and SLAM algorithms will be re-tested. Actual tests in vineyard environment will be tested. Additional datasets will be recorded throughout the fiscal year. The system will be integrated and tests will be performed in different conditions of weather. The results will be published in conferences.
|