Project/Area Number |
23K13298
|
Research Category |
Grant-in-Aid for Early-Career Scientists
|
Allocation Type | Multi-year Fund |
Review Section |
Basic Section 20020:Robotics and intelligent system-related
|
Research Institution | Ritsumeikan University |
Principal Investigator |
PAUL Hannibal 立命館大学, 総合科学技術研究機構, 研究員 (90952888)
|
Project Period (FY) |
2023-04-01 – 2025-03-31
|
Project Status |
Granted (Fiscal Year 2023)
|
Budget Amount *help |
¥3,380,000 (Direct Cost: ¥2,600,000、Indirect Cost: ¥780,000)
Fiscal Year 2024: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Fiscal Year 2023: ¥2,340,000 (Direct Cost: ¥1,800,000、Indirect Cost: ¥540,000)
|
Keywords | UAV / aerial manipulation / adaptive landing / docking / aerial vision |
Outline of Research at the Start |
A three-arm aerial manipulator system with lightweight is considered. The autonomous control of UAV and manipulator system will be implemented using on-board processor and depth vision system. The manipulator design will be used to implement landing and stopping strategy for UAV in various places.
|
Outline of Annual Research Achievements |
The research aimed to develop a hardware system for an aerial robot (UAV) capable of performing multiple tasks. A hardware setup comprising three robot arms attached to the UAV was prepared. A real test with the developed hardware confirmed its functionality. A depth camera is attached on the UAV for sensing its surroundings. To increase the field of view, a mechanism was utilized to move it. The camera could be adjusted to focus on the specific region for further 3D data processing. Simulation using Gazebo with ROS2 was employed to test the drone's performance. Simulation tools proved invaluable for developing drone systems, allowing free flight inside the simulator to test cameras and actuators. Image processing development is in progress.
|
Current Status of Research Progress |
Current Status of Research Progress
1: Research has progressed more than it was originally planned.
Reason
The UAV hardware system, including the attached manipulator system, has been completed, and initial manual flight tests have been conducted to verify system operations. Additionally, the manipulator motion has been tested with the program running in ROS.
A simulation for the drone has been set up to perform initial tests, including autonomous flight tests in different modes. This simulation was successfully tested with ROS2, which enhances the system's potential for long-term use due to ROS2's popularity in robotic development.
A camera system, incorporating a depth camera along with motion functionality to increase the field of view, has been installed on the UAV. Work is underway on the image processing system.
|
Strategy for Future Research Activity |
Moving forward with the research project, the immediate focus will be on completing the image processing system to enhance the capabilities of the UAV. Advanced algorithms for surface detection, particularly focusing on cylindrical objects for docking and precise identification of landing surfaces, will be integrated.
After further simulation tests, real flight tests using image processing feedback will be conducted with autonomous flight. These tests will serve as critical validation points for the efficacy of the algorithms in real-world scenarios and will provide valuable insights into the system's performance under various conditions.
|