Project/Area Number |
21K17710
|
Research Category |
Grant-in-Aid for Early-Career Scientists
|
Allocation Type | Multi-year Fund |
Review Section |
Basic Section 60020:Mathematical informatics-related
|
Research Institution | Kyushu University |
Principal Investigator |
Themelis Andreas 九州大学, システム情報科学研究院, 准教授 (50898749)
|
Project Period (FY) |
2021-04-01 – 2025-03-31
|
Project Status |
Granted (Fiscal Year 2023)
|
Budget Amount *help |
¥3,120,000 (Direct Cost: ¥2,400,000、Indirect Cost: ¥720,000)
Fiscal Year 2023: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2022: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2021: ¥780,000 (Direct Cost: ¥600,000、Indirect Cost: ¥180,000)
|
Keywords | Optimization algorithms / Nonconvex optimization / Adaptive algorithms / Autonomous driving / Decentralized control / Open-source software |
Outline of Research at the Start |
The project is concerned with optimization algorithms for engineering, in compliance with the application challenges: efficiency, limited power of microprocessors, and nonconvexity of the problems. The final goal is to provide efficient open-source multi-purpose software with theoretical guarantees.
|
Outline of Annual Research Achievements |
The candidate obtained a one-year extension to smoothly transition to the next project on "adaptive" algorithms. Supervising a PhD student, a merging point was established on the integration of quasi-Newton stepsizes (this project) in adaptive schemes (next project) [1]. The finalization of this project is being achieved through a new window on IP methods [2], as part of an international collaboration with UBW Munich (DE). Similar developments are expected from a visit of a student from UBC (CA), scheduled from June to August 2024.
[1] H Ou and A Themelis. Safeguarding adaptive methods: global convergence of Barzilai-Borwein and other stepsize choices, arXiv:2404.09617, 2024 [2] A De Marchi and A Themelis. An interior proximal gradient method for nonconvex optimization, arXiv:2208.00799, 2023
|
Current Status of Research Progress |
Current Status of Research Progress
2: Research has progressed on the whole more than it was originally planned.
Reason
The final deliverables of this project are being attained through an integration, not originally planned, of Newton-type proximal methods within a classical interior point framework [3]. The manuscript, based on the preliminary work [2], will be submitted to an international top-tier journal within the next month. A new alley beyond local Lipschitz differentiability is also being investigated in the convex case [1, 4]. Several preprints have been published in the meantime.
[3] A De Marchi and A Themelis. A penalty barrier framework for nonconvex constrained optimization (work in progress, soon to be submitted) [4] K Oikonomidis et al. Adaptive proximal gradient methods are universal without approximation, arXiv:2402.06271, 2024
|
Strategy for Future Research Activity |
Apart from ultimating ongoing works mentioned above, the candidate intends to publish an extended monograph for the Foundations and Trends in Optimization about his main research. Given the accomplishments of the other goals, this pending work can now resume without impediments.
|