研究課題/領域番号 |
21K17710
|
研究種目 |
若手研究
|
配分区分 | 基金 |
審査区分 |
小区分60020:数理情報学関連
|
研究機関 | 九州大学 |
研究代表者 |
Themelis Andreas 九州大学, システム情報科学研究院, 准教授 (50898749)
|
研究期間 (年度) |
2021-04-01 – 2025-03-31
|
研究課題ステータス |
交付 (2023年度)
|
配分額 *注記 |
3,120千円 (直接経費: 2,400千円、間接経費: 720千円)
2023年度: 1,170千円 (直接経費: 900千円、間接経費: 270千円)
2022年度: 1,170千円 (直接経費: 900千円、間接経費: 270千円)
2021年度: 780千円 (直接経費: 600千円、間接経費: 180千円)
|
キーワード | Optimization algorithms / Nonconvex optimization / Adaptive algorithms / Autonomous driving / Decentralized control / Open-source software |
研究開始時の研究の概要 |
The project is concerned with optimization algorithms for engineering, in compliance with the application challenges: efficiency, limited power of microprocessors, and nonconvexity of the problems. The final goal is to provide efficient open-source multi-purpose software with theoretical guarantees.
|
研究実績の概要 |
The candidate obtained a one-year extension to smoothly transition to the next project on "adaptive" algorithms. Supervising a PhD student, a merging point was established on the integration of quasi-Newton stepsizes (this project) in adaptive schemes (next project) [1]. The finalization of this project is being achieved through a new window on IP methods [2], as part of an international collaboration with UBW Munich (DE). Similar developments are expected from a visit of a student from UBC (CA), scheduled from June to August 2024.
[1] H Ou and A Themelis. Safeguarding adaptive methods: global convergence of Barzilai-Borwein and other stepsize choices, arXiv:2404.09617, 2024 [2] A De Marchi and A Themelis. An interior proximal gradient method for nonconvex optimization, arXiv:2208.00799, 2023
|
現在までの達成度 (区分) |
現在までの達成度 (区分)
2: おおむね順調に進展している
理由
The final deliverables of this project are being attained through an integration, not originally planned, of Newton-type proximal methods within a classical interior point framework [3]. The manuscript, based on the preliminary work [2], will be submitted to an international top-tier journal within the next month. A new alley beyond local Lipschitz differentiability is also being investigated in the convex case [1, 4]. Several preprints have been published in the meantime.
[3] A De Marchi and A Themelis. A penalty barrier framework for nonconvex constrained optimization (work in progress, soon to be submitted) [4] K Oikonomidis et al. Adaptive proximal gradient methods are universal without approximation, arXiv:2402.06271, 2024
|
今後の研究の推進方策 |
Apart from ultimating ongoing works mentioned above, the candidate intends to publish an extended monograph for the Foundations and Trends in Optimization about his main research. Given the accomplishments of the other goals, this pending work can now resume without impediments.
|