Study on iteration complexities of proximal coordinate descent methods for the convex optimizaiton
Project/Area Number |
25330025
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Research Field |
Mathematical informatics
|
Research Institution | Kyoto University |
Principal Investigator |
|
Research Collaborator |
DAI Yu-Hong 中国科学院, 教授
|
Project Period (FY) |
2013-04-01 – 2017-03-31
|
Project Status |
Completed (Fiscal Year 2016)
|
Budget Amount *help |
¥4,940,000 (Direct Cost: ¥3,800,000、Indirect Cost: ¥1,140,000)
Fiscal Year 2016: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Fiscal Year 2015: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
Fiscal Year 2014: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Fiscal Year 2013: ¥1,300,000 (Direct Cost: ¥1,000,000、Indirect Cost: ¥300,000)
|
Keywords | 凸最適化 / 近接勾配法 / 座標降下法 / 数理最適化 / 座標勾配法 / 凸計画問題 / 計算量 / オンライン最適化 / 最尤推定 |
Outline of Final Research Achievements |
We have proposed the proximal coordinate gradient method for large-scale convex optimization arisen in statics, signal processing, machine learning, and so on. The proposed method is a generalization of a variety of optimization methods, such as the proximal gradient method, the Newton method, and the coordinate descent method. We have investigated convergence properties of the proposed method. In particular we have given sufficient condition under which the proposed method converges globally and linearly. Moreover we have presented its worst iteration complexity. We have applied it to some applications such as the L1-L2 optimization and the portfolio selection problem, and found that the proposed method can find a reasonable solution efficiently.
|
Report
(5 results)
Research Products
(11 results)