2016 Fiscal Year Final Research Report
Study on iteration complexities of proximal coordinate descent methods for the convex optimizaiton
Project/Area Number |
25330025
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Research Field |
Mathematical informatics
|
Research Institution | Kyoto University |
Principal Investigator |
|
Research Collaborator |
DAI Yu-Hong 中国科学院, 教授
|
Project Period (FY) |
2013-04-01 – 2017-03-31
|
Keywords | 凸最適化 / 近接勾配法 / 座標降下法 |
Outline of Final Research Achievements |
We have proposed the proximal coordinate gradient method for large-scale convex optimization arisen in statics, signal processing, machine learning, and so on. The proposed method is a generalization of a variety of optimization methods, such as the proximal gradient method, the Newton method, and the coordinate descent method. We have investigated convergence properties of the proposed method. In particular we have given sufficient condition under which the proposed method converges globally and linearly. Moreover we have presented its worst iteration complexity. We have applied it to some applications such as the L1-L2 optimization and the portfolio selection problem, and found that the proposed method can find a reasonable solution efficiently.
|
Free Research Field |
数理最適化
|