Project/Area Number |
26330024
|
Research Category |
Grant-in-Aid for Scientific Research (C)
|
Allocation Type | Multi-year Fund |
Section | 一般 |
Research Field |
Mathematical informatics
|
Research Institution | Tokyo Institute of Technology |
Principal Investigator |
Fukuda Mituhiro 東京工業大学, 情報理工学院, 准教授 (80334548)
|
Co-Investigator(Renkei-kenkyūsha) |
YAMASHITA Makoto 東京工業大学, 情報理工学院, 准教授 (20386824)
|
Research Collaborator |
ITO Masaru 日本大学, 理工学部, 助手
|
Project Period (FY) |
2014-04-01 – 2018-03-31
|
Project Status |
Completed (Fiscal Year 2017)
|
Budget Amount *help |
¥2,860,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥660,000)
Fiscal Year 2016: ¥650,000 (Direct Cost: ¥500,000、Indirect Cost: ¥150,000)
Fiscal Year 2015: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Fiscal Year 2014: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
|
Keywords | 加速(劣)勾配法 / 一次法 / 凸最適化問題 / 射影勾配法 / 最急降下法 / 加速勾配法 / 分散共分散行列推定 / 加速劣勾配法 / 1次法 / 劣勾配法 / 狭義2次凸関数 / 分散共分散行列推定問題 |
Outline of Final Research Achievements |
In the current information society where large amount of data can be easily obtained and stored, there is a urgent need to solve large-scale convex optimization problems that can retrieve only valuable information (from that data). Very recently, the so-called accelerated (sub)gradient methods have been focused because they are easy to implement, but are very hard to understand theoretically. In this project, we analyze some properties that well-known (sub)gradient methods should satisfy in order to find some essential properties which guarantee fast convergence of these methods. And then, based on these properties, we propose a new family of (sub)gradient methods. As a secondary theme, we proposed customized methods which work only with the function and gradient values for convex optimization problems which have special structures. We also conducted some numerical experiments to confirm their performance.
|