• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to project page

2017 Fiscal Year Final Research Report

Accelerated (sub)gradient methods for large-scale convex optimization problems - with emphasis in the theoretical aspects of the implementation and its applications -

Research Project

  • PDF
Project/Area Number 26330024
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Research Field Mathematical informatics
Research InstitutionTokyo Institute of Technology

Principal Investigator

Fukuda Mituhiro  東京工業大学, 情報理工学院, 准教授 (80334548)

Co-Investigator(Renkei-kenkyūsha) YAMASHITA Makoto  東京工業大学, 情報理工学院, 准教授 (20386824)
Research Collaborator ITO Masaru  日本大学, 理工学部, 助手
Project Period (FY) 2014-04-01 – 2018-03-31
Keywords加速(劣)勾配法 / 一次法 / 凸最適化問題 / 射影勾配法 / 最急降下法
Outline of Final Research Achievements

In the current information society where large amount of data can be easily obtained and stored, there is a urgent need to solve large-scale convex optimization problems that can retrieve only valuable information (from that data). Very recently, the so-called accelerated (sub)gradient methods have been focused because they are easy to implement, but are very hard to understand theoretically. In this project, we analyze some properties that well-known (sub)gradient methods should satisfy in order to find some essential properties which guarantee fast convergence of these methods. And then, based on these properties, we propose a new family of (sub)gradient methods.
As a secondary theme, we proposed customized methods which work only with the function and gradient values for convex optimization problems which have special structures. We also conducted some numerical experiments to confirm their performance.

Free Research Field

数理最適化

URL: 

Published: 2019-03-29  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi