• Search Research Projects
  • Search Researchers
  • How to Use
  1. Back to previous page

Accelerated (sub)gradient methods for large-scale convex optimization problems - with emphasis in the theoretical aspects of the implementation and its applications -

Research Project

Project/Area Number 26330024
Research Category

Grant-in-Aid for Scientific Research (C)

Allocation TypeMulti-year Fund
Section一般
Research Field Mathematical informatics
Research InstitutionTokyo Institute of Technology

Principal Investigator

Fukuda Mituhiro  東京工業大学, 情報理工学院, 准教授 (80334548)

Co-Investigator(Renkei-kenkyūsha) YAMASHITA Makoto  東京工業大学, 情報理工学院, 准教授 (20386824)
Research Collaborator ITO Masaru  日本大学, 理工学部, 助手
Project Period (FY) 2014-04-01 – 2018-03-31
Project Status Completed (Fiscal Year 2017)
Budget Amount *help
¥2,860,000 (Direct Cost: ¥2,200,000、Indirect Cost: ¥660,000)
Fiscal Year 2016: ¥650,000 (Direct Cost: ¥500,000、Indirect Cost: ¥150,000)
Fiscal Year 2015: ¥1,040,000 (Direct Cost: ¥800,000、Indirect Cost: ¥240,000)
Fiscal Year 2014: ¥1,170,000 (Direct Cost: ¥900,000、Indirect Cost: ¥270,000)
Keywords加速(劣)勾配法 / 一次法 / 凸最適化問題 / 射影勾配法 / 最急降下法 / 加速勾配法 / 分散共分散行列推定 / 加速劣勾配法 / 1次法 / 劣勾配法 / 狭義2次凸関数 / 分散共分散行列推定問題
Outline of Final Research Achievements

In the current information society where large amount of data can be easily obtained and stored, there is a urgent need to solve large-scale convex optimization problems that can retrieve only valuable information (from that data). Very recently, the so-called accelerated (sub)gradient methods have been focused because they are easy to implement, but are very hard to understand theoretically. In this project, we analyze some properties that well-known (sub)gradient methods should satisfy in order to find some essential properties which guarantee fast convergence of these methods. And then, based on these properties, we propose a new family of (sub)gradient methods.
As a secondary theme, we proposed customized methods which work only with the function and gradient values for convex optimization problems which have special structures. We also conducted some numerical experiments to confirm their performance.

Report

(5 results)
  • 2017 Annual Research Report   Final Research Report ( PDF )
  • 2016 Research-status Report
  • 2015 Research-status Report
  • 2014 Research-status Report
  • Research Products

    (14 results)

All 2018 2017 2016 2015 2014

All Journal Article (2 results) (of which Peer Reviewed: 2 results,  Acknowledgement Compliant: 2 results) Presentation (12 results) (of which Int'l Joint Research: 3 results,  Invited: 2 results)

  • [Journal Article] New results on subgradient methods for strongly convex optimization problems with a unified analysis2016

    • Author(s)
      Masaru Ito
    • Journal Title

      Computational Optimization and Applications

      Volume: 65 Issue: 1 Pages: 127-172

    • DOI

      10.1007/s10589-016-9841-1

    • Related Report
      2016 Research-status Report
    • Peer Reviewed / Acknowledgement Compliant
  • [Journal Article] A family of subgradient-based methods for convex optimization problems in a unifying framework2016

    • Author(s)
      Masaru Ito and Mituhiro Fukuda
    • Journal Title

      Optimization Methods and Software

      Volume: 31 Issue: 5 Pages: 952-982

    • DOI

      10.1080/10556788.2016.1182165

    • Related Report
      2016 Research-status Report
    • Peer Reviewed / Acknowledgement Compliant
  • [Presentation] 凸最適化に対する一次法の再出発法と未知パラメータへの適応2018

    • Author(s)
      伊藤勝,福田光浩
    • Organizer
      日本オペレーションズ・リサーチ学会 2018年春季研究発表会
    • Related Report
      2017 Annual Research Report
  • [Presentation] An efficient nonmonotone spectral projected gradient method for semidefinite program with log-determinant and l_1-norm function2017

    • Author(s)
      Mituhiro Fukuda, Takashi Nakagaki, and Makoto Yamashita
    • Organizer
      The 10th Anniversary Conference on Nonlinear Analysis and Convex Analysis
    • Related Report
      2017 Annual Research Report
    • Int'l Joint Research
  • [Presentation] A comparative study of steepest descent methods for strongly convex quadratic functions2016

    • Author(s)
      Mituhiro Fukuda
    • Organizer
      Workshop on Advances in Optimization
    • Place of Presentation
      Shinagawa, Tokyo, Japan
    • Year and Date
      2016-08-12
    • Related Report
      2016 Research-status Report
  • [Presentation] A comparative study of steepest descent methods for strongly convex quadratic minimization2016

    • Author(s)
      Mituhiro Fukuda and Kensuke Gotoh
    • Organizer
      XI Brazilian Workshop on Continuous Optimization
    • Place of Presentation
      Manaus, Amazonas, Brazil
    • Year and Date
      2016-05-22
    • Related Report
      2016 Research-status Report
  • [Presentation] A numerical comparative study of steepest descent methods for strongly convex quadratic minimization2015

    • Author(s)
      Mituhiro Fukuda and Kensuke Gotoh
    • Organizer
      22nd International Symposium on Mathematical Programming
    • Place of Presentation
      Wyndham Grand Pittsburgh Downtown, Pittsburgh, PA, USA
    • Year and Date
      2015-07-12
    • Related Report
      2015 Research-status Report
    • Int'l Joint Research
  • [Presentation] New results on subgradient methods for weakly smooth and strongly convex problems2015

    • Author(s)
      Masaru Ito
    • Organizer
      22nd International Symposium on Mathematical Programming
    • Place of Presentation
      Wyndham Grand Pittsburgh Downtown, Pittsburgh, PA, USA
    • Year and Date
      2015-07-12
    • Related Report
      2015 Research-status Report
    • Int'l Joint Research
  • [Presentation] 凸最適化問題に対するヘルダー条件のもとでの最適な劣勾配アルゴリズムの提案2015

    • Author(s)
      伊藤 勝
    • Organizer
      WOO@つくば―未来を担う若手研究者の集い2015―
    • Place of Presentation
      筑波大学筑波キャンパス春日地区春日講堂
    • Year and Date
      2015-05-30
    • Related Report
      2015 Research-status Report
  • [Presentation] 凸最適化問題に対するヘルダー条件のもとでの最適な劣勾配アルゴリズムの提案2015

    • Author(s)
      伊藤勝
    • Organizer
      日本オペレーションズ・リサーチ学会2015年春季研究発表会
    • Place of Presentation
      東京理科大学、東京
    • Year and Date
      2015-03-26 – 2015-03-27
    • Related Report
      2014 Research-status Report
  • [Presentation] A new nonmonotone spectral projected gradient method for semidefinite program with log-determinant and l1-norm function2015

    • Author(s)
      Mituhiro Fukuda, Takashi Nakagaki, and Makoto Yamashita
    • Organizer
      1107th American Mathematical Society Meeting, Spring Eastern Sectional Meeting
    • Place of Presentation
      Georgetown University, Washington, DC, USA
    • Year and Date
      2015-03-07 – 2015-03-08
    • Related Report
      2014 Research-status Report
    • Invited
  • [Presentation] Some insights from the stable compressive principal component pursuit2014

    • Author(s)
      Mituhiro Fukuda and Junki Kobayashi
    • Organizer
      SIAM Conference on Optimization 2014
    • Place of Presentation
      Town and Country Resort & Convention Center, San Diego, CA, USA
    • Year and Date
      2014-05-19 – 2014-05-22
    • Related Report
      2014 Research-status Report
  • [Presentation] A unified framework for subgradient algorithms minimizing strongly convex functions2014

    • Author(s)
      Masaru Ito and Mituhiro Fukuda
    • Organizer
      SIAM Conference on Optimization 2014
    • Place of Presentation
      Town and Country Resort & Convention Center, San Diego, CA, USA
    • Year and Date
      2014-05-19 – 2014-05-22
    • Related Report
      2014 Research-status Report
  • [Presentation] Dual approach based on spectral projected gradient method for log-det SDP with L1 norm2014

    • Author(s)
      Makoto Yamashita, Mituhiro Fukuda, and Takashi Nakagaki
    • Organizer
      SIAM Conference on Optimization 2014
    • Place of Presentation
      Town and Country Resort & Convention Center, San Diego, CA, USA
    • Year and Date
      2014-05-19 – 2014-05-22
    • Related Report
      2014 Research-status Report
    • Invited

URL: 

Published: 2014-04-04   Modified: 2019-03-29  

Information User Guide FAQ News Terms of Use Attribution of KAKENHI

Powered by NII kakenhi