Budget Amount *help |
¥1,700,000 (Direct Cost: ¥1,700,000)
Fiscal Year 2015: ¥800,000 (Direct Cost: ¥800,000)
Fiscal Year 2014: ¥900,000 (Direct Cost: ¥900,000)
|
Outline of Annual Research Achievements |
In the last,we propose a novel dictionary learning algorithm with the log regularizer and simultaneously with the coherence penalty based on proximal operators. Our proposed algorithm simply employs a decomposition scheme and alternating optimization, which transforms the overall problem into a set of single-vector variables subproblems, with either one dictionary atom or one coefficient vector. Although the subproblems are still nonsmooth and even nonconvex, remarkably they can be solved by proximal operators, and the closed-form solutions of the dictionary atoms and the coefficient vectors are obtained directly and explicitly. To the best of our knowledge, no previous studies of dictionary learning have applied proximal operators to sparse coding with the log-regularizer and simultaneously to dictionary updating with the coherence penalty. According to our analysis and simulation study, the main advantages of the proposed algorithm are its greater efficiency in learning and its higher convergence rate than state-of-the-art algorithms. In addition, for real-world signals, our proposed algorithm can obtain lower approximation errors than an algorithm that uses the L1-norm for sparsity.
|