Memory System Optimzation for Energy Efficient Big Data Processing
Project/Area Number |
16H06677
|
Research Category |
Grant-in-Aid for Research Activity Start-up
|
Allocation Type | Single-year Grants |
Research Field |
Computer system
|
Research Institution | The University of Tokyo |
Principal Investigator |
Arima Eishi 東京大学, 情報基盤センター, 特任助教 (50780699)
|
Research Collaborator |
Schulz Martin Lawrence Livermore National Laboratory, Computer Scientist
|
Project Period (FY) |
2016-08-26 – 2018-03-31
|
Project Status |
Completed (Fiscal Year 2017)
|
Budget Amount *help |
¥2,990,000 (Direct Cost: ¥2,300,000、Indirect Cost: ¥690,000)
Fiscal Year 2017: ¥1,430,000 (Direct Cost: ¥1,100,000、Indirect Cost: ¥330,000)
Fiscal Year 2016: ¥1,560,000 (Direct Cost: ¥1,200,000、Indirect Cost: ¥360,000)
|
Keywords | メモリシステム / ビッグデータ / 高電力効率化 / キャッシュ / ストレージクラスメモリ / 計算機システム / 電力制御 |
Outline of Final Research Achievements |
To improve the energy efficiency of big data processing, this work focused on optimizing memory systems, the major performance/power bottlenecks during executing big data applications, by hardware/software-side approaches. In particular, this work is based on two novel approaches: (1) address translation aware cache management and (2) storage class memory aware power management. Consequently, it is quantified that a few tens of percent of energy efficiency improvement can be achieved by applying those methods.
|
Report
(3 results)
Research Products
(4 results)