Feedback Learning Based Dead Write Termination for Energy Efficient STT-RAM Caches
Feedback Learning Based Dead Write Termination for Energy Efficient STT-RAM Caches作者机构:Computer School Wuhan University State Key Laboratory of Software Engineering Wuhan University School of Computer and Information Hefei University of Technology
出 版 物:《Chinese Journal of Electronics》 (电子学报(英文))
年 卷 期:2017年第26卷第3期
页 面:460-467页
核心收录:
学科分类:08[工学] 081201[工学-计算机系统结构] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:supported by the National Natural Science Foundation of China(No.91118003,No.61170022,No.61373039,No.61402145,No.61502346) the Natural Science Foundation of Hubei Province(No.2015CFB338) the Natural Science Foundation of Anhui Province(No.1508085QF138) the Science and Technology Project of Jiangxi Province Education Department(No.GJJ150605)
主 题:Dead blocks Energy efficiency Spintransfer torque RAM(STT-RAM) Last level cache(LLC)
摘 要:Spin-torque transfer RAM(STT-RAM) is a promising candidate to replace SRAM for larger Last level cache(LLC). However, it has long write latency and high write energy which diminish the benefit of adopting STT-RAM caches. A common observation for LLC is that a large number of cache blocks have never been referenced again before they are evicted. The write operations for these blocks, which we call dead writes, can be eliminated without incurring subsequent cache misses. To address this issue, a quantitative scheme called Feedback learning based dead write termination(FLDWT) is proposed to improve energy efficiency and performance of STT-RAM based LLC. FLDWT dynamically learns the block access behavior by using data reuse distance and data access frequency, and then classifies the blocks into dead blocks and live blocks. FLDWT terminates dead write block requests and improves the estimation accuracy via feedback information. Compared with STT-RAM baseline in the lastlevel caches, experimental results show that our scheme achieves energy reduction by 44.6% and performance improvement by 12% on average with negligible overhead.