Data prefetching, ie, the act of predicting an application's future memory accesses and fetching those that are not in the on-chip caches, is a well-known and widely used approach …
M Shakerinava… - The Third Data …, 2019 - dpc3.compas.cs.stonybrook.edu
Offset prefetching has been recently proposed as a lowoverhead yet high-performance approach to eliminate data cache misses or reduce their negative effect. In offset …
Non-Volatile Memory (NVM) technology is a promising solution to fulfill the ever-growing need for higher capacity in the main memory of modern systems. Despite having many great …
Y Li, B Tian, M Gao - Proceedings of the 2024 International Conference …, 2024 - dl.acm.org
Hybrid main memory systems combine both performance and capacity advantages from heterogeneous memory technologies. With larger capacities, higher associativities, and finer …
This paper proposes Blenda, a dynamically-partitioned memory-cache blend architecture for giga-scale die-stacked DRAMs. Blenda architects the stacked DRAM partly as memory and …
Data prefetching, ie, the act of predicting application's future memory accesses and fetching those that are not in the on-chip caches, is a well-known and widely-used approach to hide …
L1 instruction (L1-I) cache misses are a source of performance bottleneck. Sequential prefetchers are simple solutions to mitigate this problem; however, prior work has shown that …
Q Zhang, X Sui, R Hou, L Zhang - Sustainable Computing: Informatics and …, 2021 - Elsevier
Die-stacked DRAM has emerged as an effective approach to address the memory bandwidth wall as it offers much higher bandwidth than off-chip DRAM. It is typically used …
Server workloads like Media Streaming and Web Search serve millions of users and are considered an important class of applications. Such workloads run on large-scale data …