Cloud service providers improve resource utilization by co-locating latency-critical (LC) workloads with best-effort batch (BE) jobs in datacenters. However, they usually treat multi …
Deep learning frameworks are powerful tools to support model training. They dispatch operators by mapping them into a series of kernel functions and launching these kernel …
Different placement or collaboration policies in handling datasets and workloads across cloud, edge, and user-end may substantially affect a cloud-edge computing environment's …
Data center networking is the central infrastructure of the modern information society. However, benchmarking them is very challenging as the real-world network traffic is difficult …
To better design AI processors, it is critical to characterize artificial intelligence (AI) workloads and contrast them to normal personal computer (PC) workloads. In this work, we …
F Wang, M Hao, W Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
GPUs play a central and indispensable role as accelerators in modern high-performance computing (HPC) platforms, enabling a wide range of tasks to be performed efficiently …
Reproducibility and replicability (R&R) are important for research. Many communities are beginning efforts to reward, incentivize, and highlight projects as a motive to adopt R&R …
The global community faces many pressing and uncertain challenges like pandemics and global climate change. Information technology (IT) infrastructure has become the enabler to …
Benchmarking and evaluating deep learning models and systems necessitate a meticulous approach to ensure comprehensive assessment. In practical applications, it is paramount to …