Accuracy Boosters: Epoch-Driven Mixed-Mantissa Block Floating-Point for DNN Training

SB Harma, A Chakraborty, B Falsafi, M Jaggi… - arXiv preprint arXiv …, 2022 - arxiv.org
The unprecedented growth in DNN model complexity, size, and amount of training data has
led to a commensurate increase in demand for computing and a search for minimal
encoding. Recent research advocates Hybrid Block Floating Point (HBFP) to minimize
silicon provisioning in accelerators by converting the majority of arithmetic operations in
training to 8-bit fixed point. In this paper, we perform a full-scale exploration of the HBFP
design space using mathematical tools to study the interplay among various parameters and …

Accuracy Boosters: Epoch-Driven Mixed-Mantissa Block Floating-Point for DNN Training

S Burcu Harma, A Chakraborty, B Falsafi… - arXiv e …, 2022 - ui.adsabs.harvard.edu
The unprecedented growth in DNN model complexity, size, and amount of training data has
led to a commensurate increase in demand for computing and a search for minimal
encoding. Recent research advocates Hybrid Block Floating Point (HBFP) to minimize
silicon provisioning in accelerators by converting the majority of arithmetic operations in
training to 8-bit fixed point. In this paper, we perform a full-scale exploration of the HBFP
design space using mathematical tools to study the interplay among various parameters and …
以上显示的是最相近的搜索结果。 查看全部搜索结果