Causal-dfq: Causality guided data-free network quantization

Y Shang, B Xu, G Liu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Proceedings of the IEEE/CVF International Conference on …, 2023openaccess.thecvf.com
Abstract Model quantization, which aims to compress deep neural networks and accelerate
inference speed, has greatly facilitated the development of cumbersome models on mobile
and edge devices. There is a common assumption in quantization methods from prior works
that training data is available. In practice, however, this assumption cannot always be
fulfilled due to reasons of privacy and security, rendering these methods inapplicable in real-
life situations. Thus, data-free network quantization has recently received significant …
Abstract
Model quantization, which aims to compress deep neural networks and accelerate inference speed, has greatly facilitated the development of cumbersome models on mobile and edge devices. There is a common assumption in quantization methods from prior works that training data is available. In practice, however, this assumption cannot always be fulfilled due to reasons of privacy and security, rendering these methods inapplicable in real-life situations. Thus, data-free network quantization has recently received significant attention in neural network compression. Causal reasoning provides an intuitive way to model casual relationships to eliminate data-driven correlations, making causality an essential component of analyzing data-free problems. However, causal formulations of data-free quantization are inadequate in the literature. To bridge this gap, we construct a causal graph to model the data generation and discrepancy reduction between the pre-trained and quantized models. Inspired by the causal understanding, we propose the Causality-guided Data-free Network Quantization method, Causal-DFQ, to eliminate the reliance on data via approaching an equilibrium of causality-driven intervened distributions. Specifically, we design a content-style-decoupled generator, synthesizing images conditioned on the relevant and irrelevant factors; then we propose a discrepancy reduction loss to align the intervened distributions of the pre-trained and quantized models. It is worth noting that our work is the first attempt towards introducing causality to data-free quantization problem. Extensive experiments demonstrate the efficacy of Causal-DFQ.
openaccess.thecvf.com
以上显示的是最相近的搜索结果。 查看全部搜索结果