Abstract Variance-Invariance-Covariance Regularization (VICReg) is a self-supervised learning (SSL) method that has shown promising results on a variety of tasks. However, the …
Q Li, Z Liu, Q Li, K Xu - Proceedings of the 2023 ACM SIGSAC …, 2023 - dl.acm.org
The development of machine learning models requires a large amount of training data. Data marketplace is a critical platform to trade high-quality and private-domain data that is not …
While post-training quantization receives popularity mostly due to its evasion in accessing the original complete training dataset, its poor performance also stems from scarce images …
J Deng, S Li, Z Wang, H Gu, K Xu, K Huang - arXiv preprint arXiv …, 2024 - arxiv.org
The Diffusion Transformers Models (DiTs) have transitioned the network architecture from traditional UNets to transformers, demonstrating exceptional capabilities in image …
K Zhen, M Radfar, H Nguyen, GP Strimel… - 2022 IEEE Spoken …, 2023 - ieeexplore.ieee.org
For on-device automatic speech recognition (ASR), quantization aware training (QAT) is ubiquitous to achieve the trade-off between model predictive performance and efficiency …
Internet of Things (IoT) systems provide large amounts of data on all aspects of human behavior. Machine learning techniques, especially deep neural networks (DNN), have …
Deep Learning (DL) has shown impressive performance in many mobile applications. Most existing works have focused on reducing the computational and resource overheads of …
J Wang, H Chen, D Wang, K Mei… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Stochastic computing (SC) has become a promising approximate computing solution by its negligible resource occupancy and ultralow energy consumption. As a potential …
In this work we show that the size versus accuracy trade-off of neural network quantization can be significantly improved by increasing the quantization dimensionality. We propose the …