X Yang, J Ye, X Wang - European Conference on Computer Vision, 2022 - Springer
In this paper, we explore a novel and ambitious knowledge-transfer task, termed Knowledge Factorization (KF). The core idea of KF lies in the modularization and assemblability of …
In recent years, deep neural networks have been successful in both industry and academia, especially for computer vision tasks. The great success of deep learning is mainly due to its …
Abstract Knowledge distillation (KD) has proven to be a highly effective approach for enhancing model performance through a teacher-student training scheme. However, most …
Y Yang, J Qiu, M Song, D Tao… - Proceedings of the …, 2020 - openaccess.thecvf.com
Existing knowledge distillation methods focus on convolutional neural networks (CNNs), where the input samples like images lie in a grid domain, and have largely overlooked …
K Zhang, J Zhang, PD Xu, T Gao… - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Artificial intelligence (AI) technology has become an important trend to support the analysis and control of complex and time-varying power systems. Although deep reinforcement …
K Zhang, P Xu, J Zhang - 2020 IEEE 4th conference on energy …, 2020 - ieeexplore.ieee.org
The application of artificial intelligence (AI) system is more and more extensive, using the explainable AI (XAI) technology to explain why machine learning (ML) models make certain …
Deep convoloutional networks have been widely deployed in modern cyber-physical systems performing different visual classification tasks. As the fog and edge devices have …
Having the right inductive biases can be crucial in many tasks or scenarios where data or computing resources are a limiting factor, or where training data is not perfectly …
A massive number of well-trained deep networks have been released by developers online. These networks may focus on different tasks and in many cases are optimized for different …