[HTML][HTML] Model compression for deep neural networks: A survey

Z Li, H Li, L Meng - Computers, 2023 - mdpi.com
Currently, with the rapid development of deep learning, deep neural networks (DNNs) have
been widely applied in various computer vision tasks. However, in the pursuit of …

Recent developments of content-based image retrieval (CBIR)

X Li, J Yang, J Ma - Neurocomputing, 2021 - Elsevier
With the development of Internet technology and the popularity of digital devices, Content-
Based Image Retrieval (CBIR) has been quickly developed and applied in various fields …

Efficiently identifying task groupings for multi-task learning

C Fifty, E Amid, Z Zhao, T Yu… - Advances in Neural …, 2021 - proceedings.neurips.cc
Multi-task learning can leverage information learned by one task to benefit the training of
other tasks. Despite this capacity, naively training all tasks together in one model often …

Multi-task learning with deep neural networks: A survey

M Crawshaw - arXiv preprint arXiv:2009.09796, 2020 - arxiv.org
Multi-task learning (MTL) is a subfield of machine learning in which multiple tasks are
simultaneously learned by a shared model. Such approaches offer advantages like …

Multi-task learning for dense prediction tasks: A survey

S Vandenhende, S Georgoulis… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
With the advent of deep learning, many dense prediction tasks, ie, tasks that produce pixel-
level predictions, have seen significant performance improvements. The typical approach is …

KD-PAR: A knowledge distillation-based pedestrian attribute recognition model with multi-label mixed feature learning network

P Wu, Z Wang, H Li, N Zeng - Expert Systems with Applications, 2024 - Elsevier
In this paper, a novel knowledge distillation (KD)-based pedestrian attribute recognition
(PAR) model is developed, where a multi-label mixed feature learning network (MMFL-Net) …

Modular deep learning

J Pfeiffer, S Ruder, I Vulić, EM Ponti - arXiv preprint arXiv:2302.11529, 2023 - arxiv.org
Transfer learning has recently become the dominant paradigm of machine learning. Pre-
trained models fine-tuned for downstream tasks achieve better performance with fewer …

Which tasks should be learned together in multi-task learning?

T Standley, A Zamir, D Chen, L Guibas… - International …, 2020 - proceedings.mlr.press
Many computer vision applications require solving multiple tasks in real-time. A neural
network can be trained to solve multiple tasks simultaneously using multi-task learning. This …

Gradnorm: Gradient normalization for adaptive loss balancing in deep multitask networks

Z Chen, V Badrinarayanan, CY Lee… - … on machine learning, 2018 - proceedings.mlr.press
Deep multitask networks, in which one neural network produces multiple predictive outputs,
can offer better speed and performance than their single-task counterparts but are …

Spottune: transfer learning through adaptive fine-tuning

Y Guo, H Shi, A Kumar, K Grauman… - Proceedings of the …, 2019 - openaccess.thecvf.com
Transfer learning, which allows a source task to affect the inductive bias of the target task, is
widely used in computer vision. The typical way of conducting transfer learning with deep …