Multi-objective deep learning: Taxonomy and survey of the state of the art

S Peitz, SS Hotegni - arXiv preprint arXiv:2412.01566, 2024 - arxiv.org
Simultaneously considering multiple objectives in machine learning has been a popular
approach for several decades, with various benefits for multi-task learning, the consideration …

Exploring lottery ticket hypothesis in few-shot learning

Y Xie, Q Sun, Y Fu - Neurocomputing, 2023 - Elsevier
Abstract Lottery Ticket Hypothesis (LTH)[14] has gathered great focus since being proposed.
Researchers then succeed in figuring out alternative ways to find the” winning ticket” and …

A multiobjective continuation method to compute the regularization path of deep neural networks

AC Amakor, K Sonntag, S Peitz - arXiv preprint arXiv:2308.12044, 2023 - arxiv.org
Sparsity is a highly desired feature in deep neural networks (DNNs) since it ensures
numerical efficiency, improves the interpretability of models (due to the smaller number of …

HRBP: Hardware-friendly Regrouping towards Block-based Pruning for Sparse CNN Training

H Ma, C Zhang, X Ma, G Yuan… - … on Parsimony and …, 2024 - proceedings.mlr.press
Pruning at initialization and training a sparse network from scratch (sparse training) become
increasingly popular. However, most sparse training literature addresses only the …

Adaptive Pruning of Pretrained Transformer via Differential Inclusions

Y Ding, K Fan, Y Wang, X Sun, Y Fu - arXiv preprint arXiv:2501.03289, 2025 - arxiv.org
Large transformers have demonstrated remarkable success, making it necessary to
compress these models to reduce inference costs while preserving their perfor-mance …

Exploring structural sparsity of deep networks via inverse scale spaces

Y Fu, C Liu, D Li, Z Zhong, X Sun… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
The great success of deep neural networks is built upon their over-parameterization, which
smooths the optimization landscape without degrading the generalization ability. Despite the …

Sparse Learning in AI: A Differential Inclusion Perspective

X Sun - Proceedings of the ACM Turing Award Celebration …, 2023 - dl.acm.org
We propose a novel sparsity learning framework called from the perspective of differential
inclusion, which enjoys better statistical properties and computational efficiency. Utility of this …

[PDF][PDF] Adaptive End-to-End Budgeted Network Learning via Inverse Scale Space.

Z Zhong, C Liu, Y Fu - BMVC, 2021 - bmvc2021-virtualconference.com
This paper studies the task of budgeted network learning [35] that aims at discovering good
convolutional network structures under parameters/FLOPs constraints. Particularly, we …

A New Screening Method for COVID-19 based on Ocular Feature Recognition by Machine Learning Tools

Y Fu, F Li, W Wang, H Tang, X Qian, M Gu… - arXiv preprint arXiv …, 2020 - arxiv.org
The Coronavirus disease 2019 (COVID-19) has affected several million people. With the
outbreak of the epidemic, many researchers are devoting themselves to the COVID-19 …

Visual Semantic Learning via Early Stopping in Inverse Scale Space

Z Zhou, Z Liu, C Xu, Y Fu, X Sun - openreview.net
Different levels of visual information are generally coupled in image data, thus making it
hard to reverse the trend of deep learning models that learn texture bias from images …