S Peitz, SS Hotegni - arXiv preprint arXiv:2412.01566, 2024 - arxiv.org
Simultaneously considering multiple objectives in machine learning has been a popular approach for several decades, with various benefits for multi-task learning, the consideration …
Y Chen, W He, XL Zhao, TZ Huang… - IEEE Transactions on …, 2022 - ieeexplore.ieee.org
Hyperspectral image (HSI) denoising has been regarded as an effective and economical preprocessing step in data subsequent applications. Recent nonlocal low-rank …
What predicts cross-country differences in the recovery of socioeconomic activity from the COVID-19 pandemic? To answer this question, we examined how quickly countries' …
We propose a learning framework based on stochastic Bregman iterations, also known as mirror descent, to train sparse neural networks with an inverse scale space approach. We …
Q Zhao, T Ji, S Liang, W Yu - Multimedia Tools and Applications, 2024 - Springer
PCB board defect detection is a necessary part of the PCB manufacturing process and needs to be repeated several times to ensure the quality of the PCB board. However …
Z Yang, Q Xu, X Cao, Q Huang - IEEE transactions on pattern …, 2020 - ieeexplore.ieee.org
As an effective learning paradigm against insufficient training samples, multi-task learning (MTL) encourages knowledge sharing across multiple related tasks so as to improve the …
This study focuses on optimizing federated learning in heterogeneous data environments. We implement the FedProx and a baseline algorithm (ie, the FedAvg) with advanced …
We propose a novel strategy for Neural Architecture Search (NAS) based on Bregman iterations. Starting from a sparse neural network our gradient-based one-shot algorithm …
K Bui, F Xue, F Park, Y Qi, J Xin - International Conference on Machine …, 2023 - Springer
As a popular channel pruning method for convolutional neural networks (CNNs), network slimming (NS) has a three-stage process:(1) it trains a CNN with ℓ 1 regularization applied to …