Z Leng, M Tan, C Liu, ED Cubuk, X Shi… - arXiv preprint arXiv …, 2022 - arxiv.org
Cross-entropy loss and focal loss are the most common choices when training deep neural networks for classification problems. Generally speaking, however, a good loss function can …
Abstract Model-agnostic meta learning (MAML) is currently one of the dominating approaches for few-shot meta-learning. Albeit its effectiveness, the optimization of MAML …
Although deep learning has made great progress in recent years, the exploding economic and environmental costs of training neural networks are becoming unsustainable. To …
We propose a meta-learning technique for offline discovery of physics-informed neural network (PINN) loss functions. We extend earlier works on meta-learning, and develop a …
X Zhou, AK Qin, M Gong, KC Tan - IEEE Transactions on …, 2021 - ieeexplore.ieee.org
Automated construction of deep neural networks (DNNs) has become a research hot spot nowadays because DNN's performance is heavily influenced by its architecture and …
B Gao, H Gouk, Y Yang… - … Conference on Machine …, 2022 - proceedings.mlr.press
Generalising robustly to distribution shift is a major challenge that is pervasive across most real-world applications of machine learning. A recent study highlighted that many advanced …
Few-shot object detection, the problem of modelling novel object detection categories with few training instances, is an emerging topic in the area of few-shot learning and object …
The choice of activation function can have a large effect on the performance of a neural network. While there have been some attempts to hand-engineer novel activation functions …
Recent studies have shown that the choice of activation function can significantly affect the performance of deep learning networks. However, the benefits of novel activation functions …