With the urgent demand for generalized deep models, many pre-trained big models are proposed, such as bidirectional encoder representations (BERT), vision transformer (ViT) …
L Rampášek, M Galkin, VP Dwivedi… - Advances in …, 2022 - proceedings.neurips.cc
We propose a recipe on how to build a general, powerful, scalable (GPS) graph Transformer with linear complexity and state-of-the-art results on a diverse set of benchmarks. Graph …
Network architecture plays a key role in the deep learning-based computer vision system. The widely-used convolutional neural network and transformer treat the image as a grid or …
Transformers have achieved superior performances in many tasks in natural language processing and computer vision, which also triggered great interest in the time series …
P Xu, X Zhu, DA Clifton - IEEE Transactions on Pattern Analysis …, 2023 - ieeexplore.ieee.org
Transformer is a promising neural network learner, and has achieved great success in various machine learning tasks. Thanks to the recent prevalence of multimodal applications …
X Xu, R Wang, CW Fu, J Jia - Proceedings of the IEEE/CVF …, 2022 - openaccess.thecvf.com
This paper presents a new solution for low-light image enhancement by collectively exploiting Signal-to-Noise-Ratio-aware transformers and convolutional models to …
Y Zhang, J Yan - The eleventh international conference on learning …, 2022 - openreview.net
Recently many deep models have been proposed for multivariate time series (MTS) forecasting. In particular, Transformer-based models have shown great potential because …
F Yuan, Z Zhang, Z Fang - Pattern Recognition, 2023 - Elsevier
The Transformer network was originally proposed for natural language processing. Due to its powerful representation ability for long-range dependency, it has been extended for …
Transformer, an attention-based encoder–decoder model, has already revolutionized the field of natural language processing (NLP). Inspired by such significant achievements, some …