Recently the state space models (SSMs) with efficient hardware-aware designs, ie, the Mamba deep learning model, have shown great potential for long sequence modeling …
Mamba, a recent selective structured state space model, performs excellently on long sequence modeling tasks. Mamba mitigates the modeling constraints of convolutional …
In the realm of medical image segmentation, both CNN-based and Transformer-based models have been extensively explored. However, CNNs exhibit limitations in long-range …
A Behrouz, F Hashemi - Proceedings of the 30th ACM SIGKDD …, 2024 - dl.acm.org
Graph Neural Networks (GNNs) have shown promising potential in graph representation learning. The majority of GNNs define a local message-passing mechanism, propagating …
Transformers with linear attention allow for efficient parallel training but can simultaneously be formulated as an RNN with 2D (matrix-valued) hidden states, thus enjoying linear (with …
Y Yang, Z Xing, L Zhu - arXiv preprint arXiv:2401.14168, 2024 - arxiv.org
Traditional convolutional neural networks have a limited receptive field while transformer- based networks are mediocre in constructing long-term dependency from the perspective of …
Transformers have become one of the foundational architectures in point cloud analysis tasks due to their excellent global modeling ability. However, the attention mechanism has …
J Liu, H Yang, HY Zhou, Y Xi, L Yu, Y Yu… - arXiv preprint arXiv …, 2024 - arxiv.org
Accurate medical image segmentation demands the integration of multi-scale information, spanning from local features to global dependencies. However, it is challenging for existing …
Recent years have witnessed great progress in image restoration thanks to the advancements in modern deep neural networks eg Convolutional Neural Network and …