Multi-head self-attention gated-dilated convolutional neural network for word sense disambiguation

CX Zhang, YL Zhang, XY Gao - IEEE Access, 2023 - ieeexplore.ieee.org
Word sense disambiguation (WSD) is to determine correct sense of ambiguous word based
on its context. WSD is widely used in text classification, machine translation and information …

Single-shot structured light projection profilometry with SwinConvUNet

W Lei, L Dunqiang, T Jiaqing… - Optical Engineering, 2022 - spiedigitallibrary.org
Structured light profilometry (SLP) is now widely utilized in noncontact three-dimensional
(3D) reconstruction due to its convenience in dynamic measurements. Compared with …

SADLN: Self-attention based deep learning network of integrating multi-omics data for cancer subtype recognition

Q Sun, L Cheng, A Meng, S Ge, J Chen, L Zhang… - Frontiers in …, 2023 - frontiersin.org
Integrating multi-omics data for cancer subtype recognition is an important task in
bioinformatics. Recently, deep learning has been applied to recognize the subtype of …

Extracting long‐term spatiotemporal characteristics of traffic flow using attention‐based convolutional transformer

AR Sattarzadeh, PN Pathirana… - IET Intelligent …, 2023 - Wiley Online Library
Predicting traffic flow is vital for optimizing transportation efficiency, reducing fuel
consumption, and minimizing commute times. While artificial intelligence tools have been …

COOL, a context outlooker, and its application to question answering and other natural language processing tasks

F Zhu, SK Ng, S Bressan - arXiv preprint arXiv:2204.09593, 2022 - arxiv.org
Vision outlooker improves the performance of vision transformers, which implements a self-
attention mechanism by adding an outlook attention, a form of local attention. In natural …

Crossing Linguistic Barriers: A Hybrid Attention Framework for Chinese-Arabic Machine Translation

C Zhao, A Hamdulla - 2024 International Conference on …, 2024 - ieeexplore.ieee.org
This study proposes an innovative method for Chinese-Arabic machine translation (zh-
ar_MT), integrating adaptive local attention mechanisms (ALAM) with dynamic global …

Unsupervised parallel tacotron non-autoregressive and controllable text-to-speech

I Elias, B Chun, J Shen, Y Jia, Y Zhang… - US Patent 11,823,656, 2023 - Google Patents
A method for training a non-autoregressive TTS model includes obtaining a sequence
representation of an encoded text sequence concatenated with a variational embedding …