Transformers in time-series analysis: A tutorial

S Ahmed, IE Nielsen, A Tripathi, S Siddiqui… - Circuits, Systems, and …, 2023 - Springer
Transformer architectures have widespread applications, particularly in Natural Language
Processing and Computer Vision. Recently, Transformers have been employed in various …

Position information in transformers: An overview

P Dufter, M Schmitt, H Schütze - Computational Linguistics, 2022 - direct.mit.edu
Transformers are arguably the main workhorse in recent natural language processing
research. By definition, a Transformer is invariant with respect to reordering of the input …

Findings of the IWSLT 2022 Evaluation Campaign.

A Anastasopoulos, L Barrault, L Bentivogli… - Proceedings of the 19th …, 2022 - cris.fbk.eu
The evaluation campaign of the 19th International Conference on Spoken Language
Translation featured eight shared tasks:(i) Simultaneous speech translation,(ii) Offline …

Uni6d: A unified cnn framework without projection breakdown for 6d pose estimation

X Jiang, D Li, H Chen, Y Zheng… - Proceedings of the …, 2022 - openaccess.thecvf.com
As RGB-D sensors become more affordable, using RGB-D images to obtain high-accuracy
6D pose estimation results becomes a better option. State-of-the-art approaches typically …

Text style transfer: A review and experimental evaluation

Z Hu, RKW Lee, CC Aggarwal, A Zhang - ACM SIGKDD Explorations …, 2022 - dl.acm.org
The stylistic properties of text have intrigued computational linguistics researchers in recent
years. Specifically, researchers have investigated the text style transfer task (TST), which …

Benchmarking large language models on controllable generation under diversified instructions

Y Chen, B Xu, Q Wang, Y Liu, Z Mao - Proceedings of the AAAI …, 2024 - ojs.aaai.org
While large language models (LLMs) have exhibited impressive instruction-following
capabilities, it is still unclear whether and to what extent they can respond to explicit …

Controllable abstractive dialogue summarization with sketch supervision

CS Wu, L Liu, W Liu, P Stenetorp, C Xiong - arXiv preprint arXiv …, 2021 - arxiv.org
In this paper, we aim to improve abstractive dialogue summarization quality and, at the same
time, enable granularity control. Our model has two primary components and stages: 1) a …

Length control in abstractive summarization by pretraining information selection

Y Liu, Q Jia, K Zhu - Proceedings of the 60th Annual Meeting of …, 2022 - aclanthology.org
Previous length-controllable summarization models mostly control lengths at the decoding
stage, whereas the encoding or the selection of information from the source document is not …

Shape: Shifted absolute position embedding for transformers

S Kiyono, S Kobayashi, J Suzuki, K Inui - arXiv preprint arXiv:2109.05644, 2021 - arxiv.org
Position representation is crucial for building position-aware representations in
Transformers. Existing position representations suffer from a lack of generalization to test …

An abstractive text summarization technique using transformer model with self-attention mechanism

S Kumar, A Solanki - Neural Computing and Applications, 2023 - Springer
Creating a summarized version of a text document that still conveys precise meaning is an
incredibly complex endeavor in natural language processing (NLP). Abstract text …