Transformers are arguably the main workhorse in recent natural language processing research. By definition, a Transformer is invariant with respect to reordering of the input …
The evaluation campaign of the 19th International Conference on Spoken Language Translation featured eight shared tasks:(i) Simultaneous speech translation,(ii) Offline …
As RGB-D sensors become more affordable, using RGB-D images to obtain high-accuracy 6D pose estimation results becomes a better option. State-of-the-art approaches typically …
The stylistic properties of text have intrigued computational linguistics researchers in recent years. Specifically, researchers have investigated the text style transfer task (TST), which …
While large language models (LLMs) have exhibited impressive instruction-following capabilities, it is still unclear whether and to what extent they can respond to explicit …
In this paper, we aim to improve abstractive dialogue summarization quality and, at the same time, enable granularity control. Our model has two primary components and stages: 1) a …
Y Liu, Q Jia, K Zhu - Proceedings of the 60th Annual Meeting of …, 2022 - aclanthology.org
Previous length-controllable summarization models mostly control lengths at the decoding stage, whereas the encoding or the selection of information from the source document is not …
Position representation is crucial for building position-aware representations in Transformers. Existing position representations suffer from a lack of generalization to test …
S Kumar, A Solanki - Neural Computing and Applications, 2023 - Springer
Creating a summarized version of a text document that still conveys precise meaning is an incredibly complex endeavor in natural language processing (NLP). Abstract text …