The MS MARCO ranking dataset has been widely used for training deep learning models for IR tasks, achieving considerable effectiveness on diverse zero-shot scenarios. However, this …
Recent advances in language representation using neural networks have made it viable to transfer the learned internal states of large pretrained language models (LMs) to …
In natural language processing (NLP), there is a need for more resources in Portuguese, since much of the data used in the state-of-the-art research is in other languages. In this …
An effective method for cross-lingual transfer is to fine-tune a bilingual or multilingual model on a supervised dataset in one language and evaluating it on another language in a zero …
Two sentences can be related in many different ways. Distinct tasks in natural language processing aim to identify different semantic relations between sentences. We developed …
H Gonçalo Oliveira - Progress in Artificial Intelligence: 20th EPIA …, 2021 - Springer
Despite different applications, transformer-based language models, like BERT and GPT, learn about language by predicting missing parts of text. BERT is pretrained in Masked …
Apresentamos neste artigo o corpo AIA-BDE, que tem como principal objetivo a avalia¸ c˜ ao de sistemas que procuram associar necessidades de informa¸ c˜ ao expressas em …
This thesis is concerned with the identification of semantic equivalence between pairs of natural language sentences, by studying and computing models to address Natural …
Na Geração automática de língua natural, modelos de transferência de estilo textual arbitrário objetivam a reescrita de um texto usando qualquer novo conjunto de …