Textual entailment for event argument extraction: Zero-and few-shot with multi-source learning

O Sainz, I Gonzalez-Dios, OL de Lacalle, B Min… - arXiv preprint arXiv …, 2022 - arxiv.org
Recent work has shown that NLP tasks such as Relation Extraction (RE) can be recasted as
Textual Entailment tasks using verbalizations, with strong performance in zero-shot and few …

WannaDB: Ad-hoc SQL Queries over Text Collections

B Hättasch, JM Bodensohn, L Vogel, M Urban… - BTW 2023, 2023 - dl.gi.de
Zusammenfassung n this paper, we propose a new system called WannaDB that allows
users to interactively perform structured explorations of text collections in an ad-hoc manner …

Event extraction in basque: Typologically motivated cross-lingual transfer-learning analysis

M Zubillaga, O Sainz, A Estarrona… - arXiv preprint arXiv …, 2024 - arxiv.org
Cross-lingual transfer-learning is widely used in Event Extraction for low-resource
languages and involves a Multilingual Language Model that is trained in a source language …

Improving Recall of Large Language Models: A Model Collaboration Approach for Relational Triple Extraction

Z Ding, W Huang, J Liang, D Yang, Y Xiao - arXiv preprint arXiv …, 2024 - arxiv.org
Relation triple extraction, which outputs a set of triples from long sentences, plays a vital role
in knowledge acquisition. Large language models can accurately extract triples from simple …

What do language models know about word senses? zero-shot wsd with language models and domain inventories

O Sainz, OL de Lacalle, E Agirre, G Rigau - arXiv preprint arXiv …, 2023 - arxiv.org
Language Models are the core for almost any Natural Language Processing system
nowadays. One of their particularities is their contextualized representations, a game …

Few-shot information extraction is here: Pre-train, prompt and entail

E Agirre - Proceedings of the 45th International ACM SIGIR …, 2022 - dl.acm.org
Deep Learning has made tremendous progress in Natural Language Processing (NLP),
where large pre-trained language models (PLM) fine-tuned on the target task have become …

Improving and simplifying template-based named entity recognition

M Kondragunta, O Perez-de-Viñaspre… - Proceedings of the 17th …, 2023 - aclanthology.org
With the rise in larger language models, researchers started exploiting them by pivoting the
downstream tasks as language modeling tasks using prompts. In this work, we convert the …

Entailment-based Task Transfer for Catalan Text Classification in Small Data Regimes

IB de la Peña, BC Figueras, M Villegas… - … del Lenguaje Natural, 2023 - journal.sepln.org
This study investigates the application of a state-of-the-art zero-shot and few-shot natural
language processing (NLP) technique for text classification tasks in Catalan, a moderately …

Cross-Lingual Temporal and Modal Dependency Parsing

J Yao - 2022 - search.proquest.com
Abstract The goal of Natural Language Understanding (NLU) is to develop systems that
have the capability to understand the meaning of human language. Whether an event …

[HTML][HTML] Democratizing Information Access through Low Overhead Systems

B Hättasch - 2024 - tuprints.ulb.tu-darmstadt.de
Despite its importance, accessing information in storage systems or raw data is challenging
or impossible for most people due to the sheer amount and heterogeneity of data as well as …