We find that the performance of state-of-the-art models on Natural Language Inference (NLI) and Reading Comprehension (RC) analysis/stress sets can be highly unstable. This raises …
In this paper, we introduce SciGen, a new challenge dataset for the task of reasoning-aware data-to-text generation consisting of tables from scientific articles and their corresponding …
NS Moosavi, M de Boer, PA Utama… - arXiv preprint arXiv …, 2020 - arxiv.org
Existing NLP datasets contain various biases, and models tend to quickly learn those biases, which in turn limits their robustness. Existing approaches to improve robustness …
Recent advances in neural network architectures and large-scale language model pretraining have enabled Natural Language Understanding (NLU) systems to surpass …
The amount of information published on the Internet is growing steadily. Accessing the vast knowledge in them more effectively is a fundamental goal of many tasks in natural language …
The de facto paradigm of developing NLP models requires collecting a dataset, training the model on the training set assuming all the examples are iid (independent and identically …