BERT models for Arabic text classification: a systematic review

AS Alammary - Applied Sciences, 2022 - mdpi.com
Bidirectional Encoder Representations from Transformers (BERT) has gained increasing
attention from researchers and practitioners as it has proven to be an invaluable technique …

Jais and jais-chat: Arabic-centric foundation and instruction-tuned open generative large language models

N Sengupta, SK Sahu, B Jia, S Katipomu, H Li… - arXiv preprint arXiv …, 2023 - arxiv.org
We introduce Jais and Jais-chat, new state-of-the-art Arabic-centric foundation and
instruction-tuned open generative large language models (LLMs). The models are based on …

AraT5: Text-to-text transformers for Arabic language generation

EMB Nagoudi, AR Elmadany… - arXiv preprint arXiv …, 2021 - arxiv.org
Transfer learning with a unified Transformer framework (T5) that converts all language
problems into a text-to-text format was recently proposed as a simple and effective transfer …

BERT for Arabic topic modeling: An experimental study on BERTopic technique

A Abuzayed, H Al-Khalifa - Procedia computer science, 2021 - Elsevier
Topic modeling is an unsupervised machine learning technique for finding abstract topics in
a large collection of documents. It helps in organizing, understanding and summarizing …

Araieval shared task: Persuasion techniques and disinformation detection in arabic text

M Hasanain, F Alam, H Mubarak, S Abdaljalil… - arXiv preprint arXiv …, 2023 - arxiv.org
We present an overview of the ArAIEval shared task, organized as part of the first ArabicNLP
2023 conference co-located with EMNLP 2023. ArAIEval offers two tasks over Arabic text:(i) …

Emojis as anchors to detect arabic offensive language and hate speech

H Mubarak, S Hassan, SA Chowdhury - Natural Language …, 2023 - cambridge.org
We introduce a generic, language-independent method to collect a large percentage of
offensive and hate tweets regardless of their topics or genres. We harness the extralinguistic …

A comprehensive review on transformers models for text classification

R Kora, A Mohammed - 2023 International Mobile, Intelligent …, 2023 - ieeexplore.ieee.org
The rapid progress in deep learning has propelled transformer-based models to the
forefront, establishing them as leading solutions for a multiple NLP tasks. These tasks span …

WojoodNER 2023: The First Arabic Named Entity Recognition Shared Task

M Jarrar, M Abdul-Mageed, M Khalilia… - arXiv preprint arXiv …, 2023 - arxiv.org
We present WojoodNER-2023, the first Arabic Named Entity Recognition (NER) Shared
Task. The primary focus of WojoodNER-2023 is on Arabic NER, offering novel NER datasets …

Dziribert: a pre-trained language model for the algerian dialect

A Abdaoui, M Berrimi, M Oussalah… - arXiv preprint arXiv …, 2021 - arxiv.org
Pre-trained transformers are now the de facto models in Natural Language Processing given
their state-of-the-art results in many tasks and languages. However, most of the current …

[HTML][HTML] EnhancedBERT: A feature-rich ensemble model for Arabic word sense disambiguation with statistical analysis and optimized data collection

S Kaddoura, R Nassar - Journal of King Saud University-Computer and …, 2024 - Elsevier
Accurate assignment of meaning to a word based on its context, known as Word Sense
Disambiguation (WSD), remains challenging across languages. Extensive research aims to …