[HTML][HTML] Contextual semantic embeddings based on fine-tuned AraBERT model for Arabic text multi-class categorization

F El-Alami, SO El Alaoui, NE Nahnahi - Journal of King Saud University …, 2022 - Elsevier
Despite that pre-trained word embedding models have advanced a wide range of natural
language processing applications, they ignore the contextual information and meaning …

Leveraging the meta-embedding for text classification in a resource-constrained language

MR Hossain, MM Hoque, N Siddique - Engineering Applications of Artificial …, 2023 - Elsevier
This paper proposes an intelligent text classification framework for a resource-constrained
language like Bengali, which is considered a challenging task due to the lack of standard …

Arabic text classification based on word and document embeddings

A El Mahdaouy, E Gaussier, SO El Alaoui - Proceedings of the …, 2017 - Springer
Abstract Recently, Word Embeddings have been introduced as a major breakthrough in
Natural Language Processing (NLP) to learn viable representation of linguistic items based …

[HTML][HTML] Bengali text document categorization based on very deep convolution neural network

MR Hossain, MM Hoque, N Siddique… - Expert Systems with …, 2021 - Elsevier
In recent years, the amount of digital text contents or documents in the Bengali language has
increased enormously on online platforms due to the effortless access of the Internet via …

On the class separability of contextual embeddings representations–or “the classifier does not matter when the (text) representation is so good!”

CMV de Andrade, FM Belem, W Cunha… - Information Processing …, 2023 - Elsevier
The literature has not fully and adequately explained why contextual (eg, BERT-based)
representations are so successful to improve the effectiveness of some Natural Language …

How to fine-tune bert for text classification?

C Sun, X Qiu, Y Xu, X Huang - … : 18th China national conference, CCL 2019 …, 2019 - Springer
Abstract Language model pre-training has proven to be useful in learning universal
language representations. As a state-of-the-art language model pre-training model, BERT …

An enhanced neural word embedding model for transfer learning

M Kowsher, MSI Sobuj, MF Shahriar, NJ Prottasha… - Applied Sciences, 2022 - mdpi.com
Due to the expansion of data generation, more and more natural language processing
(NLP) tasks are needing to be solved. For this, word representation plays a vital role …

A benchmark for evaluating Arabic contextualized word embedding models

A Elnagar, S Yagi, Y Mansour, L Lulu… - Information Processing & …, 2023 - Elsevier
Word embeddings, which represent words as numerical vectors in a high-dimensional
space, are contextualized by generating a unique vector representation for each sense of a …

A comparative study on word embeddings in deep learning for text classification

C Wang, P Nulty, D Lillis - … of the 4th international conference on natural …, 2020 - dl.acm.org
Word embeddings act as an important component of deep models for providing input
features in downstream language tasks, such as sequence labelling and text classification …

Text classification based on convolutional neural networks and word embedding for low-resource languages: Tigrinya

A Fesseha, S Xiong, ED Emiru, M Diallo, A Dahou - Information, 2021 - mdpi.com
This article studies convolutional neural networks for Tigrinya (also referred to as Tigrigna),
which is a family of Semitic languages spoken in Eritrea and northern Ethiopia. Tigrinya is a …