[PDF][PDF] 命名实体识别研究综述

刘浏, 王东波 - 情报学报, 2018 - qbxb.istic.ac.cn
摘要命名实体识别一直以来都是信息抽取, 自然语言处理等领域中重要的研究任务,
随着机器学习技术的新发展, 数字人文研究的兴起, 事件知识和实体知识变得越发重要 …

A survey on arabic named entity recognition: Past, recent advances, and future trends

X Qu, Y Gu, Q Xia, Z Li, Z Wang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
As more and more Arabic texts emerged on the Internet, extracting important information
from these Arabic texts is especially useful. As a fundamental technology, Named entity …

ARBERT & MARBERT: Deep bidirectional transformers for Arabic

M Abdul-Mageed, AR Elmadany… - arXiv preprint arXiv …, 2020 - arxiv.org
Pre-trained language models (LMs) are currently integral to many natural language
processing systems. Although multilingual LMs were also introduced to serve many …

Aravec: A set of arabic word embedding models for use in arabic nlp

AB Soliman, K Eissa, SR El-Beltagy - Procedia Computer Science, 2017 - Elsevier
Advancements in neural networks have led to developments in fields like computer vision,
speech recognition and natural language processing (NLP). One of the most influential …

CAMeL tools: An open source python toolkit for Arabic natural language processing

O Obeid, N Zalmout, S Khalifa, D Taji… - Proceedings of the …, 2020 - aclanthology.org
Abstract We present CAMeL Tools, a collection of open-source tools for Arabic natural
language processing in Python. CAMeL Tools currently provides utilities for pre-processing …

A survey of automatic text summarization: Progress, process and challenges

MF Mridha, AA Lima, K Nur, SC Das, M Hasan… - IEEE …, 2021 - ieeexplore.ieee.org
With the evolution of the Internet and multimedia technology, the amount of text data has
increased exponentially. This text volume is a precious source of information and knowledge …

Pre-training bert on arabic tweets: Practical considerations

A Abdelali, S Hassan, H Mubarak, K Darwish… - arXiv preprint arXiv …, 2021 - arxiv.org
Pretraining Bidirectional Encoder Representations from Transformers (BERT) for
downstream NLP tasks is a non-trival task. We pretrained 5 BERT models that differ in the …

Aligning cross-lingual entities with multi-aspect information

HW Yang, Y Zou, P Shi, W Lu, J Lin, X Sun - arXiv preprint arXiv …, 2019 - arxiv.org
Multilingual knowledge graphs (KGs), such as YAGO and DBpedia, represent entities in
different languages. The task of cross-lingual entity alignment is to match entities in a source …

Multi-task cross-lingual sequence tagging from scratch

Z Yang, R Salakhutdinov, W Cohen - arXiv preprint arXiv:1603.06270, 2016 - arxiv.org
We present a deep hierarchical recurrent neural network for sequence tagging. Given a
sequence of words, our model employs deep gated recurrent units on both character and …

Charner: Character-level named entity recognition

O Kuru, OA Can, D Yuret - Proceedings of COLING 2016, the 26th …, 2016 - aclanthology.org
We describe and evaluate a character-level tagger for language-independent Named Entity
Recognition (NER). Instead of words, a sentence is represented as a sequence of …