Vit5: Pretrained text-to-text transformer for vietnamese language generation

L Phan, H Tran, H Nguyen, TH Trinh - arXiv preprint arXiv:2205.06457, 2022 - arxiv.org
We present ViT5, a pretrained Transformer-based encoder-decoder model for the
Vietnamese language. With T5-style self-supervised pretraining, ViT5 is trained on a large …

BartPho: pre-trained sequence-to-sequence models for Vietnamese

NL Tran, DM Le, DQ Nguyen - arXiv preprint arXiv:2109.09701, 2021 - arxiv.org
We present BARTpho with two versions, BARTpho-syllable and BARTpho-word, which are
the first public large-scale monolingual sequence-to-sequence models pre-trained for …

Artificial Intelligence and Infectious Disease Imaging

WT Chu, SMS Reza, JT Anibal, A Landa… - The Journal of …, 2023 - academic.oup.com
The mass production of the graphics processing unit and the coronavirus disease 2019
(COVID-19) pandemic have provided the means and the motivation, respectively, for rapid …

Vietnamese text summarization based on neural network models

KN Lam, TT Do, NHT Pham, J Kalita - … on Artificial Intelligence and Big Data …, 2021 - Springer
Text summarization produces a shortened or condensed version of input text highlighting its
central ideas. Generating text summarization manually takes time and effort. This paper …

[HTML][HTML] Indonesian News Text Summarization Using MBART Algorithm

RH Astuti, M Muljono, S Sutriawan - Scientific Journal of …, 2024 - journal.unnes.ac.id
Purpose: Technology advancements have led to the production of a large amount of textual
data. There are numerous locations where one can find textual information sources …