Adversarial Machine Learning in the Context of Network Security: Challenges and Solutions

M Khan, L Ghafoor - Journal of Computational Intelligence …, 2024 - thesciencebrigade.com
With the increasing sophistication of cyber threats, the integration of machine learning (ML)
techniques in network security has become imperative for detecting and mitigating evolving …

A contrastive cross-channel data augmentation framework for aspect-based sentiment analysis

B Wang, L Ding, Q Zhong, X Li, D Tao - arXiv preprint arXiv:2204.07832, 2022 - arxiv.org
Aspect-based sentiment analysis (ABSA) is a fine-grained sentiment analysis task, which
focuses on detecting the sentiment polarity towards the aspect in a sentence. However, it is …

Quantum Computing and AI in the Cloud

H Padmanaban - Journal of Computational Intelligence and …, 2024 - thesciencebrigade.com
The intersection of quantum computing and artificial intelligence (AI) within the cloud
environment represents a paradigm shift in the capabilities of computational technologies …

Prompt-learning for cross-lingual relation extraction

C Hsu, C Zan, L Ding, L Wang, X Wang… - … Joint Conference on …, 2023 - ieeexplore.ieee.org
Relation Extraction (RE) is a crucial task in Information Extraction, which entails predicting
relationships between entities within a given sentence. However, extending pre-trained RE …

On the complementarity between pre-training and back-translation for neural machine translation

X Liu, L Wang, DF Wong, L Ding, LS Chao… - arXiv preprint arXiv …, 2021 - arxiv.org
Pre-training (PT) and back-translation (BT) are two simple and powerful methods to utilize
monolingual data for improving the model performance of neural machine translation (NMT) …

Explainability-based mix-up approach for text data augmentation

S Kwon, Y Lee - ACM transactions on knowledge discovery from data, 2023 - dl.acm.org
Text augmentation is a strategy for increasing the diversity of training examples without
explicitly collecting new data. Owing to the efficiency and effectiveness of text augmentation …

Improving neural machine translation by denoising training

L Ding, K Peng, D Tao - arXiv preprint arXiv:2201.07365, 2022 - arxiv.org
We present a simple and effective pretraining strategy {D} en {o} ising {T} raining DoT for
neural machine translation. Specifically, we update the model parameters with source-and …

Can Linguistic Knowledge Improve Multimodal Alignment in Vision-Language Pretraining?

F Wang, L Ding, J Rao, Y Liu, L Shen… - arXiv preprint arXiv …, 2023 - arxiv.org
The multimedia community has shown a significant interest in perceiving and representing
the physical world with multimodal pretrained neural network models, and among them, the …

Bi-simcut: A simple strategy for boosting neural machine translation

P Gao, Z He, H Wu, H Wang - arXiv preprint arXiv:2206.02368, 2022 - arxiv.org
We introduce Bi-SimCut: a simple but effective training strategy to boost neural machine
translation (NMT) performance. It consists of two procedures: bidirectional pretraining and …

Synthetized Multilanguage OCR Using CRNN and SVTR Models for Realtime Collaborative Tools

A Biró, AI Cuesta-Vargas, J Martín-Martín, L Szilágyi… - Applied Sciences, 2023 - mdpi.com
Background: Remote diagnosis using collaborative tools have led to multilingual joint
working sessions in various domains, including comprehensive health care, and resulting in …