Y Li, Q Wang, J Zhang, L Hu, W Ouyang - Neurocomputing, 2021 - Elsevier
Generative adversarial networks (GAN) has received great attention and made great progress since its emergence in 2014. In this paper, we focus on the theoretical …
Nowadays, a massive number of people are involved in various social media. This fact enables organizations and institutions to more easily access their audiences across the …
H Cui, T Peng, R Han, J Han, L Liu - Knowledge-Based Systems, 2023 - Elsevier
Multi-hop knowledge graph question answering targets at pinpointing the answer entities by inferring across multiple triples in knowledge graphs. To enhance model interpretability …
H Gao, B Dai, H Miao, X Yang, RJD Barroso… - ACM Transactions on …, 2023 - dl.acm.org
Formal methods have been widely used to support software testing to guarantee correctness and reliability. For example, model checking technology attempts to ensure that the …
G Stanton, AA Irissappane - arXiv preprint arXiv:1903.08289, 2019 - arxiv.org
Online reviews have become a vital source of information in purchasing a service (product). Opinion spammers manipulate reviews, affecting the overall perception of the service. A key …
Q Zhu, W Zhang, T Liu, WY Wang - Proceedings of the 2020 …, 2020 - aclanthology.org
Open-domain dialogue generation suffers from the data insufficiency problem due to the vast size of potential responses. In this paper, we propose to explore potential responses by …
J Do Yoo, H Kim, HK Kim - Computers & Security, 2024 - Elsevier
With the development of information technology, many devices are connected and automated by networks. Unmanned Areal Vehicles (UAVs), commonly known as drones, are …
Machine learning algorithms represent the intelligence that controls many information systems and applications around us. As such, they are targeted by attackers to impact their …
Conditional sequence generation aims to instruct the generation procedure by conditioning the model with additional context information, which is an interesting research issue in AI …