A survey on knowledge graphs: Representation, acquisition, and applications

S Ji, S Pan, E Cambria, P Marttinen… - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Human knowledge provides a formal understanding of the world. Knowledge graphs that
represent structural relations between entities have become an increasingly popular …

Foundations and Trends in Multimodal Machine Learning: Principles, Challenges, and Open Questions

PP Liang, A Zadeh, LP Morency - arXiv preprint arXiv:2209.03430, 2022 - arxiv.org
Multimodal machine learning is a vibrant multi-disciplinary research field that aims to design
computer agents with intelligent capabilities such as understanding, reasoning, and learning …

Unifying large language models and knowledge graphs: A roadmap

S Pan, L Luo, Y Wang, C Chen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Large language models (LLMs), such as ChatGPT and GPT4, are making new waves in the
field of natural language processing and artificial intelligence, due to their emergent ability …

Deep bidirectional language-knowledge graph pretraining

M Yasunaga, A Bosselut, H Ren… - Advances in …, 2022 - proceedings.neurips.cc
Pretraining a language model (LM) on text has been shown to help various downstream
NLP tasks. Recent works show that a knowledge graph (KG) can complement text data …

Making language models better reasoners with step-aware verifier

Y Li, Z Lin, S Zhang, Q Fu, B Chen… - Proceedings of the …, 2023 - aclanthology.org
Few-shot learning is a challenging task that requires language models to generalize from
limited examples. Large language models like GPT-3 and PaLM have made impressive …

QA-GNN: Reasoning with language models and knowledge graphs for question answering

M Yasunaga, H Ren, A Bosselut, P Liang… - arXiv preprint arXiv …, 2021 - arxiv.org
The problem of answering questions using knowledge from pre-trained language models
(LMs) and knowledge graphs (KGs) presents two challenges: given a QA context (question …

Graph neural networks for natural language processing: A survey

L Wu, Y Chen, K Shen, X Guo, H Gao… - … and Trends® in …, 2023 - nowpublishers.com
Deep learning has become the dominant approach in addressing various tasks in Natural
Language Processing (NLP). Although text inputs are typically represented as a sequence …

Generated knowledge prompting for commonsense reasoning

J Liu, A Liu, X Lu, S Welleck, P West, RL Bras… - arXiv preprint arXiv …, 2021 - arxiv.org
It remains an open question whether incorporating external knowledge benefits
commonsense reasoning while maintaining the flexibility of pretrained sequence models. To …

Greaselm: Graph reasoning enhanced language models for question answering

X Zhang, A Bosselut, M Yasunaga, H Ren… - arXiv preprint arXiv …, 2022 - arxiv.org
Answering complex questions about textual narratives requires reasoning over both stated
context and the world knowledge that underlies it. However, pretrained language models …

A survey of knowledge enhanced pre-trained language models

L Hu, Z Liu, Z Zhao, L Hou, L Nie… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Pre-trained Language Models (PLMs) which are trained on large text corpus via self-
supervised learning method, have yielded promising performance on various tasks in …