Multi-label text classification using attention-based graph neural network

A Pal, M Selvakumar, M Sankarasubbu - arXiv preprint arXiv:2003.11644, 2020 - arxiv.org
arXiv preprint arXiv:2003.11644, 2020arxiv.org
In Multi-Label Text Classification (MLTC), one sample can belong to more than one class. It
is observed that most MLTC tasks, there are dependencies or correlations among labels.
Existing methods tend to ignore the relationship among labels. In this paper, a graph
attention network-based model is proposed to capture the attentive dependency structure
among the labels. The graph attention network uses a feature matrix and a correlation matrix
to capture and explore the crucial dependencies between the labels and generate …
In Multi-Label Text Classification (MLTC), one sample can belong to more than one class. It is observed that most MLTC tasks, there are dependencies or correlations among labels. Existing methods tend to ignore the relationship among labels. In this paper, a graph attention network-based model is proposed to capture the attentive dependency structure among the labels. The graph attention network uses a feature matrix and a correlation matrix to capture and explore the crucial dependencies between the labels and generate classifiers for the task. The generated classifiers are applied to sentence feature vectors obtained from the text feature extraction network (BiLSTM) to enable end-to-end training. Attention allows the system to assign different weights to neighbor nodes per label, thus allowing it to learn the dependencies among labels implicitly. The results of the proposed model are validated on five real-world MLTC datasets. The proposed model achieves similar or better performance compared to the previous state-of-the-art models.
arxiv.org
以上显示的是最相近的搜索结果。 查看全部搜索结果