Code-Mixed Language Understanding Using BiLSTM-BERT Multi-attention Fusion Mechanism

M Wankhade, N Jain, ACS Rao - International Conference on Machine …, 2023 - Springer
Code-mixed language, characterized by the seamless blending of multiple languages,
presents a formidable challenge for natural language understanding systems. In our work …

A parameter-adaptive convolution neural network for capturing the context-specific information in natural language understanding

R Duan, X Yang, Q Wang… - 2021 2nd International …, 2021 - ieeexplore.ieee.org
Natural Language Understanding (NLU) aims to make sense of language by enabling
computers to comprehend text in semantic level, which is a fundamental but challenging …

Language-agnostic and language-aware multilingual natural language understanding for large-scale intelligent voice assistant application

DY Zhang, J Hueser, Y Li… - 2021 IEEE International …, 2021 - ieeexplore.ieee.org
Natural language understanding (NLU) is one of the most critical components in goal-
oriented dialog systems and enables innovative Big Data applications such as intelligent …

Multi-Intent Natural Language Understanding Framework for Automotive Applications: A Heterogeneous Parallel Approach

X Li, L Zhang, L Fang, P Cao - Applied Sciences, 2023 - mdpi.com
Natural language understanding (NLU) is an important aspect of achieving human–machine
interactions in the automotive application field, consisting of two core subtasks, multiple …

Framework for deep learning-based language models using multi-task learning in natural language understanding: A systematic literature review and future directions

RM Samant, MR Bachute, S Gite, K Kotecha - IEEE Access, 2022 - ieeexplore.ieee.org
Learning human languages is a difficult task for a computer. However, Deep Learning (DL)
techniques have enhanced performance significantly for almost all-natural language …

Disan: Directional self-attention network for rnn/cnn-free language understanding

T Shen, T Zhou, G Long, J Jiang, S Pan… - Proceedings of the AAAI …, 2018 - ojs.aaai.org
Recurrent neural nets (RNN) and convolutional neural nets (CNN) are widely used on NLP
tasks to capture the long-term and local dependencies, respectively. Attention mechanisms …

Multi-domain attention fusion network for language recognition

M Ju, Y Xu, D Ke, K Su - SN Computer Science, 2022 - Springer
Attention-based convolutional neural network models are increasingly adopted for language
recognition tasks. In this paper, based on the self-attention mechanism, we solve the study of …

Context-aware dual-attention network for natural language inference

K Zhang, G Lv, E Chen, L Wu, Q Liu… - Advances in Knowledge …, 2019 - Springer
Abstract Natural Language Inference (NLI) is a fundamental task in natural language
understanding. In spite of the importance of existing research on NLI, the problem of how to …

Coarse-to-Fine: Hierarchical Multi-task Learning for Natural Language Understanding

Z Fei, Y Tian, Y Wu, X Zhang, Y Zhu, Z Liu, J Wu… - arXiv preprint arXiv …, 2022 - arxiv.org
Generalized text representations are the foundation of many natural language
understanding tasks. To fully utilize the different corpus, it is inevitable that models need to …

Feature Fusion Transformer Network for Natural Language Inference

L Sun, H Yan - 2022 IEEE International Conference on …, 2022 - ieeexplore.ieee.org
Natural Language Inference (NLI) is a branch of Natural Language Processing (NLP) whose
main task is to determine the relationship between a sentence pair (two sentences). Early …