A structural topic modeling-based bibliometric study of sentiment analysis literature

X Chen, H Xie - Cognitive Computation, 2020 - Springer
Sentiment analysis is an increasingly evolving field of research in computer science. With
the considerable number of studies on innovative sentiment analysis available, it is worth …

Large language models for time series: A survey

X Zhang, RR Chowdhury, RK Gupta… - arXiv preprint arXiv …, 2024 - arxiv.org
Large Language Models (LLMs) have seen significant use in domains such as natural
language processing and computer vision. Going beyond text, image and graphics, LLMs …

Dewave: Discrete encoding of eeg waves for eeg to text translation

Y Duan, C Chau, Z Wang… - Advances in Neural …, 2024 - proceedings.neurips.cc
The translation of brain dynamics into natural language is pivotal for brain-computer
interfaces (BCIs), a field that has seen substantial growth in recent years. With the swift …

Open vocabulary electroencephalography-to-text decoding and zero-shot sentiment classification

Z Wang, H Ji - Proceedings of the AAAI Conference on Artificial …, 2022 - ojs.aaai.org
State-of-the-art brain-to-text systems have achieved great success in decoding language
directly from brain signals using neural networks. However, current approaches are limited …

Sequence classification with human attention

M Barrett, J Bingel, N Hollenstein, M Rei… - Proceedings of the …, 2018 - aclanthology.org
Learning attention functions requires large volumes of data, but many NLP tasks simulate
human behavior, and in this paper, we show that human attention really does provide a …

CogniVal: A framework for cognitive word embedding evaluation

N Hollenstein, A de la Torre, N Langer… - arXiv preprint arXiv …, 2019 - arxiv.org
An interesting method of evaluating word representations is by how much they reflect the
semantic representations in the human brain. However, most, if not all, previous works only …

Do transformer models show similar attention patterns to task-specific human gaze?

O Eberle, S Brandl, J Pilot… - Proceedings of the 60th …, 2022 - aclanthology.org
Learned self-attention functions in state-of-the-art NLP models often correlate with human
attention. We investigate whether self-attention in large-scale pre-trained language models …

UniCoRN: Unified Cognitive Signal ReconstructioN bridging cognitive signals and human language

N Xi, S Zhao, H Wang, C Liu, B Qin, T Liu - arXiv preprint arXiv …, 2023 - arxiv.org
Decoding text stimuli from cognitive signals (eg fMRI) enhances our understanding of the
human language system, paving the way for building versatile Brain-Computer Interface …

ZuCo 2.0: A dataset of physiological recordings during natural reading and annotation

N Hollenstein, M Troendle, C Zhang… - arXiv preprint arXiv …, 2019 - arxiv.org
We recorded and preprocessed ZuCo 2.0, a new dataset of simultaneous eye-tracking and
electroencephalography during natural reading and during annotation. This corpus contains …

Dewave: Discrete eeg waves encoding for brain dynamics to text translation

Y Duan, J Zhou, Z Wang, YK Wang, CT Lin - arXiv preprint arXiv …, 2023 - arxiv.org
The translation of brain dynamics into natural language is pivotal for brain-computer
interfaces (BCIs), a field that has seen substantial growth in recent years. With the swift …