Based on recent advances in natural language modeling and those in text generation capabilities, we propose a novel data augmentation method for text classification tasks. We …
Dropout methods are a family of stochastic techniques used in neural network training or inference that have generated significant research interest and are widely used in practice …
In this paper, we present Smart Compose, a novel system for generating interactive, real- time suggestions in Gmail that assists users in writing mails by reducing repetitive typing. In …
We consider a simple and overarching representation for permutation-invariant functions of sequences (or multiset functions). Our approach, which we call Janossy pooling, expresses …
Deep neural networks, trained with large amount of labeled data, can fail to generalize well when tested with examples from a target domain whose distribution differs from the training …
X Cui, W Zhang, Z Tüske… - Advances in neural …, 2018 - proceedings.neurips.cc
We propose a population-based Evolutionary Stochastic Gradient Descent (ESGD) framework for optimizing deep neural networks. ESGD combines SGD and gradient-free …
R Hou, B Ma, H Chang, X Gu, S Shan… - IEEE Transactions on …, 2020 - ieeexplore.ieee.org
Person reidentification (reID) by convolutional neural network (CNN)-based networks has achieved favorable performance in recent years. However, most of existing CNN-based …
R Wang, S Xu, Y Tian, X Ji, X Sun, S Jiang - Computers & Security, 2024 - Elsevier
Detecting vulnerabilities in source code is crucial for protecting software systems from cyberattacks. Pre-trained language models such as CodeBERT and GraphCodeBERT have …
F Yang, H Zhang, S Tao - Applied Intelligence, 2022 - Springer
Graph convolutional networks (GCNs) and their variants are excellent deep learning methods for graph-structured data. Moreover, multilayer GCNs can perform feature …