Neural machine translation: A review

F Stahlberg - Journal of Artificial Intelligence Research, 2020 - jair.org
The field of machine translation (MT), the automatic translation of written text from one
natural language into another, has experienced a major paradigm shift in recent years …

Conventional and contemporary approaches used in text to speech synthesis: A review

N Kaur, P Singh - Artificial Intelligence Review, 2023 - Springer
Nowadays speech synthesis or text to speech (TTS), an ability of system to produce human
like natural sounding voice from the written text, is gaining popularity in the field of speech …

Fastspeech: Fast, robust and controllable text to speech

Y Ren, Y Ruan, X Tan, T Qin, S Zhao… - Advances in neural …, 2019 - proceedings.neurips.cc
Neural network based end-to-end text to speech (TTS) has significantly improved the quality
of synthesized speech. Prominent methods (eg, Tacotron 2) usually first generate mel …

Pay less attention with lightweight and dynamic convolutions

F Wu, A Fan, A Baevski, YN Dauphin, M Auli - arXiv preprint arXiv …, 2019 - arxiv.org
Self-attention is a useful mechanism to build generative models for language and images. It
determines the importance of context elements by comparing each element to the current …

A survey on non-autoregressive generation for neural machine translation and beyond

Y Xiao, L Wu, J Guo, J Li, M Zhang… - IEEE Transactions on …, 2023 - ieeexplore.ieee.org
Non-autoregressive (NAR) generation, which is first proposed in neural machine translation
(NMT) to speed up inference, has attracted much attention in both machine learning and …

Towards efficient generative large language model serving: A survey from algorithms to systems

X Miao, G Oliaro, Z Zhang, X Cheng, H Jin… - arXiv preprint arXiv …, 2023 - arxiv.org
In the rapidly evolving landscape of artificial intelligence (AI), generative large language
models (LLMs) stand at the forefront, revolutionizing how we interact with our data. However …

Glancing transformer for non-autoregressive neural machine translation

L Qian, H Zhou, Y Bao, M Wang, L Qiu… - arXiv preprint arXiv …, 2020 - arxiv.org
Recent work on non-autoregressive neural machine translation (NAT) aims at improving the
efficiency by parallel decoding without sacrificing the quality. However, existing NAT …

Hand-transformer: Non-autoregressive structured modeling for 3d hand pose estimation

L Huang, J Tan, J Liu, J Yuan - … Conference, Glasgow, UK, August 23–28 …, 2020 - Springer
Abstract 3D hand pose estimation is still far from a well-solved problem mainly due to the
highly nonlinear dynamics of hand pose and the difficulties of modeling its inherent …

Understanding and improving lexical choice in non-autoregressive translation

L Ding, L Wang, X Liu, DF Wong, D Tao… - arXiv preprint arXiv …, 2020 - arxiv.org
Knowledge distillation (KD) is essential for training non-autoregressive translation (NAT)
models by reducing the complexity of the raw data with an autoregressive teacher model. In …

Directed acyclic transformer for non-autoregressive machine translation

F Huang, H Zhou, Y Liu, H Li… - … Conference on Machine …, 2022 - proceedings.mlr.press
Abstract Non-autoregressive Transformers (NATs) significantly reduce the decoding latency
by generating all tokens in parallel. However, such independent predictions prevent NATs …