Deepnag: Deep non-adversarial gesture generation

M Maghoumi, EM Taranta, J LaViola - Proceedings of the 26th …, 2021 - dl.acm.org
Proceedings of the 26th International Conference on Intelligent User Interfaces, 2021dl.acm.org
Synthetic data generation to improve classification performance (data augmentation) is a
well-studied problem. Recently, generative adversarial networks (GAN) have shown
superior image data augmentation performance, but their suitability in gesture synthesis has
received inadequate attention. Further, GANs prohibitively require simultaneous generator
and discriminator network training. We tackle both issues in this work. We first discuss a
novel, device-agnostic GAN model for gesture synthesis called DeepGAN. Thereafter, we …
Synthetic data generation to improve classification performance (data augmentation) is a well-studied problem. Recently, generative adversarial networks (GAN) have shown superior image data augmentation performance, but their suitability in gesture synthesis has received inadequate attention. Further, GANs prohibitively require simultaneous generator and discriminator network training. We tackle both issues in this work. We first discuss a novel, device-agnostic GAN model for gesture synthesis called DeepGAN. Thereafter, we formulate DeepNAG by introducing a new differentiable loss function based on dynamic time warping and the average Hausdorff distance, which allows us to train DeepGAN’s generator without requiring a discriminator. Through evaluations, we compare the utility of DeepGAN and DeepNAG against two alternative techniques for training five recognizers using data augmentation over six datasets. We further investigate the perceived quality of synthesized samples via an Amazon Mechanical Turk user study based on the HYPE∞ benchmark. We find that DeepNAG outperforms DeepGAN in accuracy, training time (up to 17 × faster), and realism, thereby opening the door to a new line of research in generator network design and training for gesture synthesis. Our source code is available at https://www.deepnag.com.
ACM Digital Library
以上显示的是最相近的搜索结果。 查看全部搜索结果

Google学术搜索按钮

example.edu/paper.pdf
搜索
获取 PDF 文件
引用
References