A bag of useful tricks for practical neural machine translation: Embedding layer initialization and large batch size M Neishi, J Sakuma, S Tohda, S Ishiwatari, N Yoshinaga, M Toyoda Proceedings of the 4th Workshop on Asian Translation (WAT2017), 99-109, 2017 | 71 | 2017 |
On the relation between position information and sentence length in neural machine translation M Neishi, N Yoshinaga Proceedings of the 23rd Conference on Computational Natural Language …, 2019 | 44 | 2019 |
You may like this hotel because...: Identifying evidence for explainable recommendations S Kanouchi, M Neishi, Y Hayashibe, H Ouchi, N Okazaki Proceedings of the 1st Conference of the Asia-Pacific Chapter of the …, 2020 | 5 | 2020 |
Spiking neural network simulation on FPGAs with automatic and intensive pipelining T Kawao, M Neishi, T Okamoto, AM Gharehbaghi, T Kohno, M Fujita 2016 International Symposium on Nonlinear Theory and Its Applications …, 2016 | 3 | 2016 |
ニューラル機械翻訳における埋め込み層の教師なし事前学習 根石将人, 佐久間仁, 遠田哲史, 石渡祥之佑, 吉永直樹, 豊田正史 研究報告自然言語処理 (NL) 2017 (1), 1-8, 2017 | 1 | 2017 |
Revisiting Pre-training of Embedding Layers in Transformer-based Neural Machine Translation M Neishi, N Yoshinaga Journal of Natural Language Processing 31 (2), 534-567, 2024 | | 2024 |
英日翻訳における seq2seq と Transformer のスワップモデルを利用した比較 根石将人 | | 2019 |
ニューラル機械翻訳における埋め込み層の教師なし事前学習 M NEISHI, H SAKUMA, A ENDA, Y ISHIWATARI, N YOSHINAGA, ... 情報処理学会研究報告 (Web) 2017 (NL-233), 2017 | | 2017 |
Non-literal Neural Machine Translation by Exploiting Non-literal Bitext L Yu, N Yoshinaga, M Neishi, Y Tsuta | | |