Tacube: Pre-computing data cubes for answering numerical-reasoning questions over tabular data

F Zhou, M Hu, H Dong, Z Cheng, S Han… - arXiv preprint arXiv …, 2022 - arxiv.org
Existing auto-regressive pre-trained language models (PLMs) like T5 and BART, have been
well applied to table question answering by UNIFIEDSKG and TAPEX, respectively, and …

TaCube: Pre-computing Data Cubes for Answering Numerical-Reasoning Questions over Tabular Data

F Zhou, M Hu, H Dong, Z Cheng, F Cheng… - Proceedings of the …, 2022 - aclanthology.org
Existing auto-regressive pre-trained language models (PLMs) like T5 and BART, have been
well applied to table question answering by UNIFIEDSKG and TAPEX, respectively, and …

TaCube: Pre-computing Data Cubes for Answering Numerical-Reasoning Questions over Tabular Data

F Zhou, M Hu, H Dong, Z Cheng, S Han… - arXiv e …, 2022 - ui.adsabs.harvard.edu
Existing auto-regressive pre-trained language models (PLMs) like T5 and BART, have been
well applied to table question answering by UNIFIEDSKG and TAPEX, respectively, and …