Rethinking data distillation: Do not overlook calibration

D Zhu, B Lei, J Zhang, Y Fang, Y Xie… - Proceedings of the …, 2023 - openaccess.thecvf.com
Neural networks trained on distilled data often produce over-confident output and require
correction by calibration methods. Existing calibration methods such as temperature scaling …

Rethinking Data Distillation: Do Not Overlook Calibration

D Zhu, Y Fang, B Lei, Y Xie, D Xu… - 2023 IEEE/CVF …, 2023 - ieeexplore.ieee.org
Neural networks trained on distilled data often produce over-confident output and require
correction by calibration methods. Existing calibration methods such as temperature scaling …

Rethinking Data Distillation: Do Not Overlook Calibration

D Zhu, B Lei, J Zhang, Y Fang, R Zhang, Y Xie… - arXiv e …, 2023 - ui.adsabs.harvard.edu
Neural networks trained on distilled data often produce over-confident output and require
correction by calibration methods. Existing calibration methods such as temperature scaling …

Rethinking Data Distillation: Do Not Overlook Calibration

D Zhu, B Lei, J Zhang, Y Fang, R Zhang, Y Xie… - arXiv preprint arXiv …, 2023 - arxiv.org
Neural networks trained on distilled data often produce over-confident output and require
correction by calibration methods. Existing calibration methods such as temperature scaling …

Rethinking Data Distillation: Do Not Overlook Calibration

D Zhu, Y Fang, B Lei, Y Xie, D Xu, J Zhang, R Zhang - ICCV, 2023 - openreview.net
Neural networks trained on distilled data often produce over-confident output and require
correction by calibration methods. Existing calibration methods such as temperature scaling …

Rethinking Data Distillation: Do Not Overlook Calibration

D Zhu, Y Fang, B Lei, Y Xie, D Xu, J Zhang… - 2023 IEEE/CVF …, 2023 - computer.org
Neural networks trained on distilled data often produce over-confident output and require
correction by calibration methods. Existing calibration methods such as temperature scaling …