Generalized logit adjustment: Calibrating fine-tuned models by removing label bias in foundation models

B Zhu, K Tang, Q Sun, H Zhang - Advances in Neural …, 2024 - proceedings.neurips.cc
Foundation models like CLIP allow zero-shot transfer on various tasks without additional
training data. Yet, the zero-shot performance is less competitive than a fully supervised one …

Generalized logit adjustment: Calibrating fine-tuned models by removing label bias in foundation models

B ZHU, K TANG, Q SUN, H ZHANG - 2023 - ink.library.smu.edu.sg
Foundation models like CLIP allow zero-shot transfer on various tasks without additional
training data. Yet, the zero-shot performance is less competitive than a fully supervised one …

Generalized Logit Adjustment: Calibrating Fine-tuned Models by Removing Label Bias in Foundation Models

B Zhu, K Tang, Q Sun, H Zhang - arXiv e-prints, 2023 - ui.adsabs.harvard.edu
Foundation models like CLIP allow zero-shot transfer on various tasks without additional
training data. Yet, the zero-shot performance is less competitive than a fully supervised one …

Generalized Logit Adjustment: Calibrating Fine-tuned Models by Removing Label Bias in Foundation Models

B Zhu, K Tang, Q Sun, H Zhang - Thirty-seventh Conference on Neural … - openreview.net
Foundation models like CLIP allow zero-shot transfer on various tasks without additional
training data. Yet, the zero-shot performance is less competitive than a fully supervised one …

Generalized Logit Adjustment: Calibrating Fine-tuned Models by Removing Label Bias in Foundation Models

B Zhu, K Tang, Q Sun, H Zhang - arXiv preprint arXiv:2310.08106, 2023 - arxiv.org
Foundation models like CLIP allow zero-shot transfer on various tasks without additional
training data. Yet, the zero-shot performance is less competitive than a fully supervised one …

Generalized logit adjustment: calibrating fine-tuned models by removing label bias in foundation models

B Zhu, K Tang, Q Sun, H Zhang - … of the 37th International Conference on …, 2023 - dl.acm.org
Foundation models like CLIP allow zero-shot transfer on various tasks without additional
training data. Yet, the zero-shot performance is less competitive than a fully supervised one …

[PDF][PDF] Generalized logit adjustment: Calibrating fine-tuned models by removing label bias in foundation models.(2023)

B ZHU, K TANG, Q SUN, H ZHANG - Proceedings of the 37th … - ink.library.smu.edu.sg
Foundation models like CLIP allow zero-shot transfer on various tasks without additional
training data. Yet, the zero-shot performance is less competitive than a fully supervised one …