Scaling language-image pre-training via masking

Y Li, H Fan, R Hu… - Proceedings of the …, 2023 - openaccess.thecvf.com
Abstract We present Fast Language-Image Pre-training (FLIP), a simple and more efficient
method for training CLIP. Our method randomly masks out and removes a large portion of …

Scaling Language-Image Pre-Training via Masking

Y Li, H Fan, R Hu, C Feichtenhofer… - 2023 IEEE/CVF …, 2023 - ieeexplore.ieee.org
We present Fast Language-Image Pre-training (FLIP), a simple and more efficient method
for training CLIP [52]. Our method randomly masks out and removes a large portion of image …

Scaling Language-Image Pre-training via Masking

Y Li, H Fan, R Hu, C Feichtenhofer, K He - arXiv e-prints, 2022 - ui.adsabs.harvard.edu
Abstract We present Fast Language-Image Pre-training (FLIP), a simple and more efficient
method for training CLIP. Our method randomly masks out and removes a large portion of …

Scaling Language-Image Pre-training via Masking

Y Li, H Fan, R Hu, C Feichtenhofer, K He - arXiv preprint arXiv:2212.00794, 2022 - arxiv.org
We present Fast Language-Image Pre-training (FLIP), a simple and more efficient method
for training CLIP. Our method randomly masks out and removes a large portion of image …

Scaling Language-Image Pre-training via Masking

Y Li - cvpr.thecvf.com
Scaling Language-Image Pre-training via Masking Page 1 Scaling Language-Image Pre-training
via Masking Yanghao Li∗, Haoqi Fan∗, Ronghang Hu∗, Christoph Feichtenhofer† …

Scaling Language-Image Pre-Training via Masking

Y Li, H Fan, R Hu, C Feichtenhofer… - 2023 IEEE/CVF …, 2023 - computer.org
Abstract We present Fast Language-Image Pre-training (FLIP), a simple and more efficient
method for training CLIP [52]. Our method randomly masks out and removes a large portion …