P Chen, S Liu, J Jia - … of the IEEE/CVF conference on …, 2021 - openaccess.thecvf.com
Unsupervised representation learning with contrastive learning achieves great success recently. However, these methods have to duplicate each training batch to construct …
Joint clustering and feature learning methods have shown remarkable performance in unsupervised representation learning. However, the training schedule alternating between …
The recently advanced unsupervised learning approaches use the siamese-like framework to compare two" views" from the same image for learning representations. Making the two …
By leveraging contrastive learning, clustering, and other pretext tasks, unsupervised methods for learning image representations have reached impressive results on standard …
The downstream accuracy of self-supervised methods is tightly linked to the proxy task solved during training and the quality of the gradients extracted from it. Richer and more …
H Fan, P Liu, M Xu, Y Yang - Ieee transactions on cybernetics, 2021 - ieeexplore.ieee.org
The superiority of deeply learned representations relies on large-scale labeled datasets. However, annotating data are usually expensive or even infeasible in some scenarios. To …
Z Xie, Y Lin, Z Zhang, Y Cao… - Proceedings of the …, 2021 - openaccess.thecvf.com
Contrastive learning methods for unsupervised visual representation learning have reached remarkable levels of transfer performance. We argue that the power of contrastive learning …
W Feng, Y Wang, L Ma, Y Yuan… - Proceedings of the …, 2021 - openaccess.thecvf.com
The instance discrimination paradigm has become dominant in unsupervised learning. It always adopts a teacher-student framework, in which the teacher provides embedded …
Mixup is a well-known data-dependent augmentation technique for DNNs, consisting of two sub-tasks: mixup generation and classification. However, the recent dominant online training …