A contrastive objective for learning disentangled representations

J Kahana, Y Hoshen - European Conference on Computer Vision, 2022 - Springer
Learning representations of images that are invariant to sensitive or unwanted attributes is
important for many tasks including bias removal and cross domain retrieval. Here, our …

Learning disentangled representations via independent subspaces

M Awiszus, H Ackermann… - Proceedings of the …, 2019 - openaccess.thecvf.com
Image generating neural networks are mostly viewed as black boxes, where any change in
the input can have a number of globally effective changes on the output. In this work, we …

Where and what? examining interpretable disentangled representations

X Zhu, C Xu, D Tao - … of the IEEE/CVF Conference on …, 2021 - openaccess.thecvf.com
Capturing interpretable variations has long been one of the goals in disentanglement
learning. However, unlike the independence assumption, interpretability has rarely been …

Disentangled representation learning

X Wang, H Chen, S Tang, Z Wu, W Zhu - arXiv preprint arXiv:2211.11695, 2022 - arxiv.org
Disentangled Representation Learning (DRL) aims to learn a model capable of identifying
and disentangling the underlying factors hidden in the observable data in representation …

Tripod: Three Complementary Inductive Biases for Disentangled Representation Learning

K Hsu, JI Hamid, K Burns, C Finn, J Wu - arXiv preprint arXiv:2404.10282, 2024 - arxiv.org
Inductive biases are crucial in disentangled representation learning for narrowing down an
underspecified solution set. In this work, we consider endowing a neural network …

Sw-vae: Weakly supervised learn disentangled representation via latent factor swapping

J Zhu, H Xie, W Abd-Almageed - European Conference on Computer …, 2022 - Springer
Abstract Representation disentanglement is an important goal of the representation learning
that benefits various of downstream tasks. To achieve this goal, many unsupervised learning …

Learning disentangled representations via mutual information estimation

EH Sanchez, M Serrurier, M Ortner - … , Glasgow, UK, August 23–28, 2020 …, 2020 - Springer
In this paper, we investigate the problem of learning disentangled representations. Given a
pair of images sharing some attributes, we aim to create a low-dimensional representation …

Product of orthogonal spheres parameterization for disentangled representation learning

A Shukla, S Bhagat, S Uppal, S Anand… - arXiv preprint arXiv …, 2019 - arxiv.org
Learning representations that can disentangle explanatory attributes underlying the data
improves interpretabilty as well as provides control on data generation. Various learning …

Disunknown: Distilling unknown factors for disentanglement learning

S Xiang, Y Gu, P Xiang, M Chai, H Li… - Proceedings of the …, 2021 - openaccess.thecvf.com
Disentangling data into interpretable and independent factors is critical for controllable
generation tasks. With the availability of labeled data, supervision can help enforce the …

Learning disentangled representation by exploiting pretrained generative models: A contrastive learning view

X Ren, T Yang, Y Wang, W Zeng - arXiv preprint arXiv:2102.10543, 2021 - arxiv.org
From the intuitive notion of disentanglement, the image variations corresponding to different
factors should be distinct from each other, and the disentangled representation should …