Structured Disentagled Representations

Apr 4, 2019·
Babak Esmaeili
,
Hao Wu
,
Sarthak Jain
Alican Bozkurt
Alican Bozkurt
,
N. Siddarth
,
Brooks Paige
,
Jennifer Dy
,
Dana H. Brooks
,
Jan-Willem van de Meent
· 0 min read
Abstract
Deep latent-variable models learn representations of high-dimensional data in an unsupervised manner. A number of recent efforts have focused on learning representations that disentangle statistically independent axes of variation by introducing modifications to the standard objective function. These approaches generally assume a simple diagonal Gaussian prior and as a result are not able to reliably disentangle discrete factors of variation. We propose a two-level hierarchical objective to control relative degree of statistical independence between blocks of variables and individual variables within blocks. We derive this objective as a generalization of the evidence lower bound, which allows us to explicitly represent the trade-offs between mutual information between data and representation, KL divergence between representation and prior, and coverage of the support of the empirical data distribution. Experiments on a variety of datasets demonstrate that our objective can not only disentangle discrete variables, but that doing so also improves disentanglement of other variables and, importantly, generalization even to unseen combinations of factors.
Type
Publication
In Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics 2019, Naha, Okinawa, Japan. PMLR: Volume 89
publication
Alican Bozkurt
Authors
AI Scientist
I am an AI Scientist at Paige AI. I did my Ph.D. with Jennifer Dy, Dana Brooks, and Jan-Willem van de Meent at Northeastern University. My main research interests are machine learning with emphasis on probabilistic programming, deep neural networks, and their applications in biomedical image processing. I am one of the developers of Probabilistic Torch, a library for deep generative models that extends PyTorch. I am also one of the maintainers of the PyTorch distributions module.