Submission by: Andre Gilbert (3465546) and Jan Henrik Bertrand (8556462) - This counts as one submission together with the presentation about Disentangled Variational Autoencoders, where this repository serves as a demo.
A demonstration of the capabilities of beta-VAE to disentangle data-generative factors. This demonstration provides a pre-trained disentangled VAE model and an interface that allows a user to alter the latent variables intuitively in order to generate handwritten digits according based on human-interpretable data-generative factors.
The demo is based on JointVAE, a Variational Autoencoder that has both continuous and discrete latent variables (cf. Dupont, 2018).
First, install the necessary requirements using the following command:
pip install -r requirements.txt
The package was developed using Python 3.9.6.
There are two ways to use this:
-
Jupyter Notebook: Run all cells in the
demo.ipynb
notebook and start by setting the first slider. -
Streamlit App: Run the streamlit app using the following command:
streamlit run demo.py
The web-app should open automatically. Live version: https://beta-vae-demo.streamlit.app.
- Dupont, Emilien. "Learning disentangled joint continuous and discrete representations." Advances in neural information processing systems 31 (2018).
- JointVAE implementation: https://github.com/Schlumberger/joint-vae/tree/master, lastest retrieval 02nd Feb 2024
- Higgins, Irina, et al. "Early visual concept learning with unsupervised deep learning." arXiv preprint arXiv:1606.05579 (2016).