Movatterモバイル変換


[0]ホーム

URL:


Skip to content

Navigation Menu

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up

Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch

NotificationsYou must be signed in to change notification settings

timbmg/VAE-CVAE-MNIST

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

32 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

VAE paper:Auto-Encoding Variational Bayes

CVAE paper:Semi-supervised Learning with Deep Generative Models


In order to runconditional variational autoencoder, add--conditional to the the command. Check out the other commandline options in the code for hyperparameter settings (like learning rate, batch size, encoder/decoder layer depth and size).


Results

All plots obtained after 10 epochs of training. Hyperparameters accordning to default settings in the code; not tuned.

z ~ q(z|x) and q(z|x,c)

The modeled latent distribution after 10 epochs and 100 samples per digit.

VAECVAE

p(x|z) and p(x|z,c)

Randomly sampled z, and their output. For CVAE, each c has been given as input once.

VAECVAE

About

Variational Autoencoder and Conditional Variational Autoencoder on MNIST in PyTorch

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages


[8]ページ先頭

©2009-2025 Movatter.jp