Skip to main content

ResNet18 run

Created on July 14|Last edited on July 14

Run con skip connections and encoded dimension = 640, kl_coefficient = 0.001

  • I report the average (over all encoded points) mean and variance of the latent variables. The variance distribution might give some insights. Here, it seems like the latent variables are quite well regularized (variance close to 0) but a few of them have variance = 0, which seems to indicate that some variables are used for reconstruction only (I observe that in runs with kl_coefficient = 0, all variances are pushed around 0).
  • Reconstruction is not perfect, but seems like most details are preserved
  • Generations are still not satisfactory, but are more promising than previous results
  • When visualizing the activations, I notice that some filters are pushed to 0. (picture below). This might due to the 'Dying ReLu' problem. I plan to try with different activations functions such as leaky ReLu or swish.


020406080Step0.511.522.533.5
020406080Step20253035
020406080Step2025303540
Run set
2