Skip to main content

Image Colorizer

This report contains the most interesting logs of the training of my neural network for the W&B colorizer challenge.
Created on May 3|Last edited on March 7

Section 1

The goal is to colorize black and white pictures of flowers in a realistic way.

alt

The latest version of the code is available on the github page.

The most interesting runs are represented below:

  • baseline made of 5 layers

  • baseline to which the up-sampling layers are replaced with up-convolutions

  • 6 layers architecture

  • 6 layers architecture with weight decay




Run set 1
4


Section 2

Surprisingly, replacing the simple up-sampling layers with up-convolutions did not affect the accuracy while increasing model size by about 35%!

It means that the filters do not learn anything more interesting than rudimentary interpolations.




Run set 1
2


Section 3

I tried to go as deep as possible (hoping it would lead to better final accuracy).

The 7-layer architecture quickly shows some over-fitting. I decided to settle with 6 layers.

Note: the 7-layer architecture starts at a lower loss because the training actually started a few epochs before this log.




Run set 1
3


Section 4

Using weight decay made the training much slower.

After running tests at different weight decay contribution factors, I decided I didn't have enough time/resources for using it.




Run set 1
4