Colorizing Black and White Images

How can we add realistic color to black & white images? Explore the effect of up-convolutions, weight decay, and deeper architectures.
Nicholas Bardy

Featured Report

This report is a saved snapshot of Boris' research. He's published this example so you can see how to use W&B to visualize training and keep track of your work. Feel free to add a visualization, click on graphs and data, and play with features. Your edits won't overwrite his work.

Project Description

The goal is to colorize black and white pictures of flowers in a realistic way.

Join the collaborative development of models in our Colorizer Benchmark →

alt

Section 1

Featured Report

This report is a saved snapshot of Boris' research. He's published this example so you can see how to use W&B to visualize training and keep track of your work. Feel free to add a visualization, click on graphs and data, and play with features. Your edits won't overwrite his work.

Project Description

The goal is to colorize black and white pictures of flowers in a realistic way.

Join the collaborative development of models in our Colorizer Benchmark →

alt

In the graphs above I visualized a few of the most interesting runs.

Surprisingly, replacing the simple up-sampling layers with up-convolutions did not affect the accuracy while increasing model size by about 35%!

It means that the filters do not learn anything more interesting than rudimentary interpolations.

Section 2

Surprisingly, replacing the simple up-sampling layers with up-convolutions did not affect the accuracy while increasing model size by about 35%!

It means that the filters do not learn anything more interesting than rudimentary interpolations.

I tried to go as deep as possible (hoping it would lead to better final accuracy).

The 7-layer architecture quickly shows some over-fitting. I decided to settle with 6 layers.

Note: the 7-layer architecture starts at a lower loss because the training actually started a few epochs before this log.

Section 3

I tried to go as deep as possible (hoping it would lead to better final accuracy).

The 7-layer architecture quickly shows some over-fitting. I decided to settle with 6 layers.

Note: the 7-layer architecture starts at a lower loss because the training actually started a few epochs before this log.

Using weight decay made the training much slower.

After running tests at different weight decay contribution factors, I decided I didn't have enough time/resources for using it.

Section 4

Using weight decay made the training much slower.

After running tests at different weight decay contribution factors, I decided I didn't have enough time/resources for using it.