The Results

The Results

Introduction

Lighting plays an important role in digital and matte painting. Unlike physical illumination in the real world or rendered scenes, the painted lighting effects in digital paintings are created by artists using heterogeneous strokes. In the paper titled "Generating Digital Painting Lighting Effects via RGB-space Geometry" the authors propose an image processing algorithm to generate digital painting lighting effects from a single image. The algorithm is based on the key observation that: artists use many overlapping strokes to paint lighting effects, i.e., pixels with dense stroke history tend to gather more illumination strokes. Based on this observation they devised an algorithm that can:

The Paper →

Here's a video released by the authors of the paper that provides an overview of the proposed algorithm.

video_link

Overview of The Paper

The proposed algorithm is step by step mimicking of artists' lighting effect composition workflow. In current artistic workflows, artists manually paint these lighting effects and tediously modify them to find the best composition. In order to create usable lighting effect products, artists usually first draw some global or coarse illumination layers, and then retouch the details of these layers to naturally fit the original image content. The drawbacks of this process are:

The algorithm is based on this key assumption that artists’ newly painted strokes are related to their previous stroke history. But most of the digital paintings do not have this stroke history information available. Furthermore recording such an information is impractical due to its high resolution.

image.png

Proposed Method

Key Features

The key features of this paper include:

The Code

Let's try to generate some interesting lightning effects ourselves.

The Code →

conda create --name paintlight
pip install https://github.com/ayulockin/PaintingLight
conda install python=3.6.10
cd code
python example001.jpeg

**Note: ** In my fork of the original repo you need to signup to wandb.com so that you can visualize the results more interactively in the Weights & Biases dashboard.

Even if you are using the original repo you can follow the steps shown here and in the README. That will generate a cv2.window with the image. You can hover your mouse on the image to see the lightning effect. But to save these effects, you'll need to log the outputs into Weights & Biases.

The Results

The Results

Selected Results

Selected Results

Tweaking the Parameters

To try out the stunning lighting effects on your own image you can use –

python default.py path_to_image

You can also play around with the parameter values to produce the desired lightning effect. A list of all the parameters with their recommended values can be found in the README of the repo. I experimented with a few parameters by changing their ranges from a minimum to a maximum value while keeping the x and y values constant.

I encourage you to play with the values to generate visually appealing images.

The Code →

Tweaking the Parameters

Conclusion and One Last Trick

Masking a region

The paper shows you how to mask your image to apply the lightning effect only at the desired regions. To do so, you can generate a binary mask and pass it to the algorithm. We've visualized the example provided by the authors in the figure below. Observe how there's only a horizontal effect applied on the giraffe.

Conclusion

This was an exciting project to work on. The project is still in a research phase and the authors are working on a Photoshop plug-in to enable widespread adoption. I hope you liked the visuals and it inspired you to give the repo a try yourself. It's super exiting that even in today's era of Deep Learning, classical image processing algorithms can at times generate really stunning results. For any feedback feel free to reach out to me on twitter @ayushthakur0.

Conclusion and One Last Trick