Lighting plays an important role in digital and matte painting. Unlike physical illumination in the real world or rendered scenes, the painted lighting effects in digital paintings are created by artists using heterogeneous strokes. In the paper titled "Generating Digital Painting Lighting Effects via RGB-space Geometry" the authors propose an image processing algorithm to generate digital painting lighting effects from a single image. The algorithm is based on the key observation that: artists use many overlapping strokes to paint lighting effects, i.e., pixels with dense stroke history tend to gather more illumination strokes. Based on this observation they devised an algorithm that can:
Here's a video released by the authors of the paper that provides an overview of the proposed algorithm.
The proposed algorithm is step by step mimicking of artists' lighting effect composition workflow. In current artistic workflows, artists manually paint these lighting effects and tediously modify them to find the best composition. In order to create usable lighting effect products, artists usually first draw some global or coarse illumination layers, and then retouch the details of these layers to naturally fit the original image content. The drawbacks of this process are:
The algorithm is based on this key assumption that artists’ newly painted strokes are related to their previous stroke history. But most of the digital paintings do not have this stroke history information available. Furthermore recording such an information is impractical due to its high resolution.
We directly estimate the density of stroke history. We do this by first extracting a virtual palette from the input image, and then make use of the relationship between pixel colors and palette colors to estimate the stroke density. And all this can be done with only one finished illustration/image.
Then we apply the lightning effect by mimicking artists' workflow going from coarse to fine.
The key features of this paper include:
Let's try to generate some interesting lightning effects ourselves.
anacondacommand prompt and create a new conda environment.
conda create --name paintlight pip install https://github.com/ayulockin/PaintingLight
README. Make sure you have
python 3.6.xversion in the created environment. If you don't, do this:
conda install python=3.6.10
code. Now you are ready to play with any example or can use your own image to see the effects.
cd code python example001.jpeg
Even if you are using the original repo you can follow the steps shown here and in the
README. That will generate a
cv2.window with the image. You can hover your mouse on the image to see the lightning effect. But to save these effects, you'll need to log the outputs into Weights & Biases.
To try out the stunning lighting effects on your own image you can use –
python default.py path_to_image
You can also play around with the parameter values to produce the desired lightning effect. A list of all the parameters with their recommended values can be found in the
README of the repo. I experimented with a few parameters by changing their ranges from a minimum to a maximum value while keeping the
x and y values constant.
light_source_height: This is the distance of the light source from the image. The recommended value is 1. To see the effect of this parameter I changed the value in the range
(0.1, 2) with step size of 0.1. The result can be seen in the figure
light_height_source_gif_015 below. Observe that the greater the height of the light source, the smaller the darker region.
light_intensity: This is the intensity of the light. The recommended value is 0.85. To see the effect of this parameter I changed the value in the range
(0, 1) with a step size of 0.1. The result can be seen in the figure
ambient_intensity: This is the intensity of the ambient light. The recommended value is 0.45. To see the effect of this parameter I changed the value in the range
(0, 0.7) with a step size of 0.1. The result can be seen in the figure
I encourage you to play with the values to generate visually appealing images.
The paper shows you how to mask your image to apply the lightning effect only at the desired regions. To do so, you can generate a binary mask and pass it to the algorithm. We've visualized the example provided by the authors in the figure below. Observe how there's only a horizontal effect applied on the giraffe.
This was an exciting project to work on. The project is still in a research phase and the authors are working on a Photoshop plug-in to enable widespread adoption. I hope you liked the visuals and it inspired you to give the repo a try yourself. It's super exiting that even in today's era of Deep Learning, classical image processing algorithms can at times generate really stunning results. For any feedback feel free to reach out to me on twitter @ayushthakur0.