Image-Based Lighting

Scene rendered with Image-Based Lighting

What is it?

What you see above is The Shaderball lit by an image, not by standard 3D light sources. I’m going to talk about the idea (and implementation) of using images of environments to light 3D scenes. Two other common types of lights that I’ll eventually go over are punctual lights and area lights. Image-based lights use a 360° image of an environment to cast light onto 3D objects. The image should have a high dynamic range (an HDRI, or High Dynamic Range Image) so that the brightest parts of the scene are preserved for accurate lighting. The image is basically ‘wrapped’ around the scene as a sphere and rays are traced from objects to the sphere to determine the lighting on the object from the environment.

Lighting from the environment

How Does it Work?

For every point on an object we shoot out a large number of rays to probe for the environment. If the ray hits an object it will be black, otherwise it will be the color from the image as if it were mapped to a giant sphere surrounding the scene. All of these samples are then averaged to find the color at the point being shaded. By sending a large amount of rays (possibly a thousand for every pixel) we can accurately sample the environment to produce correct, noise-free lighting. Using less samples results in a noisier image but decreases render time.

Scene rendered with IBL

Why?

Image-based lighting offers a convenient and quick (setup-wise) solution for incorporating 3D scenes into real environments. The alternative would be to place punctual light sources manually to try and match the lighting. Lighting with images is extremely flexible: simply changing the image creates an entirely new lighting setup. In some cases you can use images of the same 3D scene to efficiently fake global-illumination effects such as light bouncing off of other objects.

 

Lighting from environment

My Implementation

Computing image-based lighting is a bit more complex than I described previously. For the result to be correct, it has to take into account parameters from the surface such as roughness. The rays that probe for the environment are importance sampled based on the diffuse BRDF so they’re not sent out uniformly over the hemisphere, which allows us to use much lower sample counts for the same quality image.

Lighting from the environment falls into the diffuse category, so all of the same rules that apply to diffuse reflections apply to the environmental lighting. I will expand on that in a later blog post about diffuse reflections.

My current implementation is controlled and computed by a light shader. In the future, the computation and most controls will be in the surface shader, and the light shader will exist only to turn on the effect. This makes more sense at a higher level, and is closer to the interaction between punctual and area light sources and surfaces.

Examples

Here are some images showing this light shader in action. There are two images for each example: the final render and the environmental contribution.