Accurate and realistic lighting is a key requirement for all 3D graphics applications aspiring for photorealistic rendering quality, whether games, architectural applications or offline rendering engines. In fact a rigorous physically based approach to light transport is what differentiates graphics systems that can produce correct and unbiased images from those that cannot.
However the perfectly faithful simulation of all quanta emitted by a single lightbulb even in the narrow spectrum of visible light — in the neighbourhood of 10^20 per second or more — is known to be prohibitively expensive. For more than three decades global light transport, especially indirect illumination has thus been considered a hard problem that inspired many creative approximations to cut down on rendering time.
Physically based techniques have nevertheless remained the undisputed best way to attack the problem. The algorithm that best embodies this approach is path tracing, a highly parallel method for finding physically plausible paths connecting camera and lightsource in the virtual scene.
The chief obstacle to making path tracing practical in online applications is the inevitable presence of random noise in the output. This is usually dealt with by brute force: by sampling each surface point visible from the camera thousands of times and using numerical integration until the image has converged, although more sophisticated noise reduction techniques have been developed. A different way to fight noise is to make use of the spatial coherence of the scene geometry and perspective parallel rays.
This thesis suggests a natural and intuitive addition to path tracing engines aimed at removing noise and/or speeding up the rendering process by exploiting spatial coherence. A basic demonstrative implementation is provided in C++ along with a reference path tracer program that shares some of the same code. The two solutions are compared in terms of code complexity, generality and practical performance in simple test cases.