3D graphics eBook - Course Materials Repository
3D graphics eBook - Course Materials Repository
3D graphics eBook - Course Materials Repository
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
High dynamic range rendering 58<br />
High dynamic range rendering<br />
In <strong>3D</strong> computer <strong>graphics</strong>, high dynamic<br />
range rendering (HDRR or HDR<br />
rendering), also known as high dynamic<br />
range lighting, is the rendering of computer<br />
<strong>graphics</strong> scenes by using lighting<br />
calculations done in a larger dynamic range.<br />
This allows preservation of details that may<br />
be lost due to limiting contrast ratios. Video<br />
games and computer-generated movies and<br />
special effects benefit from this as it creates<br />
more realistic scenes than with the more<br />
simplistic lighting models used.<br />
Graphics processor company Nvidia<br />
summarizes the motivation for HDRR in<br />
A comparison of the standard fixed-aperture rendering (left) with the HDR<br />
rendering (right) in the video game Half-Life 2: Lost Coast<br />
three points: [1] 1) bright things can be really bright, 2) dark things can be really dark, and 3) details can be seen in<br />
both.<br />
History<br />
The use of high dynamic range imaging (HDRI) in computer <strong>graphics</strong> was introduced by Greg Ward in 1985 with<br />
his open-source Radiance rendering and lighting simulation software which created the first file format to retain a<br />
high-dynamic-range image. HDRI languished for more than a decade, held back by limited computing power,<br />
storage, and capture methods. Not until recently has the technology to put HDRI into practical use been developed. [2]<br />
[3]<br />
In 1990, Nakame, et al., presented a lighting model for driving simulators that highlighted the need for<br />
high-dynamic-range processing in realistic simulations. [4]<br />
In 1995, Greg Spencer presented Physically-based glare effects for digital images at SIGGRAPH, providing a<br />
quantitative model for flare and blooming in the human eye. [5]<br />
In 1997 Paul Debevec presented Recovering high dynamic range radiance maps from photographs [6] at SIGGRAPH<br />
and the following year presented Rendering synthetic objects into real scenes. [7] These two papers laid the<br />
framework for creating HDR light probes of a location and then using this probe to light a rendered scene.<br />
HDRI and HDRL (high-dynamic-range image-based lighting) have, ever since, been used in many situations in <strong>3D</strong><br />
scenes in which inserting a <strong>3D</strong> object into a real environment requires the lightprobe data to provide realistic lighting<br />
solutions.<br />
In gaming applications, Riven: The Sequel to Myst in 1997 used an HDRI postprocessing shader directly based on<br />
Spencer's paper [8] . After E³ 2003, Valve Software released a demo movie of their Source engine rendering a<br />
cityscape in a high dynamic range. [9] The term was not commonly used again until E³ 2004, where it gained much<br />
more attention when Valve Software announced Half-Life 2: Lost Coast and Epic Games showcased Unreal Engine<br />
3, coupled with open-source engines such as OGRE <strong>3D</strong> and open-source games like Nexuiz.