The RenderMan Interface - Paul Bourke
The RenderMan Interface - Paul Bourke
The RenderMan Interface - Paul Bourke
- No tags were found...
Create successful ePaper yourself
Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.
This procedure sets the times at which the shutter opens and closes. min should be<br />
less than max. If min==max, no motion blur is done.<br />
RIB BINDING<br />
Shutter min max<br />
EXAMPLE<br />
RiShutter (0.1, 0.9);<br />
SEE ALSO<br />
RiMotionBegin<br />
4.1.2 Displays<br />
<strong>The</strong> graphics state contains a set of parameters that control the properties of the display<br />
process. <strong>The</strong> complete set of display options is given in Table 4.3, Display Options.<br />
Rendering programs must be able to produce color, coverage (alpha), and depth images,<br />
and may optionally be able to produce “images” of arbitrary geometric or shader-computed<br />
data. Display parameters control how the values in these images are converted into a displayable<br />
form. Many times it is possible to use none of the procedures described in this<br />
section. If this is done, the rendering process and the images it produces are described in a<br />
completely device-independent way. If a rendering program is designed for a specific display,<br />
it has appropriate defaults for all display parameters. <strong>The</strong> defaults given in Table 4.3,<br />
Display Options characterize a file to be displayed on a hypothetical video framebuffer.<br />
<strong>The</strong> output process is different for color, alpha, and depth information. (See Figure 4.2,<br />
Imaging Pipeline). <strong>The</strong> hidden-surface algorithm will produce a representation of the light<br />
incident on the image plane. This color image is either continuous or sampled at a rate<br />
that may be higher than the resolution of the final image. <strong>The</strong> minimum sampling rate can<br />
be controlled directly, or can be indicated by the estimated variance of the pixel values.<br />
<strong>The</strong>se color values are filtered with a user-selectable filter and filterwidth, and sampled at<br />
the pixel centers. <strong>The</strong> resulting color values are then multiplied by the gain and passed<br />
through an inverse gamma function to simulate the exposure process. <strong>The</strong> resulting colors<br />
are then passed to a quantizer which scales the values and optionally dithers them before<br />
converting them to a fixed-point integer. It is also possible to interpose a programmable<br />
imager (written in the Shading Language) between the exposure process and quantizer.<br />
This imager can be used to perform special effects processing, to compensate for nonlinearities<br />
in the display media, and to convert to device dependent color spaces (such as<br />
CMYK or pseudocolor).<br />
Final output alpha is computed by multiplying the coverage of the pixel (i.e., the subpixel<br />
area actually covered by a geometric primitive) by the average of the color opacity<br />
components. If an alpha image is being output, the color values will be multiplied by this<br />
alpha before being passed to the quantizer. Color and alpha use the same quantizer.<br />
28