Display of 3D Holoscopic content on ... - Brunel University

Display of 3D Holoscopic content on ... - Brunel University Display of 3D Holoscopic content on ... - Brunel University

dea.brunel.ac.uk
from dea.brunel.ac.uk More from this publisher
27.01.2015 Views

ICT Project ong>3Dong> VIVANT– Deliverable 6.4 Contract no.: 248420 ong>Displayong> ong>ofong> ong>3Dong> ong>Holoscopicong> Content on Auto-stereoscopic ong>Displayong> Figure 8: Micro-lens array in integral ray tracing. The structure ong>ofong> the lenses and the camera model in the in ong>3Dong> ong>Holoscopicong> computer graphics affects the way primary rays are spawned as well as the spatial coherence among them. The camera model used for each micro-lens is the pinhole approximation, where each micro-lens acts like a separate camera. The result is a set multiple cameras. Each ong>ofong> them records a micro-image ong>ofong> the virtual scene from a different angle (See figure 9). Primary rays pass through the centre ong>ofong> the micro-lens and the image plane. The scene image straddles the micro-lens array. Therefore there are two recording directions, in front and behind the micro-lens array. Figure 9: Camera model ong>ofong> ong>3Dong> ong>Holoscopicong> imaging for computer graphics. The specific characteristics ong>ofong> ong>3Dong> holoscopic imaging, allows us to deal with each cylindrical lens separate from the others, and to measure the number ong>ofong> pixels behind each lens, focal length and the image width. All these parameters including the number ong>ofong> lenslets in the virtual cylindrical array are selected on the basis ong>ofong> the characteristics ong>ofong> the display device. The pixels intensity values ong>ofong> the micro-image for each lenslet are read, saved, and then mapped to 10

ICT Project ong>3Dong> VIVANT– Deliverable 6.4 Contract no.: 248420 ong>Displayong> ong>ofong> ong>3Dong> ong>Holoscopicong> Content on Auto-stereoscopic ong>Displayong> pixels locations on the screen so that all the vertical slots are displayed at the same time forming the ong>3Dong> holoscopic image. The location ong>ofong> the vertical elemental image on the computer screen is identical to the location ong>ofong> the corresponding lenslet in the virtual lenses array. Figure 10 shows the ong>3Dong> scene “cutlery” rendered in OpenGL. The scene was first built in ong>3Dong> max, and then exported by OpenGL via Blender and saved as MD2 files. Each object in the scene was exported, saved and uned as a separate MD2 file. Figure 11 shows the ong>3Dong> holoscopic image resulting from the projection ong>ofong> the scene through the virtual lenticular lens array. Figure 10: Cutlery scene in OpenGL. Figure 11: Cutlery scene in OpenGL after projection through a virtual lenticular lens array. 4. DISPLAY OF ong>3Dong> HOLOSCOPIC CONTENT ON MULTIVIEW STEREO DISPLAYS Original experimentations was carried out on the Philips multiview auto-Stereoscopic display has been studied and experimented as an example ong>ofong> commercially available ong>3Dong> displays. A 20-inch ong>3Dong> 11

ICT Project <str<strong>on</strong>g>3D</str<strong>on</strong>g> VIVANT– Deliverable 6.4<br />

C<strong>on</strong>tract no.:<br />

248420<br />

<str<strong>on</strong>g>Display</str<strong>on</strong>g> <str<strong>on</strong>g>of</str<strong>on</strong>g> <str<strong>on</strong>g>3D</str<strong>on</strong>g> <str<strong>on</strong>g>Holoscopic</str<strong>on</strong>g> C<strong>on</strong>tent <strong>on</strong><br />

Auto-stereoscopic <str<strong>on</strong>g>Display</str<strong>on</strong>g><br />

Figure 8: Micro-lens array in integral ray tracing.<br />

The structure <str<strong>on</strong>g>of</str<strong>on</strong>g> the lenses and the camera model in the in <str<strong>on</strong>g>3D</str<strong>on</strong>g> <str<strong>on</strong>g>Holoscopic</str<strong>on</strong>g> computer graphics affects<br />

the way primary rays are spawned as well as the spatial coherence am<strong>on</strong>g them.<br />

The camera model used for each micro-lens is the pinhole approximati<strong>on</strong>, where each micro-lens acts<br />

like a separate camera. The result is a set multiple cameras. Each <str<strong>on</strong>g>of</str<strong>on</strong>g> them records a micro-image <str<strong>on</strong>g>of</str<strong>on</strong>g><br />

the virtual scene from a different angle (See figure 9). Primary rays pass through the centre <str<strong>on</strong>g>of</str<strong>on</strong>g> the<br />

micro-lens and the image plane. The scene image straddles the micro-lens array. Therefore there are<br />

two recording directi<strong>on</strong>s, in fr<strong>on</strong>t and behind the micro-lens array.<br />

Figure 9: Camera model <str<strong>on</strong>g>of</str<strong>on</strong>g> <str<strong>on</strong>g>3D</str<strong>on</strong>g> <str<strong>on</strong>g>Holoscopic</str<strong>on</strong>g> imaging for computer graphics.<br />

The specific characteristics <str<strong>on</strong>g>of</str<strong>on</strong>g> <str<strong>on</strong>g>3D</str<strong>on</strong>g> holoscopic imaging, allows us to deal with each cylindrical lens<br />

separate from the others, and to measure the number <str<strong>on</strong>g>of</str<strong>on</strong>g> pixels behind each lens, focal length and the<br />

image width. All these parameters including the number <str<strong>on</strong>g>of</str<strong>on</strong>g> lenslets in the virtual cylindrical array are<br />

selected <strong>on</strong> the basis <str<strong>on</strong>g>of</str<strong>on</strong>g> the characteristics <str<strong>on</strong>g>of</str<strong>on</strong>g> the display device.<br />

The pixels intensity values <str<strong>on</strong>g>of</str<strong>on</strong>g> the micro-image for each lenslet are read, saved, and then mapped to<br />

10

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!