DEliverable 2.3 - the School of Engineering and Design - Brunel ...
DEliverable 2.3 - the School of Engineering and Design - Brunel ...
DEliverable 2.3 - the School of Engineering and Design - Brunel ...
You also want an ePaper? Increase the reach of your titles
YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
Project Number:<br />
Project title:<br />
Deliverable Type:<br />
248420<br />
3D VIVANT<br />
Report<br />
CEC Deliverable Number: IST-248420/<strong>Brunel</strong>/WP2/PU/R/Del-2-3/ver 2<br />
Resubmission Delivery Date: 15 th September 2011<br />
Actual Delivery Date: 1 st September 2011<br />
Title <strong>of</strong> <strong>the</strong> Deliverable:<br />
Workpackage:<br />
Nature <strong>of</strong> <strong>the</strong> Deliverable:<br />
Organisations:<br />
Authors:<br />
Circulation List:<br />
User Acceptance Validation Plan - Update<br />
WP2<br />
Report<br />
1 <strong>Brunel</strong> University<br />
2 Centre for Research <strong>and</strong> Technology Hellas – Informatics <strong>and</strong><br />
Telematics Institute<br />
3 Institut für Rundfunktechnik GmbH<br />
4 Holografika<br />
5 RAI research centre<br />
6 Rundfunk Berlin-Br<strong>and</strong>enburg<br />
7 Instituto de Telecomunicações<br />
8 European Broadcast Union<br />
9 Arnold & Richter Cine Technik<br />
Nicolas de Abreu Pereira, Annette Duffy, Oliver Pidancet, Iordanis<br />
Biperis, Amar Aggoun, Emmanuel Tsekleves, John Cosmas,<br />
Johannes Steurer, Mario Muratori, Michael Meier, Ralf Neudel,<br />
Yvonne Thomas, Michael Weitnauer<br />
Partners<br />
Keywords: user validation, usability, test<br />
01.09.11 1
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
CONTENTS<br />
CONTENTS .......................................................................................................................................................... 2<br />
TABLE CAPTIONS ............................................................................................................................................. 3<br />
EXECUTIVE SUMMARY .................................................................................................................................. 4<br />
DOCUMENT STRUCTURE ............................................................................................................................... 5<br />
1 INTRODUCTION ....................................................................................................................................... 6<br />
1.1 TARGET USERS ...................................................................................................................................... 6<br />
1.2 OVERVIEW OF FUNCTIONS AND FEATURES TO BE TESTED ..................................................................... 7<br />
2 METHODOLOGY ...................................................................................................................................... 8<br />
2.1 QUALITATIVE VS. QUANTITATIVE ASSESSMENT METHODS ................................................................... 8<br />
2.2 QUALITY ASSESSMENT ON 3D VIDEO .................................................................................................... 9<br />
2.2.1 Stereoscopic 3D .............................................................................................................................. 10<br />
2.2.2 Multiview Auto-stereoscopic 3D ..................................................................................................... 10<br />
2.<strong>2.3</strong> Data Collection Methods ................................................................................................................ 11<br />
<strong>2.3</strong> QUALITY ASSESSEMNT ON 3D AUDIO ................................................................................................. 12<br />
<strong>2.3</strong>.1 Methods for <strong>the</strong> Assessment <strong>of</strong> Sound Colour ................................................................................. 12<br />
<strong>2.3</strong>.2 Methods for <strong>the</strong> Assessemnt <strong>of</strong> Localisation Quality ...................................................................... 13<br />
2.4 QUALITY ASSESSMENT ON INTERACTIVE SOFTWARE .......................................................................... 13<br />
2.4.1 User Acceptance <strong>and</strong> Usability ....................................................................................................... 14<br />
2.4.2 Data Collection Methods ................................................................................................................ 14<br />
3 PRODUCTION-SIDE ACCEPTANCE FACTORS ............................................................................... 16<br />
3.1 HOLOSCOPIC CAMERA SETS ................................................................................................................. 16<br />
3.2 PRODUCTION PROCESSES ..................................................................................................................... 17<br />
4 END-USER ACCEPTANCE .................................................................................................................... 18<br />
4.1 BROADCAST EXPERIENCE .................................................................................................................... 18<br />
4.1.1 Viewing ........................................................................................................................................... 18<br />
4.1.2 Hearing ........................................................................................................................................... 19<br />
4.2 INTERACTIVE EXPERIENCE ................................................................................................................... 20<br />
4.2.1 Video Hyperlinking Environment ................................................................................................... 20<br />
4.2.2 Search <strong>and</strong> Retrieval Framework ................................................................................................... 21<br />
5 ORGANISATION OF TESTS .................................................................................................................. 22<br />
5.1 OVERVIEW ........................................................................................................................................... 22<br />
5.2 TIME PLAN ........................................................................................................................................... 24<br />
5.3 RECRUITING TESTERS .......................................................................................................................... 25<br />
6 CONCLUSION .......................................................................................................................................... 26<br />
7 REFERENCES .......................................................................................................................................... 27<br />
ANNEX 1 ............................................................................................................................................................. 29<br />
ANNEX 2 ............................................................................................................................................................. 31<br />
01.09.11 2
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
TABLE CAPTIONS<br />
Table 1: Pr<strong>of</strong>essional Users Testplan<br />
Table 2:<br />
Table 3:<br />
Table 4:<br />
Table 5:<br />
Table 6:<br />
End-Users Testplan<br />
Parameters Relevant to <strong>the</strong> Shooting Phase<br />
Parameters Relevant to <strong>the</strong> Recording Phase<br />
Parameters relevant to <strong>the</strong> editing phase<br />
Parameters Relevant to <strong>the</strong> Compositing Phase<br />
01.09.11 3
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
EXECUTIVE SUMMARY<br />
The purpose <strong>of</strong> this document is to provide, at an early stage <strong>of</strong> <strong>the</strong> project, a plan for user acceptance<br />
validation in 3D VIVANT. Acceptance tests <strong>and</strong> studies will mainly be carried out in Task 7.3 –<br />
“User Perception <strong>and</strong> Usability Testing”, which starts in project month 29 <strong>and</strong> continues until month<br />
34.<br />
The project developments will create new technology for <strong>the</strong> production <strong>and</strong> consumption <strong>of</strong> 3D<br />
holoscopic video. Therefore, it will be important to test <strong>the</strong> acceptance <strong>of</strong> <strong>the</strong> new production<br />
technology <strong>and</strong> <strong>the</strong> implications this has for existing production workflows <strong>and</strong> processes. Consumers<br />
will be <strong>of</strong>fered a new way to view, hear <strong>and</strong> interact with 3D content. All <strong>of</strong> <strong>the</strong>se aspects need to be<br />
tested. In order to highlight <strong>the</strong> advantages <strong>of</strong> <strong>the</strong> technology being developed <strong>and</strong> validate it, it is<br />
important to consider <strong>the</strong> shortcomings <strong>of</strong> existing 3D approaches <strong>and</strong> solutions. The user tests, both<br />
pr<strong>of</strong>essional <strong>and</strong> end-user tests, will consider <strong>the</strong>se in addition to o<strong>the</strong>r relevant acceptance factors.<br />
Following numerous internal tests <strong>of</strong> objective measures conducted by <strong>the</strong> developing partners <strong>the</strong>re<br />
will be extensive validation <strong>of</strong> 3D VIVANT’s outcomes involving pr<strong>of</strong>essional testers from outside<br />
<strong>the</strong> project teams <strong>and</strong>, <strong>of</strong> course, future (potential) end-users. As <strong>the</strong> core objective <strong>of</strong> <strong>the</strong>se tests is to<br />
validate user acceptance, <strong>the</strong> majority <strong>of</strong> <strong>the</strong> tests will be based on qualitative assessment methods as<br />
opposed to <strong>the</strong> project internal tests which will primarily involve objective testing methods.<br />
As <strong>the</strong> developments in <strong>the</strong> project will be many <strong>and</strong> complex, <strong>the</strong>re will be <strong>the</strong> need to carry out a<br />
number <strong>of</strong> tests in various locations, each with a specific purpose <strong>and</strong> involving <strong>the</strong> main partners<br />
responsible for <strong>the</strong> development in question.<br />
01.09.11 4
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
DOCUMENT STRUCTURE<br />
The introduction (Section 1) briefly summarises <strong>the</strong> objectives <strong>and</strong> <strong>the</strong> adopted approach for<br />
conducting user validation tests for <strong>the</strong> development in 3D VIVANT, explaining <strong>the</strong> starting position<br />
<strong>of</strong> <strong>the</strong> validation tests in question, such as <strong>the</strong> differences between pr<strong>of</strong>essional <strong>and</strong> end-user<br />
validation.<br />
Section 2 provides an overview <strong>of</strong> <strong>the</strong> assessment methods <strong>and</strong> explains <strong>the</strong> choice <strong>of</strong> methods in<br />
general <strong>and</strong> with respect to <strong>the</strong> assessment <strong>of</strong> 3D video (Section 2.2), 3D audio (Section <strong>2.3</strong>) <strong>and</strong><br />
usability testing (Section 2.4) in particular.<br />
The following sections give details on <strong>the</strong> assessment <strong>of</strong> acceptance factors on <strong>the</strong> side <strong>of</strong> pr<strong>of</strong>essional<br />
content production (Section 3) <strong>and</strong> on that <strong>of</strong> <strong>the</strong> end-users (Section 4). The latter include audiovisual<br />
perception (Section 4.1.1 – Viewing, <strong>and</strong> Section 4.1.2 – Listening) <strong>and</strong> <strong>the</strong> interactive experience<br />
(Section 4.2). For each <strong>of</strong> <strong>the</strong>se <strong>the</strong> selected data collection methods are described in <strong>the</strong> respective<br />
sub-sections.<br />
Section 5 provides an overview <strong>of</strong> <strong>the</strong> organisation <strong>of</strong> tests, including a rough time plan <strong>and</strong> details on<br />
<strong>the</strong> recruiting <strong>of</strong> testers.<br />
Eventually, Section 6 provides a list <strong>of</strong> references cited in this document, while Annexes 1 <strong>and</strong> 2 give<br />
some extra background information on 3D video production.<br />
01.09.11 5
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
1 INTRODUCTION<br />
The main target <strong>of</strong> 3D VIVANT is to create a new way <strong>of</strong> producing <strong>and</strong> enjoying 3D video <strong>and</strong> <strong>the</strong><br />
associated audio content. 3D VIVANT will develop <strong>the</strong> tools which are necessary for capturing,<br />
processing <strong>and</strong> viewing 3D holoscopic content; <strong>and</strong> targets both <strong>the</strong> pr<strong>of</strong>essional <strong>and</strong> consumer<br />
environments. This means that <strong>the</strong> 3D VIVANT system will be used by various groups <strong>of</strong> people,<br />
involved in different stages <strong>of</strong> <strong>the</strong> 3D holoscopic content exploitation chain <strong>and</strong> having different<br />
expectations from it.<br />
3D VIVANT has to validate that it meets <strong>the</strong> expectations <strong>of</strong> all <strong>of</strong> <strong>the</strong>se users. However, since<br />
expectations vary among different user groups, an effective way <strong>of</strong> validating <strong>the</strong>ir acceptance<br />
requires a customised approach for each group. Therefore, for a thorough validation <strong>of</strong> user<br />
acceptance, two major groups <strong>of</strong> users are distinguished: 1) <strong>the</strong> pr<strong>of</strong>essional users, <strong>and</strong> 2) <strong>the</strong> endusers.<br />
Fur<strong>the</strong>rmore, <strong>the</strong>re is an important difference between testing perceived audiovisual quality <strong>and</strong> tests<br />
on interactive features, including usability, usefulness, etc. This variety <strong>of</strong> testing persons <strong>and</strong> aspects<br />
to be tested requires a variety <strong>of</strong> methods for testing <strong>and</strong> validation. While some features <strong>and</strong><br />
functions will require primarily qualitative methods, o<strong>the</strong>rs may involve some quantitative measuring<br />
as part <strong>of</strong> <strong>the</strong> evaluation setup. The chosen methods will be described in Section 2.<br />
In WP7 <strong>of</strong> <strong>the</strong> project, assessment sessions are envisaged for evaluating <strong>the</strong> user acceptance <strong>of</strong> <strong>the</strong><br />
various functionalities <strong>and</strong> characteristics – most <strong>of</strong> <strong>the</strong>m are new ones – <strong>of</strong> <strong>the</strong> 3D holoscopic<br />
system. The results <strong>of</strong> <strong>the</strong> evaluations, from <strong>the</strong> points <strong>of</strong> view <strong>of</strong> both end-users <strong>and</strong> pr<strong>of</strong>essional<br />
users, will mainly be used by <strong>the</strong> developers to improve <strong>the</strong> capabilities <strong>of</strong> <strong>the</strong> system. This would<br />
also inform any future development outside <strong>and</strong> after <strong>the</strong> end <strong>of</strong> <strong>the</strong> project.<br />
As some <strong>of</strong> 3D VIVANT’s features will create very new experiences, <strong>the</strong> test results will provide<br />
important feedback for fur<strong>the</strong>r improving <strong>and</strong> refining <strong>the</strong> technology <strong>and</strong> for optimal marketing<br />
results. The aim <strong>of</strong> this document is to define <strong>the</strong> criteria <strong>and</strong> metrics for <strong>the</strong> aforementioned user<br />
acceptance tests <strong>and</strong> to suggest a testing plan. These metrics will be <strong>the</strong> basis for Task 7.3 – “User<br />
Perception <strong>and</strong> Usability Testing”, due to start near <strong>the</strong> end <strong>of</strong> <strong>the</strong> project, in month 29 (July 2012).<br />
The information ga<strong>the</strong>red from <strong>the</strong> tests needs to concentrate on <strong>the</strong> functionalities <strong>and</strong> experience<br />
that are exclusive to <strong>the</strong> project developments, i.e., 3D holoscopic technology <strong>and</strong> <strong>the</strong> hyperlinking<br />
environment. The approach adopted aims at capitalising on <strong>the</strong> knowledge <strong>and</strong> expertise <strong>of</strong> <strong>the</strong> project<br />
partners with regards to 3D production <strong>and</strong> consumption as well as to interactive s<strong>of</strong>tware <strong>and</strong> also to<br />
review existing research on <strong>the</strong> topic. From this, it was possible to list <strong>the</strong> shortcomings <strong>and</strong><br />
acceptability issues relating to 3D. The tests should help to establish whe<strong>the</strong>r 3D VIVANT deals with<br />
<strong>the</strong>se shortcomings <strong>and</strong> provides acceptable solutions.<br />
Concerning <strong>the</strong> choice <strong>of</strong> <strong>the</strong> methods, all tests will take into account varying requirements with<br />
regard to target users as well as features <strong>and</strong> facilities to be tested. For detailed descriptions see<br />
Sections 2-4.<br />
1.1 TARGET USERS<br />
The development <strong>of</strong> 3D holoscopic video requires <strong>the</strong> development <strong>of</strong> various production facilities as<br />
<strong>the</strong>se are not available as <strong>of</strong> today. Hence, 3D VIVANT’s results will require tests on shooting,<br />
editing, <strong>and</strong>, <strong>of</strong> course, viewing newly created content. This means that <strong>the</strong> tests will involve<br />
pr<strong>of</strong>essional users as well as end-users.<br />
Tests involving pr<strong>of</strong>essional users will differ from end-user tests in various ways. While pr<strong>of</strong>essional<br />
users will provide much more detailed input <strong>and</strong> even concrete, constructive ideas, e.g., concerning<br />
01.09.11 6
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
<strong>the</strong> organisation <strong>of</strong> production processes, end-user tests are more focused on subjective acceptance<br />
factors. While pr<strong>of</strong>essional users are expected to be much more critical, for instance, concerning <strong>the</strong><br />
quality <strong>of</strong> 3D video in terms <strong>of</strong> exact colours, etc., as well as with respect to <strong>the</strong> production side<br />
factors, end-users are expected to speak <strong>the</strong>ir minds concerning <strong>the</strong> subjective appearance <strong>and</strong><br />
perceived usefulness or advantages in comparison to what <strong>the</strong>y have known so far.<br />
These <strong>and</strong> o<strong>the</strong>r differences in background, perception <strong>and</strong> expectations will influence <strong>the</strong> choice <strong>of</strong><br />
methods applied to evaluate <strong>the</strong>ir opinions <strong>and</strong> feedback.<br />
1.2 OVERVIEW OF FUNCTIONS AND FEATURES TO BE TESTED<br />
The validation <strong>of</strong> user acceptance <strong>of</strong> 3D VIVANT’s project results will focus on all components <strong>and</strong><br />
prototypes developed in <strong>the</strong> course <strong>of</strong> <strong>the</strong> project. While some may be more in <strong>the</strong> focus <strong>of</strong> <strong>the</strong> user<br />
acceptance tests, all <strong>of</strong> <strong>the</strong>m will be involved <strong>and</strong> play an important role for <strong>the</strong> results <strong>of</strong> <strong>the</strong><br />
assessment activities.<br />
The assessment <strong>of</strong> <strong>the</strong> production process <strong>and</strong> its required functions <strong>and</strong> features will include preview,<br />
transport issues, storing, depth <strong>of</strong> field, <strong>and</strong> many o<strong>the</strong>r aspects <strong>of</strong> TV <strong>and</strong> video production.<br />
Tests on post-production issues will involve <strong>the</strong> performance <strong>of</strong> <strong>the</strong> codec <strong>and</strong> usability <strong>of</strong> editing<br />
s<strong>of</strong>tware <strong>and</strong> metadata editing tools, <strong>and</strong> numerous o<strong>the</strong>r features.<br />
Object search <strong>and</strong> retrieval will play a prominent role in both pr<strong>of</strong>essional <strong>and</strong> end-user tests as it is a<br />
feature <strong>of</strong> <strong>the</strong> hyperlinking environment both in production as well as consumption by end-users.<br />
Validating <strong>the</strong> users’ acceptance <strong>of</strong> 3D holoscopic video will be important with respect to end-users<br />
as well as to pr<strong>of</strong>essional users, while <strong>the</strong>ir interests <strong>and</strong> thus <strong>the</strong>ir feedback may be very different.<br />
The tests on <strong>the</strong> visual perception <strong>and</strong> <strong>the</strong> acceptance <strong>of</strong> 3D holoscopic video will be <strong>the</strong> most<br />
important outcome <strong>of</strong> 3D VIVANT’s User Acceptance Validation tests.<br />
There will also be dedicated tests on 3D audio, which will also involve both pr<strong>of</strong>essional users <strong>and</strong><br />
end-users, whereas pr<strong>of</strong>essional users will be involved in validating <strong>the</strong> production methods ra<strong>the</strong>r<br />
than <strong>the</strong> auditive perception <strong>of</strong> 3D audio.<br />
01.09.11 7
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
2 METHODOLOGY<br />
There are major differences between testing functions <strong>and</strong> features involving interactivity <strong>and</strong> <strong>the</strong><br />
assessment <strong>of</strong> acceptance <strong>of</strong> perception quality (using vs. viewing). While, for instance, usability<br />
testing requires tests on whe<strong>the</strong>r users underst<strong>and</strong> what <strong>the</strong>y are expected to do <strong>and</strong> how easy it is for<br />
<strong>the</strong>m to find out <strong>and</strong> actually do, validating acceptance <strong>of</strong> a novel perception <strong>of</strong> 3D video <strong>and</strong> audio<br />
will require very different research questions. These differences will influence <strong>the</strong> choice <strong>of</strong> methods<br />
to a large extent.<br />
The following section will describe <strong>the</strong> objectives <strong>and</strong> <strong>the</strong> adopted approach for conducting user<br />
validation tests for <strong>the</strong> development in 3D VIVANT.<br />
2.1 QUALITATIVE VS. QUANTITATIVE ASSESSMENT METHODS<br />
As <strong>the</strong> main objective <strong>of</strong> <strong>the</strong> User Validation Activities in 3D VIVANT is to investigate whe<strong>the</strong>r <strong>the</strong><br />
results <strong>of</strong> 3D VIVANT, from holoscopic 3D to novel opportunities to view <strong>and</strong> interact with video<br />
content, will be accepted by potential users as an added value to <strong>the</strong>ir previous experience, many<br />
subjective aspects will have to be taken into account. In order to explore <strong>the</strong> limitations <strong>of</strong> 3D<br />
VIVANT’s features <strong>and</strong> functions, <strong>the</strong> chosen evaluation methods will have to allow maximum<br />
openness <strong>and</strong> flexibility.<br />
For both interactivity as well as viewing characteristics <strong>the</strong>re is broad agreement that <strong>the</strong>y could not<br />
be sufficiently validated with objective measures alone. While performance, colour display <strong>and</strong><br />
various o<strong>the</strong>r aspects may be measured according to objective criteria, subjective terms like<br />
acceptance are best evaluated with qualitative methods.<br />
All functions <strong>and</strong> features to be validated by project externals will be pilot tested by <strong>the</strong> partners<br />
involved in <strong>the</strong> development <strong>of</strong> <strong>the</strong>se features before <strong>the</strong>y will be subject to tests involving external<br />
users. These internal tests will include several objective tests, which will be dictated by <strong>the</strong><br />
specifications determined in <strong>the</strong> respective work packages. User acceptance, on <strong>the</strong> o<strong>the</strong>r h<strong>and</strong>, will<br />
be validated primarily through qualitative methods.<br />
While quantitative evaluation methods require huge samples <strong>of</strong> testers in order to create statistics,<br />
deduct objectivity from subjective statements, <strong>and</strong> confirm or falsify <strong>the</strong> scientist’s expectations,<br />
qualitative methods are a source for recommendations <strong>and</strong> inspirations. Focus group discussions,<br />
open interviews <strong>and</strong> <strong>the</strong> thinking aloud method, for instance, may produce answers that were not<br />
planned, let alone prepared by <strong>the</strong> scientists leading <strong>the</strong> evaluation. This means, at <strong>the</strong> same time, that<br />
test results are less predictable as testers can come up with answers to questions that were not asked<br />
or not even thought <strong>of</strong>.<br />
This relative openness, however, does not contradict <strong>the</strong> will <strong>and</strong> possibility to achieve generalisation<br />
(Seale 1999) <strong>and</strong> transferability (Lincoln <strong>and</strong> Guba 1985) <strong>of</strong> <strong>the</strong> test results. Approved quality criteria<br />
ensure that both, <strong>the</strong> choice <strong>of</strong> <strong>the</strong> instrument <strong>and</strong> <strong>the</strong> evaluation <strong>of</strong> <strong>the</strong> test results will lead to an<br />
underst<strong>and</strong>ing <strong>of</strong> positive <strong>and</strong> negative potentials <strong>and</strong> give valuable input as to how to avoid negative<br />
<strong>and</strong> achieve positive results (De Abreu et al 2006).<br />
In terms <strong>of</strong> <strong>the</strong> data to be obtained from users, <strong>the</strong>se typically fall under two categories:<br />
• Process data – Observations about what <strong>the</strong> users were doing <strong>and</strong> thinking as <strong>the</strong>y proceeded<br />
through <strong>the</strong> tests.<br />
• Summary data – ‘Bottom-line’ information like <strong>the</strong> time taken to finish a task, error rate, etc.<br />
Relevant data collection methods are<br />
• Questionnaires – Comprising <strong>of</strong> both closed <strong>and</strong> open ended questions. Questionnaires in this<br />
01.09.11 8
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
context are useful for collecting summary data, such as user data (e.g., age, background), user<br />
satisfaction, ease <strong>of</strong> use <strong>of</strong> specific tasks, performance <strong>of</strong> <strong>the</strong> system, etc. This can be collected in<br />
a fairly quick manner <strong>and</strong> processed <strong>and</strong> analysed at a later stage. Questionnaires will be given to<br />
<strong>the</strong> users to complete during tests or after <strong>the</strong>y have used <strong>the</strong> tools in question.<br />
• Interviews – Semi-structured (prepared <strong>and</strong> spontaneous questions) interviews comprising <strong>of</strong> a<br />
selected user sample <strong>and</strong> one/two interviewees. The main aim here is to gain a deeper insight into<br />
<strong>the</strong> user perception <strong>of</strong> <strong>the</strong> system. Interviews are very useful in this context, as <strong>the</strong>y facilitate<br />
confidentiality <strong>and</strong> allow users to express <strong>the</strong>ir views on <strong>the</strong> system. This is very useful from <strong>the</strong><br />
interviewer’s perspective too, as <strong>the</strong>y can fur<strong>the</strong>r probe topics <strong>of</strong> interest with <strong>the</strong> user <strong>and</strong><br />
ascertain valuable data that is very difficult to gain o<strong>the</strong>rwise (e.g., via questionnaires). These are<br />
also useful in getting so-called user stories <strong>of</strong> <strong>the</strong>ir experiences. Interviews provide testers with<br />
process data.<br />
• Focus groups – Composed <strong>of</strong> a small number <strong>of</strong> users <strong>and</strong> a moderator. These are particularly<br />
useful in <strong>the</strong> process <strong>of</strong> user perception <strong>and</strong> form one <strong>of</strong> <strong>the</strong> main ways <strong>of</strong> getting feedback from<br />
pr<strong>of</strong>essional users. They help to establish a feeling <strong>of</strong> group membership <strong>and</strong> provide a unique<br />
opportunity for users to express <strong>the</strong>ir views in an informal but informative manner. It is <strong>of</strong>ten <strong>the</strong><br />
case in user groups that one response from a user triggers ano<strong>the</strong>r question or response from o<strong>the</strong>r<br />
users or <strong>the</strong> moderator creating an iterative type <strong>of</strong> group interview/discussion. Focus groups<br />
provide evaluators with information on what users thought during tests <strong>and</strong> where it specifically<br />
did not match <strong>the</strong>ir expectations (process data). Inspiring <strong>and</strong> constructive remarks, especially<br />
from pr<strong>of</strong>essional users, on how to improve <strong>the</strong> situation under assessment can be ano<strong>the</strong>r very<br />
important outcome <strong>of</strong> a focus group discussion.<br />
The following sections will explain in details which methods will be used for which tests <strong>and</strong> why.<br />
2.2 QUALITY ASSESSMENT ON 3D VIDEO<br />
While <strong>the</strong>re are st<strong>and</strong>ardised objective methods for quality assessment <strong>of</strong> 2D video coding, display,<br />
etc., such as Perceptual Evaluation <strong>of</strong> Video Quality (PEVQ) <strong>and</strong> “Peak-Signal-to-Noise-Ratio”<br />
(PSNR), metrics for validating 2D video, <strong>the</strong>se cannot be easily transferred to 3D (Veit 2011).<br />
Although ITU-R has initiated discussions on ‘Digital three-dimensional (3D) TV broadcasting’ (ITU-<br />
R Question 128/6 (2008)), <strong>the</strong>re are no agreed st<strong>and</strong>ards for testing <strong>the</strong> perceived quality <strong>of</strong> 3D video,<br />
yet. Therefore, 3D VIVANT’s user validation tests will be inspired by st<strong>and</strong>ards on traditional<br />
validation scenarios on Quality <strong>of</strong> Experience <strong>of</strong> 2D video <strong>and</strong>/or TV as well as by tests performed in<br />
<strong>the</strong> course <strong>of</strong> recent research activities <strong>and</strong> currently proposed st<strong>and</strong>ards.<br />
ITU-R Recommendation BT.500, a st<strong>and</strong>ard which describes various methods <strong>of</strong> showing test<br />
footage <strong>and</strong> recording testers’ opinions will be a good starting point. The follow-up ITU-R BT.1438<br />
did a very first step for st<strong>and</strong>ardizing subjective tests <strong>of</strong> 3D video but leaves many questions<br />
unanswered. The viewing conditions during <strong>the</strong> envisaged tests will be largely based on <strong>the</strong>se<br />
recommendations <strong>and</strong> on Chen’s catalogue <strong>of</strong> subjective video quality assessment methodologies for<br />
3DTV (Chen 2010), but may have to be adapted to 3DVIVANT-specific aspects in some details. In<br />
addition, methods <strong>of</strong> quality assessment <strong>of</strong> 3D displays, 3D content, <strong>and</strong> 3D devices based on human<br />
factors, are being discussed in IEEE P3333, <strong>the</strong> draft St<strong>and</strong>ard for Test Procedures for Electric<br />
Energy Storage Equipment <strong>and</strong> Systems for Electric Power Systems Applications 1 , <strong>and</strong> may be<br />
considered in order to optimise <strong>the</strong> validation settings, if available at <strong>the</strong> time.<br />
Validating user acceptance will also have to involve comparison tests with o<strong>the</strong>r available<br />
technologies.<br />
1 For more details check http://st<strong>and</strong>ards.ieee.org/develop/project/3333.html<br />
01.09.11 9
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
These will set certain st<strong>and</strong>ards in terms <strong>of</strong> what <strong>the</strong> user expects with respect to e.g., colour,<br />
resolution, brightness, contrast, etc., which <strong>the</strong> project results will have to meet. Although this may<br />
sound like a challenge, comparison with o<strong>the</strong>r technologies may also show advantages, such as<br />
independence from <strong>the</strong> need to wear special glasses, etc. The following two sub-sections will outline<br />
features <strong>of</strong> 3D technologies suitable for comparison tests.<br />
Never<strong>the</strong>less, since <strong>the</strong> 3D holoscopic technology is in its infancy, <strong>the</strong> tests envisaged in <strong>the</strong> project<br />
should be considered more as useful guidelines for <strong>the</strong> future developments <strong>of</strong> this technology ra<strong>the</strong>r<br />
than competitive comparisons aimed to persuade about <strong>the</strong> need <strong>of</strong> giving up any o<strong>the</strong>r existing 3D<br />
visual technology.<br />
Consequently, <strong>the</strong> methods <strong>and</strong> approaches reported in <strong>the</strong> aforementioned publications will be<br />
practiced as required in order to obtain valid results, but it is assumed that <strong>the</strong>y may have to be<br />
relaxed when deemed necessary. In this case <strong>the</strong> corresponding WP7 deliverables will mention <strong>the</strong><br />
actual modifications with respect to <strong>the</strong> st<strong>and</strong>ardized rules.<br />
User Acceptance Tests on 3D holoscopic video content will primarily be conducted with<br />
questionnaires comprised <strong>of</strong> closed questions to explore limits <strong>of</strong> quality acceptance <strong>and</strong> open<br />
questions to allow for individual input. For a detailed description <strong>of</strong> <strong>the</strong> tests, see Section 4.1.<br />
2.2.1 Stereoscopic 3D<br />
Most <strong>of</strong> <strong>the</strong> currently available 3D technologies are based on stereoscopic images. In this approach,<br />
one image is produced for each eye <strong>and</strong> delivered to <strong>the</strong> appropriate eye during playback. The<br />
separation <strong>of</strong> <strong>the</strong> images is usually done using special 3D glasses, for home users usually with activeshutter<br />
or polarised glasses.<br />
For <strong>the</strong> production <strong>of</strong> live-action stereoscopic content, stereo setups <strong>of</strong> two cameras in mirror or sideby-side<br />
rigs are used. This requires a high degree <strong>of</strong> accuracy when setting up <strong>the</strong> cameras since <strong>the</strong><br />
cameras have to be accurately aligned <strong>and</strong> work exactly frame synchronously. In addition, to achieve<br />
good results, <strong>the</strong> properties <strong>of</strong> <strong>the</strong> lenses also have to match perfectly.<br />
One <strong>of</strong> <strong>the</strong> main disadvantages <strong>of</strong> <strong>the</strong> stereoscopic method is <strong>the</strong> need to wear some kind <strong>of</strong> special<br />
3D glasses. Moreover, in stereoscopic viewing, <strong>the</strong> eyes’ convergence <strong>and</strong> accommodation do not<br />
work in unison in contrast to normal human viewing. This leads to eye fatigue <strong>and</strong> in some cases even<br />
to headaches. Ano<strong>the</strong>r shortcoming <strong>of</strong> stereoscopic 3D is <strong>the</strong> dependency on <strong>the</strong> viewing angle. Since<br />
only two different perspectives are present, a sideways movement <strong>of</strong> <strong>the</strong> observer results in a<br />
sideways shear <strong>of</strong> <strong>the</strong> stereoscopic image.<br />
2.2.2 Multiview Auto-stereoscopic 3D<br />
Auto-stereoscopic approaches are generally based on <strong>the</strong> same principles as <strong>the</strong> stereoscopic 3D. But<br />
in contrast to <strong>the</strong> classical stereoscopic 3D, for <strong>the</strong> multiview auto-stereoscopic case a larger number<br />
<strong>of</strong> perspectives (views) are used. To view <strong>the</strong> content, mostly LCD-based displays with lenticular<br />
lenses are used. They distribute <strong>the</strong> multiple views horizontally across <strong>the</strong> entire field <strong>of</strong> view <strong>of</strong> <strong>the</strong><br />
display. Depending on <strong>the</strong> position, every viewer can see exactly one pair <strong>of</strong> <strong>the</strong>se views, a<br />
stereoscopic image pair. The main advantage <strong>of</strong> auto-stereoscopic displays is that <strong>the</strong> viewer does not<br />
need glasses <strong>and</strong> that a larger area <strong>of</strong> parallax can be shown. With a larger number <strong>of</strong> different views,<br />
<strong>the</strong> effect <strong>of</strong> shear distortion can also be reduced. The main disadvantage, on <strong>the</strong> o<strong>the</strong>r h<strong>and</strong>, is that<br />
<strong>the</strong> more views a display <strong>of</strong>fers <strong>the</strong> more <strong>the</strong> resolution is reduced in comparison with 2D <strong>of</strong><br />
stereoscopic 3D systems.<br />
Unless intermediate view interpolation or extrapolation is used, in general, for <strong>the</strong> production <strong>of</strong><br />
content for auto-stereoscopic displays <strong>the</strong> number <strong>of</strong> cameras required is <strong>the</strong> same as <strong>the</strong> number <strong>of</strong><br />
views <strong>of</strong>fered, normally at least five to nine. This significantly increases <strong>the</strong> effort <strong>and</strong> expenditure,<br />
01.09.11 10
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
especially for live events, because, as is <strong>the</strong> case for stereoscopic 3D, here <strong>the</strong> cameras must also be<br />
exactly aligned <strong>and</strong> synchronised.<br />
Even though auto-stereoscopic displays do not <strong>of</strong>fer a large parallax area, <strong>the</strong> viewer appears to be<br />
able to look around objects. However, <strong>the</strong> number <strong>of</strong> viewing positions <strong>and</strong> <strong>the</strong> resulting stereo pairs<br />
is limited. In this case, <strong>the</strong> shear distortion effect is reduced but still detected in each stereo pair.<br />
Never<strong>the</strong>less, if <strong>the</strong> viewer moves around in front <strong>of</strong> <strong>the</strong> display, jumps in <strong>the</strong> picture can be detected<br />
when <strong>the</strong> viewing angle changes. This phenomenon is called flipping. A fur<strong>the</strong>r disadvantage <strong>of</strong> <strong>the</strong><br />
auto-stereoscopic process is <strong>the</strong> missing vertical parallax. As <strong>the</strong> views can only be separated<br />
horizontally, no vertical parallaxes can be displayed. The result is that a viewer can appear to look left<br />
<strong>and</strong> right around an object but not over or under it.<br />
More recently, a combination <strong>of</strong> conventional 2D video capture with depth map generation has been<br />
used for <strong>the</strong> capture <strong>and</strong> processing <strong>of</strong> multiview auto-stereoscopic 3D content. However, <strong>the</strong> display<br />
<strong>of</strong> multiview auto-stereoscopic 3D content relies upon <strong>the</strong> brain to fuse <strong>the</strong> two disparate images to<br />
create <strong>the</strong> 3D sensation. A particularly contentious aspect for entertainment applications is <strong>the</strong> human<br />
factors issue. For example, in stereoscopy, <strong>the</strong> viewer needs to focus at <strong>the</strong> screen plane while<br />
simultaneously converging <strong>the</strong> eyes to different locations in space producing unnatural viewing<br />
(Yamazaki et al., Lambooij et al. 1989). This can cause eye-strain <strong>and</strong> headaches in some people.<br />
Consequently, content producers limit <strong>the</strong> depth <strong>of</strong> scene to be viewed to minimise this problem. With<br />
recent advances in digital technology, some human factors which result in eye fatigue, such as limits<br />
in head movement in <strong>the</strong> case <strong>of</strong> circulate/linear polarized glasses systems, etc., have been eliminated.<br />
However, some intrinsic eye fatigue factor, like a mismatch in convergence <strong>and</strong> focus, will always<br />
exist in stereoscopic 3D technology (Onural et al., Benton, Honda 2006). Fur<strong>the</strong>rmore, due to <strong>the</strong> lack<br />
<strong>of</strong> perspective continuity in 2D view systems, objects in <strong>the</strong> scene <strong>of</strong>ten lack solidity (cardboarding)<br />
<strong>and</strong> give rise to an ‘unreal’ experience.<br />
For <strong>the</strong> 3D video quality assessment <strong>of</strong> stereoscopic videos, most researchers employ subjective<br />
testing (De Silva et al., Hewage et al., Leon et al.2010) focusing mainly on depth perceived by <strong>the</strong><br />
users on autostereoscopic displays <strong>and</strong> <strong>the</strong> sensitivity <strong>of</strong> <strong>the</strong> observers to <strong>the</strong> changes in depth in a 3D<br />
video scene (De Silva 2010). Most <strong>of</strong> <strong>the</strong> 3D video user perception studies relate to <strong>the</strong> design <strong>and</strong><br />
evaluation <strong>of</strong> 3D stereoscopic <strong>and</strong> multiview video systems based on different coding parameters. In<br />
<strong>the</strong> majority <strong>of</strong> <strong>the</strong>se studies, subjective testing using a qualitative methodology with no more than 15<br />
users has been employed (Kalva et al. 2006; Saygili et al. 2009; Knorr et al. 2008; Olsson <strong>and</strong><br />
Sjostrom 2010). In some <strong>of</strong> <strong>the</strong>se research studies, different types <strong>of</strong> video systems were compared by<br />
<strong>the</strong> users in terms <strong>of</strong> various system parameters <strong>and</strong> user perception <strong>of</strong> stereoscopic versus multiview<br />
video (Knorr et al.2008; Olsson <strong>and</strong> Sjostrom 2010). Most <strong>of</strong> <strong>the</strong> results suggest that, for stereoscopic<br />
<strong>and</strong> multiview video, <strong>the</strong> bit rate <strong>and</strong> <strong>the</strong> content <strong>of</strong> <strong>the</strong> original 3D image form <strong>the</strong> factors that most<br />
significantly affect <strong>the</strong> perceived 3D image quality (Kalva et al. 2006; Knorr et al. 2008; Olsson <strong>and</strong><br />
Sjostrom 2010; Reis et al. 2007). In terms <strong>of</strong> multiview 3D video, it is also noted that users prefer less<br />
apparent depth <strong>and</strong> motion parallax when being exposed to compressed 3D images on an autostereoscopic<br />
multiview display (Olsson <strong>and</strong> Sjostrom 2010). Fur<strong>the</strong>rmore, it was found that motion<br />
<strong>and</strong> complexity <strong>of</strong> <strong>the</strong> depth image have a strong influence on <strong>the</strong> acceptable depth quality in 3D<br />
videos (Leon et al. 2008).<br />
2.<strong>2.3</strong> Data Collection Methods<br />
The following data collection methodologies will be employed to ga<strong>the</strong>r information on <strong>the</strong> users’<br />
perceived quality <strong>of</strong> <strong>the</strong> project’s video content:<br />
• Questionnaires – Comprising <strong>of</strong> both closed <strong>and</strong> open ended questions.<br />
• Interviews – Semi-structured (prepared <strong>and</strong> spontaneous questions) interviews comprising <strong>of</strong> a<br />
selected user sample <strong>and</strong> one/two interviewees.<br />
01.09.11 11
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
<strong>2.3</strong> QUALITY ASSESSEMNT ON 3D AUDIO<br />
For studying <strong>the</strong> perceptive quality <strong>of</strong> audio <strong>of</strong> content reproducing, processing algorithms <strong>and</strong><br />
content generation, listening tests are <strong>the</strong> current state-<strong>of</strong>-<strong>the</strong>-art.<br />
For conducting listening tests, <strong>the</strong>re are several st<strong>and</strong>ardised <strong>and</strong> established methods. Most <strong>of</strong> <strong>the</strong>m<br />
are used for validating <strong>the</strong> quality level <strong>of</strong> coding systems. There are no up-to-date st<strong>and</strong>ardised<br />
methods for 3D audio assessment, but <strong>the</strong> known <strong>and</strong> established principles can easily be transferred<br />
for assessing certain attributes <strong>of</strong> 3D audio.<br />
All methods <strong>of</strong>fer <strong>the</strong> test subjects one or more test-signals, so called stimuli, which <strong>the</strong>y have to<br />
value on <strong>the</strong> basis <strong>of</strong> predefined attributes. Multiple stimuli, which have to be valued simultaneously,<br />
form a so-called trial. The different trials <strong>of</strong> a listening test should be validated back-to-back <strong>and</strong> <strong>the</strong><br />
order <strong>of</strong> <strong>the</strong> trials has to be accidental <strong>and</strong> unknown to <strong>the</strong> subjects (ITU-R BS.1284-1).<br />
Participating test persons are grouped in expert listeners <strong>and</strong> normal listeners. The assigned category<br />
depends on <strong>the</strong> experience <strong>of</strong> <strong>the</strong> subject with listening tests <strong>and</strong> used methods. If <strong>the</strong> differences<br />
between <strong>the</strong> stimuli under test are small <strong>and</strong> difficult to detect, <strong>the</strong> listening test should be conducted<br />
with expert listeners. Test with normal listeners may be consulted for <strong>the</strong> evaluation <strong>of</strong> a new audio<br />
reproducing system’s overall perceptive quality. For <strong>the</strong> evaluation it is necessary to separate <strong>the</strong><br />
valuations <strong>of</strong> expert <strong>and</strong> normal listeners to draw conclusions from <strong>the</strong> results to <strong>the</strong> listening<br />
experience <strong>of</strong> <strong>the</strong> test persons. In principle <strong>the</strong> number <strong>of</strong> test persons should be as big as possible, but<br />
in practice a number <strong>of</strong> 12 to 30 test subjects is considered to be sufficient.<br />
Listening tests can be conducted in three different environments:<br />
• Anechoic chamber: Advantageous for tests without any influence <strong>of</strong> <strong>the</strong> room.<br />
• Headphone listening: Avoids most room influences <strong>and</strong> provides <strong>the</strong> detection <strong>of</strong> subtle<br />
distinction.<br />
• Room with predefined conditions referred to ITU-R BS.1116-1: Volume, reverberation<br />
time, etc. are defined to prevent possible unwanted room influences. These parameters also<br />
guarantee <strong>the</strong> reproducibility <strong>of</strong> <strong>the</strong> listening tests.<br />
In <strong>the</strong> preparation <strong>of</strong> <strong>the</strong> listening test it is necessary to define <strong>the</strong> attributes, which are to be rated.<br />
These can be for example <strong>the</strong> sound colour or <strong>the</strong> spatial impression. A list <strong>of</strong> possible attributes can<br />
be found at ITU-R BS.1284-1, but using own attributes is also allowed as long as <strong>the</strong>y are explicit <strong>and</strong><br />
clear defined.<br />
<strong>2.3</strong>.1 Methods for <strong>the</strong> Assessment <strong>of</strong> Sound Colour<br />
Every method for proceeding listening tests has advantages <strong>and</strong> disadvantages <strong>and</strong> has to be selected<br />
on <strong>the</strong> basis <strong>of</strong> <strong>the</strong> criteria to be evaluated, <strong>the</strong> number <strong>of</strong> stimuli <strong>and</strong> <strong>the</strong> expected perceptibility.<br />
Typical methods are:<br />
• “ABX”: Two known stimuli “A” <strong>and</strong> “B” <strong>and</strong> one fur<strong>the</strong>r “X”, which is ei<strong>the</strong>r identically<br />
with “A” or “B”, are <strong>of</strong>fered to <strong>the</strong> test person. The test subject has to allocate “X” to <strong>the</strong><br />
corresponding stimulus. The different stimuli can be listened to as <strong>of</strong>ten as wanted. This<br />
method is qualified for subtle distinction. The number <strong>of</strong> correct <strong>and</strong> wrong allocations is a<br />
measure <strong>of</strong> <strong>the</strong> perceptibility <strong>of</strong> <strong>the</strong> differences, but doesn’t provide any quantitative<br />
information.<br />
• “Double-blind triple-stimulus with hidden reference” referred to ITU-R BS.1116-1:<br />
Three stimuli “A”, “B” <strong>and</strong> “C” are presented to <strong>the</strong> test subject. “A” is always <strong>the</strong> known<br />
01.09.11 12
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
reference. “B” <strong>and</strong> “C” are equivalent to <strong>the</strong> tested stimulus <strong>and</strong> one more hidden reference<br />
in r<strong>and</strong>om order. The test subject has to value <strong>the</strong> stimuli “B” <strong>and</strong> “C” in comparison to <strong>the</strong><br />
reference “A”. The hidden reference should be identified, <strong>of</strong> course. The stimuli can be<br />
listened to as <strong>of</strong>ten as wanted. The issued ratings <strong>and</strong> recognition rate <strong>of</strong> <strong>the</strong> hidden reference<br />
gives information about small differences respectively degradation <strong>of</strong> <strong>the</strong> tested signal.<br />
• “Multiple stimulus test with hidden reference <strong>and</strong> anchor” (MUSHRA) referred to ITU-<br />
R BS.1534-1: The test person is <strong>of</strong>fered several stimuli <strong>and</strong> one known reference. One <strong>of</strong> <strong>the</strong><br />
stimuli equates to a hidden reference <strong>and</strong> a “low anchor”. The test subject has to rate <strong>the</strong><br />
individual stimuli in comparison to <strong>the</strong> reference regarding to a defined attribute using <strong>the</strong><br />
“quality scale” (ITU-R BS.1284-1). The hidden reference has to be identified <strong>and</strong> accordingly<br />
positive valued <strong>and</strong> <strong>the</strong> “low anchor”, which is an artificially debased signal, should also be<br />
identified <strong>and</strong> negative valued. Sometimes MUSHRA is used without “low anchor” <strong>and</strong>/or<br />
explicit reference, as it may be hard to define such signals under certain circumstances.<br />
<strong>2.3</strong>.2 Methods for <strong>the</strong> Assessment <strong>of</strong> Localisation Quality<br />
Especially for spatial audio <strong>the</strong> accuracy <strong>of</strong> localisation <strong>of</strong> an audio reproduction or generation system<br />
is very important. In practice <strong>the</strong>re are two methods established for assessment (Farag 2003):<br />
• Pointing method: The test person is <strong>of</strong>fered a stimulus <strong>and</strong> has to point with a laser on a<br />
scale or pencil on a map at <strong>the</strong> perceived position. This method has <strong>the</strong> advantage that it is<br />
easy to conduct, but <strong>the</strong> disadvantage <strong>of</strong> possible fur<strong>the</strong>r inaccuracies through <strong>the</strong> test person.<br />
This applies especially for positions behind or over <strong>the</strong> test subject, thus positions outside <strong>of</strong><br />
<strong>the</strong> test person’s visual field.<br />
• Acoustic pointer method: The test subject controls a sound source (e.g. a loudspeaker array<br />
with only one active speaker) <strong>and</strong> has to ”point” at <strong>the</strong> perceived position through positioning<br />
<strong>the</strong> sound source. The test person can switch between <strong>the</strong> <strong>of</strong>fered stimulus <strong>and</strong> <strong>the</strong> positioned<br />
sound source as <strong>of</strong>ten as wanted.<br />
For both variations, <strong>the</strong> correlation between real <strong>and</strong> perceived position is evaluated. Instead <strong>of</strong><br />
positions <strong>of</strong>ten only <strong>the</strong> perceived direction, that means <strong>the</strong> angle <strong>of</strong> incidence, is evaluated.<br />
2.4 QUALITY ASSESSMENT ON INTERACTIVE SOFTWARE<br />
This section provides an overview <strong>of</strong> <strong>the</strong> methods that will be employed for testing <strong>the</strong> user<br />
acceptance for interactive features.<br />
There are numerous methods to evaluate <strong>the</strong> usability <strong>and</strong> usefulness <strong>of</strong> Information Technology. As<br />
3D VIVANT’s development outcomes are primarily combinatory innovations, models which validate<br />
usability <strong>and</strong> usefulness in comparison with existing technologies are applicable only to a limited<br />
extent. While <strong>the</strong> Motivational Model (MM) focuses on predicting <strong>the</strong> users’ interest in using <strong>the</strong><br />
features in question <strong>and</strong> Innovation Diffusion Theory (IDT, after Rogers 1985) or <strong>the</strong> Model <strong>of</strong> PC<br />
Utilization (MPCU, after Thompson et al. 1994) focus on evaluating usability in terms <strong>of</strong><br />
improvements <strong>of</strong> previous experience <strong>and</strong> (especially) working situations, 3D VIVANT’s test will<br />
focus on <strong>the</strong> Technology Acceptance Model (TAM, after Davis 1989) as it focuses on two aspects <strong>of</strong><br />
major interest in 3D VIVANT: 1) Perceived Usefulness (PU), <strong>and</strong> 2) Perceived Ease-<strong>of</strong>-Use (PEOU).<br />
Whereas Perceived Usefulness is <strong>the</strong> degree to which a person believes that using a particular system<br />
would enhance or improve his or her situation Perceived Ease <strong>of</strong> Use measures “<strong>the</strong> degree to which a<br />
person believes that using a particular system would be free <strong>of</strong> effort” (Davis 1989).<br />
01.09.11 13
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
The most comprehensive source on how to actually conduct usability tests, which describes in all<br />
details <strong>the</strong> way <strong>the</strong> partners have organized usability tests in recent years, seems to be Rubin <strong>and</strong><br />
Chisnell’s “H<strong>and</strong>book <strong>of</strong> Usability Testing” (2008). The envisaged validation tests will try <strong>and</strong><br />
investigate differences in perceived usefulness <strong>and</strong> usability with respect to gender, age <strong>and</strong> prior<br />
experience with IT, in order to cater for varying expectations in a large (potential) audience as that <strong>of</strong><br />
<strong>the</strong> broadcasting partners in <strong>the</strong> project.<br />
2.4.1 User Acceptance <strong>and</strong> Usability<br />
The key aim <strong>of</strong> <strong>the</strong> user acceptance or usability testing is to evaluate <strong>the</strong> efficiency <strong>and</strong> effectiveness<br />
<strong>of</strong> <strong>the</strong> system. As <strong>the</strong> user interacts with a system via some form <strong>of</strong> a user interface (typically<br />
graphical), usability testing is closely associated with <strong>the</strong> study <strong>of</strong> interaction between users <strong>and</strong><br />
computers.<br />
The user acceptance <strong>of</strong> <strong>the</strong> proposed use cases (<strong>the</strong> Broadcast TV as well as <strong>the</strong> Online Hyperlinking<br />
use cases) will be tested according to <strong>the</strong> following four criteria:<br />
• Usability – Whe<strong>the</strong>r <strong>the</strong> system can be operated with ease <strong>and</strong> efficiency.<br />
• Usefulness – Whe<strong>the</strong>r <strong>the</strong> system allows <strong>the</strong> user to complete <strong>the</strong> relevant tasks successfully.<br />
• Intuitiveness – Whe<strong>the</strong>r <strong>the</strong> system <strong>and</strong> its interface components (navigation, etc.) can be easily<br />
recognised.<br />
• Learnability – Whe<strong>the</strong>r <strong>the</strong> system can be mastered quickly, minimizing <strong>the</strong> user’s learning<br />
curve.<br />
User acceptance testing will involve a mixed qualitative methodology.<br />
The user acceptance testing refers to <strong>the</strong> testing <strong>and</strong> evaluation <strong>of</strong> <strong>the</strong> developed use cases, namely <strong>the</strong><br />
Broadcast TV <strong>and</strong> Online Hyperlinking use cases as well as <strong>the</strong> 3D graphical user interface (3D GUI)<br />
that will be developed as part <strong>of</strong> <strong>the</strong>se use cases. The user acceptance testing will involve both<br />
pr<strong>of</strong>essional as well as end-users.<br />
2.4.2 Data Collection Methods<br />
The following data collection methods will be utilised to ga<strong>the</strong>r user feedback. The individual<br />
sections on <strong>the</strong> technologies (Section 3 <strong>and</strong> Section 4) to be tested will name applicable methods<br />
more precisely.<br />
• Questionnaires – Comprising <strong>of</strong> both closed <strong>and</strong> open ended questions.<br />
• Interviews – Semi-structured interviews comprising <strong>of</strong> a selected user sample <strong>and</strong> one/two<br />
interviewees.<br />
• Task-centred testing – Users fulfil prepared tasks without help from experts. Their methods <strong>and</strong><br />
success <strong>of</strong> fulfilling <strong>the</strong>se are being watched <strong>and</strong> documented.<br />
• Thinking aloud method –Users tell <strong>the</strong> tester what <strong>the</strong>y are thinking, while fulfilling tasks to<br />
give designers <strong>and</strong> developers an idea <strong>of</strong> what users would have expected.<br />
The user acceptance methodology will be employed for <strong>the</strong> testing <strong>of</strong> both components <strong>of</strong> <strong>the</strong><br />
interactive experience testing as discussed <strong>and</strong> detailed in Section 4.2.1 for <strong>the</strong> hyperlinking<br />
(particularly questionnaires, task-centred testing, <strong>the</strong> thinking aloud method <strong>and</strong> observations) <strong>and</strong><br />
Section 4.2.2 for <strong>the</strong> search <strong>and</strong> retrieval framework (particularly questionnaires, task-centred testing,<br />
<strong>the</strong> thinking aloud method <strong>and</strong> observations).<br />
Focus groups <strong>and</strong> interviews will be employed primarily for <strong>the</strong> production-side testing as described<br />
01.09.11 14
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
in Section 3 <strong>of</strong> this document.<br />
The testing <strong>of</strong> <strong>the</strong> broadcasting experience as discussed in Section 4.1 will focus on viewing <strong>and</strong><br />
listening tests. The results will be evaluated both through questionnaires <strong>and</strong> records <strong>of</strong> testers’<br />
spontaneous impressions (Thinking aloud method).<br />
01.09.11 15
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
3 PRODUCTION-SIDE ACCEPTANCE FACTORS<br />
The 3D VIVANT concept can have far reaching implications for <strong>the</strong> production <strong>of</strong> content. The new<br />
camera sets will have to interface with existing technology <strong>and</strong> may require fur<strong>the</strong>r technical<br />
developments. But also production processes <strong>and</strong> workflows will be impacted by <strong>the</strong> technical<br />
developments <strong>and</strong> <strong>the</strong> opportunities <strong>the</strong>y <strong>of</strong>fer.<br />
• The editorial department may have to (re)write scripts to take full advantage <strong>of</strong> <strong>the</strong> holoscopic<br />
imagery.<br />
• The camera department will have to find out how to work creatively with <strong>the</strong> holoscopic image;<br />
especially <strong>the</strong> technical aspects <strong>of</strong> <strong>the</strong> camera’s setup will be important for <strong>the</strong>m. They have to<br />
work with <strong>the</strong> holoscopic camera <strong>and</strong>, <strong>the</strong>refore, <strong>the</strong>y will have precise ideas <strong>of</strong> how a basic setup<br />
<strong>of</strong> <strong>the</strong> camera has to work. There should be <strong>the</strong> possibility to monitor <strong>the</strong> 3D holoscopic images in<br />
terms <strong>of</strong> creative decisions <strong>and</strong> in terms <strong>of</strong> technical ones. For <strong>the</strong> creative part, <strong>the</strong> requirement is<br />
to see a 3D image like it would be recorded, to set <strong>the</strong> depth <strong>and</strong> to position all objects/actors in<br />
front <strong>of</strong> <strong>the</strong> camera <strong>and</strong> to judge lighting, etc. In terms <strong>of</strong> technical decisions, <strong>the</strong> camera<br />
department would need some measurement techniques to judge <strong>the</strong> image with respect to overall<br />
picture quality usually controlled via waveform monitors, vectorscopes, etc. It has to be<br />
investigated if <strong>the</strong>se 2D measurement instruments are still sufficient to judge holoscopic images.<br />
As <strong>the</strong> 3D holoscopic image will be captured in a different form <strong>the</strong> conventional ways, it is<br />
possible that <strong>the</strong>se techniques are no longer adequate <strong>and</strong> <strong>the</strong> measurement techniques have to be<br />
modified or new ones developed.<br />
• The director needs 3D playback to also judge <strong>the</strong> final composed image <strong>and</strong> <strong>the</strong> acting while<br />
recording.<br />
• During postproduction, both creative <strong>and</strong> technical monitoring (holoscopic display <strong>and</strong><br />
waveform/vectorscope) is needed.<br />
Production side validation activities will concentrate on <strong>the</strong> holoscopic camera sets to be developed in<br />
<strong>the</strong> project <strong>and</strong> also on <strong>the</strong> production processes for <strong>the</strong> service scenarios envisaged in <strong>the</strong> use cases.<br />
3.1 HOLOSCOPIC CAMERA SETS<br />
The project aims to ultimately produce a single tier holoscopic camera. Initially, a two tier camera<br />
will be produced. However, to make 3D content generation accessible to partners, it is also planned to<br />
develop an add-on optical hardware, which can be used with an HDTV camera equipped with <strong>the</strong> new<br />
imaging sensor. As <strong>the</strong> camera solutions developed will be prototypes, as opposed to finished<br />
marketable products, <strong>the</strong> proposed approach is to conduct a technical comparison <strong>of</strong> camera sets<br />
produced by <strong>the</strong> project with o<strong>the</strong>r 3D solutions (commercially) available. It will be necessary to<br />
consider items related to <strong>the</strong> calibration <strong>of</strong> system, as well as to list what existing technology in <strong>the</strong><br />
production process is impacted by <strong>the</strong> camera itself <strong>and</strong> <strong>the</strong> output, i.e., <strong>the</strong> 3D holoscopic images it<br />
produces.<br />
Annex 1 contains some tables in which <strong>the</strong> main parameters involved in a TV production are listed,<br />
with <strong>the</strong> indication <strong>of</strong> <strong>the</strong> monitoring tool <strong>and</strong>/or measuring instrument usually adopted in a TV studio<br />
or a post-processing environment. Some <strong>of</strong> <strong>the</strong> parameters <strong>and</strong> monitoring or measuring tools should<br />
be equally relevant for 3D holoscopic productions; o<strong>the</strong>rs may have to be modified to suit <strong>the</strong> needs<br />
<strong>of</strong> 3D holoscopic shooting.<br />
Like in 2D shooting, basic parameters such as focus, aperture <strong>and</strong> focal length have to be adjustable<br />
<strong>and</strong> controlled via adequate monitoring tools. For stereoscopic shooting, additional depth related<br />
parameters, such as screen position in depth, transversal <strong>and</strong> longitudinal magnification <strong>and</strong><br />
01.09.11 16
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
horizontal disparity range have to be adjusted to follow <strong>the</strong> artistic intention <strong>and</strong> technically result in<br />
good stereoscopic images without distortions. These adjustments are typically controlled via a<br />
monitor capable <strong>of</strong> showing stereoscopic content. For 3D holoscopic image production, it has to be<br />
investigated if similar or o<strong>the</strong>r depth related parameters are relevant <strong>and</strong> how <strong>the</strong>se parameters can be<br />
controlled <strong>and</strong> monitored. A 3D holoscopic display for image control will be necessary. Referring to<br />
signal parameters like white level, black pedestal, gamma <strong>and</strong> component gain in 2D <strong>and</strong> stereoscopic<br />
productions, measurement instruments like waveform monitor <strong>and</strong> vectorscope are used. These<br />
instruments will also have to be adapted to suit holoscopic content.<br />
3.2 PRODUCTION PROCESSES<br />
The production <strong>of</strong> <strong>the</strong> service scenarios in <strong>the</strong> use cases including 3D holoscopic content <strong>and</strong><br />
hyperlinked video will have implications for <strong>the</strong> production process <strong>and</strong> will to some extent require<br />
new or altered tasks, skills <strong>and</strong> workflows for <strong>the</strong> whole production team. It is, <strong>the</strong>refore, planned to<br />
conduct interviews <strong>and</strong> create focus groups with key members <strong>of</strong> <strong>the</strong> production teams at RBB <strong>and</strong><br />
RAI to ga<strong>the</strong>r guidelines <strong>and</strong> recommendations. Annex 2 lists <strong>the</strong> differences between 2D <strong>and</strong> 3D<br />
stereoscopic production <strong>and</strong> contains production guidelines for 3D stereoscopic production. This<br />
document, provided by ARRI, is based on in-house experience <strong>and</strong> also on an interview with <strong>the</strong><br />
director <strong>of</strong> photography (DOP) Christian Rein, who is experienced in shooting 3D stereo. These<br />
guidelines could provide <strong>the</strong> basis for discussions with <strong>the</strong> production team. The main issues will be:<br />
• Storyboard, film production grammar – How did <strong>the</strong> depth cues in 3D holoscopy affect <strong>the</strong><br />
storyboard, specifically: How much depth should be contained in each scene What is <strong>the</strong><br />
interaction between <strong>the</strong> depth cues Was <strong>the</strong>re a Lilliputian effect or anything similar Was<br />
window violation an issue Will an object in front <strong>of</strong> <strong>the</strong> screen, which is close to <strong>the</strong> lateral<br />
frame <strong>of</strong> <strong>the</strong> picture, be cut <strong>of</strong>f by <strong>the</strong> picture frame in an unnatural manner<br />
• Pre-production – Did staging <strong>and</strong> decorations have to be planned <strong>and</strong> built more accurately for<br />
3D holoscopic productions<br />
• Shooting – What additional equipment was needed Was additional personnel required (e.g., in<br />
stereoscopic 3D a stereographer cooperates with <strong>the</strong> DOP <strong>and</strong> <strong>the</strong> director to design <strong>the</strong> depth<br />
perception <strong>of</strong> <strong>the</strong> picture) Did a slower panning speed produce less irritating results Was <strong>the</strong><br />
pace <strong>of</strong> work slower than for 2D production<br />
• Quality control – How did 3D holoscopic production affect quality control<br />
• Editing – Did <strong>the</strong> 3D production require a slower/different pace <strong>of</strong> cutting as well as o<strong>the</strong>r editing<br />
effects<br />
• Green Screen, visual effects – How did 3D holoscopic production affect chroma keying <strong>and</strong><br />
visual effects Typically back plates <strong>and</strong> composites <strong>of</strong> real <strong>and</strong> virtual layers are more<br />
dem<strong>and</strong>ing in 3D. Virtual elements have to be placed at <strong>the</strong> right position in depth in <strong>the</strong> scene.<br />
Moreover, flat back plates might not be sufficient, e.g., when simulating a deep perspective.<br />
• Additional tasks in post-production – What additional tasks were necessary in post-production<br />
01.09.11 17
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
4 END-USER ACCEPTANCE<br />
The aim <strong>of</strong> <strong>the</strong> user validation with end-users is to determine if <strong>the</strong>y can use <strong>and</strong> enjoy <strong>the</strong> 3D<br />
holoscopic service scenarios envisaged in <strong>the</strong> project. These service scenarios, as described in <strong>the</strong> Use<br />
Cases Document (see Deliverable D2.1), have two main components for <strong>the</strong> end-user: 1) Broadcast<br />
TV, <strong>and</strong> 2) Online Hyperlinking. The validation <strong>of</strong> <strong>the</strong> Broadcast TV component will concentrate on<br />
<strong>the</strong> audiovisual aspects, while <strong>the</strong> validation <strong>of</strong> <strong>the</strong> Online Hyperlinking component will additionally<br />
address <strong>the</strong> interaction <strong>and</strong> scalability aspects <strong>of</strong> <strong>the</strong> service scenarios. The validation will address <strong>the</strong><br />
general acceptability <strong>and</strong> perception <strong>of</strong> <strong>the</strong> service scenarios developed <strong>and</strong> also covers specific issues<br />
derived from <strong>the</strong> end-user requirements in Deliverable D2.1<br />
The focus <strong>of</strong> <strong>the</strong> tests is not to validate 3D Television (TV) per se, but to concentrate on <strong>the</strong> benefits<br />
<strong>and</strong> characteristics <strong>of</strong> 3D holoscopic TV <strong>and</strong> object-based hyperlinked videos.<br />
4.1 BROADCAST EXPERIENCE<br />
4.1.1 Viewing<br />
Specific test criteria for validation by <strong>the</strong> end-user are based primarily on <strong>the</strong> end-user requirements.<br />
This will involve real 3D holoscopic images captured with <strong>the</strong> holoscopic camera <strong>and</strong> computer<br />
generated 3D holoscopic content, or a combination <strong>of</strong> both.<br />
To gain a broad user acceptance some basic requirements concerning <strong>the</strong> picture quality can be<br />
defined as follows:<br />
• Spatial resolution – As broadcasters tend to distribute <strong>the</strong>ir programmes in HDTV format <strong>and</strong><br />
2D, <strong>and</strong> first stereoscopic 3D movies are available on blu-ray discs that use full HD resolution<br />
(1920×1080 pixels), <strong>the</strong> users would expect <strong>the</strong> holoscopic image to deliver similar resolutions. It<br />
has to be investigated if <strong>the</strong> new viewing experience <strong>of</strong> <strong>the</strong> holoscopic image can legitimate<br />
resolutions lower than HD (at least 1280×720).<br />
• Colour – Certainly, colour is a very important parameter to reproduce a realistic image <strong>of</strong> <strong>the</strong><br />
world <strong>and</strong> users will expect pictures which show colours in a most natural way. In order to<br />
investigate to what extent colour representation is a camera or a display issue, computer<br />
generated content could be used for comparison.<br />
To gain a better 3D viewing experience than with existing 3D technologies, <strong>the</strong> user would expect <strong>the</strong><br />
shortcomings <strong>of</strong> stereoscopic <strong>and</strong> auto-stereoscopic technologies to be solved. Basic acceptance<br />
factors will be:<br />
• Free viewing – Like on 2D displays it should be possible for multiple persons to view <strong>the</strong> 3D<br />
holoscopic content without <strong>the</strong> need to wear special glasses <strong>and</strong> independent <strong>of</strong> <strong>the</strong> individual<br />
position in front <strong>of</strong> <strong>the</strong> screen.<br />
• Continuous parallax without distortions – Since current 3D technologies can only provide<br />
horizontal parallax, 3D VIVANT’s holoscopic 3D should deliver full horizontal <strong>and</strong> vertical<br />
parallax throughout <strong>the</strong> viewing zone without flipping or shear distortion.<br />
• Distinguish between 2D <strong>and</strong> 3D content – Because for some use cases a combination <strong>of</strong> 2D <strong>and</strong><br />
3D holoscopic content is needed, users should be able to distinguish between 2D <strong>and</strong> 3D content<br />
without optical or mental irritations <strong>and</strong> be able to watch 2D content on a holoscopic 3D display<br />
without distortion.<br />
During <strong>the</strong> user perception testing <strong>of</strong> <strong>the</strong> Broadcast TV viewing part, users will be asked to view <strong>the</strong><br />
same computer generated 3D holoscopic content that is displayed on different display mediums. The<br />
01.09.11 18
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
first one will be stereoscopic content generated from <strong>the</strong> 3D holoscopic content (through <strong>the</strong> 3D<br />
holoscopic s<strong>of</strong>tware plug-in tool that will be developed in M18 under <strong>the</strong> scope <strong>of</strong> Deliverable D3.9)<br />
that will be displayed on a commercial 2D monitor with users wearing special 3D glasses. The second<br />
one will be 3D holoscopic content that is displayed on three different types <strong>of</strong> displays.<br />
User acceptance tests will largely depend on <strong>the</strong> viewing conditions during <strong>the</strong> test. ITU-R<br />
Recommendation BT.500, clause 2.1.2, viewing conditions are considered “slightly more critical than<br />
<strong>the</strong> optimal home viewing conditions” <strong>and</strong> optimal conditions are described. This lighting<br />
environment is suggested for evaluating <strong>the</strong> “quality at <strong>the</strong> consumer side <strong>of</strong> <strong>the</strong> TV chain” <strong>and</strong> could<br />
be adopted as a basis for <strong>the</strong> tests envisaged in <strong>the</strong> project just like <strong>the</strong> Preferred Viewing Distance<br />
(PVD) recommended in ITU-R BT.500.<br />
It should be taken into account that <strong>the</strong> main purpose <strong>of</strong> <strong>the</strong> ITU-R BT.500 is to define a st<strong>and</strong>ard<br />
viewing environment in order to obtain comparable results with different assessment sessions.<br />
However, <strong>the</strong> definition <strong>of</strong> optimal, or preferred, viewing conditions for a 3D display is still an open<br />
issue. Consequently, for <strong>the</strong> tests to be carried out in <strong>the</strong> frame <strong>of</strong> <strong>the</strong> project, <strong>the</strong> viewing<br />
environment suggested in ITU-R BT.500 will be implemented, whereas <strong>the</strong> parameters related to <strong>the</strong><br />
display will be chosen so that <strong>the</strong> resulting viewing conditions will be felt optimal by <strong>the</strong> viewers.<br />
4.1.2 Hearing<br />
Spatial audio generation <strong>and</strong> playback will also be an integral part <strong>of</strong> <strong>the</strong> 3D VIVANT system. The<br />
main goal is to provide a consistent spatial audio experience accompanying <strong>the</strong> holoscopic video<br />
content. Thus, at least <strong>the</strong> same degrees <strong>of</strong> freedom as possible with <strong>the</strong> holoscopic display <strong>and</strong><br />
content should be provided to one or more end-users. The key objectives are to allow plausible<br />
localisation <strong>of</strong> objects in respect to <strong>the</strong> visual experience, perception <strong>of</strong> spatiality <strong>and</strong> depth, <strong>and</strong> an<br />
overall coherent immersive listening experience.<br />
For audio generation a new spherical surface microphone has been designed. Accordingly an<br />
interpolation algorithm has been developed, to simulate microphone positions on <strong>the</strong> spherical surface<br />
between <strong>the</strong> six existing microphones. This algorithm (<strong>and</strong> <strong>the</strong> spherical surface microphone itself)<br />
have to be assessed through listening tests to find out <strong>the</strong> quality level <strong>of</strong> sound with following<br />
criteria:<br />
• Sound colouration – A correct interpolation <strong>of</strong> microphone positions with as less as possible<br />
sound colouration in comparison to a real microphone at this position is very significant. This<br />
criterion will be tested with <strong>the</strong> MUSHRA (#2.4.1) method (without low anchor). Three<br />
variations <strong>of</strong> <strong>the</strong> developed algorithm <strong>and</strong> two known will be compared.<br />
• Quality <strong>of</strong> localisation – The spherical surface microphone can also be used for binaural hearing.<br />
Therefore <strong>the</strong> perceived localisation <strong>of</strong> a sound source has to match <strong>the</strong> real position as well as<br />
possible. To test <strong>the</strong> quality <strong>of</strong> localisation with <strong>the</strong> spherical surface microphone a listening test<br />
with pointing (#2.4.2) on a scale to indicate <strong>the</strong> perceived sound sources will be conducted. This<br />
test will also evaluate <strong>the</strong> quality <strong>of</strong> localisation using interpolated microphones with different<br />
algorithms. A dummy head Neumann KU-100 will also to be rated for comparing with <strong>the</strong><br />
spherical surface microphone <strong>and</strong> <strong>the</strong> interpolation algorithms.<br />
Both tests will be conducted with headphone listening <strong>and</strong> a head-tracker for dynamic tracking <strong>of</strong> <strong>the</strong><br />
(binaural) signal. The reference is an artificially created (computer simulated) spherical surface<br />
microphone with 360 microphones on <strong>the</strong> equatorial plane. Only expert listener will be consulted in<br />
case <strong>of</strong> <strong>the</strong> small differences between <strong>the</strong> stimuli. A training session preceding <strong>the</strong> listening test will<br />
familiarize <strong>the</strong> test persons with <strong>the</strong> rating procedure (especially for <strong>the</strong> pointing method) <strong>and</strong> <strong>the</strong><br />
possible artefacts or differences.<br />
The spatial audio playback system will also be tested using <strong>the</strong> following criteria:<br />
01.09.11 19
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
• Sound colouration: Possible degradations <strong>of</strong> <strong>the</strong> perceived audio quality will be tested using<br />
a MUSHRA-like test. The stimuli will consist <strong>of</strong> a virtual sound source created by <strong>the</strong><br />
playback system, a real sound source located on <strong>the</strong> same position <strong>and</strong> ano<strong>the</strong>r virtual sound<br />
source created using stereophonic playback. Speech, noise <strong>and</strong> o<strong>the</strong>r audio signals known to<br />
be prone to sound colouration artefacts will be used. As <strong>the</strong> overall experience <strong>of</strong> <strong>the</strong> three<br />
stimuli will be totally different - because various playback principles are used. The test<br />
subjects will be instructed to only consider sound colouration or o<strong>the</strong>r notable artefacts.<br />
Therefore a training session will be needed before <strong>the</strong> actual listening test to familiarize <strong>the</strong><br />
test subjects with <strong>the</strong> scope <strong>of</strong> <strong>the</strong> test.<br />
• Quality <strong>of</strong> localisation: Test subjects will be asked to localize virtual sound sources<br />
generated by <strong>the</strong> playback system. Again, <strong>the</strong> pointing method (#2.4.2) will be used.<br />
Both criteria will be tested from multiple listening positions to determine <strong>the</strong> size <strong>of</strong> <strong>the</strong> possible<br />
listening area <strong>of</strong> <strong>the</strong> new playback system.<br />
4.2 INTERACTIVE EXPERIENCE<br />
There are two interactive features developed in 3D VIVANT, namely a Video Hyperlinking<br />
Environment <strong>and</strong> an Object Search <strong>and</strong> Retrieval Framework. While <strong>the</strong> first is to be used primarily<br />
by end users, <strong>the</strong> latter will only be used by pr<strong>of</strong>essionals as part <strong>of</strong> <strong>the</strong> editing process <strong>of</strong> <strong>the</strong> video<br />
hyperlinks.<br />
4.2.1 Video Hyperlinking Environment<br />
The object-based online video hyperlinking developed in 3D VIVANT will allow online users to<br />
interact with 3D objects in a video by clicking on <strong>the</strong>m. Through clicking <strong>the</strong> hyperlinked objects<br />
users will be able to access additional content. Basic user acceptance factors can be defined as<br />
follows:<br />
• Navigation <strong>and</strong> interaction – Users will expect <strong>the</strong> navigation through <strong>the</strong> user interface to be<br />
intuitive <strong>and</strong> usable. Fur<strong>the</strong>rmore, <strong>the</strong> chosen method <strong>of</strong> navigation <strong>and</strong> interaction has to be<br />
suitable <strong>and</strong> effective for <strong>the</strong> display device <strong>and</strong> <strong>the</strong> content displayed. A computer mouse, as<br />
users are accustomed to, for example, on a PC would probably not be suitable for a 3D<br />
environment because it is basically a 2D input device. In addition to underst<strong>and</strong>ing how to<br />
interact with <strong>the</strong> service, users need to underst<strong>and</strong> <strong>the</strong> navigation architecture <strong>of</strong> <strong>the</strong> service, i.e.,<br />
<strong>the</strong>y need to underst<strong>and</strong> how to access <strong>the</strong> content <strong>the</strong>y require <strong>and</strong> how to return to previous<br />
content.<br />
• Recognition <strong>of</strong> clickable objects – To be able to follow hyperlinks, users will have to click on<br />
linked objects. Thus, <strong>the</strong> user interface should provide some kind <strong>of</strong> highlighting <strong>of</strong> linked<br />
objects, which allows <strong>the</strong> users to recognise linked objects <strong>and</strong> distinguish <strong>the</strong>m from o<strong>the</strong>r<br />
objects. At <strong>the</strong> same time, <strong>the</strong> highlights should not distract too much from <strong>the</strong> video itself. The<br />
object <strong>and</strong> highlighting will also need to be displayed for a sufficient duration to allow users<br />
enough time to interact with it.<br />
• Responsiveness - The performance <strong>of</strong> <strong>the</strong> hyperlinking system is a critical point. Users typically<br />
have high expectations with regard to responsiveness to <strong>the</strong>ir comm<strong>and</strong>s. They expect an<br />
immediate response <strong>of</strong> some sort <strong>and</strong> become irritated if this does not happen. It is, <strong>the</strong>refore,<br />
essential that <strong>the</strong> system reacts quickly enough <strong>and</strong> provides <strong>the</strong> user with some sort <strong>of</strong> instant<br />
feedback about what <strong>the</strong> system is doing <strong>and</strong> that <strong>the</strong>ir request is being processed or has been<br />
completed.<br />
01.09.11 20
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
• Service acceptance – As <strong>the</strong> hyperlinking <strong>of</strong> single moving objects in a videostream is a new<br />
video service scenario, <strong>the</strong> general acceptance <strong>of</strong> such a service by users has to be validated. The<br />
acceptance issues to be considered will include: did <strong>the</strong> end-users underst<strong>and</strong> <strong>the</strong> concept <strong>of</strong> <strong>the</strong><br />
service, did it fulfil <strong>the</strong>ir expectations, could <strong>the</strong>y imagine using it, would <strong>the</strong>y recommend it to<br />
<strong>the</strong>ir friends, what are <strong>the</strong> perceived benefits, what did <strong>the</strong>y like/dislike about it, etc.<br />
4.2.2 Search <strong>and</strong> Retrieval Framework<br />
Ano<strong>the</strong>r functionality that <strong>the</strong> 3D VIVANT system will support is <strong>the</strong> retrieval <strong>of</strong> objects similar to<br />
samples given by users. This functionality is expected to be used by pr<strong>of</strong>essional ra<strong>the</strong>r than by endusers.<br />
Pr<strong>of</strong>essional users will want to exploit this functionality, especially during 3D scene<br />
composition, when <strong>the</strong>y want to change <strong>the</strong> scene <strong>and</strong> replace certain objects with similar ones. It<br />
might be interesting for end-users to use this functionality to surf through or retrieve content from 3D<br />
holoscopic content databases. However, in <strong>the</strong> course <strong>of</strong> <strong>the</strong> project we can consider this technology<br />
as well as <strong>the</strong> availability <strong>of</strong> 3D holoscopic content as exclusive to pr<strong>of</strong>essional users.<br />
In 3D VIVANT, validation tests will be conducted only with pr<strong>of</strong>essional users to consider<br />
acceptance <strong>and</strong> usability issues with respect to functionality, performance <strong>and</strong> user friendliness<br />
aspects. More precisely <strong>the</strong>se are as follows:<br />
• Functionality – In this case, testing aims to validate user acceptance pertaining to whe<strong>the</strong>r <strong>the</strong><br />
architecture meets user expectations in terms <strong>of</strong> functionality. It associates with <strong>the</strong> completeness<br />
that <strong>the</strong> system <strong>of</strong>fers regarding <strong>the</strong> defined <strong>and</strong> implemented functions. It is focused on aspects,<br />
such as <strong>the</strong> supported content types (e.g., video, picture, 3D, 2D), search methods <strong>and</strong> metadata<br />
storage procedures, without emphasising on whe<strong>the</strong>r <strong>the</strong>se features are fast, reliable, friendly to<br />
<strong>the</strong> user, etc.<br />
• Performance – Validation testing will also consider <strong>the</strong> performance <strong>of</strong> <strong>the</strong> different functional<br />
modules in terms <strong>of</strong> retrieval accuracy, processing power requirements, storage capacity<br />
requirements <strong>and</strong> corresponding response latency. Retrieval accuracy is <strong>the</strong> core indicator <strong>of</strong> <strong>the</strong><br />
performance <strong>of</strong> any search engine as it can provide strong evidence about its actual usefulness<br />
<strong>and</strong> effectiveness. Retrieval accuracy reveals <strong>the</strong> capability <strong>of</strong> a search engine to discriminate<br />
between relevant <strong>and</strong> irrelevant objects with respect to <strong>the</strong> query <strong>and</strong> to present only <strong>the</strong> most<br />
relevant objects. However, it is important that a search engine not only is effective, but efficient<br />
as well. Resources are usually limited <strong>and</strong>, <strong>the</strong>refore, processing power <strong>and</strong> storage requirements<br />
are critical parameters so that a search <strong>and</strong> retrieval framework may be applied in practice <strong>and</strong><br />
exploited by end-users. Finally, a positive decision to accept a search <strong>and</strong> retrieval framework<br />
also depends on <strong>the</strong> time required to perform a retrieval request. In case that <strong>the</strong> user searches in<br />
small multimedia databases, time is not a critical parameter. However, as multimedia databases<br />
tend to grow exponentially, it becomes obvious that fast processing <strong>and</strong> system response get more<br />
<strong>and</strong> more significant.<br />
• User friendliness – Apart from <strong>the</strong> aforementioned characteristics, which refer mainly to <strong>the</strong><br />
technical aspects <strong>of</strong> <strong>the</strong> search <strong>and</strong> retrieval framework, validation tests will also investigate <strong>the</strong><br />
user friendliness <strong>and</strong> usability <strong>of</strong> <strong>the</strong> user interface allowing <strong>the</strong> searching <strong>of</strong> holoscopic content.<br />
These tests will be performed in <strong>the</strong> context <strong>of</strong> <strong>the</strong> general acceptance tests regarding <strong>the</strong> user<br />
interface <strong>of</strong> <strong>the</strong> 3D VIVANT system.<br />
01.09.11 21
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
5 ORGANISATION OF TESTS<br />
5.1 OVERVIEW<br />
The planned approach for <strong>the</strong> user tests in 3D VIVANT on <strong>the</strong> end-user side is to initially have a<br />
series <strong>of</strong> internal tests, focusing on particular aspects <strong>of</strong> <strong>the</strong> planned developments. These tests will in<br />
general be conducted by <strong>the</strong> partner primarily responsible for <strong>the</strong> development but may also involve<br />
o<strong>the</strong>r partners. Finally, <strong>the</strong> complete implemented use cases will be tested.<br />
For <strong>the</strong> pr<strong>of</strong>essional user tests, RAI in Task 5.3 – “3D Holoscopic Broadcasting Use Case” will create<br />
a broadcast test-bed. This will form <strong>the</strong> basis for <strong>the</strong> input for <strong>the</strong> holoscopic camera sets tests. In<br />
parallel, RBB will involve pr<strong>of</strong>essional production teams in <strong>the</strong> production <strong>of</strong> RBB Kids’ <strong>and</strong> Youth<br />
programmes to demonstrate 3D VIVANT’s features <strong>and</strong> functions. These production activities<br />
(accompanied by questionnaires <strong>and</strong> focus groups) will form a major input to <strong>the</strong> validation process.<br />
The validation <strong>of</strong> <strong>the</strong> production process will fur<strong>the</strong>rmore involve <strong>the</strong> use <strong>of</strong> <strong>the</strong> hyperlinking<br />
environment <strong>and</strong>, closely connected, <strong>the</strong> search <strong>and</strong> retrieval mechanisms both <strong>of</strong> which will be tested<br />
by pr<strong>of</strong>essionals. The evaluation <strong>of</strong> <strong>the</strong>se tests will involve questionnaires, thinking aloud <strong>and</strong> focus<br />
groups.<br />
Eventually, after feedback from project-internal testing <strong>and</strong> from pr<strong>of</strong>essional user tests will have<br />
been evaluated <strong>and</strong> <strong>the</strong> prototypes will have been optimised, end-users will be invited for testing <strong>the</strong>ir<br />
acceptance <strong>of</strong> <strong>the</strong> project results as potential new products <strong>and</strong> services.<br />
The following table provides an overview <strong>of</strong> <strong>the</strong> planned tests including timing, methodology,<br />
location, testers <strong>and</strong> <strong>the</strong> content that will be used for <strong>the</strong> tests.<br />
Table 1:<br />
Pr<strong>of</strong>essional Users Testplan<br />
Tests Month Methodology Location -<br />
Partners<br />
Testers<br />
Test Content<br />
Holoscopic<br />
camera set<br />
Production<br />
process<br />
M19- M25<br />
(Sep 2011 –<br />
Mar 2012)<br />
M29 – M31<br />
(July 2012 –<br />
September<br />
2012)<br />
M19- M25<br />
(Sep 2011 –<br />
Mar 2012)<br />
M29 – M32<br />
(July 2012 –<br />
October<br />
2012)<br />
Pr<strong>of</strong>essional Users: Production<br />
Active use <strong>and</strong><br />
technical<br />
comparison<br />
Technical<br />
comparison<br />
Focus groups,<br />
interviews<br />
Focus groups,<br />
interviews<br />
Potsdam,<br />
RBB<br />
RBB <strong>and</strong> key<br />
members <strong>of</strong><br />
RBB production<br />
teams<br />
Holoscopic camera<br />
sets<br />
Torino, RAI RAI Holoscopic camera<br />
sets<br />
Potsdam,<br />
RBB<br />
Potsdam,<br />
RBB <strong>and</strong><br />
Torino, RAI<br />
Key members <strong>of</strong><br />
RBB production<br />
teams<br />
Key members <strong>of</strong><br />
production<br />
teams<br />
Production <strong>of</strong><br />
implemented<br />
showcase<br />
Production <strong>of</strong><br />
implemented<br />
showcase<br />
01.09.11 22
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
Table 2:<br />
End-Users Testplan<br />
Tests Month Methodology Location -<br />
Partners<br />
Testers<br />
Test Content<br />
End-Users: Viewing Tests<br />
3D holoscopic<br />
content on 2D<br />
display with 3D<br />
glasses<br />
M25 – M27<br />
(March –<br />
May 2012)<br />
Questionnaires,<br />
focus groups<br />
London,<br />
<strong>Brunel</strong><br />
University<br />
Sample <strong>of</strong> endusers<br />
Stereoscopic content<br />
that has been<br />
extracted from<br />
computer generated<br />
3D holoscopic<br />
content<br />
3D holoscopic<br />
content on autostereoscopic<br />
display overlaid<br />
with lenticular<br />
sheet<br />
M25 – M27<br />
(March –<br />
May 2012)<br />
Questionnaires,<br />
focus groups<br />
London,<br />
<strong>Brunel</strong><br />
University<br />
Sample <strong>of</strong> endusers<br />
Computer generated<br />
3D holoscopic<br />
content<br />
3D holoscopic<br />
content on<br />
holoscopic<br />
display overlaid<br />
with microlens<br />
array<br />
M25 – M27<br />
(March –<br />
May 2012)<br />
Questionnaires,<br />
focus groups<br />
London,<br />
<strong>Brunel</strong><br />
University<br />
Sample <strong>of</strong> endusers<br />
Real 3D holoscopic<br />
images captured with<br />
<strong>the</strong> holoscopic<br />
camera <strong>and</strong> computer<br />
generated 3D<br />
holoscopic content<br />
3D holoscopic<br />
content on<br />
HoloVisio<br />
display<br />
M29 – M31<br />
(July –<br />
September<br />
2012)<br />
Questionnaires,<br />
focus groups<br />
Budapest,<br />
Holografika<br />
Sample <strong>of</strong> endusers<br />
Real 3D holoscopic<br />
images captured with<br />
<strong>the</strong> holoscopic<br />
camera <strong>and</strong> computer<br />
generated 3D<br />
holoscopic content<br />
End-Users: Audio Tests<br />
Audio user<br />
testing<br />
M29 – M31<br />
(July –<br />
September<br />
2012)<br />
Listening tests Munich, IRT Representative<br />
sample <strong>of</strong> endusers<br />
3D audio content<br />
End-Users: Interaction Tests<br />
Hyperlinking<br />
M29 – M31<br />
(July –<br />
September<br />
2012)<br />
Questionnaires,<br />
task-centred<br />
testing, <strong>the</strong><br />
thinking aloud<br />
method <strong>and</strong><br />
observations<br />
Potsdam,<br />
RBB<br />
Sample <strong>of</strong> endusers<br />
3D hyperlinked video<br />
Search <strong>and</strong><br />
retrieval<br />
M31 – M33<br />
(September<br />
– November<br />
2012)<br />
Questionnaires,<br />
task-centred<br />
testing, <strong>the</strong><br />
thinking aloud<br />
method <strong>and</strong><br />
observations<br />
Thesaloniki,<br />
CERTH/ITI<br />
Sample <strong>of</strong> endusers<br />
Search <strong>and</strong> retrieval<br />
interface <strong>and</strong><br />
database<br />
01.09.11 23
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
Table 2, cont.: Testplan End-Users<br />
Tests Month Methodology Location -<br />
Partners<br />
Testers<br />
Test Content<br />
Broadcast<br />
Hyperlinked 3D<br />
video<br />
M31 – M34<br />
(September<br />
– November<br />
2012)<br />
M31 – M34<br />
(September<br />
– November<br />
2012)<br />
End-Users: Complete Implemented Showcase Test<br />
Questionnaires,<br />
focus groups<br />
Questionnaires,<br />
task-centred<br />
testing, <strong>the</strong><br />
thinking aloud<br />
method <strong>and</strong><br />
observations<br />
Potsdam,<br />
RBB<br />
Potsdam,<br />
RBB<br />
Representative<br />
sample <strong>of</strong> show<br />
case service<br />
end-users<br />
Representative<br />
sample <strong>of</strong> show<br />
case service<br />
end-users<br />
Implemented<br />
showcase including<br />
3D holoscopic video<br />
<strong>and</strong> 3D audio<br />
Implemented<br />
showcase including<br />
3D hyperlinked video<br />
Two possible test scenarios for end-user visual tests, which should be reflected in <strong>the</strong> above table, are:<br />
i. How does <strong>the</strong> 3D holoscopic content compare to stereoscopic 3D content (generated using<br />
conventional stereoscopic technology); <strong>and</strong><br />
ii. How does <strong>the</strong> stereoscopic content extracted from <strong>the</strong> 3D holoscopic content compare to<br />
stereo content generated using conventional stereoscopic technology.<br />
The importance <strong>of</strong> <strong>the</strong> second point is related to <strong>the</strong> fact that, if it can be shown that <strong>the</strong> stereoscopic<br />
3D content extracted from 3D holoscopic content is comparable to stereoscopic content generated<br />
using conventional stereoscopic imaging techniques, <strong>the</strong>n <strong>the</strong> production <strong>of</strong> 3D stereo becomes easier<br />
(i.e., no need for multiple cameras <strong>and</strong> calibration problems).<br />
Using computer generated content will be an easier option because it is possible to use <strong>the</strong> same<br />
models generated in 3ds Max ® <strong>and</strong> carry out <strong>the</strong> rendering using existing plug-in tools for<br />
stereoscopic <strong>and</strong> <strong>the</strong> plug-in tools developed by <strong>Brunel</strong> University for holoscopic content. Hence, <strong>the</strong><br />
same content is generated <strong>and</strong> displayed by <strong>the</strong> two technologies.<br />
For real data, a concept which will allow <strong>the</strong> same content to be captured using <strong>the</strong> holoscopic camera<br />
<strong>and</strong> a pair <strong>of</strong> 2D cameras is needed. One way to achieve this is by using a motion control system <strong>and</strong><br />
program a camera movement when capturing a static scene to simulate motion: 1st take will be with<br />
<strong>the</strong> 3D holoscopic camera <strong>and</strong> <strong>the</strong> 2nd take will be with a pair <strong>of</strong> stereo cameras.<br />
5.2 TIME PLAN<br />
The time plan for <strong>the</strong> tests corresponds to <strong>the</strong> timing <strong>of</strong> Task 7.3 – “User Perception <strong>and</strong> Usability<br />
Testing”, which starts in project month 29 <strong>and</strong> continues until month 34. The exceptions are <strong>the</strong> tasks<br />
to be carried out by <strong>Brunel</strong> University on viewing <strong>and</strong> a small number <strong>of</strong> earlier tests at RBB focusing<br />
on <strong>the</strong> optimisation <strong>of</strong> <strong>the</strong> processes before involving costly pr<strong>of</strong>essional teams. The timing for <strong>the</strong>se<br />
tests is earlier to correspond with <strong>the</strong> availability <strong>of</strong> enough test c<strong>and</strong>idates <strong>and</strong> personnel <strong>and</strong> also to<br />
provide useful input at an earlier stage.<br />
The pr<strong>of</strong>essional user tests are planned at <strong>the</strong> start <strong>of</strong> <strong>the</strong> task so <strong>the</strong> time gap between producing <strong>the</strong><br />
showcases <strong>and</strong> developing <strong>and</strong> using <strong>the</strong> test-bed is not too great, <strong>and</strong> <strong>the</strong> experience is still fresh in<br />
<strong>the</strong> c<strong>and</strong>idates’ minds.<br />
The end-user tests with <strong>the</strong> individual elements <strong>of</strong> <strong>the</strong> system are timed at <strong>the</strong> beginning <strong>of</strong> <strong>the</strong> task to<br />
allow feedback <strong>and</strong> refining <strong>of</strong> <strong>the</strong> testing concept if necessary for <strong>the</strong> final end-user tests with <strong>the</strong><br />
01.09.11 24
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
implemented system.<br />
5.3 RECRUITING TESTERS<br />
Each partner will recruit a sample <strong>of</strong> end-users for <strong>the</strong> tests <strong>the</strong>y are to execute. The final tests, with<br />
<strong>the</strong> implemented showcases, will require a representative sample <strong>of</strong> end-users, i.e., end-users<br />
representative <strong>of</strong> <strong>the</strong> target audience group <strong>of</strong> <strong>the</strong> developed showcase. Factors such as age, sex, <strong>and</strong><br />
previous 3D content exposure will have to be taken into account when recruiting users. The<br />
responsible partners have a catalogue <strong>of</strong> interested testers from previous evaluation tests which may<br />
be extended <strong>and</strong>/or adapted according to <strong>the</strong> needs <strong>of</strong> <strong>the</strong> tests.<br />
For <strong>the</strong> pr<strong>of</strong>essional user tests, <strong>the</strong> test c<strong>and</strong>idates will include key production personnel that will be<br />
involved in <strong>the</strong> production <strong>of</strong> <strong>the</strong> show cases <strong>and</strong> in <strong>the</strong> development <strong>of</strong> <strong>the</strong> broadcast test-bed. These<br />
will be mainly pr<strong>of</strong>essionals from RAI <strong>and</strong> RBB.<br />
01.09.11 25
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
6 CONCLUSION<br />
This deliverable has presented <strong>the</strong> methodology, user acceptance metrics as well as time plan to be<br />
employed for <strong>the</strong> user acceptance validation in 3D VIVANT. Both pr<strong>of</strong>essional as well as<br />
(prospective consumers) end-users will be recruited for <strong>the</strong> purposes <strong>of</strong> testing <strong>and</strong> evaluating <strong>the</strong><br />
various components <strong>of</strong> <strong>the</strong> 3D VIVANT system. Qualitative assessment methods will be mainly<br />
employed for <strong>the</strong> user validation <strong>and</strong> testing <strong>of</strong> <strong>the</strong> new audio-visual experience to be created by <strong>the</strong><br />
3D holoscopic video <strong>and</strong> 3D audio as well as <strong>the</strong> functionality, usability <strong>and</strong> user experience <strong>of</strong> <strong>the</strong><br />
Broadcast TV <strong>and</strong> Online Hyperlinking use cases.<br />
The outcomes <strong>of</strong> <strong>the</strong>se tests will be reported in D7.3 “Results <strong>of</strong> User Perception <strong>and</strong> Usability<br />
Testing”, due in M34.<br />
01.09.11 26
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
7 REFERENCES<br />
Benton, S. A.: “Elements <strong>of</strong> Holographic Video Imaging.” In: Proceedings <strong>of</strong> SPIE 1600, Lake<br />
Forest, Ill., USA, 15-19 July. (1991), pp. 82- 86.<br />
Chen, W. et al.: “New Requirements <strong>of</strong> Subjective Video Quality Assessment Methodologies for<br />
3DTV” In: Proceedings <strong>of</strong> Fifth International Workshop on Video Processing <strong>and</strong> Quality Metrics for<br />
Consumer Electronics - VPQM 2010, Scottsdale, Arizona, U.S.A. September 26-29 (2010), pp 4025-<br />
4028.<br />
Creswell, J. W.: “Research <strong>Design</strong>: Qualitative, Quantitative, <strong>and</strong> Mixed Methods Approaches.”<br />
London (2008).<br />
Davis, F. D.: “Perceived Usefulness, Perceived Ease <strong>of</strong> Use, <strong>and</strong> User Acceptance <strong>of</strong> Information<br />
Technology.” MIS Quarterly (13:3), (1989), pp. 319-339.<br />
De Abreu, N., Blanckenburg, C., et al.: “Qualitatives Wissensmanagement. Neue wissensbasierte<br />
Dienstleistungen im Wissenscoaching und in der Wissensstrukturierung. “ Berlin, (2006).<br />
De Silva, D.V.S.X.; Fern<strong>and</strong>o, W.A.C.; Nur, G.; Ekmekcioglu, E.; Worrall, S.T.: “3D video<br />
assessment with Just Noticeable Difference in Depth evaluation.” 26-29 Sept. 2010, 17th IEEE<br />
International Conference on Image Processing (ICIP), (2010), pp.4013-4016.<br />
Farag, H., et al.: “Psychoacoustic Investigations on Sound-Source Occlusion.” Journal <strong>of</strong> Audio<br />
<strong>Engineering</strong> Society, Vol. 51, No. 7/8. (July 2003), pp. 635-646.<br />
Hewage, C.T.E.R.; Worrall, S.T.; Dogan, S.; Villette, S.; Kondoz, A.M.: “Quality Evaluation <strong>of</strong><br />
Color Plus Depth Map-Based Stereoscopic Video.” Selected Topics in Signal Processing, IEEE<br />
Journal <strong>of</strong>, vol.3, no.2, pp.304-318, April 2009.<br />
Honda, T. "Holographic display for movie <strong>and</strong> video" in Three Dimensional Image Technology ITE<br />
Japan (1991), pp 619-22.<br />
Leon, G.; Kalva, H.; Furht, B.: “3D Video Quality Evaluation with Depth Quality Variations.” 3DTV<br />
Conference (28-30 May 2008): The True Vision - Capture, Transmission <strong>and</strong> Display <strong>of</strong> 3D Video,<br />
(2008), pp.301-304.<br />
Kalva, Christodoulou, et al.: “<strong>Design</strong> <strong>and</strong> evaluation <strong>of</strong> a 3D video system based on H.264 view<br />
coding.” In: Proceedings <strong>of</strong> <strong>the</strong> 2006 international Workshop on Network <strong>and</strong> Operating Systems<br />
Support For Digital Audio <strong>and</strong> Video (Newport, Rhode Isl<strong>and</strong>, November 22 - 23, 2006). NOSSDAV<br />
'06. ACM, New York, NY, (2006), pp. 1-6.<br />
Knorr, S., Kunter, M., Sikora, T.: “Stereoscopic 3D from 2D video with super-resolution capability”<br />
In: Signal Processing: Image Communication, Volume 23, Issue 9, (October 2008), pp. 665-676.<br />
Lambooij, M.T.M., W. A. IJsselsteijn, I. Heynderickx: “Visual Discomfort in Stereoscopic Displays:<br />
A Review.” In: Proceedings <strong>of</strong> SPIE-IS&T Electronic Imaging, SPIE Vol. 6490, (2007).<br />
Lincoln, Y. S., Guba, E.G.: “Naturalistic Inquiry” Beverly Hills, Sage (1985).<br />
Norman, D.: “User Centered System <strong>Design</strong>: New Perspectives on Human-computer Interaction.”<br />
New York, CRC Press, (1986).<br />
Norman, D.: “The <strong>Design</strong> <strong>of</strong> Everyday Things.” New York, Doubleday Business, (1990).<br />
01.09.11 27
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
Olsson, R., Sjostrom, M.: “Multiview image coding scheme transformations: artifact characteristics<br />
<strong>and</strong> effects on perceived 3D quality.” SPIE New Display Technology for Military Applications<br />
Journal. Vol. 7524, 75240Z, (2010).<br />
Onural, L., et al.: “An Assessment <strong>of</strong> 3DTV Technologies.” In: Proceedings <strong>of</strong> NAB Broadcast<br />
<strong>Engineering</strong> Conference, (2006), pp. 456-467.<br />
Reis, G. A., Havig, P. R.et al.: “Color <strong>and</strong> shape perception on <strong>the</strong> Perspecta 3D volumetric display.”<br />
SPIE New Display Technology for Military Applications Journal. Vol. 6558, 65580I (2007).<br />
Rogers, E.: “Diffusion <strong>of</strong> Innovations”, New York (1995).<br />
Rubin, J. <strong>and</strong> Chisnell, D.: “H<strong>and</strong>book <strong>of</strong> Usability Testing: How to Plan, <strong>Design</strong> <strong>and</strong> Conduct<br />
Effective Testing.” London (2008).<br />
Saygili, G., Gurler, C.G., Tekalp, A. M.: “3D display dependent quality evaluation <strong>and</strong> rate<br />
allocation using scalable video coding.” In: Proceedings <strong>of</strong> 16th IEEE International Conference on<br />
Image Processing (ICIP), (2009), pp.717-720, 7-10.<br />
Seale, C.: “The quality <strong>of</strong> qualitative research.” London, Sage (1999).<br />
Thompson, Ronald L. Higgins, Christopher A. <strong>and</strong> Howell, Jane M. Influence <strong>of</strong> Experience on<br />
Personal Computer Utilization: Testing a Conceptual Model. Journal <strong>of</strong> Management Information<br />
Systems Vol. 11, No. 1 (Summer, 1994), pp. 167-187 (1994)<br />
Veit, Quirin: “Vergleichende Untersuchung und Bewertung von Codierverfahren für<br />
dreidimensionales Fernsehen“, Bachelor’s Thesis, University <strong>of</strong> Applied Sciences Amberg-Weiden<br />
(2011).<br />
Wolf, S. <strong>and</strong> Pinson, M.: “Video Quality Measurement Techniques,” NTIA Report 02-392, (Jun.<br />
2002) (available online at www.its.bldrdoc.gov/n3/video/documents.htm).<br />
Yamazaki, T., K. Kamijo, <strong>and</strong> S. Fukuzumi: “Quantitative evaluation <strong>of</strong> visual fatigue.” In:<br />
Proceedings <strong>of</strong> Japan Display, (1989), pp. 606-609.<br />
IEEE St<strong>and</strong>ards<br />
• P3333: St<strong>and</strong>ard for <strong>the</strong> Quality Assessment <strong>of</strong> Three Dimensional (3D) Displays, 3D Contents<br />
<strong>and</strong> 3D Devices based on Human Factors (2010), information online at:<br />
http://st<strong>and</strong>ards.ieee.org/develop/project/3333.html<br />
ITU-R Recommendations<br />
• ITU-R BT.500: Methodology for <strong>the</strong> subjective assessment <strong>of</strong> <strong>the</strong> quality <strong>of</strong> television pictures.<br />
Latest version 12 (September 2009) available online at: http://www.itu.int/rec/R-REC-BT.500-<br />
12-200909-I/en<br />
• ITU-R BS.1116-1: Methods for <strong>the</strong> subjective assessment <strong>of</strong> small impairments in audio systems<br />
including multichannel sound systems (1994-1997)<br />
• ITU-R BS.1284-1: General methods for <strong>the</strong> subjective assessment <strong>of</strong> sound quality (1997-2003)<br />
• ITU-R BT.1438 Subjective assessment <strong>of</strong> stereoscopic television pictures (2000)<br />
• ITU-R BS.1534-1: Method for <strong>the</strong> subjective assessment <strong>of</strong> intermediate quality level <strong>of</strong> coding<br />
systems(2001-2003)<br />
01.09.11 28
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
ANNEX 1<br />
A.1.1 INTRODUCTION<br />
This document refers to <strong>the</strong> need <strong>of</strong> calibrations <strong>and</strong> monitoring during <strong>the</strong> production process <strong>of</strong> a<br />
typical TV product.<br />
A.1.2 CALIBRATIONS AND MONITORING<br />
This section lists <strong>the</strong> main parameters involved in a TV production, indicating also <strong>the</strong> monitoring<br />
tool <strong>and</strong>/or measuring instrument usually adopted in a TV studio or a post-processing environment.<br />
Table 3:<br />
Shot parameters<br />
Parameter<br />
Parameters Relevant to <strong>the</strong> Shooting Phase<br />
Related to…<br />
Suggested monitor or<br />
measuring instrument<br />
Focus Focus regulation 2D monitor pixel-to-pixel<br />
Depth <strong>of</strong> focus<br />
Focus regulation<br />
Iris diameter<br />
2D monitor pixel-to-pixel<br />
Viewing angle Focal length (zoom) 2D monitor pixel-to-pixel<br />
Screen position in depth<br />
Transversal <strong>and</strong> longitudinal<br />
magnification (i.e., volumetric<br />
distortion)<br />
In stereo: convergence<br />
distance<br />
In holoscopic: it depends<br />
on <strong>the</strong> optical system<br />
In stereo: focal length<br />
In holoscopic: if applicable<br />
3D monitor<br />
3D monitor<br />
Normal use<br />
Setting before shot /<br />
continuous control (1)<br />
Setting before shot<br />
Setting before shot /<br />
continuous control (1)<br />
Setting before shot /<br />
continuous control (1)<br />
(scene definition)<br />
Setting before shot (1)<br />
(good representation <strong>of</strong><br />
real world)<br />
Horizontal disparity range<br />
In stereo: base,<br />
convergence angle, focal<br />
length<br />
In holoscopic: if applicable<br />
Still open problem<br />
Setting before shot (1)<br />
(to avoid eye fatigue)<br />
Signal parameters<br />
White level<br />
Iris <strong>and</strong> overall gain<br />
Waveform monitor <strong>and</strong> calibration<br />
panel (white)<br />
Setting before shot<br />
(signal range)<br />
Black pedestal<br />
Pedestal level gain<br />
Waveform monitor <strong>and</strong> calibration<br />
panel (black)<br />
Setting before shot<br />
(signal range)<br />
Gamma<br />
Gamma regulation<br />
Waveform monitor <strong>and</strong> calibration<br />
panel (grey scale)<br />
Setting before shot<br />
(colorimetry)<br />
Component gain<br />
Component gain<br />
Waveform monitor, vectorscope <strong>and</strong><br />
calibration panel (colour bars)<br />
Setting before shot<br />
(colorimetry)<br />
Notes:<br />
01.09.11 29
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
(1) This note applies when a film-type <strong>of</strong> shooting is adopted. In television-type <strong>of</strong> shooting, very<br />
<strong>of</strong>ten <strong>the</strong> scene is very dynamic, unforeseeable <strong>and</strong> uncontrollable. How to quickly calibrate<br />
<strong>the</strong> 3D shooting system is still an open issue (CRIT requested a patent for “screen position”<br />
<strong>and</strong> “disparity range”).<br />
Table 4:<br />
Parameter<br />
Parameters Relevant to <strong>the</strong> Recording Phase<br />
Related to…<br />
Recorded image Recording mechanism 2D or 3D monitor<br />
Suggested monitor or<br />
measuring instrument<br />
Normal use<br />
Presence monitor to verify<br />
<strong>the</strong> recording quality<br />
Table 5:<br />
Parameter<br />
Parameters relevant to <strong>the</strong> editing phase<br />
Related to…<br />
White level Overall gain Waveform monitor<br />
Black pedestal Pedestal level Waveform monitor<br />
Gamma Gamma regulation Waveform monitor<br />
Suggested monitor or<br />
measuring instrument<br />
Component gain Component gain Waveform monitor, vectorscope<br />
Sequences under process Video material 3D monitor<br />
Processed sequences Video material 3D monitor<br />
Normal use<br />
Colour grading<br />
(signal range)<br />
Colour grading<br />
(signal range)<br />
Colour grading<br />
(colorimetry)<br />
Colour grading<br />
(colorimetry)<br />
To select <strong>the</strong> correct<br />
transition points<br />
To evaluate <strong>the</strong> obtained<br />
result<br />
Table 6:<br />
Parameter<br />
Parameters Relevant to <strong>the</strong> Compositing Phase<br />
Related to…<br />
Sequences under process Video material 3D monitor<br />
Processed sequences Video material 3D monitor<br />
Suggested monitor or<br />
measuring instrument<br />
Normal use<br />
To watch <strong>the</strong> video material<br />
to be composed<br />
To evaluate <strong>the</strong> obtained<br />
result<br />
01.09.11 30
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
ANNEX 2<br />
What is different in Stereo-3D production compared to 2D production<br />
A.2.1 STORYBOARD, FILM PRODUCTION GRAMMAR<br />
Binocular vision enables different depth cues than monocular vision. In monocular vision <strong>the</strong><br />
following depth cues are prevailing:<br />
• occlusion<br />
• accommodation<br />
• perspective<br />
• motion parallax<br />
• relative size <strong>and</strong> familiar size<br />
• texture gradients<br />
• aerial perspective<br />
The most obvious binocular depth cues are convergence <strong>and</strong> binocular disparity. The two eyes see<br />
<strong>the</strong> world from slightly different perspectives. This results in horizontally shifted pictures depending<br />
on <strong>the</strong> distance <strong>of</strong> <strong>the</strong> observed objects. An object in front <strong>of</strong> a distant background will appear at<br />
different locations in both eyes. At <strong>the</strong> same time, different parts <strong>of</strong> <strong>the</strong> background will be occluded<br />
in both eyes. The human brain fuses <strong>the</strong>se two different pictures to a cyclopean picture augmented by<br />
<strong>the</strong> perception <strong>of</strong> depth.<br />
Ano<strong>the</strong>r binocular depth cue is binocular rivalry. In this case <strong>the</strong> pictures in <strong>the</strong> left <strong>and</strong> right eye<br />
cannot be fused by <strong>the</strong> human brain to a cyclopean picture.<br />
It is obvious that 3D productions will utilize <strong>the</strong> additional binocular depth cues. Whereas slight<br />
horizontal shifts <strong>and</strong> different occluded patches <strong>of</strong> <strong>the</strong> background are vital for binocular 3D<br />
perception, binocular rivalry is mostly unwanted <strong>and</strong> should be avoided.<br />
The story board for a stereo 3D production should contain a depth script in order to deal with <strong>the</strong><br />
following questions:<br />
• How much stereoscopic depth should be contained in each scene The binocular depth <strong>of</strong><br />
a scene will definitely tell <strong>the</strong> audience its own story <strong>and</strong> not just support o<strong>the</strong>r depth<br />
cues, especially as <strong>the</strong> binocular depth cues can be utilised independently from monocular<br />
depth cues. This might reflect some emotions like in <strong>the</strong> following examples. If a scene<br />
contains no or little binocular depth, <strong>the</strong> audience might consider <strong>the</strong> scene flat in <strong>the</strong><br />
figurative sense. A scene with a huge depth, on <strong>the</strong> o<strong>the</strong>r side, might cause a different<br />
sensation <strong>of</strong> <strong>the</strong> scene <strong>and</strong> <strong>the</strong> perceived position <strong>of</strong> <strong>the</strong> observer in it. An object behind<br />
<strong>the</strong> screen might be considered distant in an emotional sense, whereas objects in front <strong>of</strong><br />
<strong>the</strong> screen might enter <strong>the</strong> personal sphere <strong>of</strong> <strong>the</strong> audience.<br />
• What is <strong>the</strong> interaction <strong>of</strong> binocular <strong>and</strong> monocular depth cues How will it be perceived,<br />
if shallow depth <strong>of</strong> focus is combined with binocular depth, for example<br />
01.09.11 31
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
• From stereoscopic experience some strange phenomena are known. If <strong>the</strong> inter-ocular<br />
distance is chosen too wide, <strong>the</strong> scene will be perceived like a dollhouse. This is <strong>the</strong> so<br />
called Lilliputian effect. Is this in accordance with <strong>the</strong> director’s intention<br />
• Ano<strong>the</strong>r artefact is <strong>the</strong> so called window violation. An object in front <strong>of</strong> <strong>the</strong> screen which<br />
is close to <strong>the</strong> lateral frame <strong>of</strong> <strong>the</strong> picture will be cut <strong>of</strong>f by <strong>the</strong> picture frame in an<br />
unnatural manner. In natural perception an object in front <strong>of</strong> <strong>the</strong> window cannot be<br />
occluded by <strong>the</strong> window frame located at a fur<strong>the</strong>r distance.<br />
• The size <strong>of</strong> stereo rigs might be a problem for actors. It is harder for <strong>the</strong> actor to maintain<br />
a viewing line, as <strong>the</strong> mirror <strong>of</strong> a stereo rig might be bulky <strong>and</strong> obstruct <strong>the</strong> viewing line<br />
for <strong>the</strong> actor when she/he plays very near in front <strong>of</strong> <strong>the</strong> camera.<br />
The stereographer can change <strong>the</strong> binocular perception by modifying <strong>the</strong> stereoscopic parameters<br />
“inter-ocular distance” <strong>and</strong> “convergence”. Up to now, <strong>the</strong> film makers have not yet agreed on a<br />
common set <strong>of</strong> rules, how binocular depth cues should be utilised in stereo-3D productions. We now<br />
underst<strong>and</strong> that 3D production obeys different rules than 2D productions. Consequently we can also<br />
accept that holoscopic productions will require yet ano<strong>the</strong>r rule set differing from <strong>the</strong> 2D grammar<br />
<strong>and</strong> from <strong>the</strong> stereo-3D grammar.<br />
A.2.2 PRE-PRODUCTION<br />
Staging <strong>and</strong> decorations will have to be planned <strong>and</strong> built more accurately. A flat wall paper in <strong>the</strong><br />
background, for example, will not properly stimulate <strong>the</strong> sensation <strong>of</strong> a deep perspective. As <strong>the</strong><br />
cutting frequency will be slower in <strong>the</strong> stereo 3D, <strong>the</strong> props have to be worked out to a more detailed<br />
degree, as <strong>the</strong> human eye will notice sloppiness more easily. For <strong>the</strong> same reason rehearsal <strong>and</strong> acting<br />
will have to be more precisely.<br />
A.<strong>2.3</strong> SHOOTING<br />
3D productions require more complex production tools. The 3D camera comprises two normal<br />
cameras, mostly digital cameras like HDTV cameras or digital film cameras. In some rare cases<br />
integrated 3D cameras are available, which already contain two 2D cameras mounted into a single<br />
housing. In addition to <strong>the</strong> cameras, more equipment is needed, like stereo rig, rig control, means for<br />
synchronization <strong>of</strong> camera output, synchronised lens control, synchronised recording <strong>and</strong> playback,<br />
3D monitoring.<br />
3D productions require additional personnel. The most important new person in <strong>the</strong> camera crew is<br />
<strong>the</strong> stereographer. The stereographer cooperates with <strong>the</strong> DOP <strong>and</strong> <strong>the</strong> director to design <strong>the</strong> depth<br />
perception <strong>of</strong> <strong>the</strong> picture. The stereographer determines <strong>the</strong> interocular distance <strong>and</strong> <strong>the</strong> convergence<br />
angle <strong>of</strong> both cameras. In addition to <strong>the</strong> stereographer one or more stereo-technicians will be<br />
required to operate <strong>the</strong> cameras.<br />
In 3D productions, lenses with different focal lengths are being used than in 2D production. DOPs<br />
tend to use shorter focal length lenses. In many cases DOPs also tend to use a smaller aperture than in<br />
2D productions, as a shallow depth <strong>of</strong> field is <strong>of</strong>ten considered contradictory to binocular depth<br />
perception. As a rule <strong>of</strong> thumb, panning speed should be slower than in 2D production. The human<br />
eye also seems to be more sensitive to crushed shadows in 3D perception than in 2D.<br />
Special care is needed to avoid unwanted differences between <strong>the</strong> left <strong>and</strong> right image. The human<br />
eye expects only horizontally shifted pictures. Vertical disparities can happen easily, if <strong>the</strong> cameras<br />
are not calibrated correctly. It is <strong>the</strong> task <strong>of</strong> <strong>the</strong> stereographer to avoid those errors. Typically, this<br />
requires several hours <strong>of</strong> setup <strong>and</strong> calibration time for <strong>the</strong> stereoscopic camera. Differences in<br />
01.09.11 32
ICT Project 3D VIVANT– Deliverable <strong>2.3</strong><br />
Contract no.:<br />
248420<br />
User Acceptance Validation Plan<br />
brightness or colour are also crucial errors which have to be avoided or compensated. It is especially<br />
disturbing if brightness differences occur in isolated parts <strong>of</strong> <strong>the</strong> image, like different highlight<br />
clipping in <strong>the</strong> cameras or different reproduction <strong>of</strong> shiny surfaces (due to polarization effects caused<br />
by <strong>the</strong> semi transparent mirror). It is more time consuming to change lenses in 3D stereo cameras.<br />
Therefore, <strong>the</strong> camera crew might have to work at a slower pace than in 2D productions.<br />
Some changes in <strong>the</strong> setup <strong>and</strong> operation are due to technical limitations <strong>of</strong> <strong>the</strong> camera equipment<br />
(bulky, heavy, time consuming to calibrate), whereas o<strong>the</strong>r changes like selection <strong>of</strong> lenses, frequency<br />
<strong>of</strong> scene cuts, decoration, panning speed, etc., arise from psychophysical aspects <strong>of</strong> 3D perception.<br />
A.2.4 QUALITY CONTROL<br />
The 3D quality control is an additional task after acquisition. All <strong>the</strong> potential errors have to be<br />
checked <strong>and</strong> noticed like wrong order <strong>of</strong> left <strong>and</strong> right image, asynchronity, vertical disparities, colour<br />
<strong>and</strong> brightness differences, differences in flare, highlight, shining surfaces, etc. The quality control<br />
personnel have to take <strong>the</strong> decision, if errors are within <strong>the</strong> tolerance limits or <strong>the</strong> take has to be reshot.<br />
A.2.5 EDITING<br />
3D productions require a slower pace <strong>of</strong> cutting. Fast cuts cause irritations in human perception. Most<br />
3D productions will be distributed in 2D as well. It is, <strong>the</strong>refore, necessary to create a single editing<br />
version suitable for 2D <strong>and</strong> 3D viewing or to create different editing versions for 2D <strong>and</strong> for 3D.<br />
A.2.6 GREEN SCREEN, VISUAL EFFECTS<br />
Back plates <strong>and</strong> composites <strong>of</strong> real <strong>and</strong> virtual layers are more dem<strong>and</strong>ing in 3D. Virtual elements<br />
have to be placed at <strong>the</strong> right position in depth in <strong>the</strong> scene. Moreover flat back plates might not be<br />
sufficient, e.g., when simulating a deep perspective.<br />
Stunt scenes are more dem<strong>and</strong>ing. When viewing in stereoscopic 3D, it might be obvious that a fight<br />
scene is not real. The audience might realize that a punch intentionally fails <strong>the</strong> face <strong>of</strong> <strong>the</strong> victim.<br />
A.2.7 ADDITIONAL TASKS IN POST-PRODUCTION<br />
There are additional tasks in 3D post-production like depth grading, stereo fixes <strong>and</strong> stereo<br />
h<strong>and</strong>overs. Depth grading creates <strong>the</strong> chronology <strong>of</strong> <strong>the</strong> depth sensation throughout <strong>the</strong> production.<br />
Here it is assigned, when in time <strong>the</strong> scene is fur<strong>the</strong>r distant or comes closer to <strong>the</strong> audience. The term<br />
stereo-fixes describes <strong>the</strong> task <strong>of</strong> removing unwanted stereoscopic flaws like colour differences or<br />
areas <strong>of</strong> binocular rivalry. The term h<strong>and</strong>over finally describes <strong>the</strong> task <strong>of</strong> adjusting <strong>the</strong> transition<br />
from one cut to <strong>the</strong> next. This is to facilitate <strong>the</strong> perception <strong>of</strong> depth jumps.<br />
Finally, various versions might have to be created for various distribution channels like 2D, Stereo-<br />
3D, cinema, TV, <strong>and</strong> mobile displays.<br />
01.09.11 33