27.06.2013 Views

learning - Academic Conferences Limited

learning - Academic Conferences Limited

learning - Academic Conferences Limited

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Anne Jelfs and Chetz Colwell<br />

We work together taking the role of observer and facilitator alternatively depending on whether the<br />

person is disabled when Chetz takes the role of facilitator, or non-disabled where Anne is the<br />

facilitator. This method almost replicates the approach suggested by Craven and Booth (2005)<br />

‘Ideally the session should include a facilitator and an observer.The facilitator sits next to<br />

the participant and directs the testing while the observer watches the process remotely<br />

either through a one-way mirror, or by using a video to record the session which the<br />

observer watches in another room. This method enables the observer to concentrate fully<br />

on what the participant is doing and saying, rather than on the actual running of the<br />

experiment.’ (pp 187-188).<br />

While participants complete the tasks, they are observed and prompted as appropriate. We use<br />

probing questions to uncover reasons for particular actions and perceptions of what the participant<br />

expects to find. Tamler (2001) suggests that a good example of a question is ‘when you were<br />

selecting that file what were you expecting’ rather than ‘why did you select that file’ which makes the<br />

user feel that they need to justify their behaviour. We also give some feedback as the student<br />

progresses because they are working as co-evaluators and not purely in isolation so we might say<br />

something like ‘this is just the sort of thing we need to hear or your information will be very useful to<br />

the developers.<br />

Notes are taken by the observer concerning critical incidents, participants’ comments and relevant<br />

actions. Participants are not given a time limit to complete each task. Finally we use a pre-prepared<br />

interview schedule at the end of the session to find out about their opinions and preferences<br />

concerning the websites and to allow them to give any other views they might have. The interview can<br />

also be seen as a de-brief where we might explain further about the work we are conducting.<br />

4. Analysis and report<br />

We do task based analysis – how users tackled the tasks and where the major difficulties arose. The<br />

observer’s notes are invaluable in guiding the analysis as one hour of video tape has been reported to<br />

take at least 5 hours of analysis, or even a day or more (Preece et al 1994, p620). From both the<br />

observer’s notes and the videos we are then able to write a joint report for the developers.<br />

5. Conclusion<br />

This method of working together has proved invaluable to the developers as it give them a report<br />

where they can apply our recommendations knowing that we have discussed our findings and we are<br />

recommending the optimum solution for all users.<br />

If points remain which are inaccessible then these are highlighted, so that statements can be made on<br />

the module website. It is imperative that disabled students are made aware of any potential areas that<br />

are inaccessible for particular types of student or supporting software.<br />

References<br />

Ardito, C., Costabile, M. De Marsico, R. Lanzilotti, S.,Levialdi, T., Roselli, V. Rossano (2006) An Approach to<br />

Usability Evaluation of e-Learning Applications.<br />

http://www.di.uniba.it/~ivu/papers/2006/UAIS2006_Arditoetal.pdf (accessed 09.06.11)<br />

BS 8878: Web accessibility. Code of Practice<br />

http://shop.bsigroup.com/en/ProductDetail/?pid=000000000030180388 (accessed 09.06.11)<br />

Byerley, S. and Chambers, M.B. (2002) Accessibility and usability of web-based library databases for non-visual<br />

users. Library Hi Tech vol 20: 2, pp169-178<br />

Colwell, C., Jelfs, A. & Mallett, E. (2005) Initial requirements of deaf students for video: lessons learned from an<br />

evaluation of a digital video application. Learning, Media and Technology, Vol. 30, No. 2, July, 201–217<br />

Cooper, M., Colwell, C. and Jelfs, A. (2007) ‘Embedding accessibility and usability: considerations for e-<strong>learning</strong><br />

research and development projects', ALT-J, Vol 15, No. 3, 231 – 245<br />

Crowther, M.S., Keller, C.C. and Waddoups G.L. (2004) Improving the quality and effectiveness of computermediated<br />

instruction through usability evaluations. British Journal of Educational Technology Vol 35: 3 pp<br />

289–303<br />

Hertzum, M. Hansen, K.D. and Andersen, H. (2009) Scrutinising usability evaluation: does thinking aloud affect<br />

behaviour and mental workload? Behaviour & Information Technology 28:2 pp 165-181<br />

HEFCE (2010) Student perspectives on technology – demand, perceptions and training needs. Report to HEFCE<br />

by NUS<br />

Hix, D. and Hartson, H.R. (1993) Developing user interfaces: Ensuring usability through product and process<br />

Wiley & sons, New York<br />

330

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!