12.07.2015 Views

IELTS Research Reports

IELTS Research Reports

IELTS Research Reports

SHOW MORE
SHOW LESS

Create successful ePaper yourself

Turn your PDF publications into a flip-book with our unique Google optimized e-Paper software.

An empirical investigation of the process of writing Academic Readingtest items for the International English Language Testing System2.2 Item writingItem writing has long been seen as a creative art (Ebel 1951, Wesman 1971) requiring mentoring andthe flexible interpretation of guidelines. This has been a source of frustration to psychometricians,who would prefer to exert tighter control and to achieve a clearer relationship between item designcharacteristics and measurement properties. Bormuth (1970) called for scientifically grounded,algorithmic laws of item writing to counter traditional guidelines that allowed for variation ininterpretation. Attempts at standardisation have continued with empirical research into the validityof item writing rules (Haladyna and Downing 1989a and 1989b); the development of item shells -generic items with elements that can be substituted with new facts, concepts or principles to createlarge numbers of additional items (Haladyna 1999); and efforts to automate item generation (Irvineand Kyllonen 2002). Numerous studies have addressed the effects of item format on difficulty anddiscrimination (see Haldyna and Downing 1989a, Haladyna, Downing and Rodriguez 2002) andguidelines have been developed to steer test design and to help item writers and editors to identifycommon pitfalls (Haladyna and Downing, 1989a, Haladyna 1999). For all this, Haladyna, Downingand Rodriguez (2002) conclude that item writing remains essentially creative as many of theguidelines they describe remain tentative, partial or both.Yet stakeholder expectations of evidence-based, transparently shared validation for high-stakeslanguage exams are increasingly the order of the era (see Bachman, 2005, and Chalhoub-Deville,Chapelle, and Duff, (eds), 2006) often specified through codes of practice (e.g. ALTE, 1994). Rigouris increasingly expected of item-writer guidelines in the communicative language skills testingsector. The new Pearson Test of English (PTE), due in 2009, aims, like <strong>IELTS</strong>, to provide languageproficiency scores, including reading measures for colleges, universities, professional and governmentbodies requiring academic-level English. de Jong (2008) proposes an analysis, for PTE item writertraining purposes, of item types (14 potentially applicable to the testing of reading) and a schema foritem writer training structured around a general guide, item specific instructions, reference materials,codes of practice, an item writer literature review and the Common European Framework of Reference(CEFR). Cambridge ESOL’s own framework for the training and development of item writers isreferenced in some detail below.A number of handbooks include guidance on item design and quality assurance issues in languagetests (e.g. Valette 1967, Carroll and Hall 1985, Heaton 1990, Weir 1993, Norris et al 1998, Davidsonand Lynch 2002, Hughes 2003). These provide advice on the strengths and weaknesses of various itemformats and stress the need for item review and piloting. It is generally taken as axiomatic that trainedtest item writers are superior to the untrained (Downing and Haladyna 1997).While the focus of research has been on the characteristics of items, very little attention has been givento the processes that item writers go through in creating test items and the contributions that these maymake to the quality of test material. In a rare piece of research focusing on this area, Salisbury (2005)uses verbal protocol methodology and a framework drawn from the study of expertise to explore howtext-based tests of listening comprehension are produced by item writersSalisbury (2005, p 75) describes three phases in the work of the item writer:■■■■■■Exploratory Phase: ‘searching through possible texts, or, possibly, contexts’Concerted Phase: ‘working in an intensive and concentrated way to prepare text and itemsfor first submission’Refining Phase: ‘after either self-, peer- or editor-review, polishing/improving the testpaper in an effort to make it conform more closely to domain requirements’<strong>IELTS</strong> <strong>Research</strong> <strong>Reports</strong> Volume 11273

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!