12.07.2015 Views

IELTS Research Reports

IELTS Research Reports

IELTS Research Reports

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Construct validity in the <strong>IELTS</strong> Academic Reading test<strong>IELTS</strong> task surveyA corpus of <strong>IELTS</strong> reading test samples was compiled for the study. These were from two sources: i)the official <strong>IELTS</strong> Practice Test (<strong>IELTS</strong>, 2007); and ii) practice test material published by CambridgeUniversity Press (see Appendix 1 for list of corpus materials). It is understood that the CUP materialsare made up partly of retired official materials, and so were thought to reflect better than many othercommercial materials the actual nature of the official test. No live reading test materials were availableto the study. A total of 13 complete tests were investigated, each made up of a variety of task types.Reading tasks were analysed by the researchers according to the two dimensions of the study’sanalytical framework i.e. the ‘level’ and ‘type’ of engagement. Whilst a degree of interpretationinvariably enters into any analysis of this kind, some objectivity was achieved on the study by havingeach researcher analyse tasks independently, and then for a consensual analysis to be arrived atthrough processes of moderation.Academic task analysisTo compile data for the university component of the study, lecturers from the twelve selecteddisciplines were contacted and invited to participate in the study. Participation involved initially thepassing on of course reading and assessment materials, and then later being interviewed about thesematerials. A provisional analysis was made of the assessment tasks drawing on the same analyticalframework used in the <strong>IELTS</strong> analysis. This analysis was also subject to processes of moderation.Academic staff surveyAs a follow-up to the task analysis, interviews were conducted with the twelve participating staff.Prior to the interviews, a schedule of questions was sent to interviewees (see Appendix 2), alongwith a sample of <strong>IELTS</strong> reading test materials. The <strong>IELTS</strong> materials were selected so as to cover arepresentative sample of test tasks (see Appendix 2a).The interviews were divided into three main phases, covering:■■general reading requirements on courses■■reading requirements on specific assessment tasks■■perceptions regarding the degree of correspondence between the academic readingrequirements and those on the <strong>IELTS</strong> reading text.The interviews were semi-structured and followed the procedure known as the ‘discourse-basedinterview’ (Odell, Goswami & Herrington, 1983). Such a procedure involves discussion withinterviewees about specific text samples – in this case, the course materials provided by the lecturersand the sample <strong>IELTS</strong> reading test items. The interviews ran for an average of 1 hour. All interviewswere audio-recorded, and transcribed. The main themes and ideas to emerge from our informants’commentaries are presented in Section 4.2.The interview extracts presented throughout the report are in the main verbatim transcriptions of theinterviews. In some instances, there has been some minor cleaning up of the text for the purpose ofremoving any extraneous features – false starts, hesitations, fillers and the like. As in Swales’ (1998)study, the intention here was to make some small improvement to the readability of the spoken discourseof informants (p 26) while at the same time seeking to be faithful to the substance of their talk.<strong>IELTS</strong> <strong>Research</strong> <strong>Reports</strong> Volume 11199

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!