12.07.2015 Views

IELTS Research Reports

IELTS Research Reports

IELTS Research Reports

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

A multiple case study of the relationship between the indicators of students’ English languagecompetence on entry and students’ academic progress at an international postgraduate university9.1 Data collection methods9.1.1 Documentary sourcesFor this part of the study data sources included documents and questionnaires. The 2007 SummerProgramme students underwent a number of linguistic assessments in the first few months at universitywhich were accessible through the academic English staff. These included scores for students who hadtaken English tests and, following completion of the Summer Programme, reports for each studentincluding grades for separate language skills, comments and recommendations for future study. Inaddition, all Summer Programme students participated in a pre-test <strong>IELTS</strong> exercise in the penultimateweek of the Summer Programme towards the end of September 2007 which covered academic writing,listening and reading skills but not speaking. A pre-test is primarily designed to test new items forinclusion in future <strong>IELTS</strong> papers so marks are not strictly comparative with a true <strong>IELTS</strong> test. In fact,listening and reading papers are marked as raw scores, not band scores, although the maximum scorefor the combined papers is provided. Despite distinctions, advice to candidates states that the Pretest<strong>IELTS</strong> results will give an indication of ability so the results supplied an approximate measurefor those students without entry scores and for whom no baseline comparison measure existed. It ispossible too, that relative differences, or similarities, between individual student’s pre-testing scoresare robust, albeit on a single examination sitting.9.1.2 Examination scriptsApart from a student’s need for ongoing language tuition there were no specific measures of student’slanguage abilities once the student had commenced the MSc course. The earlier diversity study(Lloyd-Jones 2007) and the Course Directors’ interviews had established that written skills were thosegiving rise to concern and, on this basis, a decision was made to focus upon students’ writing abilities.Course assessments were preferred as data sources because of their contextually appropriate nature.The alternative of additional testing was considered but rejected on the grounds that it would beunacceptable to the students concerned. Of the three written assessment formats, course assignments,examinations and theses, the latter two were chosen for investigation. Assignments were rejectedbecause of questions of authenticity and feasibility. Organising, and retrieving the large number ofscripts necessary was deemed beyond the scope of the present study and sampling on a lesser scaleappeared daunting. On the other hand, exam scripts were clearly authentic and the practice of twosittings before Christmas and Easter provided standardisation in time across the different courses.Access to examination scripts was approved by Registry provided the students and Course Directorsconsented. Course Directors of Summer Programme students were emailed for their consent and allwere willing. Summer Programme students were contacted individually by email with requests forconsent to access their examination scripts. The email included an explanation for the request and abrief rationale for the study. Two students, both in SOE, declined, leaving 22 students participatingin the exam script study. Course Directors or Course Administrators provided the exam scripts whichwere copied and the originals returned to their departments. The scripts of three students and onepaper each for two more students were unavailable as a result of external examining requirements. Byagreement, the copies will be destroyed following the completion of the study.The scripts were examined with two questions in mind. The first was to gain an impression of theamount of textual content in the scripts which might corroborate the interview data for certainprogrammes (see Extract 3). The second was to identify examiner’s comments about language, eithercritical or positive. Two reviewers separately read the scripts, a lecturer from the Academic English<strong>IELTS</strong> <strong>Research</strong> <strong>Reports</strong> Volume 11161

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!