27.06.2013 Views

learning - Academic Conferences Limited

learning - Academic Conferences Limited

learning - Academic Conferences Limited

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Rachel Fitzgerald<br />

running full data collection, but with the time factor and a Christmas holiday in the midst of the data<br />

gathering, it was decided to run the full survey. Although not ideal, as the first stage of an AR cycle it<br />

was considered that all data would be of interest and any limitations would serve to improve the next<br />

cycle of research. Survey, as a method was chosen specifically because it suited the time constraints<br />

and students with erratic attendance patterns. An online survey was deployed in the virtual <strong>learning</strong><br />

environment where students access all module materials (and assessment feedback), this seemed<br />

ideal for gathering many responses and opinions about the process. Awareness of the students also<br />

informed the researcher to add a number of response categories to the survey to encourage them to<br />

get involved. This allowed the survey to be uncomplicated and swiftly completed with an option to add<br />

qualitative opinions for general analysis. The type of survey questions affect the quality of data<br />

received, O’Leary (2005), but as this data was gathered to get a general consensus, it was expected<br />

to generate questions for further cycles of research.<br />

To participate, students submitted their assignments electronically. Prior to recording the audio, the esubmissions<br />

were read and graded with some rough notes made about the work, then a selection<br />

were sent (via email) to be second marked. Typed comments from the second marker were<br />

incorporated into the rough notes and then the feedback recording for all students took place over two<br />

days. Seven survey questions were put together to measure the students’ perception of these areas<br />

and to understand if they found the experience more personal. In addition, the survey was designed<br />

to collect data about the impact of the process on international students, with a view to informing<br />

practice in the DL area. In terms of the feedback that was to be recorded, Gibbs & Simpson (2004)<br />

suggests that poor feedback is feedback that is backward looking and addresses issues that will not<br />

be required again. With the feed-forward aspect of the assessment in mind, feedback addressed the<br />

student personally and commented on meeting the <strong>learning</strong> outcomes and included clear guidance<br />

about academic writing, referencing and points to expand on for the next assignment and then they<br />

received the grade.<br />

6. Critical reflection on the process<br />

Recording feedback required little technical knowledge, just the use of a headset and Audacity<br />

software. After a trial recording to test, each recording took less than two minutes, the feedback was<br />

quite specific and therefore time efficient. To keep a personal touch, feedback was in a natural tone<br />

and intonation, and spoken as if talking directly to the student. In a study of 15 students, Merry and<br />

Orsmond (2007) highlight the importance of tone for audio feedback, but what may have been easy<br />

with just 15 students was not the case with over 100 and it was difficult maintaining a natural or even<br />

enthused tone throughout the process. The feedback was similar for many students so it became<br />

perfunctory and repetitious. Rotherham (2008, 2009a) suggests that audio feedback reduces the time<br />

spent on feedback and this was also the case in this research, however while it is easier and quicker<br />

to verbally explain a concept, when one uses typed feedback there is an opportunity to cut and paste–<br />

with large student numbers, this can be a useful tool. In addition, while the recordings saved time,<br />

there was a lot more time spent uploading each individual file to a secure area on the VLE, this<br />

administrative process added at least two days to the process and meant that the feedback was a day<br />

late being returned. This would be a barrier to using audio feedback for these students in the next<br />

cycle.<br />

When the feedback was available to the students, a notice was posted to the VLE informing students<br />

and offering a transcript if required. To get a transcript students had to email me, in all around fifteen<br />

students requested a transcript of their feedback. Using the report tool in the VLE, the graph in Fig 2<br />

shows significantly increased activity on the day the feedback was available (7 th Dec 2010) and in the<br />

subsequent days, this activity tails off during the Christmas vacation and then returns to what can be<br />

considered normal access during Jan 2011. A classroom discussion with the students indicated that<br />

they had no trouble accessing the audio file and most found it to be a novel approach and were happy<br />

to accept this as a new method of feedback. All indicated they were aware that they could have a<br />

transcript on request, but generally didn’t feel the need to request one.<br />

Through the VLE we can see how often each student on the module accessed their feedback, the<br />

VLE report indicated two interesting points that will need further investigation:<br />

The majority of students on the module accessed the file at least twice, with two students<br />

accessing the folder over twenty times. Rotherham (2009b) suggests that students find it harder<br />

to skim through audio feedback and therefore need to go back to listen again to clarify their<br />

understanding. However it could also mean that the students found my voice difficult to<br />

259

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!