learning - Academic Conferences Limited

learning - Academic Conferences Limited learning - Academic Conferences Limited

academic.conferences.org
from academic.conferences.org More from this publisher
27.06.2013 Views

Objectives More efficient and faster preparation and transmission of data for analysis by the different roles involved in administration, management and quality assurance. 6. Conclusion and next steps Tim Cappelli Evaluation Impact The exams data was sent from the base hospital to MEO on the day of each OSCE exam. It was sent electronically over a secure link rather than as bundles of paper forms, reducing the time, cost and security of data transmission back to the centre. Because the data is already in electronic format, the data can be easily collated and assimilated into existing systems for analysis and distribution. Having business rules in the software that instantly flag incorrect forms or missing data before the examiners leave the site reduces the number of errors and facilitates the QA of the process. The pilot of the digital pens was evaluated and reported on to the Senior Management Team of the School together with a set of recommendations for large-scale roll out across all the base hospital. The recommendations were: Clarification of the data requirements of the Medical Exams Office (MEO). Demonstration of the process of using the pens to MEO staff. Assessment of the value of providing timely feedback to students: Another pilot, similar to the first, should be run. This time the results from the pens, including the textual feedback, should be passed directly to the students once the data has been collated, checked and approved for distribution. Test the viability and reliability of the mobile phone upload. Exploration of vendor support: Although Ubysis provided adequate support and advice prior to, and during, the pilot, it would be prudent to seek assurances that this level of support will continue in any future Training of Base administration staff in downloading and checking the pen’s content. Training of OSCE examiners: Some examiners expressed concerns over lack of training to provide appropriate feedback. The first two points were particularly important as there had been a great deal of resistance to the introduction of the technology from the MEO. This was partly due to a lack of communication and resulting misunderstanding of the technology and partly due to concerns about the technology making certain roles redundant. The result was that despite the pilot, MEO staff were still opposed to the technology and promoted the view the pilot had been unsuccessful. This ‘dis-information’ rapidly became the prevailing view and work was required to assure the concerns of the MEO staff. This demonstrates the importance of embracing all stakeholders at the start of any technology change. At present, the SMT are reviewing the results of the pilot, together with other initiatives in student feedback, to determine the most appropriate way forward. A final decision will be based on available resources and the views of staff and students. Acknowledgements Thanks to Dr’s Hilary Dexter and Lucie Byrne-Davies from the Manchester Medical School for their work on the student evaluation that informs much of this study. References Buzzetto-More, A. and Alade, J.A. (2006) “Best Practices in e-Assessment”, Journal of Information Technology Education, Vol 5, pp 251-269. Gibbs, G. (2010) Dimensions of Quality, The Higher Education Academy, http://search3.openobjects.com/kb5/hea/evidencenet/resource.page?record=12nH2AFIYcc [accessed 12 August 2011] Harden, R and Gleeson F (1979) “Assessment of clinical competence using an objective structured clinical examination (OSCE)”, Medical Education, Vol 13, No.1, pp39-54. Love, T and Cooper, T (2004) “Designing online information systems for portfolio-based assessment: Design criteria and heuristics”, Journal of Information Technology Education, Vol 3, pp65-81. 98

Tim Cappelli http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.77.6803&rep=rep1&type=pdf [accessed 15 August 2011] Nicol, D. & Macfalane-Dick, D. (2006) Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. published in Studies Higher Education 2006, Vol 31(2), pp99-218. http://tltt.strath.ac.uk/REAP/public/Resources/DN_SHE_Final.pdf [accessed 12 August 2011] Ridgway, J. and McCusker, S. and Pead, D. (2007) ’Literature review of e-assessment.’, [online]. Futurelab, Bristol http://dro.dur.ac.uk/1929/1/Ridgway_Literature.pdf?DDD29+ded0kmt [accessed 15 August 2011] 99

Objectives<br />

More efficient and<br />

faster preparation and<br />

transmission of data<br />

for analysis by the<br />

different roles involved<br />

in administration,<br />

management and<br />

quality assurance.<br />

6. Conclusion and next steps<br />

Tim Cappelli<br />

Evaluation Impact<br />

The exams data was sent from the base<br />

hospital to MEO on the day of each<br />

OSCE exam. It was sent electronically<br />

over a secure link rather than as bundles<br />

of paper forms, reducing the time, cost<br />

and security of data transmission back to<br />

the centre.<br />

Because the data is already in<br />

electronic format, the data can be easily<br />

collated and assimilated into existing<br />

systems for analysis and distribution.<br />

Having business rules in the software<br />

that instantly flag incorrect forms or<br />

missing data before the examiners<br />

leave the site reduces the number of<br />

errors and facilitates the QA of the<br />

process.<br />

The pilot of the digital pens was evaluated and reported on to the Senior Management Team of the<br />

School together with a set of recommendations for large-scale roll out across all the base hospital.<br />

The recommendations were:<br />

Clarification of the data requirements of the Medical Exams Office (MEO).<br />

Demonstration of the process of using the pens to MEO staff.<br />

Assessment of the value of providing timely feedback to students: Another pilot, similar to the<br />

first, should be run. This time the results from the pens, including the textual feedback, should be<br />

passed directly to the students once the data has been collated, checked and approved for<br />

distribution.<br />

Test the viability and reliability of the mobile phone upload.<br />

Exploration of vendor support: Although Ubysis provided adequate support and advice prior to,<br />

and during, the pilot, it would be prudent to seek assurances that this level of support will continue<br />

in any future<br />

Training of Base administration staff in downloading and checking the pen’s content.<br />

Training of OSCE examiners: Some examiners expressed concerns over lack of training to<br />

provide appropriate feedback.<br />

The first two points were particularly important as there had been a great deal of resistance to the<br />

introduction of the technology from the MEO. This was partly due to a lack of communication and<br />

resulting misunderstanding of the technology and partly due to concerns about the technology making<br />

certain roles redundant. The result was that despite the pilot, MEO staff were still opposed to the<br />

technology and promoted the view the pilot had been unsuccessful. This ‘dis-information’ rapidly<br />

became the prevailing view and work was required to assure the concerns of the MEO staff. This<br />

demonstrates the importance of embracing all stakeholders at the start of any technology change.<br />

At present, the SMT are reviewing the results of the pilot, together with other initiatives in student<br />

feedback, to determine the most appropriate way forward. A final decision will be based on available<br />

resources and the views of staff and students.<br />

Acknowledgements<br />

Thanks to Dr’s Hilary Dexter and Lucie Byrne-Davies from the Manchester Medical School for their<br />

work on the student evaluation that informs much of this study.<br />

References<br />

Buzzetto-More, A. and Alade, J.A. (2006) “Best Practices in e-Assessment”, Journal of Information Technology<br />

Education, Vol 5, pp 251-269.<br />

Gibbs, G. (2010) Dimensions of Quality, The Higher Education Academy,<br />

http://search3.openobjects.com/kb5/hea/evidencenet/resource.page?record=12nH2AFIYcc [accessed 12<br />

August 2011]<br />

Harden, R and Gleeson F (1979) “Assessment of clinical competence using an objective structured clinical<br />

examination (OSCE)”, Medical Education, Vol 13, No.1, pp39-54.<br />

Love, T and Cooper, T (2004) “Designing online information systems for portfolio-based assessment: Design<br />

criteria and heuristics”, Journal of Information Technology Education, Vol 3, pp65-81.<br />

98

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!