learning - Academic Conferences Limited
learning - Academic Conferences Limited learning - Academic Conferences Limited
Objectives More efficient and faster preparation and transmission of data for analysis by the different roles involved in administration, management and quality assurance. 6. Conclusion and next steps Tim Cappelli Evaluation Impact The exams data was sent from the base hospital to MEO on the day of each OSCE exam. It was sent electronically over a secure link rather than as bundles of paper forms, reducing the time, cost and security of data transmission back to the centre. Because the data is already in electronic format, the data can be easily collated and assimilated into existing systems for analysis and distribution. Having business rules in the software that instantly flag incorrect forms or missing data before the examiners leave the site reduces the number of errors and facilitates the QA of the process. The pilot of the digital pens was evaluated and reported on to the Senior Management Team of the School together with a set of recommendations for large-scale roll out across all the base hospital. The recommendations were: Clarification of the data requirements of the Medical Exams Office (MEO). Demonstration of the process of using the pens to MEO staff. Assessment of the value of providing timely feedback to students: Another pilot, similar to the first, should be run. This time the results from the pens, including the textual feedback, should be passed directly to the students once the data has been collated, checked and approved for distribution. Test the viability and reliability of the mobile phone upload. Exploration of vendor support: Although Ubysis provided adequate support and advice prior to, and during, the pilot, it would be prudent to seek assurances that this level of support will continue in any future Training of Base administration staff in downloading and checking the pen’s content. Training of OSCE examiners: Some examiners expressed concerns over lack of training to provide appropriate feedback. The first two points were particularly important as there had been a great deal of resistance to the introduction of the technology from the MEO. This was partly due to a lack of communication and resulting misunderstanding of the technology and partly due to concerns about the technology making certain roles redundant. The result was that despite the pilot, MEO staff were still opposed to the technology and promoted the view the pilot had been unsuccessful. This ‘dis-information’ rapidly became the prevailing view and work was required to assure the concerns of the MEO staff. This demonstrates the importance of embracing all stakeholders at the start of any technology change. At present, the SMT are reviewing the results of the pilot, together with other initiatives in student feedback, to determine the most appropriate way forward. A final decision will be based on available resources and the views of staff and students. Acknowledgements Thanks to Dr’s Hilary Dexter and Lucie Byrne-Davies from the Manchester Medical School for their work on the student evaluation that informs much of this study. References Buzzetto-More, A. and Alade, J.A. (2006) “Best Practices in e-Assessment”, Journal of Information Technology Education, Vol 5, pp 251-269. Gibbs, G. (2010) Dimensions of Quality, The Higher Education Academy, http://search3.openobjects.com/kb5/hea/evidencenet/resource.page?record=12nH2AFIYcc [accessed 12 August 2011] Harden, R and Gleeson F (1979) “Assessment of clinical competence using an objective structured clinical examination (OSCE)”, Medical Education, Vol 13, No.1, pp39-54. Love, T and Cooper, T (2004) “Designing online information systems for portfolio-based assessment: Design criteria and heuristics”, Journal of Information Technology Education, Vol 3, pp65-81. 98
Tim Cappelli http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.77.6803&rep=rep1&type=pdf [accessed 15 August 2011] Nicol, D. & Macfalane-Dick, D. (2006) Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. published in Studies Higher Education 2006, Vol 31(2), pp99-218. http://tltt.strath.ac.uk/REAP/public/Resources/DN_SHE_Final.pdf [accessed 12 August 2011] Ridgway, J. and McCusker, S. and Pead, D. (2007) ’Literature review of e-assessment.’, [online]. Futurelab, Bristol http://dro.dur.ac.uk/1929/1/Ridgway_Literature.pdf?DDD29+ded0kmt [accessed 15 August 2011] 99
- Page 74 and 75: Some Reflections on the Evaluation
- Page 76 and 77: Nabil Ben Abdallah and Françoise P
- Page 78 and 79: Nabil Ben Abdallah and Françoise P
- Page 80 and 81: Nabil Ben Abdallah and Françoise P
- Page 82 and 83: Designing A New Curriculum: Finding
- Page 84 and 85: Andrea Benn For this new course, it
- Page 86 and 87: Andrea Benn Technology is already i
- Page 88 and 89: Andrea Benn To bring about the co-o
- Page 90 and 91: Latefa Bin Fryan and Lampros Stergi
- Page 92 and 93: Latefa Bin Fryan and Lampros Stergi
- Page 94 and 95: Faculty development Online course
- Page 96 and 97: Latefa Bin Fryan and Lampros Stergi
- Page 98 and 99: Latefa Bin Fryan and Lampros Stergi
- Page 100 and 101: Alice Bird being reviewed under the
- Page 102 and 103: Alice Bird Developing the process m
- Page 104 and 105: Alice Bird Reflecting on the feasib
- Page 106 and 107: 3.3 Early stage implementation Alic
- Page 108 and 109: Enhancement of e-Testing Possibilit
- Page 110 and 111: Martin Cápay et al. of Likert scal
- Page 112 and 113: Martin Cápay et al. Figure 3 Proce
- Page 114 and 115: Martin Cápay et al. Figure 4: An e
- Page 116 and 117: Martin Cápay et al. On the other h
- Page 118 and 119: Tim Cappelli demand from students t
- Page 120 and 121: Tim Cappelli at a time and increasi
- Page 122 and 123: Tim Cappelli forms were processed a
- Page 126 and 127: Digital Educational Resources Repos
- Page 128 and 129: Cornélia Castro et al. Economic:
- Page 130 and 131: Cornélia Castro et al. Dimension E
- Page 132 and 133: Cornélia Castro et al. feedback on
- Page 134 and 135: Cornélia Castro et al. EdReNe (200
- Page 136 and 137: Ivana Cechova et al. The influence
- Page 138 and 139: 4. Methodology Ivana Cechova et al.
- Page 140 and 141: Ivana Cechova et al. Although this
- Page 142 and 143: 8. Conclusion Ivana Cechova et al.
- Page 144 and 145: Yin Ha Vivian Chan et al. What is s
- Page 146 and 147: Yin Ha Vivian Chan et al. as a viab
- Page 148 and 149: Yin Ha Vivian Chan et al. the ILC h
- Page 150 and 151: The Development and Application of
- Page 152 and 153: Serdar Çiftci and Mehmet Akif Ocak
- Page 154 and 155: 4.3 Data collection Serdar Çiftci
- Page 156 and 157: Serdar Çiftci and Mehmet Akif Ocak
- Page 158 and 159: Table 8: Students’ responses to q
- Page 160 and 161: An Exploratory Comparative Study of
- Page 162 and 163: Marija Cubric et al. Web 2.0 tools
- Page 164 and 165: Marija Cubric et al. Despite all th
- Page 166 and 167: Marija Cubric et al. Staff profile
- Page 168 and 169: Marija Cubric et al. In case 3.2, a
- Page 170 and 171: Marija Cubric et al. Sorcinelli, M.
- Page 172 and 173: Figure 1: Adaptive eLearning system
Objectives<br />
More efficient and<br />
faster preparation and<br />
transmission of data<br />
for analysis by the<br />
different roles involved<br />
in administration,<br />
management and<br />
quality assurance.<br />
6. Conclusion and next steps<br />
Tim Cappelli<br />
Evaluation Impact<br />
The exams data was sent from the base<br />
hospital to MEO on the day of each<br />
OSCE exam. It was sent electronically<br />
over a secure link rather than as bundles<br />
of paper forms, reducing the time, cost<br />
and security of data transmission back to<br />
the centre.<br />
Because the data is already in<br />
electronic format, the data can be easily<br />
collated and assimilated into existing<br />
systems for analysis and distribution.<br />
Having business rules in the software<br />
that instantly flag incorrect forms or<br />
missing data before the examiners<br />
leave the site reduces the number of<br />
errors and facilitates the QA of the<br />
process.<br />
The pilot of the digital pens was evaluated and reported on to the Senior Management Team of the<br />
School together with a set of recommendations for large-scale roll out across all the base hospital.<br />
The recommendations were:<br />
Clarification of the data requirements of the Medical Exams Office (MEO).<br />
Demonstration of the process of using the pens to MEO staff.<br />
Assessment of the value of providing timely feedback to students: Another pilot, similar to the<br />
first, should be run. This time the results from the pens, including the textual feedback, should be<br />
passed directly to the students once the data has been collated, checked and approved for<br />
distribution.<br />
Test the viability and reliability of the mobile phone upload.<br />
Exploration of vendor support: Although Ubysis provided adequate support and advice prior to,<br />
and during, the pilot, it would be prudent to seek assurances that this level of support will continue<br />
in any future<br />
Training of Base administration staff in downloading and checking the pen’s content.<br />
Training of OSCE examiners: Some examiners expressed concerns over lack of training to<br />
provide appropriate feedback.<br />
The first two points were particularly important as there had been a great deal of resistance to the<br />
introduction of the technology from the MEO. This was partly due to a lack of communication and<br />
resulting misunderstanding of the technology and partly due to concerns about the technology making<br />
certain roles redundant. The result was that despite the pilot, MEO staff were still opposed to the<br />
technology and promoted the view the pilot had been unsuccessful. This ‘dis-information’ rapidly<br />
became the prevailing view and work was required to assure the concerns of the MEO staff. This<br />
demonstrates the importance of embracing all stakeholders at the start of any technology change.<br />
At present, the SMT are reviewing the results of the pilot, together with other initiatives in student<br />
feedback, to determine the most appropriate way forward. A final decision will be based on available<br />
resources and the views of staff and students.<br />
Acknowledgements<br />
Thanks to Dr’s Hilary Dexter and Lucie Byrne-Davies from the Manchester Medical School for their<br />
work on the student evaluation that informs much of this study.<br />
References<br />
Buzzetto-More, A. and Alade, J.A. (2006) “Best Practices in e-Assessment”, Journal of Information Technology<br />
Education, Vol 5, pp 251-269.<br />
Gibbs, G. (2010) Dimensions of Quality, The Higher Education Academy,<br />
http://search3.openobjects.com/kb5/hea/evidencenet/resource.page?record=12nH2AFIYcc [accessed 12<br />
August 2011]<br />
Harden, R and Gleeson F (1979) “Assessment of clinical competence using an objective structured clinical<br />
examination (OSCE)”, Medical Education, Vol 13, No.1, pp39-54.<br />
Love, T and Cooper, T (2004) “Designing online information systems for portfolio-based assessment: Design<br />
criteria and heuristics”, Journal of Information Technology Education, Vol 3, pp65-81.<br />
98