The General Medical Council (GMC) requires medical schools to implement quality assurance mechanisms to check that there is consistency across student placement opportunities. The schools should be able to show the GMC evidence that their students were fully integrated into the clinical team as part of their placement.
To gather this evidence, medical students are asked to complete an evaluation form about their placement. In the Medical School, the surveys are issued and created via their custom virtual learning environment (VLE), EEMeC.
The evaluation system in EEMeC is no longer viable and a replacement system must be found. Large sections of the system are written in Classic ASP. This programming language was created by Microsoft and has been earmarked for deprecation at a future, undetermined date. The language has not received any new features since the year 2000. In addition, a decision was made to close EEMeC and migrate teaching activities to Blackboard Learn to consolidate and improve the student experience. This migration was partially implemented in the 2018/19 academic year.
Prior to the move to Blackboard Learn, the Medical School reviewed all functionality in EEMeC to ensure there were appropriate migration paths for all functionality provided within Learn. The review concluded that there was not an equivalent tool available in Learn to replace the existing evaluation system.
The Medical School contacted colleagues in other institutions to see if there was a suitable commercial replacement available but one was not forthcoming. As such, a decision was taken to create a new replacement system using an in-house development team.
This project will replicate the frequently used functionality provided by the EEMeC evaluation system in a new stand-alone web application.
The MBChB Evaluation system has been redeveloped successfully using a new, modern software stack. All functionality that was present in the old MBChB Evaluation System that was integrated into the old EEMeC VLE has been ported across to the new system. In addition, the new system has automated tests built into it which will make future upgrades to the system a relatively quick procedure.
We anticipate that the system will be deployed for production use for the start of the 2020/21 academic year.
Key Learning Points
- Including automated tests in the system was a good move. We caught a number of regressions when adding in new features due to these tests, which allowed us to add features to the system with more confidence
- We would have benefitted from implementing an automated deployment system. Deployments are currently manual which meant there was often a large time lag getting our test system updated for the Medical School to evaluate it
- The system spec did not include automatic ingestion of student information. Instead, student information is imported via a csv file which is provided by the Medical School. This design matches the functionality of the old system but in hindsight, we would have benefitted from implementing a direct feed from EUCLID to import this data.
- A lot of user acceptance testing was conducted towards the end of the project because of delays in setting up/updating a test system. In future projects, we will set up more regular review meetings with the users and ensure frequent testing is conducted.
- More thorough testing is to be conducted by the Medical School over the coming months. Their initial tests found minor bugs which have since been resolved. We anticipate there will be some more minor bugs found in the run up to launch which we will address as part of our normal maintenance tasks
- We might need to implement a feed from EUCLID for student data. This was not in the original system or the specification for the new system but is making testing of the system a more involved process, as we need to upload each student prior to creating evaluation groups etc. It would be better if the ingestion of student data was automated.