Closure Report
Project Summary
- The Assessment and progression tools project (APT) Phase 4 ran from August 2019. It was due to end in February 2020 but completion was delayed by the impact of the Coronavirus pandemic. The final release took place in January 2021.
- During phase 4 the team deployed 22 releases to EUCLID which included 80 bug fixes, improvements and new features.
- The main development objective of this phase was to deliver improvements to progression and award processing in APT to better support postgraduate taught programmes. The team also aimed to reduce support calls by fixing bugs, automating tasks and reducing the number of processes that needed to be performed by Student Systems Operations.
- The APT team began to embed user experience (UX) and research techniques into the development process. This helped identify issues with progression and award process and test the usability of new features with APT users.
Benefits- Executive summary
Evidence/ benefits realisation
- Annual time savings was calculated at 378 staff hours, with an average 88% reduction in processing time (based on assessing 35% of the work delivered)
- Improvements to student experience. The information on the student view of assessment is clearer; this is reducing the need for students to contact Schools. Making progression notes easier to add means Schools can communicate information about progression and award decisions to students more easily. Bug fixes and improvements in the progression screens will reduce lead times and so students can get their results earlier.
The updates to process course results were brilliant and super useful. It saved us a lot of work. Likewise, being able to update notes in bulk to the Progression and Awards screen and remove/reinstate records saved work at a time when we really needed to prioritise other things, so thank you very much.
It was good to attend the APT meetings, especially because we got to hear about how other Schools did things and find a common solution to our issues. I wish something similar existed so we could discuss and collaborate across colleges. Although our needs might be different, it always helps to bounce ideas around and to know why a particular solution might not be feasible from Systems’ side.
The only comment myself and the PGT team of SPS have is to say is thank you and that it is a shame that the project is closing. It is very useful to have a dedicated team to listen to users and make improvements, and I wish there were more such groups! Colleagues who attended the user group meetings felt they were very informative and productive.
Some positive feedback, have just used the progression notes tool and was really easy and quick to use! Thanks for adding this to the system
2020 summer board outcome:
- 96% of awards were published by the deadline in July 2020 despite the additional work (was 98% of awards published in 2019, 95% in 2018)
- 29,222 progression decisions were published in 2019/0 (29,019 in 2019)
- No APT system downtime (compared with a couple in Jan 2019)
- Excellent support during peak period with fewer hours spent by student systems operations (-31 hours) for the whole of 2020
Link to the full benefit realisation report https://secure.projects.ed.ac.uk/project/sac081/page
Scope
The aim of this project was for the APT service to remain stable and supported during 19/20.
Not in Scope:
- Changes to resits functionality will not be addressed. Schools will still be required to manually calculate resit results in some unsupported scenarios
- Major changes to scaling functionality will not be addressed, though small tweaks may be as prioritised by users.
Objectives, benefits, success criteria
This project had three objectives:
Objectives |
Benefits |
Success Criteria |
Outcome |
O1.Address the backlog of issues and improvements prioritised by the APT user group and Student Systems Operations while providing annual maintenance and support services during peak periods for course and progression boards. The priorities may change throughout the year as agreed with the APT user group and Student Systems Operations, in line with key deadlines (see link to current backlog: LINK) |
Addressing key issues will:
|
|
Deployed 22 releases to EUCLID which included 80 bug fixes, improvements and new features. Link to the full benefit realisation report: https://secure.projects.ed.ac.uk/project/sac081/page |
O2. Complete and hand over Standard Operating Procedures (SOP) as agreed with Student Systems Operations |
Student Systems Operations can fully support APT service by the end of 19/20, with minimum project team support |
|
Completed as part of ways of working: work with a definition of done for user stories for all phase 4 SOPs However there are still outstanding SOPs not handed over for progression/awards incl BI support. It will take significant time for lead BA to complete, noted in the outstanding items below. |
O3. Define on-going systems support that will be documented via an Operational Level Agreement (OLA) |
IS Applications can provide some systems support for APT |
OLA is in place |
not delivered. A new project has been created (SAC084 Student Record Enhancement) which will (1) continue supporting the service & (2) address high benefit continuous improvements incl APT work. It includes experienced BA, Dev and Tester who are key to support the APT service |
Analysis of Resource Usage:
Budget from Core SSP funding (350 days of development time, IS support and project management) were allocated at the start. This is not based on estimated effort.
IS Staff Usage : 395 days (350 SSP Developeer, incl 124d contractor, 10d IS production, 8d IS Dev Tech, 12d project management)
SSP Service improvement actual effort: 325 days ( incl 188 Business analyst, 39d training, 98d testing) - There were no set budget at planning, but it is usually similar to the IS budget.
Explanation for variance. Budget
- more resources were allocated as result of secured sponsor funded from USG to deliver more benefits and to remove critical point of failure ahead of peak periods
- more work was required during the summer to support ART (specifically the no-detriment policy) and changes for online exam boards
Key Learning Points
What went well
-
Ways of working
The team continued to work with a definition of done for user stories that included documentation for users and Student Systems Operations. Every release went live with the required documentation in place and communication to users. This process minimises support queries from users and ensures Operations are able to support features at the point they go live.
During this phase the team continued to release software regularly and maintained the 2 weekly release cycle. From March 2020 the team moved to releasing features as soon as they were ready rather than waiting for 2 weekly release. This allowed users to get value from these improvements as quickly as possible.
The team continued to engage with APT users through the APT user group via the SharePoint site and through regular user group meetings.
- Guidance and Training
We've significantly improved our training materials and online guidance, although there is still hand over documentation to complete for Student Systems Operations and the Assessment Hub guidance could do with a refresh.
The APT team have used an approach to deliver training that involves working in schools as they go through assessment processes. This helps schools to get the most benefit from APT and has been particularly effective during this phase of the project bringing Clinical Sciences fully onto APT.
Just wanted to thank you very much for the informative online APT trouble-shooting session on assessment structures that you kindly agreed to run for my Edinburgh Surgery Online (ESO) academic colleagues. It was so helpful to see you actually demonstrating the tools and testing out different solutions for our particular issues, especially as my two ophthalmology Masters programmes have programme-level assessments and do not fit in to the traditional UoE course assessment structures. Thanks also for coming up with a workable solution on the spot to the new extensions/SC online system for my courses.
- Working with Student Systems Operations
We've improved how we engage with operations and work closely together as one team. Operations staff are involved in the Daily Stand Ups and testing, and kept up to date with the changes, while approving the releases.
-
UX – User Research and Experience
The Student Systems Partnership worked with a consultant during 2019/20 to bring UX principles and practices into the team.
As part of phase 4 the APT team ran workshops using UX techniques to identify improvements to the Progression and Award screen. The user research approach ensured that we were targeting the correct improvements to make. The team also ran usability testing sessions on some of the features during development.
The progression notes functionality went through UX testing the design was simplified based on feedback from users at those sessions.
Lessons learned
- Making changes without understanding the full impact or having detailed requirements when it’s urgent. We changed award publication quickly to get medical students graduated, which impacted graduation processing for other students. As there was pressure to make the change quickly our analysis was lacking. We could have made the change so the medical students could graduate then done further analysis immediately after (as there wasn’t time to do it upfront) to check the impact and made a second release to fix any issues.
- Leaving features (work in progress) un-released. When we switched to delivering improvements for Covid, we left some features un-released. This has been costly to pick-up again 6 months later and has been impacted by other projects (ESC-SAS003) which made changes to APT.We needed to stick to our own rules of regular releases and not having features sitting around unreleased. Here we should have said we’ll release what is in progress before picking up other work or possibly scrapped some of work in progress where priorities have changed.
- Benefits Analysis. We should include benefits analysis in our stories.Initial benefits analysis (e.g. process and lead times, user feedback) should be documented as part of the story before development.The same measures are repeated and recorded in the story after release. Benefits measures could be added the definition of done for a story.This would also give us the ability to use benefits (value) as a team performance measure.
- A product user group will always want more and will be satisfied with improvements. But we need to be reasonably sure that the effort expended will lead to more benefits than investing in other work.
Outstanding Issues
There are no outstanding SOPs from phase 4 work, but the outstanding progression and awards and BI suite documentation is still to do by lead BA -aiming for 1 day a week ring-fenced under SRE project (SAC084)