Completion Report

 

Background

This project is a continuation of the work completed in SAC034  which delivered the communication of progression decisions to students and a pilot of Assessment and Progression Tools (APT) in 7 Schools and a Deanery during 2015/16.

Under this project, APT  was rolled out across a further 17 Schools and Deaneries,  further functionality  was developed to support Assessment and Progression business processes and the tools developed during the pilot were enhanced.

This project supported the Assessment strand of the Student Systems Roadmap and supports the 'Outstanding Student Experience' Theme in the University's Strategic Plan.   Work on this strand will continue in 17/18 under a follow-on project and a smaller Steering Group will continue to provide governance through 2017/18

 

Project Summary

The assessment and Progression work strand will complete with the follow-on project begun  in 17/18.  The success of the deliveries and benefits achieved will be reassessed on closure of that project.

The project delivered software and training to support key processes around assessment and progression, including

  • enhanced DPT definition and corresponding allocation of programme and student specific requirements for passing courses
  • set up of course assessments and entry of marks, including double marking
  • publication of in course assessment results to students
  • systems processes and BI reports to support both course and programme level exam boards
  • sharing of course marks (clearly identified where still provisional) across schools, reducing 'off-system' communication between schools
  • support for certain resit scenarios
  • progression and award calculations for the majority of taught programmes, with associated processes for the ratification and publication of decisions and awards.

In addition, a new deadline for the ratification of honours course marks was implemented, ensuring that awarding boards only used ratified course marks.

On the whole the current APT project has been well received,  although certain aspects were less successful: 

  • Most schools found processing of resits challenging, and the system does not currently support all the resit requirements. Student Systems were not able to provide the timely support required during the resit period.
  • Late delivery of software and processes for the summer exam boards created extra pressure in schools 

The feedback from School users has varied considerably - this was influenced by a number of factors, e.g.

  • timing of summer exam boards
  • methods of reassessment
  • previous processes and systems used for processing of assessments
  • level of engagement with the project

The project, objectives, deliverables, benefits and success criteria are detailed  in the following table.

 

Objectives

 

 

Achieved

(Y/N)

Notes

1

17 Schools and Deaneries and the Centre for Open Learning using APT to administer assessments, exam boards and progression/classification boards during 2016/17 without recourse to other systems

 

 Partially

17 schools used APT but in some cases had to resort to spreadsheets, particularly for some 'non-standard' progression & awards and also for resits.

2

Marks for in year assessment and exam marks on courses administered within APT able to be viewed by students through EUCLID in a timely manner.

 Y

 

3

Remaining Schools and Deaneries ready to use APT in 2017/18 with any additional software required developed to support them

 N

Transitioning remaining schools/deaneries is still in progress.

4

SMART decommissioned by the end of 2016.

Y

SMART was not used by any school in 2016/17  but was not decomissioned till January 2018

 

Deliverables

Software Deliverables

 

 

Achieved

Y/N

Notes

1

Provision of barcode labels

Y

 

2

Double marking & marker identification

 

Y

 

3

Publication of component (in-course assessment) marks

 

Y

 

4

Student view of component marks

Y

 

5

PT view of component marks

Y

Available through student hub

6

DPT maintenance

Y

 

7

Management of course enrolments

Y

 

8

Standard setting/scaling

Y

Functionality was provided for a small number of schools. 

9

Integrate exam information from CCAM

N

Descoped – to be reassessed  under follow-on project

10

Enhanced Reassessment functionality

Partial

Partial. Some scenarios were able to be  processed but schools with reassessments outside of these had significant difficulties processing resits.

11

Enhancements to current assessment setup, mark entry and course board processing

Y

 

12

Enhancements to Progression and Classification processing

Y

 Individual student processing but did not enable 'online' boards

13

A simplified, standardised set of course and progression & classification board reports

 

Y

 

14

Decommission SMART

Y

Although this took  2 days effort, it took four months  for this work to be completed, delaying the closure of the project by two months.

 

 

 

 

Other Deliverables

 

 

Achieved

Y/N

Notes

1

Agreed set of progression and award/classification rules to support the programmes offered across the University

 Y

Progression and award rules were implemented for the majority of programmes but additional rules are required for some specific types of programmes, particularly in ECA and HiSS

2

Transition plans for schools adopting APT for 2017/18

Partial

Partial. Engagement with 17/18 schools was not as intensive as planned through the latter part of 2016/17 due to time pressure during the exam board periods and the sudden departure of a key member of the project team who was handling this.

3

Operational support delivered to support schools using the tools throughout 2016/17

Y

For the most part support was good but this was not the case during the August resit period.

4

Handover of the technical support of APT to IS Applications management

N

Ongoing - model to be agreed before end of follow-on project

5

University key dates for ratification and publication of marks to support progression and award/classification board

 

Y

More work to be done under follow-on project for PGT and non-honours

 

Benefits

Student Experience

 

Achieved

Y/N

Notes

1

Students can view the assessment structures for all their courses in one place.

 

Y

 

2

In-course assessment results are available to both students and members of staff (through EUCLID student hub) in a timely manner. This allows for better informed discussions between Personal Tutors (and other staff) and students relating to academic performance. Students don't have to access different systems to get an up-to date view of their progress.

 

Y

The Steering Group decided that provisional exam marks should not be publishable until the final course mark had been ratified. This has caused issues for some schools and will for the MBChB programme so this needs to be reviewed in the follow up project.

3

Programme-specific progression requirements for all a student’s courses (e.g. must pass at 50% to progress) are clearly defined available to students in EUCLID. Students will have a clear understanding of their progression requirements.

 

N

This information is available on the DPTs but not yet in the student view of assessments.

4

The DRPS more accurately and clearly reflects the programme requirements.

 

Y

 

 

 

 

 

Administration Efficiencies

 

Achieved Y/N

Notes

1

Course marks that have been ratified at a board can be ratified quickly in EUCLID and be available for progression and classification boards, eliminating the vast majority of 'off system' communications between administrative staff in different schools ahead of exam boards. This reduces the time required to prepare for exam boards. 

 

Y

Anecdotally, more time was required to prepare for exam boards in some schools. Other schools found the process much quicker.

2

The status of overall course marks is clear to staff administering exam boards, reducing risk of error in the preparation of exam boards.

 

Y

Provisional marks are clearly flagged in board reports..

3

DPTs can be more accurately defined to include agreed elevated hurdles (e.g. passing courses at 50% at first attempt).

 

Y

 

4

Student-specific course enrolments can be easily defined, maintained and viewed, enabling any staff looking at the student record to understand what must be achieved in individual courses for the student to progress. 

 

Y

Partial. This information is clear on DPTs and within Assessment and Progression tools but has not been added to the Student hub. 

5

Staff have the ability to control the timing of the publication of in-course assessment, overall course and progression & award/classification decisions.

 

Y

Several schools have requested that the system should not prevent 'cohort' publication of progression award decisions if not all students have a decision.  

6

There is a small number of standardised set of business processes and reports for administration of assessment and exam boards across the University, enabling administrative staff to:

  • spend less time preparing for board meetings
  • move more easily between subject areas and Schools

 

N

Schools felt under-supported in this area. Best practice material to be developed for schools during 2017/18.

 

 

 

 

Management Information

 

Achieved Y/N

Notes

1

Comprehensive information will be held centrally in the student record system and will be available to relevant staff for a range of processes, from the processing to assessments to considering appeals. This includes:

  • marks for individual items of assessment
  • an audit of changes of marks for individual items of assessment
  • notes against course results as presented to exam boards and any notes made as a result of exam boards
  • notes to students relating to course results including outcomes of Special Circumstances applications
  • audit of marks given by individual markers (enabling auditing for consistency of marking)

 

 Y

More work required on audit of marks given by individual markers - to be considered in next project

There has been no attempt to define reports to be used at School, College or University Level.

 

Success Criteria

2016/17

 

Achieved Y/N

Notes

1

Students view assessment marks and overall course marks through EUCLID.  

 

Y

 

2

Marks are successfully shared with Schools not using APT to support the relevant exam boards in summer 2017.

 

Y

 

3

Participating schools use APT to administer course assessment and exam boards without recourse to external systems

 

Y

In some cases Schools resorted to spreadsheets, particularly for some 'non-standard' progression & awards and also extensively for resits.

4

Award, progression decisions and course marks are ratified in the system in a timely manner using APT.

 

Y

 

5

SMART is successfully decommissioned

 

Y

 

6

Student Systems Operations have a suitable support structure in place to support the users of the system

 

N

Supporting such a large rollout was challenging and the current situation is unsustainable. Planning is in progress to address this. 

7

The Schools adopting APT in 2017/18 have transition plans in place

 

Y

Further work required with the Schools and Deaneries adopting APT for 2017/18.

Success Criteria Future years – For information only

2017/18

 

Notes

1

Remaining Schools and Deaneries adopt APT in 2017/18

 

 

To be continued in follow-on project

2

Dissemination of best practice information and support for a move towards a standardised practice

 

 

This is key for the success of APT.

 

 

 

 

 

Does the Sponsor agree that the project can be closed? 

Yes

Cost summary

IS Apps Staff Resources Estimated at Project Brief (Days)

250 days IS Apps resource

 

Actual IS Apps Staff Resources Used (% Variance)

 370 * days IS Apps resource (+50%), :

SSP non IS Staff Resources Estimated on Project Brief (Days)  300 days 
Actual SSP non IS Staff Resources (Days) (% variance)  438 days (+45%)

Other Resources Estimated on Project Brief 

 n/a
Planned Delivery Date (go Live) from Project Brief  Multiple deliveries

Actual Delivery Date (go Live)

 Multiple deliveries

 

Explanation for variance

The project was originally allocated a budget of 250 IS days and 300 SSP non-IS days for 2016/17. This was not an estimate, but an initial allocation of days.

IS resource

This was increased to 325 days: Issue 9: Project budget not sufficient 

  • The outstanding work was reviewed by the project team in February 2017. The team estimated that an additional 100 days would be required to complete development for the for 'must have' developments.  As a result of this exercise, 75 additional days were allocated to the project.
  • Initial work totalling 22 IS days already expended on the follow on project SAC063 have been transferred to this project. 

 

Non-IS resource

  • BI reports were developed by one of the BAs
  • The level of support required from BAs was higher than anticipated, in part due to the loss of the key member of the project team from Student Systems Operations who had 18 months of experience of supporting APT.
  • Impact of learning curve of BA joining the team for a few months then leaving for another post.

 

Key Learning Points:

What went well?

  1. Although not all anticipated deliverables were achieved, the project team delivered a significant amount of software in a relatively short time period.
  2. The re-writing of the very old DPT maintenance task was received very well in Schools. The ease of use and clarity of presentation, along with the effort that colleagues in Schools put in to review their DPTs has led to more accurate representations of DPTs on the DRPS. (Anecdotally, no DPT structure issues arose whilst supporting students using PATH in welcome week). We should review other very old, high volume tasks in EUCLID (e.g course enrolment) which could take advantage of the newer development techniques now available.
  3. The move to a single date for the ratification of honours course marks and two stage boards was successful. 
  4. The use of Yammer as a user support tool worked well when there was available resource to monitor and answer queries. There was good sharing of information between Schools. (However, it does take significant resource to monitor - the use of it fell down when Student Systems didn't have enough resource to engage with it, particularly during the resit period).
  5. Rapid response to queries and deployment of bug fixes enhanced the Project Team's reputation with colleagues and schools
  6. Dedicated Students Systems Operations as an integral member of the project team
  7. Although having an upfront cost, decisions taken to to refactor existing code during the project (e.g. creating the  'consolidated data object') created a more robust and more easily maintained product. 

Feedback received from School colleagues

  • About Yammer :‘The teams use of this to allow communication was such a refreshing change to how software is normally rolled out.’ (ECA)
  • Following DPT webinar:   I and a few members of the team attended the webinar today – thanks for organising that.  I thought it worked really well.  Everything seems straightforward so I’m going to cancel my training session.  (LLC)
  • Following DPT webinar ; ".... The new tool looks brilliant and I am looking forward to overhauling all of my DPTs (something I have been planning on doing for years). (LLC)
  • At Semester 1 course board review "This is the best supported project I've ever been involved with at the University"  (Chemistry)
  • "You should know that APT is consistently held up by school colleagues as an examplar of a good project" (Lisa Kendall, CAHSS)
  • "I hope that projects don't go back to being run in the old way. APT has delivered software that is really easy and good to use" (Physics)
  • "the search for course enrolments across multiple, identified courses is excellent and works a treat - thank you :)" (Maths)
  • " As a school who have previously done things manually in a well-meant but risky way, APT is a fantastic project and I am already salivating at the time we will gain from the development of this system." (anon)
  • "I was told today that I have to run a course Board tomorrow (due to staff sickness), and I thought to myself, thank goodness for APT, I know i'll be able to easily get the data together at the last minute." (Education)
  • ""All my staff say what an excellent project this has been, and so well communicated. Please don't worry that things are not as perfect as you would like just now, we still think assessment and progression tools are so much better than what we had, and lovely to use" (Law)
  • "I love the new system ! it's so easy and straightforward ! and will save us having to hassle you all the time !" (ECA)
  • "... but I just wanted to let you know that APT has honestly changed my life this Board period! It is SO MUCH BETTER. (Law)

 

What didn't go so well?

While the project successfully achieved many deliverables, as noted above, the scope may have been too ambitious.

Lessons learned due to project Scope

1 Progression and Award calculations.

Rather than try to deliver everything we could have taken a go/no-go decision on each type calculation and communicated this earlier so schools schools could  take mitigating action.

2 Rollout approach

More consideration should be given to a phased rollout for future projects of this size and impact as this would enable student systems to provide a better customer experience. 

3 Estimations

Consideration should be given to how project teams estimate story points. For example as well as development and testing this would include deployment costs, amendments to training materials and online guidance and Student Systems Operations training to understand the full cost of development.

4 Resourcing

Due to the ambitious scope of the project, there was little contingency available for the project team to cope with any unexpected availability of key team members. 

Maintaining communication with schools during the busy exam period was challenging when project team resource was stretched.

Resourcing  for support during the exam board periods, particularly the resit boards,  was not sufficient. This issue was exacerbated over the summer by the loss of a key member of support staff, BAs having conflicting project priorities and sickness in the support team.  We need to ensure appropriate levels of support is available.

The move to a single date for the ratification of honours course marks and two stage boards was on the whole successful. However, this was a big change for many schools and the resource required to enable this kind of change to long-established practices should not be under-estimated. 

5 Training and guidance

While some training was well received by colleagues in schools (see above) user feedback indicates that the project team did not provide enough training and guidance in a timely manner prior to the summer exam boards. 

6. Decomissioning of SMART

Partly due to lack of an available project management resource, it took four months for this piece of work to be completed. Although resources were assigned several times, the work was not completed till January 2017. We need to improve how small piece of work like this are prioritised.

 

Other lessons learned

1 Communications

The release plan and communications strategy for the follow-on project should be reviewed as the team found it difficult to maintain efficient communication in the context of multiple releases. Some school colleagues found the iterative nature unsettling as they were unable to plan their processes in a timely manner.  However,  those schools that pro-actively engaged with the project team generally had a more positive experience.

The project team should have defined vocabulary strictly up-front and stuck to it. For example, near the end of the project we had to revisit reassessment/resit/second sit/2nd sit nomenclature. 

2. Live performance

EUCLID eVision performance has been an issue which has affected the user experience. This is not just limited to APT.  Performance in the LIVE environment has been significantly worse than in the TEST and DEV environments.  This issue has been raised with the Head of Student Systems and the team  will invite senior technical staff from IS to offer advice on how this could be improved in the follow-on project and to try to understand where the performance issues are occurring (e.g. is it the retrieval of data or other SITS processing).

3. Deployment method

An iterative development with multiple releases requires an efficient deployment mechanism. The current process for deploying new releases and bug fixes is time consuming (in actual and elapsed time) and requires a large number of hand-offs.  There are plans to introduce automated deployment for SSP which should alleviate this.

 

Feedback

  • "A lot of issues arose during the resit diet which highlighted that the software was not ready to process resit marks." (HCA)
  • "Software was being developed as we were using the system in May." (Chemistry)
  • "Yammer was used to discuss problems and post updates, but it very quickly became unmanageable." (Chemistry)
  • "Several days of staff time were lost across the School as a result of system and process deficiencies "(PPLS)

 

Outstanding issues

The following will be considered under the follow-on project:

Deliverables

Software Deliverables

  • (8) Standard setting/scaling: further development work required to support CMVM schools/deaneries
  • (9) Integrate exam information from CCAM 
  • (10) Enhanced Reassessment functionality: significant work required before this can be considered delivered.
  • (12)Enhancements to Progression and Classification processing: further enhancements to improve speed for schools

Other Deliverables

  • (1) Agreed set of progression and award/classification rules to support the programmes offered across the University: rules for remaining programmes to be added
  • (2) Transition plans for schools adopting APT for 2017/18: work is continuing with these schools 
  • (4)Handover of the technical support of APT to IS Applications management: SSP will continue to support APT during 2017/18 and will handover responsibility after the follow on project.
  • (5) University key dates for ratification and publication of marks to support progression and award/classification board:  further work required for non-Honours and PGT

Benefits:

Administration Efficiencies:

  • (3) Programme-specific progression requirements for all a student’s courses (e.g. must pass at 50% to progress) are clearly defined available to students in EUCLID. Students will have a clear understanding of their progression requirements:  include further information in Student Hub and Student but only if we are confident that the information maintained accurately by schools 

  • (6) Business processes and reports for administration of assessment and exam boards across the University: development and embedding of best practices required during 2017/18

Management Information Benefits

  • (1) audit of marks given by individual markers (enabling auditing for consistency of marking)

Success Criteria to be reviewed

  • (3)Participating schools use APT to administer course assessment and exam boards without recourse to external systems: Further work is required, particularly around processing of resits, to enable this. 
  • (6)Student Systems Operations have a suitable support structure in place to support the users of the system: Student Systems Operations are constructing a plan for resource requirements for the ongoing support of Assessment and Progression processes.

 

It is possible that some of the above  will be de-scoped from the APT strand during planning for the current  follow on project,  APT3

 

 

Project Info

Project
Assessment & Progression
Code
SAC057
Programme
Student Systems Partnership SSP
Management Office
ISG PMO
Project Manager
Chris Giles
Project Sponsor
Susan Rhind
Current Stage
Close
Status
Closed
Start Date
01-Aug-2016
Planning Date
n/a
Delivery Date
n/a
Close Date
26-Jan-2018
Programme Priority
1
Overall Priority
Higher
Category
Discretionary

Documentation

Close