DRAFT version; for review - Completion Report

Project Summary:

The original objectives and deliverables  - as listed in the brief - for this project were:

OBJECTIVES -

Phase 1:

  1. Produce progression and classification sheets for CHSS for use to support decision making in exam boards.
  2. Produce a single, combined, new report for exam boards. Note there may be a crossover or duplication with 1 in which case only a single report will be necessary for 1 and 2.
  3. Produce an extract in XML format that can be uploaded by Registry. 
  4. Improve the speed of the batch validation process.

Phase 2:

  1. Develop functionality to enable post-graduate taught progression.
  2. Fix the navigation to find courses and student names (the system currently shows course numbers and student numbers which makes navigation difficult - showing the course names and student names instead would improve the navigation considerably).

The following was originally in scope for business analysis but out of scope for subsequent design and build activities:

  1. Develop functionality for placement marks of Pass/Fail for Education, Social Work and Nursing students

Numbers 1 to 6 were all met, although the progression and classification sheets developed for HSS were only used, this year, by two participating Schools (Law and Business) as the other schools have not yet all adopted SMART. Number 6 evolved as the project progressed with the requirements that were first stated in the BRD gradually changing through discussion with SMART users into functionality that met what was need. Number 7 was finally omitted from business analysis, so no work was done on this aspect as part of this project.

 

DELIVERABLES -

  • Project Brief, Project Plan and Project Estimate
  • Business Requirements Document including system design specification
  • Updated Technical Architecture Document (TAD)
  • Progression sheet for CHSS
  • Classification sheet for CHSS for use at exam boards
  • Production of XML export in the format specified by Registry
  • Improved batch validation process performance
  • Post-graduate taught progression sheet
  • Fixed navigation for finding courses and student names
  • Ability to use pass/fail for placement students
  • Updated test and live environments
  • Technical documentation

 

Each of the above have been achieved, apart from the ability to use pass/fail for placement students as this was removed from the scope of the project.

The project sponsor and stakeholders have agreed that the project can now be signed off.

Analysis of Resource Usage:

Staff Usage Estimate: 100 days

Staff Usage Actual: 100 days

Staff Usage Variance: 0%

Other Resource Estimate: 0 days

Other Resource Actual: 0 days

Other Resource Variance: 0%

Explanation for variance:

It can be seen from the above figures that the project has delivered within the parameters of the original estimate. However, some clarification is required with regards to estimates vs. actuals across the individual teams in IS Applications; the additional time that was suggested for 'Phase 2b', and the small amount of time carried over into 2012-13.

First, the final figures for the actual time across the IS Apps team showed an increase for Project Services (for Project Management & Business Analysis) and a decrease for Development Services. The former is attributable to a short increase in the duration of the project and more BA time required during user testing. The reduction in Dev time is simply because the development tasks ended up needing less time than originally estimated.

This reduction from the estimates for Development also meant that the work identified as belonging to Phase 2b was done under Phase 2a because the latter was developed in less time than anticipated. This meant that the additional 14 days added to the original budget (PICCL #15; 10/05/12) was not required in the long run.

Finally, the extra 14 days for Phase 2b was earmarked for 2012-13, with the original 100 days being used in 2011-12. As has been stated, the extra days were not used but the original budget was completely used by the end of the old financial year. A total of 94 days were used in 2011-12, and the remaining 6 days have been used in the current year for work covering the final deployment to live; handover meetings and sign-offs; project management, and project closure.

With regards to the difference between estimate and actual effort, one contributor has commented that:

It was largely fortuitous that development took less time than estimated; it could have easily gone the other way.  It is becoming more and more difficult to accurately estimate the effort required to incorporate new features into the existing product.  I guess you could take an alternative viewpoint that it will take more and more overhead to estimate to a given degree of accuracy prior to undertaking any work.  The underlying causes of the problem have all been covered in the rejected SMART redesign project proposal – ageing architecture, cumulative technical debt caused by years of piecemeal additions, etc.

Key Learning Points:

This was the first SMART project that involved a wider group of partners in the various Schools of the university whose requirements for a marks and records system were not directly met by SMART. The project therefore exposed staff in IS Applications to the variety of established reporting methods across the different HSS schools.

Each group of users - regardless of the system they are employing - has the same end in mind: the recording and storage of student marks; the calculation of student progression and classification; the display and reporting of these and the presentation of results to School exam boards and the statutory returns to Registry. However, this project has demonstrated that the means to achieve each of these can vary from school to school so that SMART (which was originally developed for a different College) does not necessarily meet the needs of all users at this time.

This has obviously stimulated some debate on how SMART might be modified to meet such needs and the discussions on this will obviously inform any future SMART project.

Another positive lesson learnt from this project has been pointed out by one of our colleagues:

It helped greatly that development and project services worked so closely together and that we both were actively involved with the customers. I think this prevented a lot of misunderstandings and it definitely picked up spec discrepancies far sooner in the process than if we were operating strictly sequentially.

Outstanding issues:

There are two issues that arose during this project that have not been resolved. One is a locking issue that arose during development of the course level spreadsheet - it seems that two users can simultaneously edit this (as reported in this JIRA). However, it seems unlikely that this scenario will arise and the effort to fix this was considered too much under this project. This could be added to the SMART Wiki.

The other issue was reported by Physics and concerns the perennial problem in SMART of the '0' character being over-used in the system and leading to some problems with the XML upload. This has also been reported under to STU213 (see this JIRA) and has been raised with Registry. It is to be discussed further with SMART users and support (USD and Applications Management) as well as SACS.

Project Info

Project
SMART Enhancements for HSS
Code
STU221
Programme
Student Services (STU)
Project Manager
David Watters
Project Sponsor
Janet Rennie
Current Stage
Close
Status
Closed
Start Date
16-Feb-2012
Planning Date
n/a
Delivery Date
n/a
Close Date
28-Sep-2012
Overall Priority
Higher

Documentation

Close