Closure Review
Project Summary:
Contributors
Role | Department | Name |
Project Manager (Owner) | IS Applications Projects | Maurice Franceschi |
former Project Manager | IS Applications Projects | Franck Bergeret |
Production Management Coordinator | IS Application Applications Management | Suran Perera |
Business Area Manager | Registry
| Lisa Dawson |
Project Sponsor | CHSS | Fraser Muir |
Programme Manager | IS Apps | James Thin |
Development Services Manager | IS Apps | Dave Berry |
Senior Developer | IS Apps | Greg Carter |
BOXI Analyst and Developer | IS Apps | Andrew McFarlane |
Dev Tech Senior | IS Apps | Neil Grant |
Apps Mgmt BOXI Support | IS Apps | Brian Denholm |
Conf Mgmt BOXI Universe Support | IS Apps | Rob Manley |
Other document contributors |
|
|
Background
SMART (Student Marks And Records Tool) is a system that manages students’ coursework, exam and assessment results, course marks, degree progression and classification.
There are currently only two HSS schools (Divinity and Business School) using SMART. By addressing and prioritising enhancements and current bugs, this project was to deliver enhancements to SMART to facilitate other Schools within HSS in adopting SMART.
This project has been raised as a follow up to the last enhancement project (STU221) delivered in August 2012.
Scope (from Tor)
There were two phases:
· · Phase 1: Reporting.
- Two new BOXI reports were required for the January 2013 Exam Boards. One was delivered.
- There would be more regular updates of the SMARTMI Universe used for reporting. Desirable for January 2013 Exam Board. Hard deadline was May 2013 Exam Board. Not delivered.
· · Phase 2: Prioritised Enhancements required for the June 2013 Exam Board
- Highest priority enhancements delivered, including UG/PG Pass/Fail and Null/Zero Values.
- There was a prioritised list of deliverables, as it was known from the outset (by team and sponsor) that not everything could be delivered within the budget. The table below shows this.
Objectives and Deliverables (from ToR)
No | Description | Issues Details | Phase | |
O1 | Two new reports run from SMARTMI BOXI and increase the frequency of the update of the Universe | 1 | ||
D1 | A report for exam boards that displays student classification profiles by mean for a given programme of study; contains all relevant course and exam details for a student on one page, for easy reference during an exam board. BOXI report to be based on the stakeholder agreed report ‘CHSS SMART report 1’. | Issue 15: Calculations originally done in BOXI are not correct, to be updated to read calculated fields from SMART | Achieved partly - the report has several flaws that need workarounds/fixed The reports were not usable for the Exam Board. | |
D2 | A quick glance, student-per-line report that shows progression over years 3 and 4, spread of achievement by classification, and indicates to an exam board whether a student is a borderline case. This report to be based on the ‘Prog Grad BS’ report produced by Paul Kydd and agreed by stakeholders. | Issue 3: The report was delivered as a SMART report and not BOXI for efficiency and accuracy. | Achieved, deployed in January | |
D3 | Increase the current frequency of the SMARTMI Universe update (from once a day at the moment).It will either be directly updated from the SMART database (see performance risk 3.3) or the update frequency will be increased as per the users’ requirements (every hour 9.00 to 17.00). It is currently updated once overnight.
| Issue 29: memory shortage in TEST environemnt prevented the technical stream solution to be deployed as it could not be proven to work in TEST The alternative solution - to run reports directly off the APPSLIVE - was not discussed in advance with service owner Registry and so was not accepted. A new solution was found which was to run the overnight APPSLIVE to NEWSLIVE SMART universe refresh job during the online day as scheduled date and times for the exam boards. Note, that in the end we did not need to implement this as the BOXI report was never used.
| Not Achieved | |
O2 |
Managing Pass/Fail results | 2
| ||
D1 | Process to calculate and record Pass/Fail | Issue 16:Missed requirement not built for PG | Achieved | |
D2 | Display Pass/Fail | Achieved | ||
D3 | Manage pass/fail in existing reports | Issue 15:& Jira TEL003-12: to manage the P/F changes into new built reports | Achieved | |
O3 |
System scalability as increased use by HSS schools |
2
| ||
D1 | Ensure that increased number of users can access SMART at the same time, that current queries remain efficient, and that system allows increased data storage.
| https://www.projects.ed.ac.uk/project/tel003/risks/13 assessed performance risk | Achieved | |
| Prioritised enhancements | Priority | Phase | |
O4 |
Handling zeros and null values |
1 |
2 | |
D1 | Resolution of zeros and null values in the output files reported by users |
| Achieved | |
O5 |
Averaging marks/grades for a programme |
2 (tied) |
2 | |
D1 | Resolution of incorrect calculations of marks |
| Not possible within budget | |
O6 |
Modify the yearly roll over process |
2 (tied) |
2 | |
D1 | Improve the roll over process to allow PGT programmes to calculate programme results in Sept/Oct |
| Not possible within budget | |
O7 |
Change calculation of Honours degree from ratio of years 3 to 4 from 50:50 to 1:2 |
4 |
2 | |
D1 | Ability to enter 1:2 ratio |
| Not possible within budget | |
O8 |
Add a column to record reference at course level |
5
|
2 | |
D1 | To allow users to record notes against a particular student on a particular course |
| Not possible within budget | |
O9 |
Vertical Text alignment in column headers for Progression Spreadsheets in SMART |
6 |
2 | |
D1 | To allow vertical text alignment for ease of printing and viewing course marks on spreadsheets |
| Not possible within budget | |
O10 |
Investigate disappearance in SMART of BA Exchange students who study at Edinburgh from other universities |
7 |
2 | |
D1 | To ensure those students appear on the lists |
| Not possible within budget |
Project Review
The project delivered the Pass/Fail enhancements, including Zero handling.
There has been very postive response from both new HSS and existing CSE schools users of SMART, with one experienced school team reportign that SMART had never worked so well during the exam period.
Post-Implementation and Closure Meetings have been held with team, and key stakeholders, and the outputs from these give a fair picture of project achievements and failure.
Lisa Dawson: many thanks for attending the SMART review meeting. I found it really useful to be able to chat through the issues that arose during the project and how best to move this forward. I have summarised the discussion below (Maurice – thanks for your input to this). I will be using this information within the post implementation review, if there are any changes please let me know before then.
Issue | Action | Next steps | Ownership |
Project over budget by 55 days (127 budgeted, total used 182) | Key issues Enhancements to PASS/FAIL functionality required additional bug fixing before deploying to live
PGT pass/fail (build complete but too late to deploy as could have impacted keying of marks)
BOXI reports for school exam boards (no milestones set after January which resulted in lack of clarity for resourcing)
Real time data updates (agreed process did not work within test environment; IS support proposed alternative but as this was during the exam board period this was not implemented) |
Change deployed; no further remediation required
Coding complete and within current budget; agree testing and go-live plan
Student systems (SACS) team to lead a quality review of report and compile list of changes with involvement of key users. Note this report is only suitable for use within HSS
Running the refresh at agreed times impacts the SMART BOXI environment only. New infrastructure may provide other alternatives | Student systems (SACS) as this is now part of the live system
Student systems (SACS) / IS project team
Student systems (SACS) / IS project team
Student systems (SACS) to work with IS Support to complete an impact analysis of options for daytime refresh |
Instability of environment | Move SMART to its own instance and add resources / Check database performance | The student systems team will inform IS support when key periods of use take place; during this time SMART will move into its own instance | Student systems (SACS) to inform IS of key usage
IS Support to prepare instance for SMART to be moved to within these periods |
Future SMART | Student systems (SACS) to undertake a review of all outstanding SMART issues; this should be created as a new project and not an extension of TEL-003
Understand all steps of SMART journey (application, reporting, BOXI universe)
| Review list with IS and agree the impact to SMART if changes are made and whether changes should be made within the existing SMART or wait until a decision is made over the future of assessment at the University
Student systems (SACS) to map out SMART journey to have clear understanding of all steps involved | Student systems (SACS) / IS project team
Student systems (SACS) |
Governance / Lessons learned | A lack of ownership/governance contributed to poor risk management. TEL003 was rejected from the SACS programme therefore was within the TEL programme and had no backing or priority
A lack of involvement within the project from the student systems team (SACS) resulting in changes impacting onto users that were unknown
The deployments should have been stopped well before the exam period rather than take them up to the wire - and beyond
| SMART projects must be in SACS programmes and projects have service owners: sponsors/key stakeholders/project board members
Student systems (SACS) support team for SMART to be engaged in project activities along with live support
Any future projects should have an understanding of the cyclical timeline to ensure enhancements are not planned during key processing times
| Student systems (SACS)
Student systems (SACS)
Information Services project team |
Lessons Learned
What didn't go so well? 1. BOXI reporting- -Data source is SMART: Service Management started the BOXI report without the availability of a SMART developer. Lack of data quality resulted as Service Management developed reports themselves using on-line calculations, which was subequently superceded by new build that improved quality. - when the SMART developer became available, the Senior Developer recommended that reporting be done in SMART instead of BOXI, but this advice was not taken at that time. - There is a question to be raised as why the reports were not all built in SMART, as they are part of the SMART process and re-use calculated fields. - Testing: BOXI users do not all have SMART knowledge/access. Difficult to QA changes when unsure about the quality of the data source, despite Test data refreshed from Live in Dec 12 - Testing. The new enhancements for HSS were not sufficiently regression tested to ensure they work with existing code (such as used by CSE schools) - The BOXI reports were significantly delayed because we changed strategy at the mid-point. This was partly because of problems nailing down the business requirements and also because of lack of agreement between service management and technical staff on what the delivery approach should be - Delays were also caused by confusion around which technical teams should be involved 2. Update BOXI Universe - The main option chosen to update the data stream encountered technical issues at a very late stage in Test. Other projects priority prevented earlier deployment. - Fall back position for updating BOXI update was not properly risk assessed with service owner. 3. SMART application reaching end of life - Risk of continuing developing the SMART application was highlighted at annual planning with a proposed project to re-write SMART. Proposal was not accepted. - Production had enhanced the SMART performance in 2012 but could not enhance it further - Code dependency: the enhancements affected other SMART processes that were difficult to assess/test within the resource constraint. 4. Business and technical knowledge of the SMART application and associated processes Few people have a detailed understanding of the technical implementation of SMART. Beyond the lack of breadth on the technical front, very few people have any understanding of the business, and the business logic that SMART executes
5. More emphasis is required on clearly defining and documenting the business requirements and processes. Lack of clarity caused delays and problems on this project. This either needs to be done before a project starts, e.g. as part of service analysis, or the project will need more lead time in order for this business analysis 6. Test users need to understand the business requirements and processes.3 7. A detailed test plan is required, including the testing of existing functionality. This is likely to be a significant task in any future project, if we are to avoid the sort of problems encountered this time.
|
Added by Maurice Franceschi, Project manager from 1st May : Also note that Pass/Fail was built for UG/PG courses, and for UG programmes, but although for PG although Pas/Fail was an estimated requirement it was missed during the initial build.
So PG Pass/Fail was delivered later (successfully) after exam period was over to elimiate risk to examinations (see https://www.projects.ed.ac.uk/project/tel003/issues/16)
SMART Service in 13/14
This is a list of actions taken given to improve the service in 13/14, building on what was achieved/missed during the project.
Lisa Dawson : I have summarised the actions below from the assessment sessions that have taken place and added owners and target dates.
Task ID | Task | Owner | Target date for completion |
1 | Map current processes in place within SMART i.e. what functions does SMART currently provide | Paul (with support from Alan and Colin) | Agreed users will be involved to support this as full knowledge not within SACS; complete by end August |
2 | Map current system journey i.e. EUCLID – SMART – BOXI understanding all dependencies in the process | Paul (with support from Alan and Colin) | End July |
3 | Develop full UAT plan | Paul (with support from Maurice) – Maurice – can you please send the UAT scripts that have been developed to date | This is likely to be a dependency of task 1 therefore complete by mid-September |
4 | Review BOXI report created within TEL003 project with users and create a list of fixes required | Paul (with support from Alan and Colin) | End July |
5 | Review all current SMART outstanding issues and either resolve or if requires further development, feed into product specification for future (help desk calls, wiki, product backlog) | Paul (with support from Alan and Colin) | End July |
6 | Support testing of agreed fixes (mean issue and PGT pass/fail) | Paul (with support from Alan and Colin) and agreed users (Paul to contact and arrange with users) | Starting week commencing 8th July |
7 | Map existing HSS processes for schools who are currently not using SMART or an internally built system | Paul/Alan | Mid-September (work in conjunction with Chris G and Chris B to map in a consistent format) / agree HSS users to be involved with Fraser |
8 | Review the performance of SMART during April – June with Paul Kydd (Business School) and Will Hossack (Physics) | Lisa | End July |
9 | Identify areas of good practice across the University; map processes and analyse for consistent processes / obtain feedback from issues arising from exam regulations and feed to Ian Pirie | Chris G / Chris B | Mid-September |
10 | Read assessment regulations before attending mapping sessions | SACS team | ASAP |
11 | As part of process understanding, observe a resit board | Paul / Chris | ASAP |
Analysis of Resource Usage:
Staff Usage Estimate: 129 days
Staff Usage Actual: 190 days
Staff Usage Variance: 47%
Other Resource Estimate: days
Other Resource Actual: days
Other Resource Variance: 0%
Explanation for variance:
Effort
Final total 190 days (IS Applications not including Service Management) - plus 13 days Service Management
ToR estimate was 129 days (IS Applications not including Service Management) plus 12 days Service Management
Project Phase | Total Effort | Dev Team | Config Team | Dev Tech | Apps Mngmt | Tech Mgmt | Project Services | Directors Office | Service Management |
Project Management | 32d | 0d | 0d | 30min | 0d | 0d | 31d | 15min | |
Unplanned Activity | 26d | 13d | 3d | 1d | 5d | 0d | 2d | 2d | |
Initiation | 3d | 0d | 0d | 0d | 0d | 0d | 3d | 0d | |
Planning | 4d | 1h | 0d | 2h | 2h | 0d | 3d | 45min | |
Business Analysis | 16d | 1d | 4h | 0d | 0d | 0d | 15d | 0d | |
Systems Analysis | 11d | 6d | 1d | 3d | 1d | 0d | 0d | 0d | |
Build | 70d | 54d | 9d | 5d | 2d | 0d | 1h | 0d | |
Integration | 8d | 0d | 2h | 4d | 3d | 0d | 0d | 0d | |
Acceptance | 10d | 6d | 0d | 3d | 2h 30min | 0d | 4h | 0d | |
Deployment | 6d | 0d | 0d | 3d | 2d | 1d | 0d | 1h | |
Closure | 1d | 0d | 0d | 0d | 0d | 0d | 1d | 0d | |
Total for each Team | 190 | 80 | 14 | 20 | 14 | 1 | 56 | 3 | plus 12 days |
190 days | Apps Total | Dev Team | Config | Dev Tech | Apps Mgmt | Tech Mgmt | Project Services | Director's Office | Service Mgmt |
ToR Estimate | 125 | 59 | 7 | 9 | 11 | 1 | 35 | 3 | plus 13 days |
129 days including 4 days contingency |
Reason for Variance :
BOXI reporting had to be revisited post-January deployment as it needed to have the Pass/Fail change and also to review the calculated fields - that is, the approach to do the calculations in BOXI had to be abandoned and instead the BOXI report took its values from calculated fields in SMART.
The decision on the use of BOXI had a significant impact on the variance from the Config and Dev Tech teams.
The work on updated STREAMI increased as a result of trying to fix the TEST environment problems.
The UAT testing did not pick up all problems and a large number of post-live fixes were needed.
Key Learning Points:
Outstanding issues: