Completion Report
Project Summary:
Project Team
Role | Department | Name |
Project Manager (Owner) | IS Project Services | Sue Woodger |
Programme Manager | IS Project Services | Rhian Davies |
Applications Manager | Apps Management | Anne Finnan |
Business Area Manager | Finance | Joyce Smart |
Project Sponsor | Finance | Liz Welch |
Developer | IS Config Team | Gabby Capitanchik/Clive Davenhall |
Refresh of Dev and Test | Dev Tech | Neil Grant/ Mark Lang/Chris Cord |
Were the goals of the project met?
The objectives and deliverables of the project were defined at the project Brief as follows:
- To meet audit requirements for balancing control accounts by fixing all outstanding errors
- D1 Analysis of problem types
- D2 Identification and categorisation of the errors and logging them in JIRA
- D3 Provide balanced AP and AR control accounts by fixing current issues
- D4 Show accurate student finance channel informatio
- Prevent control accounts from becoming unbalanced
- DI Identify where issues are being generated from
- D2 Provide solution to rebalance supplier and customer account
- Look at business rules to help inform best practice
- D1 Investigate current position of help documentation
- D2 Provide advice on providing best practice in process
Deliverables were met as follows:
It was essential to prevent the errors from happening in the first place so the Configuration Team could concentrate of fixing the errors without new errors happening so deliverable 2 was delivered first. Project Manager worked with Finance to provide a requirement for a software fix which was submitted to ABS to provide the fix. The fix was received back from ABS where it was tested and promoted to Live on 13 February 12 two weeks ahead of schedule. Finance were very pleased with the fix as it meant that users were physically prevented from continuing to incorrectly carry on with their actions if an Instalment Plan was attached to an account - the users were forced into deallocating the plan first.
The Configuration team then identified the types of errors that were happening and Finance identified accounts that had errors. The configuration team wrote scripts to correct the accounts which were then run on Dev, Test and Live with Finance testing the accounts at Test and Live stages. The fixes resulted in a balanced AP and AR control account. As the testing was completed in Live Finance checked the Student Finance channel and the information it returned was correct.
Joyce Smart in Finance is to produce a best practice document that Liz Welch will circulate around Finance. Project Services did produce data flow diagrams to support the help documentation but Finance agreed later on in the project that these were not needed. They also did not have the time to review them but the best practice document was considered to cover all help aspects.
Were there additional deliverables added?
Additional to the original objectives a refresh of DEV and TEST took place. The refreshes had been highlighted as a possible risk during the planning stage of the project and they came to fruition in the build stage as it became apparent that there was a large disparity of data between the DEV, TEST and LIVE environments. This meant that any fixes applied to DEV or TEST would not fix the data in LIVE. And as the project sponsor was not comfortable about amending data in LIVE it was agreed to carry out a refresh of DEV and TEST. The project sponsor felt that it was essential to have DEV, TEST and LIVE the same and not just for this project.
Staff Resources Estimated on Project Brief (Days) | In estimate: Config team: 21.25 Applications Management: 26.75 PM(planning, estimating, closure, pm, uat): 33
= 81 days Increase in budget 27 Sep 12 took the budget up to 104 days Actual expenditure 106 days 2.5 hours | |||||||||||||||||||||||||||||||||||
Actual Staff Resources Used (% Variance) |
| |||||||||||||||||||||||||||||||||||
Other Resources Estimated on Project Brief (Money) | n/a | |||||||||||||||||||||||||||||||||||
Actual Other Resources Used (%Variance) | n/a | |||||||||||||||||||||||||||||||||||
Planned Delivery Date from Project Brief Project brief of 19 Jan 12 | Fixes: 26 March 2012;Software fix: 27 February 2012 | |||||||||||||||||||||||||||||||||||
Actual Delivery Date | Fixes: 24 December 2012;Software fix: 13 February 2012 |
Project Manager's Commentary on Reasons For Variance From Plans
The project brief was approved on 19 January 12 with an estimate of 81 days and a delivery date for the error fixing of 26 March 12 and a delivery date for the software fix of 27 February. (The original proposal estimate had been 65 days.)
The software fix was delivered early, on the 13th February two weeks ahead of the project brief date of 27 February 12.
The error fixing was not delivered until 29 Nov 12. The main reasons for this:
- A refresh of DEV and TEST had to take place which caused a delay of 6 weeks (see section below on Refresh of Dev and Test). The refreshes themselves completed 4 days late. It was difficult for Joyce in Finance to get time after the refresh to source remaining accounts with errors due to the pressures of other high priority projects in Finance. The error fixes were delayed after the test refresh 5 times due to holidays and the pressures on Finance.
- And the final errors were extremely difficult to unpick (this had to be done in order of the error occurring) many with multiple errors within one account (one account had errors that were 6 A4 pages long). We had also estimated for fewer batches than we ended up with - many of the batches had errors that had to be reworked.
- The developer left in August as the final error fixes were started. A new developer took over.
- Due to other pressures the time for testing in Finance was longer than expected. Also reliant on one member of staff to test and all checks were manual.
Refresh of Dev and Test
The Dev and Test refreshes were estimated to cost an extra 17 days effort but we were able to claw back 10 days of this from cutting back on other areas of the project. A revised estimate of 88 days was therefore approved on 26 March with a revised delivery date of 1 Jun 12.
The refreshes themselves were delayed. It was originally intended to hold the refreshes at the end of March 12. Unfortunately EST065 and FIN072 were holding UAT at the end of March and required the TEST environment. At the beginning of April FIN070 required the use of TEST as they were escalating the project into TEST and Apps Man advised of patching work taking us to the middle of April. I was then unable to secure resource for the refresh until the end of April due to prior bookings. The DEV refresh was then scheduled for 27/30 April and the TEST refresh for 2-4 May. The project sponsor agreed to the revised dates for the refreshes on 26 March.
Delay to DEV refresh
The DEV refresh had a small issue which delayed the sign off by Finance, by 2 days, until 2 May.
- The refresh started on 27 April was tested by Applications Management 30 April and all was okay. Finance however received an error message when trying to raise an invoice on ACT143 & ACT045 - they were unable to apply which would create the batch number and were unable to create a batch header. This was sorted by DevTech adding Cedar_sequences rows (which had been omitted from the run as DevTech had not thought they were required), recreated the queue table as there were some missing advance queues and AppsMan rebuilt the target service. Finance retested - they were able to complete the batches and post them successfullyand therefore signed off that testing was successfully completed on 2 May 12. DevTech updated the procedure to incorporate the changes.
Delays to TEST refresh
The TEST refresh had bigger issues which caused the sign off from Finance to be delayed, by 4 days, until 10 May.
TEST refresh was due to complete on Friday 4 May but was completed on Tuesday 8 May and signed off by Finance on Thursday 10 May.
- The import itself did not take long but the rebuilding of the indexes took 24 hours. On DEV this had only taken a couple of hours. EFINAUDIT crashed during the process due to archive logs filling up the TEMP table space, which did not happen on DEV. The EFINAUDIT process was restarted on Monday 7 May without the archive logs but did not complete until 2am on 8 May.
Key Learning Points
What went well
DEV refresh
Despite a 2 day delay the DEV refresh went well through careful planning and the creation of a refresh checklist for all parties concerned, with DevTech and Apps Man being quickly able to address the issue which came out of Finance testing and they updated their procedures to capture for the future.
What didn’t go as well
- TEST refresh wasn’t as smooth as the DEV refresh even though the same processes were used. The rebuilding of the indexes took 24 hours compared to only a couple of hours on DEV which DevTech explained as slower disks on the Test environment. EFINAUDIT also crashed during the process and this was due to archive logs filling up the TEMP table space which did not happen on DEV because there was not as much data on DEV. DevTech however have updated the instructions to ensure anyone doing a refresh in the future should check that the databases are put into non-archive mode for the refresh.
There was reluctance to do a test refresh at the start of the project due to the issues last time there was a refresh (test was down for around 3 weeks apparantly).but if we had done the refresh at the beginning there would not have been so much delay to the project. However we have now proven that the refresh of test data can be done without issue.
- Finalising the final fixes was very difficult. One issue was that the team were also trying to fix Test as well as Live which added to the time. However there were problems with accounts which kept having to be refixed.
If you had a project like this again, what would you improve?
I would plan for more time between the refreshes. However there was a balancing act between not leaving too much time between refreshes (so the data did not go out of step again) and giving enough time between the refreshes for sorting issues. Plus there were resourcing issues about when we could book developers to do the refreshes hence the very tight 2 day interval. It is noted that the test system did quickly start to go out of synch fom the live system - as you would expect a test system to do. It did however cause some confusion during correction as records soon became mismatched and team members tried to keep the 2 systems in line. This was onerous and we had to keep amendments to accounts wrong on Live system only.
My understanding of the pressures on Finance team so that I could allow more time for them to carry out their part of the plan.
Re-estimate project when new developer took over.
At Brief stage the software fix and error fixing had individual close dates of 2 Mar 12 and 13 Apr 12 respectively. On reflection there should have been only be one close date for the whole project being 2 separate strands of the one project. So on 13 March 12 I updated the milestone schedule to one close date.
Outstanding Issues
There are no items of work that have to be completed after the project closes
The sponsor, Liz Welch, informed the project team via programme manager that she was very happy with the outcome of the project and conveyed her congratulations to the team
Completion review reports received
Anne Finnan Applications Management
Analysis of Resource Usage:
Staff Usage Estimate: 81 days
Staff Usage Actual: 106 days
Staff Usage Variance: 31%
Other Resource Estimate: 0 days
Other Resource Actual: 0 days
Other Resource Variance: 0%
Explanation for variance:
Key Learning Points:
Outstanding issues: