Closure Review
Project Summary:
Project Review
The Visitor Registration System (VRS) is an administrative system, developed to record the presence of visitors on University property and provide them with access to a wide range of computer systems and facilities such as library services. As the application is now over ten years old, it was a requirement that small functional and cosmetic improvements were made to the user interface, making it easier to use and as a result enhancing the user experience.
Objectives and Deliverables
No | Description | Achieved |
01 | Change Service Selection screen to allow users to select the right combination of services | Yes |
02 | Change the UI to allow different selection criteria in the search | Yes |
03 | Remove redundant information from input screens | Yes |
Scope
The scope of the project is to improve the usability within the VRS by making key changes to the user interface. The changes focused on the following:
- Search Options
The search page was modified to add additional filters - Org Unit, Originator, Visit Start Date, Visit End Date, Visit Stage.
- Search Results
The search results page was modified to allow sorting of columns without page reloads
- Fields Text Enhancements
Text was enhanced in various areas to make it more intuitive.
- PURE Section Reallocation
PURE was removed from the registration page and put into the visit details page.
- Service Grouping and Selection Improvements
The service selection page has been improved to remove any user error that may occur when selecting services. Relating services are grouped together, and a service selection dependency has been added which will automatically select all dependent services when a top level service is selected.
Schedule
- Preparation of project brief
- Investigate Service Selection screen checkbox combination options
- Investigate Search screen for keeping required fields and removing others
- Investigate Input screen for keeping required fields and removing others
- Update TAD
- Changes to the service selection screen
- Changes to the UI to allow different selection criteria selecting the right combination of services
- Removal of some redundant information collection from input screens
- Rework
- Documentation
- Peer testing
- Deploy to test
- Integration testing
- UAT
- Development Support
- Deployment Checklist
- Deploy to Live
- Development Support
- Handover to Production
- Closure documentation
Analysis of Resource Usage:
Staff Usage Estimate: 50 days
Staff Usage Actual: 76 days
Staff Usage Variance: 52%
Other Resource Estimate: 1 days
Other Resource Actual: 1 days
Other Resource Variance: 0%
Explanation for variance:
At the start of the project, a survey was carried out to ascertain what functionality was used within the VRS, and what users would like to see. The survey end date needed to be extended due to the lack of responses first time round.
The developer was initially booked on another higher priority project (LMP004) which in which the design and build had slipped, resulting in the developer's booking on LMP004 taking precedence over the COM018 VRS Improvements project. There were small pockets of availability, but milestones had to be revised.
22/05/2015 a change of scope was noted, but was covered by project contingency. This risk now was that there was no contingency left should something go wrong.
During July and August, the developer was on extended leave so progress was halted at the integration sign off stage as there were outstanding JIRAS that needed re-development.
13/08/2015 a scope change to the service selection page added 16.6 days to the project taking the total to 66.6 days.
02/09/2015 a further scope change to the service selection page added a further 5 days and change of milestones to achieve.
01/10/2015 delay to acceptance testing sign off as we did not have a definitive list of services to populate the Service Selection page which were needed to pass UAT. These were being put together at the time and once this was complete the services were entered into the database. JIRAS were raised after UAT which needed developer and business input to decide on the course of action to proceed.
Key Learning Points:
Carrying out a survey at the start of the project to gauge which parts of the application are being used and what improvements users would like to see requires effort to coax people to actually take part, and could push out timescales if not enough responses are gathered. A key learning point here would be to have sufficient contingency time after the survey to allow a re-run before proceeding to the next milestone.
Having a business analyst who is very familiar with the application and business processes greatly helps which the requirements gathering and for carrying out user acceptance testing. This helped a great deal in getting the functionality correct and identifying issues.
Continual stop / start of the project is sometimes unavoidable, but this also means that effort on the project is increased even though there is no work being carried out. This is due to project management time require for reporting, and once start up is resumed, people need to get up to speed again by catching up on where things were left off. This project was stopped on a few occasions due to higher priority project and annual leave.
If a project relies on an external application, e.g. to provide data, this is out with the control of the project and may cause an issue. This happened when deploying to LIVE the first time round when the SOA service in TEST did not provide any data and this was initially confused with a bug in the VRS code. The bug in the VRS code was quickly discounted after initial investigations, but the deployment for that day had already been aborted.
Outstanding issues:
There are no outstanding issues.