Closure Report

Background

The Distance Learning at Scale programme was seeking to increase the capacity of the University of Edinburgh to deliver online learning programmes to high numbers of learners.  

One of the considerations was how to use new technologies to support academic teams to maintain a high level of quality in teaching and a positive learner experience as learner numbers increase.

OnTask had been identified as a learning analytics tool which the DLAS programme will trial in its pilot programmes. Initially the MicroMasters (Predictive Analytics) programme.

OnTask could provide users who participate in online courses with personalised, actionable coaching feedback via email. Emails are personalised for each user based on criteria set by course instructors, utilising the users’ data trail within an LMS (or any tool which we can access the data).  Configurable workflows within OnTask could either use the RAW data or transform the data to create derived data sets and more advanced workflows. 

Summary

Although a project risk had been raised at the outset, problems were encountered with open source that did necessitate wider engagement and project delays. Furthermore, non-availability of internal resource due to assignment on competing projects also set us back against the original plan. 

Over the duration of the project, there was industrial action and due to the coronavirus, a transition to working from home. Nine Change Requests were raised:

  • Two budget changes. From an initial 50 days to 74 days (27-05-19) to 90 days (27-02-20).
  • Five milestone changes as dates for delivery/project closure had to be repeatedly re-planned. 
  • Priority elevation to highest in an effort to secure resource to complete tasks. Unfortunately with no positive result.
  • Ease user authentication removal from scope.

The original go-live date was not achieved for the above reasons and instead, a pilot was carried out in the TEST environment with build of a PROD environment re-planned and carried out at a later date.

The Project Manager would like to acknowledge the support and commitment of all project team members, specifically:

Name Business Area Role
David Thresher  Educational Design and Engagement  Senior Project Manager
Myles Blaney Digital Learning Applications and Media  Senior Learning Technology Service Manager 
Mark Lang Development Services Development Technology Manager
Hannah Johnstone   Development Services Database and Systems Administrator
Heather Larnach Technology Management Production Manager
Ana Heyn Technology Management Production Manager
Chris Cord Technology Management Database and Systems Administrator

 

Objectives and Deliverables

Item 

Phase 

Priority 

Owner 

Achieved? 

O1 

Obtain approval from the Learning Analytics Review Group on use of the OnTask tool 

Must 

 

 Yes

D1.1 

A Data Privacy impact Assessment (DPIA) plus any other documentation identified as required to support obtaining the group’s approval  

Must 

Business Lead 

 Yes

O2 

Deliver the OnTask tool for DLAS 

Must 

 

 Yes

D2.1 

Development, testing and live environments of the OnTask tool 

Must 

Technical Lead 

 Yes

D2.2 

Test scenarios (potentially including load testing for sending bulk emails) to ensure that the process above can be delivered reliably   

Must 

Technical Lead

 Yes
D2.3 Deliver appropriate Authentication solution Must Technical Lead  No

O3 

Document and train IDs (and any other IS staff who will support academic teams in adopting OnTask) on how to incorporate OnTask into course and module design and set up OnTask for DLAS programmes 

Must 

 

 Yes

D3.1 

A process map showing the steps for incorporating OnTask into the design of a course; for setting up the tool for each course; and for delivery of learning analytics during each run of the course. This should refer to the stages in the overall workflow map for the design and delivery of a DLAS programme 

Must 

Business Lead 

 Yes

D3.2 

A user guide providing detail on the tasks described in the process map 

Must 

Business Lead 

 Yes

D3.3 

A training workshop for IDs and other relevant IS staff on the tasks described in the process map and user guide 

Must 

Business Lead 

 Yes
O4

Identify and document the support processes for the pilot of the OnTask tool

Must

 

 Yes
D4.1 An Operational-Level Agreement (OLA) detailing how the pilot of the OnTask tool will be supported. Must Applications Lead  Yes

 

Scope

No. Description Project Stayed in Scope?
1 Set up development, testing and live environments of OnTask pilot service initially for the MicroMasters (Predictive Analytics)  programme.  Yes
2 Identify and test the process by which data on learner behaviour, collated in the DLAS data store (delivered by project DLAS012), will be accessed by the OnTask platform  Yes
3 Document and seek approval on use of OnTask from the Learning Analytics Review Group.  Yes
4 Identify and document the processes for using OnTask within a course.   Yes
5 This should focus both on how the use of OnTask is incorporated within the design of the course, and the technical tasks required to set the tool up.   Yes
6 Processes will be developed in the context of the Predictive Analytics pilot MicroMasters.   Yes
7 Documentation will be aimed at Instructional Designers and other IS staff, to support the training delivered to them in the project.    Yes
8 Provide training for Instructional Designers (IDs) and other Information Services (IS) staff, who will support the academic teams to adopt OnTask.  Yes
9 Investigate, identify and deliver appropriate Authentication solution Yes
10 Identify and document the support processes for the pilot of the OnTask tool Yes
11 Create an Operational-Level Agreement (OLA) detailing how the pilot of the OnTask tool will be supported. Yes

Benefits

No. Description Achieved?
1 Academic teams will be equipped to deliver coaching feedback efficiently, spending less time per student than if they were to provide feedback to each learner individually.  Yes
2 A higher proportion of students in the DLAS pilots will persist and complete modules.  Yes
3 The university (specifically the IDs) will gain knowledge and experience in the application of learning analytic tools to scaled distance learning programmes.  Yes
4 The infrastructure will be in place for future phases of delivery of OnTask, to programmes outside of DLAS.   Yes

Success Criteria

No. Description Achieved?
1 OnTask (and the associated DPIA) has been approved by the university’s Learning Analytics Review Group.  Yes
2 The OnTask environments have been delivered  Yes
3 The process for downloading data on learner behaviour from the DLAS data store and uploading it to OnTask has been clearly defined and documented  Yes
4 Testing scenarios that reflect the use of OnTask for DLAS programmes have been conducted and there is a high level of confidence of the stability of the platform   Yes
5 The OnTask tool can be used by the DLAS pilot programmes  Yes
6 IDs (and any other relevant IS staff) have been trained in the process for setting up OnTask for DLAS programmes  Yes
7 Supporting documentation for the ID training has been created and signed-off Yes
8 The OnTask tool and the student data contained within will be protected via an appropriate level of authentication. Yes

Analysis of Resource Usage:

 Staff Usage Estimate:   50 days

Staff Usage Actual:        104 days

Staff Resource Variance:   208%

 

Explanation for variance

There was a change in Project Manager during the early phase which meant familiarity and unplanned handover contributed to extra effort.

This was not a standard piece of work and should be viewed as a Research and Development project. As such, an undertaking with this particular open source software was unfamiliar to those involved, technical knowledge did not exist within the project team. Tasks undertaken were very much part of a learning process. This lack of familiarity with the technology led to initial estimates being inaccurate.

Whilst generally speaking turnaround to questions and answers with the 3rd party were reasonable, time-zone differences and the necessity for other project team members to chase to ensure clarity in understanding, led to delays in resolving some issues.

Problems were encountered and finding a resolution consumed more time than had been estimated. Solutions were eventually put in place and these will serve in good stead for the future e.g. a simplified and automated arrangement for the deployment of code.

Project resource was not always readily available from within the Development Technology team due to competing demands from higher prioritised projects, this also contributed to the variance and elongated some of the tasks as re-familiarisation was often needed in moving from one piece of work to another. The extension of these tasks led to additional resource being required to maintain the project over a longer period of time.

Key Learning Points

Description Recommendations Impact

Absence and non-availability of key project personnel needs to be effectively communicated, in order to re plan appropriately and manage expectations.

A communication tool needs to be agreed to and all stakeholders kept up to date.   Detrimental to project.
Updates from the Development resource were not always shared with the whole team, which meant it was difficult to gauge progress.  A communication tool needs to be agreed to and all stakeholders kept up to date.   Detrimental to project.
Time-sheets were regularly not submitted by due date, sometimes for weeks on end. Accurate monitoring of project spend against budget was not always possible. For Line-Manager/Head of Production to educate staff/take appropriate action to enforce good working practices.

Detrimental to project.

Outstanding Issues

For future consideration:

  • Ease user authentication whilst removed from scope, remains as a 'should have' requirement.
  • On task servers were setup in September 2019, a few months before the new applications network (172.16.136.0/23) that has default deny for outbound connections. It was agreed with Production Management that this work would not be progressed or charged to the project but documented. 

It was agreed with the business that the following risks could be closed but would be carried over into production as part of the business evaluation of the tool.

Risk number Description (Link)
2 Application Uptake by Users
3 Risk of slow response from Supplier
9 edX data may not be reliable

 

 

Project Info

Project
Learning Analytics for DLAS
Code
DLAS010
Programme
Distance Learning at Scale (DLAS)
Management Office
ISG PMO
Project Manager
Kevin Hone
Project Sponsor
Karen Howie
Current Stage
Close
Status
Closed
Project Classification
Transform
Start Date
01-Mar-2018
Planning Date
16-May-2019
Delivery Date
03-Dec-2019
Close Date
01-May-2020
Overall Priority
Higher
Category
Discretionary

Documentation

Close