Prior to any changes the Dev eugex_smart_courses_vw view returned 1647 records (as at 17th January 2:30pm). After the change to add the new reference view into this interface view to make it return whether the course instance uses a pass/fail marking scheme, an identical 1647 record for the same courses was returned. The interface view has been enhanced to provide the extra data without changing its behaviour.
Prior to the changes execution times for 5 runs of the eugex_smart_courses_vw on Dev was 43.5,66.3,56.1,40.7 and 43.4 seconds. After the change the execution times were 79.6,48.9,49.0,45.0 and 46.3 seconds. While this does not constitute a full performance test the execution times are comparable across 5 sequential runs. These tests were performed on 18th January at 9:10am.
Fuller performance testing would have to be conducted in an environment that developers do not have access to. It should also be noted that issues have already been raised regarding the performance (more accurately a marked deterioration of performance) of the incumbant overnight process in production - so this project enhancement is having to add to a process that is already failing.
SMART Overnight Process
Upload and processing packages were updated to bring across the extra marking scheme flag. The nightly schedules are disabled on Dev so the processes were run manually. Checked manually that the flag was filtering through correctly into the SMART courses table from EUGEX.
After running the import processes the data in SMART Dev has reduced significantly in quality making further testing more difficult. It looks like Dev EUGEX has not been refreshed from Live so the majority of Dev SMART enrolments (despite SMART being refreshed) were cancelled when I did developer testing of the overnight process. Other subsequent tests will have to do more data setup than expected.
SMART Course-level changes
Using courses PGHC11334(2010-1_SV1_SEM1) and BUST09001(2011-2_SS1_SEM2) on Dev. Having to be manually configured to force it to appear as a pass/fail course due to Dev data. Also re-enrolled cancelled students.
Checked that the "amend results" feature on the spreasdsheets works - displays a popup for the user to enter the appropriate result and when the spreadsheet is refreshed that result displays in the results column.
Quick links have been added to the spreadsheeets to allow users to quickly set pass or fail for students who have yet to recieve a result. Both of these links works correctly and does not interfere with the ability to use the "amend results" feature.
The PDF and CSV downloads were checked - output matches what appears onscreen and the reports are generated in the same way that the other SMART reports are generated.
As per existing registry sheet functionality in SMART the registry sheet can only be displayed once all students on the course have been given a result. Once that happens the registry sheet screen loads correctly. As per existing registry sheets there is a link to add a "CAA" flag against a result of "fail" - this functionality tested ok.
The registry sheet also has a set of PDF and CSV downloadable reports and also an XML file download for use with Registry's course result upload feature. The PDF and CSV reports downloaded successfully and reflect the onscreen report. XML generation correctly omits the mark element for a pass/fail course. The mappings used are:
- AA (Absent but CAA)
- AN (Absent)
- CA (Fail but CAA)
- WD (withdrawn)
- NF (Fail)
- P (Pass)
SMART Programme-level changes
All testing was performed using programme UTBSTUD.
The main method of testing was alternate route testing. Calculation routines were set up in an Excel spreadsheet - individual students could be fed into those calculations and cross-checked with what SMART itself calculated.
For the graduation report, student 0940214 was chosen. The default calculation of 3rd, 4th and honours % was correct.
The weightings feature was tested by altering the weightings from their default of 50:50 to 25:75 and then 75:25. In all cases the SMART calculations matched the Excel versions.
The course discounting feature was then tested by incrementally discounting more of the student's 4th year course results. In all cases the Excel calcs matched SMART.
For the progression report, student 0924656 was used. This was a much simpler test as the progression sheets offer no ability to amend data (none was requested and none of the UG progression sheets for other schools currently offer any such features). The SMART calculation matched the Excel version.
In all of the above tests for both the progression and classification reports the relevant visual flags appeared correctly on the onscreen version of the reports, and the pdf/csv reports were consistent with the onscreen version.
Progression reports have no amendment features so these tests apply only to the classification reports. The means of testing was manual inspection of the SMART report.
The course results and programme statuses were populated such that the spreadsheets would generate all of the possible classification results. The class of degree reported correctly based on the common marking scheme. All "special" results such as "failed more than 40 credits" were reported correctly based on their intended trigger conditions. The borderline flag also operated correctly.
If a result was manually overridden then that result was displayed correctly on the report, and the feature to revert to the calculated value also worked correctly.
All visual flags operated correctly and the new "amendments" column correctly reported what had been changed for the student.
Flagging year abroad
On the progression sheets students are correctly flagged as "YA" if they are enrolled on a course worth 120 credits.
On the graduation sheets the students' 3rd year is correctly flagged as "YA" too. In such circumstances the honours credit profiles and overall % are based entirely on the 4th year, i.e. are the same as the 4th year credit profile and %.
Incorporation of pass/fail courses
If a course is flagged as being a pass/fail course then the appropriate, non-numeric course results are displayed for that course. The results displayed on the programme represent what was entered into the course-level spreadsheets - this was verified by modifying the course-level results and manually checking that the changes are picked up by the programme-level reports.
The credits associated with pass/fail courses should not be taken account of by programme calculations. This was verified by comparing credit profiles and year % when the credits associated with a pass/fail course was modified from zero to 20 credits. This change of credits did not impact on the calculations.
If a student has undertaken a credit-bearing pass/fail course then their result is defaulted to "unknown", as per guidance given to the Schools.
Web Style Standards and Interface Design