Fall 2017 Data Bug
Posted January 26th, 2018 by Mirko Vucicevich


As of Monday, January 22, the Science Computing team was made aware a bug was present in the Fall 2017 data collected here using evaluate. In a small percentage of cases, some student submissions were recorded two or more times, altering the final numbers in the computed results.

For those concerned, the bug which caused this to happen was purely technical. The duplicate submissions were not caused by a security breach, and could not have been done intentionally by students.

In Fall 2017, there were over 77000 student submissions in evaluate. By our calculations, just under 1% of these were duplicate entries. Due to the fact that evaluate is strictly anonymous, it is impossible to remove all the duplicates with 100% accuracy.

Fortunately for us, over 50% of submissions to evaluate contain student-written comments. The likelihood of two student submissions having identical responses and identical written comments is exceedingly low, allowing us to immediately eliminate duplicates containing text. As a sanity check, we ran a uniqueness test for submissions with comments in previous terms and discovered they occured at a rate less than 0.0001%.

By using these commented duplicates as a metric, it was immediately clear that duplicate submissions were occuring within a small time difference from each other. In this case, we determined that all commented duplicates occured within less than 0.5 seconds of one another.

Comment-less duplicates naturally exist in evaluate data even without the bug. Many times a course will have several students who select the exact same answers -- often times all "Excellent" or all "Unacceptable". We made a best-guess at removing these duplicates by only targeting instances which occured within the same time difference as the text-based comments.

Ultimately our process will be flagging 656 responses as duplicates, affecting 360 of 2169 courses, On average 1.82 duplicates were removed per affected course, causing an average change of +0.0027 in "Prof Score" results. The majority of courses had only 1 duplicate entry to remove.

We are aware that some duplicate responses will not be caught by this method, and there will be a handful of legitimate comment-less responses which will be incorrectly marked as duplicates; however, this process has been approved by Dean's Council to ensure that the data is sufficiently fixed in a timely manner.

To help professors through this issue, we've implemented a new feature in evaluate allowing instructors and administrators to view results with or without flagged responses included. When viewing course results, any courses with flagged responses will have the following notice at the beginning of the page:

what the new flag-detection feature looks like

By clicking the link in the alert, users will be brought to a version of the course results with flagged responses included, both in the 'comments' section of the results and in the calculated averages and metrics. While viewing a course in this mode the links for downloading CSVs and raw data will also contain the flagged responses.

Likewise, administrators can now select to include flagged responses in their results listing page, and export data with or without this extra information.

Please note: If you do not see the alert box at the top of your course, your results were not affected by this bug.

We at Science Computing deeply apologize for this issue and any inconvenience it has or will cause to users of evalaute. We are taking steps to ensure this cannot happen in the future, and hope that users of evaluate continue to have a positive experience.

If you have any questions, comments, or concerns, feel free to contact Mirko Vucicevich.


evaluate is developed by Science Computing at the University of Waterloo