The study program 'Network and System Administration' teaches students in computer administration and networking. Most of the courses in this program had written exams on paper, but this was seen as increasingly tedious. Students were no longer used to write texts on paper, resulting in incomprehensible handwriting and a demanding grading process.
The introduction of the learning management system (LMS) 'Canvas' in 2018 provided a software suitable for electronic examinations, as it included a quiz component, allowing teachers to use both closed-ended (automatically graded, multiple choice) and open-ended questions (free-text) in an examination.
Written exams were migrated from paper to Canvas. After an initial learning-and-making-mistakes phase, feedback from teachers and students was positive. Time spent decoding handwriting was now spent constructing questions (Ćwil, 2019, Table I). Students benefit from getting old exams as examples in Canvas and then taking exams in a familiar environment.
In 2024, a central solution using a dedicated software ('Inspera') was acquired.
While both tools are suitable for electronic examinations, there are differences: Canvas is an LMS, the quiz tool is only one of several components. Inspera is purpose-made for examinations including features such as more sophisticated automatic grading and free-text answer commenting. Exam grades in Inspera get automatically transmitted to Ladok; in Canvas, this is a manual process.
For selected courses, we used Inspera instead of Canvas. Attempting to migrate from quizzes from Canvas to Inspera failed despite both tools claiming to support QTI[1], as they implement incompatible QTI versions.
Inspera does not support the role 'examiner' who reviews other teachers' grading. Adding the examiner as another teacher did not help, as one teacher cannot see another teacher's grading. The review process had to be conducted by having the examiner sit in front of the teacher's computer.
After the results were published, a student noticed they got a negative number of points for a multiple-choice question: in multiple-choice questions, points are added for each correct choice and subtracted for each incorrect choice to discourage random selection. In Inspera, the minimum must be set explicitly for every question, which is easy to miss. The automatic grading could not be re-run, so we were required to review all submissions to manually recalculate grades and push updates to Ladok.
Finally, we noticed a number of usability issues:
when using Inspera for IT-related courses, designing questions with source code or terminal output, often shown in monospaced colorized text, for which Inspera:
- For questions with gaps to fill out, minor but valid variations to answers cannot be described using regular expressions.
- In IT-related courses, source code or terminal output is shown in monospaced text, often colorized for readability. Asked to show text verbatim, Inspera uses a monospaced font but in red, confusing students.
- When designing multiple choice questions, checkboxes are drawn empty, without indication which alternatives are checked. To see the checkmarks, one has to switch to a different editing mode.
- Inspera makes inefficient use of available space on screen. There are UI elements where text is cut off despite available space. In the exam overview, there are large empty boxes for each question.
[1] https://www.1edtech.org/standards/qti
References
Ćwil, M. (2019). Teacher's Attitudes towards Electronic Examination – a Qualitative Perspective. International Journal of Learning and Teaching, 5(1), 77–82. https://doi.org/10.18178/ijlt.5.1.77-82
Skövde: Högskolan i Skövde , 2025.
DAL25, Det akademiska lärarskapet, Examination och bedömning, Högskolan i Skövde den 25 april 2025