Back to
all case studies

Rogo – An e-Assessment Management System

No items found.


Rogo is a fully audited e-assessment management system run through a web browser on – when needed for summative assessments – a secure desktop. It covers the full examination lifecycle from the Xrst draft of questions right through to reporting the Xnal marks. Besides candidates, examiners and administrators, Rogo takes account of the less-considered external examiners and invigilators. When desired, it is possible to show how well a student has performed in an examination in terms of the learning outcomes associated with the exam’s questions without revealing the actual answers.

Major Objective

The objective of Rogo is to produce a complete assessment system that is easy to use for all personnel involved, secure when required and able to offer a wide range of question types suitable for a comprehensive university. Further, there should be a link to our curriculum mapping system.


Rogo can be described most easily in three parts: before the exam, the exam itself and after the exam. Those setting an exam constitute a “team” within Rogo; the team owns the exam and not one individual. There is therefore no disruption as team members are rotated within their department to teach different topics. The examiners set, store and confer about questions in Rogo itself. An audit trail of changes is kept. When appropriate, the examiners can consult the external examiner via Rogo. The main types of assessment question are multiple choice, image-based, text and calculation. If desired, students can be prevented from returning to questions earlier in a paper (i.e. a paper can be “unidirectional” instead of “bidirectional”).

Rogo examinations can be run only on machines registered with Rogo. These machines have a locked-down desktop. On the other hand, because examinations originate from a single source, the same examination can be run simultaneously in different rooms and even in different countries. Once an exam has begun, its questions are locked.

With the exception of those that require the composition of text, all questions are marked automatically. Several forms of post-exam analysis are available including the “frequency and discrimination analysis”. This can reveal instances where students’ answers suggest a question has been unsuitable.

Examiners can run reports to list students’ marks, results of the frequency and discrimination analysis and the learning outcomes analysis; individual students can see how well they understand each learning outcome covered by an exam.

Major Outcomes

  • In 2016–2017, 600 exams taken by 10,000 unique students.
  • Much time and frustration saved in liaison among academic and administrative staff in setting and marking questions. This includes external examiners.
  • Questions marked ecciently and uniformly.
  • Provision of formative assessment for practice – this is a student support mechanism.
  • An invigilators’ dashboard eases several practical aspects of running an examination.

Lessons about Innovation

  • When appropriately constructed, multiple choice and similar questions are as testing as longer questions.
  • Different forms of post-exam analysis available make Rogo examinations a strong but fair discriminatory tool.
  • Rogo has become an open source project – others want to use it.

How it Challenges Conventional Thinking

Conventional examinations are set on paper. Rogo automates examination processes that previously could only be done manually or sometimes not at all. The link to learning objectives provides a strong connection between exam and syllabus. Different forms of standards setting offer means of equalizing standards across successive years.

Questions from past exam papers can be made available and then marked automatically to give students’ exam practice.

Name of Author and Main Contact Person

Dr John Horton

No items found.