ALT Digital Assessments SIG - Managing Digital Assessments: Administration and Implementation

 Registration is closed for this event

We invite you to the “Managing Digital Assessments: Administration and Implementation”  webinar from The Digital Assessment Special Interest Group on 20th May 2025 at 11:00 am until 12:30 am. 7 speakers will share the conversations that are taking place at their institutions around the challenges and opportunities of managing digital assessments. Our speakers are:

  1. Denny Roberts, Learning & Technology Enhancement Service Manager and Lisa Scott at Heriot-Watt University discussing “Transforming Digital Examinations: A Collaborative Approach at Heriot-Watt University”
  2. Laurence Horstman, Learning Technologist and Sumayyah Islam, Digital Education Team member at the London School of Economics presenting  “Piloting Digital MCQ Examinations at LSE”
  3. Lee Parkin, Teaching Associate (Psychology) at The University of Nottingham presenting “Automated Mark Agreements for 3rd Year Research Projects”
  4. Zhiqiong Chen, Assistant Professor and Yihua Huang, Digital Learning Advisor from The University of Warwick sharing “Transitioning Language Exams to Online Delivery in Higher Education”

For further details please see the session abstracts at the end of this page.

Following the presentations you are invited to stay to take part in an open discussion on this topic where you can share your thoughts, concerns and progress in the area of digital assessment creation.

This webinar is the second in a series that will explore the different parts of the assessment process including:

  • Marking and feedback of digital assessments
  • Digital exams
  • Student voice and digital assessments
  • Join the mailing list to be kept up to date on this and other SIG events.

Session abstracts

Transforming Digital Examinations: A Collaborative Approach at Heriot-Watt University

The transition to digital examinations at Heriot-Watt University, particularly during the return to campus, presented significant challenges. These were further exacerbated by centralisation changes and knowledge gaps among new staff who had not previously been on campus. Initially, the support system for digital examinations was embryonic and fragmented, with roles, escalations and responsibilities being unclear. The complexity of supporting various digital examination formats, including Canvas Quizzes and Mobius whilst implementing newer technologies Gradescope, STACK and Respondus Lockdown Browser, across five global campuses and additional Transnational Education Partnerships necessitated a comprehensive and coordinated approach.
To tackle these challenges the Virtual Exam Centre (VEC) was founded, an international interdisciplinary group comprising over 43 members from diverse departments such as Timetabling, Examinations Office, Academic Support, Learning Technologists, VLE System Administrators, Desktop Services, Network Specialists, Wellbeing, IS Firstline Support, and Campus Leads. This dynamic team leveraged agile methodologies, iterative process development, reflective practices, and technologies to create a cohesive and efficient support system for digital examinations.
Our approach centres on fostering collaboration and communication among team members, ensuring that all stakeholders are aligned, informed and remove single points of failure. By adopting agile practices, the team have been able to respond swiftly to emerging issues and continuously improve processes. Iterative development has allowed for the refinement of strategies based on feedback and real-world experiences while reflective practices ensured that lessons learned from each examination cycle were applied to future iterations.
Technology played a pivotal role in bringing together the geographically dispersed team and managing digital examinations. The team used various digital assessment platforms, such as Canvas Quizzes, Mobius, STACK, Gradescope, and Respondus LockDown Browser, to streamline the examination process, enhance academic integrity, and improve accessibility. Additionally, digital collaboration platforms helped to facilitate seamless communication and coordination among team members across global campuses, ensuring a unified approach.
The formation of the VEC has led to significant improvements in the delivery and support of digital examinations at Heriot-Watt University. A synergy has been achieved that has significantly enhanced the overall examination experience for both staff and students, reducing risks and ensuring a smooth and reliable assessment process. These efforts have been instrumental in navigating the complexities of digital examinations in a global educational environment and a technology stack and support structure that can now adapt to emerging technologies.
The presentation will detail the journey of implementing digital examinations at Heriot-Watt University highlighting the challenges faced, the innovative strategies employed and the successes achieved. It will provide insights into the formation and operation of VEC, showcasing how interdisciplinary collaboration, agile methodologies and technology can drive positive change in educational support systems.

Piloting Digital MCQ Examinations at LSE

The ‘e-Exams’ MCQ pilot represents a step forward in the digital transformation of assessment at LSE. My presentation shares insights from the pilot, which runs from July 2024 to May 2025, and discusses benefits and challenges of implementing MCQs within LSE’s ‘Digiexam’ exams platform. Despite successes using Digiexam as a ‘digital answer booklet’ to deliver essay-based exams, the potential for digitising MCQ exams using automatic marking was previously underexplored at LSE. The pilot sought to build on existing MCQ features, allowing students to view questions and answer alternatives directly on-screen.

Key Findings
• Student Feedback: Positive reception with 90% of surveyed students finding it extremely easy to record their answers in Digiexam.
• Time Savings: To date, notable reduction in academic and administrative time spent on marking and processing feedback.
• Challenges: Translation of more complex marking rules to Digiexam.

The e-Exams MCQ pilot has so far demonstrated potential for Digiexam to deliver automatically graded exams, but with notable challenges in translating complex grading rules and maintaining student exam experience.

 

Automated Mark Agreements for 3rd Year Research Projects

Within the school of Psychology, we typically have 250-400 students per year completing research projects (and we are unlikely to be going back to 250). This creates a wealth of admin responsibilities regarding their 3rd year research portfolio. With this substantial piece of work having numerous components such as interim research proposals, delivering live presentations, and a final written portfolio consisting of 3 sections which needs marking by two independent academics, the need for automation was clear. With RAA previously devoting 4 administrators 8 hours each to complete the administrative load of this module (after academics had completed a significant amount of admin well) and this year all the admin responsibility being pushed back to department, a solution was clearly needed to reduce many manual work hours into an automated system.

This webinar will focus on how automation software such as Power Automate can be integrated with multiple processes to reduce administration burden, speed up the marking process, and how these processes can potential be shared across other modules and faculties to ease marking and feedback surrounding digital assessments.

Transitioning Language Exams to Online Delivery in Higher Education

This presentation reviews the end-to-end transition of traditional paper-based language assessments to fully digital formats within a higher education language course. It focuses on administrative planning, implementation, and lessons learned, with an emphasis on improving efficiency, accessibility, and the student experience - while maintaining academic integrity.
We began introducing digital assessments in our language programs prior to the COVID-19 lockdown. The sudden shift to remote learning accelerated this process, resulting in a rapid transition to fully online delivery. The motivation behind this shift was both pedagogical and operational: to streamline marking through automation, support a wider range of accessibility needs (e.g., screen readers and adjustable audio controls), and improve overall efficiency in exam delivery.
To support the transition, a multi-phase implementation strategy was employed, beginning with collaboration between academic teams, assessment advisors, and technical teams to ensure platform suitability and assessment integrity. Key activities included staff training, audio digitisation, and the design of automated marking schemes with flexible scoring options.
To minimize risks, piloting began with tutor groups, followed by staged testing with small student cohorts. A compulsory mock test was introduced before the assessments to familiarize students with the system. Full-scale implementation included real-time technical support and contingency plans during exam periods.
Challenges encountered included ensuring consistent audio playback across devices, managing time controls and navigation settings, and supporting staff with digital skills training. Clear communication with students about expectations and technical requirements was also critical.
The outcomes have been highly encouraging. Students have a smooth experience when provided with adequate training and practice opportunities. Academic staff saw reductions in marking time and administrative processing. The digital format also allowed for more creative, flexible question design and improved accessibility options.
Looking ahead, this experience points to promising directions such as the integration of AI-assisted question design and marking, data-driven insights through analytics, and ongoing refinement of inclusive digital assessment practices.
This presentation offers practical insights and forward-thinking strategies for institutions aiming to transition language or discipline-based assessments to online platforms. It highlights the importance of cross-team collaboration, phased implementation, and stable support structures to ensure a successful and sustainable digital transformation.

When
20 May 2025 from 11:00 AM to 12:30 PM
Location
Online
United Kingdom