ARLT SIG presents 'In Conversation' with Matthew Johnson.
Date: Tuesday 8th April 2025
Time: 4pm-5:30pm GMT (to be inclusive of our overseas members)
Format: Online Event
This online event is hosted by the antiracism and learning technology special interest group (ARLT SIG) as part of a series on ‘Tackling racism in the Education and EdTech sector’. Matthew Johnson, the CEO of Race on the Agenda, one of Britain's leading anti-racist change drivers, joins us to explore how we can tackle systemic racism in the Education and EdTech sector. We explore little changes and big changes required to ensure racially equitable and just technology-enabled education. Subsequently, questions and comments will be welcomed from the audience.
About the Speaker
Matthew Johnson is an experienced policy and research professional who has worked at both the grassroots (community and third sector) and structural levels (governmental and multilateral). He has gained extensive experience working with NGOs, government and academic institutions both in the UK and internationally. This includes supporting capacity development in parastatal organisations in areas of monitoring and evaluation; while advising on innovative programmes in partnership with multilateral institutions such as UNICEF.
Written by Julian Hopkins, University of Glasgow
With the rapid rise of generative AI and growing student cohorts, universities are under pressure to rethink traditional assessment practices. How can we ensure that assessments remain meaningful, equitable, and aligned with the skills graduates truly need? This blog post explores the increasing value of peer assessment as a scalable, authentic approach to enhancing student engagement and supporting the development of graduate attributes across disciplines.
Rationale for changeResponding to these challenges requires us to reconsider not just how we assess, but why. Simon’s important reminder that, “Learning results… only from what the student does and thinks” (in Ambrose et al., 2010: 1; emphasis added) should be at the centre of the shift towards active, student-centred learning, where the focus moves from assessment as a final product, towards “assessment for/as learning” (Stančić, 2021: 852, original emphasis). This also aligns with the increased focus on authentic assessment: tasks that mirror professional practice and develop transferrable skills. These authentic assessments help learners develop graduate attributes such as time management, collaboration, and communication. In other words, redesigning assessment means that students can use and understand assessment as a means to self-improvement, rather than a tick box exercise to obtain a certificate.
Why peer assessment mattersOne approach that aligns strongly with these aims and offers both pedagogical and practical benefits is peer assessment.
Peer assessment involves students giving structured feedback on each other’s work using defined criteria, often with the aim of enhancing learning, reflection, and collaboration. The literature offers strong evidence that peer assessment enhances meaningful assessment and supports the development of graduate attributes (e.g. Serrano-Aguilera et al., 2021: 2).
Through peer assessment, students improve their subject knowledge by reviewing their peers’ responses to the same tasks, preparing and giving feedback, and reflecting on the feedback they receive (Reddy et al., 2021). Students also communicate more effectively with their peers, helping to build a community of practice (Reddy et al., 2021: 825, 833).
To better understand the varied practices that fall under peer assessment, it is useful to distinguish between its different forms and purposes. Helden et al.’s typology (2023: 22953) groups three types of peer-based assessment activity under the umbrella term of peer assessment:
Peer reviewStudents review other students’ work and provide formative feedbackPeer gradingStudents provide grades on other students’ work (in a summative or formative context)Peer evaluationStudents evaluate each other’s contributions to common group workChallenges to effective peer assessment
There are potential cognitive, affective and behavioural challenges to effective teamwork and peer assessment (Sridharan et al., 2023).
Formative exercises carry less risk and therefore less potential student dissatisfaction based on having to ‘do the lecturer’s work’, distrust in other students’ capacity for accurate and unbiased evaluations, and strategies aimed at maximising grades regardless of quality (Amendola and Miceli, 2018; Helden et al., 2023; Stančić, 2021).
Summative exercises have the potential to reduce instructor workload, and there is evidence that, given sufficient training, students can grade work in a reliable and accurate manner (e.g. Serrano-Aguilera et al., 2021: 18).
Strategies for successNonetheless, student concerns about the fairness and accuracy of peer feedback are important, and can be managed by following some good practices:
The key to successful peer assessment depends on careful planning and integration across the curriculum. Peer assessment quality improves with practice, so it should be introduced as soon as possible, gradually scaling-up expectations as students progress.
A typical model for a social sciences or humanities programme could develop as follows:
Where group projects are included, there would be formative peer evaluations for the first two years, and a summative evaluation in the final year that adjusts the final grade of the individual students in the project group.
This approach helps students gradually build confidence and competence in peer assessment, moving from qualitative to quantitative feedback and from formative to summative contexts.
Choosing a platformTo support this structured approach, selecting the right peer assessment platform is crucial for ensuring consistency, scalability, and a positive student experience.
A variety of platforms now support efficient feedback workflows, anonymity, and instructor oversight; key features for successful peer assessment. The table below outlines some of those currently available including their compatibility with Moodle.
Peer assessment not only complements educators’ feedback, but also supports authentic assessment, criteria-based evaluation, development of graduate attributes, and preparation for the workplace. With effective tools readily available, the main challenge is thoughtful, curriculum-wide implementation to ensure relevance and impact for today’s students.
As higher education continues to evolve, strategically embedding peer assessment can play a vital role in shaping more engaged, reflective, and capable graduates.
ReferencesAmbrose SA, Bridges MW, DiPietro M, et al. (2010) How Learning Works: Seven Research-Based Principles for Smart Teaching. Newark, UNITED STATES: John Wiley & Sons, Incorporated.
Amendola D and Miceli C (2018) Online Peer Assessment to Improve Students’ Learning Outcomes and Soft Skills. Italian Journal of Educational Technology 26(3). 3: 71–84.
Helden GV, Van Der Werf V, Saunders-Smits GN, et al. (2023) The Use of Digital Peer Assessment in Higher Education—An Umbrella Review of Literature. IEEE Access 11: 22948–22960.
Reddy K, Harland T, Wass R, et al. (2021) Student peer review as a process of knowledge creation through dialogue. Higher Education Research & Development 40(4). Routledge: 825–837.
Serrano-Aguilera JJ, Tocino A, Fortes S, et al. (2021) Using Peer Review for Student Performance Enhancement: Experiences in a Multidisciplinary Higher Education Setting. Education Sciences 11(2). 2. Multidisciplinary Digital Publishing Institute: 71.
Sridharan B, McKay J and Boud D (2023) The Four Pillars of Peer Assessment for Collaborative Teamwork in Higher Education. In: Noroozi O and De Wever B (eds) The Power of Peer Learning: Fostering Students’ Learning Processes and Outcomes. Social Interaction in Learning and Development. Cham. https://doi.org/10.1007/978-3-031-29411-2_1
Stančić M (2021) Peer assessment as a learning and self-assessment tool: a look inside the black box. Assessment & Evaluation in Higher Education 46(6). Routledge: 852–864.Wanner T and Palmer E (2018) Formative self-and peer assessment for improved student learning: the crucial factors of design, teacher participation and feedback. Assessment & Evaluation in Higher Education 43(7). Routledge: 1032–104
Academic year 2024/2025
Our last blog shared the experiences of our Co-Chairs over the last year of the Digital Assessment SIG. In this blog our team of officers share their experiences of being involved in the digital assessment special interest group and their hopes for the coming year.
Sally Hanford – University of NottinghamI joined the Digital Assessment SIG because I’ve been involved in a review of Curriculum Management and e-Assessment at my institution (a collaborative effort across many departments) and I felt that the SIG would help me develop a wider understanding of how other universities are approaching Digital Assessment, discover good practice and learn about lessons others have learnt. Having worked in Higher Education for over 20 years, I also hoped to be able to contribute to the wider discussions.
I’ve really enjoyed the webinars. It’s been amazing to hear about what is going on at other institutions. The events have opened up some great conversations and opportunities to network.
I’ve learned so much about summative assessment and AI by being involved in a subgroup of the SIG on this subject.
I’m hoping we can go further in exploring this subject in the next academic year and I’m looking forward to finding out more about what is going on sector wide.
Sulanie Peramunagama – Digital Assessment Advisor, Brunel University of LondonI joined the Digital Assessment SIG because I’m passionate about digital assessment and have been immersed in it since I began working in UK higher education nearly a decade ago. I am eager to learn from others across the sector, explore emerging technologies, and contribute to the evolving conversation around assessment practices.
Being part of the SIG has been both fulfilling and inspiring. Collaborating with our dedicated group as well as the sessions and demonstrations on innovations in tech on digital assessment has encouraged me to reflect on and improve my own practice.
One of the highlights for me this year was Professor Samantha Pugh’s webinar session on competency-based programmatic assessment. Her approach: “Using digital platforms to give students multiple opportunities to demonstrate their learning” illustrated how well technology can be harnessed to enhance assessment design. For me, this captures the true purpose of digital assessment: not just to digitise existing practices, but to use technology thoughtfully to make assessment more meaningful, inclusive, and effective.
I have also enjoyed being a part of a SIG subgroup researching the use of AI in summative assessment. At the time of writing (18.06.2025), we continue to navigate the impact of AI on assessment. From what we have found out so far, staff and student perspectives on AI in assessment are mixed. We share the hope that students will not only learn how to leverage AI to enhance their learning and performance, but also develop the critical awareness to understand its limitations and avoid being misled by it. To support this, assessments themselves need to be thoughtfully redesigned to provide opportunities for students to demonstrate these emerging skills in authentic and meaningful ways.
Overall, I believe our SIG will contribute generously to the sector to rethink and reshape digital assessment practices successfully. I’m excited to see how our work evolves in the coming year and I am happy to be part of this forward-thinking community.
Miki Sun – Learning Technology Service Manager, University of EdinburghI am glad to be selected as an officer and really grateful for this opportunity! I am a Learning Technology Service Manager who looks after a number of digital assessment tools and supports the vast user community. As a Learning Technologist, my passion is to use various technologies to enhance students’ experience and help them achieve learning goals; but as a service manager, my task is to ensure smooth running of the centrally supported digital tools and to mitigate impact of issues for assessment, which means I am limited by the tools and functions we could offer. I often struggle when software vendors could not develop or provide solutions and functions my users need, and often wonder how colleagues in other universities deal with similar challenges. Joining the Special Interest Group in Digital Assessment as an officer, my hope is to meet like-minded people, to learn from their best practices, to share the challenges we face so that we could influence future development of digital assessment pedagogy, technology and policy together. I have certainly not been disappointed in the first year!
Indeed, the co-chairs Alison, Gemma and Helen, and the other officers helped me feel welcome and supported as soon as we first met. I am impressed how plans and decisions were made collectively and quickly through the group’s Padlet boards and Team meetings, and our ideas were fast implemented, invitations sent, webinars organised, blog articles published, all like clockwork! Not only the JISC mailing list subscriptions increased daily, the first webinar in GenAI and Digital Assessment on 21st January 2025 was a great hit and there were 91 attendees on the day! When I shared the recordings with my colleagues, I received a lot of positive feedback, which helped grow my confidence. I then shared the future events’ more widely on my Linkedin post and in my internal service teams, Learning Technology community and user groups. I am so encouraged to see my previous and present colleagues joining the following webinars and have given excellent talks! I learnt so much from their experience and their groundbreaking work in creative and innovative digital assessment practice.
David Callaghan – Senior Educational Technologist – Liverpool School of Tropical Medicine Chair of AI for Summative Assessment sub-groupIn Mid-March 2025 I emailed the ALT list to call for participants for a group to look at using AI for summative assessment. This controversial email resulted in the formation of an initial group of 10 individuals from HEIs in the UK and North America. I was also approached by Gemma from the ALT Digital Assessment group who suggested we join as a sub-group – which we accepted.
Our initial ideas were to create a survey to gather staff and student attitudes to AI for summative assessment – which is now running and will remain open indefinitely (for longitudinal analysis). LInk to survey (& the author, Tadhg, says ‘…share, share, share!):
Other ideas include ‘Taking AI to moderation’ and marking a handful of previously assessed student work work you have IPR over (like your own) using AI.
Next steps for us are to review our survey looking for themes and write up for the ALT blog and then Journal. Ideally I would like the data from the survey and our other work to be used to lobby government and the OfS etc. to use AI effectively in assessment practices. We are also looking to take the two ideas below forward.
My favorite part of what the sub-group has done this year is allow a group of interested stakeholders to have supportive and frank discussions on this controversial topic.
I have learned, via these discussions and our initial survey, that our thoughts about what others think of the use of AI for summative assessment are fairly accurate – with some interesting thoughts from the survey, including comments like ‘Well, AI is going to be a little less biased’.
Hopes for next year is to publish in ALTJ or similar, lobby gatekeepers, and create some guidance for colleagues looking to use AI for assessing student work. The two projects, one on using AI in standardisation activities and another on using AI in a pseudo standardisation meeting may contribute to this aim.
Nurun Nahar – Assistant Teaching Professor – University of Greater ManchesterI joined the ALT Digital Assessment SIG in September 2024 as an Assistant Teaching Professor with a research background in technology-enhanced learning. In my role I advocate for research-informed pedagogies within my institution and advise my department on harnessing digital tools for both formative and summative assessments. When I came across ALT’s call for expression of interest in joining this SIG dedicated to digital assessment practice, I recognised an opportunity to join as an officer and shape a community-driven agenda that aligns with my commitment to evidence-based innovation in student learning experiences and I must admit it has been a fulfilling experience so far!
From our first meetings, I felt grateful for the open, candid conversations around challenges we all face in digitally assessing learning—whether workload pressures, questions of validity, or sustaining student engagement. Hearing from SIG colleagues representing diverse professional backgrounds across various institutions, has been an enlightening experience to explore issues that no single voice can resolve alone, such as balancing academic integrity with inclusive design. I felt proud to be working alongside colleagues who saw strength in drawing on our collective experiences, to charter a clear vision for this SIG whilst also recognising the value of inviting the wider sector to join the conversation to shape our agenda and impact.
On 21 January 2025 I was pleased to co-present in our first webinar alongside Alison Gibson, University of Birmingham (SIG Co-Chair) and Lisa Bradley, Queen’s University Belfast (SIG Officer). My session was titled “Generative AI and the Future of Digital Assessments: Shifting Focus, Leading Change.” It was attended live by 91 participants and has since been viewed over 230 times on YouTube. It offered me a chance to learn just as much as to share how generative AI might shift us from product-centred summative tasks toward process-rich formative cycles, how multimodal AI tools could support with self-regulated learning in students and how we as a sector can ensure responsible use of AI for learning, teaching and assessment through multistakeholder collaborations that foster critical dialogue exchanges and sharing of good practice across the sector. The lively discussion that followed on authenticity of learning and assessment and trust in AI tools, reminded me how much we still need to achieve in this space and what we could consider exploring collaboratively through the ALT Digital Assessment SIG.
Looking back, being part of this SIG has enabled me to further appreciate how structured dialogue can unearth practical, context-driven strategies from influencing procurement of digital tools to ensuring accessibility in digital assessments and how local innovations can inform broader guidelines. It has also offered me an opportunity to be part of the AI for Summative Assessment sub-group led by David Callghan and work closely with colleagues from various Higher Education institutions to understand staff and student perspectives on using AI for summative assessment. Above all, I am pleased that my engagement with this SIG and insights shared by various guest speakers in our subsequent webinars, has reaffirmed my belief that assessment design lies at the heart of meaningful digital practice, not the technology itself. Looking to the future, I am excited for what lies ahead for this SIG. I anticipate more challenges will unfold for the higher education sector as we continue to witness rapid progress in multimodal AI technologies combined with wider concerns facing this sector. However, I am hopeful that the forward-thinking spirit of this SIG, will help us address emerging issues collaboratively, ensuring our digital assessment practices remain resilient, equitable, and pedagogically sound.
ALTC is more than just keynotes and sessions - it’s a vibrant, immersive experience that brings the ALT community together in exciting and unexpected ways. From creative showcases and group-led conversations to social gatherings and spontaneous moments of connection, this is where the conference truly comes to life.
Reflection is widely used for professional recognition and also seen as valuable for professional development. Being able to reflect deeply requires skill to step back from what is occurring to consider beyond ‘what’ happened to why it happened and in Senge’s (2006) view develop the skills of a learning organisation. In this session we will share our experiences of Co-generative dialogues. In these structured conversations each participant has equal voice. Alternating between speaking, listening attentively to one another, and valuing diverse perspectives (Hsu, 2021). This type of reflection values the principles of equity, respect, and inclusion as foundational to sense making. We explore how these deeply reflective accounts come together in a democratic space softening the traditional hierarchies (Tobin, 2008). Working in groups, dyads or triads can foster collective responsibility for improvements (Martin, 2006). In this session, we will provide the opportunity to experience a co-generative dialogue to foster reflection that moves beyond what to why, and more importantly shape learning for the future.
CPD Webinar Host:
Lynn Gribble
After an incredible 17-year journey, Julie Voce is stepping down from the committee of the ALT M25 Member Group, one of ALT’s longest-running Regional Member Groups.
ALTC25 Guest Post Joe Wilson
After years of campaigning to bring ALTC, or a related ALT event, to Glasgow, it’s finally happening in 2025. I nearly managed to host #OER21 at City of Glasgow College, but COVID had other plans. I’ve also long advocated for a more college-friendly time of year for the conference. This year, both goals have been realised.
It’s genuinely exciting to see the programme coming to Glasgow at last. I’m hoping we’ll reach colleges and universities across the UK, with a strong Scottish contingent and a brilliant turnout overall. We’ve got plenty of fun things planned, and I’m thrilled to be co-chairing with Emily Nordham and Laura Milne. The programme is engaging, but I also hope you’ll take time to enjoy some of Glasgow’s other delights. (I’ll share a separate post soon with recommendations of things to experience and sample.) I’m biased, of course, but if you can, stay on after the conference and spend a weekend here. Glasgow is a UNESCO Learning City, and it’s well worth exploring.
A Personal Journey with ALTC
I think my first #altc was in 2000, and by 2001, thanks to a twist of fate, I found myself on the organising committee for the conference at Edinburgh University. Since then, I’ve tried to attend either the full conference or at least one policy forum or special interest group event each year. ALTC is a brilliant space for professional development and networking, and ALT’s work through CMALT and other initiatives has never been more relevant.
In the early 2000s, we were talking about online assessment systems and the rise of virtual learning environments. By the mid-2000s, it was all about BYOD and MOOCs. The sessions and discussions at ALTC are always cutting-edge, sometimes positively contentious, and always practical. I know this year’s themes reflect that spirit.
Top Tips for ALTC 2025
I’ll be posting soon with ideas for places to visit and things to experience while you’re here. Whether you’re a first-time visitor or a seasoned ALTC attendee, I hope you’ll find time to enjoy everything Glasgow has to offer.
See you in October!
ALT’s Annual Conference is one of the UK’s largest conferences for learning technology and digital education professionals. The conference provides a valuable and practical forum for practitioners, researchers, managers and policy-makers from education and industry to solve problems, explore, reflect, influence and learn.
ALTC25 will take place in Glasgow on 23 and 24 October 2025. Early bird bookings end Friday, 15 August. Register for ALTC25 now.
Take a look at what’s planned for Open Education, AI, and Populism – Revisited. Our online conference is coming up on
Take a look at what’s planned for Open Education, AI, and Populism – Revisited. Our online conference is coming up on 16 S
You asked, we listened. The early bird registration period for the ALT Annual Conference 2025 will now close at 23:59 BST on Friday, 15 August. We encourage all prospective participants to register before this extended deadline to benefit from reduced rates.
Digital literacy is a critical competency in education across all levels, from primary to higher education. It includes skills such as technical proficiency, information evaluation, online collaboration, creativity and ethical technology use. This study conducts a Systematic Literature Review (SLR), following Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines, to examine types of instruments used to assess students’ digital literacy, the competencies targeted and the methodological challenges in their development. A total of 23 peer-reviewed articles published between 2014 and 2024 were selected from Scopus, PubMed, Crossref and ERIC. This review shows that assessment instruments include Likert scale-based questionnaires, framework-aligned tools (DigComp and DQ Framework) and digital performance-based methods. These instruments are applied across diverse educational settings: primary, secondary, tertiary and adult education with varying emphases based on age and learning context. Whilst core competencies are addressed, several limitations persist, such as reliance on self-reporting, limited cross-cultural validation and lack of authentic performance assessment. This study highlights the need for more comprehensive, validated and context-sensitive instruments that integrate digital safety, ethics and practical digital skills. The findings offer insights for researchers, educators and policymakers to improve digital literacy measurement across education sectors.
The Digital Assessment SIG has thrived since its birth at the start of the 2024/2025 academic year. We now have 128 subscribers to the Jisc mail and have delivered 3 successful webinars with 10 presentations by 13 excellent speakers. For those of you interested in YouTube statistics our most watched webinar, Generative AI and Digital Assessments has so far gained 338 views and still counting.
Given the success of these webinars, next year we will explore the following topics, along with growing two subgroups in AI and Accessibility of Assessment:
Over the summer we will be sharing two blog posts where we invite you to step inside the inner workings of the SIG with a view from our SIG Co-Chairs and officers on the activities we have completed this year and a description of their own experiences as members of the Digital Assessment SIG. Starting with our co-chairs.
From our Co-Chairs Gemma Westwood – Senior Digital Education Developer, University of BirminghamI co-created the digital assessment SIG with Alison and Helen, our other co-chairs following a year-long project at the University of Birmingham where we completed a large-scale trial of two digital assessment tools. During this process we made many discoveries that did not seem to fit into other existing special interest groups. We longed for a place to discuss our findings, thoughts and experiences with the wider sector as well as to hear the experiences of others in terms of digital assessment, outside of the networks facilitated by tool providers or hopeful open sector information requests. Discussions with others in this situation made us realise that this space was needed elsewhere too.
My favourite part of this year is how the SIG has grown from being a loose idea of a discussion forum into a well formulated series of webinars and blogs that have tackled some of the pressing topics in this area. I cannot praise the dedication of the SIG officers enough for making this happen, for the members of the SIG for supporting these initiatives and for their individual contributions this past year. I am proud of how the SIG has begun to support research in specific areas, with SIG officers coordinating subgroups on generative AI in summative assessments and working together with the accessibility SIG to set up a task and finish group to provide support and guidance to the sector on procuring accessible digital assessment tools.
As a Co-Chair I have had so many opportunities from being a part of this SIG that I have not had before. For example, our first webinar was the first time I acted in the role of webinar host (rather than as a presenter), where I learned rapidly that timing for a host is key. I have also had the opportunity to represent the SIG on the EAA podcast Back to the future: a new review of the JISC 2020 report on the five principles for the future of assessment which was also my first experience of being on a panel discussion.
In terms of next year I am hoping that we can expand the reach of the SIG, working more closely with fellow SIGs on mutual interest projects and to begin to share the outputs of these in an open format. As well as to continue to expand both the webinars and the blog, with a hope to bring people together in person to highlight what is happening in digital assessment across the sector.
Helen Greetham – Digital Education Developer, University of BirminghamWhen my colleagues Alison, Gemma and I were discussing routes to share and discuss our work and the possibility of creating an ALT subgroup was suggested, I wasn’t sure what to expect! I’ve generally been a bit of a passive ALT member, appreciating the excellent journal articles and news items it sent my way, but never sure how to become more involved in the organisation. It’s been great to have this chance to dive in and contribute to sector-wide discussions.
Over the past few years I’ve been submitting a lot of conference papers and session proposals to all kinds of events, and one of my favourite parts of being an officer is the insight I’ve gained from seeing this process from the other side. It’s an enjoyable challenge to sort through all of the wonderful submissions we get for our monthly webinars, trying to select talks which work together and create a larger narrative in conjunction with each other. These events always prompt some really useful, practical discussions.
In the next year I’m hoping to meet face-to-face colleagues who I’ve only ever known in virtual spaces at the ALT conference. I’m currently working on the opening stages of a project with the Accessibility SIG to create a task and finish group to provide sector-wide guidance for procuring accessible digital assessment tools, and I’ll be excited to see what comes from that over the next academic year!
Alison Gibson – Head of Digital Education, University of BirminghamIn 2024-25 the University of Birmingham digital assessment team ran a proof-of-concept project, where we evaluated different assessment tools to see which requirements they met. I was determined from the beginning of that project to ensure we were engaging across the sector, as we were in the privileged position of having dedicated time and resource to complete digital tool evaluations.
What we found, as we did the rounds of various conferences, was an overwhelming desire from colleagues at other institutions to hear about and share experiences of digital assessment tools, away from the influence of vendors. And so, the idea of the Digital Assessment SIG was born: a place to share good and bad assessment experiences, strategic goals and projects, find commonality amidst the noise of EdTech, and have discussions that don’t dissolve into sales pitches.
While I’m now less actively involved than at the beginning, I’m delighted to see the SIG growing in membership and engagement. Working siloed away in our institutions can become habitual, but we have so much to learn from each other, and the events that the co-chairs and members of this SIG are arranging are the perfect way to do that.
Early bird registration for the ALT Annual Conference 2025 will close at 23:59 BST on Friday, 8 August. We encourage all prospective participants to register before this deadline to benefit from reduced rates.
This is the third in a series of four sessions throughout the 2025/26 academic year. These are designed as open forums for colleagues to share and get feedback in the East England and East Midlands region. The sessions are very informal and everyone is welcome, if you live and/or work in the regions. Further details to follow: The exact topic and discussion format has been determined in the 21 January 2026 session. We'll conclude the sharing 4:50 (at the latest). Following that, we'll dedicate 5-10 minutes to setting the agenda for our July meeting.
Last month, we had the pleasure of hosting an inspiring OER25 in London.
The topic is: What’s the top priority for you right now? This is an informal opportunity to share what you're currently working on, get feedback, seek collaborators, discuss any challenging issues, or report on anything you wish. Minimal preparation is required, as we aren't expecting any slides (unless you want to). Just think about what you want to talk about beforehand and any questions you want to ask others. We'll go around the room, and if multiple people are attending from the same institution, we'll address it on an institutional basis. The time allocated for sharing will be flexible, depending on the number of attendees and where the discussion is going. Any contributions who we aren't able to get to in the October session will be carried over to the following session in January. We'll conclude the sharing 4:50 (at the latest). Following that, we'll dedicate 5-10 minutes to setting the agenda for our January meeting. This agenda might focus on delving deeper into topics that arose during the October session, or it could introduce entirely new subjects for discussion.
Have you ever wondered how data science students approach learning analytics? In this session, we’ll explore the unique perspectives of postgraduate students who analysed the Open University Learning Analytics Dataset (OULAD) as part of a four-week innovation project.
What makes this work especially interesting is the combination of methods used to capture the students perspectives. Alongside human-led content analysis, we used AI tools like Retrieval-Augmented Generation (RAG) and Large Language Models (LLMs). This blend provided a richer, more nuanced understanding of how data science students interpret and engage with learning analytics.
Join us as we share what we learned from these students—about their analytical approaches, the themes they explored, and what all of this means for designing better learning analytics in higher education. It’s a conversation about how students see the role of data in their own learning journey and how their insights can shape the future of education.
CPD Webinar Host:
Raghda Marai Zahran, Newcastle University
By Digital Assessments SIG
On the 20th of May we held our last webinar of this academic year. Thank you to our presenters for sharing your work. If you missed it and would like to watch any of our presenters you can access their individual recordings on youtube:
This presentation The transition to digital examinations at Heriot-Watt University, particularly during the return to campus, and the significant challenges it presented. Denny and Lisa shared that to tackle these challenges the Virtual Exam Centre (VEC) was founded, an international interdisciplinary group comprising over 43 members from diverse departments, this dynamic team leveraged agile methodologies, iterative process development, reflective practices, and technologies to create a cohesive and efficient support system for digital examinations.
Their approach centres on fostering collaboration and communication among team members, ensuring that all stakeholders are aligned, informed and removing single points of failure. By adopting agile practices, the team has been able to respond swiftly to emerging issues and continuously improve processes. Iterative development has allowed for the refinement of strategies based on feedback and real-world experiences while reflective practices ensured that lessons learned from each examination cycle were applied to future iterations.
They shared how technology played a pivotal role in bringing together the geographically dispersed team and managing digital examinations. The team used various digital assessment platforms, such as Canvas Quizzes, Mobius, STACK, Gradescope, and Respondus LockDown Browser, to streamline the examination process, enhance academic integrity, and improve accessibility. Additionally, digital collaboration platforms helped to facilitate seamless communication and coordination among team members across global campuses, ensuring a unified approach.
This presentation detailed the journey of implementing digital examinations at Heriot-Watt University highlighting the challenges faced, the innovative strategies employed and the successes achieved. It will provide insights into the formation and operation of VEC, showcasing how interdisciplinary collaboration, agile methodologies and technology can drive positive change in educational support systems.
Sumayyah Islam and Laurence Horstman presented on the ‘e-Exams’ MCQ pilot which represents a step forward in the digital transformation of assessment at LSE. Their presentation shares insights from the pilot, which runs from July 2024 to May 2025, and discusses benefits and challenges of implementing MCQs within LSE’s ‘Digiexam’ exams platform. Despite successes using Digiexam as a ‘digital answer booklet’ to deliver essay-based exams, the potential for digitising MCQ exams using automatic marking was previously underexplored at LSE. The pilot sought to build on existing MCQ features, allowing students to view questions and answer alternatives directly on-screen.
Automated Mark Agreements for 3rd Year Research ProjectsThis webinar focused on how automation software such as Power Automate can be integrated with multiple processes to reduce administration burden, speed up the marking process, and how these processes can potentially be shared across other modules and faculties to ease marking and feedback surrounding digital assessments.
Transitioning Language Exams to Online Delivery in Higher EducationThis presentation reviews the end-to-end transition of traditional paper-based language assessments to fully digital formats within a higher education language course. It focuses on administrative planning, implementation, and lessons learned, with an emphasis on improving efficiency, accessibility, and the student experience – while maintaining academic integrity. This presentation offers practical insights and forward-thinking strategies for institutions aiming to transition language or discipline-based assessments to online platforms. It highlights the importance of cross-team collaboration, phased implementation, and stable support structures to ensure a successful and sustainable digital transformation.
by Matthew Ruddle
In the 24/25 academic year I made the decision to write weekly teaching reflections, which I published publicly on LinkedIn. Here I explain my reasoning behind this, and the impact it had.
Why reflect?Teaching is challenging – ask any teacher, and they will tell you! This past year was particularly challenging, because I was assigned to work exclusively with our alternative provision faculty.
This meant that I was teaching GCSE English Language to groups of pre16 and post16 students, most of whom had high learning needs, were recovering from trauma, experienced high anxiety, displayed challenging behaviours, and had been failed by mainstream schools in a myriad of different ways. To help them to re-engage in education (and specifically GCSE English) I had to re-think my approach to teaching, which meant ripping up the resits rule book and starting again.
I already knew from previous experience that working with high-needs learners in alternative provision is often stressful, so I needed to find a way to help me cope with this.
I also knew that if I didn’t take the time at the end of each week to sit and think (to really think) about the successes and the struggles, then I would end up drowning in a pool of negativity. No. More than just a pool. A hungry, swirling whirlpool threatened to pull me under every Friday. Writing a teaching reflection was the lifejacket that kept me afloat.
Why reflect publicly?Reflecting is something that most teachers do all the time: we are constantly thinking about our students’ progress, evaluating the impact of a lesson or a learning activity, and pondering how to improve things next time. I knew that if I reflected privately (perhaps in a notebook or journal) then I would be less likely to continue with it, so I made the decision to post these weekly reflections on LinkedIn, where anyone could read them, to make me more accountable.
What do I mean by “beyond the looking glass”?Now, we all know that social media is fraught with difficulty. In many ways, social media is a place where we often present a distorted version of ourselves: where we look better, we sound more productive, we live more exciting and more perfect lives. Social media is a like a warped mirror: it reflects back to us a version of someone who doesn’t really exist, but we still share this strange image with the online world, where we strive to be “the fairest of them all”.
Although the word “reflection” reminds most people of looking into a mirror, the act of reflection is more about gazing deeply inside of yourself. It is not about crafting a more attractive version of who you want the world to see; it is about trying to discover truths.
Why share teaching reflections?Rather than thinking of a reflection as a mirror image, I think that the act of reflecting is more like the opening of a window; through sharing my teaching reflections each week, I was not only offering a glimpse inside my classroom, but I was also looking outside, beyond my four walls, at the larger landscape.
Posting my reflections on LinkedIn created the possibility of connecting with other FE professionals across the UK. I was not seeking praise or “likes” or any sort of hollow flattery: I was trying to share my true experiences of teaching in FE, warts and all, and I didn’t really know what the reactions would be (or even if anyone would want to read them).
I made the effort to write truthfully about the struggles as well as the successes in my classroom. I wanted to give an accurate impression of what it is like teaching on the frontline, every week, and looking back at those reflections now, I am proud of my openness and honesty. More than a few people have told me that my reflections were “raw”. This reassures me that my writing has been frank and realistic, which was my goal all along.
Afterall, what’s the point of writing a teaching reflection if it isn’t true? It may not always present the “ideal” version of who we are, but it does present an honest snapshot of how things were that week, and opens a window into our pedagogy.
How did this impact me?I cannot express enough how much writing these reflections helped my mental health. There were times when I was so tired, stressed, and was second-guessing my teaching, that the simple act of thinking and writing about my experiences lightened my load. I was subsequently reassured when other teachers told me that my experiences resonated with them. Knowing that I was not alone in my experiences helped me to feel connected to the wider FE community, far beyond the confines of my own desk, classroom, or college.
What’s my hot take?Yes, sharing your honest teaching reflections online can be scary (it definitely made me feel vulnerable) but sharing our experiences in this way establishes the context for so many possibilities: for conversations, for connections, and for deeper contemplation about what we’re doing, why we’re doing it, and how we can make it better. Reflections really do open a window to the wider world.
Call to action:Connect with Matthew Ruddle via LinkedIn
If you want to learn more about LinkedIn and how you can leverage the platform to make connections, do sign up to our AmplifyFE webinar on Friday 14 Nov 2025 from 12:00 PM to 12:30 PM AmplifyFE Community Space Workshop: How to master LinkedIn to achieve your goals