Patricia Bartley, Digital Assessment and Feedback Project
Assessment plays a major role in university life. We know that it drives how students study and what they learn. Yet there has been little innovation in assessment types and practices over the years. Until 2020 most subject areas still mainly relied on ‘closed book’ examinations and coursework essays as they had done twenty or more years previously.
In April 2020, the University of York, like many higher education institutions, took the decision to move almost all of its assessments to an online format given that Covid restrictions would not allow for students to take closed assessments in the traditional exam hall setting. These traditional closed exams would be replaced with either an open assessment option (with a one month completion period) or an open examination of duration 24 hours. So, after years of very little change, assessment design and delivery took centre stage.
In a matter of weeks, academic departments were required to redesign exams that could be run in an open, 24 hour window or an alternative online assessment, and redesigned in a way to measure the same learning outcomes as the original exam. Some closed exams were cancelled, some were substituted with another type of open assessment and some were converted into 24 hour open exams using the VLE.
The subsequent increase in the volume of assessment activity on the VLE and other digital tools used during the summer assessment period in 2020 highlighted a need to update or replace the current technology in place. These tools were never designed to support the university’s growing digital demands in terms of usage volumes, storage needs, files sizes or types. As assessment activities have become increasingly reliant on digital technologies it has become clearer that the university needs to invest in modernising the infrastructure which supports assessment and feedback.
So in response to this growing demand, the University of York set up a Digital Assessment and Feedback (DA&F) project with the aim of significantly improving the University’s ability to deliver digital assessment and feedback in the medium to long term. The Programme Design and Learning Technology Team (PDLT) team have been involved with the project from the start. It closely follows on from the longstanding work that PDLT have been doing on digital assessment requirements. This assessment project began in October 2020 and will run until September 2022 when it is hoped that a more robust infrastructure for assessment will be in place.
The DA&F project is involving all teaching departments, professional services staff and students, gathering data on what works and what doesn’t work at the moment as well as exploring opportunities for using technology in future assessments and feedback .
The project approach
The project approach is to deliver iteratively released trial runs of the platform prior to any University wide rollout. These will run between Feb 2021 and Jan 2022 and will allow evaluation of the suitability of various components of the service quickly, and in stages, before significant investment is undertaken.
In the autumn of 2020 a procurement strategy was agreed to compare and select the most appropriate commercial product to use within the Proof of Concept stage. Wiseflow (from UNIwise) was chosen as the best commercial product and will be used to run these first proofs of contract.
Data gathering and User stories
The project team has sought to gather detailed information from every department on their approaches to managing and organising assessment and feedback as well as their technological requirements for the delivery of current and future assessments. A long list of user stories has been compiled to reflect these requirements and future wishes and this list will be used to test the suitability of the new assessment platform. This requirements and wish list currently contains over one hundred items including:
- an ability for students to submit large media files;
- scope for markers to have access to an instant Turnitin report and to be able to give audio feedback;
- support for professional and administration staff to be able to manage the whole assessment workflow on one seamless platform.
We have also been carrying out benchmarking activities to learn more about where the sector is heading in this area and to discover what can be achieved with solutions such as WiseFlow. We have received some valuable insights from other institutions already using Wiseflow and in particular from colleagues at Brunel University, London who have generously shared their experience of working with the new assessment platform. Brunel University is recognised in the sector as a long standing user and champion of digital assessment (see for example their Learning from Digital Examinations conference website, 2018) and so we were delighted to learn directly from their experience.
In November and December 2020, a usability study was conducted with students and staff in several departments, measuring their experiences with the VLE and other technologies used for assessment. The aim of this study was to provide a baseline for the user experience of the current assessment systems and processes. The researcher used thematic analysis to decouple, code and find patterns and themes in the qualitative data and produced an affinity diagram from which the key insights were extracted.
You can read more about this study and the affinity mapping process in the following article by the UX researcher, Jordan Marshall: Remote Research: Student assessment and feedback.
After initial discussions, three departments, TFTI, Psychology and York Management School were chosen to run the first proofs of contract (pilots) of the Wiseflow platform. The first release with the Psychology department concluded at the end of March with around 20% of its first year students taking a series of formative multiple choice tests through the Wiseflow platform. An evaluation of this release is currently underway but it is worth noting that initial findings show that students have found the platform very easy to use and that support staff have given positive reports on how the platform has led to a notable reduction in manual administration tasks, one of the main project aims. The second pilot release has been taking place in April with the Department of Theatre, Film, Television and Interactive Media and it focuses on assessments which require uploading large multimedia files and the marking and moderating of these assessments. The project team is happy to report that the submission of the large file assessments has gone very smoothly.
Going forward / Direction of travel
A third pilot release will take place later in the summer term. Smaller scale studies will test the usability of several features of the Wiseflow assessment system from April onwards and these will feed into an interim decision point in mid-June.The project team notes the University of York’s recent decision not to run in-person closed assessments (with the possible exceptions of professional accredited exams) during the spring term of the next academic year (2021-22). This decision highlights the importance of the discovery work being undertaken by the DA&F project and reinforces the need to maintain a continued focus on developing and delivering assessments in a digital environment in the coming months.
A parallel scoping project has begun on the VLE. This will support an analysis of the existing data feeds for the Blackboard VLE and examine how assessment integrates with online teaching provision and forms part of one integrated ecology of systems.
The expanded use of online and digital assessment in the summer of 2020 has shown the higher education sector the potential of technology to transform and reimagine assessment. Academics acknowledge that assessment needs to change to realise the full potential of their students. At the University of York the DA&F project’s plan is to install the most suitable technology to support these changes and to generate a shared, integrated and seamless digital workflow for the delivery of high quality and innovative assessment, marking and feedback..
The project team will spend the next 18 months having continued conversations with colleagues across the university as they try out a range of assessment features and scenarios on the new platform. We strongly encourage all members of the university community to get involved in this testing phase and we welcome everyone to share their ideas, suggestions and requirements on assessment and feedback. If you would like to learn more about this work, visit the project website and join the #digital-assessment-stakeholder-group Slack channel