Module evaluation with online tools


Summary

The Department of Social Policy and Social Work (SPSW) has been trialling two ways of running end-of-module evaluations. Google Forms and Qualtrics were trialled to allow students to provide feedback on the learning and teaching approaches in their modules. Dominic Ennis, VLE Coordinator in SPSW, presents a workflow for running evaluations in this case study.

Aims

The use of a online module feedback with a refined question set aimed to:

  • Increase student completion rates of module evaluation forms, to inform annual and periodic reviews of teaching in the department.
  • Ensure full anonymity which is not always possible with paper processes completed in the lecture room.
  • Make it easier for module convenors to capture student feedback, develop their modules and provide evaluation summaries.
  • Simplify and speed up process for students to receive student feedback summaries.

Overview

End-of-module evaluations allow students to send feedback on their experience of a module. This feedback can then be used by the module convenor and programme team to enhance the learning and teaching for that module. As such, they are crucial to the ongoing development and quality of teaching in departments.

The Department of Social Policy and Social Work has established a feedback cycle that draws upon student’s end-of-module evaluation comments and is responded to by the module convenor. The summary of feedback and the response is then posted on the Yorkshare VLE for all students on the programme to see.

Circular flowchart depicting the following steps, forming a continuous cycle: 1, Learning and teaching on the module. 2, Students complete feedback form. 3, Module convenor collates and summarises feedback. 4, Module convenor reflects and responds to feedback. 5, Summary and response provided to students. 6, Summary and response informs module development - Repeat.

SPSW Module Evaluation Cycle

The first part of this project involved a review of the question set for the end-of-module evaluation. The number of questions was reduced and refined so that feedback from each question was meaningful to the development of the module. [Question Set]

Secondly, to speed up and lower the burden of collating feedback, online end-of-module evaluations were developed using Google Forms (replacing the previous use of an inhouse system and paper-based forms). Google Forms has an inbuilt page that presents a summary of all responses to the form, useful for totaling how students have responded to different questions and making it easier to reflect on feedback rather than reviewing paper forms.

However, one of the challenges of Google Forms was the monitoring of response rates. Unlike paper-based approaches, with online forms students were not required to complete feedback at a set time and may not be a priority, especially during open assessment periods. To monitor response rates to each Google Form needed to be opened individually to see the number of students who had responded. This made identifying which modules would require reminder emails to be sent to students a tedious task.

For 2016-17, the Department are using Qualtrics following a pilot with this dedicated online surveying platform. There are three notable improvements to the online feedback process using Qualtrics:

  • A responsive (mobile-friendly) layout which allows students to complete the end-of-module feedback form on their own devices in class or in breaks.
  • A dashboard view for lecturers to monitor the number of responses for each form.
  • Automatic closing of the feedback form on a specific date.

https://elearningyork.files.wordpress.com/2017/01/de_cs02.png?w=940 A screenshot depicting the lecturer's view of the Qualtrics Dashboard. Features a number of rows of example data under the headings: Project Name, Status (active/inactive), Last Modified (date/time), Creation Date, Responses (number of).

Lecturer View of the Qualtrics Dashboard

Methodology

The chosen workflow uses Qualtrics. Module convenors need to initiate their University of York Qualtrics account for feedback results to be shared with them. This workflow is managed by a feedback coordinator or administrator in the department who sets up the feedback forms for module convenors based upon an agreed template.

  1. Module convenors request evaluation form from the department feedback coordinator/administrator who will set up an instance of the template on Qualtrics. Convenors can customise their form if necessary, with a small number of additional questions about a particular teaching activity.
  2. The evaluation form is shared with the module convenor on Qualtrics and the module convenor is sent a link to distribute to their students using one of these options:  
    • Link in module site announcement  
    • Email link to students
    • Embedded survey in announcement
  3. Lecturers may choose to allow 10-15mins time during class in final two weeks of term for students to complete the form online using their own device (or devices loaned from the department pool of laptops/tablets). This aims to increase response rates.
  4. The module convenor logs into Qualtrics to view the dashboard of their surveys showing number of respondents so far. Convenors can use this information to manage their reminders to students.
  5. At the end of the feedback period, the module convenor collates feedback from Qualtrics and completes the a Module Evaluation Summary.
  6. The Module Evaluation Summary is uploaded to Programme Sites on the Yorkshare VLE for students to see the overview of other students’ views and the response from the module convenor.

Reflections

The trials showed that response rates are relatively lower when using online forms in comparison to paper-based approaches used in class. However the online forms allow for truly anonymous responses, rapid collation of feedback and still allow for students to complete their feedback in class if time is allowed for them to use their own devices.  

A screenshot depicting the student's view of a Qualtrics Form on a mobile device. The top half of the screen contains guidance on completing the form containing such information as: a request that responses be quick, complete and honest and flagging that the form is anonymous. The bottom half of thescreen contains a demo question with multiple choice answers. Question: Overall how satisfied are you with the module? Answers: Very Dissatisfied, Dissatified, Neutral, Satisfied, Very Satisfied.

Student View of Qualtrics Form on Mobile

There are also further actions that may boost the response rate:

  • Including in the face-to-face sessions an indicator of how student feedback has informed the module design.
  • Regular reminder emails using the Announcements tool on the VLE module site.
  • Posting an Announcement on the VLE module site with the response to feedback, in addition to posting the summary and response in the Programme Site for all students to see.

Lecturers in the Department also noted in their comments during the trial that the use of feedback forms is only one mechanism for module development and student engagement in enhancing learning and teaching on their programme. Some module convenors noted the value of discussion-based evaluation processes embedded in modules, being able to tease out strengths and improvements for modules rather than inferring from short open text responses.

Transferable lessons learned

  • Provide a space on the Yorkshare VLE for summaries and responses to student feedback can be posted and notify students when summaries are available.
  • Keep question sets short and targeted so that each question provides meaningful feedback to allow you to reflect on the learning and teaching on your module.
  • Use online mechanisms to collect feedback, but ensure they are promoted in face-to-face sessions so that students know the significance of their feedback. You could show how the module has developed as a result of student feedback.

Next steps

Case Study last updated: January 2017