Closing the loop - evaluating the use of student feedback to enhance teaching

Closing the feedback loop for Module Evaluation Questionnaires: Evaluating the conversion of feedback into enhancement of teaching practice

Joanna Carter, Strategic Planning and Business Intelligence Service

Closing the feedback loop is essential for securing and maintaining student participation in teaching evaluation processes (Watson, 2003, Marshall et al., 2015, Shah, Nair and Richardson, 2017 and Hoel and Dahl 2019). The University of Hull implemented standardised staff responses to Module Evaluation Questionnaire (MEQ) feedback for their students from 2019/20 trimester 1; these responses are referred to as ‘Student Reports’.1

MEQs are an opportunity for students to provide feedback on their module experiences. They take place in the last three teaching weeks of each trimester and consist of eleven statements over four sections, including Teaching and Learning and Feedback and Support. Students indicate how much they agree or disagree with each statement and can choose to provide written comments at the end of each section. This produces quantitative and qualitative data for staff to review and use to underpin their subsequent enhancement activities.

To complete the ‘Student Report’, Module Leaders are encouraged to reflect on areas of good practice and areas for development brought to their attention by their MEQ feedback, and then propose actions they will take to enhance the module in future. The aim is to inform students about how their feedback is being used to shape teaching practice. ‘Student Reports’ are disseminated to all students in each module cohort, including to those who did not submit a response to the MEQ.

Supporting the rationale behind closing the loop, is a sector-wide trend from the National Student Survey (NSS), showing a downward trajectory for the results in the “Student Voice” section. In the NSS 2020, student agreement that they have had the right opportunities to give feedback achieved an average result for Higher Education Institutions of 85%, dropping to 76% that staff value students’ views and opinions and to 62% that it was clear how students’ feedback is acted on.2

Closing the loop sits within the ‘Module Feedback Cycle’3 at the University of Hull, where there is an emphasis on the value of including the student voice throughout module deliveries on a continual basis (Harvey, 2011). It is important that each new student cohort understands where current aspects of delivery have evolved from previous student feedback, whilst also being able to contribute themselves, building a “culture of good feedback practice” (Marshall et al., 2015).

Challenges in relation to operationalising student feedback are acknowledged in the literature (for example, Arthur, 2009, Blair and Noel, 2014 and Golding and Adam, 2016). The effective translation of feedback into enhancement activity is a complex process (Arthur, 2009) persistently challenged by a number of factors. These include staff emotional reactions (Moore and Kuol, 2005), students’ abilities to evaluate their learning experiences (Stein et al., 2012), interpretation of statistical information (Boysen, 2014 and 2017), the institutional context (Palermo, 2013) and the role of staff judgement (Golding and Adam, 2016), as some examples. Despite the challenges, consideration of staff interpretation of MEQ results in conjunction with the unfiltered student results, has the potential to offer an increased understanding of the impact of the Module Feedback Cycle on improving the student experience of teaching and learning.

To consider the types of enhancement activity being generated through standardised reflection on feedback, the University’s Student Insight and Sector Policy team are evaluating the data being generated. Currently, this spans three trimesters across 2019/20 and 2020/21, before and during Covid-19 and the introduction of a blended teaching model.

The impact of our approach can be demonstrated by the results for the MEQ statement that refers to the marking criteria being clear in advance, which was historically our lowest ranked result within the MEQs. Evaluating the data has found that working to improve this is one of the most frequent types of enhancement activity proposed by staff in their reflections on MEQ feedback. Staff are recognising the importance of explaining and signposting the assessment criteria early on in the module. They are aiming to provide more detail to address that the criteria is too vague, or conversely, simplifying it when it is too complex. They are also aiming to cover the marking criteria in a variety of ways including in video formats, during lectures and specific sessions, and looking to ensure they conduct checks for understanding of the criteria and upload it to an accessible place in the virtual learning environment. The sustained focus on improving this area has seen incremental increases in student satisfaction each trimester, producing an overall uplift of 7.6% by 2020/21 trimester one since the lowest position in 2017/18.

A further example of an area that has emerged as a priority for staff enhancement activity is to develop and improve approaches to group work. Student agreement with the statement that refers to having the right opportunities to work with other students has taken a downward trajectory in the University of Hull’s MEQs over the course of Covid-19. The response from staff is twofold; to address concerns about group work having been challenging and implementing practices to improve these experiences to appropriately suit the mode of delivery going forward, and to identify opportunities for more student collaboration and interaction with peers. This speaks to the University of Hull’s transformation of its programmes; equipping our students with the necessary skills to navigate the future context of the fourth industrial revolution, with human interaction and team working sitting within core programme competencies.

The student experience of blended teaching and learning is the newest area being evaluated in the MEQs. The MEQ feedback is encouraging staff to develop and use the most effective delivery of modules online as an immediate response to the pandemic and as part of a digitally enhanced future. This includes exploring ways to improve peer feedback and peer communication, producing interactive content, more recorded presentations, quizzes, methods for the sharing of student-created content, accessibility of software off-campus and utilisation of discussion boards. Enhancement themes relating to the timely provision of online resources and methods to encourage continued engagement have accelerated in response to student comments about their experience of online delivery and the importance of preparation for synchronous sessions.

Whilst in the early stages of observing the impact of closing the loop on our student engagement with feedback processes and evaluating the application of feedback to teaching practice, there are positive links being drawn between student feedback and staff response.  Our NSS and internal programme-level ‘Hull Student Survey’ (HSS) results for the “Student Voice” section are following a year on year upward trajectory, including that our NSS results in 2020 were above the sector average.  This initial progress coupled with positive staff engagement with closing the loop are encouraging indicators of our inclusive approach to module enhancement.


1 The University of Hull uses the ‘EvaSys+’ system, provided by EvaSys, to administer the closing the loop process for MEQs.

2 Based on the average percent agree calculated for Higher Education Institutions. Further information on the NSS is available from the Office for Students website.

3 Available to University of Hull internal staff only.


Arthur, L. (2009) From performativity to professionalism: lecturers’ responses to student feedback, Teaching in Higher Education, 14:4, 441-454. 

Blair, E. & Noel, K. V. (2014) Improving higher education practice through student evaluation systems: is the student voice being heard?, Assessment & Evaluation in Higher Education, 39:7, 879-894.

Boysen, G. A., Kelly, T. J., Raesly, H. N. & Casner, R. W. (2014) The (mis)interpretation of teaching evaluations by college faculty and administrators , Assessment & Evaluation in Higher Education, 39:6, 641-656.

Boysen, G. A. (2017) Statistical knowledge and the over-interpretation of student evaluations of teaching, Assessment & Evaluation in Higher Education, 42:7, 1095-1102. 

Golding, C. & Adam, L. (2016) Evaluate to improve: useful approaches to student evaluation, Assessment & Evaluation in Higher Education, 41:1, 1-14. 

Harvey, L. 2011. “The Nexus of Feedback and Improvement.” In Student Feedback: The Cornerstone to an Effective Quality Assurance System in Higher Education, edited by C. S. Nair and P. Mertova, 3–26. Oxford: Chandos. 

Hoel, A. and Dahl, T. 2019. Why Bother? Student Motivation to Participate in Student Evaluations of Teaching, Assessment and Evaluation in Higher Education 44(3):361–378. 

Marshall, S., Fry, H. & Ketteridge, S. (2015) A Handbook for Teaching and Learning in Higher Education, 2:9, 123-138, Fourth edition, Routledge.

Moore, S. & Kuol, N (2005) Students evaluating teachers: exploring the importance of faculty reaction to feedback on teaching, Teaching in Higher Education, 10:1, 57-73.

Palermo, J. (2013) Linking student evaluations to institutional goals: a change story, Assessment & Evaluation in Higher Education, 38:2, 211-223.

Shah, M., Nair, C. S. & Richardson, J (2017) Measuring and Enhancing the Student Experience, Chapter 10, Closing the loop: An essential part of student evaluations. Chandos Publishing, Elsevier.

Stein, S. J., Spiller, D., Terry, S., Harris, T., Deaker, L. and Kennedy, J. 2012. Unlocking the Impact of Tertiary Teachers’ Perceptions of Student Evaluations of Teaching. Wellington: Ako Aotearoa.

Watson, S. (2003) Closing the Feedback Loop: Ensuring Effective Action from Student Feedback. Tertiary Education and Management 9, 145–157 (2003).

Photo by Ricardo Gomez Angel, Unsplash

Last updated