Each year tens of thousands of higher education students complete the Student Experience Survey. It’s seen as a litmus test of student engagement, satisfaction and educational quality. But do the ways in which institutions and governments try to understand student experiences still add up?
The pandemic has transformed enrolment patterns and the ways in which students interact with their institutions and the courses they offer. We suggest the data from the 2021 survey released today no longer adequately capture students’ experience of study. The current version of the survey was designed for a time when modes of study were more clearly defined than they have become since COVID-19 emerged.
The student survey is part of the Australian Quality Indicators of Learning and Teaching (QILT) suite of measures for higher education. The 2021 report shows ratings are more positive compared to 2020 for younger and internal (classroom-based) students. According to the report, this “can likely be attributed to some return to on-campus learning and also a change in the expectations and experience of students”.
But how are “internal” students engaging in their studies? Does learning look the same today compared to 2019, and should it?
New forms of flexibility in student mode of study have to be matched with new forms of support to enable students to make smart choices. The mode of study categorised as internal for the survey now includes so much variation that it no longer serves a useful function for reporting and analysis purposes.
What we measure matters. Are we asking the right questions re student experience & engagement journeys in HE? A captivating interrogation of QILT data by @LisaBolton_QILT raising the importance of survey evaluation, student expectations & the stories we need data to tell @caullt pic.twitter.com/OlWsDLtDNr
— Dr Alyce Mason (@AlyceMa5on) October 29, 2021
Why QILT results matter
Individual higher education providers might use results to:- set key performance indicators – for example, “by 2030, we will be in the top 3 universities for learner engagement”
- market themselves – “we are the top Australian university for teaching quality”
- undertake evidence-informed planning – “develop sense-of-belonging roadmap to increase scores”.
COVID has changed how we study
The pandemic shone a light on issues of student equity as mode of study shifted (as a recent review showed). Mode of attendance is defined as:- internal: classroom-based
- external: online, correspondence, and electronic-based (the language used for data-collection purposes shows how outdated it is)
- multimodal: mix of internal and external.

The old hard-and-fast divisions between learning online or in a physical class are no longer appropriate – technology means students can be involved in both at the same time. Shutterstock
5 problems with categorising attendance this way
We have identified at least five problems with the current survey categorisation of modes of attendance: 1. categorising attendance as purely one or other mode, rather than a combination of modes, stifles research and analysis of important national datasets 2. the existing categorisations stifle innovation, limiting institutions from creating distinctive blends of modes of teaching and learning 3. it perpetuates an outdated, either/or mindset that permeates discussion in the sector 4. it masks important implications of differences between new and established modes of attendance, including:- hidden workloads for staff, leading to questions of burnout and mental health
- unclear expectations for students, which hinders decision-making and effective study approaches
- hidden costs and unclear planning processes for differing modes of study
- lack of clarity about blurred modes of study being offered, which can restrict access to higher education and create obstacles to success for equity students.