Measuring student satisfaction

Andrew Bell and Nick Bonyhady on the web of data collection which founds University policy

Last week, the entire student populace received an email from University Quality and Analytics which contained an independent survey they called the ‘Student Barometer’. It aims to create a comprehensive data picture of the student experience at the University of Sydney.

Measuring the student experience is particularly important to Sydney because it usually lags behind other Australian universities like the ANU and Melbourne on several rankings of research performance and reputation. The University’s marketing pitch to international students, who are essential to its financial health, reflects the Uni’s perceived strengths: rankings and research are mentioned only twice, while the Uni spends paragraphs on clubs and societies, facilities, orientation events and student services. If the results of the Student Barometer are good, they will reinforce this powerful marketing story by allowing for a positive comparison between Sydney and its competitors. If students are dissatisfied with the Uni, the Student Barometer will present a different message to Uni administration – a message they at USyd might actually listen to.

Professor Pip Pattison, Deputy Vice-Chancellor (Education), told Honi that “These responses had a “major influence in the development of the University’s new strategic plan” and they “regularly influence policy development, projects designed to improve students’ experiences and changes to individual units of study.” She said the responses were “reviewed regularly at many levels of the University”. Importantly, she also confirmed that the Barometer gives Sydney an “opportunity to compare our performance with universities in Australia and the rest of the world”.

While it is hard to track the direct impact of survey results, there are some reassuring examples. The Senior Executive Group, which is made up of some of the University’s most powerful administrators has agreed that “units of study should have no more than 20 per cent of students disagreeing that they are satisfied with the quality of their units of study”. One wonders though whether any other multi-billion dollar organisation would tolerate a fifth of its ‘customers’ being unhappy with any of its products. Nonetheless, relative to the impact that student protests seem to have on University decision-making, survey results seem to have some power.

However, when asked whether the survey results will be released to students, Professor Pattison told Honi, “The survey outcomes are made available to staff through regular reports to committees and our survey website; we are increasingly seeking to ‘close the loop’ on all surveys by providing aggregate responses to survey respondents as well.”

Why the University has not committed to ‘close the loop’ by releasing the full results of this survey is unclear. Other surveys, like the University Experience Survey, a national survey of uni students funded by the federal Department of Education, make their results freely available online. So, logistics do not seem to be a barrier to greater transparency.

Ironically, despite the fact that students provide the data for surveys in the first place, the University has long restricted access to its survey results. For example, students can see how Units of Study that they took in the previous semester performed in the USS survey, but they cannot see other units. Consequently, they get largely redundant information about the past, rather than being able to make more informed choices about what units to take in future.

Perhaps one reason that Sydney is reticent to reveal the results of the Student Barometer survey is that it has not done well in previous major educational surveys. In 2014, on the predecessor survey to the SES (the largest higher education survey in Australia), the University placed below the national average on 37 of 46 measures of student satisfaction amongst continuing students, that is, about 80 per cent. In accordance with national trends, the University’s worst area was student support. For instance, only 20 per cent of continuing students in 2014 were satisfied with the Uni’s efforts to provide “appropriate English language skill support”.

Hopefully the University’s past performance does not deter it from releasing the full results of the Student Barometer this year. As an institution that has committed to a core value of “openness”, it ought not to matter whether the results of the Barometer aid the University’s marketing or not. However, the fact that the two are at least potentially linked should give students some confidence that their survey answers will carry weight.