Faculties reluctant to share grade data

Of the 11 faculties Honi contacted, only five provided data.

The University prides itself on operating using “standards based assessment”. This is distinct from fitting grades to a chosen distribution, a system colloquially described as ‘scaling’. At USyd, there are no target failure rates or score distributions; in theory, everyone can fail or ace any unit of study depending on their achievement of prescribed unit outcomes.

Beyond this non-prescriptive approach to assessment, there is little unification across the university in how grades should be distributed or how data is handled. Aggregate grade data for each unit of study is held by individual faculties and never passes through the University’s central system, according to Ainslie Bulmer, Executive Director in the Office of Deputy Vice-Chancellor (Education).  “It’s not data that I would be able to pick out of any kind of central university system.”

It is also, as Honi discovered, data that some faculties are stubbornly unwilling to relinquish.

The faculties of Business, Engineering, Health Sciences (FHS), and Medicine were happy to produce data regarding failure rates and average marks when contacted by Honi (see table).

The Faculty of Dentistry initially said they were “not at liberty to provide data on failure rates or median marks for 2016”, but did so following further prompting.

Average grade data

But many other faculties simply refused to produce any data, citing a variety of reasons.

The Architecture faculty claimed unit of study data was “ sensitive and proprietary information”; the Science faculty took 11 days to determine that they couldn’t produce the data in time for publication; and the faculty of Arts and Social Sciences said the data could not be produced from their results processing system. The Nursing and Pharmacy faculties simply didn’t respond.

Meanwhile, the Law School claimed it did not even calculate the failure rates for individual units of study. This is despite Deputy Vice-Chancellor (Education) Pip Pattison saying, “Individual unit of study coordinators would need to be aware of failure rates and try to understand the reasons for it. It’s an important piece of feedback on the design of the unit.”

Only FHS provided data for individual units. This is unsurprising, given that there are obvious misconceptions that can result from sharing the average scores and failure rates for every individual unit of study.

“As a simple example, I can readily conceive of two units that are taught equally well having quite different grade outcomes due to a range of different factors (prior preparation, the extent to which the unit aligns with students’ interests etc),” said David Lowe, Engineering & IT Associate Dean (Education). “And yet a superficial consideration may unfairly put the grade variation down to teaching quality.”

This may justify refusing to share individual unit of study data, but whole-faculty averages do not come with the same risks. Yet more than half of the faculties contacted either didn’t respond or refused to supply this data as well. Instead they cited roadblocks that other faculties had already surmounted. Given that, their excuses ring somewhat hollow. Fear of reputational damage — from, for example, surprisingly generous average scores — is one competing explanation.

But refusing to supply any data at all comes with its own dangers.

The most obvious is that new students are left without an indication of what comprises a ‘normal’ score for their course. Given that the people most likely to share marks are the high achievers, this lack of data only exacerbates stress and feelings of inadequacy. 

In both the Law School and Business Faculty, rumours circulate of failure rates in excess of 30 per cent. The data provided by the Business Faculty goes a long way to extinguishing that fear while the Law School’s opacity only leaves its students in limbo. Allowing the chance of failure to be exaggerated is a rather grotesque way of motivating students.

This secrecy is not only a problem at USyd. Other Group of Eight (Go8) universities similarly do not publish average marks and failure rates for each faculty.

USyd’s administration provides no guarantee that a distinction in one faculty is equivalent to a distinction in another. Indeed the reality is that implementing such a guarantee would likely prove impossible because there are too many factors at play. As Pattison says, “you are, on one level, comparing apples and oranges, and that’s tricky.” Supplying aggregate grade data at faculty level, however, allows for students to compare themselves with others studying approximately the same courses. This is critical if students are to make decisions about their studies and careers that are realistic and well-informed.