The idea of a “living laboratory” conjures images of lab rats and test tubes; an ethically compromised zoo, perhaps, or a dystopian Dexter-esque scientific experiment. But a university campus?
Yet this was precisely the term used by Ian Callahan, the Chief Operating Officer of Curtin University, to describe his campus’ transition to new technology earlier this month. “We are effectively creating a living laboratory that is an open invitation to our own researchers and scientists from other universities to use our campus to discover and innovate with data-driven research,” he said. That this data is derived from students and staff — making us, in many senses, the lab rats — was not mentioned.
Callahan’s statement reflects the way Australian universities are increasingly using surveillance technologies like sensors, wi-fi, CCTV, face-matching platforms and learning analytics to become “smart campuses”. As facilities increasingly move online, technology rapidly evolves, and the availability of data escalates, personal information and markers of student behaviour are all ripe for the taking.
Privacy is the obvious concern, and the extent to which students have meaningfully consented to what’s happening is dubious. Common justifications like “safety and security” and “enhancing the student experience” reflect the genuine goals behind many programs. But on the other hand, they operate as rhetorical cover to mask the real danger of collecting, for example, incredibly sensitive biometric data that can be used to track you.
Also embedded within these technologies are more insidious implications for how power is exercised. Humans moderate their behaviour and conform to expectations when they know they’re being watched, and the university campus is no exception.
Callahan’s remarks were made in the context of Curtin University’s use of ‘Internet of Things’ (IoT) technology: a suite of new technologies will be installed on the campus so that data can be gathered and combined from multiple facilities. Video analytics, sensory tracking and live face matching, for example, will allow Curtin to generate information about “the lifecycle of the student, the day to day reality of a staff member, the activity pattern of a lecture theatre,” and so on, according to a statement.
Both Deakin University and the University of Melbourne are similarly using their wi-fi networks to develop maps of students’ locations in real-time. In August 2016, Melbourne’s Head of Services Paul Duldig told the ABC: “We’re not breaching [students’] privacy because we don’t know who they are … At this stage we’re not [tracking individuals] so we don’t think it would be required to notify people.”
But these capabilities immediately evoke the problem of ‘function creep’: the potential to gradually extend the use of a technology beyond the purpose for which it was initially intended. In response to the University of Melbourne’s plans, Dr Adam Henschke from ANU’s National Security College said: “Once you start getting this information you can go, ‘oh, this information is quite useful for a bunch of other things, let’s use it for these further purposes’. Then you get into even deeper concerns about informed consent and misuse of personal information.”
Fast-forward seven months: Australian tech website iTnews.com.au reported that the University of Melbourne was considering a plan to manipulate students’ movements based on their location by tampering with their wi-fi access in certain areas of campus.
At the Cloud and DC Edge Summit in March 2017, of which there is no public record, Melbourne University’s data centre and facility services manager Will Belcher said: “Deans of some faculties are not happy that their libraries are being filled with all sorts of students who aren’t necessarily in their faculty… They want to try and use the data from the wireless access points and controllers — we track [wi-fi usage] against each student’s login name — to determine what faculty they’re in.”
The students who belong to the faculty could get superior wi-fi coverage, “so as to gently dissuade the [other] students and steer them back to their own libraries.”
Placing the ethics of unpleasant design and the sinister prospect of covertly controlling students’ movements aside, this contradicts the University’s earlier claims that privacy concerns were moot because wi-fi information was anonymised. More importantly, it demonstrates how ‘function creep’ is already a problem.
The University of Melbourne was considering a plan to manipulate students’ movements based on their location by tampering with their wi-fi access in certain areas of campus
When Honi contacted the University of Melbourne for comment on the matter, Director of Technology Management Daniel Buttigieg said Belcher’s quotes had been taken out of context in the report. “The University is not actively looking into that,” Buttigieg said. “But obviously the abilities are there if somebody wanted to go and do that.”
Wi-fi and sensory tracking are just some of the ways universities monitor students’ movements in physical space. Increasingly sophisticated CCTV networks around campus also ensure you are constantly being watched.
“There are cameras everywhere,” says Kurt Iveson, an associate professor of urban geography at the University of Sydney. “There are cameras on the top of [buildings], there are cameras in the lifts going down to car parks, into buildings, and the extent of data that is just being captured as part of [this] surveillance network is really ballooning out”.
An indication of the network’s extent is the burgeoning amount of space required to store all of its footage. While it did not divulge information in detail, the University of Sydney confirmed that it now contracts out its data storage facilities to a third party supplier off campus. At the University of Melbourne, iTnews reported that there were plans to convert as many as 150 communications rooms into micro data storage centres, in order to accommodate the escalating collection of higher resolution CCTV.
Buttigieg told Honi: “The current resolution we have is high definition which is quite good. It’s just the expansion of CCTV and having more cameras in more areas [requires the extra space].”
Which begs the question: why is further expansion of video surveillance on campus a necessity?
Honi asked a University of Sydney spokesperson to provide evidence or statistics that point to the usefulness of CCTV as a preventative security measure. In response, they said: “University CCTV footage has been used to identify offenders on campus … [and return] stolen property … There is also a qualitative benefit of having CCTV in that it increases the feeling of safety and security, however it is not a stand-alone tool and therefore cannot be measured as such.”
“[But] how is burglary prevented?” asks Diarmuid Maguire, a senior lecturer in the USyd Department of Government and International Relations. “Has burglary stopped because of all of this? In terms of personal safety, I don’t see how that is prevented at all.”
Meanwhile, higher resolution technology offers an enhanced opportunity to use CCTV footage for other purposes, such as facial tracking. “I have heard of other universities doing it for a number of different purposes,” Buttigieg said. “Some of them use it to count the number of people going in and out of a library for example, so they can track usage patterns. We aren’t actively doing that at the moment, but obviously the technology is there to be used in the future.”
Curtin University is looking to the “significant role” facial recognition can play for campus safety and convenience, citing, for instance, its potential to displace keys in the foreseeable future.
Swinburne University integrates analytics programs with its CCTV cameras, allowing the technology to automatically detect unusual behaviour and recognise number plates, and might add facial-matching and heat maps to its capabilities. In an iTnews report, Swinburne’s IT security specialist Chris Goetze said: “There are persons of interest who we would like to know if they’re on site or not. That’s where facial recognition would be handy.”
At present, Swinburne’s camera-matching Snap Surveillance platform still allows people to be tracked in real-time. “If you want to follow ‘person of interest A’ across the campus, this tool comes into its own then because you find your suspect on the camera, and it shows you nearby cameras they might walk towards,” Goetze said.
Trying to capture and bust crime after the fact is just one use of the University’s blanket surveillance machinery. “The other way that digital technology gets used in the landscape is to actually sort different population groups into their appropriate spaces on a campus. There are more and more spaces on campus that are kind of ‘behind locked doors’, [which means] it’s getting increasingly difficult for us to access the workplace,” said Iveson.
“The question also becomes: who’s writing the code that determines who has access and who does not? And how open is that process to accountability towards the University community or those who are affected by it?”
Curtin University is looking to the ‘significant role’ facial recognition can play
Video surveillance has also historically operated as a tool of control, which relates to the age-old warning that people’s behaviour fundamentally changes when they know they’re being watched. “The recording of lectures in and of itself makes you very careful about what it is that you say and what it is that you don’t say,” Maguire said. “You’re very conscious that there’s a third element in the room.
“Now you’re being recorded visually. Why should the lecture theatre be monitored? It’s to keep an eye on me as an instructor, to keep an eye on the business I’m doing, and to keep an eye on you as the client.”
The implications of surveillance for protest and civil resistance in this context are particularly stark. “When the police are on campus they always turn up with cameras and go right up to you and film everybody that’s there,” Maguire said. “You know you’re being filmed, and that automatically reduces the amount of people that are willing to take part in protest.”
According to Iveson: “The classic example of the function creep is when protesters are walking through the city, [surveillance] cameras are on them and police have the right to request the City of Sydney hand over their [CCTV] footage because ‘we’re worried about some incident’. Suddenly they’ve got a record of everybody’s attendance.
“When marches are happening through campus, I don’t know how those cameras are working. I wouldn’t have a clue whether there are data sharing arrangements with the police that extend to ‘we’re trying to identify a ratbag activist that we want to stop.’ So that’s the stuff you just worry about.
“It’s sort of next level when you read all the paranoid surveillance studies about [how], when you apply CCTV, you can integrate it with facial recognition and literally red flag individuals and be tracking their movements all over campus. It’s not inconceivable with the tech that that could happen.”
Surveillance and data collection on campus extends beyond the physical realm: Universities around the world are also adopting increasingly sophisticated analytics within the teaching and learning space, accounting for another way in which your data and online behaviour is monitored.
ELearning gathers information that includes your access to online materials and its correlation with your marks, how many minutes are spent listening to lectures, and results from previous subjects, which are made available to teaching staff accordingly. Unit co-ordinators can generate reports on students’ collective age, fee status, languages spoken at home, whether they’re first in their family, and so on.
While the University maintains that all of this information is useful for teaching purposes, some members of staff remain unconvinced.
We, the instructor, all of a sudden find ourselves put in a position of too much potential power
“I don’t believe it’s the role of the instructor to access blackboard and get a hold of all this. That’s between you and the faculty,” Maguire said. Students already face pressure to cater to their tutors and professors, and with access to this data, “we, the instructor, all of a sudden find ourselves put in a position of too much potential power in relationship to the students. This is an additional power I [now] have over you. Why should I know about the fact you’ve failed other classes, for example?”
“As much as we try to keep our marking unbiased, there is no question that knowing how students access optional resources and the time they spend studying would subconsciously affect marking,” another member of teaching staff told Honi. “I can’t see how I could know these details about a student’s course engagement and honestly assign them an objective mark — it seems not only unethical, but unrealistic to think otherwise.”
The University’s recently established Data Analytics Research Group is investigating how this data can be analysed for both research and operational purposes. By the end of this year, it aims to have started cultivating an “educational data bank,” according to Co-chair Dr Kathryn Bartimote-Aufflick. This data bank will start connecting different data sets of student information that people have not had access to before.
The University insists that it will be working closely with its privacy commissioner to address concerns that will inevitably arise; however, questions of retention and students’ meaningful consent remain poignant.
“Often there’s a seven year record keeping number that goes with ethics approval,” Bartimote-Aufflick told Honi. “But in terms of longitudinal analyses and a historical database that could be potentially useful, I think it would be worth keeping them longer.”
Whether students have meaningfully consented to the use and storage of their data for a prolonged amount of time is up in the air. While all students scroll down and click ‘agree’ to on terms and conditions about their enrolment, presumably few read the fine print and consider its implications. Moreso, many would expect their information’s relevance to lapse with their enrolment.
“[With] some of this data, I don’t think explicit consent is always required,” Bartimote-Aufflick said.
One way of mitigating concerns with consent is to make these programs strictly opt-in, however, this looks unlikely to be the case. “I think we would set this up as an opt-out basis,” Bartimote-Aufflick said. “As soon as you take away broad scale data that reflects who was in the environment, what was going on, and all the interactions, the insights are diminished… If you have 10,000 students and only 500 opt in, you don’t get a clear picture of what’s going on.”
The biggest risk of digitising anything is potential disclosure, according to web security expert Troy Hunt.
“You start to have a huge amount of deeply sensitive personal information that describes people’s behaviours, and possibly genetic biometric data,” Hunt told Honi. “What would it mean to lose all of that, or to have all of that made public?”
“[Universities] are subjected to hacks nonstop all day, every day,” Hunt said. “Anything that is publically accessible is, particularly universities where you’ve got a whole bunch of people that are quite smart and have easy access.
“The most important question is how often [breaches and hacks] are successful. If you just do a Google search, you’ll see stuff all over the place. It’s certainly happened in Australia in the past as well. Certainly these attempts are happening all the time. There’s a certain amount of incidents we hear about, and a lot we don’t. Frankly, there’s a lot that the Universities themselves would never even learn about, or never even realise.”
USyd itself is no stranger to security debacles. In February 2015, thousands of students were informed by the Dean of the Faculty of Arts and Social Sciences that their personal information may have been accessed by an unknown party and in the hands of hackers. A 16-year-old hacker, who goes by the alias Abdilo, claimed responsibility for the information security breach, in which the data of 5,000 students was left insecure. At the time, Abdilo told Honi he had little trouble accessing the information, and rated the university’s database security as a “0” out of 10.
The risk we have as soon as we digitise anything is the risk of disclosure. What would it mean to have all of that made public?
In March last year, the University again contacted select students after a laptop containing unencrypted confidential personal information from the University’s Disability Assist Database was “lost in transit.” A University ICT staff member left the laptop on a bus, potentially compromising the disability diagnosis, names and contact addresses of nearly 7,000 current and former students.
As an increasing amount of data is aggregated and stored, what risk are we willing to take with sensitive personal and biometric information?
Despite the swathe of security and privacy concerns that come along with surveillance and collection of data, the reaction of many students is generally indifferent. “It’ll be interesting to see if there is broad scale concern,” Bartimote-Aufflick said of the University’s data analytics research. “In some ways I suspect there might not be.”
“Your generation has more or less accepted this,” Maguire said. “My generation was sort of like, ‘we’re not used to this,’ and when we see it cropping up we go, ‘this is bad’. But your generation has been brought up in a situation in which this has always happened. [You have] also been brought up under terrorism, so you’re used to [the instruments of surveillance] that have become very ramped up in Western society.”
In a context where students give their data to an array of corporations every day in exchange for convenience, this trajectory on University campuses aligns with the status quo.
“The University is the norm, rather than the exception,” according to Maguire. “Universities are no longer ‘ivory towers’, separate from the rest of the community. “[They] are behaving just like any other corporation that exists. Education is the fourth largest business in this economy, writ and large. That’s the fact.”
And so students’ frequent response is one of reluctant compromise: “I’ve come to expect it as a reasonable exchange. While I’m not always comfortable with the level of surveillance, I reluctantly accept it as a necessary evil for my education,” said UTS student Jack Bresnahan.
But these measures still strike us as eerie and perhaps unsettlingly familiar. It’s not by coincidence that omnipresent and omniscient surveillance is linked with the horror of dystopian fiction; 1984 is imprinted in society’s collective imagination as a notorious warning sign for a reason. The capacity of technology to render subjects docile and obedient is a power that is thoroughly theorised.
The increasing propensity of universities to monitor their students and staff’s every move, and the normalisation of this activity, bolsters their ability to manipulate behaviour. Considering the evidence of function creep that already exists, the living laboratory in which we are test-subjects could be more sinister than it’s made out to be.