In 1990, the toaster was connected to the Internet. In the 2000s, television sets and fridges were made smarter and more connected. In the 2010s, alongside the growth of the Internet of Things (previously ordinary objects connected to the internet) like lightbulbs, showers, beds and smart speakers, 2014 marked the overtaking of the number of devices relative to people.
This level of connection requires a new level of voluntary surveillance. We are living in a future forecasted by science fiction. However, where fictional citizens were involuntarily controlled by an omniscient power, we brought it into our pockets, homes and workplaces, eagerly clicking “I Accept” when asked, without considering the repercussions.
Research by the Consumer Policy Research Centre found that 94% of Australians don’t read their privacy policies.The time consuming process of unpicking the overly wordy and jargon heavy terms that we agree to is overlooked as we don’t really have another option, needing whatever software it applies to for work, socialising or university. These unread privacy policies have enabled technology corporations like Amazon, Apple, Facebook and Google to track every digital move we make, and now our physical ones as well.
We brought smart devices into our homes and workplaces to make the mundane easier, like playing music, finding out the weather and setting alarms. This was initially met with an understandable paranoia, which has faded as these products become increasingly present.
In a 2019 letter to a US senator, Amazon admitted that audio recordings from Alexa and Echo devices were retained until users requested their deletion, and even then, some transcribed data remained, although separate to the device, with the attempt to improve Alexa through machine learning.
It’s in this supposed “machine learning” that devices become smarter, as Amazon (and other) workers around the world process the recordings to help the voice assistants learn when a French speaker is saying “avec sa” instead of Alexa and when a Spanish speaker may be saying “Hecho” instead of Echo.
Amazon’s foray into video recording, through their digital doorbell Ring, extends their reach into our lives and our data as live video and facial recognition tracks anyone who walks up to or past your door. The concerns surrounding video surveillance are wide-reaching and becoming increasingly prevalent, emphasised by the existence of the website Insecam, which offers a global live feed of unsecured cameras. IP cameras have long been subject to breaches-everything from baby monitors to digital doorbells have been temporarily taken over, giving access to the video stream and speaker.
In buying and using a smart speaker or digital doorbell, a consumer consents to the surveillance it entails. However it becomes more complicated when in environments you don’t control. Google’s Rick Osteroh admitted that visitors should be informed about the presence of smart speakers.
Without structural change, privacy concerned citizens are forced to resort to measures such as counter wearables, where scarves, jackets and jewellery are used to avoid or minimise the tracking of the wearer. You can buy reflective glasses that conceal your eyes on camera (by deflecting visible and infrared waves), RFID blocking jackets and wallets, clothing that masks your heat signature from drones, makeup that attempts to trick facial recognition or a bracelet that interferes with the recording capability of nearby microphones (by emitting high frequency waves that replace your voice with white noise on the recording). These reactive measures aim to give users the means to protect their physical and digital identity, much to the chagrin of technological companies. But the consumerist nature of these products means an individual is forced to continue to develop and maintain a pseudo-digital armour.
Whilst both Amazon and Google rely on opt-out systems, requiring a user to withdraw consent for their recordings to be kept, Apple utilises an opt-in system which allows users to share their dictation recordings to improve its’ accuracy. Opt-in systems make privacy the default, a notion shared by the European Union’s General Data Protection Regulation (GDPR) and the United Nations’ institution of the right to digital privacy. Schemes like the GDPR created a precedent in favour of digital privacy and protective regulatory measures.
So, where do we go from here? It’s easy to become apathetic towards privacy policies and digital surveillance, under the guises of having already revealed too much. But, we must attempt to confront the complex tension between embracing benefits offered by technological advancement and protecting our digital lives.
The path ahead is best put by James Bennett, an editor at The New York Times, “Rather than hurriedly consenting to someone else’s privacy policy, it’s time for us to write our own.”