“He said he did it because he wanted to show me off. Then I realised that when I’d met his friends, they must’ve already seen how I looked naked. What kind of sick fucks would be able to do that and even look me in the eye?”
It was back in high school that my friend learned about how her boyfriend had been secretly recording intimate images of her and passing them around. She only found out because he had gotten into an unrelated argument with a friend, and in a vengeful mood, his friend decided to tell her what was going on. Even in discovering the abuse, she had been disempowered – used as a pawn in their juvenile spat. It’s been years, but her expression is still bitter as she grips her mug tightly, studying the smattering of cocoa topping her cappuccino as if she’ll find the answers to her hurt there.
As a victim of image-based abuse, she’s not alone. The wonders of technology mean that image-based abuse (IBA) now takes a plethora of forms, from your garden-variety perverts sharing Snapchat screenshots of their girlfriends in group chats, to those who monetise and make a living out of people they don’t know at all. In 2017, Leigh Abbot used Tinder to convince a dozen women to send him nudes in a con that lasted months, then used those photos to blackmail them into handing over their life savings.
When I think about the hurt that IBA has caused to people in my life, I am, of course, angry at the perpetrators, but I am increasingly angry at those the perpetrators performed for. What clout, profit or views can be gained without an eager audience? When supermodel Kate Upton’s nudes were leaked in 2014, what else allowed prepubescents halfway across the world to eagerly participate in the consumption and shame of her body other than the fact that everyone had done so? Why would a Snapchat screenshot be shared in a group chat, if not for the anticipation of a round of congratulatory slaps on the back from the boys?
The combination of cultural acceptability and lack of consequences creates a conducive environment for IBA to proliferate, now even more with COVID-19. As human life has shifted online and more people are turning to the Internet to fulfil their needs for intimacy, the opportunities to participate in this behaviour have increased exponentially. The eSafety Commissioner’s annual report, published 2 weeks ago, revealed a 370% spike in the number of reports of IBA received by its office since the beginning of the pandemic.
So why is IBA growing so rapidly? I invite you to consider two complementary situations.
1. The year is 2014, and ‘Celebgate’ leak has released a huge collection of nudes from various celebrities into the Internet. You’re at lunch and it’s a hot topic. One adventurous soul clicks on the Reddit link, and you all crowd around the screen for a look.
2. The year is 2020, and ‘the mood’ hits. You open up an incognito window (or just a normal window if you don’t anticipate anyone typing “P” into your search bar) and in an instant the familiar yellow and black logo pops up. The first result in the Hot category is a “2020 LEAK” featuring a familiar face. You decide to check it out.
These experiences are so common that they could almost be branded as part of the modern pubescent experience for many in our generation. However, they are also prime examples demonstrating the insidious ways that IBA has ingrained itself into our culture.
For many, celebrity nudes are the gateway into habitual consumption of IBA products. When I mentioned this article to a friend, he said that he’d looked at Kate Upton’s nudes in Year 7 “out of curiosity”. How did we end up at a point where prepubescent teens think that consuming the products of IBA is acceptable? The answer lies in the widespread dehumanisation of celebrities in modern culture.
The Paris Hilton sex tape 1 Night in Paris, filmed when she was 19 and released on DVD (complete with a special 2-disc Collector’s Edition) by her ex-boyfriend Rick Salomon three years later in the wake of Hilton’s TV debut, was a cultural watershed moment in the normalisation of IBA. It was everywhere and everyone watched it – even Donald Trump boasted about watching it with Melania. In an interview this year, Hilton described the 2014 tape release as being “electronically raped” and “something that will traumatise [her] for the rest of [her] life”. In an ugly turn of events, now expected whenever something bad happens to a celebrity, the public turned on Hilton and accused her of orchestrating the abuse in her pursuit of fame. Society presumes an implied license to access intimate details of a celebrity’s life as a form of “social tax” for fame, but this event marked a dangerous extension of this presumption into socially acceptable derogation of an abuse victim through the power of herd validation – “it’s okay since everyone else is doing it”.
For those whose consciences impede them from looking at authentic celebrity nudes, IBA has produced a lite version that is seemingly a little gentler on the moral compass – celebrity deepfakes. The suaver, more polished older cousin of cutting and pasting a celebrity’s head onto a Playboy spread, deepfake technology uses artificial intelligence in the form of GANs or generative adversarial networks to produce super-realistic fake videos. There are whole websites dedicated to celebrity deepfakes – at the time of writing, a deepfake video of 19-year-old TikTok star Dixie D’Amelio on one such site has amassed nearly a million views. Deepfakes soothe the viewer’s conscience with the assurance that they aren’t seeing something real, but fake intimate images can wrest control of one’s bodily autonomy away in the same way that real intimate images can, especially so if they are hyper realistic.
Another way that IBA casts its wide net is through the platforms people use to access porn on the Internet. Between October 2019 and March 2020, the adult entertainment sites XVideos and Pornhub received an average 3.14 trillion and 2.85 trillion monthly visitors respectively, outranking Netflix’s 2.21 trillion monthly visitors. However, these giants of the online porn industry have a terrible track record when it comes to facilitating IBA. In November 2019, XVideos hosted a recording of the murder of 29-year-old Dr Priyanka Reddy, who was sexually assaulted by multiple men and burned alive. The video was hosted on XVideos for weeks and even became one of the site’s trending videos.
Pornhub has also hosted and profited from videos depicting victims of sexual abuse. In February this year, 25-year-old Rose Kalemba came forward with a harrowing story of how she was kidnapped and raped as a 14-year-old and had the videos of her assault uploaded to the site. Her repeated requests for the videos to be taken down went ignored until she sent an email posing as a lawyer threatening legal action. On platforms where violence against women is present in almost every video to varying degrees, it can be hard to tell between those involving roleplay kink and those depicting nonconsensual abuse. The availability of real abuse videos on hugely popular platforms provides people with a seamless segue into unconsciously consuming the products of IBA, or even worse, consciously doing so while waving the free pass card of ‘curiosity’. On a broader scale, since these platforms’ profit models are based on views, viewing any video on the platform amounts to supporting firms that profit off sexual abuse.
Thus, IBA has morphed into hyper-accessibility by presenting its products in insidiously palatable forms, integrating itself into lives in a way that normalises and encourages complicity with abuse. The astounding mundanity of consuming these products in our modern context evidences a disturbing cultural phenomenon where facilitating IBA is default and opting out is hard. One especially concerning consequence of the normalisation of such behaviour is that offenders can sometimes be blissfully unaware of the consequences of their actions. My friend tells me that, evidencing an astounding lack of self-awareness, her abuser approached her a year later to invite her to his school formal. “What that told me was that he never really understood how much he had traumatised me. I felt so powerless.”
The normalisation of IBA is especially harmful because the audience plays an integral role in the operation of IBA. Unlike other forms of abuse that revolve around the perpetrator-victim nexus, IBA necessarily also involves a crucial third party – the audience. Speaking to my friend, she said, “I remember sitting outside the classroom during that lunch break taking that call when I found out what was happening. I was obviously really hurt because he broke my trust, but at that point I didn’t even give a fuck about him, I was only concerned with how to contain the damage. All I could think about was how disappointed my parents would be and how this would follow me around for my whole working life. It was like I was in survival mode.” Research has shown that, regardless of whether an intimate image was captured or distributed consensually, the depicted subject of the photo is viewed as promiscuous, which is perceived as a negative attribute. This force of social stigma is precisely what gives the perpetrator their power.
My friend noted that her fear of negative judgements from her parents, teachers and friends stopped her from ever reporting the abuse. This fear of social stigma has led to a gross underreporting of IBA cases, especially since IBA disproportionately affects vulnerable groups in society. In a 2017 study, a staggering 1 in 2 Indigenous Australians and people with disability reported being a victim of IBA. When perpetrators target those who are already vulnerable, they do so with the knowledge that it will be even harder for their target to seek justice.
The cultural barriers to legal recourse are compounded by a lack of awareness about options available to victims. Not many people are aware that the eSafety Commissioner has powers under the Enhancing Online Safety Act 2015 (Cth) to issue removal notices to perpetrators or the hosting platform, calling for nonconsensually shared intimate images to be removed within 48 hours. Noncompliance with the removal notice carries a civil penalty of 500 penalty units, equivalent to $11100 in current Commonwealth penalty unit values. However, the eSafety Commissioner presents victims with confusing advice.
In a recent public messaging video on adult cyber abuse, which they define to include IBA, the eSafety Commissioner advises victims to first block the perpetrator, then contact the platform to request that the content be taken down. Only if the platform doesn’t resolve the complaint is a report to eSafety recommended, with no guidance as to how long they should persist with the platform. This places considerable burden on victims to address the abuse themselves which can be an incredibly confronting and isolating process, and having to continuously recount their fresh trauma to strangers on a customer helpline is made more distressing. However, on a separate page dedicated to IBA, the eSafety Commissioner recommends collecting evidence and making an immediate report to eSafety. Between making a report to the eSafety under civil law, to the police under criminal law and reporting directly to the platform, this conflicting advice from a peak body makes it even harder for victims to navigate a confusing legal landscape.
Should a victim decide to report the abuse to the police to pursue action under criminal law, they may face obstacles of a completely different kind. Accounts of victims who have made reports of IBA to police reveal that they officers often convey victim-blaming attitudes during the victim interview process, giving responses like ‘what did you expect online?’ Studies have also shown that police forces prioritise directing resources towards offences with a physical dimension over “virtual harms”, meaning that reports of IBA are taken less seriously.
The rise of IBA thus comes down to the combined normalisation of IBA and the lack of serious accountability for its perpetration, stemming from social attitudes towards victims of IBA. This primes people to push the boundaries of acceptable behaviour, backed by herd validation and the knowledge that their victim might not ever report their actions.
IBA’s prevalence is gaining increased public concern, and the legislature is responding. According to a discussion paper released earlier this year detailing proposed changes to the Enforcing Online Safety Act, the permissible period for content takedown would be reduced from 48 to 24 hours in a currently undrafted Bill. While this is important infrastructure to have in place for victims, it represents a shift in the policy space further towards responsive measures, playing catchup to the mass of digital content constantly being churned out. IBA is a multifaceted problem that has planted deep roots in society, and as such it requires a holistic response which addresses the cultural source of the issue.
The narrative around IBA needs to shift away from imposing culpability on victims, and instead force people to acknowledge the consequences of what they do when nobody else is watching. Advising Year 10s in cybersafety seminars that they can effectively protect themselves against IBA if they avoid sharing nudes is akin to proposing abstinence as a solution to sexual assault – it denies the reality that abuse is perpetrated by the abuser, not by the victim. As a society, we need to start initiating difficult conversations and taking ownership of our complicity as passive observers.
The normalisation of IBA in our culture has lulled us into a collective state of cognitive dissonance, collectively denouncing abuse while retreating into our beds at night to gleefully rewatch and relive records of abusive behaviour bearing stamps of public approval. In a society where it is not only acceptable but normal to perpetrate abuse and enjoy looking on, while victims are blamed for the hijacking of their own bodies, it is not enough for us to mourn the state of affairs and give stern talks to young girls about trusting boys. Each click on abusive content perpetuates this toxic culture. Only we can control what we do when we’re alone.