Every December, millions perform the same digital ritual. We share colourful slides of our Spotify Wrapped, broadcasting our individualised musical milestones to friends and followers. Though it began as a novelty, Spotify Wrapped has transformed into a cultural touchstone: a moment when our private listening habits become public performance. This annual celebration, however, masks an intrusion into our right to privacy: the ostentatiousness and interactability of digital platforms have quietly transformed personalisation into a vehicle for an increased state of surveillance. This is digitalisation’s peripeteia. The sudden reversal of fortune reveals the true nature of our relationship with technology.
Spotify Wrapped doesn’t merely reflect our listening habits. It actively shapes them. Users now consciously curate their listening throughout the year, aware that their choices will eventually be packaged and presented to them. “I can’t listen to that guilty pleasure song too many times, or it’ll show up on my Wrapped,” has become a common refrain. We’ve begun performing not for ourselves or our peers but for the algorithm, adjusting our authentic preferences to ensure a more socially ‘acceptable,’ or culturally capitalised, year-end summary.
This gamification of taste represents a subtle but significant shift in our relationship with culture. Rather than organically engaging with music, we increasingly filter our experiences through the lens of how they’ll be quantified, categorized, and ultimately judged. The algorithm has become both audience and curator, with our participation increasingly resembling a performance for a digital panopticon that never sleeps. We are being called into existence, or “interpellated,” as subjects of algorithmic surveillance who willingly participate in our own monitoring.
Particularly insidious about this algorithmic interpellation is how it transforms our relationship with culture from spontaneous enjoyment to calculated consumption. Music, once a refuge from metrics and productivity, becomes another sphere of life subjected to optimisation and datafication. We might listen to more obscure artists in December to appear sophisticated, stream fewer Christmas songs to avoid embarrassment, or carefully balance guilty pleasures with critically acclaimed albums. These micro-decisions establish a dangerous precedent: corporate entities can effectively surveil our most intimate habits and preferences as long as they package this surveillance in visually appealing, shareable content once a year. The colorful aesthetics of Wrapped conceal its true function as a massive data collection operation normalized through gamification.
Behind Wrapped’s playful interface lies a sophisticated data extraction operation. Each colourful slide celebrating your ‘top genre’ or ‘minutes listened’ represents thousands of data points harvested, analysed, and monetised. Spotify doesn’t create Wrapped as a gift. It’s packaging your own commodification as something to celebrate and share, extending the company’s marketing reach through your social networks while normalising surveillance capitalism.
This contradiction became particularly stark in December 2023, when Spotify released its annual Wrapped campaign while simultaneously laying off 1,500 employees—17% of its workforce. CEO Daniel Ek later admitted these cuts “disrupted day-to-day operations more than anticipated,” even as the company reported record profits. In the same earnings call, Ek explained that too many employees were “doing work around the work rather than contributing to opportunities with real impact.” Corporate speak that reveals the company’s priorities lie with algorithms and data harvesting over human creativity and curation.
Users noticed. The 2023 Wrapped rollout faced unprecedented criticism, with many describing it as “low effort, ugly, and incomplete.” One user reportedly felt “genuinely embarrassed for the team that worked on it.” This disconnect between Spotify’s business decisions and user experience reveals the underlying tension in digitalism’s promise: while algorithms can process vast amounts of data, they fundamentally lack the human creativity, intuition, and cultural understanding that make genuine personalization meaningful. The mass dissatisfaction with Wrapped highlights what we already intuitively know: AI-driven automation produces hollow experiences that feel generic and unimaginative compared to human-curated content. Corporate cost-cutting may favor algorithms, but users crave the irreplaceable human touch.
Spotify’s business model illuminates this tension further. The company distributes revenue based on “streamshare,” that is, the proportion of total streams a particular rightsholder controls. This system inherently favors major labels and heavily marketed artists, despite the platform’s promise of discovery and diversity. Meanwhile, the actual payment mechanisms remain opaque, with Spotify itself stating on its website: “Spotify has no knowledge of the agreements that artists and songwriters sign with their labels, publishers, or collecting societies, so we can’t answer why a rightsholder’s payment comes to a particular amount in a particular month.”
This opacity extends to the algorithms themselves. When we open Spotify, the choices presented feel personal, but they’re selected from a limited range of options determined by unseen factors. Commercial partnerships, promotional deals, and the statistical patterns of millions of other users, to name a few. The apparent freedom of unlimited music disguises a narrowing of our cultural horizons.
The question remains: have we lost the ability to engage with culture organically? Perhaps not entirely, but the path forward requires conscious effort. Recognizing the performative aspects of our digital lives is the first step toward more authentic engagement. Understanding that “free” services extract value through surveillance is essential to making informed choices about our digital participation.
Digitalism’s peripeteia, the moment when convenience becomes constrained, doesn’t have to be the end of the story. It could instead mark the beginning of a more conscious relationship with technology, where we understand the trade-offs and make choices that align with our values rather than corporate interests. The algorithm may giveth and taketh away, but we still decide how much of ourselves we’re willing to surrender to its calculations.