Is The Biggest Bully On Social Media The Algorithm Itself?
If the online world was the film Mean Girls, my bet is that social media platforms would be Regina George. The bully. It makes sense. Social media thrives on scandal, constantly feeds us gossip, and is well versed in telling people when they can and cannot sit with us. But, of course, this isn’t Mean Girls. Social media is increasingly shaping public opinion, entrenching political polarisation and frequently stoking the flames of misogyny, racism and intolerance. Yet these platforms are fuelled by us, willingly logging on and engaging. An algorithm alone couldn’t possibly bully us. Or could it?
Though the Meta-owned Instagram declined to comment, a TikTok spokesperson was happy to tell me its algorithm is all about curating content for users based on user preferences. Essentially, it shows you more of what you like and measures that through likes, follows and videos watched. If anything, this just seems helpful, until you consider that – if we are only viewing our own interests, we are slipping further into echo chambers, contributing to an increasingly divided society.
Sara McCorquodale, founder of social media consultancy studio Corq and author of the book Influence, believes that, while division is not the aim of social media, that is not to say it is not beneficial to these platforms. “Huge online communities have amassed around divisive issues, and I think a substantial number of people are addicted to getting a reaction and having their opinions reinforced,” she says. “The addiction means they are constantly engaging with social content, and this is to the advantage of social platforms where daily usage and minute-to-minute relevancy are everything.”
An algorithm’s aim is to measure and increase engagement because this is how these platforms are monetised – the longer you stay on, the more advertising they can serve you, and if it finds out what performs well it will serve you more of the same. It doesn’t even have to be something you’ve searched for – it can just be a trending topic. This may explain why Amber Heard-bashing content arrived uninvited onto so many of our feeds this summer.
“Critical posts are more likely to get comments and shares than any other post,” says Camille Carlton, senior policy and communications manager at the Centre for Humane Technology – a non-profit that works to make tech more socially responsible. “Morally emotional words in messages increased their diffusion by a factor of 20% for each additional word. As long as hate performs well it is profitable and as long as outrage is profitable, these platforms have no incentive to stop nudging it your way.”
Outrage is the lifeblood of the internet. Just look at Patient Zero for online bullying: last week’s At Your Service podcast guest Monica Lewinsky. In 1998, her affair with President Bill Clinton was the first political scandal to break online. Gossip sites were born from that moment, with all of us logging on to feed our basest interests for the indignity of famous people. Social media has tapped into this, allowing us to bully alongside it. In this way, it is baiting what I like to call our Car Crash Eyeballs. “The algorithm is often measuring engagement with content that is not necessarily what you’re actually interested in or what is good for you,” agrees Carlton. “It’s the stuff that you can’t look away from.”
However, TikTok’s spokesperson tells me that users can actively say what they would not want to see – filtering out hashtags or certain users. “This requires people to take a proactive approach to social media, rather than simply passively scrolling, which at this point may feel unnatural,” observes McCorquodale. “It’s also the only way to combat echo chambers: forcing yourself to interact with content across the political and topical spectrum to get a more balanced, nuanced feed.” So, if we can stop it, are we otherwise just bullies with no willpower? “No, we’re up against a system that has been designed by some of the most brilliant minds in order to try and keep us engaged,” says Carlton.
It works like this: if you were allergic to a certain type of food and your fridge behaved like the algorithm, it would stuff that food in your mouth the minute you opened it. If you took a bite, the next time you opened your fridge, it would stuff two in your mouth. The algorithm is not creating hateful content and, technically, it is also not forcing us to watch it, but much like the insidiousness of Regina George, its influence is everything…
Marie-Claire Chappet is a London-based arts and culture journalist and contributing editor at Harper’s Bazaar