Originally published in Democracy & Social Media (STUDIO SCUM, 2019)

Your grandma doesn’t understand how the internet works, which is how she ended up sharing a link to that fake article about Pope Francis apparently endorsing Donald Trump. You’re younger, savvy, and don’t fall for bullshit like that. You then go on YouTube, and within a few clicks of mildly political content you’ve got a Canadian professor in your recommendations telling you to eat an absurd all-beef diet. You watch Joe Rogan talk to a B-List American comedian about recreational drugs for 3 hours, and you’re recommended a 20-something-minute clip of him going on surreal tangent about how transgender teenagers shouldn’t have access to hormone treatments because their frontal lobes aren’t fully developed and so cannot make informed decisions, while a confused comedian from CollegeHumour tries to respond.

YouTube’s recommendation algorithm favours longer videos, so that they can insert more adverts and make more money, and content creators favour longer videos, so that they can insert more adverts and make more money. And if there’s one group that’s mastered the art of creating long, tedious, drawn out, advert-filled content, it’s right-wing YouTubers. As mapped out by Rebecca Lewis of the Data & Society Research Institute and backed up by a statistical analysis of over 300,000 videos by Manoel Horta Ribeiro and various others, even neutral pundits like Joe Rogan can cause people to be recommended conservative content. These conservative YouTubers, who often frame themselves as contrarians and concerned with their legal right to say horrible things about trans people, are only a few algorithmic recommendations away from explicit white nationalists. The opaque, non-human programming of YouTube has no way of understanding that white nationalism is the most absurd ideology ever created. A basic level of human intellect is all that’s required to understand that attempting to create a nation-state populated entirely by people who have kinda similar skin pigmentation is about as coherent as saying that Britain should be populated entirely by people with green eyes, not to mention being inherently violent and authoritarian. The recommendation algorithm, however, does not have a basic level of human intellect. It then blindly promotes such content, as it’s what glues eyes to screens, and generates advert revenue.

Money doesn’t care. YouTube is part of Google, a for-profit company. Generating profit is simple — take some money, do something with it, and turn that into more money. There’s a word for money that’s spent to make more money: capital. Capitalism, then, is a system where decisions are made in order to maximise the expansion of this non-human, digital, self-replicating numerical entity. There’s no malicious bias to YouTube, no intentional rigging of the system, just programmers attempting to make a number on a screen go up for the benefit of a select few shareholders. Social media of all kinds — not just YouTube — exists in order to make this number go up, and it just happens that feeding teenagers pseudoscience that was debunked in the 1910s is how capital currently grows.

Beyond YouTube, let’s say you post a picture of your new outsider art project on Instagram. You provide it to Instagram for free. Your friend scrolls down through their feed, sees it and likes it. Continuing to scroll down, they see an advert, an advert that generates profit for Instagram, not you who published the art, despite that being the reason your friend’s there in the first place. Even if you’re lucky and your vegan food account gets you some free goodies, or even sponsorship and freelance gigs, Instagram’s owner Facebook will generate more money from your content than you will see in your lifetime. Social media mathematical formulae, known only to the private companies who create them, feed you the content that maximises profit for them, despite that being a tiny sliver of the huge daily output of data onto these platforms.

It’s cliché to say that social media is bad for your mental health, something said by people who don’t understand why “ROAD WORK AHEAD” signs are hilarious to anyone under the age of 25. But trust me, it is. Sometimes the problems are obvious — abuse, bullying, and anonymous replies calling you horrible things. Sometimes they’re more subtle — poor sleep from screens, feeling bad about your body, and overthinking why you weren’t invited to the gig in everyone’s story. Then there are those constant anxieties that eat away at your soul. Have you been left on read recently? Have you checked your story every 20 minutes to see if they have seen it? The low-level stress of constantly maintaining an aesthetic and hundreds of social relations erodes the sense of connection and joy that these technologies should be able to bring us.

“Do they fancy me?” is a primal concern, a peculiarly human offshoot of the strange path evolution took two million years ago when some species started to reproduce through two partners having sex, rather than one organism replicating itself. Social media companies have captured this and turned it into a machine designed to reproduce capital. You are fed content that makes you keep your phone on longer, with no regard for whether it turns you anorexic or into a Nazi (or both), purely so you can see adverts for phone games with awful graphics that no-one even seems to be playing. There’s no single solution to this. You can delete your account, but at the cost of socially isolating yourself. You can try and only follow hyper-positive Instagram posts, only to feel like you’re not being positive enough when a drawing of a cute cat tells you to drink water.

The real solutions include action outside of social media, as well as within it, by solving the societal problems that social media makes obvious. Create collective spaces where the benefits of social media can be experienced without having to play by the rules of a for-profit corporation. Put on exhibitions in your living room, throw a party with whatever cheap speakers you have, go to the library and read a book on counselling skills and learn to listen to your friends. When using social media, be aware of what you’re using and how the technology is using you. Instead of allowing the algorithm to push you into content that benefits capital, follow those who are trying to find a way out, create detours, and use these platforms against the will of the algorithm to create meaningful expression.

Thomas Sullivan is an anthropologist of esotericism.