How social media algorithms harm your mental health
Social media algorithms have long been criticized for various reasons, and mounting evidence strongly suggests the detriment they can have on users’ mental health. A team of psychology researchers has urged social media companies to increase transparency around how algorithms work in a recent article published in the journal Body Image. While this article focuses on the impact of social media on teens and body image, other studies have found a link between social media use and depression.
Many point to algorithms as a key cause for these adverse mental health outcomes.
Algorithms are built to manipulate
The point of social media algorithms is to keep users engaged for long periods of time, providing users with personalized content tailored to their interests. This mindless scrolling can harm people’s mental health. Psychiatrist Dr. Nina Vasan told Fast Company that
“there’s no ability to pause and think. Time just basically goes away. We need to think about how we can break the cycle and look at something else, take a breath.”
The type of content algorithms tend to promote doesn’t help matters. According to Big Think, social media algorithms suggest content based on popularity biases; for example, engagement such as likes, comments, and shares rather than users’ specific search terms. Whether you’re using TikTok or Facebook, algorithms rank and recommend the content that has been engaged with the most.
This sort of ranking seems reasonable, but it’s based on the assumption that high engagement equals high-quality content. However, a study from Scientific Reports found that this isn’t the case at all and primarily amplifies lower-quality content. This may include content that cites conspiratorial or less-than-scientific sources. Popularity-biased ranking also assumes that engagement comes from real humans. But floods of fake accounts and bots can easily manipulate algorithms to make certain beliefs appear popular, exposing them to more and more people.
Unfortunately, humans are wired to follow the crowd. And on social media, this can lead to less critical interrogation of a piece of content and simply believing the ideas it espouses because they are popular.
Another common quality of popular content is negativity. The unfortunate reality is negative content keeps people on platforms longer. A recent study tracking people’s eye movements as they read social media posts found that people tend to spend more time reading negative comments, particularly angry ones, rather than positive ones. Another study focusing on Twitter specifically discovered that negative news spreads more broadly than positive news.
The allure of the negative does have an evolutionary function, with researchers suggesting that reading bad news can make humans feel more prepared and better equipped to anticipate danger. However, continually monitoring content that results in feelings of fear, sadness, and anger is detrimental to mental well-being and can worsen moods and anxiety.
Algorithms and body image
Certain content can also lead to body image issues. In the Body Image article mentioned at the beginning of this piece, researchers found that social media platforms exacerbate teen body issues by continually exposing them to edited, filtered content that isn’t realistic. These distorted images can lead to dissatisfaction among users about themselves, increasing their risk of developing eating disorders and body dysmorphia.
The report also found that social media algorithms may worsen the issue because of how content is personalized for users based on their preferences. This can lead users down a rabbit hole of more extreme, less monitored content that causes them harm just to keep them on the platform. An Italian study on the subject supports these findings, revealing that the TikTok algorithm frequently shows users with eating disorders content promoting eating disorders without them having to even search for it.
Social media companies are aware of the problem
One of the worst things about this is that social media platforms are well aware of their algorithms’ detrimental impact. Meta whistleblow Frances Haugen leaked documents revealing Facebook knew its products had a harmful effect on body issues among teenagers. A TikTok whistleblower has also revealed the careful manipulation of the platform’s algorithm, maintaining engagement by prioritizing emotionally triggering content to users.
So how can you protect your mental health against algorithms that seem to be designed to harm it? Try to be mindful of the content you consume and how it makes you feel, and try to limit exposure to it. Psychologists also recommend influencers produce more body-positive content and inform their followers about the adverse effects of algorithms.
But ultimately, they believe that social media companies should be responsible for protecting their users from unnecessary harm and being transparent about their algorithms and how they work, telling users why exactly certain content has been shared with them.