The glorification of dangerous thinness is a long-standing problem in American culture. And it’s a particularly bad problem on the internet. where users can find an endless stream of Extreme Dieting Advice“thinspo” image boards, YouTube videos claiming weight loss magic, and so on, there has always been a large audience for this type of content. Most of them are highly visual and emotional and easily spread.
Most of the big social media platforms have been aware of this reality for many years. and has taken at least basic measures to deal with it. At least on most of these platforms. If you search for some well-known keywords related to eating disorders, This is because people who are attracted or vulnerable to such content are more likely to search for it. You will see a pop-up screen. Ask if you need help and recommends contacting the national hotline On today’s biggest platforms for young people, like Instagram and TikTok, the screen is like a wall. You can’t tap past that screen to get to search results. This isn’t to say that these websites don’t host sensual photos and videos of eating disorders. It’s just that finding them isn’t often as easy as finding them.
However, the X offers a completely different experience. If you search for popular tags and terms related to eating disorders, You’ll see accounts with those words in their username and bio. You’ll see related posts and recommendations for different groups. to participate under the heading “Explore the Community” the impression these posts communicate. What’s commonly associated with cool photos of extremely thin people is that eating disorders are a more enviable lifestyle than mental illnesses and dangerous health conditions. In fact, the lifestyle looks even more aspirational with some users talking about its growing popularity and desire to be apart. “Anorexics” leave the community, however, those who are accepted feel truly accepted. They received advice and positive feedback from the wider group.
Technically All of this violates X’s distribution. policy Oppose the promotion of self-harm. But there is a big difference between having a policy and enforcing it.disgusting behavior– The site is demonstrating what can happen when platform rules mean nothing (X did not respond to emails about this issue).
This moment did not occur in a vacuum. The social web is in a period of decline when it comes to content moderation. Major platforms There has been a push to act on misinformation in response to earthquakes. They include the 2016 presidential election, the coronavirus pandemic, the 2020 Black Lives Matter protests, the rise of QAnon, and the January 6th insurrection, but most have retreated after the pushback from Donald Trump. Support Republicans who equate moderation with censorship. That equation is one of the reasons Musk bought Twitter in the first place: He saw it as a powerful platform that operated in heavy favor with his enemies. and limit the speech of his friends After he took over the site in 2022, he eliminated thousands of employees. and pledged to reverse years of efforts to censor overlapping content on the platform. “These teams that were working full-time on preventing malicious content really weren’t there,” Rumman Chowdhury, a data scientist who had led the security team at pre-Musk Twitter, told me. They had been fired or demoted. dropped significantly when Musk took over, she said.
Now the baby is thrown out with the bathwater, Vaishnavi J, a youth safety expert who works on Twitter and Instagram, tells me (I agreed not to publish her full name because she’s concerned about targeted harassment). (She also published the study using only her last initial.) “Despite what you say about Musk,” she told me, “I think if you showed him The type of content currently being displayed I don’t think he really wanted it on the platform.” At that point in October, NBC News’s Cat Tenbarge reports. X removed one of the largest groups of eating disorders. after she drew the company’s attention to the group during her reporting. But she also reports that there are new groups. emerging quickly to replace which is true Before Thanksgiving I found an eating disorder support group with nearly 74,000 members (with a little effort) when I looked this week to see if it still existed. It has grown to more than 88,0000 members. (Musk did not respond to a request for comment.)
That growth is tracked by user reports that X doesn’t just host eating disorder content. actively recommend it In the algorithmically generated “For You” feed Even if people don’t want to see it. Researchers are paying attention: Kristina Lerman, a professor at the University of Southern California who has published About previous online eating disorder content It’s part of the team that summarizes a new article about how pro-anorexia rhetoric spreads in X. “There’s this echo chamber. This very connected community,” she tells me, is also very visible. Which is why X has built a reputation as the place to find that kind of content. The X community openly uses words like Pro Ana and Tinspoand even hope and deathIt’s a romantic term for eating disorders taken to the extreme, lovingly alluding to the worst outcomes.
Eating disorder content has been one of the most difficult content moderation issues since the inception of the social web. Widespread in early online forums And it spread on Tumblr, which is where it started to gain popularity. Different visual aesthetics and the set of community rituals that are part of the internet in various forms. Since then (Actually, I already know. problem on Twitter, before Musk even took over the site.) There are a number of reasons why this content presents such difficult moderation problems, for one, compared to hate speech or targeted harassment. Users are less likely to be flagged. Community participants are unlikely to self-report. The creator of this content, on the other hand, is like that. There is a strong incentive to evade detection. and will Create new things with code to avoid new interference Platforms that want to reduce the spread of eating disorder content have to work hard. Keep up to date with the latest trends in keywords and euphemisms. and is constantly looking to sabotage its efforts.
An additional challenge is The line is drawn between content that glorifies eating disorders and content that is part of a culture obsessed with thinness. Masquerading as “fitness” and “health” advice, it’s not always clear. This means that moderation must have a human element and be able to process very small nuances in order to understand how to resolve the issue without causing unintentional harm, for example by breaking a social network. Is it dangerous to take someone overnight when they are already struggling? Is it helpful to discuss eating disorders if the conversation is about recovery? or That can be dangerous too.–
These questions are the subject of ongoing research and discussion. The internet’s role in disordered eating has been discussed for decades, but looking at X in 2024, you wouldn’t know it. After searching for popular words just once correct—”Eating disorders Twitter”—and click on some of the recommended communities. I immediately started seeing this type of content in the main feed of my X account. When scrolling through my news and jokes, I saw posts like “Thinsp0’s top threads I love for edtwt” and “Things What’s the worst thing about being fat?” … Editing thread to motivate you.”
I found this shocking mostly because it was so simple. We hear all the time how complicated recommendation algorithms are for today’s social platforms. But what I do is search for something once and click around. for five minutes It’s strange one to one But when I told Vaishnavi about this experience, She wasn’t surprised at all. “The recommendation algorithm places a strong emphasis on engagement, and ED content is very popular,” she tells me. If I search for something less popular that the site can’t deliver, I may not see changes in my feed.
When I spoke with Amanda Green, who Widely published about eating disorder content online as a researcher at the University of Michigan She highlighted a larger, newer problem with recommendation algorithms. “That’s what made TikTok popular, and that’s what I think is making eating disorder content so widely available on X,” she said. “It’s one thing to have this stuff out there if you search for it. really It’s another thing to push it on people.”
It’s also evident how brutal most of the X content is. For me It’s like the old style of content that promotes eating disorders. It’s not just the romance of ultra-thin. It looks like something you would have seen 10 years ago, when it was very common for people to post photos of themselves on social media and ask others to take it apart. In X, I saw people say Horrible things to each other in the name of “meanspo” (“devastating inspiration”), which would encourage them not to eat.
Although she isn’t collecting information about “Tough love” or “support” has become very strong in recent years. And now it’s coming back again. “I think that might be part of the reason for it. have Going out was moderating the content,” Green told me. Now it’s back. And everyone knows where to find it.