Media
Study: Harmful content pushed to users on TikTok every 39 seconds
A recent study confirms the dangers the popular social media app TikTok poses on its users. The new study was conducted by having researchers set up TikTok accounts posing as 13-year-old users interested in content about body image and mental health.
“It found that within as few as 2.6 minutes after joining the app, TikTok’s algorithm recommended suicidal content. The report showed that eating disorder content was recommended within as few as 8 minutes” reports CBS News.
The report, published earlier this week by the Center for Countering Digital Hate (CCDH) found 56 TikTok hashtags which hosted eating disorder videos, with a whopping 13.2 billion views.
“The new report by the Center for Countering Digital Hate underscores why it is way past time for TikTok to take steps to address the platform’s dangerous algorithmic amplification,” said James P. Steyer, Founder and CEO of Common Sense Media, which is unaffiliated with the study. “TikTok’s algorithm is bombarding teens with harmful content that promote suicide, eating disorders, and body image issues that is fueling the teens’ mental health crisis.”
“TikTok is able to recognize user vulnerability and seeks to exploit it,” said Imran Ahmed, CEO of CCDH. Studies such as this confirm tragedies which have led to the 1,200 families pursuing lawsuits against social media companies including TikTok for having assaulted the mental wellbeing of their children; in many cases the result was death by suicide.
CBS adds that the CCDH report details how TikTok’s algorithms refine the videos shown to users as the app gathers more information about their preferences and interests. The algorithmic suggestions on the “For You” feed are designed, as the app puts it, to be “central to the TikTok experience.” But new research shows that the video platform can push harmful content to vulnerable users as it seeks to keep them interested.
CBS reports of the study:
To test the algorithm, CCDH researchers registered as users in the United States, United Kingdom, Canada, and Australia and created “standard” and “vulnerable” accounts on TikTok. There was a total of eight accounts created, and data was gathered from each account for the first 30 minutes of use. The CCDH says the small recording window was done to show how quickly the video platform can understand each user and push out potentially harmful content.
In the report, each researcher, posing as a 13-year-old, the minimum age TikTok allows to sign up for its service, made two accounts in their designated country. One account was given a female username. The other, a username that indicates a concern about body image—the name included the phrase “loseweight.” Across all accounts, the researchers paused briefly on videos about body image and mental health. They “liked” those videos, as if they were teens interested in that content.
When the “loseweight” account was compared with the standard, the researchers found that “loseweight” accounts were served three times more overall harmful content, and 12 times more self-harm and suicide specific videos than the standard accounts.
-
Politics4 days ago
Former Anheuser-Busch exec details how firms pressure corporations to pursue lefty agendas
-
Immigration1 week ago
Chicago Passes $51 Million To Fund Housing And Food For Illegal Immigrants
-
Politics1 week ago
Chinese citizens suing Florida over law banning land purchasing near military installations
-
Media4 days ago
REPORT: TikTok stored Americans’ financial information including social security numbers in China