TikTok executives and employees were well aware that the feature encouraged compulsive use of the app and the associated negative mental health effects, according to TikTok. NPR. The broadcast organization reviewed unredacted documents from the lawsuit filed by the Kentucky Attorney General’s Office that were released by Kentucky Public Radio. A few days ago, more than a dozen states sued TikTok for “making false claims.” [that it’s] Kentucky Attorney General Russell Coleman said the app was “specifically designed as an addiction machine to target children who are still developing appropriate self-control.”
Most of the documents filed for the lawsuit had redacted information, but in Kentucky’s case the redactions were insufficient. Apparently, TikTok’s own research shows that “compulsive use is linked to many negative mental health outcomes, including loss of analytical skills, memory formation, situated thinking, conversational depth, empathy, and increased anxiety.” It seems that it is correlated with the influence of TikTok executives also knew that compulsive use could interfere with sleep, work or school responsibilities, and even “connecting with loved ones.”
They also reportedly knew that the app’s time management tools would do little to keep younger users away from the app. Although the tool defaults to an app usage limit of 60 minutes per day, teens spent 107 minutes on the app even when it was turned on. This is only 1.5 minutes less than the average daily usage time of 108.5 minutes before the tool was started. Based on internal documents, TikTok is basing the tool’s success on “how it improves.”[ed] Public trust in the TikTok platform increased through media coverage. ” The company knew the tool was ineffective, one document said.[m]”For most engagement metrics, younger users perform better,” another document reportedly states.
Additionally, TikTok is reportedly aware of the existence of a “filter bubble” and understands how potentially dangerous it is. According to the documents, employees conducted an internal investigation and found that negative filters appeared immediately after following certain accounts, including accounts focused on painful content (“pain hubs”) or sad content (“sadnotes”). It turned out that he had been sucked into a bubble. They are also aware of content and accounts that promote “thinking skills” related to eating disorders. Researchers have discovered that the way TikTok’s algorithm works is that users become trapped inside a filter bubble after 30 minutes of use.
TikTok also struggles with moderation, according to the document. An internal investigation revealed that underage girls on the app were receiving “gifts” and “coins” in exchange for live strips. Additionally, company leadership has reportedly instructed moderators not to remove users reported to be under 13 unless the account actually states that they are under 13. NPR TikTok also says it has acknowledged that a significant amount of content that violates its rules slips through its moderation techniques, including videos that normalize pedophilia and glorify minor sexual assaults and physical abuse. .
TikTok spokesperson Alex Howrek defended the company, saying the Kentucky AG’s complaint “picks misleading quotes out of context to misrepresent our commitment to community safety.” We are pulling out old documents,” he told the group. He also said that TikTok has “strong safeguards, including proactively removing users suspected of being underage,” and that “TikTok has “strong protection measures, including proactively removing users who are suspected of being underage,” and that it has “default screen time limits, family pairing, and restrictions on minors under 16.” “We are voluntarily introducing safety features such as default privacy for users.”