TikTok utilizes an addictive, opaque recommendation algorithm that bombards vulnerable teen users with an “endless stream” of harmful content related to suicide, anxiety, and depression, concerned critics alleged in a bombshell report published Thursday.
The lengthy exposé by Bloomberg Businessweek details several instances in which teen users purportedly faced a barrage of troubling content on their “For You” feeds – including the devastating case of 16-year-old Chase Nasca of Long Island, who died by suicide last year after being exposed to TikTok videos promoting suicide.
Scrutiny of TikTok’s algorithm and ties to China have reached unprecedented heights in recent months – with lawmakers on both sides of the political aisle calling for the app to be banned in the US.
Nasca’s TikTok account is still active – and his “For You” feed still displays an “endless stream of clips about unrequited love, hopelessness, pain and what many posts glorify as the ultimate escape: suicide,” according to Bloomberg.
In one case, a video that appeared on Nasca’s feed in February said, “Take the pain away. Death is a gift,” according to the report. The video surfaced just days before the first anniversary of his passing.
As recently as April, videos related to suicide and depression were still appearing on the account page.
“I don’t understand why they keep sending him this stuff,” Nasca’s mother, Michelle, told the outlet.
Nasca’s parents are currently suing TikTok, alleging their son was “targeted, overwhelmed, and actively goaded” into committing suicide.
In total, the Social Media Victims Law Center, the firm representing the Nasca family in the case, has reportedly filed more than 65 cases alleging that TikTok and other social media apps contribute to various harmful results, including sleep deprivation, eating disorders, and suicide.
“Our children are dying,” attorney Laura Marquez-Garrett told Bloomberg. “They are developing harmful dependencies on these products, and they are experiencing unprecedented rates of depression, anxiety, and suicide. How many 16-year-olds killing themselves does it take for people to realize this is not OK?”
Meanwhile, more than a dozen former employees on TikTok’s trust and safety team told the outlet that ByteDance executives still “hold the keys” to the app’s recommendation algorithm, despite assurances that it is managed by employees based around the world.
The ex-employees said they had little control over or understanding of the algorithm and were often met with silence when they asked for details about how it operated, according to Bloomberg. A handful said they quit after their requests for more information were stonewalled.
Bloomberg’s report provides several other examples of users who logged onto TikTok in search of fun and positive content only to receive disturbing or harmful videos that led to eating disorders or other negative consequences.
TikTok told Bloomberg it was unable to comment on pending litigation.
However, TikTok spokesperson Jamie Favazza said the company was committed to ensuring the safety of its users.
“Our hearts break for any family that experiences a tragic loss,” Favazza said in a statement. “We strive to provide a positive and enriching experience and will continue our significant investment in safeguarding our platform.”
The tech firm also told Bloomberg that it treats the concerns of its former employees seriously and that its trust and safety team works closely with its engineers.
The Post has reached out to TikTok for comment.
TikTok has attempted to assuage the concerns of critics and lawmakers – including oversight of its recommendation algorithms and shifting all US user data to servers managed by the tech giant Oracle.
TikTok is one of the most popular social media apps, with more than 150 million users in the US alone.
Critics have pointed to concerns that Beijing could have backdoor access to US customer data, as well as fears about the harmful effects of questionable content on underage users.
TikTok CEO Shou Zi Chew was grilled on Capitol Hill last month about the app’s struggle to address harmful content. Nasca’s parents were in attendance at the high-profile hearing.
During the hearing, Chew described Nasca’s death as “devastating” and “tragic.”
“We do take these issues very seriously and we do provide resources for anybody that types in anything suicide-related,” Chew said at the time.