TikTok on Wednesday announced several updates intended to help users customize their viewing preferences and filter out content that may be problematic or too mature for young users, amid renewed scrutiny of the potential harms social media poses to teens.
The short-form video app is tweaking its viewing experience so users will now see fewer videos about topics that “may be fine as a single video but potentially problematic if viewed repeatedly,” according to Cormac Keenan, head of trust and safety at the company. Keenan cited topics related to dieting, extreme fitness, and sadness as examples of such content. (TikTok rival Instagram has also previously attempted to prevent teens from seeing certain weight loss products.)
TikTok also said it is rolling out a new system that organizes content based on thematic maturity, not unlike the ratings systems used in film and television. The new safeguards will allocate a “maturity score” to videos detected as potentially containing mature or complex themes.
The goal, according to Keenan, is “to help prevent content with overtly mature themes from reaching audiences between ages 13-17.”
Senators grilled executives from TikTok, YouTube, and Snap late last year about the steps their platforms were taking to protect teens online after a Facebook whistleblower renewed concerns about the impact social media platforms have on their youngest users.
Additionally, a coalition of state attorneys general launched an investigation earlier this year specifically into TikTok’s impact on young Americans. In a statement at the time, TikTok said it limits its features by age, provides tools and resources to parents, and designs its policies with the well-being of young people in mind.
In the blog post Wednesday, Keenen said the company is “focused on further safeguarding the teen experience” and will add a new functionality to provide more-detailed content filtering options in the coming weeks.