In order to make its short-form video platform safer for kids, TikTok has introduced new content moderation features as well as new hashtag filters for its For You and Following feed. “Content Levels,” the first iteration of its new system to prevent some sorts of content from being accessed by minors, is scheduled to go live in the upcoming weeks.
TikTok claims that some of the app’s content may contain “mature or complex themes that may reflect personal experiences or real-world events that are intended for older audiences,” despite the fact that adult content is prohibited on the platform. Such content will be classified by Content Levels and given a maturity rating, which will prevent individuals between the ages of 13 and 17 from viewing it.
The method will eventually be expanded to offer filtering options for the entire community, not just teens, according to TikTok, which claims Trust and Safety managers would initially assign the scores to increasingly popular videos or those that people have reported on the app.
The system will eventually enable content creators to categorize their work, much as how age ratings are used in movies, TV shows, and video games.
In addition to content levels, hashtag filters on TikTok will soon be available, giving users even more control over what shows up on the For You and Following tabs. The program will automatically filter out any terms or hashtags that users choose they do not want to appear in their feeds.