TikTok uses automation to detect and remove many of the videos that violate its guidelines. For the past year, the service has tested and optimized systems to find and remove such content. It will roll out these systems in the US and Canada over the next few weeks.
First, the algorithms will look for posts that violate policies regarding minor safety, violence, graphic content, nudity, sex, illegal activity, and regulated goods. When the systems detect a violation, they immediately remove the video and the user who posted it can appeal. Users can still report videos for manual review.
Automated reviews are “reserved for categories of content where our technology has the highest level of accuracy,” TikTok said. Only one in 20 of the automatically removed videos was false positive and, according to the company, should have stayed on the platform. Hoping to improve the accuracy of the algorithms, TikTok notes that “claims against the removal of a video have remained consistent”.
According to TikTok, automation should give its security staff the freedom to focus on content that requires a more nuanced approach, including videos that feature bullying, harassment, misinformation, and hate speech. Crucially, the systems can reduce the number of potentially disturbing videos that the security team must watch, such as videos depicting extreme violence or child exploitation. For example, Facebook has been accused of not doing enough to protect the well-being and mental health of content moderators who are tasked with reviewing content that is often disruptive.
Elsewhere, TikTok is changing the way users are notified when they break rules. The platform now tracks the number, severity and frequency of violations. Users can see details about these in the Account Updates section of their inbox. They can also view information about the consequences of their actions, such as: For example, how long they’re blocked from posting or interacting with other people’s content.
All products recommended by Engadget are selected by our editorial team independently of our parent company. Some of our stories contain affiliate links. If you buy something through one of these links, we may earn an affiliate commission.