TikTok updates account enforcement system to support creators

TikTok updates account enforcement system to support creators

TikTok is an entertainment platform powered by the creativity, self-expression, and heart that creators put into making authentic content. In order to support creators on the platform, TikTok announced an updated system for account enforcement, which aims to facilitate a smoother experience on TikTok while creating content.

TikTok’s Community Guidelines establish and explain the behaviors and content that are not allowed on the platform, and when people violate the policies, TikTok takes action on their content to keep the platform safe.  Most of the community members aim to follow the policies, but there is a small minority of people who repeatedly violate the policies and don’t change their behavior.

To allow for a better user experience, TikTok updated the account enforcement system to better act against repeat offenders with the goal of helping TikTok more efficiently and quickly remove harmful accounts, while promoting a clearer and more consistent experience for the vast majority of creators who want to follow policies.

Why TikTok updated the current account enforcement system
The existing account enforcement system leverages different types of restrictions, like temporary bans from posting or commenting, to prevent abuse of product features while teaching people about the policies in order to reduce future violations. While this approach has been effective in reducing harmful content overall, TikTok heard from creators that it can be confusing to navigate. It can disproportionately impact creators who rarely and unknowingly violate a policy, while potentially being less efficient at deterring those who repeatedly violate them. Repeat violators tend to follow a pattern – analysis has found that almost 90% violate using the same feature consistently, and over 75% violate the same policy category repeatedly. To better address this, TikTok is updating the account enforcement system to support the creator community and remove repeat offenders from the platform.

How the streamlined account enforcement system will work
Under the new system, if someone posts content that violates one of TikTok’s Community Guidelines, the account will accrue a strike as the content is removed. If an account meets the threshold of strikes within either a product feature (i.e. Comments, LIVE) or policy (i.e. Bullying and Harassment), it will be permanently banned. Those policy thresholds can vary depending on a violation’s potential to cause harm to community members – for example, there may be a stricter threshold for violating TikTok policy against promoting hateful ideologies, than for sharing low-harm spam.

TikTok will continue to issue permanent bans on the first strike for severe violations, including promoting or threatening violence, showing or facilitating child sexual abuse material (CSAM), or showing real-world violence or torture. As an additional safeguard, accounts that accrue a high number of cumulative strikes across policies and features will also be permanently banned. Strikes will expire from an account’s record after 90 days.

Helping creators understand their account status
These changes are intended to drive more transparency around TikTok’s enforcement decisions and help the community better understand how to follow the Community Guidelines. To further support creators, TikTok will roll out new features in the Safety Center provided to creators in-app in the coming weeks. These include an “Account status” page where creators can easily view the standing of their account, and a “Report records” page where creators can see the status of reports they’ve made on other content or accounts. These new tools add to the notifications creators already receive if they’ve violated policies, and support creators’ ability to appeal enforcements and have strikes removed if valid. TikTok will also begin notifying creators if they’re on the verge of having their account permanently removed. 

Making consistent, transparent moderation decisions
As a separate step toward improving transparency about moderation practices at the content level, TikTok is beginning to test a new feature in some markets that would provide creators with information about which of their videos have been marked as ineligible for recommendation to the For You feed, let them know why, and give them the opportunity to appeal.

The updated account enforcement system is currently rolling out globally and TikTok will notify all community members as this new system becomes available to them. TikTok will continue evolving and sharing progress around the processes used to evaluate accounts and assure accurate, nuanced enforcement decisions for accounts of all kinds across the platform.

Lost Password