Safety Information From TikTok
Safety Information From TikTok
We encourage community members to use the tools we provide on TikTok to report any content or account they believe violates our Community Guidelines.
• For Your Feed reflects preferences unique to each user. The system recommends content by ranking videos based on a combination of factors, including videos you like or share, accounts you follow, comments you post, and content you create. An account's user name does not influence their recommendations.
• Last year we began testing ways to avoid recommending a series of similar content on topics.
• We've also been testing ways to recognize if our system may inadvertently be recommending a narrower range of content to a viewer.
• We introduced Content Levels to prevent content with overtly mature themes, such as fictional scenes that may be too frightening or intense for younger audiences, from reaching audiences between ages 13-17.
Combatting suicide, self-harm and eating disorder content
Our Community Guidelines make clear that we do not allow content depicting, promoting, normalizing, or glorifying activities that could lead to suicide or self-harm.
• From April-June 2022, of the videos removed for violating our policies on suicide and self-harm content, 93.4% were removed at zero views, 91.5% were removed within 24 hours of being posted, 97.1% were removed before any reports
• Additional measures include: providing search interventions, which means that when someone searches for banned words or phrases such as #selfharm they will not see any results and will instead be redirected to local support resources
• As outlined in our Community Guidelines, content that promotes unhealthy eating behaviors or habits that are likely to cause adverse health outcomes is not allowed on the platform, and we expanded this policy earlier this year to focus on disordered eating more broadly. We made this change, in consultation with eating disorders experts, researchers, and physicians, as we understand that people can struggle with unhealthy eating patterns and behavior without having an eating disorder diagnosis.
• One of the key challenges is helping moderators differentiate between "healthy" dieting or fitness behaviors and those that are harmful or indicate disordered eating. What's triggering for one person may be completely fine for another, and because these thoughts and behaviors can ebb and flow over time, someone can find a piece of content helpful to their recovery at one point, and harmful at another. That's why we're working to ensure people have a diverse and safe viewing experience.
• Another challenge is that content is constantly evolving, and although we redirect thousands of terms to the National Eating Disorders Association (NEDA) Helpline, people change the language they're using as they try to evade our safeguards, and we regularly update our safeguards to reflect these changes.
Our approach to youth safety
• On TikTok we offer a range of safety and privacy controls to empower people to make decisions about who they share their content with.
• We also believe it's important to ensure even stronger proactive protections to help keep teenagers safe, and we've continually introduced changes to support age-ppropriate experiences on our platform.
• For example, when younger teens start using TikTok, we intentionally restrict access to some features, such as Direct Messaging, and automatically set accounts of users ages 13-15 to private by default. Accounts under the age of 18 can't send or receive virtual gifts or livestream. Content created by those under 16 is not eligible for recommendation or search results.
• We aim to provide parents with resources they can use to have conversations about digital safety and decide the most comfortable experience for their family, including our Family Pairing features and our Guardian's Guide to TikTok.