Today we announced changes to our harassment policy to better protect our Creators & community.
We’ve gotten a head start answering top questions below – Read on ⬇️for resources & FAQs, and drop a comment if you still have questions.
— TeamYouTube (@TeamYouTube) December 11, 2019
On Wednesday, Google-owned YouTube its new, stricter harassment policy. According to the updated , the platform will no longer tolerate “malicious insults,” like racial slurs. This includes content uploaded that harasses protected groups and minors.
The new harassment policy will also now cover both “veiled” and “implicit” threats along with “explicit” ones. Any content that simulates or alludes to violence against an individual will also no longer be allowed on the platform.
Many of you have told us we need to do a better job preventing harassment on YouTube, so we consulted with a wide array of creators, experts and organizations to update our harassment policy, which changed today https://t.co/TnozAF9ZCG. ⬇️Here’s what it covers:
— YouTube Creators (@YTCreators) December 11, 2019
YouTube specifies that it will take action against specific videos as well as channels that have a history of uploading this kind of content. Comments are also covered under the new policy. YouTubers who run afoul of the new harassment rules can find their individual content removed and their channels demonetized, suspended, or even deleted.
The company appears to be aware that some of these rules are fairly broad and have listed to the policy, like scripted performances, educational content, or even topical issues related to high-profile people, like politicians and CEOs.
According to YouTube, the company met with experts and creators for insights into crafting this updated harassment policy.
The updated policy from the online video giant shouldn’t be too surprising. The platform had struggled with harassment, much like other social media services do. However, targeted harassment and hate speech always seemed to hit targets explicitly harder than its tech counterparts due to YouTube’s massive size, influence, and the more personal nature of video as a medium.
The issue of harassment on the platform seemed to really hit a breaking point this past summer, after Vox reporter Carlos Maza YouTube for enabling personal attacks related to his race and sexual orientation carried out on the platform by right wing YouTuber Steven Crowder. YouTube CEO Susan Wojcicki apologized to Maza, but defended the company’s policies, which allowed Crowder’s video to remain on the site. Of course, a major part of the issue wasn’t just the content of Crowder’s videos, but the fact that its viewers also harassed Maza.
Due to YouTube’s history of how it enforces its policies, upon hearing the news Maza said on Twitter that he remains skeptical about the platform changes.
TL;DR: YouTube loves to manage PR crises by rolling out vague content policies they don’t actually enforce.
These policies only work if YouTube is willing to take down its most popular rule-breakers. And there’s no reason, so far, to believe that it is.
— Carlos Maza (@gaywonk) December 11, 2019
What is surprising with this update, though, is just how quickly YouTube has started enforcing areas of the new harassment policy. Typically, the company takes time for these updates to roll out. However, some creators are already reporting that their videos have been deleted based on these new rules.
Another popular YouTuber, iDubbbzTV aka Content Cop, also reported takedowns of his video content dating back to 2016.
As The Verge’s Julia Alexander , YouTube should expect more backlash from the community for the removal of creators’ backlog of videos, and not so much how the policy is enforced on future content going forward.
While a strong harassment policy is certainly a step in the right direction, there’s still a lot of work to be done. The most important aspects will be how the company interprets those vague areas of the policy and that it respects the line between legitimate critique and harassment.