YouTube fights "bad actors" with new content policies
New measures include manual vetting of Google Preferred videos, and a higher monetisation threshold for small channels
YouTube has further revised its content policies, placing tighter restrictions on channels that can monetise and pledging to manually review videos from more popular creators.
In a blog post signed by chief product officer Neal Mohan and chief business officer Robert Kyncl, 2017 was described as, "a tough year for many of you, with several issues affecting our community and the revenue earned from advertising through the YouTube Partner Program."
YouTube's focus for the year ahead, the post stated, is to better protect the "creator ecosystem" and provide more stable revenue for those involved.
"We're making changes to address the issues that affected our community in 2017 so we can prevent bad actors from harming the inspiring and original creators around the world who make their living on YouTube.
"A big part of that effort will be strengthening our requirements for monetisation so spammers, impersonators, and other bad actors can't hurt our ecosystem or take advantage of you, while continuing to reward those who make our platform great."
"Higher standards will also help us prevent potentially inappropriate videos from monetizing which can hurt revenue for everyone"
There will be two key changes, but YouTube placed greater emphasis on its requirements before a channel can monetise through advertising. In April, a minimum of 10,000 lifetime views was introduced, but YouTube is now significantly raising that level to a minimum of 1,000 subscribers and 4,000 hours of watchtime in the preceding 12 months.
According to YouTube, 99 per cent of the channels affected were earning less than $100 a month over the last year, and 90 per cent were earning less than $2.50.
"We've arrived at these new thresholds after thorough analysis and conversations with creators like you. They will allow us to significantly improve our ability to identify creators who contribute positively to the community and help drive more ad revenue to them (and away from bad actors). These higher standards will also help us prevent potentially inappropriate videos from monetizing which can hurt revenue for everyone."
The new restrictions will be in effect from February 20, at which point existing channels with fewer subscribers and watch hours will no longer be able to monetise. When they pass those milestones, both new and existing channels will be evaluated to ensure they comply with YouTube's other content policies.
Of course, when one thinks about "bad actors" on YouTube, the names of influential creators are just as likely to arise as small channels with few subscribers. Felix "PewDiePie" Kjellberg is perhaps the most obvious example from the gaming world, but the popular YouTuber Logan Paul - who has 15.6 million followers - provoked outrage this month for posting a video that showed the body of a suicide victim. From the perspective of advertisers with money to spend on the platform, these very public scandals are a cause for concern.
To that end, YouTube will continue to use "community strikes, spam, and other abuse flags" to assess monetisation on specific channels regardless of size, but it will also introduce the manual vetting of videos from creators in its Google Preferred program.
"Moving forward, the channels included in Google Preferred will be manually reviewed and ads will only run on videos that have been verified to meet our ad-friendly guidelines," YouTube's Paul Muret said in a separate blog post. "We expect to complete manual reviews of Google Preferred channels and videos by mid-February in the U.S. and by the end of March in all other markets where Google Preferred is offered."