Visibility of Misinformation Videos on YouTube to be Reduced

YouTube is adjusting its recommendation engine in an effort to lower the reach of videos that misinform users.

This type of content cannot be outright removed since it doesn’t quite violate YouTube’s community guidelines.

YouTube will instead reduce the visibility of borderline content by not displaying the videos as recommendations.

we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.

Users will still be able to find these types of videos if they’re specifically searched for. This change only affects recommended videos that appear in places like the home page.

The only time YouTube may recommend these videos is when users subscribe to channels that publish borderline content.

According to YouTube, less than one percent of its videos qualify as borderline content.

While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community.

YouTube will utilize a combination of machine learning and human evaluators to assess which videos should not appear as recommendations.

The rollout of this change will be gradual and initially only affect a small set of videos in the US.

YouTube will roll out the change to more countries as its recommendation system becomes more accurate.

You can see the article on Search Engine Journal.
 
The article originally appeared on the Official YouTube blog regarding continuing improvements to YouTube.   

When recommendations are at their best, they help users find a new song to fall in love with, discover their next favorite creator, or learn that great paella recipe. That’s why we update our recommendations system all the time—we want to make sure we’re suggesting videos that people actually want to watch.
You might remember that a few years ago, viewers were getting frustrated with clickbaity videos with misleading titles and descriptions (“You won’t believe what happens next!”). We responded by updating our system to focus on viewer satisfaction instead of views, including measuring likes, dislikes, surveys, and time well spent, all while recommending clickbait videos less often. More recently, people told us they were getting too many similar recommendations, like seeing endless cookie videos after watching just one recipe for snickerdoodles. We now pull in recommendations from a wider set of topics—on any given day, more than 200 million videos are recommended on the homepage alone. In fact, in the last year alone, we’ve made hundreds of changes to improve the quality of recommendations for users on YouTube.
We’ll continue that work this year, including taking a closer look at how we can reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines. To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.
While this shift will apply to less than one percent of the content on YouTube, we believe that limiting the recommendation of these types of videos will mean a better experience for the YouTube community. To be clear, this will only affect recommendations of what videos to watch, not whether a video is available on YouTube. As always, people can still access all videos that comply with our Community Guidelines and, when relevant, these videos may appear in recommendations for channel subscribers and in search results. We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users.
This change relies on a combination of machine learning and real people. We work with human evaluators and experts from all over the United States to help train the machine learning systems that generate recommendations. These evaluators are trained using public guidelines and provide critical input on the quality of a video.
This will be a gradual change and initially will only affect recommendations of a very small set of videos in the United States. Over time, as our systems become more accurate, we’ll roll this change out to more countries. It’s just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube.

Content Marketing
Bing Increase Number of URLs Webmasters Can Submit

Bing increase in the number of URLs webmasters can submit to Bing to get their content crawled and indexed immediately. Webmasters should be able to see revised limit for their site on Bing webmaster tools portal (Submit URLs option) or by using the Get URL submission quota API.  Bing Blog and… We believe …

Social Media
Acquiring Customers with Facebook Ad Sequences

Amanda explains why the challenges marketers face in 2019 can create opportunities to lower their ad spend. You’ll also discover how to sequence Facebook ads to reflect the customer journey. What Is a Facebook Ad Sequence? It’s all about asking people for an email address or asking them to sign …

SEO
Why companies should invest more in PR in 2019

In 1999, a marketing research company in the US decided to conduct a survey on the use of Public Relations by corporate organisations. The firm surveyed about 100 top and middle managers in the communications field and found that over 60% of them said their PR programmes involved a little …