z YouTube could ‘break’ sharing on borderline content to fight misinformation

YouTube could ‘break’ sharing on borderline content to fight misinformation


The platform is eyeing new steps to prevent misinformation from going viral.


YouTube is eyeing new measures to tackle misinformation on its platform. Among the changes being considered, according to Chief Product Officer Neal Mohan, are updates that would effectively “break” sharing features for videos with “borderline content.”

The change would be a major shift for the platform, though it’s not clear if the company will actually take such a step. Mohan described the possibility in a lengthy blog post outlining the company’s approach to preventing misinformation from going viral. In the post, he noted that so-called borderline content — “videos that don’t quite cross the line of our policies for removal but that we don’t necessarily want to recommend to people” — can be particularly challenging to deal with.


That's because YouTube aims to remove these videos from its recommendations, but they can still spread widely when shared on other platforms. “One possible way to address this is to disable the share button or break the link on videos that we’re already limiting in recommendations,” he wrote. “That effectively means you couldn’t embed or link to a borderline video on another site.”

Mohan added that the company was still wrestling with whether or not it should take this more aggressive approach. “We grapple with whether preventing shares may go too far in restricting a viewer’s freedoms.” He said an alternative approach could be adding “an interstitial that appears before a viewer can watch a borderline embedded or linked video, letting them know the content may contain misinformation.”

If YouTube were to prevent sharing of some videos, it would be a dramatic step for the platform, which has repeatedly cited statistics claiming that less than 1 percent of views on borderline content comes from recommendations. But critics have pointed out that this doesn’t fully address the issue, and fact checkers and misinformation researchers have cited YouTube as a major vector of misinformation. Last month, a group of 80 fact checking organizations signed an open letter to the video platform urging it to do more to stop harmful misinformation and disinformation.

The YouTube exec hinted at other changes to come as well. He said the company is also considering adding “additional types of labels to search results” when there’s a developing situation and authoritative information may not be available. The company is also looking to beef up its partnerships with “with experts and non-governmental organizations around the world” and invest in technology to detect “hyperlocal misinformation, with capability to support local languages.”

Post a Comment

Previous Post Next Post