Home / Technology / The Shift: YouTube’s Rapid Response Partisans Game the News of Tragedy

The Shift: YouTube’s Rapid Response Partisans Game the News of Tragedy

In a phone interview from his home in Houston, Mr. Williams told me that he had created more than 10,000 YouTube videos over an eight-year period, posting as many as 20 monologues per day, and racking up estimated 200 million views.

His hit productions have included fact-challenged videos like “Barack and Michelle Obama Both Come Out The Closet,” which garnered 1.6 million views, and “Hillary Clinton Is On Crack Cocaine,” which had 665,000. He was admitted to YouTube’s partner program, which allows popular posters to earn money by displaying ads on certain types of videos, and claims to have made as much as $ 10,000 a month from his channel.

“I like to call myself a reporter who reports the news for the common person,” Mr. Williams said.

Whether motivated by profit or micro-celebrity, the success of sensationalists like Mr. Williams has become a vexing problem for companies like Facebook, Twitter, and Google, which owns YouTube.

These companies sort and prioritize information for their users, and most have built ranking systems that boost news from mainstream outlets over stories from less credible sources. But those algorithms can be gamed in breaking news situations by users who work fast, uploading their videos in the valuable minutes between when news breaks and when the first wave of legitimate articles and videos appears.

“Before reliable sources put up stories, it’s a bit of a free-for-all,” said Karen North, a professor studying social media at the University of Southern California. “People who are in the business of posting sensationalized opinions about the news have learned that the sooner they put up their materials, the more likely their content will be found by an audience.”

Photo

Elmer T. Williams said he had created more than 10,000 YouTube videos, posting as many as 20 monologues per day on hot-button topics.

The phenomenon is not limited to YouTube. After last month’s mass shooting in Las Vegas, a Facebook safety check page featured a story from a site called “Alt-Right News” that made false statements about the gunman, and Google’s search results displayed a conspiracy theory from 4Chan, the notoriously toxic message board. After last month’s terrorist attack in New York City, a trending topic page on Twitter briefly featured a story from Infowars, a conservative site that is popular among the conspiracy-minded.

Conservatives have argued that YouTube unfairly targets their videos while allowing liberal channels, such as The Young Turks, to post heated political commentary. And some dispute that there is any conscious gaming going on.

“There is absolutely no strategy,” said Paul Joseph Watson, an editor-at-large at Infowars and a popular YouTube personality who has 1.1 million subscribers. On the day of the Texas church shooting, one of Mr. Watson’s tweets appeared as a result in Google searches for the shooter’s name, although it has since disappeared.

Tech companies, already under fire for the ease with which they allowed Russia to interfere in last year’s election, have also vowed to take a harder stance on domestic misinformation. Twitter’s acting general counsel, Sean Edgett, told congressional investigators last week that the company would take steps to keep false stories from being featured on trending topic pages.

“It’s a bad user experience, and we don’t want to be known as a platform for that,” Mr. Edgett said.

YouTube, whose community guidelines prohibit hateful and threatening content, has begun using artificial intelligence to help identify offensive videos. But conspiracy theories don’t announce themselves, and machines can’t yet handle the complicated business of fact-checking.

In Mr. Williams’s case, human intervention seems to have been necessary. On Tuesday, shortly after I asked YouTube some questions about Mr. Williams’s account, all of his videos disappeared, and his profile was replaced by a message saying, “This account has been terminated due to multiple or severe violations of YouTube’s policy prohibiting hate speech.”

Mr. Williams, who said he had recently left his job as an operations manager at a hazardous materials plant to focus on full-time punditry, has tangled with YouTube’s hate speech policies before. The company shut down one of his previous accounts for similar infractions, which he claimed cost him 250,000 subscribers and a lucrative income stream.

“If YouTube didn’t punish me,” Mr. Williams said, “I could easily be making over $ 30,000 a month.”

In a statement, YouTube said that Mr. Williams’s account was banned “as soon as it was flagged to us,” because its terms of service prohibit repeat rule-breakers from opening new accounts. It also said that its terms prohibit advertising from appearing on videos featuring “controversial and sensitive events, tragedies, political conflicts, and other sensitive topics.”

Even before this week’s crackdown, Mr. Williams was branching out. He sells cellphone ringtones on his website, and was considering starting his own paid streaming service. Tuesday night, just hours after he was banned by YouTube, Mr. Williams posted a video on Vimeo, another video-hosting platform. He pledged to keep insulting his favorite targets — Democrats, Hillary Clinton, Barack Obama — and not shy away from controversy, no matter what the policies said.

“I don’t want to be on YouTube anymore,” Mr. Williams said. “It’s too communist.”

Continue reading the main story
NYT > Technology

Leave a Reply

Your email address will not be published. Required fields are marked *

*