Once again, the impartiality of Facebook’s news feature is being called into question. This time, the social network claims it“accidentally” obstructed all links to the leaked Democratic National Committee emails published by Wikileaks just ahead of the party’s convention.
This past Friday morning, the infamous publisher of anonymous leaks released nearly 20,000 internal emails between members of the formal governing body of the Democratic Party. The news, where it was seen, provoked outrage, especially among supporters of Bernie Sanders, who was shown to have received unfair treatment by officially impartial party operators.
While social media outlets have helped facilitate the spread of uncovered information in the past, Facebook is no trusted ally of Wikileaks.
“@Facebook is blocking #DNCLeak email links,” Wikileaks tweeted Saturday evening, following other individual reports of Facebook suppressing the documents hours earlier.“Monday is the Democratic National Convention.”
In a reply about three hours later, Twitter user @SwiftOnSecurity said,“@wikileaks Facebook has an automated system for detecting spam/malicious links, that sometimes have false positives. /cc @alexstamos”
Then, without elaborating, Facebook Chief Security Officer Alex Stamos replied to both tweets: “It’s been fixed,” Stamos said.
Wikileaks later tweeted that Facebook explained it all away as an“accident.”
A Facebook representative attempted to clarify, telling Gizmodo, “Like other services, our anti-spam systems briefly flagged links to these documents as unsafe. We quickly corrected this error on Saturday evening.”
But tech blogs aren’t just letting this go. The Next Web said Facebook’s correction “is great — but also not really the point,” adding that there seems to be a “very tight reign on what’s allowed on Facebook.”
Complaining about Facebook is nothing new, but this episode of newswire censorship amounts to more than ignored demands for a Dislike button.
In May, former employees of the social networking service blew the whistle on how news stories and trends were generated on the platform. Instead of an unbiased algorithm, human “curators” were revealed to be making decisions on what deserved to be a top headline.
More recently, Facebook took down and re-uploaded a Facebook Live video showing the immediate aftermath of the police shooting death of Philando Castile — an anomaly the site chalked up to a “technical glitch,” TechCrunch reported.
One of the more peculiar cases of Facebook post policing occurred in November 2015, when U.K. student Roua Naboulsi had a lengthy status update removed. Her criticism of the selective sentimentality over the terrorist mass shooting attack in Paris, France, asking why the same response didn’t come for brown-skinned victims of terror, garnered 9,000 shares and 12,000 likes before Facebook took it down, RT reported.
In a twist, Facebook also faced harsh criticism for what it refused to censor earlier this month; it shared a graphic Instagram video of victims in the Bastille Day attack in Nice, France, proving that as egregious as Facebook’s latest censorship campaign may be, it is merely yet another expression of the same pattern.