n 2015, Facebook’s interest in the world of news became more overt with the launch of “Instant Articles,” keeping users on the Facebook platform instead of sending them to the outside web when they clicked on a link to an article. But the quest to conquer content distribution and consumption on the internet started well before then.
Remember, Facebook used to have “a wall” and now it has a “News Feed.”
Facebook has gradually built out its platform to serve publishers, whether it’s through analytics dashboards or Instant Articles or rich-media link previews, through Follow buttons or Verified pages or the Trending section of the News Feed. And, through no fault of its own, Facebook also happens to have a lot of eyeballs.
Obviously, publishers became interested, and then hooked — with little power to negotiate against Facebook.
Until December, the company took no responsibility for blurring media brands and news content to all look the same through Instant Articles. Facebook threw this site or that site up on the Trending section with little-to-no regard for the most accurate and trusted sources. And yet, former Facebook News Feed editors admitted to suppressing conservative stories and boosting left-wing content, which shows some awareness that Facebook could influence its users via the news that flows through the social network’s veins.
Before the election, Facebook stuck with the party line: “We are a tech company, not a media company.”
It wasn’t until after the election, after BuzzFeed exposed how influential Facebook’s fake news problem could have been, that Zuck started to change his tune.
No president has ever represented the interest of 2 billion people, nor has anyone ever attempted to regulate the flow of information for a group so large and diverse, and take responsibility for their experience using the service. For that, Facebook will willingly play a role. But given the breadth of users and the depth of the problem, I highly doubt that Facebook will “move fast, break things.”
Instead, Facebook is pushing half-hearted solutions and scapegoating the issue to other parties, prepping to weather this storm on an already-laid foundation of media dependence and habit.
November 10, 2016
Zuckerberg’s initial reaction, two days after the election, was to diminish fake news on Facebook as an issue:
“Personally, I think the idea that fake news on Facebook — of which it’s a small amount of content — influenced the election in any way is a pretty crazy idea,” said Zuckerberg at the Techonomy conference. “I do think there is a profound lack of empathy in asserting that the only reason someone could’ve voted the way they did is fake news.”
That fails to account for the fact that more than half of U.S. adults get their news from social media, and that most students have trouble distinguishing fake news from real news and that 20 percent of social media users say that they modified their stance on a social or political issue because of content seen on social media.
November 12, 2016
Zuckerberg then recalibrated his language with a post on Facebook, sticking with the idea that it’s highly unlikely fake news affected the outcome of the election:
This is an area where I believe we must proceed very carefully though. Identifying the “truth” is complicated. While some hoaxes can be completely debunked, a greater amount of content, including from mainstream sources, often gets the basic idea right but some details wrong or omitted. An even greater volume of stories express an opinion that many will disagree with and flag as incorrect even when factual. I am confident we can find ways for our community to tell us what content is most meaningful, but I believe we must be extremely cautious about becoming arbiters of truth ourselves.
There are a few glaring problems with this. The first was explained well by my colleague Lora Kolodny in this post: “Zuckerberg’s comment draws a false equivalency between “mainstream sources” of news (including TechCrunch) and political groups masquerading as news brands.”
The second issue is that Zuckerberg is holding to the idea that Facebook can not take full responsibility for the content, only the pipes.
November 18, 2016
Zuckerberg’s third reaction (also posted to his Facebook page) added specifics about how the company plans to combat fake news, revealing under-construction features like stronger detection, easier reporting, third-party verification, warning labels, elevating the quality of “related articles,” disrupting the economics of fake news and “listening.”
In the post, he lamented that the issue of fake news is both technically and philosophically complicated. On that, he’s right. Determining the truth is more complicated than an algorithm can handle.
Source Article from http://feedproxy.google.com/~r/techcrunch/facebook/~3/0Gpf8BGxpMM/