facebook

opinion piece published this week in The New York Times gets something exactly wrong. In “Facebook Shouldn’t Fact-Check,” Jessica Lessin argues that it’s not Facebook’s job to fact-check the growing flood of fake news coming at us through its portal and social media more generally. She does have a point, but I think she doesn’t take things far enough.

“Fake News and the Internet Shell Game” by Michael P. Lynch, a related article that appeared a day earlier in the Times, is more to the point.

Lynch opens his piece with this nugget: “The Oxford English Dictionary crowned its international word of the year: post-truth.” That means your content might be factual, but on the other hand it might not be. If not, is it content at all or is it simply an attempt to deceive with the purpose of causing some action that otherwise might not happen?

Contents

Social Media Marketing

Lessin’s point is that Facebook is just a set of pipes delivering something and is not responsible for the content, the same argument an Internet provider might offer. Nebulous others have that responsibility. Yet here is why this problem is squarely in Mark Zuckerberg’s lap.

Facebook is in the business of delivering content to a wide variety of people for a multitude of reasons. Businesses — arguably its best customers, at least in terms of revenue — have an expectation that Facebook will deliver to them readers with a potential for making purchases or otherwise engaging with brands.

If Facebook can’t deliver the eyeballs, its revenues and profits would suffer as vendors sought other ways to reach potential customers.

That possibility is made probable when readers discover that Facebook delivers content at variance with truth, when it is expected to deliver all truth, all the time. It would be detrimental especially to any business striving to develop social media marketing campaigns, for example.

Truth or Consequences

So, Facebook really does have a problem with fake news outlets sending their content through its pipes. Fake news is to Facebook what a Trojan horse is to your computer. It takes over part of your productive capacity to do things you don’t approve of.

Therefore, the opposite of Lessin’s argument is true. It’s not the Facebook user’s job to sort through the content looking for the truth. In posing as a content provider, Facebook has assumed the role of arbiter.

One thing that is true is that it’s not our responsibility to fix this mess. That’s for Facebook and all the other social networks to address. Facebook has a business imperative to ensure that its pipes are clean. Its profits depend on it.

 

 

[Source:- Technewsworld]

By Adam