There are many issues.
First, the definition of news is current, noteworthy, accurate, unbiased new information. Sure, there is hard news and soft news. But the definition implies truth produced by serious, ethical journalists, not amateurs.
Second is where that came from. There are mainstream sources like newspapers, TV, radio and specialist on-line sites like iTWire that adhere to the definition of news. Some of the news comes from old style reporting – burning shoe leather, some from press releases, and some “scoops” generated by social media and Twitter. The point of difference here it that news is curated by real editors and journalists.
And that leads to the third issue – fakes. There are also self-publishers, bloggers, content creators, social news, special interest and lobby groups, and many more, some of whom are paid to flood the channels with so-called news, reviews, opinions, and whitepapers mixed with more than a little chest thumping and myopia.
A site called Buzzfeed, itself a “contagious, viral, social media news site,” that relies on users to click on its content to generate advertising revenue has accused Facebook and others that fake news ran wild during the entire presidential campaign. “The Pope endorses Donald Trump …”
In response, Facebook is alleged to have set up an unofficial task force of “more than a dozen” employees dedicated to addressing the fake news issue. That dirty dozen face an impossible task – the volume of news is too great to make a difference and then there are issues of censorship. It cannot win here.
Following Google’s lead, it will similarly restrict advertising from “fake” sites on its ad network.
So, let’s not blame Facebook – its crime is that it was late to the news aggregation stakes and it has not had the experience that Google News has in using machine learning and algorithms to identify suspect news and news sources from the tsunami of items published every day.
But even those algorithms are more aimed at “popularity and repetition” than accuracy and reality. Google’s top-ranking news result for the term "final election result” highlighted a story from a fake news site with inaccurate information on the vote tally.
The problem is that a computer finds it hard to distinguish between what is real and fiction. If a “lie” is repeated more than once on several sites is it real? As Winston Churchill one said, "A lie gets halfway around the world before the truth has a chance to get its pants on."
The issue is that there is no regulation on what constitutes news. Let’s not open the argument up about that, but it is the root of the issue.
But as Buzzfeed says, "A fake news website might publish a hoax, then because it's getting social attention another site might pick it up, write that story as though it's true and may not link back to the original fake news website.
"From there it's a chain reaction until at some point a journalist at a largely credible outlet might see it and quickly write something up because many journalists are trying to write as many stories as possible and write stories that get traffic and social attention. The incentive is towards producing more and checking less."
Computer rating news algorithms need to be tightened – news should only be reported from genuine news sites and opinion and unverified news perhaps in a separate feed. But in the end it is caveat emptor – some people just like rubbish news, sorry views!