
Retiring the red flag
Just when you thought it was safe to go back online, Seth's going to talk about fake news again. But only because Facebook is talking about it again.
More or less, since the phrase fake news came into its own last year (and let's not forget it's adorably obnoxious sister-phrase; alternative facts) online sources have been met with skepticism. We found out there were actually people out there, typing up falsehoods intentionally and getting the public to spread the lies throughout the web.
The problem was, none of us believed it could be the sources we were looking at ourselves. It was surely some unintelligent pleb drooling over a keyboard who would fall for such a simple ploy.
But, of course it was and is all of us.
So people started to ponder how major junctions of the information highway could stop the trafficking of false information without violating first amendment rights. Facebook initially allowed users to flag links and articles they thought were false for further review by fact-checkers (because new terminology is like a potato chip, you can't just have one) and an eye-catching red flag would appear next to the link reading "disputed."
Last week, Facebook said they are retiring the red flag because it actually had the opposite effect from what they intended.
"Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs," Facebook Product Manager Tessa Lyons said in a statement.
So, in the mean time, Facebook has replaced the red flag with a bank of related articles so users can get more context for themselves. In the same statement, Lyons said the numbers support this approach.
"Indeed, we've found that when we show related articles next to a false news story, it leads to fewer shares than when the disputed flag is shown," she said.
This seems like a good idea. We should use other sources on related topics to cross reference the information and test it's truthiness (Merriam-Webster's 2006 Word of the Year thanks to Stephen Colbert). If only we had thought of it sooner.
Well, actually a group of college students created a program to do just that during a technology expo — in November of 2016. That's more than a year ago. The team of students specifically wanted to address the issue of fake news and literally created a browser add-on in about a-day-and-half which was capable of doing it.
Now, I don't know why it's taken Facebook so long to follow in the team's footsteps, but I'm glad they are and I'm glad they're adding in the human factor, rather than let a CPU do the steering (although, there's self-driving cars), because, if technology company Botnik's attempt to let a computer write a chapter of Harry Potter by predicting word choice has taught us anything, it's that we need a human mind behind the wheel.
Posting a comment requires free registration:
- If you already have an account, follow this link to login
- Otherwise, follow this link to register