Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Communications Facebook Network Networking Social Networks Software The Internet News Technology

Facebook Begins Asking Users To Rate Articles' Use of 'Misleading Language' (techcrunch.com) 113

Facebook is finally cracking down on the fake news stories that run rampant on its site and many other social media sites across the web. The company is rolling out a new feature in the form of a survey that asks users to rate articles' use of "misleading language." The feedback received will likely help Facebook train its algorithms to better detect misleading headlines. TechCrunch reports: The "Facebook Survey," noticed by Chris Krewson of Philadelphia's Billy Penn, accompanied (for him) a Philadelphia Inquirer article about the firing of a well-known nut vendor for publicly espousing white nationalist views. "To what extent do you think that this link's title uses misleading language?" asks the "survey," which appears directly below the article. Response choices range from "Not at all" to "Completely," though users can also choose to dismiss it or just scroll past. Facebook confirmed to TechCrunch that this is an official effort, though it did not answer several probing questions about how it works, how the data is used and retained, and so on. The company uses surveys somewhat like this to test the general quality of the news feed, and it has used other metrics to attempt to define rules for finding clickbait and fake stories. This appears to be the first direct coupling of those two practices: old parts doing a new job.
This discussion has been archived. No new comments can be posted.

Facebook Begins Asking Users To Rate Articles' Use of 'Misleading Language'

Comments Filter:
  • Let's see (Score:5, Insightful)

    by Anonymous Coward on Tuesday December 06, 2016 @02:02AM (#53430679)

    How many days we can go without a Facebook story.

  • by Anonymous Coward on Tuesday December 06, 2016 @02:03AM (#53430681)

    I'm sure this will work perfectly, and everybody will respond honestly and accurately based on whether the story is factual, rather than whether or not it follows the correct political opinion.

    • by buss_error ( 142273 ) on Tuesday December 06, 2016 @02:13AM (#53430713) Homepage Journal

      What strikes me is that Facebook is asking the very people that believe the fake news to point out it's fake news.

      Doesn't seem very prone to accuracy to me.

      • by LeftCoastThinker ( 4697521 ) on Tuesday December 06, 2016 @02:52AM (#53430819)

        This exactly. The majority can still be wrong, and a voting system will be rife with partisan BS. There is no substitute for a good BS detector and a healthy dose of skepticism. The problem is there is no foolproof way to automate what honest journalists used to do. It will be interesting to see how many NYT or other MSM articles get flagged as mostly false.

        What FB needs to do is develop an apolitical pipeline where they don't vet the articles, they vet the sources, and then put down strict sourcing and veracity rules straight out of classical journalism 101 for those sources in the pipeline. Sources that violate these rules get flagged as satire or fiction and banned from the news pipeline and lose visibility for a period of weeks or months. A small team of investigators could check into random or flagged stories and then ban as appropriate. There are tens of thousands of stories written up every day, but probably curating 1000 pipelines would satisfy 95% of the important news.

        • It's worse than that, really. The shills and the trolls (Anonymous, I'm looking at YOU, among others) and the intentional troublemakers, the people who just want to watch the world burn, will band together and call legitimate content 'misleading' and protect their own rhetoric.

          More censorship

      • Quite. If your average faceberk had the critical thinking & reading skills necessary for this then there'd be no fake news, because it wouldn't work.

      • by Monoman ( 8745 )

        What strikes me is that Facebook thinks it's users are actually reading the articles. I don't have any evidence too support this but I suspect FB users don't read much past the headline and move one.

      • Yup, The Dunning-Kruger Effect in full ...ahem...effect. Asking those who are reading (and believing) this stuff to evaluate how reliable the content is strikes me as being even worse than useless, and actively harmful, because positive feedback on such articles will encourage further propagation. And facebook wil be able to say that audiences rated the articles highly, so they must be ok.
        • "Let's see, an article about Hillary Clinton running a child sex & cannibalism ring out of a DC Orange Julius kiosk with the proceeds going to ISIS? 100% factual, obviously! [Submit] Heil Trump!"

          And OJGate is born...verified as factual on Facebook!

      • by mjwx ( 966435 )

        What strikes me is that Facebook is asking the very people that believe the fake news to point out it's fake news.

        Doesn't seem very prone to accuracy to me.

        Actually its a good way to spot people who are bad a spotting fake news.

        Put out a series of control stories, X number real and Y number fake (anything from Fox News or Brietbart should do). Then you weight the users votes depending on accuracy. In theory this can now be used to spot fake news as we've identified the unreliable voters.

        Facebook's problem is that they need their site to remain an echo chamber for their userbase to not shrink. So this kind of weighting would better be used to determine which st

        • by Anonymous Coward

          The real problem with hate based politics is that otherwise intelligent people become so stupid that they assume, like you, and then soon truly believe that anything the opposition reads must be deliberately false and manipulative. The truth is that there isn't much difference between Fox News and CNN. The bias is in what they don't cover, and both end up being center-left by that metric.

          • Re: (Score:3, Interesting)

            by Anonymous Coward

            They assume alt right = fake. The other day, one of the big websites (abc/cbs/cnn) ran a story quoting Trumps pick for head of DHS "I won't rule anything out." to the question of making a Muslim registry. The left out the "But there will be no registry." That part was in the video, but not the article.

            Both sides.

      • by ranton ( 36917 )

        What strikes me is that Facebook is asking the very people that believe the fake news to point out it's fake news.

        While this statement is true, it is very misleading. Facebook is asking a random sampling of everyone to point out news is fake (or misleading). Sure there will be people who believe the stories clouding the survey results, but that is what clustering algorithms are for, among other techniques. The work of relatively few paid meta-moderators could be multiplied a thousand time over by easily identifying the Facebook users who are overly biased and/or unable to read news critically.

        In a very short period of

      • by sudon't ( 580652 )

        What strikes me is that Facebook is asking the very people that believe the fake news to point out it's fake news.

        What I keep wondering is how the hell people can't tell the difference between real news and fake news? Do we really have this many people who can't read between the lines?

        Yes, I'm being sorta rhetorical. I'm aware of the recent election. But, still...

    • by BlueStrat ( 756137 ) on Tuesday December 06, 2016 @02:14AM (#53430715)

      I'm sure this will work perfectly, and everybody will respond honestly and accurately based on whether the story is factual, rather than whether or not it follows the correct political opinion.

      "Working As Intended(TM)"?

      Perhaps filtering of politically "sensitive" topics in the "news" feed is the goal and not an unintended consequence under the guise of filtering "fake news"?

      Strat

      • by Anonymous Coward

        I'm sure this will work perfectly, and everybody will respond honestly and accurately based on whether the story is factual, rather than whether or not it follows the correct political opinion.

        "Working As Intended(TM)"?

        Perhaps filtering of politically "sensitive" topics in the "news" feed is the goal and not an unintended consequence under the guise of filtering "fake news"?

        Strat

        Ya think?

        Why aren't we hearing about how "fake" all the "Look! Russian hacker! SQUIRREL!!! Look at the RUSSIAN squirrel!!!!" stories are?

        Given the "fake news" craze only appeared AFTER Hillary! lost, I suspect it's just an attempt by Democrats/leftists/"progressives"/media to try to regain their ability to control the "narrative". The ability that Trump took from them.

        Look how they tried so damn hard to stifle the story of Hillary!'s illegal email server and blatantly incompetent handling of classified ma

    • Yeah,I thought about that too (I'm sure most of us did), but the existence of this button by itself is a good thing, because it will get people to start thinking about fake news.

      Anything that gets people to start considering that things might be fake news will give them slightly better critical thinking skills, even if only slightly.
    • I'm sure this will work perfectly, and everybody will respond honestly and accurately based on whether the story is factual, rather than whether or not it follows the correct political opinion.

      They want to do this because they think that the correct political opinion of the masses agrees with them. They also thought that the current political opinion of the masses agreed with them prior to the election, too. I think they may be in for a bit of a shock (again) when they discover that their ideals are not as widespread as they believe them to be.

    • by gsslay ( 807818 ) on Tuesday December 06, 2016 @06:12AM (#53431285)
      It'll work just like all online vote systems work, including the slashdot scoring right here. People will down-vote not because they think the story is false or misleading, but because they don't like what it says. Or because they don't like someone it features. Or because they disagree with an opinion given.

      And the reverse for things they like what is said, or they'd like to think was true.

      Unless you have the time to do your own research on every news story, the best source of news is a source that you trust to be true and accurate. A source that depends on its reputation and cannot afford to lose its readers' trust. Anonymous voting systems involve neither trust or reputation.
      • by AmiMoJo ( 196126 )

        Facebook promotes comments that get a lot of likes. During the Brexit campaign I noticed that often when people posted fake news a comment debunking it would appear right underneath due to the number of likes it got. If they could find a way to make the debunking more prominent, or maybe even make it the main post with the original lie as a footnote for context, it would go a long way to fixing this problem.

        At the very least, present both side by side for comparison so people can make up their own minds.

      • Im not sure the slashdot moderation does what you say it does. The meta-moderation seems to work pretty well, as in, obvious trolls from both nationalist alt-right and SJW left are often modded down. Topical opinion on both side seems to float up. Not to say it always works.. but it works better then thumbs up, down or other shitty yes/no moderation.

      • Comment removed based on user account deletion
    • it will work as well as reddit

    • by sudon't ( 580652 )

      I'm sure this will work perfectly, and everybody will respond honestly and accurately based on whether the story is factual, rather than whether or not it follows the correct political opinion.

      Right. I think it'll come out as close as the recent election, with half voting up, and half voting down. That should be a big help to FB.

  • Slashdot routinely has awful headlines such as the story about only one majot tech company saying they wouldn't partake in Trump's Muslim registry. In reality, most of the companies just didn't comment, but the headline implied that most tech companies would go along with the plan. I've seen a fair amount of misleading journalism from Slashdot lately, and it seems like that blue checkmark should be revoked again.

  • And on that subject (Score:5, Interesting)

    by Okian Warrior ( 537106 ) on Tuesday December 06, 2016 @02:10AM (#53430709) Homepage Journal

    And... they're off! [nytimes.com]

    Hillary campaign bus involved in deadly crash [washingtonstarnews.com].

    And of course, CNN falsly admits [breitbart.com] it aired pornography for 30 minutes on thanksgiving.

    Synopsis of previous link:
    1) A Twitter user in the Boston area reported that CNN was airing hardcore pornography for 30 minutes through local provider RCN.
    2) Picked up by The Independent, a leading left-of-center newspaper based in the United Kingdom.
    3) Subsequently many other media outlets including Variety magazine, the U.K. Daily Mail, the New York Post, Esquire magazine and Mashable, &c.
    4) Eventually, CNN actually confirmed that it did air “inappropriate content” and was seeking an “explanation” from RCN.

    Of course, nothing of the sort happened.

    Mainstream media has a bit of a credibility problem, yah?

    • by Anonymous Coward

      Mainstream media has a bit of a credibility problem, yah?

      That's the issue right there: The blame does not go to the perpetrator, but to all news sites. Therefore is no incentive to fact-check next time, because the blame is socialized. Better to rank news sites by false news fraction.

    • None of that makes alternate media any better. There's nothing wrong with pointing out the problems media has. Indeed it is healthy and necessary as the only way we can hope to improve it is to point out the problems and demand that they be improved upon.

      The issue is that is not what many of the people who call themselves skeptical of the media are doing. Rather they seem to be taking the view that MSM is bad so that means whatever alternate media site they read is good and accurate all the time. They'll be

    • by AmiMoJo ( 196126 )

      There is a difference between media outlets rushing to publish a story and screwing up, but having it corrected and debunked within a day, and the fake news conspiracies like Pizzagate that drag on for months even after being shown to be completely false.

      The media has started to get more active at debunking stuff now. It turns out that debunking articles make good clickbait too. The remaining big problem, which is harder to solve, is bubbles preventing the message getting out, and echo chambers amplifying c

      • by Raenex ( 947668 )

        fake news conspiracies like Pizzagate that drag on for months even after being shown to be completely false

        How was it shown to be "completely false"?

        • by AmiMoJo ( 196126 )

          TFA goes into some detail about how it was discovered that every claim made was false.

          • by Raenex ( 947668 )

            TFA goes into some detail about how it was discovered that every claim made was false.

            Which article? Provide a link, please, because the article [techcrunch.com] for this story doesn't talk about Pizzagate.

            I want to see the article that shows every claim made was false. Though I'm sure the mainstream would never spin a story, ignore facts, or otherwise, right? It's only those fake news sites like Breitbart that we have to worry about, right?

            A little something [archive.is] for you to ponder while you're digging up that article.

            • by AmiMoJo ( 196126 )

              My bad, I got mixed up with the Pizzagate article today. Here are the links:

              http://arstechnica.com/tech-po... [arstechnica.com]
              http://www.nytimes.com/2016/11... [nytimes.com]
              https://www.buzzfeed.com/craig... [buzzfeed.com]

              By the way, did you notice how the owner of the pizza restaurant didn't post the photo you linked to? And how it's not proof of anything other than someone making a really bad joke? This is what conspiracy theories are made of.

              • by Raenex ( 947668 )

                By the way, did you notice how the owner of the pizza restaurant didn't post the photo you linked to?

                Actually, he did. That's an archive of a post on "jimmycomet"'s Instagram, a.k.a. James Alefantis.

                And how it's not proof of anything other than someone making a really bad joke?

                At this point, there's just suggestive evidence. And that is, by no means, the only photo involving young children in really bad taste. But remember, we're being told every claim is false. Except for those pesky photos. And a bunch of other stuff that are facts. So I'm just wondering how all these news sites know nothing untoward is going on here.

    • Part of the reason this "story" got play is that it contained a partial truth. CNN did actually inadvertently air a very short segment having mild nudity, or at least "ambiguous" nudity, not "30 minutes" as the fake story claimed.

      A lot of dodgy news stories are like that: they contain enough truth to give it some legitimacy and to make it hard to outright refute. Disputes are nuanced, and nuanced doesn't sell and spread in the age of sound-bites.

      Another common example was Hillary's alleged "illegal server".

    • by Tablizer ( 95088 )

      "Hillary campaign bus involved in deadly crash"

      That story was actually true if you remove the word "bus".

  • by evanh ( 627108 ) on Tuesday December 06, 2016 @02:15AM (#53430719)

    ???

    • by tsotha ( 720379 )
      Pretty much. Without some kind of human intervention on Facebook's part it's really not functionally different than dislike/flag.
  • by TodPunk ( 843271 ) on Tuesday December 06, 2016 @02:16AM (#53430727) Homepage

    I enjoy this trend of assuming that crowdsourcing will solve it, because people know what "misleading language" is. I don't think it can actually solve it, but the optimism is interesting. Whatever happened to assuming the universe would build a better idiot, and the user is always wrong?

    Times seem to have changed.

    • People here at dashslot seem to have a good grasp on shitty headlines. And no one goes out of the way to compliment a good headline. I expect a few abuses as statistical outliers, but "people are stupid" applies everywhere. And anyone coding this has to consider that.

      Or do we assume that the people behind this *don't* know that people are stupid? That we alone understand this secret?

      There are piles of master's theses to be written this year. One of them will probably cover this, and we will read it if it is

  • Ah yes, I see -for now they'll try to correct only "misleading language", not informative features or flamebaits etc.
    They still have, let's say politely, a large growth potential ;-)
    CowboyNeal should have patented metamod :-D

  • by mi ( 197448 ) <slashdot-2017q4@virtual-estates.net> on Tuesday December 06, 2016 @02:24AM (#53430747) Homepage Journal

    The following Party-approved Fake News stories need not be flagged — indeed, tagging them as anything other than deeply concerning may cause your account to be suspended:

  • ... independent ones, and independent journalists.

    So this was a headline neglecting to mention her white nationalism. If Facebook asks a black person whether the headline is misleading, they're generally going to get a very different answer from a white person. Also, does Facebook actually know if it's asking a black person?

    Here's the Philly article:
    http://www.philly.com/philly/b... [philly.com]

    Here's the version from bleeding heart liberal progressive magazine, Sports Illustrated:
    http://www.si.com/mlb/2016/12/... [si.com]

    It's

    • by Anonymous Coward

      I am wondering if one could make a machine learning algorithm that predicts, given a news story text, whether it is likely fake.
      Train it with news from two months ago, with the current information, fact-checking and investigation by journalists and organisations as you point out.

  • Comment removed (Score:4, Insightful)

    by account_deleted ( 4530225 ) on Tuesday December 06, 2016 @02:31AM (#53430771)
    Comment removed based on user account deletion
    • The problem with 1984 is that the Ministry of Truth was propaganda, not objective truth. We actually do rely on society, and even government, to tell us what things are actually, really true. That's the entire education system, for example. No, it doesn't always get it right.

  • by SlovakWakko ( 1025878 ) on Tuesday December 06, 2016 @03:28AM (#53430893)
    Most posts get published/read inside a likeminded network of contacts. These are not very likely to judge an article objectively, and only very few people from outside of the network, with a different point of view, will get an opportunity to present their opinion (or will care to do so). On the other hand, this may be a good way to quickly judge the crazyness of a community, to discover its particular bias and use that to feed into the community more attractive fake news which will get clicks. Hooray, more $$$ for Macedonia :)
    • by Pascoea ( 968200 )

      judge the crazyness of a community, to discover its particular bias and use that to feed into the community

      Wait. Are you saying this is another attempt by Facebook to collect more data about us? Say it aint so!

  • Facebook still seem to be bent on co-opting the labors of the unwashed masses for free. This has some major drawbacks:
    * The Dunning-Kruger-effect (people thinking they are knowledgeable about something when they are actually the opposite).
    * Grass-roots campaigns to get some particular POV propagated - because people have difficulty in separating their particular political or religious view from objective reality; for them that is the reality and "the right thing".

    In short, FB get the quality that they pa

  • by Gumbercules!! ( 1158841 ) on Tuesday December 06, 2016 @03:48AM (#53430935)
    So Facebook's brilliant plan to solve "fake news articles" tricking idiots into believing things is to ask those same idiots who couldn't tell it was fake news to tell them that it was fake news? Yeah, that checks out. Go right ahead Facebook. I don't see any flaws in that logic, at all.
  • Obviously, they should learn from older forums on the internet and use moderation and meta moderation. The use of "Agree" and "Disagree" buttons along those that add or subtract karma points is also a good step, but of course, Facebook wanted everyone to just have a positive outlook on life, hence the Like button being the only one they had for years.

  • We should have this on /. too. "Use of misleading language" Is Something I encounter here a bit too often to my liking, even if not straight up fake news.
  • by alternative_right ( 4678499 ) on Tuesday December 06, 2016 @04:26AM (#53431005) Homepage Journal

    To all media goodthinkers:

    We need the sheep to stop deviating from the orthodoxy. Repeat "fake news" until the citizens cry for non-conforming news to be censored.

    Sincerely, the Inner Party.

  • Does this qualify as misleading?: Facebook Terms, Section 9.3 - "You understand that we may not always identify paid services and communications as such"
  • by swb ( 14022 ) on Tuesday December 06, 2016 @08:01AM (#53431691)

    IMHO, the problem isn't just fake news but a broader, and longer term problem of general dishonesty in society that's been going on for decades.

    * Government dishonesty since at least Viet Nam and/or Nixon. Two examples where the government actively lied and/or stretched the truth, and there are many others. This has long been internalized by many people about the honesty of government.

    * General misleading nature of advertisements. We're constantly bombarded with misleading messages about every day items and we've all had experience where the product doesn't align with its promises.

    * Corporate dishonesty -- outright lying. Karen Silkwood, Thalidomide, Corvair, Pinto, corporations relentlessly covering up and lying about bad products, corporate misdeeds and so forth. And these are all very old examples just to demonstrate how it has been going on for decades.

    * Employer dishonesty -- The relentless messaging from management about business goals and plans for employees. How often is it true or does it end up improving employee work lives? Almost never. Most people impulsively parse and disbelieve what management tells them because it's so often the opposite of what they're told.

    * The near-legal practical status of scams and cons -- We're constantly assaulted by outright dishonest people. Spam email, "card services", "free cruises". Yes, it's illegal and few people believe it at face value but there's so little effort to stop it that it seems to be legitimized as a means of doing business.

    * Ideological dishonesty -- across the political spectrum all ideological advocates both embrace untruths necessary to advance their cause and discount their critics when it seems patently obvious they're not being honest.

    It's not just fake news -- belief in fake news is just a symptom of the relentless, never ending crisis of honesty in our culture. Lying and misleading is so ingrained in our culture that doubting is our first impulse. So why not buy into fake news and conspiracy? Lies and conspiracies have quite often been shown to be true, why should I have any faith that person/institution X is telling the truth and not lying to me and that the conspiracy is false?

    Until the Internet, the news media was actually one of the last institutions to *mostly* tell the truth -- libel laws, the business nature of actually printing news, journalism as an actual profession with a sense of ethics and some mission to tell the truth -- mostly worked against fake news, which was (in the US anyway) generally marginalized into corners of celebrity gossip or supermarket tabloids. It just wasn't practical to create fake news when you needed a press run of a million copies on a regular basis and a distribution network.

    • If I had mod points I'd mark this as the most insightful response on the topic -- but then I'd be unable to add to it.

      I agree with all that's said but would extend it to include the "filter bubble" effect that Facebook, Google etc seem to amplify: it's likely that each person sees more and more from sources that FB/G's algorithms deem to be "interesting" to them and fewer and fewer views that disagree. This tends to reinforce opinions in a way that 'older' sources could not do.

      I know that some newspapers

  • If you are getting your news from Facebook you have already lost the battle.

    A dude holding a cell phone asking leading questions to someone at some protest isn't "news", it's usually a biased asshole looking for their 2 minutes. Some article about Obama getting impeached or Trump punching a Mexican posted by my dumb ass cousin isn't "news", it's an "article" they picked up off one of their dumb ass friends feeds and shared without reading it.

    It's way easier once you realize that pretty much everything news

  • The problem is political zealots posting unsolicited political crap that all their "friends" are forced to endure. Whether that crap is true or false or misleading is irrelevant. The goal should to to get people back to talking about their lives like people used to do when they got together in meatspace and before it was so easy to hide behind the keyboard.

  • What on earth makes them think people are going to read the articles before responding?
  • Why make just one common blacklist when we could have hundreds, maintained by groups of people. Some will have republican bias, some democrat, some Chinese, some will agree with programmers, or Turks and so on. Basically you could subscribe to a bunch of them and if the score is above a threshold, then remove the article from the feed. If you change your mind you can always subscribe to other blacklists.
  • ...until they realize that public opinion does not correspond to their agenda.

  • All the sheep will happily report any story they don't like, burying lots of good stories to preserve their news bubble.
  • Are you kidding me? You want the SAME PEOPLE who get taken in by 'Fake News' to also POLICE it?

    Fucking funniest thing I've read this year. Facebook is the Wal-Mart of the internet. Useless ass clowns in bad clothes buying useless crap.

    LMFAO.

  • Facebook users are flagging THIS as misleading, and you won't believe WHY!

You know you've landed gear-up when it takes full power to taxi.

Working...