Forgot your password?
typodupeerror
Facebook News

Facebook Fallout, Facts and Frenzy 160

Posted by samzenpus
from the to-friend-or-not-to-friend dept.
redletterdave (2493036) writes Facebook chief operating officer Sheryl Sandberg said the company's experiment designed to purposefully manipulate the emotions of its users was communicated "poorly". Sandberg's public comments, which were the first from any Facebook executive following the discovery of the one-week psychological study, were made while attending a meeting with small businesses in India that advertise on Facebook. "This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you." anavictoriasaavedra points out this article that questions how much of this outrage over an old press release is justified and what's lead to the media frenzy. Sometimes editors at media outlets get a little panicked when there's a big story swirling around and they haven't done anything with it. It all started as a largely ignored paper about the number of positive and negative words people use in Facebook posts. Now it's a major scandal. The New York Times connected the Facebook experiment to suicides. The story was headlined, Should Facebook Manipulate Users, and it rests on the questionable assumption that such manipulation has happened. Stories that ran over the weekend raised serious questions about the lack of informed consent used in the experiment, which was done by researchers at Cornell and Facebook and published in the Proceedings of the National Academy of Sciences. But to say Facebook’s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic.
This discussion has been archived. No new comments can be posted.

Facebook Fallout, Facts and Frenzy

Comments Filter:
  • We're Sorry (Score:5, Funny)

    by dcw3 (649211) on Thursday July 03, 2014 @08:17AM (#47375059) Journal

    We're sorry....
    .
    .
    . ...that we got caught.

    • by GoCrazy (1608235)

      It all started as a largely ignored paper

      Can you really qualify that as being caught though?

    • "caught" so this was leaked?
      • Caught, leaked, fruit of the poisonous tree... the method of delivery becomes a moot point once it's the topic to rage about with the short attention span tribe.

        Linked to suicides.

        Cue the lawsuits.

    • by schlachter (862210) on Thursday July 03, 2014 @08:35AM (#47375159)

      Facebook has released several different responses to this issue and is closely monitoring how people in each of the different experimental groups respond to these releases.

    • by Sockatume (732728) on Thursday July 03, 2014 @09:26AM (#47375453)

      "Dear customers. We are really sorry that you're so upset at our great study. We're super glad that we did the study but so very very sorry that you guys were upset by it. When we do it again, let's work together to find a way that you could just not be so upset about it."

      • Why doesn't FB just man-up and tell the truth, e.g., "Dear people who have FaceBook accounts: Until you start paying for this service, you have no say in how we operate. Our real customers, i.e., advertisers, were quite pleased with the results of our little test. They now have more insight into you and can better target their advertising to you. This, in turn, makes us more valuable to them and we make more money by growing our customer base and/or raising our advertising rates. Now go about your busin

    • by JStyle (833234)

      This reminds me of this one time, as a kid, I threw a rock at someone really far away. I didn't actually want to hit them, and never thought I would. The rock nailed them square in the back... It was a really weird apology. "Um... yes, I was aiming for you, but I never thought I'd hit you! Sorry!"

      • by dcw3 (649211)

        I think this kind of thing happens more often that we realize. With all the TV shows where people have pranks pulled on them, I'd love to know how often they go wrong. I'm surprised that more of the pranks don't end up with the person who is pulling them getting their ass kicked, or arrested.

      • That was you?

        I never did figure out who hit me on the back with a rock. I *suspected* it was you, JStyle, but then I was like, nah. He's too far away.

  • by theodp (442580) on Thursday July 03, 2014 @08:18AM (#47375065)

    Facebook Experiments Had Few Limits [wsj.com]"Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real. In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's antifraud measures...'There's no review process, per se,' said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. 'Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior.'...The recent ruckus is 'a glimpse into a wide-ranging practice,' said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies 'really do see users as a willing experimental test bed' to be used at the companies' discretion."

  • Oh those poor media outlet editors, panicking about missing the next big story. Surely their fragile egos should not be sacrificed to such banalities as truth and common sense?

    Instead, we should allow them to play games of telephone with facts, because that way no one's feelings (advertising revenue) get hurt.

    • by Sentrion (964745)

      Disclaimer: The proceeding post is part of a research project to study the emotional reactions of media outlet editors.

  • by Shadow of Eternity (795165) on Thursday July 03, 2014 @08:25AM (#47375103)

    This should never have made it through the ethics board.

    • by gstoddart (321705) on Thursday July 03, 2014 @08:45AM (#47375201) Homepage

      This should never have made it through the ethics board.

      Ah, but Facebook isn't a university ... they don't have one of those.

      So, either they went to the scientists and said "hey, we want to find something out", or the scientists went to Facebook and said "hey, we could do an awesome experiment on your users".

      Either way, Sandberg sounds like an unapolagetic smug idiot who more or less said "they're our users, we do this shit all the time".

      The people who run Facebook are assholes, and don't give a crap about anything more than how they can maximize ad revenue. And Zuckerfuck is a complete hypocrite about privacy -- his users get none, and he jealously guards his own.

      • by SeaFox (739806)

        This should never have made it through the ethics board.

        Ah, but Facebook isn't a university ... they don't have one of those.

        Which one? A board for these issues, or ethics?

        /rimshot

    • by RobertJ1729 (2640799) on Thursday July 03, 2014 @10:01AM (#47375805)
      The scientists represented to the IRB that the dataset was preexisting, and so the IRB passed on the review. It's not clear that the dataset was preexisting, though, since the study seems to indicate that the scientists were involved in the design of the experiment from the beginning. What's more, the paper itself claims to have obtained informed consent when clear there was none.
    • by thegarbz (1787294)

      Why not? Considering the ensuring emotional shitstorm that comes from simply changing the homepage layout this experiment is actually quite tame in comparison.

  • by Anonymous Coward

    Get rid of your account. Be free.

  • No one outside of the "twitterati" cares about this. "designed to purposefully manipulate the emotions of its users"? Huh, sounds like advertising as so many others have pointed out.
  • . . . yes, sometime companies, do you, their customer . . . or in the case of Facebook, their product.

  • by Sockatume (732728) on Thursday July 03, 2014 @08:34AM (#47375147)

    "the questionable assumption that such manipulation has happened"

    They literally wrote a peer-reviewed scientific paper demonstrating that they manipulated people's moods to a statistically significant degree, I don't think there's much you can call questionable about it from Facebook's perspective.

    • by Tharkkun (2605613)

      "the questionable assumption that such manipulation has happened"

      They literally wrote a peer-reviewed scientific paper demonstrating that they manipulated people's moods to a statistically significant degree, I don't think there's much you can call questionable about it from Facebook's perspective.

      And what do you call advertising, commercials and the nightly news? The same damn thing.

  • by StoneCrusher (717949) on Thursday July 03, 2014 @08:36AM (#47375163)
    Ah... the apology that puts blame on the victim. A hall mark of abusers and sociopath's everywhere.

    Now everyone would of notice that they are only apologising for the mis-communication, not the act of physiological experimentation (as if we would be OK with it if they had told us). But it goes deeper...

    Notice that they put the action and apology in two difference sentences, followed quickly by a "We never meant to upset you." Putting the emotional blame back on us. As if we were just accidentally bumped bystanders, not the actual targets of the actions.

    And never ever use the word "Sorry". Only the big weasel phrase "we appologise". This apology goes right along with the classic phoney apologies...

    I'm sorry you that you got upset.
    I'm sorry that you feel that way.
    I'm sorry that you made me do that.
  • Ethics (Score:2, Insightful)

    by ceoyoyo (59147)

    Human experimentation without review board approval and informed consent violates a number of national and international laws. It doesn't matter whether anyone gets hurt.

    • Re:Ethics (Score:5, Informative)

      by msauve (701917) on Thursday July 03, 2014 @08:46AM (#47375209)
      Cites, please? Because I have one which counters that claim.

      Importantly -- and contrary to the apparent beliefs of some commentators -- not all HSR is subject to the federal regulations, including IRB review. By the terms of the regulations themselves, HSR is subject to IRB review only when it is conducted or funded by any of several federal departments and agencies (so-called Common Rule agencies), or when it will form the basis of an FDA marketing application. HSR conducted and funded solely by entities like Facebook is not subject to federal research regulations...

      - Everything you need to know about Facebook's manipulative experiment [wired.com]

      • by Sockatume (732728)

        Given that subjects were not geographically constrained (they were randomly selected by user ID), the US isn't the only nation whose laws apply to this research.

        • by msauve (701917)
          So, your point is that you can't point to any foreign laws which have been violated, either.
          • by Sockatume (732728)

            Human subjects research is subject to mandatory informed consent - specific to the study being performed, you can't just have a boilerplate like the Facebook ToS - in almost all jurisdictions. For example, this is the US law Facebook undoubtably broke:

            http://www.hhs.gov/ohrp/humans... [hhs.gov]

            • by Sockatume (732728)

              Looks like this doesn't apply. Federal funding requirement.

              • by ShaunC (203807)

                Looks like this doesn't apply. Federal funding requirement.

                The obvious joke response is that Facebook probably gets all kinds of funding from NSA.

                A more serious response is that it's not quite clear. UCSF and Cornell both participated in this project to some degree, and they're both subject to HHS regulations since they do get federal funding. Whether or not the whole project was then required to follow the rules depends on what exactly the university participants did.

      • by Sockatume (732728)

        That doesn't discuss informed consent, which under Federal law requires that study participants be given specific information about the purpose, risks, procedures, duration, etc. etc. of the research.

        http://www.hhs.gov/ohrp/humans... [hhs.gov]

      • by Sockatume (732728)

        If nothing else it violates PNAS' own policies, because it's in clear breach of both the Declaration of Helsinki and ICMJE requirements on informed consent.

        http://www.pnas.org/site/autho... [pnas.org]

        • by msauve (701917)
          Reading is fundamental.

          Because the two academic authors merely designed the research and wrote the paper, they would not seem to have been involved, then, in obtaining either data or informed consent.

          - idem (original all caps emphasis removed to avoid /. lameness filter)

          • by Sockatume (732728)

            ...and? I don't see a part of PNAS' policies where it says "it's okay to breach these rules, so long as the people who made the breach and the people who performed the data analysis aren't the same".

          • by Sockatume (732728)

            Adam I Kramer, the Facebook analyst responsible for the part of the research considered ethically dubious, is the first and corresponding author on the paper.

      • "while a Facebook employee was the lead researcher, there were co-authors affiliated with institutions of higher education â" University of California, San Francisco and Cornell University â" that most certainly adhere to the requirement."

        http://www.hawaiiweblog.com/20... [hawaiiweblog.com]

        Meanwhile..

        "PNAS (the journal) has a policy requiring IRB ethics review for all published studies that experiment on humans, regardless of whether academic or corporate[1]. A Cornell press release[2] says this work was also funded

    • by Anonymous Coward

      This surely isn't a one-time experiment. They likely have piles of data about tests they have been doing in secret.

      And this only measured posts, not feelings to the posts. They don't actually know if what they saw affected people's day in a real way.

      Who is watching these companies?
      They know so much about us. We're all little playthings to them.

    • I'm in trouble then. In the last couple of weeks, I've performed a number of human experiments on the website I manage, including:
      * Do they push green buttons more than red buttons?
      * Do they fill in forms more reliably if it's one big form, or split across multiple pages?
      * Do people finish reading a page more often if the text is in large font rather than a small?

    • Sure it can't be all human experimentation, or else ad agencies couldn't attempt to measure the effectiveness of their ads. Parents couldn't raise their children (e.g. "Let's try withholding cookies and see if that works!").

      There must be specific parameters under which human experimentation is illegal.

  • I haven't seen a human subject review or impact statement mentioned in any of these /. articles. Did Facebook even do one before proceeding with this research? If so was it reviewed by an ethics panel before they proceeded with the experiment? If not, then they should definitely be held responsible for any negative outcomes.

  • Facebook has no compact with its users to offer fair and balanced news (if you'll forgive the expression). They are not obligated to feature any particular array of stories to anybody; in fact, we've heard over and over again how the relevance of items that appear in the news feed is skewed and unpredictable. Nobody should be relying on them for news and I don't think we should expect any more journalistic integrity from them than Buzzfeed.

    I don't usually take this angle when it comes to corporate resp
    • by ahaweb (762825)
      Apparently there is nobody on slashdot with any experience with the ethical requirements of doing psychological studies. Psychology is just not STEM-y enough, maybe.
  • by Junta (36770) on Thursday July 03, 2014 @09:02AM (#47375307)

    My question is why is there particular outrage when they do it as part of a science experiment whereas it is widely acceptable to do the exact same thing in mass media to get revenue.

    National and local news programs basically live and breath this sort of thing constantly. They schedule their reporting and editorialize in ways to boost viewership: stirring up anger, soothing with feelgood stories, teasing with ominous advertisements, all according to presumptions about the right way to maximize viewer attention and dedication. 'What everyday item in your house could be killing you right now, find out at 11'.

    I don't have a Facebook account precisely because I don't like this sort of thing, but I think it's only fair to acknowledge this dubious manipulative behavior is ubiquitous in our media, not just as science experiments in Facebook.

    • by Sockatume (732728)

      Research ethics. We hold scientists to a higher standard than web sites and TV stations.

    • by EmagGeek (574360)

      Because there is a difference between trying to elicit a behavior and trying to change a person's psychological state of mind.

    • Actually you don't have a question but rather an ignorant and inane comment. The objection to what Facebook did has been clearly stated and you have the ability to do research on the subject to understand what Facebook did wrong (hint, it was about informed consent). You don't care about the issue at hand. Rather, your intention was to make a feeble comparison between Facebook and other media in order to make the "you too" argument - a claim which does no justify Facebook's actions (see Tu quoque logica
      • by Junta (36770)

        I fail to see how it's that different than the manipulation that mass media does, who also do not get informed consent. There is the facet of it being more targeted, but the internet is already about targeted material (hopefully done with the best interest of the audience in mind, practically speaking with the best interests of the advertiser). They just stop short of calling it an 'experiment' (in practice, they are continually experimenting on their audience) and somehow by not trying to apply scientifi

  • by BobMcD (601576) on Thursday July 03, 2014 @09:03AM (#47375311)

    Facebook's efforts to manipulate the feed are really disappointing. If they'll do it for jollies, then they'll damn sure do it if someone pays them to or if the government orders them to.

    Imagine an 'American Spring'. Imagine the government not only spying on Facebook users communicating about it, but requiring that Facebook actively suppress any positive comments about it.

    That shit ain't right.

  • How was this paper "largely ignored"? It was published two weeks ago, and the outrage started immediately.

  • by Theovon (109752) on Thursday July 03, 2014 @09:27AM (#47375463)

    As has been pointed out many times, Facebook was doing their usual sort of product testing. They actively optimize the user experience to keep people using their product (and, more importantly, clicking ads). The only difference between this time and all the other times was that they published their results. This was a good thing, because it introduced new and interesting scientific knowledge.

    Because of this debacle, Facebook (and just about every other company) will never again make the mistake of sharing new knowledge with the scientific community. This is truly a dark day for science.

    Ferengi rule of aquisition #285: No good deed ever goes unpunished.

    • As has been pointed out several times, this was not product testing. This was a psychological test which Facebook failed to get informed consent.

      Science is in no way hurt by this but that you think it is shows how truly ignorant you are.
      • by Theovon (109752)

        The requirement for informed consent was ambiguous in this case. If I had been in their position, I would have erred on the side of caution, and the research faculty who consulted on this project should have been more resolute about it. If anything, it is those people who should have done the paperwork. I think their failure to get informed consent was a mistake, but I don’t believe it was any kind of major ethical violation. It does no harm to get informed consent, even if you don’t legally

      • by Tharkkun (2605613)

        As has been pointed out several times, this was not product testing. This was a psychological test which Facebook failed to get informed consent. Science is in no way hurt by this but that you think it is shows how truly ignorant you are.

        It's ok. You've had 2 years to get over it.

  • A Non Apology (Score:4, Insightful)

    by sjbe (173966) on Thursday July 03, 2014 @09:31AM (#47375513)

    "This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you."

    This is identical to saying "I don't know what we did that upset you but whatever it was I apologize". They don't get it. It basically means that they are going to continue treating their users as insects to be experimented upon and lack the moral compass to understand why what they did was wrong. The fact that they ran an experiment is fine in principle but HOW you do it matters. We insist that academic researchers run their psychology experiments by a review board and when necessary get informed consent. It's not a hard thing to do and we do it for very good reasons. Facebook has not presented any plausible reason we should hold them to a different standard.

    I'm very glad I do not have a facebook account and at this point I doubt I ever will. This is simply not a company I care to be involved with any closer than I have to be.

  • ... then don't keep changing the news feed to "Top Stories" which nobody gives a shit about.
  • to say Facebookâ(TM)s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic

    That's exactly what the tobacco industry said about health damage due to cigarette smoking, when they knew damned well that it was supported by both data and logic.

  • When has the Facebook newsfeed ever NOT been manipulated and been merely a list of posts in chronological order from people you are friends with and/or follow?

    It strikes me as constantly being manipulated in multiple ways and in a manner noticeable to many people. Most obvious was the "top stories" filter which purported to filter the newsfeed in some manner designed to suppress some comments and promote others.

    But we don't know about the criteria for this or the motivation of other, less obvious manipulat

  • by rebelwarlock (1319465) on Thursday July 03, 2014 @10:21AM (#47375997)
    Jokes on you guys - the "leak" was fictional. The real experiment is the public's reaction to this.
  • I mean, ya; "facebook is the enemy", sure. But honestly? Where's the personal responsibility? You can show me whatever you want, *I* control my emotions and my responses.

    This whole thing has seemed a tempest in a tea cup, but because facebook is of questionable morals and ethics, it seems everyone is jumping on board how horrible this was.

  • Anyone who kills themselves over an emoticon is actually on the right track.

  • Facebook's "research" reminds me of the treatment that eventually led the Unabomber to drop out of civilization and seek revenge against the system from his remote cabin in the woods.

    From Wikipedia: While at Harvard, Kaczynski was among the twenty-two Harvard undergraduates used as guinea pigs in ethically questionable experiments conducted by Henry Murray. In the experiment each student received a code name. Kaczynski was given the code name "Lawful". Among other purposes, Murray's experiments were focused

  • Every single person who feels hurt by what Facebook did should admit (to themselves) that their reason to be upset is because things like these make it obvious that THEY are not in control of their emotions. That THEY are but moats of dust taken for a ride by the world around them.
    I don't feel abused or betrayed or manipulated by Facebook. Not that they could. My emotions are mine, and if Facebook could alter them, I would just have to admit that I was wrong, and I would learn from it to be a better ME.
    Don'

Whoever dies with the most toys wins.

Working...