Facebook Fallout, Facts and Frenzy 160
redletterdave (2493036) writes Facebook chief operating officer Sheryl Sandberg said the company's experiment designed to purposefully manipulate the emotions of its users was communicated "poorly". Sandberg's public comments, which were the first from any Facebook executive following the discovery of the one-week psychological study, were made while attending a meeting with small businesses in India that advertise on Facebook. "This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you." anavictoriasaavedra points out this article that questions how much of this outrage over an old press release is justified and what's lead to the media frenzy. Sometimes editors at media outlets get a little panicked when there's a big story swirling around and they haven't done anything with it. It all started as a largely ignored paper about the number of positive and negative words people use in Facebook posts. Now it's a major scandal. The New York Times connected the Facebook experiment to suicides. The story was headlined, Should Facebook Manipulate Users, and it rests on the questionable assumption that such manipulation has happened. Stories that ran over the weekend raised serious questions about the lack of informed consent used in the experiment, which was done by researchers at Cornell and Facebook and published in the Proceedings of the National Academy of Sciences. But to say Facebook’s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic.
Re:Never meant to upset? (Score:5, Informative)
Facebook has done us all a favor by waking up the dumb consumer to the consequences of the idea that information wants to be free - and therefore its alright to waive all personal rights on the internet.
Re: (Score:1)
Testify! As was said before on /. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.
Re:Never meant to upset? (Score:4, Informative)
Testify! As was said before on /. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.
Facebook's ability to do that is right there in the EULA. Yes, I actually read Facebook's EULA. Anything you post on the site is yours, but they enjoy the right to use it in any way they like while it's there. So this journalist agreed to let them do that when she signed up. She likely didn't know that, because who actually reads EULA's, right? It's one more reason I'm not on Facebook.
Re: (Score:2)
while it's there
So, uh, delete it from Facebook...?
Re: (Score:2)
Testify! As was said before on /. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.
I Google'd "British Journalist Facebook Photo" and received zero valid results relating to what you describe. Do you have a more direct link, or a name?
Re: (Score:2)
Some people just don't care about this kind of stuff, even fairly intelligent people can be indifferent to privacy just because they're not terribly concerned about the worst that can happen.
I was watching a friend use their Facebook the other
Re:Never meant to upset? (Score:5, Informative)
Worse: Study has military sponsorship (Score:5, Interesting)
Except that the purpose of this experiment was to play with emotions of their users. And upset was one of the expected results.
Worse: The study has military sponsorship [scgnews.com], part of ongoing experiments how to manipulate/prevent/encourage spread of ideas (like voting for an unapproved political parties or mute general discontent):
"research was connected to a Department of Defense project called the Minerva Initiative, which funds universities to model the dynamics, risks and tipping points for large-scale civil unrest across the world."
The end game explain in this very long but very insightful analysis: America’s Real Foreign Policy – A Corporate Protection Racket [tomdispatch.com].
Re: (Score:2)
Re: (Score:3)
The study did not have military sponsorship. As the headline states, one of the authors received funding from the military for another project studying social contagion. Regardless we should be unnerved by the idea that there are leading experts on social manipulation out there getting their ideas from studying Facebook and then taking their expertise to the military.
Re: (Score:2)
This does nothing to evaluate the actual mood of the person - they might be posting positive things only to compete, or to give the appearance of being positive.
Re: (Score:1)
Except that the purpose of this experiment was to play with emotions of their users. And upset was one of the expected results.
The problem is this was done in 2012. Almost 2 years ago. Anyone who was even remotely upset would have moved on by now.
Re: (Score:2)
No, in this case it really *did* try changes that they thought might upset customers. Show more negative stuff and see if the users post more negative stuff. That was the whole freaking goal of the experiment.
We're Sorry (Score:5, Funny)
We're sorry.... ...that we got caught.
.
.
.
Re: (Score:1)
It all started as a largely ignored paper
Can you really qualify that as being caught though?
Re: (Score:2)
Re: (Score:2)
Linked to suicides.
Cue the lawsuits.
Re:We're Sorry (Score:5, Funny)
Facebook has released several different responses to this issue and is closely monitoring how people in each of the different experimental groups respond to these releases.
Passive aggressive much? (Score:4, Funny)
"Dear customers. We are really sorry that you're so upset at our great study. We're super glad that we did the study but so very very sorry that you guys were upset by it. When we do it again, let's work together to find a way that you could just not be so upset about it."
Re: (Score:2)
Why doesn't FB just man-up and tell the truth, e.g., "Dear people who have FaceBook accounts: Until you start paying for this service, you have no say in how we operate. Our real customers, i.e., advertisers, were quite pleased with the results of our little test. They now have more insight into you and can better target their advertising to you. This, in turn, makes us more valuable to them and we make more money by growing our customer base and/or raising our advertising rates. Now go about your busin
Re: (Score:1)
This reminds me of this one time, as a kid, I threw a rock at someone really far away. I didn't actually want to hit them, and never thought I would. The rock nailed them square in the back... It was a really weird apology. "Um... yes, I was aiming for you, but I never thought I'd hit you! Sorry!"
Re: (Score:2)
I think this kind of thing happens more often that we realize. With all the TV shows where people have pranks pulled on them, I'd love to know how often they go wrong. I'm surprised that more of the pranks don't end up with the person who is pulling them getting their ass kicked, or arrested.
Re: (Score:2)
That was you?
I never did figure out who hit me on the back with a rock. I *suspected* it was you, JStyle, but then I was like, nah. He's too far away.
WSJ: Users seen as a willing experimental test bed (Score:5, Informative)
Facebook Experiments Had Few Limits [wsj.com]"Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real. In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's antifraud measures...'There's no review process, per se,' said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. 'Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior.'...The recent ruckus is 'a glimpse into a wide-ranging practice,' said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies 'really do see users as a willing experimental test bed' to be used at the companies' discretion."
Re: (Score:3)
Fact telephone (Score:2)
Oh those poor media outlet editors, panicking about missing the next big story. Surely their fragile egos should not be sacrificed to such banalities as truth and common sense?
Instead, we should allow them to play games of telephone with facts, because that way no one's feelings (advertising revenue) get hurt.
Re: (Score:2)
Disclaimer: The proceeding post is part of a research project to study the emotional reactions of media outlet editors.
How in the hell did this pass IRB? (Score:3)
This should never have made it through the ethics board.
Re:How in the hell did this pass IRB? (Score:4, Interesting)
Ah, but Facebook isn't a university ... they don't have one of those.
So, either they went to the scientists and said "hey, we want to find something out", or the scientists went to Facebook and said "hey, we could do an awesome experiment on your users".
Either way, Sandberg sounds like an unapolagetic smug idiot who more or less said "they're our users, we do this shit all the time".
The people who run Facebook are assholes, and don't give a crap about anything more than how they can maximize ad revenue. And Zuckerfuck is a complete hypocrite about privacy -- his users get none, and he jealously guards his own.
Re: (Score:2)
Ah, but Facebook isn't a university ... they don't have one of those.
Which one? A board for these issues, or ethics?
How in the hell did this pass IRB? (Score:5, Informative)
Re: (Score:2)
I don't understand the "-1, Troll" moderation.
Quirkology [amazon.co.uk] by Richard Wiseman, an interesting read, is a compilation of rigorous experiments in social psychology, many of which were conducted, *gasp*, without the subjects' consent.
Retail stores do research on consumers' behaviour in order to try to sell them more sugary, salty and fatty snacks. Addiction to those nasty foods is a very real issue with health consequences. Where's the outrage?
Re: (Score:2)
Why not? Considering the ensuring emotional shitstorm that comes from simply changing the homepage layout this experiment is actually quite tame in comparison.
Facebook is dumb. (Score:1)
Get rid of your account. Be free.
Re: (Score:2)
Re: (Score:3)
I made friends with over thirty people on Facebook and had a vibrant social life for over a year until I found out they were all zombie accounts run by the same person. I was still OK with that until I realized that I had spent over $2,000.00 on wedding gifts and birthday presents for all my "friends". But then again, you can't put a price on friendship, right?
Re: (Score:2)
I didn't have enough Facebook friends who wanted to get really, really good at MobWars, so in my initial excitement I forgot to read Facebook's TOS & accidentally found myself with a dozen Facebook accounts.
Business was kind of slow that summer so I ended up writing a set of Perl modules that took care of the tedious portion of logging in to Facebook, bypassing the HTTPS redirect, logging in to MobWars, logging each character's stats, checking 'stamina' to see if the character had enough points to compl
Not important (Score:1)
This was part of ongoing research companies do (Score:2)
. . . yes, sometime companies, do you, their customer . . . or in the case of Facebook, their product.
Facebook doesn't think it's "questionable" (Score:4, Insightful)
"the questionable assumption that such manipulation has happened"
They literally wrote a peer-reviewed scientific paper demonstrating that they manipulated people's moods to a statistically significant degree, I don't think there's much you can call questionable about it from Facebook's perspective.
Re: (Score:1)
"the questionable assumption that such manipulation has happened"
They literally wrote a peer-reviewed scientific paper demonstrating that they manipulated people's moods to a statistically significant degree, I don't think there's much you can call questionable about it from Facebook's perspective.
And what do you call advertising, commercials and the nightly news? The same damn thing.
Wafer thin almost apology. (Score:5, Interesting)
Now everyone would of notice that they are only apologising for the mis-communication, not the act of physiological experimentation (as if we would be OK with it if they had told us). But it goes deeper...
Notice that they put the action and apology in two difference sentences, followed quickly by a "We never meant to upset you." Putting the emotional blame back on us. As if we were just accidentally bumped bystanders, not the actual targets of the actions.
And never ever use the word "Sorry". Only the big weasel phrase "we appologise". This apology goes right along with the classic phoney apologies...
I'm sorry you that you got upset.
I'm sorry that you feel that way.
I'm sorry that you made me do that.
Ethics (Score:2, Insightful)
Human experimentation without review board approval and informed consent violates a number of national and international laws. It doesn't matter whether anyone gets hurt.
Re:Ethics (Score:5, Informative)
- Everything you need to know about Facebook's manipulative experiment [wired.com]
Re: (Score:2)
Given that subjects were not geographically constrained (they were randomly selected by user ID), the US isn't the only nation whose laws apply to this research.
Re: (Score:2)
Re: (Score:2)
Human subjects research is subject to mandatory informed consent - specific to the study being performed, you can't just have a boilerplate like the Facebook ToS - in almost all jurisdictions. For example, this is the US law Facebook undoubtably broke:
http://www.hhs.gov/ohrp/humans... [hhs.gov]
Re: (Score:3)
Looks like this doesn't apply. Federal funding requirement.
Re: (Score:1)
Looks like this doesn't apply. Federal funding requirement.
The obvious joke response is that Facebook probably gets all kinds of funding from NSA.
A more serious response is that it's not quite clear. UCSF and Cornell both participated in this project to some degree, and they're both subject to HHS regulations since they do get federal funding. Whether or not the whole project was then required to follow the rules depends on what exactly the university participants did.
Re: (Score:2)
That doesn't discuss informed consent, which under Federal law requires that study participants be given specific information about the purpose, risks, procedures, duration, etc. etc. of the research.
http://www.hhs.gov/ohrp/humans... [hhs.gov]
Re: (Score:2)
Actually this doesn't apply. Federal funding requirement.
Re: (Score:2)
If nothing else it violates PNAS' own policies, because it's in clear breach of both the Declaration of Helsinki and ICMJE requirements on informed consent.
http://www.pnas.org/site/autho... [pnas.org]
Re: (Score:2)
- idem (original all caps emphasis removed to avoid /. lameness filter)
Re: (Score:2)
...and? I don't see a part of PNAS' policies where it says "it's okay to breach these rules, so long as the people who made the breach and the people who performed the data analysis aren't the same".
Re: (Score:2)
Adam I Kramer, the Facebook analyst responsible for the part of the research considered ethically dubious, is the first and corresponding author on the paper.
Re: (Score:2)
"while a Facebook employee was the lead researcher, there were co-authors affiliated with institutions of higher education â" University of California, San Francisco and Cornell University â" that most certainly adhere to the requirement."
http://www.hawaiiweblog.com/20... [hawaiiweblog.com]
Meanwhile..
"PNAS (the journal) has a policy requiring IRB ethics review for all published studies that experiment on humans, regardless of whether academic or corporate[1]. A Cornell press release[2] says this work was also funded
Re: (Score:1)
This surely isn't a one-time experiment. They likely have piles of data about tests they have been doing in secret.
And this only measured posts, not feelings to the posts. They don't actually know if what they saw affected people's day in a real way.
Who is watching these companies?
They know so much about us. We're all little playthings to them.
Re: (Score:1)
I'm in trouble then. In the last couple of weeks, I've performed a number of human experiments on the website I manage, including:
* Do they push green buttons more than red buttons?
* Do they fill in forms more reliably if it's one big form, or split across multiple pages?
* Do people finish reading a page more often if the text is in large font rather than a small?
Re: (Score:2)
Sure it can't be all human experimentation, or else ad agencies couldn't attempt to measure the effectiveness of their ads. Parents couldn't raise their children (e.g. "Let's try withholding cookies and see if that works!").
There must be specific parameters under which human experimentation is illegal.
Human Subject Review (Score:2)
I haven't seen a human subject review or impact statement mentioned in any of these /. articles. Did Facebook even do one before proceeding with this research? If so was it reviewed by an ethics panel before they proceeded with the experiment? If not, then they should definitely be held responsible for any negative outcomes.
Never meant to get caught (Score:1)
Who cares? (Score:1)
I don't usually take this angle when it comes to corporate resp
Re: (Score:1)
Not happy about the concept, however... (Score:3)
My question is why is there particular outrage when they do it as part of a science experiment whereas it is widely acceptable to do the exact same thing in mass media to get revenue.
National and local news programs basically live and breath this sort of thing constantly. They schedule their reporting and editorialize in ways to boost viewership: stirring up anger, soothing with feelgood stories, teasing with ominous advertisements, all according to presumptions about the right way to maximize viewer attention and dedication. 'What everyday item in your house could be killing you right now, find out at 11'.
I don't have a Facebook account precisely because I don't like this sort of thing, but I think it's only fair to acknowledge this dubious manipulative behavior is ubiquitous in our media, not just as science experiments in Facebook.
Re: (Score:3)
Research ethics. We hold scientists to a higher standard than web sites and TV stations.
The question is why.. (Score:2)
Why not hold people not claiming to be scientists to a higher standard? It's not like their science-but-don't-call-it-science experiments are any less potentially damaging than the same behavior done by a 'true scientist'.
Re: (Score:2)
I agree.
Re: (Score:1)
Because there is a difference between trying to elicit a behavior and trying to change a person's psychological state of mind.
Re: (Score:2)
Re: (Score:3)
I fail to see how it's that different than the manipulation that mass media does, who also do not get informed consent. There is the facet of it being more targeted, but the internet is already about targeted material (hopefully done with the best interest of the audience in mind, practically speaking with the best interests of the advertiser). They just stop short of calling it an 'experiment' (in practice, they are continually experimenting on their audience) and somehow by not trying to apply scientifi
Most disappointing for me is manipulating the feed (Score:3)
Facebook's efforts to manipulate the feed are really disappointing. If they'll do it for jollies, then they'll damn sure do it if someone pays them to or if the government orders them to.
Imagine an 'American Spring'. Imagine the government not only spying on Facebook users communicating about it, but requiring that Facebook actively suppress any positive comments about it.
That shit ain't right.
Re: (Score:3)
Well, three problems:
1) Their users provide the feed. Facebook just displays it.
2) It isn't 'for free' as Facebook uses the data to advertise to you, and thus earns money on the content you generate.
3) The example I gave was explicitly not bullshit - if it were, why would anyone interfere with it?
Re: (Score:2)
Because the opinions of AC's matter...
"Largely ignored"? (Score:2)
How was this paper "largely ignored"? It was published two weeks ago, and the outrage started immediately.
Re: (Score:2)
That's why this isn't all over the mainstream media, of course.
The good Samaritan always gets his ass kicked (Score:5, Insightful)
As has been pointed out many times, Facebook was doing their usual sort of product testing. They actively optimize the user experience to keep people using their product (and, more importantly, clicking ads). The only difference between this time and all the other times was that they published their results. This was a good thing, because it introduced new and interesting scientific knowledge.
Because of this debacle, Facebook (and just about every other company) will never again make the mistake of sharing new knowledge with the scientific community. This is truly a dark day for science.
Ferengi rule of aquisition #285: No good deed ever goes unpunished.
Re: (Score:3)
Science is in no way hurt by this but that you think it is shows how truly ignorant you are.
Re: (Score:2)
The requirement for informed consent was ambiguous in this case. If I had been in their position, I would have erred on the side of caution, and the research faculty who consulted on this project should have been more resolute about it. If anything, it is those people who should have done the paperwork. I think their failure to get informed consent was a mistake, but I don’t believe it was any kind of major ethical violation. It does no harm to get informed consent, even if you don’t legally
Re: (Score:1)
As has been pointed out several times, this was not product testing. This was a psychological test which Facebook failed to get informed consent. Science is in no way hurt by this but that you think it is shows how truly ignorant you are.
It's ok. You've had 2 years to get over it.
Re: (Score:2)
Because Facebook is not a business or a scientific endeavor. Ever notice that abbreviated, Facebook Inc. is "FBI"? More intel is gathered through Facebook in one month than all of the NSA's illegal wire tapping program.
Re: (Score:2)
OK, I pulled that out of my arse, but FB is still a creepy corporation.
Re: (Score:2)
Yes and no. They were testing something that they HYPOTHESIZED could reduce the quality of the user experience. And IIRC, that hypothesis turned out to be wrong (to the extent that one can get that from the statistics).
If all user interface modifications that lead to an improved user experience were intuitive, then Facebook would have implemented them already. They are now at a point where they have to consider things that are NOT intuitive. The idea that filtering other people’s posts in a way th
A Non Apology (Score:4, Insightful)
"This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you."
This is identical to saying "I don't know what we did that upset you but whatever it was I apologize". They don't get it. It basically means that they are going to continue treating their users as insects to be experimented upon and lack the moral compass to understand why what they did was wrong. The fact that they ran an experiment is fine in principle but HOW you do it matters. We insist that academic researchers run their psychology experiments by a review board and when necessary get informed consent. It's not a hard thing to do and we do it for very good reasons. Facebook has not presented any plausible reason we should hold them to a different standard.
I'm very glad I do not have a facebook account and at this point I doubt I ever will. This is simply not a company I care to be involved with any closer than I have to be.
If you don't mean to upset us... (Score:2)
hahaha (Score:2)
to say Facebookâ(TM)s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic
That's exactly what the tobacco industry said about health damage due to cigarette smoking, when they knew damned well that it was supported by both data and logic.
When has FB newsfeed ever been not manipulated? (Score:2)
When has the Facebook newsfeed ever NOT been manipulated and been merely a list of posts in chronological order from people you are friends with and/or follow?
It strikes me as constantly being manipulated in multiple ways and in a manner noticeable to many people. Most obvious was the "top stories" filter which purported to filter the newsfeed in some manner designed to suppress some comments and promote others.
But we don't know about the criteria for this or the motivation of other, less obvious manipulat
Time to get meta (Score:4, Funny)
Am I alone here? (Score:2)
I mean, ya; "facebook is the enemy", sure. But honestly? Where's the personal responsibility? You can show me whatever you want, *I* control my emotions and my responses.
This whole thing has seemed a tempest in a tea cup, but because facebook is of questionable morals and ethics, it seems everyone is jumping on board how horrible this was.
People are stupid (Score:1)
Anyone who kills themselves over an emoticon is actually on the right track.
Comes back to bite you in the end (Score:2)
Facebook's "research" reminds me of the treatment that eventually led the Unabomber to drop out of civilization and seek revenge against the system from his remote cabin in the woods.
From Wikipedia: While at Harvard, Kaczynski was among the twenty-two Harvard undergraduates used as guinea pigs in ethically questionable experiments conducted by Henry Murray. In the experiment each student received a code name. Kaczynski was given the code name "Lawful". Among other purposes, Murray's experiments were focused
puppets (Score:2)
Every single person who feels hurt by what Facebook did should admit (to themselves) that their reason to be upset is because things like these make it obvious that THEY are not in control of their emotions. That THEY are but moats of dust taken for a ride by the world around them.
I don't feel abused or betrayed or manipulated by Facebook. Not that they could. My emotions are mine, and if Facebook could alter them, I would just have to admit that I was wrong, and I would learn from it to be a better ME.
Don'
Re: (Score:1)
Seriously? Please keep your racist comments to yourself asshole.
Now if you want to bitch about what they did (as I've already done), that's perfectly fine, but this has zip to do with Zionism.
Re: (Score:2)
how about "Babylon The Great, the mother of harlots and abominations of the Earth"?
Re: (Score:2)
You make the assumption he was -1 at the time I made my comment, and he wasn't. Interesting to note that I've been modded down for it too, maybe because someone thinks I'm just complaining because I'm Jewish, when in fact I'm not.
Re: (Score:1)
Re: (Score:1)
People are getting their noses bent out of shape because Facebook talked about this as a psychological experiment rather than testing a system change; what they did was no worse th
Re:what's worse is.. (Score:4, Insightful)
People are controlling your mind all the time. Every time you see an ad, someone is trying to control your mind to try to convince you buy something. Every time you read an article in a paper, someone controls your mind to try to get their point across. Every time you argue with someone she is trying to control your mind by getting her point across. Etc.
Get off your high horse, use your brain.
Re: (Score:3)
When I see an ad or get into an argument with someone they usually don't have a billion-dollar program of research tracking my own and my friends' behaviour and then covertly adjusting what I see and hear to get their way.
Unless it's election season.
Re: (Score:2)
I find interesting that people think EULA's are blanket excuses to do anything. In fact they are not. Court decisions have stated that EULA's are limited in scope. Facebook's EULA is too broad and therefore has no legal weight.