Facebook Fallout, Facts and Frenzy 160
redletterdave (2493036) writes Facebook chief operating officer Sheryl Sandberg said the company's experiment designed to purposefully manipulate the emotions of its users was communicated "poorly". Sandberg's public comments, which were the first from any Facebook executive following the discovery of the one-week psychological study, were made while attending a meeting with small businesses in India that advertise on Facebook. "This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you." anavictoriasaavedra points out this article that questions how much of this outrage over an old press release is justified and what's lead to the media frenzy. Sometimes editors at media outlets get a little panicked when there's a big story swirling around and they haven't done anything with it. It all started as a largely ignored paper about the number of positive and negative words people use in Facebook posts. Now it's a major scandal. The New York Times connected the Facebook experiment to suicides. The story was headlined, Should Facebook Manipulate Users, and it rests on the questionable assumption that such manipulation has happened. Stories that ran over the weekend raised serious questions about the lack of informed consent used in the experiment, which was done by researchers at Cornell and Facebook and published in the Proceedings of the National Academy of Sciences. But to say Facebook’s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic.
Re:Never meant to upset? (Score:5, Informative)
Facebook has done us all a favor by waking up the dumb consumer to the consequences of the idea that information wants to be free - and therefore its alright to waive all personal rights on the internet.
WSJ: Users seen as a willing experimental test bed (Score:5, Informative)
Facebook Experiments Had Few Limits [wsj.com]"Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real. In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's antifraud measures...'There's no review process, per se,' said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. 'Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior.'...The recent ruckus is 'a glimpse into a wide-ranging practice,' said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies 'really do see users as a willing experimental test bed' to be used at the companies' discretion."
Re:Never meant to upset? (Score:5, Informative)
Re:Ethics (Score:5, Informative)
- Everything you need to know about Facebook's manipulative experiment [wired.com]
How in the hell did this pass IRB? (Score:5, Informative)
Re:Never meant to upset? (Score:4, Informative)
Testify! As was said before on /. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.
Facebook's ability to do that is right there in the EULA. Yes, I actually read Facebook's EULA. Anything you post on the site is yours, but they enjoy the right to use it in any way they like while it's there. So this journalist agreed to let them do that when she signed up. She likely didn't know that, because who actually reads EULA's, right? It's one more reason I'm not on Facebook.