Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Facebook Fallout, Facts and Frenzy

samzenpus posted about 4 months ago | from the to-friend-or-not-to-friend dept.

Facebook 160

redletterdave (2493036) writes Facebook chief operating officer Sheryl Sandberg said the company's experiment designed to purposefully manipulate the emotions of its users was communicated "poorly". Sandberg's public comments, which were the first from any Facebook executive following the discovery of the one-week psychological study, were made while attending a meeting with small businesses in India that advertise on Facebook. "This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you." anavictoriasaavedra points out this article that questions how much of this outrage over an old press release is justified and what's lead to the media frenzy. Sometimes editors at media outlets get a little panicked when there's a big story swirling around and they haven't done anything with it. It all started as a largely ignored paper about the number of positive and negative words people use in Facebook posts. Now it's a major scandal. The New York Times connected the Facebook experiment to suicides. The story was headlined, Should Facebook Manipulate Users, and it rests on the questionable assumption that such manipulation has happened. Stories that ran over the weekend raised serious questions about the lack of informed consent used in the experiment, which was done by researchers at Cornell and Facebook and published in the Proceedings of the National Academy of Sciences. But to say Facebook’s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic.

Sorry! There are no comments related to the filter you selected.

Never meant to upset? (0)

Anonymous Coward | about 4 months ago | (#47375027)

How often does any company make a decision which purpose is to upset customers (or users in this case).... Really?

Did microsoft change to Metro UI to upset people? Propably not, but that was the case.

Re:Never meant to upset? (4, Informative)

coastwalker (307620) | about 4 months ago | (#47375047)

Facebook has done us all a favor by waking up the dumb consumer to the consequences of the idea that information wants to be free - and therefore its alright to waive all personal rights on the internet.

Re:Never meant to upset? (1)

Anonymous Coward | about 4 months ago | (#47375165)

Testify! As was said before on /. change your information to nonsense and leave. Afterthought Look up the British journalist whose photo was used for a prostitution service she objected and was told because the advertisers liked her photo they could use it and there was nothing she could do about it.

Re:Never meant to upset? (1)

Kimomaru (2579489) | about 4 months ago | (#47375409)

FB users can't possibly care about this issue, there's nothing for them to "wake up" from. This is a scandal because we're in the middle of summer and there's really no news. The consumer who cares about privacy left FB before the IPO. It's a total non-story.

Some people just don't care about this kind of stuff, even fairly intelligent people can be indifferent to privacy just because they're not terribly concerned about the worst that can happen.

I was watching a friend use their Facebook the other day and I was shocked at how noisy and sticky it is.

Re:Never meant to upset? (0)

Anonymous Coward | about 4 months ago | (#47375551)

I'm concerned about privacy, and still use facebook for either the chat feature, or for organizing groups and events. Otherwise I just post news stories my FB friends might not usually see.

Re:Never meant to upset? (0)

Anonymous Coward | about 4 months ago | (#47375535)

I like your idealism...
Now go back play in the sandbox with your friends and let the grown ups worry about stuff like this

Re:Never meant to upset? (5, Informative)

wisnoskij (1206448) | about 4 months ago | (#47375099)

Except that the purpose of this experiment was to play with emotions of their users. And upset was one of the expected results.

Worse: Study has military sponsorship (5, Interesting)

FriendlyLurker (50431) | about 4 months ago | (#47375525)

Except that the purpose of this experiment was to play with emotions of their users. And upset was one of the expected results.

Worse: The study has military sponsorship [scgnews.com] , part of ongoing experiments how to manipulate/prevent/encourage spread of ideas (like voting for an unapproved political parties or mute general discontent):

"research was connected to a Department of Defense project called the Minerva Initiative, which funds universities to model the dynamics, risks and tipping points for large-scale civil unrest across the world."

The end game explain in this very long but very insightful analysis: America’s Real Foreign Policy – A Corporate Protection Racket [tomdispatch.com] .

Re:Never meant to upset? (1)

Bill, Shooter of Bul (629286) | about 4 months ago | (#47375465)

No, in this case it really *did* try changes that they thought might upset customers. Show more negative stuff and see if the users post more negative stuff. That was the whole freaking goal of the experiment.

Sheryl "Jew Mom" Sandberg should shut the fuck up (-1)

Anonymous Coward | about 4 months ago | (#47375031)

Sandberg and Zuckerberg are practicing Zionist mind-control techniques on the American public and all they can say is "it was poorly communicated"?

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

dcw3 (649211) | about 4 months ago | (#47375071)

Seriously? Please keep your racist comments to yourself asshole.

Now if you want to bitch about what they did (as I've already done), that's perfectly fine, but this has zip to do with Zionism.

Re:Sheryl (-1)

Anonymous Coward | about 4 months ago | (#47375129)

You notice how it's racist to talk about Jews (and blacks) but you can talk all sorts of shit about muslims/arabs (and whites) and no one bats an eye. That's how you know the Jews control the media (and the straight white male is the most discriminated against person in America)

Re:Sheryl (0)

amalcolm (1838434) | about 4 months ago | (#47375255)

Cry me a river

Re:Sheryl (0)

Anonymous Coward | about 4 months ago | (#47375485)

No, that's still racist. And people do bat eyes and call people racists for that.

Saying that Jews control the media: also racist. And stupid as hell.

Re: Sheryl (0)

Anonymous Coward | about 4 months ago | (#47375687)

That's a fact though. Look at the list of owners and CEOs from the most powerful media companies and you'll see mostly names ending in Berg or Stein. The Jewish lobby in America is extremely powerful, and that's why Americans never see the other side of what's going on in the middle east (Israel stealing land left and right) It's apartheid out there, and no one is doing anything to stop it (America and its Jewish lobby keep spending more and more money to bring the promised land back to the Jews at the expense of the Palestinians)

Re: Sheryl (0)

Anonymous Coward | about 4 months ago | (#47375811)

Two percent of the US population and about 90 percent of the control.

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

Anonymous Coward | about 4 months ago | (#47375155)

What's to say they won't alter everyone's facebook feed to make us all feel good about the next war for our "number one ally"?

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

Anonymous Coward | about 4 months ago | (#47375187)

How (why?) is anti-Zionism racism?

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

amalcolm (1838434) | about 4 months ago | (#47375259)

Maybe refering to the title of the post?

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

Anonymous Coward | about 4 months ago | (#47375357)

Sandberg called herself a "jewish mother figure for young silicon valley CEOs" or some such shit in her book. Is it racist to call a jewish mother a jewish mother? What should we call them instead?

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

Anonymous Coward | about 4 months ago | (#47375427)

"Jewish Mom" and "Jew Mom" are two totally different classes of speech, and you know it. Both legal, but one is merely descriptive while the other is offensive. Try "Japanese Mom" and "Jap Mom" on for size if you don't understand it yet.

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

Anonymous Coward | about 4 months ago | (#47375547)

Nah.... Jews are just better than most at technology....

From the Jew Wiki

Technology[edit]
Beny Alagem, Israeli-American founder of Packard Bell[48]
Steve Ballmer, former CEO of Microsoft[49]
Sergey Brin, co-founder of Google, Inc.[50]
Michael Dell, Founder, Chairman and CEO of Dell[51]
Lawrence Ellison, Founder of Oracle Corporation[52]
Larry Page, CEO and co-founder of Google Inc[53][54]
Philippe Kahn, creator of the Camera Phone, Founder of Fullpower, Borland[55]
Benjamin M. Rosen, founding investor and former Chairman and CEO of Compaq[56]
Sheryl Sandberg, COO of Facebook[57]
Mark Zuckerberg, co-founder and CEO of Facebook[58]

Re:Sheryl "Jew Mom" Sandberg should shut the fuck (0)

Anonymous Coward | about 4 months ago | (#47375837)

Let moderation work. Now someone has to waste mod points modding you, and all of the other repliers (including me, hence by AC post), down.

The troll's been modded down to -1. If you hadn't replied most people would never have seen it.

what they do, and what they can do.. (0)

Anonymous Coward | about 4 months ago | (#47375039)

I guess my reaction to this was that if they do this kind of research, what kind of capabilities they actually do have? Maybe they COULD push someone to suicide by adjusting their feeds.

We're Sorry (5, Funny)

dcw3 (649211) | about 4 months ago | (#47375059)

We're sorry....
.
.
. ...that we got caught.

Re:We're Sorry (1)

GoCrazy (1608235) | about 4 months ago | (#47375081)

It all started as a largely ignored paper

Can you really qualify that as being caught though?

Re:We're Sorry (1)

wisnoskij (1206448) | about 4 months ago | (#47375107)

"caught" so this was leaked?

Re:We're Sorry (1)

rmdingler (1955220) | about 4 months ago | (#47375257)

Caught, leaked, fruit of the poisonous tree... the method of delivery becomes a moot point once it's the topic to rage about with the short attention span tribe.

Linked to suicides.

Cue the lawsuits.

Re:We're Sorry (5, Funny)

schlachter (862210) | about 4 months ago | (#47375159)

Facebook has released several different responses to this issue and is closely monitoring how people in each of the different experimental groups respond to these releases.

Passive aggressive much? (3, Funny)

Sockatume (732728) | about 4 months ago | (#47375453)

"Dear customers. We are really sorry that you're so upset at our great study. We're super glad that we did the study but so very very sorry that you guys were upset by it. When we do it again, let's work together to find a way that you could just not be so upset about it."

WSJ: Users seen as a willing experimental test bed (4, Informative)

theodp (442580) | about 4 months ago | (#47375065)

Facebook Experiments Had Few Limits [wsj.com] "Thousands of Facebook Inc. users received an unsettling message two years ago: They were being locked out of the social network because Facebook believed they were robots or using fake names. To get back in, the users had to prove they were real. In fact, Facebook knew most of the users were legitimate. The message was a test designed to help improve Facebook's antifraud measures...'There's no review process, per se,' said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. 'Anyone on that team could run a test," Mr. Ledvina said. "They're always trying to alter peoples' behavior.'...The recent ruckus is 'a glimpse into a wide-ranging practice,' said Kate Crawford, a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media and a principal researcher at Microsoft Research. Companies 'really do see users as a willing experimental test bed' to be used at the companies' discretion."

No effort to avoid targetting suicidal individuals (0)

Anonymous Coward | about 4 months ago | (#47375067)

This is the big issue in my mind. If they just randomly selected 40,000 people then they may have inadvertently caused someone already at risk for suicide to be exposed to additionally negative material.

Re:No effort to avoid targetting suicidal individu (0)

Anonymous Coward | about 4 months ago | (#47375095)

a positive note dropped here, a plea for help ignored there, and suddenly you don't have an overpopulation problem anymore..

Re:No effort to avoid targetting suicidal individu (1)

GoCrazy (1608235) | about 4 months ago | (#47375141)

According to the study, two parallel experiments were conducted, one where positive comments were reduced and one where negative comments were reduced. They weren't exposing them to any additional "negative material".

Re:No effort to avoid targetting suicidal individu (0)

Anonymous Coward | about 4 months ago | (#47375145)

I dislike fb as any other reasonable person and I have unused account to block other users with the same name but if a slight change in stream of ads has such an enormous impact as to cause somebody to end his/her life then such person was instable anyway and as sad as it may seem would have reacted to other sad or as in this case negative stimuli in a way that for the rest of us may be not to understand. FB did wrong by not searching users consent - that is/was doable without endangering the results of the study - that is wrong. Too much compassion for people committing suicide I do not have or at least not because they have been manipulated. We all are in some way if we take this measure or manipulation. Still FB should apologize and Zuckerberg should be castrated.

Re:No effort to avoid targetting suicidal individu (0)

Anonymous Coward | about 4 months ago | (#47375899)

I had an usused account but deleted it when I got two friends request (received by e-mail) from very close people. Then I could browse a "you might know these people" and there were about a hundred people from my childhood/adolescence-hood. So, I'm concerned that Facebook knew that (and my name, and my main e-mail account that I've used for nearly everything!) and that it probably will still have that data 40 years from now, no matter how limited that is.

Re:No effort to avoid targetting suicidal individu (0)

N1AK (864906) | about 4 months ago | (#47375251)

And? If a news show notices that it gets better viewing figures when shows are more negative and thus changes shows to be more negative that could have a worse effect. If google changes the pagerank algorithm in a way that makes negative sites score more highly (even if it is inadvertant) then it could have a far bigger effect.

People are getting their noses bent out of shape because Facebook talked about this as a psychological experiment rather than testing a system change; what they did was no worse than what thousands of companies do every day, and considerably better than what thousands of other companies do every day (those who prey on peoples insecurity to drive sales).

Re:No effort to avoid targetting suicidal individu (0)

Anonymous Coward | about 4 months ago | (#47375475)

You're right - but what those thousands of other companies have done before are directly responsible for the shit-hole 'murica has become. FB just has a MUCH bigger audience, so I would say the outrage here is justified.

Yeah, that was the problem (0)

Anonymous Coward | about 4 months ago | (#47375085)

The experiment was fine, you just don't get it. We shouldn't have told you idiots about it.

Rodney Dangerfield (-1)

Anonymous Coward | about 4 months ago | (#47375089)

I walked into McDonald's... and they told me I didn't deserve a break!

Fact telephone (1)

kruach aum (1934852) | about 4 months ago | (#47375091)

Oh those poor media outlet editors, panicking about missing the next big story. Surely their fragile egos should not be sacrificed to such banalities as truth and common sense?

Instead, we should allow them to play games of telephone with facts, because that way no one's feelings (advertising revenue) get hurt.

It really looks like... (0)

Anonymous Coward | about 4 months ago | (#47375093)

You get what you pay for.

How in the hell did this pass IRB? (2)

Shadow of Eternity (795165) | about 4 months ago | (#47375103)

This should never have made it through the ethics board.

Re:How in the hell did this pass IRB? (3, Interesting)

gstoddart (321705) | about 4 months ago | (#47375201)

This should never have made it through the ethics board.

Ah, but Facebook isn't a university ... they don't have one of those.

So, either they went to the scientists and said "hey, we want to find something out", or the scientists went to Facebook and said "hey, we could do an awesome experiment on your users".

Either way, Sandberg sounds like an unapolagetic smug idiot who more or less said "they're our users, we do this shit all the time".

The people who run Facebook are assholes, and don't give a crap about anything more than how they can maximize ad revenue. And Zuckerfuck is a complete hypocrite about privacy -- his users get none, and he jealously guards his own.

How in the hell did this pass IRB? (5, Informative)

RobertJ1729 (2640799) | about 4 months ago | (#47375805)

The scientists represented to the IRB that the dataset was preexisting, and so the IRB passed on the review. It's not clear that the dataset was preexisting, though, since the study seems to indicate that the scientists were involved in the design of the experiment from the beginning. What's more, the paper itself claims to have obtained informed consent when clear there was none.

Facebook is dumb. (1)

Anonymous Coward | about 4 months ago | (#47375123)

Get rid of your account. Be free.

Re: Facebook is dumb. (0)

Anonymous Coward | about 4 months ago | (#47375237)

Enjoy not having a social life, faggot

Re: Facebook is dumb. (2)

amalcolm (1838434) | about 4 months ago | (#47375267)

If you need Facebook toi have a social life, you're the loser

Re: Facebook is dumb. (0)

Anonymous Coward | about 4 months ago | (#47375319)

You need a website to have friends LOL

Not important (0)

jbmartin6 (1232050) | about 4 months ago | (#47375137)

No one outside of the "twitterati" cares about this. "designed to purposefully manipulate the emotions of its users"? Huh, sounds like advertising as so many others have pointed out.

This was part of ongoing research companies do (1)

PolygamousRanchKid (1290638) | about 4 months ago | (#47375139)

. . . yes, sometime companies, do you, their customer . . . or in the case of Facebook, their product.

Facebook doesn't think it's "questionable" (3, Insightful)

Sockatume (732728) | about 4 months ago | (#47375147)

"the questionable assumption that such manipulation has happened"

They literally wrote a peer-reviewed scientific paper demonstrating that they manipulated people's moods to a statistically significant degree, I don't think there's much you can call questionable about it from Facebook's perspective.

Re:Facebook doesn't think it's "questionable" (0)

Anonymous Coward | about 4 months ago | (#47375835)

Technically they are correct (the best kind of correct) it is questionable.
In the same way that I exist is questionable - am I really here?

what's worse is.. (1)

strstr (539330) | about 4 months ago | (#47375149)

this is a form of mind control. and DOD funding was involved, aka looks like programs like MKULTRA are alive and well like all the whistleblowers talk about.

RT.com actually did the link to the DOD/military part taking in the tests. mass mood/emotion manipulation through whatever medium the DOD targets!

http://rt.com/usa/169848-penta... [rt.com]

Learn more about military mind control (which is also what surveillance is used for, because they can learn how to target us using information we provide or believe): http://www.obamasweapon.com/ [obamasweapon.com]

Re:what's worse is.. (3, Insightful)

Pieroxy (222434) | about 4 months ago | (#47375441)

People are controlling your mind all the time. Every time you see an ad, someone is trying to control your mind to try to convince you buy something. Every time you read an article in a paper, someone controls your mind to try to get their point across. Every time you argue with someone she is trying to control your mind by getting her point across. Etc.

Get off your high horse, use your brain.

Wafer thin almost apology. (5, Interesting)

StoneCrusher (717949) | about 4 months ago | (#47375163)

Ah... the apology that puts blame on the victim. A hall mark of abusers and sociopath's everywhere.

Now everyone would of notice that they are only apologising for the mis-communication, not the act of physiological experimentation (as if we would be OK with it if they had told us). But it goes deeper...

Notice that they put the action and apology in two difference sentences, followed quickly by a "We never meant to upset you." Putting the emotional blame back on us. As if we were just accidentally bumped bystanders, not the actual targets of the actions.

And never ever use the word "Sorry". Only the big weasel phrase "we appologise". This apology goes right along with the classic phoney apologies...

I'm sorry you that you got upset.
I'm sorry that you feel that way.
I'm sorry that you made me do that.

Ethics (1, Insightful)

ceoyoyo (59147) | about 4 months ago | (#47375179)

Human experimentation without review board approval and informed consent violates a number of national and international laws. It doesn't matter whether anyone gets hurt.

Re:Ethics (4, Informative)

msauve (701917) | about 4 months ago | (#47375209)

Cites, please? Because I have one which counters that claim.

Importantly -- and contrary to the apparent beliefs of some commentators -- not all HSR is subject to the federal regulations, including IRB review. By the terms of the regulations themselves, HSR is subject to IRB review only when it is conducted or funded by any of several federal departments and agencies (so-called Common Rule agencies), or when it will form the basis of an FDA marketing application. HSR conducted and funded solely by entities like Facebook is not subject to federal research regulations...

- Everything you need to know about Facebook's manipulative experiment [wired.com]

Re:Ethics (1)

Sockatume (732728) | about 4 months ago | (#47375367)

Given that subjects were not geographically constrained (they were randomly selected by user ID), the US isn't the only nation whose laws apply to this research.

Re:Ethics (1)

msauve (701917) | about 4 months ago | (#47375439)

So, your point is that you can't point to any foreign laws which have been violated, either.

Re:Ethics (1)

Sockatume (732728) | about 4 months ago | (#47375801)

Human subjects research is subject to mandatory informed consent - specific to the study being performed, you can't just have a boilerplate like the Facebook ToS - in almost all jurisdictions. For example, this is the US law Facebook undoubtably broke:

http://www.hhs.gov/ohrp/humans... [hhs.gov]

Re:Ethics (2)

Sockatume (732728) | about 4 months ago | (#47375863)

Looks like this doesn't apply. Federal funding requirement.

Re:Ethics (1)

Sockatume (732728) | about 4 months ago | (#47375821)

That doesn't discuss informed consent, which under Federal law requires that study participants be given specific information about the purpose, risks, procedures, duration, etc. etc. of the research.

http://www.hhs.gov/ohrp/humans... [hhs.gov]

Re:Ethics (1)

Sockatume (732728) | about 4 months ago | (#47375851)

Actually this doesn't apply. Federal funding requirement.

Re:Ethics (1)

Anonymous Coward | about 4 months ago | (#47375215)

This surely isn't a one-time experiment. They likely have piles of data about tests they have been doing in secret.

And this only measured posts, not feelings to the posts. They don't actually know if what they saw affected people's day in a real way.

Who is watching these companies?
They know so much about us. We're all little playthings to them.

Re:Ethics (0)

Anonymous Coward | about 4 months ago | (#47375377)

But are you playing games with their feelings? And is it in the name of Science?

Re:Ethics (0)

LordLucless (582312) | about 4 months ago | (#47375333)

I'm in trouble then. In the last couple of weeks, I've performed a number of human experiments on the website I manage, including:
* Do they push green buttons more than red buttons?
* Do they fill in forms more reliably if it's one big form, or split across multiple pages?
* Do people finish reading a page more often if the text is in large font rather than a small?

Re:Ethics (1)

nine-times (778537) | about 4 months ago | (#47375447)

Sure it can't be all human experimentation, or else ad agencies couldn't attempt to measure the effectiveness of their ads. Parents couldn't raise their children (e.g. "Let's try withholding cookies and see if that works!").

There must be specific parameters under which human experimentation is illegal.

Re:Ethics (0)

Anonymous Coward | about 4 months ago | (#47375869)

I am not sure whether "illegal" is an appropriate term, unless some criminal lawyer is hereabouts.

The fact is that it was *unethical* because Facebook knew from prior research that emotional contagion is a measurable outcome, and proceeded with an experiment in which a half of their subjects were exposed to a negative contagion effect. They knew, a priori, that there was a probability that half their subjects would be harmed by their experiment.

Advertisers and software developers do not set out to create harmful customer or user experiences in ad campaigns and A/B testing, they compare candidate improvements. Facebook could have compared "Facebook Happy" with a control Facebook group, or two alternative "Happy Contagion" exposures. Instead, Facebook intentionally exposed a half of their sample to a harmful influence.

This would be like testing Cognitive Behavioural Therapy (CBT) against, say, making depressed patients cry (perhaps with photos of dead puppies), in order to test with CBT is effective. It would not be ethical. An ethical test would be CBT versus no intervention, or CBT against the best alternative.

A civil lawyer might be interested in evaluating whether there is any loss to those exposed, which could be pursued in civil damages. I am sure that more than one is doing exactly that right now.

Human Subject Review (1)

flink (18449) | about 4 months ago | (#47375195)

I haven't seen a human subject review or impact statement mentioned in any of these /. articles. Did Facebook even do one before proceeding with this research? If so was it reviewed by an ethics panel before they proceeded with the experiment? If not, then they should definitely be held responsible for any negative outcomes.

Re:Human Subject Review (0)

Anonymous Coward | about 4 months ago | (#47375365)

They should be held responsible even if there were no negative outcomes. They violated Federal Law and every facet of professional ethics in the field of Psychology.

Never meant to get caught (1)

Cardoor (3488091) | about 4 months ago | (#47375283)

FIFY

Who cares? (1)

ctrlshift (2616337) | about 4 months ago | (#47375303)

Facebook has no compact with its users to offer fair and balanced news (if you'll forgive the expression). They are not obligated to feature any particular array of stories to anybody; in fact, we've heard over and over again how the relevance of items that appear in the news feed is skewed and unpredictable. Nobody should be relying on them for news and I don't think we should expect any more journalistic integrity from them than Buzzfeed.

I don't usually take this angle when it comes to corporate responsibility to the public, but in this case I think people are getting too close to Facebook, when Facebook really just wants to be friends. Or perhaps researcher & test subject.

Not happy about the concept, however... (2)

Junta (36770) | about 4 months ago | (#47375307)

My question is why is there particular outrage when they do it as part of a science experiment whereas it is widely acceptable to do the exact same thing in mass media to get revenue.

National and local news programs basically live and breath this sort of thing constantly. They schedule their reporting and editorialize in ways to boost viewership: stirring up anger, soothing with feelgood stories, teasing with ominous advertisements, all according to presumptions about the right way to maximize viewer attention and dedication. 'What everyday item in your house could be killing you right now, find out at 11'.

I don't have a Facebook account precisely because I don't like this sort of thing, but I think it's only fair to acknowledge this dubious manipulative behavior is ubiquitous in our media, not just as science experiments in Facebook.

Re:Not happy about the concept, however... (2)

Sockatume (732728) | about 4 months ago | (#47375401)

Research ethics. We hold scientists to a higher standard than web sites and TV stations.

The question is why.. (1)

Junta (36770) | about 4 months ago | (#47375757)

Why not hold people not claiming to be scientists to a higher standard? It's not like their science-but-don't-call-it-science experiments are any less potentially damaging than the same behavior done by a 'true scientist'.

Re:The question is why.. (1)

Sockatume (732728) | about 4 months ago | (#47375831)

I agree.

Re:Not happy about the concept, however... (1)

EmagGeek (574360) | about 4 months ago | (#47375405)

Because there is a difference between trying to elicit a behavior and trying to change a person's psychological state of mind.

Re:Not happy about the concept, however... (1)

oh_my_080980980 (773867) | about 4 months ago | (#47375521)

Actually you don't have a question but rather an ignorant and inane comment. The objection to what Facebook did has been clearly stated and you have the ability to do research on the subject to understand what Facebook did wrong (hint, it was about informed consent). You don't care about the issue at hand. Rather, your intention was to make a feeble comparison between Facebook and other media in order to make the "you too" argument - a claim which does no justify Facebook's actions (see Tu quoque logical fallacy). Pinhead.

Re:Not happy about the concept, however... (2)

Junta (36770) | about 4 months ago | (#47375725)

I fail to see how it's that different than the manipulation that mass media does, who also do not get informed consent. There is the facet of it being more targeted, but the internet is already about targeted material (hopefully done with the best interest of the audience in mind, practically speaking with the best interests of the advertiser). They just stop short of calling it an 'experiment' (in practice, they are continually experimenting on their audience) and somehow by not trying to apply scientific rigor they get off the hook.

I'm not saying that Facebook is undeserving of outrage, I'm saying that a great deal of the media behavior is similarly deserving and somehow we are complacent with that situation.

Most disappointing for me is manipulating the feed (2)

BobMcD (601576) | about 4 months ago | (#47375311)

Facebook's efforts to manipulate the feed are really disappointing. If they'll do it for jollies, then they'll damn sure do it if someone pays them to or if the government orders them to.

Imagine an 'American Spring'. Imagine the government not only spying on Facebook users communicating about it, but requiring that Facebook actively suppress any positive comments about it.

That shit ain't right.

It wasn't "communicated poorly." (0)

Anonymous Coward | about 4 months ago | (#47375329)

It wasn't communicated at all, and Facebook failed to obtain informed consent from subjects of a psychological experiment.

But, let's look at the bright side. Though they violated many federal laws, nobody is going to get in trouble for it, so it's no big deal. The 4 test subjects who committed suicide as a result of the experiment don't need justice, anyway.

"Largely ignored"? (1)

Sockatume (732728) | about 4 months ago | (#47375373)

How was this paper "largely ignored"? It was published two weeks ago, and the outrage started immediately.

Playing with emotions? (0)

Anonymous Coward | about 4 months ago | (#47375397)

So they are still not truthful about this. How can you disclose anything about such a experiment as triggering emotions and disclose your doing this? This to me is just another impulse buying test to see what emotions garner the most response. Marketing is all about impulses and emotions. Apple is all about instilling a emotional response to their products and to the company. The more successful you are at manipulating people's emotions. The better you can provide targeted ads to those that respond to those emotions. People need to understand that the users of Facebook provide a "fee" group of people to experiment on and gather information from. You hardly can complain about what Facebook does because you have a option of simply not participating as a user in Facebook. I myself reached a point a year ago that Facebook was no friend on mine and that being truthful to its users was not in Facebook's values. The people at Facebook think if your willing to share your personal life on a web based social network. Then you are willing to do just about anything, and you really do not value much in privacy.
I find it ironic how the same people who never read EULA or privacy disclosures cry wolf when it becomes a news item.

Re:Playing with emotions? (1)

oh_my_080980980 (773867) | about 4 months ago | (#47375575)

Really? Where was it stated that you will become part of an experiment when you signed up for a Facebook account? Where was the disclosure that the experiment was about to begin and giving you the option to opt out?

I find interesting that people think EULA's are blanket excuses to do anything. In fact they are not. Court decisions have stated that EULA's are limited in scope. Facebook's EULA is too broad and therefore has no legal weight.

The social engineering experiment is still ongoing (0)

Anonymous Coward | about 4 months ago | (#47375451)

Your responses are being measured. FUCK YOU FACEBOOK. and related things just just makes them more powerful.

The good Samaritan always gets his ass kicked (4, Insightful)

Theovon (109752) | about 4 months ago | (#47375463)

As has been pointed out many times, Facebook was doing their usual sort of product testing. They actively optimize the user experience to keep people using their product (and, more importantly, clicking ads). The only difference between this time and all the other times was that they published their results. This was a good thing, because it introduced new and interesting scientific knowledge.

Because of this debacle, Facebook (and just about every other company) will never again make the mistake of sharing new knowledge with the scientific community. This is truly a dark day for science.

Ferengi rule of aquisition #285: No good deed ever goes unpunished.

Re:The good Samaritan always gets his ass kicked (2)

oh_my_080980980 (773867) | about 4 months ago | (#47375603)

As has been pointed out several times, this was not product testing. This was a psychological test which Facebook failed to get informed consent.

Science is in no way hurt by this but that you think it is shows how truly ignorant you are.

Re:The good Samaritan always gets his ass kicked (1)

Theovon (109752) | about 4 months ago | (#47375691)

The requirement for informed consent was ambiguous in this case. If I had been in their position, I would have erred on the side of caution, and the research faculty who consulted on this project should have been more resolute about it. If anything, it is those people who should have done the paperwork. I think their failure to get informed consent was a mistake, but I don’t believe it was any kind of major ethical violation. It does no harm to get informed consent, even if you don’t legally need it, and there are moral arguments for getting it in any case.

My main point is that this kind of “manipulation” has been going on for a long time and will continue to occur. Facebook intentionally manipulates users in all sorts of ways to determine what will affect what gets users to use their service and click ads. The only practical difference between this current intentional manipulation and past intentional manipulation is that this time, they reported on it. Going forward, they continue to not get informed consent (because they don’t need it), but they will also continue to manipulate. Thus the travesty is that will simply stop reporting their findings in the future, and that is the ONLY thing that will change, and the rest of the world will be less informed because of it.

Re:The good Samaritan always gets his ass kicked (0)

Anonymous Coward | about 4 months ago | (#47375897)

As has been pointed out many times, Facebook was doing their usual sort of product testing. They actively optimize the user experience to keep people using their product

No, they actively created a negative user experience. The opposite of optimized. They chose to damage their cusomers, which is not normal business or scientific practice.

Facebook is a PoS (0)

Anonymous Coward | about 4 months ago | (#47375473)

Those who are into Facebook (meaning that they get convulsions if they don't check their Facebook page several times a day, and those who upload every single insignificant detail of their pathetic lives to Facebook) either are too stupid to notice it, or they just don't care.

A Non Apology (3, Insightful)

sjbe (173966) | about 4 months ago | (#47375513)

"This was part of ongoing research companies do to test different products, and that was what it was," Sandberg said. "It was poorly communicated. And for that communication we apologize. We never meant to upset you."

This is identical to saying "I don't know what we did that upset you but whatever it was I apologize". They don't get it. It basically means that they are going to continue treating their users as insects to be experimented upon and lack the moral compass to understand why what they did was wrong. The fact that they ran an experiment is fine in principle but HOW you do it matters. We insist that academic researchers run their psychology experiments by a review board and when necessary get informed consent. It's not a hard thing to do and we do it for very good reasons. Facebook has not presented any plausible reason we should hold them to a different standard.

I'm very glad I do not have a facebook account and at this point I doubt I ever will. This is simply not a company I care to be involved with any closer than I have to be.

If you don't mean to upset us... (1)

Mandrake (3939) | about 4 months ago | (#47375559)

... then don't keep changing the news feed to "Top Stories" which nobody gives a shit about.

hahaha (1)

drinkypoo (153816) | about 4 months ago | (#47375719)

to say Facebookâ(TM)s slight alteration of news feeds caused people to suffer depression seems to be unsupported by any kind of data or logic

That's exactly what the tobacco industry said about health damage due to cigarette smoking, when they knew damned well that it was supported by both data and logic.

my responsibility (0)

Anonymous Coward | about 4 months ago | (#47375827)

So, as a user of facebook and of technology, and as someone who didn't expect to start having schizophrenic episodes, social isolation, paranoia, and a complete immolation of my life over the last six months, I take to say I did not really appreciate and continue to not appreciate emotional manipulation. Sigh.

When has FB newsfeed ever been not manipulated? (1)

swb (14022) | about 4 months ago | (#47375913)

When has the Facebook newsfeed ever NOT been manipulated and been merely a list of posts in chronological order from people you are friends with and/or follow?

It strikes me as constantly being manipulated in multiple ways and in a manner noticeable to many people. Most obvious was the "top stories" filter which purported to filter the newsfeed in some manner designed to suppress some comments and promote others.

But we don't know about the criteria for this or the motivation of other, less obvious manipulations designed to enhance or suppress comments. Presumably most motivations are commercially driven to promote advertisers products or increase Facebook usage.

When has FB newsfeed ever been not manipulated? (0)

Anonymous Coward | about 4 months ago | (#47376117)

Yeah, it's not like Slashdot ever experiments with different ways to suppress low quality posts or rolls out a Beta experiment to half the site's users.

Time to get meta (3, Funny)

rebelwarlock (1319465) | about 4 months ago | (#47375997)

Jokes on you guys - the "leak" was fictional. The real experiment is the public's reaction to this.

Am I alone here? (1)

grasshoppa (657393) | about 4 months ago | (#47376017)

I mean, ya; "facebook is the enemy", sure. But honestly? Where's the personal responsibility? You can show me whatever you want, *I* control my emotions and my responses.

This whole thing has seemed a tempest in a tea cup, but because facebook is of questionable morals and ethics, it seems everyone is jumping on board how horrible this was.

People are stupid (1)

gelfling (6534) | about 4 months ago | (#47376063)

Anyone who kills themselves over an emoticon is actually on the right track.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?