Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Facebook's Emotion Experiment: Too Far, Or Social Network Norm?

timothy posted about 1 month ago | from the applied-semantics dept.

Social Networks 219

Facebook's recently disclosed 2012 experiment in altering the tone of what its users saw in their newsfeeds has brought it plenty of negative opinions to chew on. Here's one, pointed out by an anonymous reader: Facebook's methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations. "If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent." For a very different take on the Facebook experiment, consider this defense of it from Tal Yarkoni, who thinks the criticism it's drawn is "misplaced": Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more. ... [H]aranguing Facebook and other companies like it for publicly disclosing scientifically interesting results of experiments that it is already constantly conducting anyway–and that are directly responsible for many of the positive aspects of the user experience–is not likely to accomplish anything useful. If anything, it’ll only ensure that, going forward, all of Facebook’s societally relevant experimental research is done in the dark, where nobody outside the company can ever find out–or complain–about it."

cancel ×

219 comments

Sorry! There are no comments related to the filter you selected.

One solution (5, Insightful)

Anonymous Coward | about 1 month ago | (#47348827)

Just don't use social networking.

Re:One solution (0)

Anonymous Coward | about 1 month ago | (#47348939)

and if you have a social network account you don't use any more - delete it!

No point inflating FB, Twitter et al user numbers for them to profit from.

Re:One solution (0, Offtopic)

flyneye (84093) | about 1 month ago | (#47348983)

Lol ,G+ is like a party, where you decorate,chill the beer and send out invitations and nobody comes . It is an ANTI-social network. Where does this fit the scheme of things?

Re:One solution (1)

MancunianMaskMan (701642) | about 1 month ago | (#47349031)

Lol ,G+ is ... an ANTI-social network.

G+ suits me fine, I have no friends IRL either.

You insensitive clod. People sometimes go on at me about being antisocial, I think they're being overly judgemental in implying it's a bad thing.

Re:One solution (1)

flyneye (84093) | about 1 month ago | (#47349079)

Why, I'd love to have you outside my circles!
Get on wit'cha bad self.

Re:One solution (4, Insightful)

rmdingler (1955220) | about 1 month ago | (#47349177)

Some people have little interest in happiness and a good mood... you see it every day.

Within and without social media, people, events, results, and happenstance conspire to alter your mood each and every day... something that cannot happen without your tacit permission. Grow a thicker skin and remember that yelling at that jerk in traffic means you've allowed a complete stranger power over your behavior.

If giving up social media is too big a first step, don't go in with your eyes wide shut: you are the product, not the customer.

Re:One solution (1)

Noah Haders (3621429) | about 1 month ago | (#47349453)

Quote in summary says that in the us it requires informed consent... What if they just tested on foreigners? Problem solved, yes?

Re:One solution (1)

gunner_von_diamond (3461783) | about 1 month ago | (#47349019)

and if you have a social network account you don't use any more - delete it!

No point inflating FB, Twitter et al user numbers for them to profit from.

FB/Twitter will always be inflated, no matter what, thanks to spam accounts. How else would I have any friends?

Re:One solution (2, Interesting)

flyneye (84093) | about 1 month ago | (#47348963)

But, think of the implications for upcoming elections! How will the Repubmocrats keep 100% power against independents, tea party and other radical despots competing against the chosen ones? Control! The people obviously need controlled, they don't know what is good for them and the Repubmocrats always will.
Since there is a market, Facebook, who covers most demographics, can help by raising tensions toward sinister interlopers in our one party political system.
Look for upcoming "hour of hate" shows profiling Tea Party advocate Emanuel Goldstein, NRA talking heads, Farmers, Anti-war hippies and anyone without the Hillary Clinton seal of approval on their forehead or hand.....

Don't worry! (1)

Mashiki (184564) | about 1 month ago | (#47348829)

It's all perfectly harmless. CTos is here for you! [wikia.com] For those that haven't played the game, stop reading. With that one of the points in the game was "targeted reassignment" of vote predictions to get a mayor re-elected.

more interessting,.. (5, Interesting)

Selur (2745445) | about 1 month ago | (#47348833)

it doesn't sound like this is the first experiment done by the facebook crowd -> What other experiments happened? Were the participants informed about it later? Who takes the blame if such an experiment results in someone getting hurt?

Re:more interessting,.. (0, Flamebait)

Anonymous Coward | about 1 month ago | (#47348875)

Not that I believe they would do such a thing, it would be entirely possible for them to increase suicide rates by subtly changing the results you get. Like making sure any pleads for help never get shown to the network, and no positive feedback from the network get back.

Re:more interessting,.. (4, Interesting)

N1AK (864906) | about 1 month ago | (#47348915)

Where do you draw the line? If Facebook realised that showing more negative stories (by monitoring what people already see) makes people more likely to click adverts is that really any better/worse than them artificially increasing/decreasing the amount of positive stories a user sees?

If Google was having a hard time deciding if a page was junk or not, would it be unethical to put it in the results for some users and see how they react? Clearly that's an experiment without user knowledge, but it certainly doesn't sound like it's unethical to me and stopping that kind of experimentation or flooding sites with notices about them would make things better for users.

Obviously there are experiments they could run that would be unethical if users weren't informed and monitored; discussing where the lines are and agreeing some best practices would therefore make sense.

I think it's fine (3, Insightful)

Threni (635302) | about 1 month ago | (#47348835)

I love how overblown the coverage of this has been..as if it's driven people to suicide. It's their site, they can do what they want; people are free to leave if they want. Nothing to see here.

Re:I think it's fine (1)

buck-yar (164658) | about 1 month ago | (#47348901)

I think they didn't go far enough, more experiments like this should be done. Never before has such a database been compiled (other than NSA). Much can be learned.

Re:I think it's fine (3, Insightful)

danudwary (201586) | about 1 month ago | (#47349373)

More PUBLISHED experiments, though, please. Let's know what they're doing, and what the outcomes are.

Re:I think it's fine (1)

Anonymous Coward | about 1 month ago | (#47348987)

We'll never know if it drove people to suicide.

Re:I think it's fine (0)

Anonymous Coward | about 1 month ago | (#47349015)

And? If there is never any evidence found that proves this experiment drove some to suicide, then what's the problem?

Re:I think it's fine (-1)

Anonymous Coward | about 1 month ago | (#47349125)

We'll never know if it drove people to suicide.

If something as simple as changing the stories you read on facebook for a week drove you to suicide, I think you were well on your way there already, and it would have just taken something else instead to set you off.

Re: I think it's fine (1)

Anonymous Coward | about 1 month ago | (#47348995)

It's a sight for family and friends to share experiences, not to explore and experiment us as if where mice. Ohhh that's right everybodys trying make that easy money. How about marketing, timing and user mapping this:-) happy :-) emotional friendly middle finger Jackie !!! #getbent fb
Nothing to see here !!! :-)

Re:I think it's fine (3, Insightful)

DarkOx (621550) | about 1 month ago | (#47349035)

That is kinda my reaction as well. It seems this issue people have here is that facebook sought to manipulate peoples emotional state. The thing is that is exactly what just about every advertiser does all the time.

Home Security System ads: clearly designed to make you feel vulnerable and threatened.

Cosmetic surgery ads: clearly designed to make you feel inadequate.

Beer ads: very often designed to make you feel less accepted, you need their product to be preceived as cool, ditto for clothing, and personal care products

Political ads: feelings of security and family (at least if you pick their candidate)

This list goes on...

It might not have the same rigor as the academic world but they absolutely do focus group this stuff and find out how people 'feel' the marketers have researched what words, phrases, and imagery can best evoke these feelings. If what facebook did is illegal or even just unethical than so is pretty much everything the modern advertising industry has been up to for the past 70 years.

I am sure many people would actually agree with that, but I don't see why its suddenly so shocking and deserving of attention just because facespace does it.

This is not advertising (3, Insightful)

sjbe (173966) | about 1 month ago | (#47349159)

The thing is that is exactly what just about every advertiser does all the time.

No it is NOT the same thing. The beer company does not have any control over what *I* say and they do not get to (legally) change what I say or how it is delivered to others. There is a HUGE difference between putting a message out there and seeing how people react to it versus actually changing what you or I say and how it is delivered to someone else without my consent. The former is advertising which is fine as long as it isn't too intrusive. The later is a violation of personal sovereignty unless you obtain informed consent beforehand.

Furthermore even if every advertiser actually did this (which they do not) and you have an ethical blind spot so large that you can't actually see what Facebook did wrong, two wrongs don't make a right. "Everyone else is doing it" is a juvenile argument that little kids make to justify behaviors that they shouldn't be engaging in.

Re:This is not advertising (4, Insightful)

nine-times (778537) | about 1 month ago | (#47349539)

As far as I could tell from reading about this, they didn't change what people said.

Here's the thing, Facebook already filters what you see with the default setup. Your 500 friends each post 10 posts today, and when you load up your page on a social networking site, the page only displays 15. So how are those 15 chosen? (I'm making up numbers here, obviously)

The obvious choice would be to show the 15 most recent posts, but that means there's a good chance you'll miss posts that are important and that you'd like to see, since you're only getting a brief snapshot of what's going on in that social networking site. Facebook instead has an algorithm that tries to determine which of those 5,000 posts you'll care most about. I don't know the specifics, but it includes things like favoring the people who you interact with most on Facebook.

So what Facebook did in this study is they tweaked that algorithm to also favor posts that included negative words. The posts were still from that 5,000 post pool and the contents of the posts were unedited, but they subjected you to a different selection in order to conduct the research.

It's still an open question as to whether this sort of thing is appropriate, but it's important to note that this is something Facebook does all the time anyway. I think where is gets creepy is that Facebook is also an ad-driven company, so you have to wonder what the eventual goal of this research is. I can imagine Facebook favoring posts that include pictures of food to go along with an ad campaign for Seamless. Maybe they'll make a deal with pharmaceutical companies to adjust your feed to make you depressed, while at the same time plastering your feed with ads for antidepressants.

Re:I think it's fine (2)

oh_my_080980980 (773867) | about 1 month ago | (#47349215)

For the cognitively impaired: Advertising is is identified as advertising. The experiment that was conducted was not identified, people did not know this was occurring. The "researches" did not get informed consent - which you are required. The "researches" claimed the EULA was the informed consent.

Re:I think it's fine (1)

DarkOx (621550) | about 1 month ago | (#47349291)

Advertising is not always identified. How often have you gotten a letter, that is designed to look like an insurance invoice or bank check. Many look official enough I have spend at least 20 seconds deciding if its something I really need to act upon. All kinds of advertisers take out full page ads and do their damnedest to disguise them as articles in print magazines and news papers.

Sure there is always some fine print somewhere that says "advertisement" but then I would argue facebooks EULA qualifies as fine print.

Also facebook was not altering the messages or their presentation, they were filtering them. Just like they always filter what goes in your news feed, some of the filter criteria you control in the user preferences much of it you do not; all they did is change the filter criteria which users never had full control over. I think its a little crazy to get up in arms about about that.

Finally advertisers do borrow what people say to promote products. They quote celebrities and other people all the time, and they don't need to make any payment or get consent for that either as long as they keep it a direct quote; even when they use it well out of context.

Re:I think it's fine (5, Insightful)

Tim the Gecko (745081) | about 1 month ago | (#47349227)

Facebook uses psychology to make minor changes in our happiness... Something must be done!

Soda companies use psychology to sell huge buckets of sugar water... Hands off our soda, Mayor Bloomberg!

Re: I think it's fine (1)

afgam28 (48611) | about 1 month ago | (#47349399)

That's what I thought when I read it too. I wonder if Slashdot did an a/b test with its moderation system and did some sentiment analysis on the resulting comments, would there be the same outrage?

This news piece has been greatly exagerated (-1, Troll)

rodrigoandrade (713371) | about 1 month ago | (#47348843)

Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.

TFS and TFA should include something like: This news brought to you by our scaremongering overlords, courtesy of privacy nutjobs all over the globe.

This isn't legal documents we're talking about here, anyway. I'm also pretty sure this is covered under Facebook's EULA/TOS you didn't read.

Life goes on. Don't like Facebook, don't use Facebook.

Re:This news piece has been greatly exagerated (5, Insightful)

astro (20275) | about 1 month ago | (#47348873)

Bullshit. How do you know that you don't know anyone that was affected by it? Do you know which week in 2012 the experiment was conducted? Do you know which of the ~billion FB accounts were the 700k experimented upon? I find it pretty shocking that so many people are having difficulty understanding the difference between A/B testing and intentional emotional manipulation where a significant negative (or positive) result was the data point the study strove to measure.

I can quite imagine that a significant number of offline lives were impacted by this experiment. People exposed to negative content presumably don't limit their negative reactions to behavior only in the venue where they were exposed to the negative content.

Re:This news piece has been greatly exagerated (3, Interesting)

bickerdyke (670000) | about 1 month ago | (#47348909)

I find it pretty shocking that so many people are having difficulty understanding the difference between A/B testing and intentional emotional manipulation where a significant negative (or positive) result was the data point the study strove to measure.

Creating an emotional response is part of marketing and therefore webdesign.

Of course you're not directly monitoring emotions as a data point during A/B-Tests. You measure e.g. the clicks, pages read or the time spent on the website. But every marketing guy worth its money could tell you that you can increase all of that by "making the user feel at home".

Re:This news piece has been greatly exagerated (1)

oh_my_080980980 (773867) | about 1 month ago | (#47349225)

Wow? Manipulating emotional response is not part of web design. Somebody failed high school.

Re:This news piece has been greatly exagerated (2)

bickerdyke (670000) | about 1 month ago | (#47349249)

So you don't want to create websites that people enjoy using?

That may explain the design of the average Linux user group website, but wold also explains why websites like facebook or even lolcats, that target emotions, have more commercial success.

Re:This news piece has been greatly exagerated (1)

oh_my_080980980 (773867) | about 1 month ago | (#47349279)

Really? You do understand what an experiment is don't you? You do know what informed consent is don't you?

Web design, much like beauty, is in the eye of the beholder. But more importantly, web design is not about illiciting an emotional response from a user, it's about navigation. How easy it is for people to navigate your site. But if you want to get caught up in putting dancing bears on your web page so people can get a warm and fuzzy feeling, whatever.

Re:This news piece has been greatly exagerated (1)

bickerdyke (670000) | about 1 month ago | (#47349559)

Point taken.

I'll reduce that claim to "commercial web design". But that's still the majority of pages out there. They want to SELL. And if it takes those dancing bears, there is no way they won't use dancing bears.

Quick: what toilet paper brand has dancing bears as mascots?
And aren't they cute and funny and loveable.... See, it works.

Re:This news piece has been greatly exagerated (3, Insightful)

PolygamousRanchKid (1290638) | about 1 month ago | (#47349049)

1. Look up what wacky crimes were committed in January 2012.
2. Blame them on Facebook.
3. Sue.
4. Profit!.

In January 2012 a bunch of kids formed the Islamic Caliphate of the Rusted Chevy on Cinder Blocks on my front lawn, despite their parents instructing them: "You best be staying away from Mr. Kid, he ain't right in the head. "

Obviously Facebook manipulation caused this.

Re:This news piece has been greatly exagerated (0)

Anonymous Coward | about 1 month ago | (#47349123)

Do you know which week in 2012 the experiment was conducted?

Yes, because I actually read the paper.

Re:This news piece has been greatly exagerated (5, Insightful)

Anonymous Coward | about 1 month ago | (#47348889)

Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.

Do you PERSONALLY know anyone who was affected by warrantless wire tapping? Neither do I.

As long as they never admit who it happened to, so that nobody can know whether it happened to them, then we're good? Look, there probably isn't anyone alive today (and certainly not on thus website) who knew Little Albert but that doesn't make the experiment that was done to him any less unethical.

Facebook's TOS can obtain consent, but it can never obtain informed consent.

This news piece has been greatly exagerated (0)

Anonymous Coward | about 1 month ago | (#47348895)

So it was done, but nobody was affected by it? Sounds like something they would have cancelled if it wasnt having any effect on people.

Re:This news piece has been greatly exagerated (0)

Charliemopps (1157495) | about 1 month ago | (#47349039)

Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.

TFS and TFA should include something like: This news brought to you by our scaremongering overlords, courtesy of privacy nutjobs all over the globe.

This isn't legal documents we're talking about here, anyway. I'm also pretty sure this is covered under Facebook's EULA/TOS you didn't read.

Life goes on. Don't like Facebook, don't use Facebook.

I also don't personally know anyone that was killed in the Holocaust. I guess that's ok to then?

The point is, Facebook could very easily manipulate elections with this sort of thing. It should be illegal. I'm aware that lots of other companies have done the same sorts of things to lesser degrees, but that doesn't make it right. If it were illegal, at least the researchers involved might think twice about participating.

Re:This news piece has been greatly exagerated (0)

Anonymous Coward | about 1 month ago | (#47349133)

I also don't personally know anyone that was killed in the Holocaust. I guess that's ok to then?

Facebook filters some users already filtered feeds slightly differently to lower the chance of positive/negative messages appearing => killing millions of people in absolutely horrendous ways. Good lord.

Not exaggerated at all (3, Insightful)

sjbe (173966) | about 1 month ago | (#47349071)

Seriously, come on. Do you PERSONALLY know ANYONE who was affected by this? Neither do I.

Nobody knows who was affected or exactly how. That's part of the problem. They did it without knowledge or consent. They did not inform people of what they were doing or the fact that they did it after the fact. They did not have their design of experiment reviewed by an independent ethics board. They violated the (misplaced) trust their users had to deliver their messages as the users intended.

This isn't legal documents we're talking about here, anyway. I'm also pretty sure this is covered under Facebook's EULA/TOS you didn't read.

NOTHING in Facebook's TOS remotely qualifies as informed consent to be experimented upon. I don't even have to read it to know that. It's not THAT they did this experiment, it is HOW they did this experiment. It's not hard to check the experiment proposal in front of an ethics panel. It's not hard to get informed consent if that is deemed appropriate by the ethics panel. It is standard practice to do those things for some very very good reasons. Facebook couldn't be bothered.

Re:This news piece has been greatly exagerated (2)

mr100percent (57156) | about 1 month ago | (#47349189)

I don't know anyone who was affected by the Tuskeegee syphilis study, but that doesn't mean it was right or we shouldn't be outraged.

Too far? (1)

Anonymous Coward | about 1 month ago | (#47348845)

It was too far when they were selling the data before these shenanigans but most people do not care/uninformed/clueless.

  Congratulations FB has reached a new level of indecency. Given that it's so regular it's no longer alarming, somehow.

Shock and awe (3, Informative)

Stumbles (602007) | about 1 month ago | (#47348849)

Not really, not in the least bit should anyone be surprised. Some years ago when Zuckerburg was asked about facebook users data, his reply; they are fucking idiots to trust him.

Holy False Dichotomy; Batman! (1)

fuzzyfuzzyfungus (1223518) | about 1 month ago | (#47348851)

This is "social", the fucking pox of the internet, we are talking about here: "too far" and "social network norm" are usually synonymous...

In a way, though, the fact that it doesn't go even further too far helps make it pathetic: what happens on 'social' services are the ethical transgressions of our best and brightest, equipped with nigh-unlimited funds, the assurance that they are Just That Good, and that 'disruption' is the ultimate virtue, and yet their imaginations seem to extend no further than being bigger assholes about selling ads...

More proof failbook is for fucktarded sheeple (-1, Flamebait)

Anonymous Coward | about 1 month ago | (#47348865)

Failbook has always proved and will always prove to be intrusive. Yet the sheep that use failbook continue to prove they are nothing more than stuipid little fucks that value nothing at all. Now with this "emotion experiment" the dumb asspie cracker Zuckerberg feels he is beyond any and all laws with his sheep still saying "fuck me in the ass harder Mark." The solution to this simple, shut failbook down. If you must keep in touch that is what email and *gasp* letters via snail fucking mail is for. Then there are also a new fangdangled method called a "website" that will allow for someone to put their shit up. Making a webpage is all too simple. If they can't make one then they are too fucking stupid to even exist let alone use a fucking computer so it is best to let the fucktarded sheeple that use failbook to fucking self destruct and perhaps earn themselves a fucking darwin award along the way.

Re:More proof failbook is for fucktarded sheeple (1)

fey000 (1374173) | about 1 month ago | (#47349271)

Failbook has always proved and will always prove to be intrusive. Yet the sheep that use failbook continue to prove they are nothing more than stuipid little fucks that value nothing at all. Now with this "emotion experiment" the dumb asspie cracker Zuckerberg feels he is beyond any and all laws with his sheep still saying "fuck me in the ass harder Mark." The solution to this simple, shut failbook down. If you must keep in touch that is what email and *gasp* letters via snail fucking mail is for. Then there are also a new fangdangled method called a "website" that will allow for someone to put their shit up. Making a webpage is all too simple. If they can't make one then they are too fucking stupid to even exist let alone use a fucking computer so it is best to let the fucktarded sheeple that use failbook to fucking self destruct and perhaps earn themselves a fucking darwin award along the way.

I dare say I smell the distinct aroma of a Pulitzer from your florid loquaciousness.

A/B-Testing (5, Insightful)

bickerdyke (670000) | about 1 month ago | (#47348891)

I understand why this should be considered wrong and fully understand users who don't want to have someone (less some company!) playing with their feelings.

But on the other hand, considering that creating an emotional response has been a standard marketing tool for the last 20 years, how is this different from regular A/B-Testing? 50% of your website users will see a slightly altered version of your website, and you compare response rates to the users receiving the "old" or "original" website.

Advertisers are manipulating our feelings for decades.News outlets have been doing it to an extent it became part of the news format itself (I guess anyone who was watching tv news last night saw that light-hearted, cozy, human-intrest or slightly oddball or cute item concluding the broadcast, right?) While creating negative feelings toward someone else has always been used in political campaigns.

It even becomes less spectacular if you consider, that on facebook, there always has been a selection algorithm in place, that tried to select those items from all your facebook-sources, that might keep your intrest focused onto facebook. Without selection, your facebook would scroll past like the Star Wars end titles. Only the parameters of the selection have been fine tuned, as they probably are at each facebook server update. It would be some new quality if that selection had been "objective" before, but being "personal" and emotional instead, is what kept us at facebook already.

So this is old news. But it should be a wake-up call: WAKE UP, THIS IS OLD NEWS! PEOPLE ARE TRYING TO MANIPULATE YOUR FEELINGS FOR AGES!

Just in case you haven't noticed. I'm surprised about the number of people who are surprised.

Re:A/B-Testing (0)

Anonymous Coward | about 1 month ago | (#47348917)

If your "regular A/B testing" is intended to cause harm to one half of your user-base or customer-base, then you are also ethically unsound.

Re:A/B-Testing (4, Insightful)

TheRaven64 (641858) | about 1 month ago | (#47348929)

A significant amount of marketing is intended to cause harm to 100% of the user-base, so being ethically unsound doesn't appear to be a problem.

Re:A/B-Testing (1)

oh_my_080980980 (773867) | about 1 month ago | (#47349233)

First, no it's not, nice try. Second, people are aware that it is marketing/advertising. People were not aware they were part of an experiment. That the point Potsy.

Re:A/B-Testing (1)

Racemaniac (1099281) | about 1 month ago | (#47349445)

so all kind of products trying to claim they're healthy while they're absolutely not isn't happening, and never has happened?
under which rock have you been living?

Re:A/B-Testing (1)

Sockatume (732728) | about 1 month ago | (#47348933)

It becomes a problem when you involve actual academic research staff. Private companies can do whatever the heck they like outside of a hospital, but researchers engaging in interventions are required to meet certain ethical standards as a matter of professional norm and quite often as a binding condition of any funding they have received.

Messaging versus manipulation of content (2)

sjbe (173966) | about 1 month ago | (#47349037)

But on the other hand, considering that creating an emotional response has been a standard marketing tool for the last 20 years, how is this different from regular A/B-Testing?

Because they aren't just throwing messages at people to see how they react. They were actively changing the messages and how they were received. HUGE difference and one that crosses an ethical line. If you are a beer company, you can try to promote your product to me in a way that you think might make me more inclined to buy it and that is fine as long as you aren't overly intrusive about it (think telemarketers). What is NOT fine is for them to take what I say and manipulate that to try to convince me (or others) to buy their product.

Just in case you haven't noticed. I'm surprised about the number of people who are surprised.

Then you do not understand what is going on. Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable. Screwing around with people's emotions in a controlled experiment should require at minimum review by a genuinely independent ethical review board and probably genuine informed consent. Facebook could be bothered with neither one. They seem to regard their users as insects to be manipulated and dissected.

Re:Messaging versus manipulation of content (3, Interesting)

bickerdyke (670000) | about 1 month ago | (#47349121)

Because they aren't just throwing messages at people to see how they react. They were actively changing the messages and how they were received. HUGE difference and one that crosses an ethical line.

But according to /., not what happend here.

According to this article here [slashdot.org] , no messages were changed:

Facebook briefly conducted an experiment on a subset of its users, altering the mix of content shown to them to emphasize content sorted by tone

(emphasis mine).

I agree with you that changing the actual messages would not be acceptable by any standard.

Just in case you haven't noticed. I'm surprised about the number of people who are surprised.

Then you do not understand what is going on. Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable. Screwing around with people's emotions in a controlled experiment should require at minimum review by a genuinely independent ethical review board and probably genuine informed consent. Facebook could be bothered with neither one. They seem to regard their users as insects to be manipulated and dissected.

And again I agree with you that you're stepping over a line when you're consciously manipulating people's feelings for economic reasons. But this line is crossed thousandfold already. The type of environment is secondary. A/B-tests take place in controlled environments, too.

Re:Messaging versus manipulation of content (1)

DarkOx (621550) | about 1 month ago | (#47349365)

Facebook stepped over an ethical line in their "research". No, nobody got (badly) hurt but that doesn't make it acceptable.

Yes actually it does make it acceptable because the people doing the experiment knew that it was very very unlikely to cause anyone serious injury. When an psychological experiment amounts to no more than making people aware their buddies had a shitty day at work by their own account, I don't think it actually rises to the level of requiring consent.

Humans conduct experiments all the time; its how any self aware being interacts with the world around them. It just on a small scale so nobody cares, I bet plenty of sales professionals have at leas t informally experimented around if asking about peoples kids, helps them close deals. If every little experiment no matter how benign really requires informed consent then all anyone will have time to do ever again is sue each other over if their actions constituted an experiment or not.

Re:A/B-Testing (2)

mr100percent (57156) | about 1 month ago | (#47349185)

The issue is clear; if a doctor or psychologist tried this, they would have to get IRB approval. You need informed consent; such laws were passed after psychologists had tried a LOT of experiments on the unwitting public; simluating muggings, imminent death scenarios, etc.

I know people say "it's just manipulating feeds, what's the harm?" There can be plenty of harm if you manipulate the feeds. Where is the line? What if facebook had decided to see what happens if you try showing depressing posts and bad news for a year? Or a feed where you were always ignored? No IRB would allow something like that if it risked permanent mental scarring or created a suicide risk.

Bad move, Facebook. Experiments are definitely cool (I'm a researcher), but we go through proper channels and regulation for a darned good reason.

Re:A/B-Testing (1)

bickerdyke (670000) | about 1 month ago | (#47349237)

The issue is clear; if a doctor or psychologist tried this, they would have to get IRB approval. You need informed consent; such laws were passed after psychologists had tried a LOT of experiments on the unwitting public; simluating muggings, imminent death scenarios, etc.

Yes. And I agree with you.

I never said it was or should be accepted, I said it was widespread. And that in marketing, emotional manipulation is even out of the experimental stage.

Re:A/B-Testing (3, Interesting)

Trepidity (597) | about 1 month ago | (#47349261)

Apparently they did actually get IRB approval, oddly enough. The study was jointly done with two universities, and from what other researchers have told me, the two universities' IRBs approved the protocol. I'm surprised myself that they would. Would be curious to see what their reasoning was.

Re:A/B-Testing (1)

mr100percent (57156) | about 1 month ago | (#47349281)

Now that IS very interesting. I wonder how the IRB approved an experiment that clearly didn't have any participants' consent.

Advertising =/= scientific research (1)

langelgjm (860756) | about 1 month ago | (#47349299)

It's different from A/B testing in that the experiment is explicitly designed to cause harm to half of the participants.

Presumably most A/B testing would be designed to figure out which choice performs better on a set of metrics. But going in, there is little evidence to point to one or the other, and the "harm" caused would simply be in user experience. In this experiment, the researchers had a prior theory about which choice would cause harm, and the harm is emotional and psychological.

All that aside, if this was purely internal research at Facebook, it would still likely be unethical but probably nothing out of the ordinary. The fundamental different is that this is being presented as scientific research. It's published in PNAS. It involve three co-authors from various universities. There are standards, both legal and ethical, that must be followed when engaging in scientific research, and the concern is that such standards were perhaps not followed.

Manipulation and even inducing harm may be widespread throughout the advertising industry, but that's advertising, not science.

Too far? (5, Insightful)

ebonum (830686) | about 1 month ago | (#47348919)

What about what advertisers do every day?
Our government (for us Americans) runs campaigns to alter opinions in other countries.
I'd like to everyone in the business of "caus[ing] changes in psychological status" get "require informed consent" first.
Beer companies anyone?

Messaging versus content manipulation (1)

sjbe (173966) | about 1 month ago | (#47348993)

What about what advertisers do every day?

What about them? They don't get to run controlled experiments on me and they certainly do not get to alter what I say or how others receive what I say. Advertisers can control what they say to me and see how I react but they don't get to manipulate what I say and see how that affects others. HUGE difference.

Our government (for us Americans) runs campaigns to alter opinions in other countries.

They don't get to adjust what *I* say to see what effect it has on others. You really can't see the difference?

I'd like to everyone in the business of "caus[ing] changes in psychological status" get "require informed consent" first.

When they are performing a controlled experiment on me then yes they should. If they want to simply send messages my way to see what I do, then that does not require informed consent unless it rises to a certain level of obnoxiousness like telemarketing. They do not get blanket permission to interrupt my day, manipulated what I say or manipulate how others receive what I say.

Re:Messaging versus content manipulation (0)

Anonymous Coward | about 1 month ago | (#47349069)

They don't get to adjust what *I* say to see what effect it has on others.
 
If you don't think modern political ads and "journalism" doesn't alter "what you say" to see what effect if has on others than you're living in a fantasy land or you're naive enough to believe that there is truth in these forums. If you can't see this and admit to it with a straight face then I'm afraid that what you think about this and a lot of other matters means nothing to me.

Don't read if you don't want your emotions changed (1)

kruach aum (1934852) | about 1 month ago | (#47348921)

"If you are exposing people to something that causes changes in psychological status, that's experimentation,"

No it isn't, otherwise the above sentence would be experimentation, as it changed my psychological state from calm to annoyed. Is it too much to ask that supposed experts use their own jargon correctly?

"Victims" received positive or negative newsfeeds? (3, Interesting)

by (1706743) (1706744) | about 1 month ago | (#47348923)

According to the WSJ's coverage http://online.wsj.com/articles... [wsj.com] ,

The impetus for the study was an age-old complaint of some Facebook users: That going on Facebook and seeing all the great and wonderful things other people are doing makes people feel bad about their own lives.

So although conventional wisdom might say that seeing positive things makes you happier, here there have been accusations to the contrary -- positive things about other people makes you feel lousy about yourself. This study ostensibly looked at that (and I think it found something along the lines of conventional wisdom: happy posts make you post happy stuff, a [dubious!] proxy for your own happines...).

If Facebook knew (and how would they?) that X makes you depressed, then yes...there might be some moral issues with that. But it seems that Facebook asked a legitimate question -- especially so given that it was published in PNAS.

That said, yeah...it feels a little shady. But then, when I log onto Facebook, I am certainly not expecting any aspect of the website to be designed with my best interests in mind!

Re:"Victims" received positive or negative newsfee (0)

Anonymous Coward | about 1 month ago | (#47349579)

If Facebook knew (and how would they?) that X makes you depressed, then yes...there might be some moral issues with that. But it seems that Facebook asked a legitimate question -- especially so given that it was published in PNAS.

It's simple. After an acquisition, Facebook was stuck with a large stockpile of sodium thiopental that they couldn't move because of an EU ban. Hence, it was an ethical obligation to Facebook shareholders to make as many Europeans as depressed as possible to help move inventory. The other option--manipulating the media and restoring lethal injection in the EU--would have been impractical.

And yes, the above is a farce. But in the grand scheme of things, if the above were true it'd pale in comparison to the regular war drum beating that happens in the US regularly to keep the flow of dead bodies and money flowing through the Military-Industrial Complex. As much as a despise PETA's actions of throwing [fake] blood on people over the killing of animals, it seems amazingly appropriate to do such on the many Congressmen who are so quick to support the cycle of industrial dependence on war yet so slow to support the cycle of veteran dependence on the VA.

Let us not even get into the whole point, as others have raised, how the news media is incredibly complicit in manipulating the populace's emotional state, especially when it comes to supporting wars. If it bleeds it leads and what better way to make things bleed than a war? They just need bags of soldier blood for the Senators to bathe in.

Advertising? (1)

psnyder (1326089) | about 1 month ago | (#47348931)

Advertising frequently uses psychological pressure (for example, appealing to feelings of inadequacy) on the intended consumer, which may be harmful.

Criticism of advertising [wikipedia.org]

...was my 1st thought when reading...

"If you are exposing people to something that causes changes in psychological status, that's experimentation," says James Grimmelmann, a professor of technology and the law at the University of Maryland. "This is the kind of thing that would require informed consent."

One could argue that advertising is not always done with informed consent.

Let's hope misery is not profitable. (0)

Anonymous Coward | about 1 month ago | (#47348937)

It is hard to understand how this passed an ethics board when harm to users was a predictable outcome. Increased rates of self harm and suicide are realistic prospects when you deliberately try to make people unhappy.

par for the course (1)

zer0sig (1473325) | about 1 month ago | (#47348953)

They are constantly screwing around with everything else, breaking this, fixing that, changing this, etc. I don't find it surprising that Facebook would look at this as a social experiment and neglect to consider the human emotion manipulation element. However, it is telling that this sort of thing goes on, and if anyone is shocked or offended by this, then they might want to invest their time and energy in another form of social media. I hear G+ is nice.

Wacky libertarian world of Slashdot still around (-1)

Anonymous Coward | about 1 month ago | (#47348959)

I would hope that there are some hungry lawyers who want to sue the socks of Facebook for emotional abuse, as that is what their 'experiment' amounts to. But a lot of Slashdot users are stuck in the 'anything goes on the internet because it is technology' era, which I submit is in the process of rapidly drawing to a close.

Natural vs randomized experiments (3)

sjbe (173966) | about 1 month ago | (#47348971)

Given that Facebook has over half a billion users, it’s a foregone conclusion that every tiny change Facebook makes to the news feed or any other part of its websites induces a change in millions of people’s emotions. Yet nobody seems to complain about this much...

If this guy actually thinks nobody complains about this much then he isn't paying attention. However putting that aside his argument is a straw man. There is a VERY significant difference between changing a service and that change having an emotional impact versus actually experimenting on the emotions of your customers directly and without their permission without even so much as review by an independent review board. Anyone who can't comprehend the difference between the two has a pretty big ethical blind spot. The fact that Facebook seems to be genuinely surprised by this response tells me everything I need to know about how they regard their users. They see them the same way an entomologist sees bugs - something to be cataloged and experimented on but not worthy of the respect one normally gives other human beings.

–presumably because, when you put it this way, it seems kind of silly to suggest that a company whose business model is predicated on getting its users to use its product more would do anything other than try to manipulate its users into, you know, using its product more

There is a big and fairly bright line between observing users behavior given certain stimuli as a natural experiment [wikipedia.org] and the experimental investigators manipulating those users directly without their permission in a designed experiment. The later generally requires informed consent [wikipedia.org] for a variety of very sensible reasons relating to ethics. The fact that emotional manipulation is done in other contexts is utterly irrelevant. That's the same argument children make when they claim that "...but all my friends are doing it too". I suppose since Facebook is owned and run by an immature child billionaire that I shouldn't be surprised.

And no, the Facebook terms of use does NOT rise to the level of informed consent.

Re:Natural vs randomized experiments (1)

fey000 (1374173) | about 1 month ago | (#47349407)

The fact that Facebook seems to be genuinely surprised by this response tells me everything I need to know about how they regard their users. They see them the same way an entomologist sees bugs - something to be cataloged and experimented on but not worthy of the respect one normally gives other human beings.

And how can this surprise you? Have you ever heard anything at all about Facebook respecting the privacy of its users? In fact, again and again and again Facebook ends up in the news with an anti-privacy scandal on its hands.

I am not saying that running social experiments on random people is a great idea (though it is funny), I am saying this is a 'no biggie' because it is neither surprising nor out of line with previous actions. That doesn't make it right, but anyone with half a brain should have seen it coming five years ago, and stopped using social media platforms 4 years ago. The only people who need to have these accounts are the marketeers. The rest will get much better 'social' results using 1) a phone and 2) mouth + ears.

I suppose since Facebook is owned and run by an immature child billionaire that I shouldn't be surprised.

And yet you appear to be. Facebook is run by a greedy thief, and you expect non-greedy-thief behavior. That is inconsistent.

Unethical (1)

aaaaaaargh! (1150173) | about 1 month ago | (#47348973)

I'm a postdoc at university, though not in a field in which you usually study human behavior. Anyway, if I experminted on humans without their prior consent, I'd loose my job. In every application for a project that involves studies on animals or humans there is an ethics form to fill out, and I must wonder how they got funding without cheating in one of those forms.

Lying to tests subjects is to some extent necessary, of course, or otherwise research in pschology would be almost impossible. However, conducting experiments on humans without their prior consent is unethical. Everybody knows that. Whoever conducted this study needs to be investigated by an ethics committee.

My 2 cents.

Re:Unethical (0)

Anonymous Coward | about 1 month ago | (#47349129)

It seems a bit hazy whether they did or didn't receive IRB approval. An interesting point to this, is that companies can constantly manipulate people with almost no oversight. Marketing is not meaningfully different from a psychological experiment - except the participants are uninformed, can't opt out, there is no ethical oversight, and the only justification is profit (rather than the betterment of human knowledge and society). It's unsurprising that this happens. We has a society hold dearly to the narrative that we are free agents coursing through life making decisions independently. No one lives in a vacuum. We are all under control - however, sometimes those sources of control seem less obvious. The following article highlights that one key realization that should arise from this fiasco and be acted upon is the knowledge that companies are free to manipulate people with less oversight than a professor emeritus and Nobel laureate at Harvard. [Thinking that it's OK to just let biz execs manipulate people is outside of rationale.] http://www.thefacultylounge.or... [thefacultylounge.org]

Re:Unethical (1)

Sockatume (732728) | about 1 month ago | (#47349441)

They did receive IRB approval, however the protocol listed in the paper expressly breaches one of the IRBs' rules, and may breach several others depending on how the study was performed. It shouldn't have been approved.

Re:Unethical (0)

Anonymous Coward | about 1 month ago | (#47349161)

The best kinds of experiments are the ones where the subjects have no idea the experiment is happening.
Milgram experiment, and many others (the army once sent troops in a forest and shot live mortar shells near them to see the efects of strees, do you think they were informed of anything?)
  All this "ethics committee" bullshit is what's stoping progress in all psychology.
My 2 cents.

Re:Unethical (1)

mr100percent (57156) | about 1 month ago | (#47349203)

I agree, this was my first thought. They screwed up big time, it would be fun to see the federal government investigate them for unlicensed human research.

Yarkoni misses the point (2)

Registered Coward v2 (447531) | about 1 month ago | (#47348981)

Facebook didn't simply set out to make tweaks and see how users responded; they setup a controlled experiment on subjects without their consent; a practice that appears to violate ethical and possibly legal guidelines for behavioral research. I agree it could push them to continue to do such research and not reveal it; but when it inevitably leaks that they are doing that it will create a PR nightmare. Facebook could have simply asked people to opt in to the study and provide the standard information regarding the study and this would be a non-issue. For those looking for info on humane research protection guidelines in the US google Office of Human Research Protection.

Re:Yarkoni misses the point (1)

Trepidity (597) | about 1 month ago | (#47349287)

without their consent

What's actually more problematic to me is that the paper explicitly claimed they asked for and received "informed consent". But their justification is that users agreed to the Facebook EULA. That is a serious misunderstanding of what constitutes informed consent in research ethics; it does not just mean that someone agreed to some fine print, possibly months ago, in a transaction unrelated to the current study.

If they want to argue that this doesn't require informed consent at all, because it's e.g. just data mining of effectively existing data, that would be less problematic imo than watering down the standard for informed consent to include EULAs.

Re:Yarkoni misses the point (2)

Registered Coward v2 (447531) | about 1 month ago | (#47349371)

without their consent

What's actually more problematic to me is that the paper explicitly claimed they asked for and received "informed consent". But their justification is that users agreed to the Facebook EULA. That is a serious misunderstanding of what constitutes informed consent in research ethics; it does not just mean that someone agreed to some fine print, possibly months ago, in a transaction unrelated to the current study.

If they want to argue that this doesn't require informed consent at all, because it's e.g. just data mining of effectively existing data, that would be less problematic imo than watering down the standard for informed consent to include EULAs.

I agree, with an added thought. It wasn't just data mining but a controlled experiment that altered the data they received. That, IMHO, cross the line between "let's look at the existing data" to "let's conduct an experiment."

Another part to his argument seem to be "the impact was so small as too be negligible and thus it was OK." However, the researchers did not know the results would be negligible so using that as an excuse after the fact doesn't fly.

Re:Yarkoni misses the point (1)

PvtVoid (1252388) | about 1 month ago | (#47349421)

Facebook didn't simply set out to make tweaks and see how users responded; they setup a controlled experiment on subjects without their consent; a practice that appears to violate ethical and possibly legal guidelines for behavioral research.

Bingo. Advertisers may do this sort of thing all the time, but they don't get it published in peer-reviewed scientific journals without adhering to standard human research protocols. PNAS should immediately retract the article, and the researchers involved should be censured and stripped of funding.

And people who don't want to be experimented on without consent should just fucking quit using Facebook.

"Music hath charms to soothe a savage beast" (0)

Anonymous Coward | about 1 month ago | (#47348989)

It goes the other way too, & I said all I had to about that here in this exchange -> http://tech.slashdot.org/comme... [slashdot.org]

* It's VERY easy to 'sway people' & their emotions based on what their environs shows them (in sounds, what you see, & yes - what you read too!).

Psychology & Psychiatry are the MOST dangerous 'sciences' & imo, least understood (statistics & shrinks notwithstanding) - however, parts of what we DO know, does actually work... this is one of them (& any media you consume is an avenue to that).

APK

P.S.=> Makes me wonder sometimes what the 'end goal' of such emotional manipulation is on the parts of those practicing it - it can be "for the absolute good" or "the absolute bad" is all - instead of using it for "the bad" why NOT use it, for the "good" (relative terms of course, but I think most normal folks here catch my drift on this account)... apk

NSA defense (0)

Anonymous Coward | about 1 month ago | (#47348991)

Facebook defense is similar to NSA defense. Makes one wonder if they are run by the same (type of) people.

Um (0)

Anonymous Coward | about 1 month ago | (#47349025)

> "If you are exposing people to something that causes changes in psychological status, that's experimentation,"

That's any and all communications ever made by anyone to anyone else through any platform ever.

Don't fucking care (0)

Anonymous Coward | about 1 month ago | (#47349029)

I don't fucking care, just show me the items in my feed, that means everyone I'm subscribed to, in the order they were posted.

There is no other acceptable answer and any automated rearrangement is unacceptable.

My Facebook feed (1)

d3bruts1d (639027) | about 1 month ago | (#47349041)

I don't know about anyone else, but my feed is usually full of people complaining, arguing, or just pissed off. I always thought this was the norm for FB. IÃ(TM)d hate to think I know this many unhappy people.

Slow news day . . . (1)

Kimomaru (2579489) | about 1 month ago | (#47349047)

For real, THIS issue bothers FB users? I'm speechless, you never know what's going going to matter to someone.

Creating emotional response is not the issue (5, Interesting)

CodyRazor (1108681) | about 1 month ago | (#47349067)

The problem is not that they attempted to create an emotional response or manipulate people's emotions. As people are constantly pointing out advertisers so that all the time. People don't seem to grasp that there is a large difference between this and advertising.

The problem is the way it was done. People use facebook with the expectation that they are seeing a (reasonably) objective representation of what their friends are trying to express or convey. Facebook is the equivalent of the telephone in a telephone call. If the telephone somehow manipulated what you heard to make your friend sound more negative or positive without changing their core meaning that would be unethical without informed consent, just as this is.

A more extreme version would be facebook subtly modifying the content of what your friends post as it appears to you without anyone knowing it was doing this. That would be even more unethical. The problem is mirepresentation, the method by which they attempt to manipulate emotions.

Quite Simple (0)

Anonymous Coward | about 1 month ago | (#47349105)

Think for yourself. Don't rely on some social network site to think for you or influence the way you think. Leave the flock and quit being a sheep.

Re:Quite Simple (1)

oh_my_080980980 (773867) | about 1 month ago | (#47349253)

What the fudge does that have to do with anything? Facebook was promoted as a way for people to share information. Business use Facebook as a way to promote their products and provide specials. No where does is it clearly stated that you will be used as subjects in an experiment and not know about it. There's no informed consent. If people willing participated that's fine; based on some of the comments it sounds like people would have.

You didn't pay, they can do whatever they like. (1)

Foske (144771) | about 1 month ago | (#47349107)

Sad but true. Then again, 99.9999999% of the users still wouldn't read the EULA even if they had to pay millions, so they still could get away with it.

Re:You didn't pay, they can do whatever they like. (0)

Anonymous Coward | about 1 month ago | (#47349171)

Pay or not they're going to do what they want.

They could have done this EULA or no.

Re:You didn't pay, they can do whatever they like. (1)

oh_my_080980980 (773867) | about 1 month ago | (#47349245)

Which still would not constitute informed consent. When taking part of an experiment, the participant must be informed. You are not considered automatically enrolled and have the option of opting out. The fact that people don't grasp this simple concept is fudging scary.

Outrage due to Censorship, not the test (4, Interesting)

Anonymous Coward | about 1 month ago | (#47349169)

I talked to several (non-tech) friends about this, and they were more upset about Facebook "censoring" out posts than the emotional manipulation. In their minds, Facebook allows everything to be shown, but certain topics gain preference due to likes or dislikes. However, they will show you everything if you scroll far enough.

Their outrage came from the thought that FB was removing "happy" content from their feed. (That it was no longer a "dumb" pipe for social data).

Bleeding eyes (0)

Anonymous Coward | about 1 month ago | (#47349285)

The "defense" is painful to read, but boils down to this: stop complaining, or industry may decide to be an even bigger dick. Call me silly, but that doesn't sound like a very good defense.

Consent is required (1)

Manfre (631065) | about 1 month ago | (#47349393)

They conducted a psychology experiment without the consent of the test subjects. I'm not sure what the rules are for private organizations, but I do believe that any publicly funded researcher involved in the experiment, or possibly those who use the results, would be at risk of losing all federal funding. I really hope some lawsuits are filed against facebook and any of the researchers because this shouldn't creep in to becoming an accepted norm.

What can I say? (0)

Anonymous Coward | about 1 month ago | (#47349475)

If you are stupid enough to use Facebook... Facebook is not necessary. Life was fun and good before Facebook, and it still is without Facebook.

Isn't the FB Newsfeed a giant experiment anyway? (2)

swb (14022) | about 1 month ago | (#47349489)

I quit using Facebook six months ago, but for a couple of years was a regular user.

The "newsfeed" always struck me as enormously manipulated, with Facebook constantly altering the algorithm that determines what you're shown. Even nontechnical users would comment about this, wondering why they didn't see some posts from some people some times.

Some of this may have been benign, trying to figure out what order to display posts relative to relationships, posting frequency, sort of ordinary attempts to sort out "importance".

But I'm sure there was commercial manipulation -- ranking user comments with links to advertising-affiliated sites higher than non-affiliated sites, downranking links to sites likely to lead a person to shorten their Facebook session, etc.

All of this could be considered "manipulation" even though there might not be one single motivation behind it and not all the factors may be even focused on a specific outcome.

Human experimentation needs close supervision (3, Insightful)

Opportunist (166417) | about 1 month ago | (#47349595)

There is no fine line here. There's only a bold one. Does it involve humans? If so, not only is tight ethic supervision required (to avoid a Milgram scenario) but, and that's the even more important part, the active and willing consent of the participating people is required.

Anything else, no matter how "trivial" it may be seen, is simply and plainly wrong. And no, some clause somewhere hidden between another few billion lines of legalese in an EULA does NOT constitute consent to being a guinea pig!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>