Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Who is Responsible? The Developer? The User?

Cliff posted more than 14 years ago | from the is-finger-pointing-really-necessary dept.

The Courts 376

Anonymous Coward II asks: "I am working on a paper for a computer ethics course, and need to answer the following question: Who must be held responsible: The person that develops a software that will (or can) be used to illegal ends (like to break into a computer system, to illegaly monitor other users, a virus, etc), or the person that use it afterward? I'd like to know what Slashdot users think, and what is the answer according to the law." Software is a tool, just like any other, so when things go wrong I think this then boils down to a question of personal responsibility or negligence. What are your opinions?

cancel ×


Sorry! There are no comments related to the filter you selected.

A gut reaction (1)

Nodatadj (28279) | more than 14 years ago | (#1508017)

Having not thought about this very long
My gut feeling is that if the program has a legitamite use (Like BO2K can be used as a remote admin thing), then the person who misuses it is to blame.
But if there's no legitamite use, then the author should be held to blame.

The only problem is to work out what constitutes legitamite use.

(sorry for the poor spelling)

Definitely the user... (3)

JatTDB (29747) | more than 14 years ago | (#1508018)

I don't blame gun manufacturers or knife manufacturers for murders. I don't blame car manufacturers for drunk drivers. And I don't blame developers for writing software that could be used in an illegal way.

No tool w/o Health Warning (2)

derwisch (65084) | more than 14 years ago | (#1508019)

You are to be made responsible if you issue a tool that is potentially dangerous without indicating it. Companies like AOL make internet connection look like a breeze, and therefore are responsible for hapless users unknowingly offering their box as a spam hub.

At least this was the essence of our Lunux User Group's last night's discussion.

Are we writing a paper for him? (1)

Cheesemaker (36551) | more than 14 years ago | (#1508020)

Man, I need to figure out how to get Slashdot to do all my research for me......

Tools or Weaponts? (1)

CyberMandrake (110333) | more than 14 years ago | (#1508021)

If one is killed by a hammer (in other persons hand, of course :-) is the hammer manufacturer responsible by this crime? What if the murder was by a gun (pistol, shotgun, ....)? Or a chainsaw? Well, this is a polemical issue. But the truth and justice must be between the opposite sides of the question, I believe the Judge must examine each case separadelly.

Depends (0)

Anonymous Coward | more than 14 years ago | (#1508022)

If software has a legitimate use, e.g. SATAN then
it is the end user's fault.

However, a program that creates smurf attacks or
a virus would make the developer also a bad guy.

well, by precedent... (1)

matman (71405) | more than 14 years ago | (#1508023)

well, if you consider gun makers, and cigarette makers and other such organizations, who produce harmful products, abused by others, its obvious that the law sees the user as being the responcibile party for whichever use the product is put to.

Of course, a virus maker who writes a virus which is infectious, and 'accedentally' leaves it where someone can see it to distribute it would be guilty of at least neglegence... so the producer wouldnt be totally off the hook.

Its like, if I make peanut butter and someone is allergic to peanuts, I shouldnt be held responcible if they eat some unless I mislabeled it and make it not taste like peanut butter. Now if someone hid some of it in the allergic person's food, I dont think that I should be responcible for it at all. Many of the hacking apps are demonstrations of exploits or are legitimate tools to test ones own network... therefor they have legitimate uses too, and are not made expressly to cause trouble :)

Depends on the situation (1)

CodeMonky (10675) | more than 14 years ago | (#1508024)

I think alot of it depends on the situation.
A gun manufacturer can not be sued if one of its guns is used to kill someone. They are simply making a 'tool' to be used for personal protection. I think that more and more these days we are seeing exploits coming out that are dis-abled (bad shellcode) or simply print a message ('I h4v3 0wn3d j3w!') while still getting the point across that there is a problem with the software.

It can be either - i guess (1)

jdigital (84195) | more than 14 years ago | (#1508025)

My totally uneducated guess :

If the software can be used for legitimate [] poiposes, then you could argue any illegal [] usage is the fault of the user []

However if the software
  • promotes
illegal activities, then the fault lies with the programmer []

Responcibility (1)

wmtrexler (101720) | more than 14 years ago | (#1508026)

Obviously the one who commits the crime should be held responcible. There is nothing in this world that can not be used in a wrong way.
Cars get you to work....but they can kill
Email can be used for fast reliable communication....Spammers can flood your inbox with garbage.
Scripts can automate frequently used processes....Melissa Virus

In the end the question becomes similar to the statement---Guns don't kill people, people kill people.

The obvious answer... (1)

frinky (35152) | more than 14 years ago | (#1508027)

is that it depends on the design of the tool. A gun for example can be used for assualt or self defence, so the responsibilty for it's use comes down to the user. On the other hand, if a company designed and sold an item that only had a negative use eg. car bombs(extreme example I know), then both the company and the user should be held responsable. Just common sense.

Like everything else in life (0)

Anonymous Coward | more than 14 years ago | (#1508028)

Tools never have ethics. People can. Its too bad that so few understand this and are willing to take responsibility for their own actions. As a culture we seem to be moving away from individual responsibilities (and therefore Rights) to State protections. Laws that reduce freedoms in the attempt to prevent irresponsible actions. You've opened a can of worms with this one...

No ! (2)

Bouglou (109816) | more than 14 years ago | (#1508029)

But what about software that has NO legal utilisation ? virus are such things. A gun allows you to defend yourself, not only to attack others. I think for some kind of software, the developer HAS a responsability.

Cracker Software & guns (1)

Lexel (110539) | more than 14 years ago | (#1508030)

Look at other tools that are used to break the law:

Tools to break into houses and guns. They can be developed legally and they can be sold legally (possibly with restrictions). Why should software be different?

You can hardly draw a line between software that *can* be used for illegal purposes (almost anything) and software that is built to break into others computers. Look at Back Orifice. First a crackers tool, now a remote administration toolkit. It is much easier to see that automatic guns can only be used for evil purposes (killing people), yet their development and production is normally legal.

- Alex

Hm - What about Colt's decision (0)

Anonymous Coward | more than 14 years ago | (#1508031)

Didn't Colt decide to get out of the Handgun market because they thought that they could be considered liable for what people did with guns made by Colt.

Seems like the exact same situation.

a look at the latest trends... (0)

Anonymous Coward | more than 14 years ago | (#1508032)

given that people are suing gun manufacturers left and right for crimes comitted with guns...survey will probably say....everyone is responsible for everything.

long live the legal mafia...etc, etc.

Re:Definitely the user... (1)

Ender Ryan (79406) | more than 14 years ago | (#1508033)

Yes, I agree to an extent. But what if a gun manufacturer was selling rocket launchers to terrorists? (a bit extreme I suppose) How's that different from a developer writing programs that have no purpose other than causing damage and distributing it to script kiddies?

HNN's take (3)

Ratface (21117) | more than 14 years ago | (#1508034)

The Hacker News Network has been asking much the same question. Anti Virus companies have been labelling some programs that allow remote undetected monitoring of a computer as virusses (e.g. BO2K) while other products released by "mainstream" software companies,(such as Softeyes) are not scanned for at all.

What makes an anti virus company label one program as a vrius, while another program with similar uses is unlabelled?

HNN ask the question at ustry.html []

The user (0)

Anonymous Coward | more than 14 years ago | (#1508035)

Any tool can be used for legal or illegal purposes, good or evil if you prefer.
The more powerful the tool, the more potential danger.
My personal favourite example is a car, a gassed up running car is far more dangerous then a loaded gun, just think of driving into a crowd, and shooting into a crowd.
We have to educate people of the importance of proper social behaviour, that stands a much better chance of 'appropriate' usage of stuff then simple restrictions and the blame game

It all depends (2)

sufi (39527) | more than 14 years ago | (#1508036)

There are so many other variables.

It's all a question of 'intent'.

Ultimately if someone is knowingly using software for illegal things then they are responsible, end of story.

However you can also argue that the people who develop the software can be held accountable for enabling people to perform these illegal actions. In the same way that it is illegal to sell certain guns to people in the UK unless they specifically have an owners license.

Then again, people use windows, linux and all sorts of other things for illegal purposes, visual interdev creates programs that do illegal things all the time (haha - sorry - had to throw that one in).

It's an interesting ethical question, creating software purely for illegal purposes is indeed unethical, but it *can* be a fine fine line.

Can I have your diploma? (0)

spiffyboy (31591) | more than 14 years ago | (#1508037)

If I do your homework, will I get your diploma too?

the user (1)

dave_lister (104424) | more than 14 years ago | (#1508038)

if i produce lock picks for sale to locksmiths and you manage to get hold of a set _and_ get caught possessing them, you can be charged federally. although it's true that gun manufacturers may be held responsible for the damage caused by their weapons, i don't believe that it applies here. if i write a security tool and you use it to violate one or more laws, you should be the one to go to jail.

Guns dont kill people, I kill people (1)

matman (71405) | more than 14 years ago | (#1508039)

hehe theres another precedent! UHF

Don't blame inanimate objects! (1)

btlzu2 (99039) | more than 14 years ago | (#1508040)

To me, the answer is simple, the consequences should lie with the person who used some software to hurt someone or damage something. Where does the finger-pointing stop? If someone writes an extremely lethal virus and compiles it using gcc is somebody going to try and blame GNU for providing the tools to build the virus? There is a major problem today, in the US at least, of blaming everyone and everything remotely associated with someone bad (think: Doom and Columbine wackos). As much as I personally dislike guns, I believe the same argument holds there as well...guns don't shoot themselves. Blame the responsible person: the one who committed the act!

Rephrase (2)

rde (17364) | more than 14 years ago | (#1508041)

Like pretty much everyone else, I've got to say that it depends on context. In nearly all cases, though, I'd be inclined to blame the user.
I'd like to rephrase the question slightly, though.

Does the fact that a Virus Construction Kit can be used by sysadmins to aid in network defense justify its existence?

Re:Yes! (was: No !) (2)

Bruenor (38111) | more than 14 years ago | (#1508042)

While a virus might have no legal use, what about studying it to learn about it? A virus is usually a fairly nice piece of code.

When it comes down to it, it's just a series of 1s and 0s, like and other software. It's up to the user to use it responsibly.

Tool / Ethics / Homework (1)

VSc (30374) | more than 14 years ago | (#1508043)

I should presume it is *your* opnion which matters at the assignment, so just do your homework!

I agree with previous poster that the one who actually commits anything should be held responsible; however, how about drug dealers or tobacco companies (still remember those suits?). If a program is designed to be malicious (like a virus) then the author is ultimately responsible. People who run it (like users on infected PCs) are actually victims then.

And then, I never liked ethics lessons.

Re:Are we writing a paper for him? (0)

Anonymous Coward | more than 14 years ago | (#1508044)

The person is just asking a field opinion and nothing more. Why should someone who asks a question on slashdot for their paper be accused of having us do all of the research. Law books are very hard to read (try understanding IDEA 97 some time) and this person is just looking for a concise answer. Please do not jump to conclusions.

is GM responsible for drunk drivers? (1)

one_who_uses_unix (68992) | more than 14 years ago | (#1508045)

To get a better feel for the issue consider other products and their users. If a drunk driver kills a pedestrian can we sue GM for making the car? If someone uses a steak knife to kill or maim another person can we sue tramontina for making the knife? If I wrote "rm" can someone sue me when their disk gets wiped?

Re:A gut reaction (0)

Anonymous Coward | more than 14 years ago | (#1508046)

You can't afford to have a double standard on here. I just thank God that the readership of slashdot isn't actually in the legislature some times.

This begs the question of whether gun makers are liable for the uses of the public. The NRA says there are lots of legitimate uses: sport shooting, hunting, collecting. Opponents say those are all feable reasons at best.

Ultimately it isn't a question of liability. I can't see how anyone with an ounce of educated thought (american juries excluded, who knows what they think about) could reasonably belive that gun manufactureers are actually liable.

Is Ford, Chevy, or Chrysler going to be held liable for all the death on the road this weekend?

No one is (1)

Mo B. Dick (100537) | more than 14 years ago | (#1508047)

Neither of those 2 is repsonsible, I think the people resonsible is the company who leaves security holes to all allow for cyber instrusions

The gun analogy (2)

cluke (30394) | more than 14 years ago | (#1508048)

I notice that a lot of posters are using the gun analogy, in that gun manufacturers are not to blame for shootings. But if you look at this link on the BBC [] it seems that people *are* suing gun manufacturers, or at least makers of assault rifles, as they are not 'self-defense weapons'.

I think we can stretch this to malicious software too - e.g. viruses. But then, what if you were to write viruses for 'educational' purposes? If you write cracking software, I think you'd have to prepared to face some legal action.

Re:well, by precedent... (1)

CoolHnd30 (89871) | more than 14 years ago | (#1508049)

if you consider gun makers, and cigarette makers and other such organizations, who produce harmful products, abused by others,

Guns are not inherently dangerous products at all. Used properly, at a target range, a gun will do no harm to anyone whatsoever. Cigarettes, on the other hand, will always do harm to the user, even if used in the proper fashion, so please don't compare the two !!!

The intent points at the responsibility (2)

substrate (2628) | more than 14 years ago | (#1508050)

Many years ago Dan Farmer authored a paper with a title similar to "Improving Your Computer Security By Breaking Into It". The paper illustrated a number of means of hacking into a system, some of which sadly are still very possible today. His intention wasn't to be the first enabler for script kiddies, it was actually to make the internet a better place by improving security. His thesis was that the best way to counter these attacks was to learn to think like your attacker. He didn't have any hidden motives. It wasn't like a lot of self-proclaimed security experts who say they're producing material to enhance security with a few concealed winks and nods to the script kiddies. He went on to write SATAN later. Apparently educating system administrators and programmers didn't help. Buffer overflows were still rampant, critical security patches weren't applied and the internet itself was rapidly growing. It wasn't just touching the most wired of the geeks anymore but was starting to become part of the general publics experience. SATAN was an automated audit system. Some moron at SGI even fired him over this.

Both of these systems could be exploited and abused and both of them were. Dan's intentions were still honorable though. Yes, it was possible that they could fall into the wrong hands but both items in the right hands could help armour your systems against these attacks. It's a failure of system administrators everywhere that script-kiddies COULD use these tools against them.

The responsibility here is firmly planted on two groups. Foremost are the abusers of the tools. Just because somebody leaves the doors open doesn't give anybody the right to exploit it. The administrators who were compromised by things which were EXPLICITLY EXPLAINED OR AUDITED also bear some responsibility to any users who were effected. Ideally I'd love to see the day where script kiddies are locked away or otherwise punished (I loved somebodies suggestion the other day of forced community service teaching computer skills) and administrators that are proven to have not been dilligent in applying patches were open to financial repurcussions.

Some groups write scripts for the sole use of script kiddies. They may claim they're writing security tools but I find it hard to believe them when comments in the source code proclaim "// n4m3 0f z1t3 t0 b3 0wn3d" so they're liable. They're purposely producing tools to enable computer crimes.

A fine line... (1)

akey (29718) | more than 14 years ago | (#1508051)

To answer the question -- both author and user should be held responsible, to varying degrees and depending on the circumstances. It should be based in part on the severity of the damage and in part on the intentions of the people involved. Intent to cause harm is hard to prove, however.

Another problem you're going to have is your idea of punishing people for writing programs which cause (or simply are capable of causing) harm. Compilers, interpreters, and even good ole' DOS DEBUG are good examples of programs that can be either very useful, but can also be used for destructive purposes. I think we have to look back again at the overall intent of the person writing or using the program.

No No and No (1)

Bouglou (109816) | more than 14 years ago | (#1508052)

1-you don't need to take a virus to see a nice piece of code. I'm sure some things like demos are also REALLY nice pieces of code 2-when you write a program, you offer somebody the possibility to do something. for a 'normal' program, you give the user the possibility to achieve a certain kind of work. when you write a virus, you give him the opportunity to do nasty things he could not have done without you (well, without you and the others virus authors. but let's forget the "i'm no the only one!" excuse). So YOU are responsible.

RIAA, Diamond Rio, and Napster:Ethics and Legality (1)

wynlyndd (5732) | more than 14 years ago | (#1508053)

I usually believe that a technology/process/program/etc to be neither good,bad,illegal, or evil. Only what we as users of said technology do with it is good,bad,illegal etc. In fact, this is one of the reasons (basically) why the RIAA's suit against the Diamond Rio failed. The Rio's sole intent was to play MP3s. The human is the one that supplied it with MP3s be they legit or non-legit. This is also the same type of arguement given by the makers of Napster. Napster is a distribution and search method for MP3s. Of course they warn people against releasing copyrighted works. "We are a way for unknown bands to get their mp3s out", the makers of napster have basically said. Unfortunately, how does one do searchs for bands you don't know even existed, if your search fields are artist or song title? This is where I think the RIAA's suit may get them. So it would seem that Napster is primarily a tool for searching and distributing known works...almost all of which are copyrighted.

Diamond Rio is ethical. Napster will probably be found to not be ethical.

Re:Yes! (was: No !) (3)

Bruenor (38111) | more than 14 years ago | (#1508054)

Okay, I hate to reply to myself, but I just found another reason:

I'm on BUGTRAQ. I have been for quite a few years. Often a security problem is found and a commercial vendor remains unresponsive until someone produces a working exploit. Then, once the world has access to the exploit, the vendor usually begins work on a patch. Sometimes it's the only way to get their attention.

Now, the exploit itself has no legal purpose when you use it. It could be an educational tool to explain about buffer overruns/race conditions/whatever, perhaps. But often someone needs to write it and publish it or the vendor will never do anything about it.

Having virii and exploits should make us all more conscious of security and more prone to check your provider of software, check digital signatures, and more apt to want to see the source code.

The world is not a nice place and people would attempt to break into machines anyway. If having virii and exploits out there increases the level of security in software and systems then I am all for it.

Crowbars and JackHammers (3)

Father (118609) | more than 14 years ago | (#1508055)

I worked for a contract shop in Florida, and more than once used "hack" tools to get a job done. Occasionally the rules of engagement get you in a bind and you have to work outside those rules to get your job done. We had a source control machine that crashed, dead, inoperable with quite a bit of source code that we needed to retrieve. Without hack tools, etc, we wouldn't have been able to get the data back out by playing the role of script kiddies and using hack s/w to make the drive accessible. A tool is a tool. Without those tools in particular, my company would have had to face a serious financial set back. mike

Re:A gut reaction (1)

tosi (118610) | more than 14 years ago | (#1508056)

I see it as such: The user of the "tool" whatever purpose the tool may be for, must always be held responsible for the use of the tool, and not the author. Just think, if NSA put up a badly configured IIS webserver, and I would use FrontPage to change their web, should M$ be held responsible ? ( I wish, but... ;) The same must go for software as it goes for hardware - S&W aren't responsible if you would use one of their manufactured guns to shoot the guy nextdoor ?

Re:Definitely the user... (1)

nhowie (38409) | more than 14 years ago | (#1508057)

I don't blame gun manufacturers or knife manufacturers for murders. I don't blame car manufacturers for drunk drivers

If the gun or knife where sold to anyone, with no restrictions, and blatently advertised as for murder; or the car given 3-foot pointy spikes at the front for killing pedestrians -- what then?

It sounds silly, but that's what things like viruses and the scripts that script-kiddies use (what is the name for these anyways?) are essentially doing.

The bulk of responsibility does lie with the user, and I believe that it is the person who uses the utility that should be held to account, but there is a line that must be drawn between useful utilities that may be used by the unscrupulous(sp?) to do illegal things, and programs created with the intention of letting people crack systems or whatever more easily.

Some Interesting Questions Here ... (2)

SimonK (7722) | more than 14 years ago | (#1508058)

Off the top of my head, I'd say this is rather like the question of firearms, and I'd say that "guns don't kill people, people kill people" is even more applicable for software than for guns.

Naively making it illegal to produce software capable of being used to break the law would make a lot of vital activity - for instance producing exploits for security flaws - against the law, which would be hugely to everyone's detriment. If that was done, the inabilility of honest law abiding people to effectively investigate security issues was be a massive boost to crackers everywhere.

As far as I can see liability for breaking the law lies with the person whose intent it was to break it. If the that is the author of some software (eg, a program deliberately designed to spread a virus) then so be it, but if the author produces a tool with multiple functions (eg. BO2K) then he's no more guilty than a man who makes a knife.

There are of course some tricky cases. For instance a friend of mine once wrote a virus as an exercise and gave it a slightly nasty payload. He never intended to release it, but unfortunately a copy got loose on his hard drive and infected several other machine before it was wiped out. If that had well and truly escaped, and done serious damage, where would the liability lie for that ? or is it a natural hazard ? Possibly there is no criminal liability in that case, but merely civil negligence by failing to contain the virus ?


Tough one ... (1)

pvente (89848) | more than 14 years ago | (#1508059)

If I make hammers, and someone kills someone by misusing the hammer, I would not be responsible.

If I yell 'FIRE' in a crowded theater, and people use my 'product' by accidentally killing by stampede, I would certainly be responsible.

In either case, I personally didn't do the killing, but the line of responsibility clearly falls on different sides. What if I invented and marketed a product that could only be used to kill sleeping people ? What if it had no other uses at all ? Would I be responsible ?

The difference between who's responsible and who's not eventually is determined by the 'official' making the legal decision, and if that person is on the left side of 'center', the line falls more towards the manufacturer. If the person is on the right side of 'center', it falls more towards the user. In the end, that's the difference.

terrorists != script kiddies (0)

Anonymous Coward | more than 14 years ago | (#1508060)

hehe Terrorist, that is a judgemental word, a terrorist generally has a political goal to achieve and just wrongly feels terror is the most effective means to achieve it.
if you agree or not they are just fighting for what they believe in, be it their religious beliefs, money, or democracy.
script kiddies generally have no goals, they just want to cause shit.
of course being a terrorist is a very poor way to achieve your goals anyway.

Question of ethics or law? (4)

evilpenguin (18720) | more than 14 years ago | (#1508061)

I think the law has to treat the person who uses a product for illegal means as the "guilty" party. The person who makes it bears no automatic culpability.

This is my general take. Gun manufacturers are not responsible for murders committed with guns. Now, I'm not a gun nut, but I think this is legally right.

The same should hold true for the authors of nmap and queso (to name a couple tools that system crackers might use) and the authors of pgp and gpg (to name a couple tools that criminals or terrorists might use).

Now, if it is a question of ethics, you've opened an entirely different can of worms. Ethically, I think several guns need a closer look. I think teflon tips are something that raise ethical questions. I think nmap has a few grey areas (what legitimate use requires the micro-fragmentation feature? That's there just to avoid string scanning intrusion detection.), but in each of these cases (except maybe those teflon tips) I think the law has to protect the author/maker and hold the user accountable.

If we hold that the maker/author is responsible for all of the ways in which their product/idea is used, then we should have locked up Darwin because his ideas contributed to holocaust. We should lock up the inventor of the circular saw because it has maimed and killed. And so on...

Ethics lies behind law, but the cliched figure of justice that adorns so many government buildings (at least so many American ones) wields a scale, a sword, and she is blindfolded. The sword is two edged as well. It may be a cliche, but it is an apt one. The law is not ethics. The law is the minimum interference to maintain the social order. While many conservatives in this country will argue with me about the law being minimal, it is certainly not the opposite. You can write and buy a book about how to crack safes. That's legal. Crack somebody else's safe, and you've broken the law. It seems absurd, but it isn't. To write a book on how to crack safes (so long as you believe in the idea of private property) is unethical, but I for one would not want to see it made illegal.

responcibility (1)

penfold2 (88605) | more than 14 years ago | (#1508062)

In my opinion, in a situation like this it is the responsibility of the person/people using a piece of software to ensure that they use it responsibly. If it weren't for this ideal within computing, many systems would not be as secure as they are now, due to a lack of tools to enable people to ensure that their systems are not vulnerable to many of the commonly known attacks.

Some might say that programs which allow this kind of detection of vulnerability should not be made available, and that they make it far too easy for someone wanting to break into a server to do so with little or no experience. These people would be ignoring the fact that the information is readily available on the internet already, and that these programs only serve to make life easier for people. As long as the systems administrators ensure that they use these programs on their machines, and follow the advice given, they can be in no danger of someone using the software against them. In my opinion, any administrator who fails to do this simple task shouldn't be doing the job. They would also be ignoring the fact that these programs are not generally written to be easy to use. Anyone managing to make use of them must have at least some experience, and therefore would be able to do the same thing (though not as quickly) without the software to help them. On the legal side, I don't know if there is any law against this kind of software. But with the prevalence of these pieces of software in existence, and the fact that I have heard of no court cases relating to the author being sued (I'm sure there are several large corporations who would like to do it), I can only assume that there is no such law, or at least none which is strong enough to bring a court case against anyone.

Re:No ! (0)

Anonymous Coward | more than 14 years ago | (#1508063)

I once had a friend that collected virus'. He even wrote some. He did this to help learn them to write anti-virus software, and aid in data recovery.

McAffee is one such institution that also does a lot of research into not only what is out there, but what types of exploits might occur.

Ultimately it isn't the virus, or the person who wrote it. It is the person who maliciously sent it in a "Whack-a-Bill" progra or something. Granted, the author and the perpetrator are _USUSALY_ the same, but it isn't until they perpetate the crime that they become a criminal.

Re:well, by precedent... (1)

matman (71405) | more than 14 years ago | (#1508064)

hehe i meant to say potentially dangerous. :)
tho even at a gun range and stuff there are still gun injuries... remember that kid who was shot by a stray bullet at a gun range? or the fact that bullets can apparently backfire? They're a lot safer than cigarettes, of course.

It depends on some reasons (1)

segmond (34052) | more than 14 years ago | (#1508065)

This is a little bit complicated, but not that complicated that a 5 years old cannot get the gist of it. Software are tools, and like all tools, they sometimes can be used for good or evil. For example, lets look at a gun? Who is responsible the maker or the one who used it to commit a crime? It depends, If the maker of the gun allowed the gun to be obtained by anyone, as far as ethics go he is partially responsible. The same applies with softwares.

In the software world, take a look at BackOrifce. The guys who developed are partially responsible for all the wrong things that is done with it. I am not against them for releasing it or sharing the information, so please don't attack me. Likewise, the users who used backorfice are responsbile.

Now, lets take a look at the latest bind overflow exploit that was released not so long ago. The developers of this code are not responsbile, The exploit was crippled that anyone who has not written or read about bufferoverflows could not use it. Now, if I was to take this exploit and uncripple it, then use it. Then I am to be blamed, not the developers.

So, as you can see, It depends on why/how the software was released and deployed.

Re:Yes! (was: No !) (2)

Ender Ryan (79406) | more than 14 years ago | (#1508066)

And an atom bomb is just a bunch of atoms... But that sure doesn't mean it can't wipe out a city.

Just as a virus can destroy a network...

If you can sue a gun maker... (1)

briancarnell (94247) | more than 14 years ago | (#1508067)

If you can sue a gun maker for the criminal actions of a third party (which has been allowed in U.S. court), why not allow lawsuits against makers of software such as compilers that allow people to manufacture viruses.

Consider the Chicago example. It is illegal to own guns in Chicago. People who want to own a gun but live in Chicago drive to the 'burbs and buy a handgun from a legal dealer their and then transport the gun back to their Chicago residence.

The city is suing gunmakers and dealers saying the dealers are acting in a negligent manner by selling to people they should know are breaking the law.

The same principles could easily be applied to compilers and other software -- by not making sure that the buyers of the software *aren't* going to use it to create viruses, the dealer and manufacturer are negligent.

Or to put it another way, has anyone in the software industry taken any positive steps to make sure criminals *don't* have access to their software? No, they haven't, and that's exactly the grounds that people are going after gun manufacturers in court.

Re:Depends on the situation (0)

Anonymous Coward | more than 14 years ago | (#1508068)

Not true. Several states and cities are suing gun manufacturers (like Colt) for "health-care" related costs.

Does not make it right, its gay as hell, but
our society is sue-happy and responsibility-lacking.

Its not my fault, he made me do it. The gun took control of my mind, and made me shoot him.

I would add a caveat to that. (1)

perky (106880) | more than 14 years ago | (#1508069)

Likewise, I wouldn't blame a gun manufacturer for illegal use of their product, however I think that legislation restricting the ability to purchase a gun wholly right. Consequently I think there should be legislation prohibiting the manufacture of programs whose purpose is malicious.

Now, before I get the torrent of "the internet is not policable" posts and all the rest of the freedom online thing, I have a few more things to say.

I realise that it is much harder to prevent the production of malicious code since it takes only one person using tools that are freely available. Furthermore, recent incidents (deCSS etc) have shown that it is impossble to prevent the circulation of code/binaries on the internet. However, I don't think that difficulty of enforcement is reason enough not to legislate. It would give the law enforcers at least some leverage in certain situation, that they don't have at the moment.

Secondly I realise that the line is very blurred between legitimate usefulness, and malicious, particularly with tools that can be used both maliciously and defensively, like port scanners. Again, I see this as a challenge for the courts and legislators, rather than a reason not to even attempt to legislate.

I think that it is also the responsibility of every developer to think about the potential illegal uses of their code, and the damage that their programs could cause. Since it requires a certain degree of brain power to be a developer in the first place, it shouldn't be too hard for everyone to realise that if they write a virus/scanner/exploit, and release it to the public, it will innevitably wreck someone's day, and cost someone money. Just put yourself in the position where you miss dinner with your family/have to stay up all night fixing a server because of some script-kiddie/have to pospone the family holiday because your data was wiped off your servers, and you miss a contract deadline as a result. It doesn't take long to decide not to release malicious code does it? Just remember that there's always some arsehole who think's it's cool to screw things up, and he might be doing it with your code.

Users are to blame (1)

Peabody007 (118611) | more than 14 years ago | (#1508070)

I feel that the user is totally to blame. Much of the software out there can have one bad use or another. You can't blame a developer for the software being used in a destructive way no more than you can blame a fork for the user having bad table manners.

Legal Uses (2)

wangi (16741) | more than 14 years ago | (#1508071)

I'd agree with the majority that a user is responsible for the misuse of a tool (and i'd include software in this term).

HoweverConsider tools/processes that have no legitimate use, such as chemical weapons. I believe i'm correct in thing that development of chemical weapons is illegal (in most civilised contries). Computer viruses should be considered in the same light.

Re:No tool w/o Health Warning (0)

Anonymous Coward | more than 14 years ago | (#1508072)

Where is the health warning on my car?

On my hammer?

On my kitchen knives?

If you have ever read Douglas Adam's you will soon find your self living in the asylum, and I will be the only one on the outside, in my little house.

Re:terrorists != script kiddies (2)

Ender Ryan (79406) | more than 14 years ago | (#1508073)

ok, so, what's your point?

Re:Cracker Software & guns (1)

CoolHnd30 (89871) | more than 14 years ago | (#1508074)

automatic guns can only be used for evil purposes (killing people), what if the time comes when its necessary to overthrow our government because they've grown extremely tyrranical? Just like our forefathers overthrew the yoke of oppression of their government, we should have the means at our disposal to do so. Our forefathers tried to insure that we had those means by the 2nd amendment to the constitution. Now however, our rights are suppressed. You may call them evil if you wish, but I believe that our citizenry should have even more potent weapons than automatic weapons available, so that we will be able to overthrow it, if there comes a time when our gov't becomes like Nazi Germany or pulls a Tianamen square. This may sound radical to some ppl, but I think its anti-American to view it any other way. I mean our fore-fathers started a revolution for independence from their gov't b/c of taxation w/o representation, for the most part. Pretty radical dudes.

IMHO, it depends... (2)

jd (1658) | more than 14 years ago | (#1508075)

...on what the software is, and the purpose for which it was written.

To use an analogy, guns are designed to kill. That is their sole function. They'd make lousy can openers. As such, I feel that the makers have a measure of responsibility if people use guns in that manner.

HOWEVER, a measure is just that. A measure. The gun manufacturers don't -make- people use guns that way, that is the choice of the owner, and nobody else.

I guess my point is that responsibility (as opposed to blame) for anyone involved is, IMHO, never 100% and very rarely 0%. Rather, it's the entire spectrum inbetween.

If a software package has one, and only one, possible function, then the writer or company needs to take some responsibility if people use it that way. After all, that's what it was intended to be used for, and that alone. For the writer or the company to deny any responsibility, on the grounds they didn't actually -use- the program that way, is denial of reality.

MOST programs, though, are multi-purpose. SAINT is an excellent example, being very useful for testing for some of the more blatant security flaws in systems. Yes, it can also be used maliciously, but so can a swiss army knife. Doesn't make either program necessarily malicious in it's own right.

Summary: Where one, and only one, intended use exists in a program, the writer or company should bear some responsibility for people using that function in the manner intended. (NOT blame, just responsibility, and at most 50% of the responsibility.)

Where more than one use exists, the writer or company should bear responsibility no greater than 50% of the fraction of possible uses that are malicious. (The user is never forced to use the program maliciously, so bears at LEAST half of any responsibility, regardless.)

Re:Depends (3)

h2so4 (33298) | more than 14 years ago | (#1508076)

As source code, I wouldn't say that the authors of these programs are necessarily the "bad guy"; the code can provide interesting insights into security flaws.

In the case of a virus, if the developer keeps the code within a quarantined environment, which he has authorisation to be using, it seems legitimate. As long as he does not distribute the code to unstrusted partied, or release a binary into the wild, then he has not really done any damage, it is when this boundary is crossed that he could be held responsible (to some extent) for damage.

How far back are we willing to slide? (0)

Anonymous Coward | more than 14 years ago | (#1508077)

If programmers are made liable for others misusing their code, then it's easy to let the argument keep sliding backward. Since programmers provide the tools used for hacking, who provides the tools for programmers? Compiler developers. Who makes compiler development possible? Hardware development. Who makes hardware development possible? Companies combined with some really smart people.

How far back should this line of reasoning be taken?

If you blame the programmers for producing hacking tools, then outlaw programming. However, since there's a programmer whereever there is a compiler, you'll have to outlaw compilers. Since there is a compiler whereever there is hardware, outlaw hardware. Who produces hardware? Companies/corporations such as IBM, Apple, etc. Where did all the hardware ideas start? ENIAC. Why was ENIAC built? The US military wanted it. Why? World War II. Why? Hitler.

Blame all hacking on Hitler. There, now we have identified the responsible party. Someone go convict him.

Bono Vox,

Re:No ! (2)

Coyote (9900) | more than 14 years ago | (#1508078)

A word processor can be used for illegal purposes, but no one would consider holding the developer responsible for that kind of use, and even a virus is not necessarily destructive in nature. For instance the virus exists for the sole purpose of testing virus scanning software.

Where the software was developed for the sole purpose of illegal use, the responsibility is on both the developer and on the user.

If the software was developed for legitimate use, then the responibility is on the user.

The intent of the software may be a grey area; what was Back Orifice _really_ intended to do? Be a tool or a crack? I'm not an attorney (but I play one on web-tv). IIRC, the state of mind of the developer; what he intended the software to do, is a legal point that may determine guilt or innocence.

But, in any case, the user always has the responsibility to use _any_ software only for legal ends.

Re:Depends (2)

Ech0 (28768) | more than 14 years ago | (#1508079)

I agree... There are programs that can be used for either means. l0pht comes to mind. As a System Adminstrator this is a very helpful tool for testing password security and getting into boxes of an ex-employee (sometimes disgruntled), however in the wrong hands it can be a dangerous thing. The guys at l0pht are not to blame because they provide a service.

The guy who writes a malicious virus is another story. Both the coder and the user in this case are equally at fault. I'd be hardpressed to find a useful reason for a virus. It causes trouble for end users and even more trouble for admins who have to keep track of them, clean them, repair the damage, prevent them from returning and explain to the client what it was in the first place and why.

Reasonable use (1)

frobnoid (64717) | more than 14 years ago | (#1508080)

How many pieces of software could NOT be used in an illegal way?
I could write a ransom note in Word on my Windows 95 machine. I could then send it to you via E-mail. I guess that also implicates my AOL software, sendmail and the copy of Eudora you use to read it. I suppose that means we'd better round up their development teams and cart them off to jail.
Grab the Mozilla team while you're at it... I just looked at some illegal pornography and those developers assisted me.

The user should ALWAYS be held responsible.
Developers are blameless... unless they play another role in the problem (such as misrepresenting their software) "Oh yeah, run this 'internet_worm' program, it's even more fun than Zork!"

If I create an AI lifeform, and it commits a crime, who is at fault?

Re:Depends (1)

Sabalon (1684) | more than 14 years ago | (#1508081)

However, what if the smurf program was written to help test a firewall?

You can't just say this is good, that is bad.

Intent is the key - and the intent can't be known until the user has the program to use.

Re:No ! - Get a brain before replying! (0)

Anonymous Coward | more than 14 years ago | (#1508082)

No legitimate uses for virusses? Being interested in virusses myself, I can tell you that *real* virusses (not the Melissa crap) push the limits of both OSes and underlying architectures. They are fascinating and *VERY* educational.

You sound like one of those idiots who would lock me up for playing with something you don't like. I'm really glad you're not in a position of power to enforce such stupid laws.

On the other hand, intentionally infecting other people's machines brings up the question of ethical use. This is a completely different can of worms. Is it illegal to insert malicious code into other people's systems? I should think so.

Software is a tool and the end user holds full responsibility for how he or she uses that tool.

Re:terrorists != script kiddies (1)

perky (106880) | more than 14 years ago | (#1508083)

I don't see your point. The analogy was based around the idea that a coder who develops an axploit/script/whatever and releases it on the internet KNOWS that it will be used for harm, just as an arms dealer who sells to terrorists (or an oppressive regime etc) KNOWS that they will be used for harm, and should carry a moral and legal burden as a result. That is not to say that the "users" should escape legal repercussions, but that the blame should not fall exclusively on the designer or the user.

Re:The gun analogy (1)

Quack1701 (26159) | more than 14 years ago | (#1508084)

And why do you think an 'assult rifle' is not a self-defense weapon? Often times the only difference in a hunting rifle and an assult rifle is color. And it doesn't matter if I kill you with a rifle or a spoon. In the end, your still dead. And it is not the rifle's or spoon's manufacturers responsibility. The only reason people are bringing these lawsuits about are that you can sue anyone for anything and they are attempting to put the gun manufacturers out of busines because they haven't been able to convince enough people do outlaw the products. Rather sad if you ask me. I think when these people lose thier lawsuits, they should be 100% responsible for the leagle fees of the manufactures. These preditory lawsuits will be the undoing of busines as we know it.


Disclaimers (1)

Kalper (57281) | more than 14 years ago | (#1508085)

The best answer to this question I can think of is "How is your disclaimer phrased?"

Diamond warns their customers that the Rio product is intended to be used for legimate uses only, therefore they are not responsible for their customers violating copyright law.

McDonald's did not have anything more specific than "Caution: Hot" on their coffee cups, therefore they were responsible for some woman ordering coffee at a drive-thru and scalding herself with it. They never mentioned that pouring the coffee on yourself was not an intended use, therefore they were liable for her injuries.

wot no latin? (0)

Anonymous Coward | more than 14 years ago | (#1508086)


Re:No ! (1)

bogado (25959) | more than 14 years ago | (#1508087)

A virus could have be written to demonstrate a security flaw. Since for some big companies, like the one we all love, security flaws are only security flaws when there is a exploit available a virus writer could force a patch to be made a real good deed. :-)

seriously though, usualy a virus is not "used" by anyone, the virus infects persons without their knowledge (I myself never saw a pop up "installing virus, please wait...":-) ). Usualy the only person that realy "uses" the virus is the creator itself when he starts the spreading, and in my opinion that is the unlawful action.

Imagine the folloing, someone creates a virus to test a design or a virus-scan technique, uses it in his lab only and never sets it free. Then a lab janitor, that happen to be a cracker, cracks the lab stoles the virus and sets it free. Who is the criminal?

In a few words my opinion is that to code a virus is not a crime itself, but spreading it to world is.
"take the red pill and you stay in wonderland and I'll show you how deep the rabitt hole goes"

This depends on the situation (2)

Pilchie (869) | more than 14 years ago | (#1508088)

When I was in first year Computer Engineering, we spent quite a lot of time on this issue. (Note: Laws, etc, pertain to Canada, but I believe that the US is the same).

Currently by law it is the user's responsibility, totally, in every situation. However, there is starting to be significant pressure to make some systems the responsibility of a Professional Engineer, who would have to sign off on a project, and take responsibility for it. The reason for this is not virii, but other systems, such as medical software, navigation/control systems for aircraft, trains, etc.

Numerous people think that someone who develops the software to control the administration of a drug (for example), should have to take responsibility for the safety of their code

I don't have a reference for it, but one of the big examples that we discussed had to do with a machine that administered chemotherapy drugs to patients in the US. There was software controlling the dosage, and a hardware safety check to prevent ODs in the first version. Then in the second version they removed the hardware check and (I think) about 20 people died of ODs because a lazy programmer didn't check whether the dose was allowed or not. In this case, the hospitals were deemed responsible for the deaths, but personally I think that situations like this need the developer to take responsibility for safety.

Of course the problem with the developer's taking responsibility is that most projects depend on numerous other products. For example if a developer writes code that is safe, but is rendered unsafe by the compiler, or by the OS the system is running on, who is really responsible, the developer, or the tool vendor. Which brings me to my final question, if the thrid party vendor is actually an open source project, who takes responsibility for it. As an example, consider this. Some company wants to write a navigation system for a 777. The search freshmeat, and find that there is a really great AVL library that is LGPL'd. They decide to use it rather than roll their own, and some bug in the lib causes the planes to crash. Is the library developer responsible, or the company who made the nav system? I realize that most licenses have a no liability clause in them, but if it becomes a requirement for developers, could this be a major stumbling block in the road to world domination?

Anyway, I think I have rambled long enough, I should probably go write some code now. (Good thing I am a co-op student, so I won't be working here when the code gets released).


It's the User's Responsibility (1)

Tim C (15259) | more than 14 years ago | (#1508089)

With enough inventiveness, a person can put just about anything to use in illegal or immoral ways. Crude explosive devices can be made with common household objects and common chemicals. In the film Casino, a guy is stabbed to death with a fountain pen. Photocopiers or PCs with scanners and printers can be used to forge documents.

You cannot start trying to make the person who produced the item in question, whether it be a piece of software or an object of some kind, responsible for the use to which people put it.

As many people have already pointed out, you can't sue a gunsmith if someone uses one of their guns to commit murder (unless, of course, the intended use of the weapon was made explicitly clear at the time of purchase, but even this is somewhat dodgy ground). In the same way, you can't sue Microsoft because there exists in their software the capacity for people to write Word macro virii.

Ultimate responsibilty must rest with the user; to try to make it any other way would be to start down a very dangerous road indeed. Imagine a world in which you, as the author of a piece of software, is responsible for any use that anyone makes of it, now or at any point in the future. Any piece of software more complex than "Hello World" has the potential for misuse - email clients can be used to send harrassing/defamatory emails (and don't forget the servers that relayed those messages, or the network cabling, routers, etc, etc...).

I would also argue that just because you write a piece of software, the sole purpose of which is, for example, to attempt to expose security holes in a system, does not mean that you are liable for any illegal use to which it is put.

I do not believe that people can be held responsible for the actions of others, particularly when they have never had any contact with them.

Just my two penn'orth.


Intent counts (2)

Paul Johnson (33553) | more than 14 years ago | (#1508090)

I think the most important factor has to be what the lawyers (at least over here in the UK) call "mens rea", which I think translates as "guilty mind". Its the intent that counts.

Take a couple of examples: the recent DVD crack, and credit card number generators (the latter generate syntactically valid random credit card numbers). For the purpose of discussion I'll assume that copyright violation is unethical.

In the case of the DVD crack the purpose of the crack was honest: to let Linux users legitimately watch films without having to pay for Windows just to run the DVD drive. This is a perfectly legitimate goal, and there is nothing unethical about doing it. Of course it is possible to use the same software for unethical purposes, but the author of the software is not responsible for such a decision.

On the other hand the author of a credit card number generator has produced a piece of software which exists for only one purpose: to facilitate theft. The author set out to aid theft, and is therefore morally an accessory to the thefts which are carried out using the software.

Of course there is a big grey area in between these to extremes. What do we say about software which has some minor or marginal use, but which is almost entirely used for some bad and foreseeable purpose? Back Orifice might come into this area: it has some legitimate use for remote admin, but its primary purpose is to break Windows NT security.

Here ethics moves away from the legal domain: lawyers are concerned with proof. However ethics is more about formalising matters of conscience (although some ethical codes do carry penalties for gross violation). If you believe that cracking is wrong then it follows that the CDC acted unethically in releasing a tool which had, as its primary purpose, cracking NT.

A program for Linux which was designed to facilitate DVD copying would be an interesting case. It may be ethical to copy a DVD for backup purposes, but the vast majority of copies made would be illegal pirate copies for sale or just given away. Would it be ethical to write such a program?

The classic hardware scenario for this kind of ethical debate is the shopkeeper who sells a knife which is subsequently used in a murder. If the knife is a cooking knife brought in the normal course of business then obviously the shopkeeper shares no guilt. At the other extreme if the customer comes in and says "Give me a knife so I can kill my wife with it" and this statement appears believable then equally obviously the shopkeeper is an accessory to the murder. But in the middle is a large grey area. What about combat knives? They are specifically designed to kill. Any individual purchaser might plead a desire for honest self defence, but the fact remains that most of the time that such knives are used it is not in self defence. The vendors must therefore share to some extent in the guilt of the users of these knives.


Personal Responsibility (1)

paitre (32242) | more than 14 years ago | (#1508091)

After initially reading the question, I was of two minds, after reading others comments and taking a few minutes to think about it, I'm of single mind on this. Software products are tools, much in the same way a hammer and screwdriver are tools. To hold the developer of a software tool responsible for its use is absurd. I CAN see some exceptions: viruses, BUT, as another poster has noted, viruses tend to be very nice peices of code that can (and IMO, SHOULD) be used for educational purposes.
A good example (used here by others) is the drunk driver. While sober, the driver behaves and uses his car responsibly, however, after a 6 pack of Newcastle Brown he really shouldn't be driving. He gets in his car anyway, and ends up killing somebody in an accident. Who's at fault? Not the car manufacturer. In fact, I can't think of any sane person who would even consider holding the manufacturer responsible. The driver, on the other hand, is completely responsible. He got drunk, drove, and killed someone.
The question still remains wrt software: is the developer responsible? Yes and no. No, because in general, the software created will likely have a multitude of uses, most of them legal. An example would be if we tried to hold Quicken responsible for an organized crime family using their software to manage their books. Yes, they should be held responsible if the developer is knowinglly developing a tool with the goal of it being used for illegal activities (virii, primarily). BO2K is a legitimate product, MS and other companies make similar products that noone has a complaint about. The only reason people bitch about BO2K is that it was NOT developed by a major software house.
Again: Yes, in some cases developers should be held responsible, however, in the general case the users are responsible for their own actions, as the developers have no control over the use of their package once it hits the shelves.

Disclosure is good (1)

larva (82883) | more than 14 years ago | (#1508092)

There are two basic types here, software that CAN be used both as good and bad, and software that is made for the sole purpose of destruction. Either way the user of the software is the one who should be made responsible for his actions, the tools cannot do any harm on its own.

Its tempting to compare to a gun or any other weapon but this goes beyond that. By making such tools available it force a reaction to the problem and thereby making the world a better place. If the tools remained unknown to the vendors it would just make technological warfare,
industrial espionage or whatever, so much easier.

In a perfect world it should be enough to notify the vendor of a problem, but this just does not happen. The only way to make sure patches are released, fixes are made, and protocols changed
is to publish the tools needed to take advantage of it.

This is the way bugtraq has operated for quite some time now, and I havent heard of a lawsuit for making a program like that yet, but plenty lawsuits against people who use them :)

The user of the Gun... (1)

Duderino (117527) | more than 14 years ago | (#1508093)

Guns are not(yet) outlawed.... It's the act of killing someone using a gun that is illegal. Same goes for Knives, Ammonia etc.... I think that it's the user's responsibilty, not the developer... Technology should not be outlawed, but rather the illegal use of technology.... D.

Only one real /. answer (0)

Anonymous Coward | more than 14 years ago | (#1508094)

It's Microsoft's fault.

the user is the only source of blame. (0)

Anonymous Coward | more than 14 years ago | (#1508095)

it's the users fault .. all of it really ... this goes for software as well as for example guns. just cause you have it doesn't mean you have to use it and if you do you have to take the consequnces of your actions, no matter if you blow a person or a server away.

Poll comming up? (1)

Serenade (5015) | more than 14 years ago | (#1508096)

Hmm.. first of all, i'd like to say that this would be kind of a good poll.. :)

Secondly, i'd like to put it like this:
I would definately prefer if it was something like this: Describing the tecniques involved in the (randomtask)sofware, should be ok, while distributing compiled forms shouldn't be, since alot of LaYme SKRiPTz0R k1DdiES wreak havoc by acting irresponsible, and get us real geeks in trouble.
Kind Regards / Mark

potentially both (1)

jmorzins (86648) | more than 14 years ago | (#1508097)

The usual gauge of whether someone is culpable for an act is to
consider whether he committed it with full knowledge of what
he was doing, and if he consented to doing it.

If a tool-maker did not know that her tool could be used for bad
ends, she is less blamable if it is used in that way. (I don't
think it arises very often is software development, but if she
were somehow forced to build it against her will, she is
similarly less blamable.) Same argument for if a user does not
know that a tool will have bad consequences, or if the user is
forced to use it.

But if a user knows that use of a tool is wrong and deliberately
uses it anyway, he has responsibility for wrongdoing. If a developer
knows that the net effect of a tool will be wrong, and creates it
anyway, she has responsibility for the wrong done because of it.

(The really hairy question is to ask how the developer judges if
the "net effect" of a tool will be bad. I leave this as your
homework exercise.)

No two ways about it... (1)

knife_in_winter (85888) | more than 14 years ago | (#1508098)

The end user of the software is totally responsible for his or her actions. There is no question about that. Trying to deny the responsibility of one's own actions is morally and ethically unacceptable; even though it is often done in the United States. I cite the example of the woman who sued McDonald's for $1 million after spilling hot coffee in her own lap. Her argument was that the coffee was not appropriately labeled as dangerously hot and therefore her burns were a direct result of McDonald's negligence. Now, we all know that the woman was a moron, but worse still, she skirted her responsibility for her actions. She played ignorant and refused to acknowledge that she was stupid to have put hot coffee between her legs.

Notice, however, that if one embarks on an action that harms others, the authorities are *very* quick to take the correct moral and ethical high ground. If you use a gun to murder someone, you are tried for murder, not the gun manufacturer. If you break into a home using a glass cutter, you will be tried for breaking and entering, not the glass cutter manufacturer. If you use a particular software package to crack a system and damage it, you will be tried for computer trespass, not the software designer.

I guess the real question you are trying to ask is "can the software designer be held responsible for making a tool that is potentially dangerous"? Asking this question is the same as asking "can we hold any designer responsible for the harmful use of their creation"?

I don't know the answer. The closest I can come to an answer for myself is something that is purely relativistic and probably unacceptable. I would say that it really depends on the intent of the creator. For example, if I use a Stanley claw hammer to unrepentantly bash your brains in, I think it is a safe assessment to acknowledge that the Stanley corporation will not be brought to trial for murder in the first degree. However, if Stanely designed and marketed a hammer specifically for the purpose of imploding the skulls of living humans, and I used *that* hammer in my crime, I think the Stanley corporation might find themselves culpable.

So what about gun manufacturers? I don't think anyone can argue to the contrary that hand guns are designed for anything but the purpose of immobilizing and killing human beings. But are gun manufacturers ever brought to trial with the assailant in murder cases? Not that I know of.

That's all I have to say about that.

Nothing can possiblai go wrong. Er...possibly go wrong.
Strange, that's the first thing that's ever gone wrong.

Re:Definitely the user... (1)

tzanger (1575) | more than 14 years ago | (#1508099)

The bulk of responsibility does lie with the user, and I believe that it is the person who uses the utility that should be held to account, but there is a line that must be drawn between useful utilities that may be used by the unscrupulous(sp?) to do illegal things, and programs created with the intention of letting people crack systems or whatever more easily.

I don't agree... The responsibility lies solely on the user. As one poster mentioned, many companies refuse to fix a problem until someone writes a program which makes it easy to exploit a bug. In other words, those people who wrote the malicious code are helping keep things safe by making the companies react.

Now in an ideal world we wouldn't need that. Companies would feel compelled to fix it on their own or, more importantly, code better in the first place. Unfortunately this doesn't happen. It's cheaper to sweep it under the rug so you need to make it costlier to keep it hidden.

Another use for those exploits... As an admin I often run them on my own system to see if I'm vulnerable. Or to see if my firewall rules can keep it out. Sometimes when a program says it fixes problem 'x' it doesn't fix it all the way.

tcpdump (1)

Tomahawk (1343) | more than 14 years ago | (#1508100)

I can think of 4 different 'levels' of responsibility/blame here, depending on circumstance, and on the application.

1) User bad, programmer good:
A prime example of this would be 'tcpdump'. It is a very very useful tool for finding faults on networks - I only used it the other day. It doesn't just do tcp - it will handle all sort of network protocols. Such a useful little tool.

tcpdump, in short, is a network sniffer/analyser. It listens to all network packets passing by your network card and displays information about them on the screen. You can even save all of these packets to a file for analysis later.

This leads to a problem with it - in the wrong persons hands, this same tool could be used to find non-encrypted password, allowing someone to access a system. It can also trap encrypted password over the wire, save them to a file, and allow someone to crack it.

Of course, it can get even more than just password - emails, credit card details, etc. This is why we need good encryption routines and SSL.

This is a perfect example of where the user is the one at fault. The programmer did everything that was required to make the utility useful for fault finding. He would have known that it could be used for bad things, but it was necessary for the good things too.

2) User bad, Programmer good and bad:
(disclaimer: I'm not giving out about BO2K, just using it as the only example I can think of here. No harm is meant to CodC)
Next, the case of BO2K. This tool was, in some ways, written to allow people to get access to NT systems. It was written with the knowledge of certain security breaches in NT.

The program itself, though, it one of the best admin tools for NT. The guys who wrote it don't use it for gaining entry into NT systems.

Here, if a user maliciously uses the program, then, yes, the user is still at fault. Is the programmer responsible? Well, firstly we need to know if the ability to use it maliciously can be used for a good purpose. Yes, to an extent as it shows that there are security bugs, and that they should be fixed. Next, we need to know if the feature is a necessary feature in order for the program to work. In this case, I don't think it is.

So where does that leave us? Well, the feature was added in order to improve NT security. The feature brings to light the knowledge that the security problem exists. So, for this reason, the programmer was morally correct in adding the feature, if just to ensure that admins (and MS) fix any security holes so the feature can't be used.

At the same time, it was wrong, as not everyone will get the security feature fixed, meaning that the program can be used to gain access to their systems. This is a case of being right and wrong at the same time.

Notice, however, that the user is just wrong for using the feature.

3) User bad, Programmer bad:
Next has to be the case of a program designed to gain entry into a system, but with no other use. Again, a user using such a program would be wrong. If this program has no 'good' use (unlike BO2K), then the programmer would also be wrong.
An example here would be the program 'crack' for cracking Unix password - a tool written exclusively for that purpose.

4) User good, programmer bad:
User blameable, programmer bad:
Lastly would be the case of a virus/trojan. In this case, in most circumstances, the user is not to blame. Naturally, different circumstances can bring blame to the user. If, for example, the user forwards on 'Sophie.EXE' to all the other guys in the office, which is most likely agaist office policy, and it happens to contain a trojan/virus, then this user would be to blame for not checking it, in a moral sense. He may have innoscently sent the attachment, but blame could still be put on him for causing problems within the company.

The programmer of the virus is definately to blame.

In the case of the last 2 here, the difference is the a Virus/Trojan isn't a utility that the user would be using to deliberatly cause harm. In most cases, a user would be unaware of the virus within the program/file. Like, what's the harm in running Sophie.EXE, eh?

The whole moral issue in all of these cases can have exceptions. Take the example of crack - here the police could use this utility to help them gain access to a drug barons computer, helping to convict him. In this case, the use of the product is a good one, and you could say that the programmer's involvement was also good. Everything has exceptions.

Re:well, by precedent... (1)

perky (106880) | more than 14 years ago | (#1508101)

"Guns are not inherently dangerous products at all" Are you mad? What is the sole design purpose of a gun? is it:

a) to entertain rednecks at target ranges

b) to give Stallone something to hold in Rambo

c) To kill or injure people or animals from distance

I agree that guns can be harmless when that is the intention of a well trained user, but that doesn't take anything away from the fact that a gun is an inherently very dangerous product.

RE: the cigarette comment. Used properly, in combat, a gun will always harm the victim. Secondly, the smoker has the choice whether to kill himself through nicotine, whereas the victim of a gunshot (usually) gets little choice in the matter.

The Death of Common Sense (2)

Michel (8815) | more than 14 years ago | (#1508102)

"Caution: The contents of this bottle should not be fed to fish." -- On a bottle of shampoo for dogs.

"Do not use while sleeping." -- On a hair dryer.

"Do not use in shower." -- On a hair dryer.

"Warning: This product can burn eyes." -- On a curling iron.

"Do not use orally." -- On a toilet bowl cleaning brush.

"Please keep out of children." -- On a butcher knife.

"Wearing of this garment does not enable you to fly." -- On a child sized Superman costume.

"May be harmful if swallowed." -- On a shipment of hammers.

Are you sure you want a warning label on anything that can be potentially dangerous?

Good ethics....? (1)

A Masquerade (23629) | more than 14 years ago | (#1508103)

Surely an example of good ethics is doing your own homework assignments rather than just posting the questions to "Ask Slashdot"?

[with thanks to hobbit]

Re:No tool w/o Health Warning (0)

Anonymous Coward | more than 14 years ago | (#1508104)

Well, my hammer does have a warning on it. So does the new wire cutters I bought a few days ago. Most of the instruction manual on my cordless drill deals with all the stupid stuff you can do with it to hurt yourself or others. My ladder has a warning about standing on the top step, in case you are stupid enough to do it. Why? Because these guys have been sued. Software is getting to be the same way. Johnny hacker crashed my e-commerce site, so let's sue Microsoft because he used thier operating system (works for me). This debate is pretty old, but the courts decide what's stupid and what's negligent. A billion bucks to a family because they didn't know how to drive and caused an accident and the gas tank blew up becuase the car was 20 years old? Is GM really liable for your stupidity? Is Cult Dead Cow liable because you are a script kiddie hacker wannabe and fucked up a bunch of computers? My opinion, the script kiddie oughta be glad I'm not thier daddy, because I DO believe in corpral punishment. Oh, I'm ranting again. Later, The Geek

Responsibility (1)

Stanleverlock (111131) | more than 14 years ago | (#1508105)

Dear Computer Ethics, This is not a case of the Weapon maker being unable to control the use of the weapon after it leaves his establishment. In the case of software you can make a very powerful argument for harmful intent. software can be so designed to do nothing other then it's intended purpose. But when you start adding all kinds of little programs that monitior and survail the user or another user while the software is running amounts to invasion of privacy at the least and up to criminal harmful intent. When programs are written into software for whatever imagined crimes, you are breaking certain unwritten moral codes an probadly laws as well. How you might justify logic of such programming, you are clearly stepping across a line.

Re:Are we writing a paper for him? (1)

chizz (95740) | more than 14 years ago | (#1508106)

hear hear.. I don't mind my students doing it this way and on /. he'll get a lot more and varied opinions to consider than he would working alone or ina small group

Re:well, by precedent... (1)

HarryTuttle (69566) | more than 14 years ago | (#1508107)

> Guns are not inherently dangerous products at all.

In The Social Animal [] , Elliot Aronson relates a study involving guns. In the chapter on Human Aggression. It was found that the mere presence of a firearm could increase a subject's level of aggression. This finding would seem to disprove the "guns don't kill people" mantra.

Re:I would add a caveat to that. (1)

tzanger (1575) | more than 14 years ago | (#1508108)

I think that it is also the responsibility of every developer to think about the potential illegal uses of their code, and the damage that their programs could cause. Since it requires a certain degree of brain power to be a developer in the first place, it shouldn't be too hard for everyone to realise that if they write a virus/scanner/exploit, and release it to the public, it will innevitably wreck someone's day, and cost someone money. Just put yourself in the position where you miss dinner with your family/have to stay up all night fixing a server because of some script-kiddie/have to pospone the family holiday because your data was wiped off your servers, and you miss a contract deadline as a result. It doesn't take long to decide not to release malicious code does it? Just remember that there's always some arsehole who think's it's cool to screw things up, and he might be doing it with your code.

That's why I would strongly suggest to all exploit writers that they do not make the code available for ... oh say... two weeks after notifying the company that they have said exploit. If the company is unresponsive or gives the "who cares" attitude, release it. It's not the developer's fault anymore.

Re:A gut reaction (1)

Anonymous Coward | more than 14 years ago | (#1508109)

I think you're on the right track, but there are other considerations. Most things that are at all useful can be misused. One's point of view is certainly a critical component. I'm sure our CIA doesn't consider eavesdropping on suspected criminals a "misuse" of electronic equipment, however, the suspects - especially if they are innocent - might disagree. We quickly get into a question of "the end versus the means." People's actions will be judged on an individual basis by those who are in a position of power over that person. A person who is hanged as a spy would have been given a hero's welcome had he escaped to his own country. Bringing it back to the computer world, what about DVD? There are those whe believe it should be illegal to prevent people from freely viewing DVD movies on Linux. They couldn't get those "in power" to agree, so they reverse engineered the encription algorithm. Some people would call these programmers "heroes"; some would call them criminals. There are no absolutes in the concept of right or wrong. If you want to enjoy the benefits of a society then you must obey the laws of that society or risk being forced to pay the price. How much risk are you willing to take?

At all times the user (1)

Symbiosys (118606) | more than 14 years ago | (#1508110)

The person who should be held responsible for any action, is the person who commited the action. Like so many have already said, the company producing fire-arms can not be held responsible for murders. One gentleman felt that if the developer creates software that can be misused then he should be held responsible, this is the exact same situation as with guns. If AOL tries to make it easy for people to get on the web and in their efforts make it easy to hack into someone's PC. That does not make them the perpetrator. The person who hacks into the PC is wrong. If we support the idea that the developer / producer is wrong it will cause total chaos.

Analogy with patents (1)

The_Compact (102019) | more than 14 years ago | (#1508111)

That question is really good and have been debated by many philosophers. (And will continue to be)

For me, as a computer programmer, I try to keep my software bug-free. It is impossible of course to have such a thing as a bug-free software with all the variables taken in account while writing a code.

What I think is up to a legitimate line, a legit software can contain "bugs" and users must accept it so. A company that won't write patches/upgrades or have too many bugs could eventually be attacked in law.

As for purely evil codes (destructive viruses), I think it's both the user and the writer who have the responsability in that case. I wrote many viruses, some pretty destructive. But I never made them leave a particular floppy. I made them basically to learn more about viruses. I take full responsiblity for them since I keep them hidden.

<<In one line, my answer would be: Depends on the intended usage of the software.>>

What I REALLY fear is people creating false usages to protect themselves from their share of responsabilities. More or less like the software patents we have these days.

In SW patents, people will use twisted ways to get their software patented. "A device which permits to (Insert patent here)".

If a virus writer would like to protect himself, he could always say he was doing that particular "piece of software" (and not virus) to help system administrators learn about the different connections between employees of a company and the outside world. Who shares codes, who knows who in competition.

In these cases, it would (again) be the big companies that would be allowed to do anything. The small fry wouldn't use 2000 lawyers to create false pretenses.

This is a no-brainer (2)

G27 Radio (78394) | more than 14 years ago | (#1508112)

There's no way I can justify, in my mind, blaming the author of the software. It's the implementer that is at fault.

In the case of virii: I don't believe there is anything inherently wrong with writing a virus. The author is not to blame until he unleashes it--deliberately or accidentally.

I have yet to find a good reason to hold an author responsible for how their software is used. It would be an evil thing if we could be prosecuted for the way someone may abuse software that we write. This could certainly have a chilling effect on free software.

I don't think any of us will be very happy if the people that can afford to release software are companies that have a full-time legal staff to fend off law suits brought on by misuse of software.


If you use - give credit (1)

gabrieltss (64078) | more than 14 years ago | (#1508113)


If you happen to use ANY of the comments given to this posed question you should give the credit in your bibliography and/or within your paper or that's plagurism and unethical. Thought I would point this out since you are taking an Ethics course.

Me I have two weeks left on my Business ethics course. (sounds like an oxymoron to me...) :^)


During Prohibition... (0)

Anonymous Coward | more than 14 years ago | (#1508114)

...there was a yeast product sold, or so goes the legend, that had a disclaimer on it:
Warning! Mixing this product with (names many beer ingredients) and heating for (instructions) will produce beer, which is a controlled substance under the (prohibition laws).
So it has been _legal_ to sell this stuff. Also note that under Prohibition it wasn't illegal to drink, just to sell.
As another example, look at "head" shops. They sell "drug parephenelia" such as pot pipes, and usually stay in business. The person who uses them could get busted for posession of pot, and the pipe has no use outside of pot use.
In practice you can sell a lot of stuff with no "legal" use.

Who's responsible? (2)

Seth Scali (18018) | more than 14 years ago | (#1508115)

I'm going to open up a can of worms here and open myself up to a flame war. Moderators, go ahead and mark this down as flamebait, but please realize I'm not trying to advocate a political viewpoint:

Is a gun company responsible for people who get shot?

Some people say "yes". Like Gail Fox, a Brooklyn lady who watched somebody shoot her son. He survived, fortunately, but she felt that action needed to be taken. Not against the person who pulled the trigger. Not against the dealer who illegally sold the gun. Against the gun industry. 15 of the 25 gun companies named in the suit were found liable for the shooting, and for the deaths of 6 other children.

Take this logic and apply it to software. If some company is hit by BO2K, it isn't the fault of the script kiddie who installed it. It isn't the fault of the administrator who didn't take proper precautions to secure the servers.

No, according to the flawed logic detailed above, it's the cDc's fault that the company gets hacked. After all, the cDc distributed something that they knew could be used for illegal purposes! They distributed something that could be easily used by even the most inexperienced person to wreak havoc on the lives of others, right?

In other words, personal responsibility is gone. Nobody prosecutes the people who sell illegal guns-- they prefer to make the CEO of Colt Firearms go in front of a judge and grovel for mercy. Nobody wants to prosecute the script kiddy or toughen up their system-- it's easier to blame the Cult of the Dead Cow and make them pay for the damages. Nobody wants to make a good copy protection scheme for DVD movies-- it's easier to threaten lawsuits against the people who point out how horribly fucked-up the system is.

Responsibility for the use of any technology, be it software or guns, is in the hands of the person who uses it. I don't believe in passing the blame around like so much candy-- my actions are my own, for better or worse. If I'm willing to take the credit for my accomplishments, I should damn well be willing to take the blame for my mistakes and blunders.

A note to the world: don't blame others. It won't do you a damn bit of good. Instead, take a little responsibility for your actions and learn from your mistakes. It's that ninth habit of highly successful people-- they don't pass the buck.

Re:No No and No (1)

Neuroprophet (12311) | more than 14 years ago | (#1508116)

By writing a virus you could also be exposing how an OS could be compromised or infected. The code could then be examined by people, and fixes could be made so that a virus of that type could no longer compromise or infect the OS. Just because someone has code to a virus doesn't mean they have to use it for harm.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>