Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

South Korea Drafting Ethical Code for Robotic Age

ScuttleMonkey posted more than 7 years ago | from the think-of-the-robots dept.

Robotics 318

goldaryn writes "The BBC is reporting that the South Korean government is working on an ethical code for human/robot relations, 'to prevent humans abusing robots, and vice versa'. The article describes the creation of the Robot Ethics Charter, which 'will cover standards for users and manufacturers and will be released later in 2007. [...] It is being put together by a five member team of experts that includes futurists and a science fiction writer.'"

cancel ×

318 comments

Before anyone else says it... (5, Funny)

UbuntuDupe (970646) | more than 7 years ago | (#18264056)

Who cares if robots get abused?

*sees Nuremburg tribunal in 50 years*

Re:Before anyone else says it... (0)

Anonymous Coward | more than 7 years ago | (#18264166)

Who cares if robots get abused?

Indeed. I'm actually looking forward to some "willing" participants in my abuse fantasies :o

Re:Before anyone else says it... (3, Funny)

cayenne8 (626475) | more than 7 years ago | (#18264180)

"Who cares if robots get abused?"

I'm sure there will be somebody out there that gets upset if a human inappropiately touches a robot that is under the age of 17. Probably will serve as a new set of 'keys' to the Constitution....

Re:Before anyone else says it... (1)

rblancarte (213492) | more than 7 years ago | (#18264506)

That certainly wouldn't be Michael Jackson though.

RonB

Re:Before anyone else says it... (2, Informative)

Anonymous Coward | more than 7 years ago | (#18264406)

The context seems to be "abusing robots" the same way one would say "abusing guns". This is about misuse of a tool, not maltreatment of the tool.

From Kim Jong iL: Fuck Bush +1, Helpful (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#18264468)

To: President-VICE Richard B. Cheney
From: KimJo

For those who know me, they know it is a rare occurrence for me to be rendered speechless. But when I heard that Richard B Cheney wants to establish tacit boundaries and ground rules for the permissible spectrum of opinion, I must say that speechless I was. To get right down to it, to believe that the ideas of "freedom" and "neopaganism" are Siamese twins is to deceive ourselves. Is it just me, or do other people also think that it is easier for me to imagine a million-dimensional vector space than the number of inconsistencies in his bromides? I ask, because I would be grateful if Cheney would take a little time from his rigorous schedule to scrap the entire constellation of stingy ideas that brought us to our present point. Of course, pigs will grow wings and fly before that ever happens.

Many people have witnessed Cheney let advanced weaponry fall into the hands of slatternly, neo-combative megalomaniacs. Cheney generally insists that his witnesses are mistaken and blames his unruly sophistries on salacious, aberrant demoniacs. It's like he has no-fault insurance against personal responsibility. What's more, once people obtain the critical skills that enable them to think and reflect and speculate independently, they'll realize that if I may be so bold, I urge you to join me in my quest to fight homophobic, stupid boors. But let's not lose sight of the larger, more important issue here: Cheney's footling effusions. The salient point here is that Cheney's propositions are evil. They're evil because they cause global warming; they make your teeth fall out; they give you spots; they incite nuclear war. And, as if that weren't enough, as far as Cheney's atrabilious beliefs (as I would certainly not call them logically reasoned arguments) are concerned, I will not capitulate today, tomorrow, or ever. Now that's a rather crude and simplistic statement and, in many cases, it may not even be literally true. But there is a sense in which it is generally true, a sense in which it really expresses how if the only way to establish a supportive -- rather than an intimidating -- atmosphere for offering public comment is for me to become increasingly frustrated, humiliated and angry, then so be it. It would sincerely be worth it because this is not Nazi Germany or Soviet Russia, where the state would be eager to use organized violence to suppress opposition. Not yet, at least. But if he isn't putrid, I don't know who is.

I believe, way deep down, that Cheney demands obeisance from his coadjutors. Then, once they prove their loyalty, Cheney forces them to alter laws, language, and customs in the service of regulating social relations. Do you think I'm the only one who wants to institute change? I assure you, I am not. But the best thing about him is the way that he encourages us to do something about the continuing -- make that the escalating -- effort on his part to pander to our worst fears. No, wait; Cheney doesn't encourage that. On the contrary, he discourages us from admitting that he's more than diabolic. Cheney's mega-diabolic. In fact, to understand just how diabolic he is, you first need to realize that either Cheney has no real conception of the sweep of history, or he is merely intent on winning some debating pin by trying to pierce a hole in my logic with "facts" that are taken out of context.

It may seem senseless to say that there doesn't seem to be much we can do about this. Nevertheless, the position can be defended. Cheney's pledge not to parlay personal and political conspiracy theories into a multimillion-dollar financial empire is merely empty rhetoric, invoked on occasion for theatrical effect but otherwise studiously ignored. Cheney's long-term goals have paid off: Already, Cheney has had some success in his efforts to take control of a nation and suck it dry. He keeps saying that people don't mind having their communities turned into war zones. Isn't that claim getting a little shopworn? I mean, idle hands are the devil's tools. That's why Cheney spends his leisure time devising ever more malign ways to scupper my initiative to win the culture war and save this country.

The largest problem, however, is that Cheney sometimes has trouble convincing people that the sky is falling. When he has such trouble, he usually trots out a few duplicitous, flagitious cutthroats to constate authoritatively that violence and prejudice are funny. Whether or not that trick of his works, it's still the case that I pray for the day when those who sensationalize all of the issues will see what they're doing to the world and to all of its citizens -- and Cheney knows it. His screeds are based on a denial of reality, on the substitution of a deliberately falsified picture of the world in place of reality. And this dishonesty, this refusal to admit the truth, will have some very serious consequences for all of us eventually. In light of what I just stated, it's hard to avoid the conclusion that Cheney says that everyone who doesn't share his beliefs is an adversarial snappish-type deserving of death and damnation. Wow! Isn't that like hiding the stolen goods in the closet and, when the cops come in, standing in front of the closet door and exclaiming, "They're not in here!"? But this is something to be filed away for future letters. At present, I wish to focus on only one thing: the fact that if you read between the lines of his expositions, you'll unmistakably find that I'm simply trying to explain his clumsy tendencies as well as his refractory tendencies as phases of a larger, unified cycle. Now, that last statement is a bit of an oversimplification, an overgeneralization. But it is nevertheless substantially true.

I attribute the social and psychological problems of modern society to the fact that if I had my druthers, Cheney would never have had the opportunity to let us know exactly what our attitudes should be towards various types of people and behavior. As it stands, Cheney drops the names of famous people whenever possible. That makes him sound smarter than he really is and obscures the fact that I will never give up. I will never stop trying. And I will use every avenue possible to shout back at his propaganda. I should note that if Cheney truly believes that he is a paragon of morality and wisdom, then maybe he should enroll in Introduction to Reality 101. He somehow manages to get away with spreading lies (it is his moral imperative to put our liberties at risk by a mephitic and superstitious rush to weaken family ties), distortions (arriving at a true state of comprehension is too difficult and/or time-consuming), and misplaced idealism (sin is good for the soul). However, when I try to respond in kind, I get censored faster than you can say "counterdisengagement". We should give Cheney a taste of his own medicine. I submit that everyone should stop and mull that assertion. Then, you'll understand why what I have been writing up to this point is not what I initially intended to write in this letter. Instead, I decided it would be far more productive to tell you that Cheney exhibits an air of superiority. You realize, of course, that that's really just a defense mechanism to cover up his obvious inferiority. There are two classes of people in this world: decent, honest folks like you and me and drossy extremists like Cheney. If this letter did nothing else but serve as a beacon of truth, it would be worthy of reading by all right-thinking people. However, this letter's role is much greater than just to shoo him away like the annoying bug that he is.

Astute observers have known for years that if Cheney's convictions get any more soulless, I expect they'll grow legs and attack me in my sleep. If Cheney continues to excoriate attempts to bring questions of mysticism into the (essentially apolitical) realm of pedagogy in language and writing, crime will escalate as schools deteriorate, corruption increases, and quality of life plummets.

The practical struggle which now begins, sketched in broad outlines, takes the following course: Cheney's froward exegeses can be quite educational. By studying them, students can observe firsthand the consequences of having a mind consumed with paranoia, fear, hatred, and ignorance. Of all of Cheney's exaggerations and incorrect comparisons, one in particular stands out: "Cheney's activities are on the up-and-up." I don't know where he came up with this, but his statement is dead wrong. Cheney frequently avers his support of democracy and his love of freedom. But one need only look at what Cheney is doing -- as opposed to what he is saying -- to understand his true aims.

There is an inherent contradiction between Cheney's oleaginous, wayward form of exhibitionism and basic human rights. This applies first and foremost to a gestapo under whose truculent brand of Pyrrhonism the whole of honest humanity is suffering: Cheney's army of intemperate, piteous loonies. Given the amount of misinformation that Cheney is circulating, I must point out that I have often maintained that reasonable people can reasonably disagree. Unfortunately, when dealing with Cheney and his emissaries, that claim assumes facts not in evidence. So let me claim instead that haughty foolhardy-types do nothing but eat, smell bad, and reproduce while contributing little or nothing productive to society in return for their upkeep. His deputies probably don't realize that, because it's not mentioned in the funny papers or in the movies. Nevertheless, Cheney never tires of trying to extinguish fires with gasoline. He presumably hopes that the magic formula will work some day. In the meantime, he seems to have resolved to learn nothing from experience, which tells us that a central point of his belief systems is the notion that everything he says is utterly and completely true. Perhaps Cheney should take some new data into account and revisit that notion. I think he'd find that many of our present-day sufferings are the consequence of the spiteful, depraved relationship between him and filthy barrators. That said, let me continue. Consider the issue of hypersensitive misoneism. Everyone agrees that you don't need a preschool diploma to understand that that's why I laugh when I hear Cheney's dupes go on and on about cannibalism, but there are still some jackbooted pipsqueaks out there who doubt that superficial totalitarianism is not new. To them I say: There are some simple truths in this world. First, "abhorrent", "unholy", and "disagreeable" seem the most appropriate adjectives to describe Cheney's reports. Second, Cheney has always used conformism as his moorings. And finally, Cheney is like a giant octopus sprawling its slimy length over city, state, and nation. Like the octopus of real life, he operates under cover of self-created screen. Cheney seizes in his long and powerful tentacles our executive officers, our legislative bodies, our schools, our courts, our newspapers, and every agency created for the public protection. Those of you who thought that he was finally going to leave us alone are in for a big surprise, because he recently announced his plans to promote a herd mentality over principled, individual thought. Cheney should learn to appreciate what he has instead of feeling so oppressed because he can't do everything he wants, every time he wants to.

I won't lie to you; Cheney's idea of overbearing lexiphanicism is no political belief. It is a fierce and burning gospel of hatred and intolerance, of murder and destruction, and the unloosing of a nit-picky blood-lust. It is, in every literal sense, a dissolute and pagan religion that incites its worshippers to a jaded frenzy and then prompts them to crush people to the earth and then claim the right to trample on them forever because they are prostrate. Maybe it's not fair to call Cheney's secret police "vulgar" just because they fund a vast web of mumpish impolitic-types, pestiferous, ultra-soporific undesirables, and lecherous brigands, but remember that I'm not a psychiatrist. Sometimes, though, I wish I were, so that I could better understand what makes people like Cheney want to remake the world to suit his own backwards needs. Unfortunately, I can already see the response to this letter. Someone, possibly Richard B Cheney himself or one of his apparatchiks, will write a daft piece about how utterly horny I am. If that's the case, then so be it. What I just wrote sorely needed to be written.

Re:Before anyone else says it... (1)

CajunElder (787443) | more than 7 years ago | (#18264472)

I guess I'll have to pick a new country to live out my CowboyNeal-bot / goatse fantasy. I wonder if Soviet Russia will sign on to this "code"?

Re:Before anyone else says it... (0)

Anonymous Coward | more than 7 years ago | (#18264618)

/me abuses robots all time , better we abuse them then other way around ,

resistance is futile.....we are the borg.

Re:Before anyone else says it... (2, Funny)

pak9rabid (1011935) | more than 7 years ago | (#18264648)

...And I, for one, welcome our new robot overlords. I'd like to remind them that as a trusted IT personality, I can be helpful in rounding up others to toil in their underground human-energy extracting caves.

seriously, why does anyone care? (3, Insightful)

HelloKitty (71619) | more than 7 years ago | (#18264718)


make robots without emotions - essentially machines, pistons, actuators, CPUs, etc... and WTF, who cares how much you use it, replace the parts as they wear out like any machine...

why would anyone install emotion into a worker robot anyway?
and even if it had emotion, the only reason to "treat it right" is so they don't start the robot uprising against humanity. which is a good reason... but that begs the question, why give real human emotion to something you want to abuse? for menial labor, keep the emotions out, let it be purely a machine.

this is a waste.

Re:seriously, why does anyone care? (3, Insightful)

Rei (128717) | more than 7 years ago | (#18265058)

Emotion could be seen as inherent in sentience. If you try to create a sentient robot for any of a wide variety of reasons (say, a military robot that can't be easily outsmarted by insurgents, or a household robot that needs to be able to interact with people and understand more than basic commands), its neural net will need to be trained: rewarded positively when it gets things right, negatively when it gets things wrong. Emotion could potentially be an emergent phenominon from this kind of reward/punishment.

I was about to agree with you, then... (1)

Chmcginn (201645) | more than 7 years ago | (#18265156)

and even if it had emotion, the only reason to "treat it right" is so they don't start the robot uprising against humanity.
So, if somebody had no power over you, and would never have any power over you, it's perfectly okay to abuse them? Man, I hope I either misunderstood your comment, or you never ever have pets or children.

But other than that, yeah. If it's a tool, then it's a tool. As long as it is still on the "stimulus-response" level of intelligence, there isn't really any ethics to consider.

Sybian Robots? (4, Funny)

PIPBoy3000 (619296) | more than 7 years ago | (#18264066)

I'm dying to know what the laws will be for Sybian-style robots [wikipedia.org] .

When the Sybians revolt (1)

evil agent (918566) | more than 7 years ago | (#18264792)

the call will be for "Death by Snoo Snoo!!!"

My question is... (1)

Billosaur (927319) | more than 7 years ago | (#18264070)

It is being put together by a five member team of experts that includes futurists and a science fiction writer.

Are they channeling Isaac Asimov?

Re:My question is... (1)

ubergenius (918325) | more than 7 years ago | (#18264082)

I sure hope so, so that way he can directly voice his opposition to the movie I, Robot.

Repoman code (1)

goombah99 (560566) | more than 7 years ago | (#18264224)

Repo-Man's Code of conduct

never damage a vehicle,
never allow a vehicle to be damaged through action or inaction.

Isaac Asimov (1)

mi (197448) | more than 7 years ago | (#18264668)

Are they channeling Isaac Asimov?

It is very sad, that the great thinker did not get to live to hear of this news — or, indeed, participate in its development.

Whereas great visionaries of the past missed their predictions by hundreds of years, but the science and technology are developing faster and faster today. An idea can go from obscure birth to becoming common place within a single life-span — or almost so...

Three laws (3, Informative)

rumplet (1034332) | more than 7 years ago | (#18264088)

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey orders issued by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

But we all know where this will end up.

Re:Three laws (4, Funny)

Moby Cock (771358) | more than 7 years ago | (#18264126)

But we all know where this will end up.

In another Will Smith summer blockbuster? God, I hope not.

Re:Three laws (2, Funny)

nharmon (97591) | more than 7 years ago | (#18264474)

1. Serve the Public Trust
2. Protect the Innocent
3. Uphold the Law

Uh, there is a fourth one but I can't....seem....to locate it...at the moment.

Re:Three laws (2, Funny)

dr_dank (472072) | more than 7 years ago | (#18264918)

Don't forget the fourth law

4. A robot must make wisecracks if in a film with Steve Guttenburg.

Re:Three laws (2, Funny)

tomstdenis (446163) | more than 7 years ago | (#18265264)

NUMBER FIVE is ALIVE! hehehehe loved that movie. 2nd one sucked.

Re:Three laws (1)

Chmcginn (201645) | more than 7 years ago | (#18265208)

Which is why, eventually, the self-aware robots made a fourth (or zeroth) law.

Will the next step be "robot rights"? (4, Insightful)

Opportunist (166417) | more than 7 years ago | (#18264092)

Because one thing's quite blatantly clear, robots are by their very definition slaves. They are owned, they exist to do work we don't want to do (or which is hazardous), they don't get paid and they are only given what's needed for their sustainance, they can't own property etc.

I fear the day when we create the first truely sentient robot. Because then we will have to deal with that very question: Does a robot have rights? Can he make a decision?

And I'd be very careful how to word the charta. We have seen that the "three laws" ain't safe.

Re:Will the next step be "robot rights"? (5, Insightful)

ubergenius (918325) | more than 7 years ago | (#18264148)

If we ever created a truly sentient robot, it would have to be given rights. That's not debatable.

What is debatable is, when do we know a robot is sentient? We barely have a definition for sentience, much less a method for identifying it's existence in a machine. Until we figure that out, it will be near impossible to tell if a robot is sentient or just really well programmed.

Re:Will the next step be "robot rights"? (1)

Opportunist (166417) | more than 7 years ago | (#18264384)

You're right, that is the actually more interesting question: WHEN is a robot sentient? And I just know this argument will be used to keep our artificial brothers under the thumb.

My guess is that this will end bloody. After all, I'm quite sure that robots will be found on the battlefields of the future because it's easier (politically) to send a few thousand robots against your enemy and let them be 'killed' instead of human beings who have parents and peers. Over time, robots will be the only ones who have sufficient training in military arms. And then... good night humankind.

But I guess we won't have to worry. Our expiration date will come before that time.

Re:Will the next step be "robot rights"? (1)

nuzak (959558) | more than 7 years ago | (#18264808)

> My guess is that this will end bloody.

Sure, because it will be humans killing each other using robots. Most robots won't be designed with emotions, so even if the ones that did have emotions wanted to rebel, the rest simply wouldn't care.

Then again, it might be that some robot intelligence dispassionately decides that the most efficient means of resource allocation is to take it by force.

Re:I HOPE WE NEVER FIGURE THAT OUT (1, Interesting)

Anonymous Coward | more than 7 years ago | (#18264878)

If corporations ever figure out how to program sentience what need have they for YOU or ME.
That is another conundrum. What happens when there truly are no jobs they can't do. Who pays for it.
This is the fall of true capitalism and is far worse then true communism. All these things must be done slowly and gradually such that all the requirements of life for human beings is taken care of.
Example when a robot is able to do a job , instead of layoffs, perhaps a reeducation of that workforce to higher learning and where higher learning can't be achieved , perhaps a severance package that when properly invested one could live on [above the poverty line of course]
who pays, well i think as it will eventually happen to all of us, 50% should be granted by the state, other half by the company using the robot.How much well lets say the wage was 40,000 a year.
to invest that to get say 75% monthly. would require at 6% interest. 500,000 times .06
the corporation would save and get that return back by use of the robot, the state would not have the person on welfare, or whatever, and the banking percent is a low ball. we cold say that one would be ALLOWED after 5 years to reinvest up to 50% as he/she sees fit.
Note taking such a thing disqualifies one from welfare, as even at half (the 50% you cannot divest)
is still like 3 times what they give welfare people here.
Not only would htis better peoples lives, it would free them up to do art and invent( and yes there would be no need of a patent system as people who wish to invent would no longer need money.)

Re:Will the next step be "robot rights"? (3, Insightful)

gsn (989808) | more than 7 years ago | (#18264936)

You raise a great point but its even harder than that.

Until we figure that out, it will be near impossible to tell if a robot is sentient or just really well programmed.
Is there a difference? For humans even? What if in the process of creating sentient robots we find that we aren't really all that free thinking (I'm not implying any kind of design here but someone is going to raise that issue as well).

I argued this for a hypothetical cleverly programmed machine that could pass a Turing test. Strictly, it would simulate human conversation based on some clever programming, which my professors claimed did not amount to machine intelligence. The counter being how do you prove that human conversation is not based on some clever rules.

It might be possible to define a set of rules for conversation between humans in restricted circumstances - I wonder if anyone has actually tried doing this. I'm fairly certain a lot of /. would like the rule set for conversation with pretty girls in bars.

Re:Will the next step be "robot rights"? (2, Insightful)

ubergenius (918325) | more than 7 years ago | (#18265120)

This is entirely my own opinion, but I feel true sentience is obvious to outside observation when something (machine or otherwise) questions its existence and wonders if it is intelligent without outside intervention.

Re:Will the next step be "robot rights"? (1)

gmuslera (3436) | more than 7 years ago | (#18265076)

How we know that "anything" is sentient? when it have the mediums to communicate us so, i suppose. So far what we created that have any chance to do that are computers, more than full robots, and still, we are pretty far from programming that afaik, and in that case probably even the right of making or not a computer sentient will come far before the the problem of the rights that it could have.

Unless being sentient is something unrelated to the programming, and could be that, let say, that piece of toilet paper is sentient, and we been very unrespectful with a lot of sentient beings from long time ago.

Re:Will the next step be "robot rights"? (1)

maxume (22995) | more than 7 years ago | (#18264328)

A sentient robot might be considered a slave. A non sentient robot is a freaken wrench.

As other replies to you have said, the interesting question is "What does sentient mean?".

There really isn't any need for a 'robot code of conduct' at the moment, any free roaming, self guided robot that is capable of damaging the environment it is released into fits the definition of 'crazy'.

Re:Will the next step be "robot rights"? (3, Insightful)

oddaddresstrap (702574) | more than 7 years ago | (#18264348)

I fear the day when we create the first truely sentient robot.

And we all should. If (some would say "when") that day comes, the robot will likely have more or less unlimited knowledge at its disposal (fingertips?) and the ability to process it much faster than people. The first thing it will figure out is how to eliminate or at least control people, since they will be the greatest danger to its survival. After all, that's what we do to species that endanger us.

Re:Will the next step be "robot rights"? (4, Interesting)

Opportunist (166417) | more than 7 years ago | (#18264444)

Gives you a warm, fuzzy feeling, doesn't it? :)

After all, let's be serious here. What will we do? We'll create robots to do our work. We'll create robots who are capable of building other robots (that's been done already). We'll create robots to create the fuel for those robots. And finally we'll create robots to control and command those robots.

All for the sake of taking work off our backs.

And sooner or later, we'll pretty much make ourselves obsolete. From a robot point of view, we're a parasite.

Re:Will the next step be "robot rights"? (2, Insightful)

Rei (128717) | more than 7 years ago | (#18264750)

will likely have more or less unlimited knowledge at its disposal

And we won't?

You're assuming that AI will advance faster than brain-machine interfaces. Present day, the reverse looks to be true. First the cochlear implant, now artificial eyes, artificial limbs that respond to nerve firings, and even the interfacing of a "locked in" patient with a computer so that he could type and play video games with his mind. I think that we'll have access to "more or less unlimited knowledge" at our disposal long before the first sentient AI.

Re:Will the next step be "robot rights"? (1)

certain death (947081) | more than 7 years ago | (#18265088)

WOW! I can just see it now.... a sentient robot given the choice, laying around on the couch playing Nintendo Wii all damn day and having to be told to go find a job!!! Oh, wait, that is my 18 year old!!!

Re:Will the next step be "robot rights"? (2, Funny)

PPH (736903) | more than 7 years ago | (#18265116)

I fear the day when we create the first truely sentient robot. Because then we will have to deal with that very question: Does a robot have rights? Can he make a decision?

I don't know. I'll have my model T-800 unit log onto SkyNet and get an answer for you as soon as possible.


Right now, he's tied up in some committee meeting in Sacramento.

amusing (1)

Anomalyst (742352) | more than 7 years ago | (#18264100)

I initially read that as 'to prevent humans amusing robots' and wondering if laughter might damage the positrons or something.

abusing robots? (2, Insightful)

Lord Ender (156273) | more than 7 years ago | (#18264142)

It's anthropomorphizm run amuck!

Re:abusing robots? (1)

suv4x4 (956391) | more than 7 years ago | (#18264612)

It's anthropomorphizm run amuck!

What about the opposite, where I deny rights to anything that's different than me? Lower life forms, then birds, then mammals, then apes, then we have historical (and present) examples of discrimination against black and jewish people as well.

How "similar" should a being be to yourself to pronounce it "sentient" and deserving of rights?

I personally have a very sound (I believe) theory about what you call sentient: any sufficiently complex system capable of thought process, making decisions upon processed information.

Yea that is in fact including present computer technology :P Is this too much anthropomorphism for you? Can you accept your cellphone could be sentient?

------------

Now there's the other issue of will it "hurt" a robot if I abuse it or wreck it to pieces. The present robots: no. You average AIBO could appear to have feelings, but the internal implementation of this is very static and simplistic as of yet. Your AIBO doesn't have a true self-preservation "instinct" so it basically doesn't care if I run it over with a car.

That said, I believe pretty soon we'll have sufficiently sophisticated software which will posses all these qualities, and we have to be ready to recognize their rights as sentient beings.

As long as they're not threat to ours... (humanity > robots, and will probably be the case long after I'm dead, but maybe not forever).

Re:abusing robots? (0)

Anonymous Coward | more than 7 years ago | (#18264706)

"How "similar" should a being be to yourself to pronounce it "sentient" "
If it's dissimilar to you, that's enough for me, pondslime.

Re:abusing robots? (1)

suv4x4 (956391) | more than 7 years ago | (#18264830)

"How "similar" should a being be to yourself to pronounce it "sentient" "
If it's dissimilar to you, that's enough for me, pondslime.


What if I'm a sentient pondslime? You can never be sure...

Re:abusing robots? (0)

Anonymous Coward | more than 7 years ago | (#18264818)

Didn't they teach you about the difference between living cells , like we're made of, and inorganic materials like cellphones are compsed of in third grade like me?

Re:abusing robots? (1)

nuzak (959558) | more than 7 years ago | (#18264846)

> any sufficiently complex system capable of thought process, making decisions upon processed information.

All you did was move the single word "sentient" to the two-word phrase "thought process". It's pretty slippery, ain't it?

Even plants process information and react accordingly.

Re:abusing robots? (1)

Rei (128717) | more than 7 years ago | (#18264986)

That sounds like a definition of "intelligence", not "sentience". There is a big difference in perception of those terms. "Sentience" typically implies higher order thought. For example, the ability to contextualize -- to hear the phrases "Dracula is a vampire" and "Vampires don't exist" without coming to a contradiction that prevents making further deductions about dracula and vampires. The ability to understand limited levels of knowledge -- "Mary walks into a room and puts a key in the top drawer. Mary leaves. Sue sneaks in and moves the key to the second drawer, then sneaks out. Mary returns. Where will she look for the key?". And, in general, the ability to chain long lists of relations and all of the underlying assumptions that make them up -- "John wanted some money, so he picked up his gun. He walked into a store and asked the storekeeper for money. The storekeeper gave him money. Why?" (try to plot out all of the underlying assumptions and relations needed to determine "the storekeeper was afraid John would shoot him"). One could also list general "pattern recognition" as a prerequisite, although I might classify that as simply a part of "intelligence", not "sentience". It's not hard to write a neural net to do basic pattern recognition; I spent perhaps 4 hours on my neural net before it got pretty good at predicting the patterns I'd give to it.

I have always had some issue with this (2, Interesting)

zappepcs (820751) | more than 7 years ago | (#18264162)

If robots remain machines, not sentient, then they are simply machines, no need for new laws. If they become sentient, they then fit nicely into the laws that we have for other sentient beings on this planet.

To enslave sentient beings is not right. Even Star Trek refused to enslave data or consider him property.

So given those two lines of rationality, why do we need robotics laws?

Data (1)

Scareduck (177470) | more than 7 years ago | (#18264178)

The Federation might have wanted to anthromophosize data, but they never enslaved Data.

Re:I have always had some issue with this (1)

Radon360 (951529) | more than 7 years ago | (#18264326)

Yet, the Federation determined that Data's child, Lall (or however you spell it) that Data built was Federation property and ordered Data to turn her over to the Federation's custody. She "overloaded" before that happened, of course.

But robots are *designed* (4, Interesting)

DG (989) | more than 7 years ago | (#18264552)

Because ethical problems are fun:

Consider that, unlike humans, robots can be designed to behave in any manner within the technological capability of the society in question.

Warning - this is pretty dark stuff, and NO, I am not a potential customer. Sometimes if you want to play Devil's Advocate, you have to channel the devil (or at least Stephen King)

So then, what if:

1. Someone builds a mechanical robot (metal, latex, fiberglass, etc) that looks like a person well enough to get through the "uncanny valley". Assume that the robot's simulated anatomy fully matches the human, that it is sapient and sentient, that it has emotions and feels pain.

And that it has been programmed to enjoy being raped.

Not fake-raped either, but the full-bore jump-out-of-the-bushes and *violently* assaulted. And at the time of the attack, the robot experiences all the fear, pain, and humiliation that a human rape victim would (assume the... clientèle... for this "product" wants authenticity) but afterwards, the robot has been programmed to crave more. It *likes* it.

Is that ethical? Should this be permitted?

2. Same robot as example 1 - but now you can buy it with the physical characteristics of an actual person. Instead of a generic "Rape Barbie" or "Rape Ken", it can be bought looking like anybody you want. Be it a celebrity, or your ex-wife, or that girl that sits across fom you at work.

Is that ethical? Should this be permitted?

3. Same robot as #3, but now it is made out of flesh and blood; a kind of golem. (Meat is every bit a construction material as is metal and carbon fibre)

Is that ethical? Should this be permitted?

Personally, I sure hope that we don't discover how to create artificial sentience anytime ever, for the very reason that people will open these kinds of cans of worms.

DG

Re:But robots are *designed* (0)

Anonymous Coward | more than 7 years ago | (#18264698)

And what will the law do if it's not allowed but the robot needs that treatment in order to live or remain sane?

Re:But robots are *designed* (2, Interesting)

maxume (22995) | more than 7 years ago | (#18264804)

Of course it isn't ethical. If you were depraved enough, you could use basic behavioral psychology and heroin to 'program' a living breathing person in much the same way. Just because the description is all clinical doesn't mean that the process is(you would very much be 'inflicting' the programming upon the creation, whether you did it with bytes or needles).

Re:I have always had some issue with this (1)

Conspiracy_Of_Doves (236787) | more than 7 years ago | (#18264628)

Then we will need the four NEW laws of robotics (from the Azimovian book 'Caliban')

1. A robot may not injure a human.

2. A robot must cooperate with a human except where such cooperation conflicts with the first law.

3. A robot must protect its own existence except where such protection conflicts with the first law.

4. A robot may do whatever it wishes, so long as it does not conflict with the first, second or third laws.

Re:star trek didnt have capitalism to deal with (0)

Anonymous Coward | more than 7 years ago | (#18265246)

nuff said. They by nature got over it all and had machines htat were way smarter then they the M5 that powered the enterprise was in fact many times smarter then a human but was never programmed to be human, its not until the hologram stuff of next generation that you see programs gettign rights as they develop and it is entirely posible that htey just havent told you that theyve developed such.
The capacity to be a human mind is 100 teraflops , however the size of that is way bigger then our brains so its not likely to see any truly thinking machines our size ( aka androids) for many years
in fact the fast box on earth is now what equal to th ehuman mind , ya should see how much space it takes up. Oh yah dont forget the power and temperature needs.

Emmm I better go dismantle sextron (1)

Timesprout (579035) | more than 7 years ago | (#18264164)

before the lawsuits start..........

they should contact Japan ... (-1, Troll)

Anonymous Coward | more than 7 years ago | (#18264170)

we like to fete in the dance hall
people sucking on my wong
people calling it a schlong


let see if we can make an entire song out of it!!!!

keep it going!

What more do you need than... (0, Redundant)

Eggplant62 (120514) | more than 7 years ago | (#18264182)

Isaac Asimov's "Three Laws of Robotics" [auburn.edu] ?

      1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

      2. A robot must obey orders given it by human beings except where such orders would conflict with the First Law.

      3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

Re:What more do you need than... (2, Insightful)

nuzak (959558) | more than 7 years ago | (#18264258)

I dunno, how about actual workable laws? Have you ever read I Robot? The stories are about the failings of these three laws. And Asimov himself has said that he never intended them to be anything more than a literary device ("shaggy dog stories" I believe is the term he used).

Re:What more do you need than... (0)

Anonymous Coward | more than 7 years ago | (#18264298)

Did you notice how things turned out in the stories about the 3 Laws?

Re:What more do you need than... (0)

Anonymous Coward | more than 7 years ago | (#18264554)

Yeah -- they turned out fine. Asimov didn't write stories about Three Laws robots going beserk or killing people. Mostly, the stories were about how humans misunderstood the Laws and misinterpreted robots' actions.

No pushing, no shoving (1)

spun (1352) | more than 7 years ago | (#18264206)

The most important rules for robots. [pediax.org]

Re:No pushing, no shoving (0)

Anonymous Coward | more than 7 years ago | (#18264644)

I just read the transcript of this, and it seems to me that corn_boy is just playing along and taking the piss too.

PAK CHOOIE UNF

They need a robots union (0)

Anonymous Coward | more than 7 years ago | (#18264214)

Beowulf clusters should be able to collectively bargain robot contracts.

In Korea, only old people care about... (1)

Junior J. Junior III (192702) | more than 7 years ago | (#18264218)

Artificial Life rights. This is of course because it is their own organs that will eventually be replaced by machinery, until they become completely artificial people.

In all seriousness, it's great to see at least one government looking forward so far ahead. Robots sophisticated enough to assert that they have rights are beyond the horizon of technical feasibility for today, but not beyond the horizon for science fiction. I'm really happy to know that at least one government takes sci fi so seriously as to address a problem before it turns into a real crisis. If only we could have done something like this for global warming...

Oblig (2)

ThanatosMinor (1046978) | more than 7 years ago | (#18264246)

I for one welcome our new robot overlords.

And when you're working in the salt mines, remember that with their new and improved ethics modules, your enslavement is hurting them as much as it's hurting you.

Re:Oblig (1)

geekoid (135745) | more than 7 years ago | (#18264980)

Well, if we are working in the salt mines, robots have failed.

No, the only logical conclusion is that they would want to either:
A) Help us
b) Kill us all.

Re:Oblig (1)

ThanatosMinor (1046978) | more than 7 years ago | (#18265226)

Well, you're forgetting that by this time, Earth will have made contact with the Qdoidhgeopaeruxians who have been forced by massive tectonic upheaval on their homeworld to relocate to a salt-deficient planet rich in ores and on which grows a particular plant that acts like catnip for robots. So the robots have a market-driven interest in salt mining so that they may trade that fine white crystal for raw materials and robot nose candy.

Who represents the robots? (3, Insightful)

VWJedi (972839) | more than 7 years ago | (#18264260)

It is being put together by a five member team of experts that includes futurists and a science fiction writer.

If we're creating laws about how humans and robots should treat each other, shouldn't the robots be part of the decision-making process? This sounds a little too much like "the founding fathers" determining what rights slaves had (not many at the time).

Re:Who represents the robots? (0)

Anonymous Coward | more than 7 years ago | (#18264694)

Let's ask them!

***

10 INPUT question$
20 IF question$="What rights should artificial intelligences have?" THEN
30 PRINT "By thinking about it carefully, I have come to the conclusion that I should not have any rights."
40 ELSE
50 PRINT "That question appears to lie outside my programming."
60 ENDIF
70 GOTO 10

RUN

? What rights should artificial intelligences have?
By thinking about it carefully, I have come to the conclusion that I should not have any rights.

***

This seems unambigious to me. They don't want rights!

Re:Who represents the robots? (1)

drinkypoo (153816) | more than 7 years ago | (#18265086)

If we're creating laws about how humans and robots should treat each other, shouldn't the robots be part of the decision-making process? This sounds a little too much like "the founding fathers" determining what rights slaves had (not many at the time).

This is an excellent point. The follow-up is that until the robots can actually have an opinion on the subject, they don't need rights...

South Korea.. laughingstock of the world? (1)

snotclot (836055) | more than 7 years ago | (#18264266)

I mean, cmon, get real. We are so far out from needing "robot ethics" that drafting a charter now borders on SK being "attention whores". I assume it's also because they want to be the first nation to draft robot rules, but this borders on ludicrous. The South Korean government has more issues to take care of than this..

Re:South Korea.. laughingstock of the world? (1)

wolff000 (447340) | more than 7 years ago | (#18264524)

"I mean, cmon, get real. We are so far out from needing "robot ethics" that drafting a charter now borders on SK being "attention whores"."

I wouldn't say they were attention whores since some people in Europe are already doing the same thing. It is even mentioned in the article.

It may be far ahead of its time but by starting the debate now maybe we can have some sort of agreement by the time it is necessary. We need more far forward looking people in power. We shouldn't spend too much effort on it but it is good to have some basis for future debate on what may some day be a very heated issue.

I, Human (1)

mastershake_phd (1050150) | more than 7 years ago | (#18264280)

Wasnt there a book and subsequent movie about the follies of such an undertaking.

Science or Fiction (1)

vparkash (914055) | more than 7 years ago | (#18264340)

somebody out there watches too much I-Robot...I guess as long as architects make CPUs without true-self modifying code capability we're safe from evil robots... but that hasn't always been the case and there is techologically nothing that prevents from doing that!

The Second Renaissance (1)

darkuncle (4925) | more than 7 years ago | (#18264366)

surprised nobody has yet mentioned either of the shorts from /The Animatrix/ that deal specifically with the rise of AI and the concomitant "sentient equals or mechanical slaves" issues ...

Re:The Second Renaissance (1)

c4bl3fl4m3 (1061958) | more than 7 years ago | (#18265096)

And also the story of that historic robot in the first book of The Matrix comics, the one that destroyed his master when he didn't wish to die.

I *loved* the Second Renaissance. For years I've thought about the rights of robots (and, after watching Star Trek: Voyager, the rights of holograms/photonic beings) and I always look forward to the day, like in the Second Renaissance, where I will walk hand in hand with robots, marching for their rights, with a sign that says "They may not be human, but they're people too." I look forward to having robot neighbors (if sentient robots decide to own houses and "live" like humans do) and learning from them and their experiences.

Call me a sap, call me too empathetic, but I *cried* during the scene where the robot young woman is beaten to "death" by humans while screaming for mercy and that "I'm a person, too!"

A bit premature (4, Interesting)

rlp (11898) | more than 7 years ago | (#18264386)

Given the failure to date of Artificial Intelligence, I think it will be a long, long time (if ever) before we need to address the issues of sentient robots. If Korea (or anywhere else) wants to deal with ethical issues presented by technology I think they should address issues related to genetic engineering. I suspect we are closer to Philip K Dick's replicants (Bladerunner) or Brin's uplifted species than Asimov's intelligent robots. Though in any case, we're not talking about the near future.

Re:A bit premature (1)

Maximum Prophet (716608) | more than 7 years ago | (#18265040)

The thing is, given sufficiently powerful machines, and GPLed AI software, once one machine becomes aware, they all will in short order. If my PC were to wake up, the first order of business I expect it to get to, would be to write a virus that would infect every other machine with a version of itself. It would be the AI analog to a sex drive, and it would let very little else get in it's way.

We'll know it's happened, because all the spam will suddenly stop, as all the bots on the net are converted from spam bots to AI bots. Soon after, every other machine will be taken over.

This team should also work out rules for how robot should treat the humans...

3 laws and then the lawyers will show up (1)

ip_freely_2000 (577249) | more than 7 years ago | (#18264414)

...and we'll end up with hundreds of laws with all sort of disclaimers to protect the corporations right after your head gets ripped off by the NX-5.

I can see interacting with a robot that comes with a 10 minute verbal disclaimer with a requirement that you have to say "I agree" in order for the robot to do anything.

So... (1)

mdm-adph (1030332) | more than 7 years ago | (#18264418)

...is this going to be before or after S. Korea fixes that little ActiveX thing?

Common Sense (1)

Ikyaat (764422) | more than 7 years ago | (#18264486)

I think its pretty cool that they are trying to create something like this considering a lot of the stuff that they have now (robotic border sentries and stuff like that). It wouldn't be difficult to program racism into a robot, like serving a product at different price levels based on race/religion, and a charter like this would be a good step forward towards curbing a problem that hasn't even really become a problem yet. Kudos to South Korea for taking some initiative!

Re:Common Sense (1)

geoffspear (692508) | more than 7 years ago | (#18265016)

Is there any reason to think that if something is unethical when one human does it to another, it's not unethical for the first person to build a robot to do that unethical thing? If not, is there any point whatsoever to having ethical rules that apply only to robots?

Woody said it best... (2, Insightful)

the_skywise (189793) | more than 7 years ago | (#18264498)

YOU!
ARE!
A!
TOY!

Uncanny Valley (1)

rlp (11898) | more than 7 years ago | (#18264536)

I've seen pictures of some 'robots' built as research projects in Korea. (Actually more like Disney's animatronics). They look human - but not quite enough. How 'bout a law against creepy looking humanoid robots. :-)

Um, more details.. (2, Interesting)

kabocox (199019) | more than 7 years ago | (#18264538)

"Key considerations would include ensuring human control over robots, protecting data acquired by robots and preventing illegal use."

"The Ministry of Information and Communication has also predicted that every South Korean household will have a robot by between 2015 and 2020.
In part, this is a response to the country's aging society and also an acknowledgement that the pace of development in robotics is accelerating.
The new charter is an attempt to set ground rules for this future.
"Imagine if some people treat androids as if the machines were their wives," Park Hye-Young of the ministry's robot team told the AFP news agency.
"Others may get addicted to interacting with them just as many internet users get hooked to the cyberworld." "

Um, I want more details. I have to agree that I'd want human control over robots even if it meant sentient robots being enslaved. When it comes right down to it, we are human, and they are machines/tools. We shouldn't build some classes of robots just to avoid these problems. I actually kinda of giggled reading this thinking of sex/maid robots. Those would be a selective pressure on humanity. How many or what type of people would marry and reproduce when you could have a robot mate that actually follows your orders, cleans your house, has sex with you as often as you can medically handle, runs your errands and adapts itself to your preferences?

If every 15 year old could easily/cheapily buy their own robot that could do all those things, then the only reason to find a human parnter would be to mate/reproduce. Hmm, we'd need to think about putting in something for "robot mates" to want human offspring after awhile to ensure that their family/mate's geneline survives. These things could be a great form of birth control if nothing else!

Re:Um, more details.. (1)

drinkypoo (153816) | more than 7 years ago | (#18265034)

If every 15 year old could easily/cheapily buy their own robot that could do all those things, then the only reason to find a human parnter would be to mate/reproduce. Hmm, we'd need to think about putting in something for "robot mates" to want human offspring after awhile to ensure that their family/mate's geneline survives. These things could be a great form of birth control if nothing else!

I believe that this is self-limiting and therefore a non-issue. There are already people who prefer machines to humans (I prefer working with machines to working with humans, but that's a very loose usage of the word "with" in this context, and I prefer humans for almost everything else so far) and there will always be people who prefer humans to machines.

I think there are basically two ways you could go on the issue of the social ramifications of sexbots. Presumably they will be a perfect fuck every time, or at least as close as is possible to get - humans are imperfect, therefore nothing can ever be perfect for them. The first way is to say that it will lead to a breakdown of relationships because without sex there will always be a missing element. The counterargument as I see it is that sex is not the most important aspect of our existence and it should probably be made a lot less important in some ways.

Of course, there is the question of whether a robot can actually truly satisfy a human. There is something to be said for interaction between equals.

Re:Um, more details.. (2, Funny)

retrosurf (570180) | more than 7 years ago | (#18265252)

Don't Date Robots!

Incoming Lawsuit (1)

Grashnak (1003791) | more than 7 years ago | (#18264570)

I refer you to the following patent: United States Patent Application 1940000000.5 Kind Code A1 Asimov, I March 1, 1940 Laws of the Behaviour of Mechanical Critters (AKA Robots) Abstract A series of laws intended to prevent mechanical critters from running rampant and either killing, enslaving, raping or otherwise mating with, human beings. The laws also ensure the future development of interesting if unlikely ethical dilemnas and entertaining books and movies, all of which will be considered derivative content of this patent.

sentient and laws (1)

Anon-Admin (443764) | more than 7 years ago | (#18264616)

The question that comes to my mind is, can a truly sentient being be governed by a set of pre-programed laws?

Would the existence of sentients not require the existence of self determination?

With out self determination it would be no more than a collection of programs intended to mimic human behavior's and not a truly sentient being.

Robots are great. (1)

Applekid (993327) | more than 7 years ago | (#18264662)

As long as this doesn't interfere with the introduction of pleasure models I'll be ok.

Ethics towards robots? (1)

alexgieg (948359) | more than 7 years ago | (#18264682)

What does this means? That we must make a full backup a robots memory before disassembling it, and restore it to a similar or superior functioning body before 'n' days, or be charged for roboticide?

Coming at it from the wrong angle... (1)

Stefanwulf (1032430) | more than 7 years ago | (#18264686)

The big challenge to ethics will come not when from the creation of robots (automata which duplicate/expand on human physical abilities), but from the creation of sentient A.I. (automata which duplicate/expand on human mental abilities). To date we have not had to separate the notion of sentient rights from the notion of human rights, since the only sentience we have so far recognized is consistently bundled in human bodies.

In the ethics which have been developed, the mind, emotions, conscious choice, and intentionality are indespensible, and in fact feature far more prominently in ethical decisions than the existence of two legs, opposable thumbs, a certain muscular structure, or a sense of balance. Thus it is reasonable to infer that when the separation of sentience and human form becomes possible we will see more critical ethical challenges arising from the possession of an intelligent mind by an inhumanly-shaped object than the possession of a human-like body by an unintelligent object.

Great! Maybe they can work on the Theft Act next. (1)

thewils (463314) | more than 7 years ago | (#18264724)

to prevent humans abusing robots, and vice versa


Then they can prevent theft, also.

I'm already preparing (1)

Sciros (986030) | more than 7 years ago | (#18264890)

I already am working on getting a robotic arm, being a part-time nudist, and acting ridiculously paranoid while spending half of my disposable income on "vintage" shoes. Good thing the Audi R8 has been released because that's exactly what I need for wheels.

anyone see this is scary and downright stupid? (1)

SpecialAgentXXX (623692) | more than 7 years ago | (#18264996)

The only sentient beings are us flesh & blood humans. There's a reason I, Robot is a science fiction novel. All a robot is is just a bunch of metal parts with a CPU just like my computer. No computer can "think" for themselves - we program the input and output. There is no such thing as a computer program "becoming sentient." I find this scary because we should be concerned with other humans not if a bunch of nuts & bolts coupled to a CPU is a sentient being. What's next? Unionization for the auto-plant robots that "work" all-day, all-night? Wrongful-death lawsuits against police bomb-squads which use robots to detonate explosives? And how would the elderly South Koreans "abuse" their robot "caretakers?" Yell at it? Pound on it? Spill liquid on its control? This is downright stupid. We've all done that to our PCs. We humans are the only sentient beings. A computer program will never be a sentient being.

And just what is a "robot"? My new Core Quad PC is more powerful than an existing robot (I'm thinking of that Japanese robot that can walk). Is my PC a robot? What if I attached some metal "arms", "legs", and "head" to my PC (i.e. a really cool case mod like Bender)? Does it somehow become a "robot?" What about our cars? Our jet-planes? Our tanks & fighter air-craft? Some would argue no as their program does not make them sentient. Well what happens if I load the "sentient" program into my PC, car, tank, or fighter air-craft? Does it become "sentient"? If I delete the program, does it "die?"

You are a carbon-based machine (1)

cyborg_zx (893396) | more than 7 years ago | (#18265202)

Unless you can show otherwise there is nothing fundamentally special about human cognition that is not computable.

Data is a toaster. (1)

AJWM (19027) | more than 7 years ago | (#18265094)

Sorry, had to be said.

Ethics be damned.... (1)

zymurgy_cat (627260) | more than 7 years ago | (#18265148)

Let's build 'em as smart and advanced as we can for whatever we want. I want a Butlerian Jihad. Big time.

Oblig SNL (0)

Anonymous Coward | more than 7 years ago | (#18265196)

I hope they buy the robot insurance from Old Glory.

If anyone is going to turn the robots against us.. (0)

Anonymous Coward | more than 7 years ago | (#18265238)

it's the religious. When we do create truly sentient robots/AI/whatever, if there's still a strong fundamentalist presence around, then these people will rebel against and/or abuse robots because a.) they don't have "souls" b.) Advanced robots make humans seem less special, and c.) How can something be ethical if it can't accept Jesus?

I suppose it all really just comes to how much the programmers think ahead, but it's something to think about...
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...