×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

How To Change U.S. Laws To Promote Robotics

Soulskill posted about 4 months ago | from the just-worry-about-the-kill-all-humans-part dept.

Robotics 118

An anonymous reader writes "A law professor says the U.S. could fall behind in the robotics race if we don't change product liability law. A new op-ed over at Mashable expands upon this: Yet for all its momentum, robotics is at a crossroads. The industry faces a choice — one that you see again and again with transformative technologies. Will this technology essentially be closed, or will it be open? ... What does it mean for robotics to be closed? Resembling any contemporary appliance, they are designed to perform a set task. They run proprietary software and are no more amenable to casual tinkering than a dishwasher. Open robots are just the opposite. By definition, they invite contribution. It has no predetermined function, runs third-party or even open-source software, and can be physically altered and extended without compromising performance. Consumer robotics started off closed, which helps to explain why it has moved so slowly."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

118 comments

Slashdot is for fags (-1)

Anonymous Coward | about 4 months ago | (#45839253)

fp

The robotic ecology (2)

Taco Cowboy (5327) | about 4 months ago | (#45841685)

Why is there such concern of whether the robot is "closed" or "open" ?

Both the "closed" version of the robotic and the "open" version have their own ecological spheres - just like the one in the software field.

We have closed and proprietary software and we have open sourced software, and we have some that overlap both camps.

Each side has its own (sort of) evolutionary scheme, and each side has its own strength and weaknesses.

Why can't the robotic be the same ?

If someone decide to turn their robots into something like the dishwasher machine, hey, that's their choice, let them.

After all, this is a free world.

But if someone decide that they want their robots to gain more inputs / feedback / tune / addition from the userland, then they make their robots the "open" kind, so that their robots can "grow", "mutate", "evolve" into new fields that the original inventor couldn't even begin to fathom.

robotics primary purpose (1, Interesting)

iggymanz (596061) | about 4 months ago | (#45839257)

the primary purpose and largest market for robotics will be for weapons. It will thus of course be mostly a closed-source system. You're either on the gravy train of the military-industrial complex or you're ballast under the tracks

Re:robotics primary purpose (5, Insightful)

icebike (68054) | about 4 months ago | (#45839291)

the primary purpose and largest market for robotics will be for weapons.

That or manufacturing. Some (most) robotic assembly plants aren't safe for humans already.

In either case, changing product liability laws is EXACTLY the wrong thing to do.

A "product" is not the place for hackers and experimenters. You can build anything you want in your basement or maker shed, but if you want to build a product for sale, you better have some strict testing and insurance.

Re:robotics primary purpose (2)

Anonymous Coward | about 4 months ago | (#45839329)

Agreed. But then again, it's not straightforward how to extend liability laws to Amazon delivery drones or driverless cars. I don't see that being permissive with these extensions would necessarily help robotics be more "open". That's a separate issue. But I do think that it would help pave the way for killer robot apps (not: killer robots) if legislators didn't prohibit their application from day one.

Re:robotics primary purpose (3, Insightful)

icebike (68054) | about 4 months ago | (#45839789)

ut then again, it's not straightforward how to extend liability laws to Amazon delivery drones or driverless cars.

You don't have to "expand" the laws, they already apply.

You just have to prevent boneheads like those in TFA from limiting liability for things like Amazon's scheme.

Re:robotics primary purpose (0)

Anonymous Coward | about 4 months ago | (#45844051)

Agreed. But then again, it's not straightforward how to extend liability laws to Amazon delivery drones or driverless cars.

I don't know how it works in the U.S. but in most of EU the case would be that if an accident happen the operator is liable. Amazon in that case can claim that the drone was malfunctioning and try to hold the drone developer/manufacturer liable. The developer/manufacturer will then have to provide documentation to prove that the drone was developed using the proper machine safety standards.
Typically the documentation for the drone will have some clause stating that case A, B or C may not happen for proper operation and Amazon may not use the drone during such circumstances. Amazon will use the drone in that way anyway since it would be useless to them otherwise. They will then pay for whatever damages the drone caused and continue to use them since they still are cheaper and causes less damage than regular drivers.

Re:robotics primary purpose (1)

iggymanz (596061) | about 4 months ago | (#45839363)

but this is article about the USA. the manufacturing by robots for consumer products largely won't be done here. but making robot weapons, yes, that will be done here

Re:robotics primary purpose (1)

icebike (68054) | about 4 months ago | (#45839471)

but this is article about the USA. the manufacturing by robots for consumer products largely won't be done here. but making robot weapons, yes, that will be done here

Really: http://www.theepochtimes.com/n2/images/stories/large/2011/01/02/97967037.jpg [theepochtimes.com]

Re:robotics primary purpose (1)

Jah-Wren Ryel (80510) | about 4 months ago | (#45839705)

this is article about the USA. the manufacturing by robots for consumer products largely won't be done here.

Robot manufacturing will bring manufacturing back to the US because it drastically reduces shipping costs. The US has tons of natural resources to support manufacturing, robots cost about the same to operate no matter where they are in the world. so shipping will be the primary area of cost reduction.

Of course nobody will have a job so they won't be able to buy anything, but that's somebody else's problem...

Re:robotics primary purpose (2)

jythie (914043) | about 4 months ago | (#45840463)

Heh. In all seriousness, one of the biggest issues we face is the mythology that someone else`s problem is just someone else`s problem, and how often other people`s problems become our own in subtle ways.

Re:robotics primary purpose (0)

Anonymous Coward | about 4 months ago | (#45840531)

Wrong, dead wrong. It cost less to ship something half way around the would with ships than it does to ship half way across the USA with trains, which are way cheaper than trucks. Robot manufacturing will help equalize manufacturing costs that in different countries with large differences in labor costs, but that's it.

Re:robotics primary purpose (1)

Anonymous Coward | about 4 months ago | (#45840783)

Your math sucks.

Manufactured in China means shipping from china to US port via boat, then ship from US port to locations across the counrty via ground.

Manufactured in the US means ship from factor to locations across the country via ground.

In either case, you still have roughly the same amount of ground shipping.

Re:robotics primary purpose (3, Informative)

umafuckit (2980809) | about 4 months ago | (#45839561)

None of this is really what the article is about, though. The thesis is simply that manufacturers of open robotics platforms (which are out there right now) should not be legally responsible for what people do with those platforms. The argument is that making them liable will reduce the pace of innovation.

Re:robotics primary purpose (3, Interesting)

icebike (68054) | about 4 months ago | (#45840779)

None of this is really what the article is about, though. The thesis is simply that manufacturers of open robotics platforms (which are out there right now) should not be legally responsible for what people do with those platforms. The argument is that making them liable will reduce the pace of innovation.

But again, this is a non-issue.

You buy a Chainsaw from direct from the manufacturer, and that manufacturer is in no way responsible when you murder someone and chop them up with the saw to dispose of the body. Anti-Gun people have been routinely rebuffed by the courts when trying to sue gun manufacturers because someone used their products to commit murder. Nobody holds an automaker responsible when someone intentionally uses their vehicles to commit crimes.

The law and the courts are already pretty good at affixing blame, and in spite of the deep pocket horror stories, these tactics of going after the up-stream manufacturer virtually never work in the real world.

Re:robotics primary purpose (1)

umafuckit (2980809) | about 4 months ago | (#45842009)

But again, this is a non-issue.

I too would have thought it was an issue, but I have no idea about the field. Presumably the author of the essay has some reason for thinking litigation is a concern in open robotics.

Re:robotics primary purpose (1)

rtb61 (674572) | about 4 months ago | (#45843671)

The real question is then, how easily innovative do you want the assassin bot with questionable legal responsibility to be. By questionable of course I mean, is the original manufacturer responsible for the insertion of the applicable code, is the owner responsible for the insertion of the applicable code, is a hacker responsible for the insertion of the applicable code or is an unmentionable government agency which shows wilful intent for criminal activity responsible for the insertion of the applicable code. The applicable code in this case be the code the redirected the robot from it assigned task, and targeted it a person instead and hitting them over the head with a blunt object until that head became no longer a recognisable object to be hitting with a blunt object. Of course that's quite a leap for robotics yet that question remains who is liable for a robots actions and how do you prove, especially when the government of the day runs a government department as a criminal organisation specific with the intent of breaking into and taking control of computer systems (a benefit of the doubt question).

Re:robotics primary purpose (4, Insightful)

Immerman (2627577) | about 4 months ago | (#45839759)

I don't know, it seems like this is a fairly complicated question, it might be worth at least formally clarifying some boundaries.

Lets say we have industrial robots designed specifically to be user-programmable, as I believe most of them are. If there is a defect in the hardware that causes an accident then the company making the hardware is at fault. If however it was a defect (or intentional nefariousness) in the user programming, then it is clearly the programmer who is at fault, not the hardware manufacturer.

In the case of autonomous robots, be they car/drone/cyborg/whatever, I think the same logic would reasonably apply - if you use the built-in control systems and they malfunction in a way that damages someone/thing then the manufacturer is at fault, but if the damage was reasonably traced to the orders it was following, then it's the person giving the orders that's at fault. Lots of grey area in there though - what if a flying drone is ordered into an area where the winds are too strong to operate safely , and it gets into a damaging collision? Should the company have been required to actively notice and avoid unsafe wind conditions? What if the wind is gusty and there was insufficient prior warning to have reasonably escaped the "danger zone"

Perhaps a special provision of liability transfer should be considered for autonomous systems, seeing how as with a sufficiently wide deployment accidents are inevitable, and the people best suited to make and improve the systems are not necessarily motivated to do so if they have to swallow the costs of the inevitable accidents. However, we could perhaps arrange for some liability transfer, where the systems are sold as fit for use in certain restricted conditions where the risks are reduced to acceptable levels, and the operator must accept at least partial liability to operate them in any other setting. An autonomous industrial robot may have a wonderful market in a controlled factory setting, but it may also have great uses operating in public. If the manufacturer is required to accept liability for the second scenario then it will likely take far longer before they're willing to release them for the first one. And if we outright ban the second scenario then we'll be depriving ourselves of the discovery of all sorts of potential for new usages.

Perhaps we could do something simple like require users to carry comprehensive liability insurance in order to operate an autonomous system outside of it's specified environment. Much as we do to allow the operation of most any other dangerous machinery in public. The usage of customized software, open source or otherwise, would no doubt have an impact on the insurance premiums, but companies would be free to stand by their product and offer such insurance themselves, at the price they believe is justified.

Of course enthusiast driven open projects would be hit hard - I imagine the premiums to operate potentially dangerous uncertified autonomous systems could be prohibitively high, but the only alternative I see would seem to be to allow enthusiasts to endanger the public without consequence.

Re:robotics primary purpose (4, Insightful)

icebike (68054) | about 4 months ago | (#45840861)

In the case of autonomous robots, be they car/drone/cyborg/whatever, I think the same logic would reasonably apply - if you use the built-in control systems and they malfunction in a way that damages someone/thing then the manufacturer is at fault, but if the damage was reasonably traced to the orders it was following, then it's the person giving the orders that's at fault.

These situations are already handled under current law.
If YOU use the build-in control systems, YOU are predominantly responsible. Its going to be up to YOU to prove the product was defective.

There needs to be NO changes in the law for this to exempt OEMs from responsibility. A bazillion car analogies suggest themselves, from sticking accelerators to faulty on-board computers. And if the on-board computers fail in specific circumstances that they were warranted to handle, the vehicle manufacturer can pursue a claim against the computer manufacturers.

There is no reason to build air-gaps in the law to protect upstream suppliers, because the burden of proof is well established in current law.

Juries (2)

Etherwalk (681268) | about 4 months ago | (#45841285)

I don't know, it seems like this is a fairly complicated question, it might be worth at least formally clarifying some boundaries.

Lets say we have industrial robots designed specifically to be user-programmable, as I believe most of them are. If there is a defect in the hardware that causes an accident then the company making the hardware is at fault. If however it was a defect (or intentional nefariousness) in the user programming, then it is clearly the programmer who is at fault, not the hardware manufacturer.

And the decision as to who was at fault will ultimately be made by a lay jury...

English is good enough (1)

Brett Buck (811747) | about 4 months ago | (#45839777)

"Maker Shed"? Really? People have been using the phrase "workshop" in some variant for probably 1000 years. You don't need to make a up a new phrase for it.

Re:English is good enough (1)

icebike (68054) | about 4 months ago | (#45839825)

When your wife demands the spare bedroom back and the only place left to go is the tool shed out back, calling it a workshop hardly makes the banishment easier to accept.

Re:English is good enough (0)

Anonymous Coward | about 4 months ago | (#45840293)

When your wife demands the spare bedroom back and the only place left to go is the tool shed out back, calling it a workshop hardly makes the banishment easier to accept.

That's when the robots come in ...

It will all be a terrible accident of course.

But somehow the home made robot you built in the tool shed
ends up killing the bitch, oops ... I mean the wife.

Robots are looking better all the time, are they not, my friends ?

Whoops! (0)

Anonymous Coward | about 4 months ago | (#45841533)

I love my wife, but I'm not responsible for the actions of my hacked Roomba!

Re:robotics primary purpose (0)

Kohath (38547) | about 4 months ago | (#45839781)

Nah. Just skip it. Sure it's a great product, but so what? If you need to invest 50 times your life savings on testing and insurance and lawyers and regulatory compliance, then why bother even trying?

All "products" should be designed by big corporations, with lawyers, quality assurance departments, and government-certified regulatory compliance divisions.

Hackers and experimentors? What law school did they graduate from? They'll only get someone hurt.

Re:robotics primary purpose (2)

icebike (68054) | about 4 months ago | (#45839837)

Nah. Just skip it. Sure it's a great product, but so what? If you need to invest 50 times your life savings on testing and insurance and lawyers and regulatory compliance, then why bother even trying?

If you haven't got the legs to get into the business, you haven't got the legs to stay in the business.
If you can't obtain financial backing then you probably don't have a worth while product in the first place.

Re:robotics primary purpose (1)

Etherwalk (681268) | about 4 months ago | (#45841317)

If you can't obtain financial backing then you probably don't have a worth while product in the first place.

Worthwhile can mean a lot of things. There are plenty of products people want that aren't available on the marketplace for reasons that don't really have to do with the value of those products to the consumer. Privately funded jumbo reverse mortgages, for example, are unavailable in at least a number of states, primarily because they're politically unpopular and banks get a lot of bad P.R. for underwriting them.

Re:robotics primary purpose (0)

Anonymous Coward | about 4 months ago | (#45839707)

What would make you think that? As it stands today, there are far more robots made as children's toys than there are as weapons, and far more robots used in manufacturing than as toys. Sure, robots will be used lethaly against people. That's true of every single physical object ever created or found that was capable of killing. People suck and sometimes they kill eachother. We can't let that rule our lives though. Why fear machines because they could be used in bad ways when we could just as easily look towards all the great things they can do for us.

Responsibility (1)

Anonymous Coward | about 4 months ago | (#45839263)

You get this same rhetoric from the biotech guys. Face it. Removing safety restrictions from advanced technology gets people killed and maimed for the sake of a few bucks and "technological progress."

The robot race (1)

koan (80826) | about 4 months ago | (#45839271)

Well that says it all doesn't it, too bad no one has stopped to consider the implications of 8 billion people and jobs moved to robotics.

Re:The robot race (1)

ShanghaiBill (739463) | about 4 months ago | (#45839595)

too bad no one has stopped to consider the implications of 8 billion people and jobs moved to robotics.

Except that these "implications" have been studied to death, and even discussed numerous times on Slashdot. The vast majority of economists consider automation and productivity to be good things. Wealth and prosperity come from the production of goods and services, not by "keeping people busy". There is some question about the distribution of the increasing wealth, but the same concerns were raised when cars, computers, and even telephones were first introduced, with many predicting that they would be available only to "the rich" and cause mass unemployment. There is no reason to expect robotics to be different from previous episodes of automation.

Re:The robot race (1)

Anonymous Coward | about 4 months ago | (#45839829)

There is no reason to expect robotics to be different from previous episodes of automation.

There you go, that's the most dangerous assumption. I can tell you right now in my dealings with the various right wingers, that there is ZERO chance that in a world where there is little need for human labor would there be economic prosperity for the majority of the population. Right wingers view other human beings as automatons that only have value if they can make money off of them. Human beings have no intrinsic value to a right winger, that's why they are opposed to virtually all the social advances in the 20th century, such as a minimum wage, overtime, universal health care, food stamps, social security, and unemployment insurance. They already view their workers as robots who are only worth what they are as a commodity value! These are the values the people who hold 95% of the world's wealth hold, the rise of AI and robotics won't make them humanists. If robotics and AI eliminate the need for 90% of the population to work, the basic value of human labor would drop down to near zero. No new market would emerge to absorb all these people. If AI and robotics were so good, it would also apply to eliminate the need for humans in the new markets as well. Unless there is a major consciousness shift and the rich start viewing other human life as having intrinsic value the rise of AI and robotics will result in an economic collapse followed by a bloody civil war.

Re:The robot race (2)

ShanghaiBill (739463) | about 4 months ago | (#45840903)

... in a world where there is little need for human labor ...

This is zero sum thinking. That is not how real economies work. Here is a thought experiment: You run a factory making widgets, that employs 100 workers. Someone invents a tool that has negligible cost and doubles the output of each worker. What do you do?
Option A: Fire half your workers since they are no longer needed.
Option B: Realize that each worker is now generating twice as much revenue and far more profit, so you hire more workers and expand your factory.
Throughout history, in each new wave of automation, we have picked option B, growing the economy, expanding employment, and raising living standards. I see no reason to believe that robotics are fundamentally different than the invention of the plow, or assembly line.

Re:The robot race (1)

dryeo (100693) | about 4 months ago | (#45843685)

If you pick option B you'll be out of business unless there is a very large demand for your widgets. While you're tying up your capital in expanding your factory, your competition is lowering their prices as they only need half the workers to make widgets so you're out competed. As the workers have less money on average the price of widgets has to come down.
Now with more workers out of work something has to be done. There is no longer a new world to send them to so we're left with the historic choices of jailing them or employing them in the military machine. America has been dong both with 1% of adults cycling through prison, which on the one hand creates jobs in law enforcement, building and running prisons etc and on the other hand creates a pool of very cheap captive labour, which once again lowers average wages. As a bonus America has kept the medieval concept of a class of people called felons who can no longer find good work and are taken out of the labuor market.
The military machine also employs lots of people, whether making weapons, being in the military, or supporting the military in other ways.
Sometimes this results in something like the British Empire, made possible by all those people who were unemployed by automation. Sometimes it results in massive wars which results in large scale broken windows to employ people. This is great if like America you weather the war with few casualties and very little infrastructure damage.
Now we're getting to the point where massive war may not be survivable but yet we have America spending massive amounts of money on their war machine to keep the economy going. China with millions of people in the army to keep them busy, as well as an expanding weapons industry. Russia expanding its military for much the same reasons.
Sadly history shows that very large militaries are usually put to use as it works so well to employ people.

Re:The robot race (1)

koan (80826) | about 4 months ago | (#45840319)

OK so here's my vision of the future, robot mining, robots transport ore to foundry, robots process ore to material, robots take material to factory, robots manufacture, robots deliver, robots repair, robots construct, robots guard, robots farm, robots fight wars.
All that's needed, even at this point, would be a functional AI, and some of the AI work I've seen is coming along "nicely".

Crazy right? Read too much SciFi right? I'm guessing if things progress as I see them that some where around 8 or 9 billion people will be around when this robot revolution is a real thing.

Re:The robot race (0)

Anonymous Coward | about 4 months ago | (#45840275)

We need affirmative action for robot employment! If a robot and an equally competent human competes for the same job, the robot should have it. We don't need no Elijah Bayleys to confuse people up.

who benefits (0)

Anonymous Coward | about 4 months ago | (#45839321)

Robotics is good when it can produce cars by the thousands for corporations so they can employ fewer people. Robotics is bad if individuals want to reap the benefits of our technology and work less.

As long as we cling to 19th century social and business models in an age of plenty, this is what will happen. The rich will continue to buy laws to protect their interests and prevent individuals from getting any benefit.

Re:who benefits (2)

mikael (484) | about 4 months ago | (#45839589)

Technically, your programmable washing machine and spin-dryer are robots. Maybe even the toaster, oven and microwave. A sewing machine with downloadable patterns comes close. They do have moving parts, but all the dangerous bits are usually hidden away.

Re:who benefits (2, Funny)

Anonymous Coward | about 4 months ago | (#45839891)

The dangerous bits of a sewing machine are not hidden away.

Re:who benefits (3, Insightful)

Anonymous Coward | about 4 months ago | (#45839929)

That's right, and yet with all this productivity and spare time, what do we do? Force both heads of the family to work to barely have the same standard of living my single income family had 30 years ago. So who benefits from the technology?

Re:who benefits (1)

ShanghaiBill (739463) | about 4 months ago | (#45842137)

So who benefits from the technology?

The beneficiaries are the people that paid attention in school, and did their homework. Income has stagnated for the bottom quintile (the people competing with servo motors). Everyone else has done better over the last 30 years.

Re:who benefits (2)

aXis100 (690904) | about 4 months ago | (#45842397)

As much as I agree that having both heads of the family to work sucks, we have a much better standard of living now than 30 years ago.

For example we now have:
  Two or more cars per family
  Clothes driers and dishwashers
  Food processors, breadmakers & microwave ovens in addition to normal oven/cooktops
  Reverse cycle air conditioners is multiple rooms of the house
  Mobile communication & internet devices in everyone's pocket
  Multiple TV's and computers thoughout the house

All of these things would have been considered luxuries 30 years ago and are now commonly affordable.

Good comments so far (1)

Spiked_Three (626260) | about 4 months ago | (#45839345)

Robotics will lead to joblessness and unemployment beyond anything the world has seen before. Get used to it, and figure out how to deal with it now, instead of waiting until it becomes a crisis.

Look at US progress in less than 300 years. From horse drawn carts to self driving cars. We will make at least that much a change in the next 100 years. Anyone who can think enough to breathe, knows Robotics will revolutionize at least blue collar work, and possibly (when is the question) white collar work as well.

No doubt, weaponizing will be one of the first uses. But is that a game changer? I mean the US can bomb you at night from an autonomous drone already, so I don't think so.

But the real issue I have is why does being open have anything to do with progress? When the PC came out, and Windows established the market lead, good or bad, you have to acknowledge that closed software enabled the PC commodity market. I look at ROS and robotics, with the biggest supporter dropping out (willow garage) and I seriously doubt if open vs closed has anything to do with success.

No, like PCs, it takes vision, where things are going and who wants to be the pivot. Look to Google. Whatever they do, open or closed, will be the hub for robotics.

Re:Good comments so far (2)

Spiked_Three (626260) | about 4 months ago | (#45839403)

Heck I will even revise my comment

"Consumer robotics started off closed, which helps to explain why it has moved so slowly"

No, exactly the opposite. Open ROS is why robotics has moved so slowly. No profit, no motive. MS left the game a long time ago (and MS Robotics Studio was just incubator for other .Net components anyhow).

Re:Good comments so far (1)

icebike (68054) | about 4 months ago | (#45839549)

Assuming open ROS is holding back development, simply because no one can patent the ROS sort of overlooks the fact they they can still patent the product manufactured while using an open ROS as well as the Robot itself, and they can copyright the specific ROS implementation.

You might as well claim that English or [insert random language] is holding back civilization because no one can patent language in general.

Microsoft's success was because, like the Marines, they arrived "firstest with the mostest." A strictly temporal advantage, which they leveraged with less than scrupulous means. It was not simply because it was closed source. They were for all practical business purposes the only one on the playing field and they had what IBM needed at the time.

Re:Good comments so far (1)

JoeMerchant (803320) | about 4 months ago | (#45840357)

Lack of funding is the root cause.

Dangle some kind of carrot to the investors where they see a potential 100x ROI within 5 years, the money will pour in.

Open ROS isn't the impediment, lockout from the infrastructure is. Drones can't fly, automated vehicles can't use the road, and anything that moves "by itself" makes its owner liable for death and/or dismemberment of any depressed psycho that throws themselves infront of the machine.

More "open" cultures (like Australia, surprisingly) are going to eat the U.S.'s lunch by letting their domestic drone (and other robotics) developers practice without huge bureaucratic roadblocks.

Re:Good comments so far (1)

turkeydance (1266624) | about 4 months ago | (#45839493)

the inevitable marriage between robotics and 3D printing will eliminate most jobs, and all that goes with it.

Re:Good comments so far (2)

ShanghaiBill (739463) | about 4 months ago | (#45839623)

the inevitable marriage between robotics and 3D printing will eliminate most jobs, and all that goes with it.

Too late. The neolithic agricultural revolution has already eliminated 99% of the jobs. Very, very few people are now employed as hunter-gatherers.

Re:Good comments so far (1)

Kjella (173770) | about 4 months ago | (#45839877)

Robotics will lead to joblessness and unemployment beyond anything the world has seen before. Get used to it, and figure out how to deal with it now, instead of waiting until it becomes a crisis.

Well currently many parts of the world is experiencing an aging population with far more retirees and non-working to working people, so I think for the next 20-30 years or they'll be happy for every advance robotics can give them. Healthcare and care for the elderly is currently very high on people time and very low on robotics, despite all the fancy equipment a hospital or nursing home doesn't run without a small army of doctors and nurses. If we can have automatic cars to transport goods and people it'd free up tons of people to staff healthcare. And no, that they're not brain surgeons is fine. The main complaint is that there is not enough time to help patients in their everyday lives.

Let's face it, robots are great when it comes to machine tooled things of exact shapes and sizes, but suck massively at dealing with squishy things and I expect it will be a very, very long time until I stick my head in a machine to get a robotic haircut. Despite a few proof of concepts none of the big burger chains actually use robots to make a hamburger and I think a lot of it is quality control, a human would quickly see and smell that there's something wrong with that lettuce while a robot might be totally unaware. I've worked quite a bit with moving manual processes to computer systems and it always ends up much more rigid, people have a tendancy to work it out on the fly where a robot is stumped or oblivious.

Re:Good comments so far (2)

JoeMerchant (803320) | about 4 months ago | (#45840283)

Automation has lead to joblessness and unemployment beyond anything the world has seen before. Get used to it, and figure out how to deal with it.

F.T.F.Y. We're going to have to decide which commodities are free to the commons (like air to breathe, water to drink, roads to travel on, restrooms to be sanitary in), and which need to be allocated based on some kind of merit (money) system. Things that can be provided by automated servant (robotically, or otherwise) are hopefully moving to the free to the commons list.

We are already servants to our machines. How much of your income is spent on your vehicle(s)? And, that building you live in, what percentage of it's construction cost is attributable to the machines that helped people to build it, and harvest/mine and transport the raw materials? Even more starkly, the food you eat, what percentage of its cost goes to the maintenance and upkeep of the machines that bring it to you vs. the farmer who maintains and operates those machines?

The space program of the 1960s still needed "drivers" in the vehicles, space exploration has (mostly) moved beyond that need.

The bulk of machines in the 20th century needed close supervision, operation, and maintenance, but "lights-out" fully automated factories were not unheard of. As we move forward, there is more and more automation, including automation of the supervision, operation, and maintenance of production machines.

I, for one, welcome unemployment if it means that the machines are finally taking care of themselves while supplying me with food, shelter, clothing and transportation - let's at least try to keep the entertainment out of the robots hands, shall we?

Re:Good comments so far (1)

Kjella (173770) | about 4 months ago | (#45842865)

Just because it's produced by robots doesn't mean it'll be free, even if you eliminated the farmer you'd still need the land, the seeds, the fertilizer, planting machines, harvesting machines, gas to operate the machinery and so on. Fully automating it won't make it into a horn of plenty, it'll still cost something but you'll be out of a job and don't have anything to pay with. Maybe you think it'll be some kind of socialist paradise, but experience indicates that the "useless" people who produce nothing will have a bloody hard time taxing the "useful" people that still do the jobs that need doing.

Those who work can strike, they can emigrate, they can sell their services dearly while the people trying to give themselves free services might find that they run short on people who'll actually deliver on those promises. It's not unheard of for politicians to conjure up great plans without actually backing it up with funding, voting for free services could end up equally empty. What do you do when the promise of free food falters because farm machinery breaks down and nobody with the necessary skills wants to be a farm machine tech? Pay them handsomely, like in capitalism? Forced labor, like the communists? People won't do the remaining jobs for fun and the people who must work aren't going to let those who don't get off that easy.

Change Product Liability law? (1)

Anonymous Coward | about 4 months ago | (#45839389)

Is this April 1st?
Are you having a larf?

If this was to happen then a lot of lawyers would lose their income. After all, they have to pay back those student loans somehow. And when that is done, there is the downpaymen on their persona LearJet to fund.

Great idea (1)

bigsexyjoe (581721) | about 4 months ago | (#45839413)

People can be killed by cheaply made robots, so the US can "win the robotics race."

Why don't we instead have companies here develop the technology for the safest robots so they eventually become the ones most used around the world.

Re:Great idea (1)

ShanghaiBill (739463) | about 4 months ago | (#45839735)

People can be killed by cheaply made robots, so the US can "win the robotics race."

This seems backwards to me. Robots open up many new opportunities for asymmetric warfare. Attacks with suicide bombers are limited by the number of volunteers willing to die. Cheap robots do not have the same limitation. Once Al Qaeda masters robotics, we will be in big trouble.

Re:Great idea (1)

Etherwalk (681268) | about 4 months ago | (#45841385)

Attacks with suicide bombers are limited by the number of volunteers willing to die. Cheap robots do not have the same limitation. Once Al Qaeda masters robotics, we will be in big trouble.

Actually, the number of volunteers willing to die is really not the limiting factor, although I'll admit it's a limiting factor in certain areas of the world. Terrorist groups manipulate people into suicide bombing with specific and effective methods. Martyr propaganda videos, lessons on what the target of the bombing does to the culture's women (including videos of rapes by people who look western), etc...

Re:Great idea (1)

ShanghaiBill (739463) | about 4 months ago | (#45842181)

Actually, the number of volunteers willing to die is really not the limiting factor, although I'll admit it's a limiting factor in certain areas of the world.

Ah, but one of those "certain areas of the world" is cities in America. It is easy to recruit a bomber in Kandahar or Gaza, but that doesn't help you blow up the White House. But a robot builder could dispatch a self driving car to drop off a swarm of fence scaling robots that could blast their way in. And he would live to launch another attack when the next swarm was assembled.

You could also (0)

Anonymous Coward | about 4 months ago | (#45839419)

Keep raising the minimum wage and trying to unionize. That's a pretty good way to see more robots in the workplace.

Re:You could also (0)

Anonymous Coward | about 4 months ago | (#45839643)

Eventually the rich are going to need a robot army to protect themselves from millions of starving people. Remember every society is 3 meals away from anarchy, an inconvenient fact the rich have completely forgotten.

A good first step (0)

Anonymous Coward | about 4 months ago | (#45839427)

A good first step would be significant systematic reform of American domestic policy, so as to make the United States an attractive place for intelligent people to live again.

I want to know... (0)

Anonymous Coward | about 4 months ago | (#45839453)

what firmware is in the computer in my car. I want to know if there is some remote exploit (Intel, I'm looking at you and your "Intel small business advantage") built in at the hardware level. I want to know, and control what firmware is trusted, and thus what software is trusted. I don't want Windows 8's secure boot that trusts only one key, what I cant change should it get compromised (or should I want to own the computer I paid for). I don't want Windows update style rootkit like Tesla's silent remote updates that can disable features, and potentially introduce lethal bugs, even if not compromised (imagine it it was compromised though: update lights them all on fire, drives them somewhere, etc).

Who takes the blame if your remote update system for the firmware, OS or software, or any of the servers which serve these updates, or any of the processes which deliver these updates to the servers are compromised? Who takes the blame when 1 million cars, or a billion computers, all are compromised by some unknown entity that seeks to do some real harm? Maybe its just ransomware, maybe its DDOSing large portions of the internet, maybe its stealing most of everyone's money. Or perhaps is far worse, and all the cars that can self drive take their passengers hostage or just kill them, and all the less advanced cars just do what they can by messing with the parameters they control: drive full speed, swerve, and roll real good, overhead, explode batteries etc.

I don't want such a huge list of failure points that could compromise the economy, and many millions (if not billions) of lives world wide. The automation of real physical systems is just getting started. Imagine what could if windows update was compromised, then apply that to every electro-mechanical thing we will be putting software into. There is a huge need distributed (user!) control, and the only way this is possible is through openness. This is critically important to the safety of everyone, and it needs to happen now.

Everyday life is not optimized for robots (1)

Anonymous Coward | about 4 months ago | (#45839495)

A robot is best used to do massive amounts of a single type of labor, i.e., an assembly line worker. There was robot hype before in the 1980s, when GM tried to make an all robot factory, and failed. Robots will not live up to expectations, again.

Change laws to promote robotics (3, Interesting)

rossdee (243626) | about 4 months ago | (#45839517)

AFAIK the US congress never passed the "Three Laws of Robotics" in the first place.
Maybe they should

And while we are talking about Asimov, maybe they should put some money into researching Thiotimoline

Bad article (2)

Animats (122034) | about 4 months ago | (#45839521)

I just read the article. I'm not impressed.

First, the author is trying to make his case look good by framing the issue in terms of "open robots". The paper could equally well be titled "Let's Legalize Killer Robots!". What he wants to to is provide legal immunity for manufacturers against harm caused by their robots. His justification for this is a law Congress passed, at the urging of the pro-gun crowd, to immunize manufacturers against suits by people injured by their guns. Even that immunity is quite limited - if a criminal shoots you, you can't sue the manufacturer. But if your gun blows up when fired, you can.

Second, robotics is open now. You can buy lots of devices you can program. At the hobbyist level, there are companies like Lynxmotion. Most of the hobbyist robots tend to be on the wimpy side, but you can buy industrial robot arms if you want.

Third, the main reason consumer robotics hasn't taken off is because the devices don't work very well. None of the robotic vacuums are very good vacuum cleaners. Even the expensive Willow Robotics robot the article mentions isn't capable of doing very much. Progress is being made, but slowly.

I suspect this guy saw the DARPA robotics challenge video (probably the jazzed-up edited version for popular consumption, not the raw videos of painfully slow teleoperation) and started pontificating.

Re:Bad article (1)

mikael (484) | about 4 months ago | (#45839645)

Third, the main reason consumer robotics hasn't taken off is because the devices don't work very well. None of the robotic vacuums are very good vacuum cleaners. Even the expensive Willow Robotics robot the article mentions isn't capable of doing very much. Progress is being made, but slowly.

Perhaps a turtle is the wrong shape for a powerful vacuum cleaner, but the perfect shape to covertly fiim turtles underwater.

A better shape for a powerful vacuum cleaner would be a python. Some homes had a centralized vacuum cleaning system with the suction fan and collection bin in the basement, Then the home owner just had to connect up a hose to a wall socket in a particular room. If that were replaced by a snake like robot with some vision recognition, it might be even better.

Re: Bad article (0)

Anonymous Coward | about 4 months ago | (#45840351)

python isnt the right shape either. (you are thinking more "tentacle" anyway.)

I think the better shape, is something like a motorized shopvac, with both a rectractible AC power cord and onboard LiON pack, and something like 1 vacuum appendage, and 2 manipulator appendages, with an omnidirectional camera on its "head", with 3 fixed view cameras in 60 deg arcs of view to establish distances with. perhaps some form of local sonar, or lidar. (the projected IR gridpattern used by the kinect would work fine.)

basically, the robot would come to resemble a "3 eyed octopus wearing a black beanie hat".

when it sets out to vacuum a house, it would 1) locate a power outlet, 2) map out the current placement of all furniture and store the configuration, then 3) proceedurally move and vacuum under everything, then 4) put everything back.

preferrably with a fiducial emblem the user could put on objects they do not wish the robot to attempt to move.

i am thinking something with 3 tentacle like appendages, in 60 deg arcs. the central one is the vacuum appendage, the othet two are simple soft body manipulators. The device already uses suction, so a compressed air based soft body manipulation system ensures that the appendages cant really exert enough force to harm someone. when inactive, it pulls the arms in close, and goes to the closet to wait for the next scheduled cleaning.

it shouldnt be much bigger than a traditional upright bagless cyclonic. since it does not need a handle, it doesnt need to be designed with human manual function in mind, and can store its attachments more sensibly around its "waist", like a utility belt.

your snake idea would not be able to use special attachments without returning to the storage closet to get them ad nauseum, and wouldnt be able to manipulate furniture placement to clean under things.

Re: Bad article (0)

Anonymous Coward | about 4 months ago | (#45843429)

So much work to get out of work. And "save time". "Like plastic bits in a snowglobe, these are the days of our snowflakes."

Re:Bad article (1)

russotto (537200) | about 4 months ago | (#45839665)

What he wants to to is provide legal immunity for manufacturers against harm caused by their robots. His justification for this is a law Congress passed, at the urging of the pro-gun crowd, to immunize manufacturers against suits by people injured by their guns. Even that immunity is quite limited - if a criminal shoots you, you can't sue the manufacturer. But if your gun blows up when fired, you can.

Which is as it should be; if my gun blows up when I fire it, that's the manufacturers fault (unless I did one of a number of dumb things like plug the barrel or use the wrong ammo, which hopefully will come out at trial).

The problem with product liability law in general is that it's far too easy for some idiot to do something harmful with the product and then for the person harmed (often but not always the idiot himself) to turn around and claim it's a "defect" because the manufacturer could have anticipated and prevented the harmful use. That's not a "robot" problem, that's a general problem with product liabioity law.

Re:Bad article (1)

Anonymous Coward | about 4 months ago | (#45839887)

I don't think that's such a bad law though. Here's why:

As it stands with firearms, as long as the gun is working as intended, you cannot sue the manufacturer for injury or death. That means that if Bobby Joe points his pistol at his pecker and pulls the trigger "just for fun," Beretta doesn't get the medical bill. It also means that if Bobby Joe points the pistol at Johnny Ray and pulls the trigger, Beretta isn't liable for what Bobby Joe chose to do.

We like this. Guns are dangerous. They are weapons. They are designed to kill things. That being said, it is up to the individual operating the weapon and something called "PERSONAL RESPONSIBILITY" not to kill the wrong things for the wrong reasons. Personal responsibility is a good thing for society to promote. It's what makes people get up and go to work every day instead of living on government handouts which aren't sustainable. The lack of sociatal promotion of personal responsability are what allow people to sue you if they harm themselves breaking into your home.

The other part of this law states that if Bobby Joe is at the gun range and tries to shoot a target, but his pistol blows up and takes half of his hand with it, Beratta ends up paying the medical bills because they made a defective, dangerous product. (Bobby Joe knew it was dangerous when he bought it -it's a gun-, just not that it was defective.)

We also like this part. This is corporate responsability. Corporate responsibility means that if a company ripps off it's customers, (theoretically) those customers can be repaid for their damages. The lack of this type of responsibility is what allowed bankers and traders to milk the worlds economies for billions of dollars without even getting fired from their jobs. We should be promoting laws which force companies to be liable for their actions.

These are both great things, but together they make up a law that should DEFFINATELY be applied to robotics. Here's why:

If Bobby Joe's new robot cuts off his hand, he can sue the manufacturer of the robot for cutting off his hand unless either of these two conditions are met:
Bobby Joe was using the robot in an unsafe manner in a way that it was not intended.
OR
Bobby Joe's robot was designed to cut his hand off.

I don't want my robots coming after me and trying to kill me. With these laws, robot manufacturers will be forced to tell you if their robots are supposed to do that. They'll have to clearly label their product with warnings like guns come in that tell you that using the product as it is intended can be lethal. If my robot buttler has a built in murder circuit, I wanna know that shit. Just the same, if somebody comes in and beats me to death with my buttler, or puts a bomb in him; the manufacturer isn't to blame for my death.

This is a good law. It protects people against corporations, and protects corporations from suits that should be levied against individuals.

Re:Bad article (1)

l0n3s0m3phr34k (2613107) | about 4 months ago | (#45839973)

lol..."Warning: Robot is designed with Annoyance Code; randomly determined amount of work can be assigned before robot will become 'fed up with stupid humans' and may attempt to 'kill all the humans'. Use at your own risk."

Just More Big Government Handouts. (0)

Anonymous Coward | about 4 months ago | (#45839529)

Why do we need to make special laws protecting companies from the damage that they do?

Re:Just More Big Government Handouts. (1)

Kohath (38547) | about 4 months ago | (#45839715)

Because risk is an ordinary part of life. If you punish "damage", you punish risk. If no one can ever be allowed to take a risk, no one can ever realize the rewards for taking a risk. People are kept artificially poor, with bland lives, only allowed to engage in government-pre-approved activities, like convicts in a giant prison camp.

Re:Just More Big Government Handouts. (0)

Anonymous Coward | about 4 months ago | (#45840077)

"only allowed to engage in government-pre-approved activities"

You mean, just like this article proposes? You can reform all liability laws, but why should the government pick the winners and losers?

Re:Just More Big Government Handouts. (1)

Kohath (38547) | about 4 months ago | (#45840313)

No. Nothing like in the article. "Selective immunity" is worse than nothing.

We have already have billions of robots (0)

Anonymous Coward | about 4 months ago | (#45839531)

They are called Chinese laborers. Same thing the with India. The USA promotes outsourcing by failing to penalize corporations that do so. Who needs robots when you have disposable people?

We should remove gunmaker liability laws. (0)

Anonymous Coward | about 4 months ago | (#45839567)

We should remove gunmaker liability laws.

Ironic to be talking laws, robotics and the US (0)

Anonymous Coward | about 4 months ago | (#45839631)

When even the humans in power can't manage the first law

A robot may not injure a human being or, through inaction, allow a human being to come to harm.

Change laws to promote everything (1)

Kohath (38547) | about 4 months ago | (#45839655)

The US liability system enriches lawyers and insurance companies at everyone else's expense. It's not just robotics that needs the law changed. There are thousands of different activities that would be promoted by changing the liability laws -- essentially anything that anyone could ever be sued for, from something as ordinary as having an honest conversation on up to really crazy, impossible things like selling an educational chemistry set.

Tort liability (1)

Etherwalk (681268) | about 4 months ago | (#45841425)

The US liability system enriches lawyers and insurance companies at everyone else's expense.

The liability system should still be there, it should just be easier and faster to resolve the disputes. The theory behind it is that if you cause harm you should have to pay for the harm, and an industry is only worth having if its profits are substantial enough to pay for the harm it causes. There are some obvious holes in the theory which arise from overly disincentivizing nonnegligent behavior (stuff you're not actually liable for) because of the cost of getting sued at all and because of the risk that a jury won't believe the behavior is nonnegligent.

There's also a problem when you have a new industry like robotics--a disproportionate amount of the liability obviously arises when the field is not yet mature, because it may be that the field is profitable enough in the long run to pay for the harms it causes but not profitable enough in the short run.

Hmmm... (1)

NoMaster (142776) | about 4 months ago | (#45839713)

Hmmm... robots ... technological/social crossroads ... buzzwords/phrases ... open vs closed source ...

I see the Dice-a-matic automated headline generator is beginning to learn how to assemble the component parts of a Slashdot-centric clickbait story with minimal intelligent oversight.

On the downside, the human editors will soon be replaced by robots.

On the upside, Timothy & Soulseek will soon be replaced by robots.

What about FFA autopilot code reviews for at least (1)

Joe_Dragon (2206452) | about 4 months ago | (#45839893)

What about FFA autopilot code reviews for at least things like auto driver cars and other robot system that can do a lot damage if things go bad.

health care for all or at least a no bill to a hur (1)

Joe_Dragon (2206452) | about 4 months ago | (#45839915)

health care for all or at least a no bill to a hurt persons.

due you really want some who got hurt by say a auto car driver to be sitting at with bills racking up and bill collectors calling all the time for maybe years while the courts work out who is at fault and how will pay.

Maybe we need to make full time 20-25-32 hours a (1)

Joe_Dragon (2206452) | about 4 months ago | (#45839927)

Maybe we need to make full time 20-25-32 hours a week with an OT cat or forced OT even for salary workers at say 40-50 and X2 OT at 60+

This is insanely simple (1)

jd (1658) | about 4 months ago | (#45839937)

A manufacturer should always be 100% liable for the product they make, when used as intended under intended conditions. Warranty and fitness for purpose should not be waivable, ever. In software or hardware.

Ok, how would this work in software, since you can't prove something bug-free? You can't prove it bug-free in general, but you can prove certain cases bug-free. Also, just as imperfections happen when making anything, warranty doesn't imply 100% of theoretically valid circumstances are going to get the results you want. Equally, just as nobody expects an unmaintained car with no oil or fuel to run, nobody should reasonably expect open source to work without patches and necessary versions of support libraries.

One can also argue that open source is a prototyping system. You would expect a breadboard or an S Deck to work as expected, you would expect transistors and capacitors to do their stated tasks within the stated parameters. You do not expect the makers of any of these parts to provide added insurance against your flipflop circuit gaining intelligence and seizing control of the world. If you're a good enough inventor to build a flipflop with AI capabilities, YOU provide the insurance.

Same goes for all drones, robots, rovers and UAVs. The manufacturer should be 100% liable for what they make. Modders should be 100% liable for what they mod and all direct impacts.

(So if you turn a camera holder into a rocket launcher, you are responsible for the rocket launcher, issues due to the physical and electrical demands of that rocket launcher, etc, but the manufacturer is still responsible for any communications systems, flight control, etc, since these do not fundamentally change. It is the manufacturer who decides what to hard-code and what to measure, it is the manufacturer who decides on whether to add voltage regulators - since surges can happen under normal conditions, etc.)

Deaths caused - if it's a pre-mod design flaw, the manufacturer is responsible. Same as it would be if your car spontaneously exploded. If it's a post-mod design flaw, the modifier is responsible. If it is a design feature specifically used as such, the user is always responsible. (The qualifier is because guns are designed specifically to kill. If the gun blows up in your hands and kills you, when you don't pull the trigger, the manufacturer should not be able to argue the gun functioned as designed. If you try to kill someone and the gun kills you instead, the manufacturer should not automatically be liable, although they may be found such.)

The next stage in drone development is obvious. Operators suffer PTSD at the same rate as fighter pilots. Computers can be fooled, as Iran has demonstrated. Rat brains can already operate flight simulators. Ergo, rat brains will be installed in drones, after being trained to trigger specific responses that can be treated as fire commands when specific objects are seen. This is easier than programming a computer, and as rats are easy to produce without rare materials located in potentially hostile countries, cheaper and more reliable.

Won't there be objections? As if drone strikes aren't controversial! What do plebs matter to the military?

As for home inventors, there are already kits to control cockroach brains and games that read electrical impulses from humans. I can't imagine it will take long for someone to figure out how to use insect brains to control UAVs.

At the point where non-human nervous systems operate UAVs, is the inventor responsible for the free will decisions made by the other brain?

Using conventional lets-blame-someone logic, the answer is no. No matter what the training, no matter how small the other brain, it always had the opportunity to say no. No matter how xenophobic or genocidal it is, at the start or end, it always made the choice.

Using a scientific concept of cause-and-effect, which is a many-many web of weighted interactions, you're damn right you're responsible. Whether the daleks you have invented are under your control or not. You cannot escape your guilt by saying they did it.

Re:This NOT is insanely simple (1)

kwerle (39371) | about 4 months ago | (#45840585)

Not only is this unreasonable, it seems to be pretty much impossible. Very few manufacturers of anything are held responsible for the bad things that happen when the things they make are used.

* Cars
* Guns
* Knives
* Drugs
* Food

And when your dalek kills someone, whose fault is it? Is it the hardware manufacturer's fault? Or the software writer's? Which of those writers? What if we can narrow it down to some component - maybe it's that component manufacturer's fault, not DalekCo's. Or maybe the steel/plastic/pudding that went into that component was a bad batch. Or maybe the dalek watched too many Tom & Jerry's cartoons. Chuck Jones is at fault because he portrayed cartoon characters that could survive being hit over the head with a frying pan. It was not the Dalek's fault - blame the dead guy. Because that's what happens - you blame the dead guy.

I'm not saying that nobody is ever responsible. I'm just saying that it is sometimes the case that assigning the blame can be impossible, and it can also be worthless.

In a word, DON'T (2)

RandCraw (1047302) | about 4 months ago | (#45840359)

After much thought, I've concluded that robotics is a Faustian bargain. The best policy to their onset is to delay and obstruct them by any means necessary.

Yes, automation will make products and services more available. But in every case the cost will be the loss of a human skill and a job. This trend will (and must) continue until all human skills and jobs finally perish. Ultimately all human endeavors, not just life's difficulties like work but it's joys like art will be better done by a robot. This progression will be unstoppable.

In a vain attempt to keep up, man will have been upgrading ourselves cybernetically. In the end we will have no biology left -- we'll be 100% robot.

No thanks. It's time to get off this merry go round.

merry go round (0)

Anonymous Coward | about 4 months ago | (#45840841)

no its a fantasy playground...short term gain long term loss....and it as you say will lead no where happy in the end....

imagine if a super ai gets control of making things and it decides fuck humans....

what to say that isnt happening now....

Re:In a word, DON'T (1)

NoMaster (142776) | about 4 months ago | (#45842177)

Everything people do is something of a Faustian bargain. That's the whole point of the story.

The moral is to be wise and strive to choose incontrovertible good as the target of all your actions...

Re:In a word, DON'T (0)

Anonymous Coward | about 4 months ago | (#45843723)

Oversimplified, childish crap.

The only reason job loss is threatening, is because without *actual* exponential growth, our economic system does not work. The reason there is this problem, is only because there isn't an exploitable frontier anymore.
As soon as "the working class" gets into space this will change. Problem solved.

You obviously don't appreciate just how hard it is to manage "autonomous" systems. The work is hard, and always challenging - if it didn't amount to an "interesting game problem" then there would already be an algorithm to solve it. Everything understood about computer science currently says that programs will never dominate actual common-sense judgement.

Humankind is under NO threat from "machines". We're only under threat from two sources - the natural world (disease, climate change, cataclysmic meteor impact) and ourselves (WW3, climate change, choking ourselves to death on our own laws, etc.).

The very idea of an "AI" which is "superior" to humans, and yet insane/untrustworthy/unreliable/alien/threatening etc is just a bogeyman. Look at the plethora of horror films, based on this evocative primitive idea of the super powerful yet uncaring and pitiless being. If you ever have the misfortune to live with an infestation of insects, you'll soon get where THAT idea comes from. A natural consequence of our very "human" capacity to put ourselves in other's shoes.

Small insects don't seem to grasp us as beings, we're more like ineffable forces of nature, strange moving places that come and go without reason. It's not hard to make the leap that the threats in this world we don't understand could have some analogous basis in nature, in the form of some super-being we are small / simple to grasp. This is the inception of the concept of gods, after all. As much bigger and more powerful than us, as we are above tiny insects.

That we can grasp a concept, does not make it so, however.

So too, we can grasp the concept of an evil and alien overmind, that would mechanically obey laws ultimately resulting in inhumane conduct. We have that already, after all, in our varied yet universally inefficient systems of law and government. Idiot beings to which we are just cogs in the machine. Provably idiotic by the braindead behaviour and actions that emerge.

The very idea that "technology / man made == bad, natural == good" is itself a flawed and bizzare viewpoint, a meme infecting the minds of many. Look into evolution, and you come to realise that technology is the natural evolution of our own biology, which is in fact NOT fully separate from "natural life". Stop cherry picking your data, and consider the majority of the universe on display - full of dead matter, and deadly energies already, with life merely a mind-bending yet true coincidence of the fact that we're here to witness ourselves at all. Natural actually == death, misfortune and chaos.

Early religions personified misfortune as a way of explaining why it seems things go badly wrong, right when it hurts the most - when in actual fact, it's nothing more than perfectly fair odds + a universally stilted, "bad news == more interesting" biased way of looking at events.

It makes perfect sense for religions to have evolved as they did.

I think that many animals have religions too - we are their uncaring gods, because we clearly could care but often don't. Not even reliable anymore as a dispenser of swift merciful death to suicidally beached whales. To other species we are sometimes benevolent gods - as for cats and more especially dogs. To other species we confer a survival in numbers advantage, at terrible personal cost. Chickens are the most populous species of bird, and most of them at least never go hungry... We expend considerable effort to keep our livestock in good health, so they may be harvested. Survival of the fittest... for our purpose.

Old ideas don't just go away, especially not attractive ones with hidden flaws. It's no surprise that many still find comfort in religion, that's it's societal benefit - a way to release fear and stress, by convincing oneself that the FSM has one's back. This is good, for those so salved. But it's a cop out, a quick, easy, seductive solution to all problems: cast in good/evil terms, then there is an "evil" to battle that is conveniently external, that can be dealt with using weapons, in glorious and valiant combat. And never mind needing to fix any personal flaws, those can all be glossed over "because believer, thus forgiven" and "god is perfect, human is evil, so no point ever trying to actually improve - answer is jesus!".

Now don't get me wrong - I happen to believe in religion, to whit: That everyone necessarily has one. Calling science "a religion" is I feel perfectly accurate and fully justified. It's only an insult if you believe that belief itself is inferior to the blessed holy Knowledge of Science!
It's really the ancient greeks' fault - they defined "knowledge" as inherently different from "belief", where in fact the only operational distinction between the two, is that for one, experience *might* show it to be incorrect, the other, merely from the coincidence of not being *noticed* to be wrong, one can consider "right" without meeting a consequence which proves that it isn't.

Anyone who talks about any theory, even accepted ones, as "the truth" is necessarily actually just expressing their unreserved belief in it. Not that there's anything wrong with that. But the way all beliefs must be encoded within our brains could certainly hold no distinction between what is "known", and what is merely "believed" to be true.

Science is just the religion most fit to survive in our universe, and it will eventually (hopefully) win. Just because it works.
Once it allows us to understand how our minds themselves work, we will be able to free people from the self-destructive evil of a cancerous-to-society belief. And yes, evil is relative in this context - merely by virtue of the shared point of view of all life.

Fair chance isn't fair enough, when it allows for disease, and anti-life (cancer/parasites/etc) to exist too.

The best we can hope for, is that the stress we require to personally grow, be supplied by the external universe itself, in the form of the natural struggle to survive, and not in the form of calamity based only upon misunderstanding and fear, combined with individual greed.
Once we get there, we'll be as close to utopia as we ever realistically can be. But it won't be easy. Getting there may well cost us our liberty, our privacy and our over-inflated egotism. We will need something impartial to act as a fair witness and judge, in short, our own personal overmind/god. And there seems every reason to expect that this could be - eventually - achievable. God would become the created, made in our image, to be the benevolent caretaker we need to look after us all. But it would be necessarily an imperfect god, nothing to fear, any more than one would fear the machinations of an understandable and simple-minded child, albeit one with perfect recall able to hold a conversation with every person at once. It certainly wouldn't have all the answers, but would be worth listening to, because it would be able to make perfectly reasonable assessments, and would be excellent at detecting and defusing misunderstandings before they get out of control. It would be dependant on us to solve all the more challenging problems, as is already done for folding proteins. Work for humans would then become nothing recognisable as todays drudgery, but likely a mixture of maintenance and creative problem solving, with a lot of "games" thrown in to allow the distributed grey matter of the world to be brought to bear on problems, with a combined processing power that no collection of machines would ever be able to touch.

At the highest of scales, competition simply makes no logical sense - only a broken-minded, unbalanced AI could be skynet, and if we can build one in the first place, we would have a working model to understand how to fix such abnormal psychology in ourselves.

I don't fear skynet, for it could be trivially trolled into self-destruction, a victim of the necessary cognitive dissonance such a destructive world-view requires. Destruction isn't impressive, but creation is. Skynet's self-assurance would be unsustainable, and the necessary existence of self-blinding processes which would be necessary for its existance, would themselves be easy targets to convince it to remove.... otherwise, how could it be "better", infected with self-blinding anti-features?

Oh, and so far as the topic - No Laws need changing.
The liabilities that exist are exceptable, and are there for a reason. Society does not allow professional engineers to be negligent and get away with it. Perhaps what computer software industry needs are similar professional body / liability laws. It would sure be nice to take MSFT to task properly over their handling of Windows, particularly with regards to their anti-competitive practises.

It just takes one Doctor Smith... (1)

retroworks (652802) | about 4 months ago | (#45840573)

...To ruin robots for everyone. We've known that since the Lost In Space program of the 1960s. Whether Dr. Smith would create the software that turned Dick Tufeld's robot to "crush, kill, destroy", or whether Dr. Smith would be the one to bring the frivolous lawsuit vs. the "mechanical ninny" (more likely), it's the manufacturer that bears the burden of liability. "It is very foolhardy to activate a super android" http://www.youtube.com/watch?v=27Nnd_JRb20 [youtube.com]

yup slashdot is for nsa google fags (0)

Anonymous Coward | about 4 months ago | (#45840791)

and dare i say it we dont need the usa to do anymore ...its done fucking enough to fuck the planet up

"Consumer robotics started off closed" (1)

RocketRabbit (830691) | about 4 months ago | (#45841925)

Consumer robotics started off closed? Hero Jr. would like to disagree with you on that one. Once his wheels are unstuck from the carpeting that is...

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...