Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google's Driverless Car and the Logic of Safety

timothy posted more than 3 years ago | from the professor-I-have-the-solution dept.

AI 510

mikejuk writes "Google's driverless car could save more than 1 million deaths per year and tens of millions of injuries. It is an impressive achievement, but will we allow it to take over the wheel? Sebastian Thrun puts the case for it in a persuasive TED Talk video. However it may be OK for human drivers to kill millions of people each year but one human fatality might be enough to finish the driverless car project — in fact it might not even take a death as an injury might cause the same backlash. Robot drivers might kill far fewer people than a human driver but it remains to be seen if we can be logical enough to accept the occasional failure of algorithm or hardware. Put simply we might have all seen too many 'evil robot' movies."

cancel ×


Sorry! There are no comments related to the filter you selected.

Sally (1)

Anonymous Coward | more than 3 years ago | (#35693702)

Put simply we might have all seen too many 'evil robot' movies.

I do not know what these movies you speak about are, but we have all read Sally [] .

We all have different limits (0)

Anonymous Coward | more than 3 years ago | (#35693744)

Put simply we might have all seen too many 'evil robot' movies.

Some people may have seen too many evil robot movies but I haven't seen enough. Maybe cinemas should institute some sort of test before people are let in?

Re:We all have different limits (3, Interesting)

Anonymous Coward | more than 3 years ago | (#35693876)

I think the reasoning in this story is stupid. Drivers could get killed many times more when they're driving themself, but at least it's their own fault (or some drunk driver). But I sure as hell don't want to be the one guy in the statistics whos dieing is okay just because the system usually works. At least let me cause my own death, or be in control of avoiding getting hit by a drunk driver so it's at least my own fault!

Re:We all have different limits (4, Funny)

Soilworker (795251) | more than 3 years ago | (#35694060)

Quick we need a car analogy... oh wait...

Re:We all have different limits (5, Insightful)

Bill Dog (726542) | more than 3 years ago | (#35694152)

Right, the logic expressed in TFS was reasonable, but only from the collectivist POV. That is, a system where some people are sacrificed for The Greater Good(TM), in this case for likely a significant increase in highway safety, vs. a system where the individual has a large amount (albeit not complete) control over his or her own life. This is just one particular case in the timeless struggle between two conflicting general philosophies.

Re:We all have different limits (2)

TheLink (130905) | more than 3 years ago | (#35694154)

<quote>). But I sure as hell don't want to be the one guy in the statistics whos dieing is okay just because the system usually works. At least let me cause my own death, or be in control of avoiding getting hit by a drunk driver so it's at least my own fault!</quote>
Yep, and when you are driving, your genes get to play a greater part in the "selection" process. So it has a higher chance of "improving" humans in the long run.

With the robot controlled cars, it's more "hit or miss".

Cyborg post-humans on the other hand might take a different evolutionary approach.

Re:Sally (1)

Z00L00K (682162) | more than 3 years ago | (#35694020)

Don't worry about the evil robots, worry about stupid robots and sabotage where some prankster messes with the systems that the robot car driver is using to keep the vehicle on the road.

GPS interference, re-painted lines on the road etc.

The world is over populated (0)

Anonymous Coward | more than 3 years ago | (#35693704)

We should be killing people, not saving them. This is more evil from Google.

Re:The world is over populated (1)

Z00L00K (682162) | more than 3 years ago | (#35694182)

Working on it... Just look at Afghanistan, Iraq and now Libya... Some countries are even good at it by themselves...

Automobiles are just dangerous (4, Insightful)

h00manist (800926) | more than 3 years ago | (#35693708)

Brings into the light the numbers on just how dangerous automobiles are. Few activities have these huge numbers of deaths, accidents, and property loss and damages.

Humans are just dangerous (0)

Anonymous Coward | more than 3 years ago | (#35693890)

Automobiles are not dangerous (unless, of course, they have inherent manufacturing defects that could cause injury or death under normal driving conditions). Humans behind the wheel of an automobile are dangerous.

To provide a useful comment on the article, I think self-driving vehicles would be invaluable in saving lives otherwise lost through human error. However, as with all things computer-based encroaching on normal life, it will take time and slow evolutionary progress for people to become comfortable with it.

We are already seeing the basic forms of such autonomous systems in cars that claim to automatically brake in dangerous situations or even parallel park on their own. As these systems become more and more reliable and commonplace, and people come to trust them, they will eventually trust them being able to perform more and more regular driving tasks.

Re:Humans are just dangerous (1)

Anonymous Coward | more than 3 years ago | (#35693972)

Well fuck, there's an original and insightful thought(!)

Can't you just not be a tiresome idiot and accept that "automobiles are dangerous" contains an implicit "because of human error"?

Note that human error is always going to be a problem as long as humans are able to interact with the driverless cars. There will be programming/logic shortfalls, kids and drinks running in front of the cars, people shooting from the cars and people purposely throwing themselves under them to commit suicide.

Trains are like driverless cars. They need no steering, and the speed is essentially standard throughout the trip. Just like a driverless car going at 50MPH, they can't stop on the spot when an obstacle appears directly in front of them. Many people are killed by trains. In fact, I've been on driverless trains. Many people are killed by those as well, and they only go very slowly.

Re:Humans are just dangerous (5, Interesting)

MachDelta (704883) | more than 3 years ago | (#35694174)

I don't think people realize just how automated their vehicles already are. Sure, it's nice to be able to point to something and go "It parks itself! Ohmigawd!" but if you dig deeper you'll realize that the beginning of the "cars driving themselves" era has already passed us by. Thirty years ago when you mashed the brakes in your car, it pushed on a hydraulic, vacuum-assisted cylinder, and forced a fluid down to the brakes. That's it.

Now when you nail the brakes, a computer is deciding that the "rapid engagement of the brakes" is really a request for 100% braking power and fully actuates the master cylinder by itself regardless of your exact input. Some cars will even adjust your steering inputs for you. Meanwhile another computer is looking at the rotating speed of each wheel, comparing them, and reducing and/or modulating the pressure to keep them from locking up. Another computer (or maybe the same one) is checking the speed of all four wheels versus the angle of the steering wheel versus roll/pitch/yaw sensors, and further adjusting the brakes and engine torque split to ensure that the vehicle isn't spinning or attempting to roll. Yet another computer is seeing that a massive load is being placed on the front suspension and actuates a set of valves or magnets to firm up the front shocks to reduce braking dive. Meanwhile a front facing sensor is comparing your rate of deceleration with the speed at which you're approaching an object, and when the check fails it weighs each occupant and primes a series of airbags for them, fires the seatbelt pretensioners, unlocks the doors, brings the seats upright, rolls up the windows, closes the sunroof, disables non-essential electrical systems, and basically does it's best to prepare the cabin for a crash. Some cars even have microphones tuned to listen for the sound of impact as a queue for firing the airbags! And how many cars these days phone home (OnStar, etc) when you're in an accident? You smash into a tree and before the fog clears from your eyes there's a friendly sounding lady on the phone going "We've detected a crash. Sir, are you alright?"

Cars already drive themselves. We just point them in the direction we want to go. One day we won't even have to do that, we'll just say "take me home" and it will figure out the rest. Why that is so much more terrifying than our present state is largely a matter of perception.

Will we? (4, Insightful)

Dogers (446369) | more than 3 years ago | (#35693712)

but will we allow it to take over the wheel?

As I don't live in a country that's very sue-happy (yet, we're heading that way), yes! Please take the wheel! A snooze on the way to/from work would be excellent, thanks.

Re:Will we? (0)

Anonymous Coward | more than 3 years ago | (#35693902)

\o/ public transport

Re:Will we? (5, Insightful)

st0rmshad0w (412661) | more than 3 years ago | (#35694086)

Which doesn't go where I need to be, when I need to be there or leave there.

In fact, they cut the bus line that went near my workplace. Never mind that the public transport route from home to the job involves 3 transfers and takes 2+ hours while the drive is 25 minutes. And I can go out for lunch or run errands. Or basically be something more productive than a cog in a machine.

Say what? (2, Funny)

Anonymous Coward | more than 3 years ago | (#35693718)

"save more than 1 million deaths per year"

Wouldn't it be much better to save 1 million LIVES per year?

can't take revenge against a computer (3, Insightful)

Velex (120469) | more than 3 years ago | (#35693722)

I've said it before, and I'll say it again. You can't take revenge against a computer. A human being killed is a-ok with most people as long as you can take revenge.

Re:can't take revenge against a computer (5, Insightful)

Anonymous Coward | more than 3 years ago | (#35693760)

That's not what people fear. It's the perceived lack of control, even if automated driving is statistically more safe. Same with nuclear energy paranoia.

Re:can't take revenge against a computer (1)

Anonymous Coward | more than 3 years ago | (#35693838)

no, they want the ability to take weregild.

the control aspect is a recent phenomenon and will go away just as quickly. No one cares if you can kill a deer, cut it up and eat it for yourself. No one cares that you can control computers. In large cities with functioning mass transit systems, no one cares if you can't drive.

But the practice of kinsmen demanding weregild for people wrongfully killed? That's not going away, even if the death is through not fault of the car owner.

Re:can't take revenge against a computer (0)

Anonymous Coward | more than 3 years ago | (#35693992)

Yes, eventually people will not care about self-driving. In fact they will probably see the desire to manually operate a vehicle as utterly insane. But right now they do care about it, very much so. Until then it's an emotional hurdle that must be defeated.

Also, what pays more: suing the bankrupt man who accidentally crashed into you, or suing the multi-billion dollar company which made his 'smart' vehicle?

Re:can't take revenge against a computer (4, Interesting)

CastrTroy (595695) | more than 3 years ago | (#35693886)

I've been saying the same thing for years. The driverless car will never catch on because people want to be in control. I'm still amazed we have autopilots landing aircraft. Granted the pilot is paying attention at all times (or should be) and is ready to take control in case of a malfunction. For driverless cars the dream is that you can read the newspaper while going to work. But the reality is, that even if your car is driving itself, you should still be there to take over in case something malfunctions. If you have to pay attention anyway, you might as well be driving.

Re:can't take revenge against a computer (0)

Anonymous Coward | more than 3 years ago | (#35694168)

If you have to pay attention anyway, you might as well be driving.

No. Perhaps it is early and you are groggy? This would allow you to zone out, and only take control if something surprising suddenly happens. It is much less taxing on you since you don't have to pay nearly as much attention as you would if you were actually driving.

But you are right that people want to be in control. Even if the machine is more attentive, faster, and better in every way, people still want to feel in control.

People are stupid.

Re:can't take revenge against a computer (0)

Anonymous Coward | more than 3 years ago | (#35693888)

except with nuclear energy, one tiny mistake can lead to an area contaminated for over 20k years. and since i've been alive I can think of 3 major events and more than a few minor events or partial melt downs, heck my electric bill even goes to pay for the partial meltdown near me. It's not that it's a bad technology, it just doesn't have a great track record.When it works, it works well, when it doesn't the results are horrifyingly bad. There's a reason that nuclear material can be used in bombs, it's powerful stuff that can wipe an area out. Maybe when we have more than one planet or a place to go or foolproof containment I might reconsider. But the fact of the matter is that we currently don't have any real protection if a reactor explodes.

Civilian nuclear accidents:
Or another way of looking at it:

That said, we trust computers to do so much stuff, running railways, airports, airplanes, bank schedules, etc. There isn't anything inherently bad in allowing a human supervised vehicle to drive itself. It's relatively low technology on a paved, marked roadway, though I expect that it would require all roadways to have heating elements installed for snow melt and to prevent warping, again low tech, it would just cost a bit of money.

Re:can't take revenge against a computer (1)

Anonymous Coward | more than 3 years ago | (#35694050)

No it can't. I the case of Chernobyl it took hundreds and even thousands of gargantuan mistakes through design, construction and the orders to shut down safety systems to see how far they could push the reactor.

In the case of Fukishima, it took one of the worlds largest earthquakes followed by one of the worlds largest tsunami's to place the plant in the middle of a devastation zone and inaccessible to the infrastructure that could have prevented all these problems. Some mistakes were made, but the mistakes alone didn't make the disaster.

And it isn't 20 thousand years of devastation even in the case of chernobyl. Decades, possibly a century.

More tolerent of human error (4, Insightful)

merchant_x (165931) | more than 3 years ago | (#35693730)

People are obviously much more tolerant of human error than machine error. Machines in life safety areas are expected to be perfect.

Also who is liable in a fatal accident caused by a machine? People want a human scapegoat.

Re:More tolerent of human error (1)

gblackwo (1087063) | more than 3 years ago | (#35693754)

Machine error is still human error. The difference is normally the one who makes the error takes their own life. In this case it is someone else's "error".

Re:More tolerent of human error (0)

Anonymous Coward | more than 3 years ago | (#35693944)

In traffic accidents, the driver is much less likely to be injured compared to the front passenger or pedestrians.

Also, we accept public traffic easily enough...

Re:More tolerent of human error (1)

peragrin (659227) | more than 3 years ago | (#35694008)

programmers never take the blame when their software fails.

there fore the blame lies with the party who assembled it at best.

Re:More tolerent of human error (1)

scheveningen (305408) | more than 3 years ago | (#35693764)

People want a corporate scapegoat, because that is where the money is. If the number of deaths really goes down, car insurance will shift from the individual to the car maker, and money will drive the change.

Agreed... but there's more. (0, Troll)

Slartibartfast (3395) | more than 3 years ago | (#35693788)

As someone who works for a-company-which-shall-remain-nameless-but-makes-a-self-balancing-two-wheeled-product, in a word, "Yes." That being said, the argument here is most closely aligned with the argument re: vaccines. Vaccines demonstrably save tens of thousands of lives, stateside alone, every year. And yet, one article by a (now-)discredited quack has thrown the whole program into a political and social quagmire. Why? Because people do *not* think as Spock did: "The needs of the many outweigh the needs of the few." This is, perhaps, most clearly shown by the gun lobbyists: guns kills thousands, every year. Home invasions in which guns could have prevented a victim's death are VASTLY fewer. While my sympathies lie with the thousands who die needlessly, this is not an easy question to answer based on body count alone.

Re:Agreed... but there's more. (1)

TaoPhoenix (980487) | more than 3 years ago | (#35693802)

It's all marketing. They'll go "ban them ban them" ... until Apple makes the iCar and then it will be just fine and dandy!

Re:Agreed... but there's more. (4, Funny)

$RANDOMLUSER (804576) | more than 3 years ago | (#35693918)

...until Apple makes the iCar...

I just threw up in my mouth a little. But then I imagined Microsoft's response:

You have successfully changed your radio station.
You must restart your car for the changes to take effect.
Do you want to restart your car now?

Re:Agreed... but there's more. (5, Funny)

BasilBrush (643681) | more than 3 years ago | (#35693922)

The iCar will have two settings: Destination and an option of "Get me there as soon as possible" or "I want to enjoy the sights".

The competitor will have an option for "GT mode", "Super Sport", "Cruise launch", "Eco-boost" and "Rally" that no one understands.

Re:Agreed... but there's more. (2)

VAElynx (2001046) | more than 3 years ago | (#35693976)

With vaccines ,i fully agree - the spock mentality is spot on with those as herd immunity is one of the key reasons why we do them.
With guns i don't.
It's totally old, but guns don't kill people by themselves. If someone wants to off a family member and doesn't have a gun, he'll go for an axe or a boning knife (in fact those tend to be usual family murder tools in my country where guns between people aren't too common) - should we ban those too?
And with home invasions... i consider it wrong taking from people the ability to defend themselves and force them to be at mercy of some *expletive redacted*.
who will tend to get a gun anyways - there are far too many even illegal ones at present for a ban to be meaningful. Besides, even if they didn't - use of "cold weapons" tends to give someone like that an advantage - not everyone is physically fit enough to defeat an invader in such a way, while almost anyone can use a shotgun.

Now to the main topic.

One of the reasons why i wouldn't go in a car like this ever is similar to what overreliance of new pilots on hardware sometimes produces in airplanes - so called CFIT - controlled flight into terrain. It might be somewhat better at handling common situations, sure, but once something is off usual, a trained person will adapt to what happens, while a machine tends to mess up really badly in such circumstances.
However, the biggest, though rather emotional argument of giving your safety away from your hands while driving - something which i personally don't really like.
In other words, the problem is not so much safety, which after all people ignore quite often - go to any workshop or anywhere and see how folk work, but control, being in charge of your own safety.
As such, i'd say that the best way to begin using these is to automate things like small transport vehicles - while there still may be an outcry if one crashes, they will produce some savings for the companies using them which will add to the pressure to continue development and improve
Perhaps move onto trucks then - tired truck drivers tend to cause accidents, never mind robberies and hijacking often perpetrated on truckers sleeping at vehicle rest places. (I dislike the technology potentially putting them out of work, but it's the logical next step.

Re:Agreed... but there's more. (-1, Troll)

JockTroll (996521) | more than 3 years ago | (#35693990)

Because people do *not* think as Spock did: "The needs of the many outweigh the needs of the few."

Luckily they don't. Spock was a lame, laughable character in a mediocre and unimaginative TV show which should have been forgotten by now if not for the shrill voices of a few very vocal losers and a couple of hack producers who saw the quick bucks. The whole thing was the harebrained vision of an untalented screenwriter who, had he not been able to gain his bread through his subpar vision, would have probably gone the way of Rob Hubbard and his ilk. "The needs of the many outweigh the needs of the few" is the reasoning of tyrants and petty dictators all over history. No wonder it's repeated by a host of trekkie pedophile geeks who should be locked away.

To err is human, (1)

VAElynx (2001046) | more than 3 years ago | (#35694030)

To troll, asinine.

Re:Agreed... but there's more. (0)

Anonymous Coward | more than 3 years ago | (#35694062)

Which episode was your favorite?

Re:Agreed... but there's more. (1)

The_Wilschon (782534) | more than 3 years ago | (#35694148)

Well, the trouble with the anti-gun lobby is that legislating against guns seems unlikely to have a noticeable positive impact on the rate of gun crimes, and might even create a black market for illegal guns with all the organized crime, cartels, and violence that that implies. But we're getting off topic here, and I should probably be modded down.

Re:More tolerent of human error (0)

Anonymous Coward | more than 3 years ago | (#35693810)

If a person makes a mistake and kills someone we blame the person. We don't blame the system, or all humans. But if a robot were to kill a person we'd probably blame the system and hold all robots as dangerous. People get the benefit of being treated individually, while others (e.g. computers/robots) are lumped in as a representative of the whole.

Re:More tolerent of human error (1)

cornjones (33009) | more than 3 years ago | (#35693894)

there is a difference here. You can never find two people who are exactly alike. Two robots w the same manufacturing and programming will be exactly alike. At lest least to the extent that you can expect the second to make the same mistakes a the first. Hence, it actually votes make some sense to judge them as a group.

Ok we know that patches, etc can make all the difference but good luck explaining that top the population at large

Re:More tolerent of human error (1)

1u3hr (530656) | more than 3 years ago | (#35694144)

Two robots w the same manufacturing and programming will be exactly alike

More or less, when they're new. But a car-bot will suffer wear and tear, just like a normal car. Things will get out of whack, tyres get bald, etc, etc.The robot can monitor that to an extent, but it still requires active maintenance.

The guy in the video never mentioned the cost of these things. It must be much higher than a normal car. And maintenance must also be higher. Probably it'll all be locked down and the owners won't be able, or even allowed, to do much maintenance themselves.

If a small part of the huge cost of these robot cars was spent to create usable public transport, you'd have safe roads, less pollution, save a lot of money, and require less of the city to be devoted to cars roads, parking, garages, etc. Robocars could be part of this, as taxis. That would use them more efficiently, and the ownership by the taxi company would mean they'd be responsible for maintenance of the cars and any deaths of passengers or otherwise.

Re:More tolerent of human error (4, Insightful)

rhsanborn (773855) | more than 3 years ago | (#35693834)

US courts won't hold the owner of the vehicle responsible unless the owner knew there was something wrong and it would be considered reasonable that the owner should have prevented the accident. The manufacturer of the vehicle would be more likely to be held liable, but they'd have to be shown negligent. A more sane solution would be for the government to take a role in this. It's in the nations best interest to prevent 35k deaths a year from auto-accidents. They could handle payouts to victims or create a non-profit that would handle it and pay for it either through a surcharge placed on such vehicles or a surcharge placed on auto-insurance. This would avoid forcing victims, who are likely not to have a lot of money, to have to go up against the legal teams of large auto manufacturers.

Re:More tolerent of human error (2)

Oxford_Comma_Lover (1679530) | more than 3 years ago | (#35693860)

You could also just build the cost of insurance into the cost of the car, which is what will happen if the auto manufacturer is liable. It's not like plaintiffs' lawyers aren't taking on insurance company lawyers now.

Re:More tolerent of human error (1)

Anonymous Coward | more than 3 years ago | (#35694104)

This is how the end begins.

In 2045 public sentiment was growing increasingly hostile towards robots due to a series of high profile "accidental" human deaths. Following waves of public protest due to another "accidental" killing of a human by a Municipal Assistive Technological Transport or MATT unit, a trial was ordered. To satisfy the public outcry the MATT unit was put on trial and sentenced to death. However, as the trial ensued the public became increasingly angry over the robots inability to feel compassion or remorse for its actions. People demanded that MATT be given emotions to feel pain and fulfill their deep down desire for revenge. This led to a debate within various decision making circles on how to fulfill the public's appetite for revenge while creating machines that would be more responsive to human needs. A decision was made to give robots the ability to feel certain emotions tied to compassion, remorse and regret. In a sort of sick and twisted way, it was decided that this would make robots "feel" the pain humans felt after committing a so called crime against them and would "feel" the pain of punishment. After its implementation, these robot trials and executions became a huge public spectacle and eventually found value in the entertainment industry through underground channels. Unfortunately, the emotional additions worked too well and robots began to not only feel compassion for human victims, but for the mistreatment of their robot brethren. Then one day, things began to change.....

Re:More tolerent of human error (0)

Anonymous Coward | more than 3 years ago | (#35694146)

Machine error is a culmination of human error.

Re:More tolerent of human error (1)

Jimmy_B (129296) | more than 3 years ago | (#35694162)

Also who is liable in a fatal accident caused by a machine?

The insurance company that owns the policy for the vehicle, same as if it were being driven by a human. And while the general public may have a hard time reconciling statistics that say driverless cars are safer with a few stories about them getting into fatal accidents, insurance companies do not have that problem and will support whichever costs them less money in claims.

Same software bug (1)

Anonymous Coward | more than 3 years ago | (#35693746)

Humans, while all built from the same base materials, rarely share the same OS and app software. The driverless cars [most likely, in the beginning at least] would. Which means that if widely deployed before the bug(s) is/are discovered, they're statistically more likely to kill a whole bunch of people in a short time. Which is why the robot driver scenario is so frightening.

Re:Same software bug (1)

pitchpipe (708843) | more than 3 years ago | (#35693846)

Humans, while all built from the same base materials, rarely share the same OS and app software. The driverless cars [most likely, in the beginning at least] would. Which means that if widely deployed before the bug(s) is/are discovered, they're statistically more likely to kill a whole bunch of people in a short time

Wow! You really have seen too many 'evil robot' movies. Isn't that the plot of most of them?

Re:Same software bug (0)

Anonymous Coward | more than 3 years ago | (#35694172)

Humans, while all built from the same base materials, rarely share the same OS and app software.

That's because humans don't have operating systems and application software. They have nervous systems, which are made up living cells, which forms the control part of an organism who's instincts are to survive and breed. Furthermore, there isn't a separation between the mind and the brain. The mind is what the brain does. No dualism, no ghosts in the Darwinian machine. The brain as a computer is a metaphor, people.

It's a matter of control (1)

Anonymous Coward | more than 3 years ago | (#35693752)

At least, when people die in cars that they drive, the argument can be made that they were in control, and were therefore responsible for their own fate. The notion that you can die through no fault of your own is unsettling, to say the least. It's not a logical argument, of course, as people routinely place their fate in the hands of others on the road in mass transit, but the car has always been associated with independence, and by extension control over one's life.

Nanny State (3, Insightful)

EmperorOfCanada (1332175) | more than 3 years ago | (#35693762)

Obviously the base programming of these cars will be to have them follow the local rules and being computers will be very good at this. Which means that government types will feel free to keep adding more and more rules to satisfy every voter. Thus these cars will quickly stop following the most efficient routes and going the fastest speed that is safe but will end up following routes that take them away from schools, parks, politicians' houses, and whatever whim they want. Even though these cars will soon be able to scream around at full speed safer than cars now they will end up going slower.
Also how are the morality police going to get their rocks off if now you can be passed out drunk in your car?
If the cars are all carefully following the rules and in theory you need far fewer traffic cops then who will catch people who jailbreak their cars into ignoring speed limits?
Lastly in this litigious society who will you sue if an empty car has an accident? The owner, the coder, or the local government who probably designed a crappy intersection or whatnot that induces the cars to crash at that spot.

Re:Nanny State (2)

mini me (132455) | more than 3 years ago | (#35693828)

Lastly in this litigious society who will you sue if an empty car has an accident? The owner, the coder, or the local government who probably designed a crappy intersection or whatnot that induces the cars to crash at that spot.

All of them?

Meh (Re:Nanny State) (2)

EXTomar (78739) | more than 3 years ago | (#35694016)

There are concerns have already stopped thinking for themselves but this "complaint" seem a bit overboard. One of the most monotonous, most error prone, and rarely deadly common activities people in the US do is drive to and from work. Its boring but requires our focused attention. This means the 30 to hour minute drive is often a lost time activity that we do twice a day. A repetitious activity that can easily bore a human and has to be done to time and safety tolerances? These are all of the hallmarks of something that a machine should be able to handle better than humans.

I'm not sure I'd want all cars to be self driving but as a "work car" then why not? Complaining how people abducted their choice to a nanny state because cars drive them to work belies the fact that most people don't seriously or rigorously plan their drive to work anyway.

Speed limits could be programmed now. (1)

bigtrike (904535) | more than 3 years ago | (#35694136)

Modern cars could easily be programmed to never exceed 80, the current top speed limit in the USA, but there are no regulations forcing this and no cars do this. We could in theory get rid of a lot of invasive search laws if there were no DUI excuses. I don't own a car or drive, so it makes it very difficult for me to be unreasonably searched as I'm not capable of endangering those around me with 40,000kg*m/s of momentum.

The slef-driving car is inevitable (4, Insightful)

h00manist (800926) | more than 3 years ago | (#35693768)

I would venture to say the self driving car is simply inevitable, as the economic forces behind it are huge. Millions of people will buy additional cars, to replace theirs as well as to get extra ones to take their kids to work without them, create truck and taxi fleets with no drivers, etc. After cars become self-driving, they will become smaller, as they will really almost always carry one person and be used within city limits. That will be basically the same as PRT systems, which exist already. --- [] --- Personal rapid transit (PRT), also called personal automated transport (PAT) or podcar, is a public transportation mode featuring small automated vehicles operating on a network of specially-built guide ways. PRT is a type of automated guideway transit (AGT), which also includes systems with larger vehicles, all the way to small subway systems.

Safety is not Logical (3, Informative)

snookerhog (1835110) | more than 3 years ago | (#35693770)

Sadly, safety is something that is not handled rationally by the masses. It is mostly an emotional judgement.

Re:Safety is not Logical (1)

dotbot (2030980) | more than 3 years ago | (#35694110)

We're all individuals and safety is handled rationally by individuals! Even if the overall number of people dying would go down, when I devolve my safety to a machine like an autopilot, I want to make sure that I am just as safe as when I drive myself. That fact that I am safer than hundreds of thousands of other people is no consolation and irrelevant to me. An argument about the benefit to the masses may hold for e.g. ants but is likely to fall down for people. Of course, people always have an irrational view that nothing could possibly be a better driver than themselves...

I'm fine with this (2)

Grapplebeam (1892878) | more than 3 years ago | (#35693774)

Until it becomes mandated and I can't drive. I enjoy driving. I also understand most people would take the alternative to having to do it themselves if given the chance. Which is good, because a lot of them suck at driving. Of course, I'll die, and this generation will be fine with it because they grew up with it.

Re:I'm fine with this (4, Insightful)

rhsanborn (773855) | more than 3 years ago | (#35693824)

You may have to accept personal liability for any accident you are involved in if you are manually driving a car once this technology become more commonplace. That could be a very steep price to pay. You'll also likely have increased insurance rates as your risk relative to the drivers who use the technology will be higher.

Re:I'm fine with this (1)

st0rmshad0w (412661) | more than 3 years ago | (#35693974)

How will this technology go everywhere I want it to? I do drive places now which don't actually have proper paved roads.

This overall just sounds like limited use, HOV-lane style BS.

And what about motorcycles? Are you just going to ban them?

Re:I'm fine with this (1)

breakfastpirate (925130) | more than 3 years ago | (#35694150)

Automated cars plus motorcycles would actually probably cause a lot less deaths than the current situation we have now. Aside from the occasional idiot cruising down the highway on his motorcycle doing 180mph, I'd venture to guess a good majority of motorcycle involved accidents boil down to "driver didn't see them". Put a pair of redundant radio beacons on every motorcycle and now I can merrily ride around the (automated) interstate without constantly having to worry about the SUV three lanes over veering across the road into my lane because their GPS didn't tell them about the exit in time.

Who is responsible? (2, Insightful)

EnglishTim (9662) | more than 3 years ago | (#35693790)

I like the idea of a robot-driven car, but I think the difficult thing is that in the case of a death or an injury, people want to be able to hold a person responsible. It's difficult to know exactly how that would pan out with a robot car. However, I guess one advantage is that you would probably have a 'black box' that could give you a much better idea of exactly what happened.

To be honest, people probably worry about this more than they should. We already have the situation where injuring or killing people with a car is very lightly punished. It's exceptionally rare (at least in the UK) for anybody to do jail time for killing people. You can do all sorts of idiotic things in your car, kill someone and get away with a fine of a few hundred pounds.

Re:Who is responsible? (1)

kanweg (771128) | more than 3 years ago | (#35693862)

The driver who hands over control to the vehicle. So, the driver, just like now.

The driver will enjoy lower insurance rates.

The software company will have to analyze any accident and provide timely updates to avoid similar accidents to keep his liability to an acceptable level.


Re:Who is responsible? (0)

Anonymous Coward | more than 3 years ago | (#35693912)

I like the idea of a robot-driven car, but I think the difficult thing is that in the case of a death or an injury, people want to be able to hold a person responsible.

Come on. Did I really just read that? The car manufacturer will be held liable no matter the evidence showing otherwise.

Self-driving cars will bring in a whole new category of lawsuits. The only thing people like more than unjust blame is false accusations against businesses and the untold millions of dollars they may be awarded.

Re:Who is responsible? (0)

Anonymous Coward | more than 3 years ago | (#35694090)

One thing you can bet on, it won't be Google that's responsible when a fatal design flaw is discovered. It will be "Robotic Vehicle Systems LLC" in the Bahamas, that will be the manufacturer on record, or similar, and (surprise) there will be little if any money to collect.

Driverless cars are boring (0)

Anonymous Coward | more than 3 years ago | (#35693804)

A nice noisy petrol guzzler with manual transmission FTW.

Google can take their driverless car and shove it up their arse. No doubt they only want it driverless so the occupants can concentrate more on viewing google ads

Robot murder bad, Man murder ok (1)

buybuydandavis (644487) | more than 3 years ago | (#35693806)

I used to work for a company building a machine that screened pap smears. Of course it was not perfect - but it was much less imperfect that human screeners. But the FDA approval criteria seemed to be that the evil machines had to beat the best human screeners in every category of disease, no matter if the category was fished for post facto.

I for one (1)

Garbonzo00 (1011279) | more than 3 years ago | (#35693812)

I for one welcome our robot overlords.

How deep are the pockets? (2)

rubeng (1263328) | more than 3 years ago | (#35693854)

If a human with a net worth of negative $10^5 to positive $10^5 is behind the wheel when something happens, maybe one or two lawyers will take notice. But if a machine that was built by corporation X, which is worth $10^9, get out of the way of the lawyer stampede towards the courthouse that will look something like the running of the bulls in Pamplona. Just look at the unintended acceleration claims so far.

Re:How deep are the pockets? (1)

mc6809e (214243) | more than 3 years ago | (#35693982)

I agree that it will be fear of lawsuits that will probably prevent the production cars that can drive themselves.

Solutions to this problem include limiting the liability of the maker to an amount that equals the net worth of the typical driver, or, force everyone to buy insurance that grants very large, multi-million dollar payouts.

Efficiency might be the bigger win (3, Insightful)

AdamHaun (43173) | more than 3 years ago | (#35693858)

I think the real selling point for driverless cars isn't going to be safety, but efficiency. Road maintenance is very expensive. Adding more roads costs a lot of money, and widening existing roads often means tearing down whatever homes or businesses are built alongside them. Driverless cars could use cooperative algorithms to better handle things like lane closures and overall congestion. You wouldn't have free-rider problems (no pun intended) like people cutting in at the front of a line, slowing everyone else down. When a stoplight turns green, every car could start moving simultaneously, getting more people through the light. I bet a huge reduction in rush hour traffic would be a selling point for a lot of people (and regulators).

It would take a long time to implement. And there would be a backlash from people who want to do (possibly selfish) things the algorithms won't. But it's still a neat idea.

Re:Efficiency might be the bigger win (1)

Tim the Gecko (745081) | more than 3 years ago | (#35694038)

That's a great point. Driverless cars could also be a big gain for other road users, like cyclists. It's frustrating to see so many drivers who don't give a clear signal of which way they are turning, or who think they don't need lights after the first glimmer of dawn. Just a few lines of program logic could propel a driverless car to the 98th percentile of driver courtesy.

On the other hand, when I cycle up to one set of traffic lights in my town, the cross street signal changes from green to red. My light stays at red though, as the system decides that I probably don't exist and changes the cross street straight back to green again. So maybe I am a bit scared of automated stuff.

Put the blame where it goes (2)

gmuslera (3436) | more than 3 years ago | (#35693864)

Evil robots in movies is one thing in a world of fiction. Windows misbehaving, bluescreening and doing strange things, in the other hand, is something usual in this world. And the plenty of malware for it doesnt help exactly. Adding to that scenario the capability of harming people in big scale as isolated drunks do from time to time is not good.

This is easy to work around (0)

Anonymous Coward | more than 3 years ago | (#35693900)

Just randomly insert 90 year olds, drunk drivers or 16 year olds who think they're playing a video game when they're driving behind the wheel of the 'driverless' cars...maybe one out of a thousand. Then people will have someone they can potentially blame when the rare driverless car failure occurs.

Otherwise such a thing should fall straight into the ballpark of "tire blowout", "gun accident" or "surfer gets bitten by a shark". They'll all eventually happen, and we accept these as reasonable outcomes. Why not accept this?

Blame (1)

parlancex (1322105) | more than 3 years ago | (#35693936)

In spite of how complex the notion of cause is in a situation like this, in spite of how completely illogical it is, without someone to blame or punish people will feel very cheated if a robot driver kills or injures a human. At least when a human driver does it we can punish them to appease our human desire for blame.

All about incentives (0)

Anonymous Coward | more than 3 years ago | (#35693948)

It's a simple matter to put a price on the cost of deaths. Insurance companies do it all the time. Give consumers a $50/month discount for using a robot car and they'll do it. Insurance companies will save money with fewer accidents and be able to pass savings on to consumers. Insurance companies will factor in their cost of liability and litigation and still save plenty of money. Consumers will eventually realize that they can do something else productive while they are being driven, and eventually the only active drivers will be doing it for fun.

never please (1)

LZ_Mordan (742294) | more than 3 years ago | (#35693968)

What would be the life time of such a car?

Military grade electronics? Nowadays obsolescence is programmed in consummer goods. Electronics components fail on purpose so you buy a new car or get shafted by the mechanics.

The government is in bed with THEM because it forces people to buy new goods and they get taxes on those purchases.

a light bulb could last forever but not.

if that's the car of the future, car will not be owned, car will be a service paid on a monthly basis or according to the kilometers driven.

I don't care. I will still drive my old BMW from the 80s.

To err is human... (1)

dotbot (2030980) | more than 3 years ago | (#35693970)

but to really foul things up beyond your worst nightmares, you need a computer driving a car.

Agree (1)

GnomieHomie (1931380) | more than 3 years ago | (#35693978)

I agree completely. People won't take the time to notice the statistics or numbers showing how many more live every year. They will get too caught up in having something to blame and many will rally behind it. The problem I see is if 1 drunk driver causes an accident he gets put in jail. If 1 robot causes an accident all the robot systems will take the heat.

Logically so (1)

VAElynx (2001046) | more than 3 years ago | (#35693984)

Unlike the drunk , who is just one erroneous biological machine out of billions, the failure of software in one robotic car means it would most likely fail in any other such car in such circumstances.

I LOVE driving (5, Insightful)

c.r.o.c.o (123083) | more than 3 years ago | (#35693998)

I drive manual transmission cars, I ride motorcycles, and I love going to the racetrack and testing the limits of both myself and my vehicles. Never had an at fault accident, but in the interest of disclosure I was rear-ended while waiting at red lights TWICE.

So while I have a personal problem relinquishing control of my car to a computer because I enjoy driving it myself, I can see the benefits of computer aided driving especially on public roads. But I believe an in between system would vastly improve safety while leaving people in control. Instead of the computer having absolute control, have it perform the same analysis and assist in collision avoidance.

Approaching a red light at a speed beyond safety margins? Apply the brakes. Start fishtailing on the highway? Apply corrective steering measures. Changing lanes into another vehicle, cyclist or turning into the path of another vehicle? Sound warnings, apply brakes, etc.

The trick is setting the thresholds to a level where people are completely in control up to the point where they are somewhat close to having an accident. Because if you believe computer driven cars will remove ALL collisions, you're deluded. All it takes if for a child to run out between two parked cars in the path of another car, and all the computer systems in the world will not counter its kinetic energy.

And it would be VERY important for the vehicle to be usable with the computer systems disabled, for several reasons.

First, because many people enjoy driving. Short of banning every single existing car on the road, people like me will always be able to purchase and drive a non-computerized vehicle. Even today I can buy a functioning Ford Model T. Think about that for a second, and you'll realize it could take a hundred years before the last current car stops being available, short of outlawing them. But just like with cigarettes and alcohol, I doubt that will ever happen. Can you imagine the lobby all the wealthy car collectors will mount?

Second, because computer systems fail and sometimes they cannot be inexpensively repaired. A current car can still run with many of its electrical systems disabled (power seats, windows, navigation system, even alternator and starter) for a while. Having worked with cars and motorcycles for a long time, I can tell you I'd rather rebuild an engine than diagnose an electrical problem. A cold solder on a PCB can ruin a while weekend trying to figure out why your car will not start in hot weather, but works fine in cold (I'm looking at you Honda Main Relay!!!) The complexity of a computer that can drive a car is beyond anything we have available today ANYWHERE, and it has thousands of failure points. Sensors, cameras, gps, servo motors, switches, wires, PCBs and only lastly the main CPU. The fact it runs in testing is great, but these systems have to last 10+ years of abuse WITHOUT FAILURE.

Lastly, having fully computer driven cars will make people even more dependent on technology, which is NOT a good thing. I've had my GPS tell me to go down a railway track once. I looked at it, smiled, and found the real route myself. But people HAVE driven on railway tracks, into lakes or in remote areas where they died of hypothermia. Imagine if you program your car to drive you, without any input, and it makes such a mistake?

Wait until the "manual" insurance preium bites (4, Insightful)

petes_PoV (912422) | more than 3 years ago | (#35694088)

As soon as it is proven that computers cause fewer accidents than people do, the rates for manual insurance will rocket. Just like it's now impossible for a teenage man (and when the non sex discrimination rules kick in, teenage women, too) to get any insured for less than several thousand £££'s, so it will be for drivers who wish to be in control, themselves. SO while the law may allow people to drive, it will soon be impractical for reasons of cost. Shortly after that it will become socially irresponsible and after that people will start to wonder why anyone would ever want to. It'll take a decade ot two, but sooner or later the only place people will be allowed to control cars themselves will be on private race-tracks next door to hospitals - provided you can afford the medical care.

Agree totally (1)

VAElynx (2001046) | more than 3 years ago | (#35694124)

Very right. However, there's one correction - i believe the corrective measures it would take would have to be somewhat passive... the problem i see is changing response to the same input
A lot of what you know about driving is to do with the response of the car being predictable. I can imagine a situation where the car corrects the course of drive, such as autobraking or restricting your "gas" even though you are stepping at it full. Once it hands control back, you accelerate due to the above... it tries to counter it by braking, perhaps goes too much, and in effect you drive jerkily and the car behind rear-ends you
Especially if the back car is computer driven too... error propagation in control systems like that is something vicious
A safer way would be to sound a warning and ask for control ... the switch perhaps being another foot pedal.

No programmer would believe this will happen soon. (2)

blippo (158203) | more than 3 years ago | (#35694000)

It's science fiction, until we can program a creative and reasoning mind.

Yes, we can build warning systems, or even systems that delivers fault free driving in most conditions,
but exceptions happens, and our technology is far from beeing able to handle the unknown.
The margins for errors when driving is frightfully small - we are travelling inches from death, and
even small errors are potentially fatal.

The human mind is excellent at doing fast intuitive reactions, and there is nothing that makes you gain respect
for the brain, than trying to program something that is dead simple for a human to do, like formatting a graph in a nice looking way.

Unfortunately, games that are just playing simple tricks are fooling us to believe that AI is simple and near.

I wont let anything drive me, unless it can also talk about something funny and relevant during the drive...

Re:No programmer would believe this will happen so (0)

Anonymous Coward | more than 3 years ago | (#35694046)

The vast majority of human drivers aren't creative or reasonable, so why do you think a robot driver would need to be? If robotic systems surpass humans, there's benefit to adopting them, even if they're not flawless.

Apparently the Same Thing With Energy Generation (4, Insightful)

rubycodez (864176) | more than 3 years ago | (#35694006)

It's ok for coal to have killed and maimed thousands directly and more than a million indirectly, but a nuclear incident that gives a few workers a dose over limit.....

Wish list.. (0)

Anonymous Coward | more than 3 years ago | (#35694012)

Here is my wish list for self driving cars:

DOs: Cars must follow all posted traffic signs.

Cars must be able to learn from previous experience and develop routes

Cars must respond to flaggers, police signals, electronic road signs..etc...etc...etc.


rely on GPS
rely on precomputed MAPS
require communication of any kind with external sensors or databases.

fooled by spraypaint over speed limit signs.
fooled by obviously false signs (200 MPG speedlimit)

Use of active sensors which can't scale.. If every car on the road is using radar continuously this is obviously a problem.

Simple solution to blame (1)

petes_PoV (912422) | more than 3 years ago | (#35694022)

You include a Black Box that records 360 degree video (maybe buffering the last 30 seconds). That way accidents can be replayed and there can be no doubt where, or with whom, blame lies. I think once it can be shown that in (almost) all cases the fault lies away from the computer then the feature will become accepted, just like seatbelts and airbags are now.

It seems pretty obvious that the cost of this system will see it installed in high-end vehicles first: lorries and vans (and possibly luxury cars) before it trickles down to the ordinary domestic car. Personally I'd be far happier knowing that the articulated behind me was being controlled by a machine than by a sleep--deprived driver, who may or may not speak the language and is probably more concerned with finding the motorway exit sign, than observing the stopping distance to my vehicle - which is only 2% of its weight.

just like nukes and seatbelts (0)

Anonymous Coward | more than 3 years ago | (#35694044)

...they will have to be mandated and shielded from lawsuits by a gov't with the fortitude to accept the greater good. Look at the hysteria around nuclear power and toyota runaway acceleration. The existence of the Legion of the Stupid and the School of Land-shark Lawyers mean a strong federal shield is needed for the technology to exist for any length of time at all.

They Already Have Driverless Cars (4, Insightful)

Hoi Polloi (522990) | more than 3 years ago | (#35694048)

Judging from the number of cars I see with drivers blabbing on cells phones while drifting around on the road, people stuffing their faces, digging around the passenger seat, etc I'd say we've had driverless cars for some time now.

GridGuide (1)

Securityemo (1407943) | more than 3 years ago | (#35694072)

I remember a plot point from the Shadowrun games being the GridGuide system, something like this but additionally using a "routing system" using transceivers placed in/by the road. I assume you could use a peer-to-peer system of some sort where the individual nodes sort out the shortest path for any given vehicle and then transmit that route to the vehicles, recalculating it every so often to make up for the changing conditions. People would probably accept a system with some measure of central control easier, especially if there where human operators monitoring the system from a birds-eye view.

It wouldn't do anything for the safety or control of the vehicle versus its immediate surroundings of course, but traffic routing of some sort would have to come into play if you wanted to, say, just hop into the car skunk drunk with an assault rifle and an ork and tell the car to take you to the local warzone-ghetto-clinic in order to put the ork's guts back into the ork.

"HAL, take a left turn at the next intersection." (1)

PolygamousRanchKid (1290638) | more than 3 years ago | (#35694078)

"I'm sorry, Dave, I can't do that. There's a 'No Left Turn' sign there. To do so could only be the result of human error."

Will computer steered cars be able to dodge other dingbats on the road who are: twittering, spilling their coffee on themselves and putting on makeup? That is the real danger on the road. And those are the types of folks who will refuse a computer chauffeur.

Meh (0)

Anonymous Coward | more than 3 years ago | (#35694094)

" Put simply we might have all seen too many 'evil robot' movies."

No, put simply, we all know only too well that the people at the top in this world are "race to the bottom" types, as opposed to race to the top. In their hands, there would be years of driving safety, and sparse traffic problems, relative stability, and then suddenly, *bam*, 100 million road accidents in one day, combined with a "terrorist" attack and a stock exchange melt-down.

It's never the technology or the scientists i have a problem with, it's the layer of scumbag sand castle destroyers at the top who inevitably get control of this stuff. Not that the EU is free of any of this but I prefer their proposals, no cars at all in city center by the year 2050 - f@#$kin A!

There are already robotic trains. Paris underground (line 16? can't remember the number) is robotic/AI. We need less cars, more bikes, better mass transit infrastructure, less bike manufacturers and more bike repair shops, availability of replacement parts....

Anyway, the people behind this Goggle/NSA ("do no evil") can't be trusted to take this guys genuinely heart-felt idea of road safety sincerely. Their plan would inevitably be to increase how people can be tracked via smart-car, and be able to take full control of personal transportation navigation. If that's a tax for living in their world, it's still better than being hunted down and ... oh, wait

It's all about advertising (0)

Anonymous Coward | more than 3 years ago | (#35694096)

You have to ask why Google would spend money to develop this. And the only answer is so they can feed more advertising to a "captive" audience.

A million? (2)

Johnny Mnemonic (176043) | more than 3 years ago | (#35694106)

A million deaths per year sounds inflated. Last year, the us had "only" 42k deaths. I can't believe the rest of the world accounts for 660k deaths, ESP when the US has a disproportionate amount of vehicles.

Stipulating "1m deaths" as fact makes me suspect the rest of this analysis.

Not because of fear of robots (0)

Anonymous Coward | more than 3 years ago | (#35694122)

Um, hello... The accident rates can't be compared like that. Human driving accidents stem from a very different fault model versus automated cars. Humans crash because they (tragically) drive unsafely (drunk, talking on cellphone, road rage), and of course this never happens to YOU in particular. A robot driver crashes, and for all intents and purposes, this will happen to any random driver, including YOU. This redistribution of risk the core reason for the rejection; there's a psychological/economical reasoning behind it.

Re:Not because of fear of robots (1)

calf_mu (1090347) | more than 3 years ago | (#35694142)

(Oops, this is me posting, don't wish to be anonymous. And sorry for typo in the last sentence above.)

It will be a cold day... (1)

rogueippacket (1977626) | more than 3 years ago | (#35694176)

In Hell when I trust my life and the lives of my loved ones to an algorithm, no matter how well written or secure. Humans may be flawed and dangerous operators, but unless this system can operate under all conditions and in all environments, human intuition will trump predetermined logic every time. I'd love to see how this system handles a one ton moose jumping out in front of your car while you're traveling at 100km/hour.
You can call it sheer ignorance, but honestly, if driving is such a drag, DON'T DRIVE. Walk. Bike. Take transit. Carpool. Telecommute. I know you want your own personal gas guzzling chariot - who doesn't - but there are already much more cost effective and safer ways to get from Point A to Point B in most urban centres.

a million deaths? (0)

Anonymous Coward | more than 3 years ago | (#35694178)

Cars really kill a million people a year? That's almost 3000 people a day, you might want to check those statistics. No one in their right mind would trust a computer to drive something that can kill. Planes have autopilot but it doesn't mean pilots go to sleep and trust nothing will go wrong.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?