Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Who Is Liable When a Self-Driving Car Crashes?

Unknown Lamer posted about 6 months ago | from the werecar-had-too-much-to-drink dept.

Transportation 937

innocent_white_lamb writes "Current laws make the driver of a car responsible for any mayhem caused by that vehicle. But what happens when there is no driver? This article argues that the dream of a self-driving car is futile since the law requires that the driver is responsible for the operation of the vehicle. Therefore, even if a car is self-driving, you as the driver must stay alert and pay attention. No texting, no reading, no snoozing. So what's the point of a self-driving car if you can't relax or do something else while 'driving?'"

cancel ×

937 comments

Efficiency. (5, Insightful)

lifewarped (833507) | about 6 months ago | (#45908079)

A self driving car would be less likely to rubberneck, or cause other issues relating to a human driver. Cars could in theory go faster. etc.

Re:Efficiency. (2)

silas_moeckel (234313) | about 6 months ago | (#45908151)

Cant wait till they update all those dedicated bus/carpool lanes. Self driving cars no speed limit max safe speed determined by the cars, cars slower automatically pull over and let faster cars pass. Hell leave the buses in as long as they stop obstructing the flow of traffic.

Re:Efficiency. (1, Insightful)

Anonymous Coward | about 6 months ago | (#45908407)

Not to mention that the government can hack your car to kill you, like they did Micheal Hastings -- and Hastings' car wasn't even self-driving.

It is for this reason that I drive an older model with a manual transmission, with manual door locks and crank-operated windows. Government takes out my brakes? No problem, shift into first and engine-brake going 10 mph down the hill. Stuck accelerator? Put 'er in neutral. Get caught in a storm or drive into a lake? I can simply unlock the door or roll down my windows and swim out, no power components to sieze up or go inactive. Starter or battery dead? Push-start the car. Save gas? Coast in neutral down large hills. It will take nothing short of a remote-controlled bomb or gunfire or a chase ram car to assassinate somebody driving an all-manual car.

-- Ethanol-fueled

Re:Efficiency. (0)

Anonymous Coward | about 6 months ago | (#45908533)

Someone who wants you killed would kill you. A manual car is not a deterrent. But enjoy the delusion.

Re:Efficiency. (2)

jythie (914043) | about 6 months ago | (#45908187)

Though I wonder how long it would take before marketers started allowing for customized driving parameters.

One of the major problems with current traffic flows is it only takes a few aggressive drivers who get minor advantages to slow everything down. There are enough people who, when presented with "you can get there in 8 minutes but everyone else will take 12 or everyone including you can take 10 minutes, but for each person who chooses 8 minutes everyone, including them, will take one minute longer' will take the 8 minute option.

So I could, sadly, see a future were we end up with the exact same traffic problems due to people being able to set how selfish they want their auto-car to be, resulting in the same general slowdown selfish human drivers produce.

Re:Efficiency. (5, Insightful)

amicusNYCL (1538833) | about 6 months ago | (#45908375)

Think of all the problems it could solve though. For example, oblivious drivers shoulder to shoulder going the same speed and not letting anyone else pass. If the cars were autonomous then they could simply tell each other to move over. I would love to have that ability now. Lane speed could also be regulated. If you wanted your car to drive slower then it would stay in the farther right lanes. If your car was being passed on the right, then it would keep moving over until no one is passing it on the right. It would be great if humans did that today, which is the cause for most of the slowdown that I see on the highways.

Re:Efficiency. (0, Troll)

MickyTheIdiot (1032226) | about 6 months ago | (#45908453)

This comment gives me a pretty good picture of what type of a driver you are, so I am sure I didn't even come to mind that there should be a MAX speed. A driver (even in the left lane) who is going way too fast for conditions is still endangering the person in the right lane. Also, that person who is slower than you that you hate so much has the right to go they speed he/she wants to go to, and pass that car that wants to go slower.

Re:Efficiency. (0)

geekoid (135745) | about 6 months ago | (#45908473)

Why would there be a need for passing? You will get on the road, get into the lane best for the distance you are traveling and then go the regulated speed. The vehicle will talk to each other via a VMN (Vehicular Mesh Network).

Traffic will adjust dynamically.

You are not important, maintaining good traffic flow for society does.

Re:Efficiency. (0)

Anonymous Coward | about 6 months ago | (#45908449)

So would this be a parallel to overclocking? Or perhaps more along the lines of net neutrality. Maybe they could limit the number of passes your car gets in an hour similar to the way you have a limited number of skips on Pandora.

Re:Efficiency. (5, Interesting)

crakbone (860662) | about 6 months ago | (#45908511)

Actually I see the opposite. When I drive people around they talk, work on their phones, or make calls. They don't usually tell me to go faster. On an automatic car you would most likely see people start to do other more important things than worry about that .25 second advantage they would have if they cut off three cars.

Re: Efficiency. (0)

Anonymous Coward | about 6 months ago | (#45908215)

In line with the theme of this article, would you really want to be going faster in that super rare case of an accident. We won't be able to react when were at a rest state and have to do... something, when the machine is being ultra efficient.

Re:Efficiency. (1)

mcgrew (92797) | about 6 months ago | (#45908541)

True, self-driving cars will be safer, but that doesn't answer the question. At first you'll still need insurance. If one of them does cause a crash because of a mechanical malfunction, why would anything change? Automakers and mechanics are sued all the time for crashes caused by mechanical problems (which are actually rare, almost all car wrecks are human error). Example: Ford and Firestone for the SUV rollovers.

I think eventually driving without insurance will be legalized when the human factor is removed.

Safety (5, Insightful)

adamstew (909658) | about 6 months ago | (#45908085)

I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.

Self-Driving cars, I believe, have the ability to drastically reduce deaths caused by motor vehicle accidents...one of the highest causes of death in the USA.

Boring Drive (4, Insightful)

ZombieBraintrust (1685608) | about 6 months ago | (#45908181)

But with nothing to do behind the wheel 99% of the time your not going to be alert. Your going to be super bored. So when your supposed to take over you won't be prepared to do so.

Re:Boring Drive (5, Informative)

SJHillman (1966756) | about 6 months ago | (#45908385)

Cars should have a failsafe option when faced with a decision in dangerous circumstances. Something like "pull the fuck off the road without hitting shit then ask what to do". Sure, even a failsafe option can't account for everything, but it will probably still do a better job than your average human driver - alert or not.

If we always waited until 100% of the issues are ironed out, then we still wouldn't even be using fire. Personally, once machine drivers are statistically safer than human drivers, I'm all for adopting them as our vehicular overlords.

Re:Boring Drive (2, Insightful)

MickyTheIdiot (1032226) | about 6 months ago | (#45908497)

Yep, you're right, but the problem is that people are so fucking stupid that if any non-autonomous drivers were on the road it would be pulling over constantly. How many times a week do people get too close to you on the highway or tailgate. How many times a week do you pull up to a four-way stop and some hillbilly can't comprehend what to do? The same things will happen to self-driving cars while there are still people driving their own machines.

Re:Safety (1)

CanHasDIY (1672858) | about 6 months ago | (#45908237)

I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.

The problem with that is, how much notice do you think a computer is going to give the operator when it comes across one of those situations it doesn't know how to handle? 5 minutes, probably not; 5 seconds*, maybe. That's not a lot of time to switch gears from "casually reading a book" to "OMFGABIRDISCOMINGTHROUGHTHEWINDSHIELD!"

* I'm probably being quite generous.

Re:Safety (1)

SJHillman (1966756) | about 6 months ago | (#45908411)

A machine equipped with the full range of sensors available today will probably be able to detect, decide and alert the passengers to the threat faster than the average human driver would be able to detect and react to the same threat in the majority of situations.

Re:Safety (1)

HornWumpus (783565) | about 6 months ago | (#45908519)

Human brains are still better then computers at that type of pattern matching. Autonomous cars will require strong AI.

Re:Safety (1)

geekoid (135745) | about 6 months ago | (#45908509)

The emergency situation will be handled by the system. In fact, that will be the greatest reason for increased driving safety.
The .5% is for places like parking garages, dirt trails, and other edge cases.

Re:Safety (4, Insightful)

gstoddart (321705) | about 6 months ago | (#45908301)

I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do.

No way that's gonna work.

There's now way you can expect people to be alert and responsive if they have to be on the ball for that small fraction of the time -- they'll have started reading their paper or plenty of other things.

If I'm responsible for the operation of the vehicle, I'll bloody well drive myself and be engaged for the entire time, and don't need your autonomous car.

I'f I'm not responsible for the operation of the vehicle, I want to be in the back seat in one hell of a good safety cage with no pretense whatsoever that I'm in control.

You can't have the vehicle responsible most of the time, and the ostensible operator responsible whenever that stops working suddenly, it defeats the purpose.

Which, to me, is kind of a fairly fundamental problem with self driving cars. It's all or nothing. And if *all* the cars on the road aren't autonomous, then the autonomous ones are mostly a traffic hazard with no clear liability.

Re:Safety (0)

Anonymous Coward | about 6 months ago | (#45908371)

This sounds good. Have the car fail and dump control to the user who has been napping for the last 10 minutes. Fail safe is a concept auto makers need to learn before they even think of self driving cars. I would trust Google or Microsoft to understand this but no way in hell would I trust Toyota or Ford to get this right.

Re:Safety (1)

alen (225700) | about 6 months ago | (#45908419)

most car deaths in the USA happen around the big holidays and weekends. times when people are drinking and driving and probably driving late at night and tired

if the drunks buy the self driving cars, then it will reduce deaths. but then by law they still have to stay alert to take over

Re:Safety (0)

Anonymous Coward | about 6 months ago | (#45908499)

I would think the point would be that machines, once properly programmed, can be the worlds safest drivers...statistically. You, as a human, will still be responsible for taking over when the machine doesn't know what to do. But, for the other 99.5% of the time, the self-driving car will make the best decisions and always be completely alert.

Self-Driving cars, I believe, have the ability to drastically reduce deaths caused by motor vehicle accidents...one of the highest causes of death in the USA.

The catch is "Properly Programmed".

That doesn't happen with computers and embedded devices.
The do hang every once and a while.

After passing "Hello World." it down hill from there.

If you don't want to drive, then don't get behind the wheel.
Take the mass transit or let someone else do the driving.

Bennett Hassleton Fakeout (0)

Anonymous Coward | about 6 months ago | (#45908093)

I was already mid-groan when I realized it wasn't him...

Insurance (4, Insightful)

mfwitten (1906728) | about 6 months ago | (#45908105)

There's an industry that manages risk.

Regulation (e.g., insurance) always develops spontaneously, because there is a market for reducing chaos.

Re:Insurance (5, Interesting)

jythie (914043) | about 6 months ago | (#45908205)

*nod* I could see the liability resting on your insurance carrier, then premiums being based off the model of car, version of software, or configuration.

Re:Insurance (-1)

Anonymous Coward | about 6 months ago | (#45908267)

I could see that too, but these days insurance companies don't do jack, then when it's time to collect, they weasel out. Determining your car's software version would be too much work for them and cut into profits.

Re:Insurance (5, Funny)

sjames (1099) | about 6 months ago | (#45908467)

when it's time to collect, they weasel out.

Too true. Clearly we need insurance insurance.

Re:Insurance (2)

bill_mcgonigle (4333) | about 6 months ago | (#45908505)

Determining your car's software version would be too much work for them and cut into profits.

Empirical data points to the opposite: http://fitguide.installernet.com/progressive/ [installernet.com]

Re:Insurance (5, Interesting)

bill_mcgonigle (4333) | about 6 months ago | (#45908481)

Right. It needs to be strictly civil liability - the government could really hose this up if they attach criminal penalties.

The computer industry has set a terrible precedent here, which I hope is stopped. That person running an unpatched XP in a botnet should be just as liable as the person riding in his car, for the damage his car does and for the damage his PC does. Kaspersky should be selling combination AV/Insurance packages.

People wonder why linux doesn't catch on despite being so much more secure than Windows. One of the factors is that Windows doesn't have to be as good because liability is artificially limited by the government, and that's a direct subsidy. Absent that protection, either Windows would get better or it'd become too expensive to run.

Re:Insurance (1)

MightyYar (622222) | about 6 months ago | (#45908489)

Yup. And I'd bet that autonomous cars will get you a discount, like having a theft deterrent device. Owners won't care too much that they are liable, since they were always liable and their insurance just went down. Behind the scenes I expect all sorts of juicy court battles, as insurers and manufactures fight over things like manufacturing defects vs improper sensor maintenance and the like - but to the owner of the vehicle, I don't expect much resistance.

Re:Insurance (2)

vt0asta (16536) | about 6 months ago | (#45908531)

Agreed. It's the same as if you were driving. However, there could be a safe auto driver discount for your insurance if you allow the vehicle to do the driving more than you do...and if there is an added fee for driving a vehicle with auto drive that too will dictate it's adoption and incorporate liability costs. Further, there are these things called courts and they've been known to settle grey areas like these. "What did the manufacture know and when did they know it?" Also, as per usual...the life you save may be your own. As per usual, your life and property value will be reduced down to a number provided by an actuarial. The difference will be...that your responsibility and negligence may come down to software maintenance or lack thereof...instead of how many beers you were drinking prior...which will now have had no impact on the way the vehicle was driving.

The masturbator will be at fault (1)

Anonymous Coward | about 6 months ago | (#45908109)

If neither party is engaged in any such activity, then an age based system will be utilized.

laws change (5, Insightful)

MozeeToby (1163751) | about 6 months ago | (#45908115)

Current law not appropriate for future technology! News at 11!

Re:laws change (1)

Archangel Michael (180766) | about 6 months ago | (#45908305)

Current Law not appropriate for Future Technology = Poorly designed law. Such a law should be repealed immediately. Replacement should be technology neutral. There are always flaws in every system, we cannot eliminate all flaws, but we can mitigate against them.

At some point, it would be better to assume the flaws, build in common structure for handling "no fault" accidents (technology failures) financially so that we remove the "get rich quick" aspects of tort litigation, and incorporate those costs into the system.

I guarantee that transition to driverless cars will be quick and efficient when it is proven (statistically) that technologies are better drivers than humans. It will be too expensive to allow humans to drive.

Think of it this way, No More Drunk Drivers .... period.

Re:laws change (1)

Jason Levine (196982) | about 6 months ago | (#45908483)

The problem is that laws can't be designed with future technology in mind as you never know where future technology will lead. Who could have envisioned, 50 years ago, that we would have cars that drove themselves? A law isn't poorly designed if some future technology isn't handled by it. In cases like that, the law needs to be updated, completely rewritten, or repealed (depending on the new situation). That's just the reality of laws and technology.

Re:laws change (1)

SJHillman (1966756) | about 6 months ago | (#45908441)

Hell, current laws aren't appropriate for many current technologies.

but your honor! (2)

swschrad (312009) | about 6 months ago | (#45908445)

I was researching the appropriate statues in the Combined Annotated Statues of the Law of the State of (wherever) at the time the vehicle ran down six members of the State Supreme Court. I refer you to Evidence Photo #17, in which the rest of the car was full of lawbooks. your honor, this case should be considered pre-appealed, as it has already been presented to the Supreme Court, and I should be released on personal recognizance... .

Shouldn't this be obvious? (0)

Anonymous Coward | about 6 months ago | (#45908491)

If google's technology controls the self-driving car, then google is effectively driving the car, and therefore google is 100% responsible for what happens while the car is driving.

The insurance company is liable... (0)

Anonymous Coward | about 6 months ago | (#45908121)

This is already a solved problem.

Laws will adapt. (0)

Anonymous Coward | about 6 months ago | (#45908123)

Laws will adapt.

Maybe.

troll (1)

cellocgw (617879) | about 6 months ago | (#45908135)

Talk about a crazy-assed prognostication! This is a ridiculously stupid question (cue the "even by slashdicetimmy standards" responses).

you might as well ask what would happen if it turned out that the number of angels that can dance on a pin turned out to be finite.

Easy, the owner's insurance company (0)

Anonymous Coward | about 6 months ago | (#45908141)

Or, actually everybody since insurance is a shared risk among all insured.

This makes the insured driver look smart, by the way.

Change the law (0)

Anonymous Coward | about 6 months ago | (#45908145)

Change the wording of the law to something to the effect of "The person at the controls of the vehicle must pay sufficient attention to the vehicle's status".

Then, add a bunch of loud alarms to the car that the computer can sound when something goes wrong, or when red lines are crossed (speed over the limit, distance to car in front, certain amount of wheel slippage, etc).

Now, as long as you, the driver, aren't drinking alcohol or doing drugs, and at least one person in the vehicle is awake at all times, you can be alerted to a dangerous situation and put down your book / food when a problem comes up.

Depends (2, Insightful)

Murdoch5 (1563847) | about 6 months ago | (#45908149)

If the car has a software issue and crashes then the software developer is at fault. If the car has a hardware problem then the hardware developer is at fault. If the car has a mechanical failure then then mechanical engineer is at fault and so on. Either developer the components / modules correctly in the first place or not at all. If modules / components have lifespans then just lock the car from starting once those lifespans have been reached and if you don't want to be held holding the torch when shit hits the fan then don't get involved from the get go. To spite this modern system of pass the buck and never accept ownership of the problem, someone caused the issue by not doing there job right to begin with and they should have to rectify it.

Re:Depends (2)

SJHillman (1966756) | about 6 months ago | (#45908485)

"If the car has a software issue and crashes then the software developer is at fault. If the car has a hardware problem then the hardware developer is at fault. If the car has a mechanical failure then then mechanical engineer is at fault and so on."

"Fault" and "Liability" are not the same thing. You can be at fault without being liable, or liable without being at fault.

Re:Depends (1)

Whatsisname (891214) | about 6 months ago | (#45908513)

*their

Also, not all failures are caused by "not doing there job right", especially when venturing into new territory. The Tacoma Narrows Bridge, a classic example of a disastrous engineering project, pushed the envelope and collapsed, but not because the engineers didn't do their job right. There hadn't been a bridge of that size with that design before, and aerodynamic concerns weren't taken into account. If that bridge hadn't collapsed and taught the lesson, some other bridge would have.

You can never remove all risk. You may call that 'passing the buck', but blaming all failures, regardless of cause, as "not doing there job right", forces a stone-age technological capability.

Who is Liable when a Horse..... (0)

Anonymous Coward | about 6 months ago | (#45908155)

Who is liable when a Horse has an accident? The rider, or the owner, or the trainer of the horse?

Re:Who is Liable when a Horse..... (1)

boristdog (133725) | about 6 months ago | (#45908463)

Generally it is the rider. I have seen drunk riders pulled over in New Mexico, where horses are used when a driver has their license taken away for DUI. Seriously, I have seen this twice in Taos.

Then I guess they get charged with RUI.

The driver is responsible. (1)

Erich (151) | about 6 months ago | (#45908157)

If you are driving, you are responsible.

A car that drives itself is responsible for itself.

Who pays in the event of an accident is the driver. In this case, the car. Probably the manufacturer would be liable.

Manufacturers will probably get insurance for the car when driven autonomously. If self-driving cars are safer, this should be a lower insurance rate than you pay now. Additionally, self-driving cars will probably have sensor input that will prove/disprove fault.

Re:The driver is responsible. (0)

Anonymous Coward | about 6 months ago | (#45908289)

Fault does not necessarily equal liability.

A car is a dangerous instrumentality. You can have liability associated with the vehicle, and not be at fault.

Auto-Driving cars would still require you carry insurance.

Re:The driver is responsible. (0)

Anonymous Coward | about 6 months ago | (#45908523)

The operator of heavy machinery is liable for the safe operation of said machinery. It doesn't matter that the car "drives itself". There's still a human operator.

Only if the law doesn't change (0)

Anonymous Coward | about 6 months ago | (#45908163)

If the law doesn't change, driving will be even more boring than it is right now with automatic transmission and cruise control, but otherwise I still want those self-driving cars.

But I'm sure a world convention is being drawn up in the back offices as we speak, to amend current highway codes at some point to suit these cars.

Insurance company is liable (0)

Anonymous Coward | about 6 months ago | (#45908165)

The insurance company would have to pay. If they can prove that
computer driven cars are safer than humans, it won't be long
before insurance companies mandate them and charge MORE
for human-in-the-loop drivers (aka iRobot).

Got a solution... (1)

buck-yar (164658) | about 6 months ago | (#45908173)

Maybe it should be like govt caused problems, where the taxpayers pay for all problems.

Ever heard of mechanical failures? (0)

Anonymous Coward | about 6 months ago | (#45908189)

If your breaks fail despite you keeping safe distance and preforming proper maintenance you wouldn't be made accountable. Why would the on-board-computer be treated differently?

Re:Ever heard of mechanical failures? (2)

HornWumpus (783565) | about 6 months ago | (#45908263)

What? Simply wrong. You cause an accident, you pay. If your brakes fail, after you pay, you might have a civil case against your mechanic or car manufacturer.

Re:Ever heard of mechanical failures? (1)

SJHillman (1966756) | about 6 months ago | (#45908503)

If your breaks break, does that mean they stay in one piece?

Won't be the manufacturer ... (4, Informative)

gstoddart (321705) | about 6 months ago | (#45908193)

The manufacturer will have an EULA which absolves them from guilt.

It won't be the people who sold it, because they'll also have a contract term which says they are absolved from guilt.

So, it will come down to the owner, who will be entirely dependent on the quality of the product, as delivered by two entities who have already said "not us".

So, if you privately buy an autonomous car, and it crashes, you will likely be on the hook for it. If you merely hire them (as in a Taxi), then I'm sure the people who rent them will also absolve themselves from guilt in some strange way -- likely through arms length 3rd parties who do the actual operation.

This won't be so much "buyer beware" as "everyone else on the roadway beware", because you'll have a vehicle driving around that if it crashes, there's a long line of people who have already made sure their asses are covered.

The lawyers for the companies making and selling these will have covered their asses before it ends up in the hands of anybody else.

Re:Won't be the manufacturer ... (0)

Anonymous Coward | about 6 months ago | (#45908479)

All this is exactly why I will opt to remain in control of my vehicle

Nothing changes (0)

Anonymous Coward | about 6 months ago | (#45908207)

The person in charge of the vehicle will still be liable. Pretty sure that auto manufacturers will make people sign waivers until they pass out and note that the self driving car isn't, in fact, capable of safely driving itself and allowing it to do so is acceptance of risk. In the US we have warning labels on hair dryers informing people not to put them in water...

Isn't it kind of obvious? (1)

Lucas123 (935744) | about 6 months ago | (#45908217)

A vehicle malfunction that causes an auto accident won't be attributed to the driver. When Toyota's gas pedals were getting stuck and causing deaths, the lawyers were going after drivers. There's no difference with autonomous vehicles. If the technology is found to be at fault, it will be the part manufacturer and the auto maker who will be dragged into civil court.

Re:Isn't it kind of obvious? (1)

Lucas123 (935744) | about 6 months ago | (#45908235)

Typo: "the lawyers weren't going after drivers" when gas pedals were getting stuck.

The driver (0)

Anonymous Coward | about 6 months ago | (#45908219)

The "driver" (operator) will be liable. Insurance will pick up the tab though. Since self-driving cars will be less likely to be at fault in a crash, everybody will eventually accept the (reduced) risk.

Automated vehicles already exist (5, Interesting)

i_ate_god (899684) | about 6 months ago | (#45908227)

Just from memory:

Montreal Metro is driven by autopiloting with someone in the cab for door management.

Vancouver Skyline doesn't even have a driver anywhere, it's all automated.

Several airports (Orlando was the last one I went to), have automated trains/monorails to shuffle people between terminals.

Most flights you take are done almost entirely on autopilot.

So far, it seems that mass transit is increasingly automated. So why is non-mass transit any different?

Re:Automated vehicles already exist (3, Informative)

gstoddart (321705) | about 6 months ago | (#45908433)

Except, being on rails provides distinct advantages in terms of things being on auto-pilot.

There's far fewer degrees of freedom in terms of what can happen, because, well, you're on frigging rails.

You need to monitor your speed and your braking, but the turning is enforced by the rails unless you're going way too fast.

So why is non-mass transit any different?

Because cars aren't on rails?

Planes are slightly different, because you can bet that the pilot is still ultimately responsible for the aircraft, and if it crashes due to pilot error, he's going to be the one hung out to dry. (Other than that, we mostly just hope/trust that pilots are professional, qualified, and able to do the job at hand)

Re:Automated vehicles already exist (1)

mythosaz (572040) | about 6 months ago | (#45908529)

Sure, being on a track makes autopiloting and "self-driving" easier, but the question the submitter proposes is already answered.

We have self driving vehicles already, and amazingly, we know what to do when there's a crash.

Hell, escalators break, and hurt people FFS. This isn't any different.

Re:Automated vehicles already exist (1)

bob_super (3391281) | about 6 months ago | (#45908535)

Maintenance.
Many people will not maintain their car until something brakes, hoping that it won't be at 150kph.

You've also mostly chosen examples where failure is limited to the mass transit itself (except for planes, but they have pilots with a great incentive to aim for something soft). A failing automated car drifting into my lane is a suddenly a lot more complex liability case.

Re:Automated vehicles already exist (0)

Anonymous Coward | about 6 months ago | (#45908537)

A train on a track only moves in one dimension.
A flight on autopilot has two college educated humans sitting next to the control system. Both being paid six figure salaries and having over 500 hours of training.
If this technology was remotely ready for prime time we would already see it in mail delivery and trash removal.

Laws adapt to change (0)

Anonymous Coward | about 6 months ago | (#45908239)

I'm not sure legal concerns mean that the dream is futile, it just means we'll have to figure out the laws around it.

Like with my accounting software (0)

Anonymous Coward | about 6 months ago | (#45908255)

When my accounting software gets it wrong, they eat the bill.

If a self-driving car were to crash and it turns out that the fault was with the car (i.e. it wasn't slammed into by some teenage kid that doesn't understand stop signs) then the human that owns the car would obviously be shown that in the language of the car's liability statement, the human is at fault if the car was driving in rain, snow, or if the human hadn't observed all 25 points on the opening pre-drive AI diagnostic checks for the day. So in theory: The software. In practice: The human.

Good point (0)

Anonymous Coward | about 6 months ago | (#45908257)

This is a good point. Laws have never been changed to accommodate new technology. Given that precedent, self driving cars will never happen because today's laws are set in stone.

Short-sighted (1)

RavenousRhesus (2683045) | about 6 months ago | (#45908265)

Open your eyes to the more distant future where all new cars are self-driving and only antiques require drivers. Then we can lay blame of self-driving-car-on-self-driving-car accidents with the manufacturers.

!(Programmer) (0)

Anonymous Coward | about 6 months ago | (#45908271)

We can't hold the programmers, sorry Software "Engineers" at risk, because the software comes with NO WARRANTY, OR FITNESS FOR A PARTICULAR PURPOSE, and CONTAINS KNOWN DEFECTS.

I think there is simply too much regulation. We should loosen the engineering restrictions to match the software engineering restrictions. This bridge is made under no warranty, or fitness for a particular purpose, and contains known defects. Your only remedy is the cost of the bridge, or the cost of the toll, whichever is less.

Lets also apply this to airplanes, elevators, etc.

Everyone can be an engineer under these new regulations. Enjoy the commute. Sorry the elevator module encountered a java exception and plunged into the basement. It didn't have to by type-safe, just safe enough.

This has been dealt with (1)

geekoid (135745) | about 6 months ago | (#45908277)

It would be the exact same liability that it is now: the manufacturer is at fault for manufacturing defaults, software or otherwise.

This isn't an issue.

Re:This has been dealt with (1)

RobinH (124750) | about 6 months ago | (#45908417)

Exactly. If a manufacturer makes a car that explodes when hit in a rear-end collision (Ford Pinto), they get sued. If they installed faulty brake lines, they'd get sued. If they provide a self-driving car, they have to make it "reasonably safe", where "reasonable" is determined by the current state of the art in that field of engineering, or by a jury informed by expert witnesses.

It's all about liability (1)

kilodelta (843627) | about 6 months ago | (#45908287)

And the liability will shift to the manufacturer of the autonomous vehicle more so than the person riding in it and owning it.

I don't think much has to change.... (5, Interesting)

spinozaq (409589) | about 6 months ago | (#45908295)

The change will happen slowly, organically, over time. A self driving car will behave statistically as a very safe driver. Ownership of a self driving car should bestow upon you lower insurance rates. If the current insurance companies balk at the idea, the private market will take over and "self driving only" insurance companies will gladly take their place. Eventually, as more and more share of vehicles are self driving the size of the insurance industry will shrink significantly.

I see no reason to change the liability burden away from the "Driver". It may seem counter intuitive, but you are gaining economic advantage by using your self driving car. For that advantage, you accept the risks, and insure yourself against them. That said, operating a self driving car will/should carry significantly less risk and liability then driving yourself around does now.

That does not mean that the car makers are off the hook. Just like today, if a vehicle mechanically malfunctions in a way that the car maker is found responsible, the insurance company may attempt to subrogate the claim to them.

And who would manufacture them? (1)

larry bagina (561269) | about 6 months ago | (#45908299)

Remember the toyota software bug? Toyota cars had a software bug that caused older drivers to accidentally hit the gas when they wanted to hit the brake! But it only affected older drivers, and driver height was also a factor. Anyhow, an expert witness reviewed the source code at testified at a civil trial that he couldn't find any bugs but he couldn't rule out that bugs existed. The jury found toyota liable. Cha-ching!

So even if you use something like haskall to prove your code correct or node to prevent blocking, the car manufacturer may still be responsible for drivers pressing the wrong pedal. It's software, there might be bugs, pay up.

Re:And who would manufacture them? (1)

MickyTheIdiot (1032226) | about 6 months ago | (#45908355)

Didn't this bug also disproportionally hit women drivers from India, too?

Ancient horse law (0)

Anonymous Coward | about 6 months ago | (#45908315)

Didn't we used to have vehicles that were capable of their own decisions once before? We should just treat it however they treated horse-related crashes.

stay awake for the first few years (0)

Anonymous Coward | about 6 months ago | (#45908319)

I think the driver will have to be alert for about 5 years until the vast majority of car are driverless. People will want this, it will sell like iphones to anyone who can get credit.

Even if I have to stay awake an alert I'd still buy one for bumper to bumper traffic. How alert do you have to be at 7:30 a.m. just north of O'hare in Chicago (if you don't know it sucks then and there). I could read a book while the car was going 3 miles an hour and then stops, and then 5 miles and hour and then stops ( sorry kilometer people ).

Even at freeway speeds I can at least just relax and not have to worry about controlling acceleration, braking, lane changes, etc.

Re:stay awake for the first few years (1)

mythosaz (572040) | about 6 months ago | (#45908551)

Drivers are going to be alert for about two weeks and then the novelty and thrill will have worn off, and he'll be like the guy who works at the amusement part on the roller-coaster. Yawn...

There's a clear business model here (1)

j_f_chamblee (253315) | about 6 months ago | (#45908321)

Who is liable if you have a crash in a taxi cab or a state-owned vehicle? The thing this article overlooked is that there is more than one business model for selling cars. Self-driving cars might flourish by allowing companies to provide a lower cost car service for those who either cannot or do not wish to drive themselves. Apps like Sidecar (http://www.side.cr/safety) and Lyft (http://www.lyft.me/safety) are already pointing in this direction and centrally controlled driverless car services could be a logical next step, especially if companies take on the liability for what happens during a ride -- just as they would in an airline, rideshare or taxi service.

Moreover, even if driverless cars don't become the norm, driver-assist cars may do so and could dramatically reduce accident rates. As a car and driving enthusiast, I am selfishly averse to all these changes, but the safety benefits are hard to argue against.

Just make the car white... (1)

torqer (538711) | about 6 months ago | (#45908327)

Just make the car white... and put a fruit symbol on it. Millions of people will buy it despite the fact it has no practical application.

easy (0)

Anonymous Coward | about 6 months ago | (#45908337)

It's Phillip K. Dick.

ABS (0)

Anonymous Coward | about 6 months ago | (#45908341)

Cars already have Anti-lock braking system, but I haven't heard any driver worry about responsibility of bugs in the controller chip.

Not an issue...also it's a product liability (1)

mschaffer (97223) | about 6 months ago | (#45908351)

In many areas, this is not regulated by law but by legal precedent. Besides, laws can be changed and precedents evolve.
Besides, depending on the cause of the accident, this could easily fall under existing product liability laws, regulations, and precedents.

Liability Insurance. (0)

Anonymous Coward | about 6 months ago | (#45908353)

Same as now. In most states, the driver is required to have liability insurance. I expect it won't be any different just because the car is autonomous.

It'll just be cheaper to insure a self-driving car than it would be to insure a car that requires an error-prone human to operate. =)

Universal insurance (0)

Anonymous Coward | about 6 months ago | (#45908361)

Universal insurance provided by the government through taxes, like universal single-payer healthcare in the more civilized countries, could be a solution. Don't try to assign blame, just fix the damage.

Because, God knows... (1)

Chris Mattern (191822) | about 6 months ago | (#45908367)

...there has to be *somebody* who can be sued. It's the American Way.

Analogue to this: What about autonomic drones? (0)

Anonymous Coward | about 6 months ago | (#45908415)

Will an EULA absolve you from murder? The disconnect between the person that pulls the trigger or pushes the button and the predictability of the following actions become larger and lager.

no fault insurance (1)

vux984 (928602) | about 6 months ago | (#45908431)

Some sort of no-fault insurance that all driverless car owners would pay into that accepts responsibility for and pays out damages on accidents seems like the obvious solution here.

If the cars are genuinely significantly safer than it would be cheaper than current insurance. And if there is an accident, the damages are covered, and there's no penalty to the owner.

This doesn't seem like an intractable problem at all.

Not black and white (0)

Anonymous Coward | about 6 months ago | (#45908437)

This isn't a black and white situation where either you are driving or the car is driving. There is plenty of in between.

People get road rage, emotional, or tired behind the wheel. Presumably, a self-driving car would follow all the rules of the road as human's do not. This can increase safety for everyone on the road and not just the individual in the self-driving car. So even if you're just letting the car do most of the driving while you rest your hands on the wheel and monitor some dials and stay alert, there's still a lot of value in that.

So I hope that answers the question of what's the point of self-driving cars if you can't relax. Self-driving cars aren't just about taking a nap in the car during your morning commute.

Your premise is wrong (1)

Anonymous Coward | about 6 months ago | (#45908443)

The driver is not the [only] one liable.

First, any person, even a pedestrian or passenger who causes an accident through their own acts can be held liable. Even if you swerve to avoid a dog or child who runs into the road and sideswipe the car next to you, the parent of the child or owner of the dog will be liable for the damages (unless, of course, you failed your duty to keep a proper lookout or otherwise acted negligently).

Second, the owner of the vehicle is also liable, for act of someone he authorizes to use the vehicle.

Third, if the car was in use on behalf of an employee/servant the employer/baster is responsible.

Fourth, the manufacturer is liable if a product defect is responsible.

Finally, none are liable if they (including their property (i.e. car) and their agents/employees/servants) were not the proximate cause of the accident.

Easy solution (1)

locrien (865888) | about 6 months ago | (#45908455)

Make it so the company who makes the car is "the driver" legally.

In other words, don't sell us your shitty cars if they crash.

Faulty question (0)

the computer guy nex (916959) | about 6 months ago | (#45908493)

You will never buy a driverless car for exactly this reason (will be beyond our lifetimes). There will be drive-assist vehicles, with the driver always expected to be alert in case of emergencies.

Michigan exempts manufacturers (0)

Anonymous Coward | about 6 months ago | (#45908521)

Michigan just passed a law exempting manufacturers from civil liability when driving aids malfunction.

Speeding tickets are civil, so I am using that the next time my cruise control malfunctions.

Novelty, Cruise Control (1)

eepok (545733) | about 6 months ago | (#45908545)

It's because of this conundrum that autonomous vehicles will only be novelty features on standard automobiles. It will be an auto-pilot or cruise control wherein the driver is still expected to take control in the case of an emergency that could not be measured by the car's sensors or accounted for by the car's algorithms.

And that's not bad! It's just not as idyllic as some would prefer.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...