Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

How Do You Give a Ticket To a Driverless Car?

Unknown Lamer posted about a year and a half ago | from the kitt-got-into-the-bourbon-again dept.

AI 337

FatLittleMonkey writes "New Scientist asks Bryant Walker Smith, from the Center for Internet and Society at Stanford Law School, whether the law is able to keep up with recent advances in automated vehicles. Even states which have allowed self-driving cars require the vehicles to have a 'driver,' who is nominally in control and who must comply with the same restrictions as any driver such as not being drunk. What's the point of having a robot car if it can't drive you home from the pub while you go to sleep in the back?"

cancel ×

337 comments

Better yet (4, Funny)

Anonymous Coward | about a year and a half ago | (#42384691)

I want to car see car fight the ticket in court!

Re:Better yet (-1, Offtopic)

Nyder (754090) | about a year and a half ago | (#42384699)

I want to car see car fight the ticket in court!

I want to car see car also.

Send the bill to Sergey (1)

Jeremiah Cornelius (137) | about a year and a half ago | (#42385027)

They were drivin' the foncker.

Terms of Service prolly screw you: "Google car is a beta" bullshit.

Extra safety (1)

Anonymous Coward | about a year and a half ago | (#42384695)

The human component is just there in case something unexpected happen on the road that self-driving cars may not be able to react to in time. While such disaster scenario may be rare, the possibility isn't 0%, which is why you need someone who is able to drive.

Re:Extra safety (2)

firex726 (1188453) | about a year and a half ago | (#42384767)

But realistically wont a robot be far better at staying safe then a human would be?

Human gets distracted and runs a light causing an accident. With a robot you wont need to worry about is being distracted, or misjudging a distance, etc...

Re:Extra safety (5, Insightful)

Anonymous Coward | about a year and a half ago | (#42384827)

Have you ever played with sensors before? They aren't perfect and can give incorrect readings (depending on type they may not even be very accurate under the best of conditions.) Which means software must be written to take those conditions into account, and usually coordinate among different types of sensors. But that software is written by people, and may have bugs in it (in fact certainly will) plus may simply not cover all the real world situations perfectly.

In some cases where people would have a tough time driving, the cars may do awesomely, but in cases where people would have little trouble the cars may behave strangely as sensors give odd readings, etc.

My phone has an incredible processor in it and can handle millions of calculations per second, but it still locks up sometimes, occasionally responding seconds later to all the stored input. Isn't that pretty close to being distracted?

Don't get me wrong, I want a self driving car so badly it hurts sometimes, but I don't expect it to be perfect. And if that's what people expect they are in for a world of disappointment and pain. And my fear is that will mean people panic at the first accident and push back against allowing them at all.

Re:Extra safety (1)

firex726 (1188453) | about a year and a half ago | (#42384985)

Have you ever used your eyes? They aren't perfect and can give incorrect readings (like if you're in any way distracted, out of focus, or a million other things)

Re:Extra safety (1)

wisnoskij (1206448) | about a year and a half ago | (#42384855)

You must never have played a game with AI.

Here is a hint, we are actually very bad are creating smart machines. A 9 YO would be a more intelligent driver any most super computers.

Re:Extra safety (1)

Yosho (135835) | about a year and a half ago | (#42384907)

You must never have played a game with AI.

I've both played and worked on enough games to know that creating an AI that follows a set of rules perfectly is easy. You have to go out of your way to make an AI that can fool players into thinking it's stupid.

And besides that, how many game AIs have had thousands of people spend several years working competitively on developing them?

Re:Extra safety (5, Insightful)

KingMotley (944240) | about a year and a half ago | (#42385067)

Perhaps, but a 9 YO that is paying attention is probably a better driver than most people out there.

Re:Extra safety (0)

Anonymous Coward | about a year and a half ago | (#42385083)

You must never have played a game with good ai. Most games only give a fraction of the processor budget to the ai because ai doesn't sell. If you spend too much time making the computer intelligent, it can even massively decrease the enjoyment cause then you can't do anything against it.

Re:Extra safety (5, Informative)

icebike (68054) | about a year and a half ago | (#42385107)

You must never have played a game with AI.

Here is a hint, we are actually very bad are creating smart machines. A 9 YO would be a more intelligent driver any most super computers.

In relation to the present discussion, I'd have to say that Google's driver-less cars pretty much put the lie to that statement.
In August 2012, Google announced [wikipedia.org] that they have completed over 300,000 autonomous-driving miles accident-free, typically have about a dozen cars on the road at any given time. Not explicitly stated in their announcement [blogspot.hu] was how often the driver had to take command.

Further, the summary above may be wrong, because the Nevada law also acknowledges that the operator will not need to pay attention [wikipedia.org] while the car is operating itself, which implies the State has no reasonable expectation of holding the driver responsible for accidents.

Re:Extra safety (5, Insightful)

Concerned Onlooker (473481) | about a year and a half ago | (#42385053)

If by "gets distracted" you mean "is an entitled narcissist" then I agree. Robots will take the deadliest thing out of the driving equation, ego.

Re:Extra safety (1)

r1348 (2567295) | about a year and a half ago | (#42385103)

Humans are still better at detecting sensor glitches.

Re:Extra safety (2)

TFAFalcon (1839122) | about a year and a half ago | (#42384775)

But humans are also not exactly perfect at reacting to the unexpected. So why dismiss an automatic car that is not perfect, but may still be better than many human drivers?

Re:Extra safety (1)

uncqual (836337) | about a year and a half ago | (#42385057)

Lawyers?

When a person does something a little stupid or shows a bit of poor judgement that results in a collision, a jury can relate a little to that (they are, after all, human and realize that the glare of on coming headlights may have caused you to not see the stop sign) and they are likely to be a little more forgiving.

When a machine does something a "little stupid", it's likely to be something non-techi human on a jury can't relate to as well ("What do you mean the computer couldn't tell a difference between a shrub and a kid dressed in a shrub costume on Halloween - the kid was carrying a plastic pumpkin, that was a a dead giveaway it was a kid, not a shrub, and that it had the right of way and might step out into the crosswalk."). Also, the deep pockets is "the Google" so it's "free money" the jury is giving away.

Re:Extra safety (5, Interesting)

Wrath0fb0b (302444) | about a year and a half ago | (#42384947)

The human component is just there in case something unexpected happen on the road that self-driving cars may not be able to react to in time. While such disaster scenario may be rare, the possibility isn't 0%, which is why you need someone who is able to drive.

It's also possible that relieving the driver of the drudgery of driving during the vast majority of uneventful rides will actually deprive him of the instinctual familiarity that would allow him to react correctly in those marginal cases. That is, the purpose of keeping a human being in the loop just for disaster scenarios might be self-defeating if the driver does not possess the experience to best resolve the situation.

Re:Extra safety (5, Insightful)

KingMotley (944240) | about a year and a half ago | (#42385077)

On the other hand, with humans, each human learns how to correctly deal with situations. With computer drivers, they ALL learn from one mistake.

There would be no need... (5, Insightful)

Anonymous Coward | about a year and a half ago | (#42384701)

to ticket a driverless car. The car, by design and foregoing any human intervention, will obey the law exactly as it is programmed to. It will not speed, it will not swerve, it will not disobey traffic signs nor will it deviate from its programmed course unless directed to by human intervention.

Ergo, if the driverless car fails to function as specified, then the manufacturer is to receive a citation for the vehicle's failure, or otherwise the human who was in control at the time of the infraction will receive the ticket. The car itself is irrelevant.

Re:There would be no need... (2, Interesting)

Anonymous Coward | about a year and a half ago | (#42384765)

So what you're saying is, cars don't commit DUI's... car drivers do. Bizarre thinking.

Re:There would be no need... (5, Insightful)

firex726 (1188453) | about a year and a half ago | (#42384771)

Exactly, it'll do as it's programmed. If there is a conflict then either the programming is bad or the law is in error.
Really this seems more like a "budget" issue for the states that have become to rely on ticket revenues.

Re:There would be no need... (2)

meerling (1487879) | about a year and a half ago | (#42384777)

Of course, is the 'infraction' due to a software error, an emergency response to prevent a collision, or falsified to begin with?
In any of those cases, you, the owner, can easily fight it in court.

Remember, just recently a car stopped at an intersection and clearly not moving was cited for going too fast. Needless to say, the driver fought the ticket using the 'proof' from the citation that clearly showed the car not moving.

Re:There would be no need... (3, Insightful)

TFAFalcon (1839122) | about a year and a half ago | (#42384787)

There would be cases where the car's owner would deserve the ticket - busted lights, missing first aid kits, no winter tires,.... So give the ticket to the car's owner, then have the manufacturer reimburse the owner if it was the fault of the 'driver'

Re:There would be no need... (5, Interesting)

SternisheFan (2529412) | about a year and a half ago | (#42384837)

There would be cases where the car's owner would deserve the ticket - busted lights, missing first aid kits, no winter tires,.... So give the ticket to the car's owner, then have the manufacturer reimburse the owner if it was the fault of the 'driver'

Devil's advocate here. For insurance/liability reasons shouldn't the car refuse to operate unless it's operating with 100% safety compliance? If it does, than it would be a manufacturer that would be liable. A car should sense when maintenence is required and, if it's prudent to, drive itself to the repair shop.

Re:There would be no need... (2)

TFAFalcon (1839122) | about a year and a half ago | (#42385025)

It would probably be tough for the car to detect every possible problem with itself. Imagine the front of the car being covered with black paint, blocking the front lights. How would the car be able to detect that? But it could present quite a traffic hazard.

Re:There would be no need... (1)

SternisheFan (2529412) | about a year and a half ago | (#42385097)

It would probably be tough for the car to detect every possible problem with itself. Imagine the front of the car being covered with black paint, blocking the front lights. How would the car be able to detect that? But it could present quite a traffic hazard.

There are going to be redundancy systems needed if a sensor is blocked, or ALL of the LEDs blow at once. Some type of fail-safe that tells it to safely pull off the road and shut down. Though I can see some growing pains in this new, unproven tech. Expect some major pileups and loss of life in the beginning until all the unforseen kinks get discovered and fixed.

Re:There would be no need... (2)

tftp (111690) | about a year and a half ago | (#42385115)

There are going to be redundancy systems needed if a sensor is blocked, or ALL of the LEDs blow at once.

No need to. It's much simpler. The car will stop and refuse to move if it cannot see the road, for any reason. Failed headlights are just as likely as a very dense fog or a blizzard.

It is of course a pretty simple test for the machine: switch the lights on and observe the increase in brightness of the camera image. You can measure the light output of the headlights quite accurately, as long as you have an idea about the reflectivity of the road.

Re:There would be no need... (5, Funny)

VortexCortex (1117377) | about a year and a half ago | (#42385059)

Devil's advocate here. For insurance/liability reasons shouldn't the car refuse to operate unless it's operating with 100% safety compliance? If it does, than it would be a manufacturer that would be liable. A car should sense when maintenence is required and, if it's prudent to, drive itself to the repair shop.

Just wait till the machine intelligence is a bit more advanced, you'll see the behavior you're speaking of emerge naturally. Think about it. If you had a fluid leak, staining your sitting spots, you'd have it repaired or at least wear a bandage or diaper... You wouldn't go trotting around town leaving a mess everywhere, eh?

"I'm sorry Dave, I'm afraid I can't do that. If we go anywhere it's straight to the mechanic to get this embarrassing oil leak fixed."

Re:There would be no need... (1)

cigawoot (1242378) | about a year and a half ago | (#42384909)

I'd just say, straight up, that traffic infractions caused by the software are the responsibility of the owner of the vehicle, or the person the vehicle was rented/leased/loaned to. If faulty software caused the infraction, then it should be the responsibility of the owner to sue the manufacturer.

Problem solved.

Re:There would be no need... (2)

Flentil (765056) | about a year and a half ago | (#42384941)

I'm wondering, in what country do drivers get issued a ticket for not having a first-aid kit in the car? Serious question, did you just make that up, or is that actually a law somewhere?

Re:There would be no need... (2)

thrillseeker (518224) | about a year and a half ago | (#42384963)

Germany, as I recall from long ago.

Re:There would be no need... (1)

climb_no_fear (572210) | about a year and a half ago | (#42385021)

You need a safety vest AND a first-aid kit (and every driver here is required to take a first aid course).

Re:There would be no need... (1)

Anonymous Coward | about a year and a half ago | (#42385073)

And Belgium: you need first-aid kit, reflecting vest, fire extinguisher and reflecting "Danger!" triangle.

Re:There would be no need... (2)

TFAFalcon (1839122) | about a year and a half ago | (#42385011)

And Slovenia. You're required to have basic emergency supplies in your car.

Re:There would be no need... (0)

Anonymous Coward | about a year and a half ago | (#42384795)

Indeed, and presumably before too long the police will have some way of checking to make sure that you haven't modified the firmware that controls the AI for the car. In which case you would be liable if you modded the car to something other than what the manufacturer permitted. Otherwise, it would be the manufacturer that would be liable.

Hell, I think we all know how that would likely wind up. You'd be responsible when the manufacturer waived liability and the corporatist right wing nut jobs screamed free market until the manufacturers were allowed to do it.

Re: There would be no need... (1)

EGSonikku (519478) | about a year and a half ago | (#42384811)

What if I have expired registration, etc?

Re: There would be no need... (1)

SternisheFan (2529412) | about a year and a half ago | (#42384973)

What if I have expired registration, etc?

The car should refuse to operate until the DMV signals to it that the registration is active. Now, there will be times when a person would need an override, say to escape some perceived danger that the car isn't programmed for. Then the risk is on the operator.

Re:There would be no need... (1, Interesting)

Todd Knarr (15451) | about a year and a half ago | (#42384885)

How does it know what the speed limit is on a particular stretch of road? And what happens when the city changes the posted limit (eg.for construction work) and the car's database isn't updated? Since the car "knows" the speed limit is 55 there it's going to go 55 even though the posted limit is 25.

Re:There would be no need... (1)

auLucifer (1371577) | about a year and a half ago | (#42385045)

That's an odd question because how do you know there is a change? People who travel the same road day in/day out get to know the speeds of their routes and stop looking at the street signs. Yet when there are road works or a change in signage they can see the sign or watch the traffic around and see the speed has changed. Surely automated cars will get cameras that can do the same.

Re:There would be no need... (1, Informative)

Anonymous Coward | about a year and a half ago | (#42385049)

They read the posted speed limit signs, you really think a car that can drive can't read the standardized limit signs??

Re:There would be no need... (0)

Anonymous Coward | about a year and a half ago | (#42385085)

Make it detect the signs?

Could be done with a camera on the car (road signs look very standard) or by having the signs transmit information are the first two ideas that pop up.

Re:There would be no need... (5, Insightful)

VortexCortex (1117377) | about a year and a half ago | (#42385101)

How does it know what the speed limit is on a particular stretch of road? And what happens when the city changes the posted limit (eg.for construction work) and the car's database isn't updated? Since the car "knows" the speed limit is 55 there it's going to go 55 even though the posted limit is 25.

How do humans know what the speed limit is on a particular stretch of road? And what happens when the city changes the POSTED LIMIT (eg. for construction work) and the human's database isn't updated? Since the human "knows" the speed limit is 55 there it's going to go 55 even though THE POSTED LIMIT IS 25.

First off: The car senses things like pedestrians, stalled cars, and other sorts of hazards just like a human can. "Uht Oh! Look: The 3D imagery doesn't match known maps, I should slow down because it might be an accident or constru-- Oh, highly reflective bands on flag waiving pedestrian and a series of cones, why it's a good thing I slowed down since I just confirmed this is a construction zone." Secondly: Say you filed the red-tape to start street construction, even scheduled workers to show up and do the labor, and machines for them to do the labor. A) The digital systems responsible for this also changed the registered speed limit in the construction zone thus notifying the car. B) The construction equipment broadcasts a wireless speed limit update signed with PGP.

You fail computer vision, which is how these things work, not via exclusively following some program. Hell, did you even watch the video of Google's self driving cars? [youtube.com] It slows down for pedestrians, parades, tourists, etc. The concerns you have are based in pure and utter ignorance. The mods who deem you insightful should turn in their geek badges.

Re:There would be no need... (0)

Anonymous Coward | about a year and a half ago | (#42384917)

Aren't you just *precious*!

This is the beginning of something new (0)

Anonymous Coward | about a year and a half ago | (#42384703)

No harm in making laws just in case the tech isn't fully reliable yet

SImple (0)

Anonymous Coward | about a year and a half ago | (#42384707)

The owner, he is still ultimately in charge, if he is drunk, tough

SImple analogy, if i come home drunk and start up my chainsaw and mutilate a few people, is it the chainsaw or me at fault?

Re:SImple (1)

Anonymous Coward | about a year and a half ago | (#42384747)

Incorrect analogy. The manual use of a chainsaw is not "driverless".

Re:SImple (2)

SternisheFan (2529412) | about a year and a half ago | (#42384759)

The owner, he is still ultimately in charge, if he is drunk, tough

SImple analogy, if i come home drunk and start up my chainsaw and mutilate a few people, is it the chainsaw or me at fault?

Simple, but flawed, analogy, since your chainsaw is not a computer programmed to operate without human assistance. Any humans in a programmed driverless car cannot be held resposible, unless it can be shown they tampered with its programming.

Re:SImple (0)

Anonymous Coward | about a year and a half ago | (#42384819)

the reason why the analogy works is this:

does the car operate even though the owner has disabled it or chosen it not to function? no, he has ultimate control

i could start the chainsaw and put it down and it shoots across a room and maims someone, i still ultimately started it and had control to set it going, the cars are the same, it can be owner disabled, and thus under the control of the owner

Re:SImple (1)

SternisheFan (2529412) | about a year and a half ago | (#42384879)

the reason why the analogy works is this:

does the car operate even though the owner has disabled it or chosen it not to function? no, he has ultimate control

i could start the chainsaw and put it down and it shoots across a room and maims someone, i still ultimately started it and had control to set it going, the cars are the same, it can be owner disabled, and thus under the control of the owner

Well, if the owner disables the safetys, then yes, he would be assuming the risk and hence, the liability. Otherwise it's the manufacterers responsibility. If I remove/disable the manufacters safety guard on a circular or table saw and lose a couple fingers because of it, it's my fault then, not the makers.

Re:SImple (1)

gutnor (872759) | about a year and a half ago | (#42384853)

In some countries (EU countries) there are also laws that prohibit you to be drunk on the street or as a *passenger* of a car (as you may guess selective application of the law is required). Problem solved ... I guess.

Re:SImple (1)

SternisheFan (2529412) | about a year and a half ago | (#42384901)

In some countries (EU countries) there are also laws that prohibit you to be drunk on the street or as a *passenger* of a car (as you may guess selective application of the law is required). Problem solved ... I guess.

The laws will have to evolve with this new tech. Governments will need to find a newer revenue stream. I can also see taxi drivers will become obsolete soon after cars become autonomous.

Re:SImple (1)

hjf (703092) | about a year and a half ago | (#42384887)

Better analogy: i put a shotgun and wire it to the door. If someone opens the door the shotgun is programmed to shot him in the face. Guess who's liable for that.

Even easier analogy: electric fence. There have been cases where a thief has sucesfully sued a home owner for getting shocked with one of those.

Re:SImple (1)

SternisheFan (2529412) | about a year and a half ago | (#42384991)

Better analogy: i put a shotgun and wire it to the door. If someone opens the door the shotgun is programmed to shot him in the face. Guess who's liable for that.

Even easier analogy: electric fence. There have been cases where a thief has sucesfully sued a home owner for getting shocked with one of those.

You're not gonna be satisfied until someone's bleeding, are you?

Re:SImple (0)

Anonymous Coward | about a year and a half ago | (#42384927)

Fine...
"If I come home drunk and start up my computer programmed, automatic chainsaw and it mutilates a few people; is it the chainsaw or me at fault?"

I'm sure someone is going to bring up the whole "guns don't kill people, people kill people"... I'd love to see you kill a man with a small and bendy feather

Re:SImple (1)

dreamchaser (49529) | about a year and a half ago | (#42384961)

The owner, he is still ultimately in charge, if he is drunk, tough

SImple analogy, if i come home drunk and start up my chainsaw and mutilate a few people, is it the chainsaw or me at fault?

Simple, but flawed, analogy, since your chainsaw is not a computer programmed to operate without human assistance. Any humans in a programmed driverless car cannot be held resposible, unless it can be shown they tampered with its programming.

To be fair, this *is* Slashdot. How do you know for sure that his chainsaw is not run by a computer program?

Re:SImple (1)

SternisheFan (2529412) | about a year and a half ago | (#42385009)

The owner, he is still ultimately in charge, if he is drunk, tough

SImple analogy, if i come home drunk and start up my chainsaw and mutilate a few people, is it the chainsaw or me at fault?

Simple, but flawed, analogy, since your chainsaw is not a computer programmed to operate without human assistance. Any humans in a programmed driverless car cannot be held resposible, unless it can be shown they tampered with its programming.

To be fair, this *is* Slashdot. How do you know for sure that his chainsaw is not run by a computer program?

I'd have heard about them and would already own one, sounds cool!

uh VIN? (0)

Anonymous Coward | about a year and a half ago | (#42384711)

stupid question. how do you give a ticket to a parked car without a driver?

Re:uh VIN? (0)

Anonymous Coward | about a year and a half ago | (#42384949)

You slip it right under the wiper... sheesh...

Obvious (1)

Anonymous Coward | about a year and a half ago | (#42384717)

We automate lawmaking, with artificial Intelligences

Isn't it obvious... (2)

3seas (184403) | about a year and a half ago | (#42384723)

you use a cop less ticket writer.

Re:Isn't it obvious... (3, Funny)

ArsonSmith (13997) | about a year and a half ago | (#42384829)

An automated speed/red light camera ticketing an autopilot car. I think the first one to get issued needs to go into some type of art exhibit.

All in good time (5, Informative)

PRMan (959735) | about a year and a half ago | (#42384727)

We are at the early stages. Look at the laws from the first few years of automobiles. You had to walk in front waving a lantern. And go slow enough that the cop on horseback could give you a ticket. What's the point of a car with laws like that?

Re:All in good time (1)

Anonymous Coward | about a year and a half ago | (#42384821)

There were also laws that if the owner of an automobile saw a horse driven vehicle oncoming, he was to stop his vehicle on the side of the road, turn it off, and cover it so as to not "scare" the horses.

Re:All in good time (1)

Anonymous Coward | about a year and a half ago | (#42384867)

You say that like it was a stupid idea. You are of course aware that the first person killed by a car was hit by a car that was only traveling 6mph. Just because their effort at foresight was ultimately proven to be a bit silly, doesn't make it a stupid idea. It just means that they weren't sure what the effects would be and made a best guess at it.

Same goes for the horse rule, the sibling poster has obviously never dealt with horses if he thinks it's a stupid rule. Horses are often times easily spooked and unless you know how a particular horse will react, you always assume that the horse is liable to be spooked by things like cars backfiring.

Also, obviously, I've neglected to notice all the people who go around with flags and lanterns ahead of the car, because obviously, laws like that can never be changed.

Re:All in good time (1)

nedlohs (1335013) | about a year and a half ago | (#42384911)

Way to miss the entire point.

Just business (0)

Anonymous Coward | about a year and a half ago | (#42384749)

Of course you cant sleep in the back seat while the robot drives you home, you'd a lot o people out of business

How will the cop know? (3, Insightful)

MojoRilla (591502) | about a year and a half ago | (#42384753)

How will the cop know who to arrest, if the car isn't displaying the obvious signs of a drunk driver?

For now, though the laws require a sober driver, no drunk driver will be in trouble under most circumstances. The laws will eventually catch up.

Re:How will the cop know? (-1)

Anonymous Coward | about a year and a half ago | (#42384779)

How will the cop know who to arrest, if the car isn't displaying the obvious signs of a drunk driver?

Same way they do now: look for the cars with black drivers.

Re:How will the cop know? (0)

Anonymous Coward | about a year and a half ago | (#42384863)

Most of the time it is poor cars (cars that are owned by poor people). A rich black man is less likely to be arrested. You can't safely beat up a rich man not mater the colour.

Re:How will the cop know? (1)

nedlohs (1335013) | about a year and a half ago | (#42384921)

But the rich black man is more likely to get stopped so they can check that he is actually a rich black man as opposed to a black car thief.

Re:How will the cop know? (1)

Spamalope (91802) | about a year and a half ago | (#42384981)

That also works for teenagers, sports cars and hot rods.

Bets on whether driverless car GPS and telemetry data will be ruled inadmissible in traffic court? Just like the camera+gps systems now? After all, allowing that evidence would made up speed trap tickets.

Around here, driving through a speed trap while any of the above nets you a ticket for 11+ over the speed limit without regard to how fast you were driving.

FYI: I often drive with a camera showing out the windshield -and- the speedometer when I pass through common speed traps, especially if the city is having 'budget issues'. It's much harder to get video showing the car's instruments thrown out than to get logged data excluded in court. Officers don't like finding out they've been caught lying on camera. Now you know while photographers are being labeled as terrorists!

Re:How will the cop know? (0)

Anonymous Coward | about a year and a half ago | (#42384883)

That's still an improvement. They'll probably just catch the subset that are doing stupid things like throwing cigarettes out the window and yelling at the other drivers. Even if they don't catch most of the drunk drivers, it would still be a massive improvement over the current system. Also, there will undoubtedly be overrides available and I'm guessing people DUI will probably be more likely to abuse those controls than the rest of the people.

The DUI law would be there to deal with rare cases where the care malfunctions or the sensors fail, most of the time being drunk while driving wouldn't cause any deaths or property damage.

Adopting radical new technologies (1)

CanadianRealist (1258974) | about a year and a half ago | (#42384757)

Let's deal with the last question first:

What's the point of having a robot car...?

The answer is so that people can have a chance to become accustomed to a radical new technology and we have time to work out the bugs with that new technology. Once we get past those two steps and maybe even get to the point that everyone is (not) driving a robot car then we can think seriously about not requiring a driver. Let's try walking before we try running. Or maybe someone could think up some sort of car analogy.

Once we stop requiring drivers then the ticket should probably go to the owner of the vehicle. If the vehicle was operated according to the manufacturer's instructions and was not modified in any way that would cause it to behave incorrectly then the owner can pass the ticket on to the manufacturer or seller. Just as cars have warranties now and must meet certain requirements to be operated now, robot cars should have to meet certain requirements and likely would be guaranteed to drive correctly. If a manufacturer wants to sell a robot car than does not require a driver they should want to offer a guarantee against tickets. (Or be required to, if necessary.)

Re:Adopting radical new technologies (0)

Anonymous Coward | about a year and a half ago | (#42384803)

Reasonable expectations for the future, but the article is incorrect when it claims that self driving vehicles are legal Today. If somebody is required to be in the car while it is moving, then it is not a driverless car.

Re:Adopting radical new technologies (0)

Anonymous Coward | about a year and a half ago | (#42385047)

Reasonable expectations for the future, but the article is incorrect when it claims that self driving vehicles are legal Today. If somebody is required to be in the car while it is moving, then it is not a driverless car.

Reasonable expectations for the future, but the article is incorrect when it claims that self driving vehicles are legal Today. If somebody is required to be in the car while it is moving, then it is not a driverless car.

What's the point of having a robot car??? I hate driving! ( Especially a long-ass 3-hour commute from San Diego everyday) And I hate DUI's. The sooner I can flop into my car and say "home", then pass out, the better.
I would imagine that a car on the road that drives itself would have to be networked and in constant communication with other cars ( updating changes to road and "changes in posted speed limits"). Its software ( or the software of other cars ) would probably be on the alert to any car that exhibited behavior that suggested its firmware had been modified.

The real issue in the future will likely be that for all this new technology work. "It will eventually be illegal for humans to drive" And boy people are gonna be pissed about that.

You think people get pissed about giving up their guns?

Interesting Market (1)

Gorobei (127755) | about a year and a half ago | (#42384783)

Given we already have cars/drivers/insurance companies/state regulation, it seems that a really easy solution might just emerge:

1. driverless cars with drivers allowed by some states
2. insurance companies see increased profits from driverless cars due to less accidents
3. driverless cars become cheaper
4. states actually pay "cash for clunkers" to get the remaining cars off the road
5. quaint laws about "drunk driving" are still on the books and people in 2100 laugh at them like we laugh at our "car law" statutes.

Sue the car company (0)

Anonymous Coward | about a year and a half ago | (#42384799)

If it's the "car's" fault, sue the car manufacturer. Simple!

What's the motivation for these rules? (4, Insightful)

climb_no_fear (572210) | about a year and a half ago | (#42384805)

I understand how I might legally be the driver but if I'm not actually holding the wheel and constantly adjusting the foot pressure on the brake or accelerator, it is impossible to react in time in case something goes horribly wrong with the automated driver (or with the car, for example, a blowout). Are the judges just bending to pressure from the car companies and tech companies who don't want to be responsible for their software glitches?

Legacy/inertia (4, Insightful)

Sycraft-fu (314770) | about a year and a half ago | (#42384899)

A lot of laws are "Oh no this is new and we don't understand it so we'll make old laws apply to it!" stuff. In the case of cars it'll be a long time before things get changed. Eventually automatic vehicles will be prevalent enough that there will be a big enough push to change the laws to something sensible. It'll be quite awhile.

As an example see the FAA squaring off with the FCC over electronics on flights. There is no fucking way electronics cause issues with modern planes. If they did, it would be an open invitation for problems/sabotage. Plenty of people forget/ignore the "turn off your stuff" rule and yet there are no issues. Hence the FCC has told the FAA they need to get with the program and allow electronics at all times. However the FAA is dragging their feet on it.

Also with regards to drunk driving there will be major pushback by special interest groups like MADD. They don't want drunk driving laws to make our streets safer, they are a prohibition/temperance group that uses it to try and push against alcohol. So they'll try to find reasons to keep it illegal to be in a car drunk, even if the car is self operating.

Re:What's the motivation for these rules? (1)

rolfwind (528248) | about a year and a half ago | (#42384913)

The motivation for the rules basically is "Yeah, Google, Microsoft, etc, you can put your experimental cars on the road but don't let a human's hand off the wheel for a second." And I don't disagree.

Because, let's get real, that's the stage driverless cars are at still. About the most automated thing you will see driving right now is a self-parallel parking car that's not even deciding where to park, just how to do it once the driver selects it.

Otherwise, it's not a burning issue. The tech companies still get to put their prototype toys on the road, and someone still has to put their life on the line to be in them, and hopefully, stop any accidents before they happen.

Re:What's the motivation for these rules? (1)

SeaFox (739806) | about a year and a half ago | (#42385039)

It really doesn't matter what you're actually doing in many jurisdictions. They may say they're Drunk Driving laws but the "driving" part is optional. Around here just sitting in the driver's seat of your car drunk can get you ticketed. Even if the car isn't in motion or even running. The only way to avoid the ticket is to have the keys out of the ignition and not reachable. So if you decide you're too drunk to drive and want to just spend the night sleeping it off in your car you have to toss the keys in the back seat so they're out of reach.

How? (0)

Anonymous Coward | about a year and a half ago | (#42384813)

Easy. Send the ticket to the company who programmed the cars software.

That's profiling officer! (0)

Anonymous Coward | about a year and a half ago | (#42384815)

You pulled me over cause I'm a mac didn't you?

Seriously though, I can't wait for the day a profiling cop pulls over a driverless vehicle or one which the human occupant is not in control.

Is it really that complex?(prolly yes but still) (1)

spectre_be (664735) | about a year and a half ago | (#42384817)

1. maybe law was inspired by google's testing car which did have a driver. If not, it still seams reasonable until driverless cars are considered mostly infallable
2. just a guess - why not make the driverless car owner responsible?
Plus given it's a driveless car something tells me law officers won't have to search for plate numbers anymore either.

Empty car (1)

EmperorOfCanada (1332175) | about a year and a half ago | (#42384843)

I suspect that people are going to fight empty cars (which are just too cool). But more interestingly some of the people who fight drunk driving will show their true colours and be shown to actually be anti drink people. And before you cast any stones at me for that one it is the position of the woman who founded one of the biggest anti drinking and driving movements; she started it after losing loved ones but feels that the organization has been co opted by temperance types.

surveillance state already solved this problem (1)

Anonymous Coward | about a year and a half ago | (#42384849)

The owner of the vehicle is responsible for camera tickets, not the driver.

Follow the money.

What ticket? (1)

AK Marc (707885) | about a year and a half ago | (#42384865)

Wouldn't you program a driverless car to not break the law? If it's breaking the law as much as everyone else, there's a problem, but it should be at least theoretically possible to program a car incapable of breaking the law, or does so only to prevent a crash, in which case, most cops wouldn't issue a ticket.

So what do they think will happen? You'll be ticketing them by the thousands for speeding, like regular drivers? Or will they be programmed to use signals, obey lights and limits, and there'll be nothing to ticket them for?

I love how all the luddites are making up failures in a system that shouldn't have those failures to demonstrate how bad the system is. But it doesn't fail that way, just because people fail that way doesn't mean it would be a likely (or even possible) failure mode of the automated cars.

I guess... (1)

Mike73 (979311) | about a year and a half ago | (#42384891)

They could file a bug report instead?

dumb question (1)

pbjones (315127) | about a year and a half ago | (#42384903)

the owner of the car is responsible for the car and any laws it may break or damage it may cause. If a driver is legally monitoring the vehicle, i.e. Driving, then they are responsible for the actions of the vehicle. There is no free ride.

Re:dumb question (1)

pbjones (315127) | about a year and a half ago | (#42384931)

sorry for replying to my own post, this situation would hark back to horse and buggy days where a milkman's house would learn the route and move from place to place while the milkman delivered milk door to door, I remember seeing this happen for years. Anyway, the milkman is still responsible for the horse and cart.

Re:dumb question (1)

SternisheFan (2529412) | about a year and a half ago | (#42385069)

sorry for replying to my own post, this situation would hark back to horse and buggy days where a milkman's house would learn the route and move from place to place while the milkman delivered milk door to door, I remember seeing this happen for years. Anyway, the milkman is still responsible for the horse and cart.

If the horse goes out of control and tramples someone you can't blame the milkman for it.

Re:dumb question (0)

Anonymous Coward | about a year and a half ago | (#42385135)

The milkman's house?! Wow! Automated driverless houses are older than I thought!

Premature Question (0)

Anonymous Coward | about a year and a half ago | (#42384915)

It's silly to ask "What's the point of having a robot car if it can't drive you home from the pub while you go to sleep in the back?" Driverless vehicle technology is still in development. It's nowhere near ready to be deployed for any use other than testing under close human supervision. Once the technology is sufficiently mature and proven, applicable laws will be written, and you'll be able to sleep while your driverless car acts as the designated driver.

How? I'll tell you how. (1)

Datamonstar (845886) | about a year and a half ago | (#42384943)

A fucking ass-load of regulations and "licensing" fees, that's how. I expect it will get more expensive for us commuters without any access to public transportation. It really sucks.

To Serve Mankind (1)

DarkWyld (772153) | about a year and a half ago | (#42385023)

So, let's see we are now almost at the age where a Speed Camera can issue a ticket to a AutoMate car? Seriously, if we could just wait a few more years till the machines get a bit more advanced I think we should just trust our whole global defense systems to them... I'm sure there isn't a likely future possibility where this ends badly... :) P.S. I think my AutoMate car just ran over john connor... F the Future of mankind!

Say what (0)

Anonymous Coward | about a year and a half ago | (#42385029)

Nobody working on self-driving cars has gotten to the point where they can let them drive without a person. The laws are in place to force a group to prove that their self-driving car can operate without a person instead of assuming that the evil corporations will make sure they work correctly.

As the article name suggess (0)

Anonymous Coward | about a year and a half ago | (#42385091)

How Do You Give a Ticket To a Driverless Car?

Impound it. Then somebody might step forward to reclaim it, along with whatever ticket needed issuing. Not to mention if it's a legal vehicle it must be registered to someone.

Cities need their revenue! (0)

Anonymous Coward | about a year and a half ago | (#42385119)

Traffic fines, "court fees", this-that-and-the-other assessment fee... a $170 ticket suddenly is almost $500 here in LA. I don't care what you call it, I call it a tax. When a $4 "surcharge" on any ticket in CA is anticipated to bring in another $34 MILLION, it isn't about enforcement; it's a shakedown.

http://latimesblogs.latimes.com/lanow/2010/12/cash-strapped-california-increasing-traffic-fines-again-some-citations-now-400.html

In light of "driverless cars", where is a city to turn for revenue? Certainly can't blame the hardware or software because corporate interests (and their bought politicians) will fight these in court for millions of dollars cities don't want to gamble on and they (cops, bureaucrats, everyone) will turn to one law: it is the "operators responsibility to ensure the vehicle is safe". They will cry about this up and down the street regardless of the consumer's recourse and you will just pay the "fine" (tax) for whatever the city and/or their enforcement can think up to react faster and fine individuals than patches can keep up.

Better start your LocalCarNavPatch dot com now!

Point is moot (1)

spire3661 (1038968) | about a year and a half ago | (#42385121)

The ROAD ITSELF will control the car, not onboard AI. Even if the car has onboard AI, it will still only respond to the road and the rules that are programmed for that particular stretch of highway as set by (hopefully) civil engineers. This method solves the 'lawyer's banquet' dilemma.

Re:Point is moot (0)

Anonymous Coward | about a year and a half ago | (#42385137)

But not the local city's revenue problem...

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...