Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google's Self Driving Car Crashes

timothy posted more than 3 years ago | from the had-to-happen-one-day dept.

Google 244

datapharmer writes "We've all read previous stories on slashdot about Google's driverless car, and some have even pondered if a crash would bring the end to robotic cars. For better or for worse, we will all find out soon, as the inevitable has occurred. The question remains, who is to blame. A Google spokesperson told Business Insider that 'Safety is our top priority. One of our goals is to prevent fender-benders like this one, which occurred while a person was manually driving the car.'"

Sorry! There are no comments related to the filter you selected.

Johnny Cab (5, Funny)

Anonymous Coward | more than 3 years ago | (#37001880)

"The door opened, you got in!"

Re:Johnny Cab (5, Funny)

pinkj (521155) | more than 3 years ago | (#37002242)

I wish I had mod points.

Johnnycab: The fare is 18 credits, please.
[Quaid gets out]
Douglas Quaid: Sue me, dickhead!
[cab tries to run him down, crashes, and explodes]
Johnnycab: We hope you enjoyed the ride!

Summary is sensationalistic (5, Informative)

ELitwin (1631305) | more than 3 years ago | (#37001920)

The car crashed while being driven by a person.

Nothing to see here - move along please.

Re:Summary is sensationalistic (4, Funny)

WrongSizeGlass (838941) | more than 3 years ago | (#37001934)

The car crashed while being driven by a person.

Maybe he was looking at the GPS and not paying attention to the road.

Re:Summary is sensationalistic (1)

redemtionboy (890616) | more than 3 years ago | (#37001938)

But...the robots....:( So disappointed.

Re:Summary is sensationalistic (1)

FhnuZoag (875558) | more than 3 years ago | (#37002348)

Well, Asimov's second law of robotics does overrule the third law...

Re:Summary is sensationalistic (4, Insightful)

Dice (109560) | more than 3 years ago | (#37001958)

The car crashed while being driven by a person.

According to a Google spokesperson. If I were in that car, and it crashed while the software was driving, I would claim that I had been driving it too. Any public crash that could be blamed on the software would put the project in serious jeopardy.

Re:Summary is sensationalistic (4, Funny)

101010_or_0x2A (1001372) | more than 3 years ago | (#37001978)

9/11 was an inside job because man never landed on the moon.

Re:Summary is sensationalistic (1)

petermgreen (876956) | more than 3 years ago | (#37001986)

OTOH if you lied and the cops found out you had lied then I would think that could put the project in even more serious jeopardy.

Re:Summary is sensationalistic (0)

Anonymous Coward | more than 3 years ago | (#37002146)

I'm fairly certain that robotic cars are entirely unlawful to be operating on their own on public streets across the U.S. So if the car was on a public street there was a person in the drivers seat and whether they were driving or the software was driving, they were operating the car.

Re:Summary is sensationalistic (3, Insightful)

tibman (623933) | more than 3 years ago | (#37002274)

I did some quick research.

According to California officials, there are no laws that would bar Google from testing such models, as long as there's a human behind the wheel who would be responsible should something go wrong.

Taken from here: http://jalopnik.com/5661240/are-googles-driverless-cars-legal [jalopnik.com] which was linked in the article from the summary.

However i would say that there is a difference from operating the car and manually driving the car. The google spokesperson used the phrase, manually driving.

Re:Summary is sensationalistic (1)

Anonymous Coward | more than 3 years ago | (#37003434)

OTOH, this is a Google employee we are talking about. If the cops catch him lying "rogue employee trying to suck up to the bosses for fear of losing his job." If he didn't take the bullet by attempting to lie Google probably has enough datamined info to have him cloned and replaced by a believable, more obedient version.

Re:Summary is sensationalistic (1)

Fuzzums (250400) | more than 3 years ago | (#37002266)

Yeah. And if later on anybody would find out it actually WAS the software instead of the human driver...

Re:Summary is sensationalistic (1)

Anonymous Coward | more than 3 years ago | (#37002374)

It is a test car. It collects more data while in operation than you could imagine. It'd be trivial to verify the logs against what they claim.

The claim that it is a 5 car accident seems to indicate that there was something unusual going on in the area that caused traffic to get much tighter than typical. When he saw whatever that was, the driver most likely took control and screwed up. The whole thing should be in the logs so it'll be easy for them to see what went wrong.

Re:Summary is sensationalistic (0)

Dice (109560) | more than 3 years ago | (#37002438)

It is a test car. It collects more data while in operation than you could imagine. It'd be trivial to verify the logs against what they claim.

I can imagine quite a lot, but that's beside the point. It would be trivial for Google to verify the logs, sure, but I doubt very much that the Mountain View Police Department would have an easy time with it.

Re:Summary is sensationalistic (1)

Monchanger (637670) | more than 3 years ago | (#37002782)

It would be trivial for Google to verify the logs, sure, but I doubt very much that the Mountain View Police Department would have an easy time with it.

They don't have to. All they need is to hire a credible computer expert to testify as to what the data actually says. If the subpoena includes Google's data analysis tools, it becomes even more trivial.

PS- "credible" means someone who doesn't spread conspiracy theories online.

This is Slashdot. (0)

Anonymous Coward | more than 3 years ago | (#37001964)

The summary is always sensationalist.

Always.

Re:This is Slashdot. (5, Funny)

sneakyimp (1161443) | more than 3 years ago | (#37002286)

Man Crashes Car? That's no story. CAR CRASHES MAN!!! Now *that's* a story.

Re:Summary is sensationalistic (1)

thegarbz (1787294) | more than 3 years ago | (#37002094)

Are you kidding? This is the epiphany of news for nerds. It's amazing. The Google car has crashed! Who cares about the details. It's all over. Even Google can't make the perfect car. Why do they allow a manual override in the first place. OMG.

Actually the real Oh My God moment was that I just clicked on that waste of a link.

Re:Summary is sensationalistic (2)

Bitsy Boffin (110334) | more than 3 years ago | (#37002182)

The word you want is epitome, not epiphany.

Re:Summary is sensationalistic (3, Funny)

ColdWetDog (752185) | more than 3 years ago | (#37002574)

Ah yes, the epiphany of epitome.

Re:Summary is sensationalistic (1)

qxcv (2422318) | more than 3 years ago | (#37002342)

Actually the real Oh My God moment was that I just clicked on that waste of a link.

I even clicked on the subsequent two links to get to the *actual* story rather than just CNET's "commentary". Outbound links to content farms really should be banned from /.

Re:Summary is sensationalistic (2)

del_diablo (1747634) | more than 3 years ago | (#37002126)

Lets talk about something a bit more relevant.
Basically a small issue is that bugs will occuere. If the cars AI actually crashed the car, ain't it actually a really good thing? I mean, the bug would otherwise have been present.
And since it crashed, they can figure out WHY it crashed, and that means they can fix the bug.
And the same thing applies to everything: While doing R&D you actually want a few of your products to break badly, so you can fix the fault that caused it.

Re:Summary is sensationalistic (0)

Anonymous Coward | more than 3 years ago | (#37002152)

Aww, I was hoping for automated demolition derby.

Re:Summary is sensationalistic (1)

nerdyalien (1182659) | more than 3 years ago | (#37002340)

Thought of sending this to Top Gear... too bad as a human was driving it.

Hope the robot called 911 after the crash...

Re:Summary is sensationalistic (0)

Anonymous Coward | more than 3 years ago | (#37002514)

Was Mr. Bean driving it?

Re:Summary is sensationalistic (0)

Anonymous Coward | more than 3 years ago | (#37002602)

If that's a reason to move on I had better just kill my Slashdot Atom feed.

Re:Summary is sensationalistic (0)

Anonymous Coward | more than 3 years ago | (#37002744)

If you believe this OR my suspicious mind leads me to believe if they ever needed to cover their ass and not be the laughing stock in the world as to why a car with no one in it would need to be on the road IN MOTION moving with traffic, going somewhere? in the first place... This is a good time for gOOGLe to rethink just what the hell they are in the first place any way.. They are a search engine? Who spies on the general public?

Re:Summary is sensationalistic (0)

Anonymous Coward | more than 3 years ago | (#37002868)

"It can only be attributed to human error"...

Re:Summary is sensationalistic (2)

jklovanc (1603149) | more than 3 years ago | (#37003052)

How do we know that the following condition didn't happen;

The car was in automatic drive.
A problem occurred and it appeared that a crash was about to occur.
The driver took control of the vehicle
There was not enough time to avoid the crash and the crash occurred.

Google can truthfully say that at the time of the crash the car was in manual control but the crash was still caused by the computer.

Re:Summary is sensationalistic (1)

uglyMood (322284) | more than 3 years ago | (#37003258)

Because Google makes a distinction between the car being driven autonomously, which is the scenario which you describe, and manually driving the car, in which the car is driven just like any other Prius. Until more facts come in, what is the point of this scare-mongering conspiracy theory? What is YOUR agenda?

Wildly misleading headline (5, Informative)

Bovius (1243040) | more than 3 years ago | (#37001924)

Relevant quote: "...occurred while a person was manually driving the car."

Headline should be: "Human damages Google car by operating it with his own slow, meaty appendages"

Re:Wildly misleading headline (4, Interesting)

emurphy42 (631808) | more than 3 years ago | (#37002026)

TFAs are largely about questioning whether
(a) it was indeed the human's fault
(b) the robot effed up first, then the human took over and attempted (unsuccessfully) to recover

Re:Wildly misleading headline (0)

Anonymous Coward | more than 3 years ago | (#37002290)

Headlines:
Humans unable to drive "self-driving car".
Human crashes and likely claims not to be intuitive enough.
Crash occurred while human was texting and attempting to overtake selfdriving car.
Humans probe unable to want a safe trip.

Right next to my office (1)

Anonymous Coward | more than 3 years ago | (#37001930)

This crash occurred around the corner from my company's offices. I've seen that car in the fabrication shop across the street, so it's entirely possible that the vehicle had just been in for some repairs or modifications which could have triggered the accident.

Posting as AC to protect identity and such...

Re:Right next to my office (0)

Anonymous Coward | more than 3 years ago | (#37002534)

Please elaborate.

Some other people said it's a 5-car pileup. Did you get a good look (doubt you saw it happen, but afterward)? Where was the Google car (front, middle, back)? Speed limit on the road? Circumstances? Anything else you find relevant?

Re:Right next to my office (0)

Anonymous Coward | more than 3 years ago | (#37003058)

I didn't see the accident or the aftermath, I haven't been over on that side of the block today. The intersection is this one [google.com] : I've positioned the street view so that you can see the side of the Costco building which is pictured in the background of the car. The intersection sucks in general: it is fairly busy at various times during the day (including during the morning commute) and I'm not at all surprised that there was an accident there. You can see in the street view that there are cars waiting for the light pretty much everywhere and that straight through and turning traffic have to rely on everyone yielding properly for anything to happen.

Who is to blame? (2, Funny)

Anonymous Coward | more than 3 years ago | (#37001946)

Why, Apple, Microsoft and Yahoo! and may be Oracle too!

Re:Who is to blame? (1)

blackfrancis75 (911664) | more than 3 years ago | (#37002732)

you forgot Bitcoin

Built Upon Failures (2)

agent_vee (1801664) | more than 3 years ago | (#37001976)

Why would one crash bring an end to robotic cars? Crashes can be expected while they are still developing this car.

Re:Built Upon Failures (1)

Anonymous Coward | more than 3 years ago | (#37002078)

Because nowadays, people are afraid of new technology if it's not 100% safe, even if current technologies are not particularly safe at all.

This applies to airplanes (the concorde -- fastest plane ever made -- was cancelled because of a single crash) and energy production (nuclear) in particular.

Re:Built Upon Failures (1)

barlevg (2111272) | more than 3 years ago | (#37002180)

9/11 killed the concorde, not the crash. Specifically, the heightened security meant that the super-rich who could afford a ticket decided to take private jets instead.

Re:Built Upon Failures (1)

Volante3192 (953645) | more than 3 years ago | (#37002184)

Concorde was killed off for many other reasons unrelated to the crash, most critically, it was a money and fuel sponge.

Nuclear, though, I agree. Apparently coal and it's hundreds to thousands of deaths is ok because we've had it since man first sent child into a mine shaft to play in the dirt. Nuclear though, GAAH! MUTANT THREE EYED FISH!!!!

Re:Built Upon Failures (2)

ScrewMaster (602015) | more than 3 years ago | (#37002474)

Concorde was killed off for many other reasons unrelated to the crash, most critically, it was a money and fuel sponge.

Nuclear, though, I agree. Apparently coal and it's hundreds to thousands of deaths is ok because we've had it since man first sent child into a mine shaft to play in the dirt. Nuclear though, GAAH! MUTANT THREE EYED FISH!!!!

Yes, and what's tragironic about that is that many coal fields are naturally radioactive, and we (as in "pretty much everyone on the planet") have been breathing thorium dust for over a century now. Thorium that would have been better of staying in the ground. The unfortunate reality is that some number of people die every year just from that particular aspect of our use of coal for power. Well-designed nuclear power facilities (and no, I don't mean obsolescent junk like what lit off in Japan recently, and please don't bring up Chernobyl: that dirty bomb on steroids had no business ever being built ... leave it to the Russians to nuke themselves) does a *HELL* of a lot better job of keeping radioactive particulates out of our atmosphere. But you can't tell that to some people because they've already up their minds. Like it or not, coal power has a very definite, very predictable, and very real cost in human life.

Coal burning has a number of nasty biological effects unrelated to radioactivity, but that's another issue. More people have already died from coal-fired power plants than will ever die from nuclear fission. You can't tell that to some people either.

Re:Built Upon Failures (1)

barlevg (2111272) | more than 3 years ago | (#37002750)

Has anyone here ever read Heinlein's Starship Troopers (no comment about the movie)? There's a scene where they discuss a human colony on this planet that's exactly like earth, only with far less ionizing radiation. The discussion goes that in a few thousand years, the humans that settle on that world will be evolutionarily impaired due to lack of mutations.

That is, unless they purposefully irradiate themselves.

Re:Built Upon Failures (1)

Stormy Dragon (800799) | more than 3 years ago | (#37002942)

The colony will be perfectly fine. Most genetic mutations are spontaneous, caused by defects in the molecular transcription processes. And even among induced mutations, there's plenty of chemical or biological agents in the environment that do more damage than ionizing radiation.

Re:Built Upon Failures (1)

queazocotal (915608) | more than 3 years ago | (#37003392)

The proximal cause of Concorde actually being cancelled was the maintenance company deciding it diddn't want to do it any more, and putting in a stupidly high quote for the renewal.

Re:Built Upon Failures (1)

PieSquared (867490) | more than 3 years ago | (#37002298)

Because we have a need to *blame* someone when something goes wrong. If a robotic car makes a mistake, crashes, and kills someone, who goes to jail? The owner, who submitted it for through testing before allowing it to drive on the road? The manufacturer, who did the same and also preformed thousands of hours of independent testing? One of the dozens of engineers or hundreds of programmers who worked on it? A person is hypothetically dead, and they wouldn't have hypothetically died if not for this robotic car! Who do we get to punish!?

The statistical fact that if every robotic car on the roads had been driven by a human, then there would have been ten fatal accidents in the time it took for this first robotic car fatality to happen isn't much comfort to the family of the hypothetical dead victim. Especially when the on-board cameras show that this particular accident would have been trivially prevented by a human driver. And you can bet that *some* politician is going to plaster that hypothetical victim's face all over the national news until everyone knows that robotic cars are a terrible idea and should be banned.

Now, we can hope that cooler heads would prevail, and the video of the avoidable crash would be shown along with dozens of videos of crashes that no human could have avoided. That people will point out that even if it does sometimes make mistakes, it's still better then a human driver, that injures and even deaths result from seatbelts and airbags, but that we keep them anyway because they save more lives then they take. That we can change the software so that this particular mistake never happens again, and do more testing to eliminate any other extant problems before anyone is hurt. Not to throw the baby out with the bathwater. We can *hope* for that, but I personally don't have enough faith in US politics to really believe it.

Re:Built Upon Failures (1)

ScrewMaster (602015) | more than 3 years ago | (#37002502)

Who do we get to punish!?

The corporation has to pay. And, when all is said and done, if their behavior was especially egregious they'll pay a lot. That's just the way it is. And yes, it does take time and money. If it were any other way, nobody would ever be an engineer, nobody would ever build anything of consequence, because going to jail for doing your job is just not a worthwhile risk for most people.

Re:Built Upon Failures (0)

Anonymous Coward | more than 3 years ago | (#37002904)

Guess we'll just punish the other 9 hypothetical victims by preferring revenge to safety.

Re:Built Upon Failures (0)

Anonymous Coward | more than 3 years ago | (#37003066)

If the programmers had done a better job the "accident" would not have happened.

No way I am letting a car drive me around that has been programmed by the typical /. contributor!

Re:Built Upon Failures (1)

cheekyjohnson (1873388) | more than 3 years ago | (#37003390)

who goes to jail?

How about no one? Why must someone go to jail for what would probably be perceived by most to be an unfortunate occurrence?

I've heard of not reading the article... (1)

m3000 (46427) | more than 3 years ago | (#37001984)

..but not even reading the summary before being pushed to the front page?

I guess it is Friday afternoon, but still

goolge has deep pockets get a good lawyer (1)

Joe_Dragon (2206452) | more than 3 years ago | (#37002002)

any ways legal liability is a big hold up to auto cars and the only way that at least at the start to have them is to have auto drive only roads and even then there will need to be some kind of no fault or some one saying that all costs to fix things will be covered or there will need to be auto drive insurance. Also the cops and courts will need someone to take the fall if any laws are broke.

Computer Crash? (1)

tokencode (1952944) | more than 3 years ago | (#37002024)

This brings a whole new and more significant meaning to the term "computer crash". "Yea my computer crashed yesterday, it was a real problem because it went right through someone's living room and I forgot to take a backup..."

Re:Computer Crash? (1)

tokencode (1952944) | more than 3 years ago | (#37002110)

BTW would I have to list my car/computer as a drive on my insurance policy? Does it get a license and can it accumulate points and get suspended? Maybe the points can go directly to the developer's license.... If google is working on AI and a human really did crash the car, I hope that person has a really good attorney......

Re:Computer Crash? (1)

ScrewMaster (602015) | more than 3 years ago | (#37002540)

BTW would I have to list my car/computer as a drive on my insurance policy? Does it get a license and can it accumulate points and get suspended? Maybe the points can go directly to the developer's license.... If google is working on AI and a human really did crash the car, I hope that person has a really good attorney......

Doesn't work that way. I'm no lawyer, but I am a software developer, and I work on some fairly mission-critical stuff. So yes, I did consult an attorney regarding my own personal liability. What it comes down to (in the U.S. at least) is that the company takes on that liability. Unless, of course, you do something criminal like sabotage a control program or something ... but your employer assumes the normal costs and risks of doing business. If you are just doing your job you generally can't be held liable for something like that. Any lawyers out there feel free to correct me: if I'm wrong believe me I'd like to know about it!

Re:Computer Crash? (0)

Anonymous Coward | more than 3 years ago | (#37002644)

The obvious question was, can it drive me home after a night at the bar?

(Or: can it drive me to the liquor store and back home again)

Car Crash (1)

schlesinm (934723) | more than 3 years ago | (#37002066)

Is it that slow of a news day that a car crash makes it to the front page?

100% reliability not needed (5, Insightful)

KingSkippus (799657) | more than 3 years ago | (#37002080)

I've posted this before and I'll post it again.

Robot cars don't have to be 100% reliable. As long as they're more reliable than the jerks who normally scare the bejesus out of me by cutting across three lanes of traffic, driving 90 MPH, weaving in and out, running red lights, etc., then I'm all for a robot car-driven society. I'm willing to put up with the computer glitches that, on very rare occasions, cause crashes if I don't have to put up with the human glitches that call themselves licensed drivers.

Re:100% reliability not needed (3, Insightful)

Riceballsan (816702) | more than 3 years ago | (#37002202)

Good for you, but unfortunately that only means you are more sane then a lawmaker, the lobbyists etc... The problem is if there is a single fatality, or even minor accidents, a large group will rise up screaming about how unsafe the cars are, and they will be disallowed from driving on public roads. Even if the average rate of accidents and fatalities is 1/16th of human rates. Most laws can be stopped by focusing on the 1% of the time something is worse and completely ignoring the 99% of the time they were better.

Re:100% reliability not needed (1)

sl149q (1537343) | more than 3 years ago | (#37002792)

Mostly the scare campaigns will be generated by people with other agendas... Think teamsters wanting to protect jobs for drivers. There are a *lot* of people who stand to loose their living once self driving cars start to be deployed.

You can see prototype scare campaigns of this sort anywhere that has contemplated driverless mass transit systems.

I suspect that in some jurisdictions (where unions have political pull) we will see laws enacted that require a human "driver" be available to override the controls for vehicles larger than say a small delivery truck or more than 6 passengers. Jobs for the boys.

Which reminds me of the old joke about the robotic factory that had a guard and a dog... The guard was there to protect the robots and the dog was there to bite the guard if he touched anything.

Re:100% reliability not needed (1)

artor3 (1344997) | more than 3 years ago | (#37002208)

It doesn't matter that you're okay with it. The media will jump on it to create a scare, so that they can get more advertising revenue. Their victims will get scared, and demand their congressmen ban self-driving cars. History has shown that politicians who try to rationalize with raving, scared citizens end up having short careers.

There will never be self-driving cars. Not in our lifetimes. Technology allows them, but society doesn't.

Re:100% reliability not needed (1)

xigxag (167441) | more than 3 years ago | (#37003224)

Strongly disagree.

First of all, we already have automatic braking systems, cruise control, electronic stability control and other computer assisted driving methods. And they can fail. The argument you are making would lead us to conclude that a couple of ABS failures would lead to banning the technology, but that hasn't happened. The computer is taking over the automobile in stages, and people will have time to become accustomed to each incremental step.

Second, people become accustomed to automated transport quicker than you might think. I don't know if they even exist anymore but remember those human-operated elevators where the operator had to manually gauge where the floor landing was before opening the gates? How many people prefer those these days? How many people would prefer to get into a 747 that they knew had no automated guidance systems? How many people bat an eye at getting into one of those fully automated airport monorails? If you're visualizing the transport alternatives while reading this, I hope you'd agree that generally the computer assisted versions have a smoother, safer *feel* to them. People will gradually come to associate the safer *feel* with actual safety, and conversely, will associate the manual acceleration feel with recklessness. Plus computer control will save on fuel costs, which is bound to be more important over time, because it will allow for more precisely timed acceleration, better aerodynamics in vehicle shape, efficiency through slipstreaming, etc.

Third, let's say we eventually get to a point where vehicle collisions break down as follows: 90% human-human, 9% human-computer 1% computer-computer. Plus, since there will be fewer autonomous cars on the road, the h-c and c-c collisions will be even rarer. Basically, almost every accident will involve dumb human error. So, what happens when you manually drive your neighbor's kid home from school and get into an accident? He will sue the pants off you because you deliberately put his kid in a situation which is documented to be unsafe. Eventually (think of the children) it will be illegal to put kids in a manually driven car.

Fourth, insurance on manually driven vehicles will creep up and up until it will simply be unaffordable to anyone who doesn't have a trust fund.

Fifth, and most important, autonomous vehicles will allow people to drink in their cars again. Game over.

Re:100% reliability not needed (1)

jamesh (87723) | more than 3 years ago | (#37002304)

cutting across three lanes of traffic, driving 90 MPH, weaving in and out, running red lights, etc

If you want that behavior download the @r53h0L3 patch...

cause crashes if I don't have to put up with the human glitches that call themselves licensed drivers.

I wonder how much of that sort of driving they have put the googlemobile through? Being a tester would be a whole lot of fun... set the googlemobile down a freeway and everyone else gets to cut it off etc and see how it responds.

Re:100% reliability not needed (0)

Anonymous Coward | more than 3 years ago | (#37002936)

I wonder how much of that sort of driving they have put the googlemobile through? Being a tester would be a whole lot of fun... set the googlemobile down a freeway and everyone else gets to cut it off etc and see how it responds.

It'd make sense to do this in a simulator... a good deal cheaper... actually, come to think of it, I'd be vastly surprised if Google hasn't already done this.

Re:100% reliability not needed (2)

jamesh (87723) | more than 3 years ago | (#37003276)

I wonder how much of that sort of driving they have put the googlemobile through? Being a tester would be a whole lot of fun... set the googlemobile down a freeway and everyone else gets to cut it off etc and see how it responds.

It'd make sense to do this in a simulator... a good deal cheaper... actually, come to think of it, I'd be vastly surprised if Google hasn't already done this.

Scene 2: A bunch of google tech's crowded around their broken googlemobile, scratching their heads and muttering "strange... it worked perfectly in the simulator"

Re:100% reliability not needed (1)

Dachannien (617929) | more than 3 years ago | (#37002310)

Robot cars *do* have to be 100% reliable, because the automakers will bear culpability for crashes caused by an autopilot, and their much deeper pockets will result in lawsuits filed for damages several orders of magnitude higher than what Joe Sixpack faces when he hits someone. That risk of liability will keep car autopilots off the roads for the foreseeable future, even when the technology appears to have matured.

Re:100% reliability not needed (0)

Anonymous Coward | more than 3 years ago | (#37002412)

Robot cars *do* have to be 100% reliable, because the automakers will bear culpability for crashes caused by an autopilot, and their much deeper pockets will result in lawsuits filed for damages several orders of magnitude higher than what Joe Sixpack faces when he hits someone. That risk of liability will keep car autopilots off the roads for the foreseeable future, even when the technology appears to have matured.

Why would that be? You do realize airplane manufacturers do not bear culpability for crashes caused by an autopilot, right? The pilot in command is responsible for being in control the entire time and turning the freaking thing off if it starts misbehaving. Why would it be any different for a car? Require a licensed driver at the wheel, if the "autodriver" messes up and the licensed driver doesn't recover, he has the liability.

Re:100% reliability not needed (1)

CastrTroy (595695) | more than 3 years ago | (#37003324)

Because in a plane you have a whole lot more time to react and fix something if the autopilot starts going wonky. When you're going down the freeway at 100 KM/h and the car suddenly veers left, you probably won't even have time to know what happened before you are knocked unconscious by the collision with the car in the next lane.

Re:100% reliability not needed (1)

mitch.swampman (1216614) | more than 3 years ago | (#37002518)

you say this now, but wait until the smartcar you're in gets caught in an infinite loop!

Re:100% reliability not needed (3, Funny)

ColdWetDog (752185) | more than 3 years ago | (#37002618)

you say this now, but wait until the smartcar you're in gets caught in an infinite loop!

So you get lost around the Apple campus. What's the big deal?

Re:100% reliability not needed (1)

ranpel (1255408) | more than 3 years ago | (#37002720)

I'm wishing that I could find the article but, from memory, there was a distinct value add on the overall flow of traffic when a certain percentage were "aggressive" classed drivers. Not to be confused with total dickheads behind the wheel mind you. I would guess that the article that I can not find is about 10 months old or so.

Now then, I would consider myself to be an "enthusiastic" driver which means, at times, one needs the mechanisms inherent to accommodating and maintaining enthusiasm. So, when those licensed idiots (driving idiots - you don't, necessarily, need a license at all to be an idiot in other aspects of life) that
a) see a car in their rear view and do not accommodate flow but choose to set a new pace (read: their own pace)
b) fail to keep their slower than median asses to the right (whenever possible of course)
c) accelerate and decelerate like a stoned mouse on a tread-wheel
d) brake at highway speeds for any of a, b, or c (unless to warn traffic behind you of law enforcement or, obviously, to not crash)
can understand that courteousness is a two-way street you might find yourself becoming a happier driver all around.

Any idiot can do 61 in a passing lane just as easily as every other idiot can pass them on the left.

I'd venture a guess that if you're willing to put up with glitch crashes due to software and not extend the same sort of leeway to glitch crashes caused by wetware that you just might fall into the driver class that deem themselves the protector of lost souls in regulating flow your own way.. but that's just a guess.

In other words, idiocy begets idiots - like rabbits without the fun part.

Re:100% reliability not needed (1)

ranpel (1255408) | more than 3 years ago | (#37003106)

Any idiot can do 61 in a passing lane just as easily as every other idiot can pass them on the *right*. ftfm (fixed that for myself)

Re:100% reliability not needed (1)

sandytaru (1158959) | more than 3 years ago | (#37003278)

One of those untraceable and probably if it was traced, would turn out to be apocryphal, quotes: Every American considers himself to be an above average driver.

I imagine it more as a bell curve, scaled on an accumulation of tickets and accidents over time. Nine out of ten drivers will get at least one speeding ticket in their lifetime, and be involved in 3-5 accidents serious enough to be reported. As I have not had either yet in 15 years of driving, I can safely consider myself an above average driver. You would like me: I go with the flow of traffic and stay in the right lane unless I am trying to get around a truck going 40 mph.

Re:100% reliability not needed (0)

Anonymous Coward | more than 3 years ago | (#37003192)

The funny part is, that guy driving 90 thinks you are being unsafe by driving too slow.

Re:100% reliability not needed (0)

Anonymous Coward | more than 3 years ago | (#37003222)

The really funny part is, he might be right.

so the machine have finally taken over (1)

snero3 (610114) | more than 3 years ago | (#37002192)

Does this mean that the car works better without humans? The "every small" sample size study says yes!

It makes you too comfortable (1)

Osgeld (1900440) | more than 3 years ago | (#37002254)

ah the computer will take care of it, I have rear view tv why should I bother turning my hea..bump

I like safety but I cant expect humons to do anything right as a whole, a great example would be a coworker of mine, focused so hard on his little tv screen he didn't notice me standing inches away from the side of his car as he backed out, I knocked on his window, throughly scaring him and pointed to my eyes.

Does Everyone on CA own a Prius or Accord? (5, Funny)

adisakp (705706) | more than 3 years ago | (#37002256)

FTA: Google's Prius struck another Prius, which then struck her Honda Accord that her brother was driving. That Accord then struck another Honda Accord, and the second Accord hit a separate, non-Google-owned Prius.

Re:Does Everyone on CA own a Prius or Accord? (1)

sandytaru (1158959) | more than 3 years ago | (#37003234)

Yes, at least in the Bay Area.

I think it was depressed. (2)

Pneathery (1949818) | more than 3 years ago | (#37002384)

I mean here the car is, a brain the size of a planet, and all we are asking it to do is to drive us around. I think it was attempted suicide.

Why so antagonistic? (2)

Marc_Hawke (130338) | more than 3 years ago | (#37002394)

The author of the Business Insider article seems to think that a 'driverless car' killed his mother or something. Every sentence was a scathing attack on the audacity of Google to even be running these tests. He also never once entertains the idea that this might have been a normal fender-bender between normally driven vehicles. He just assumes Google's responses are bald-faced lies and implies what really happened is that the computer decided to try to kill everyone else on the road.

What I don't get is why does he hate the car so much? It thought these cars were an exciting new technology. Why would he go out of his way to demonize it?

Re:Why so antagonistic? (1)

lankyvaulter (1158987) | more than 3 years ago | (#37002532)

Agreed, that article was difficult to read it was so slanted. And it basically illustrates the major hurdle to auto driven cars.....old people who think they can drive better than they actually can. It will probably take a generation of people to catch up to the technology unfortunately.

A single incident doesn't usually kill something (1)

Anonymous Coward | more than 3 years ago | (#37002454)

There are examples where a single story is given credit for putting an end to something.

Buckminster Fuller's Dymaxion car http://en.wikipedia.org/wiki/Dymaxion_car [wikipedia.org] was abandoned by its investors after a single accident. A closer examination shows that the bankers were already getting skittish. The car didn't offer sufficient advantages to offset its perceived problems.

The Hindenburg http://en.wikipedia.org/wiki/Hindenburg_disaster [wikipedia.org] accident is said to have ended the age of lighter than air ships. Again, a closer examination shows a whole bunch of other factors made airships uneconomical, especially when airplanes were becoming more and more competitive with each passing year.

I can't think of any examples where something with real advantages was killed off by a single incident.

It was attempting to mate (0)

Anonymous Coward | more than 3 years ago | (#37002478)

Judgement day narrowly averted again . . .

Rugged Prius (1)

hawguy (1600213) | more than 3 years ago | (#37002572)

After reading this article and seeing the pictures, I'm buying a Prius!

Striking a car with enough force to trigger a four-car chain reaction suggests the Google car was moving at a decent clip

It caused all that carnage and I can't even see a scratch on the Google Prius or the Prius in front of it!

Re:Rugged Prius (0)

Anonymous Coward | more than 3 years ago | (#37002628)

It is difficult to tell from those images. In one picture the car in the rear looked perfect, in another image you could clearly see that the entire front end up to the front doors was shoved in and by my guess a couple of inches.

New autonomous test cars crash? NO WAI! (1)

Loopy (41728) | more than 3 years ago | (#37002668)

Anyone who doesn't think we're going to see crashes with a new (semi)autonomous driving system is delusional or being obtuse. If one crash becomes some sensational national news story, one has to wonder why.

Self Flying Airplane Crashes (0)

Anonymous Coward | more than 3 years ago | (#37002758)

When the next airplane crashes the news should make a big deal about because all large airplanes have autopilot installed.

You can't really blame Google (2)

Jeremi (14640) | more than 3 years ago | (#37002998)

There's an inherent conflict between the prime directive of the Google auto-driving software ("drive safely"), and the prime directive of the Toyota firmware ("drive safely until the human isn't paying attention, then accelerate to top speed for as long as possible").

It was only a matter of time before the Toyota side of the car's character came to the fore. ;^)

In soviet russia (2)

Joe_Dragon (2206452) | more than 3 years ago | (#37003070)

car crashes you!

Either the test failed or Google was faking it (0)

Anonymous Coward | more than 3 years ago | (#37003128)

So either Google was faking the test or the test failed miserably.

Based on how Google is handling the issue .... they are too stupid to understand that by saying that a human driver was in driving they are saying that the test was FAKE.

logic fail (1)

wickerprints (1094741) | more than 3 years ago | (#37003134)

Commercial aircraft are largely automated fly-by-wire systems. Every so often, there's a crash caused in part by sensor malfunction. Does the NTSB and FAA prohibit use of autopilot as a result?

Humans crash cars on the road and kill each other all the time. So that means we should outlaw human-controlled driving mechanisms, of course.

Some men are sexual predators and have abused children. That means we have to physically quarantine all men from all children, right?

If your standard for progress is perfection, then I've got some bad news for you.

Re:logic fail (1)

artor3 (1344997) | more than 3 years ago | (#37003184)

Rationality doesn't matter. The media will conduct a scare campaign to drive up their ratings. Most people, and thus most voters, get their news entirely through the media. They will be kept outraged and afraid, as always, and self-driving cars will be banned.

This FP for GNAA (-1)

Anonymous Coward | more than 3 years ago | (#37003164)

You are a screaming others what to again. Th3re are Addresses will the problems may well remain

American Driving (1)

lopaka1998 (1352441) | more than 3 years ago | (#37003176)

Google's Self Driving Car Crashes

Ah, so in other words it drives like the average American driver... Good to know. Now it's truely ready for the main stream.

Second Offense (1)

retroworks (652802) | more than 3 years ago | (#37003212)

The article neglects to mention the google-car's previous DUI. Influence of.....?

Re:Second Offense (1)

SeaFox (739806) | more than 3 years ago | (#37003430)

Must have been the Ethanol blend of the fuel.

Maybe it crashed on purpose.... (1)

Restil (31903) | more than 3 years ago | (#37003440)

Although we like to think all accidents are preventable (and in theory, they are), that theory changes a bit when you claim that all accidents are preventable when only one driver is attempting to prevent them. Now, I'm sure this happened under a typical, well controlled situation (stopped cars in the middle of the street, for instance), something that happens quite regularly on any drive, and therefore a very typical obstacle. However, consider that there has to be SOME condition for which lesser of evil choices might have to be made. If I, as a regular human driver, are driving down a residential street and a child jumps out in front of me at the last second, and I don't have time to stop, but I DO have time to swerve, but swerving means I will hit another car/tree/mailbox/etc, the non-living, inanimate object is going to have a really bad day.

As I said, I'm sure the google car crashed for much less sensational reasons. Either it was a bug in the software, or a human really had full control of the vehicle and it's not a software/hardware issue at all. Still, I can foresee a decision tree that allows for the decision to hit another vehicle to avoid hitting a person. I've brought up this issue before in fact, in the case that AI, as good as it might be, will have difficulty determining the difference between a small child, a dog, and a fire hydrant standing at the edge of the street. A human, when approaching either the child or dog will pay attention and adjust speed and passing distance to ensure that should a last second "dart out into traffic" moment occur, appropriate action can be taken to avoid a tragic accident. Google's car won't have an easy way to know for sure if that fire hydrant is going to dart out into traffic or not, and without being absolutely sure will HAVE to slow down for each and every one of them, JUST IN CASE. That alone will do more to kill the program than any number of fender benders ever will.

-Restil

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?