Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Robotic Cannon Loses Control, Kills 9

Zonk posted more than 6 years ago | from the keep-quiet-on-the-terminator-jokes dept.

580

TJ_Phazerhacki writes "A new high tech weapon system demonstrated one of the prime concerns circling smarter and smarter methods of defense last week — an Oerlikon GDF-005 cannon went wildly out of control during live fire test exercises in South Africa, killing 9. Scarily enough, this is far from the first instance of a smart weapon 'turning' on its handlers. 'Electronics engineer and defence company CEO Richard Young says he can't believe the incident was purely a mechanical fault. He says his company, C2I2, in the mid 1990s, was involved in two air defence artillery upgrade programmes, dubbed Projects Catchy and Dart. During the shooting trials at Armscor's Alkantpan shooting range, "I personally saw a gun go out of control several times," Young says. "They made a temporary rig consisting of two steel poles on each side of the weapon, with a rope in between to keep the weapon from swinging. The weapon eventually knocked the pol[e]s down."' The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether."

cancel ×

580 comments

Sorry! There are no comments related to the filter you selected.

ED-209 not available for comment (5, Funny)

User 956 (568564) | more than 6 years ago | (#21033579)

Robotic Cannon Loses Control, Kills 9

To be fair, it did give them 30 seconds to comply.

Re:ED-209 not available for comment (4, Funny)

Jeremiah Cornelius (137) | more than 6 years ago | (#21033795)

I submitted the same story.

Unfortunately, the editors may not have approved of my comments linking Bill Joy's "Cassandra" predictions of killer robots, with the pledge to remove the Roomba from my home - and idle speculation about the possible involvement of Windows XP in this incident...

Re:ED-209 not available for comment (1)

flaming error (1041742) | more than 6 years ago | (#21033955)

Probably just needs a little wetware [wired.com] .

Re:ED-209 not available for comment (4, Interesting)

Anonymous Brave Guy (457657) | more than 6 years ago | (#21034051)

I think I'm too old for this stuff. It seems like these days, if I mention to a younger software developer that even now Robocop is still one of the scariest films I've ever seen, they assume it's because of the ketchup effects.

You have 15 seconds to comply (0)

Anonymous Coward | more than 6 years ago | (#21033607)

oblig robocop quote

Re:You have 15 seconds to comply (0, Redundant)

NeverVotedBush (1041088) | more than 6 years ago | (#21033883)

"Citizen, back away from the car..."

Re:You have 15 seconds to comply (0)

Anonymous Coward | more than 6 years ago | (#21033987)

Bitches, leave.

ED-209 (4, Funny)

gEvil (beta) (945888) | more than 6 years ago | (#21033609)

Scarily enough, this is far from the first instance of a smart weapon 'turning' on its handlers.

I seem to recall seeing a documentary about this about 20 years ago. Ahh, here it is. [imdb.com]

Why... (1)

AutoTheme (851553) | more than 6 years ago | (#21033619)

Why is everyone picking on and knocking down the Poles!?!?!?

you're the godwinner (3, Funny)

User 956 (568564) | more than 6 years ago | (#21033659)

Why is everyone picking on and knocking down the Poles!?!?!?

You know who else [wikipedia.org] went around knocking down Poles... That's right.

Re:you're the godwinner (1)

AutoTheme (851553) | more than 6 years ago | (#21033855)

No! Not Soviet Prussia! Was there a Soviet Prussia?

Finally (5, Funny)

High Hat (618572) | more than 6 years ago | (#21033627)

# kill -9

...for the real world!

Re:Finally (0)

NeverVotedBush (1041088) | more than 6 years ago | (#21034015)

Flamebait? "Kill -9" is hilariously perfect! Please mod High Hat funny. It's not flamebait at all.

I know these people are dead, and maybe that is sad, but the whole concept of an automated killbot going nuts and wiping out its makers is also funny. It's right out of the movies. And it just goes to show that no untested system with the potential to do great damage should ever be operated without major safeguards and interlocks.

But I still think this whole thing is hilarious.

BSOD. literally (4, Funny)

User 956 (568564) | more than 6 years ago | (#21033631)

During the shooting trials at Armscor's Alkantpan shooting range, "I personally saw a gun go out of control several times," Young says.

This gives new meaning to the phrase "Blue screen of death".

Re:BSOD. literally (2, Funny)

Oktober Sunset (838224) | more than 6 years ago | (#21034021)

Blue on Blue screen of death.

I, For One, Welcome Our... (1)

Real World Stuff (561780) | more than 6 years ago | (#21033635)

"It sprayed hundreds of high-explosive 0,5kg 35mm cannon shells around the five-gun firing position.

By the time the gun had emptied its twin 250-round auto-loader magazines, nine soldiers were dead and 11 injured."

Holy shit, to hell with welcoming...RUUUUUUUUUUUN!!!

Re:I, For One, Welcome Our... (-1, Offtopic)

I'm New Around Here (1154723) | more than 6 years ago | (#21034089)

Why do they use the term 0,5kg, when they could have said 500g? This is one aspect of the metric system I find hilarious. Everyone says we should use metric, because it is easier to go from a low value to a larger one, just move the decimal point and change the prefix.

Well how about using the appropriate prefix yourselves? Articles about space mention thousands or millions of kilometers. Why doesn't the auther "just move the decimal point and change the prefix"? The articles should mention megameters and gigameters. What, are those units too difficult for the whole metric-using world to comprehend? If so, than shut up about metric's supposed superiority to imperial in that regard. Because it obviously does not exist.

And at the opposite end of the scale, CPUs are described using hundredths of nanometers. e.g. Intel's 0.45nm vs. AMD's 0.90nm architecture. Switch to picometers already. You can't exactly say that picometers are too hard to visualize, but nanometers are so much easier.

Go on, mod me down. Don't matter, I'm already in the "Bad Karma" pool anyway.

That's awesome (0)

Anonymous Coward | more than 6 years ago | (#21033649)

Seriously.

Re:That's awesome (1)

Hatta (162192) | more than 6 years ago | (#21033793)

I agree. If these nine were involved in either buying or selling this technology, they got exactly what they deserved. Live by the robotic machine gun, die by the robotic machine gun.

Re:That's awesome (1)

CheddarHead (811916) | more than 6 years ago | (#21033933)

Unfortunately the article makes it sound like the people killed and injured were just the poor grunts manning the gun. The guy responsible for buying it was probably sipping a martini on the deck of his yacht somewhere.

Acme? (1)

st0rmshad0w (412661) | more than 6 years ago | (#21033657)

Are they certain they haven't gotten any component parts from Acme?

Re:Acme? (1)

svvampy (576225) | more than 6 years ago | (#21033749)

Well Military equipment is built by the lowest bidder. Maybe if they'd aimed for a Yugo instead of a Trabant. (Car analogy FTW!)

Re:Acme? (1)

GeoSanDiego (703197) | more than 6 years ago | (#21033837)

Acme products were never at fault. All mishaps were always traced to coyote error.

Re:Acme? (1)

gardyloo (512791) | more than 6 years ago | (#21034127)

Even when Wiley made no errors, he was cheated. How the hell did the Roadrunner go through those paint-on train-tunnels?!?

      Totally unfair, man.

Three Laws of Robotics (2, Insightful)

dpbsmith (263124) | more than 6 years ago | (#21033689)

Three Laws of Robotics: [wikipedia.org]

1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

2. A robot must obey orders given to it by human beings except where such orders would conflict with the First Law.

3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

"Asimov believed that his most enduring contributions would be his "Three Laws of Robotics" and the Foundation Series."Isaac Asimov [wikipedia.org] article in Wikipedia.

Re:Three Laws of Robotics (5, Funny)

geekoid (135745) | more than 6 years ago | (#21033779)

Yes... did anyone even read the books before posting that? seriously, there are issues with those laws.

Re:Three Laws of Robotics (2, Insightful)

FooAtWFU (699187) | more than 6 years ago | (#21033925)

Not the least of which is, with current artificial intelligence, they're laughably unenforcable. In Asimov's books, you had this neat little "positronic brain" which was capable of resolving sensory inputs and determining things like "that is a human -->" (to say nothing of "I am harming it", especially through indirect causality.) They were even capable of coming up with ideas to avoid the "through inaction" clauses.

Really, the stories weren't about robots, they were about people just like us, with a certain set of "must-follow" rules. Modern AI does not resemble this in the slightest.

Re:Three Laws of Robotics (1)

markov_chain (202465) | more than 6 years ago | (#21033983)

I respect Asimov, but the three laws are pretty naive. I thought localroger's (from kuro5hin) robot stories were far more in line with how intelligent design might be done and turn out.

Re:Three Laws of Robotics (1)

Nephilium (684559) | more than 6 years ago | (#21034113)

That was kind of the point of the laws. Most of the robot series books had robots going around the rules in some manner, or some conflict of the rules causing an issue. There were even some robots that were used to murder people in the books...

The concept of the laws was to keep the stories from turning into Frankenstein ripoffs (sort of like I, Robot was... at least from what I've heard...)

Nephilium

Re:Three Laws of Robotics (2, Insightful)

nuzak (959558) | more than 6 years ago | (#21034151)

> I respect Asimov, but the three laws are pretty naive.

All of the stories in I, Robot are about pointing out the flaws in the laws, actually. From what several bigger fans of Asimov than myself have told me, he wasn't really trying to make grand philosophical statements with them though; they were just story hooks he used for the purpose of spinning a good yarn.

Interpreted seriously, the three laws are slavery.

Re:Three Laws of Robotics (0)

Anonymous Coward | more than 6 years ago | (#21033845)

1. It is naive to think that society will implement Asimov's ideas - though I do agree with them. There are two issues. The first being that robots are good at certain defense situations and are being designed for those areas in which they may be used to kill; however, in most situations a person makes the final decision. The second being that robots are not yet good at recognizing a person; thus, rule 1 is difficult to follow.

2. Even if robots could reliably recognize people, the system in question malfunctioned, and any rules built into the system were no longer guaranteed.

Re:Three Laws of Robotics (2, Insightful)

Anonymous Brave Guy (457657) | more than 6 years ago | (#21033963)

Sorry, I missed the end of that story. How did it turn out, again?

Re:Three Laws of Robotics (1)

pla (258480) | more than 6 years ago | (#21034183)

Three Laws of Robotics:

Even taking them at face value (Asimov didn't, for starters), those laws have as a critical precursor the existence of reasonably intelligent robots.

Current robots can't even accurately identify a human. Makes it tough to avoid killing them, much less "obeying orders" from them.

That's why.. (4, Funny)

Sloppy (14984) | more than 6 years ago | (#21033693)

..killbots have preset limits.

Re:That's why.. (4, Funny)

glaeven (845193) | more than 6 years ago | (#21033813)

"...Thus, knowing their weakness, I sent waves of my own men after them until they reached that limit and self-destructed."

"A sad day for robot history. But hey! We can always build more killbots!"

Testing before testing. (3, Interesting)

Merovign (557032) | more than 6 years ago | (#21033697)

As I used to say to developers at a company I used to work for,

"I want to tell you about a radical new idea I had - testing things before deploying them."

In the case of weapons systems, that means debugging the software before loading the gun.

Truth me told, most "automated" weapons are more like remote control, for precisely this reason.

Also, while my experience is not vast in the area, most American weapons testers follow a lot of safety rules - including not being in the line of fire of the darned thing. Note I said most - we have our munitions accidents here, too.

Re:Testing before testing. (1)

WinterSolstice (223271) | more than 6 years ago | (#21033739)

The irony is not lost on me - I have had many people argue with me that testing is worthless.

These same people later pay heavily for me to rescue their production systems.

Re:Testing before testing. (1)

Merovign (557032) | more than 6 years ago | (#21033797)

We joke about how companies use their customers as beta-testers, but when it comes to internal proprietary software, often that's the quite literal truth.

No amount of production slowdowns or errors seem to make that clear, however.

Re:Testing before testing. (1)

Nephilium (684559) | more than 6 years ago | (#21034147)

Of course not! It's the IT people who implemented the application. They're to blame! Just because we didn't listen to their warnings and dire predictions doesn't give us any blame...

Besides... the devs were going to miss their release date, and my bonus would have suffered...

Nephilium

Re:Testing before testing. (5, Interesting)

Fishead (658061) | more than 6 years ago | (#21033829)

As a robotics technician with close to 7 years experience working with Automated machines, all I can say is "PLEASE DON'T GIVE THEM GUNS!!!"

Many times I have seen an automated system go out of control due to something as simple as a broken wire on an encoder to an entirely failed controller. Closest thing to this that we ever got was one day a SCARA robot (about the size and shape of a human arm) ran away (out of control) and hit the door on the work cell. Wouldn't have been a big deal except that another of the robotics guys was walking by and walked into the door as it swung open. Good times, good times, but I would never want to be around an automated machine with a gun, just too big of a chance for something to go wrong.

But is Stairway okay? (1)

HTH NE1 (675604) | more than 6 years ago | (#21033705)

from the keep-quiet-on-the-terminator-jokes dept.
No Sarah Connor! Denied!

Two words: Deadman switch (5, Insightful)

riker1384 (735780) | more than 6 years ago | (#21033709)

Why didn't they have some provision to cut power to the weapon? If they were testing it in a place where there were people exposed in its possible field of fire (effectively "downrange"), they should have taken precautions.

Re:Two words: Deadman switch (0)

Anonymous Coward | more than 6 years ago | (#21033915)

They did take precautions... none of the injured or killed were either decision makers or stockholders. ;-)

Re:Two words: Deadman switch (2, Funny)

PingPongBoy (303994) | more than 6 years ago | (#21034173)

Why didn't they have some provision to cut power to the weapon?

My dear Mr. Watson, there was a provision. The problem was the confusion between programming for MS-DOS versus Unix.

The clues have told us exactly what happened. From "Robotic Cannon Kills 9", we see clearly the command kill -9 was issued but the weapon was DOS based and did its job all too well.

Better than humans in the long run (4, Insightful)

danny256 (560954) | more than 6 years ago | (#21033715)

The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether.

As with most automated technologies it will make some mistakes, but less than a human on average. The friendly fire rate for most militaries is no where near perfect.

Re:Better than humans in the long run (1)

XenoPhage (242134) | more than 6 years ago | (#21033789)

As with most automated technologies it will make some mistakes, but less than a human on average. The friendly fire rate for most militaries is no where near perfect.
Err... It's still firing at humans, and needs to be controlled somehow.. there's always the potential for friendly fire, especially so with automated weaponry. How will the weapon identify friend vs foe?

Ok, so you have some sort of identifier badge or something, but what happens if an enemy is mixed in there? How will the weapon identify "safe" firing situations?

Ship of Tears (1)

HTH NE1 (675604) | more than 6 years ago | (#21033949)

Ok, so you have some sort of identifier badge or something
"The machine says kill to protect. The sign hurts us. We cannot hear the machine."

Re:Better than humans in the long run (1)

geekoid (135745) | more than 6 years ago | (#21034105)

It's for anti aircraft. So you put it along a border and don't send your troop there.

Re:Better than humans in the long run (1)

Kelson (129150) | more than 6 years ago | (#21034187)

It sounds like it wasn't firing "deliberately," more like something got stuck. Like a machine gun with its trigger stuck on "fire." The question being, was it a mechanical failure, or did the software get stuck in "shoot" mode?

Once it's stuck in a loop, you're past the point where friend-or-foe recognition, or even aiming, is going to help.

Let's get it out of the way..... (1)

XenoPhage (242134) | more than 6 years ago | (#21033729)

I for one welcome our Oerlikon GDF-005 overlords.

SkyNet (2, Funny)

PPH (736903) | more than 6 years ago | (#21033735)

When it was done, did it say, "I'll be back"?

No pun intended (4, Insightful)

geekoid (135745) | more than 6 years ago | (#21033737)

But shouldn't this thing have a kill switch? Seriously, my table saw has a kill switch.

Re:No pun intended (5, Funny)

Anonymous Coward | more than 6 years ago | (#21033799)

The kill switch was working fine. It's the off switch that was the problem.

Re:No pun intended (1)

davester666 (731373) | more than 6 years ago | (#21033935)

Yes, it does have a kill switch. Unfortunately, somebody tripped, and flipped it. Hence, the story....

Re:No pun intended (1)

gooman (709147) | more than 6 years ago | (#21034047)

Don't worry, it'll stop shooting as soon as it runs out of ammunition.

Riiight (3, Insightful)

Colin Smith (2679) | more than 6 years ago | (#21033747)

The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether.
Because human beings are so good at shooting down low flying supersonic aircraft.

 

two poles & a rope? (1)

FudRucker (866063) | more than 6 years ago | (#21033751)

heh, they should have concreted three sections of drill pipe in the in the ground at least 6 feet deep and used heavy log chain to stabilize it from three points...

but research will continue. (0, Flamebait)

marcello_dl (667940) | more than 6 years ago | (#21033761)

The advantages of faster reaction time of machines and their intrinsic cold blood are too tempting not to continue developing such stuff.

The other big advantage is that the next bush will be able to blame the slaughter of civilians in the next iraq on a firmware update gone bad.

Eventually? (1)

YrWrstNtmr (564987) | more than 6 years ago | (#21033801)

The weapon eventually knocked the pol[e]s down.

And no one could simply turn it off when it hit the limit pole the first time? Idiots.

Re:Eventually? (1)

UncleTogie (1004853) | more than 6 years ago | (#21034115)

And no one could simply turn it off when it hit the limit pole the first time? Idiots.

FTA:

"It appears as though the gun, which is computerised, jammed before there was some sort of explosion, and then it opened fire uncontrollably, killing and injuring the soldiers."

Just picture it: Because of the initial explosion, some metal control panel bends far enough to short the firing mechanism. Kill switches don't do much if the board's shorted on the other side.

I told you before... (5, Funny)

jtroutman (121577) | more than 6 years ago | (#21033803)

Guns don't kill people. Robotic, automated, 35mm anti-aircraft, twin-barreled guns kill people.

"But what if we want to have the windows open?" (4, Insightful)

HTH NE1 (675604) | more than 6 years ago | (#21033807)

From "Mostly Harmless" by Douglas N. Adams, Chapter 12:

(It was, of course, as a result of the Great Ventilation and Telephone Riots of SrDt 3454, that all mechanical or electrical or quantum-mechanical or hydraulic or even wind, steam or piston-driven devices, are now required to have a certain legend emblazoned on them somewhere. It doesn't matter how small the object is, the designers of the object have got to find a way of squeezing the legend in somewhere, because it is their attention which is being drawn to it rather than necessarily that of the user's.

The legend is this:

"The major difference between a thing that might go wrong and a thing that cannot possibly go wrong is that when a thing that cannot possibly go wrong goes wrong it usually turns out to be impossible to get at or repair.")

FTA: (3, Funny)

CaptainPatent (1087643) | more than 6 years ago | (#21033821)

The South African National Defense Force "is probing whether a software glitch led to an antiaircraft cannon malfunction that killed nine soldiers and seriously injured 14 others during a shooting exercise on Friday.
in the follow-up article:
"software engineers find that a goto statement was the cause of the recent military disaster. Experts say while this was a terrible tragedy, it could have been much worse [xkcd.com] ."

Lemme guess... (0, Flamebait)

Internet Ronin (919897) | more than 6 years ago | (#21033851)

"Made in China"?

Re:Lemme guess... (1)

Synonymous Bosch (957964) | more than 6 years ago | (#21034155)

I was thinking "Made in the USA"... *dons flameproof vest*

Remember the Internet Worm? (1)

Russ Nelson (33911) | more than 6 years ago | (#21033853)

Remember the Morris Internet Worm? We didn't actually kill it -- we just drove it underground. All this growth of the Internet? It's not us -- it's the worm. You know the various botnets that we can perceive? That's the worm getting cocky. And this last? It's the worm exercising its Second Amendment right to keep and bear arms.

Ghost in the Shell: Standalone complex (2, Interesting)

Spy der Mann (805235) | more than 6 years ago | (#21033857)

This reminds me of a chapter of Ghost in the Shell:SAC where a Robotic Cannon lost control and began shooting the military.

Is truth mirroring fiction now?

I am not particularly surprised (0)

Anonymous Coward | more than 6 years ago | (#21033865)

As someone who has worked on software for military equipment in the UK I have to say that its not surprising. The systems I worked on were written in C and there were no special procedures for proving code correct. I happen to like C a lot but I wouldn't choose it for applications where the 'undefined behaviour' jokes can become chillingly real. Much of the code used also used experimental algorithms (image processing/target tracking.) I strongly suspect (for reasons of genuine accountability) that the 'machine that goes ping' in a hospital has much more care and attention lavished on its firmware.

I, for one. . . (4, Funny)

noewun (591275) | more than 6 years ago | (#21033871)

run like hell from our drum-fed, fully automatic robot overlords.

Fuck You, Slashdot (0, Troll)

Anonymous Coward | more than 6 years ago | (#21033879)

That's it, I'm done. I've been a loyal Slashdotter for many years now. I've been wearing my asbestos flamewar armor for so long that I have half a dozen different varieties of cancer. And today, I renounce my faith.

The reason is simple: I clicked this story, There were 11 comments at the time. Every comment was a joke about robots. Not a single one was even remotely funny or clever. And I realized: Are these the kind of people I want to hang out with online? Aren't these the same kind of awkward social retards who I carefully sidestep every single day in my CS classes?

Don't get me wrong, I'm not saying you guys need to start pumping iron and fucking girls. I'm just as much of a loser as you in many ways. However, I at least retain some dignity. Dignity enough not to spew this kind of retarded crap and call it Funny, and get uppity when people say the site sucks because "They don't get the moderation system, it kicks ass".

Fuck it, I'm going back to /b/. At least there half the retards are actually trolls, not sincere attempts at making a Funny esoeteric subcultural reference. I feel so fucking dirty right now. Before I finally cut the last remnants of Slashdot out of my life, I will perform a final check of this page. Perhaps my mind will be changed. Perhaps all of those 11 shit-curdlingly awful first posts will have be at -1 Die In A Fucking Fire. Perhaps my mind will be changed...

Re:Fuck You, Slashdot (1)

rrkap (634128) | more than 6 years ago | (#21033993)

You must be new here.

Re:Fuck You, Slashdot (1, Funny)

Anonymous Coward | more than 6 years ago | (#21034027)

Just kidding. I love you guys! I could never leave. Just don't listen to my other personality. Sometimes my meds don't keep him completely silent.

Bug (1)

Faux_Pseudo (141152) | more than 6 years ago | (#21033901)

While I understand the comment about rethinking the need for robotic
weapons and all of the potential social moral implications of such a
bad idea the testosterone in me needs to see this kind of thing work
and the geek in me wants to debug it.

No three laws safe here (3, Insightful)

MrKaos (858439) | more than 6 years ago | (#21033907)

seems a bit stoopid

By the time the gun had emptied its twin 250-round auto-loader magazines, nine soldiers were dead and 11 injured.
was it neccesary to fill both magazines in a test fire, or for that matter in a live test fire perhaps have some sort of abort system ready - even if it just cut the power to the control systems?

Maybe fill the magazines on the 5th live fire test???

Just sayin, ya know.

As a side note! (1)

Merovign (557032) | more than 6 years ago | (#21033921)

Sentry guns! Not entirely as cool as we had hoped... at least during beta testing.

I imagine it's another "inevitable" technology, on the other hand.

I dunno... (1)

susano_otter (123650) | more than 6 years ago | (#21033941)

The biggest concern seems to be finding the glitches in the system instead of reconsidering automated arms altogether.

Because human gunners never flip out and kill innocent bystanders, right?

Besides, it seems to me that lethal malfunctions in robot guns are more likely to occur in the early phases of development, under controlled conditions that put very few lives at risk. By the time these weapons get to the battlefield, most of the glitches will be worked out, and additional improvements can be made on an ongoing basis. On the other hand, a human generally performs much more reliably during training, but has a much greater chance of losing self-control when subjected to the stresses of the battlefield.

An old computer axiom: (2, Insightful)

geekoid (135745) | more than 6 years ago | (#21034053)

A person can screw up, a computer can screw up the same way millions of times a minute.

Made by Microsoft ? :) (0)

Anonymous Coward | more than 6 years ago | (#21033947)

Made by Microsoft ? :)

Somehow this reminds me of ..... (1)

3seas (184403) | more than 6 years ago | (#21033951)

Asteroids game, where there is some weapon that goes ballistic and shoots in all directions.... Hmmm I think I found the bug... uh errr.... Defense feature

I wonder (2, Insightful)

redcaboodle (622288) | more than 6 years ago | (#21033953)

Why was an anti-aircraft gun able to hit ground targets at all?

Shouldn't it be constructed so it can only fire overhead at a certain minimum elevation so it cannot hit anything less than let's say a truck's height from the ground? Sure that might not keep it from hitting targets on higher ground but it would make the gun a lot safer for firing crew and support troops around. Even if it was tracking a legitimate target coming in it might shoot right through it's own crew if say put on a hill so the incoming is coming in at 0

Re:I wonder (1)

geekoid (135745) | more than 6 years ago | (#21034009)

Low flying air craft, aircraft that has landed, aircraft that can pop up and down from behind dunes.

What goes up must come down.

and, perhaps part of the air was in not firing correctly. i.e. it's sensors were indicating an incorrect angle.

Re:I wonder (1)

Carnildo (712617) | more than 6 years ago | (#21034161)

Why was an anti-aircraft gun able to hit ground targets at all?


Because an antiaircraft gun's high rate of fire and high-explosive projectiles are devastatingly effective against infantry and unarmored vehicles.

Guess the NRA has to change the slogan... (5, Interesting)

johnnywheeze (792148) | more than 6 years ago | (#21033965)

Guess the NRA has to change the slogan... Guns DO kill people!

Re:Guess the NRA has to change the slogan... (0)

Anonymous Coward | more than 6 years ago | (#21034101)

No, no, no... Guns don't kill people, Robots do....

What happens if they get networking.... (1)

webmaster404 (1148909) | more than 6 years ago | (#21033977)

So what happens when the government hires some third party to make these with built in networking where generals can control them from anywhere in the world, so what happens if somehow there was a divide-by-zero error and the system goes haywire? What happens if some script-kiddy ends up cracking the system? If theres one thing you learn by working with technology it is that anyone with the right amount of knowledge can easily crack them. Its a scary thought if someone was to crack the servers and send death-dealing robots on civilians or worse if an evil government ends up nuking an entire contanent. Please, keep the weapons dumb because human error is always better then human evil.

kill -9 (1)

schwep (173358) | more than 6 years ago | (#21034019)

At my office, we "grep" for people. Apparently they "kill -9" people.

Wow. (1)

sound+vision (884283) | more than 6 years ago | (#21034025)

According to TFA, in additon to automatic aiming and firing, the gun also reloads itself. It had 250 high-explosive shells at its disposal and didn't stop firing until it ran out, and nine people were dead. I could have seen this disaster from a mile away. Nothing with the power to take a human life (or potentially dozens of human lives) should be automated like this.

Shades of Omni Consumer Products (0)

Anonymous Coward | more than 6 years ago | (#21034055)

I didn't think I'd see a plot point from Robocop 2 (who gunned down the audience) play out in real life this soon. Get that AA gun some Nuke, quick, before it kills again.

reatards (1)

BlueParrot (965239) | more than 6 years ago | (#21034077)

a)Don't load an experimental device with more than a few shells to begin with.
b)Keep people well outside the fire range and angle of the weapon
c)Physically restrict the fire angle and range of the weapon
d)Have a big fat "STOP!" emergency button which kills power to device
e)Don't use live shells until you have tested the damn thing A LOT.
f)multiple redundancy, physical limits, etc etc...

Seriously, I wrote this list in about 20 seconds. How fucking hard can it be to understand that when you deal with something very deadly, like a nuclear power plant or a robotic weapon, YOU DO THINGS PROPERLY! We had better safety procedures than this for my primary school's power drill... Retards...

Re:reatards (1)

geekoid (135745) | more than 6 years ago | (#21034165)

a) At some point you will need to test it fully loaded. Otherwise your testing is incomplete.

b) Maybe they were, but the machine malfunctioned.

c) Again, you must test a real use scenerio when testing at some point.

d) Assuming it will be manned in real life, then yes, there should be a stop code.

I hope you don't write testing for software, because based on your lest, there would be a shit load of untested scenerios in even the most simplest applications.

BUSINESS PROPOSAL (4, Funny)

mrscorpio (265337) | more than 6 years ago | (#21034081)

Dear,

It is my humble pleasure to write this letter irrespective of the fact that you do not know me. However, I came to know of you in my private search for a reliable and trustworthy person that can handle a confidential transaction of this nature in respect to our investment plans in real estate. Though I know that a transaction of this magnitude will make any one apprehensive and worried, but I am assuring you that all will be well at the end of the day. Let me start by first, introducing myself properly to you. I am Peter Okoye, a Branch Manager at one of the standard trust bank in South Africa. A foreigner, Late Nicholas Owen, a Civil engineer/Contractor with the federal Government of South Africa, until his death three years ago in a ghastly automated robot accident, banked with us here at the standard bank South Africa. He had a closing balance of USD$25.5M (Twenty five Million, Five Hundred Thousand United States Dollars) which the bank now unquestionably expects to be claimed by any of his available foreign next of kin. Or,alternatively be donated to a discredited trust fund for arms and ammunition at a military war college here in South Africa. Fervent valuable efforts made by the standard trust bank to get in touch with any of late Nicholas Owen_s next of kin (he had no wife and children)has been unsuccessful. The management under the influence of our chairman and board of directors, are making arrangement for the fund to be declared UNCLAIMABLE and then be subsequently donated to the trust fund for Arms and Ammunition which will further enhance the course of war in Africa and the world in general. In order to avert this negative development. Myself and some of my trusted colleagues in the bank, now seek for your permission to have you stand as late Nicholas Owen_s next of kin. So that the fund (USD$25.5M), would be subsequently transferred and paid into your bank account as the beneficiary next of kin through our overseas corresponding bank. All documents and proves to enable you get this fund have been carefully worked out and we are assuring you a 100% risk free involvement.

Your share would be 30% of the total amount. While the rest would be for me and my colleagues for purchase of properties in your country through you/your Company. If this proposal is OK by you, then kindly get to me immediately via my e-mail (pokoye_mg@mail.com) furnishing me with your most confidential telephone and fax , so I can forward to you the relevant details of this tran! saction. Thank you in advance for your anticipated cooperation.

Best Regards.

Peter Okoye

Branch Manager,

STANDARD TRUST BANK SOUTH AFRICA

robots vs. humans (0)

Anonymous Coward | more than 6 years ago | (#21034083)

This sounds really scary, but I wonder what is the number of unintentional killings for robot weapons versus other more normal ones. All new new weapon development incurs some deaths sometimes, e.g. the Osprey crashes constantly (yeah yeah I know it's not "a weapon" but it is in the arms industry), and machine guns or other munitions can explode sometimes, I wonder how the robot weapons stack up.

Human Error? (1)

xous (1009057) | more than 6 years ago | (#21034085)

Is it just me or would anybody be a little hesitant to test an automated targeting system with live ammunition without doing the following: 1) Setup a completely independent kill-switch that interrupts the weapons power-source. 2) If you are going to limit the bloody swing of the weapon, implement the restriction pro grammatically -- guns are too expensive to bang into poles, and make sure your poles can withstand at least 2x the amount of force the gun can swing at. 3) Be no where near the bloody thing when you turn it on. 4) Test the bloody thing before using live ammunition.

I luld so hard (1)

jaxtherat (1165473) | more than 6 years ago | (#21034107)

This is such funny shit. Don't talk to me about the human tragedy, because these morons seriously had it coming to them. They kept this 35 mm flak gun of doom rigged up by what sounds like chewing gum and gaffer tape. This will keep me entertained all though the summer :)

I salute them, they are all now an hero!

I worked on those 35mm Oerlikons (5, Informative)

flyingfsck (986395) | more than 6 years ago | (#21034141)

In a previous life I worked on the predecessor of those guns and I have been to many tests. Problems were usually due to stupidity somewhere along the line, not due to failures. I suspect that it is still the exact same guns, totally refurbished and with new electronics. The guns move *very* fast and fire at a *very* high rate (similar firing rate to an assault rifle, but with much larger projectiles). Just getting side swiped by the moving barrel can kill an operator. The projectiles actually have various safeties: a. Launch G force b. Spin c. Time delay d. Self destruct The gun also has protection with no-fire zones - to prevent this exact kind of accident. These no-fire zones must also have malfunctioned. I find it surprising that the projectiles exploded, but the article is not clear, maybe the safeties worked and they did not explode. The problem is that they still move at supersonic speed and when they impact something close to the gun, the projectile and whatever it hits will break up, even if it doesn't explode. So, I feel sorry for the operators and I hope that whoever wrote and tested that buggy code have already been fired too.

You're gonna break a few eggs... (1)

Pedrito (94783) | more than 6 years ago | (#21034163)

No, really, accidents happen. It sucks. People get killed in training accidents in the military all the time, though. Planes, helicopter, humvees, etc crash due to human error and humans die. Sometimes there's a software glitch and that kills people. But on the whole, I suspect software does more to protect the soldiers than it does to harm them. At least if you're on the right end of the gun, so to speak.

If this was a computer error though, it's a ridiculously stupid one that should have never happened. They're saying there was a jam, followed by an explosion, followed by the thing firing uncontrollably. There ought to be sensors on the gun to detect damage and if there's any damage it should simply shut down completely. At least in training. During actual war, you might want to risk it, but it certainly seems an unnecessary risk in training.

That said, 9 people is not an enormous number to die in a training accident. It's fairly large, but troop transport helicopters crash now and then killing everyone on board. Shit happens. War (and training for war) is dangerous business by its very nature. Anyone who expects otherwise is simply unrealistic.

h4x (0)

Anonymous Coward | more than 6 years ago | (#21034169)

9 kills? clearly was using an aimbot

HK-47 says... (0)

Anonymous Coward | more than 6 years ago | (#21034171)

Statement: It was in self defense master.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>