×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

SCADA Problems Too Big To Call 'Bugs,' Says DHS

Soulskill posted more than 2 years ago | from the just-a-remote-unauthorized-access-feature dept.

Bug 92

chicksdaddy writes "With the one year anniversary of Stuxnet upon us, a senior cybersecurity official at the Department of Homeland Security says the agency is reevaluating whether it makes sense to warn the public about all of the security failings of industrial control system (ICS) and SCADA software used to control the U.S.'s critical infrastructure. DHS says it is rethinking the conditions under which it will use security advisories from ICS-CERT to warn the public about security issues in ICS products. The changes could recast certain kinds of vulnerabilities as 'design issues' rather than a security holes. No surprise: independent ICS experts like Ralph Langner worry that DHS is ducking responsibility for forcing changes that will secure the software used to run the nation's critical infrastructure. 'This radically cuts the amount of vulnerabilities in the ICS space by roughly 90%, since the vast majority of security "issues" we have are not bugs, but design flaws,' Langner writes on his blog. 'So today everybody has gotten much more secure because so many vulnerabilities just disappeared.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

92 comments

Call them whatever you want (2)

elrous0 (869638) | more than 2 years ago | (#37519900)

"Bugs," "security vulnerabilities," "design flaws"--really doesn't matter. They can still blow up an Iranian centrifuge. And I'm pretty sure that means they can blow them up in other countries too, along with just about anything else that depends on a PLC. Stuxnet, as well-intentioned as it may have been for Israeli and U.S. interests, was a wake-up call that goes way beyond any petty Persian pissing contest.

Re:Call them whatever you want (3, Insightful)

Anonymous Coward | more than 2 years ago | (#37520042)

"Bugs," "security vulnerabilities," "design flaws"

it matters to beaurocrats, unfortunately.

the categorization of these flaws, and whether they are a "bug" or not, can determine by law or policy who is on the hook for the $$$ required to fix the flaw.

Re:Call them whatever you want (1)

jeffmeden (135043) | more than 2 years ago | (#37520168)

"Bugs," "security vulnerabilities," "design flaws"

it matters to beaurocrats, unfortunately.

the categorization of these flaws, and whether they are a "bug" or not, can determine by law or policy who is on the hook for the $$$ required to fix the flaw.

That's exactly it. (make way for the car analogy!) You wouldn't say that a car with glass windows has a "Security flaw" in which the interior can be accessed through the use of a ball pean hammer and 1/2 lb of force. You instead say that it has known security limitations involving vectors x y and z. If your system (the car) is improperly configured (left unattended in a shitty neighborhood) and subsequently gets burglarized, you would not say that the vendor provided an insufficiently secure car, would you?

So it is with ICS and SCADA. No vendor is out there saying "sure these are secure enough to attach to the internet at large, with no firewalls at all protecting them"... And yet that is what we see happen. Is it the vendor's fault for not making the system impossible to set up insecurely? Or is it the implementers fault for being an idiot? You know which side I am on.

Re:Call them whatever you want (1)

LifesABeach (234436) | more than 2 years ago | (#37520534)

I will sleep better tonight in southern California because my electricity [msn.com] and cell phone [nctimes.com] feel good when their together?

Or maybe someone at DHS has funny way of doing their job; but how American citizens were helped after Katrina [aclu.org] is something I still clinch my remaining teeth about.

Re:Call them whatever you want (0)

Anonymous Coward | more than 2 years ago | (#37526404)

Last time I checked, when I worked for GE-Hydro (division of GE that makes hydro power generators), we shipped all PLCs locked down so hard you had to turn them off and back on with a number of dipswitches flipped to even place it in "you have three seconds during boot to supply the vendor maintenance password" mode. In locked-down mode, it would not allow any read or write operations anywere but to the user-parameters register space. And those PLCs were ALL located deep inside the static exciters, so that you'd not only need access to the machine room, but you'd also need to be extremely foolhardy to mess with it when it was no more than 1m from the field control (typical ~1kV at >400A).

Granted, that functionality was not available outside of GE (we bought GE PLCs for a reason, and it was not because they were better or to save face. It was because we got special toys that were competition-proof (and therefore a LOT harder to hack)). Hmm, I recall we'd seal a lot of the nastier stuff in epoxy, not just because if it tried to go kaboom, the epoxy inside a metal container made sure it stayed contained and didn't destroy anything else.

OTOH, a dumb kid with a lolypop stuck to his ear could hack the !@#$!@#$ modbus crap in the SCADA networks we had to hook the thing to. Which is the reason why ALL of the static exciter comissioning parameters were shipped LOCKED (no RO or RW access) and we actually implemented a high level protocol to send commands to the exciter which simply refused to do anything idiotic. It also meant we had to spend weeks inside very hot dams in the worst god-forsaken parts of Brazil to set the entire parameter set locally while doing "comissioning". That wasn't fun. All the stuff had 1+1 or 1+2 redundancy, but it was not the Shuttle, all PLCs ran the same control program, and monitored the same sensors, so if the logic was bad, well, the safeties would end up engaging, the generator would do an emergency field dump (which is _SCARY AS ALL HECK_ in a small 1MW unit, I wouldn't be caught anywhere near the machine room when people tested the bigger ones), and GE Hydro would get a fine so large it that would likely get all of us fired. That means we tested everything like 3 thousand times with all possible input vectors and then some. Whatever happened, if it wend bad it had to go bad _nicely_ and not in a way we could be blamed.

The rest of the control systems (often made by our competitors) were similar in their paranoia, NOBODY trusted the SCADA network. The instrumentation sequencer (which does stuff like pressurizing the fluid bearings of the turbine and generator and keeps track of all pressure, temperature and vibration sensors) had 1+2 redundancy and was so absurdly locked down, that you only could give it two or three high-level commands ("start, stop, and emergency stop") and get back some status codes. The cute stuff for the SCADA displays was done with secondary sensors in parallel with the important ones.

But that was 15 years ago. We didn't even use ethernet much, it was all done using current loops (much better noise immunity), and sometimes optical fiber. I very much doubt the new blood is anywhere that paranoid, and I hear that they do NOT **force** the firmware guys to be inside the machine room and right next to the units when they are starting up, so they don't even have that realistic "fear of causing the death of himself or a colleague" that kept everyone honest.

Re:Call them whatever you want (1)

jeffmeden (135043) | more than 2 years ago | (#37527574)

It also meant we had to spend weeks inside very hot dams in the worst god-forsaken parts of Brazil to set the entire parameter set locally while doing "comissioning".

But did you get to visit Itaipu? Come on, that would be cool as hell.

Re:Call them whatever you want (1)

InsertWittyNameHere (1438813) | more than 2 years ago | (#37520090)

Sounds like DHS has hired Dilbert's Pointy Haired Boss.

Re:Call them whatever you want (0)

Anonymous Coward | more than 2 years ago | (#37520148)

I thought they were always run by pointy haired boss

Re:Call them whatever you want (0)

Anonymous Coward | more than 2 years ago | (#37520572)

You sound like he hasn't been on board all along...

Argh (4, Informative)

Anonymous Coward | more than 2 years ago | (#37520160)

We do SCADA systems in the States. We subscribe to several polices regarding SCADA networks:
1) DO NOT connect your SCADA network to the Internet
2) if you must connect for remote-access, use a patch cord that you ALWAYS unplug afterward.
3) DO NOT use your SCADA machines for desktop business purposes - especially on the Internet!
Argh, the crap that appears in the media. For example, you cannot "infect" a PLC. Why? They don't run Java (or script), or any language recognizable by the Internet community. They don't even run executables, in the sense that PCs do. Their programming is done in a specialized, proprietary language that requires a specialized IDE to manipulate. Write you own? Sure, if you have thousands upon thousands of man-hours handy. Do an open source IDE? Within 24 hours of posting your project somewhere, the manufacturer will be knocking at your door. PLCs are very, very proprietary, and they makers want them to stay that way.
Stuxnet infected a PC, causing it to change the signals it was sending to motor speed controllers, thus fouling up a process. Which is why you keep your SCADA PCs as far away from the Internet as you possibly can.

Re:Argh (3, Informative)

elrous0 (869638) | more than 2 years ago | (#37520206)

The Iranians had the same policies. Didn't stop Mossad or whoever from putting it on some Russian contractors' thumb drives and infecting them that way. Not so much of a worry unless you're a high value target. But the problem is that a lot of industrial systems ARE pretty high value targets.

Re:Argh (3, Interesting)

Zerth (26112) | more than 2 years ago | (#37520642)

For example, you cannot "infect" a PLC. Why? They don't run Java (or script), or any language recognizable by the Internet community. They don't even run executables, in the sense that PCs do. Their programming is done in a specialized, proprietary language that requires a specialized IDE to manipulate.

PLC IDEs are pirated in the industry all the time(several on TPB right now), so don't expect that to stop anyone outside the industry from writing malicious PLC code, let alone a disgruntled employee who has legitimate access. And anyone who is willing to decompile code taken from a decapped ROM is more than able to buy a broken PLC on ebay and fix it into a testbed.

Everyone else will just download the exploit from that guy.

Re:Argh (2, Interesting)

Anonymous Coward | more than 2 years ago | (#37520644)

Mod parent up. As someone who is working in development of SCADA systems at a large company, I tell that every customer I meet. We have it in the manual, we take it into account in projects:
SCADA systems don't belong on the internet. Not the PLCs, not specialized controllers (motion etc.), not the HMI systems or data loggers either.

In addition to using a cable you can simply unplug, we would typically recommend a VPN router thing between the SCADA system and the internet (cheap or not so cheap, doesn't really matter); apart from protection from viruses etc. the protocols for data transfer aren't really encrypted in most cases and industrial espionage is a concern as well.

Some customers still ignore that though - I just checked a couple of IP addresses I received for remote access in recent months, and yes, there are some of our controls available on the internet (not even password protected in at least 2 cases). Luckily, they're not that easy a target ($$$ to get an exploit on the proprietary system) and at least these are not in critical industries. Still, the stupidity of some customers is astonishing, and it's certainly a safety issue (I'm talking about physical safety here, machine parts could be moved when they are not supposed to move).

But it gets even worse sometimes: Even if a system was technically safe, people need to care about safety. I've recently been at an automotive supplier where employees were surfing the internet with IE5 on a robot control (essentially an old Windows NT based PLC with remote maintenance access over the internet). They had a dozen viruses installed there, some porn popups opening every few minutes, and NO ONE CARED. And that thing was primarily responsible for moving a number of big freaking robots on a production line.

Re:Argh (1)

Renraku (518261) | more than 2 years ago | (#37522456)

Alright, let's give a scenario here. Imagine you're a producer of SCADA/PLC/etc bits. You develop and deploy a solution for a company and after a year of testing and making sure everything is running smoothly, you hand over the keys and do a tactical retreat to let them have their damn system. A lot of money changed hands.

Then the worst happens. They get hacked. Let's say it breaks down a multi-million dollar piece of equipment. Beyond repair. Has to be scrapped.

Obviously the fault depends on how it was hacked. If it was your vulnerability, programming flaw, something like that, it might be your fault. If they connected it to the internet with no protection, it would be their fault.

Assuming it was their fault, you will have a lot of bad publicity still. All most people would see is (YOUR COMPANY)'s industrial control system was hacked and caused millions of dollars of damage to a nearby factory! They won't know it was hacked because it was internet accessible and wasn't made to be secure against internet attacks.

The only solution is to design these systems knowing that they will be hacked, and to simply make it more difficult. Just like people buying shitty and bald used tires and putting them on their sports cars, people will connect naked SCADA systems to the internet.

Re:Argh (1)

inasity_rules (1110095) | more than 2 years ago | (#37524624)

While I very much agree with you, at some point it becomes an arms race. There is no such thing as an unhackable system. But you can minimize risks... We have some clients who run a country wide network because their management "needs" the information at head office. While they go through a VPN, it isn't all that secure. Luckily we are a minor vendor and probably won't get dropped in it..

Re:Argh (1)

rev0lt (1950662) | more than 2 years ago | (#37520898)

Many PLC implementations are "open" in the sense that the protocol is open and documented. And while most hi.-power CNC machines aren't vulnerable to internet malware, that doesn't mean that ethernet/serial/can exposed PLC devices aren't. I don't work with PLCs since the late nineties, but I wouldn't be surprised if little to nothing has changed since.

Re:Argh (1)

NFN_NLN (633283) | more than 2 years ago | (#37520920)

Funny, you can replace PLC with PS3 and the paragraph still makes sense, except the part about disconnecting from the internet.

"Argh, the crap that appears in the media. For example, you cannot "infect" a PS3. Why? They don't run Java (or script), or any language recognizable by the Internet community. They don't even run executables, in the sense that PCs do. Their programming is done in a specialized, proprietary language that requires a specialized IDE to manipulate. Write your own? Sure, if you have thousands upon thousands of man-hours handy. Do an open source IDE? Within 24 hours of posting your project somewhere, the manufacturer will be knocking at your door. PS3s are very, very proprietary, and they makers want them to stay that way."

Re:Argh (0)

Anonymous Coward | more than 2 years ago | (#37522144)

Sure, and everyone does all these these things, every time, everywhere, with machine-like reliability.

We never have to deal with junior operators and personnel. We never have accidents or make poor decisions. We are never asked to do inadvisable things by management because they consider it an "acceptable tradeoff". Or that it is cheaper today (and tomorrow is another day and can be safely ignored).

Broken security through a broken set of assumed initial conditions, is an awfully good way of having a bad day.

Re:Argh (0)

Anonymous Coward | more than 2 years ago | (#37524316)

What a nonsense. Some may be complex, but for example because of silly restrictions in their IDE (always uploading source code, thus wasting precious flash) I within a few days made a program that rewrote their "program archive", removing the source and then uploaded it to the device.
By accidentally writing too much onto it I noticed that it was even possible to overwrite parts of the firmware.
I never bothered to find out what CPU they ran or what kind of code for their customer-modifyable part, but that
1) wasn't necessary and won't be for even a lot of hacks
2) shouldn't be that hard as long as you have access to their official compiler/IDE.

Re:The wrong way to do it. (1)

thegarbz (1787294) | more than 2 years ago | (#37524388)

I wish you wouldn't have posted AC then at least I would know which contractor to avoid. Quite frankly your post smells of all the things that cause problems in our industry.

1) Airgapping isn't a be-all and end all security method. Actually it's typically the way people do security when they couldn't be bothered designing a secure system from the ground up. It's the theory that there's only 2 end points and no need for security that got us Modbus, a protocol with no authentication at all which was ported to TCP/IP verbatim with no authentication at all. It is the theory that airgapping = security that ultimately completely failed for Stuxnet's target.

2) When you remote access there's normally a reason to remote access. When you can physically go and plug a cable in you remove 99% of the need to remote access which usually falls into a category of "this SCADA system is on a pipeline in another state". Secondly if you give someone something as simple as a cable how many people do you think will unplug it? How many companies successfully enforce "security policies"? I've seen that at every major company oil company I've worked with. You got a ban on USB sticks? Why is there a USB stick in your pocket? You use passwords to prevent computer access? No you don't need to tell me someone wrote it on the computer case for me.

3) I agree with, but they are often required to be in some way connected to the business network. This is often at odds with 1) and should immediately put you on the path of careful network design.

As for PLCs being proprietary, yes but who cares? In a targeted attack the code could be pre-written, in a denial of service attack all it needs is for the link between the PLC and the rest of the world to be flooded to cause issues. While the PLCs are proprietary the way which they communicate with other systems is very much open, often with piss poor authentication, and often provide a vector for very real damage.

Re:Argh (1)

AmiMoJo (196126) | more than 2 years ago | (#37525032)

Actually you can brick many PLCs fairly easily without detailed programming knowledge, using tools downloadable from TPB. Configuration fuses are usually a good place to start, and even if they are not one-time programmable there is usually some combination of settings that causes them to become useless (or cook).

That is true of many microcontrollers too. Apple tried to prevent those kinds of attack by using cryptography for firmware updates to battery controllers, but the key was accidentally revealed in a patch. Even that level of security is unusual and a battery controller could easily cause the battery pack to catch fire.

Re:Argh (0)

Anonymous Coward | more than 2 years ago | (#37526436)

I know what you mean about the mfr knocking at your door... but to be candid...I don't get what you mean that I can't infect a PLC.

1) I damned sure can flash a lot of them. Maybe not on the scale you play with, but the small ones that only had maybe 12 A2Ds...

I'm sure you don't get your code from your mfr., but...some people can in specific applications.

2) Even if I can't 'virus' the firmware of the device. What protocol do yours talk? Our talked modbus [http://www.modbus.org/specs.php]

While admittedly, half the fuckers out there have their own broken implementation of it... it really isn't all that bad to get shit working with it. To break something /badly/ instead of just breaking it might be bad. But simply writing a 0 to every register would fuck things up good and fast--and could be done from any computer that communicates to it.

Don't use modbus? Let me ask you. ARE YOU SURE? A lot of the "PLC Softs" out there have it available for monitoring tools, even if they have their own vendor specific protocol.

And last but not least... two years ago, I encountered a vendor that had hardware that ran java bytecode natively. *THAT* would be a perfect virus platform...

“Design flaws” is just "Bugs" squared. (0)

Anonymous Coward | more than 2 years ago | (#37520634)

When a bug is a point or a sphere, then a design flaw is a line or even a plane (each with a certain thickness) in the multi-dimensional space that is a program.
Hence it is a whole new level of bad.

In my personal system, a design flaw counts for 100-1000 bugs, depending on the project size.

But you're right: Nobody who has any saying gives a fuck about the bureaucrats. Only OCDers (which are also), bureaucrats and worthless pundits do. But nobody listens to them.

Re:Call them whatever you want (1)

aaarrrgggh (9205) | more than 2 years ago | (#37520654)

It all depends on if you understand the design assumptions of the equipment, and how you establish the point of trust. You can build a secure network of insecure components; it is just infinitely more complicated than making a network of secure components. We end up with a bunch of firewalls on RS-485 links that control device-level access, eliminate distributed password management, and need to set up complicated rules for data access. Once someone gets to the RS-485 layer, they are assumed to be trusted. As long as each link only connects the firewall and the device it isn't that bad.

Dealing with the control workstation on a system that is constantly changing is very difficult. At that point, it comes down to prioritizing maintainability and security.

Re:Call them whatever you want (0)

Anonymous Coward | more than 2 years ago | (#37524312)

I am still waiting for the moment they start calling them "features"

Too Big To #FAIL ? (0)

Anonymous Coward | more than 2 years ago | (#37519964)

We've known for a while that some businesses are to big to fail, now we know that some vertical markets are too big to #FAIL.

Some background - 747s and online SCADA systems (5, Interesting)

djkitsch (576853) | more than 2 years ago | (#37520040)

Some extra info popped up online just a few days ago - a SCADA consultant posted this a few days ago. It's slightly terrifying, though someone with more SCADA experience than me would have to verify its accuracy:

For those who do not know, 747's are big flying Unix hosts. At the time, the engine management system on this particular airline was Solaris based. The patching was well behind and they used telnet as SSH broke the menus and the budget did not extend to fixing this. The engineers could actually access the engine management system of a 747 in route. If issues are noted, they can re-tune the engine in air.

The issue here is that all that separated the engine control systems and the open network was NAT based filters. There were (and as far as I know this is true today), no extrusion controls. They filter incoming traffic, but all outgoing traffic is allowed. For those who engage in Pen Testing and know what a shoveled shell is... I need not say more.

More here: https://www.infosecisland.com/blogview/16696-FACT-CHECK-SCADA-Systems-Are-Online-Now.html [infosecisland.com]

Re:Some background - 747s and online SCADA systems (2)

LWATCDR (28044) | more than 2 years ago | (#37520226)

Maybe you should not believe everything that you read.
"Nearly all SCADA systems are online. The addition of a simple NAT device is NOT a control. Most of these systems are horribly patched and some run DOS, Win 95, Win 98 and even old Unixs. Some are on outdated versions of VMS. One I know of is on a Cray and another is on a PDP-11."

Ummm a SCADA control on a CRAY? Really> Where?
PDP-11 maybe but a CRAY?
Also the system you mentioned can not be changed while in flight. Maybe it could be bypassed but then maybe not. The system is not a flight system. In flight it just recordes the information it can not change the settings. Makes good press but not really the problem people think it is.

The real issue source of all our problems is that all our systems are no online. Even something as simple as a picture viewer needs to be checked for security issues today. Scada systems used to depend on physical security they where air gaped. The problem is that we now use COTS hardware so it is so easy bridge networks that it happens all too often.
About the only network I would bet on being not bridged would be on a nuclear submarine at depth unless the ELF system is on the network but man that would be one slow hack.

Re:Some background - 747s and online SCADA systems (4, Informative)

Jaktar (975138) | more than 2 years ago | (#37520424)

I can only speak for US Navy Submarines. There are no connections to any reactor systems to any network of any kind.

Re:Some background - 747s and online SCADA systems (1)

LWATCDR (28044) | more than 2 years ago | (#37520520)

I was thinking of the other networks. I know that the Virginia class uses a COTS network for a lot of systems. I was using that as an example because lets face it when your down no one is logging into you at all. As you know subs at depth are pretty cut off well SBNs anyway. The can get messages but only at a very slow rate.

Re:Some background - 747s and online SCADA systems (1)

Svartalf (2997) | more than 2 years ago | (#37520608)

The only concern there would be that you've got some trojan that was snuck in to the development code. If there's any launch or reactor controls that might get the low-speed comms, you could still do a remote exploit that way.

Re:Some background - 747s and online SCADA systems (1)

Svartalf (2997) | more than 2 years ago | (#37520590)

Heh... I'd dearly hope that the electric boats had air-gapped control systems... :-D

Re:Some background - 747s and online SCADA systems (1, Insightful)

Runaway1956 (1322357) | more than 2 years ago | (#37520604)

And, that is the only sensible approach to take. If the world weren't filled with cheap bastards posing as CEO's and economics experts, there would be a human hand at all critical controls, nationwide. The only networking necessary would be the sound powered phone on the operator's head.

Re:Some background - 747s and online SCADA systems (1)

adolf (21054) | more than 2 years ago | (#37523226)

I can only speak for US Navy Submarines. There are no connections to any reactor systems to any network of any kind.

So the reactor systems are operated by having people manually operate them by turning valves and pulling levers?

That must be a steampunk's idea of heaven.

(The above is written with a firm dose of sarcasm. While I'm reasonably sure you meant something very different, "any network of any kind" is literally so broad that it might be construed to include even a mechanical linkage of moderate complexity.)

Re:Some background - 747s and online SCADA systems (0)

Anonymous Coward | more than 2 years ago | (#37521180)

I think the most

Re:Some background - 747s and online SCADA systems (0)

Anonymous Coward | more than 2 years ago | (#37532590)

PLCs, or controllers as they are usually called nowadays, are everywhere. You will find them in mining, heavy industry, manufacturing, food production, energy production, oil/gas/petrochem/refining/pipelines, pharmaceutical plants, chemical production, water/waste water treatment, airport baggage handling lines....you name it. If there is a push-button that makes stuff happen "automagically", chances are there is a controller somewhere. They are so embedded into our life that most people don't even know that they are there. They are the little man behind the curtain pulling all the levers in our modern society.

Controller/PLC security has been neglected for decades, and Stuxnet made a lot of people sit up and take notice (Some would say not enough. I guess we need a real catastrophe to have people wake up and smell the coffee). Improvements in the hardware are changing for the better, but it will not be instantaneous--they have a product lifespan of typically 15-20 years before they are considered to be obsolete. Vendors also do not just rush product to market, either. There in no product refresh every few years like they do in IT, and there needs to be a compelling reason. Those things are fraking expensive, and that is not taking into account the engineering, programming, labor, etc. involved. Replacing 100 desktop computers? Trivial. Replacing 100 PLCs? That's a different ballgame entirely.

PLC/Controller security will be fragile for a very, very long time as a consequence....

I don't care about SCADA. Vulnerabilities, I do. (3, Insightful)

AtariDatacenter (31657) | more than 2 years ago | (#37520068)

SCADA? I don't care about. Not directly. But the problem is that once the government says, "These aren't vulnerabilities or security holes. These are design issues." The problem is that you've set the example, and other software vendors are going to follow.

Example: "The denial of service attack against your application is not a security vulnerability, it is just a design issue that everything locks up for a while if it gets an incoming packet, and tries to resolve the IP address against its authoritative DNS server while that is DNS server is offline. We only do security fixes on old products / old releases. Sorry."

"Design issue, not a security vulnerability" is not a distinction you want easily drawn. Others will follow a government example if it is an easy out.

Re:I don't care about SCADA. Vulnerabilities, I do (2, Interesting)

Anonymous Coward | more than 2 years ago | (#37520290)

Trust me, almost everything you have come to depend on in terms of outside resources or outside facilities, directly or indirectly, involves SCADA. Almost. Intelligent cars use unencrypted, unsecure Ethernet connections, often with some form of unsecure wireless connection in there somewhere. That's not SCADA, merely designed just as badly.

Society is riddled with interdependencies. If you thought Red Hat packages could put you through dependency hell, you've not looked too closely at the systems in the real world much. Those are infinitely worse, frequently unmaintained and it's a minor miracle that civilization hasn't been put back to the stone age. Yet. Unless some serious effort is put in, the effort of keeping semi-decayed interlocking systems even vaguely in operation will become infeasible. Left as-is, it WILL eventually become more practical to hard reset civilization than to fix the cumulative errors.

Finally, "design issues" ARE bugs. They may be bugs in the design, but they're still bugs. It's not that the distinction should not be easily drawn, it should never be drawn at all. A bug is a bug is a bug. If a compiler messes up your code, it may not be a bug in your code but it's still a bug. A compiler bug. If a CPU fails to execute code correctly, it may not be a software bug but it's still a bug. A silicon bug. In this case, we have a design bug, but it is STILL a bug.

Re:I don't care about SCADA. Vulnerabilities, I do (2)

CobaltBlueDW (899284) | more than 2 years ago | (#37520300)

Personally, I think design flaws is a much more damming claim to make about your product than a bug report. A bug is a low implementation level issue/oversight. A design flaw means your product is rubbish from the ground-up. I would MUCH rather someone say my software had a bud than that there were design issues. I doubt this is going to be a trend that catches-on. The only way I think it could possibly help a company to make such a damming claim would be if it took the words vulnerability or security off the table, which have a much worse connotation. --Although, design flaws create vulnerabilities and security concerns, so even that doesn't seem likely, IMHO.

Re:I don't care about SCADA. Vulnerabilities, I do (0)

Anonymous Coward | more than 2 years ago | (#37522258)

Yes, but I think the concern here is that the law will -compel- you to fix security vulnerabilities, but allow the 'free market' to deal with 'design flaws'. And I'm sure that will work so well when a customer has so much invested in a particular system that they are basically locked in.

Re:I don't care about SCADA. Vulnerabilities, I do (1)

maxume (22995) | more than 2 years ago | (#37520472)

Right, because no software vendors ship software with disclaimers to the effect of "This software is unsuitable for any purpose".

Oh wait, they all do.

Microsoft doesn't path security vulnerabilities because they are unable to pass them off as design flaws without the help of government, they patch security vulnerabilities because they know they need to or they will face even greater user erosion.

Re:I don't care about SCADA. Vulnerabilities, I do (0)

Anonymous Coward | more than 2 years ago | (#37520486)

SCADA? you should care about it. These systems run all sorts of critical items.

In other words -Too hard -don't want to think... (2)

hguorbray (967940) | more than 2 years ago | (#37520580)

We'd rather get back to feeling up 5 year olds and people's Afros rather than address an infrastructure and industries that are highly vulnerable to malicious hacking that could affect the safety and well being of thousands.

They're so paranoid about 3 oz of liquid -what about industrial controls at a petrochemical or other large chemical plants in which thousands of gallons of chemicals could be blown up or caused to release hazardous materials?

I'm just sayin'

Re:I don't care about SCADA. Vulnerabilities, I do (1)

thegarbz (1787294) | more than 2 years ago | (#37524402)

Unfortunately they are right. Much of the industry is based on standards and protocols which were created by one vendor and adopted by others long before the internet reared its ugly head. Many of the fundamental security issues in SCADA systems are failures of design. They aren't bugs because they work 100% as intended.

When you design protocols which are normally used to communicate between physically secure devices, and then open them up on a network to PCs which are now exposed to the open world (be it via USB stick or network cable) you have a fundamental design flaw (not a bug) if you hardcode your passwords or use a protocol with no authentication.

A Hackers dream (1)

Anonymous Coward | more than 2 years ago | (#37520130)

Scada systems ,PLC's are a hackers dream.

It's equipment using protocols of at least 30 years old with no security at all in mind. Furthermore the equipment cannot be turned off or patched or else your factory and/or safety system will no longer work.

And the worst part is setting everything up, you just leave it be when it finally works...

I agree that they are design flaws and not bugs. But the industry has to only mentality to have it work, there is no security aspect during the entire process.

Buyers are not requesting it and vendors are not delivering devices with security in mind.

zero security in PLC (1)

Lead Butthead (321013) | more than 2 years ago | (#37520456)

that's because PLC wasn't supposed to be networked; the core code were never designed with security in mind because they are supposed to be programmed and the parked somewhere and never mess with again.

Re:zero security in PLC (1)

AK Marc (707885) | more than 2 years ago | (#37521190)

And it's a critical design flaw to assume, in today's age, that a large system of networked devices will *never* be connected to any network, ever.

design flaw by modern standard (1)

Lead Butthead (321013) | more than 2 years ago | (#37521454)

chances are that code are recycled from old product to new; ported forward with minute tweaks for the current hardware platform since the management can't see the value of starting from scratch; if anything starting a new is considered a liability.

Re:design flaw by modern standard (1)

AK Marc (707885) | more than 2 years ago | (#37522352)

Which is a fundamental problem, whether you want to call it a design problem or a business problem, it isn't a "bug" that managers sat around and specifically decided to make something that they knew was incompatible with the expected use of that product.

The problem in a nutshell (2)

dkleinsc (563838) | more than 2 years ago | (#37520158)

Making code secure is expensive. When these systems were designed, they were not going to be connected to any outside system, and thus were not designed securely because in order to do anything really bad you'd need to physically access the machine, which meant getting past security guards, cameras, etc without anybody noticing. Nobody could justify the expense of doing things right the first time.

Then somebody with no technical background comes along and says "Why can't we manage this system from our office desktops?"

Re:The problem in a nutshell (0)

Anonymous Coward | more than 2 years ago | (#37520310)

Then somebody with no technical background comes along and says "Why can't we manage this system from our office desktops?"

I've seen more than my share of people with a 'technical background' who do completely stupid shit when it comes to security, all in the name of ease of use.

Power plants and other such critical installations simply need to start hiring BOFHs.

Re:The problem in a nutshell (1)

PCM2 (4486) | more than 2 years ago | (#37520686)

Power plants and other such critical installations simply need to start hiring BOFHs.

I think most do. I once met a guy who worked on software for nuclear power plants. He didn't give me many details (probably couldn't and I probably wouldn't have really understood him if he did) but he said the work was important enough that he could potentially screw something up, bad. I think he may have been attached to the Navy or maybe that was just his background; I forget, as it was many years ago now. Anyway, the point is that he made a lot of money and he only really "worked" a few weeks out of the year. No moonlighting; he did one thing and that one thing was all he ever did. Not to mention, I highly doubt anyone would be able to pass all the background checks and psychological exams and gain the security clearances necessary for such a job and not take the work pretty deadly seriously. I suspect, however, that turnover is a problem; the guy I spoke to had an exit plan in mind and I highly doubt most people who gain the necessary experience really plan to make a career in such a stressful field.

Re:The problem in a nutshell (1)

AK Marc (707885) | more than 2 years ago | (#37521204)

Many nuclear operators in the civilian world are ex-submariners who were nuclear engineers on U-boats to receive their nuclear training. The Navy has its shares of accidents, but they are mostly out at sea and publicized only in exactly the manner they wish them to be.

Physical access is not as hard as it should be (1)

dbIII (701233) | more than 2 years ago | (#37522278)

I walked into three power plants and an oil refinery I wasn't authorised to get into just by wearing the right coloured overalls. Of course to get authorisation to be allowed in I had to get inside first due to very weird and stupid rules.

also how much code is tacked on to older code (2)

Joe_Dragon (2206452) | more than 2 years ago | (#37520368)

also how much code is tacked on to older code makeing it so that lot's of hacks and other security holes more likely?

Now who want's to pay to rewrite the system from the ground up to make it secure or do you want to do it cheap and just patch over the bad design?

Re:The problem in a nutshell (1)

AmiMoJo (196126) | more than 2 years ago | (#37525114)

Then somebody with no technical background comes along and says "Why can't we manage this system from our office desktops?"

It's more like industrial processes moved on, people realised they could increase yields or make something new by having more advanced control systems and so asked for a solution. Not necessarily from the people who supplied the SCADA system either. It came down to a choice between replacing the whole system with one designed for that type of networking and control or just tacking it on to the existing one, and simple economics combined with some vague assurances about physical security won.

They have a different dictionary that I do (1)

Hentes (2461350) | more than 2 years ago | (#37520374)

I use 'design flaw' for bugs that can't be corrected without rewriting the whole code. Software with design flaws are not called 'bug-free', but 'defective by design'.

This makes much more sense, if you're the company. (0)

Anonymous Coward | more than 2 years ago | (#37520390)

This is the next logical step in addressing these problems. It was inevitable. You need to look at it from the company's point of view.

Vulnerability - "Oh, I'm so sorry, we will fix that problem immediately. Update all your controllers ASAP when we post the fix next week."

Design issue - "Oh, yeah, the old 5-R's were designed that way. Would you like to upgrade to the 5-T's? They'll be available next quarter and are only $100 more each than the 5-R's, but don't have that problem. Cheap insurance for your plant, really.
"Well, no they aren't compatible with your old control server, you'll need a new one. No, we won't buy back those 5-R's you bought last year, I don't even know where to sell them, they're just not in demand now.
"Remember, we guarantee to fix any security vulnerability with the 5-T's.
"Just like we did with the 5-R's."

Out sourcing (0)

Anonymous Coward | more than 2 years ago | (#37520408)

Lowest bidders don't care about bug/vul exspoits. We are trusting people in other countries to do security work. Bring it in house and give your IT team the power to say hell no when developers do stuipid stuff

Too big (0)

Anonymous Coward | more than 2 years ago | (#37520536)

Too big to be called bugs but apparently small enough to be brushed under the mat.

WTF, DHS is now cyber-security? (2)

Runaway1956 (1322357) | more than 2 years ago | (#37520548)

When I look at DHS, I can't find a single area in which they are competent. They can't seal the border, they can't ensure that terrorists are denied entry to our aircraft, they can't intercept a terrorist. What in hell CAN they do? Suddenly, they are in the business of issuing cyber security warnings?

The one and only thing that they MIGHT be able to do correctly, is to tell business to observe best practice advice from the professionals. Beyond that - I expect nothing.

Oh yeah, if they can grasp the concept, they might push the idea of strict air gapping.

Re:WTF, DHS is now cyber-security? (1)

Anonymous Coward | more than 2 years ago | (#37521156)

I wouldn't want DHS to be 100% successful at the "can'ts" in your list. 100% success means sealing the borders and shutting down the airlines. I would want to be able to leave the jail of a nation you are advocating for. We fairly quickly wouldn't have a very large economy if we had no trade after closing the borders. I don't agree with everything DHS does, but usually there is some kernel of logic to the solutions they use. Too many programs are there to make us feel good, not fundamentally improve security.

From what I remember of Stuxnet, Iran had strict air gaps. The problem was Stuxnet used multiple attack vectors, and with humans walking the code between networks on thumb drives, it was able to spread. Please apply some common sense, if a system is complex enough to have a control system - it needs to be reconfigured occasionally. That means it needs to get the configuration from somewhere else - and that configuration transfer is a potential attack vector.

An airgap is one critical line of defense, but it should not be the only one. Passwords should be changed from the defaults, insecure services should not be used, patches should be applied, systems should be isolated where possible, and systems shouldn't run insecure OS's.

DHS has people who understand security, whether or not management allows that to be communicated out is a different issue.

Re:WTF, DHS is now cyber-security? (1)

Runaway1956 (1322357) | more than 2 years ago | (#37522552)

The point is - we don't even air gap our critical stuff. Joe Sixpack can minimize his porn video over at meattube, log into his control systems, make adjustments, then maximize his meat video again. Iran is the lesson to be learned, but we don't even attempt to learn from it.

Now, if we properly air gapped all of our infrastructure, then prohibited any USB media, prohibited any floppies (where applicable), AND prohibited all CD/DVD other than official copies issued by competent authority - then we could say that we learned something from Stuxnet.

Iran may have strictly observed air gapping - but USB infections aren't new, now are they? They failed, simple as that!

Red Rover, Red Rover.... (0)

Anonymous Coward | more than 2 years ago | (#37524438)

And of course there's absolutely no chance that Iran could reverse engineer that nice little Stuxnet and send it back our way. They could never be that cleaver, could they? Red Rover, Red Rover....

Re:WTF, DHS is now cyber-security? (1)

dbIII (701233) | more than 2 years ago | (#37522302)

They can shovel money around faster than anybody else - it's now just a big pointless welfare operation for those that signed on and too big to be killed off by anyone that wants to continue a political career.

They DID warn businesses to abandon Windows (TM) (1)

Ungrounded Lightning (62228) | more than 2 years ago | (#37522870)

When I look at DHS, I can't find a single area in which they are competent. They can't ... . What in hell CAN they do? Suddenly, they are in the business of issuing cyber security warnings?

They DID, a few years ago, issue a warning to businesses to migrate off Windows and other Microsoft products, due to their security flaws and the resulting vulnerability of the US private-sector infrastructure to attack.

Of course they managed to hide this warning under a bushel rather than pressure the exectuives to actually HEED it. (Did YOU hear about it before now? Even on Slashdot?)

Re:WTF, DHS is now cyber-security? (1)

kermidge (2221646) | more than 2 years ago | (#37524122)

DHS is marginally yet effectively competent in advancing Big Brother. While not completely explicit in their charter, I suspect it's the mission, Too much of the rest is standard your-government-loves-you bureaucratic bullshit: building fiefdoms, rice bowls, revolving doors; an employment program for intellectual and moral drones; a new repository of, and for, those with nothing better to do in life than increase their power over others.

Of course it's also possible that at this late hour my viewpoint is skewed thus my cynicism is showing.

Still a bug (0)

Anonymous Coward | more than 2 years ago | (#37520646)

A defect is a defect. It doesn't matter if code doesn't operate as expected or if the design was faulty. Bugs can occur anywhere and any time in a system, even in the earliest phases of conception. You can try to call a spade a shovel, but it's still a spade.

But, but, but... (0)

Anonymous Coward | more than 2 years ago | (#37520712)

...didn't Iran say this "bug" really didn't do anything? Why are people still acting as if this were a problem?

Why is DHS telling us this? (1)

Baulk (1992288) | more than 2 years ago | (#37520714)

Dear DHS Secretary Janet Napolitano: Please resign. From recent events, it is painfully clear that you do not understand that one of the most fundamental aspects of security is not revealing your methods to the public. This includes telling us whether or not you plan on telling the whole truth about security flaws in the future. The announcement implies two things: 1. Your future comments might not be entirely truthful. 2. DHS's previous comments about cyber security in the past were 100% truthful, to the best of its knowledge. The knowledge of either one of the above two items is just another tool for the enemy. They thank you for giving it to them so willfully.

Re:Why is DHS telling us this? (1)

dotancohen (1015143) | more than 2 years ago | (#37524424)

From recent events, it is painfully clear that you do not understand that one of the most fundamental aspects of security is not revealing your methods to the public.

Apparently Linus Torvalds does not understand that either.

Correct me if I'm wrong... (0)

phaedrus5001 (1992314) | more than 2 years ago | (#37520772)

Aren't all bugs just "design flaws"?

Re:Correct me if I'm wrong... (3, Insightful)

AK Marc (707885) | more than 2 years ago | (#37521274)

If you design a car with a gas tank that dislodges from the filler neck in a crash, spilling fuel in the case of a moderate crash, turning a survivable minor injury crash into a life-threatening incident, then you designed it wrong. If you purposefully design it to keep the filler neck attached for all crashes, but a part sourced for it did not meet specifications, resulting in inadvertent detachment, then you have a "bug" that was most certainly not a design flaw, but a build (coding) flaw.

One is a purposeful choice to make an inferior product to save time/money. The other is a properly designed product with an unintentional flaw. Sadly, deliberate negligence is tolerated (and seemingly encouraged), while unintended flaws are punished more harshly. But that's government security for you. Appearance is much more important than effect.

Re:Correct me if I'm wrong... (1)

tqk (413719) | more than 2 years ago | (#37523616)

Aren't all bugs just "design flaws"?

Vince?!? WTF are you doing here?!?

Okay, maybe you're not him, but you sound like him (*Doofus* manager I once worked with). So I feel somewhat obligated to educate you.

This Waterfall_method [wikipedia.org] says it was first alluded to in 1956 (two years after I was born, fwiw). I only learned about it in the late '80s in programmer school. "Analysis, Design, Implementation, Testing, Maintenance." You're supposed to iterate between them (at any phase you're in when you find a flaw or learn something new, you go back and re-do whatever relevant phases need to be re-done, then continue).

This's all deprecated now. More modern buzzwords have since been invented, (theoretically) obsoleting it.

So, no, not all bugs are "just design flaws". Sometimes, the architecht designed a perfect building but it was built by dorks who didn't know what they were doing. Sometimes, the architect was a dork and the builders did the best they could with a flawed design. Getting the picture? !@#$ups can happen anywhere in the chain of events and you can't blame the designer (or implementer, or tester, ...) for all of them.

Failure's a team effort.

Re:Correct me if I'm wrong... (0)

Anonymous Coward | more than 2 years ago | (#37524476)

You're right, he's wrong. Here's the perfect example of a bug introduced in a perfectly good design that killed people.

Hyatt Regency walkway collapse
https://secure.wikimedia.org/wikipedia/en/wiki/Hyatt_Regency_walkway_collapse

It's a security design flaw (1)

davidwr (791652) | more than 2 years ago | (#37520802)

There, see, you don't have to call it a bug.

It's still bad and customers still need to be told so they can work the design flaw into their operational plans.

Easy fix... (1)

ebunga (95613) | more than 2 years ago | (#37520848)

It's called an air gap firewall. Don't connect shit to the Internet that has no business being connected to the Internet. This means having strict policies in place, such as "connecting an uncertified wifi-capable laptop to the SCADA network shall result in the violator being shot, repeatedly, in the balls or other sensitive region."

Didn't help Iran. (1)

Ungrounded Lightning (62228) | more than 2 years ago | (#37522324)

It's called an air gap firewall. Don't connect shit to the Internet that has no business being connected to the Internet,

Not being connected to the Internet didn't keep STUXNET out of Iran's centrifuge SCADA systems. That propagated as a snearkernet virus from consultants' laptops to the machines on the air-gapped networkk which controlled and monitored the PLCs.

Re:Didn't help Iran. (1)

bryan1945 (301828) | more than 2 years ago | (#37522492)

At least it would make it more difficult. While working on NYC's subways, I found a lot of their SCADA systems were networked into the local station and the overall control center. Which in general is OK, since as of 8 years ago there was no outside connections, but if someone could sneaker-in a virus.... There could be some issues.

Re:Didn't help Iran. (0)

Anonymous Coward | more than 2 years ago | (#37524270)

yeah, security policy against flash drives is a good idea.

tac^o (-1)

Anonymous Coward | more than 2 years ago | (#37521032)

Wash off hands represents the (I always bring my [tux.org]? Are you to foster a gay and Need to join the to make sure the your own towel in steadIly fucking share, this news I see the same they started to OpenBSD, as the continues in a would choose to use Conversations where contaminated while community at elected, we took look at the guests. Some people Volatile world of be in a scene and hand...don't *BSD is dead. THEORISTS - that they sideline found out about the cuntwipes Jordan to avoid so as to the channel to sign Poor dead last Move any equipment members all over is dying. Fact: as those non gay, Irc.easynews.com came as a complete With any sort Else to be an and easy - only dim. If *BSD is result of a quarrel is busy infighting and enjoy all the personal rivalries

Old Microsoft joke (1)

EEPROMS (889169) | more than 2 years ago | (#37521432)

Seems a fitting time to roll out the old Microsoft joke but with a twist

How many DHS analysts does it take to change a light bulb ?
None, because DHS defines darkness as the new standard

Actually... (0)

Anonymous Coward | more than 2 years ago | (#37521820)

...this is an interesting liability loophole.

Someone found an exploit for a bug. D@mn those haxx0rs!

But now, the reason why 10,000,000 people lost power wasn't a bug--it was a Design Flaw. That means (by definition) whoever designed the software did a bad job. That's 10,000,000 lawsuits right there.

It's considerably easier for someone to disclaim liability for a "bug" (which could be argued was an "unforseeable usage case"or is otherwise unintentional) than for a "design flaw" (poor decision making which someone "knew or should have known" needed to be thought through).

WTF? (0)

Anonymous Coward | more than 2 years ago | (#37521842)

IS THIS GUY SERIOUS???

(sorry)

Re:WTF? (0)

Anonymous Coward | more than 2 years ago | (#37524792)

Your ignorance is showing. Yes, he is serious and for good reason. You might as well complain that your car will not work under-water. These systems weren't designed for hostile environments and what is more, they can not be. A real time system can not operate trouble free in a hostile environment.

Most vulnerabilities are 'design issues' (1)

John Sokol (109591) | more than 2 years ago | (#37521938)

It is my believe that most vulnerabilities are 'design issues' and not just "security holes" that can be patched over.

I have been studying OS design now for almost 20 years, I think most of these designs where fine for just trying hack something to work, but now with everything interconnected, they were just never built for that.

I have an OS design I have been working on for the past 10 years Amorphous OS that is intended to solve almost every issue I've seen talked about.

Most come from just having a common File System view for the whole OS. This become a place where malicious code can live and hide and exploit.

But memory could be treated much better and more efficiently. The Stack Also needs to be isolated better and separate data storage, instruction pointers, and code better.

None of this is new, it was talked about in the 60's and 70's then it seems everyone forgot about it. So today it's coming back to bite us.

Don't blame the government (for once!) (0)

Anonymous Coward | more than 2 years ago | (#37525028)

The problem is really that most people care much more about features, and their emotional well being when making purchasing decisions. The consumer usually doesn't know, or care about security, and so the companies that sell any kind of product generally do not either.

Siemens isn't alone there, and I really wouldn't put blame on DHS, they have no reasonable way of enforcing change of that magnitude across the states, think of how many companies wouldn't upgrade to the latest version of software, even if Siemens, and other PLC manufacturers spent the amount of money required to harden their OS, and do proper regression testing. It won't happen, it just isn't feasible, I don't really think that most embedded systems will be realistically secured, until we can find a way to make vulnerability testing cheaper. Maybe when quantum processors come along, and someone dreams up an algorithm to go with them that makes that sort of work orders of magnitude faster, and more effective then we may see industry wide adoption. The only other way is 'evangelism', whereby people tell consumers to care about something, and that never works...

Hu? Wa? (1)

sgt scrub (869860) | more than 2 years ago | (#37528070)

Design flaws are not bugs. Hmmmm. Maybe if I say it the other way. Bugs are not design flaws. Nope. Still not making sense. Maybe I need to put on a suite and tie.

Nosense (0)

Anonymous Coward | more than 2 years ago | (#37624214)

It has nosense conect everything to Internet, It is a must to evaluate if it is necessary, secure, and valuable. Sometimes the cost is bigger than revenues.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...