Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Is Analog the Fix For Cyber Terrorism?

Unknown Lamer posted about 6 months ago | from the security-through-obsolescence dept.

Security 245

chicksdaddy writes "The Security Ledger has picked up on an opinion piece by noted cyber terrorism and Stuxnet expert Ralph Langner (@langnergroup) who argues in a blog post that critical infrastructure owners should consider implementing what he calls 'analog hard stops' to cyber attacks. Langner cautions against the wholesale embrace of digital systems by stating the obvious: that 'every digital system has a vulnerability,' and that it's nearly impossible to rule out the possibility that potentially harmful vulnerabilities won't be discovered during the design and testing phase of a digital ICS product. ... For example, many nuclear power plants still rely on what is considered 'outdated' analog reactor protection systems. While that is a concern (maintaining those systems and finding engineers to operate them is increasingly difficult), the analog protection systems have one big advantage over their digital successors: they are immune against cyber attacks.

Rather than bowing to the inevitability of the digital revolution, the U.S. Government (and others) could offer support for (or at least openness to) analog components as a backstop to advanced cyber attacks could create the financial incentive for aging systems to be maintained and the engineering talent to run them to be nurtured, Langner suggests."
Or maybe you could isolate control systems from the Internet.

cancel ×

245 comments

Sorry! There are no comments related to the filter you selected.

obviously BSG was right (5, Insightful)

mindpivot (3571411) | about 6 months ago | (#46513383)

the terrorists are like cylons and we need to disconnect all networked computers for humanity!!!

Re:obviously BSG was right (1)

Anonymous Coward | about 6 months ago | (#46513695)

the terrorists are like cylons and we need to disconnect all networked computers for humanity!!!

Analog solutions to perceived terrorism are only as good as the persons trusted for its implementation — 12 Monkeys.

Are the cylons running on analog engine ? (0)

Anonymous Coward | about 6 months ago | (#46513955)

If the cylons themselves are based on digital, they have the same vulnerabilities as any other digital lifeforms and are hackable.

sure, no problem (4, Informative)

davester666 (731373) | about 6 months ago | (#46513399)

>Or maybe you could isolate control systems from the Internet

said the person volunteering to get up at 3 am to go to the office to reset the a/c system.

Re:sure, no problem (2, Funny)

Anonymous Coward | about 6 months ago | (#46513409)

Don't worry, Bill Gates says a robot will take that guy's job soon enough.

Re:sure, no problem (1)

dimko (1166489) | about 6 months ago | (#46513707)

And how this solves "digital" issue? :P

Re:sure, no problem (0)

Anonymous Coward | about 6 months ago | (#46513421)

Job security! No off-shoring to India. Not sure if it can be done by H1B as I hope this should require a security clearance.

Re:sure, no problem (5, Insightful)

TWX (665546) | about 6 months ago | (#46513425)

said the person volunteering to get up at 3 am to go to the office to reset the a/c system.

Sounds to me like you need a better A/C system.

Or you need to not consider an HVAC system to be so critical that it can't be on the network. Or, perhaps you need to design the HVAC system to take only the simplest of input from Internet-connected machines through interfaces like RS-422, and to otherwise use its not-connected, internal network for actual major connectivity. And design it to fail-safe, where it doesn't shut off and leave the data center roasting if there's an erroneous input.

And anything that is monitored three-shifts should not be Internet-connected if it's considered critical. After all, if it's monitored three shifts then it shouldn't have to notify anyone offsite.

Re:sure, no problem (-1)

Anonymous Coward | about 6 months ago | (#46513525)

Networked does not imply internet connected. In the same way, if you are using electricity, it does not mean you need to be connected to the electric grid.

There is no reason to going analog IF people are not stupid.

Re:sure, no problem (2, Insightful)

Anonymous Coward | about 6 months ago | (#46513699)

Networked does not imply internet connected. In the same way, if you are using electricity, it does not mean you need to be connected to the electric grid.

There is no reason to going analog IF people are not stupid.

Unfortunately, we have plenty of examples that refute your premise. People ARE stupid, including the people who designed the highly vulnerable smart grid that most of the US is now using for power distribution.

Re:sure, no problem (3, Insightful)

CBravo (35450) | about 6 months ago | (#46513917)

And to make it even more simple: Everyone, including smart people, makes mistakes.

Re:sure, no problem (2, Informative)

Anonymous Coward | about 6 months ago | (#46513833)

Networked does not imply internet connected. In the same way, if you are using electricity, it does not mean you need to be connected to the electric grid. There is no reason to going analog IF people are not stupid.

You may want to be careful using words like "stupid". A reasonably intelligent person would recognize that a purely internal network without internet connectivity is still vulnerable. The internet is just one method of ingress. A malware payload could be introduced through physical media for example.

A lack of internet connectivity may make data theft more difficult however in an industrial control application merely getting into internal network and taking control of machinery is all that is necessary.

Re:sure, no problem (4, Interesting)

phantomfive (622387) | about 6 months ago | (#46513463)

said the person volunteering to get up at 3 am to go to the office to reset the a/c system.

I can't speak for everyone, but I would rather pay extra for someone to be willing to do that (or do it myself, it shouldn't be a common situation) before I connect important systems to the internet.

Having an air gap isn't a perfect solution, but it makes things a lot harder for attackers.

Re:sure, no problem (5, Interesting)

mlts (1038732) | about 6 months ago | (#46513489)

As a compromise, one can always do something similar to this:

1: Get two machines with a RS232 port. One will be the source, one the destination.

2: Cut the wire on the serial port cable so the destination machine has no ability to communicate with the source.

3: Have the source machine push data through the port, destination machine constantly monitor it and log it to a file.

4: Have a program on the destination machine parse the log and do the paging, etc. if a parameter goes out of bounds.

This won't work for high data rates, but it will sufficiently isolate the inner subsystem from the Internet while providing a way for data to get out in real time. Definitely not immune to physical attack, but it will go a long ways to stopping remote attacks, since there is no connections that can be made into the source machine's subnet.

Re:sure, no problem (4, Insightful)

phantomfive (622387) | about 6 months ago | (#46513523)

The main use case that causes problems with air gaps (AFAIK) is transferring files to the computer that's hooked up to the heavy machinery. People get tired of copying updates over USB, for example, and hook it up. Or they want to be able to reboot their air conditioner remotely.

And that is the use case that caused problems with for Iran with Stuxnet. They had an airgap, but the attackers infected other computers in the area, got their payload on a USB key, and when someone transferred files to the main target, it got infected. That is my understanding of how that situation went down. But once you start thinking along those lines, you start thinking of other attacks that might work.

Re:sure, no problem (1)

khasim (1285) | about 6 months ago | (#46513911)

That's part of a larger issue. People will ALWAYS get sloppy and lazy.

Part of the security system has to include a way to check the systems and to check the people.

Security is not an item in itself. It is only a reference point. You can have "better security" than you had at time X or you can have "worse security" than you had at time X (or the same).

Improving security is about reducing the number of people who can EFFECTIVELY attack you.

Once you've gotten that down to the minimum number then you increase the number of people REQUIRED to attack you.

If one guy on the other side of the world can crack you, that's pretty bad.

If that one guy has to be physically at your site, that's better.

If that one guy has to be physically at your site with two other guys providing overwatch and support, that's even better.

Iran needs to learn about superglue on USB ports.

Re:sure, no problem (1)

phantomfive (622387) | about 6 months ago | (#46513951)

Iran needs to learn about superglue on USB ports.

How do you suggest they copy files to the computers then? Type them in by hand?

Re:sure, no problem (2)

richlv (778496) | about 6 months ago | (#46513705)

i remember watching 'nikita' episode where they hacked a computer through its power connection and going "um, that's a bit stretching it..."

then, several years later, some proof of concept attack vector like that was demonstrated. assuming that experts in the field can do much more than public knows, it might have been not that much of a stretch after all.

i would also imagine that attacks for analog systems have been polished quitealot, given that they have been around longer. not that they could not be more secure - but thinking that they are immune might be a dangerous trap.

Re:sure, no problem (1)

Jeremi (14640) | about 6 months ago | (#46513727)

That's known as a data diode, and it's a great idea (and can be done at higher speeds than RS232, if necessary; e.g. you can do something similar with an Ethernet cable).

It does have one big limitation, though -- it won't let you control the system from off-site. If that's okay, then great, but often off-site control is something that you want to have, not just off-site monitoring.

Re:sure, no problem (1)

darkain (749283) | about 6 months ago | (#46513809)

>Or maybe you could isolate control systems from the Internet

Oh, you mean like all those systems Stuxnet infected?

http://en.wikipedia.org/wiki/S... [wikipedia.org]

Re:sure, no problem (1)

nateman1352 (971364) | about 6 months ago | (#46514251)

Or maybe you could isolate control systems from the Internet

You know that stuxnet explicitly targeted a uranium enrichment control system that was NOT Internet connected right?

Re:sure, no problem (4, Informative)

Technician (215283) | about 6 months ago | (#46514297)

A more common control with this type of critical limits is an elevator. The digital controls calls the cars to the floors, opens doors, etc. Between the digital world and electrical/mechanical world is control relays. Limit switches are in pairs. One you are used to. The elevator arrives at a floor and there is a pause while the fine alignment is completed to level with the current floor. The hard limit on the other hand such as exceeding safe space below bottom floor or past the top floor, does interrupt power to the control for the power relays. One drops power to the motor and the other drops the power to the brake pick solonoid. Brakes fail safe in an elevator. Need power to release the brakes.

Yea, it is a pain to reset the elevator at 3 am with someone stuck inside, but that is better than a runaway elevator. And no, there is no software defeat for the hardware limit switches.

Because no analog system has (2, Insightful)

Anonymous Coward | about 6 months ago | (#46513407)

ever been compromised :) Physical kill switches, human operated are not simply analog (one might argue they are digital at the switch level). Analog might be the wrong word, since analog systems have been repeatedly compromised (from macrovision, to phreaking boxes, etc, etc). keep it off a communications network, even off local networks if they are uber critical.

Re:Because no analog system has (-1)

gweihir (88907) | about 6 months ago | (#46513469)

Seems you have not even begun to understand what this is about.

Re:Because no analog system has (4, Insightful)

phantomfive (622387) | about 6 months ago | (#46513539)

I think his point is that anything that can be accessed remotely by a trusted party can also be accessed remotely by an attacker. The distinction between analog and digital is a red herring.

Maybe that wasn't his point, but it's still a good one. :)

Re:Because no analog system has (0)

gweihir (88907) | about 6 months ago | (#46513565)

No, it is not. If the remote analog access is by a dedicated wire (and that is what you do in analog), then the attacker has to have physical access to that wire. Come on, does not body know basic EE anymore? No wonder all this insecurity and stupidity happens...

What this comment shows nicely is how incompetent CS types are routinely and how far they misunderstand the world.

Re:Because no analog system has (2)

mysidia (191772) | about 6 months ago | (#46513649)

No, it is not. If the remote analog access is by a dedicated wire (and that is what you do in analog), then the attacker has to have physical access to that wire.

Usually the "remote analog" access is through an analog circuit provided by a telecommunications company between two locations called an ISDN circuit.

If the locations are far enough, your so called "dedicated wire" gets muxed, and then transmitted over a digital trunk which may be copper or optical with a bunch of other "dedicared wires"

The communication is subject to possible attack -- interception and insertion of false signals, at any point the line crosses, if compromised physically.

Or theoretically possible by remote attacks, if the Telco becomes compomised.

Re:Because no analog system has (1)

Jeremi (14640) | about 6 months ago | (#46513757)

Usually the "remote analog" access is through an analog circuit provided by a telecommunications company between two locations called an ISDN circuit.

What does the "D" in ISDN stand for?

Re:Because no analog system has (2)

erice (13380) | about 6 months ago | (#46513713)

No, it is not. If the remote analog access is by a dedicated wire (and that is what you do in analog), then the attacker has to have physical access to that wire

And that dedicated wire could control digital circuitry or even a conventional computer running software. So what is your point?

The only advantage of analog is that control methods are generally so limited that doing something stupid like sending a critical control signal over the Internet is not possible. However, the cost is very very high and it doesn't do anything that following a policy of never sending controls over the Internet would not do. Further, without such a policy, the security advantage is lost the first time someone gets the bright idea of inserting a repeater.

Re:Because no analog system has (1)

phantomfive (622387) | about 6 months ago | (#46513731)

No, it is not. If the remote analog access is by a dedicated wire (and that is what you do in analog), then the attacker has to have physical access to that wire. Come on, does not body know basic EE anymore? No wonder all this insecurity and stupidity happens...

It's not clear you even understood the point before replying. Maybe you did, but your comment doesn't make that clear.

Don't associate your self-deluded angry mind ... (0)

Anonymous Coward | about 6 months ago | (#46513875)

No, it is not. If the remote analog access is by a dedicated wire (and that is what you do in analog), then the attacker has to have physical access to that wire. Come on, does not body know basic EE anymore? No wonder all this insecurity and stupidity happens... What this comment shows nicely is how incompetent CS types are routinely and how far they misunderstand the world.

Please don't associate your self-deluded angry mind with EE, it reflects poorly on the bulk of EE types who are more stable than you. If you understood the world, even a small amount, you might realize that physical access is no placebo. You might even realize that a man in the middle attack predates CS, by millennia. Analog systems have been compromised and will be compromised.

Re:Don't associate your self-deluded angry mind .. (0)

phantomfive (622387) | about 6 months ago | (#46513923)

physical access is no placebo

I literally have no idea what this means. I've read it five times and still don't know.

Typo: "requiring physical access" (0)

Anonymous Coward | about 6 months ago | (#46513975)

... then the attacker has to have physical access to that wire ...

[requiring] physical access is no placebo

I literally have no idea what this means. I've read it five times and still don't know.

Sorry, typo, should have been "requiring physical access".

analog control != airgap or dedicated line (1)

gl4ss (559668) | about 6 months ago | (#46513979)

. ..
so the article speaks of a dedicated line when it speaks of "analog"? I don't think so(without reading the article). it just speaks of analog protection systems, like an analog temp fuse on fire suppression water lines.

(analog dedicated control line would be only as useful as both ends of the wire are secure.. making it about as useful as a digital line only transmitting a simple protocol handled with good code at both ends)

real analog control and protection systems aren't programmable and so less vulnerable to someone hacking the max RPM limit on some centrifuges etc, since the attacker would need to physically alter the control mechanisms/analog electronics to alter the rpm. obviously such systems are more demanding to operate too..

they are more expensive to do and more prone for faults though...

Re:Because no analog system has (0)

Anonymous Coward | about 6 months ago | (#46514023)

No, it is not. If the remote analog access is by a dedicated wire (and that is what you do in analog), then the attacker has to have physical access to that wire.......What this comment shows nicely is how incompetent CS types are routinely and how far they misunderstand the world.

And what your comment shows is that you need to go and read up about Electromagnetism [wikipedia.org] and maybe learn the basics of the world you live in. You may want to start with Electromagnetic Induction [wikipedia.org] .

Stuxnet (3, Informative)

scorp1us (235526) | about 6 months ago | (#46513417)

"Or maybe you could isolate control systems from the Internet."
Wasn't Stuxnet partially a sneakernet operation? I can't imagine Iran being so stupid to connect secret centrifuges to the internet.

The only way to win is not to play.

Re:Stuxnet (3, Informative)

NixieBunny (859050) | about 6 months ago | (#46513473)

Yes, it was a USB flash drive with a firmware update.

I work on a telescope whose Siemens PLC is so old that it has a PROM in a 40 pin DIP package for firmware updates. Not that we've touched the firmware in 20 years. After all, it works. And it ought to work for another 20 years, as long as we replace the dried-out aluminum electrolytic capacitors regularly.

Re:Stuxnet (0)

Anonymous Coward | about 6 months ago | (#46513573)

And it ought to work for another 20 years, as long as we replace the dried-out aluminum electrolytic capacitors regularly.

Just don't be tempted to use tag tantalum capacitors for the replacements ... being an annoying source of parasitic oscillations n all.

Re:Stuxnet (0)

countach (534280) | about 6 months ago | (#46513587)

Yeah, the internet isn't the only way to hack something. And Siemen's patch for their CPUs? Who knows if it isn't an NSA effort to hack Iran's Nuclear program? You wouldn't know, would you.

Challenge accepted (1)

mrmeval (662166) | about 6 months ago | (#46513429)

Digial, analog, trinary, HIKE! You won't safe them without MIKE!

In other words children it's all the humans who're messing up your security chain.

You need better, faster, stronger, smarter people who have a driving need to make your security better from the floor sweep to the ablative meat.

Without it you're just asking for an ass raping.

Isolated? (0)

Anonymous Coward | about 6 months ago | (#46513443)

Didn't Stuxnet target centrifuge controllers that were (designed/thought to be) isolated from the internet?

Re:Isolated? (1)

Casandro (751346) | about 6 months ago | (#46513451)

Yes it did. It was transmitted via USB stick.

Iran (0)

Anonymous Coward | about 6 months ago | (#46513447)

Iran's centrifuge operation was physically isolated from the Internet. The Americans and/or Israelis broke through via a USB drive. They infected the machines of individuals related to the project and waited until somebody used a USB drive to transfer data. Oops.

If it's digital and not shut-off, it can be hacked remotely.

This is very, very old (3, Insightful)

gweihir (88907) | about 6 months ago | (#46513453)

It is called self-secure systems. They have limiters, designed-in limitations and regulators in there that do not permit the systems to blow themselves up and there is no bypass for them (except going there in person and starting to get physical). This paradigm is centuries old and taught in every halfway reasonable engineering curriculum. That this even needs to be brought up shows that IT and CS do not qualify as engineering disciplines at this time. My guess would be that people have been exceedingly stupid, e.g. by putting the limiters in software in SCADA systems. When I asked my EE student class (bachelor level) what they though about that, their immediate response was that this is stupid. Apparently CS types are still ignoring well-established knowledge.

Re:This is very, very old (5, Insightful)

DMUTPeregrine (612791) | about 6 months ago | (#46513531)

That's because CS is math, not engineering. Computer Engineering is engineering, Computer Science is the study of the mathematics of computer systems. CE is a lot rarer than CS though, so a lot of people with CS degrees try to be engineers, but aren't trained for it.

Re:This is very, very old (-1, Troll)

gweihir (88907) | about 6 months ago | (#46513553)

What utter nonsense. Have you seen any real-world CS course? I doubt it.

The difference between CS and CE ... (0)

Anonymous Coward | about 6 months ago | (#46513883)

That's because CS is math, not engineering. Computer Engineering is engineering, Computer Science is the study of the mathematics of computer systems. CE is a lot rarer than CS though, so a lot of people with CS degrees try to be engineers, but aren't trained for it.

The difference between CS and CE is usually just the name the department chooses, not their course work. In other words it is usually a cosmetic difference.

Re:The difference between CS and CE ... (2)

ttucker (2884057) | about 6 months ago | (#46513963)

That's because CS is math, not engineering. Computer Engineering is engineering, Computer Science is the study of the mathematics of computer systems. CE is a lot rarer than CS though, so a lot of people with CS degrees try to be engineers, but aren't trained for it.

The difference between CS and CE is usually just the name the department chooses, not their course work. In other words it is usually a cosmetic difference.

This is not true, or even approximately true. CE is a discipline of EE. It is created mostly by learning EE, with a few computer architecture classes, lots of Verilog, and a few CS classes. In most universities, the program is offered by the EE college.

Re:The difference between CS and CE ... (0)

Anonymous Coward | about 6 months ago | (#46514085)

Coursework is really different in the undergraduate level as well. A university CS degree might recommend you to take 180 credits of pure mathematics courses for the basic degree, an computer related engineering degree might offer you 60 credits of pure mathematics courses and the rest, what ever is necessary, is learned along the degree related courses. A CS degree might not require to you to take courses in industrial engineering and production engineering, an engineering degree might. Those are just limited examples.

Re:This is very, very old (1)

phantomfive (622387) | about 6 months ago | (#46513563)

Heh, nice try, but you can't blame the programmers for this one. The only thing programmers can do is write software for the device once the engineers have built it. If the engineers build a system that is not self-secure, what do you expect the software guys to do? Pull out the soldering iron?

All blame is on the engineers if they don't build a self-secure system (or management if it's their fault).

Re:This is very, very old (0)

gweihir (88907) | about 6 months ago | (#46513579)

Management and CS-types that promise things they cannot deliver. The second ting is quite common, actually.

Re:This is very, very old (0)

phantomfive (622387) | about 6 months ago | (#46513711)

Management and CS-types that promise things they cannot deliver.

CS types aren't building the hardware. That's on the engineers.

Re:This is very, very old (3, Insightful)

vux984 (928602) | about 6 months ago | (#46513581)

My guess would be that people have been exceedingly stupid, e.g. by putting the limiters in software in SCADA systems.

Or they just did what they were told by management. After all, software solutions to problems tend to be a fraction of the price of dedicated hardware solutions, and can be updated and modified later.

Apparently CS types are still ignoring well-established knowledge.

You can't build a SCADA system with *just* CS types; so apparently all your 'true engineers' were also all asleep at the wheel. What was their excuse?

Seriously, get over yourself. The CS types can and should put limiters and monitors and regulators in the software; there's no good reason for them not to ALSO be in there; so when you run up into them there can be friendly error messages, logs, etc. Problems are caught quicker, and solved easier, when things are generally still working. This is a good thing. Surely you and your EE class can see that.

Of course, there should ALSO be fail safes in hardware too for when the software fails, but that's not the programmers job, now is it? Who was responsible for the hardware? What were they doing? Why aren't those failsafes in place? You can't possibly put that at the feet of "CS types". That was never their job.

Re:This is very, very old (1)

ttucker (2884057) | about 6 months ago | (#46513977)

Hardware fail-safes protect from so called, "never events". They are an added layer of protection beyond the software level, and should never be depended upon by the SCADA system.

Re:This is very, very old (1)

ratboy666 (104074) | about 6 months ago | (#46513657)

Way to shunt blame!

I design code, your "EEs" design electrical hardware. I have been delivered hardware without such safeties. I could simply refuse to deliver code for the platform -- it will simply be offshored.

Just costs me work.

Re:This is very, very old (1)

hjf (703092) | about 6 months ago | (#46513835)

Your "EEs" actually "code" too, but in disguise. PLCs are programmed, just (usually) not in written code, but rather, in Ladder Diagram or Function Blocks. But you know that, right?

I'm a programmer, but also a hobby electronics guy. And I've worked with PLCs. And I know for sure that "CS" types are never involved in these projects. The programming required is minimal (as usual with "elegant" engineering solutions), so a CS degree isn't required. It's much more about the hardware than software.

A CS guy usually doesn't even know what SCADA is, and would think LD is for retards.

And also: come on, EE guy, your kind is moving into "programming"... Seen enough people thinking "since now we have really fast and cheap embedded CPUs like ARM that are 32-bit, run at 80mhz, and cost cents, we might as well use that instead of a dedicated PWM chip for SMPS". Call me old fashioned, but I really think those things (SMPS control) should always be analog. Even with the annoyance of stabilizing that damn feedback loop that starts oscillating.

Re:This is very, very old (1)

ttucker (2884057) | about 6 months ago | (#46513947)

It is amazing how fast we have forgotten the Therac 25....

Replace with a human (0)

Anonymous Coward | about 6 months ago | (#46513455)

Yes, sure disconnect critial systems from a network, no brainer right? They don't because the systems allow one remote operator to control what it took hundreds of employees to do/monitor previously.

A product is made in America, it costs $5 to make, and they sold it for $10(or more). China opens up, can make the same product there for $0.05 each, they import it and sell it for $10(or more) making incredible profits... This works for years(80-90s), anything that can be 'exported' is. Years later, inflation and a dwindling middle-class the price come down through 3rd parties bringing in the same products and selling for less than $10 each. China now controls all manufacturing for the products, instead of us setting terms on deadlines, they do, they increase prices...So we have a nation that makes very little, w/o majority of the jobs not actually doing anything but killing time(move vapor around). The rich sill get rich tho, so all is fine, go about your business, don't vote for a 3rd party candidate or anything, your happy, have some bread, enjoy the clowns.

analog vs digital isnt the problem (5, Insightful)

Osgeld (1900440) | about 6 months ago | (#46513465)

analog is actually more suceptable to interference generated by rather simple devices, as there is no error checking on whats being fed to the system

the problem is your reactor is for some fucking reason hooked to the same network as facebook and twitter

Re:analog vs digital isnt the problem (0)

Anonymous Coward | about 6 months ago | (#46513507)

Correct. Anyone who has studied analog control systems knows that they can be inherently unstable and proving otherwise is MUCH harder than in the digital domain.

Re:analog vs digital isnt the problem (2)

Tablizer (95088) | about 6 months ago | (#46513555)

the problem is your reactor is for some fucking reason hooked to the same network as facebook and twitter

Rats, I knew I shouldn't have "liked" nuclear meltdown.

Re:analog vs digital isnt the problem (0)

Anonymous Coward | about 6 months ago | (#46513957)

the problem is your reactor is for some fucking reason hooked to the same network as facebook and twitter

That's a point too, but not THE point .

I studied reactor design in a decade when it looked as if there still was a future in that (it turned out there wasn't, so I ended up working with computers).
Back in those days, even analog electronics were shunned in reactor design (or better, design of reactor control and safety systems).
The issue was RELIABILITY.

Case Study (1)

Tablizer (95088) | about 6 months ago | (#46513467)

Fred Flintstone never had unexpected brake failures...at least none without a known cause.

Good idea (5, Insightful)

Animats (122034) | about 6 months ago | (#46513495)

There's a lot to be said for this. Formal analysis of analog systems is possible.The F-16 flight control system is an elegant analog system.

Full authority digital flight control systems made a lot of people nervous. The Airbus has them, and not only do they have redundant computers, they have a second system cross-checking them which is running on a different kind of CPU, with code written in a different language, written by different people working at a different location. You need that kind of paranoia in life-critical systems.

We're now seeing web-grade programmers writing hardware control systems. That's not good. Hacks have been demonstrated where car "infotainment" systems have been penetrated and used to take over the ABS braking system. Read the papers from the latest Defcon.

If you have to do this stuff, learn how it's done for avionics, railroad signalling, and traffic lights. In good systems, there are special purpose devices checking what the general purpose ones are doing. For example, most traffic light controllers have a hard-wired hardware conflict checker. [pdhsite.com] If it detects two green signals enabled on conflicting routes, the whole controller is forcibly shut down and a dumb "blinking red" device takes over. The conflict checker is programmed by putting jumpers onto a removable PC board. (See p. 14 of that document.) It cannot be altered remotely.

That's the kind of logic needed in life-critical systems.

Re:Good idea (1)

countach (534280) | about 6 months ago | (#46513591)

That's interesting they have a different system cross checking. But what happens when they are in disagreement? Who wins? There might not be time for the pilots to figure it out.

Re:Good idea (2)

pipedwho (1174327) | about 6 months ago | (#46513843)

It's not that the secondary system is 'cross checking' or comparing results. They are really just monitoring circuits with a particular set of rules embedded in separate circuitry that just makes sure the primary system never breaks those rules. It is effectively the master control and will always 'win' if there is a problem. They are designed to be simple, robust and if possible, completely hardware based.

Some other examples are 'jabber' control hardware lockouts to stop a radio transmitter from crashing and permanently keying active; the watchdog timers in critical systems that will reset the system if it isn't periodically reset; power control systems that shutdown power domains if an overload is detected; etc.

Something like a nuclear power station should have more complex monitoring systems, but the rules are similar. In modern critical system design, the rules are generally set up to require a sanitising channel between the 'internet' and the control network. That channel may be some simple UART to UART based control logic that allows the a subset of general control commands to be issued without the ability to override the primary safety lockouts. If you want to override those, you have to turn up in person.

This type of security has been standard practice for years by the embedded systems engineers. Once people started shoehorning inappropriate solutions into critical system control, that's where it went belly up. That's where you end up with glorified 'web coders' writing what should be done by someone that understands the pitfalls. Sometimes, it's because 'management' has decided to requisition and install something beyond the design parameters set by the engineers.

Re:Good idea (1)

KingOfBLASH (620432) | about 6 months ago | (#46514005)

That's interesting they have a different system cross checking. But what happens when they are in disagreement? Who wins? There might not be time for the pilots to figure it out.

Then the minority report is filed in the brain of the female, who is obviously the smarter one. Duh. Didn't you see the movie?

Re:Good idea (1)

phantomfive (622387) | about 6 months ago | (#46513593)

For example, most traffic light controllers have a hard-wired hardware conflict checker. [pdhsite.com] If it detects two green signals enabled on conflicting routes, the whole controller is forcibly shut down and a dumb "blinking red" device takes over.

That's really cool

yes isolate (2)

globaljustin (574257) | about 6 months ago | (#46513499)

Or maybe you could isolate control systems from the Internet.

Unkown Lamer has it.

tl;dr - using analog in security situations would be obvious if "computer security" wasn't so tangled in abstractions

Sure someone may point out that the "air gap" was overcome by BadBios http://it.slashdot.org/story/1... [slashdot.org] but that requires multiple computers with speakers and microphones connected to an infected system

IMHO computer security (and law enforcement/corrections) has been reduced to hitting a "risk assessment" number, which has given us both a false sense of security & a misperception of how our data is vulnerable to attack

100% of computers connected to the internet are vulnerable...just like 100% of lost laptops with credit card data are vulnerable

Any system can have a "vulnerability map" illustrating nodes in the system & how they can be comprimised. I imagine it like a Physical Network Topology [wikipedia.org] map for IT networking only with more types of nodes.

This is where the "risk assessment" model becomes reductive...they use statistics & infer causality...the statistics they use are historical data & they use voodoo data analysis to find **correlations** then produce a "risk assessment" number from any number of variables.

If I'm right, we can map every possible security incursion in a tree/network topology. For each node of possible incursion, we can identify every possible vulnerability. If we can do this, we can have alot more certainty than an abstract "risk assessment" value.

Analog comes into play thusly: if you use my theory, using **analog electronics** jumps out as a very secure option against "cyber" intrusions. Should be obvious!

"computer security"....

Re:yes isolate (0)

Anonymous Coward | about 6 months ago | (#46513621)

hello all I am 66 years old ,and able to type .
I will say in the 1990 the companys were told the dangers using the internet connected to the outside world without encripton. So the folks that did not oh my.
and oops If I do not type well, Old telcom guy.

besides digital or analog, for safety, use physics (4, Insightful)

raymorris (2726007) | about 6 months ago | (#46513515)

Analog vs. digital, fully connected vs less connected - all can fail in similar ways. If it's really critical, like nuclear power plant critical, use simple, basic physics. The simpler the better.

You need to protect against excessive pressure rupturing a tank. Do you use a digital pressure sensor or an analog one? Use either, but how also add a blowout disc made of metal 1/4th as thick as the rest of the tank. An analog sensor may fail. A digital sensor may fail. A piece of thin, weak material is guaranteed to rupture when the pressure gets to high.

Monitoring temperature in a life safety application? Pick analog or digital sensors, ei ther one, but you better have something simple like the vials used in fire sprinklers, or a wax piece that melts, something simple as hell based on physics. Ethanol WILL boil and wax WILL melt before it gets to be 300 F. That's guaranteed, everytime.

New nuclear reactor designs do that. If the core gets to hot, something melts and it falls into a big pool of water. Gravity is going to keep working when all of the sophisticated electronics doesn't work because "you're not holding it right".

Re:besides digital or analog, for safety, use phys (1)

jrumney (197329) | about 6 months ago | (#46513599)

In other words, it is nothing to do with analog vs digital, but about having failsafe mechanisms that contain the damage when all your control systems go wrong. Failsafe mechanisms tend to be "analog", as they need to be effective even when the electricity and anything else that can fail has failed.

Re:besides digital or analog, for safety, use phys (2)

ttucker (2884057) | about 6 months ago | (#46513989)

Is a piece of wax melting analog, or something else entirely?

re: wax - name that song. (0)

Anonymous Coward | about 6 months ago | (#46514035)

'My time is a piece of wax falling on a termite, who's choking on the splinters'

No, it's education (5, Insightful)

Casandro (751346) | about 6 months ago | (#46513533)

Such systems are not insecure because they are digital or involve computers or anything. (seriously I doubt the guy even understands what digital and analog means) Such systems are insecure because they are unnecessarily complex.

Let's take the Stuxnet example. That system designed to control and monitor the speed at which centrifuges spin. That's not really a complex task. That's something you should be able to solve in much less than a thousand lines of code. However the system they built had a lot of unnecessary features. For example if you inserted an USB stick (why did it have USB support) it displayed icons for some of the files. And those icons can be in DLLs where the stub code gets executed when you load them. So you insert an USB stick and the system will execute code from it... just like it's advertised in the manual. Other features include remote printing to file, so you can print to a file on a remote computer, or storing configuration files in an SQL database, obviously with a hard coded password.

Those systems are unfortunately done by people who don't understand what they are doing. They use complex systems, but have no idea how they work. And instead of making their systems simpler, they actually make them more and more complex. Just google for "SCADA in the Cloud" and read all the justifications for it.

'every digital system has a vulnerability'? (0)

Anonymous Coward | about 6 months ago | (#46513543)

There are plenty of secure digital systems. Its not hard to make them, intact its quite easy. Trivial non networked systems are often secure. There is no need to go to analog, simple digital circuits are fine. I don't care how good your leet hacking skills are, I can make a single digital control system thats perfectly secure that sets line C high in line A and B are high. You can't hack an AND gate. There are plenty of places one can use provably correct digital control systems.

The idea is not that you need to put "Analog" in there somewhere, but rather that you should have simple things that are easy to secure, and design such that they are in the critical path for attacking. Ex: the Linux kernel is rather large (~15 million LOC). While its nice, you don't really want to rely on all that being secure. If you want security, you reduce the surface area exposed to attackers. If you are worried about incoming attacks over the network, air-gap = 0 area to attack. If you still need to allow come input, you can squeeze the threat through something simple which could be some analog mess as implied by the article, but more realistically would be a simple digital system, either hardware, or carefully validated (trivial) software, or both.

If you are willing to expose a bit more and get a real general purpose OS, you can opt for something like genode [genode.org] thats much more practical to design secure software for, and to validate the security of the OS itself.

TFA seems to be advocating using analog control systems to avoid things like cross site scripting attacks. Maybe drop the "site" and "scripting" before dropping the idea of digital control systems. If you don't care about putting your junk on the internet, and air gap will fix most of that crap, and if you do want it on the internet, too bad, IP is a digital protocol, and the analog version won't be able to work with it. Besides, those attacks are client side, so maybe just not exposing important infrastructure controls capable of wrecking everything if messed with to people using web browsers to edit them in a non-secure environment is enough.

Recently saw him speak... (0)

Anonymous Coward | about 6 months ago | (#46513545)

Just last week, I saw this man speak at the Johns Hopkins University Applied Physics Laboratory. He had given his whole presentation, and at the end someone had asked him if analog systems could be an answer to protecting critical infrastructure. His response was that yes, it would help, but nobody wants "that old shit" (in this case he was paraphrasing what he feels the industry thinks of analog systems). He also asserted that the main reason that digital systems were popular and on the internet was because companies were focused on the cost savings of having remote access to these systems.

Battlestar Galactica (2)

sg_oneill (159032) | about 6 months ago | (#46513577)

Reminds me a bit of one of the tropes from battlestar galactica. Adama knew from the previous war that the cylons where master hackers and could disable battlestars by breaking into networks via wireless and then using them to disable the whole ship, leaving them effectively dead in the water, so he simply ordered that none of his ship ever be networked and that the ship be driven using manual control. Later on they meet the other surviving battleship, the pegasus, and it turns out that only survived because its network was offline due to maintainance. Its not actually a novel idea in militaries. I remember in the 90s doing a small contract for a special forces group I can't name, and asked them about their computer network. He said they used "Sneaker-net", which is that any info that needed transfer was put on a floppy and walked to its destination, thus creating an air gap between battlefield systems.

I guess this isn't quite that, but it certainly seems to be a sort of variant of it.

Computer viruses predated the internet ... (1)

perpenso (1613749) | about 6 months ago | (#46513925)

Computer viruses predated the internet and worked across sneaker nets. Code on a floppy can be infected. A floppy can contain data crafted to overrun buffers and execute code, etc. The internet just simplifies the process, automates it.

isolate control systems from the Internet. (1)

manu0601 (2221348) | about 6 months ago | (#46513605)

Editor or submitter said

isolate control systems from the Internet.

Stuxnet has shown that it is not enough. You can still be infected by an USB key.

What a pathetic uninformed crock of sh artic (1)

Rosco P. Coltrane (209368) | about 6 months ago | (#46513677)

Analog vs digital has nothing to do with "cyberterrorism". Analog refers to systems with an infinite number of states, digital refers to systems with a finite number of states. If properly designed, both are perfectly safe.

Cyber security has nothing to do with digital or analog, and everything to do with software and networking. Which have nothing whatsoever to do with the analog vs digital design choices.

TFA reads like a science essay from a 3rd grader who write with technical words to look smart, but doesn't actually understand any of what they're writing about...

Maybe you could (1)

Jeremi (14640) | about 6 months ago | (#46513681)

>Or maybe you could isolate control systems from the Internet.

Yes, maybe is the keyword there. Set up everything to be nice and air-gapped, and maybe some joker won't bring in his malware-infected laptop the next day and temporarily hook it up to your "secure network" in order to transfer a file over.

Or then again, maybe he will. Who knows?

This fixes it as a side effect (2)

gman003 (1693318) | about 6 months ago | (#46513715)

The core problem is that "data" and "code" are being sent over the same path - the reporting data is being sent out, and the control "data" is being sent in, but it's over a two-way Internet connection. If you had an analog control system that was openly accessible in some way, you'd have the exact same problems. Or you could have a complete separate, non-public digital control connection that would be secure. But nobody wants to lay two sets of cable to one device, and there's a convenience factor in remote control. So since security doesn't sell products*, but low price and convenience features do, we got into our current situation. It's not "digital"'s fault. It's not "analog"'s fault. It probably would have happened even if all our long-range communication networks were built of hydraulics and springs.

* For those who are about to point out how much money antivirus software makes, that's fear selling, not security. Fear moves product *very* well.

"Isolate from the Internet" is hard (2)

roca (43122) | about 6 months ago | (#46513725)

Air-gap alone is not enough. Stuxnet travelled via USB sticks. And if your hardware (or anything connected to it) has a wireless interface on it (Bluetooth, Wifi, etc), you have a problem ... an operator might bring a hacked phone within range, for example.

Simplifying the hardware down to fixed-function IC or analog reduces the attack surface much more than attempts to isolate the hardware from the Internet.

Re:"Isolate from the Internet" is hard (1)

TubeSteak (669689) | about 6 months ago | (#46514081)

Air-gap alone is not enough. Stuxnet travelled via USB sticks.

The Stuxnet attack was (for the Iranians) a failure of operational security.
The attackers knew exactly what hardware/software was being used and how it was set up.
If the Iranians had one less centrifuge hooked up, or a different SCADA firmware version, the worm would have never triggered.

There is such a thing as security through obscurity.
It's never a complete solution, but it should always be your first line of defense.

Re:"Isolate from the Internet" is hard (0)

Anonymous Coward | about 6 months ago | (#46514219)

There is such a thing as security through obscurity.

No.

local network (0)

Anonymous Coward | about 6 months ago | (#46513733)

these reactors would be run in a local environment you'd think. essentially away from cyberspace /.

Perhaps analog isn't the right term (1)

sjames (1099) | about 6 months ago | (#46513787)

The key is hard stop rather than analog. For a simple example, imagine 3 machines that draw a great deal of inrush current using typical start/stop controls. Since we're in the digital age, we put them under computer control. The controller can strobe the start or stop lines for the 3 machines.

Now, they must not all be started at once or they'll blow out everything back to the substation. We know they must be started 10 seconds apart at least. Doing it the "digital way" we program the delay into the controller software and call it good. Then someone hacks the firmware and does a great deal of damage power cycling the units rapidly until kaboom.

Or we do it the 'analog way'. When a start line is strobed, a PLC with no connectivity of any kind locks out the other two and starts a ten second timer. The firmware can't touch the timer. The attacker annoys but does no real damage due to the hard stop.

Better design and discipline (1)

WaffleMonster (969671) | about 6 months ago | (#46513791)

Whether it is a series of mechanical cogs or a digital controller problem in abstract seems not so much selection of technology as it is proliferation of "nice to have" yet possibly unnecessary capabilities.. widgets which may not offer significant value after closer inspection of all risks. Is remote management really a must have or can you live without? Perhaps read-only monitoring (cutting rx lines) is a good enough compromise... perhaps not all systems need network connections, active USB ports..etc

Then we get to process questions.. can system be designed and isolated in such a way any manipulation is subject to local safety constraints which cannot be remotely bypassed or influenced/tricked?

It is problematic control people have not sufficiently cared about security in terms of product development, deployment and operation.

Also at some level operators must be trusted to not be stupid or evil.... To some extent this means knowing when to ignore the security/bureaucratic guy endlessly pulling what-ifs and CYAs out of their asses and focus on what in the bigger context is actually important.

Obvious solution is obvious (2)

Karmashock (2415832) | about 6 months ago | (#46513803)

The hubris of some thinking that everything can be linked to the internet while maintaining acceptable security is ignorant.

Some systems need to be air gapped. And some core systems just need to be too simple to hack. I'm not saying analog. Merely so simple that we can actually say with certainty that there is no coding exploit. That means programs short enough that the code can be completely audited and made unhackable.

Between airgapping and keeping core systems too simple to hack... we'll be safe from complete infiltration.

Lots of unproven assertions here. (3, Interesting)

johnnys (592333) | about 6 months ago | (#46513829)

"obvious: that 'every digital system has a vulnerability,' "

So far, this has been demonstrated (NOT proven) only in the current environment where hardware and software architects, developers and businesses can get away from product liability requirements by crafting toxic EULAs that dump all the responsibility for their crappy designs and code on the end user. If the people who create our digital systems had to face liability as a consequence of their failure to design a secure system, we may find they get off their a**es and do the job properly. Where's Ralph Nader when you need him?

And as the original poster noted, you CAN isolate the control systems from the Internet! Cut the wire and fire anyone who tries to fix it.

"analog protection systems have one big advantage over their digital successors: they are immune"

Nonsense! There were PLENTY of breakins by thieves into banks, runaway trains, industrial accidents and sabotage BEFORE the digital age. There was no "golden age" of analog before digital: That's just bullsh*t.

The key is cost (1)

Nkwe (604125) | about 6 months ago | (#46513847)

It is not a analog or digital issue, it is a cost issue. To be secure from remote attack you have to be willing to pay to have trusted (human) individual with a sense of what is reasonable (with respect to the process) to be in the control loop. The problem is of course that trusted humans with a sense of reason are expensive.

Re:The key is cost (1)

fuzzyfuzzyfungus (1223518) | about 6 months ago | (#46514275)

It doesn't necessarily come down to humans (who can't necessarily save you if very fast responses are required or very subtle deviations need to be detected), though they can certainly help; but cost is much of the problem on the software side as well. More than a few important things run at the 'incompetent and overworked IT staff usually apply patches within a few months of release, assuming it isn't one of the systems that the vendor says you shouldn't touch' level and people are unwilling enough to shell out what it would take to bring them up to 'commercial best practice' levels, much less the (stratospheric, if we even have enough suitably qualified humans available) cost of 'all formally proven and whatnot'...

perspective of a controls engineer-- (4, Insightful)

volvox_voxel (2752469) | about 6 months ago | (#46513895)

There are billions of embedded systems out there, and most of them are not connected to the internet. I've designed embedded control systems for most of my career, and can attest to the many advantages a digital control system has over an analog one. Analog still has it's place (op-amps are pretty fast & cheap), but it's often quite useful to have a computer do it. Most capacitors have a 20% tolerance or so, have a temperature tolerance, and have values that drift. Your control system can drift over time, and may even become unstable due to the aging of the components in the compensator (e.g. PI, PID,lead/lag) .. Also a microcontroller wins hands down when it comes to long time constants with any kind of precision (millihertz). It's harder to make very long RC time constants, and trust those times. Microcontrollers/FPGA's are good for a wide control loops including those that are very fast or very very slow. Microcontrollers allow you to do things like adaptive control when you plant can vary over time like maintaining a precision temperature and ramp time of a blast-furnace when the volume inside can change wildly.. They also allow you to easily handle things like transport/phase lags, and a lot of corner conditions, system changes -- all without changing any hardware..

I am happy to see the same trend with software-defined radio, where we try to digitize as much of the radio as possible, as close to the antenna as possible.. Analog parts add noise, offsets, drift, cross-talk exhibit leakag,etc.. Microcontrollers allow us to minimize as much of the analog portion as possible.

Fix digital security (0)

Anonymous Coward | about 6 months ago | (#46513899)

Analog is a step backwards. We should be moving forward with digital security.

Redundant (0)

Anonymous Coward | about 6 months ago | (#46513935)

This seems a bit redundant, there's already methods to mathematically prove digital systems from metal to software (although at the metal level, it gets quite time consuming).

The issue is people have critical infrastructure built on, or controlled by insecure systems (regardless of if it's digital or not).

As many have stated, isolation is the easiest first step - but you can go further, building on well define/restricted ASIC, built on mathematically proven macro kernels which give essential access via MAC (also mathematically provable, via the kernel & MAC manifests) to 'processes' (drivers, software, etc) built on mathematically provable languages.

A lot of functional languages (typically those who are strict implementations of lambda calculus w/ minor (if any) extensions based on a well known type system - which again in the FP domain is typically a Hindley Milner type system (or super/sub-set of)) can be proven mathematically to meet various constructs/restrictions, which in a strict macro kernel environment, and using drivers under identically strict process environments meet very high levels of security from both intrusion and correctness perspectives.

Some of this is even 'required' for various military contracts (personally, we use a subset of Java SE for military grade gun mounts & control systems for various national militaries around the world - and have an entire team of mathematicians/CS staff whose sole job is to validate correctness and prove system reliability from various standpoints, who do exactly this - albeit for a Java-like language, not an FP language).

Tautology (3)

Yoda222 (943886) | about 6 months ago | (#46514051)

A "cyber-attack" is a digital attack. So if your system is not digital, you can't be cyber-attacked. Great news.

Hmm... (1)

fuzzyfuzzyfungus (1223518) | about 6 months ago | (#46514195)

I think I have a call from 1985 on line one, from some guy called 'Therac-25' who seems very excited about the importance of hardware safeguards and not trusting your software overmuch...

The simple fact is this : (1)

Rollgunner (630808) | about 6 months ago | (#46514209)

My sister-in law was excitedly showing off her new car to me, and I said that I didn't care for the idea of a remote-start function for cars. "But it's security coded." she said. My response was this:

If a device can be controlled with an electronic signal, that means that the device can be controlled with an electronic signal.

Sometimes that signal will come from where you want it to, but there can be no guarantee that it will not come from somewhere else.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?