Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Information Security Is Becoming Infrastructure

Soulskill posted more than 6 years ago | from the time-to-pay-your-monthly-security-bill dept.

Security 75

Bruce Schneier has a story at Wired about his observations from the recent RSA conference. He noticed that the 350+ vendors who attended the conference were having difficulties selling their products or even communicating with potential buyers. Schneier suggests that the complexity of the security industry is forcing it away from end-users and into the hands of companies who can bundle it with the products that need it. Quoting: "When something becomes infrastructure -- power, water, cleaning service, tax preparation -- customers care less about details and more about results. Technological innovations become something the infrastructure providers pay attention to, and they package it for their customers. No one wants to buy security. They want to buy something truly useful -- database management systems, Web 2.0 collaboration tools, a company-wide network -- and they want it to be secure. They don't want to have to become IT security experts. They don't want to have to go to the RSA Conference."

Sorry! There are no comments related to the filter you selected.

We've seen this with PGP (5, Insightful)

CRCulver (715279) | more than 6 years ago | (#23135016)

We've seen this problem with the PGP world. Geeks like working with everything themselves, but it's hard to convince non-geeks to use it, because they don't see the point. If encryption were really vital, it would be packaged for them to easily enable it, just like their online banking. Even with secure e-mail standards like Secure MIME, they are easy to use but are yet little known because companies don't actively pitch them to their customers.

I would beg my fellow geeks, at least, to rediscover some of the passion about encryption. As I posted a couple of days ago, a decade ago every geek had a PGP key and Schneier's Applied Cryptography [] was our favorite bedtime reading. Now, even geeks don't want to go through the minimal (to us) effort of working with crypto.

Re:We've seen this with PGP (2, Insightful)

PDG (100516) | more than 6 years ago | (#23135428)

I read your post the other day and agreed whole heartedly with it. I remember back in '97 when PGP keys were parts of email signatures and such.

Now, its unheard of.

I've set my machines up with GPG and my wife's as well, and autoconfigured them to encrypt any and all email between the two of us, but my attempts to get others to do so has proven fruitless.

I harp the same line Zimm did--when you put a letter in the mailbox, you put it in an envelope, right? Why is email any different?

Re:We've seen this with PGP (1)

ColdWetDog (752185) | more than 6 years ago | (#23135474)

I harp the same line Zimm did--when you put a letter in the mailbox, you put it in an envelope, right? Why is email any different?

While in some sense I agree that the problem with encryption is that it isn't ubiquitous, isn't easy to use and isn't the default, I think part of the problem is that email isn't the functional equivalent of sending a letter in the mail. It's the functional equivalent of sending a postcard. Most the emails with my wife are at the level of "the puppy ate (insert another expensive object) today".

That doesn't need to be encrypted. Nobody cares.

Putting pgp keys in our emails doesn't help that. It has to be transparent. And that's exactly what Scheiner is saying.

Re:We've seen this with PGP (1)

Ethanol-fueled (1125189) | more than 6 years ago | (#23136372)

Maybe you'd stop eating expensive things if you weren't so cold and wet.

Re:We've seen this with PGP (1)

PDG (100516) | more than 6 years ago | (#23136552)

Agreed on the unimportant email, but plenty of important info gets passed along via email as well.

I also agree that the stuff does need to be transparent. The fact that I pre-configured my wife's computer to do it automatically is proof of that (because she doesn't have a clue)

The core problem is the lack of options right now. Unfortunately there doesn't seem to be a lot of importance placed on secure email so GPG is about all we have.

Re:We've seen this with PGP (2, Informative)

Eighty7 (1130057) | more than 6 years ago | (#23136554)

Putting pgp keys in our emails doesn't help that. It has to be transparent. And that's exactly what Scheiner is saying.
Yeah, good luck with that. In my experience, mail encryption is fundamentally difficult - like going from driving cars to planes. You have to know the basics of key management ie get someone's PUBLIC key, encrypt messages using HIS public key & he decrypts using HIS private key. That's already a dealbreaker for most people. Does he seriously expect they'll listen when he talk about key backups, key signing or the importance of only keeping decrypted attachments in ram?

Why johnny can't encrypt. (pdf) []

Re:We've seen this with PGP (1)

umghhh (965931) | more than 6 years ago | (#23136556)

well transparent or not belief that users can stay ignorant while industry does its hocus-pocus is just silly. I do not need to know all the gory details of how the car I drive to work every day works but knowing the basics helps to make intelligent guesses e.g. not to buy Toyota's hybrids because they are less efficient than 'normal' cars or to use safety belt etc. On the same principle I prefer to know what and how the tools that protect me and my electronic transactions work. Ignorance may be a bliss but it is a costly one.
Still what we have now is a mess that is difficult to handle (for geeks and others too). That is the same for general issue with software - quality. a ot can and must be done. Ignorance prevents already discussing the issue.
Then again common sense is not a tool used on managment floors anyway.

Re:We've seen this with PGP (1)

Strilanc (1077197) | more than 6 years ago | (#23138232)

People do not see email as postcards. A friend of mine runs a small website, and people use email to contact the company for orders. A surprising number of them include their credit card numbers RIGHT IN THE EMAIL. That is INSANE! These are the same people who worry about typing those numbers into websites!

I completely agree (1)

gr8scot (1172435) | more than 6 years ago | (#23154960)

... I "don't want to go through the minimal (to us) effort of working with crypto," and except for my work (and hobbies) as programmer, I should not have to work with crypto. Microsoft should have made that a standard feature, with shortcut icon to Properties including others' public keys, of all user actions resulting in 1+ bits sent off the client. If an Internet browsing program can legitimately be described as integral enough to computing to be part of the operating system, then encryption damn well is too, and much more so. Ridiculous!

maybe the market is working (4, Insightful)

convolvatron (176505) | more than 6 years ago | (#23135040)

maybe the problem with selling security is that is that the products are a pile of afterthought patches. security is a property that should lie at the foundations of a design. why should i put some 1u appliance with alot of molded plastic on my ethernet at all?

Re:maybe the market is working (4, Insightful)

houstonbofh (602064) | more than 6 years ago | (#23135114)

I was thinking this myself... I could be that people don't understand it. But it could be that the products don't work all they well. Or it could be that a bad network design makes it all pointless anyway. But get HP or BMC in there with a big network plan that includes security, and it works.

I think they have it backwards. Security isn't a utility, it is a highly technical skill. You need a person, not a box.

HP??? (1)

Systems Architect (1276840) | more than 6 years ago | (#23145080)

I agree with your concept, but HP???
You've got to be kidding us .... try Foundry!

Re:maybe the market is working (3, Interesting)

eihab (823648) | more than 6 years ago | (#23135122)

A similar conclusion can be drawn from the article:

The booths are filled with broad product claims, meaningless security platitudes and unintelligible marketing literature. You could walk into a booth, listen to a five-minute sales pitch by a marketing type, and still not know what the company does. Even seasoned security professionals are confused.
This is the state of security products for the most part nowadays, hoax products and snake oil salesmen "IT'S 2009 READY!!!1!".

Now, I do agree with you that security should lie at the foundation of a design, but security also works by constructing layers of defense. No matter how good your design/implementation is, software is very complicated and someone will slip somewhere.

Unless you write your own OS, design your hardware and write its firmware, then write your application on top of all that: You _will_ be depending on someone somewhere to do it, and they may (or may not) mess something up.

The more layers of security you add (hardware firewall, anti-virus, etc.), the more secure you will be at the end.

Re:maybe the market is working (1)

alen (225700) | more than 6 years ago | (#23135142)

it's the same with any product

a herd of tiny companies makes something to fix some obscure problem that 99% of people will never encounter but the marketing hype makes it seem like the end of the world

A lot of companies don't want to pay for it (3, Interesting)

MikeRT (947531) | more than 6 years ago | (#23135084)

Probably because they don' think that security is really that critical to them. However, for many others, the cost of getting the right consultants and infrastructure might be too much for their business to handle. Most businesses don't have a lot of disposable cash that they can put into IT infrastructure, especially since a lot of IT infrastructure has to be upgraded on a semi-regular basis.

Re:A lot of companies don't want to pay for it (1)

perlchild (582235) | more than 6 years ago | (#23135626)

A lot of companies don't want to pay for it, because they think it should have been designed in the project in the first place too... Or assume they're secure. A lot of the snake oil has one good side, it makes people aware the security wasn't in, in the first place.
However, since those salesmen have a product, not a redesign, to sell, none of their solutions really address the problem, but makes them a lot of money.
I'm mostly talking about smtp and spam here, but the same concept applies elsewhere, to a greater or lesser degree. Backwards compatibility to the insecure solution is usually the killer there, and in a lot of other places too.

NOOOOOOOOO (4, Insightful)

Original Replica (908688) | more than 6 years ago | (#23135090)

the complexity of the security industry is forcing it away from end-users and into the hands of companies who can bundle it with the products that need it.

Great, once again the tools I need to protect myself are being taken away given to "the professionals". So if all the security tools go to the ISPs and other infrastructure how do I protect myself from ISP spyware?


Anonymous Coward | more than 6 years ago | (#23135256)

Don't use that installation disk your ISP gives you? :)


Ethanol-fueled (1125189) | more than 6 years ago | (#23136476)

That may not be an option for mobile broadband: the mandatory software is auto-installed directly from the adapter the first time you plug it in(for USB, at least).


colinrichardday (768814) | more than 6 years ago | (#23137328)

Is there a Linux version? Those Windows executables don't run too well on OpenSuSE.


Anonymous Coward | more than 6 years ago | (#23141402)

"Great, once again the tools I need to protect myself are being taken away given to "the professionals". So if all the security tools go to the ISPs and other infrastructure how do I protect myself from ISP spyware?" - by Original Replica (908688) on Sunday April 20, @12:46PM (#23135090)

Try this (IF you're a Windows 2000/XP/Server2003 or even VISTA user):

HOW TO SECURE Windows 2000/XP/Server 2003 & even VISTA + make it "fun" to do, via CIS Tool Guidance: []

It truly works, IF you can apply & adhere to some SIMPLE rules it notes.

The 1 thing I think that many "security pros" & network admins/techs fear, is that "normal end users" begin to grasp how to secure themselves (OR, that they even begin to grasp things Tcp/IP (networking))...

If everyone starts to realize how SIMPLE it is? Then, that's taking away the need for the "pros", period (that is, IF an end-user's interested in securing themselves, & most are... else, why put locks + security systems into their homes or vehicles).

Don't let this b.s. from this article fool you (that people aren't interested in security OR knowing how to achieve it)!

Simply because common-sense & looking around you shows you clearly & cleanly otherwise, period.


P.S.=> The best part is this though, & that is that CIS Tool is NOT restricted to Windows users only... there are versions for various *NIX variants too, & they're decent enough as well... apk

most security products are to fix poor admins (1)

alen (225700) | more than 6 years ago | (#23135134)

i can't count how many products are crazy ways to push updates or check for updates or are just easier ways for admins to use features of Windows or some other MS product that is part of the product but requires more than clicking a button to make it work. I use SQL 2005 and there are so many ways to get into the guts of the product and see what is really happening that it will take months to learn it all. but there is no shortage of products that do the exact same thing except with a colorful GUI and so you don't have to invest the time to learn the product yourself

Nobody likes paying for "security" (3, Insightful)

smithfarm (862287) | more than 6 years ago | (#23135152)

Whether you're a computer user or a small shop owner in the Bronx, nobody likes paying for security.

Good news. (2, Interesting)

Shoten (260439) | more than 6 years ago | (#23135154)

This is a good thing. I'm working on a proposal for a...well, it's $900 million worth of something, I'll say that. It's a huge project, with a lot of different technologies (even by IT standards). I'm the "Security Tower," the group of people responsible for security in the solution, and I've never had it so easy. Sure, there are firewalls, and an IdM extension to support SSO, and a few other things for security, but for the most part our security is architectural. Every area of the solution has products with security infused into them to some degree, whether it's encryption for the endpoints, key management for the central system that manages the endpoints, and so on. Instead of having to wait until the rest of the solution was finalized, and then play catch-up to try and get security added in, it's been a matter of mapping requirements to security functionality that is already there.

Self-serving horseshit (2, Insightful)

duffbeer703 (177751) | more than 6 years ago | (#23135184)

Of course, security consultants think that security should be left to the professionals. (ie, them)

The information security people are getting jealous because project managers have the certification/religious body (PMI) and a certification (PMP) that is basically required for many serious projects. That keeps the rates high by limiting the marketplace and mandating some prescribed process for doing everything.

Security consultants like to put that "CISSP" on email signatures and business cards because it makes them sound like doctors or lawyers, but at the end of the day, nobody really gives a shit. So now every so-called security guru is coming around telling us that the russian mafia has probably already hacked our systems, and the Chinese are going to take over the world, starting with our company's PCs. The magazines roll out witicisms like "digital pearl harbor" and "cyber 9/11".

The solution, is to give more money to security consultancies. Maybe buy some million dollar IDS solutions from the likes of Symantec to let you know that some putz in accounting tried to use FTP.

IMO, it's all bunk. IT people are finally starting to question the dubious value of cash-cow security software like AV, so the security community rolls out some more fear-mongering.

Re:Self-serving horseshit (2, Insightful)

ladybugfi (110420) | more than 6 years ago | (#23135462)


The answer is not just to give more money to security consultants (like me, a CISSP + GSNA) nor hw/sw vendors.

The answer is to develop a good security management framework that works for the organization. Security is not a product or a consultant or a service. Security is a process. Invest into developing the process and the organization is set to survive whatever the Chinese/Government/God throws at it.

Re:Self-serving horseshit (2, Insightful)

Anonymous Coward | more than 6 years ago | (#23135700)

IT people are finally starting to question the dubious value of cash-cow security software like AV, so the security community rolls out some more fear-mongering.

It's remarkable how many PMPs are really risk-seeking, control-averse, self-declared security expert cowboys trying to impress the bosses on how many shortcuts they've taken to get the project out the door. Outlooks like this are far from scarce and unfortunately leads to the purchase of expensive common-control level solutions to compensate post-implementation for lacking system security discovered by external auditors or hackers.

An approach I'd suggest alternately is a risk-balanced one (e.g. ISO 31000, AS/NZZ 4360). As a financially-educated risk manager in a large financial corporation's information risk program, I see repeated framework purchases (e.g. web application firewalls) that have to be implemented at the data center level due to shortcuts and an absence of basic security planning and design by the information system owner. When you can't take an application offline or recode it in a short period to address PCI findings, you end up throwing millions at compensating common controls.

Our business executives have gotten sick of countless millions spent on database encryption, gazillions of firewalls, application scanning systems, etc. but don't understand nor care that the inclusion of system security in the design phase of these applications would have avoided much of this cost.

I'd concur that much of the security efforts are seriously not risk aligned and lack any awareness of risk optimization. Too many in our world seek perfection, having zero tolerance for risk. Unfortunately, that unrealistic attitude, combined with the risk-seeking "shove it in and call it a day" PMP types, leads to a total breakdown in communication and ultimately insecure applications and unacceptable residual risk.

Re:Self-serving horseshit (1)

Bargeld (621917) | more than 6 years ago | (#23136044)

Well damn, wish I'd read your reply before I posted. Far more eloquently stated than I put it. /salute

Re:Self-serving horseshit (5, Interesting)

Bargeld (621917) | more than 6 years ago | (#23135998)

Of course, security consultants think that security should be left to the professionals. (ie, them)
Because it should. Or more accurately, oversight of it should. But when you have security-savvy architects, project managers, and (rarely) business-line managers, it makes the need for micro-managed technical oversight MUCH less. But no matter what, someone needs to be managing the big picture of risk across all the silos of expertise.

Security consultants like to put that "CISSP" on email signatures and business cards because it makes them sound like doctors or lawyers, but at the end of the day, nobody really gives a shit.
Amen :) It's always struck me as a grandiose, sad conceit...and I _AM_ a CISSP. It'll be a cold day in hell when I throw it around like a badge of pride, let alone authority, because frankly, it's a mediocre standard. Management at my last employer forced me to write the exam "to make our practice more credible to clients", and I spent a whopping 2 days "studying". The bar it sets is...very low. Not bad for a foundation, but not good for much else.

I've been doing infosec work for over 17 years now, and IMO, the "problem" as it were, is that the demand for expertise has utterly outstripped the experienced pool of talent.

Net result? Exactly what you observe: "cash cow security" that is more focused on implementing wildly expensive (and frequently Rube-Goldberg-esque) technology solutions. Why? Because the inexperienced security practitioner immediately and inevitably turns to vendors for "turn-key solutions" to every risk (and many non-risks :)

Conversely, the much smaller number of people with substantial experience in the trenches are the ones who might point out that a $50,000 security awareness campaign _just might_ reduce net risk a WEE BIT more than a $3million 17-tier-firewall-atrocity. Or that a 10-man-hour risk assessment by security professionals attached to EVERY project's design phase _just might_ have a better chance of reducing risk than a $30k penetration test of every project by an external vendor that is 9 times in 10 a glorified canned vulnerability scan by a junior drone.

Not much of this is likely to change anytime soon. Sad to say, information security is still a very young and immature science. Things won't get better until the experience-pool gets deeper.


Re:Self-serving horseshit (1)

DevilDoc (1004278) | more than 6 years ago | (#23143508)

Not much of this is likely to change anytime soon. Sad to say, information security is still a very young and immature science. Things won't get better until the experience-pool gets deeper. --Bargeld
You make a valid assessment of the IT Security industry. My question is how and where do the "junior drones" find the knowledge and experience that is needed?

Re:Self-serving horseshit (1)

Bargeld (621917) | more than 6 years ago | (#23152734)

Time :(
Wish I had a better answer. There might be one.

PS: My "drones" snark is directed more at consultancies selling BS than at inexperienced-but-learning security people trying to do their job. Used to be in charge of a security consulting practice, and was sabotaged endlessly by a sales force positioning my team as "all created equal", or promising that in a pinch _I_ would personally deliver every engagement, so boilerplate SOW's are just fine. It's all about the billable, baby...*sigh*

Re:Self-serving horseshit (1)

duffbeer703 (177751) | more than 6 years ago | (#23150194)

I totally agree with you on taking proactive measures during the planning phases of the project. That also makes me stop and think that the team-building approach that Brooks laid out in the "Mythical Man-Month" is the type of approach that would help address problems like this.

When people talk about the Mythical Man-Month, they usually refer to the assertion that throwing people on a project tends to delay the project. But another key point in that book was that the programming/implementation team was more like a surgical team than a bunch of interchangeable people. He described an architect, tool-builder, documentation person and other roles, which could conceivably include security.

If you took a bright programmer on each team, and had her focus on security issues as a primary responsibility, I think you'd develop a fantastic core of security expertise on project teams. Certainly better than the drive-by security types that dominate the field.

Re:Self-serving horseshit (1)

Bargeld (621917) | more than 6 years ago | (#23152482)

If you took a bright programmer on each team, and had her focus on security issues as a primary responsibility, I think you'd develop a fantastic core of security expertise on project teams. Certainly better than the drive-by security types that dominate the field.
Slowly but surely, I see more companies "getting this". It's been many years since I've had trouble finding "that guy", the bright dev or admin who also gives a shyt about security, who WANTS to be the evangelist, the translator, and work together with infosec from 'go'. The opposition to this approach is usually bureaucratic, rooted in upper management who historically view infosec as adversaries (and to be fair...many security professionals, even experienced ones, HAVE frequently been adversarial and authoritarian).

I see good things at the (grossly large and management-burdened) financial that I'm working for right now. They get it, lately. If this old dog can learn new tricks, there's hope.

Balance (0)

Anonymous Coward | more than 6 years ago | (#23135188)

Life would be simple if all the server and client applications in the world were inherently secure.

Life would also be good if everything were modular.

Every server and client application is written with a ssh server/client layer through which all communication passes. All proprietary and standard protocols can be encapsulated within an ssh tunnel.

Transparent Tech is Better (3, Insightful)

Doc Ruby (173196) | more than 6 years ago | (#23135218)

One advantage of security as infrastructure rather than as products is that infrastructure is the foundation of a service, not just something bolted on afterwards.

The biggest problem with security is that it's added afterwards as a "deluxe feature", rather than integrated with every design and implementation detail. Adding security afterwards means always catching up with the original insecure condition. It means creating an insecure system that the bad guys like, then fighting your own system along with the bad guys while you labor to secure it.

But the "built-in" tech shouldn't become completely invisible. The bundles should be transparent, not closed and opaque. Because nothing has a higher risk of insecurity than something unknown that you can't inspect. And no matter how well a vendor inspects their own secure component, if it's properly secured no extra scrutiny makes it less secure, only more. Leaving it transparent, visible only when you inspect it, is the best, safest tech.

Re:Transparent Tech is Better (1)

yuna49 (905461) | more than 6 years ago | (#23135452)

Along the same lines, my general predisposition is to remove as much responsibility for security from users as is possible. That means scanning email for viruses before they reach the desktop, blocking users from downloading dangerous payloads (like executables) over the web, and so forth. Security should be a part of infrastructure, not something tacked on at the users' end.

Perhaps one reason why it's so hard to figure out what those guys are hawking at the RSA conference is that what they're really hawking is fear. That's been something of a winning strategy on many levels here in the US for the past seven years now.

What you described is a quarantine. (1)

gr8scot (1172435) | more than 6 years ago | (#23155162)

Your network is most likely infected with the Microsoft Windows virus.

Along the same lines, my general predisposition is to remove as much responsibility for security from users as is possible. That means scanning email for viruses before they reach the desktop, blocking users from downloading dangerous payloads (like executables) over the web, and so forth.
Your diligence is commendable, by the way, but if the client machines on your network were running professional-grade operating systems, that would not be necessary. Limited User accounts really should only be able to run executable programs which are located on a protected partition, which in turn should only be writable to the Administrator.

Security should be a part of infrastructure, not something tacked on at the users' end.
True. And if the operating system isn't computing infrastructure, then ffs, what is?

Connective Awareness? (1)

MessyBlob (1191033) | more than 6 years ago | (#23135280)

Most security problems are a result of misunderstanding the purpose of an object in the infrastructure, and telling other components lies about its nature (permissions boosting). Bad admin does this with a human face. Poor products do this when out-of-the-box configurations don't match the user's requirements, allowing too much be begin with, or having options that bad admins change inappropriately.

So, how do we do this in a product-based environment? Do we need new module API, covering anything that communicates, which authenticates its purpose and reconciles this with the policies of the larger infrastructure? Will good admins resent such technology?

Finally, a contentious summary: good admins are needed because of poor products.

get out of jail free card (1)

zogger (617870) | more than 6 years ago | (#23135290)

The vast bulk of ongoing security issues is because of a single glaring market/government oversight-software is not being required to have a normal consumer warranty. Is it a product like other products-as patents suggest-or is it a work of creative art, like copyright suggests?

    I contend that society needs to make a clear distinction between the two and force the industry through legislative action (because voluntary is clearly not working) to choose one or the other, but not both.

  If they want to continue to sell products, and to have patent protection, then consumers need protection from them as well in the form of warranties, same as in every other industry that pushes products. Security problems would then start to get REALLY addressed, from the ground up, not patched on like keeping an old bald tire going.

  All the other industries out there got dragged kicking and screaming away from ye olden days "caveat emptor" snakeoil products era, before warranties, they claimed it "couldn't be done", that "the cost to the consumer would be to high!", that they just couldn't make products that could be covered by warranties..yet, they have, even with their faults, manufacturing settled down and is still profitable enough, even with forced warranties.

  It is well past time software was as well.

  This is not a new "delicate flower" industry that continues to need subsidy in the form of hand holding and a "get out of jail free card" for their products, it is now a decades old well established and robust and innovative industry that can finally have their training wheels taken away and stand behind their products and be forced to code so well that normal warranties can be offered. This would stop the massive release of perpetual betaware that has never ending security and functionality issues, and separate the truly thoughtful and "engineering first" efforts- from the good companies that would succeed- from the "marketing first frosted with gibberish and chanting billionaires going neener neener nothing is our fault, check the EULA, hahahaha, sucker!" offerings- from the bad companies- that we consumers get to "enjoy" now and are succeeding to the tune of tens of billions in the bank from their snakeoil wells they pump from.

Re:get out of jail free card (1)

Original Replica (908688) | more than 6 years ago | (#23135532)

forced to code so well that normal warranties can be offered. This would stop the massive release of perpetual betaware that has never ending security and functionality issues, and separate the truly thoughtful and "engineering first" efforts- from the good companies that would succeed

That is true, but it would also raise the price of an OS several fold and require more restrictions to be placed on application designers. Car manufacturers can require that you only use certain, high rated tires for their cars. [] They have to do this in order to protect themselves from the kind of liability that you are suggesting for software companies. So unless you relish the idea of never using third party software ever again, and having a very short list of "approved software" available only at prices that reflect the increase of liability insurance, than you need to get used to the idea that the user is largely responsible for the integrity of their system. Look at what high exposure to liability has done for the medical field, why introduce that to the software field?

Re:get out of jail free card (1)

Naturalis Philosopho (1160697) | more than 6 years ago | (#23136504)

Expensive? Damn straight. I'd pay $1000 for an OS that was warranted to be secure for my work computers, wouldn't you? Heck currently I pay orders of magnitude (real orders of magnitude, not market-speak orders of magnitude) more than $1000 for the software on my business servers. Then, I'd run the $100 "home edition" on a gaming rig. And a shortlist of third party software? Bring it on. I'll only run Adobe and Kodak at work, and then put "Uncle Bob's HAckSorS Shareware" on said gaming rig. Kind of like how I have a sedan for the road and motorcycle for off-road, different environments entail different risks and different limits on cost expended to ameliorate said risk. While we're on cars... A good car doesn't turn left across traffic on it's own, it requires a stupid user/driver to do that. A decent OS won't ask hackers into it, it should require will-full ignorance/misuse by the user to infect an OS. Please put limits on application designers. Your post only makes is all to clear that we will have to impose limits, as it's not viewed as profitable for them to police themselves enough to even put out a reliable product let alone a warranted one.

Re:get out of jail free card (1)

sjames (1099) | more than 6 years ago | (#23167374)

Perhaps you should look at an s390. You'll get the warranty you want for orders of magintude more cash. Alas there is no "home edition". All bets are off if you run Adobe on it, that's a different vendor.

Still needed (1)

zogger (617870) | more than 6 years ago | (#23136558)

The medical profession and insurance and pharma industries needed the slap downs because in the old days they were killing people or maiming them and got away with it. And even despite more scrutiny they are still trying to dodge safety issues, such as using barely knowledgeable academics as a "name brand lead author" on papers (headline article in recent JAMA). Nope, that liability was needed, they brought it on themselves because they refused to self regulate. If they had done it from day one they never would have needed the lawyers sicced on them, but they tried to hide behind white coats and pomposity for decades and finally got called on it when the accumulated evidence of serious malpractice and malfeasance just got overwhelming.

All the other industries have warranties and you can still buy their stuff, so I reject the FUD. I am no longer either believing the typical knee jerk indignant reaction expected, the scare tactics thrown out by the software industry "Your stuff will cost too much, we can't do it, wahhh!" nor do I think it is completely impossible. Yes, it will take one of those "paradigm shifts" in thinking and doing, and that is because it is needed. You know, I read it all the time here, devs in this or that thread complaining marketing forces-the suits- telling them to ship code that they *know* isn't finished and is still buggy. I would think the dev community would welcome forced minimum standards and minimum warranties like "suitable for purpose" like being exposed to the internet, etc., to help with fighting off marketing weasels and the slimeball tactics that permeate the industry and to actually be able to say they are "engineers" and have it mean something good.

    Until THEY are under the gun of losing it in the wallet, like their customers are daily from using their no-warranty products that make security job 9768 all the time, their excuses will just keep pushing out perpetual beta-crippleware. They make buzillions of dollars, time to man-up a little and accept some responsibility for the alleged "software engineering" that goes on and is used to justify tremendous profits and salaries. Everyone else has to, why should they get a completely free skate? Just "because"? Sorry, maybe 40 years ago, but not today, this is now a mature industry, they need to collectively be treated like adults in the normal business realm then, not "special needs" children.

Now, either that or just give up on trying to hustle this buggy and insecure forever crap for serious folding green and demanding "patent" protection and etc. Both ways is just a completely clearcut consumer ripoff, no other industry out there gets to skate on things by posting some ridiculous EULA. Give it away free, clearly label it as beta, fine, start charging serious money for it, different story, it needs a normal warranty.

Re:Still needed (1)

sjames (1099) | more than 6 years ago | (#23157348)

Actually, Warranties are NOT going to help and are NOT practical in software as we know it.

For one, when is that last time you have seen anything that absolutely warrants against break-in? Certainly not your car or house. Risks digest has had several postings about keyfobs that unlock several cars in the same parking lot and even one where the physical key operated an identical car. The dirty secret of home security is that anyone with the ability to kick hard and a hammer can break in and disable most alarm systems. It seems that the majority of home alarms call only when triggered rather than the much more secure system of signaling when all is well.

A warranty that it will take at least x time to break in is useless for an OS or networking device. All the vendor has to do is say "Hmmm, they must have been probing your network for months to pull that off..." now go prove otherwise. The the finger pointing begins. Can you prove an insider didn't configure something deliberatly to open a hole? Can you prove a user didn't approve the break-in?

OTOH, do you really want software that doesn't obey the user? Do you really want Word to refuse to open anything not written by an official MS signed and certified program? Or for Windows to refuse to install anything MS didn't sign? That's exactly what your plan will get you.

So what other metric shall we use? Mandate certification and there will surely be an MS security certification center (by hook or by crook). Guess what they'll say about introducing a Mac or Linux onto your MS-certified MSNetwork?

The closest we have to software that could sort-of meet strict liability requirements is the code running the space shuttle. If you want your word processor and email to do the same, be prepared to either pay millions or take a trip back to the early '70s.

Drugs do cost a great deal more because of the liability issues and testing reqquirements. In some cases I have seen, the veterinary version costs 100 times less than the human one. Since life is on the line, it's a cost that we simply have to accept there. However, $20,000 OSes with $50,000 word processors is simply not going to fly.

Believe me, I would love to find some metric that could be used to decide that a vendor just hasn't done enough to make their software secure. We all know some software is crapware and other is better. However, even the better software isn't perfect. A half-baked legal liability scheme won't actually fix the problem, it'll just fatten a few lawyers and leave anyone who dares to invoke a compiler potentially liable for unlimited damages.

If you can suggest a metric that can be applied unambiguously so that we can say this bug is inexcusible but that one is understandable, I'd be thrilled to read it. If you can't, then you should have some insight into why we can't treat software like drugs.

Ha! (1)

zogger (617870) | more than 6 years ago | (#23164414)

"However, $20,000 OSes with $50,000 word processors is simply not going to fly." just pulled that out of thin air, you have no actual idea what it might cost, do you? I have an OS and a "word processor" that costs zero and is inherently by past historical track record significantly more secure than OS and word processors that costs hundreds of dollars now.

You want a metric, the rest of all industry has one, it is very, very simple, you sell something and it is bogus and causes physical or financial harm because it is not "suitable for purpose", your customer then has a tort action available to them. If you as joe browser shipper ship a browser and OS, then some guy's bank insists on that browser and they or it gets pwned and you as the customer lose, they should be liable for it, plus damages and costs. Get a few cases like that out there, you'd see a lot better code..less releases..but better code. The demand for code is out there, it is huge, someone would be able to do a much better job, much better than now with legalized snakeoil, the best you can get is half baked more or less works and has constant security issues-the entire main point of this entire article, how security has gotten complex and is now beyond the ken of most people and even most businesses, and it's because the applications and OS they start with are A) constantly buggy and probably not even realistically into full beta mode, but are shipped as "finals", and B) most should never be connected to the internet anyway, because they are simply unable to exist there and remain secure. And there's things on the net and in common usage that although they are capable of doing nifty-cool things are clearly not suitable to use from a security angle, such as javascript. That's one of the simplest and most effective ways to make sure you never get pwned, just turn javascript off.

This is an involved subject, but in essence you want to defend no warranties, I say that it is a normal modern industry and needs warranties like every single other industry out there. They have managed to struggle by and eventually came up with engineering practices that make products "good enough", and have been able to deal with the odd random whoops! It worked out, they claimed it wouldn't but it did. I can't give you *exact* details of how this would work with software, I am not a coder, this is not my business (I am in farming, food production, we have metric shitloads of rules and regs and standards we have to adhere to), but just looking at everything else out's possible. Heck, the hardware that software uses has warranties even. The smarter guys would figure it out inside their own specialties of typing and coding and profit from it, the dumber ones would go out of business-as they should.

    I have some hardware thoughts on making more secure machines if you want to go into it, basically-just talking about generic home users-I don't think most people really need a full open PC as we have it now, they would be much better served with a locked down next generation powerful/fast internet appliance (or multimedia server) that respawned a clean OS and apps image and ran from RAM, so that every time it was turned on and off it would be clean and bug free. It could originally load from a locked optical disk that can't be written to. Like you see some places when they do "kiosk mode", but even better than that. In fact, that is going to be my next home built machine, it will be designed to run like that, because the market only is offering general and completely insecure machines that are now so complex even near -experts have to be constantly tweaking and guarding them and "patching". They had some really bad examples of internet appliances in the past, they all sucked, but with todays hardware I think you could build a pretty fast and secure one. Basically, most people don't run an OS and such like, they don't even know the differences between the OS and the browser, etc, all they know is "mash this for the internet". They run a handful of "things", they want to browse the web, check email, do some chatting, play some vids and tunes or games. I think you could cover that with a locked down machine that was even easier to use than most stuff out there now and wouldn't be a constant security nightmare. Not to say "power users" shouldn't have completely open machines, of course they could and should, I just think the market is ready for the next generation internet appliance, as long as it isn't woefully underpowered and stupid like that "web TV" disaster.

Re:Ha! (1)

sjames (1099) | more than 6 years ago | (#23167314)

I based the prices on the guesstimate of 100 times the price. That's the same as the liability markup on drugs but considerably less than the may not fail cost for space shuttle avionics software.

Please name any industry that warrants against criminal acts (such as breaking and entering) committed by a 3rd party (hint, there re none). Since there are none, there are also no metrics for it. Even safes and armored cars don't absolutely warrant that they won't be broken in to, only that they will "resist" for x amount of time against a particular attack. If someone comes along next week with a 100,000 degree cutting torch and it cuts the safe like butter, it's not covered under warranty.

Vaults don't warrant against theft if the owner dials the combination for the bad guys (run this attachment? OK CANCEL).

Having said all of that, I (and most of the industry) am well aware that some (inexplicably popular) software is substantially worse than others. If I could find a way to hang a number on it, I'd be the first to advocate doing just that and setting minimum allowable values. I would settle for a way to rigorously qualify it as best, better, average, bad, and legally liable. However, I haven't found a way to do that.

Even the usual fuzzy legal definitions such as "suitability for purpose" would be dreadfully hazy for software right now. People still believe that computers by nature "just crash" sometimes. They keep using them and buying more, so that must be "suitable for use".

Justice demands that we legally define what is required of someone before we hold them responsable for it.

You should dig deep into the various ideas in CS about provably correct software. Alas, the best and the brightest have only been able to accomplish that for trivial software that is really only interesting from an acedemic standpoint. The effort to expand the principles there to a full scale general purpose OS would be staggering. By the time such a project was completed, the hardwware it was certified for would be long out of production. Then there's the users. Even that provably correct OS will do bad things if the user blindly commands it to run code from who knows where that does who knows what (perhaps with elevated privileges).

An additional problem is that literally every program on a PC is actually made up of a hodge-podge of parts by different vendors. A PC running a python program using several 3rd party modules gets owned. Who do we blame? The program? Python?, One of the modules?, libc?, The video driver?, The OS? Let's look at the hardware. Perhaps the RAM flipped a bit disabling the NX setting on a page. Some PCI device might have corrupted something with an errant DMA operation. It could even be the chipset itself. Perhaps a subtle microcode bug in the CPU allowed an atomic op to take place non-atomically?

Imagine trying to educate a jury that doesn't know the diference between the application and the OS sufficiently to even sort of understand any of that!

Your idea for a more secure machine is an embedded application, and so IS certainly more securable. The job is easier when you can dictate exactly what is or is not on the machine and which version. Even there, you're not making the thing "secure", your just limiting the damage by reloading on every boot.

"Who do you blame?" (1)

zogger (617870) | more than 6 years ago | (#23178126)

That's an easy one, whomever you handed the cash to for your OS or the third party application that hosed you. If they in turn turned around and blamed someone else in their vendor stack, so be it, such is the nature of cutthroat predatory capitalism. It is the system we have, the software snakeoil peddlers just want the "caveat emptor" exclusion. So far, they have it, eventually, someone who got really took and has deep pockets and is finally fed up enough with the ridiculous EULA nonsense is going to break the back of the bugware cartel, then things will change for the better for both the consumer and the actual coders. For some almost do-nothing "shareholders" of macrobugware, inc., I wouldn't give crap one about them and their short term profits over peoples misery and frustration with being forced to endure perpetual betaware. You can type your fingers to the nubbins in defense of crapware, but the fact remains, they are the last so called "industry" put there that isn't being required to have warranties, yet they want full and complete and extensive legal protection for their profits, trademarks, patents, copyrights, "IP" and etc. My opinion, and your responses just intensify it, is it's a half-scam industry that has grown up thinking they are "special" with every excuse in the book to prove they are special, so now it is hard coded into their corporate and personal DNA defense of selling and shilling bugsqueezings because "that's the best they can do". Well, so far, ya, it appears so, there's no actuall quality as job 1 out there that I can see. Closed source is "good enough to look like it works, ship it". Open source is "we know it is always broken someplace, but it's free, ship it fast and often". No other options, expensive betaware, or free betaware.

  How would you like every other industry out there to have the same deal, would you be feeling lucky then? Would you even come close to trusting your food and water and electrical appliances and cars and so on, if all of those guys were allowed to just post some ridiculous disclaimer that "this product is not suitable for purpose", and so on? You just want total free and unrestricted trade with no forced warranties, no inspections, pure caveat emptor? Or just for software? You have a vested interest in that, it is your job perhaps?

    Now personally, I *used* to pay for software, for years and years, I even paid for all my shareware cepting one that turned into abandoned ware with no contact info (which makes me a rather odd person to be sure I guess), I don't pirate a thing, but now I use free and Free open source, so I have no recourse over the stuff I paid zero money for if it screws up. That's my tradeoff, I stopped being willing to pay rather decent sums for two cents of digital copies of half baked stuff,and I am willing to accept perpetual betaware as long as I don't have to pay for it. If I did though, bet your bippy some snakeoil peddler jerks would have been in court a long time ago, the first time I suffered any loss due to the lack of quality in some typed up alleged "product". I haven't suffered a loss, because mainly I always refused to use "the big gorilla" or any applications that even touched the big gorilla, I just shy away from obvious pure manure, like it always has been.

    It is going to happen someday, bet on it, all the businesses out there who have gotten burnt and reburnt over the years with crapware...some big billionaire boss is just going to go ENOUGH and get the ball rolling in court and challenge this exclusion, or some powerful senator or something.

And he is going to win.

  You worst case scare scenarios not withstanding, the jury and or judge is going to go "this expensive software stuff is pure crap, they lie through their teeth constantly and make billions, their expert witnesses are using smoke and mirrors and razzle dazzle,so... for the plaintiff!"

Enjoy the good times and phat checks while they last, someday it is going to be smaller checks for more real work that has some quality to it by law, same as everyone else (if that applies to you, excuse me if I ass-u-me wrongly). This subject is just such a major annoyance to me, why people put up with getting ripped off daily. It is simply amazing. It's just another major drag on the economy that doesn't need to exist right now, I just really don't like seeing people get frustrated and be forced into the anti missilemissile anti missile defgense with their crapwarte that comes "stock" on the bulk of the hardware out there. Entire industries revolve around trying to make it half way work on the internet. It's the "broken windows fallacy" economic argument, and that is just a dismally rank way to think of a sound economy, and there is no credible legitimate defense for that sort of arrangement..

Re:"Who do you blame?" (1)

sjames (1099) | more than 6 years ago | (#23178524)

Just so you know, I agree a lot of software is crap, and soem of it in addition to being insecure is also unfit for it's purpose. I'm just saying that in order to bring law into it, there must be legal standards.

I *KNOW* that whoever wrote the crap part is to blame, I'm not stupid. I'm saying that if *you* buy an OS from one place, pay someone else to install python, and then buy my python program from me and install it, who gets the blame when you get hacked? You'll probably blame me and it'll cost me a bazillion dollars to prove it was a crappy version of libc that the python consultant didn't update when he installed the package (against that package's recommendation.

That means I'll need some heavy insurance and my prices will go up to reflect the premioms. YOU will be the one who gets to pay for it.

I might reduce my premiums by licensing my software for use on a machine with exact versions of the system files and absolutely nothing else on it, but you (and everyone else) will no doubt end up voiding the warantee in order to have a useful computer. Surely you don't expect to hold ME responsable when the mis-mash that is your computer has a problem.

Again, (*PLEASE* read this carefully) no product in the world is liable for the result of someone deliberatly trying to break in. Crooks "exploit" doors all the time. Door makers are not held responsable.

On the other hand, if you want a refund for the crappy email program and OS that can't seem to operate correctly for 2 consecutive hours on anyone's computer, I would be more inclined to agree with you. We just have to find a way to translate "everyone knows xyz is a pile of crap" into a legal standard that judges and juries can understand.

The latter is important. There has to be a way to filter out the shouts of defective from users of fax modems who hold the document up to the screen and press send. There especially has to be a way to filter out the id10t who says format c:...yes....yes, please, wipe out my data and make my computer never boot again.....OH MY GOD, that piece of crap fried my computer! It won't even boot!

Personally, I write a good bit of free software for HPC and don't get phat checks (but I do get to like who I see in the mirror). I think the warning gcc issues when gets is used should be made an error instead. People who pass user input directly to SQL without escaping things like semicolons should have their hands broken. I'm 100% for increased standards for software.

I just don't think the courts or the general population are up to speed enough with software for bring lawyers into it to take us anywhere good.

When I can presume that people understand that installing random software from obscure websites is bad the same way auto makers can presume people know the car won't drive itself if they hop into the back seat for a snooze, we'll be just about there.

As a side note, I have been 100% Linux since '96 when I convinced my employer at the time to switch. The winning argument was quality and stability.

How do you fix it then? (1)

zogger (617870) | more than 6 years ago | (#23181926)

We have this huge security industry that by default is always one step behind the level they need to be at. There's little to no accountability anywhere though. If no one is at fault for designing and pushing bad products, then why bother with the security at all then? It never actually works all that well "in the field", the existence of huge botnets prove this. And I think it is because software releases that have no accountability to them encourage just more of the same. At a minimum it should be clearly labeled, such and such is suitable for exposure to the internet, such and such is not.

I ran Mac classic for years, with little to no worries, despite hoots of derision from my windows friends that it was a "toy" system, yet they were the ones who had constant security issues andf I had none. It was just designed different, and even taking numbers out of the discussion, it was inherently much more difficult to get root or ownership of classic over the wide open nature of MS (I never used linux back then so cannot comment) AFAIK, if you had sharing turned off, to this day there is still no remote code execution pwnership possible, none I have heard of anyway, never been done or shown. For example sub7 could run as a client (attacker) but not as a server (pwned victim). That wasn't "obscurity", it was because they couldn't figure out how to get root when getting root was made near impossible by design up front, which would have made it more practical to offer a warranty at the time for "suitable for use on the internet". I paid for that security and for knowing that by gum if something said it ran on mac, it sure did (ease of use, I fooled with windows, major icky stuff, and I don't game so that eliminated any need to run windows). And once you grokked extension sets and adjusting your ram usage app by app, it ran just fine, with hardly any worries and no need for bug detection, firewalls, etc. To me, that shows it is more possible than current levels of coding, it has gone backwards to a great degree (maybe open BSD is the exception there), and when they went to the less secure osx, I just went ahead and switched to free linux, as I was not going to pay for a digression (and my last mac machine wouldn't even run osx for that matter).

I see the willingness to have stuff that is perhaps faster outweighing security concerns, and just don't agree with that. And given I have no legal or practical protection whatsoever from *any* operating system or software beng offered to joe regular consumer, all of it contains the "neener, neener, nothing is our fault, sucker!" disclaimer, I had to go with cheap/free as the best defense and most practical way to go forward.

Back to the appliance concept, I still think that is the easiest way to make internet surfing more secure, if there is nothing to write to except RAM, and that is more locked down with permissions anyway (even to the point of making the browser be its own user), that would bring it closer to truly plug and no need to pray, which is where they need to be at for most folks usages. I don't think computers as they sell them now will ever be able to be made secure until they switch philosphies and treat them as application appliances.

Re:How do you fix it then? (1)

sjames (1099) | more than 6 years ago | (#23194162)

There is one phenominon I have no explaination for. If we can figure that out, many of the stability and security issues would solve themselves.

MS (primarily, but others as well) repeatedly announces new improved versions just like Lucy holding the football for Charlie Brown. Like Charlie Brown, users everywhere for some reason fall for the hype and believe that the result will be different this time in spite of decades of history. When MS announces a new release I am nearly to the point of actually hearing the AUUUUUUGH.....WHUMP!

The once again defeated MS users all complain bitterly right up to the point of an alternative suggestion. Then they make the excuse that they must run XYZ. Sometimes that's a valid reason, but often XYZ turns out to be entirely replaceable like Outlook or Word. For that matter, the way many people use Word, they could as easily use notepad.

As long as that remains true, MS will not receive the punishment in the market that will force them to do better.

Something many people don't realise is that by some miracle, nobody has ever released a truly malicious virus. Sure, they've done some damage and pumped a lot of spam, but the former is prankish vandalism and the latter is merely self-serving.

I mean a real killer. Do nothing to degrade performance, avoid excess network activity, then after months wipe the drive and erase the BIOS everywhere at the same time. Perhaps even worse, start sending private documents to random places on the net or randomly sending emails practically certain to cause a sexual harassment suit.

WRT the appliance, I agree that many people might be better off with that than a general purpose PC OS.

Re:get out of jail free card (1)

Skapare (16644) | more than 6 years ago | (#23137264)

It's one thing to make an OS fully secure. It's something else entirely to make it enforce security on other products. I want the former and not the latter. It is then my responsibility, delegated to the makers of the applications I add on, to make sure the applications themselves are secure. The OS only needs to provide the necessary facilities that applications might need. If an application specifically allows anyone that can reach that computer to login and erase crucial files, that is an issue of the application, not the OS. If the OS provides useful tools that the application can make use of to do this in a more convenient way (for example, convenient to the computer owner to specify who is allowed to login to more than one application, done at a single point of control), that is a good thing. But forcing all applications to use that is a bad thing.

An OS, even a secure one, should not have any kind of "approved software" that is a requirement for the software to work.

If someone wants to avoid the issue of multiple vendors pointing blame at each other for an undiagnosable issue, then they need to find an integration vendor who is willing to take responsibility for the packaging of the OS and the application together. What already comes in the OS can be considered integrated. Maybe some application vendors are willing to "integrate" with the OS and support all issues if you have their application installed.

But I do not want some OS telling me I can't use some application just because it thinks the application may not be secure, or the application vendor didn't pay the OS vendor to become approved. I may even want to write my own application.

The same should apply to drivers. The OS vendor is off the hook if you use such a driver. But it shouldn't prevent me from doing so.

Duh... use Macs. (0)

Anonymous Coward | more than 6 years ago | (#23135332)

Because Macs are known as arguably 100% secure, free from any issues that plague Windows or other UNIX systems, I don't see why deploying Macs should not be an integral part of any organization who values security.

The "Gay" Computer (0)

Anonymous Coward | more than 6 years ago | (#23136276)

Where's Father Randy "Pudge" O'Day when you need him?

However, security is not like power (1)

ladybugfi (110420) | more than 6 years ago | (#23135420)

While I agree in principal that security should be embedded as a core component in the services sold and puchased, I hope organizations realize security cannot really be bought simply like "..and add 1kW of power, thank you".

The correct amount and nature of security is very much relative to the risks the organisation is facing. Those risks are dependent on the kind of business they're doing and also on their business model.

However, as a security professional I still see people who say "It must be ... mmm ... secure" when I ask them what are their security requirements for any particular target. They would be really ready to purchase "security as infrastructure" and not think of security at all, but unfortunately in that case their organisations would eventually face an EPIC SECURITY FAIL.

No amount of "security as infrastructure" will help if organisations do not have a good risk management and analysis framework or do not understand what kind of security they need and how much. If they don't understand it, they cannot ask it of the vendors and thus they will get either nothing or something random.

Re:However, security is not like power (1)

ZonkerWilliam (953437) | more than 6 years ago | (#23144608)

No amount of "security as infrastructure" will help if organisations do not have a good risk management and analysis framework or do not understand what kind of security they need and how much. If they don't understand it, they cannot ask it of the vendors and thus they will get either nothing or something random.
I've only encountered a few companies that could even implement anything like "Best Practices" for security. Why? because currently INFOSEC is seen as a cost to the company without any type of revenue from it, like most of IT, only worse. When your blocking traffic from a poorly created application that a company depends on, or a mis-configured windows clustered server, INFOSEC is blamed for outages, because it's the one thing that actually does it's job, the rest of IT will see security as something preventing them from doing their jobs. Can't win for winning.

God I love Schneier (1)

fxer (84757) | more than 6 years ago | (#23135424)

he seems to be the only person that consistently "gets it." Does he need a surrogate to carry his children?

Security doesn't work that way (1)

Jaime2 (824950) | more than 6 years ago | (#23135490)

You can't take care of security at the infrastructure level. Insecure products can be built on a secure infrastructure. Commercial software will continue to force users to run with elevated permissions. New document formats and communications channels will provide new places for malware to hide. Infrastructure cannot police end-to-end secure tunnels.

Unless everyone participates in security, the system is not secure. As we learned years ago, a password can be purchased for a candy bar. Millions of AOL email accounts will be sold for a few hundred thousand dollars by a low end tech with the permissions to do so.

I have been troubled for years at the tendency for organizations to have a "security department". As soon as you take security off your developer's plates, they immediately start writing un-securable software. Same goes for administrators, if they buy security instead of doing it, they are going to cause problems.

Re:Security doesn't work that way (1)

Iamthecheese (1264298) | more than 6 years ago | (#23139622)

No, we learned how many people are either willing to write down their password or lie about it for a chocolate bar.

Security is not infrastructure (1)

ZonkerWilliam (953437) | more than 6 years ago | (#23135618)

Theres a desire for information security to be "easy" and automated, but when it comes down to it, thats the last it (information Security) can be, at least until AI is perfected. It will always take a human who understands the technology and the implications of network/workstation based attacks on the small and large scale. There are just to much complexities in todays networks that a single device/application/solution could deal with effectivly without human intervention.

They don't want to have to become IT security experts

Maybe not but someone will have to be, no matter.

Security a problem for someone else? (1)

BartMan57 (1276392) | more than 6 years ago | (#23135638)

This is interesting...are we actually thinking security is separate from the underlying applications or services that are being implemented? Security is an element of a solution we provide to our customers or if your an internal IT shop, the end-users. Sure there are components that are purely infrastructure items that IT uses to secure an environment, such as IDS\IPS, Anti-Virus, Firewalls, etc. Maybe this Slashdot post shows us a symptom of the overall lack security posture technology companies tend to take when developing a solution. DO THEY THINK SECURITY IS A PROBLEM TO BE SOLVED BY SOMEONE ELSE?

And what do these companies do, besides cry WOLF? (3, Interesting)

kscguru (551278) | more than 6 years ago | (#23135812)

From TFA:

I can't figure out what any of those companies do
Anyone doubt this? Let's take a tour through a few products that "make you more secure":
  • Antivirus: works by scanning files being written to/from disk, and by scanning I mean "run ~1 million instructions in an emulator then see if it matches a virus pattern". Requires weekly updates to latest definitions. One of the most successful "security" products
  • Static code analysis tools (e.g. Coverity). They take your source code, run a heavy-duty static analysis program on it, and point out memory leaks / double frees, uninitialized variables, and other flaws. My educated guess is that 1/3 of viruses involve such a problem. Useful, but to a manager, you can find a different 1/3 of flaws with a manual code audit that costs about as much.
  • Windows Vista (yeah, ha ha). Includes improved account control and privilage separation! Except that most users get so sick of the Allow box that is required for so many things on Windows that Vista has NOT fundamentally increased security.
  • Network intrusion detection appliance - you plug this into your network, and it does something when it detects a malicious access pattern - I dunno, maybe it bakes cookies? But detecting malicious access patterns makes you more secure!!!
The security product that takes off will be one that says "with product X, you will never experience security problem Y". Unfortunately, the security products out there are crap (product X decreases chances of problem Y from 1% to 0.01%) and security folks are the most paranoid about providing any guarantees. (Use the word "impossible" at a security conference and watch what the blogosphere does to you. I dare you.)

In other words: most security products provide a small marginal gain, while their vendors tout them as essential, must-have products.

The single most telling "security" trait I have seen is from the security group at my employer. They send out a feature proposal, and then flame anyone who disagrees with by saying "if you don't agree to this, we'll probably get hacked next year and it will be your fault for being against the security of our products!". Never mind the technical flaws (ASLR doesn't work when you map 1GB of contiguous memory in a 32-bit process) or performance implications. Security "sells" based on fear, and the security industry sales arm has yet to realize they have cried WOLF too many times for purchasers to take them seriously anymore.

Re:And what do these companies do, besides cry WOL (1)

ZonkerWilliam (953437) | more than 6 years ago | (#23136958)

Ahh yes FUD (Fear, Uncertainty and Doubt) The previous INFOSEC company I worked for was all about that. Best sales technique they had. It's definitely a self-perpetuating meme, that lately, companies have started to ignore.

Re:And what do these companies do, besides cry WOL (1)

base3 (539820) | more than 6 years ago | (#23138774)

Good. It's about god-damned time that "security" ceased being a magic word that made money and organizational power come from the sky for those who uttered it.

Re:And what do these companies do, besides cry WOL (1)

sgtrock (191182) | more than 6 years ago | (#23142918)

Static code analysis tools (e.g. Coverity). They take your source code, run a heavy-duty static analysis program on it, and point out memory leaks / double frees, uninitialized variables, and other flaws. My educated guess is that 1/3 of viruses involve such a problem. Useful, but to a manager, you can find a different 1/3 of flaws with a manual code audit that costs about as much.
I'd argue that if your software is important enough to deserve a thorough manual audit, you should probably consider doing both as they tend to catch different sorts of problems. Witness all the code cleanup that has been done in FOSS code on the basis of bugs found through Coverity's DHS funded code scanning service [] . Other than that, I'm pretty much in agreement with what you say.

Didn't he mean security is becoming a commodity? (1)

prxp (1023979) | more than 6 years ago | (#23136302)

Didn't Scheneier mean Computer Security is becoming a commodity (infrastructure sounds rather vague)? Is it really a bad thing? I mean, security is such an essential part of every thing that it really is supposed to be a commodity IMHO. Nevertheless, I disagree with him, it is very hard to embed security for all aspects in all products, so you always going to need supporting tools or services that will complement the security of the product you are interested in (like Antivirus Sofware complements Operating Systems). Also, as long there's security, there's someone trying to break it. This means that even if you embed enough security in a product, this security might be eventually broken some time in the future and again you're gonna need some supporting tool or service to protect you. Specially, because these breaks many times aren't just related to the specific implementation of some security technique, but to the fundamental principal the technique is based on (like what we have seen happening to CAPTCHA systems and hard disk ecryption products, and also the implementation of attacks that were considered impossible before). The notion of security becoming a commodity is hardly acceptable, let alone a reality.

Re:Didn't he mean security is becoming a commodity (1)

Skapare (16644) | more than 6 years ago | (#23137156)

Embedding security in other products may be hard (I don't entirely agree with this), but it is what is essential. Security should not be a separate product.

For example, if you have a router between your LAN and your link to the internet, that router should be performing the security function for you. If you want to block certain ports from being connected to via the internet, block it there. If you want to establish a VLAN tunnel to another office, you could do it there.

To the extent that any separate product can make something else more secure, that something else could have been made just as secure on its own. Don't confuse this with separate kinds of security that should be in different products.

Much of the problem is how things are marketed. Something marketed as a firewall may well really be a full router that can be a drop in replacement to an insecure router. But it should be marketed as a router that also happens to include state of the art security in the way it operates.

There is a market for separate security devices and tools only because existing products are just not secure. This is simply a reflection of the bad state of affairs of way too many products. For example, if Windows were secure, there would be no market for add-on security tools and products.

From TFA (2, Insightful)

techno-vampire (666512) | more than 6 years ago | (#23136568)

No one wants to buy security. They want to buy something truly useful...

And there you have it, ladies, gentlemen and slashdotters, the problem in a nutshell. People don't want to buy security because they don't think it's useful. And then what happens when their site gets defaced or their database hacked? They blame the admins, that's what. They never, ever admit that it happened because they wouldn't pay the price needed to secure their machines, they just blame somebody else for not keeping them safe even though they didn't have the tools to do the job.

Re:From TFA (1)

gr8scot (1172435) | more than 6 years ago | (#23155218)

They never, ever admit that it happened because they wouldn't pay the price needed to secure their machines, they just blame somebody else for not keeping them safe even though they didn't have the tools to do the job.
First, you admit that the price of keeping those machines secure exceeds the total value of the machines. As with any commodity, we blame the manufacturers of defective products for the damage done using those products for their advertised use. It's only Microsoft shirking their responsibility here, not Microsoft customers.

Re:From TFA (1)

techno-vampire (666512) | more than 6 years ago | (#23155318)

First, you admit that the price of keeping those machines secure exceeds the total value of the machines.

No I don't. Security software and the extra time to install, upgrade and maintain it isn't anywhere near that expensive, and if it is, it shouldn't be. Of course, we're probably talking Windows here, where security is nothing more than an afterthought tacked on at the last minute. If we're talking Linux, Unix or some other real OS, it's largely built in from the ground up, making your claim even less accurate. Security isn't free, but it's nowhere near as expensive as you make it out out be.

Re:From TFA (1)

gr8scot (1172435) | more than 6 years ago | (#23158902)

That was an imperative, not a declarative sentence.

First, you admit that the price of keeping those machines secure exceeds the total value of the machines.

No I don't. Security software and the extra time to install, upgrade and maintain it isn't anywhere near that expensive, and if it is, it shouldn't be.
You overestimate the "value-add" of the crappy machines then. "Security" should be an adjective we use to distinguish good software from insecure software. Any product that does require separate "security software" to become realistically usable for its advertised functions would not succeed in a free market any better than doors that unlock from both sides without a key.

Of course, we're probably talking Windows here, where security is nothing more than an afterthought tacked on at the last minute.
No, I was and am certainly talking about Microsoft. I specified that twice -- in the same sentence in fact. And yes, with Microsoft, "security is nothing more than an afterthought tacked on at the last minute." +1

If we're talking Linux, Unix or some other real OS, it's [security] largely built in from the ground up, making your claim even less accurate.
But we're not, and that "even less accurate" claim is not mine. I deliberately named Microsoft, twice, for the exact reason that they don't build their software securely, "from the ground up." If you want to say nice things about Linux & Unix all day long, I probably won't interrupt or ever disagree. Mainly, because they don't sell themselves as a convenient out-of-the-box experience. They're up-front overall about things like hardware requirements and the level of expertise necessary, and what they do promise, they deliver pretty well.

Now, back to the topic, Microsoft: who's paying and who's receiving money in this picture? Who, then, is responsible to deliver a useful product and who, by centuries of common law, has an implicit right to expect a product worth what was paid? Your "blame the victim" comment was and still is sickening. I won't be sidetracked by weaknesses in technical arguments you fabricate and try to attribute to me, tv.

Problem is not in infrastructure (2, Interesting)

Iagi (546444) | more than 6 years ago | (#23136842)

Most bad things that happen to users these days because they clicked a link that goes to a web site that installs malicious code. It seems that the largest security problem is that end users do not want to take the necessary minimal precaution (for whatever reason). It make no sense to me to try to build a "fool proof" infrastructure. The problem resides more with the end users and his/her computer. Since most computers (especially MS) like to use the internet to install software/updates. The problem is not going to go away by tweaking the infrastructure. Also the internet was designed for connectivity and interoperability. Obviously trying to move security to the infrastructure will mean giving up on these.

Why do we even have that lever? (2, Insightful)

argent (18001) | more than 6 years ago | (#23137260)

Why do browsers even have a "run malicious code" function?

In "The Emperor's New Groove" there is a running gag where someone pulls the wrong lever and falls through a trap door into an alligator pit, then returns dripping water and kicking away alligators and asking "Why do we even *have* that lever?"

Why does Firefox have a mechanism to install extensions to Firefox from within a Firefox window?

Why does Internet Explorer have a mechanism to run native code downloaded from a website?

Why does Safari have an 'Open "Safe" Files after Download' option?

Why doesn't Microsoft provide a way for browsers to launch and pass parameters to helper functions that doesn't require them to guess how the helper function's quoting mechanism works?

Why do we even HAVE these levers? These are all obviously bad designs.

Every other plugin you install in a browser can be installed by downloading it and running it as an application. Why does Firefox have to implement a mechanism to allow a web page to request that an XPI installer run?

ActiveX and other mechanisms based on using "security zones" to allow the HTML control to guess whether it's being asked to run a plugin that Windows Update needs instead of one that's going to install spyware are inherently insecure. Why doesn't Windows Update, for example, run as an application and provide its extensions to the specific instance of the HTML window that needs them, instead?

Apple has finally turned 'Open "Safe" files' off by default. This tiny increase in security is probably the best news I've heard in web security in a year... which is kind of sad. The underlying problems with helper function bindings are still there in OS X and Windows, alas.

Finally, Microsoft's POSIX subsystem actually includes "exec", the UNIX system call that is available on other platforms to avoid the quoting problems that the corresponding Windows call has. Unfortunately you can't use that call from Win32 programs, and they haven't implemented the equivalent in the past 15 or so years that it's been there. Why not?

Re:Problem is not in infrastructure (1)

threat_or_menace (746325) | more than 6 years ago | (#23138136)

I'm kind of stumped by who Schneier (and some readers in this thread) think attends RSA. I went last year, and it did not look to me like an end-user conference. It looked to me like it was a lot of people from companies large enough to have more than one person doing IT, and a company of that size is offering security as infrastructure to its users.

Are they doing it well? It's all over the map. Are they at least aware that they're doing it? One hopes so. But most of the attendees that I saw were clearly folks from large enough shops that they were thinking about dropping a lot of dollars on security.

If you work in IT at a company, and your users are downloading malware, you're not securing your gateway properly. Lots of ways to do this. IPCOP has a lot of good stuff for doing it in a free Linux distribution. You can go with a commercial product dedicated just to filtering, or that bundles in filtering, firewalling, and even spam.

If you offer a lot of services over the public internet, you're needing a larger IT staff and a security department. Or you're needing to be able to buy security as a service - for instance, the services of a reputable e-commerce site to handle transactions. And at that point, you are buying their infrastructure, and you probably aren't at RSA, and their staff are or ought to be. Whoever signs the checks for your e-commerce contract probably ought to be doing some due diligence around this point.

Now, the sales types may not have any idea what they're selling at RSA. Many did last year, but has this joke already been beaten to death?

What's the difference between an IT salesman and a car salesman?

A car salesman knows when he's lying.

What is an end user to do? Well, IPCOP is certainly a way to go. Set it up, set up a subscription to Dansguardian for some protection against malware URLs, turn on the IDS chunk so you get Snort telling you once you've screwed up. For end-users, Bluecoat has a free subscription to their list of categorized sites (called K-9) that a friend at the office thinks is very good. (He's got kids, and doesn't want to use the internet as an unrestricted babysitter.)

But the best venue to learn more about this approach is going to be your local LUG, not RSA. I don't know many people who can afford to throw around 4 to five figures for rackmount appliances for their homes. And my God it makes your home theater sound a lot worse.

Is it 1998!? (1)

v(*_*)vvvv (233078) | more than 6 years ago | (#23139440)

The reason security infrastructure sells is the same reason why security books don't. It is the same reason we want air bags, not driving lessons.

No one wants to learn anything, especially if it has nothing to do with the task at hand. We want it to just work, and it should.

Just prevent it, don't make us think about it unless you want some of us to make mistakes.

Security is a business decision (0)

Anonymous Coward | more than 6 years ago | (#23141992)

I know the opinions of MBA-types like myself are not always appreciated on here (hence I'm an AC), but I'm just going to throw this out there...

Security is a business decision.

If the probability of a security failure times the cost of that failure is less than the cost of the security measure, then you generally don't implement it.

I think a big part of the issue here is that management has a much better sense of the cost of a security failure than security does, and NO ONE really knows the probability of a security failure. The only thing certain is the cost of the security measure.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?