×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Thinking of Security Vulnerabilities As Defects

timothy posted more than 5 years ago | from the doesn't-everyone-already-think-that dept.

Security 158

SecureThroughObscure writes "ZDNet Zero-Day blogger Nate McFeters has asked the question, 'Should vulnerabilities be treated as defects?' McFeters claims that if vulnerabilities were treated as product defects, companies would have an effective way of forcing developers and business units to focus on security issue. McFeters suggests providing bonuses for good developers, and taking away from bonuses for those that can't keep up. It's an interesting approach that if used, might force companies to take a stronger stance on security related issues."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

158 comments

Of course vulnerabilities are defects (5, Insightful)

ardle (523599) | more than 5 years ago | (#23984733)

If they weren't, they would be in the program design.

Thread over. (5, Funny)

Anonymous Coward | more than 5 years ago | (#23984747)

Thread over on the first post. Well done.

Re:Of course vulnerabilities are defects (5, Informative)

kesuki (321456) | more than 5 years ago | (#23984953)

but then what do you call design features like windows networking telling you if you got the first letter of your password right, even without the rest of the password, and then letting you do that for the next letter, and so on and so on.

it was a feature of early windows networking, to do just that! like people might 'forget' their password, so they would 'need' a feature that would tell them letter by letter, if they were getting warmer on remembering the password! hackers had a FIELD day with various 'features' of Microsoft products.

Re:Of course vulnerabilities are defects (1)

ardle (523599) | more than 5 years ago | (#23985237)

Well, in support of my earlier statement, I'd have to call it a really dumb feature ;-)
I see what you mean, tho - hope you understand where I was coming from: if the program is behaving unexpectedly (as opposed to operating as designed but in a context the designers didn't intend), then it's a defect, and a defect that poses a security risk is still a defect.
For the sake of a pithy post, I kept clear of grey areas ;-)

Re:Of course vulnerabilities are defects (3, Insightful)

magisterx (865326) | more than 5 years ago | (#23985093)

Sometimes they are. Remember that there are some times when a vulnerability is a technical exploit of something subtle, and those possibilities are always bugs and defects and should be treated as such. But there are many other times when there is a trade off to get that additional security. There is very often a balancing act between usability and security.

This example is certainly not ideal as it does not involve software design, but it is analogous and one that I have personally seen happen. Consider a small company where literally everyone knows everyone and let us say it is a highly technical company so we can assume that everyone knows the basics of what they are doing. They may choose given everyone system administrator priveleges on the database and give everyone a domain admin account even if they normally log in on a lower privileged account. They have absolutely no security in the sense that any user can mess up the system in any way they chose. But they also have none of the usability problems that comes with security. They never have to wait for the network admin to be available if some setting needs to be changed, and never have to worry about a user not being able to get a document they need.

As the company grows, this will become unacceptable. But once security is laid on, now you have to make sure that everyone has the right permissions to read the documents they need. It adds layers of overhead and usability. It increases security, but that security comes at the price of tremendous man hours for a select few domain admins and often forcing users to wait for a designated admin when they need basic things like software installed.

Was the lack of security and the potential vulnerabilities a defect or a design flaw for the small company?

For something more immediately applicable there can also be a trade off between security and efficiency. For instance, I write a lot of SQL scripts that use dynamic SQL. Adding code to protect against SQL Injection requires more of my time to write, more of the computers time to process, and makes the code more complex for someone else who has to maintain it. It comes at a trade off. When I write something that will be used by a broad audience, I always favor security, but when I am writing scripts that will only be used by my internal team I often favor the efficiency and readability.

Clearly, there are cases where a vulnerability is a definite defect, but there are other times when vulnerabilities are consciously accepted for usability, performance, or code maintainability reasons. I will agree that performance and code maintainability become less compelling when it is a commercial product being sold, but they can be major factors within a company where you can often give the user at least a certain minimum level of trust, and usability can be a valid reason to consciously permit small security risks in even a commercial package.

Re:Of course vulnerabilities are defects (2, Insightful)

turbidostato (878842) | more than 5 years ago | (#23985977)

"Was the lack of security and the potential vulnerabilities a defect or a design flaw for the small company?"

How can somebody twist a simple concept into such a contorted one?

Defect is nothing more and nothing less than something not working as expected. If something is there by a concious decision is a feature; if something is misdoing, it's a defect. It's as simple as that. Really.

Now, on defects: if something works as designed, but the designers didn't thought in advance of a given (misdoing) situation it's a design defect (in your example, if somebody misuse his admin rights and the boss feels it unacceptable *now*, that means that his security design was flawed. If he answers "well, these things happen, let's move on", then it's a feature). If something doesn't work as designed, and it's misdoing, it's an implementation defect. If something is working as designed and the designer doesn't feel some behaviour to be misdoing, then it's a systemic defect (either an unethical seller or an idiot/uniformed buyer).

And that's all.

NO! Otherway Round! Defects _are_ Vulnerabilities! (2, Insightful)

refactored (260886) | more than 5 years ago | (#23985307)

It's the other way round Dammit!

The vast majority of security vulnerabilities are merely exploits of defects!

How do you hack a system? Find a bug, that's usually pretty easy....

Then you have the system operating already, "not as the designer intended" and you're more than halfway there...just add a bit of creativity and malice aforethought.

Re:Of course vulnerabilities are defects (2, Insightful)

TubeSteak (669689) | more than 5 years ago | (#23985449)

Of course vulnerabilities are defects

If they were defects in the eyes of USA law, they'd be considered a material defect or design defect under existing contract or product liability law respectively.

There are a few possible outcomes from such a scenario
A) Nobody writes software anymore because they'd be sued into oblivion
B) Prices go up because coders & companies have to buy software defect insurance
C) Prices go up because companies spend more in labor to produce defect free code
D) The EULA lists every possible failure scenario (plausible or not) in the interests of full disclosure and business continues as usual

Re:Of course vulnerabilities are defects (2, Insightful)

turbidostato (878842) | more than 5 years ago | (#23985987)

If problems on cars were defects in the eyes of USA law, they'd be considered a material defect or design defect under existing contract or product liability law respectively.
There are a few possible outcomes from such a scenario
A) Nobody builds cars anymore because they'd be sued into oblivion
B) Car prices go up because builders & sellers have to buy car defect insurance
C) Prices go up because companies spend more in labor to produce defect free cars
D) The EULA lists every possible failure scenario (plausible or not) in the interests of full disclosure and business continues as usual

Well, I don't see car bussiness to be in bad shape lately.

Re:Of course vulnerabilities are defects (2, Insightful)

tchuladdiass (174342) | more than 5 years ago | (#23986187)

Except, when I bought a new car, there was a small defect in the paint job -- a nearly unnoticeable paint bubble. I'm sure that every car that comes off the lot has a blemish somewhere. Doesn't cause the car to crash, and life goes on. Same thing ain't true with software -- that same "blemish" could easily be turned around and allow someone to break into the software.

The only way we could get comparable results with software v.s. physical objects is if computer systems develop the ability to withstand a certain percentage of defects without adverse affects. Kind of like how a few bolts on a bridge can be bad, because the way it is engineered is if a section calls for 5 bolts, they put 8 bolts in there just to be safe. Not a lot of practical ways to do that with software, at least not without a performance trade off.

Re:Of course vulnerabilities are defects (0)

Anonymous Coward | more than 5 years ago | (#23986703)

The only way we could get comparable results with software v.s. physical objects is if computer systems develop the ability to withstand a certain percentage of defects without adverse affects. Kind of like how a few bolts on a bridge can be bad, because the way it is engineered is if a section calls for 5 bolts, they put 8 bolts in there just to be safe. Not a lot of practical ways to do that with software, at least not without a performance trade off.

The only place where I see this would be remotely applicable is buffer overflow attacks. but still not a 100% secure method. because once I know that a bridge is designed to handle 2000 tonnes but on the entrance it says "1500 tonnes," I can still put 2500 tonnes to break it. all it does is increase the hacker's work.
so IMO there is no gray area. it is black or white. we could assign "pseudo-shades" saying there is against 99% probability that it is 100% secure of attacks but that is just marketing.

Re:Of course vulnerabilities are defects (0, Troll)

mrsbrisby (60242) | more than 5 years ago | (#23986621)

E) All software becomes GPL; You can fix defects yourself, or hire anyone to fix them.

Entertaining liability is only material because companies hold a monopoly on the "right" to fix defects- whatever that means, whether it be mere "annoyances" or outright failures of engineering.

This essentially makes it a PR problem, where with the low cost and even lower expectations of a modern software world, you could sell a life support machine that killed its patients 100% of the time.

Remove the monopoly, and you'll see higher quality software for cheaper- with the best software people working harder for more money, and the rent-a-coders becoming completely obsolete.

Re:Of course vulnerabilities are defects (1)

jc42 (318812) | more than 5 years ago | (#23986865)

D) The EULA lists every possible failure scenario (plausible or not) in the interests of full disclosure and business continues as usual

Actually, IBM discovered all this decades ago. One of the ongoing jokes in IBM systems, dating at least from the 1960s, is about a customer documenting and reporting a bug, only to be pointed at page 485 in volume 17 of the documentation, where exactly that behavior is documented. And since it's documented in the manual, "It's not a bug; it's a feature" and won't be fixed.

So maybe the only effect of such a proposed law would be to increase prices by forcing all software vendors to provide copious documentation. If you've even dealt with the documentation for IBM mainframe software, you won't necessarily be made happy by the prospect. You might be horrified.

Re:Of course vulnerabilities are defects (1)

slarrg (931336) | more than 5 years ago | (#23985831)

Sadly, most manages only recognize a defect if it i something likely to be noticed by their customers. Unless your customers are hackers, they'll never discover the problem so there's no need to test for them.

The biggest problem, in my opinion, is that management as a whole is generally "success based" which means failures a swept under the carpet or otherwise ignored. This means the company as a whole never leans from its mistakes or avoids them in the future. Everyone runs around talking about how great everything is and how wonderful they are at their jobs but no one ever analyzes anything to see if the project was successful by its original definition.

Instead, as the project grinds on the scope is limited, features removed, timelines extended, budgets exceeded, etc. However, every manager beams about the unbridled success of the finished product after the lowered success conditions are met.

Vulnerabilities are not always defects (1)

jc42 (318812) | more than 5 years ago | (#23986813)

If they weren't, they would be in the program design.

"It's not a bug; it's a feature."

That old joke isn't always a joke. Some vulnerabilities are built in, because they were put there intentionally by the designers and/or developers.

This is one of the primary arguments behind Open Source: If you can't get at the code and study it, you don't have any idea what "special features" might be hidden in there. The people who built it could have provided all sorts of back doors for exploitation by themselves or by other customers who have paid them for knowledge of how to get into your system.

Needless to say (but I'll say it anyway), one of the names for such non-bug vulnerabilities is "DRM", which refers to hidden code that prevents you from using your system as you'd like to use it.

Consider also the recent stories about software in Vista and earlier Windows releases that does automatic updates to some system components, even when you think you've turned off automatic updates. This misfeature is there explicitly to allow Microsoft (or anyone who pays them for access) to replace parts of your system with new code. It's pretty clear that this is a "vulnerability", but it's not a "bug" or "defect", because it was intentionally designed into the product.

No (3, Funny)

blargfellow (948805) | more than 5 years ago | (#23984773)

Of course they aren't defects, they should be treated as features!

Re:No (1)

trolltalk.com (1108067) | more than 5 years ago | (#23984997)

Since we already call "bugs" defects ... umm ,,, on second thought ...

Seriously, security faults, "bugs", "features" that aren't, etc., are defects. They're mistakes. Errors. They didn't just crawl in there accidently, no matter how much we strive to give them independent life by calling them "bugs" or "random behaviour". Not on deterministic systems like computers.

No - they go beyond application level defects (4, Interesting)

devloop (983641) | more than 5 years ago | (#23986059)

The elephant in the room is that primitive, unsafe tools endlessly perpetuate these problems. Buffer over/under flows are not difficult problems to solve at language design level, but the common tools We currently use to create applications make diagnosing them and fixing them rocket science. C and C++ (and other lesser used languages) are notorious for being hostile to catching these problems at compile time or debugging them when they happen later. In most cases, the problem goes "unnoticed" affecting unrelated functions in the application downstream and incorrect behavior or crashes happen at a later time when they can no longer be traced back to the original cause.

For kicks check http://en.wikipedia.org/wiki/Buffer_overflow#Choice_of_programming_language [wikipedia.org]
Google search on http://www.google.com/search?hl=en&q=%2Bbuffer+%2B%22overflow%7Coverrun%7Cunderrun%22&btnG=Search [google.com]

Re:No - they go beyond application level defects (1)

mrsbrisby (60242) | more than 5 years ago | (#23986667)

What a load of crap. FORTH is as primitive and unsafe as they come and they don't have to deal with over/under flows the way incompetent C++ programmers have to.

If users knew that there was a corrolation between competence and bug-free and problem-free code, they'd stop accepting crap. Instead, there are a lot of programmers- some good, and some second rate, defending bugs and security problems as mere accidents at worst- the kind everyone makes.

Instead, we have this culture that has convinced the user to accept liability for the failure and weaknesses in the programmer. We have entire companies that sell additional buggy and broken software that promises "protection" from other buggy and broken software- locking the user into an otherwise invisible extortion racket.

Stop letting idiots near a compiler and the bugs will go away. Seriously.

Potholes (0)

Anonymous Coward | more than 5 years ago | (#23984785)

Duh, that's like asking if potholes should be considered defects in a road.

won't change a thing (0)

Anonymous Coward | more than 5 years ago | (#23984805)

you'll just have more disclaimers stating "this product does not provide secure services" which will continue to dominate the market because they are cheaper.

Unless you make security defects criminally punishable you will get no traction at all.

Criminal negligence (1)

tepples (727027) | more than 5 years ago | (#23984877)

Unless you make security defects criminally punishable you will get no traction at all.

I'd imagine that this is already the case for banks, payment processors, medical facilities, and the like.

No. They'd get sued (1)

techmuse (160085) | more than 5 years ago | (#23984855)

They'd get sued out of existence for shipping defective products. I can't see any company agreeing to label its products in such a way.

Re:No. They'd get sued (5, Informative)

nine-times (778537) | more than 5 years ago | (#23985109)

The article (at least in my reading) isn't saying that they should be held legally accountable as selling a defective product. Instead it's about how companies should approach a bug report of a vulnerability. He's saying, when someone reports a vulnerability, consider it something that you're obligated to fix, not as a feature request.

But then, I think most people do. It seems like he hit a bad support person.

I ran into a similar problem once with Citrix, actually. Their software was relying on some library that it assumed was installed, even though recent Linux releases (at the time) had stopped using that library. The result was that the software didn't work until you tracked down that library, dropped it in the right place, and then it worked fine.

So I went to their website to give feedback, just to let them know. I mean, I'm sure they would have figured it out, but I thought, "may as well give them a heads up" because it was happening on major linux distros almost a year after their release. Citrix had released several updates to their software, and never fixed this problem. I couldn't find anyplace on their website to provide feedback, except for a form to give feedback about the website itself.

So I wrote up a little feedback, trying to explain the situation briefly (i.e. "I wanted to drop some feedback to your development team letting them know there's a problem, how to fix it, but I can't find any contact information on your website. Is there any way to submit this sort of feedback). The response came back quickly, "If you want support, you'll have to pay for a support contract."

I wrote back again, trying to explain, "No, see, I'm not looking for help, I'm trying to be helpful. I'm letting you know that there's a problem I already know how to fix. I was just wondering if there was a place to submit this sort of feedback."

Again, the response came in, "I'm sorry sir, but if you want us to help you with this problem, you'll need to buy our support contract."

At that point, I gave up.

Re:No. They'd get sued (0)

Anonymous Coward | more than 5 years ago | (#23985945)

Well then, the moral of your story is: don't buy/use Citrix' products. (I feel for you and I've the done the same when such has occurred to me --i.e., try to help out-- but the most of the proprietary world doesn't doesn't care.)

Re:No. They'd get sued (1)

turbidostato (878842) | more than 5 years ago | (#23986017)

"when someone reports a vulnerability, consider it something that you're obligated to fix, not as a feature request."

Why any company in the world would do something like that!!!???

Oh, yes, only if they are legally or financially forced, that's how.

Or do you think any company in the world would rise their production costs for no benefit?

Re:No. They'd get sued (1)

arth1 (260657) | more than 5 years ago | (#23985981)

Very true. Also, the bonus system won't work for two reasons:
1: Security bugs tend to be discovered years down the road.
2: A lot of companies would have to start paying bonuses to programmers, and not just to S&M. That'd never float.

wait...what? (1)

yakumo.unr (833476) | more than 5 years ago | (#23984869)

There are companies that DON'T treat security vulnerabilities as defects??

Re:wait...what? (1)

clarkkent09 (1104833) | more than 5 years ago | (#23985209)

Yeah, weird article. At the last company I worked for (Oracle) any security issues were filed and treated in every way same as any other bugs. That seemed a natural thing, why would you treat them as something separate?

Re:wait...what? (3, Funny)

jd (1658) | more than 5 years ago | (#23985245)

Microsoft don't seem to treat scurity vulnerabilities. Mind you, they don't seem to treat defects, either, so I guess they are still treated as the same.

Re:wait...what? (0)

Anonymous Coward | more than 5 years ago | (#23986593)

Nice troll; so patch Tuesday each month doesn't exist? Dork. Microsoft actually spends a lot of effort fixing security vulnerabilities. They don't treat them like other defects. Vulnerabilities get a LOT higher priority.

Companies DO consider them defects (0)

Anonymous Coward | more than 5 years ago | (#23984873)

The last company I worked for (a fortune 100 company) considered them defects.
There was a concerted effort to track down and eliminate buffer overflows and other common security flaws.

Then again, after all the cutbacks (including the ones that cost me and my coworkers jobs) I'm sure it's not really a priority any more.

Why is this even a question? (5, Interesting)

EdwinFreed (1084059) | more than 5 years ago | (#23984883)

We've treated potential vulnerabilities in our products, even extremely minor ones, as defects for over two decades now. And we have always given them very high priority.

To the best of our knowledge we've never had a remote exploit vulnerability, but even so we've gone so far as to scrap thousands of freshly pressed CDs a day before releasing them because I spotted a way to get root access through a tricky bit of business with shared libraries. (And that was for something spotted internally - no customer ever reported it.)

The real question isn't whether to treat security vulnerabilities as a defect - of course you do - but - somewhat paradoxically - whether or not to treat them as security vulnerabilities. We were acquired some time ago and have now adopted (and adapted to) various more complex procedures typical of a large company. There's this little box you're supposed to check in our current bug reporting system that says "this is a security vulnerability". The problem is that checking that box fires up a whole lot of extra process that rarely helps and can actually hinder prompt resolution of the problem and getting the fix into customer's hands.

Re:Why is this even a question? (1)

HalAtWork (926717) | more than 5 years ago | (#23985577)

Why is this even a question?

Because it leads to the line of thought that one might hold a company liable for these defects, the repercussions of which would imply large changes both in the approach of software development, in the support of existing software, and in regard of the current state of widely used software.

The real question isn't whether to treat security vulnerabilities as a defect - of course you do - but - somewhat paradoxically - whether or not to treat them as security vulnerabilities.

I think that's a different question altogether...

Re:Why is this even a question? (5, Interesting)

Anonymous Coward | more than 5 years ago | (#23985679)

We've treated potential vulnerabilities in our products, even extremely minor ones, as defects for over two decades now. And we have always given them very high priority.

Go ask your corporate legal counsel what would happen if the law treated software vulnerabilities as design defects.

Re:Why is this even a question? (1)

Kjella (173770) | more than 5 years ago | (#23986815)

Go ask your corporate legal counsel what would happen if the law treated software vulnerabilities as design defects.

Umm... nothing? Last I checked pretty much every EULA disclaimed any liability for defective design along with everything else.

Vote with your money (1)

jonaskoelker (922170) | more than 5 years ago | (#23984897)

Here's a summary of the article:

Vendors should make their developers work more on security (via money)

Meh. How often are the developers free to choose which parts and aspects of their companies' software they want to work on? As long as the companies tell their developers what to work on, here's the easy way: tell the devs to work on security testing and fixing. Letting the developers manage themselves is not going to sit well with management types, so you almost always have developers who are told what to work on.

Also, if anything external to the way you work (i.e. the promise of more money) can make you work better, you're slacking off in your daily work: why don't you deliver peak performance without the extra money?

Generic unrelated subject (2, Insightful)

jonaskoelker (922170) | more than 5 years ago | (#23985055)

Everybody, please laugh at the subject of my post which has no relation to its contents ;)

What I meant to write when I wrote the subject is that, from the point of view external to the organization developing insecure software, you are, according to the wisdom of the /. masses, supposed to vote with your wallet.

Yet, how's that expected to take place? To apply some of Schneier's observations, you have multiple parties, each with their own security agenda; the sysadmin might want the most secure option because anything less will be a nightmare to maintain, whereas the phb will want the cheapest because that'll make him look good in the eyes of those who set his salary.

Guess who makes the purchasing decision. Guess which security agenda will be reflected in that decision. Sometimes, the insecure option will be the cheapest even when the cost of bad security has been factored in.

Also, consider the fact that writing "perfectly secure code" is hard and time-consuming, and thus expensive. Given that it's hard enough to write reasonably non-buggy code when there's enough of us, what does that predict for security issues? Now add in the variability in skill level of the developers, and the varying experience with the particular code base they work on.

Re:Vote with your money (2, Insightful)

fastest fascist (1086001) | more than 5 years ago | (#23985117)

Also, if anything external to the way you work (i.e. the promise of more money) can make you work better, you're slacking off in your daily work: why don't you deliver peak performance without the extra money?

There's two ways to look at performance vs. compensation. Employees, ideally (at least from the employer's viewpoint) will look at it the way you do: you're being paid to do your best, so you should need no extra incentive to do so. Project management, on the other hand, should be pragmatic about it. Sure, employees SHOULD do their best no matter what, but maybe cash incentives can add motivation. If that is found to be the case, a good manager will choose results over principles.

Re:Vote with your money (1)

magisterx (865326) | more than 5 years ago | (#23985267)

Also, if anything external to the way you work (i.e. the promise of more money) can make you work better, you're slacking off in your daily work: why don't you deliver peak performance without the extra money?

Once again, due to trade offs.

I have had jobs where for several months at a time I worked for 15+ hours a day, every day and that was the minimum some weeks exceeded that substantially. That job got the very best I could give. That was a job where whether people lived or died could be impacted by the quality of my work and the pay was also commensurate with the hours. During that period, I did very little that was not related either to my job or to the physical maintenance of my body. I did not take college classes, did not get certifications, never saw my family in person and spent only a limited amount of time on phone/e-mail keeping in touch with my family. That was my peak performance.

At my current job I work between 45 and 65 hours a week depending on the office tempo and do not at all consider it slacking off (if my boss does, he hasn't said anything). I am not making anywhere close to the pay I was making at the other job, but I see my family, have completed computer certifications, and am preparing to start grad school part time in a month. If a company wants me to give up my time with my family and my ability to pursue personal development, I can certainly be persuaded, but it requires the right motivation. For me to do that again, it would take either a situation where people's lives/health were on the line or a salary high enough to compensate me for what I am giving up, or preferably both.

Yes (1)

maz2331 (1104901) | more than 5 years ago | (#23984929)

Yes, they are defects.

That said, there's one caveat to it: WHO'S defect is it? If the defect comes from a licensed library, OS issue, or other hidden cause, then that defect belongs to the source author/vendor.

Sometimes you end up having to work around someone else's crap.

Re:Yes (1)

Sique (173459) | more than 5 years ago | (#23984961)

But that's no different to any other product. Buy defective capacitors from a manufacturer for your product, and your product blows up because of them: It's your responsibility to recall the product and replace the capacitos.

Recall, fix, and sue... (1)

maz2331 (1104901) | more than 5 years ago | (#23986195)

...the source of the crapacitors for fraud.

Re:Recall, fix, and sue... (1)

Sique (173459) | more than 5 years ago | (#23986261)

That's an afterthought. First you are responsible for making your product save, even though you didn't manufacture the capacitors.

Re:Yes (0)

Anonymous Coward | more than 5 years ago | (#23985255)

WHO IS defect is it? YOU ARE IS defect it is!

Intentional misuse (2, Insightful)

asc99c (938635) | more than 5 years ago | (#23984955)

If a user was intentionally mis-using software I had written, I wouldn't consider it a bug. Although a vulnerability is generally mis-use by someone other than the owner of that piece of software, I'd still have to conclude it's not a bug. If I'd built a car, I would be more than a little annoyed if I got the blame that someone had broken into it and run someone else over with it.

I think it needs to be left to the market to decide what is acceptably secure software. Many Ford cars from the early 90s had locks that were far too easy to break - just stick a screwdriver in and it opens - even did it myself when I locked the keys in the car once. They got a bad reputation, and Ford improved the security to a level the market was happier with.

The market in software doesn't work quite as well as for cars unfortunately, but that's another issue.

Re:Intentional misuse (2, Insightful)

John Hasler (414242) | more than 5 years ago | (#23985119)

You might want to read up on merchantability [findlaw.com], implied warranty [wikipedia.org], and fitness for use. These legal concepts apply to cars and other tangible goods but not to software. They should.

Should they? (1)

XanC (644172) | more than 5 years ago | (#23985175)

These legal concepts apply to cars and other tangible goods but not to software. They should.

Would that not destroy hobby software, and much of OSS and Free Software along with it?

Re:Should they? (1)

John Hasler (414242) | more than 5 years ago | (#23985483)

No. The UCC is about commerce, and consequential damages can be and usually are disclaimed. If I make a free gift to you of a copy of my software and it turns out to be buggy how can you sue me for selling something defective? On the other hand if you sell a copy of my (GPL) software to someone else they should be able to sue you if it proves to be defective.

Re:Intentional misuse (1)

asc99c (938635) | more than 5 years ago | (#23985377)

They do apply to software. EULAs in most places at least are pretty much unenforceable nonsense. If software doesn't do it's job, you can return it for a refund.

Most software vulnerabilities are caused by inputting in random or intentionally malicious data. Continuing the car analogy, I can go a pour sugar (or worse) into a car's fuel tank and it will break the car (or explode and kill someone depending what I used). That isn't a bug. If I do this to my own car I'm completely to blame for my actions, and if I do it to someone else's car it's a crime.

The same should apply to software. I think in fact there are already more demands on software than tangible goods. It's pretty easy to get confused and put petrol in a diesel car - especially if you've got one of each :) Its an easy mistake to make, but the same sort of thing in software would be treated as a bug.

Re:Intentional misuse (1)

John Hasler (414242) | more than 5 years ago | (#23985517)

> They do apply to software. EULAs in most places at least are pretty much unenforceable
> nonsense. If software doesn't do it's job, you can return it for a refund.

I'm not talking about EULAs. I agree that they are largely unenforceable. As far as I know in the US what is warranted is the tangible goods: the physical copy. If the CD is scratched you can insist that tney make you whole by replacing it with a good one or refunding your money but the software can be buggy as hell as long as the copy arrives on your computer intact.

Re:Intentional misuse (1)

dvice_null (981029) | more than 5 years ago | (#23985133)

> Although a vulnerability is generally mis-use by someone other than the owner of that piece of software, I'd still have to conclude it's not a bug

So if you would write e.g. an FTP-server and user would setup that anonymous users have no write permission (read only). And then someone would as anonymous user send invalid command, which would cause a buffer overflow and corrupt the whole file system. You would not consider that a bug, because someone had just misused it by using a non-standard command?

Re:Intentional misuse (1)

asc99c (938635) | more than 5 years ago | (#23985727)

I suggest leaving it to the market to decide precisely because of this sort of issue - an FTP server designed to be openly accessible shouldn't have this kind of vulnerability. Users wouldn't accept this sort of problem, so yes it should be treated as a bug.

Even so, sending a few Kb of malicious data down to the server is hacking and the legal 'blame' for this should be with the hacker not the developer.

But not all software is subject to this requirement. Probably the majority of applications developed run within firewalled corporate networks and do not need to be designed with security as such a prime factor.

Re:Intentional misuse (1)

mrsbrisby (60242) | more than 5 years ago | (#23986727)

I suggest leaving it to the market to decide precisely because of this sort of issue

Justify that.

This issue is presently a public-relations problem with public-relations solution. So far, the strategy is to convince the user that they did something wrong and need more software. This works because the user tends to be under-educated, and because both competent and incompetent programmers are lock-step on message: programming is hard, you can't do it, and everyone makes mistakes.

I reject this premise, so I reject that the market should figure this out. We don't let anyone practice medicine and let the "market figure it out", nor do we let anyone practice law. Nor university professor, nor even the sale of many goods is possible without license, understanding and oversight.

Programming is hard; people have to be very clever, and have a very strong background in critical thinking, rigorous reasoning, and the scientific methods. Any programmer who says otherwise is incompetent; competent programmers do not make buffer overflows. Period.

Re:Intentional misuse (1)

paratiritis (1282164) | more than 5 years ago | (#23985521)

If I'd built a car, I would be more than a little annoyed if I got the blame that someone had broken into it and run someone else over with it.

It depends. If the user left the car unlocked, then sure. If the car was stolen because it had a defective lock (either one you manufactured, or because one of your suppliers messed up) then yes, it is your fault.

I think it needs to be left to the market to decide what is acceptably secure software. Many Ford cars ... were far too easy to break

A bit too simplistic, I think. Take prescription drugs. You want these tested and approved because the average buyer (patient or even doctor) is not qualified to judge their efficiency, so they cannot judge cost/benefit accurtely. They need standards to guide them. Similarly in software neither users or general IT professionals can judge the full security features of all available products. Again, they cannot judge cost/benefit accurtely, therefore they also need standards to guide them. Failure to meet these standards should then be considered a defect.

Re:Intentional misuse (1)

asc99c (938635) | more than 5 years ago | (#23985773)

I think this is a good point. I can't really think of any other goods of the same level of complexity as software, that are not regulated and produced according to strict standards. However the cost of failure in software is relatively low - at least in the sense that it can be measured in dollars rather than lives.

Re:Intentional misuse (1)

paratiritis (1282164) | more than 5 years ago | (#23986063)

I agree with you here. I was talking about the principle, not the severity of the faults. And probably this is the reason there is not much pressure to change this.

Of course there are exceptions. The space shuttle, nuclear reactors, even banks that move billions each day. These can have some pretty costly (in terms of both life and money) failures. But they are so few, they cannot pull the IT industry in this direction. They make great headlines and are then mostly forgotten.

Bonuses for good developers? (1)

Frekko (749706) | more than 5 years ago | (#23984979)

Bonuses based on code "quality" and or quantity has been tried before. It simply does not work. Counting the number of security issues some guy makes does not necessarily mean he or she is a good developer. It depends on several factors:
- difficult or simple code
- amount of code written
- where in the application does the developer code? Some code is more likely to result in security issues.
- how well that person knows the code

Absolutely (3, Interesting)

cryptoluddite (658517) | more than 5 years ago | (#23985009)

As a software developer I spend about a quarter of my time rewriting code that one of our other developers writes. His code is like a rhesus monkey came in and started flinging shit all around. He 'keeps up' with the other developers because he does the absolute minimum, never ever rewrites code to fix problems, cuts and pastes, etc. One time he cut and paste a second copy of a 200 line function so he could change one loop constant.

There's lots of developers like him, and they and/or their company should get sued over that code. At least when it is from negligence. Or there should be a licensing requirement.... something so that the people who are irresponsible or incompetent are held responsible for it.

Pretty much the only thing that makes programming not worth while is that people can hack out a 80% working code, get credit for it, then move on and leave all the crap for competent developers to fix. I would gladly pay a malpractice insurance fee if it means less having to deal with bullshit code.

Re:Absolutely (1)

Dahamma (304068) | more than 5 years ago | (#23986065)

Tell me you are not so desperate for a job that you would spend 25% of your time fixing a coworker's mistakes? Bring it up with your manager, have his faults explained to him, document it over a month or so, and fire him if he doesn't get better. If the company doesn't agree to hold your coworkers accountable for their work, leave. There are plenty of other companies that do.

In fact, if you haven't already tried the above suggestions, the problem is almost as much your fault as anyone else's.

Re:Absolutely (1)

cryptoluddite (658517) | more than 5 years ago | (#23986463)

Tell me you are not so desperate for a job that you would spend 25% of your time fixing a coworker's mistakes? Bring it up with your manager, have his faults explained to him, document it over a month or so, and fire him if he doesn't get better.

Because the other 75% of the time I get to code awesome stuff. He's friends with the CEO and went to college with a good portion of the company, so licensing or liability would help -- but your suggestions would not, Mr. Know-It-All.

Customers (1)

xstonedogx (814876) | more than 5 years ago | (#23985017)

No matter what tricks you try to use to get your developers and others to focus on security issues, it is going to cost money. Denying bonuses won't help because your developers can always leave and work for a competitor who doesn't play that game. And you'll still have to fix those vulnerabilities.

The solution is to ask your customers. Given the choice between a more secure, more expensive product and a less secure, less expensive product, which will your customers choose? Once you have the answer to that, you'll have the answer to whether you should think of security vulnerabilities as defects or a price-reduction feature.

Black robes and Black Hats (0)

starglider29a (719559) | more than 5 years ago | (#23985033)

I dare say that MOST developers do NOT know how to combat the most heinous, or even the most common hAx0Rz tricks. I've never triggered a buffer overflow in my life (on purpose).

To make software bullet proof requires the developers to have the skills of the hackers and malware writers, and the resources, the secret handshakes, the underground culture.

Yer talking one of two scenarios here:
  1. PhD in Software security *by the time you got it, it'd be obsolete.
  2. sending "nice" developers to infiltrate the black hat world.


Email me when that works out for ya~

Re:Black robes and Black Hats (3, Insightful)

dvice_null (981029) | more than 5 years ago | (#23985247)

Actually you really need just one person in the company with "haxor" skills to test the security of the products that others make. A single person can very quickly find a lot of common holes. That person doesn't need to a developer. He/She can be there just for testing or even just for supervising others that make the testing, to make sure that they test for security vulnerabilities also.

TFA Author is Inexperienced (5, Insightful)

awitod (453754) | more than 5 years ago | (#23985059)

"The problem of course is I'm saying how the companies should handle them, and I have no authority at any of these places, save people actually valuing my ideas. Personally, I've done some development in the past, and there was the concept of defects. Your bonus would depend on how many defects were in your application at delivery time. These were feature-based defects, but shouldn't vulnerabilities be considered defects as well?"

So, the author freely admits he is neither a developer or a manager. If he was a developer he'd know that these are defects and everyone treats them as such.

If he was a manager, he'd know that one of the surest ways to wreck a good shop is to start doing comp based on defects. Here is what invariably (in my experience) happens when a shop includes defect counts in there comp plans.

1. Relationships between Dev, QA, Product Management and Operations get worse because the terms 'defect' and 'bug' become toxic. In reality these things always exist in software. The last thing you want to do is create barriers to dealing with them. Making the acknowledgment of a defect cost someone money means you will have arguments over every one of them unless they cause an out right crash.

2. Culture becomes overly risk-averse - No one wants to take on difficult problems or blaze new territory. The smartest people will naturally pick the easiest work to minimize the risk of defects.

3. Over-dependence on consultants - More CYA behavior. If it's too complex people will outsource to keep the defects away. This is a very bad thing if the nasty problems are because of business and not technical challenges. Now the people who know enough about the problem domain to understand the risk are hiring proxies who know nothing to avoid responsibility for 'defects'.

Re:TFA Author is Inexperienced (0)

Anonymous Coward | more than 5 years ago | (#23985121)

You are right. That's why that are "writing" about the subject rather than participating.

They have no skills that I can detect, which appears to be their experience level too.

Re:TFA Author is Inexperienced (1, Interesting)

Anonymous Coward | more than 5 years ago | (#23985359)

What's even worse (and I've heard a story about this actually happening) is compensating people for fixing defects. What you get is programmers putting in errors on purpose, "finding" them a week later, fixing them and racking up huge bonuses.

Treating security issues as defects depends on... (3, Insightful)

SamP2 (1097897) | more than 5 years ago | (#23985087)

...the nature of the security issue.

A defect, by definition, is an unintended behavior of a program. Something was designed to work, but for whatever reason, doesn't. Compare this to a lack of a feature, which means that something doesn't work because there was never the intention for it to work in the first place.

A buffer overflow or SQL injection related issue is almost definitely a defect, since there is a dedicated, designed parsing mechanism to process input, and if some types of inputs are not processed as intended, it is a defect of the software.

On the other hand, for example, a security issue arising from plaintext transmission of sensitive data over the net, is not necessarily a defect. If the site in question was never designed to use SSL or another encryption mechanism, then it's a lack of a feature. If the site in question is an online banking site, then it is a blatantly poor and inexcusable design shortcoming, but nontheless, not a defect. (Of course, if the site DID intend SSL to work properly, but for whatever reason there is a hole allowing to crack or circumvent the encryption, then it IS a defect).

Besides, assigning a "defect" status to a security issue is not necessarily useful for it's own sake. The understanding is that a responsible company should treat a security issue with much higher priority than a non-security related one, defect or not (compare "we released an emergency hotfix to download" to a "we'll ship the patch in the next release cycle"). Saying a security issue is a defect, is like saying that a cardiac arrest is "organ numbness" - true, but not very useful.

Re:Treating security issues as defects depends on. (1)

cjonslashdot (904508) | more than 5 years ago | (#23985193)

You are correct. A security vulnerability is a defect if and only if the defect represents a failure with respect to a requirement. In fact, security is but one dimension of reliability, and so if the vendor is responsible for reliability, then the vendor should also be responsible for security. If there is no requirement for security, then it is not a defect. One of the posters mentioned that software that is sold for the intended purpose of general use on the Internet carries with it some implied requirements. (Merchantability.) Thus, just because the requirement for security is not explicitly stated does not mean that it is not there. If security is a requirement for a given environment of use, then a vulnerability is a defect, by definition. And yes, producers of software should be accountable in some manner for any and all defects, according to the terms of any explicit or implied warrantee.

Re:Treating security issues as defects depends on. (1)

paratiritis (1282164) | more than 5 years ago | (#23985619)

On the other hand, for example, a security issue arising from plaintext transmission of sensitive data over the net, is not necessarily a defect. If the site in question was never designed to use SSL or another encryption mechanism, then it's a lack of a feature.

I disagree. What I would conclude in this case is that it is not the fault of the coder. It is the fault of the company that provided the software though.

Such a product would be designed in many stages. You would have an analysis, a design and an implementation phase, where coding is done (possibly in small iterative steps as in extreme programming, but all the steps are there). In this case the analysis, design or both are faulty. This is not the poor coders fault, but the coder was given defective specs, so the defect is there. The coder implemented correctly a defective system.

If the site in question is an online banking site, then it is a blatantly poor and inexcusable design shortcoming, but nontheless, not a defect. (Of course, if the site DID intend SSL to work properly, but for whatever reason there is a hole allowing to crack or circumvent the encryption, then it IS a defect).

Try telling the bank if it loses millions as a result that this was not a defect. This is a sure way for your company to lose all credibility.

Wouldn't matter what you call them. (1)

nurb432 (527695) | more than 5 years ago | (#23985091)

The EULA's exempt holding the software maker for defects in their products anyway. Be them security holes or total meltdowns.

Re:Wouldn't matter what you call them. (1)

Khyber (864651) | more than 5 years ago | (#23985299)

EULAs are unenforcable in California, which is why VMware, among other companies, has in section 8 of their EULA "This EULA will be governed by California law."

So us Californians are safe. No EULA can go against a law.

Re:Wouldn't matter what you call them. (1)

nurb432 (527695) | more than 5 years ago | (#23986547)

In that case, id would forbid sales of my software in that state without written pre-sale contract.

Just to protect myself.

Edward Deming would disagree (5, Informative)

stephanruby (542433) | more than 5 years ago | (#23985107)

ZDNet Zero-Day blogger Nate McFeters has asked the question, 'Should vulnerabilities be treated as defects?' McFeters claims that if vulnerabilities were treated as product defects, companies would have an effective way of forcing developers and business units to focus on security issue. McFeters suggests providing bonuses for good developers, and taking away from bonuses for those that can't keep up. It's an interesting approach that if used, might force companies to take a stronger stance on security related issues.

When I think of defects and total quality management, I think of Edward Demings [wikipedia.org].

Edward Demings saw the problem of defects as a systems issue, not an individual performance issue. And his theory was that paying someone based on performance would have the unintended consequence of increasing the number of defects, not decrease them (Here is the list of Deming's 14 principles with my emphasis added in bold).

Deming offered fourteen key principles for management for transforming business effectiveness. The points were first presented in his book Out of the Crisis (p. 23-24)[19].

  1. Create constancy of purpose toward improvement of product and service, with the aim to become competitive and stay in business, and to provide jobs.
  2. Adopt the new philosophy. We are in a new economic age. Western management must awaken to the challenge, must learn their responsibilities, and take on leadership for change.
  3. Cease dependence on inspection to achieve quality. Eliminate the need for inspection on a mass basis by building quality into the product in the first place.
  4. End the practice of awarding business on the basis of price tag. Instead, minimize total cost. Move towards a single supplier for any one item, on a long-term relationship of loyalty and trust.
  5. Improve constantly and forever the system of production and service, to improve quality and productivity, and thus constantly decrease cost.
  6. Institute training on the job.
  7. Institute leadership (see Point 12 and Ch. 8 of "Out of the Crisis"). The aim of supervision should be to help people and machines and gadgets to do a better job. Supervision of management is in need of overhaul, as well as supervision of production workers.
  8. Drive out fear, so that everyone may work effectively for the company. (See Ch. 3 of "Out of the Crisis")
  9. Break down barriers between departments. People in research, design, sales, and production must work as a team, to foresee problems of production and in use that may be encountered with the product or service.
  10. Eliminate slogans, exhortations, and targets for the work force asking for zero defects and new levels of productivity. Such exhortations only create adversarial relationships, as the bulk of the causes of low quality and low productivity belong to the system and thus lie beyond the power of the work force.
  11. a. Eliminate work standards (quotas) on the factory floor. Substitute leadership.
    b. Eliminate management by objective. Eliminate management by numbers, numerical goals. Substitute leadership.
  12. a. Remove barriers that rob the hourly worker of his right to pride of workmanship. The responsibility of supervisors must be changed from sheer numbers to quality.
    b. Remove barriers that rob people in management and in engineering of their right to pride of workmanship. This means, inter alia, abolishment of the annual or merit rating and of management by objective (See CH. 3 of "Out of the Crisis").
  13. Institute a vigorous program of education and self-improvement.
  14. Put everybody in the company to work to accomplish the transformation. The transformation is everybody's work.

Re:Edward Deming would disagree (0)

Anonymous Coward | more than 5 years ago | (#23985425)

will you please just shut the fuck up? haven't you been clued in that no one is interested in what you have to say?

Re:Edward Deming would disagree (1)

tuomoks (246421) | more than 5 years ago | (#23985485)

As would I, not that it matters.. You take, point by point, Edward Demings list and compare it to any software company practices! What do you get - a total mismatch, because it would break the management structure (and lower the next quarter profits - maybe?)

Part of blame goes to the consumer! There was a time when I was able to get an answer and often a fix in 12 hours from vendor. And they better - otherwise our ($1 million in 70's) monthly license payments somehow created the same problems, got lost, whatever. Of course you are holding you license payments today as long as there are obvious, not fixed problems in product?

Later, moving to vendor side, I was told not to hurry, we have more important things to do, as another week of meetings how to have better quality (I learned a lot of new acronyms and abbreviations, mostly made for inside use by some managers?), or told that actually it is a feature caused by "cost savings" and already in plans to change, maybe? And the customers, most of them, seemed happy with those answers, of course a little edited for customer use! And they bought the line how difficult and expensive it is to fix so it takes time - sorry?

Of course a customer is different, try that with your own management - I'm looking the problem, it takes time to analyze, to schedule the change to my workload, to find a good and correct solution, to develop and to test it, to figure how to incorporate it and the new documentation to the product and then to deploy it? Should work, if not, learn marketing and sales tactics!

For some answers which seem to assume that a person in assembly line, developers, etc, can do anything to change the process - go and get a job, maybe after a while you are not any more so sure how the current world works?

is this a serious question (-1, Troll)

Anonymous Coward | more than 5 years ago | (#23985111)

or are you faggots just that clueless?

Management is usually to blame (1)

tukang (1209392) | more than 5 years ago | (#23985147)

companies would have an effective way of forcing developers and business units to focus on security issue

I don't think developers need to be 'forced' because generally they understand the importance of making their software secure and if they don't do it then it's usually due to external pressures such as unreasonable deadlines and management not wanting them to spend time on something that does not have tangible results.

In short, the problem with a lot of companies is that management doesn't value security as much as it should. If they did then they probably would handle vulnerabilities as defects - but then again a lot of these vulnerabilities would have been prevented to begin with by giving developers sufficient time to properly design and test their code.

The market is to blame (1)

asc99c (938635) | more than 5 years ago | (#23985439)

Management are trying to maximise profit, and typically don't care anywhere near as much as developers whether the job is done 'right'.

The problem is most buyers of software are way more interested in shiny bits and pieces than the security. If (more) people weren't willing to put up with insecure software, managers would be asking the developers to work more on the security aspects of the application.

Read the EULA (0)

Anonymous Coward | more than 5 years ago | (#23985195)

The people who produce and sell software already thought of this. The EULA goes to great lengths to try to absolve the maker from any kind of liability. In fact the data sheets for chips have wording that says the chips are not for use in any application where human life may be at risk.

If I were seriously damaged because of defective software or hardware, I would expect my lawyer to argue that the manufacturer had at least some liability. Microsoft, in particular, has very deep pockets though and I wouldn't be willing to spend much money fighting them. What I would hope for is that the threat of bad publicity would cause them to give me some money to shut up.

Anyway, as far as treating exploits as product defects, ... good luck.

Re:Read the EULA (1)

mysidia (191772) | more than 5 years ago | (#23985573)

If I were seriously damaged because of defective software or hardware, I would expect my lawyer to argue that the manufacturer had at least some liability. Microsoft, in particular, has very deep pockets though and I wouldn't be willing to spend much money fighting them. What I would hope for is that the threat of bad publicity would cause them to give me some money to shut up.

They may have deep pockets, but that should actually makes them potentially more vulnerable -- say to class action, or similar, as MS are perceived as the big corporate bullies/bad guys.

Trying to get away with making dangerous products... (bad PR)

They're a business, and unlikely to want to spend millions defending themselves and their good will, when they can settle for a pittance and safe face.

The only reason they'd want to spend more defending than you stand to make out of them (if they lose), is to avoid the potential embarrasment of losing, the precedent it could set, if their EULA's liability limitations didn't hold up, and desire to make an example of the case (to dissuade people thinking about bringing similar cases in the future).

Respect their desires, and you can probably reach a deal, if a M$ product ever causes you real harm.

Already are (1)

spazoid12 (525450) | more than 5 years ago | (#23985197)

"McFeters suggests providing bonuses for good developers, and taking away from bonuses for those that can't keep up. It's an interesting approach that if used, might force companies to take a stronger stance on security related issues."

They already are considered defects. The only thing that might change is the prioritization of defects. This topic is ridiculous.

Bonuses? Are you suggesting bonuses for finding defects? In 20 years I've worked at only once place that awarded bonuses for resolving defects. The idea failed within weeks. I thought everyone had come to realize it was a bad idea and for obvious reasons. Must be a new generation of devs taking to the reigns and retrying old dead ideas.

You know what force[s] companies to take a stronger stance on security related issues? The consumer. Assuming the consumer knows anything and isn't just buying the same thing his cousin did because his cousin knows a guy who knows a guy who is a half-wit "computer consultant". Too many community college consultants out there...

Re:Already are (1)

mysidia (191772) | more than 5 years ago | (#23985637)

Bonuses? Are you suggesting bonuses for finding defects? In 20 years I've worked at only once place that awarded bonuses for resolving defects. The idea failed within weeks.

Bonuses for having fewer than normal defects with your code makes sense.

Bonuses for finding defects made by some devs on your team are too easily abused... by devs "accidentally" introducing known bugs to help buddies collect the award a few days down the road.

And the "super-productive" cheaters+conspirators get praised while the honest developers seem unproductive and get canned.

Now if all devs are assigned bonuses and the defect finder gets part of the responsible developer's bonus (the developer responsible for the defect loses $2 of bonus for every $1 the defect finder gets), then that's more workable.

Great Idea On Paper...BUT... (3, Insightful)

darkPHi3er (215047) | more than 5 years ago | (#23985223)

In the RW, i'd suggset that we should consider the following;

You are Programmer Sian (notice the trendily androgynous name), you work for a gigantic software company, or conglomerate or industrial that does all its own major development inside, you are potentially confronted with;

1. Antiquated Developer Tools -- in general, the larger the development environment, unless you're Disgesting Your Own Pet's Nutrition, you are very likely to be using multi-year and/or multi-generation old development platforms and tools.

The question here is then, how can you effectively hold poor Sean accountable for vulnerablities, that are intrinsic to many older tools?

Who's more accounatable here? Sian or the managers who make the procurement decisions?

2. "Science Fiction" Application Programming Interfaces - depending on whether you are programming on a well-established product or not, if you are, Poor Sian is probably stuck with API's that were developed many years before and have been the victim of Design Creep, and its, Lunatic Cousin, Design Implosion.

In many instances the APIs, while they may once have had a large degree of Paradigmatic and Philosophic Design Integrity, as their initial Designers and Implementers have moved on to other; products, companies or, Worst Case, Inpatient Mental Health Facilities. Many New Designers have come in to add "Their Own Programming Uniqueness" to the APIs, frequently rendering the API's a jumble of radically different approaches to similar algorithms.

Should Sian be subjected to having their pay docked because 9/10 Functions implement a Library Call one way, and some "Johnny-Come-Lately" API function implements a similar looking, but substantially different in output function?

Shouldn't the API Designers/Architects be held more responsible for this one?

3. PHB Stupidity - As QC forwards endless application/OS defect notices to the Development/Maintenance Team, these defects are reviewed by the Team Managers and Supervisors. It's understandable, given the 11 hours per day of Absolutely Vital Meetings that most PHBs love to, i mean are forced to attend, that Defect Prioritization will suffer.

Sian can't choose what defects to repair, and in what order to repair them.

This is a management function, and one, in my experience, that Mgt usually jealously and zealously guards.

SOOOO, it's been the case in every Development project that i've worked on and know about, that PHB's have a well-understood tendency to prioritize Defect repair, according to external pressures, especially from Sales and Marketing.

Sales and Marketing organizations are usually likely to priortize according to their immediate impact on quarterly projections.

Vulnerablities are only likely to affect quarterly results when they are CATASTROPHIC defects, i.e. App or OS Killers. Otherwise, the majority of vulnerablities, which are usually well submerged in the Defect numbers, tend to get shoved aside for the higher priority defects that S&M believe impact immediate sales.

There are numerous other considerations here; including Contract Programmers, Legacy Compatability (ask the Vista Team about that one), Vendor Driver Teams that don't even know what to do with a new code base, etc, etc..

But it seems to me, that, while financial incentives CAN BE, useful as a Mgt tool for improving product quality, they should, to be even-handed, applied across the entire product team, with specific ***POSITIVE*** incentives used to take care limited, high priority problems across the product line.

There's already a tendency to "blame the programmer", and my Best Guess is, that any attempt to lay the responsiblity for vulnerabillites, THAT AREN'T CLEARLY THE RESULT OF SLOPPY/POOR/INCOMPETENT CODE PRODUCTION, at the feet of the programmer, will merely increase the employee turnover in the Development Team. Something that already is a problem most places.

from my experience: "The Fault, Horatio, Usually Lies Not In Our Code, But In Our Process"

Re:Great Idea On Paper...BUT... (1)

paratiritis (1282164) | more than 5 years ago | (#23985697)

from my experience: "The Fault, Horatio, Usually Lies Not In Our Code, But In Our Process"

Well all right, but the customer doesn't care. The company supplied a procuct that does not work as advertised. Yes probably in this case your Sian does not deserve blame but will tget blamed anyway. The world is unjust, true. But this still means that there was a problem (introduced by managers etc. who don't know and don't care)

How the company fares is the result of the market. If better companies exist it will sink. If not, it will survive. Sian's paycheck and continued misery will be assured. So will further security defects in future software.

Re:Great Idea On Paper...BUT... (1)

mysidia (191772) | more than 5 years ago | (#23985703)

The question here is then, how can you effectively hold poor Sean accountable for vulnerablities, that are intrinsic to many older tools?

You don't. You hold Sean accountable if he makes a coding error or a design error.

If he is using a third-party API/compiler that company has chosen that does not adhere to standards, then you have provided all necessary documentation that effects coding practices, or the coder is not responsible.

If there is a bug in the compiler, then the maker of the compiler (and person who chose that compiler) is responsible, unless you have informed the developer of the specifics of the compiler bug and the decision to include workarounds in the code, then the code is not expected to workaround unknown bugs in the compiler/API/OS/etc.

If there is a bug in a third-party API, then the local developer is not responsible, until informed of the bug and what workarounds to implement.

And only then after developers are provided time to implement the third-party-API-bug workaround and failed to.

Semantics (1)

InsertCleverUsername (950130) | more than 5 years ago | (#23985287)

This reminded me of a funny/ typical/ stupid/ aggravating thing at work a few weeks ago. I pointed out a security vulnerability in one of our intranet apps during a meeting to discuss the next release. Despite exasperating efforts to educate --and a heated argument over the correct term-- a project manager insisted on spreading the word to upper management that we had a "security breach." But in the war with management (and those who THINK they're above us on the org. chart), I guess it's all about the power struggle.

If you don't work with idiots, count yourself lucky.

Clueless writer dorks should know when to shut up. (0, Troll)

asackett (161377) | more than 5 years ago | (#23985397)

The problem of security holes in commercial software products is not one of developer apathy, but instead is a consequence of resource constraint. Which is just a nice way of saying that during the push to achieve an unreasonably accelerated product launch date with a short staff, small things get overlooked by developers, and the big things get overlooked by management.

"Hey, boss, we've got a potential remote exploit here. We can't ship this garbage." "We have to ship. We'll catch it on the first patch cycle." "Uh, boss, we've never before caught anything on the first patch cycle. Why should we expect this one be any different?" "Good question. Here's another: Who's going to sign your paycheck next week if we don't ship this product on time?"

Clueless writer dorks should know when to shut up.

Security vulnerabilitiesare not functionality bugs (2, Insightful)

Heembo (916647) | more than 5 years ago | (#23985469)

Functionality tests are easy to prove through unit and integration testing. Normal users spot functionality bugs quickly during normal product cycles.

However, security bugs are not easy to test or discover. In fact, it's very expensive to do testing to uncover even some easy classes of security vulnerabilities. Normal users do not stumble on security problems like they do with functionality issues.

Also, none of your developers were ever taught anything about application security in college. They professors are clueless. Even Michael Howard from MS who is hiring out of the best universities in the world cannot find a new grad who has any clue how to build secure software.

Functionality bugs and Security bugs are apples and oranges and deserve very different consideration. (Like measurement of Risk, etc)

Last, you can make a piece of software work. But you an never make a piece of software secure, only reduce risk to an acceptable level.

Depends on the vulnerability. (1)

mysidia (191772) | more than 5 years ago | (#23985515)

It a remote attacker is able to exploit a buffer overflow to run arbitrary code via a network service (that's intended to be publicly reachable), then that's a software code defect.

The defect is the buffer overflow that can occur -- the security vulnerability is that the bug can be used by a remote attacker to cause the program to do something that compromises security.

Security vulnerability effects the severity of the bug, but not the fact that it is a bug.

If the vulnerable network service is documented as to be run only behind a special firewall, insulated from any adversary, then it's not a security vulnerability, if it is "exploited", then the exploit was due to the configuration of the software and network (for which the network admin is responsible). There is still a defect in the software code (but independent of security issues that improper system configuration may create).

If the program by default allows commands to be remotely executed through the software, in a normal way, that is part of the product, (without exploiting a bug), then it's not a defect.

Although it may have security implications, especially if the feature is enabled by default and there are no security protections by default.

If the software (that allows remote command execution) has no possibility of implementing security over the feature, and it is intended to provide a network service, then it is a design defect (not a programming defect).

If the software can be secure, but the default is not, then this is no defect -- either the documentation explains how to secure it, and the user is in error, or it doesn't, and the documentation is defective.

Insecure defaults aren't desirable, but they're not defects. The insecure defaults may be very convenient when running the software in a recommended environment where security is not an issue, or security is provided by network equipment (configured according to recommendations of the documentation to disallow untrusted access, say to certain ports, or from untrusted IPs).

McFeters is nuts! (2, Interesting)

SwashbucklingCowboy (727629) | more than 5 years ago | (#23985529)

I work with the vulnerability management team and product security team at a large software company, and trust me vulnerabilities are treated as product defects. The cost of addressing vulnerabilities in the field is huge, and not addressing them is simply not feasible - customers would never tolerate it.

Should software piracy be treated as theft? (1)

ClosedSource (238333) | more than 5 years ago | (#23985543)

Redefining something rarely changes anyone's behavior.

Vunerabilities aren't defects unless the behavior of the application is inconsistent with it's requirements.

In any case, the solution isn't likely to be found in rewarding or punishing developers, but rather making security part of the requirements and providing enough development and testing time to insure that the software is secure.

Generally the market drives the process rather than software quality.

Pointless (1)

CyberDog3K (959117) | more than 5 years ago | (#23985579)

How many even slightly large software products have you ever seen that have no outstanding defects to begin with? Just browse a buganizer or two. There are thousands of open issues for major apps; some false, but plenty viable, if often minor, "defects." How you refer to a coding or logic error is irrelevant; whether it gets fixed or not is entirely dependant on the dedication of the supporting company, and how they envision it affecting their bottom line.

DONT DO IT! (0)

Anonymous Coward | more than 5 years ago | (#23985709)

Poor Micro$oft would go belly up!

Too broad of a question (1)

cecil_turtle (820519) | more than 5 years ago | (#23985719)

That's too broad of a question; it depends on the vulnerability. If my car's door lock can be bypassed by pulling slightly sideways on the door handle, that's a defect. If I can hold a cell phone over my car's windshield and it triggers the remote door unlock, that's a defect. But if I can smash a car window and climb in through it that's not a defect. If I can use an advanced lock pick set and have 2 hours alone with a car and get in that's not a defect.

The same thing applies in software - no piece of software will ever be impenetrable to every attack. The question is a matter of balance - given the purpose of the software, how much security is expected or required? How much trouble does a malicious entity have to go through to compromise the software?

And for many people the answer to those questions will be different for the same piece of software. That's where layered security comes into place. If I'm worried about my car's window being smashed and somebody climbing in to it, I don't blame the car manufacturer, instead I park the car in a garage. If there is something extra valuable I want to put in the car in the garage, maybe I put it in a lock box.

Well Yah... (1)

Darkness404 (1287218) | more than 5 years ago | (#23985747)

Well of course vulnerabilities should be considered defects. Even more so, DRM should be seen as defects as that is even in the program's planning. A security hole is a small cut, something that should be patched, DRM is like a gaping wound that will never be patched.

Well ... (0)

Anonymous Coward | more than 5 years ago | (#23985827)

Developers who release shoddy products should be treated as engineers who design flawed systems. Unlike engineers, however, the quality of software developers has started to dwindle. What with outsourcing and economic uncertainty ...

Then again, you have to consider that vulnerable programs don't often contain vulnerable code themselves. Some link to vulnerable libraries and so forth. Some vulnerabilities are just so obscure that developers can't anticipate them.

Ideal vs reality (0)

Anonymous Coward | more than 5 years ago | (#23986281)

Ideally, yes all program bugs, and security holes should be treated as defects.

In reality, programs can run to millions of lines of code worked on by hundreds or even thousands of different people.

On a scale that large, no amount of proof reading an double checking will work, there is just too much there. Its why we see beta tests, its an attempt to put their program under real world conditions to see what breaks.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...