Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The 25 Most Dangerous Programming Errors

samzenpus posted more than 4 years ago | from the spilling-your-soda dept.

Programming 534

Hugh Pickens writes "The Register reports that experts from some 30 organizations worldwide have compiled 2010's list of the 25 most dangerous programming errors along with a novel way to prevent them: by drafting contracts that hold developers responsible when bugs creep into applications. The 25 flaws are the cause of almost every major cyber attack in recent history, including the ones that recently struck Google and 33 other large companies, as well as breaches suffered by military systems and millions of small business and home users. The top 25 entries are prioritized using inputs from over 20 different organizations, who evaluated each weakness based on prevalence and importance. Interestingly enough the classic buffer overflow ranked 3rd in the list while Cross-site Scripting and SQL Injection are considered the 1-2 punch of security weaknesses in 2010. Security experts say business customers have the means to foster safer products by demanding that vendors follow common-sense safety measures such as verifying that all team members successfully clear a background investigation and be trained in secure programming techniques. 'As a customer, you have the power to influence vendors to provide more secure products by letting them know that security is important to you,' the introduction to the list states and includes a draft contract with the terms customers should request to enable buyers of custom software to make code writers responsible for checking the code and for fixing security flaws before software is delivered."

cancel ×

534 comments

Yeah, right. (5, Insightful)

Anonymous Coward | more than 4 years ago | (#31179316)

I'll sign such a contract, but the project will take twice as long and my hourly rate will go up 300%.

People like to draw the comparison with civil engineering, where an engineer may be liable (even criminally) if, say, a bridge collapsed. But this isn't really the same thing. We're not talking about software that simply fails and causes damage. We're talking about software that fails when people deliberately attack it. This would be like holding a civil engineer responsible when a terrorist blows up a bridge -- he should have planned for a bomb being placed in just such-and-such location and made the bridge more resistant to attack.

The fault lies with two parties -- those who wrote the insecure code, and those who are attacking it. I'll start taking responsibility for my own software failures when the justice system starts tracking down these criminals and prosecuting them. Until then, I'll be damned if you're going to lay all the blame on me.

Re:Yeah, right. (5, Insightful)

timmarhy (659436) | more than 4 years ago | (#31179386)

Not only will it take twice as long and cost 3 times as much, but i'd also reserve the right to deny the customer any features i deemed unsafe.

I could lock down any system and make 100% hacker proof - i'd unplug their server.

it's a ratio of risk to reward like most things, if you want zero risk there won't be any reward.

re:zero risk (5, Insightful)

Tumbleweed (3706) | more than 4 years ago | (#31179592)

"Insisting on absolute safety is for people who don't have the balls to live in the real world."
- Mary Shafer, NASA Dryden Flight Research Center

"those O-rings will be fine.. its not that cold" (2, Insightful)

decora (1710862) | more than 4 years ago | (#31179756)

some jackass circa january 1986

Re:zero risk (0)

Anonymous Coward | more than 4 years ago | (#31179824)

Wow, that dude sounds almost as scary as that guy named Sue.

Re:zero risk (2, Funny)

TheLink (130905) | more than 4 years ago | (#31179866)

She's not a guy. As for her balls, she might have ripped them off the guy named Sue for all I know.

Re:zero risk (1)

Dahamma (304068) | more than 4 years ago | (#31179928)

Whoosh!

Re:Yeah, right. (1)

ehrichweiss (706417) | more than 4 years ago | (#31179752)

Marshall Sylvar the hypnotist used to say "No Risks. No Goodies." It seems to fit most everywhere in life.

Re:Yeah, right. (0)

Anonymous Coward | more than 4 years ago | (#31180078)

That'd be awesome.

"I'm sorry, sir. But electricity flowing into your server is considered a security risk."

Re:Yeah, right. (2, Insightful)

TapeCutter (624760) | more than 4 years ago | (#31179408)

Yes, damage caused by a deliberate attack is an insurance matter, not an engineering matter. Nothing can be made 100% failsafe.

Re:Yeah, right. (1)

fractoid (1076465) | more than 4 years ago | (#31180090)

Insurance matter? Isn't it more a criminal matter?

Re:Yeah, right. (2, Insightful)

Mr Thinly Sliced (73041) | more than 4 years ago | (#31179420)

Yep this isn't about removing vulnerabilities or improving quality - this is about making someone accountable.

Having a countract where the developer is made liable? This is management blame-storming at it's finest.

Re:Yeah, right. (5, Insightful)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#31179826)

Worse, in addition to being management blame-storming(and hardly novel, at that). It is quite arguably a member of a very old and inglorious school of argument, the one that asserts that people are fully rational agents, who will perform properly if suitably threatened. Sure, Mr. "Eh, I'd rather masturbate and play Halo than check for bugs in the software I was paid to write" could probably do with a kick in the ass; but the main threat is simple honest mistakes, which humans make with some frequency, depending on their constitution and surrounding conditions.

Anybody who honestly thinks that scary looking contracts are going to keep the engineers in line should read up on the sorts of things that happen in workplaces with real hazards: heavy machinery, toxic materials(and not the chickenshit "recognized by the state of california to cause reproductive harm" type, the "Schedule 2, Part A, per CWC" type), molten metal, exposed high voltages, and the like. Even when their lives are on the line, when the potential for imminent gruesome death is visible to even the least imaginative, people fuck up from time to time. They slip, they make the wrong motion, they get momentarily confused, some instinct that was real useful back when lions were the primary occupational hazard kicks in and the adrenalin shuts down their frontal lobe. Happens all the time, even in countries with some degree of industrial hygiene regulation.

Re:Yeah, right. (1, Insightful)

140Mandak262Jamuna (970587) | more than 4 years ago | (#31179448)

Bad analogy. It is like holding the car company responsible for making cars without doors and locks when they get stolen. True, stealing a car is a criminal activity. But designing a car that can not be secured effectively is aiding and abetting.

Re:Yeah, right. (4, Insightful)

timmarhy (659436) | more than 4 years ago | (#31179518)

nope wrong. the car has doors and locks, but their are criminals out there that are skilled enough to pick the locks. how far should you raise the complexity of the hack before your off the hook?

Re:Yeah, right. (4, Insightful)

pclminion (145572) | more than 4 years ago | (#31179650)

It's laughable to equate an outright lack of security (lock-less doors) with subtle programming errors which result in security holes. It's not like a door with no locks. It's like a door with a lock which can be opened by some method that the designer of the lock did not envision. Does it mean the lock designer did a poor job? That depends on the complexity of the hack itself.

Software is designed by humans. It won't be perfect. Unfortunately, software is targeted by miscreants because of its wide deployment, homogeneity, and relative invisibility, which are concepts that are still quite new to human society. I'd be willing to take responsibility for security failures in my products, but I'm sure as hell not going to do so when I'm subjected to your every idiotic whim as a client, nor will I do so at your currently pathetic pay rates. If you want me to take the fall for security failures, then I reserve the right not to incorporate inherently unsecure technologies into my solutions. In fact, I reserve the right to veto just about any god damned thing you can come up with. After all, I'm a security expert, and that's why you hired me, right? And I'm going to charge you $350 an hour. Don't like it? Go find somebody dumber than me to employ.

Re:Yeah, right. (0)

Anonymous Coward | more than 4 years ago | (#31179758)

It's also laughable to say that someone who didn't provide locks was "abetting" a break-in. Criminal liability doesn't actually work that way (thank God).

Re:Yeah, right. (3, Interesting)

danlip (737336) | more than 4 years ago | (#31179792)

It's laughable to equate an outright lack of security (lock-less doors) with subtle programming errors which result in security holes. It's not like a door with no locks. It's like a door with a lock which can be opened by some method that the designer of the lock did not envision. Does it mean the lock designer did a poor job? That depends on the complexity of the hack itself.

I mostly agree. My pet peeve is SQL injection attacks, because they are so frickin' easy to avoid. Any developer that leaves their code open to SQL injection attacks should be held liable (unless their employer insists they use a language that doesn't have prepared statements, in which case the company should be held liable).

Re:Yeah, right. (1)

timmarhy (659436) | more than 4 years ago | (#31179838)

even then, a decent DBA will prevent even the crappest program from being a problem.

Re:Yeah, right. (4, Interesting)

Antique Geekmeister (740220) | more than 4 years ago | (#31179864)

Unfortunately, many of these errors are _not_ subtle. Let's take Subversion as an example. It is filled with mishandling of user passwords, by storing them in plaintext in the svnserve "passwd" file or in the user's home directory. Given that it also provides password based SSH access, and stores those passwords in plaintext, it's clear that it was written by and is maintained by people who simply _do not care_ about security. Similarly, if you read the code, you will see numerous "case" statements that have no exception handling: they simply ignore cases that the programmer didn't think of.

This is widely spread, popular open source software, and it is _horrible_ from a security standpoint. Collabnet, the maintainers of it, simply have no excuse for this: they have been selling professional services for this software for years, and could have at least reviewed if not accepted outright the various patches for it. The primary patch would be to absolutely disable the password storage features at compilation time, by default, especially for SSH access. There was simply never an excuse to do this.

I urge anyone with an environment requiring security that doesn't have the resources to enforce only svn+ssh access to replace Subversion immediately with git, which is not only faster and more reliable but far more secure in its construction.

Re:Yeah, right. (0)

Anonymous Coward | more than 4 years ago | (#31179690)

Even a car with no locks shouldn't be responsible, you bought the car knowing full well there was no locks, if you want cars with locks, pressure those who make cars and take your business to one with locks.

Re:Yeah, right. (1)

the_Bionic_lemming (446569) | more than 4 years ago | (#31180058)

Bad analogy. It is like holding the car company responsible for making cars without doors and locks when they get stolen. True, stealing a car is a criminal activity. But designing a car that can not be secured effectively is aiding and abetting.

Speaking to bad analogies
Brick, meet car window.

Oh wait, you need a windshield???

Re:Yeah, right. (4, Informative)

0x000000 (841725) | more than 4 years ago | (#31179454)

Holding programmers accountable for their coding errors should happen inside of the corporation as they are working on the project. I don't remember which company had this, but if a developer broke the build it failed to pass a test a lava lamp at their cubicle would turn on, and until the developer fixed the build the lava lamp would stay on, which generally meant you had a certain amount of time to fix the issue before it would actually start bubbling. This way there is an incentive not to break the build, and a bit of competition between the various programmers to have the least amount of bugs or build breakages.

Having programmers imagine every way that their program may be attacked is impossible. There will always be new attacks that take advantage of that one that the programmer had not thought of. Just like the security systems that are in place at airports around the world. If the good guys could come up with every single scenario that an attacker could take airports would be much safer, as every single scenario had already been thought about.

I agree with you, don't put all the blame on me as a programmer.

Oh, if I had mod points, you sir would have them!

Re:Yeah, right. (2, Funny)

ls671 (1122017) | more than 4 years ago | (#31179684)

> Holding programmers accountable for their coding errors

We used to have a board where we would note "bozo the clown points" for anybody involved in the project, even managers ! ;-))

http://en.wikipedia.org/wiki/Bozo_the_Clown [wikipedia.org]

Re:Yeah, right. (5, Insightful)

rolfwind (528248) | more than 4 years ago | (#31179504)

People like to draw the comparison with civil engineering, where an engineer may be liable (even criminally) if, say, a bridge collapsed. But this isn't really the same thing. We're not talking about software that simply fails and causes damage. We're talking about software that fails when people deliberately attack it. This would be like holding a civil engineer responsible when a terrorist blows up a bridge -- he should have planned for a bomb being placed in just such-and-such location and made the bridge more resistant to attack.

Not only that, but civil/mechanical/other engineers usually know exactly what they are dealing with - a Civil engineer may specify the type of concrete used, car engineer may specify the alloy of steel.

Most of the time, software engineers don't have that luxury. Video Game consoles (and still are, mostly) used to be nice that way and it was the reason they had fewer problems than PCs.

Tell a bridge engineer that he has no absolutely control over the hardware he has to work with and that it may have a billion variations, and see if he signs his name to it.

Re:Yeah, right. (1)

razvan784 (1389375) | more than 4 years ago | (#31179638)

Tell a bridge engineer that he has no absolutely control over the hardware he has to work with and that it may have a billion variations, and see if he signs his name to it.

You know, that's what modern operating systems with hardware abstraction layers and APIs, and high-level development toolkits are for. I don't think I care what hardware or even OS my stinky, SQL-injection-prone PHP code is running on. Sheesh.

Re:Yeah, right. (3, Insightful)

chebucto (992517) | more than 4 years ago | (#31179850)

Not only that, but civil/mechanical/other engineers usually know exactly what they are dealing with - a Civil engineer may specify the type of concrete used, car engineer may specify the alloy of steel.

But other engineers can't specify all the variables. They have to deal with the real world - rock mechanics, soil mechanics, wind, corrosion, etc. - so they too can never know exactly what they're dealing with. Many of the worst engineering disasters occured because some aspect of the natural world was poorly understood or not accounted for. However, it remains the engineer's responsibility to understand and mitigate those uncertainties.

Re:Yeah, right. (1)

Cassini2 (956052) | more than 4 years ago | (#31179868)

Tell a bridge engineer that he has no absolutely control over the hardware he has to work with and that it may have a billion variations, and see if he signs his name to it.

A civil engineer can be disciplined for not checking that a building is built to his/her specifications. The engineer is responsible for ensuring that the general contractor and the trades actually build what was intended. As such, many engineering contracts run for 60 or 90 days after project completion, allowing enough time to ensure all trades have been paid, and all work done to standard.

An exception is a "design only" contract, in which case you certify nothing, because you have absolutely no idea how someone might interpret the plans.

Re:Yeah, right. (5, Insightful)

cgenman (325138) | more than 4 years ago | (#31179900)

Let's see. The top programming errors are:
Let people inject code into your website through cross site scripting.
Let people inject code into your database by improperly sanitizing your inputs.
Let people run code by not checking buffer sizes.
Granting more access than necessary.
Granting access through unreliable methods.

Geez, #7 is fricking directory traversal. DIRECTORY TRAVERSAL. In 2010! It's not like your drawbridge is getting nuked by terrorists here. Generally bridges are built to withstand certain calamaties, like small bombs, fires, hurricanes, earthquakes, etc. Being successfully assaulted through a directory traversal attack is like someone breaking into the drawbridge control room because you didn't install locks on the doors and left it open in the middle of the night. Why not leave out cookies and milk for the terrorists with a nice note saying "please don't kill us all [smiley face]" and consider that valid security for a major public-facing application.

Further down the list: Failing to encrypt sensitive data. Array index validation. Open redirects. Etc, etc, etc. These aren't super sophisticated attacks and preventative measures we're talking about here. Letting people upload and run PHP scripts! If you fall for THAT one, that's like a bridge that falls because some drunk highschooler hits it with a broken beer bottle. Forget contractual financial reprisals. If your code falls for that, the biggest reprisal should be an overwhelming sense of shame at the absolute swill that you've stunk out.

And yes, security takes longer than doing it improperly. It always does, and that has to be taken seriously. And it is still cheaper than cleaning up the costs of exposing your customer's banking information to hackers, or your research to competitors in China. Stop whining, man up, and take your shit seriously.

Re:Yeah, right. (0, Troll)

Madsy (1049678) | more than 4 years ago | (#31179686)

Software does not fail, ever. It either works according to the specification, or it does not. Any attack vector or 'bug' is a fault with the program which has always been there. Bridges and other structures can't be made 100% secure, but software can and should. If a piece of software is not, then either it does not work according to the specification (as in, not finished), or it was deliberately made to be faulty. This is where your analogy breaks down. A well-formed program is 'invincible', but a bridge can never be, unless someone invents a material which is resistant to corrosion, shock, extreme heat/cold, and never gets tired.

In fact, since software cannot fail, ever, designing a well-formed specification and software following that specification exactly, is the only thing you can be responsible for as a software engineer. Since the very definition of "works according to specification" implies the absence of any vulnerabilities, is it really so hard to see why the blame is put on the software authors, in addition to the 'attackers'?

Feel free to ship unfinished software, or make it insecure on purpose. But then don't be surprised of the opinion customers and your peers hold for you.

Re:Yeah, right. (3, Insightful)

bky1701 (979071) | more than 4 years ago | (#31179774)

Have you ever programmed? I mean this seriously. It sounds like you either do not understand the complexity of software, or just want to complain.

Software bugs are logic typos. Have you ever made a grammatical error? Reading your post, I can say yes. Bugs are like that. In projects with tens of thousands of lines of code, it is unreasonable and completely unrealistic to expect every line to be a pinnacle of perfection, just like it is unreasonable to expect that every sentence in a book is completely without error.

Security holes tend to be failures to predict the way that things might "align" as to allow unforeseen things to happen. Working to specification is in no way, shape, or form a guarantee that something is secure. It is impossible to predict new security holes - if it were, the vast majority wouldn't exist to begin with. Further, when dealing with other libraries and programs (like every application on the planet), there are variables beyond the programmer's control, which might not be totally as they should be. If you know of somebody who can write specs that compensate totally for unknowns, I think you should shut up and go ask them for lottery numbers.

Come back when you have even a marginal understanding of what is involved in programming.

Re:Yeah, right. (3, Insightful)

pclminion (145572) | more than 4 years ago | (#31179776)

You're probably still in school, but I'll give you a break. Allow me to quote Knuth: "Beware of bugs in the above code; I have only proved it correct, not tried it."

Anyway... back to the Ivory Tower with you. The hour is getting late, and I think your faculty advisor has a cup of warm milk and a cozy set of jammies ready for you.

Re:Yeah, right. (1)

OrangeCatholic (1495411) | more than 4 years ago | (#31179828)

>Since the very definition of "works according to specification" implies the absence of any vulnerabilities, is it really so hard to see why the blame is put on the software authors

What about the client who purchased the software? Aren't they responsible for specifying what they want?

It seems backwards to hold the vendor responsible for testing. The client has to define what the software is supposed to do.

Toyota's Prius braking problem is a perfect example. It gets fooled when you go over a bump. Who was supposed to think of that, the developers in a lab, or the managers who ship the thing?

Re:Yeah, right. (5, Insightful)

digitalchinky (650880) | more than 4 years ago | (#31179930)

How many clients have you ever met that actually ~know~ what they want? :-)

Re:Yeah, right. (1)

Kingrames (858416) | more than 4 years ago | (#31179884)

Not quite.
If the attacks are guaranteed, then yes, you are expected to prepare for them as best as you can. That means establishing a paper trail that exposes each and every person in management for every time that they cut costs and endangered security while they were at it.
If you were hired to build a structure in an area where spontaneous fires occurred and you didn't even bother making anything heat-resistant, then yes, you should be sued, for being a damn moron.

I don't think the readers of Slashdot would be the kind of programmers to do a slack-jawed job on anything, really, unless they didn't yet know a better way to do it, but there are unethical people out there who would make a shoddy program and then sell information on how to attack it to third-parties who could make a quick buck off of exploiting those vulnerabilities.
Those people absolutely should face charges.

Re:Yeah, right. (1)

Nefarious Wheel (628136) | more than 4 years ago | (#31180044)

People like to draw the comparison with civil engineering, where an engineer may be liable (even criminally) if, say, a bridge collapsed. But this isn't really the same thing. We're not talking about software that simply fails and causes damage. We're talking about software that fails when people deliberately attack it. This would be like holding a civil engineer responsible when a terrorist blows up a bridge -- he should have planned for a bomb being placed in just such-and-such location and made the bridge more resistant to attack.

Yes - you can only predict nature, not politics or the sweeping winds of change. Bridges have been built before terrorism was a concern. The military has always known how to destroy things that civil engineers build. It's the nature of life. You can't re-write history that's writ in things that were built, you can only build anew with the new concerns in mind.

But engineers have tried. The World Trade Towers were designed to survive an impact of a fully-laden Boeing 707, I heard. 747's were far off in the future.

In times long past I was guilty of writing software with absolutely no concern for security at all. Why should I have? It was a quick and dirty piece in Vax Fortran/TDMS/RMS and written before the frigging Internet existed. And I found out that twenty years later, the program - a months work designed to be thrown away when new software would come in to replace it - was still in use. Hey, it worked, and it survived bit decay far better than I expected.

But like you, I refuse - utterly - to be responsible for its security in a world that didn't exist when it was written. I've moved on.

Re:Yeah, right. (1)

Tuoqui (1091447) | more than 4 years ago | (#31180094)

Except the difference is the Civil Engineer is expected to deal with things that are reasonable. Loads on the bridge, wind, tensile strength of materials, etc... A bomb is not a reasonable thing for a civil engineer to be protecting against.

Whereas the Software developer is working in an environment akin to a virtual mine field. All of these common attacks could be detected and avoided ahead of time thus saving a whole lot of headache for everyone involved. Still they cant really be expected to defend against things which arent known (like the stuff on this list must be).

Errors, Schmerrors (5, Funny)

empesey (207806) | more than 4 years ago | (#31179326)

A real programmer can do all 25 in one line of code.

Re:Errors, Schmerrors (1)

MikeFM (12491) | more than 4 years ago | (#31179808)

I can do it with zero lines of code. My zero lines of code will remain perfectly safe.

Re:Errors, Schmerrors (0)

Anonymous Coward | more than 4 years ago | (#31179926)

That's nothing. The code I write is so bad it creates vulnerabilities even after I comment it out.
*is extra cautious to click that "post anonymously" button this time*

Re:Errors, Schmerrors (1)

_merlin (160982) | more than 4 years ago | (#31179964)

Don't you mean a real Perl programmer? They can do anything in one line of code!

Re:Errors, Schmerrors (1)

DeadDecoy (877617) | more than 4 years ago | (#31179990)

I think that's what the IOCCC [ioccc.org] is for.

Re:Errors, Schmerrors (1)

linhares (1241614) | more than 4 years ago | (#31180016)

A real programmer can do all 25 in one line of code.

For the last time, that shit wasn't my fault!

Re:Errors, Schmerrors (5, Funny)

Anonymous Coward | more than 4 years ago | (#31180080)

#include "win32.h" /* :p */

Alanis ? (4, Funny)

daveime (1253762) | more than 4 years ago | (#31179338)

Kind of ironic the report is a PDF file, when another report stated that PDF accounts for 9/10 (or something like that) exploits last year.

Re:Alanis ? (2, Interesting)

Anonymous Coward | more than 4 years ago | (#31179670)

Adobe reader accounts for 9/10 exploits.

ftfy

I bookmarked this immediately (2, Interesting)

drfreak (303147) | more than 4 years ago | (#31179340)

Some of the errors are not relevant, mainly having my code in a managed (i.e. .NET) environment. The SQL injection and XSS potential vulnerabilities are still very relavent to me. Although most of my responsibility lies in code which is only reached via a https authenticated connection, as with any other web programmer, a "trusted" user can still -especially- find exploits.

This is even more true in inherited code. If you inherited code from a previous employee, I recommend a rigorous audit of the input and output validation. You just don't know what was missed in something you didn't write.

And Number 26 ... (2, Funny)

WrongSizeGlass (838941) | more than 4 years ago | (#31179348)

... letting me try assembler with my level of dyslexia.

Bad Idea (4, Insightful)

nmb3000 (741169) | more than 4 years ago | (#31179354)

a novel way to prevent them: by drafting contracts that hold developers responsible when bugs creep into applications

Holding a gun to somebody's head won't make them a better developer.

I don't understand why well-known and tested techniques can't be used to catch these bugs. There are many ways to help ensure code quality stays high, from good automated and manual testing to full-on code reviews. The problem is that most companies aren't willing to spend the money on them and most open source projects don't have the manpower to dedicate to testing and review.

TFA seems like it's just looking for somebody to blame when the axe falls. If your method of preventing bugs is to fire everybody that makes a programming mistake pretty soon you won't have any developers left.

Re:Bad Idea (1)

Ethanol-fueled (1125189) | more than 4 years ago | (#31179390)

Article:

The 25 flaws are the cause of almost every major cyber attack in recent history, including the ones that recently struck Google and 33 other large companies...

You:

TFA seems like it's just looking for somebody to blame when the axe falls.

"Contractor" for Russian Business Network"

Hooray! It's not my fault anymore!

Re:Bad Idea (2, Insightful)

Meshach (578918) | more than 4 years ago | (#31179446)

It does not matter how well you test something there will still be bugs. A successful test does not prove the absence of bugs, it just fails to prove the presence of any bugs.

Re:Bad Idea (1)

uncqual (836337) | more than 4 years ago | (#31179870)

Indeed.

I find too many development organizations don't understand that an organization can't test quality into a product, it must be designed and implemented into a product.

Also, too many QA organizations evaluate, to some extent, testers (by that, I mean those who develop tests) by the number of bugs they find. What QA testers should be evaluated on to a great extent is how many "severity adjusted" bugs the customer finds in the tested feature as a ratio of how many bugs the tests found during QA (crappy code, even with a good QA cycle, will still be crappy in the field as not every case can be tested and software "learns" to pass tests).

Of course the developer is most responsible, and should be held so (this is one reason I believe strongly in code ownership). However, the code reviewer should also be held nearly as equally responsible for bugs as the coder.

Re:Bad Idea (1)

bluej100 (1039080) | more than 4 years ago | (#31179940)

The number one risk, cross-site scripting, can automatically be almost eliminated through the use of a templating system that automatically escapes HTML in variables, as does Django.

Re:Bad Idea (2, Interesting)

evanbd (210358) | more than 4 years ago | (#31179624)

a novel way to prevent them: by drafting contracts that hold developers responsible when bugs creep into applications

Holding a gun to somebody's head won't make them a better developer.

I don't understand why well-known and tested techniques can't be used to catch these bugs.

Yeah, but you can keep them from doing it again.

The reason people don't use these well-known techniques is very simple: it takes time and effort, and people are lazy. So until the customer tells them to, they won't bother.

Which brings me to my biggest objection to this proposed contract. There's lots of documentation requirements, and no assignment of liability. Documentation is expensive to produce, and much of this I really don't care about. (Exception: the document on how to secure the delivered software, and security implications of config options, is an excellent idea.) For most of the documentation requirements, I don't really need to hear how you plan to do it: I just need to know that, if you screw up, you're going to be (at least partially) financially liable. And yet, the contract fails to specify that. What happens when there *is* a security breach, despite all the documentation saying the software is secure? If the procedures weren't followed, then that's obviously a breach of contract — but what if there was a problem anyway?

I actually like designating a single person in charge of security. Finding someone to blame after the fact is a horrible idea. However, having someone who's job it is to pay attention early, with the authority to do something about it is an excellent way to make sure it doesn't just fall through the cracks. By requiring their personal signoff on deliverables, you give them the power they need to be effective. (Of course, if management inside your vendor is so bad that they get forced into just rubber-stamping everything, that's a different problem. But if you wanted to micromanage every detail of how your vendor does things internally, why are you contracting to a vendor?)

Re:Bad Idea (1)

Herkum01 (592704) | more than 4 years ago | (#31179994)

The reason people don't use these well-known techniques is very simple: it takes time and effort, and people are lazy. So until the customer tells them to, they won't bother.

Haven't you heard, this is the age of marketing(got that from Business School). Why spend money building something when you can just market your way to the top...

This is what happens when we let business school grads run everything.

and yet they do not mention COBOL (1)

cavehobbit (652751) | more than 4 years ago | (#31179360)

Or did they retire that category?

The obvious follow-up question... (1)

damn_registrars (1103043) | more than 4 years ago | (#31179376)

... how many of these errors have been committed by slashdot?

So in other words (1)

lul_wat (1623489) | more than 4 years ago | (#31179380)

They got each organisation to submit one error, ranked them, then dropped the last 5 (or maybe there were 5 double-ups)

The actual article (0)

Anonymous Coward | more than 4 years ago | (#31179396)

The Register is just talking about another article detailing the list.

The actual article: http://cwe.mitre.org/top25/

Background checks are awful and stupid (1, Informative)

Anonymous Coward | more than 4 years ago | (#31179410)

I am a competent and trustworthy programmer in his late 30s who will fail a background check because I was convicted of something in my mid 30s, something I did when I was a teenager (and still a minor).

I have, over the years, been given many responsibilities and opportunity to abuse the authority required to discharge those responsibilities. I never once have abused that authority. If you ask previous co-workers if they consider me honest and trustworthy they will unanimously tell you that I'm one of the most trustworthy people they know.

I strongly resent the growing prevalence of background checks. I wasn't convicted of any sort of fraud or theft, but I am rejected anyway. The sad part is half the time I end up having to tell someone exactly what I was convicted of and why, and they wring their hands over their policy being so stupid but follow it anyway.

Background checks lead to stupid behavior. The criminal justice system is only a mediocre to poor arbiter of who is and isn't trustworthy. Like lie detector tests, you can never pass, only fail.

Re:Background checks are awful and stupid (1)

HornWumpus (783565) | more than 4 years ago | (#31179524)

Greater then 15 years statute of limitations.

Less then 5 years sentence for conviction.

I get an empty set for possible crimes and call BS.

Re:Background checks are awful and stupid (0)

Anonymous Coward | more than 4 years ago | (#31179634)

It had a minimum 7 year sentence that was suspended and I was given 10 years probation. And you would likely be quite surprised at what the crime was and how the statute of limitations stuff played out.

Re:Background checks are awful and stupid (1)

HornWumpus (783565) | more than 4 years ago | (#31179728)

You're anon.

What's stopping you?

Re:Background checks are awful and stupid (1)

Mister Whirly (964219) | more than 4 years ago | (#31179736)

So, you are posting anonymously. Why don't you just tell us what the crime is, so we don't have to wonder?

Re:Background checks are awful and stupid (0)

Anonymous Coward | more than 4 years ago | (#31179888)

For a few different reasons. It's ugly and personal and I don't like talking about it. It's unusual enough that giving details might be enough to identify me. If you did identify me and knew details other innocent people could be hurt.

Re:Background checks are awful and stupid (1)

Mister Whirly (964219) | more than 4 years ago | (#31179954)

So, it was a crime so rare that it has only been committed by a few people? I seriously doubt that naming the statute you violated could be enough to identify you. Considering the only other detail I know about you as that you may occasionally read Slashdot, it would be very hard for myself or anyone else to string together enough data to make a positive ID. I am starting to smell a rat on this one.

Unless - OMG are you that guy from my junior high who got caught skull-fucking that dead sheep wearing a prom dress???

Re:Background checks are awful and stupid (1)

Antique Geekmeister (740220) | more than 4 years ago | (#31179976)

No, stupid behavior leads to failing background checks. Keep cause and effect in the correct order.

In most cases, even a felony for something foolish in your teens will not override years of professional experience. And many crimes do not necessarily lead to a repeat of the crime: some crackers, for example, have gone on to productive careers in software development or security.

But if you are a convicted child molester, I _do not want_ you anywhere unsupervised with children. The recidivism rate is too high. And if you have pulled the sort of stunts that, say Brian Thomas Mettenbrink, a member of the cracker group "Anonymous", was convicted of, I don't want you anywhere near my systems. You'd have proven you were too self-righteous and vindictive to be trusted with my equipment.

The actual link . . . (1)

HazyRigby (992421) | more than 4 years ago | (#31179418)

. . . to the list, instead of an article discussing the list: Link [mitre.org]

Re:The actual link . . . (1)

HazyRigby (992421) | more than 4 years ago | (#31179432)

Or I could be a moron you can all safely ignore.

Oh, you mean VENDORs, not DEVELOPERs (1)

Jason Pollock (45537) | more than 4 years ago | (#31179426)

When you say "developer", I think individual employee. However, the individual employee isn't around long enough, the project validation will more than likely happen after the majority of them have finished, taken their final pay and left.

As for the actual contract? It reads like lawyer bait.

    Consistent with the provisions of this Contract, the Vendor shall use the highest applicable industry standards for sound secure software development practices to resolve critical
    security issues as quickly as possible. The "highest applicable industry standards" shall be defined as the degree of care, skill, efficiency, and diligence that a prudent person
    possessing technical expertise in the subject area and acting in a like capacity would exercise in similar circumstances.

And finally, background checks? Seriously? Only if you want it to take 6+ months for me to hire someone.

They missed one (1)

sayfawa (1099071) | more than 4 years ago | (#31179442)

I didn't see this [boingboing.net] one in there... I once typed it into some code by accident. It's more common than you'd expect.

25 is a nice round number. (0)

Anonymous Coward | more than 4 years ago | (#31179460)

But I think they probably could have shortened it to about 6 or 7. "Sanitize every input", "pay attention to trusted vs untrusted input methods", "bounds checking - do it", "make sure the encryption you use is strong enough", "watch multi-threading carefully", and the interesting one: "while error messages should be helpful and detailed, remember that you're not the only one reading them."

Misplaced Burden (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31179472)

The way to prevent most of these types of errors is to fix the programming language. A modern high-level language simply should not allow most of these things to happen. Any such language which does needs to be either fixed or discarded.

Yes, for low-level work you need languages without such safeguards. But for the rest of development work, the compiler/interpreter/runtime environment should prevent even the most careless of programming from making most of there errors.

Blame HTML and the browser for XSS. (2, Insightful)

MikeFM (12491) | more than 4 years ago | (#31179880)

In the case of XSS I'd say fix (X)HTML and the browsers. By default scripting should not work in the body of a page. Force a meta tag to enable it in the head part of the page or by end-user override if they really must have it. There is really no reason scripting needs to be included in the body of a web page. Trying to completely block scripting, especially in IE which just executes damn near anything, is a real pain and often ends up excluding valid data such as comments including source code. If someone uses an unsafe browser it's their problem.

Micromanagement (1)

russotto (537200) | more than 4 years ago | (#31179484)

The model contract smacks of the customer attempting to micromanage the vendor's development process. You might get away with that if you're IBM or the Federal Government, but most smaller customers aren't going to have that kind of clout.

And of course, the "security training" section is pure self-promotion for SANS itself.

Just Show Me the List!! (5, Informative)

QuantumG (50515) | more than 4 years ago | (#31179558)

So much shit. So much commentary. Just gimme the list? Here it is:

  1. Failure to Preserve Web Page Structure ('Cross-site Scripting') [mitre.org]
  2. Improper Sanitization of Special Elements used in an SQL Command ('SQL Injection') [mitre.org]
  3. Buffer Copy without Checking Size of Input ('Classic Buffer Overflow') [mitre.org]
  4. Cross-Site Request Forgery (CSRF) [mitre.org]
  5. Improper Access Control (Authorization) [mitre.org]
  6. Reliance on Untrusted Inputs in a Security Decision [mitre.org]
  7. Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal') [mitre.org]
  8. Unrestricted Upload of File with Dangerous Type [mitre.org]
  9. Improper Sanitization of Special Elements used in an OS Command ('OS Command Injection') [mitre.org]
  10. Missing Encryption of Sensitive Data [mitre.org]
  11. Use of Hard-coded Credentials [mitre.org]
  12. Buffer Access with Incorrect Length Value [mitre.org]
  13. Improper Control of Filename for Include/Require Statement in PHP Program ('PHP File Inclusion') [mitre.org]
  14. Improper Validation of Array Index [mitre.org]
  15. Improper Check for Unusual or Exceptional Conditions [mitre.org]
  16. Information Exposure Through an Error Message [mitre.org]
  17. Integer Overflow or Wraparound [mitre.org]
  18. Incorrect Calculation of Buffer Size [mitre.org]
  19. Missing Authentication for Critical Function [mitre.org]
  20. Download of Code Without Integrity Check [mitre.org]
  21. Incorrect Permission Assignment for Critical Resource [mitre.org]
  22. Allocation of Resources Without Limits or Throttling [mitre.org]
  23. URL Redirection to Untrusted Site ('Open Redirect') [mitre.org]
  24. Use of a Broken or Risky Cryptographic Algorithm [mitre.org]
  25. Race Condition [mitre.org]

Re:Just Show Me the List!! (1)

epp_b (944299) | more than 4 years ago | (#31179730)

Y'know, that whole list of 25 is really just a few items expanded into verbosity. You could basically narrow it down to unsanitized user input, unencrypted sensitive data, improper or no data length control, improper or no condition control, improper data storage, improper or no linearity control.

That's a six-item list that should really be common sense for any decent programmer.

The other points of this discussion are determining what potential vulnerabilities are even applicable, the likelihood of an attack occuring and how much the client is actually willing to pay if that likelihood is low.

Re:Just Show Me the List!! (1)

QuantumG (50515) | more than 4 years ago | (#31179760)

Y'know, that whole list of 25 is really just a few items expanded into verbosity. You could basically narrow it down to unsanitized user input, unencrypted sensitive data, improper or no data length control, improper or no condition control, improper data storage, improper or no linearity control.

Or you could just narrow it down to: user/programmer doesn't care about security.

Sheesh.

Did any of these kill?? Re:Just Show Me the List!! (1)

davidwr (791652) | more than 4 years ago | (#31180000)

Any bugs that resulted in a human death move to the front of the line.

Defective automobile braking and accelerator systems perhaps? Medical equipment that delivered too much radiation due to a software error (vs. machine-operator error)?

Just outsource. (1)

nicknamenotavailable (1730990) | more than 4 years ago | (#31179568)

Outsource security and programming to those countries responsible for the attacks.

Right away the system will have fewer vulnerabilities and there will be fewer attacks.

I am divided on this one (1)

wisnoskij (1206448) | more than 4 years ago | (#31179582)

While it makes sense for the developer of any product to be held responsible for its quality, it does not make sense for some huge multinational company to sue a $20/hr programmer for billions in damages.

Good Luck (1)

epp_b (944299) | more than 4 years ago | (#31179600)

Good luck actually finding a programmer that will give you code you want at the price you're paying.

Oh, and protection against SQL injection attacks? That shouldn't be part of a contract; that should be implied.

Programmers are only a part of the problem (0)

Anonymous Coward | more than 4 years ago | (#31179628)

Yes, there are many bad programmers out there. Probably over 50% of them wouldn't understand the bugs (security or otherwise) if you sat down and tried to explain it to them. Probably most people who work as programmers should be in another field. This isn't, however, really the issue.

With commercial software, the real problems are well known. Product and project managers for the most part do not understand software. What they do understand is their deadline and making their bosses happy. Quality is always sacrificed in order to make those deadlines. Companies put far less emphasis on testing than they should, and even when companies have great programmers and go to great effort to test, things will slip through anyway. Fees, contracts, etc. are really just a replacement for training that comes too late, after the problem has already occurred.

Sanitization is a worrying term to use. (2, Informative)

argent (18001) | more than 4 years ago | (#31179658)

Improper Sanitization of Special Elements used in an OS Command

The best solution is not "sanitization" (which people usually perform by blocking or editing out what THEY think are dangerous metacharacters) but proper encapsulation. In addition, there's a misleading section here:

For example, in C, the system() function accepts a string that contains the entire command to be executed, whereas execl(), execve(), and others require an array of strings, one for each argument. In Windows, CreateProcess() only accepts one command at a time. In Perl, if system() is provided with an array of arguments, then it will quote each of the arguments.

Execl() is not a "C" API, it's a UNIX API. It doesn't involve quoting. On a UNIX system, you can safely take advantage of this mechanism to pass parameters and bypass either shell or application quoting inconsistencies. On Windows, even if your program is in Perl and you pass system() an array of arguments, Perl is still at the mercy of the called program to correctly parse the quoted string it gets from CreateProcess()... *unless* you are operating under the POSIX subsystem or a derivitive like Interix.

In addition, whether you quote your arguments, use execl(), or use a smart wrapper like Perl's system(), you still need to ensure that COMMAND level metacharacters (like the leading dash (on UNIX) or slash (on Windows) of an option string) are properly handled.

This latter problem may remain even if you pass the command arguments through a configuration file to avoid the possibility of shell metacharacters being exploited.

Whitelists can't be simplistic. You can't ban the use of "-" in email addresses, for example. Encoding is better.

Re:Sanitization is a worrying term to use. (1)

pclminion (145572) | more than 4 years ago | (#31179710)

The best solution is not "sanitization" (which people usually perform by blocking or editing out what THEY think are dangerous metacharacters) but proper encapsulation. In addition, there's a misleading section here:

Excellent point which should be raised more often. "Sanitization" is a cry of helplessness. It says "I don't have control over my execution environment -- it does mysterious, inexplicable things, and I need to process my data to avoid causing strange things to happen." The problem is not the data, the problem is the environment.

A shell-variable-expansion exploit due to a call to system() can be solved several ways. The INCORRECT solution is to attempt complex hacks to "sanitize" the input. The CORRECT solution is to not use system() in the first place.

In general, introducing complicated languages (like shell script, or SQL) is a good way to absolutely fuck yourself. God damn SQL for making it so freaking hard to just STICK DATA INTO A DATABASE SAFELY. The fault is not with the programmer, who should have "sanitized his data" more extensively. The fault is the language itself, the API which forces us to combine the data with the commands themselves in a way that leaves holes open for exploitation. SQL should only ever have been used as a query language for humans. It should not have gained traction as a programmatic API. Now we have to suffer with SQL injection attacks. The problem is SQL itself.

Re:Sanitization is a worrying term to use. (3, Informative)

russotto (537200) | more than 4 years ago | (#31179818)

In general, introducing complicated languages (like shell script, or SQL) is a good way to absolutely fuck yourself. God damn SQL for making it so freaking hard to just STICK DATA INTO A DATABASE SAFELY.

Use prepared statements. A prepared "INSERT INTO FOO (BAR, BAZ, BIFF) VALUES (?, ?,?)", along with parameters from the user, is safe from SQL injection attacks, no matter the parameters.

Unfortunately there are a few cases you can't do that. No way to use a prepared statement for an "IN" clause, for instance.

Therac-25 (1)

SemperUbi (673908) | more than 4 years ago | (#31179666)

Bad programming for a radiation therapy machine caused it to emit 100 times the radiation dose after certain keystrokes, burning patients badly and killing some of them. Wikipedia has the root cause analysis.

blaming programmers.....DUMB (0)

Anonymous Coward | more than 4 years ago | (#31179672)

what is a programmer to do when managers demand short coding time, nothing but leave features out. Each feature costs time and given less time companies have to be happy with more bugs

Wow! It's actually an accurate and useful list! (2, Interesting)

deisama (1745478) | more than 4 years ago | (#31179678)

So, I clicked the link expecting something similar to the slashdot description and was shocked to find a real and relevant list!

Cross site scripting? sql injection? buffer overflow errors? Those are real problems and issues that any programmers would do well to learn about.

Really, presenting that information almost makes Slashdot seem, well ... responsible and informative.

I wonder if I just went to the wrong site...

Post a link to the actual list (0)

Anonymous Coward | more than 4 years ago | (#31179696)

Instead of articles *about* the list, go to http://cwe.mitre.org/top25/.

You get what you pay for... (2, Insightful)

Dgtl_+_Phoenix (1256140) | more than 4 years ago | (#31179778)

As much as we might like to think otherwise, software development is a business. And like all businesses the goal is to generate profit by increasing revenue and decreasing cost. As such an inherent bargain is struck between consumers and software shops as to proper ratio of cost to quality. High volume consumer applications get a lot of attention to quality though less to security. It's all a matter of threat assessment verse the cost of securing against such threats. Sure we all want perfect software where the software engineer is held accountable for every bug. But we also want software whose cost is comparable to a 20 dollar an hour sweet shop programmer. The software that results is really an economic compromise between the two. Running a space shuttle or saving patients lives? You probably are willing to shell out for the high cost software engineer. Putting up your hello kitty fan club blog? You might settle for something a little bit less... high class. I've been in this business for awhile now and as much as we like to wax poetic about quality we are still just trying to have our cake and eat it too. Better, faster, cheaper. Pick two.

Re:You get what you pay for... (1)

MikeFM (12491) | more than 4 years ago | (#31179920)

You don't think $7/hr is enough to write good code? Oh and you want it fast? Sure. Good, cheap, quick. Pick one.

coding open source is the biggest mistake. (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31179788)

i hear if you code open source that you end up with this irresistible taste for other men's dicks and shit. smokin them dicks. eating that shit right from another faggots ass.

I always do that... (0)

Anonymous Coward | more than 4 years ago | (#31179856)

I must have put a decimal point in the wrong place or something. Shit. I always do that. I always mess up some mundane detail.

Show me (1)

hoytak (1148181) | more than 4 years ago | (#31179914)

a programmer who doesn't get bitten by race conditions on occasion, and I'll show you one who doesn't program more than basic multithreaded code.

A good programmer is a good debugger...

So what does this do to open source? (0)

Anonymous Coward | more than 4 years ago | (#31179936)

What if you obtain your software through means other than a written, detailed negotiated contract?

What if you provide software you have written to the world under terms no more detailed than, say, the GPL?

Is this really a serious effort at security, or is it a corporate push to get entities away from libre software?

Any word on who is really behind this?

Lol @ Dangerous (2, Informative)

JustNiz (692889) | more than 4 years ago | (#31179958)

I work as a software developer in the avionics industry.
This list is ridiculous.
There's nothing any website programmer could do that is even remotely dangerous compared to what we could screw up yet all I see in the list are website programming bugs.

Nice suggestion to... (1)

zullnero (833754) | more than 4 years ago | (#31179974)

Hold devs responsible for bugs that creep into code. Because, after all, we all know that developers always get to work on unlimited time constraints and NEVER have any pressure to cut corners and get something out fast...right?

If they do that, there has to be a means to defend oneself in that situation, or they're suggesting that unlike any other production industry, the workers would be held accountable for a company's systematic failure to provide an operating environment and schedule that could produce success. Work a developer 24 hours without paid overtime? No problem. After all, if they get delirious and check in some code they were using for testing and were too out of it to remove it and it gets into the final version...and poof...there's a bug, you can sue them for whatever they were paid during 8 hours of that shift. Or a lot more. Then, you end up with the legal problem of defining what a proper environment and schedule would be. After the dust settles, all you'll have is a pile of bureaucracy and a legal mess that will just end up in shafting the developer, and not the management ultimately responsible for the release, in the end...after way too much money and time is spent trying to wade through that mess each time a bug is found in production code. Ridiculous.

Actions speak louder than words (2, Insightful)

nick_davison (217681) | more than 4 years ago | (#31180062)

"As a customer, you have the power to influence vendors to provide more secure products by letting them know that security is important to you,"

And, as a consumer, you have the power to influence vendors to provide better employment and buying practices by letting them know that they are important to you.

Meanwhile, the vast majority of America continues to shop at Walmart whilst every competitor goes out of business.

"Does it get the job done? Now what's the cheapest I can get it for?" is most people's primary motivation.

Sellers, who listen to them saying, "I want security!" and deliver that, at the expense of greater cost, are then left wondering why the competitor who did just enough to avoid standing out on security but otherwise kept their product slightly cheaper is selling many times more copies.

So, yes, people can influence sellers with their actions. The problem is, it needs to be their actions, not their words. Even worse, they're already successfully doing just that - unfortunately, their actions are screaming something quite different to any words about, "Security is truly important to me."

What it's like to do software like that (5, Interesting)

Animats (122034) | more than 4 years ago | (#31180064)

Been there, done that, in an aerospace company. Here's what it's like.

First, the security clearance. There's the background check before hiring, which doesn't mean much. Then, there's the real background check. The one where the FBI visits your neighbors. The one where, one day, you're sent to an unmarked office in an anonymous building for a lie detector test.

Programming is waterfall model. There are requirements documents, and, especially, there are interface documents. In the aerospace world, interface documents define the interface. If a part doesn't conform to the interface document, the part is broken, not the document. The part gets fixed, not the documentation. (This is why you can take a Rolls Royce engine off a 747, put on a GE engine, and go fly.)

Memory-safe languages are preferred. The Air Force used to use Jovial. Ada is still widely used in flight software. Key telephony software uses Erlang.

Changes require change orders, and are billable to the customer as additional work. Code changes are tied back to change orders, just like drawing changes on metal parts.

In some security applications, the customer (usually a 3-letter agency) has their own "tiger teams" who attack the software. Failure is expensive for the contractor. NSA once had the policy that two successive failures meant vendor disqualification. (Sadly, they had to lighten up, except for some very critical systems.)

So that's what it's like to do it right.

A real problem today is that we need a few rock-solid components built to those standards. DNS servers and Border Gateway Protocol nodes would be a good example. They perform a well-defined security-critical function that doesn't change much. Somebody should be selling one that meets high security standards (EAL-6, at least.) It should be running on an EAL-6 operating system, like Green Hills Integrity.

We're not seeing those trusted boxes.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...