Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

Oracle Exec Strikes Out At 'Patch' Mentality 264

An anonymous reader writes "C|Net has an article up discussing comments by Oracle's Chief Security Officer railing against the culture of patching that exists in the software industry." From the article: "Things are so bad in the software business that it has become 'a national security issue,' with regulation of the industry currently on the agenda, she said. 'I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."
This discussion has been archived. No new comments can be posted.

Oracle Exec Strikes Out At 'Patch' Mentality

Comments Filter:
  • Of course (Score:5, Insightful)

    by Anonymous Coward on Monday May 29, 2006 @04:46AM (#15423660)
    Oracle are (rightly or wrongly) worried about competition from Open Source. Regulation of the software industry would be a major benefit to them in this. Anyone who didn't meet the regulators' criteria couldn't compete.
    • Re:Of course (Score:5, Informative)

      by arivanov ( 12034 ) on Monday May 29, 2006 @05:35AM (#15423773) Homepage
      No.

      Not at all in fact.

      Open Source has nothing to do with this and I would suggest that you actually do some research instead of parroting the usual "Open Source will fix all problems" mantra.

      Oracle has recently been shown to have up to 5 years turnaround to patch glaring security holes. This has reached the point where security researchers like Litchfield who have had an ongoing relationshop with Oracle for 10+ years do not want to work with any longer. Note, we are not talking sc1pt k1dd10tz sitting in their dad's basement here. The people in question consult banks, governments, large corps and cannot actually recommend them a working security policy because Oracle cannot get its head out of its arse and patch a security problem for multiple years after it has been reported to them.

      As a result people who used to work on Oracle problems and reported them in private to Oracle have started posting them openly "0 day" style or giving Oracle a 1 month fixed notice of an impending posting regardless of does it have a patch or not.

      Obviously Oracle is pissed.

      First of all it breaks all of their marketing bollocks about unbreakability and security to bits.

      Second it is threatening their sales to customers in regulated markets where security issues must be addresses within a fixed term after being known.

      This is the reason for them to rattle the "regulation" sabers and moan about a "patch culture". Open Source has nothing to do about it.
      • by SnowZero ( 92219 )
        I noticed that you used the Queen's English in writing your post, which means you must be one of those "evil British hackers" mentioned in the TFA.

        Remember everyone, the lower the patch frequency a product has, the more secure it must be. Pay no attention to the wookie.
      • Re:Of course (Score:5, Insightful)

        by Anonymous Coward on Monday May 29, 2006 @06:14AM (#15423857)
        Open Source has nothing to do with this and I would suggest that you actually do some research instead of parroting the usual "Open Source will fix all problems" mantra.

        I said nothing at all about open source fixing all problems, or fixing any problems for that matter.

        If you've ever worked in an industry that's gone from being unregulated to being regulated, you'll know that one of the first things that happens is that the number of participants decreases as all those that can't afford the overhead of the regulations and of maintaining a compliance department (not the same as quality assurance; experts in the interpretation and application of the regulations) leave the field. One of the next things that happens is that the number of new suppliers entering the market plummets.

        There are many disasvantages to being regulated - additional costs and potential damage to reputation if you conflict with the regulator, but the big advantage is a barrier to competitors entering your market.

        That does NOT mean that regulation is a bad thing - that depeneds on the specifics. However, if a supplier is arguing for regulation of their market then the chances are that they're doing so to cut down the competition. It's unlikely that they're asking for it because they can't control their own engineers and are hoping a regulator will do better.

        If you've observed Oracle at all you'll have noticed that they are worried by competition from open source. It is likely that that's their target in this, though it could be other smaller competitors.
      • If by regulation they means making a law against disclosing security holes on regulated software, it follows the security by obscurity dogma. Which might not have anything to do with Open Source as a cause. Yet, by design, would make impossible for Open Source to fulfill this "security by obscurity" requirement.

        As for the "patch culture" head-liner. The term patch was is commonly associated with Open Source and it's inherent quality for any piece of software to deviate from the original distribution giving
      • Re:Of course (Score:2, Insightful)


        First of all it breaks all of their marketing bollocks

        Second it is threatening their sales to customers in



        It sure sounds like an Oracle problem to me. How the hell can they try to drag in a regulatory body, whose essential function would be to raise the barrier to market entry and protect and grow their market share?

        Well, we *know* how they can try. No way in hell they will succeed.
        • Pretty easy (Score:5, Insightful)

          by Moraelin ( 679338 ) on Monday May 29, 2006 @10:12AM (#15424447) Journal
          Given what Oracle's problem _is_, probably what they _really_ want isn't regulation of the "you must prove that your software passes this and that criteria to be allowed to sell it." (Which would also raise entry barriers for competitors.) I mean, really, if you were a company which takes five fucking _years_ to bother patching a security hole, and even then only when an exploit was widely publicized, you're not going to ask for a regulation that'll ask you to pull the product off the market until you fix it.

          The kind of regulation they want is more like "you're an evil irresponsible hacker and going to jail if you disclose bugs in someone else's product." Yes, it's security by obscurity. But that way Oracle can happily spew bullshit about being secure and unbreakable, and never have to fix any bugs.

          Basically Oracle doesn't give a shit if Corporation X's database is riddled with bugs and exploits. They just don't want the PHB's at Corporation X to know about it.

          If it also results in some entry barrier, all the better, but that's not the main goal.
  • by Mikachu ( 972457 ) <burke...jeremiahj@@@gmail...com> on Monday May 29, 2006 @04:51AM (#15423675) Homepage
    Of course the "patch, patch, patch" business plan is bad for consumers. But in truth, most software companies don't care about consumers. They care about making money. As it happens, most people really don't care enough about the subject to make the companies change.

    One of the examples in the article asks, "What if civil engineers built bridges the way developers write code?" and answers, "What would happen is that you would get the blue bridge of death appearing on your highway in the morning." The difference here, however, is that civil engineers couldn't get away with making rickety bridges. You would find public outcry if it broke while people were on the bridge. In the software world, however, they scream and the companies just fix it with a patch and it shuts the consumers up. Saves a lot of money and time in testing at companies.
    • by pe1chl ( 90186 ) on Monday May 29, 2006 @04:57AM (#15423688)
      Another difference is, that when you build a bridge and it collapses you will be held liable for it.
      When you build software, you just attach a EULA that says "I shall not be held liable" and that's it.

      Once software makers, especially the large commercial companies, find themselves in the same boat as other industries and have to pay compensation when bad stuff is released, they will certainly step up quality control to the next level. Because it saves them money.
      • by Sycraft-fu ( 314770 ) on Monday May 29, 2006 @05:35AM (#15423774)
        The difference is that software is expected to be cheap, released fast, and to run on all kinds of platforms. Sorry, that leads to errors. You can have software that never needs patching, you just have to take some concessions:

        1) Development cost will be a lot more. You are going to have to spend time doing some serious regression testing, besically testing every possible compination of states that can occur. May seem pointless, but it's gotta be done to gaurentee real reliability.

        2) Development time will be a lot more. Again, more time on the testing. None of this "Oh look there's a new graphics card out, let's get something to support it in a month." Be ready to have years spent some times.

        3) Hardware will be restricted. You are not going to be running this on any random hardware where something might be different and unexpected. You will run it only on hardware it's been extensively tested and certified for. You want new hardware? You take the time and money to retest everything.

        4) Other software will be limited. Only apps fully tested with your app can run on the same system. Otherwise, there could be unexpected interactions. The system as a whole has to be tested and certified to work.

        5) Slower performance. To ensure reliability, things need to be checked every step of the way. Slows things down.

        If you aren't willing to take that, then don't bitch and demand rock solid systems. I mean such things DO exist. Take the phone switches for example. These things don't crash, ever. They just work. Great, but they only do one thing, yoy use only certified hardware, they've had like one major upgrade (5ESS to 7R/E) in the last couple decades, and they cost millions. You can do the same basic type of stuff (on a small scale) with OSS PBX software and a desktop, but don't expect the same level of reliability.

        The thing is, if your hypothetical bridge were software (and it's quite simple compared to software) people would expect to be able to put the same design anywhere and have it work, drive tanks over it and not have it collapse, have terrorists explode bombs under it and have it stay up and so on and have all that done on 1/10th of the normal budget.

        Until we are willing to settle for some major compramises, we need to be prepared to accept patches as a fact of life. I mean hell, just settling on a defined hardware/software set would do a lot. Notice how infrequent it is to see major faults in console games. It happens but not as often. Why? Well because the hardware platform is known, and you are the only code running. Cuts down on problems immensly. However take the same console code and port it to PC, and you start having unforseen problems with the millions of configurations out there.

        Me? I'll deal with some patches in return for having the software I want, on the hardware I want, in the way I want, for a price I can afford.
        • Take the phone switches for example. These things don't crash, ever. They just work.

          Sorry, that's just not true. Phone switches _do_ crash - it's just that the telcos have learnt to build networks with a hell of a lot of redundency. If a phone switch goes down then the worst that'll happen is you'll lose the calls that are in-progress on that switch (actually, the switch may be able to recover the calls if it resets quickly enough - just because the signalling goes down for a few seconds doesn't necessari
        • by MathFox ( 686808 ) on Monday May 29, 2006 @07:17AM (#15423983)
          Too much time is spilled in "integration" and testing because management refuses to plan time for high level design. One can create better quality software in about the same amount of time when one uses a proper development process. Some hints:
          • Do a proper high-level design.
          • Review your design with all stakeholders, including QA/testing and marketing.
          • Plan time to fix issues in all steps of the project.
          • Prototypes are to throw away, don't build your product on top of them.
          • Require specifications for all parts of the application.
          • Peer review all specifications.
          • Peer review all code.
          • Perform unit and module tests on all parts of the code.
          • Fix bugs as early as possible.
          Development will cost more and take longer
          It will take more time till a programmer starts coding, you will need less time to find and fix bugs. A clean design leads to cleaner module interfaces, which makes tracing the bug easier. Doing module testing means that a lot of bugs are found early and are automaticly traced to an offending module, which means quick fixing.

          Restrictions on hardware and software
          For high-reliability, yes. It's hard to write software that can replace blown out fuses. I think it is rediculous that an Internet connected Windows system is "automagicly" degrading to a near useless condition, so Windows should be thrown out.
          It should be possible to run a decent selection of software on a server, where the user selects his mixture, taking into account his desired level of reliability. An Operating System should sufficiently isolate processes so that a single bug doesn't crash the machine.

          Slower performance.
          Needless consistency checks slow things down (and improper checks may even cause instability). With a proper design you know what to check where, so you only check once. In my experience good quality software performs better than bad software.

          Take the phone switches for example. These things don't crash, ever. They just work. [...] they've had like one major upgrade (5ESS to 7R/E) in the last couple decades
          Sorry, I had to pick myself up from the floor, fell of my chair laughing. I did work for a telco and crashed a few switches myself, the Lucent stuff you mention. Ericson makes more reliable systems (but they have a different design philosophy). And software updates for phone switches appear regularly.

          • With a proper design you know what to check where, so you only check once.


            That isn't going to weed out all bugs. What if the programmer is tired and makes a mistake and forgot to check for a precondition in some places? Boom. And that kinds of mistakes happen a lot. If the code doesn't crash, that can be even worse, as it may lead to corruptions in the internal states.
            • What if the programmer is tired and makes a mistake and forgot to check for a precondition in some places?

              The development process is there to catch these kinds of mistakes. When a programmer has proper specifications of what to program, he has less things to worry about and can spend his attention on making better code. The programmer should have the time to look over his own code and run his test set after a proper night of sleep.
              Secondly, we do peer reviews and module tests. If you have a decent revie

          • by spirality ( 188417 ) on Monday May 29, 2006 @02:05PM (#15425278) Homepage
            You make lots of good points. However, software is generally not written from scratch. That is, more people are maintaining existing systems than writing new ones.

            Second, software is maleable. It grows over time. Unlike the bridge, which is static. After the initial release you add to it. Oftentimes in ways that were unintended by the original designers.

            We do not have Brooklyn Bridge 2.0.

            So yes, everything you mention would improve quality, but because of its maleable nature, software will always be different than things in the physical world.
        • Your example of the console market misses the mark a little. The real reason that such products are so solid has less to do with the limited hardware as it has to do with proper design and coding practices, and some of the best quality control in any software industry. Publishers insist upon that, for the reason that if a cartridge or DVD-ROM game has a fatal error in it customers return their discs or cartridges in droves. That costs millions. The penalty for failure is high, very high, since there is no w
        • Here here.

          This quote seems appropriate:

          [G]overnment's view of the economy could be summed up in a few short phrases: If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it. -Ronald Reagan
        • Take the phone switches for example. These things don't crash, ever. They just work. Great, but they only do one thing, yoy use only certified hardware, they've had like one major upgrade (5ESS to 7R/E) in the last couple decades, and they cost millions.

          As other posters have already noted, telephone switches do crash, or much more frequently, have impairments short of a complete outage. Lucent's markteting boasts of five 9s availability, but thats based on aggregate FCC reporting data, not for any one sw

      • Except that bridges get potholes, and cars hit and scrape the anti-rust paint, and roads get renamed so they need new signs, or other roads get changed so the traffic doubles, etc. Bridges are hardly static: Oracle has been inexcusably static with their software, precisely because it is a lumbering behemoth and nearly impossible except for a highly trained person to manage with a huge investment in time.

        They can't tell which pothole, when fixed, will cause all cars to go faster and skid off the place they d
      • Another difference is, that when you build a bridge and it collapses you will be held liable for it.
        When you build software, you just attach a EULA that says "I shall not be held liable" and that's it.

        Do you know the differences between a program and a bridge ?

        1. Bridge builders have exact requirements (needs to carry this much weight, needs to take this much wind, needs to last this many years) from the start. Programmers usually keep on getting new requirements and have the old ones changed during th
        • by pe1chl ( 90186 )
          Of course software can be treated as a science, with mathematic roots and stable foundations.
          Of course people could look upon programmers as they look upon engineers: this is something that you need a good education and training for, and that you should not attempt as a naive bystander.

          In reality, this is not happening. There have been times when unemployed people with some not-so-practical education were retrained as programmers in a couple of weeks. And we see development environments that push "trial a
    • First, bridges are quite a bit less complicated than software. Second, there are numerous examples of bridges that have had structural flaws. Just because they don't turn blue with obvious error codes stamped on them does not mean they are perfect. Bridges must undergo repair periodically, or they will fall apart.

      Bridges solve one problem: Supporting X weight across Y distance, taking into account building materials and terrain.

      Software is usually far more complex in what it tries to accomplish. Its no
      • "Second, there are numerous examples of bridges that have had structural flaws."

        And the very concept of road construction is very much a trial and error with thousands of people actually getting _killed_ every year, often because of known problems that should have been fixed to provide a safer infrastructure.
    • As a software developer, I lie awake at night dreaming of only having to solve a problem as simple a bridge. It has only one use case: vehicles of a known weight with a known wheel surface traveling in predetermined paths at a predetermined rate of speed. Also, if you dig down deep enough on the Earth, there is always something solid to anchor the bridge. Then bridge developers have millions of existing examples which can be studied and reused.

      In software, half the stuff people will do with it were unknown

      • Sorry, but as a mechanical engineering (who works with a lot of civil engineers), I can't let this one pass. You wrote:

        I lie awake at night dreaming of only having to solve a problem as simple a bridge. It has only one use case: vehicles of a known weight with a known wheel surface traveling in predetermined paths at a predetermined rate of speed.

        and then you wrote:

        people would be in an uproar about all the deaths that are only possible because of the bridges: people jump off of them, cars crash

    • bridges are simple and its uses dont increase.

      Its like one 1000 transitor circuite or 200 line function, thats it.
  • yeah... (Score:3, Informative)

    by narkotix ( 576944 ) on Monday May 29, 2006 @04:52AM (#15423676)
    this [techtarget.com] explains all....bunch of slackasses!
  • by zappepcs ( 820751 ) on Monday May 29, 2006 @04:59AM (#15423695) Journal
    Wow, really nice slice on the Brittish.. FTFA

    She claimed that the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."

    It seems to me that the F/OSS industry has shown that fast, and effective patches can be applied, and that software we pay for has less then reasonable responses to such threats. I use F/OSS and I'm quite happy with the response they have to software problems. I don't expect it to be of NASA quality, just to be good, and it is. For the amount that you have to pay for Oracle et al, you expect fast resonses on problems. The problem is that they don't respond fast enough. There is NO bullet proof software, though I give a hat nod to the guys that wrote the code for the Mars rovers. Certainly, Oracle isn't showing that they deserve the price they demand, at least not in this respect.

    I might be off topic, but all the F/OSS that I use, delivers what I pay for AND MORE. The software that I have to pay for is lacking. When you pay thousands of dollars, you expect patches in a timely manner, and before you get hacked. I think this is a big reason that F/OSS will continue to win hearts and minds across the world. Despite the financial differences, F/OSS actually cares, or seems to, and they do fix things as soon as they find out, or so it seems to me. They have a reputation to uphold. Without it, they will just wither and die. It amazes me that investors, stock holders, and customers are willing to wait for the next over-hyped release of MS Windows while they suffer the "stones and arrows" of the current version. It appears that no matter how bad commercial software is, people rely on it. Yes, of course there is more to the equation than this simple comparison, but I think this is important. If you weigh what you get against what you pay, F/OSS is a good value. The argument is old, and worn, but ROI is a big deal, and patches make a difference to ROI.

    Is it really what the software industry needs? A set of rules to make things bullet proof.. which of course won't ever happen. That kind of mindset is totally wrong, even though the sentiment is in the right place, you can't regulate quality in this regard. Sure, you can make sure that all gasoline is of a given quality, but I don't trust the government to test and regulate software. The US government already has a dismal record of keeping their own house in order on this account, I don't want them telling me how to do anything or what I can and cannot sell, never mind what I can give away for free under GPL.

    • She claimed that the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."

      Sums me up perfectly old boy (well maybe not the technically skilled part)
    • There is NO bullet proof software, though I give a hat nod to the guys that wrote the code for the Mars rovers.

      Ah, that would be the software on the rovers that almost cost the mission quite early on then. :)

      FWIW, I believe the rover software runs under VxWorks. It would, of course, be very interesting to see the software - it's a shame NASA aren't likely to open-source it. If they did I could quite imagine a few build-you-own-mars-rover projects popping up on the web. :)
  • by Colin Smith ( 2679 ) on Monday May 29, 2006 @04:59AM (#15423697)
    Most "engineers" are mechanics. It is indeed time that the software developers, in fact everyone in the industry started to act in a more professional manner, that means understanding the principles, designing and building systems which are known to be able perform to specifications. When I say known, I mean modeled and tested.

    You can start taking the profession seriously by joining your local professional engineering body.

     
  • Re: "Chief Security Officer Mary Ann Davidson has hit out at an industry ... wedded to a culture of "patch, patch, patch," at a cost to businesses of $59 billion"

    So, if people pirated software, instead of buying it, there would be no need for vendors to provide patches and business would be $59 billion richer.
  • by Toby The Economist ( 811138 ) on Monday May 29, 2006 @05:02AM (#15423704)
    "I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."

    Funnily enough, I'm just now reading Darrell Huff's book, "How To Lie With Statistics".

    The problems with her poll are manifold.

    Firstly, her group is composed of securiy officers who are on the CSO Council; might their views differ from security officers not on the Council? perhaps tending to be more of the belong-to-an-organised body sort? might perhaps therefore be predisposed towards regulation?

    Secondly, of the officers on the Council, which ones did she ask? all of them? or did she have a bias to tend to ask those she already knows will agree? perhaps those who found it rather boring and aren't quite so pro-organised bodies just don't turn up at the meetings.

    Thirdly, what's her position in the organisation? if *she* askes the question, are people more likely to say "yes" than they would to another person?

    Fourthly, are people inclined in this matter to say one thing and do another, anyway? e.g. if you do a survey asking how many people read trash tabloids and how many people read a decent newspaper, you find your survey telling you the decent newspaper should sell in the millions while the trash should sell in the thousands - and as we all know, it's the other way around!

    Fifthly, even if the views of members of the CSO Council truely represent all security officers, and even if they were all polled, who is to say the view of high level security officers is not inherently biased in the first place, for example, towards regulation?

    So what, at best, can her poll tell you? well, at best, it can tell you that chief security officers who regularly turn up at meetings will say to a particular pollster, for whatever reason, and there could be widely differing reasons, that they think regulation is a good idea.

    Well, I have to say, that doesn't tell us very much, and that's even assuming the best case for some of the issues, which is highly unrealistic.
    • So, what you're saying is: Her survey needs a some patches?

      ---

      Insisting on absolute safety is for people who don't have the balls to live in the real world - Mary Shafer [yarchive.net], NASA

    • Or, as Homer Simpson put it..

      "Oh, people can come up with statistics to prove anything. 14% of people know that."
    • Great points, but you missed one:

      Sixthly, what does "a lot" mean? "A lot thought the industry should be regulated", eh? Was that a majority? Did a lot more _disagree_?

      Every week there's a new security survey, usually of about 50 people, showing how critical it is that I rush out and buy a product from the company that sponsored the survey. It tends to make me somewhat skeptical of surveys and polls. Very few stand up to any sort of scrutiny, though there are the occassional exceptions [pwc.com]
    • Great book. I had to read it for a Critical Thinking course. It definately gives you a good view of how the statistics generated for media really work, as well as how much weight you should put on them.

      OT: Always makes me think of Homer Simpson saying "Oh, people can come up with statistics to prove anything, Kent. 14% of people know that."
    • Geez, you had to write all this down to debunk the phrase "I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,"? (Emphasis mine)

      Here's something to *really* throw a monkey wrench into his argument: a) the poll was informal, and he doesn't even have any numbers to back it up, and b) he just says "a lot of them." "A lot" can mean 25%. It by no means has anything to do with counting a majority. The real issu
  • This, from Oracle? (Score:5, Insightful)

    by Anonymous Coward on Monday May 29, 2006 @05:07AM (#15423720)
    Whose patches are infamously known to break stuff, released in 6 month batches (maybe just a mite too spaced out?), and so infamously poor at actually patching their bugs that they currently have an open, publically known 0day with no patch, because they screwed up patching it last time and it's still open?

    And they think security patches are a poor model?

    Maybe that's why they put so little effort into them. Maybe that's because they put so little effort into them. Maybe some people think of it as bridge maintainance, and they want to build the bridge perfect every time? When they can't even get patches right when they have six months between them? Fat chance.

    Honestly, out of the people in the software industry, even Microsoft do a better job, security-response-wise, than Oracle. And when you're behind Microsoft in that department, you've really got a problem.

    They need to make a serious effort at security response and treat it like a real priority, not show-ponying about regulation when, if they were regulated, they would still be completely unable to respond, but would point to poorly-drafted regulation as "tying them up in red tape".
  • by 228e2 ( 934443 ) on Monday May 29, 2006 @05:13AM (#15423733)
    This infuriates me to no end, when people use references they saw on the back of a cereal box beacuse they thought it was cute. FTA:

    "What if civil engineers built bridges the way developers write code?" she asked. "What would happen is that you would get the blue bridge of death appearing on your highway in the morning."

    Im sorry, but there are crazy people scanning my highway for open ports and i dont see script kiddies pinging my roads. Graffati aside, they are left alone. Code that is written works just fine if people dont try to over flow buffers and install rootkits. The bridge I see out of my window is fine because people dont hit it with sledge hammers.



    Just my 2 cents . . . .
    • Well, how about buildings, then? They are supposed to keep burglars out, and yet very few houses crash regularly.

      The difference is that a computer program can do so many different things. If buying a new toaster could install an invisible front door with no lock right next to your regular door, then I think we would have a lot more real world security problems.
    • Im sorry, but there are crazy people scanning my highway for open ports and i dont see script kiddies pinging my roads.

      No, but highways and bridges have to contend with wind, rain, snow and occasional collisions. You know what happens when a truck slams into a bridge abutment? The truck is destroyed and the bridge quivers a bit.

      Security attacks are just part of the software equivalent of weather. Good software shrugs it off.

      Code that is written works just fine if people dont try to over flow buff

    • It's not illegal to point out that a bridge is about to fall down and demand that something be done about it, either.
  • by Dasher42 ( 514179 ) on Monday May 29, 2006 @05:14AM (#15423737)
    People outside the software development field really do make an awful lot of assumptions about the number of things that can go wrong in millions of lines of source code. Specification versus implementation is a tricky beast by itself.

    If they really want to follow through with this talk, they'd better be prepared for the design decisions that go along with it, code reuse most of all. One thing that I think is particularly detrimental to code reuse is a proprietary model where the OS and every software vendor re-invents wheels over and over. You're going to need more open specs to change that.

    If this is rooting for regulation of the software industry, beware. The big guys have a lot more to gain from this than the small innovators and startups. Who would really want to take advise from stereotyping wags like that anyway?
  • Just Be Clear (Score:4, Insightful)

    by Enderandrew ( 866215 ) <enderandrew@NOsPAM.gmail.com> on Monday May 29, 2006 @05:25AM (#15423753) Homepage Journal
    Often, when consumers are given the choice they prefer to have software sooner, even in a beta state. We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

    Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?

    Do you think major corporations are just going to hand over source code? Can you imagine the leaks?

    Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?
    • Re:Just Be Clear (Score:4, Insightful)

      by erroneus ( 253617 ) on Monday May 29, 2006 @06:24AM (#15423870) Homepage
      Often, when consumers are given the choice they prefer to have software sooner, even in a beta state. We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

      Actually, it does. At least in my case, and in the case of the business I work for. The fact is, we have quite a few programmers on staff due to the realization that we KNOW we cannot trust anyone but ourselves to address the concerns of the company directly and diligently. We don't create our own word processors. We have no plans to write our own Photoshop clone. But for many apps that are critical for business flow, we either wrote it ourselves, or have a great deal of leverage over the development of the apps we use.

      Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?

      OSS would have an inherent exemption. Regardless of where or how it is used, it's still 'hobby' coding. No pretense is made that it is a for-profit effort. However, if there are any OSS projects that are designed for for-profit, then yeah perhaps some level of consumer protection is in order. EULAs have questionable legal status as it is, but I think it's time we struck them down as invalid and forced 'professionals' to accept the blame for shoddy work. As for burdens on taxpayers? OMG. Are you serious? And as for regulators having access to souce code? Probably not at bad idea! We've all heard of source code escrow. Perhaps it should ALL be that way.

      Do you think major corporations are just going to hand over source code? Can you imagine the leaks?

      Yeah, they would as a continued cost of doing busines. Many of the products we use in the physical world are easily duplicated and most are. Unsurprisingly, there is more than one maker of clothing. More than one burger joint. More than one maker of plastic food storage containers. More than one maker of automobiles. In these cases, it's not the technology that differentiates the product. It's the QUALITY and the reputation of the business (and yeah, the price too) that factors into consumer choice.

      But yeah, I see your point about leaks... it could result in software piracy, copyright violations and all sorts of nasty things that... hrm... hey wait a minute! They are ALREADY a problem! This wouldn't create the problem and I can't imagine it adding too much more fuel to it.

      Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?

      No, I don't want to see more government oversight. But I would like to see more consumer protection. Do you think the consumer doesn't need it? If not, then why not? If so, then how would you propose that consumers get that protection?

      Look. There was a time before the FDA and various medical boards. To live life without them protecting the recipients would be rather unimaginable wouldn't it? We don't want people driving on the streets without all manner of regulation... driver's licenses, safety inspections, liability insurance. We require that many of the products and services we use regularly have regulation to guarantee minimal quality standards and some of them aren't as 'critical' as software. We don't allow EULAs and disclaimers to get in our way either. There's a cancer warning on every label for cigarettes. Doesn't stop people and governments from going after the tobacco industry. Why should software have such an exemption? Because it's PRESENTLY unregulated as medical/dental practice once was? Because it's an unimaginable mess to clean up?

      There are ways for goverment to be involved without being complete morons. How about people with PhDs in software development sitting on the board of regulation
      • If you make your own tools at work, you are the exception, not the rule. Most consumers aren't developers, they are consumers.

        You also suggest there is more than one burger joint, and that consumers purchase software based on the quality of said software.

        So why then was AOL number 1?

        In most categories, I could argue that the leading product is often an inferior product. Given that most CIOs can't differentiate between quality software and well-known software, I don't trust the government to step in and st
    • We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

      Who is this 'we' you speak of? Personally, I don't purchase software, I emerge or apt-get it. As for the beta state of commercial software, it makes me cry rather than laugh, seeing people close to me waste money, time and nerves on Microsoft crap.

      • As a fellow Gentoo user, I can relate.

        However, you do not represent the masses. If I had to hazard a guess, I'd say the bulk of software purchases come in the corporate world. People at home love to pirate. And most major businesses prefer to go with traditional retail software over a custom-made-Gentoo-build.

        Where is the official support for Gentoo? Can you call a 1-800 number? Are the end users knowledgable and familiar with it in the way they are with Windows? How standard is it? How consistent is
  • by mustafap ( 452510 ) on Monday May 29, 2006 @05:25AM (#15423754) Homepage
    They are the company who have the worst user interface tools on the planet.

    The GUI's would have sucked in the 1980's.

    Every SQL statement was designed by a dfferent person, with a different syntax.

    If the guy expects us to assume he is an authority on the subject, he should clean up his own rubbish first.
  • as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."
    Rule Britannia!
    Britannia's pwnz0rs r0x.
    British h4xx0r5 r so l33t they
    pwn3d t3h b0x.
  • by Masa ( 74401 ) on Monday May 29, 2006 @05:33AM (#15423769) Journal
    Well, patches are not nice and of course it would be better for customers if the product would be perfect from the start. It's true that the most software products are buggier than, for example, fifteen years ago. On the other hand, there are several reasons for the (lack of) quality of the modern computer software. Tight dead-lines, investors, competition, to name few. And of course it's always possible to cast some blame to the software engineer.

    However...

    I don't like that she is using age-old classics for fear mongering. "National security" and the bridge analogy to be specific.

    Bugs themselves are rarely the problem when we are talking about "national security". For some odd reason it seems that people have forgot the importance of physical separation of the public network and sensitive information / infrastructure. It's stupid to blame the tools if the user is an idiot (and in this case I mean those "chief security officers", who design these braindead infrastructures for corporate networks).

    I don't understand how anyone in their right minds could suggest any kind of regulatory system for the software quality. It's practically impossible to control and what if there is some sort of accident caused by some regulated and "certified" product? Is this certification (or what ever) a free pass for the software provider? This would turn to be an ultimate disclaimer for the software companies. Or - the other way around - the ultimate responsibility, which would lead to the point where there are no more software engineers because there is too much personal responsibility involved.

    Besides, in my opinion, Daividson insults British people pretty badly and describes them as "slightly disrespectful of authority, and just a touch of criminal behaviour." I think that's not a very professional comment.

    Anyway, this is what I'm thinking about of this whole article.
  • The whole bridge::software analogy is:

    1. A straw man man argument and a poor one at that. It's not uncommon for civil engineering projects to require "patches" http://en.wikipedia.org/wiki/Big_dig#Reports_of_su bstandard_work_and_criminal_misconduct [wikipedia.org]

    2. An obviously bad analogy, I'm sure the specifics will be discussed here ad infinium.
  • by suv4x4 ( 956391 ) on Monday May 29, 2006 @05:48AM (#15423804)
    That's a typical manipulation move: announce a problem we all know exists, ask "why does not solution X exist that solves it" and then push for solution x to happen.

    Somewhere in between the hype surrounding the issue, noone stops to ask themselves "wait, this solution doesn't even prevent this problem".

    Liability is one thing, regulation before manifacturing: another. Given how much success government institutions have with software patents, how could we trust our software's security to them?

    First thing they'll do is "regulate" the existence of a backdoor for the police/CIA/FBI into everything that resembles software technology with access control.
    • How about we make Oracle a trade? If they throw out the end-user licenses and become liable for the flaws and damage caused by the flaws in their software, we'll protect them from security flaws published without telling Oracle at least 3 months before publication?
  • For software, that is. Building codes and electrical codes have worked pretty well.

    If we could measure software quality well enough to regulate it, how much need would there be for regulation? Companies would just specify in their purchase orders "must have 685 mill-pf of quality" or "not less than 3 kilo-Sendmails of security" and the market would sort things out in its usual inconsistent but unbeatable way.

    I'm nervous about government regulation partly from spending too much time studying the HIPAA regula
  • Yes, OpenBSD still has a few security patches each version, but thier methodology is far better than many other software developers.
  • British "Hackers" (Score:3, Insightful)

    by smoker2 ( 750216 ) on Monday May 29, 2006 @06:21AM (#15423866) Homepage Journal
    Speaking as a Briton -

    the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."
    should read -
    the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, disrespectful of authority, and are not averse to criminal behavior."
    BTW, I see the use of the word "hacking" as a good thing, versus "cracking". Also, "criminal behaviour" is an ever changing variable, defined by clueless beaurocrats. I break the law every time I play a dvd or mp3 on my linux system.

    The ideal system (for the government) is one where we are all criminals.

  • by freedom_india ( 780002 ) on Monday May 29, 2006 @06:29AM (#15423882) Homepage Journal
    Coming from a company that for Years has perfected the art of vaporware, and charges the cost of a Battleship to build a kid's 2-ft long boat.

    She forgot to say that if Oracle were to adopt truthfulness in adverts and avoid vaporware and prevent charging the cost of a FULL Salon to setup cardboard emplants the industry would be $159 billion richer and we would have all have witnessed the Second Coming with the money...

    Sheesh what a rant from a company that is responsible for the Vaporware strategy...

  • Until they can invent a human that doesn't make mistakes, what Oracle is aiming for is an unrealistic goal. People screw up, so we patch. Mistakes happen, and we patch. Software evolves, and we patch. When a software company has an install base of several zillion, and can't get their act together in terms of reliability, or don't want to, then you have an issue that needs resolving. Patching because of mistakes is part of being human, patching due to apathy and blatant disregard for security is an entir
  • If only... (Score:2, Insightful)

    by Jimboscott ( 845567 )
    "What if civil engineers built bridges the way developers write code?" she asked. If only all IT projects where well defined as briges plans...
  • This is simple... (Score:2, Interesting)

    by JC Lately ( 949612 )
    The market should determine the value of a quality product. The only regulation that should change is the ability of software vendors to avoid accountability with the complex EULA. If all the businesses in the world sued Microsoft for the effort to continually patch their software it might just get them to do something. Of course, the cost of the software would rise too, at least in the short term. Secure and bug free code doesn't need to cost significantly more provided you have the correct process and
  • by ajv ( 4061 ) on Monday May 29, 2006 @06:48AM (#15423926) Homepage
    I write the OWASP Guide, which is used by basically everybody as the standard for web application security, and is the official standard of Visa, many governments, and so on.

    She talks to CSO's who mostly are bean counters. They see money down the drain from patching. I agree with them - patching is inefficient and wasteful. But it's necessary as Oracle builds crap, buggy and insecure software. They are easily five+ years behind Microsoft in churning out safer software. Buffer overflows, high privilege accounts, public access to highly privileged library functions - all this stuff is easily 10-15 years old and should not be in Oracle 10g, but it is.

    Oracle has time and time again outright refused to get on board with a secure coding program, often fixing just the little bug which gained root privileges, exposed all your data, or destroyed the database outright. Instead, they should be searching for all those types of bugs and fixing them in one hit. Davidson has more than enough time to address the root cause

    She is holding software up to the standards of bridges. Bridges have tolerances and over-design built into them. Most software does not. Often to make artificial deadlines made by beancounters, software is shipped with bugs. Often the bugs are not found for some time and requires researchers to go find them. If it's not researchers, its the commercial 0day crowd. This is where Davidson shows she is an amateur and must be replaced. It's best for HER customers to be secure, and that means shipping secure software. Shipping insecure software does not prevent the 0day houses from creating exploits. Oracle's reputation as a solid data partner is worthless if we lose all our data to an attacker because Oracle suppressed the news from us, rather than fixes the problem.

    It is simply unachievable to build bug free software for a reasonable cost. What is required is care, developer training in secure software techniques, and defense in depth. That is our tolerance and over-design. Oracle is sadly lacking. She has had five years to get their developers onto a program of building this into their platforms, and she's failed miserably. I will be interested to hear what standards they use, and if it's mine (OWASP Guide), or if they do their own based upon ours, or use Microsoft's.

    I've called for her to step down more than once. When she attacked the good name of David Litchfield and NGS Software, I was outraged - this was like shooting the messenger that their "unbreakable" software was pure crap, which we already knew - but now know through his unstinting efforts that it is truly appalling and not fit for purpose.

    If this latest "push" for too little too late does not work out, she should be sacked by the Oracle board for the good of all Oracle shareholders and customers. She's had more than enough time to make a positive change, and should make way for someone who really understands security.
  • that couldn't be done by setting up standards and best practices within the industry, and then testing software and source against those metrics.

    It seems like there could be an organization setup to certify software as meeting some security standards. Some people might think this would be a problem for open source, but they forget that there is a lot of money behind open source. I'm sure IBM and others would help foot the bill behind getting linux certified.

    The real problem with certification or government
  • The Real Enemies of Software Reliability [rebelscience.org]

    Guess what? Oracle is on the list. ahahaha...

    Oracle's Chief Security Officer Mary Ann Davidson should be next on the list, IMO, for once more comparing software engineering to bridge and building engineering.
  • by SmallFurryCreature ( 593017 ) on Monday May 29, 2006 @08:32AM (#15424147) Journal
    You mean like that double decker highway that collapsed during an LA earthquake? Maybe that one that fell apart in a stiff wind?

    Ah but most bridges don't fall apart that easily. Well no, most bridges are best on millenia old technology. The more advanced designs are designed to very fine tolerances.

    Take that "new" superhigh bridge in france. It cannot support the weight of an ocean liner. Would collapse if you blew up one of the pillars and a nuclear strike within a mile would cause it to fall apart. Hell even a simple typhoon would do it.

    Ah, but none of those things are likely to happen so the bridge wasn't designed for it.

    That is the big difference between software and hardware. Even the simple thing of user supplied data is different. In software you need to check and check again every bit of data to make sure the user hasn't supplied the wrong kind of data. Hasn't the user put a 1 gigabyte of data in a bool field?

    In the real world this is kinda easier to check. I think you would notice if a truck instead of being loaded with 10 tons was loaded with 10.000 tons. A clue might be the way its axels are buried in the asfalt.

    So the bridge designer only has to design for the entire roaddeck being filled with trucks filled with lead and that is it. He can work with real world limits. The french bridge was really tested like this. It withstood the test and is in theory designed to withstand 2x the load. That ain't much of a tolerance but in the real world you can easily discount such a heavy load ever being put on the system. Someone driving up with an ocean liner on his trialer would draw attentention.

    Not so with software. I can put anything I want in this input form and the software better be designed for it. I am not constrained by real world limits.

    That is what makes software engineering so difficult, you need to account for every possibility. If you checked a piece of data and wrote it too storage then you need to check it again when you read it. This would be like a bridge engineer testing the steel, then having to check it every day to see if hasn't turned into porridge by an act of god.

    Oh and one final note. A lot of software insecurity only happens under attack. Bridges don't exactly last long under attack. Blowing one up is amazing easily. Any army engineer can do it.

  • ... appears to be bad management.

    Odd, how bad managers do not write a single line of code, but establish policies and practices virtually guaranteed to provide the fertile ground necessary for bad code to spring forth, and to extinguish the capability to make secure code, buried under the management "fertilizer" to produce bad code.

    It's pretty easy to see how this motif extends into other aspects of software quality as well.

    How much proprietary software makes use of a tool like Bugzilla to manage bug-tracki
  • It's a way for companies to shave money. When you are a business selling blenders, if your blendmaster 2000 has an issue, you ignore it. Fix it in the next model. If your Ford has an issue, you might recall it, but that's extremely expensive. What about software? Bug? Lots of bugs? Don't worry about it! We'll just patch it in the 1.01 release! Oops, more problems, here comes 1.02!

    Software developers use The Patch as a way to get a product to market before it's ready. Shareholders don't see this, an
  • Someone made a comment about how if you build a bridge and it falls you can be held libel, but softwarer you just attach an EULA that says you shall not be held responsible.

    What is really being said with such EULAs is that the software insustry is still using roman numerals to do alchemy, or in other words, they don't know what the fuck they are doing well enough to take greater responsibility. And unfortunately lack of responsibility has become such an acceptable norm that there is a notable reduction in t
  • To get it right the first time?

    We know that will never happen. I mean, to get it right the first time requires months or even years of beta testing using a very LARGE user base in order to get all the quirks and holes and issues out of the system.

    It is arrogant to assume that ANY group of programmers can get it right the first time while developing software, and its not to discredit the quality of programming they are offering. Management is largely at fault for why software products fail to work right ou
  • Oracle is the last company that should be complaining about patches. Not only are they slow to address specific security holes, but they are constantly releasing patches! My biggest customer that I consult at spends at least a week every quarter on just the patching of Oracle.
  • A lot of people have made quite a few good points already, so I'll just chime in with one I haven't seen yet: software will never be regulated (at least not in the near future).

    Why? Because despite their comments to the contrary, execs and managers don't want regulation. Why? Because regulation and enforced quality control, as in civil engineering, would wrest control of software development from managers, and place it in the hands of professional, certified engineers that would be entrusted and liable for

For God's sake, stop researching for a while and begin to think!

Working...