Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Oracle Exec Strikes Out At 'Patch' Mentality

Zonk posted more than 8 years ago | from the this-post-to-be-patched-later dept.

264

An anonymous reader writes "C|Net has an article up discussing comments by Oracle's Chief Security Officer railing against the culture of patching that exists in the software industry." From the article: "Things are so bad in the software business that it has become 'a national security issue,' with regulation of the industry currently on the agenda, she said. 'I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."

cancel ×

264 comments

Sorry! There are no comments related to the filter you selected.

First Patch! (-1, Troll)

Anonymous Coward | more than 8 years ago | (#15423655)

ummm... -well I tried

Of course (5, Insightful)

Anonymous Coward | more than 8 years ago | (#15423660)

Oracle are (rightly or wrongly) worried about competition from Open Source. Regulation of the software industry would be a major benefit to them in this. Anyone who didn't meet the regulators' criteria couldn't compete.

Re:Of course (0)

Anonymous Coward | more than 8 years ago | (#15423740)

Sounds to me like this woman is another Art major who slipped through the cracks in IT and was raised to some form of CEO status. Without patches we wouldn't have Apache. Without Patches we wouldnt have World Of Warcraft. Hell, if it weren't for patches we wouldn't be IT workers.

Get it straight. Your "Culutre of patches." Is the same thing as "We need to do an emergency release."

These people make me sick.

Re:Of course (5, Informative)

arivanov (12034) | more than 8 years ago | (#15423773)

No.

Not at all in fact.

Open Source has nothing to do with this and I would suggest that you actually do some research instead of parroting the usual "Open Source will fix all problems" mantra.

Oracle has recently been shown to have up to 5 years turnaround to patch glaring security holes. This has reached the point where security researchers like Litchfield who have had an ongoing relationshop with Oracle for 10+ years do not want to work with any longer. Note, we are not talking sc1pt k1dd10tz sitting in their dad's basement here. The people in question consult banks, governments, large corps and cannot actually recommend them a working security policy because Oracle cannot get its head out of its arse and patch a security problem for multiple years after it has been reported to them.

As a result people who used to work on Oracle problems and reported them in private to Oracle have started posting them openly "0 day" style or giving Oracle a 1 month fixed notice of an impending posting regardless of does it have a patch or not.

Obviously Oracle is pissed.

First of all it breaks all of their marketing bollocks about unbreakability and security to bits.

Second it is threatening their sales to customers in regulated markets where security issues must be addresses within a fixed term after being known.

This is the reason for them to rattle the "regulation" sabers and moan about a "patch culture". Open Source has nothing to do about it.

Re:Of course (2, Funny)

SnowZero (92219) | more than 8 years ago | (#15423836)

I noticed that you used the Queen's English in writing your post, which means you must be one of those "evil British hackers" mentioned in the TFA.

Remember everyone, the lower the patch frequency a product has, the more secure it must be. Pay no attention to the wookie.

Re:Of course (-1)

Anonymous Coward | more than 8 years ago | (#15424045)

I recommend the following tags for this story: crustybitch oldhag sandyvagina

Re:Of course (5, Insightful)

Anonymous Coward | more than 8 years ago | (#15423857)

Open Source has nothing to do with this and I would suggest that you actually do some research instead of parroting the usual "Open Source will fix all problems" mantra.

I said nothing at all about open source fixing all problems, or fixing any problems for that matter.

If you've ever worked in an industry that's gone from being unregulated to being regulated, you'll know that one of the first things that happens is that the number of participants decreases as all those that can't afford the overhead of the regulations and of maintaining a compliance department (not the same as quality assurance; experts in the interpretation and application of the regulations) leave the field. One of the next things that happens is that the number of new suppliers entering the market plummets.

There are many disasvantages to being regulated - additional costs and potential damage to reputation if you conflict with the regulator, but the big advantage is a barrier to competitors entering your market.

That does NOT mean that regulation is a bad thing - that depeneds on the specifics. However, if a supplier is arguing for regulation of their market then the chances are that they're doing so to cut down the competition. It's unlikely that they're asking for it because they can't control their own engineers and are hoping a regulator will do better.

If you've observed Oracle at all you'll have noticed that they are worried by competition from open source. It is likely that that's their target in this, though it could be other smaller competitors.

Re:Of course (1)

cabazorro (601004) | more than 8 years ago | (#15423950)

If by regulation they means making a law against disclosing security holes on regulated software, it follows the security by obscurity dogma. Which might not have anything to do with Open Source as a cause. Yet, by design, would make impossible for Open Source to fulfill this "security by obscurity" requirement.

As for the "patch culture" head-liner. The term patch was is commonly associated with Open Source and it's inherent quality for any piece of software to deviate from the original distribution giving away or handing over the control/security from the creator to the user. From a contractual point of view, that removes any security liability from the creator. And that is exactly what vendors like Oracle charge dearly, for the assumed responsability/liability they want to uphold..and charge accordingly.

Regulation is the shot-gun approach. The solution should be for Oracle to sell their security AS-IS. Stop touting and charging for solutions that security-wise or may not be better than it's counterpart.

The fellowship - ring of corruption (1, Insightful)

Anonymous Coward | more than 8 years ago | (#15423663)

In other words, you should make your choice not on merit, but on a short list of products from an exclusive club. There is a ring of corruption to this G

Re:The fellowship - ring of corruption (1)

PinkyDead (862370) | more than 8 years ago | (#15423953)

And an exclusive club that hasn't exactly excelled at providing secure applications in the past...

Patches (1)

yobjob (942868) | more than 8 years ago | (#15423664)

Maybe if EA didn't run its coders in to the ground, they wouldn't need to go on to the patches...

Re:Patches (0)

Anonymous Coward | more than 8 years ago | (#15423670)

Maybe if EA didn't run its coders in to the ground

Yeah, it's pretty tough work changing all the strings from "2005" to "2006".

Well, obviously.... (4, Insightful)

Mikachu (972457) | more than 8 years ago | (#15423675)

Of course the "patch, patch, patch" business plan is bad for consumers. But in truth, most software companies don't care about consumers. They care about making money. As it happens, most people really don't care enough about the subject to make the companies change.

One of the examples in the article asks, "What if civil engineers built bridges the way developers write code?" and answers, "What would happen is that you would get the blue bridge of death appearing on your highway in the morning." The difference here, however, is that civil engineers couldn't get away with making rickety bridges. You would find public outcry if it broke while people were on the bridge. In the software world, however, they scream and the companies just fix it with a patch and it shuts the consumers up. Saves a lot of money and time in testing at companies.

Re:Well, obviously.... (3, Insightful)

pe1chl (90186) | more than 8 years ago | (#15423688)

Another difference is, that when you build a bridge and it collapses you will be held liable for it.
When you build software, you just attach a EULA that says "I shall not be held liable" and that's it.

Once software makers, especially the large commercial companies, find themselves in the same boat as other industries and have to pay compensation when bad stuff is released, they will certainly step up quality control to the next level. Because it saves them money.

Re:Well, obviously.... (0)

Cyclops (1852) | more than 8 years ago | (#15423724)

Maybe if you're talking about software for such a critical thing as keeping an airplane safe, or controlling nuclear reactors and such. But what he says is lunacy for software in general.

It's just about the same as demanding guarantees that you will be satisfied by a book. It's idiotic.

Re:Well, obviously.... (1)

maxwell demon (590494) | more than 8 years ago | (#15423741)

What if a book tells you how to build your own house, but it turns out that houses built following those instructions tend to collapse? Say I build such a house by exactly following the instructions, and it collapses and I get hurt, would the book author be liable?

Re:Well, obviously.... (1)

Cyclops (1852) | more than 8 years ago | (#15423840)

Of course the book author should never be liable. To build houses you can't buy books. You must follow due academic education _and_ pass.

Re:Well, obviously.... (1)

Mindcry (596198) | more than 8 years ago | (#15423746)

and how do you gaurantee other misc. software that's also on the system won't interact with it in a bad way? there's too many corner cases... such liability would only work with closed boxes built ground up (including a custom OS that'd be nice and expensive to build and secure and gaurantee).

besides, if regulation becomes too heavy here, why not just relocate to india and save some money and the hassles? liability insurance could also work, if software damages were actually measureable beyond hours of repair or insane overestimations pulled out of thin air.

Re:Well, obviously.... (1)

Anonymous Coward | more than 8 years ago | (#15423767)

Yet another difference however is, if you build a bridge, there are tried and true ways of making sure that it doesn't fail. There's still the occasional mishap (Tacoma Narrows) when boundaries are pushed but all in all the engineers know exactly what they need to do to make a bridge safe. Not so in the software world. There is no tried and true way of making safe software. At least not one that customers would pay for. Remember that software can only be as reliable as the hardware it's running on. Simulation software on Pentium processors, accounting on faulty RAM, cheap USB cables to the external harddisk: The list of problems that look like software failures but really are hardware problems is long. In critical areas (such as flight control systems), software is about as reliable as classical engineering. But those are tiny and expensive programs compared to desktop systems. They run on expensive redundant hardware that's years behind in performance because verification takes time.

Now, I'm not saying it can't be done, but it's going to cost a lot of money and people will have to make do with software that is years behind in functionality and performance compared to what we are used to. And they're going to pay the premium for that honor. Quite frankly, I don't think it's worth it. Critical software is already written to higher standards and the rest just doesn't need to be perfect. A better way of dealing with the problem would be to anticipate mistakes and build the system in a way that makes failures less harmful. You know that computers which are connected to the internet will be hacked sooner or later. Don't put mission critical private data on them then. Layer security, monitor your systems, etc. I'd say we learn from nature: When was the last time you saw a perfect animal, a perfect flower, a perfect human being? Nature deals with imperfections, it doesn't avoid them. It's the more cost effective way.

But it's different things (4, Insightful)

Sycraft-fu (314770) | more than 8 years ago | (#15423774)

The difference is that software is expected to be cheap, released fast, and to run on all kinds of platforms. Sorry, that leads to errors. You can have software that never needs patching, you just have to take some concessions:

1) Development cost will be a lot more. You are going to have to spend time doing some serious regression testing, besically testing every possible compination of states that can occur. May seem pointless, but it's gotta be done to gaurentee real reliability.

2) Development time will be a lot more. Again, more time on the testing. None of this "Oh look there's a new graphics card out, let's get something to support it in a month." Be ready to have years spent some times.

3) Hardware will be restricted. You are not going to be running this on any random hardware where something might be different and unexpected. You will run it only on hardware it's been extensively tested and certified for. You want new hardware? You take the time and money to retest everything.

4) Other software will be limited. Only apps fully tested with your app can run on the same system. Otherwise, there could be unexpected interactions. The system as a whole has to be tested and certified to work.

5) Slower performance. To ensure reliability, things need to be checked every step of the way. Slows things down.

If you aren't willing to take that, then don't bitch and demand rock solid systems. I mean such things DO exist. Take the phone switches for example. These things don't crash, ever. They just work. Great, but they only do one thing, yoy use only certified hardware, they've had like one major upgrade (5ESS to 7R/E) in the last couple decades, and they cost millions. You can do the same basic type of stuff (on a small scale) with OSS PBX software and a desktop, but don't expect the same level of reliability.

The thing is, if your hypothetical bridge were software (and it's quite simple compared to software) people would expect to be able to put the same design anywhere and have it work, drive tanks over it and not have it collapse, have terrorists explode bombs under it and have it stay up and so on and have all that done on 1/10th of the normal budget.

Until we are willing to settle for some major compramises, we need to be prepared to accept patches as a fact of life. I mean hell, just settling on a defined hardware/software set would do a lot. Notice how infrequent it is to see major faults in console games. It happens but not as often. Why? Well because the hardware platform is known, and you are the only code running. Cuts down on problems immensly. However take the same console code and port it to PC, and you start having unforseen problems with the millions of configurations out there.

Me? I'll deal with some patches in return for having the software I want, on the hardware I want, in the way I want, for a price I can afford.

Re:But it's different things (1)

Djatha (848102) | more than 8 years ago | (#15423838)

The difference is that software is expected to be cheap, released fast, and to run on all kinds of platforms. Sorry, that leads to errors. You can have software that never needs patching, you just have to take some concessions:

1) Development cost will be a lot more. You are going to have to spend time doing some serious regression testing, besically testing every possible compination of states that can occur. May seem pointless, but it's gotta be done to gaurentee real reliability.

2) Development time will be a lot more. Again, more time on the testing. None of this "Oh look there's a new graphics card out, let's get something to support it in a month." Be ready to have years spent some times.

I do not completely agree with you. Testing is not really a solution to safety as is the case with, for example, constructing (bridges). A bridge is not build, then tested and adapted to fit some requirements. It is designed to meet some requirements, and there are theories, techniques and tools to ensure that the requirements are met.

Of course, if there is some problem with material used, the bridge can collapse, break down, or wear out over time, and in that sense bridges, too, are tested and patched as is software to ensure safety and the like during the lifetime of the bridge.

Nevertheless, we trust that the original bridge is designed correct and behaves accordingly. In software, we can design systems, but these designs are abstractions of the final systems and the relation between a final system and a design is not the same as it is in construction where simple theories (physics, material, etc.) and fundamental facts on the environment (like site, underground, width, climate, etc.) can and are incorporated in the (development of the) design.

So, what I trying to say is that computer science needs some fundamental theories, techniques and tools applicable in real life situations before software can be trusted by design. Till then, software engineering is just a craft, where testing, patching and the like is needed to keep the system going.

I would advise Oracle to invest in fundamental computer science research instead of talking about bad practices which are more or less unavoidable in this stadium of computer science.

Nope, sorry (1, Flamebait)

hummassa (157160) | more than 8 years ago | (#15423990)

The "bridge" equivalent of consumers' expectation for software would be: a bridge made out of cardboard, with a lot of lights, a coffee-making machine each 100 yards, seven entrances and eighteen exits -- and ways to go from each to each, that can be reconstructed in 15 minutes to 3 hours if it falls, and nobody will mind if it falls every other day. A plain old bridge is 1000x - 100000x more expensive to build, would take one year to get ready, and probably will see maintenance only ten to twenty years after it's ready... It's possible to build it, but no one wants it, so it's not _viable_ to build it.

Anyway, the better software design tools are those that are integrated deeply with the coding phase... But no one wants to use those (say Lisp)

Re:But it's different things (2, Informative)

FireFury03 (653718) | more than 8 years ago | (#15423951)

Take the phone switches for example. These things don't crash, ever. They just work.

Sorry, that's just not true. Phone switches _do_ crash - it's just that the telcos have learnt to build networks with a hell of a lot of redundency. If a phone switch goes down then the worst that'll happen is you'll lose the calls that are in-progress on that switch (actually, the switch may be able to recover the calls if it resets quickly enough - just because the signalling goes down for a few seconds doesn't necessarilly mean the voice circuits have also failed). New calls will be routed via a redundent switch.

Of course, building any kind of highly redundent network is very costly so people avoid doing it if they can help it.

Also, phone systems are only designed to deal with a fairly specific set of events, they don't need to worry about security holes, etc, because everyone on the network is fairly trusted. I'm sure this will change very quickly in the near future with the convergence of the internet and the PSTN.

Probably the closest you'll get to completely reliable are the fly-by-wire systems used in planes, but even then they still put a lot of effort into redundency, with multiple computers arranged in voting systems so faults can be spotted and corrected or completely taken out of service as early as possible. This is probably also the most similar scenario to the bridge analogy - if it goes badly wrong, people die.

Re:But it's different things (4, Informative)

MathFox (686808) | more than 8 years ago | (#15423983)

Too much time is spilled in "integration" and testing because management refuses to plan time for high level design. One can create better quality software in about the same amount of time when one uses a proper development process. Some hints:
  • Do a proper high-level design.
  • Review your design with all stakeholders, including QA/testing and marketing.
  • Plan time to fix issues in all steps of the project.
  • Prototypes are to throw away, don't build your product on top of them.
  • Require specifications for all parts of the application.
  • Peer review all specifications.
  • Peer review all code.
  • Perform unit and module tests on all parts of the code.
  • Fix bugs as early as possible.
Development will cost more and take longer
It will take more time till a programmer starts coding, you will need less time to find and fix bugs. A clean design leads to cleaner module interfaces, which makes tracing the bug easier. Doing module testing means that a lot of bugs are found early and are automaticly traced to an offending module, which means quick fixing.

Restrictions on hardware and software
For high-reliability, yes. It's hard to write software that can replace blown out fuses. I think it is rediculous that an Internet connected Windows system is "automagicly" degrading to a near useless condition, so Windows should be thrown out.
It should be possible to run a decent selection of software on a server, where the user selects his mixture, taking into account his desired level of reliability. An Operating System should sufficiently isolate processes so that a single bug doesn't crash the machine.

Slower performance.
Needless consistency checks slow things down (and improper checks may even cause instability). With a proper design you know what to check where, so you only check once. In my experience good quality software performs better than bad software.

Take the phone switches for example. These things don't crash, ever. They just work. [...] they've had like one major upgrade (5ESS to 7R/E) in the last couple decades
Sorry, I had to pick myself up from the floor, fell of my chair laughing. I did work for a telco and crashed a few switches myself, the Lucent stuff you mention. Ericson makes more reliable systems (but they have a different design philosophy). And software updates for phone switches appear regularly.

Re:But it's different things (1)

FooBarWidget (556006) | more than 8 years ago | (#15423996)

With a proper design you know what to check where, so you only check once.


That isn't going to weed out all bugs. What if the programmer is tired and makes a mistake and forgot to check for a precondition in some places? Boom. And that kinds of mistakes happen a lot. If the code doesn't crash, that can be even worse, as it may lead to corruptions in the internal states.

Re:Well, obviously.... (1)

NiroZ (964916) | more than 8 years ago | (#15423718)

Well, if the bridge started to crack, they would fix it, wouldn't they? The computer industry is very, very young in comparison to the bridge building industry. Who's to say that there was a time when bridges where in a constat state of being patched up. The reason that people tend to release software early and use their customers as beta testers is because they can fix it faster, easier and cheaper than if you where to fix a problem in a bridge. Also, there are not nearly as many souls who's lives depend on bridges (well, the major ones), compared to computers, where the competition is intense and is a smaller user base. It is also (arguably) a less important service.

Re:Well, obviously.... (1, Interesting)

Anonymous Coward | more than 8 years ago | (#15423780)

You know, I'm starting to get sick of the software = bridges analogy. The fact is, software is not bridges, it's not even close to being bridges. If you're building a bridge, you're asking people to trust their lives to your creation. If you're writing software, you're doing no such thing (disclaimer: yes, some software does get used in life or death situations, but the vast majority is not, so don't pretend it is). If anything, software is closer to something like, say, landscaping, than building bridges. And in fact if you're in the landscaping business mistakes are perfectly tolerable. "Oh, you say that bush died because I planted it wrong? Gee, I'm sorry, let me come out and plant another one. How does two weeks from today work for you?" Do you demand 100% perfection from someone painting your house or doing your yard? Is it the end of the world if they screw up, or do you just have them fix their mistake? Now tell me, which of these approaches is more fitting to apply to software?

Bridges and software are not the same (1)

L.Bob.Rife (844620) | more than 8 years ago | (#15423796)

First, bridges are quite a bit less complicated than software. Second, there are numerous examples of bridges that have had structural flaws. Just because they don't turn blue with obvious error codes stamped on them does not mean they are perfect. Bridges must undergo repair periodically, or they will fall apart.

Bridges solve one problem: Supporting X weight across Y distance, taking into account building materials and terrain.

Software is usually far more complex in what it tries to accomplish. Its not merely a matter of being "more expensive" to make bug free software. Its so incredibly difficult, and so labor intensive, thats its actually "cost prohibitive". Meaning that for nearly all programs made, the cost involved in making it bug-free is far more than a company could hope to redeem.

Would you like Microsoft to make a bug-free OS but then sell it for $10,000 per computer, to make up for all the production costs?

Of Course, Bridges Are Easy (2, Interesting)

slarrg (931336) | more than 8 years ago | (#15423900)

As a software developer, I lie awake at night dreaming of only having to solve a problem as simple a bridge. It has only one use case: vehicles of a known weight with a known wheel surface traveling in predetermined paths at a predetermined rate of speed. Also, if you dig down deep enough on the Earth, there is always something solid to anchor the bridge. Then bridge developers have millions of existing examples which can be studied and reused.

In software, half the stuff people will do with it were unknown while it was being designed. It's placed on top of existing code (operating systems, existing architectures, outmoded designs) which deceases the stability of your own applications. Runs on systems with wildly different equipment from any test environment available with drivers written by corporate hacks which decrease your applications' performance. Then users use the application with many other applications which can interfere in numerous ways with the other applications while sucking up the required resources (memory, hard-drive space, etc.) your application needs. Which is not even mentioning the malicious attacks by those who only wish to wreak havoc on the systems. Then if any of the myriad of things running on the computer fail, everyone starts screaming that the developers are the problem.

The problem is that people expect the software to perform absolutely flawlessly while doing things that the developer never intended on a wide variety of equipment that cannot not be tested on or controlled by the developer. It's the world of continuous progress. No one changes the use cases of bridges after they are designed. No one every just tacks a few more lanes onto a bridge or decides to make the bridge into an airport runway after it was built. When was the last time someone re-commissioned a pedestrian bridge for railway traffic or built an additional level on the bridge for a shopping mall without significant studies to determine feasibility?

And yet if bridges were scrutinized the same way as software, people would be in an uproar about all the deaths that are only possible because of the bridges: people jump off of them, cars crash over the guard rails, tornadoes and hurricanes wipe them out, and if they are not maintained properly they eventually fall to the ground under their own weight. Books could be filled with the death stories of people killed by bridges. Everyone sees how silly it is to blame a bridge designer when people are not using or maintaining the bridge in the way intended.

This is not to say that there is not badly designed software out there or that much of it couldn't be done better. However, people need to understand that to have completely bullet-proof software would require studying all possible use cases, locking all features and hardware, then designing a system that will perform only those features and nothing else ad infinitum. Of course, that's exactly what a group of mindless, uncreative government regulators will do. I'd rather have innovation and patches and the largest number of technical resources and methodologies available for the problem.

The core problem is that solutions are being locked up by patents and business methodologies rater that allowing all the code to be shared and reused by everyone allowing everyone to benefit from new applications of previous solutions. I don't really expect Oracle to agree since they make a tremendous amount of money from closed code and patents and would really love to kill all new entry into the market. Of course, they don't really believe in making code that works without patching, either, or they would no longer be patching their own supposedly well-designed and executed flagship product. It's just rhetoric and business as usual.

bridges are a dump comparison (1)

cheekyboy (598084) | more than 8 years ago | (#15424132)

bridges are simple and its uses dont increase.

Its like one 1000 transitor circuite or 200 line function, thats it.

yeah... (3, Informative)

narkotix (576944) | more than 8 years ago | (#15423676)

this [techtarget.com] explains all....bunch of slackasses!

One problem (1)

Loconut1389 (455297) | more than 8 years ago | (#15423680)

One thing I see all the time is code that doesn't matter is under total scrutiny. So what if there's an exploit in the gimp? If your machine is properly firewalled in (for a regular home user), and you're the only one using it, what does it really matter?

Hunting down these things is nice, but not necessary in a lot of cases.

Re:One problem (2, Insightful)

paskie (539112) | more than 8 years ago | (#15423707)

Someone sends you an image and tricks you to open it in Gimp (social engineering of that kind is not really very hard to do). Then depending on nature of the bug, he can install a backdoor calling out to him and asking for further commands, or whatever.

Re:One problem (2, Interesting)

Loconut1389 (455297) | more than 8 years ago | (#15423709)

worth noting that I'm aware that an exploit in the gimp in a corporate environment that would allow an employee to gain root on the machine may or may not matter depending on the setup. if well administered, gaining root would at most allow the user to set up a server or install something. At worst, someone has set up ssh keys to get in other places and you've already given out the keys to the kingdom and left them behind unmonitored 1/8" plexiglass in the lobby.

Anyway, applications that do not listen on a port and are mostly basic user applications probably don't matter in the scheme of things.

Re:One problem (3, Insightful)

fatphil (181876) | more than 8 years ago | (#15423719)

If you've ever forwarded an image file to a friend who might forward it to other people, then you are a potential vector for malware. Sure it may look like a picture of a carrot that looks like Tom Hanks, but if it causes a buffer overrun that installs a rootkit, and one of the friends-of-a-friend wants to 'photoshop' out the logo in the corner, then someone's getting as 0wned as if they clicked "yes" after downloading an executable.

The moment you say that security doesn't matter in on place, that becomes the ideal place for attacks to be focussed.

FatPhil

Re:One problem (0)

Anonymous Coward | more than 8 years ago | (#15423869)

"If your machine is properly firewalled in ..."

To take the "if engeneers would build bridges like companies build software" analogy : what does it matter those bridges fall apart ? Just have a number of security-measures so that the people will land softly, and there is no problem.

Would you find that acceptable ?

Well, neither do I find your "build fences about default-buggy software" suggestions (as a final solution) acceptable.

As others allready mentioned, you would be left with two war-zones (the web, and your own local machine) only seperated by a thin wall. How long do you think that wall would actually be able to resist such an onslaught (or be able to recognise the "good" travelers between the two war-zones from the "bad") ?

Wow... is this what the software industry needs? (4, Insightful)

zappepcs (820751) | more than 8 years ago | (#15423695)

Wow, really nice slice on the Brittish.. FTFA

She claimed that the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."

It seems to me that the F/OSS industry has shown that fast, and effective patches can be applied, and that software we pay for has less then reasonable responses to such threats. I use F/OSS and I'm quite happy with the response they have to software problems. I don't expect it to be of NASA quality, just to be good, and it is. For the amount that you have to pay for Oracle et al, you expect fast resonses on problems. The problem is that they don't respond fast enough. There is NO bullet proof software, though I give a hat nod to the guys that wrote the code for the Mars rovers. Certainly, Oracle isn't showing that they deserve the price they demand, at least not in this respect.

I might be off topic, but all the F/OSS that I use, delivers what I pay for AND MORE. The software that I have to pay for is lacking. When you pay thousands of dollars, you expect patches in a timely manner, and before you get hacked. I think this is a big reason that F/OSS will continue to win hearts and minds across the world. Despite the financial differences, F/OSS actually cares, or seems to, and they do fix things as soon as they find out, or so it seems to me. They have a reputation to uphold. Without it, they will just wither and die. It amazes me that investors, stock holders, and customers are willing to wait for the next over-hyped release of MS Windows while they suffer the "stones and arrows" of the current version. It appears that no matter how bad commercial software is, people rely on it. Yes, of course there is more to the equation than this simple comparison, but I think this is important. If you weigh what you get against what you pay, F/OSS is a good value. The argument is old, and worn, but ROI is a big deal, and patches make a difference to ROI.

Is it really what the software industry needs? A set of rules to make things bullet proof.. which of course won't ever happen. That kind of mindset is totally wrong, even though the sentiment is in the right place, you can't regulate quality in this regard. Sure, you can make sure that all gasoline is of a given quality, but I don't trust the government to test and regulate software. The US government already has a dismal record of keeping their own house in order on this account, I don't want them telling me how to do anything or what I can and cannot sell, never mind what I can give away for free under GPL.

Re:Wow... is this what the software industry needs (4, Funny)

cyber-vandal (148830) | more than 8 years ago | (#15423728)

She claimed that the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."

Sums me up perfectly old boy (well maybe not the technically skilled part)

Re:Bullet Proof Code (1)

BlkItlStl (919234) | more than 8 years ago | (#15423783)

A server taking a shot from a bullet and still keeps running http://youtube.com/watch?v=mAuKwTDGnCg&search=hp%2 0bulletproof [youtube.com]

Re:Bullet Proof Code (1)

zappepcs (820751) | more than 8 years ago | (#15423794)

Very cool advertisement, but the warrantees notices at the end sort of ruined it... still cool though

The software is not bulletproof (1)

mangu (126918) | more than 8 years ago | (#15423922)

A server taking a shot from a bullet and still keeps running


Good advertisement, but it only shows the hardware has enough redundancy to sustain some heavy damage. TFA, OTOH, is about software.


And speaking of software, it's the big weak point in the youtube link you provided. The flash movies in youtube are really annoying to watch. Video is definitely not an appropriate medium to insert in web pages. However, if there is a link to the video file you can download it and watch off-line. I even wrote a small Perl script that I call from inside Konqueror to download videos from break.com. But youtube.com uses flash and that makes it much harder to download separately.


Your server hardware can be bulletproof, but if the software is flash your users will have to accept all the breaks and pauses as the web reluctantly delivers its content to you.

Re:The software is not bulletproof (1)

BlkItlStl (919234) | more than 8 years ago | (#15424099)

I don't work for HP but the orignal bulletproof software comment reminded me of this clip which I had seen a while back and thought was cool. Yes while TFA is talking about software and this is a video of a server taking a bullet there is most likely a large amount of software support that allows the server to still operate after massive hardware failures.

Re:Wow... is this what the software industry needs (0)

Anonymous Coward | more than 8 years ago | (#15423940)

"There is NO bullet proof software, though I give a hat nod to the guys that wrote the code for the Mars rovers. "

The Mars Rovers are amazing pieces of equipment, and the software has worked great -- mostly -- but it wasn't bullet-proof code. I can't remember all the details, but the system went bad on one of the rovers within the first few weeks of landing due to a bug (too many files in flash memory), but, as it is supposed to do, it went into "safe" mode and they fixed it by uploading patches. It took them a while to fix because the system was rebooting over and over, many times a day, and they had a narrow window of opportunity to interupt it before the system rebooted again. They then did the same patch for the other rover, which would have been afflicted by the same bug eventually.

The point is, even the Mars Rovers, which can be regarded as a software and hardware success in almost every measure, still had bugs that needed patching. Even "mission critical" software can have problems. As you suggested, the software engineers still deserve alot of credit.

Ah, I did a search and found a few details [newscientist.com] .

Re:Wow... is this what the software industry needs (1)

FireFury03 (653718) | more than 8 years ago | (#15423968)

There is NO bullet proof software, though I give a hat nod to the guys that wrote the code for the Mars rovers.

Ah, that would be the software on the rovers that almost cost the mission quite early on then. :)

FWIW, I believe the rover software runs under VxWorks. It would, of course, be very interesting to see the software - it's a shame NASA aren't likely to open-source it. If they did I could quite imagine a few build-you-own-mars-rover projects popping up on the web. :)

Engineers vs mechanics (3, Interesting)

Colin Smith (2679) | more than 8 years ago | (#15423697)

Most "engineers" are mechanics. It is indeed time that the software developers, in fact everyone in the industry started to act in a more professional manner, that means understanding the principles, designing and building systems which are known to be able perform to specifications. When I say known, I mean modeled and tested.

You can start taking the profession seriously by joining your local professional engineering body.

 

Re:Engineers vs mechanics (0)

Anonymous Coward | more than 8 years ago | (#15423722)

Amm, IAAE (I am an engineer) and I can tell you "joining your local professional engineering body" is not even an exclusive club any more. Mostly overseas engineers join in hope that would give them some recognition. Merit, past work and visible performance is the only way to go. 90% of mechanical engineering can be found in books that are over than 30 years old. It is not the same with software. In fact it might be heading that way, but it will be a while. 10 years ago the following was true 90% of the time (and some change) Engineering = Look at a problem and find how it's been done before Computer science = Look at a problem, break it down to it's most fundamental bits. Buld your solution Now Computer science is moving towards the Engineering solution G

Re:Engineers vs mechanics (5, Insightful)

cyber-vandal (148830) | more than 8 years ago | (#15423739)

As soon as the management starts to then so will I. Or did you think unrealistic deadlines and bad overall designs come from the grunts?

Re:Engineers vs mechanics (0)

Anonymous Coward | more than 8 years ago | (#15423846)

You go ahead and be a shining example. Don't come whining to us when you realize that no amount of modeling and testing will result in perfect software. You might be able to write software that performs to specification, but it will not be used in the specified environment, it will be expected to do many things that weren't in the specification and it will be attacked in ways that were not anticipated in the specification. You'll get a nice ISO certification and you'll extract more money out of your customers, but you'll still write and deploy patches, and when you do, it'll go horribly wrong because patching isn't in the specification of perfect software.

Simple Solution (2, Funny)

giafly (926567) | more than 8 years ago | (#15423702)

Re: "Chief Security Officer Mary Ann Davidson has hit out at an industry ... wedded to a culture of "patch, patch, patch," at a cost to businesses of $59 billion"

So, if people pirated software, instead of buying it, there would be no need for vendors to provide patches and business would be $59 billion richer.

How To Lie With Statistics (5, Insightful)

Toby The Economist (811138) | more than 8 years ago | (#15423704)

"I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."

Funnily enough, I'm just now reading Darrell Huff's book, "How To Lie With Statistics".

The problems with her poll are manifold.

Firstly, her group is composed of securiy officers who are on the CSO Council; might their views differ from security officers not on the Council? perhaps tending to be more of the belong-to-an-organised body sort? might perhaps therefore be predisposed towards regulation?

Secondly, of the officers on the Council, which ones did she ask? all of them? or did she have a bias to tend to ask those she already knows will agree? perhaps those who found it rather boring and aren't quite so pro-organised bodies just don't turn up at the meetings.

Thirdly, what's her position in the organisation? if *she* askes the question, are people more likely to say "yes" than they would to another person?

Fourthly, are people inclined in this matter to say one thing and do another, anyway? e.g. if you do a survey asking how many people read trash tabloids and how many people read a decent newspaper, you find your survey telling you the decent newspaper should sell in the millions while the trash should sell in the thousands - and as we all know, it's the other way around!

Fifthly, even if the views of members of the CSO Council truely represent all security officers, and even if they were all polled, who is to say the view of high level security officers is not inherently biased in the first place, for example, towards regulation?

So what, at best, can her poll tell you? well, at best, it can tell you that chief security officers who regularly turn up at meetings will say to a particular pollster, for whatever reason, and there could be widely differing reasons, that they think regulation is a good idea.

Well, I have to say, that doesn't tell us very much, and that's even assuming the best case for some of the issues, which is highly unrealistic.

Re:How To Lie With Statistics (0)

Anonymous Coward | more than 8 years ago | (#15423791)

"I did an informal poll recently of chief security officers on the CSO Council, and a lot of them said they really thought the industry should be regulated,' she said, referring to the security think tank."

You didn't mention the ambiguity of "a lot". If she'd been able to truthfully say "most" then she almost certainly would have done, since it would have strengthened her claim. Since she said only "a lot" it is likely she was unable to get a majority to agree with her.

OT rant: Apparently it's been 55 minutes since I last posted and therefore "Chances are, you're behind a firewall or proxy, or clicked the Back button to accidentally reuse a form.". No, I am not behind a proxy and I did not click the back button. Who thinks like that? This is utterly stupid. What's so magical about 55 minutes that someone thinks I'm using a proxy? Why should they care whether I'm using a proxy anyway?

How to misunderstand someone (1, Informative)

Anonymous Coward | more than 8 years ago | (#15423810)

and get modded insightful for it.

Really, your whole post is so silly, it defies believe.
First off, she did not in any way shape or form suggest that her poll, as she perhaps wrongly liked to call it, does in any way meet the requirements for a statistically correct poll.

Further, her argument does not rely in any way on this "poll", no matter how hard you try to spin it.

So what did she do?
She simply presents an argument about the terrible state of security in software engineering and mentions that many in the field agree with her.
To claim that this is lying with statistics is simply absurd and simply shows that it's not enough to merely read books, one should also understand them.

Re:How to misunderstand someone (1)

Toby The Economist (811138) | more than 8 years ago | (#15423826)

> Further, her argument does not rely in any way on this "poll", no matter how
> hard you try to spin it.

Then why did she say it?

However, note that I didn't read the article, so I can't be trying to argue her argument is undermined by her poll her invalid.

My post merely critiqued her statistic.

You're certainly right that she didn't say her poll was statistically valid. I think it's because she didn't even think about. I certainly had no conscious perception of just *HOW* invalid polling can be until I began reading Huff's book.

Re:How to misunderstand someone (0)

Anonymous Coward | more than 8 years ago | (#15423842)

"Then why did she say it?"
Simply to point out that others in the field agree with her that it is a problem.

"However, note that I didn't read the article, so I can't be trying to argue her argument is undermined by her poll her invalid.

My post merely critiqued her statistic."
Well, not reading the article might be the root of the problem then. She did not use any statistics.
I get the impression that in a sort of Pavlov reflex you read the word poll and eagerly shot away, applying your new found knowledge but unfourtunately missing the target by miles.

"You're certainly right that she didn't say her poll was statistically valid. I think it's because she didn't even think about. I certainly had no conscious perception of just *HOW* invalid polling can be until I began reading Huff's book."
No, she didn't really use a poll. All she did was point out that talking to her colleques many agreed with her.
But you are certainly right, it's fascinating to see what one can do with polling and statistics.

Re:How To Lie With Statistics (2, Funny)

bit01 (644603) | more than 8 years ago | (#15423907)

So, what you're saying is: Her survey needs a some patches?

---

Insisting on absolute safety is for people who don't have the balls to live in the real world - Mary Shafer [yarchive.net] , NASA

Re:How To Lie With Statistics (2, Funny)

Unique2 (325687) | more than 8 years ago | (#15423982)

Or, as Homer Simpson put it..

"Oh, people can come up with statistics to prove anything. 14% of people know that."

what a moron (1, Insightful)

Anonymous Coward | more than 8 years ago | (#15423712)

Re:what a moron (1)

ufoot (600287) | more than 8 years ago | (#15423792)

Yes, take an example: "Oracle Database 10g Release 2 (10.2.0.1)". 10 dot 2 dot 0 dot 1...
  • 10 is the major release version
  • 2 is the minor release version
Now what the hell is that "0 dot 1" for?

Still not enough (0)

Anonymous Coward | more than 8 years ago | (#15423851)

Actually 4 numbers wasn't enough for Oracle. I have a file (oramts.dll) which have the version 9.2.0.4.1

This, from Oracle? (5, Insightful)

Anonymous Coward | more than 8 years ago | (#15423720)

Whose patches are infamously known to break stuff, released in 6 month batches (maybe just a mite too spaced out?), and so infamously poor at actually patching their bugs that they currently have an open, publically known 0day with no patch, because they screwed up patching it last time and it's still open?

And they think security patches are a poor model?

Maybe that's why they put so little effort into them. Maybe that's because they put so little effort into them. Maybe some people think of it as bridge maintainance, and they want to build the bridge perfect every time? When they can't even get patches right when they have six months between them? Fat chance.

Honestly, out of the people in the software industry, even Microsoft do a better job, security-response-wise, than Oracle. And when you're behind Microsoft in that department, you've really got a problem.

They need to make a serious effort at security response and treat it like a real priority, not show-ponying about regulation when, if they were regulated, they would still be completely unable to respond, but would point to poorly-drafted regulation as "tying them up in red tape".

Re:This, from Oracle? (1)

maxwell demon (590494) | more than 8 years ago | (#15423752)

Whose patches are infamously known to break stuff, released in 6 month batches (maybe just a mite too spaced out?), and so infamously poor at actually patching their bugs that they currently have an open, publically known 0day with no patch, because they screwed up patching it last time and it's still open?

If they don't want people to demand patches, the best way is to make the patches so bad that people don't want them. That is, make them worse than the problems they cure, and demand for them will reduce dramatically.

Re:This, from Oracle? (2, Interesting)

Anonymous Coward | more than 8 years ago | (#15423795)

Until us white-hats get so annoyed at Oracle's lack of meaningful response that we lose all semblence of patience with them and decide that the public good would best be served via Full-Disclosure of the security holes that Oracle will not fix in a timely manner, so that everyone can make an informed decision whether or not to use Oracle, and (pending Oracle's eventual response, if any) can make an attempt at third-party mitigation via firewalls, SQL proxies, etc.

This has, of course, already happened.

Another failed cross reference (5, Interesting)

228e2 (934443) | more than 8 years ago | (#15423733)

This infuriates me to no end, when people use references they saw on the back of a cereal box beacuse they thought it was cute. FTA:

"What if civil engineers built bridges the way developers write code?" she asked. "What would happen is that you would get the blue bridge of death appearing on your highway in the morning."

Im sorry, but there are crazy people scanning my highway for open ports and i dont see script kiddies pinging my roads. Graffati aside, they are left alone. Code that is written works just fine if people dont try to over flow buffers and install rootkits. The bridge I see out of my window is fine because people dont hit it with sledge hammers.



Just my 2 cents . . . .

Re:Another failed cross reference (1)

/ASCII (86998) | more than 8 years ago | (#15423946)

Well, how about buildings, then? They are supposed to keep burglars out, and yet very few houses crash regularly.

The difference is that a computer program can do so many different things. If buying a new toaster could install an invisible front door with no lock right next to your regular door, then I think we would have a lot more real world security problems.

Real pointy-hair speak here (2, Interesting)

Dasher42 (514179) | more than 8 years ago | (#15423737)

People outside the software development field really do make an awful lot of assumptions about the number of things that can go wrong in millions of lines of source code. Specification versus implementation is a tricky beast by itself.

If they really want to follow through with this talk, they'd better be prepared for the design decisions that go along with it, code reuse most of all. One thing that I think is particularly detrimental to code reuse is a proprietary model where the OS and every software vendor re-invents wheels over and over. You're going to need more open specs to change that.

If this is rooting for regulation of the software industry, beware. The big guys have a lot more to gain from this than the small innovators and startups. Who would really want to take advise from stereotyping wags like that anyway?

Re:Real pointy-hair speak here (0)

Anonymous Coward | more than 8 years ago | (#15424113)

Regarding specifications... swing [muetze.net]

I, for one, can only applaud her (1, Insightful)

Anonymous Coward | more than 8 years ago | (#15423748)

I really don't get all the negative comments. I think it is high time that people really start to address this issue and I can only applaud her for doing it.

Lack of security, lack of taking responsiblity and the relience on customers as beta testers really is a big problem in the software industry and as she rightly notes, it's going to have some serious repercussions for this industry.

So, if you want to avoid these, get your act together.
Do something about the problem, but don't shoot the messenger!

Re:I, for one, can only applaud her (0)

Anonymous Coward | more than 8 years ago | (#15424000)

...lack of taking responsiblity...

I notice you misspelled the word "responsibility" there.

Typos are a problem that has plagued mankind for millenia. And still you make them. It is time we address this problem.

I propose that only state-licensed typists may write anything from now on. The "write-and-fix" mentality has to go. Anyway, typos are only a conspiracy of the white out makers to make money. What if bridges were built the way you type?! For the love of God, won't someone please think of the children!??!?!

If you want to avoid typos, get your act together!

She is the problem (1)

SmallFurryCreature (593017) | more than 8 years ago | (#15424100)

She is working at one of the worst companies when it comes to patching.

Worse she is the one shooting the messenger. Hackers are the messenger and when they hack your software the message is you screwed up. She wants to stop the hackers/messengers NOT get her own act together and build secure software from the start.

I can well imagine that Oracle wants regulation against all those nasty people who just give them 1 month notice before publishing yet another security hole. SHUT UP so we can continue peddling software with holes in it that we have known of for years. MS feels very much the same.

Patches are like bandaids being against them is silly. Be against people getting wounded in the first place.

If you are against patches you need to design your software better.

She doesn't want that, she just wants the hackers to go away. This is like banning doctors to make sickness go away.

No this woman is a clueless shill wanting to make sure her company can peddle the same crap protected by security through obscurity. You know like worked so well for software in the past.

Just Be Clear (3, Insightful)

Enderandrew (866215) | more than 8 years ago | (#15423753)

Often, when consumers are given the choice they prefer to have software sooner, even in a beta state. We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?

Do you think major corporations are just going to hand over source code? Can you imagine the leaks?

Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?

Re:Just Be Clear (3, Insightful)

erroneus (253617) | more than 8 years ago | (#15423870)

Often, when consumers are given the choice they prefer to have software sooner, even in a beta state. We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

Actually, it does. At least in my case, and in the case of the business I work for. The fact is, we have quite a few programmers on staff due to the realization that we KNOW we cannot trust anyone but ourselves to address the concerns of the company directly and diligently. We don't create our own word processors. We have no plans to write our own Photoshop clone. But for many apps that are critical for business flow, we either wrote it ourselves, or have a great deal of leverage over the development of the apps we use.

Industry regulation is a very bad idea. It will cripple OSS development. It will place an unnecessary burden on taxpayers to fund the red tape. Furthermore, wouldn't regulation somewhat require the regulators to in the end have access to source code?

OSS would have an inherent exemption. Regardless of where or how it is used, it's still 'hobby' coding. No pretense is made that it is a for-profit effort. However, if there are any OSS projects that are designed for for-profit, then yeah perhaps some level of consumer protection is in order. EULAs have questionable legal status as it is, but I think it's time we struck them down as invalid and forced 'professionals' to accept the blame for shoddy work. As for burdens on taxpayers? OMG. Are you serious? And as for regulators having access to souce code? Probably not at bad idea! We've all heard of source code escrow. Perhaps it should ALL be that way.

Do you think major corporations are just going to hand over source code? Can you imagine the leaks?

Yeah, they would as a continued cost of doing busines. Many of the products we use in the physical world are easily duplicated and most are. Unsurprisingly, there is more than one maker of clothing. More than one burger joint. More than one maker of plastic food storage containers. More than one maker of automobiles. In these cases, it's not the technology that differentiates the product. It's the QUALITY and the reputation of the business (and yeah, the price too) that factors into consumer choice.

But yeah, I see your point about leaks... it could result in software piracy, copyright violations and all sorts of nasty things that... hrm... hey wait a minute! They are ALREADY a problem! This wouldn't create the problem and I can't imagine it adding too much more fuel to it.

Lastly, the government has time and time again demonstrated they have little to no understanding of technology. Do we really want them making sweeping decisions regarding software?

No, I don't want to see more government oversight. But I would like to see more consumer protection. Do you think the consumer doesn't need it? If not, then why not? If so, then how would you propose that consumers get that protection?

Look. There was a time before the FDA and various medical boards. To live life without them protecting the recipients would be rather unimaginable wouldn't it? We don't want people driving on the streets without all manner of regulation... driver's licenses, safety inspections, liability insurance. We require that many of the products and services we use regularly have regulation to guarantee minimal quality standards and some of them aren't as 'critical' as software. We don't allow EULAs and disclaimers to get in our way either. There's a cancer warning on every label for cigarettes. Doesn't stop people and governments from going after the tobacco industry. Why should software have such an exemption? Because it's PRESENTLY unregulated as medical/dental practice once was? Because it's an unimaginable mess to clean up?

There are ways for goverment to be involved without being complete morons. How about people with PhDs in software development sitting on the board of regulation? Would a group of software professionals who already KNOW the technology quell your concerns about government ignorance? Many coders are sloppy. Some of them work on projects that affect the public and make them vulnerable without options or alternatives. I have no doubt that the quality issues would be addressed, but only if they were forced and required. Right now, they aren't and it's too expensive to compete when others aren't required to concern themselves with quality. But if they all were, unilaterally, required to be responsible for the quality of their products, we'd see some comforting changes. The best of which would be that we'd see actual programmers and engineers in control of these businesses and deadlines posted by beancounters wouldn't exist..not so much anyway.
 

Re:Just Be Clear (1)

Enderandrew (866215) | more than 8 years ago | (#15423912)

If you make your own tools at work, you are the exception, not the rule. Most consumers aren't developers, they are consumers.

You also suggest there is more than one burger joint, and that consumers purchase software based on the quality of said software.

So why then was AOL number 1?

In most categories, I could argue that the leading product is often an inferior product. Given that most CIOs can't differentiate between quality software and well-known software, I don't trust the government to step in and start regulating the industry.

And whether or not there would be any benefit is clearly debatable. This would come out of tax payer's pockets. For the increased cost in spending, how much benefit do you think we'll receive? If the government wags their finger at a developer, do you think they'll switch from operating in "bad programmer" mode to "good programmer"?

Re:Just Be Clear (1)

TeknoHog (164938) | more than 8 years ago | (#15423892)

We joke about how official releases have made us all beta testers, but that doesn't seem to stop us from purchasing software.

Who is this 'we' you speak of? Personally, I don't purchase software, I emerge or apt-get it. As for the beta state of commercial software, it makes me cry rather than laugh, seeing people close to me waste money, time and nerves on Microsoft crap.

Re:Just Be Clear (1)

Enderandrew (866215) | more than 8 years ago | (#15423934)

As a fellow Gentoo user, I can relate.

However, you do not represent the masses. If I had to hazard a guess, I'd say the bulk of software purchases come in the corporate world. People at home love to pirate. And most major businesses prefer to go with traditional retail software over a custom-made-Gentoo-build.

Where is the official support for Gentoo? Can you call a 1-800 number? Are the end users knowledgable and familiar with it in the way they are with Windows? How standard is it? How consistent is it?

Many people in the corporate world believe you get what you pay for. And if you pay nothing for F/OSS software, that is exactly what you get.

We use a few F/OSS applications in the corporation I work for (Nagios/VNC) and it was like pulling teeth getting those approved.

For those who do not know Oracle: (4, Interesting)

mustafap (452510) | more than 8 years ago | (#15423754)

They are the company who have the worst user interface tools on the planet.

The GUI's would have sucked in the 1980's.

Every SQL statement was designed by a dfferent person, with a different syntax.

If the guy expects us to assume he is an authority on the subject, he should clean up his own rubbish first.

Assumptions (0)

Anonymous Coward | more than 8 years ago | (#15423823)

"If the guy expects us to assume he is an authority on the subject, he should clean up his own rubbish first."

    *She* should clean up *her* own rubbish first.

Regulating the Industry is a joke (0)

Coeurderoy (717228) | more than 8 years ago | (#15423756)

Sure we should regulate, for instance any company guily of releasing software generating more than 10 Million should be banned and forbidden to operate.
So we would get rid of Microsoft, and probably most of the closed source companies.
While it makes sense that a system integrator using some software to drive a peace maker should follow more stringent rules, than a game developper, "regulating the industry" is just a short hand for: "removing anybody that could be a new threat for us".
I'm sure Oracle would love to "regulate" mySQL.

"British are particularly good at hacking..." (1)

ettlz (639203) | more than 8 years ago | (#15423766)

as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."
Rule Britannia!
Britannia's pwnz0rs r0x.
British h4xx0r5 r so l33t they
pwn3d t3h b0x.

Typical fear mongering (3, Insightful)

Masa (74401) | more than 8 years ago | (#15423769)

Well, patches are not nice and of course it would be better for customers if the product would be perfect from the start. It's true that the most software products are buggier than, for example, fifteen years ago. On the other hand, there are several reasons for the (lack of) quality of the modern computer software. Tight dead-lines, investors, competition, to name few. And of course it's always possible to cast some blame to the software engineer.

However...

I don't like that she is using age-old classics for fear mongering. "National security" and the bridge analogy to be specific.

Bugs themselves are rarely the problem when we are talking about "national security". For some odd reason it seems that people have forgot the importance of physical separation of the public network and sensitive information / infrastructure. It's stupid to blame the tools if the user is an idiot (and in this case I mean those "chief security officers", who design these braindead infrastructures for corporate networks).

I don't understand how anyone in their right minds could suggest any kind of regulatory system for the software quality. It's practically impossible to control and what if there is some sort of accident caused by some regulated and "certified" product? Is this certification (or what ever) a free pass for the software provider? This would turn to be an ultimate disclaimer for the software companies. Or - the other way around - the ultimate responsibility, which would lead to the point where there are no more software engineers because there is too much personal responsibility involved.

Besides, in my opinion, Daividson insults British people pretty badly and describes them as "slightly disrespectful of authority, and just a touch of criminal behaviour." I think that's not a very professional comment.

Anyway, this is what I'm thinking about of this whole article.

Re:Typical fear mongering (1)

maubp (303462) | more than 8 years ago | (#15423848)

I thought that final paragraph was funny, and I'm British:
She claimed that the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."

Re:Typical fear mongering (1)

Masa (74401) | more than 8 years ago | (#15423873)

I thought that final paragraph was funny, and I'm British

Well, It's nice to hear that the great British sense of humour is still there :)

Still, I think that when talking about software security, this kind of humour perhaps isn't most appropriate considering the subject. Usually, if someone wants to lighten the speech up, mocking the people of the host country perhaps is not the best thing to do. Instead, mocking the neighbours is OK ;) But then again, maybe I just take this a bit too seriously and I'm thinking too much of the etiquette.

Shoddy Straw Man (at best) (2, Insightful)

charlievarrick (573720) | more than 8 years ago | (#15423776)

The whole bridge::software analogy is:

1. A straw man man argument and a poor one at that. It's not uncommon for civil engineering projects to require "patches" http://en.wikipedia.org/wiki/Big_dig#Reports_of_su bstandard_work_and_criminal_misconduct [wikipedia.org]

2. An obviously bad analogy, I'm sure the specifics will be discussed here ad infinium.

Re:Shoddy Straw Man (at best) (2, Insightful)

OP_Boot (714046) | more than 8 years ago | (#15423865)

Does no-one remember the Millenium bridge across the Thames? http://www.urban75.org/london/millennium.html [urban75.org]
It was opened, closed within two days, then patched.

Forget about the emotion! (0)

erroneus (253617) | more than 8 years ago | (#15423786)

This is a highly emotional topic for many people in the business of making software. But let's get beyond that, especially if we want to be 'professional' developers. After all, when you drive a car, do you concern yourself with the pressures and emotional status of the people who designed and built your car? Not likely, and you shouldn't have to. So you know what it's like to be a consumer/user.

Simple logic says that if a problem can be correct, it could have been avoided in the first place.

It gets more complicated when there's 'finger pointing' involved. When there's multiple parties developing the same project. When there are faulty libraries being used. When there are deadlines to meet. When it's just "too hard to do it the right way!" These are not the problems of the consumers. If these obstacles are too much to overcome, get out of the business. (heat/kitchen)

I can't find a single explanation that doesn't boil down to much more than whiney excuses. Nothing has shown that the simple logic is flawed. It still comes down to 'if it can be fixed, then it could have been avoided.' I would truly like to see how it's flawed logic.

Typical manipulation (2, Insightful)

suv4x4 (956391) | more than 8 years ago | (#15423804)

That's a typical manipulation move: announce a problem we all know exists, ask "why does not solution X exist that solves it" and then push for solution x to happen.

Somewhere in between the hype surrounding the issue, noone stops to ask themselves "wait, this solution doesn't even prevent this problem".

Liability is one thing, regulation before manifacturing: another. Given how much success government institutions have with software patents, how could we trust our software's security to them?

First thing they'll do is "regulate" the existence of a backdoor for the police/CIA/FBI into everything that resembles software technology with access control.

Are there any good examples of govt regulation? (1)

Beryllium Sphere(tm) (193358) | more than 8 years ago | (#15423806)

For software, that is. Building codes and electrical codes have worked pretty well.

If we could measure software quality well enough to regulate it, how much need would there be for regulation? Companies would just specify in their purchase orders "must have 685 mill-pf of quality" or "not less than 3 kilo-Sendmails of security" and the market would sort things out in its usual inconsistent but unbeatable way.

I'm nervous about government regulation partly from spending too much time studying the HIPAA regulations. For one thing there's a requirement that you write down procedures [mail-archive.com] . Then there's "thou shalt have a procedure for updating the procedures". and "thou shalt have a procedure for making the procedures available to those who follow the procedures". After that narrow escape from infinite recursion there's a clause that, after multiple readings, I swear boils down to "thou shalt do what this clause says to do". HIPAA compliance does close some common security holes but at a price that seems excessive even when I'm the one getting paid to do it.

Maybe they could take lessons from OpenBSD (2, Informative)

Bunyip Redgum (641801) | more than 8 years ago | (#15423828)

Yes, OpenBSD still has a few security patches each version, but thier methodology is far better than many other software developers.

British "Hackers" (2, Insightful)

smoker2 (750216) | more than 8 years ago | (#15423866)

Speaking as a Briton -

the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, slightly disrespectful of authority, and just a touch of criminal behavior."
should read -
the British are particularly good at hacking as they have "the perfect temperament to be hackers--technically skilled, disrespectful of authority, and are not averse to criminal behavior."
BTW, I see the use of the word "hacking" as a good thing, versus "cracking". Also, "criminal behaviour" is an ever changing variable, defined by clueless beaurocrats. I break the law every time I play a dvd or mp3 on my linux system.

The ideal system (for the government) is one where we are all criminals.

Re:British "Hackers" (1, Interesting)

Anonymous Coward | more than 8 years ago | (#15423889)

As opposed to the Americans who are technically skilled but less so every year, totally paranoid and suffer from a persecution complex, and are a nation of drug addicted murders who, at least, have moved on from being out and out slavers.

When will the 'mericans WAKE UP ?????

Stop throwing stones !!!!! You live in the biggest glass house of them all!

computers and networks (0)

Anonymous Coward | more than 8 years ago | (#15423872)

here is a news flash for you ..

security isn't a computer or a network problem ..

computers and the WWW ..

were never originally designed or intended to be a secure environment .. it's counter productive to the sharing of information ..

computer and network security is only a problem for .. the global corporate interests trying to use them for mass control .. capital commerce .. exploitation .. and profit ..

computer and network security is only a problem for .. the governments that the corporate and elite controling interests put in power .. wanting to keep information from being available to the general public for scrutiny .. and the general public from discovering the degree to which elite self-interestes .. are behind the corporate economics .. governance and conflicts of the modern world ..

neither of which is really in the true best interests of the general public ..

or the use of the WWW ..

for facilitating World Wide communication and interaction of the general public through the sharing of information and knowledge ..

neither of which is in the best interests of global corporate capital exploitation .. and the sudo democratic governments(limited dictatorships) they put in .. and keep in power ..

it's bad for business .. it's bad the bottom line .. and it's bad for being able to keeping the sheepeople in line ..

Coming from a company.... (2, Insightful)

freedom_india (780002) | more than 8 years ago | (#15423882)

Coming from a company that for Years has perfected the art of vaporware, and charges the cost of a Battleship to build a kid's 2-ft long boat.

She forgot to say that if Oracle were to adopt truthfulness in adverts and avoid vaporware and prevent charging the cost of a FULL Salon to setup cardboard emplants the industry would be $159 billion richer and we would have all have witnessed the Second Coming with the money...

Sheesh what a rant from a company that is responsible for the Vaporware strategy...

You will always have patches (1)

sl4shd0rk (755837) | more than 8 years ago | (#15423891)

Until they can invent a human that doesn't make mistakes, what Oracle is aiming for is an unrealistic goal. People screw up, so we patch. Mistakes happen, and we patch. Software evolves, and we patch. When a software company has an install base of several zillion, and can't get their act together in terms of reliability, or don't want to, then you have an issue that needs resolving. Patching because of mistakes is part of being human, patching due to apathy and blatant disregard for security is an entirely different matter. Bring forth thy bitchstick.

the blue bridge of death: submitted on Sat 27 (1)

rs232 (849320) | more than 8 years ago | (#15423896)

Is there a list of approved posters or has slashdot decended into a self indulgent clique.
If so do you mind posting this list so as the rest of us can stop wasting our time.

Oracle Exec Strikes Out At 'Patch' Mentality
Posted by Zonk on Monday May 29, @09:40AM

rs232's Recent Submissions,

the blue bridge of death Saturday May 27, @06:03PM Rejected

Re:the blue bridge of death: submitted on Sat 27 (0)

Anonymous Coward | more than 8 years ago | (#15424008)

Oh it is quite simple.. Your title sucks

If only... (2, Insightful)

Jimboscott (845567) | more than 8 years ago | (#15423901)

"What if civil engineers built bridges the way developers write code?" she asked. If only all IT projects where well defined as briges plans...

This is simple... (2, Interesting)

JC Lately (949612) | more than 8 years ago | (#15423908)

The market should determine the value of a quality product. The only regulation that should change is the ability of software vendors to avoid accountability with the complex EULA. If all the businesses in the world sued Microsoft for the effort to continually patch their software it might just get them to do something. Of course, the cost of the software would rise too, at least in the short term. Secure and bug free code doesn't need to cost significantly more provided you have the correct process and design for quality up front. It seems obvious that Microsoft uses the Beta program and even their initial production releases to test their products. Every release of their OS is cobbled together with wire, gum and duck tape. How about a real security model? How about true multi-user capabilities - not just "My Documents"... How about preventing Rootkit installations period? How is it ok to allow an OS to be so easily attacked and modified without some administrative control? If MSFT and many others approach this topic like a joke, then we need to have our laugh in the courts.

I write the standard. She doesn't get it (5, Informative)

ajv (4061) | more than 8 years ago | (#15423926)

I write the OWASP Guide, which is used by basically everybody as the standard for web application security, and is the official standard of Visa, many governments, and so on.

She talks to CSO's who mostly are bean counters. They see money down the drain from patching. I agree with them - patching is inefficient and wasteful. But it's necessary as Oracle builds crap, buggy and insecure software. They are easily five+ years behind Microsoft in churning out safer software. Buffer overflows, high privilege accounts, public access to highly privileged library functions - all this stuff is easily 10-15 years old and should not be in Oracle 10g, but it is.

Oracle has time and time again outright refused to get on board with a secure coding program, often fixing just the little bug which gained root privileges, exposed all your data, or destroyed the database outright. Instead, they should be searching for all those types of bugs and fixing them in one hit. Davidson has more than enough time to address the root cause

She is holding software up to the standards of bridges. Bridges have tolerances and over-design built into them. Most software does not. Often to make artificial deadlines made by beancounters, software is shipped with bugs. Often the bugs are not found for some time and requires researchers to go find them. If it's not researchers, its the commercial 0day crowd. This is where Davidson shows she is an amateur and must be replaced. It's best for HER customers to be secure, and that means shipping secure software. Shipping insecure software does not prevent the 0day houses from creating exploits. Oracle's reputation as a solid data partner is worthless if we lose all our data to an attacker because Oracle suppressed the news from us, rather than fixes the problem.

It is simply unachievable to build bug free software for a reasonable cost. What is required is care, developer training in secure software techniques, and defense in depth. That is our tolerance and over-design. Oracle is sadly lacking. She has had five years to get their developers onto a program of building this into their platforms, and she's failed miserably. I will be interested to hear what standards they use, and if it's mine (OWASP Guide), or if they do their own based upon ours, or use Microsoft's.

I've called for her to step down more than once. When she attacked the good name of David Litchfield and NGS Software, I was outraged - this was like shooting the messenger that their "unbreakable" software was pure crap, which we already knew - but now know through his unstinting efforts that it is truly appalling and not fit for purpose.

If this latest "push" for too little too late does not work out, she should be sacked by the Oracle board for the good of all Oracle shareholders and customers. She's had more than enough time to make a positive change, and should make way for someone who really understands security.

I'm not sure exactly what regs would accomplish (1)

sentientbrendan (316150) | more than 8 years ago | (#15423933)

that couldn't be done by setting up standards and best practices within the industry, and then testing software and source against those metrics.

It seems like there could be an organization setup to certify software as meeting some security standards. Some people might think this would be a problem for open source, but they forget that there is a lot of money behind open source. I'm sure IBM and others would help foot the bill behind getting linux certified.

The real problem with certification or government regulation is that it might cause innovation to stall in the industry. If an expensive certification process is required for huge classes of applications, then it will be difficult for smaller companies to introduce new products. The way the industry is structured, most innovative products come from smaller companies, which are often bought out by larger companies. If software must be certified, then these companies can never sell anything on their own, and their only hope is being bought out immediately after they have a product, but before they can bring it to market. This keeps such products from being tested by the market before being bought out by a larger company, and makes being a startup so unattractive that even fewer people would be willing to do it.

In other words, regulation might pretty much ruins the whole scheme that has fueled the software industry.

That's a pretty big generalization though. Some qualifications on what regulation or certification would mean could actually make it pretty attractive. Doing security certification for only small classes of products where the market is already pretty solidified could minimize the damage and maximize the benefit. Varying degrees of certification, where the minimal level is within the range of a small companies budget, would certainly help.

Personally, I'd like to see a good faith effort at industry self regulation through certification before we consider government regulation.

Why wait (1)

heson (915298) | more than 8 years ago | (#15423987)

Oracle can easily start to sell products boosting "Verified high reliability", if they think consumers do want pay lots for well proven sollutions (read ancient). Their problem is that they want to force customers into buying prehistoric software at premium prices. This PHB sees lots of money going onto the patching and feature adding business at Oracle, and desperately wants a way to kick the coders and sell the current product forever.

Their customers value varporware and bells and whistles higher than reliability when buying a enterprise database. To Oracle, laws against bells and whistles seams to be the right way to squish the competition.

The Real Enemies of Software Reliability (1)

MOBE2001 (263700) | more than 8 years ago | (#15424105)

The Real Enemies of Software Reliability [rebelscience.org]

Guess what? Oracle is on the list. ahahaha...

Oracle's Chief Security Officer Mary Ann Davidson should be next on the list, IMO, for once more comparing software engineering to bridge and building engineering.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>