Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Cost of Crappy Security In Software Infrastructure

Soulskill posted about 2 years ago | from the measured-in-dollars-and-annoying-calls-from-relatives dept.

Security 156

blackbearnh writes "Everyone these days knows that you have to double- and triple-check your code for security vulnerabilities, and make sure your servers are locked down as tight as you can. But why? Because our underlying operating systems, languages, and platforms do such a crappy job of protecting us from ourselves. The inevitable result of clamoring for new features, rather than demanding rock-solid infrastructure, is that the developer community wastes huge amounts of time protecting their applications from exploits that should never be possible in the first place. The next time you hear about a site that gets pwned by a buffer overrun exploit, don't think 'stupid developers!', think 'stupid industry!'"

cancel ×


Sorry! There are no comments related to the filter you selected.

Ugh (5, Insightful)

Anonymous Coward | about 2 years ago | (#40184415)

Tools are dangerous. If I want to cut my hand off with a chainsaw, I can. If I want to leave my PHP script open to XSS, I can.

Re:Ugh (4, Insightful)

h4rr4r (612664) | about 2 years ago | (#40184639)

This 1 million times THIS!

Any tool that is useful will be dangerous.

Re:Ugh (1)

Jeremiah Cornelius (137) | about 2 years ago | (#40185087)


As you increase the general-purpose utility of any piece of technology, you open corresponding opportunity for abuse or exploitation.

Security comes through ongoing practice. This includes implementation specifics, ongoing management operations and individual initiative/decision capacity of users.

To believe there is a technology solution that - correctly implemented at the correct point of design and lifecycle - would automatically solve the security "problem"? This is a naive point of view, which ignores the wealth of research and understanding acquired in the field of systems security, over the past 20 years.

This is not to argue that nothing can be done. But assuming that security can be "solved" with just the right design and development is cruelly untrue.

And I have yet to see this security-conscientious, aware development community to which the article makes reference.

Re:Ugh (4, Insightful)

I_am_Jack (1116205) | about 2 years ago | (#40184687)

Tools are dangerous. If I want to cut my hand off with a chainsaw, I can. If I want to leave my PHP script open to XSS, I can.

True. But I think the biggest impediment to secure systems and code is what people like my 82 year old dad are going to do if you ask them to start making selections or decisions regarding how tight or loose they want access to the internet. He's going to get angry and tell me, like he always does when I have to clean viruses off his computer, "I just want to read my email!" And there's more people a lot younger than him that will respond the same way, only it'll be over free smilies, fonts or porn.

Re:Ugh (1)

Anonymous Coward | about 2 years ago | (#40184761)

Have you considered buying him a Mac? Best investment I ever made when it came to my parents' computing, and my dad is even an electrical engineer. Some people will complain about the cost, but unless your time is completely free, it is easily worth it.

Re:Ugh (1)

jakimfett (2629943) | about 2 years ago | (#40184843)

Have you considered buying him a Mac?

...or installing Linux Mint for him? (a decent amount cheaper, and less confusing for someone moving from Windows...)

Re:Ugh (0)

Anonymous Coward | about 2 years ago | (#40185431)

My parents adapted quickly. I'd already moved them to Firefox, so they just switched from Firefox for Windows to Firefox for OSX. Office for the Mac is similar enough for most users that it also isn't much of an adjustment, or at least compared to the ribbon thing. The other nice thing about Macs if you live in a city area is that if there is a problem, you can send them to the Apple Store, and they'll help out. Battery problem? Boom, Apple Store.

Re:Ugh (1, Funny)

Anonymous Coward | about 2 years ago | (#40184887)

I considered giving up my girlfriend and peddling my ass down the gay part of town, but no, I really didn't. Just like I would never buy a Mac for myself or my family when I could get the same performance and security out of a Linux box for 1/7 the price.

Here's a Mac-related joke, though - why did they bury Steve Jobs face-down? So people like you could stop by for a cold one! Hah heh, silly Macfags.

-- Ethanol-fueled

Re:Ugh (1)

I_am_Jack (1116205) | about 2 years ago | (#40184919)

Have you considered buying him a Mac? Best investment I ever made when it came to my parents' computing, and my dad is even an electrical engineer.

Funny. My dad was a mechanical engineer and you'd think he'd know. I've told him under pain of death that he is never to buy another computer without my input, and yes, it'll be a Mac. Every computer in my house is a Mac, except for the file server, and it's running Mint.

Re:Ugh (0)

Anonymous Coward | about 2 years ago | (#40185187)

Every computer in my house is a Mac, except for the file server, and it's running Mint.

Theres nothing wrong with mint but why run it as a server? You need the gui that much?

Re:Ugh (5, Insightful)

mlts (1038732) | about 2 years ago | (#40185513)

I personally am from the IT school of "all operating systems suck, so pick what sucks less", and in some cases, the Mac recommendation may be the best way to go.

First, Apple has actual customer service compared to the PC companies (well, unless you buy from the "business" tier and get the better support plan.) So, they will have someone to call to get problems fixed and questions answered that isn't you.

Second, building them a desktop is in some ways the best solution, but it means you are on 24/7/365 call if anything breaks.

Third, Macs are not unhackable, but as of now, the biggest attack is through Trojan horses, while Windows is easily compromised through browser and browser add-on holes. So, for now, Macs have a less of a chance of being compromised by browser exploits.

Fourth, Time Machine isn't perfect, but compared to other consumer level backup programs, it is good enough. Especially if paired up with Mozy or Carbonite for documents. That way, the parent's documents are stashed safely even if the computer and its backup drive are destroyed or stolen.

Fifth, the App Store and a stern instruction to not run anything unless it came from there will help mitigate the possibility of Trojans. It isn't perfect, but it is a good method.

Of course, Linux is a workable solution as well, but a Mac's advantage is that it still has a mainstream software selection available for it, so Aunt Tillie can get a copy of Photoshop if she so chooses.

Re:Ugh (3, Insightful)

mlts (1038732) | about 2 years ago | (#40185659)

I find the biggest impediment to secure systems is cost. In previous companies I have worked for, there was a mantra by the management, "security has no ROI."

The fact that on the accounting ledger, proper security practices, doesn't mean black numbers are added, but that red numbers are not added escaped them. The typical response when I asked what the contingency plan about a break-in was usually "We call Geek Squad. They are open 24/7."

Yes, good security costs. Good routers, firewalling, hiring clued network guys, and running penetration scenarios are not cheap. However compared to other business operating costs, it isn't expensive on the relative scale.

Because there is little to no penalty if a business does get compromised, there is not much interest in locking things down. Until this is addressed, crappy security policies will be the norm.

Re:Ugh (1)

Bengie (1121981) | about 2 years ago | (#40186463)

"security has no ROI."

Security has a best case of no return and a worst case of "you lose everything".

Preventative doctor visits also have no ROI, yet they keep saving money by saving lives.

Ask them why they think banks "waste" money on security.

Re:Ugh (1)

mosb1000 (710161) | about 2 years ago | (#40184735)

I think it's harder to cut your hand off with a chainsaw than you realize, since they really require two hands to operate. A circular saw, on the other hand. . .

Re:Ugh (1)

h4rr4r (612664) | about 2 years ago | (#40184795)

Hit a nail in a tree onetime and see where it wants to go. It won't get your arm, but a chainsaw to the ribcage is not much better.

Re:Ugh (1)

X0563511 (793323) | about 2 years ago | (#40184977)

NOODLE ARMS! Really. You should have control over that bastard.

Re:Ugh (1)

h4rr4r (612664) | about 2 years ago | (#40185119)

I grew up in rural area where logging was and still is a common occupation. Noodle arms has nothing to do with it, if you are not paying 100% attention at all times when it kicks back bad things will happen.

Re:Ugh (1)

mosb1000 (710161) | about 2 years ago | (#40186235)

But don't you think it's advisable to pay 100% attention at all times when using a chainsaw? I mean, it is a chainsaw after all. . .

Re:Ugh (2)

X0563511 (793323) | about 2 years ago | (#40186751)

Indeed. You're supposed to treat it like it wants to murder you at the first opportunity.

Re:Ugh (1)

Surt (22457) | about 2 years ago | (#40184817)

My chainsaw is a couple of years old, but it just has a startup lock that requires two hands. Once it is in operation, it requires only one hand to operate.

Re:Ugh (1)

jakimfett (2629943) | about 2 years ago | (#40184859)

This. Now, if you were talking about *feet*, the OP would have a point...

Re:Ugh (1)

eimsand (903055) | about 2 years ago | (#40184895)

I wish I had mod points for this.

Re:Ugh (0)

Anonymous Coward | about 2 years ago | (#40184981)

I would've replied earlier, but typing responses with one hand takes longer.

Re:Ugh (1)

AK Marc (707885) | about 2 years ago | (#40186129)

I had an electric one, and it was surprisingly light. Sure, you couldn't chase the coeds down the street with it, or cut down a Real Tree, but for trimming, it was about as light as a hedge trimmer, and as powerful as the smallest common gas-powered ones.

Re:Ugh (3, Insightful)

neonKow (1239288) | about 2 years ago | (#40184973)

Yeah, and tools have safety standards too. Just because you accept the risk of a car crash when you buy a car doesn't mean you have to accept the risk of your car spontaneously exploding.

More importantly, if you're writing PHP code that costs money when you have an XSS vulnerability, that means you're responsible for your users' information. So, no, if you want to leave your PHP open to XSS, do it where it doesn't add to the cost of crappy security. And do it in a way that doesn't result in your site being hijacked to serve malware and spam for months on end before you notice.

You're not an island. Personal responsibility means you don't blame other people for stuff that's your own responsibility (like getting hacked); it doesn't mean you can just neglect the responsibility of protecting you customers' or boss's data, or the network that your share.

Re:Ugh (1)

Deorus (811828) | about 2 years ago | (#40186677)

Yeah, and tools have safety standards too. Just because you accept the risk of a car crash when you buy a car doesn't mean you have to accept the risk of your car spontaneously exploding.

That kind of safety is already available in modern implementations in the form of buffer canaries, randomized memory maps, read/write/execute memory protections, and managed memory allocations that are nearly transparent to the users. This is analogous to your car example, in which safety features exist in order to mitigate the negative consequences of a crash but not to limit its functionality.

Furthermore, there are extremely complex problems for which there isn't even a clearly good engineering solution, such as concurrent programming. While an object-oriented programming language would have enough information to assume the safest option (resource locking), such an assumption would prevent you from taking advantage of the benefits of a transactional implementation. The same kind of complication arrises when multiple processes are sharing resources and communicating with each other, in which case you are required to inform your implementation that all shared resources are volatile and take back control of ownership logic, because under such conditions your implementation no longer has enough information to understand the complexity of the entire IPC solution. Speaking of object-oriented programming, while it's a very solid option for synchronous programming, it sucks for asynchronous programming, which is best tackled by event-driven programming, because the object-oriented model is not designed to deal with asynchronous error conditions due to its inability to raise exceptions asynchronously. Speaking of exceptions, while developing under an object-oriented model, you re always required to ensure that all your code is exception-safe, because an unhandled exception during the execution of an instance function can potentially leave an object in an invalid state with no way to recover from that state or generate a relevant exception (required for proper error signaling and self-destruction). It is why you have signals, which are essentially asynchronous exceptions, but signals carry their own problems, a bit like exceptions you need to ensure that everything you call from within a signal handler is reentrant, you may also need to ensure that the signal handler itself is reentrant (if signals are not being deferred while the signal handler is running), and that any global variables changed by a signal handler are volatiles of atomic types, which again requires control.

As you can see, things aren't that simple. Software engineering is a complex monster, and that's why I love it.

Re:Ugh (1)

AK Marc (707885) | about 2 years ago | (#40186309)

XSS is still a systemic error, not strictly coding. Why? Because it's code injection. If the browser was sandboxed, then the code couldn't do anything. Now, fi your bank was hit or your browser is sandboxed per instance, not tab, then you could lose your bank info to an attack, again, a high level design issue, not a coding issue.

But designing things well is hard. For one, most designs like that focus on wants and nice to haves, and not strictly on needs, and the other, the designs are discarded when inconvenient, making them useless.

Re:Ugh (1)

Jerome H (990344) | about 2 years ago | (#40186609)

XSS is still a systemic error, not strictly coding. Why? Because it's code injection. If the browser was sandboxed, then the code couldn't do anything. Now, fi your bank was hit or your browser is sandboxed per instance, not tab, then you could lose your bank info to an attack, again, a high level design issue, not a coding issue.

Well even if the browser is sandboxed what would it change? The malicious code comes from the URL (either per mail or linking) and is displayed back to the user without any sanitizing, how is this not an coding error ?

Yeah, yeah, yeah. (5, Insightful)

localman57 (1340533) | about 2 years ago | (#40184467)

The next time you hear about a site that gets pwned by a buffer overrun exploit, don't think 'stupid developers!', think 'stupid industry!'"

Yeah, yeah. Hate the game, not the player, and all that. If you code a buffer overrun and you get pwned, it may mean the industry is stupid. But that doesn't mean that you're not stupid too.

Re:Yeah, yeah, yeah. (1, Insightful)

i kan reed (749298) | about 2 years ago | (#40184599)

Except the industry has painfully simple solutions to buffer overruns, like, say, almost any programming language developed after 1990 has no risk of buffer overruns.

Re:Yeah, yeah, yeah. (1)

h4rr4r (612664) | about 2 years ago | (#40184673)

No risk of maturity either.

Truly powerful tools are always dangerous.

Re:Yeah, yeah, yeah. (0)

i kan reed (749298) | about 2 years ago | (#40184753)

Oh yeah, I've heard that java is such an immature platform that no one ever uses it, and it can't do ANYTHING.

Get over yourself.

Re:Yeah, yeah, yeah. (1)

h4rr4r (612664) | about 2 years ago | (#40184819)

The JDK has never had a buffer overflow?
Secunia disagrees.

Re:Yeah, yeah, yeah. (1)

i kan reed (749298) | about 2 years ago | (#40185133)

I don't feel up to dealing with dissembling of this sort. You said "code a buffer overrun". You can't do that. The end. The JDK ISN'T programmed in java. The applicability of overruns it gets(which are basically impossible to cause with a significantly more complicated exploitation to run arbitrary code first) are irrelevant.

Re:Yeah, yeah, yeah. (2)

h4rr4r (612664) | about 2 years ago | (#40185199)

Fine, then I will just admit this one flaw is something java does a good job at preventing. It still will not magically check your inputs for you.

Re:Yeah, yeah, yeah. (1)

i kan reed (749298) | about 2 years ago | (#40185579)

No, it doesn't, but the original point is that the industry does nothing systematic for security. It's untrue. We actually work pretty hard as a whole on considering at least nominal security. Perfect security requires a non-existent level of perfection, so we address the problems with our software as it stands, and plan for the underlying basic security concepts that are well known and we can afford to.

I honestly belief that software engineering is natively one of the most security conscious professions. We don't ask our architects to plan buildings with a constant eye to possible flaws like we ask our programmers to. I would imagine we don't push electrical engineers through hoops to prevent hot wiring cars either. I know there are reasons for computers to be higher risk, but as far as fundamentals go, software ISN'T BAD.

99% Wrong (0)

Anonymous Coward | about 2 years ago | (#40186057)

There are established and well-founded rules regarding static load and fire protection safety in regulations/laws for buildings.
The equivalent thing would be outlawing the use of C++ for anything carrying confidential/secret information. But what's reality ? Powerpoint is one of the most-used tools along with Acrobat Reader, especially in government, military and big business. So the Risk Of Being Fucked By China is being discounted against Cute Dancing Bunnies.
Some programmers know about the risks, but the typical, experienced C++ guy has his preconceived notions about the superiority of C++ versus anything. In truth, he/she is protecting his/her long-time investment into learning C++. No rationality, no responsible behaviour whatsoever. Most programmers and of course their managers are simply Software Development Whores.

Re:99% Wrong (1)

0123456 (636235) | about 2 years ago | (#40186187)

The equivalent thing would be outlawing the use of C++ for anything carrying confidential/secret information.

True. All future operating systems should be written in Ada instead of C/C++.

Re:99% Wrong (1)

colinrichardday (768814) | about 2 years ago | (#40187653)

There are established and well-founded rules regarding static load and fire protection safety in regulations/laws for buildings.

But none of those make buildings vandalism- and arson-proof. Builders don't have to worry too much about malice; programmers do.

Re:Yeah, yeah, yeah. (2)

mrnobo1024 (464702) | about 2 years ago | (#40185325)

The designers of Java tried to do two things regarding security:
1. allow running untrusted code (applets) without letting it break out of its sandbox
2. prevent unsafe memory access by bounds checking, type checking on casts, no explicit deallocation

#2 is a prerequisite for #1, since if code can write to arbitrary memory locations then it can take over the Java runtime process. However, #1 is not a prerequisite for #2. Java has in practice done poorly at meeting goal #1 but has been quite solid at #2.

Re:Yeah, yeah, yeah. (2)

0123456 (636235) | about 2 years ago | (#40185523)

Note that while Java may prevent common bugs like buffer overflows, they may simply cause it to throw an unexpected exception which is caught by random code which then causes the software to behave in an unexpected way. So it's an improvement, but not a magic solution to all your security issues.

And you can probably do all kinds of exciting stuff with random Java programs by throwing so much data at them that they run out of memory and explode in a hail of cascading exceptions.

Re:Yeah, yeah, yeah. (1)

i kan reed (749298) | about 2 years ago | (#40185689)

I wasn't saying it was a magic bullet, I was saying it addressed a fairly basic common security problem by being an advancement in technology. But it's 20 years old now, better things have come along too.

Re:Yeah, yeah, yeah. (1)

neonKow (1239288) | about 2 years ago | (#40185127)

What are you saying? That modern languages aren't powerful because you can't perform buffer overruns? You realize that even if your statement that more power --> more exploit were true, there are a million other vulnerabilities out there besides buffer overruns, which is a feature completely useless for the vast majority of programming.

Re:Yeah, yeah, yeah. (1)

dgatwood (11270) | about 2 years ago | (#40185549)

Yes. If you truly cannot have buffer overflows, then there are many things the language cannot do. You will never, for example, be able to write a device driver for any existing hardware architecture in any language that does not allow you to construct fixed-length buffers and fixed data structures. By definition, any language that supports those things can experience a buffer overflow.

This is not to say that languages should not have string handling capabilities that are immune to buffer overflows, mind you, but that does not make buffer overflows impossible; it merely makes them less common.

Re:Yeah, yeah, yeah. (1)

blueg3 (192743) | about 2 years ago | (#40186193)

Buffer overflows are independent of whether you have fixed-length buffers and fixed data structures. You can have them with variable-length buffers as well.

The essential problem that causes a buffer overflow is that your language supports a data-copying (or data-writing) operation that either does not care about or must be explicitly told the amount of space available in the destination. This essentially means that you must have range-checking for all pointers.

Last I knew, Ada is both immune to buffer overflows and has been used to write device drivers.

Re:Yeah, yeah, yeah. (0)

Anonymous Coward | about 2 years ago | (#40187743)

By definition, any language that supports those things can experience a buffer overflow.

No, they can experience an index out of range error. Buffer overflows are consequence of unchecked access to memory, not fixed length data structures.

You can write device drivers simply by providing support for arrays with specified placement, not necessarily at languge level, compiler/runtime level is enough. You might get bounds checking related slowdown with this, but it is easily eliminated with most use cases in low-level programming - access to static index is trivial and can be checked at compile time, variable indexes are either checked at runtime or at compile time with sufficiently smart value range analysis. In any case you won't be able to clobber some value outside of range.

Nothing says you have to have Wild West like C/C++ pointer arithmetic to write OS level code.

HP's MPE Operating System (0)

Anonymous Coward | about 2 years ago | (#40187921)

..was done in a PASCAL variant. They had lots of users who loved the stability and ease of use. Unfortunately, Unix and WNT came along, so the corporate drones killed it. Sad.

Yes, blame the developers! (5, Interesting)

BagOBones (574735) | about 2 years ago | (#40184497)

Most web app exploits ARE the developers fault!
- They don't check their inputs (length) buffer over flow
- They parse or merge database commands (SQL injection)
- They don't limit abuse (brute force retry attacks)

Yes some of these can be mitigated at other levels, but ALL are common APPLICATION DEVELOPER ISSUES! by measure of deployment to number of exploits I would say the programing languages and OS already do a MUCH better job than the application developers...

Re:Yes, blame the developers! (3, Interesting)

Korin43 (881732) | about 2 years ago | (#40184777)

- They parse or merge database commands (SQL injection)

I would argue that this one is sometimes the fault of the tool. In most database APIs, there's a function like:

run_sql(String command, Object[] data)

But the language that most amateur programmers use only has:

mysql_query(String command);

Looking at that function signature, who's the know that you're supposed to also use mysql_real_escape_string. Even if you know what you're doing, you might accidentally use addslashes or mysql_escape_string. Or forget it for one parameter.

Interestingly, the language that does this best, is also the second worst language ever invented (after PHP). In ColdFusion, if you do this:

select * from cats where cat_name = '#catname#'

It's perfectly safe, since ColdFusion detects that catname is inside of quotes, so it automatically escapes it. You can still use variables inside of SQL, since it only escapes them when they're inside quotes.

Re:Yes, blame the developers! (1)

BagOBones (574735) | about 2 years ago | (#40184937)

Your example is still a failure of the developer understanding the tool which caused the problem, not the tool missing an alternate secure way to do it.

Re:Yes, blame the developers! (0)

Anonymous Coward | about 2 years ago | (#40186083)

There are such things as bad tools.

The tool is bad if the defining characteristics of the tool are its misfeatures. Take out the crap bits from PHP and it starts to not be like PHP.

They've been deprecating the crap bits (stuff like magic quotes, register globals, addslashes) but there's pretty much lots of crap left and they even added some more. []

Re:Yes, blame the developers! (1)

Korin43 (881732) | about 2 years ago | (#40186935)

Say someone sells a car where there's three pedals: a gas pedal in the normal place, a brake pedal in an unusual place, a pedal that functions like a brake normally, but swerves into oncoming traffic if you're going over 50 mph. Is it the users fault that suddenly they're all crashing? The manual clearly states, "don't ever use the pedal where the brake pedal used to be, because you might swerve into oncoming traffic". But still, isn't it mainly the manufacturer's fault for including that pedal at all?

Re:Yes, blame the developers! (1)

BagOBones (574735) | about 2 years ago | (#40187429)

Developers are not end users... they are some level of engineer, as they are BUILDING things for end users to use... They should be reading some kind of docs before choosing tool / function they use for the job... the more powerful the language the more you need to know.

In your example the developers should be the ones that build the BAD CAR with the exploit in it that was sold, they where not the poor end users that purchased it.

Re:Yes, blame the developers! (1)

InvisibleClergy (1430277) | about 2 years ago | (#40185347)

This so much. I went from ColdFusion development to PeopleSoft, and I find myself missing such nice things from ColdFusion.

Re:Yes, blame the developers! (1)

Korin43 (881732) | about 2 years ago | (#40186765)

I can't bring myself to actually miss anything about ColdFusion, but I do find it interesting that they managed to solve the #1 security problem on the web. If only they could automatically detect when you're trying to store a password, and run it through bcrypt first ;)

Re:Yes, blame the developers! (1)

Anonymous Coward | about 2 years ago | (#40186221)

> I would say the programing languages and OS already do a MUCH better job than the application developers...
Unless the language in question is either PHP or JavaScript.
The first was written by someone with no clue about language design or network security. The second was written by a competent enough programmer who was constrained by having to write the language and library from scratch within a week.
PHP's idea of a database API was an "execute string as SQL" function, pushing the responsibility for avoiding injection attacks onto the developer (by "developer", I mean someone whose entire programming knowledge amounts to the first two chapters of "Learn PHP in 15 Minutes"). PHP isn't a web development language, it's an exploit development language.
And JavaScript has eval(); usually that's a stupid idea, but in a language designed specifically for remote execution, it's an insanely stupid idea. I'll know that the government has started taking "cybersecurity" seriously when they pass a law requiring any "eval" function to be called "inject_me_harder".

Re:Yes, blame the developers! (1)

baggins2001 (697667) | about 2 years ago | (#40186549)

I have blamed the developers, but the greatest source of issues I've seen usually circles back to managers and users. These fundamental issues/problems are still going on
- Prototypes are taken directly into production. The prototypes intention was to actually flesh out the business rules. After initial testing does it do what the customer wants or can it do what the customer wants, customer is happy and wants it right now.
So how did the prototype get so poor -- after 4 or 5 iterations where the developer saw multiple weeks of work get dumped and trashed, the code got more turn and burn. Backend gets sloppy and just good enough so the shiny front end works.
It may not be business rules, insert any other complex issue here that is poorly described and spec'd.

- I have seen where developers get assigned to departments where they are reporting directly to a business manager or someone in sales. They churn out shiny code. Business Manager or Sales manager are all happy because they are getting new tools.
Actually saw a horror story of this. Developer went and asked and got a view created in a database. Application allowed for an unintentional SQL injection. Data was lost and nobody noticed a large chunk of data missing for 2 weeks. The DBA got blamed. They just didn't want to blame an out of control developer.
But basically it pointed back to a developer who was churning out non-peer reviewed code. Which is actually a management problem.
- Then there is the industry publishing problem. (This I think is one of the biggest problems)
I haven't read a beginners or intermediate programming book in awhile, but when I did, I never saw chapters that addressed these issues. Example after example never checking for buffer over flow or SQL injection. If it is such an easy problem to deal with , why aren't they addressed in beginning example code.

Totally off-base (3, Insightful)

i kan reed (749298) | about 2 years ago | (#40184515)

Computers are inherently instruct-able. That's their power, and that's where all security flaws come form. The underlying problems don't arise out of an industry-wide antipathy. If anything the reality is opposite, the entire industry in quite interested in the fundamentals of security.

The problem lies in the fact that we want to be able to tell computers what to do with a wide assortment of options on each of multiple layers(machine, operating system, high level language, and user application). Every one of those layers necessarily includes things we won't want to do that someone else could want to(i.e. security flaw)

This is like blaming car theft on a general malaise towards car security, when in fact it's a simple matter of cars that don't go wherever the driver wants or only ever accepts one driver is nigh useless.

Re:Totally off-base (1)

neonKow (1239288) | about 2 years ago | (#40185239)

I disagree. While what you're saying COULD be true, everything I've heard from everyone from programmers to IT employees point to the opposite. As if the news weren't enough. Sure, a security firm getting hacked might be an instance of "if someone wants to hack you badly enough, they will," but many more problems arise because management makes the decisions about where to spend resources, and they're always pushing edge for features because that is where the money is.

Re:Totally off-base (1)

i kan reed (749298) | about 2 years ago | (#40185457)

Whoa, I'm blown away by the concept that there are compromises that establish a balance of different needs due to budget limitations. That never affects anything but security.

Re:Totally off-base (1)

damm0 (14229) | about 2 years ago | (#40185359)

The car industry did move towards key fobs that authenticate legitimate holders. And the computer industry can do similar kinds of tricks.

Re:Totally off-base (1)

i kan reed (749298) | about 2 years ago | (#40185437)

We do. We do all the time. That doesn't mean that every choice made should be with a "Security first, freedom second" attitude. Every situation needs known severe risks addressed, and everything else played by ear. That's just how complex projects work.

Like "RSA Security" and the F-22 (and China ?) (0)

Anonymous Coward | about 2 years ago | (#40186399)

I am sure the Chinese Air Force likes those RSA "security" tokens, which gave them the one-time passwords to Lockheed-Martin networks and lots of juicy F-22 data.

Boy, throw away all commercial security thingies and learn how to roll your own (e.g. use a J2ME app to perform two-factor authentication). Then you know the real risks and can keep the Chicoms out. If not, you are just a Victim.

A Clueless American (0)

Anonymous Coward | about 2 years ago | (#40186289)

..that's what you are. Just because all the "important" (ie. mostly "rich") keep drumming C,C++, Java, .net and the Microsoft/Adobe Virus APIs means nothing. Just because Mr Bill Programmer-Salesman talks of "Secure Development Lifecycle" means exactly nothing. Just because every C++ coder and their dog think that their particular code is "safe" means shit.
Reality is - C++ is an el-cheapo solution millions of people are knowledgeable in.

Reality is - Microsoft and Adobe are first and foremost concerned about MONEY. Security is not generating money, it costs money - so it's a nuisance to the typical software company. At least in the short run.

Reality is, even the best C++ developers make lots of security-relevant (ie exploitable) errors, because it is so easy. Just check all the bugs in Chrome.

"Rock-solid HW/OS"? We'll get right on that... (2)

jeffb (2.718) (1189693) | about 2 years ago | (#40184643)

...because we love hearing not only the clamor for new features, but also:

"Why won't you run on commodity hardware? I can get a system that does everything yours does, plus more [including things others make it do against my will], for half the price!"

"Why is your system so much slower? Every benchmark shows that other systems can do X in a quarter of the time [leaving the other 75% for executing malware]."

"Why does your system make it such a PITA for me to do this simple operation, when all the other systems let me [or any unauthenticated user] do it with a few simple lines of code?"

Re:"Rock-solid HW/OS"? We'll get right on that... (1)

X0563511 (793323) | about 2 years ago | (#40185083)

... and yet such PITA systems like SELinux do exist and are utilized.

Solutions are there, people need to just stop being lazy bitches about it.

Most Importantly (0)

Anonymous Coward | about 2 years ago | (#40186471)

"Why can't your system display Dancing Bunnies from IdiotTube ??"

Just Ask Apple (1, Insightful)

Ukab the Great (87152) | about 2 years ago | (#40184715)

When you protect developers and users from themselves, when you start making engineering tradeoffs that reduce functionality and tinkering and fiddling ability in exchange for greater security and stability, some people start screaming that you've being evil, paternalistic and unfreedomly and not letting them decide for themselves whether they want to make tragic mistakes.

Re:Just Ask Apple (1)

neonKow (1239288) | about 2 years ago | (#40185399)

I'd say ask IT security people. Apple is hardly a good example since their reasons for their walled garden model has been more about what makes money than what makes things secure. It's been very successful at creating a certain experience for the users, but only recently has it taken up the slack as far as security goes, so I feel Apple deserves the criticisms it receives.

Re:Just Ask Apple (1)

Xtifr (1323) | about 2 years ago | (#40185837)

That's funny--I don't hear that sort of screaming about OpenBSD, which is miles ahead of Apple on making a solid, secure system. Maybe it's not the greater security and stability that pisses people off. And it's not the reduced functionality, because OpenBSD has that too. Maybe it's the reduced ability to tinker and fiddle, and the fact that you don't actually own what you bought, and the fact that Apple really are arrogant and paternalistic.

(Actually, I think OSX is a perfectly adequate system. It's Apple's mobile devices that I avoid like the plague.)

It is a double edge sword (3, Insightful)

brainzach (2032950) | about 2 years ago | (#40184829)

If you design your tools and infrastructure to prevent those with bad intent, it can also prevent those with good intent from using your system.

There is no magical solution that will solve our security needs. In reality, everything will require tradeoffs which developers have to balance out according to what they are trying to do.

Re:It is a double edge sword (1)

DamonHD (794830) | about 2 years ago | (#40185295)

Wait, the evil bit[RFC 3514]? ...



Re:It is a double edge sword (1)

damm0 (14229) | about 2 years ago | (#40185323)

The great majority of applications today could be coded up in environments similar to what developers are already used to using, but constrained by sandboxes, if the sandbox author were to provide useful tools for the developer to do things they want to do. Examples include local storage on the file system, database interactions, etc.

Oddly enough, efforts to solve the concurrency problem might also help our security problem. Witness [] for example. Being able to analyze the source lattice of a particular variable also gives us useful hints about what safety mechanisms might need to be put in place. If your variable includes user input without going through any of the input checking routines and is then passed to a string concatenation routine before being passed to the database... the run time can detect that easily and abort or check!

Re:It is a double edge sword (1)

marcosdumay (620877) | about 2 years ago | (#40187013)

f your variable includes user input without going through any of the input checking routines and is then passed to a string concatenation routine before being passed to the database... the run time can detect that easily and abort or check!

Like Perl?

Well that was 7 minutes I won't get back (2)

eimsand (903055) | about 2 years ago | (#40184853)

Ugh. What a flaky, uninformed piece of drivel that was.

The author can think of himself as an artist all he wants to. Here's a newsflash: other "arts" have to do things responsibly, too.

His whole argument is like an architect blaming the bricks when his/her poorly designed building falls over.

Re:Well that was 7 minutes I won't get back (1)

nobodyatnowhere (2636539) | about 2 years ago | (#40184949)

That took you 7 minutes to read?

Re:Well that was 7 minutes I won't get back (1)

eimsand (903055) | about 2 years ago | (#40185061)

I had to stop and smack my head against the table twice.

Re:Well that was 7 minutes I won't get back (1)

Tarsir (1175373) | about 2 years ago | (#40185435)

Agreed. This was an article with many low points, but I think the following two excerpts highlight the flawed reasoning quite well:

The underlying platforms and infrastructures we develop on top of should take care of [ensuring security], and leave us free to innovate and create the next insanely great thing.

The other major factor in why things are so bad is that we don't care, evidently. If developers refused to develop on operating systems or languages that didn't supply unattackable foundations, companies such as Apple and Microsoft (and communities such as the Linux kernel devs) would get the message in short order.

This article is missing even a gesture towards explaining why "the infrastructure" should be responsible for security while developers create their masterpeices, and boils down to mere whining: "Security isn't fun so someone else should do it for me!" Perhaps the worst part is that there is a good argument to be made that the OS and hardware should take of security, and a fundamental limit to how much security they can offer; the blog author just doesn't make it. Having the OS plug a given security hole once is more efficient than having each application duplicate the effort of plugging the hole. On the other hand, security is necessarily a trade-off for functionality, so the only fully secure application is one with no permission to do anything.

Re:Well that was 7 minutes I won't get back (0)

Anonymous Coward | about 2 years ago | (#40186105)

Anytime someone pulls out the 'Why do we still have multiple operating systems?' complaint, I pay close attention to whatever else they're saying. I am constantly updating my list of horrible ideas that I should always avoid, and I know I may find a few new entries for that list nearby.

IT weenies don't know anything about security (1)

nobodyatnowhere (2636539) | about 2 years ago | (#40184867)

America's biggest threat is not terrorism. It's complacency. For such an arrogant industry, IT "solutions" sure do have a LOT of holes. That's what you get when you demand quantity over quality.

Re:IT weenies don't know anything about security (1)

X0563511 (793323) | about 2 years ago | (#40185135)

Correction: MANAGERS don't know anything about security.

Let me assure you, IT does - unless MANAGEMENT has ensured they only hire those who don't. The ones that do, however, cannot exercise it because of MANAGEMENT.

Re:IT weenies don't know anything about security (0)

Anonymous Coward | about 2 years ago | (#40185405)

Wrong. Most IT workers are idiots, in particular most system administrators are idiots. (Most programmers are idiots, too, but I digress.) Case in point:

Let's let a group of people who don't anything about programming, who aren't interested in programming (because that eats into their video game time), become the gatekeepers and administrators of bug-ridden corporate software. What result?

The entire IT industry is fscked up. At each level you have incompetent people doing the work. Things would be much more secure if the sysadmins washed the floors, the programmers administered the systems, and the grey beards pulled out of retirement to program the software.

Managers are never the problem because they're entirely clueless. All they do is talk, and all you have to do is talk back. There's no substance to what they're saying, so don't pretend like there is. If they say Foo, you say Foo. If they say Bar, you say Bar. That's as far as the interaction ever need go.

Re:IT weenies don't know anything about security (1)

Billly Gates (198444) | about 2 years ago | (#40186475)

The probem is always management. It is taught in business school. They tell the other guys what to do and set the budget.

In a large office with 3,000 users a few IT support guys are rushing around constantly and do not have the time to update and do testing. The budget is never set high enough to do it as they are viewed as cost sink or cost center by the finance department. The bean counters are much to blame as do the management for not selling themselves better about the hidden costs not shown in Excel by the CPAs in board meetings on why you need IT.

Oblig Farside (2)

BenSchuarmer (922752) | about 2 years ago | (#40184953)

I've got a Farside on my cube wall. The caption is "Fumbling for his recline button, Ted unwittingly instigates a disaster." The picture is a guy sitting in an airplane seat about to grab a switch that's labeled "wings stay on" and "wings fall off".

It's a reminder to me to try to avoid giving my users a way to shoot themselves in the foot.

On the other hand, people need powerful tools to get their jobs done, and those tools can do horrible things when used incorrectly. There's only so much we can do to make things safe.

That's why I code in .NET (1)

JcMorin (930466) | about 2 years ago | (#40184969)

I .NET there is no buffer overflow or html inject (querystring and post data are scanned by default) or sql inject (using SqlParameter all data are encoded). I "feel" a lots safer about basic security problem.

Re:That's why I code in .NET (1)

Billly Gates (198444) | about 2 years ago | (#40186427)

My specialty is not .NET.

I can tell you that when I rewipe my computer with a fresh Win 7 image there are over 120 security updates and half are .NET. Hours later there are more security updates with the .NET platform. I do not think it is secure as you think.

Wah, wah, wah. (2)

Shoten (260439) | about 2 years ago | (#40185157)

The article focuses on security problems that have been largely addressed, in exactly the way he's complaining hasn't happened yet. He focuses on smack stashing and buffer overruns, for example...and disregards the latest higher-level languages that manage memory in ways that makes these attacks far less common. He entirely ignores the most frequent and effective attacks (XSS, SQL injection) nor does he talk about the underlying causes of such vulnerabilities. (I, for one, am extremely curious how a SQL injection attack can be the fault of a fundamentally insecure operating system, since in many cases the attack traverses across multiple different OSes with nary a hiccup.) I'm not entirely convinced that he even understands the current state of what most vulnerabilities look like, to be honest. And finally, he gives absolutely no indications as to how to accomplish this lofty goal of an OS that would prevent there from being such a thing as an insecure app in the first place. It looks to me that all he's doing is whining about the fact that he's having to learn about proper and secure programming methods, which is taking away from his hobby of eating bear claws two at a time.

Clamoring for new features (0)

Anonymous Coward | about 2 years ago | (#40185317)

Gimme a fuck'in break!
People aren't clamoring for a new browser update any more than the latest flash player.
This crap is forced down their throats in an effort to grab market share, screw competitors and make a buck.
Customers have almost no direct control.

Author needs to back up a few steps (0)

Anonymous Coward | about 2 years ago | (#40185667)

Forget about protecting me from my bugs ... how about protecting me from the OS's bugs? Windows has so many bugs in very basic functions that I'm amazed anyone manages to write robust software.

For example, the atof() function, which has been around since at least the 80's (probably the 70's) still has bugs on Windows. Microsoft's documentation [] says: "The function stops reading the input string at the first character that it cannot recognize as part of a number." No, it doesn't. It will keep reading until it hits a '\0' even though the extra bytes it is reading won't impact the returned value. This can make reading a number from a large buffer excruciatingly slow. And, if you aren't lucky enough to have a null byte hanging around at the end of the buffer, atof() might just keep going until it tries to read outside of your program's memory space, causing the OS to kill it.

Building robust software on Windows is like building a house on quicksand.

WTF Slashdot (0)

Anonymous Coward | about 2 years ago | (#40185749)

WTF, seriously? You're going to post this to a tech site?

What fucking moron decided to write this summary for slashdot? What fucking moron decided it was a good submission? Yes, I realize the answers are blackbearnh and Soulskill - seriously guys. It's shit like this that makes me wish I read CNN instead. Stop insulting your readers.

Go ahead and mark this post flamebait - that's what the article itself is, so I am just posting in kind.

Tried it... (0)

Anonymous Coward | about 2 years ago | (#40185829)

Yes, because way back in the olden times, before the clamoring for new features overtook infrastructure stability demands, things were completely secured. Never mind that many of the infrastructure components from that era still in use today have to be protected by layers of modern security due to the huge and gaping security holes... we don't have time for actual facts!

it's us, we are the industry (0)

Anonymous Coward | about 2 years ago | (#40185873)

software community has a massive blind spot, and this is the inability to identify root cause. think not? identify one process once a developer checks in code that identifies and resolves root cause. our industry every day spends millions of dollars automating bad process. only we can fix that, and there is no better place to do this, than in open source, because there are no constraints. it will not happen in industry, this is where open source efforts can truly lead the way.

So close, and yet so far (1)

ka9dgx (72702) | about 2 years ago | (#40185897)

Blaming the users, developers, tool chains, internet, or operating systems isn't going to help fix anything because those aren't the root cause of the problem.

Complexity is the problem. The solutions we're all used to using involve adding even more complexity on top of things, which makes it impossible to secure things.

There is another approach. It's called capability based security, and here's how it works:

For a given process, create a list of the files it needs to access, and the rights it needs for each. That list goes to the operating system, along with the program to run. The OS then checks the list consistently any time a file or other resource is needed. There is a special (but not onerous) way for the process to request access for other files from the OS (like when you need to open or save a file with a new name) called a "power box".

At no time is a process allowed to just try things out and scan around.

This means that you can simply and effectively limit the side effects of a given program, and not have to worry about buffer overflows, etc... because they can only result in processes which end up with the same limited access.

A capability based security system provides a realistic, reasonable, and fairly easily understood way of providing security which does NOT require trusting code (outside that of the actual OS).

This is the way forward out of the security morass we find ourselves in. I've been preaching this message for a while, and I hope that there are some out there in this wilderness who agree with me.

Oh Boy (0)

Anonymous Coward | about 2 years ago | (#40187485)

Sorry for being condescending, but all you describe is already out there. AppArmor, for example.

The problem with your post is that you view it as a Silver Bullet, which it is not. I agree that it would seriously improve security and should definitely be done. I agree that managers are fucking idiots if they don't allocate time for creating sandboxing profiles ( typically requires just a few days of engineering time),

But at least "simple" sandboxing is not a Silver Bullet for several reasons. Imagine you are the security admin at Lockheed Martin and you duly create all these sandbox profiles. So you create a profile for this dangerous piece of crap called "Powerpoint" (e.g. using "Sandboxie"). From now on Powerpoint processes on Lockmart computers will only open *.ppt and *.pptx files. Nice. But now comes the Chinese Spearfish and extracts 1500 Powerpoints to Chengdu from Mr LockheedMartin CEO. These files contain all the summaries of the hard work of Lockheed Martin Scientists and Engineers, including key measurement data.

So the Chicoms could not download the "raw" measurement files (say radar signatures from differen aspects to 0,1 degree accuracy), but they got all the highly secret "tips and tricks" related to controlling the signature, the type of absorbing paint used, the maintenance and operations issues. That is how powerpoint is used in a modern big business - it is used to store and communicate critical data and "know how". It contains the essence of hundreds of man-years of engineering work and that is exactly what intelligence is about - extracting knowledge, not extracting low-level data.

So boy, go back to your "thinking zone" and come up with something vastly more complex and nuanced. And then, please integrate it with existing business workflows, consider the ergonomics of all that. Sandboxing is valuable, but it is by no means a Silver Bullet !

Stupid article. Important point. (3, Interesting)

Animats (122034) | about 2 years ago | (#40185963)

The article is stupid. But the language and OS problem is real.

First, we ought to have secure operating system kernels by now. Several were developed and passed the higher NSA certifications in the 1980s and 1990s. Kernels don't need to be that big. QNX has a tiny microkernel (about 70KB) and can run a reasonable desktop or server environment. (The marketing and politics of QNX have been totally botched, but that's a different problem.) Microkernels have a bad rep because CMU's Mach sucked so badly, but that was because they tried to turn BSD into a microkernel.

If we used microkernels and message passing more, we'd have less trouble with security problems. The way to build secure systems is to have small secure parts which are rigorously verified, and large untrusted parts which can't get at security-critical objects. This has been known for decades. Instead, we have bloated kernels for both Linux and Windows, and bloated browsers on top of them.

On the language front, down at the bottom, there's usually C. Which sucks. The fundamental problems with C are 1) "array = pointer", and 2) tracking "who owns what". I've discussed this before. C++ doesn't help; it just tries to wallpaper over the mess at the C level with what are essentially macros.

This is almost fixable for C. I've written about this, but I don't want to spend my life on language politics. The key idea is to be able to talk about the size of an array within the language. The definition of "read" should look like int read(int fd, &char[n] buf; size_t n); instead of the current C form int read(int fd, char* buf, size_t n); The problem with the second form, which the standard UNIX/Linux "read" call, is that you're lying to the language. You're not passing a pointer to a char. You're passing an array of known size. But C won't let you say that. This is the cause of most buffer overflows.

(It's not even necessary to change the machine code for calling sequences to do this. I'm not proposing array descriptors, just syntax so that you can talk about array size to the compiler, which can then do checking if desired. The real trick here is to be able to translate old-style C into "safe C" automatically, which might be possible.)

As for "who owns what", that's a language problem too. The usual solution is garbage collection, but down at the bottom, garbage collection may not be an option. Another approach is permissions for references. A basic set of permissions is "read", "write", "keep", and "delete". Assume that everything has "read" for now. "write" corresponds to the lack of "const". "delete" on a function parameter means the function called has the right to delete the object. That's seldom needed, and if it's not present, the caller can be sure the object will still be around when the function returns. "Keep" is more subtle. "Keep" means that the callee is allowed to keep a reference to a passed object after returning. The object now has multiple owners, and "who owns what" issues come up. If you're using reference counts, only "keep" objects need them. Objects passed without "keep" don't need reference count updates.

Do those few things, and most low-level crashes go away.

I won't live to see it.

Re:Stupid article. Important point. (1)

mhogomchungu (1295308) | about 2 years ago | (#40186991)

int read(int fd, &char[n] buf; size_t n);

char buffer[10] ;
&buffer[9] will point to the address of the last element of the buffer.
&buffer[10] is outside the buffer range -->> BUG, C programming 101.

if the function as stated above requires that n be the buffer size, then:

1. You will always be passing a pointer to outside the buffer size.
2. You will always be required to read ONLY the full size of the buffer.This will prevent reading more than what the buffer can hold, but it will also prevent reading less than the buffer size. Solving a problem due to programmer carelessness by handicapting other programmers since they will no longer be able to call "read" to read data of various sizes that are under the buffer limit.

The problem with the second form, which the standard UNIX/Linux "read" call, is that you're lying to the language. You're not passing a pointer to a char. You're passing an array of known size. But C won't let you say that. This is the cause of most buffer overflows.

The API takes a pointer to a memory address, and writes n bytes from the beginning of the pointer address.
The API does not care if you gave it an array or not and thats a good thing because you can then read data to not only arrays, but to any arbitrary position in the array.

Oh no! (0)

Anonymous Coward | about 2 years ago | (#40185971)

This chainsaw is too dangerous!

How am I supposed to chop down the tree with a nail file?

Ah ah (1)

Billly Gates (198444) | about 2 years ago | (#40186339)

Ask any MBA or beancounter. THey will gladly tell you that IT is a cost center that adds no value so there is no costs at all associated running IE 6 with no security updates after June 7th 2009, on a Tuesday that wont work with intraCrap from MegaCorp.

It is not like any financially sensitive information is ever used on computers anyway and since it is a dollar sink there is nothing wrong with using that and switching to typewritters since after all they are just a cost. ... (... end sarcasm)
My rant above is a serious issue. Especially for hospitals for HIPA requirements that still use IE 6 and unsupported who no updates, XP SP 2. The bean counters tell doctors what medical appropriate procedures to run too and not just how to handle IT and it drives me crazy when I contract work for them.

The worst offenders are not that McCrappy or Symantic endpoint, but software that is 100 security updates behind! Can't get more updates because the intranet app developers are lazy and do not want to support it, or are evil and do this on purpose so your employer can buy version B for $150,000 so they can run updates again or join the rest of the world and join Windows 7.

A patched Windows 7 office with IE 9, up to date updates and no flash or outdated java running in the internet zone, is like 300% more secure. The calls for malware go down drastically. The problem is always the MBAs and the obsession over the share price going up.

TFA is another... rant (0)

Anonymous Coward | about 2 years ago | (#40186721)

How is crappy s/w different from the local police?

No news here I say.

We get to choose... (1)

Genda (560240) | about 2 years ago | (#40187683)

We can have a wide open no holds barred space to create anything good, bad or indifferent. Or we can lock it all down according to someone's idea of safe, fair and convenient. Under the second plan. a thousand things you are going to want to do will not be possible because they exceed the mandate of the security environment (no matter where you arbitrarily draw the line.) So you get to pick your demons. Me I like it the way it is. That's just me.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>