Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Is Paying Hackers Good for Business?

Zonk posted more than 7 years ago | from the moral-quandry dept.

Security 94

Jenny writes "In the light of the recent QuickTime vulnerability, revealed for $10,000 spot cash, the UK IT Security Journalist of the Year asks why business treats security research like a big money TV game show. 'There can be no doubt that any kind of public vulnerability research effort will have the opportunity to turn sour, both for the company promoting it and the users of whatever software or service finds itself exposed to attack without any chance to defend itself. Throw a financial reward into the mix and the lure of the hunt, the scent of blood, is going to be too much for all but the most responsible of hackers. There really is no incentive to report their findings to the vulnerable company, and plenty not to. Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"

cancel ×

94 comments

Sorry! There are no comments related to the filter you selected.

Too late (5, Insightful)

orclevegam (940336) | more than 7 years ago | (#19073935)

0-Day exploits are already big business on the black market, better for the companies to pay for disclosure and have a more secure product, than for the exploits to be sold off on the black market and only discovered after a significant portion of the user base has been compromised.

Re:Too late (2, Insightful)

!ramirez (106823) | more than 7 years ago | (#19073979)

There's a simple solution to this. Stop writing sloppy, insecure, poorly-managed code, and actually MAKE a product that works as advertised and is fairly secure. Hackers go after the low-hanging fruit. This is nothing more than a product of the 'get it out the door as quick as possible, damn the consequences' software industry mentality.

Re:Too late (3, Insightful)

lgw (121541) | more than 7 years ago | (#19074817)

There's a simple solution to this. Stop writing sloppy, insecure, poorly-managed code, and actually MAKE a product that works as advertised and is fairly secure. Hackers go after the low-hanging fruit. This is nothing more than a product of the 'get it out the door as quick as possible, damn the consequences' software industry mentality.
While this comment is more flaming than is perhaps strictly necessary, this is certainly the heart of the problem. Security best practices are no longer a dark art. In my experience, people often do extra work to create security holes in their products.

If it were just the case that companies were ignoring the security issues in development because it was cheaper, well, that's business for you, but the reverse is commonly true. I'm simply amazed by the frequency with which people write their own products from scratch in areas where products that have already had all the "low hanging fruit" patched are freely available for commercial use!

Here's a hint: you're not going to save any money by writing your own user authentication mechanism, or your own RPC infrastructure, or your own file encryption software, or your own network traffic encryption software. You're spending money to re-invent the wheel, and you're getting a trapezoid with an off-center axel!

Re:Too late (1)

tyler.willard (944724) | more than 7 years ago | (#19077905)

Bullshit.

Complete and utter bullshit.

The anti-virus/security industry has bent over backwards for over a decade to avoid even the appearance if impropriety. Recollect the public and nasty castigating of the University of Calgary over virus writing courses to "train" antivirus researchers. After this and other efforts there are still large numbers of people who still think antivirus companies write viruses. Offering bounties on vulnerabilities is no different from employing malware authors. This does nothing but harm to the reputation of the industry.

Further, selling vulnerabilities underground takes a hell of a lot more than cracking code with time and tools. It's a *goddamn* criminal endeavor that comes with the associated risks and effort. For the security industry to openly encourage and reward this sort of activity is a horrendous idea from both an ethical and practical standpoint.

The ethical standpoint should be obvious: activity threatening to the public should not be encouraged and rewarded.

The practical standpoint shouldn't be difficuld to grasp either: how to prevent the blackmarket resale of bountied exploits, how to contain the information so as not to inspire related exploits, how not to provide cover for blackmarket exploit purveyors who send in a few to pretend to legitimacy?

The idea that this could in any way be good or effective is at best ridiculously naive and at worst criminally negligent.

Re:Too late (1)

orclevegam (940336) | more than 7 years ago | (#19082429)

You must live in a very different world than I do. Yes, selling exploits on the black market is illegal, but that's why it's called the black market, it's a place people go to sell illegal things. Because it's illegal and risky, it drives the price up, the higher the price, the more people are willing to risk getting caught to make money. If on the other hand there's a legitimate legal way to make money, even if it's a fraction of that possible on the black market, more people will be willing to pass up the higher profits in favor of a secure risk free channel. Of course, if the companies did their jobs properly and produced fairly secure code in the first place, then maybe these bounties would go more than a week before someone claimed them. The fact that companies can announce a bounty and get results so quickly proves that the product has a LONG way to go before it's worthy of even claiming any sort of security. As a good example of how this kind of bounty can be a good metric on somethings security, consider the bounties the RSA pays for cracking their algorithms. As it is, you can have a very good sense of exactly how strong a particular RSA algorithm is, based on the sheer amount of computing power and time it takes to crack any given one of them. Now, if you had a company that put out one of these bounties, and nobody was able to claim it for say a year or more, I'd feel pretty confident running that software.

Re:Too late (1)

dougmc (70836) | more than 7 years ago | (#19082867)

Yes, selling exploits on the black market is illegal, but that's why it's called the black market, it's a place people go to sell illegal things.
Selling exploits is illegal? Or is it only illegal because it's on the black market? (and therefore illegal, because anything sold on the black market is illegal?)


I don't get it. Why would selling knowledge of security vulnerabilities be illegal? In the US, the DMCA makes selling copyright circumvention technologies illegal, but I'm not really sure that would apply to general security vulnerabilities. As I see it, cracking into somebody else's box is certainly illegal in most cases, but selling information about a vulnerability that would allow you to do so? In general, no.

Re:Too late (1)

orclevegam (940336) | more than 7 years ago | (#19083357)

That's a fairly good point, selling the actual flaw is not (usually) illegal. What is illegal however, and what is primarily trafficked in on the black market, are utilities (viruses, worms, and root kits to name a few) that take advantage of those exploits to break into systems. Often the cracked systems themselves are sold off as well. Of course, with some of the most recent legislation, and creative lawyers and politicians putting a fair amount of spin on things, it may not be long before selling ANY exploit is considered illegal. As it is, a lot of people are being charged under the DMCA with the companies claiming that whatever is cracked is bypassing a copyright prevention device. Of course, you need not even do anything illegal technically, as if they really want they can press charges for just about anything, they just have to be civil charges. It's important to remember a civil suit does not have to be based on anything at all, although it's risky to file as if the court sides against you, they can slap some pretty nasty fees on you for wasting the courts time (although they seem to exercise this right more against individuals than corporations).

Bounty Hunters (5, Interesting)

conner_bw (120497) | more than 7 years ago | (#19073973)

In the United States, bounty hunters catch an estimated 31,500 bail jumpers per year, about 90% of people who jump bail. That's bounty hunters, who do this for money, not police officers, who also do it for money. In matters of computer security big business can, as it does now, support both bounty hunting and more organized security efforts.

What's wrong with both?

Re:Bounty Hunters (2, Insightful)

malcomvetter (851474) | more than 7 years ago | (#19074107)

The problem with your analogy is that "bounty hunters" in the infosec debate would actually be searching for the exploiters, not the exploits.

Re:Bounty Hunters (1)

l4m3z0r (799504) | more than 7 years ago | (#19074219)

No. What you said is not an analogy. Normal bounty hunters would look for exploiters on the lamb.

Re:Bounty Hunters (1)

Chosen Reject (842143) | more than 7 years ago | (#19074397)

Not even that. Normal bounty hunters would look for accused exploiters on the lam. Or did we decide that if you are on bail then you are guilty. If so, why are we letting guilty go free for a short time?

Re:Bounty Hunters (2, Informative)

Torvaun (1040898) | more than 7 years ago | (#19078505)

Generally, the accused but innocent don't take off. They stay in the state like they're supposed to, they show up to their trial, and then they most often get acquitted. Violating bail is, in fact, a crime, so a bail jumper is a criminal, regardless of whether or not he's guilty of the crime he put up bail for.

Re:Bounty Hunters (1)

Daychilde (744181) | more than 7 years ago | (#19079221)

"Normal bounty hunters would look for exploiters on the lamb"

You're going to feel sheepish when you realize that should be "on the lam". ;-)

Re:Bounty Hunters (1)

KingKiki217 (979050) | more than 7 years ago | (#19081425)

This is slashdot, so where's the car analogy?

Re:Bounty Hunters (1)

WwWonka (545303) | more than 7 years ago | (#19074165)

exactly...that's why the likes of Microsoft and Apple need to rely on 3l33t peeps with sk1llz like b0b4 f3++.

Re:Bounty Hunters (4, Interesting)

Applekid (993327) | more than 7 years ago | (#19074241)

In the US, bounty hunters have legal protection to do what they do. If a company puts up a juicy reward for finding a security hole, the person coming forward could easily get the shaft and then be prosecuted under DMCA.

At least on the black market, you know, honor among thieves.

Re:Bounty Hunters (1)

drinkypoo (153816) | more than 7 years ago | (#19074339)

If a company puts up a juicy reward for finding a security hole, the person coming forward could easily get the shaft and then be prosecuted under DMCA.

No, that would be illegal. If a cop does it to you, it's entrapment, but in this case it would be... hell, I don't know what it would be. But by throwing the contest they're inviting people to attack their software, and unless your lawyer is utterly incompetent, the DMCA would not apply because you had express permission.

Re:Bounty Hunters (1)

orclevegam (940336) | more than 7 years ago | (#19074527)

the DMCA would not apply because you had express permission.

Well, that really depends on how exactly the contest is stated. If you discover a exploit and then make a announcement about it at the same time you try to claim the prize the company might turn around and sue you saying that you didn't have the right to announce the exploit to the general public without their express permission. If on the other hand you discover it and only tell them and they try to sue you, yes, then you could pretty much laugh them out of court.

Re:Bounty Hunters (1)

snoyberg (787126) | more than 7 years ago | (#19074855)

Well, that really depends on how exactly the contest is stated. If you discover a exploit and then make a announcement about it at the same time you try to claim the prize the company might turn around and sue you saying that you didn't have the right to announce the exploit to the general public without their express permission. If on the other hand you discover it and only tell them and they try to sue you, yes, then you could pretty much laugh them out of court.

At least, we'd like to believe so. Remember, justice != logic.

Re:Bounty Hunters (1, Funny)

Anonymous Coward | more than 7 years ago | (#19074711)

What's wrong with both?

Nothing. Both Cops and Dog the bounty hunter get cool TV shows. Clearly that is the solution.

Micro$oft should pay some hackers (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#19073989)

LOL, teh funnay was made complete with the 's' replaced with '$'.

Is paying hackers good for business? (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#19073993)

Is roasting Juden good for acid rain?

What is it, Obvious Question Day on Slashdot?

What i fail to understand (3, Insightful)

Adambomb (118938) | more than 7 years ago | (#19073995)

Is why would such contests HAVE to report what vulnerability successfully got through. Shouldnt the results be between the company holding the contest, the successful hacker, and companies whose software was involved in the vulnerabilities be the only ones who know?

Why couldn't one just announce "Joe Bob McHobo was the winner!" without publicizing the vulnerability itself before the softwares author gets a crack at it.

Humanity is weird.

Re:What i fail to understand (1)

drinkypoo (153816) | more than 7 years ago | (#19074163)

Is why would such contests HAVE to report what vulnerability successfully got through. Shouldnt the results be between the company holding the contest, the successful hacker, and companies whose software was involved in the vulnerabilities be the only ones who know?

I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public. From what I have seen of most people who actually find these things, most of them wouldn't either.

Re:What i fail to understand (1)

orclevegam (940336) | more than 7 years ago | (#19074363)

I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public.

Would you be willing to do it assuming there was a reasonable lag time between the announcement of the discovery and the announcement of the details of the exploit? Something reasonable like say a week or two (agreed to before the contest is started), to give some time for the developer(s) to fix the problem and release a patch. Assuming that the requisite time has passed then either party could release the details, and also have a legally binding contract giving them that right (maybe with a clause that the developers are legally required to attribute the discovery to whoever wins the contest).

Re:What i fail to understand (1)

drinkypoo (153816) | more than 7 years ago | (#19074483)

I can only speak for myself, but I would not participate in any such contest in which the vulnerability was not immediately reported, and/or where I did not have the right to immediately release it to the public.
Would you be willing to do it assuming there was a reasonable lag time between the announcement of the discovery and the announcement of the details of the exploit? Something reasonable like say a week or two

You seem to have a reading comprehension problem. I suggest you look up the meaning of the word "immediately". I didn't say "quickly", or "shortly", or "rapidly". You could have saved yourself the trouble of asking a question whose answer is immediately apparent, and me from having to reply.

Re:What i fail to understand (1)

Manchot (847225) | more than 7 years ago | (#19074649)

Pardon the grandparent for assuming that you weren't a zealot. You've cleared that up, though.

Re:What i fail to understand (1)

drinkypoo (153816) | more than 7 years ago | (#19074841)

Pardon the grandparent for assuming that you weren't a zealot. You've cleared that up, though.

Pardon the grandparent for assuming that you meant something other than what you said. You've cleared that up, though.

There, fixed that for you.

Re:What i fail to understand (1)

lgw (121541) | more than 7 years ago | (#19074879)

Well, yeah, the ancestor assumed "only an idiot would say that, so I'm going to give this guy the benefit of the doubt and assume he meant something slightly different". Don't worry, you have in fact removed all doubt.

Re:What i fail to understand (1)

Adambomb (118938) | more than 7 years ago | (#19074487)

That is precisely what i was meaning in terms of having the companies involved sort out their security details PRIOR to it become public. Reporting it publicly everywhere just gives people a how-to until such time as the vulnerability is fixed.

Re:What i fail to understand (1)

orclevegam (940336) | more than 7 years ago | (#19074727)

I think the biggest concern is more over time spans involved. It's important before hand to agree to how quickly the details of any vulnerabilities should be disclosed, otherwise either the company isn't happy because they have to scramble to patch something overnight, or the researcher isn't happy because he can't release any of the details about what he discovered. Having a fixed time agreed to also serves as motivation for the company to actually do something, instead of just sitting on it and waiting until the next version before doing anything.

Disclosure is key (2, Interesting)

Uruk (4907) | more than 7 years ago | (#19074017)

The value of finding security holes is in disclosing them to everyone, particularly the affected vendor.

The most damaging holes are the ones that only the bad guys know about. This doesn't tend to advance security in software, it just allows people to take over your machine without your permission.

Security research or incentivization schemes that don't include a built-in mechanism to promote disclosure of the discovered problems won't help much.

Money laundering (1)

HomelessInLaJolla (1026842) | more than 7 years ago | (#19074187)

The most damaging holes are the ones that only the bad guys know about
I think this is where we see why this is a money laundering scheme--with big money. The "bad guys" know hundreds, maybe thousands of holes. There is no shortage of security vulnerabilities in nearly any code base in modern software. There are people who have entire libraries of text files describing vulnerabilities for whatever they want.

Remember the semi-cynical description of job descriptions? From a random job seeker's point of view all job descriptions are things that they're seeking to fit themselves to so that they can qualify for a job. In reality, though, job descriptions are the result of careful, diligent, and deliberate definition by HR departments who already have a candidate in mind. It is their goal, then, to write a job description which is sufficiently vague to put on a good show of interviewing candidates (and neutralizing any claims of discrimination, nepotism, or favoritism) while still being able to give the position to the (secretly) preordained favorite.

This is exactly what is happening with pay-for-vulnerability gigs. They already know who knows the vulns (usually someone in the pool of people who wrote the software or someone who, in years past, designed the hardware on which it runs) and they already have their preferred winner selected. The task is then on to construct the game show such that more money can be made off of parading the contestants around.

It's the same way insider trading is covered up. It's the same way that political elections are run.

Re:Money laundering (1)

CastrTroy (595695) | more than 7 years ago | (#19074331)

About job descriptions. This is necessary because even if somebody is already doing the job, they still have to go through the same hiring process, at least thats the way it works in government and many other organizations. I knew a guy who was a contract for a couple years, and they decided to turn his position into a full time job. They had to hold interviews and everything. Even though they were perfectly happy with his work, and he would require no training or time to learn about how things worked, they still had to conduct interviews. Like you said, they made the job description such that he was guaranteed to get the job, but they still had problems hiring him. Other people from the same working group applied for the job, and then appealed the decision once they decided to hire him. I don't see how you could make a case against hiring the guy. If you hire him, you are 100% certain that you will be satisfied. If you hire someone else, no matter how good they are, there's always a chance that it won't work out, for whatever reason.

Re:Money laundering (2, Informative)

drinkypoo (153816) | more than 7 years ago | (#19074393)

In reality, though, job descriptions are the result of careful, diligent, and deliberate definition by HR departments who already have a candidate in mind.

Careful? Yes. Deliberate? Maybe. Diligent? Usually not, which is why we end up with ads requiring a decade of .NET experience or similar.

Usually the HR department knows jack diddly shit about the job they're writing requirements for. And if you hand them requirements that actually fit the position, they'll rewrite them anyway.

Re:Disclosure is key (2, Interesting)

EvanED (569694) | more than 7 years ago | (#19076231)

The most damaging holes are the ones that only the bad guys know about.

And:
The second most damaging holes are the ones that both the bad guys and the developers know about, but no one else does
The third most damaging holes are the ones that everyone knows about
The fourth most damaging holes are the ones that only the developers know about

If you reveal an exploit, you know that you are in the third state. If you do not reveal an exploit to the public, but only tell the developers, you might have made things worse by going into the second state, but you also might have made things better, by moving into the fourth state. So it isn't necessarily a good thing to publicly reveal exploits.

Here are my general thoughts.
I.) If you have good evidence that an exploit is in the wild, publicly release details. [We are in case 2; by releasing the exploit, we move to case 3, an improvement.]

II.) If you would expect that if your exploit was in the wild you would see evidence of that, but you see no such evidence, do not publicly release the exploit *yet*. [We are in case 4; releasing the exploit would move us to case 3, which is getting worse.] Notify the developers. If the developers are reasonably quick with a patch (I would say within a couple weeks in general, maybe until the next "Patch Tuesday" in MS's or similar cases), still do not publicly release the exploit yet. Give maybe a week after the patch is released for security-conscious users and admins to apply it. Then you may release an exploit. If the developers are not being responsive after a reasonable amount of time has passed, release the exploit at that point. Continually monitor for evidence of the exploit being used in the wild; if such evidence surfaces, release the exploit immediately.

III.) Otherwise (if you can't tell if the exploit is in the wild), do not publicly release the exploit *yet*. Follow the procedure in II, but an expedited version of it. (In other words, be more impatient.) Give a few days instead of a couple weeks, don't wait for the next Patch Tuesday unless it's right around the corner, and only give a day or two after the patch release instead of a week for people to catch up. This is the situation where we can't tell if we are in case 2 or case 4; not releasing immediately tries to minimize the chance/effect of moving from case 4 to case 3, while moving quickly tries to minimize the chance/effect of moving from case 2 to case 3.

The timing in II and III above can vary according to the severity of the bug and how hard it would be to patch, and also whether there is a workaround. If there is a workaround, both situations (*especially* III) should be biased towards releasing the exploit; if there is no workaround and it requires a patch from the vendor, the situation should become biased towards not releasing the exploit. Also, if you can release information about the vulnerability (such as a suggested workaround) without releasing enough information to be of much help to black hats, then that obviously becomes biased towards releasing that information. In fact, you might immediately release such information as this even in II. Perhaps the vendor's history as to patching behavior should come into play too.

It's very much a case-by-case scenario. Saying "always release" or "never release" I think is always wrong.

Re:Disclosure is key (1)

BlargIAmDead (1100545) | more than 7 years ago | (#19077285)

What happens when you have enough exploits that you can't take them on a case by case scenario? I know the usual answer is "Scrap the code and start over! It was shite to begin with". What about if the code is already in production? What about if it's vital system software? What if you just fired all the contract coders because you figured the job was done, realized the code was garbage, and don't have enough cash to hire coders to fix it? I think your system has a very well thought out and quite frankly logical process to it. Nicely done. But my only issue comes up when you have massive amounts exploits. Ideas?

Responsible disclosure (3, Insightful)

morgan_greywolf (835522) | more than 7 years ago | (#19074027)

'Responsible disclosure' is a euphemism for 'we can't fix bugs fast enough, so if you keep the vulnerabilities a secret, it'll help us to save face.' And more time often means months, not days. Responsible disclosure is nothing more than security through obscurity. And security through obscurity is as good as no security at all. In the intervening months, you have a live, exploitable hole sitting there ripe for attack! And not just on that one system -- every like-configured system is vulnerable. I say, damn the consequences. Report as soon as possible no matter who it embarrasses. It'll either put more pressure on them to fix the bugs faster, or push users to more secure platforms, where security fixes don't take months and are usually found before their ever exploited in the wild.

Re:Responsible disclosure (4, Insightful)

malcomvetter (851474) | more than 7 years ago | (#19074379)

'Responsible disclosure' is a euphemism for 'we can't fix bugs fast enough, so if you keep the vulnerabilities a secret, it'll help us to save face.'

Wrong. Responsible Disclosure is an attempt to curb the greater than linear complexity associated with testing patches.

If a bug is found in product X, then all applications that reside upon product X need to be validated as functional. In an enterprise, that could include applications plus interfaces that are unique to that organization. Most studies on code complexity find that complexity increases at a greater than linear clip. Responsible Disclosure is the opportunity to level the playing field between the "good guys" and the "bad guys" (deliberately avoiding hat color references).

Anyone who claims Full Disclosure is the best for his company is:
A) Not a sysadmin at all
B) A lazy sysadmin who refuses to fully test patches
-OR-
C) A vulnerability pimp (e.g. IDS, AV, Vuln Assessment, etc.)

Re:Responsible disclosure (0)

Anonymous Coward | more than 7 years ago | (#19074863)

Responsible Disclosure only works if the exploit is not being used. Take the example of an exploit discovered as a result of forensics after an attack. It may not be possible to pin point when the attack was successful in all cases, though it may be possible to narrow the time frame. In this scenario, there is an active exploit in the wild being used. Responsible disclosure will leave more people vulnerable for a longer time. Full disclosure would allow individuals to take mitigating steps. No disclosure could provide your company a defense without alerting your competition. Someone who is slightly less ethical could even use the exploit to attack a competitor and hide it under the guise of there own systems being compromised.

The problem is that the world is not full of absolutes. How common is the software? How obvious is the attack? Is there evidence of it being in the wild? How important is company image? Will a report help or hurt the company's brand? How complete should such a report be and what impact would that have on the brand? Is this a directed attack (dedicated agent seeking specific goal, such as corporate espionage) or is this a non-directed incident (worm propogation, expanding bot net, etc)? These are all mitigating factors.

Re:Responsible disclosure (1)

Rich0 (548339) | more than 7 years ago | (#19077763)

And how are all these organizations supposed to test their code if they don't know there is a vulnerability, or what might have changed in a product they just got a patch for?

I am under the impression that a bunch of fortune 500s want security bugs disclosed to software vendors and a select group of companies including themselves, and to nobody else. The problem is that EVERYBODY wants to be one of those select companies, which means the bug gets out anyway. So now the bugs leak out to those who would exploit them, but not to all people affected by them (since not everybody ends up in the elite group).

It seems more reasonable to me to just let the original vendor know about the bug for a little while, and then disclose to the world...

Re:Responsible disclosure (1)

captnitro (160231) | more than 7 years ago | (#19074543)

Report as soon as possible no matter who it embarrasses.

Oh, please. Responsible disclosure isn't about who it embarasses; this isn't high school. It's about lost data and compromised systems of real people and real companies.

What you're preaching is a form of Econ 101 [joelonsoftware.com] -- if we incentivize security patching via reputation, you'll have more people fixing their holes. Maybe, but regardless, I think you'll just have more people changing the definition of what constitutes a vulnerability.

Re:Responsible disclosure (3, Interesting)

99BottlesOfBeerInMyF (813746) | more than 7 years ago | (#19074595)

Responsible disclosure is nothing more than security through obscurity. And security through obscurity is as good as no security at all.

Actually, security through obscurity is very functional and useful as part of a security scheme. Your password is security through obscurity. Why don't you just give it to everyone if it makes no difference?

In the intervening months, you have a live, exploitable hole sitting there ripe for attack!

And if you disclose the wrong vulnerability to the general public you have a live, exploitable hole that everyone knows about sitting there ripe for attack. Which is better?

Responsible disclosure is simply evaluating what is best for the security of users and disclosing i that manner. n some cases, the best thing for overall security is immediate, public disclosure to pressure the vendor into fixing the hole more quickly and to give users a chance to work around the vulnerability. In other cases, where the vendor is responsive, and ther is no easy way to mitigate the vulnerability for the end user, immediate disclosure increases the risk to users with no real benefit.

I say, damn the consequences. Report as soon as possible no matter who it embarrasses.

Who is embarrassed is immaterial. Ignoring the likely consequences of you disclosure method, however, is irresponsible, which is why the alternative is called "responsible disclosure."

It'll either put more pressure on them to fix the bugs faster...

In many cases the vendor is very motivated and goes to work with all their resources immediately. Take a look at the OmniWeb vulnerability published by the MOAB project. Omnigroup implemented a fix within a day and had it available for download, but they do the same thing for bugs disclosed to them privately. All the immediate disclosure did was give hackers more time to exploit the problem before the fix reached users. Disclosing a vulnerability to the public before sending it to a responsible and security minded development team is good for no one but blackhats. Also, rushing vendors to write code faster, can result in more bugs in said code, including other vulnerabilities or or bugs.

...or push users to more secure platforms where security fixes don't take months and are usually found before their ever exploited in the wild.

Please. Most users will not switch platforms because of security issues and many are locked into MS's desktop monopoly by some software they absolutely need and price constraints. The vast majority of users never even hear about security vulnerability disclosure in the first place.

Here's a tip for you from someone who does work in the security industry. If you're looking for a job in the field, don't expose your irresponsible ideas about disclosure if you want a chance at being hired somewhere respectable.

Re:Responsible disclosure (1)

Weezul (52464) | more than 7 years ago | (#19080257)

Yes, but he's not an insider. He's a guy who only once used a nice canned exploit to play a prank on a friend. All we outsiders see is news about some stupid/evil company who prosecutes some nice kid for "responcible disclosure". Kids are well liked by most. Adults who beat up kids are liked by none. So a vigorous assult on those adults ability to beat people up seems best.

Your also wrong about security issues not having an impact on platform choice. No one sane runs their web server on windows. User's Windows machines are behind a visious firewall. etc. Moreover full disclosure has a long term impact on a vender's reputation. It may not override other concerns, but people know about it.

Life is simple for hacker with a consience: If you find a hole in an open source program, just submit a patch or privately tell the developer, and forget about it. If you find a hole in a closed source product, sell that bitch for all the money you can get. Yes, some innocent people will lose some cpu cycles on a botnet, and others will get spam. But no one will be hurt physically. And you've done your small part to discourage use of some proprietary protuct. It's not just about "protecting the users", it should be about following your conscience.

IANASE(H)

Password not "obscurity" (1)

Kaseijin (766041) | more than 7 years ago | (#19080269)

Your password is security through obscurity.
A password is secret, not merely obscure. It's the key that fits the lock.

Re:Password not "obscurity" (0)

Anonymous Coward | more than 7 years ago | (#19093527)

Ug, I'm glad someone point out that idiotic argument already. Thanks Kaseijin.

Re:Responsible disclosure (1)

houghi (78078) | more than 7 years ago | (#19080801)

Actually, security through obscurity is very functional and useful as part of a security scheme. Your password is security through obscurity. Why don't you just give it to everyone if it makes no difference?


Login : 81121
Password : 123456

And yes, they are actual logins and passwords I use on my Cisco phone. Makes no difference to me.

Re:Responsible disclosure (1)

Torvaun (1040898) | more than 7 years ago | (#19078619)

If 'they can't fix bugs fast enough,' more pressure will not help. It will hurt.

Fixing bugs costs money.
Saying 'Company X has all these exploitable bugs' will cost Company X money, in stock price dropping, fewer consumers, etc.
Thus, exposing exploits can slow down the bug correction process by moving resources away from doing bug corrections. Bonus points if they lose enough money that they have to fire one of the code guys, and he takes a list of unpatched bugs with him when he goes.

Now, you're pushing users to a more secure platform, which sounds like a good thing until you realize that you're creating a monoculture. There is no perfect code, so every platform, regardless of security level, will be vulnerable. Now, they're all vulnerable to the same thing, and you're one 0-day away from tech-hell.

hmm (3, Insightful)

eclectro (227083) | more than 7 years ago | (#19074029)

why business treats security research like a big money TV game show

Maybe because the bugs they find are "showstoppers"?

NO WHAMMIES! (1)

MS-06FZ (832329) | more than 7 years ago | (#19075607)

Come on, damnit, no whammies!

It's business (1)

Joebert (946227) | more than 7 years ago | (#19074083)

What's the difference between you charging me for information , & me charging you for information ?

You quit charging me for your information, I'll quit charging you for mine.

Make no mistake, there's plenty of people out there perfectly willing to pay me for my information.

I wish... (5, Funny)

firpecmox (943183) | more than 7 years ago | (#19074217)

My school would do this for me so I would stop getting suspended.

Why not pay? (2, Interesting)

superbus1929 (1069292) | more than 7 years ago | (#19074235)

Here's my view: the one and only point of trying to find a vulnerability is to find the vulnerability. You don't care how it's done, you want that vulnerability found while you still have SOME control over it instead of after it's been out in the wild, and you have to patch around it. What's the best way to find your vulnerabilities? Have outsiders working towards a prize. Not only is it good publicity, looks great on the winner's resume, you find just about everything wrong with your product. It's truly win-win.

Anything that is the most thorough way of eventually getting the programme secure is the best way to go about it. Period.

Stunning (3, Insightful)

Pheersome (116234) | more than 7 years ago | (#19074239)

Wow. How is it that an "ex-hacker" who now "specialises in security from the white hat side of the fence" (from the author's bio) can have so little clue about the responsible disclosure debate and the economics of modern vulnerability research? Maybe getting lambasted on Slashdot will be a wake-up call for him to actually do his homework before he spouts off.

Re:Stunning (3, Funny)

merreborn (853723) | more than 7 years ago | (#19074827)

Maybe getting lambasted on Slashdot will be a wake-up call for him to actually do his homework before he spouts off.


Wait, you mean there are stories/authors who don't get lambasted on slashdot?
I thought we pretty much did our best to rip every story to shreds?

Out of context. (3, Insightful)

Kintar1900 (901219) | more than 7 years ago | (#19074255)

Nice way to take the situation out of context with the snippet here on /. I think the important question isn't whether public, for-pay security hunting is a good idea, but rather if it's ethical for an outside firm to pay for it. Would anyone have batted an eye if Apple had been the one advertising for a hack for the Mac? I don't think so, they'd probably have been lauded for having the wherewithal to offer good money to people to help them find exploits of their software.

Re:Out of context. (0)

Anonymous Coward | more than 7 years ago | (#19108781)

Well, Apple does happen to have a better image and reputation than Microsoft.

Who would you trust more, the guy who keeps more to himself that nowhere near as many people know, but who hasn't really managed to screw up... or the guy everyone knows who routinely manages to fumble?

Damn the consequences (5, Insightful)

minion (162631) | more than 7 years ago | (#19074257)

Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"
 
Considering how quickly companies tend to SUE you for disclosing a vulnerability, I don't think there can be any true code of conduct between hackers and companies.. Not unless the companies start making it (public) policy that they WILL NOT sue you as long as you disclose a vulnerability to them first, and give them a reasonable time to fix it before going public.
 
I think that'll never happen though, and the only way to safeguard a hacker is to make legislation against those type of lawsuits.
 
I also think that'll never happen either, considering how firmly planted the lips of those companies are to the politician's ass... So *#@& 'em, we just need a good way to disclose anonymously.

Re:Damn the consequences (1)

pscottdv (676889) | more than 7 years ago | (#19074583)

I also think that'll never happen either, considering how firmly planted the lips of those companies are to the politician's ass

You've got the choreography reversed

Re:Damn the consequences (3, Interesting)

99BottlesOfBeerInMyF (813746) | more than 7 years ago | (#19074677)

Considering how quickly companies tend to SUE you for disclosing a vulnerability, I don't think there can be any true code of conduct between hackers and companies.

So Apple sued the guy who disclosed this Quicktime vulnerability? If that happened, I never heard about it. In fact, I work i the security industry and very, very rarely hear about any lawsuits, which is why they are news when they do happen.

Not unless the companies start making it (public) policy that they WILL NOT sue you as long as you disclose a vulnerability to them first, and give them a reasonable time to fix it before going public.

Why? Would such a statement stop them from later doing it? In general companies don't sue over vulnerability disclosures, no matter whether they are immediate, or if the vendor is given time. The reason security researchers tend to give companies time to fix things is because that is what they think is best for security, overall.

I think that'll never happen though, and the only way to safeguard a hacker is to make legislation against those type of lawsuits.

That doesn't really work. Basically you can sue anyone for anything in the US (with very few exceptions). I don't see the need for one here since I very rarely, if ever, hear about anyone being sued for disclosing bugs.

Yes. (-1, Troll)

Bobb Sledd (307434) | more than 7 years ago | (#19074277)

Yes, I think there is some truth to this. Wait...Huh? What did he say? This summary is incomprehensible.

Re:Yes. (1)

Duggeek (1015705) | more than 7 years ago | (#19074745)

So many of us are alluding to this, but so few are actually calling it out

There really is no incentive to report their findings to the vulnerable company, and plenty not to. Which is why, especially in the IT security business, there needs to be a code of conduct with regard to responsible disclosure.' Do you think there's any truth to this? Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?"

“No incentive” !? Is $10,000 so lacking as to be deemed a non-incentive ? Why is this statement disagreeing with the premise of the article?

I believe the gist is this: When a developer opens the door to the community, and puts up a cash reward for finding vulnerabilities, what's to keep the “black hats” from keeping the exploits to themselves? (potentially selling them underground and making more in illicit revenues than the amount posted as a bona-fide reward) They attempt to introduce pseudo-psychological factors (which only help to confuse the matter) but essentially address the core morality of the coder community.

TFA seems just as confused as OP about the exact point they are both trying to make. I think the headline should read, How much is the color of your hat worth?

In the case of Apple, what if the hacker found a way to make $100,000 from the exploit, rather than just settle for the one-time $10,000 payoff? Would it have been enough to keep someone honest?

I think this brings us to a most compelling question. What's a “white hat” worth? What amount could be called a “standard bounty” for finding vulnerability in code? Also, support a stance on whether such rewards are a “bounty” or a “sellout price”.

(I can hear the knuckles cracking already...)

Re:Yes. (1)

lgw (121541) | more than 7 years ago | (#19074983)

I'd *far* rather make $10,000 legally than $100,000 illegally. This is true of most people. The former is just a better long-term plan.

But this debate is a bit silly since there are any number of legal firms that pay bounties for exploits in popular software, then extort huge "security consulting" fees out of the vendors to reveal these exploits. hen the company offers the bounty directly it just cuts out the middle-man.

FUCK YOU AMERICA! (-1, Troll)

Anonymous Coward | more than 7 years ago | (#19074383)

I'm 24 years old. I don't want to go through the next 50 years of my life living in an international air of worry and uncertainty. I don't want to live in a permanent state of fear, generated by a megalomaniacal American government taking advantage of the majority low IQ populous' capacity for being brainwashed.

I don't want to live like Israel, fighting militant Muslims round every corner. The problem of Muslim extremists exists and needs to be dealt with, not encouraged by invading innocent countries and waging war on people who have done nothing to deserve it. I want my children to grow up in a world free from military oppression and I want a government that understands that the wars of the future are guerrilla ones which can never be won, even if they are waged for noble purposes (which theirs never are).

The world is fu*cked up enough as it is. The food chain has been poisoned so badly the average human is full of chemicals normally found in plastics and toxic waste. I'm sick of global warning and environmental damage to the planet and the fact the all this time the greenies were right. I'm sick of America being the biggest wilful contributor to the pollution of the planet.

I'm sick of an American school system that produces children who are brought up to believe that America IS the world and anything that goes on outside is irrelevant. Children so stupid they think America invented the Internet, computer, motor car, light bulb, telephone etc ad infinitum....

The Internet or it's successor is the future of entertainment and I'm sick of stupid low IQ, ignorant Americans infecting every corner of it with their insular, jingoistic mindsets, their whiny voices and manifestations of their low self esteem driven by the fact that despite it being their turn as the world's super power, no one actually takes them seriously or gives them the respect that the British or the Ancient Greeks got because a superpower best known for producing mass produced crap is never going to get the respect that one who gave the world Shakespeare, culture, philosophy or mathematics will get.

I'm sick of hypocrisy and two facedness. I'm sick of Gangsta Rap and hamburgers, Political Correctness and TV programmes that begin with 'When' and end in 'go bad and attack people'. I'm sick of reality TV and I'm sick of news programmes that are more censored than accurate. I'm sick of tokens, token minorities, token universities, token degrees, token attempts at the truth, tokens. I'm sick of fat people, ugly people, stupid people, gay people, coloured people, female people, whiny people all complaining they don't have the opportunities in life they would like and it must be someone else's fault. I'm sick of women that act like men and femininity being a crime, unless you're a man in which case you're a new man which nobody ever wanted because there was nothing wrong with the old one. I'm sick of people falling over and suing the ground and people watching nipples and suing the TV and I'm sick of coffee cups with 'don't pour over yourself, you may get burnt' on the side to try and counter this.

I'm sick of stupid Americans who don't know the difference between patriotism and jingoism and who think flag waving should be an Olympic event. I'm sick of Americans who cry that people hate them or are jealous of them or who are anti them because someone dares to point out that the America they've been programmed to believe in from birth bears no relation to the one that exists in real life.

Re:FUCK YOU AMERICA! (0)

Anonymous Coward | more than 7 years ago | (#19074553)

Sorry, America. I posted in the wrong topic thread. I meant to post it here [slashdot.org] . ;)

Re:FUCK YOU AMERICA! (1)

OriginalArlen (726444) | more than 7 years ago | (#19074993)

Hey, n.p., I was with you all the way until you got into the stuff about girly-men ;p

Re:FUCK YOU AMERICA! (0, Offtopic)

ShrapnelFace (1001368) | more than 7 years ago | (#19076577)

I'm sick of international people who use the word "American" to typify a country and cultural without considering that in fact they just characterized an entire geographic area with flame.

If I am incorrect to assume that you are from outside a place, which I can only guess is the United States of America due to your vague off-topic tirade, then:

FIRST: I owe you an apology for assuming that you arent from here. Threats like these either come from foreign governments or aspiring socialists.

SECOND: If you are a citizen, there are many places you can move to after you renounce your citizenship- which I encourage you to do ASAP. There is Canada and Venezuela if you don't want to travel too far- please note that these are both good choices for socialists and liberals alike.

In closing I would like to remind you that despite your cry for improvements, you have neglected your self imposed responsibility by not educating yourself on the finer points of education. First and foremost, you are not only speaking out of turn but severely off topic and in a secular forum. But secondly, you seem to have a contextual impression that everything is related to everything else, and there are no singular independences at work in the world that we live in.

Blame yourself for your own short-comings because there is not a single government (not even North Korea) that can compensate for a low investment in due dilligence that has fed your unusually low self esteem.

In short, you sir go fuck yourself, and I hope to God that there is a bus out there with your name on it waiting for that fateful day when you step off the curb without looking and runs you over.

God Bless America and all the legal citizens who belong here.

Re:FUCK YOU AMERICA! (1)

Hal_Porter (817932) | more than 7 years ago | (#19078853)

I'm sick of international people who use the word "American" to typify a country and cultural without considering that in fact they just characterized an entire geographic area with flame.

I'm sure anti Americans regret any collateral damage they cause.

Responsability (2, Insightful)

Ariastis (797888) | more than 7 years ago | (#19074505)

They released a product with security holes in it, they should pay to have them found.



If a construction company builds a bridge with defects that causes it to fall on someone, that someone can sue them.

If a software company makes an insecure product, and someone gets pwned because of it, that should be allowed to sue for damages.
Yes security holes aren't easy to find in big products, but it should never be an excuse for a company (especially those that make billions, wink wink) for them to release unsafe products.

Re:Responsibility (2, Insightful)

Strilanc (1077197) | more than 7 years ago | (#19074909)

The problem with your argument is it's much harder to create a secure software product than it is to create a secure bridge. This is especially true because delaying construction of a bridge for a month can be done without competitors swooping in and taking the market.

You forgot the EULA (1)

Sean0michael (923458) | more than 7 years ago | (#19078103)

Basically, most EULAs will leave you hanging out to dry in this regard. They'll make sure you acknowledge that the company isn't responsible for security breaches, or at the very least you waive your right to sue for damages in such an instance.

Re:Responsability (1)

dodobh (65811) | more than 7 years ago | (#19079111)

So what happens when the hole is due to the interaction between multiple components, not all of which may be provided by the same vendor? What happens if you are not running in the recommended safe mode?

In the case of Vista, what happens if you turn UAC off?

It's the WRONG APPROACH (0)

Anonymous Coward | more than 7 years ago | (#19074517)

I'm not an expert in the field and I expect to be criticized as such.

But I have always held to the simple and logical principle that if the can be fixed or patched, then the problem could have been avoided in the first place with good coding practice and code review.

I have heard from countless sources (like BugTraq and other security lists as well as professionals in the field) that 99.9% of these bugs come from lazy programmers writing code in ways that should [should] know better than to do. It happens when "quick and dirty" prototype code somehow makes it into production. It happens when the programmer it simply unaware of the problem. It happens for some reasons that are fairly understandable, but let's pause for a moment and consider why people still buy software over open source. Among the reasons, one common reason is that commercial code is "professional code" and as such is expected to have been created by trained professionals, using professional standards, methods and techniques. (It's public expectation, not the truth) But in my mind, if your product is to be considered worthy of public consumption and you would like to be considered nothing less than professional, then perhaps you should write code to professional standards and use professional methods and practices.

Yes, there are buggy libraries beyond the control of many programmers. But by definition, it's not the programmer's fault or responsibility unless, of course, the programmer KNEW about the problem and failed to work around it. But all these stack overflows, underflows, sideways-flows (I just made that up) and stuff like that is simply unforgivable when it comes from "professionals" selling their commercial wares. If they don't have the knowledge, then they should quit what they are doing until they have it. In architecture, medicine and many other professional fields, there are serious things that can happen to their licenses should they fail to behave and perform professionally. Somehow the profession of writing code has escaped that level of professional regulation... and well? Look at the consequences.

Re:It's the WRONG APPROACH (2, Insightful)

malcomvetter (851474) | more than 7 years ago | (#19074877)

I am a security analyst by profession and education [not that it matters, but as a distinction of the previous poster's non-security background].

You are somewhat correct. Sloppy coding techniques do lead to security vulnerabilities which lead to exploit code which eventually lead to websites burning, etc. However, that is only one category of security flaws. If you look at, say the GDI flaws Microsoft had last year (for example), you'll notice that vulnerability is actually a design flaw-- allowing executable code to live embedded in file objects was the problem [the embedded code's trustworthiness had no mechanism to be measured and therefore any user double-clicking on a malicious code-within-an-image file would have their system compromised]. Design flaws are much more tricky to prevent and most experts attempting to solve this problem suggest that development houses should leave the design aspects of their code to people with a background in security principles, or at least have some sort of design-time security review. This is mostly what formalized threat modeling attempts to do.

But you are right ... there are vast categories of vulnerabilities that end up compiled in code unnecessarily. And a great place to start for anyone looking to weed these unforgiveable buffer overrun types of issues out of their code is to use a static analyzer on their code. Essentially, static analysis tools attempt to catch these obvious (or sometimes not so obvious) bugs before the code is shipped to customers. Fortify Software [fortifysoftware.com] is a great place to look for such a tool.

Hackers? (1, Informative)

tm2b (42473) | more than 7 years ago | (#19074685)

Of course paying hackers is a good idea, if you want to generate any interesting code... Oh, wait a minute. Slashdot has bought into the lowest common denominator usage of "hacker" to mean a cracker. And here I thought my opinion of the Slashdot moderators couldn't get any lower, after I had moderation privs revoked for daring to criticize them on other matters...

MOD PARENT UP (1)

tgcid (917345) | more than 7 years ago | (#19075193)

Please mod up the "hacker-truth, moderator-bashing" post!

Re:Hackers? (1)

mjeffers (61490) | more than 7 years ago | (#19075787)

Language changes over time and the meaning of the word "hacker" is now commonly understood to mean what geeks would term "cracker". Similarly, "gay" doesn't just mean happy and when I call someone a "bastard" I don't mean that they're the product of unmarried parents. You're fighting a battle that you lost at least 10 years ago.

Re:Hackers? (1)

dwarfsoft (461760) | more than 7 years ago | (#19076639)

Be careful... he might CrAx0r your B0xen... hmm... It doesn't have the same kind of ring to it, does it...

Re:Hackers? (1)

tm2b (42473) | more than 7 years ago | (#19093303)

Not at all. I don't expect the mehums in the mass media to pay any attention, that battle is truly lost. It was more of a rout than a battle, truthfully.

Slashdot editors on the other hand, should know better. There are enough of us here who actually are non-cracking hackers, after all.

"Hackers" is just one of several examples (1)

Freed (2178) | more than 7 years ago | (#19077555)

More terminological abuse from Slashdot editors:

"Linux" instead of "GNU/Linux" (when not referring specifically to the Linux kernel)
"piracy" instead of "copyright infringement"

I am sure we could dig up more.

Re:Hackers? (1)

s16le (963839) | more than 7 years ago | (#19078625)

You need to get laid, big-time.

Re:Hackers? (1)

tm2b (42473) | more than 7 years ago | (#19093291)

How many times? I'm up for more, but I think my girlfriend isn't up for any more today.

(Don't believe the lies, hacker kids. Become successful and get your life together and everything else will fall into place. Except, fucktards who should know better think that hackers are crackers.)

bug testing? (3, Interesting)

Lord Ender (156273) | more than 7 years ago | (#19074825)

Buying vulnerability info from a third party is just outsourcing your QA. It's just buying testing + bug reporting.

If a third party demands money to keep QUIET about a vulnerability, that's extortion.

Much of the animosity here is that many security researchers specialize in breaking things--they haven't ever worked on engineering a large, complex system. They just don't understand how much time is required to test code before it is released. Also, the legal teams for many companies just don't understand that alienating security researchers by filing law suits is only going to make their situation worse.

Re:bug testing? (1)

vinn01 (178295) | more than 7 years ago | (#19076817)


The problem is that bug testers want to be paid for their efforts. The companies will do anything, fair or unfair, to avoid payment.

To any bug testers, I offer these:

Hint: Never answer this question "What will you do if we don't pay for this information?".

Answering that question, with nearly any answer, can lead to extortion charges.

Next hint: Never demonstrate a vulnerability, to anyone, just document it. Written words are rarely illegal. Actions are more frequently illegal.

Not paying is -definitely- not good for business.. (1)

MS-06FZ (832329) | more than 7 years ago | (#19074839)

OK, let's suppose you were to have a standard "date" and didn't pay. You might think this is just dandy, perfectly fine business but in fact the hooker probably has some associates who would be willing to break your kneecaps for that money. So from that perspective, paying hookers is definitely good for business.

who-the-which-what? (1)

OriginalArlen (726444) | more than 7 years ago | (#19074961)

UK IT Security Journalist of the Year...

I'm a UK cit, I work in infosec, and I've a friend who's an IT hack (er, that is, journalist :) ) I have no idea who the UKITSJotY might be. Mine non-UK SIJOTY is Bruce Schenier, same as last year and the year before that, with Peter Neumann a close second.

I have a better idea! (1)

StewedSquirrel (574170) | more than 7 years ago | (#19075827)

I have a better idea.

Why not hire a professional team of assessment professionals to look at your stuff?

I'm not talking a lame corporate-compliance team, but a highly experienced team of world-class hackers, who are employed by an extremely reputable company and managed by an experienced staff capable of communicating problems quickly and completely.

Try this one: www.accuvant.com

Then you don't have any of these issues.

Of course, that wouldn't necessarily be as cheap. I think $10,000 would definitely be on the extreme low end of a simple job.

Stew

History (2, Insightful)

Koby77 (992785) | more than 7 years ago | (#19076337)

"Responsible disclosure" would have been great, except that history has shown us that it usually doesn't work. When "responsible disclosure" has been tried the vulnerability has lingered (especially with the larger corporations). When the vulnerability has been openly disclosed, then suddenly the software gets a patch. If history had been different then perhaps we would give the idea consideration. But it wasn't, and it was a problem created by the software companies themselves, so here we are today reaping the seeds that were sown.

haha (0)

Anonymous Coward | more than 7 years ago | (#19076881)

pay me now or pay me later... your call nub....

Hate the vuln, not the finder (0)

Anonymous Coward | more than 7 years ago | (#19076931)

Security industry commentators fallaciously believe that it is the announcement of the existence of a vulnerability that puts users at risk, not the vulnerability itself. As an illustration, compare this QuickTime vulnerability with the Microsoft Windows Animated Cursor (ANI) vulnerability.

The ANI vulnerability was reported to Microsoft in December 2006 by Determina. Completely independently, this vulnerability was reported as being exploited in the wild on March 29th by Microsoft and on April 3, an official patch from Microsoft was released. It is unknown how long the vulnerability was being exploited in the wild before Microsoft's announcement.

In this case, the live demonstration of the QuickTime exploit at the conference was performed over a controlled network to prevent anyone else from sniffing the network traffic. The only details released over the weekend were, "A vulnerability affecting Safari on MacOS X". The fact that the vulnerability was in QuickTime's Java components was only revealed on the subsequent Monday, after the vulnerability had been reported to Apple. These details were revealed so that users could take appropriate action (disabling Java) to protect themselves in the meantime. Apple subsequently released a patch one week later.

With the ANI Vulnerability, Microsoft took 4 months to fix a very serious vulnerability. During that time, countless Internet users were compromised with attacks based on that vulnerability. With the QuickTime vulnerability, Apple took 1 week to fix the vulnerability, and there have been *no* reports Macs or PCs being compromised using that vulnerability, beyond the MacBook Pro at the contest.

The publicity of the contest actually sped up the process of addressing the vulnerability, thus putting less users at risk. Had Microsoft taken 1 week to address the ANI vulnerability, we would have avoided the rash of infections that came in mid to late March. Blame the vulnerability, not messenger.

Evolve or die? (1)

wytcld (179112) | more than 7 years ago | (#19077101)

If we can create a situation where bug disclosures are maximized, the products with the most serious security problems will die, and likely take their companies with them. So if you're a company that reasonably believes your products have few if any such bugs, your smartest bet is to encourage all companies to offer rewards to hackers - if you're right about the quality of your products, it will take your competition down and leave you standing.

As a customer, then, who should you buy from? The companies with the confidence in their products to offer hacker rewards, or the ones with so little confidence that they don't? Yes, some of the first will be wrong about their products; but virtually all of the latter will be correct.

just my opinion (1)

dead.phoenix.616 (948836) | more than 7 years ago | (#19077743)

Or is it a better idea to find the vulnerabilities as fast as possible, damn the consequences?


I'm not sure if people would agree, but in my opinion,
looking back at history, I think the consumers (which is most
of us I suppose?) got fed up with vendors not dealing with the
vulnerablities of their software (and some of them going out of
their way and sometimes calling it a "feature").

Pretty soon it became a trend to disclose known vulns -- for
leverage? -- everybody getting together because they wanted
a solution - a fix - so they stop getting screwed (again and
again) and here we are today, with added hype, and business
models which creates just another revenue stream.

Whether code of conduct or not (which the article seems to argue)
I'm not wise enough to boldly say which is good or bad, but if at
least someone nice enough discloses a problem, it would surely
make me feel a little better because I could make choices to
think about it and do something about it (and maybe help others?).

Thats another reason why I chose to use open source OSs/softwares
as much as possible (not that I'm siding with any side), because if
something needed done, I could open the hood up and see, think, and
most likely fix something about it, instead of "working around" some
problem and knowing that its not fixed.

just my opinion
(sorry if this was a little off topic)

Incredibly stupid (1)

cdrguru (88047) | more than 7 years ago | (#19077861)

In humans in general, you get behaviors that are rewarded or reinforced. You reward hacking, cracking and exploits and you will get more of it. Mostly focused in directions you didn't even dream of originally.

And the new crop of victims will never know who to thank.

Re:Incredibly stupid (1)

Nazlfrag (1035012) | more than 7 years ago | (#19093791)

This is precisely why it is a good idea to pay hackers. There are many rewards already available to the exploiters, yet none of these reward disclosure to the affected parties. If we reward disclosure we can expect to see much more of it. The longer that bugs go undisclosed, the longer malicious hackers will have their zero day exploits. With readily available bounties, these exploits will be much harder to pass around the underground without someone leaking the info. This will close down communications between individuals and groups, and create an atmosphere of distrust in the underground. Conversely, the number of people searching for bugs without malicious intent will increase, perhaps even to the point they outnumber the exploiters.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>