×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Exploit Sales: the New Disclosure Debate

Soulskill posted about a year ago | from the responsibility-versus-a-new-car dept.

Security 31

msm1267 writes "There are a lot of echoes of the disclosure debate in the current discussions about vulnerability exploit sales. The commercial exploit market has developed relatively quickly, at least the public portion of it. Researchers have been selling vulnerabilities to a variety of buyers – government agencies, contractors, other researchers and third-party brokers – for years. But it was done mostly under cover of darkness. Now, although the transactions themselves are still private, the fact that they're happening, and who's buying (and in some cases, selling) is out in the open. As with the disclosure debate, there are intelligent people lining up on both sides of the aisle and the discussion is generating an unprecedented level of malice."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

31 comments

WTF (4, Funny)

donaggie03 (769758) | about a year ago | (#43733953)

WTF is this article even about?

Re:WTF (2)

OhSoLaMeow (2536022) | about a year ago | (#43733975)

It's in the TFA:

One difference this time around is that there are large piles of currency involved, not to mention the privacy, security–and in some cases, physical security–of people in countries around the world. Governments are buying exploits and using them for a variety of purposes. Some are using them to spy on their own citizens, while others are using them to attack their enemies’ networks. And government contractors and other private buyers are purchasing them for their own uses, as well.

Re:WTF (2)

schneidafunk (795759) | about a year ago | (#43734007)

Agreed. This is an opinion piece by some unknown author on some unknown site (to me). I found no bio info for the author on their site and question the professionalism of websites that cannot figure out how to update their copyright date (2011).

exploit sale = nondisclosure (4, Insightful)

bouldin (828821) | about a year ago | (#43733973)

The only interesting exploit is one that hasn't been patched, right? So anyone who discovers, sells, or buys an exploit knows of a vulnerability and is choosing not to disclose it.

By not disclosing a vulnerability, you are allowing others to be vulnerable. It's hard to argue that this is ethical behavior...

Here's an analogy: what if, for every nuke the U.S. destroyed, a nuke disappeared from every other nuclear arsenal in the world? That's what it's like.. by keeping a vulnerability secret, it can be used against anyone using the software. By disclosing the vuln, everyone can patch, disable, or protect the vulnerable software.

Re:exploit sale = nondisclosure (5, Interesting)

michelcolman (1208008) | about a year ago | (#43734017)

Being paid for finding a vulnerability and keeping it secret sure beats getting sued for disclosing it responsibly.

Re:exploit sale = nondisclosure (1)

Intrepid imaginaut (1970940) | about a year ago | (#43734253)

It is kind of a win win. Either people who stand to lose from vulnerabilities buy or people who stand to gain from vulnerabilities will. If the practise is made illegal it will simply drive disclosers underground, which will greatly weaken all sorts of organisations as the proportion of hostile buyers increases immeasurably.

Re:exploit sale = nondisclosure (0)

Anonymous Coward | about a year ago | (#43734277)

Most of the big companies have known ways to disclose it. Some even pay for it. Due to much laziness (not lack of ability) I have not taken advantage of this. However you probably could make a decent living doing it.

Re:exploit sale = nondisclosure (3, Insightful)

thoth (7907) | about a year ago | (#43734401)

It's hard to argue that this is ethical behavior...

Sounds like the free market to me, buyers and sellers auctioning off products in a competitive environment. Perhaps corporations with their billions of quarterly profits can reinvest that money into buying exploits so they can fix them.

Re:exploit sale = nondisclosure (3, Interesting)

Anonymous Coward | about a year ago | (#43734491)

So long as people *CAN* patch / disable / protect the vulnerable software.

With the rise of things like locked / encrypted bootloaders, appstores, and lack of updates without a new hardware purchace, I'd say that idea will soon be (if not already) restricted to a very small class of citizens.
(I.e only those who would care about such things. The majority will just roll over and take it, as usual.)

That and if the summary is to be beleved, I would also imagine that the governments of the world will want to outlaw patching "their" exploits.

As far as the disclosure goes you're right it's not ethical from a public safety standpoint, but if you are selling exploits in the first place you most likely don't have that as a goal. Especially if you want some real money for it.

Re:exploit sale = nondisclosure (2)

ediron2 (246908) | about a year ago | (#43734677)

Am not sure I agree with your 2nd 'graph, bouldin.

By not disclosing X, you gain a competitive advantage Y.

I don't see how that is definitely unethical. For example, must antivirus vendors disclose the signatures used to identify infection? Seems unethical by your sentence. If a brick-n-mortar finds methodologies to reduce loss, must they share the idea? Ditto automotive improvements? Hell, for that matter, if I come up with a whiz-bang idea for improving a car, can I sell it selectively to manufacturers? Yeah, I know there are patents, but what if I choose to retain the idea as a trade secret?

So, discovery of a flaw in someone else's system is work product. It's intellectual property. It's leverage for market advantage. Choosing whether to selectively release that information is not even just an ethical decision, let alone a clear, inarguably inethical act.

Re:exploit sale = nondisclosure (1)

Errol backfiring (1280012) | about a year ago | (#43749917)

By not disclosing X, you gain a competitive advantage Y.

So you can screw {h where h in World}. Like the poster said, that is quite unethical. I never understand how people can confuse "free market" with "free people", as they are usually opposites.

Re:exploit sale = nondisclosure (3, Insightful)

plover (150551) | about a year ago | (#43734703)

Here's the counter argument. Let's say you accidentally discover a vulnerability in a bank's web site by mistyping a URL and you ended up at a different customer's account. You write up your finding, and you privately send it to the bank's security team and ask them for nothing in return other than that they act quickly to protect your account. And let's say they turn around and accuse you of hacking them under the Computer Fraud and Abuse Act, and they provide your own written report to the Secret Service as evidence against you? Who is the ethical party?

How would money alter the ethics? If you gave them the details of the flaw and asked the bank for a $1,000 reward, would that change things? What if you offered to tell the bank of the flaw in exchange for $1,000? If they don't pay, are you ethically bound to not sell the vulnerability to a third party?

What if you don't know of any specific flaw in your bank's site, but you would like to make some side money as a pen tester; so you send them a letter asking if they have a "pay for vulnerability policy", and they respond by placing a hold on your account and calling in the Secret Service? Who is acting ethically in that scenario?

What if you fear retribution so you ask this question anonymously? Are you more or less suspicious to the bank? Should they be more or less likely to seek your prosecution?

What if you exploit the vulnerability personally to view Paris Hilton's bank balance, but you don't do anything malicious to her account? What if you disclose that balance information to the tabloids? What about viewing the bank data of a non-celebrity?

And if not the bank, which third party might you sell it to? A security researcher? A competing bank? Microsoft? A hacker? Some random alias on darkode?

Different people are likely to view these behaviors differently, including banks, law enforcement, hackers, computer security professionals, lawmakers, bank customers, and the general public. Different legal cases with different judges are likely to interpret these differently, as well.

There are few clear cut lines standing out among these questions that say "here are the exact boundaries of ethical behavior."

Re:exploit sale = nondisclosure (0)

Anonymous Coward | about a year ago | (#43736455)

I was going to intelligently reply to your post, but, holy shit Batman, theres literally 20 questions in there!

Bugger that, too much time and effort.

Re:exploit sale = nondisclosure (1)

plover (150551) | about a year ago | (#43736727)

That was the point. There are hundreds of questions on various sides, and no one answer fits them all.

Re:exploit sale = nondisclosure (1)

saveferrousoxide (2566033) | about a year ago | (#43755207)

Let's say you accidentally discover a vulnerability in a bank's web site by mistyping a URL and you ended up at a different customer's account. You write up your finding, and you privately send it to the bank's security team and ask them for nothing in return other than that they act quickly to protect your account. And let's say they turn around and accuse you of hacking them under the Computer Fraud and Abuse Act, and they provide your own written report to the Secret Service as evidence against you? Who is the ethical party? How would money alter the ethics? If you gave them the details of the flaw and asked the bank for a $1,000 reward, would that change things? What if you offered to tell the bank of the flaw in exchange for $1,000? If they don't pay, are you ethically bound to not sell the vulnerability to a third party? What if you don't know of any specific flaw in your bank's site, but you would like to make some side money as a pen tester; so you send them a letter asking if they have a "pay for vulnerability policy", and they respond by placing a hold on your account and calling in the Secret Service? Who is acting ethically in that scenario? What if you fear retribution so you ask this question anonymously? Are you more or less suspicious to the bank? Should they be more or less likely to seek your prosecution? What if you exploit the vulnerability personally to view Paris Hilton's bank balance, but you don't do anything malicious to her account? What if you disclose that balance information to the tabloids? What about viewing the bank data of a non-celebrity? And if not the bank, which third party might you sell it to? A security researcher? A competing bank? Microsoft? A hacker? Some random alias on darkode? Different people are likely to view these behaviors differently, including banks, law enforcement, hackers, computer security professionals, lawmakers, bank customers, and the general public. Different legal cases with different judges are likely to interpret these differently, as well.

I'm sorry, I don't see a single one of those as vague ethical quandaries. You seem to be confusing "ethics" with "what a bank/government would do in today's litigious, paranoid, and ignorant society."
To answer your questions in order though:

1) You were acting ethically, the bank was not.
2a) No, you're now simply asking for a tip for services already rendered.
2b) Dramtically by introducing an artificial and selfishly motivated barrier to aiding those in need.
2c) You are ethically bound to not "sell" the solution to anyone, but rather to freely inform those who have the power to address the situation without affecting other innocent parties.
3) There was no ethical attribute to your action, the bank is being unethical.
4a) No
4b) No, they shouldn't be seeking prosecution in either case.
5) You are acting unethically in all three scenarios, especially the 2nd.
6) The information should not be sold at all. But if the bank is not interested, then a security firm, and the FTC, FCC, SEC, and FDIC should be next on your list.

I don't think they would view these scenarios differently than laid out above from a strictly ethical point of view without a damaged moral compass. Legally speaking, there might be slightly more of a gray area around whether the bank has the right to charge you with a crime, but selling the information will be illegal barring an agreement struck with the bank before you were in possession of the knowledge. Viewing another person's account information, famous or not, is always illegal. You're really not raising any hard questions.

A free-for-all market of "cyber weapons" (2)

GameboyRMH (1153867) | about a year ago | (#43733997)

About as good as any other weapons market willing to sell to whoever is the highest bidder...

Re:A free-for-all market of "cyber weapons" (1)

Synerg1y (2169962) | about a year ago | (#43734077)

with FAR less regulation.

Re:A free-for-all market of "cyber weapons" (2)

stewsters (1406737) | about a year ago | (#43734143)

This is something I am worried about. Politicians are considering security flaws as weapons in their cyber wars. Are we going to see more things like the RSA encyption export ban for exploits? I can see such laws getting really bad really fast.
Will I be able to work with foreign coders to fix bugs, or do I need to report them all to the government? Will I be seen as a terrorist for submitting a pull request for a security feature?

Re:A free-for-all market of "cyber weapons" (1)

Synerg1y (2169962) | about a year ago | (#43734353)

Also, how do you enforce it, w/o nuking everybody's privacy, which so far rightfully so remains the bigger issue. It's not like these entities that are getting hacked are broke and new, they for the most part don't invest properly in mitigation. Remember Sony's play on line getting hacked hard (if i remember correct), and the PR generated from that... Sony hasn't been hacked since in such a high profile attack as far as I know (DDOS doesn't count).

Bugs will get fixed, the easy way or the hard way. (2)

Kazoo the Clown (644526) | about a year ago | (#43734043)

it's clear that reporting a vulnerability to someone in a position to actually fix it (such as the developer of the software) often doesn't work so well. We've seen severe negative effects as they strive to cover up rather than address the vulnerability, attacking the messenger instead. What better way to escalate a bug and get it fixed, than to sell it to the highest bidder and see it get exploited in the field by bad actors?

Re:Bugs will get fixed, the easy way or the hard w (1)

Synerg1y (2169962) | about a year ago | (#43734097)

Oh, there's better ways, they just haven't been implemented yet.

Like a national center for vulnerabilities disclosure, or computer break in laws that actually make sense in the context of their subject.

Re:Bugs will get fixed, the easy way or the hard w (1)

techsoldaten (309296) | about a year ago | (#43734545)

National Disclosure Centers are only as good as the organizations that take their disclosures.

I worked pretty closely with the DOC CIRT when it was first formed. It did not matter how many CIOs were involved in the process of forming it, or what they agreed to do, or what channels of communications were established. There were always groups that would not / could not work to address issues when they happened.

I don't think passing more laws has much affect on the issue either. Laws are regulatory and fall very much into the camp of attorneys, who rarely understand their implications in terms of infrastructure. Have spent many days on the phone with people for OIG seeking clarification on regulatory guidelines for handling systems, without getting the impression they understood much more than how to work the on / off switch.

This is a supply and demand problem, but a very special one. There is not enough demand for patches and security solutions prior to an incident, and there is not enough supply of secure code available to combat the threat. If anything, a solution lies with manufacturers, but there has to be a serious market for secure solutions for it to happen (and a willingness of buyers to invest in products that go down this route).

In other words, organizations needs to stop buying windows and start buying hardened Linux platforms. I honestly don't believe there is another way.

Re:Bugs will get fixed, the easy way or the hard w (2)

Synerg1y (2169962) | about a year ago | (#43734603)

Just out of curiosity how would replacing windows with linux prevent a spear phishing attack?

In the context of laws, I'm actually thinking of laws that would protect security researchers who are publishing these vulnerabilities.

I would also love to hear what you think "secure code" is, what if the vulnerability is in a lower OSI layer as plenty often are?

Re:Bugs will get fixed, the easy way or the hard w (2)

DeathGrippe (2906227) | about a year ago | (#43734111)

Sowing chaos does not lead to more order, only more disorder.

Two wrongs don't make a "right."  Just because a vendor fails to adequately address a vulnerability does not make it ethical to exploit that vulnerability.

Re:Bugs will get fixed, the easy way or the hard w (0)

Anonymous Coward | about a year ago | (#43734719)

two lefts though can reverse a right

Re:Bugs will get fixed, the easy way or the hard w (2)

justcauseisjustthat (1150803) | about a year ago | (#43734115)

If the bug is reported to the developer and they do nothing, I don't feel bad for the developer and I can understand why the person who discovered it wants to get paid.
If I lived in a world that didn't require money, it would be different.

But either way, I do feel bad for the end users.

modern day defense contractor (3, Interesting)

anthony_greer (2623521) | about a year ago | (#43734743)

There is nothing different between this and the practice of huge companies selling death machines to the militaries of the world, and the occasional non state para military planing a takeover or something - tanks, bomber jets, missiles and so on - how is this any different? the security researchers work to create a product - i.e. a vulnerability, then sell that information, the product of their effort - to a willing customer.

Its a nasty business, you can question the morals and ethics of it, but it really is no different than companies that sell guns and bombs to whatever crackpot thug has a truck full of cash or gold bars...

Re:modern day defense contractor (1)

Impy the Impiuos Imp (442658) | about a year ago | (#43735817)

Sales of weapons to nasty governments has at least, in theory, been nominally approved by accountable, elected officials, and is in accordance with some policies set by same.

Turning Tables on Developers (0)

Anonymous Coward | about a year ago | (#43739253)

So far, your incompetency has been our collective weakness, now your incompetency can mean profit for someone other than yourself! Sucks to be you.
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...