Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Case For a Global, Compulsory Bug Bounty

timothy posted about 10 months ago | from the perverse-incentives dept.

Security 81

tsu doh nimh writes "Security experts have long opined that one way to make software more secure is to hold software makers liable for vulnerabilities in their products. This idea is often dismissed as unrealistic and one that would stifle innovation in an industry that has been a major driver of commercial growth and productivity over the years. But a new study released this week presents perhaps the clearest economic case yet for compelling companies to pay for information about security vulnerabilities in their products. Stefan Frei, director of research at NSS Labs, suggests compelling companies to purchase all available vulnerabilities at above black-market prices, arguing that even if vendors were required to pay $150,000 per bug, it would still come to less than two-tenths of one percent of these companies' annual revenue (PDF). To ensure that submitted bugs get addressed and not hijacked by regional interests, Frei also proposes building multi-tiered, multi-region vulnerability submission centers that would validate bugs and work with the vendor and researchers. The questions is, would this result in a reduction in cybercrime overall, or would it simply hamper innovation? As one person quoted in the article points out, a majority of data breaches that cost companies tens of millions of dollars have far more to do with other factors unrelated to software flaws, such as social engineering, weak and stolen credentials, and sloppy server configurations."

Sorry! There are no comments related to the filter you selected.

Good idea... (2)

jddeluxe (965655) | about 10 months ago | (#45713163)

Good luck getting many of the software corporations to sign up for this...

Re:Good idea... (2)

jythie (914043) | about 10 months ago | (#45713415)

Well, if we were going to be in favor of this, I could see a company's underwriters requiring such a system or perhaps offering it as an insurance package.

Not all security bugs are equal. (2)

jellomizer (103300) | about 10 months ago | (#45713549)

The real problem is the assumption that all security glitches are equally bad.
Sure at Hack-a-thons we see impressive I can break into this computer in under 5 minutes, however this is often in a controlled environment. Where they can pick and choose what services that they want on, assume that a lot of people hook their PC's up to Raw internet. And a bunch of businesses do this too.

Now if there is a flaw on the World facing features such as a Web Browser or SSH client, yes that is serious. But if it is a flaw that says allow for local vulnerabilities, that is much different.

To try to make market for these, means companies will have to pay for and fix things in the wrong priority.

Re:Good idea... (1)

foobar bazbot (3352433) | about 10 months ago | (#45713877)

Good luck getting many of the software corporations to sign up for this...

You know what "compulsory" means? It means you get to jail/fine any software companies who don't sign up for it, so I don't think much luck will be needed.

Re:Good idea... (2)

smooth wombat (796938) | about 10 months ago | (#45714417)

It means you get to jail/fine any software companies who don't sign up for it,

And good luck getting a company to pay a fine. Or is this like the UACA where the government will reach into your bank account if you don't voluntarily hand over your money to private companies?

If you're trying to stifle companies and drive them out of business, or make them go elsewhere, this is a good way to do it.

But I guess living in your nanny state, that's the only way to get companies to produce better code.

Re:Good idea... (1)

foobar bazbot (3352433) | about 10 months ago | (#45717917)

And good luck getting a company to pay a fine.

Are you serious? Companies pay fines all the time -- even big companies. Being a big company can mean you get to buy laws and control fines (ideally, set them so they're effectively a wrist-slap for you, but a body-slam to some upstart competitor), but once a court decides against you (and you've exhausted appeals, if applicable), you pay the fine.

If you're trying to stifle companies and drive them out of business, or make them go elsewhere, this is a good way to do it.

Well... yeah.

But I guess living in your nanny state, that's the only way to get companies to produce better code.

"My" nanny state? Are you so deep in an us-or-them mind-state that you're unable to consider that someone who does not support it could possibly criticise someone arguing against it? When I criticised jddeluxe for seeming to completely ignore the "compulsory" bit, I said exactly what I meant. I did not say I supported any such compulsory-bounty scheme, for the simple reason that I don't support it.

I think it's a horrible idea, but "good luck getting companies to sign up" is not only not the most horrible thing about it, it's not even an issue -- if such a compulsory-bounty measure is ever adopted, it will be because interests with lots of money/power (read: big software houses that can afford the bounties, and/or those with an interest in moving most of the software industry overseas) want it bad enough to throw their weight behind it. And if they do, you can bet they'll also throw their weight behind setting fines for non-compliance high enough to to make it effective.

Re:Good idea... (4, Insightful)

mlts (1038732) | about 10 months ago | (#45715693)

What will happen is that companies will spawn off sub-contractors which do all the coding and are completely offshore entities.

For example, foocorp spawns off ABC Coders. ABC Coders just does business in one country, selling and maintaining its codebase to foocorp. Foocorp is just a customer, so if a government demands a bug bounty, they would have to go upstream to ABC Coders, and since ABC Coders does not do international business, they can give other nations the middle finger when it comes to their regulations.

Re:Good idea... (1)

BarefootClown (267581) | about 10 months ago | (#45716307)

What will happen is that companies will spawn off sub-contractors which do all the coding and are completely offshore entities.

For example, foocorp spawns off ABC Coders. ABC Coders just does business in one country, selling and maintaining its codebase to foocorp. Foocorp is just a customer, so if a government demands a bug bounty, they would have to go upstream to ABC Coders, and since ABC Coders does not do international business, they can give other nations the middle finger when it comes to their regulations.

If ABC is offshore, and sells to foocorp, then isn't that "international business" kind of by definition?

Re:Good idea... (1)

nmr_andrew (1997772) | about 10 months ago | (#45716693)

What will happen is that companies will spawn off sub-contractors which do all the coding and are completely offshore entities.

No, what will happen is that $BIG_COMPANY will bribe^Wlobby $GOVERNMENT to make sure that no such compulsory program ever exists.

Re:Good idea... (0)

Anonymous Coward | about 10 months ago | (#45720189)

No no no.

The government puts its demands on whoever sells software in the country. A pure sales organization will have to take money out of their sales then. And if they want to stop doing that, they put pressure/money on their offshore software provider to get a fix in a short time.

Re:Good idea... (3, Insightful)

ultranova (717540) | about 10 months ago | (#45716967)

You know what "compulsory" means? It means you get to jail/fine any software companies who don't sign up for it, so I don't think much luck will be needed.

So in other words, this is about killing off independent developers. Only companies who can afford $156,000 per bug will be able to distribute programs. Free software will, of course, die overnight.

So... Apple or Microsoft?

Re:Good idea... (0)

Anonymous Coward | about 10 months ago | (#45720113)

Good luck getting many of the software corporations to sign up for this...

You don't have them sign up. You change the law and force responsibility on them. Similiar to how a manufacturer cannot sell cheap cars without brakes and leave the problems to the buyers.

Also, such responsibility is not hard. Dan Bernstein has a money offer for bugs in his free software. This is even easier with commercial software, where there is an actual budget. Maybe microsoft and other sloppy programmers would be hit hard for some time - others would fare better. (A hint: computers do not crash "now and then". They can run for months and years easily. Windows crashes now and then though.)

Silly (4, Insightful)

Nerdfest (867930) | about 10 months ago | (#45713177)

This is silly. Allit would do it force black markey prices up and push smaller companies out of business. It would probably also raise insurance rates for software companies and the cost of software in general. Of course, it would laso probably push up the rates for competent software developers.

Re:Silly (1)

weilawei (897823) | about 10 months ago | (#45714403)

It might push the rates up, but that extra money would likely go to insurance. For reference, see what an anesthesiologist makes vs. how much they spend on insurance. (In 2009, this was $21,480, according to the AQI [slashdot.org] . Sadly, they've pulled the 2009 version and the 2013 version is paywalled. But you get the general idea.)

Re:Silly (1)

Kookus (653170) | about 10 months ago | (#45714415)

...Of course, it would laso probably push up the rates for competent software developers.

I think you just made a case for proceeding with the article's proposal. At least, you just sold me on that idea!

Re:Silly (1)

swillden (191260) | about 10 months ago | (#45714437)

This is silly. Allit would do it force black markey prices up and push smaller companies out of business. It would probably also raise insurance rates for software companies and the cost of software in general. Of course, it would laso probably push up the rates for competent software developers.

I disagree, in part.

I do agree that it would increase insurance rates for software companies and increase the cost of software. But I don't think that's a bad thing. We have a serious problem today with the amount of shoddy software being pushed out and placed in critical positions where defects can result in huge losses. Software that is an attractive target for attack should cost more, because the maker should invest more into it, in the form of the appropriate security due diligence.

Re:Silly (1)

Nerdfest (867930) | about 10 months ago | (#45714503)

It would be nice to see software development treated like other skilled professions (engineering, medicine, etc) as long as the pay icreases with the responsibiity.

Re:Silly (1)

sjames (1099) | about 10 months ago | (#45738729)

Agreed. It's worth noting that practically everything decent on the net started out too small to absorb even one such bug bounty.

Would the first www browser have even made it into the wild if it carried that liability? I doubt it. Even if it did, Apache probably wouldn't have gotten far enough to form a foundation around it.

Next up, who pays when the bug is at the protocol level (such as the pizza thief vulnerability in FTP)? The IETF? Surely we can't fairly charge a company that faithfully implemented the protocol.

What if the software isn't produced by a corporation? Surely a use at your own risk pile of code on github shouldn't be subject to this?

flits plotz (0)

Anonymous Coward | about 10 months ago | (#45713191)

slashdot, news for fucking bastards, stuff thats fucked up the ass.

Just a bad idea (1)

Akratist (1080775) | about 10 months ago | (#45713197)

The problem with this sort of program is the same problem that no amount of vulnerability fixing will ever address -- the human factor. Just as social engineering is probably the biggest weakness with most systems, something like this is going to be gamed by people who figure out how to profit from a program that companies are forced to participate in.

That's absurd (3, Insightful)

DarkFencer (260473) | about 10 months ago | (#45713221)

That is an absurd argument. Yes some companies can and should offer bug bounties but if the only method you can rely on is out bidding the black market, then you've already lost.

Not to mention, there are a lot of small companies, small foundations, and open source projects which could never afford such prices.

Re:That's absurd (2, Insightful)

Obfuscant (592200) | about 10 months ago | (#45716645)

Not to mention, there are a lot of small companies, small foundations, and open source projects which could never afford such prices.

Who pays when a bug is found in the Linux kernel?

Re:That's absurd (1)

vilanye (1906708) | about 10 months ago | (#45717567)

The Linux kernel project shouldn't be used when speaking in general terms about open source projects.

The fact that there are many large companies investing in the Linux kernel project makes them different.

Re:That's absurd (0)

Anonymous Coward | about 10 months ago | (#45720765)

Who pays when a bug is found in the Linux kernel?

Whoever charged money for that linux installation. If none - then none pays. (I.e. hobbyist/DIY installation) Always go after the money! But businesses often buy their linux rigs from RedHat or IBM, in those cases they pay.

Re:That's absurd (1)

IamTheRealMike (537420) | about 10 months ago | (#45753019)

This is especially true given that the insane climb in zero-day prices in recent years has largely been driven by governments starting to buy them up as weapons. You cannot outbid entities that are able to both tax and print money, it's simply impossible. All that would do is result in the NSA spending more on zero days to ensure they still win, and bankrupt a lot of useful software companies.

Kill all startups (3, Insightful)

mwvdlee (775178) | about 10 months ago | (#45713259)

I work for a startup. Not one of those few heavily-funded startups, but a regular startup with barely enough funding to scrape by in the first few years. Like most startups.

$150,000 is just ever so slightly more than two-tenths of one percent of my startup's annual revenue.

Asking an average startup to pay $150,000 for a security bug is like asking security researchers to work for $0.10 an hour.

Re:Kill all startups (2, Insightful)

Anonymous Coward | about 10 months ago | (#45713351)

$150,000 is double the annual revenue of many smaller non american companies. (Think smaller companies with only two or three programmers - where a lot of the more useful/interesting software of the world comes from)
Forcing something like this would be a disaster.

Re:Kill all startups (0)

Anonymous Coward | about 10 months ago | (#45713391)

Subbie must work for a multi-national, or a government entity ("a million here, a million there - pretty soon you're talking real money!"). Implementing this idea would be the corporate equivalent of burning the understory growth in a forest - only the tallest trees would survive.

Re:Kill all startups (1)

jythie (914043) | about 10 months ago | (#45713447)

yeah, the selection of amount really seems kinda bias. Looking at the author's chart the person is really focusing on only the biggest of the big vendors.

Though to be honest, I actually could see such a system benefiting everyone if it was forced on the big companies. Their software tends to be so wide spread that bugs in their stuff doesn't just impact their direct customers but has a watershed effect on the whole industry. So I could kinda see if a 'if you are so big your screw ups cause everyone problems, then you need to do XYZ or we lynch you' type thing.

Re:Kill all startups (1)

dgatwood (11270) | about 10 months ago | (#45713625)

If we could just make it mandatory for browser plug-in vendors (Adobe, Microsoft, I'm looking at you two), it would go a long way towards improving security.

Re:Kill all startups (2)

swillden (191260) | about 10 months ago | (#45713787)

I don't think this would be so problematic for startups. They'd just end up buying insurance, the same way they insure a lot of other things. And the insurance companies would not only spread the risk, but they'd also actively require companies to mitigate the risk, by doing the right kinds of security reviews. Further, they'd almost certainly end up pricing the premiums differently based on the degree of risk posed by the software. If a startup is building a product that, if exploited, could lead to billions of dollars in damages, then the premiums will be higher and the security practices required to bring them down will be much stiffer. On the other hand, if your startup is building the latest twitterbot, the risks are pretty low.

The bigger impact to startups would be in agility, not financial, I think. Particularly for startups building software that could be used to compromise high-value targets (or large numbers of low-value targets). But I don't think that's actually a bad thing. Some areas should innovate more slowly and cautiously, because they're risky.

I think the bigger problem is how to combine white and black markets in a reliable and trustworthy way.

Re:Kill all startups (2)

Wycliffe (116160) | about 10 months ago | (#45714065)

The only way the insurance would be reasonable would be if the bug bounty was not a fixed price. I.e. If I have
1000 customer's credit card numbers then the bug wouldn't be worth near as much as if I had 100000 customers.
But how do you do that with opensource software or does the company running it hold the responsibility?
Also, if we are basing it on the "street value" of the bug then it still becomes insane. So if I find a bug that could
cost microsoft $10M and the street value is 50 cents on the dollar then microsoft has to give me 5 million dollars
for finding it? That's probably worse than just waiting and letting it happen which is never going to be 100% and has
at least some chance of recovering or mitigating the loss.

Re:Kill all startups (2)

swillden (191260) | about 10 months ago | (#45714357)

The only way the insurance would be reasonable would be if the bug bounty was not a fixed price.

Yes, that's the idea. Bug bounties would be set by the value of the vulnerabilities on the black market, so the prices would vary depending on the nature of the bug and the target. I'm doubtful that such a market would work, but if you assume that part of it does, then insuring against it would work well.

That's probably worse than just waiting and letting it happen which is never going to be 100% and has at least some chance of recovering or mitigating the loss.

Yes, that's the nature of insurance. If the actuaries do their jobs right, insurance is always, in aggregate and in the long run, a losing proposition. If you can afford the potential hit, you should not buy insurance. But insurance makes a lot of sense in cases where the probability of catastrophic loss is relatively low but the impact is, well, catastrophic.

Re:Kill all startups (1)

Wycliffe (116160) | about 10 months ago | (#45725217)

Yes, that's the nature of insurance. If the actuaries do their jobs right, insurance is always, in aggregate and in the long run, a losing proposition. If you can afford the potential hit, you should not buy insurance. But insurance makes a lot of sense in cases where the probability of catastrophic loss is relatively low but the impact is, well, catastrophic.

But we're really talking about 2 different things:
1) Insurance is actuaries calculating the probabilty of a loss payout, requiring you to fix know problems to lessen this risk but then just sitting back and waiting for a loss.
2) A bug bounty is requiring you to pay a percentage of the loss even if there has never been an actual loss.

The first presumably already exists in one form or another. I'm not sure the second is workable in the real world. The second would be like
making an insurance company pay out for a potential theft every time you forget to lock your front door even if no theft occurred just because
your neighbor noticed it and asked for a reward.

Re:Kill all startups (1)

swillden (191260) | about 10 months ago | (#45726215)

I think you're drawing an artificial distinction. Given a regulatory requirement to pay a bug bounty, there would be an actual loss to be covered.

Re:Kill all startups (1)

Wycliffe (116160) | about 10 months ago | (#45728739)

I think you're drawing an artificial distinction. Given a regulatory requirement to pay a bug bounty, there would be an actual loss to be covered.

Ok, to continue with my previous example then it would be like the government stepping in and telling everyone that if you
find your neighbor's door unlocked that you can report it and get a check from their neighbor worth half of their stuff.
This would obviously cause your neighbor to want to always lock their door and to also probably want to buy insurance to
protect themselve from accidently leaving their door unlocked. But doesn't this seem a little drastic and prone to abuse?
Doesn't your neighbor already have an incentive to lock their door? What if they have other protections in place like a dog
at home or cameras in place?

Re:Kill all startups (1)

swillden (191260) | about 10 months ago | (#45729181)

I don't think that analogy is useful. If you leave your door open, you're the one that stands to lose, but if vulnerabilities exist the software company (generally) isn't the loser, which is why it makes sense to impose some method of bringing the societal costs to bear on the company. In economic terms, vulnerability costs are largely a negative externality, while security costs are internalized. That's a recipe for incenting people to ignore security, and the general solution is to internalize the externality.

I think a better analogy would be leaving something dangerous to others unsecured. Say, explosives. If you have a license to handle explosives and you don't follow the rules for securing them appropriately, you will get fined (if your'e caught). The other twist with software vulnerabilities is that the risk associated with a given bug is much harder to pin down, whereas it's pretty easy to quantify with a given type and quantity of explosives. This proposal attempts to use market forces to quantify the risk and determine the dollar amount of the "fine". It further tries to use the fine to actually motivate and therefore fund the security research. In the case of explosives, the government pays people to audit licensees and the value of the fines go to the government. I suspect if we looked a bit we could find some situations like this proposal, where the government essentially outsources the auditing operation to a third party who is compensated by collecting the fines.

Re:Kill all startups (1)

Stormy Dragon (800799) | about 10 months ago | (#45714071)

How is the price of this insurance going to be determined for a company that just came into existence? There's no track record that can be used to establish the relative risk for producing bugs.

Re:Kill all startups (2)

swillden (191260) | about 10 months ago | (#45714319)

How is the price of this insurance going to be determined for a company that just came into existence? There's no track record that can be used to establish the relative risk for producing bugs.

The nature of the software should provide a good basis for estimating potential damage (e.g. avionics control system vs twitterbot), and the tools and development processes used should provide a good basis for estimating risk of vulnerabilities. Indeed, much as I hate to admit it, the software industry could probably benefit from the level of rigor that insurance actuaries would apply, to both damage estimation and development methodology evaluation.

Re:Kill all startups (2)

Splab (574204) | about 10 months ago | (#45715015)

Yeah, heres an idea, create a company, get insurance, create bug riddled code, get someone else to turn them in and profit...

This makes about as much sense as having firefighters paid on accord.

Re:Kill all startups (1)

swillden (191260) | about 10 months ago | (#45717979)

Yeah, heres an idea, create a company, get insurance, create bug riddled code, get someone else to turn them in and profit...

Which is no different from "get an old building, buy fire insurance, have someone set it on fire and profit". Insurance fraud scams exist with every type of insurance in existence, and it doesn't prevent insurance from working.

Re:Kill all startups (1)

AmiMoJo (196126) | about 10 months ago | (#45724411)

Firefighters kinda are paid on accord. If there are few fires then budgets are slashed and people laid off.

---DO YOU WANT TO KNOW MORE?--- (1)

Sigmon (323109) | about 10 months ago | (#45713263)

Srsly?... Global bug bounty? IMDB [imdb.com]

Re:---DO YOU WANT TO KNOW MORE?--- (1)

fibonacci8 (260615) | about 10 months ago | (#45713773)

or even IMDB [imdb.com]

Clickbait (2)

SteveFoerster (136027) | about 10 months ago | (#45713305)

This idea is so ridiculous, I can't imagine it's not simply clickbait. And thanks to Slashdot editors, it worked.

Re:Clickbait (2)

Akratist (1080775) | about 10 months ago | (#45713327)

This idea is so ridiculous, I can't imagine it's not simply clickbait. And thanks to Slashdot editors, it worked.

Sadly, bad ideas have a way of becoming policy and law, especially when special interests and lobbyists get involved.

Re:Clickbait (1)

swillden (191260) | about 10 months ago | (#45714377)

This idea is so ridiculous, I can't imagine it's not simply clickbait. And thanks to Slashdot editors, it worked.

I don't think it's at all ridiculous. I don't see how to make it practical, but it's definitely not ridiculous.

Great way to kill off small players (1)

Anonymous Coward | about 10 months ago | (#45713321)

This sounds like an excellent way to completely kill off all small companies, only the big players like Microsoft and Oracle will be left, prices will skyrocket.
People whould really think through what they are asking for, or is it that they have thought it through i.e. Is it actually the Microsofts of the world pushing for this?

Re:Great way to kill off small players (2)

fche (36607) | about 10 months ago | (#45713339)

Regulations often benefit the entrenched regulated against the newcomer competitor.

Go fucking fuck your fucking self, fucking fuckup (2)

Anonymous Coward | about 10 months ago | (#45713397)

This guy wants to force all companies to buy something this guy's company would indubitably directly financially benefit from.

From their website:

"Our unique team of world-class security analysts have led the IT research and testing communities in providing the right information IT decision-makers need to be secure. Let us help your business make better, informed security decisions."

Way to create a market for yourself ! You go ! If you can't drum up business through providing value, head to Congress and force people to give you money. It's the American way.

Everything old is new again! (2, Interesting)

Anonymous Coward | about 10 months ago | (#45713529)

I recall an old story I heard in my early days of programming. A company offered a monthly bonus to its testers for each bug found in its code. Guess what happened? The testers made deals with the programmers for a cut of the action so the programmers created bugs and let the testers know where/what they were. Now, I guess we just have to scale this out a bit more and viola...here is the story on Slashdot! THANKS!

Targeted at larger companies... (2)

Stolpskott (2422670) | about 10 months ago | (#45713597)

...that kind of scale could work.
For a bounty of $150,000 to be "less than two-tenths of 1% of those companies' annual revenue" (I am assuming that is each company's annual revenue calculation, not a global pool), that suggests the model is aimed at companies with >$75M annual revenue.
Newsflash for the paper authors... there are not many software development companies in that ballpark. Granted, the smaller the company, (probably) the smaller the market for their software so the smaller the need for such a bug bounty.
But if companies are going to be "compelled" to buy bug reports, that is going to require federal legislation which is not good at such fine-tuned work, especially after 150 groups of lobbyists have crafted their specific amendments to it, at which point companies will shift development efforts offshore, causing the federal legislation to be retargeted at company head-office location or companies whose software is used within the country, and a legal dance to get around the legislation begins, assuming software dev houses do not simply say their software cannot legally be used within USA.

not bounties... (2)

Fubari (196373) | about 10 months ago | (#45713605)

Mandatory bounties is the wrong way to go; it reminds me of this: http://dilbert.com/strips/comic/1995-11-13/ [dilbert.com] . An approach like TFA advocates would have an underground economy in bug fixes spring up and wouldn't solve real zero day. Instead...

Allowing users to recover damages seems more suitable; a "zero day" class action suit or two would result in tremendous advances in best practices for security and qa (aspects of software development that, for some odd reason, just don't seem to get much funding today). By 'allowing' I mean changing software licensing so that verbiage like '...AS-IS WITHOUT RECOURSE TO RECOVER ANY LOSSES OR DAMAGES, DIRECT OR INDIRECT...' no longer holds.

Which is a pretty huge change, and a number of interests would lobby against that. So I expect it will take a pretty severe incident (e.g. loss of life, or maybe a loss of significant money) to shock existing legislation and treaties (it would have to be global; hello WTO) sufficiently to encourage change. By "significant" I mean larger than the multi-billion dollar loss 'estimates of global damage from cybercrime' cited in TFA. That "cost" isn't nearly enough to change behavior, especially when you average it out across the world population.

Re:not bounties... (1)

Anonymous Coward | about 10 months ago | (#45714923)

By 'allowing' I mean changing software licensing so that verbiage like '...AS-IS WITHOUT RECOURSE TO RECOVER ANY LOSSES OR DAMAGES, DIRECT OR INDIRECT...' no longer holds.

It never should have. How is a customer supposed to determine whether a particular piece of software is "fit for purpose" if he's unable to examine the source? How can the buyers beware if they're not allowed to examine the merchandise?

On the flip side, who pays the bounty on open source software? Well, there should be no need, anyone is free to examine the source for themselves.

(Not that everyone has the requisite skill to examine the merchandise, of course, but that's irrelevant if access is denied.)

How about a certification? (1)

ebno-10db (1459097) | about 10 months ago | (#45713617)

For a large variety of reasons that have already been explained here, making this mandatory is an idiotic idea. What about making it part of a rating or validation though? Such things are generally voluntary except for safety critical applications.

Compelling consumption? (1)

Gothmolly (148874) | about 10 months ago | (#45713621)

Yeah that always works well. What is this, socialized medicine?

Centralized submission centers (0)

Anonymous Coward | about 10 months ago | (#45713629)

Frei also proposes building multi-tiered, multi-region vulnerability submission centers that would validate bugs and work with the vendor and researchers

That's a great idea. These 'submission centers' could work directly with vendors to ensure that bugs are fixed in a responsible way so the public isn't harmed. We could call it something like the Total Security Audit or the Time Sensitive Action program or simply TSA for short. I feel safer already.

Could this work in practice? (1)

swillden (191260) | about 10 months ago | (#45713657)

With one big practical issue, this idea seems fundamentally sound, from an economic perspective. Presumably the black market values the vulnerabilities according to their exploitation potential, which should be related to the value of the software. Currently that may not always be the case, but it should be, even in cases of cyber warfare where the attacker's interest is in doing damage, not stealing money.

Consider, for example, a control system that is used to manage a large electrical power grid. Right now, economics will price the value of that software based on the cost of production, plus sales expenses, transaction costs and a profit margin. If the company buying the software (or its regulators) to run its grid is very conscientious, it may recognize that vulnerabilities could wreak havoc and require some additional security auditing, etc., in which case the necessary security effort would get factored into the price. However, that also may not happen -- especially if the software is some apparently minor, peripheral piece, whose ability to destroy the grid isn't obvious.

But if the maker of the software is responsible for purchasing vulnerabilities, and the value of the vulnerabilities becomes clear to, say, Chinese government hackers looking for ways to attack the power grid, then the security due diligence is likely to be factored in up front. I imagine what will really happen is that software companies will buy insurance against potential vulnerability costs, and insurance companies will quickly become savvy analysts of security risk potentials and secure development process evaluation. Security code reviews may become the equivalent of installing fire suppression systems and building with flame-retardant materials, something everyone does to keep their premiums down.

However, I see one big problem with it: The black market is, by definition, black. Can you get reliable vulnerability valuations out of it? It seems to me that if I have a potentially-serious vulnerability to sell, the first thing I'm going to do is to get some buddies to help bid up the price, with an agreement that we'll split the take. They can bid as much as they like because we all know the company will be required to buy the vulnerability for a slightly higher price. For that matter I can simply claim I have a bidder willing to pay $X, for any X I choose. Given that real black market bidders are going to be very hard to identify, how can anyone say I'm wrong? And if the software company claims I'm lying and refuses to buy, what if they're wrong? And what's to stop them from claiming that all of the other bids are fabrications?

Making this work seems to require an auditable, high-trust marketplace that traffics in illicit goods and has a lot of criminal participants, who are somehow comfortable participating. That seems... rather difficult to achieve. Not impossible, perhaps, but definitely very difficult.

Let's litigate the little guy away. (1)

VortexCortex (1117377) | about 10 months ago | (#45713837)

As an independent developer who is very security aware -- Unit tests + input fuzzing, zero memory access/free errors for release candidates, complete code coverage -- There are still bugs that can sneak in, especially when statically linking against libraries. I remember being bit by libpng -- code I did not write myself and could not hold to as high a standard. Do you charge every dev using libpng? Do I charge libpng devs? Does everyone charge libpng? How am I supposed to know who's fault it is if you don't let me see the bug first? Oh, would you look at that, my next patch will remove the exploit vector anyway, sorry, I don't have to pay your bounty. Do I just go out of business because I can't actually afford to pay black-market prices for a bug targeting a library simply because it's been customized to work against my product? You have the source code, you fucking fix it yourself. I'm not paying for a service I never asked for, just like you don't have to pay for my support service for the codebase.

Another name for bug is programming mistake. I'm making ends meat so that's the level of effort you get: What you pay for. Humans make mistakes and errors will happen since you will not pay what it does take for me to write 100% mathematically verifiability secure code -- I've done so in the past for a few drivers back in the day written in ASM: all possible inputs validated as producing the correct machine state, computers have finite state, and the price of my work reflects the extra development time and energy. You do not value security, so I can not spend the time to secure the code because you will buy a cheaper and less secure service. Compulsory bug bounty? Get ready for a price hike, meanwhile wherever the law doesn't apply will become the new software capital of the world.

Factoring in bug bounty to my expenses means I can't take the risk to release code, might as well close up shop. Look, I hate EULAs as much as the next guy, but I have to have one: You see that indemnity clause? The one that I have to include because even if my code is perfect, your hardware and other software may not be and I can't trust you, a judge or jurors to tell the difference? Yeah, that's what I'll use if there's a mandated compulsory bug bounty. You'll click right through the waiver that says you won't hold me liable for YOUR USE of my software, like you always do -- If you can't take on the responsibility and risk to operate the software, then you don't have permission to use my software. So, read the fine print and it'll say that I'll be billing you the cost of any bug you bill me for, plus my legal expenses. And if you try to sue me over it, well, in America the court will want you to prove damages -- which you can't, because it's YOUR USE of the software that causes risk, not my publishing of it. You don't have to use my code. Even if you manage to not agree to my license and discovered a bug, if you found the bug you can avoid the bug... no damage. Users could just sue crackers for exploiting them -- that'll work so well, eh?

Thank you for downloading from Bug Bounty Isolation Software Inc. -- The corporate shell you'll be trying to charge for software bug bounties, which will file bankruptcy immediately and Bounty Free Software Inc. will then assume the role of distributor. (Just like with patent infringement suits) Rest assured, this will be the 6th time I have rebuilt the BusinessMatrixAdapterFactorySingleton, and I have become exceedingly efficient at instantiating it.

I've got a better idea. Why don't you get everyone to care enough about security first, and run a Kickstarter to get them to fund your bug research efforts? While you're at it, solve the halting problem for me too; Then a mandatory bug bounty will make sense, because it could be provably the result of malice.

Re:Let's litigate the little guy away. (0)

Anonymous Coward | about 10 months ago | (#45715317)

"Waaaaah, I want to be allowed to endanger people and not be held responsible."

I make home made pharmaceuticals, sure if I screw up people might get sick or die, but look, it's YOUR USE of the pills that cause risk - not my publishing them!

In order for this to work we need 2 things (2)

wiredog (43288) | about 10 months ago | (#45713895)

A ban on "free" or "open sourced" software that doesn't have a corporation behind it. And a legal requirement that software only be produced by licensed and bonded "software engineers".

Re:In order for this to work we need 2 things (1)

Stormy Dragon (800799) | about 10 months ago | (#45714017)

Exactly what I was thinking. Say good bye to the hobby coder if something like this passes. You willing to risk hundreds of thousands in liability just to tinker around on your computer?

Re:In order for this to work we need 2 things (1)

XnavxeMiyyep (782119) | about 10 months ago | (#45715407)

If this happened in the US, I would relocate to another country, which I'd rather not do.

Re:In order for this to work we need 2 things (1)

Obfuscant (592200) | about 10 months ago | (#45716739)

Headline: The Case For a Global, Compulsory Bug Bounty

If this happened in the US, I would relocate to another country, which I'd rather not do.

You'd have to move to another planet. "Global" kinda makes country boundaries irrelevant. Perhaps you could trade a few choice vulnerabilities you've found to the Chinese for a ride on one of their moon probes?

Re:In order for this to work we need 2 things (1)

XnavxeMiyyep (782119) | about 10 months ago | (#45726793)

Not that I think this will happen at all, but if it did, I'd bet on some countries ignoring it and a tech boom occurring there.

Nonsense (3, Insightful)

vinsci (537958) | about 10 months ago | (#45714009)

That suggestion makes no sense at all, considering that governments are paying to insert seurity bugs either by ordering the companies to do so or by infiltration of the developer team.

Re:Nonsense (1)

CodeBuster (516420) | about 10 months ago | (#45723651)

This Stefan Frei guy is just another dishonest shill saying something colossally stupid in public to draw attention to himself and the products that his company is selling. Forcing anyone to buy something or pay a fine to anyone without prior restraint of free contract or due process in court of law is so antithetical to the very basis of western civilization that it ought to be summarily dismissed from further debate or discussion with prejudice.

Compulsory? Bah (1)

A nonymous Coward (7548) | about 10 months ago | (#45714205)

Anytime coercion enters the picture, along come its sibling corruption in every sense of the word.

If your scheme is not popular enough to stand on its own two legs -- if your arguments are not enough to win the day -- propping it up with compulsion is the only recourse left, and it reaps what it's worth.

Yeah (1)

The Cat (19816) | about 10 months ago | (#45714553)

Make it impossible to start a software business. Makes perfect sense!

Wait, there are no software businesses in America any more. Never mind.

All for the NSA $$ (0)

Anonymous Coward | about 10 months ago | (#45714727)

Instead of having to pay black market prices and force companies to hand over bugs. This would save money and be silent!

I call BS (1)

PortHaven (242123) | about 10 months ago | (#45714729)

NSA claims to have foiled a cataclysmic cyber threat (likely from China) to exploit a BIOS attack.

First off, there are a number of bios manufacturers, not all will have the same bug. Second, there are numerous bugs still existent. And even when known it is extremely hard to get manufacturers to fix them.

This sounds like the NSA found someone in China using an exploit in a BIOS to hack computers. Alerted the manufacturer who was probably already aware of the fact after numerous Linux users had reported it years ago.

http://www.businessinsider.com/nsa-says-foiled-china-cyber-plot-2013-12 [businessinsider.com]

Re:I call BS (1)

slew (2918) | about 10 months ago | (#45716801)

First off, there are a number of bios manufacturers...

Maybe in number, but not marketshare where there are basically 2: AMI and Phoenix/Award. The market share of all others is a rounding error.

Second, there are numerous bugs still existent.

True, but see point #1

This sounds like the NSA found someone in China using an exploit in a BIOS to hack computers. Alerted the manufacturer who was probably already aware of the fact after numerous Linux users had reported it years ago.

Probably likely, but not a consequence of your first two points.

If you want to be actually helpful.... (0)

Anonymous Coward | about 10 months ago | (#45714773)

...what you SHOULD be lobbying for is reporting and transparency on bug reports made to companies relying on some auditing body (either government or private). Here's the bug, here's a POC on the exploit (the POC code can be kept private). Require companies (or perhaps some third party and/or government board) to do some level of risk assessment on the identified issues - how big an exploit? How hard to use? How easy to fix?.

If the company sits on it, and it gets in the wild, then have some way to penalize them for negligence if they deliberately sit on a known major issue. Track when and how transparently companies provide security warnings to customers/users if there are known issues that are not yet fixed. Allow independent auditing of how many issues of what severity each company has open. Have some independently audited "scoring" of which companies are the most responsive to reported issues, and which ones sit on them. Score who has the most exploits discovered in the wild.

Now you're at least moving in the direction of forcing companies to give a crap about security without giving financial incentives to black hats.

Market dominance (1)

Larry_Dillon (20347) | about 10 months ago | (#45715189)

It might make sense if the "mandatory" part was limited to larger players in a given sector. e.g., over 20% market share or something. Certainly, vendors need more incentives to patch bugs, but I'm not sure this is the right way to go about it.

I'ma code myself up a minivan! (0)

Anonymous Coward | about 10 months ago | (#45715395)

As a programmer I support this proposal 100%!
    http://dilbert.com/strips/comic/1995-11-13/

There's an easier way. (0)

Anonymous Coward | about 10 months ago | (#45715497)

Enforce and apply existing product liability laws to software.

That way, any bugs that are found become a liability for the company that produced the software.The companies won;t be able to make users sign all their rights away with EULAs, and will actually have to take responsibility for their products.

It would just create a new black market (1)

JimDot (519946) | about 10 months ago | (#45715599)

As a developer, I generally try to *remove* bugs from the software, but for a share of the $150,000, I'm sure I could let something slip through and then tell you where to find it. Dilbert nailed this 18 years ago: http://dilbert.com/strips/comic/1995-11-13/ [dilbert.com]

Bug bounty for building code violations (1)

Tony Isaac (1301187) | about 10 months ago | (#45715851)

Imagine a world where you and I could get a bounty for finding building code violations. That could be a full-time occupation, and a lot of people would be going around finding frivolous technical violations just to get the money.

Software isn't any different. There are lots of things that could be considered bugs, that shouldn't deserve a bug bounty. Who is the arbiter of what deserves a bounty and what doesn't?

This is pure BS.

Better Idea (0)

Anonymous Coward | about 10 months ago | (#45715861)

Instead of forcing anything, simply decline to offer copywrite protections on the work.

And here's the case against: (1)

wonkey_monkey (2592601) | about 10 months ago | (#45716075)

It's a stupid idea.

I know some coders .... (1)

PPH (736903) | about 10 months ago | (#45720035)

... at Microsoft.

1. They'll put the bugs in and tell me where to look.
2. I'll report the bugs.
3. We split the $150,000.
4. ????
5. Profit!

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?