×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Gag Order Fuels Responsible Disclosure Debate

Soulskill posted more than 5 years ago | from the excellent-use-of-judicial-resources dept.

Security 113

jvatcw writes "The Boston subway hack case has exposed a familiar rift in the security industry over responsible disclosure standards. Many see the temporary restraining order preventing three MIT undergrads from publicly discussing vulnerabilities they discovered in Boston's mass transit system as a violation of their First Amendment rights. Others, though, see the entire episode as yet another example of irresponsible, publicity-hungry security researchers trying to grab a few headlines." We discussed the temporary restraining order last weekend, and later the EFF's plans to fight it. CNet reports that another judge has reviewed the order and left it intact. Reader canuck57 contributes a related story about recent comments by Linus Torvalds concerning his frustration over the issue of security disclosure.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

113 comments

Hehe... (0, Funny)

Anonymous Coward | more than 5 years ago | (#24624565)

You said gag.

Re:Hehe... (1)

infonography (566403) | more than 5 years ago | (#24624867)

lame, you for got the spoon.

ok first in thread to reference Frank Zappa for the win!

Let the off-topic Olympics begin

IMO, this is really a simple issue (5, Insightful)

zappepcs (820751) | more than 5 years ago | (#24624577)

Linus is dead on right. If you find it, tell the author(s). If they don't respond? Tell the world. Software makers should credit those that find the bugs as well. This will eventually lead to credit where credit is due, and subsequent reputation building in a reasonable manner.

Gag orders just make things worse. This is where I believe the law should take a stand. If someone makes reasonable due diligence to report the vulnerability to the author(s) and nothing happens in response to the report, then the authors have no recourse on what happens when it is made public. This is in line with the intent of our legal framework now, and would not IMO violate legal values.

"Unsafe at any speed" was not exactly something the auto industry wanted to deal with, but they had to. Those lessons are very applicable here. Those who don't play nice and disclose to the public too soon should be penalized if actual damages can be shown. Restraint and respect. These two things have no dependency on reciprocal action.

I read Linus' rant and he's absolutely correct. The bigger the flame war over vulnerabilities, the more security companies make off of unwarranted fears etc. It's just a game, and where the law is concerned, we have prior examples to look at... and goddamnit, they are about cars! No analogy needed here

Re:IMO, this is really a simple issue (0)

Anonymous Coward | more than 5 years ago | (#24624609)

"Linus is dead on right. If you find it, tell the author(s). If they don't respond? Tell the world."

exactly what i would do. once they start complaining about what people are doing when they find it and abuse it like they should, it would be a big i told you so...

Re:IMO, this is really a simple issue (1, Insightful)

iminplaya (723125) | more than 5 years ago | (#24624637)

If you find it, tell the author(s). If they don't respond? Tell the world.

But it still MUST be done anonymously to keep anybody from suppressing it, as is being done here.

Re:IMO, this is really a simple issue (1, Insightful)

Anonymous Coward | more than 5 years ago | (#24625565)

>But it still MUST be done anonymously to keep anybody from suppressing it, as is being done here.

Often when we see information being suppressed, it is as much due to the self-aggrandizing nature
of the person trying to disseminate the information, rather than the information itself.

This information could have been silently released to the public through any number of anonymous channels.
But the people being suppressed here, are themselves attempting to limit availability of the information for
their own selfish reasons. One may assume that they *want* to be suppressed by government and/or corporate
forces, that this is their goal. If it were only about getting the information to the public, the individuals
in the story would not need to be personally associated, and would not be under attack, or even, under discussion.

Re:IMO, this is really a simple issue (1)

iminplaya (723125) | more than 5 years ago | (#24626417)

Heh, you're right. We are dealing with some fat egos with these guys. They want their names in big bright lights. Gives them a feeling of power.

Re:IMO, this is really a simple issue (-1, Flamebait)

KlomDark (6370) | more than 5 years ago | (#24624665)

Are you fucking kidding me? That's the stupidest thing I've ever heard of. Are you hurt in the brain or what?

I think it would be much more diligent to make sure and look for the actual meaning here. You could do a lot of good for the world if you just realized that.

If the tangentials of the argument do not agree in your favor, then perhaps you should take a separate approach.

Join with the team already, quit pushing against it. I know you have the understanding to comprehend.

Can you really tell which one is correct? If you look at the analog mirror of the situation, it's easy to tell the reverse.

It's just like my Uncle Bob used to say: If not for the cows, then what of the flowers.

I think that's a good analogy to prove my point.

Re:IMO, this is really a simple issue (2, Interesting)

wellingj (1030460) | more than 5 years ago | (#24624777)

I really don't want to get flamed here but, are you a native English speaker? I'm having trouble making heads or tails of your argument. Maybe I'm the one who's dense...

Re:IMO, this is really a simple issue (2, Interesting)

dynamo52 (890601) | more than 5 years ago | (#24624843)

I'm with you. I read that post three times and understood it less each time. It reads like some automatically generated spam email.

Re:IMO, this is really a simple issue (3, Interesting)

Anonymous Coward | more than 5 years ago | (#24624717)

Linus manages to be right, arrogent and stupid in the same statement.

He seems to have now discovered that in order to improve security you have to try to fix all bugs. This is right. A bug is a place where the software doesn't do what the "educated" user expects. That can almost certainly lead to a security situation.

He's competely stupid, however, to compare a random bug with a demonstration of exploitability. When someone has an exploit, that's something they can sell for money to cause harm to your users. Some exploit finders do. Someone who chooses to tell the software designer directly is doing the designer a big favour. Someone who chooses to tell the users directly is doing them a big favour. An exploitable bug like in Boston is always the tip of a huge ice berg. It's a sign for a software author/designer to go and review their entire design and start looking for ways of doing it more solidly and with better protection on place. It's a sign for users to change to a more secure system.

Finally, Linus is arrogent because his new discovery, that fixing bugs is a good idea for security, is exactly what the OpenBSD group has been preaching for ages. Despite not having a hundredth of the resources Linus has at his disposal, they have demonstrated much better commitment to delivering quality software than he has. He could just have said "thank you".

Re:IMO, this is really a simple issue (1)

zappepcs (820751) | more than 5 years ago | (#24627147)

I had not thought of looking at Linus' signal to noise level quite that way, but you're right: arrogant, stupid, but right. His notable quotes are both sad and hilarious at the same time.

It is those that choose to sell the vulnerability to bad guys that I believe should be considered criminals. The vulnerability finders that just want some credit... well, they should get it.

I can see how Linus perspective is a bit skewed on this subject. When he started out with the kernel there was hardly the pressure or customer base that we now have with the Linux Kernel. Security is not something he can have no opinion on.

I hope that despite the signal to noise on this issue, the industry develops a middle of the road drama-free method of dealing with vulnerabilities.

Re:IMO, this is really a simple issue (1)

Psychotria (953670) | more than 5 years ago | (#24624747)

As much as I love Linux, I don't think that "Linus is dead on right". Coming from anybody else that interview would be labelled as a rant or a troll. Coming from Linus, I still regard it as a rant. He said *NOTHING* in the interview, but just went on and on (between the lines) about how great he is. Sorry. I admire the man. I respect him. But, seriously, he needs to grow up.

Re:IMO, this is really a simple issue (5, Insightful)

Z00L00K (682162) | more than 5 years ago | (#24624821)

And gag orders are today's version of "shoot the messenger".

The problem is there even if you don't tell the world.

Anyway - being a security person is more than revealing or hiding facts. It's also about having the insight to realize that there are always security failures in a system. The point isn't to track down each individual security failure but to create a layered solution that can change a security problem from being critical to moderate.

It's impossible to catch all security problems in a system, and sometimes a security weakness is in place because it isn't possible to make the system more secure without causing it to be impossibly hard to use.

But there are of course stupid security failures too. Autorun in Windows is one... Very effective when you want to spread viruses and other malicious features. And now and then we hear about USB devices that are infected.

Re:IMO, this is really a simple issue (1)

Maelwryth (982896) | more than 5 years ago | (#24624889)

"I think the OpenBSD crowd is a bunch of masturbating monkeys, in that they make such a big deal about concentrating on security to the point where they pretty much admit that nothing else matters to them."

I think we can also safely ascertain that he has cracked Theo's webcam. :)

The gag order may be appropriate (2, Interesting)

Sycraft-fu (314770) | more than 5 years ago | (#24624913)

Temporary restraining orders of all different kinds are often issued at the beginning of a legal case. The idea is that a party might be doing another party harm, and you shouldn't have to wait for the conclusion of a court case (which can take years in some cases) to get the harm to stop. The other party can, of course, argue that the restraining order would cause them harm and thus shouldn't be granted.

Take, for example, a case of someone slandering you. The make knowingly false statements about you with the intent to harm you. This is a matter in which you can take legal action against them, so you do. However, they are a rich prick, so they lawyer up and basically work to drag the lawsuit out as long as possible. They know they'll lose, they just want it to take forever. So should they be allowed to continue while the case is going on? Should you continue to have to endure this for months, maybe years? No, so you'd try to get the judge to issue a temporary restraining order to make them shut up until the case was settled.

Now I'm not saying that it was a good idea for the city to bring a case against these students, however that isn't really for the judge to decide at this point. The question basically comes down to: Could the respondents (the students) cause the plaintiff (the city) harm through their actions? Would it cause the respondents hard to have to cease their action? Well yes, it would cause the city harm if the students revealed their information. You can argue the city deserves it, but all the same. However it won't really cause the students any harm to have to keep quiet about it until the case is settled.

Hence you can see why the judge would grant the order. It isn't a permanent order or anything, it is basically just saying "You have to keep your mouths shut until we've had a chance to examine the case in court." If the EFF lawyers make a good argument (I wouldn't count on it, the EFF has a poor courtroom record) as to why the gag order should be lifted, the judge will do that.

You see this kind of thing in patent cases all the time. A party will sue over a patent and request an injunction to prevent the other party from selling the allegedly infringing product. These often get granted, then removed shortly after when counter arguments are made.

It even applies to personal restraining orders. If you want a restraining order against someone, you go to a judge and present your case. If they find it compelling, one is granted. The person it was against can then challenge it, but it is granted before they can challenge it. Happened to a friend of mine. A girl he knew liked to use them as weapons against people and he pissed her off, so she got one on him. He then went to court and argued why it was bullshit. The judge agreed, dismissed the order and barred her from getting another one against him for a couple years.

So while you can get mad at the city, the legal system appears to be working as it should.

Re:The gag order may be appropriate (4, Insightful)

mysidia (191772) | more than 5 years ago | (#24626913)

You can argue the city deserves it, but all the same. However it won't really cause the students any harm to have to keep quiet about it until the case is settled.

This is clearly not true. Being unable to reveal the information probably discredits some of their valuable work.

It will effect their successful reputation, and possibly what job opportunities they might have, if they don't get to reveal their discovery.

More importantly, it is denying them of their constitution-granted rights as citizens of a free country by placing a prior restraint on their ability to use their free speech right, interfering with their liberty, and right to pursue happiness.

Re:The gag order may be appropriate -- Not (3, Interesting)

SpammersAreScum (697628) | more than 5 years ago | (#24627097)

Could the respondents (the students) cause the plaintiff (the city) harm through their actions? Would it cause the respondents hard to have to cease their action? Well yes, it would cause the city harm if the students revealed their information.

You appear to be overlooking the critical point that the students' planned presentation did not Reveal All -- critical information needed to actually exploit the flaw was left out. MBTA was told this and sued anyway. The only "harm" the city would have suffered is well-deserved acute embarrassment.

Re:The gag order may be appropriate (1)

HiThere (15173) | more than 5 years ago | (#24627413)

If you are arguing that the restraining order was legal...well, two judges have agreed with you.

It's stupid, harmful, obnoxious, and of doubtful constitutionality, but it appears to be legal as defined by the courts of New York. (What, you say "doubtful constitutionality" should mean it's not legal? I agree. But the legal system doesn't.)

Re:IMO, this is really a simple issue (3, Insightful)

WK2 (1072560) | more than 5 years ago | (#24624919)

Linus is dead on right. If you find it, tell the author(s). If they don't respond? Tell the world.

Once upon a time, I would have agreed with you. But nowadays, when someone finds a vulnerability and tells the vendor, the vendor goes and gets a gag order to prevent the public from being able to protect themselves. Or the security researcher gets arrested. It might be safer to just tell everybody, anonymously, through one of the many full disclosure lists.

This is where I believe the law should take a stand.

There is no reason to get the law involved with this one. In fact, the courts seem to be the problem in this case.

Re:IMO, this is really a simple issue (1)

martin-boundary (547041) | more than 5 years ago | (#24625287)

It's really a question of trust.

As a security researcher/bug finder, do you trust the author to act correctly upon hearing the information? If yes, then you should tell the author first, ie give him the benefit of the doubt and only tell the world after a small delay (but always tell the world, because you're helping people). If you don't trust the author to act correctly, then you should tell the world immediately.

The thing is, in the open source world authors have a strong incentive to act correctly wrt a bug report. If they don't, the source is there for everyone to see anyway. In the closed source world, authors don't have a strong incentive to act correctly, and often they have business incentives to hide the existence of the bug.

Linus has an open source point of view. Of course he feels that open source authors can be trusted to fix or discuss security and bug reports. So his rule of thumb, which only applies to open source and a few highly trusted entities, is trust the author first, and tell the world later.

By contrast, in the closed source world Linus' rule of thumb doesn't apply, as we know that most companies are very likely to ignore, or even sue those who discover the bugs, often for simple business reasons. In the closed source world, you should always tell the users ASAP, because you might not get the chance to do so later.

Re:IMO, this is really a simple issue (1)

TheRaven64 (641858) | more than 5 years ago | (#24625547)

What happens if you tell someone else (e.g. an attorney) first. Will they be bound by the gag order? Can they disclose the information later? What happens if you tell someone in a different legal jurisdiction before you tell the manufacturer? For example, if these MIT guys had told someone in the UK first, then the gag order in the USA wouldn't have had any legal impact on the people in the UK and they would then have been free to legally disclose the information to the public. As long as you don't tell the vendor who else knows they can't get a gag order preventing the right people from knowing.

Re:IMO, this is really a simple issue (4, Insightful)

jc42 (318812) | more than 5 years ago | (#24625757)

But nowadays, when someone finds a vulnerability and tells the vendor, the vendor goes and gets a gag order to prevent the public from being able to protect themselves. Or the security researcher gets arrested. It might be safer to just tell everybody, anonymously, through one of the many full disclosure lists.

Indeed. In a recent discussion on this topic, someone pointed out that there's a legal name for the strategy of "tell the vendor, and if they don't fix it, tell everyone". The name for this is "blackmail", and you are in danger of prosecution.

We might add that if you tell the vendor, and offer to work for them to fix the problem, there's also a legal term that applies: "extortion".

The only real way to protect yourself from the danger of prosecution is to not tell the vendor anything. You should simply make the information public. That way, it's clear that you're not threatening the vendor with release of the information and you're not trying to get them to pay you to fix it.

This also prevents them from asking the courts to impose gag orders. It doesn't do much to prevent the media from labelling you a "hacker", which to the general public is a kind of criminal. But if you are knowledgeable enough to find and fix security problems, there's probably no way to prevent the media or the political system from labelling you as some sort of criminal. People in positions of authority have always wanted to silence messengers with inconvenient messages, and there's probably no way to fix this bug in the human psyche.

Re:IMO, this is really a simple issue (1)

mysidia (191772) | more than 5 years ago | (#24626873)

What you do is you give the information to the vendor, and tell them the information will be published within 6 months.

If they fix the issue, no problem, the publication will address a fixed issue.

The publication is already outsourced to a trusted third party, and they will have an irrevokable order and agreement to place the information into a public place on the agreed upon date, subject to a non-disclosure agreement concerning the exact time, venue, etc.

The bug has been known for more than 6 months (0)

Anonymous Coward | more than 5 years ago | (#24625343)

Even months after the problems with the Mifare classic chip were first revealed, ongoing projects to deploy Mifare classic in the Netherlands were not stopped. The people responsible for those projects at TLS (Trans Link Systems) intend to wait until signs of systematic exploitation of the vulnerabilities force them to switch to the revised chip. Their analysts expect the system to last two years. Who's irresponsible here?

Re:The bug has been known for more than 6 months (0)

Anonymous Coward | more than 5 years ago | (#24626079)

Get money to deploy crappy system. Blame hackers for insecurity. Get money to deploy fix that wouldn't have been 'necessary' if the hackers hadn't done anything...

Re:IMO, this is really a simple issue (0)

Anonymous Coward | more than 5 years ago | (#24626049)

They did tell the MBTA and the MBTA Lawyers told the world when they made the info public by including it in their filing for the restraining order!!!
Read it here for your self:
http://www.tgdaily.com/content/view/38817/108/

The real news is how stupid the Lawyers for the MBTA were and the idiocy of the contractractors that implemented such a disgracefully unsecure system.

Why are we not talking about that? MIT and the students should have been let off the hook imeadiately by the judge issuing the restraining order because the "secret" was published by the MBTA Lawyers IN THE PUBLIC RECORD with thier own filing for a restraining order! THE HACK IS A MATTER OF PUBLIC RECORD JUST LIKE EVERY OTHER COURT FILING.

What MBTA did was like standing on the roof of your house shouting "Stop telling the world about how I screwed the neighbors wife or my neighbor will find out!"

Dumb and dumber. MBTA is making a big issue about disclosure when they got the notification and chose to do nothing with it....except include it in thier CYA move with the court trying to keep it a secret. The big secret, I imagine, is not that the system is as secure as a gumball machine, but that some body in the MBTA probably got a kickback from the contractor for big bucks because the substandard system they contracted to put in was a hack to begin with.

MBTA is trying to cover their butts and divert attention away from the real issue of money and a system that can launder it nicely. It's not first ammendment issues that are the real news here people, it's why the MBTA chose to attack MIT et. al. instead of deal with the problem.

The real culprits of this situation are in the MBTA!

They're a bunch of attention whores (5, Insightful)

Anonymous Coward | more than 5 years ago | (#24624597)

However...

"...yet another example of irresponsible, publicity-hungry security researchers trying to grab a few headlines" <-- this does not invalidate this --> "First Amendment rights" ...no matter what the neo-cons or lobbyists might say.

Re:They're a bunch of attention whores (1)

iminplaya (723125) | more than 5 years ago | (#24624669)

In the land of "free speech zones" and sedition acts, the first amendment is toothless. The info must be released from a place that's out of reach.

Re:They're a bunch of attention whores (0)

Anonymous Coward | more than 5 years ago | (#24625715)

Their are limits that can be placed on the 1st amendment. The fact that any judge has the power to place a gag order, proves that when involved in litigation, limits can be imposed.

Also, I fail to see what any of this has to do with Neo-cons. Massachusetts has voted overwelmingly democrate for the last 60 years. The only position that comes up Republican on a regular basis is govenor, and they are usually Socially liberal, Fiscially conservative, and get elected to be a check on the almost uniformly Democrat legislative branch.

Idiots on the Bench (1)

Detritus (11846) | more than 5 years ago | (#24624627)

What part of the First Amendment don't you understand?

Re:Idiots on the Bench (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#24624641)

The obscenity laden part written in Swedish with feces on giant slices of raisin bread.

Not offtopic. (0)

Anonymous Coward | more than 5 years ago | (#24627959)

One of the biggest areas of misunderstanding with the first amendment is the expression of apolitical artistic thoughts in an unfamiliar, offensive form. The preceding post as a description of one such potential example.

Re:Idiots on the Bench (1)

HTH NE1 (675604) | more than 5 years ago | (#24627739)

You have the obligation to remain silent. Anything you say can and will be used to find you in contempt of court. Findings of contempt are final and are subject to appeal only at the judge's discretion. Your term of incarceration also will be at the judge's discretion.

This all, can be defeated... (3, Interesting)

bogaboga (793279) | more than 5 years ago | (#24624647)

...How? You may ask.

By letting Russian hackers release the info. The problem for the authorities is to prove that those under the gag order had a hand in this.The Russians get the information using no traceable medium. That includes the internet, post, fax etc.

Proving that the students had a hand in this, would be hard if not impossible. After all, the system was open to usage to everyone as long as they paid up -- including the Russians we are talking about.

Well, well (4, Insightful)

slobarnuts (666254) | more than 5 years ago | (#24624649)

Ok, I read the article. This is my summary:

This just stuck out, the one guy they got that doesn't support full disclosure to the public if the entity fails to protect their users (or less laden: The one guy doesn't support people speaking out about vulnerabilities) said this:

"When you discover major flaws in a system that society relies on, you go to the people who own the system and work with them," Jordan said "You don't stand up on a podium and say, 'Look how clever I am.'"

The first that came to mind is: "How does society rely on a refillable card?"

Explain that one to me. Society doesn't rely on the card, they rely on the transport.

Oh, here's the crux of it:

He added that in such cases, the goal of security researchers often seems to be to further their own agendas instead of helping others fix problems. "It's all about improving one's own self-absorbed ego," Jordan said.

Often is a funny word. Because OFTEN the company will just ignore the report, or put the bug on the lowest priority possible. Although you have to admit that OFTEN, when it affects their customer's decision to use a competing product, it gets fixed pretty quick.

I'm not an infosec researcher, but I will say that OFTEN, I'd rather have an idea of a vulnerability, even if it is unpatched. Rather than have no clue about a vulnerability because it was reported to the vendor and got sent to the back of stack, then be blindsided by it in a major way

But then again, I am OFTEN a proponent of free speech

The Boston system is really dumb (5, Informative)

Animats (122034) | more than 5 years ago | (#24624661)

The MTA is trying to cover up the fact that their system design is very weak. The value of the card is actually stored on the card, and there's no central validation. That's embarrassing, considering that the MTA implemented fare cards quite late, long after other cities.

The NYC MetroCard system [sephail.net], in comparison, is totally paranoid. Cards have unique serial numbers and are validated by the entry gate, the station computer, and central servers at MetroCard HQ. Creating new cards with new IDs won't work. Duplicating cards is possible, but is detected the second time the card is used. NYC is so paranoid that equipment maintenance is performed by an outside company, but NYC employees handle the money and blank cards, so that no single party has full access. The New York City subway system was losing about $20 million a year to token fraud, and when the new system went in, they were determined that would stop. They had some fraud back in 1995, when someone stole a supply of blank cards and was able to encode them, but it turned out to be a rip-off for buyers - the cards only worked once, then were invalidated.

The first fare card system, San Francisco's BART, isn't that secure, but has an big advantage - BART has exit gates. So, while it doesn't have real-time validation against a central database, gate info is being transmitted in background to a central system, and if centralized analysis indicates something funny going on, central control can flag the card, trap the user at the exit gate, and alert station security to check the card.

Re:The Boston system is really dumb (1)

MichaelSmith (789609) | more than 5 years ago | (#24624707)

You can store the value on the card. You just have to combine it with salt and encrypt it against a big enough private key. Shouldn't be hard in this day and age.

I don't really see why they are so worried about this attack. Most people would be deterred from using it because falsifying tickets is against the law. They couldn't possibly lose much money and the hole has to be fixed in the long run anyway.

Re:The Boston system is really dumb (4, Interesting)

0123456 (636235) | more than 5 years ago | (#24624733)

"You can store the value on the card. You just have to combine it with salt and encrypt it against a big enough private key. Shouldn't be hard in this day and age."

How does that help? If you can copy the data to another card or prevent the reader from updating the value, then you have infinite amounts of money available.

We used to have stored value cards at university back in the 80s, and it wasn't long before someone discovered how to prevent the automated readers from writing the value back to the card after they subtracted money from it so it never went down. There was also a bug where in some cases the reader would add $100 to the card rather than deducting $0.25...

Re:The Boston system is really dumb (2, Insightful)

jd (1658) | more than 5 years ago | (#24625135)

Not if all transactions are validated. If you're using PKI, then the holder of the card cannot determine in advance what the new value on the card is supposed to be, so all the software has to do is ensure that the decrypted value when re-encrypted is equal to the value that should have been written to the card, and that the digital signature placed on the card matches up with that machine's "personal" public key. Then you know that the value that you think has been written to the card is indeed the value written to the card.

As for writing the correct value - well, not my fault if coders are so incompetent they can't be bothered doing basic top-down design and bottom-up testing. Nor is it my fault they're so frail and scared when anyone suggests formal methods or even something as puny as checking invariants and QAing the corner cases. Frankly, it's pathetic. If you want good software, you've got to put in effort, same as if you want good anything. A top-notch Olympic-quality athlete is going to cost more to train and prepare than a third-grader for their school sports day. It is also going to take a lot more time to get them up to that standard. Software is no different.

If you want software that's 99.999% bug-free, you can do it, but it's not going to be pretty and it's not going to be cheap. If you want cast-iron guarantees that the remaining 0.001% cannot have bugs that significantly impact operations in terms of money, quality of service or reliability, you can do it, but don't expect it to appear effortlessly.

NB: When I say it's not "cheap", I mean exactly what I say. It's going to cost developers in blood, sweat and tears. It's going to cost them in time, it's going to cost them in pain, it's going to cost them in stress. It might well cost them their sanity. If it's in the corporate sector, it's certainly going to cost someone a great deal of money. But don't tell me it can't be done. Given enough time and some suitable rope to hang themselves with afterwards, any programmer can do it. It's that they or their paymasters aren't interested. Lack of interest is a WHOLE 'nother game. Nothing to do with impossibilities, it's all psychological bullshit.

(Linux is a good example. It has a low percentage of bugginess because the coders think they can do it. Windows does not because their coders don't think they can. OpenBSD has superb security because their coders think that that's what they're great at. There are carrier-grade and DO-178B level A Linux variants because those coders thought that was possible too. I've mentioned before general-purpose OS' that use security kernels to achieve mathematically-provable Orange Book A1 security - yet I still hear people insist it cannot be done. Too bad, it already has. Get used to it.)

If people wanted a high-reliability IT system that was secure, bullet-proof and made life easier - really wanted it - then that is what they would have. IT disaster stories aren't because IT is difficult, it's because a difficult attitude is difficult.

Re:The Boston system is really dumb (1)

Monkeybaister (588525) | more than 5 years ago | (#24625921)

This is what your system sounds like: I have card with memory contents A. The reader then reads A, writes B to the card, reads B from the card and is happy. I push a button on my modified card, it's now stores A.

Your fancy hashes, signing, whatever are all stored in the memory contents on the card. What you need is a way to know that A is invalid data and should never be read from the card ever again (putting a timestamp in the data will ensure going back to a previous balance won't write the same data to the card).

The problem is that you have to store that A is invalid in a trusted place, which is not on the card. All that your fancy PKI has done it made it possible to know that the data on the card was written by a trusted source at some point in the past. The problem is how does it know it's what it should be right now.

Re:The Boston system is really dumb (1)

0123456 (636235) | more than 5 years ago | (#24627955)

"As for writing the correct value - well, not my fault if coders are so incompetent they can't be bothered doing basic top-down design and bottom-up testing."

You just don't get it, do you? This is the way that these systems are broken, not by breaking encryption keys.

There is NO WAY to prevent these kind of attacks without some central validation; you can make them moderately difficult with complex hardware, but the hackers are generally smarter than the people building the system... if they have to dismantle a few dozen cards in order to determine how to duplicate them, so what?

The only way you can prevent duplicate cards (or storing the old contents of the cards and writing it back to the same card later) is by centrally validating all cards when they're used, or at least logging all transactions and flagging duplicate cards; but by then, they may already have got thousands of dollars or more of free services.

Re:The Boston system is really dumb (1)

jpatters (883) | more than 5 years ago | (#24626125)

Suppose that both a value and a unique transaction ID are stored on the card. Then, the system could count on a particular value/transaction ID pair only being used once. When the card is used, the system encodes the updated value with a new transaction ID and writes that to the card. If that is prevented, the card is left with the already used value/transaction ID pair, and the card is rejected if it is attempted to be used again. If the card is duplicated, only the first use would work. Of course, you would have to have a centralized database of used transaction IDs for this to work, so if you do that you can just store an ID on the card and keep the value in the centralized database. But you could securely store the value on the card.

Re:The Boston system is really dumb (1, Interesting)

Anonymous Coward | more than 5 years ago | (#24624739)

That is such a primitive way of looking at things. Call-home systems suffer from all kinds of issues that a PKI system with localized validation could laugh at with both PCI busses tied behind its back. For crying out loud, this is NOT the 1970s! You could store tens of millions of private keys on each and every single card reader and not even notice the change in costs. A smart card that stores just one side of the asymmetric key pair and a value (ie: a card that is read-only EXCEPT to an authorized machine) - Mondo Cards have been around for 15-20 years - is infinitely more secure than any bank and is infinitely more private. The only reason Swansea didn't stick with them is that 15-20 years ago the tech was too primitive. It was damn good, but too early. And that wasn't for any damn light rail, people were putting real money onto those cards. And they worked, and you don't hear of anyone going to a black hat conference wanting to talk about security holes in them. I trust proven technology and proven results. Banks offer neither. I use them because they're still one or two rungs up the ladder from the pond scum that run the credit card system.

Re:The Boston system is really dumb (0)

Anonymous Coward | more than 5 years ago | (#24625991)

The NYC system uses similar technology to the Boston system. It's also a stored-value system. Conceptually the CharlieCard system is superior - if only the encryption worked. The MetroCard as described in that article is just a magnetic swipe card.

Sadly the Boston system relies on a weak encryption system that can be brute-forced in minutes. Once you do that, nothing can stop you from writing to the card.

I would hope that the Boston system does record information about how the cards are being used to detect things like this. It would be criminal not to. Which, if you've ever dealt with Massachusetts government, sums it up, so it comes as no surprise that they in fact do not.

Plus there are more expensive versions of the same card that do not have the same flaw. Instead of using some proprietary encryption scheme they use 3DES or AES.

But since this is the city that decided to cheap out and GLUE concrete tiles to the roof of its useless tunnel when it was forced to put up some of its own money rather than continue to misuse federal highway funds. Surprisingly enough, these GLUED tiles then fell and killed a woman [wikipedia.org]. So it's impossible to be surprised that they went for the cheap option over the functional option.

Re:The Boston system is really dumb (1)

HTH NE1 (675604) | more than 5 years ago | (#24627849)

The first fare card system, San Francisco's BART, isn't that secure, but has an big advantage - BART has exit gates. So, while it doesn't have real-time validation against a central database, gate info is being transmitted in background to a central system, and if centralized analysis indicates something funny going on, central control can flag the card, trap the user at the exit gate, and alert station security to check the card.

As a teenager visiting San Francisco, one day I accidentally left my BART card on the BART, and found myself in a station where not only did I need to use my card to leave the station, but all card vending machines were beyond the gates I couldn't pass without a card! I was at a loss for what to do. The cards had left the station and were irretrievable. I could get on another and try station after station looking for one that had vending machines inside the station or a free exit, being unsure whether such a station even existed. I didn't want to risk jumping a turnstile.

Eventually I approached an official and explained my situation and was let out of the station. And I made sure from then on that if I wasn't immediately about to use the card that I kept it in my pocket, not tucked under my leg in my seat.

I am glad (3, Insightful)

definate (876684) | more than 5 years ago | (#24624693)

I am glad this judge has put a gag order on the MIT students, because now there is no exploit, and we are all safe from the terrorists/etc.

As we all know, if we all don't talk about it, it doesn't exist... right?

Okay, so sarcasm aside, this is the most ridiculous idea I have ever heard. Attempting to fix a problem by stopping people from hearing about the problem?

I know I am over simplifying the matter to get my point across, but I'm doing this to point how ridiculous it is.

Additionally by saying "He added that in such cases, the goal of security researchers often seems to be to further their own agendas instead of helping others fix problems." shows a complete lack of understanding of market forces. Yes he is furthering his own agenda, and in the process, he benefits us. It's the market you commie bastard, it isn't evil, we all win, get over it.

Re:I am glad (3, Insightful)

wellingj (1030460) | more than 5 years ago | (#24624793)

Okay, so sarcasm aside, this is the most ridiculous idea I have ever heard. Attempting to fix a problem by stopping people from hearing about the problem?

Welcome to modern life in the USA.

Re:I am glad (1)

wisty (1335733) | more than 5 years ago | (#24625391)

Gagging has been tried before.

Let's say you are a large military power, with a naval base in Hawaii. Let's say another large power in the Pacific is causing a bit of trouble.

Let's say that a guy (let's just call him Billy Mitchell) writes a 324-page report predicting a war in the Pacific, and pointing out that if it happens, the ships in your Hawaiian base are sitting ducks to an air attack.

Do you fix your defenses? If you don't immediately fix your defenses, do you try to gag the guy who is pointing out your weaknesses?

I am not (3, Insightful)

TubeSteak (669689) | more than 5 years ago | (#24624795)

Yes he is furthering his own agenda, and in the process, he benefits us. It's the market you commie bastard, it isn't evil, we all win, get over it.

The market is neither evil, nor good, it merely is.
But, as we've seen time and time again, without regulation, markets tend towards imperfect competition [wikipedia.org].

That said, what you and many other people generally fail to point out is exactly how security researchers contribute towards the free market. Their contribution is information. Complete information (in this case) is when everyone has knowledge that an exploit exists. Perfect information is when everyone has knowledge of how the exploit works.

But economics and markets are never that simple and it isn't very hard to argue that the net harm from releasing the information is greater than the net good.

Re:I am not (1)

definate (876684) | more than 5 years ago | (#24624817)

It is funny, I hear this time and time again, that "free markets without regulation tend towards imperfect competition" however I have never seen any credible theory which suggests that.

Free markets ALLOW competition, regulation reduces competition/possibilities. By it's very definition it forces markets towards imperfect competition. (Yeah, there are pro-competition laws (monopoly laws), however they are hardly used, especially when compared to how often the anti-competition laws (trademarks, copy rights, patents, taxes, etc) are used)

We need not know EXACTLY how security researches contribute towards the free market. That shows a lack of understanding of the invisible hand of the free market. All we need know that the researches contribute, and if they are successful and pave the way for other researchers to make a living, then the market has deemed them valuable. We know this because they are profitable and thrive.

We need not explain or understand the entire system and all its finer details, all we need understand is that it is inevitably in our (collective "our") best interests.

Re:I am not (1)

kailoran (887304) | more than 5 years ago | (#24625197)

You must have been reading a somewhat strange selection of books on economy if you didn't find a theoretical explanation of why free markets are not always great for competition.

It may all be fine and dandy with software or some manufacturing business, but you might have noticed that for instance the telecom market has this barrier of entry thing that prevents any meaningful competition *unless* regulated.

Re:I am not (2, Insightful)

definate (876684) | more than 5 years ago | (#24625243)

Barriers to entry are be overcome as long as those barriers are not enforced by Government. This is the primary problem telco's have problems competing.

If we are talking about infrastructure that the company has created being a barrier, you are mistaken, since any opportunity to a company is weighed according to it's profitability.

If another telco wants to use their infrastructure then they pay for it, where it is priced against their own internal services.

Under a free market companies will make stupid decisions, however in the long run they will be forced to make better ones, unless you regulate the industry.

There are a few examples of how free markets are not completely efficient (eg, total surplus excluding government surplus is not maximized), however all of the solutions for these problems are often criticized as introducing more problems than they solve. Especially since most of them presume that the government has perfect information, which it never does.

Additionally, companies develop barriers to entry to push the price from perfect competition towards monopoly, however they are stopped from pushing it to far by other companies competing. If a company is pushing it far without any competition then there are 2 possible reasons:
1) The actual margins of the company are not particularly attractive or there is an immensely unattractive payoff period/npv/etc. If so, perhaps this industry isn't that attractive, and by forcing the introduction of another through whatever means, would not benefit the industry or the consumer.
2) The company has developed a competitive advantage and so they are capitalizing on their innovation and hard work. If so, why would you want to punish a company for doing well and creating so much value for people?

There are theories about why free markets are not always good, and there are "problems" with a free market, however there is no other reasonable alternative.

Re:I am glad (1)

RAMMS+EIN (578166) | more than 5 years ago | (#24624973)

While I agree with you, it needs to be said:

Additionally by saying "He added that in such cases, the goal of those issuing gag orders often seems to be to further their own agendas
instead of helping others fix problems." shows a complete lack of understanding of market forces. Yes they are furthering his own agendas, and
in the process, they benefit us. It's the market you commie bastard, it isn't evil, we all win, get over it.

Acting in ones own interest doesn't always benefit the common good.

Re:I am glad (1)

definate (876684) | more than 5 years ago | (#24625003)

Acting in ones own interest while operating in a free market always benefits the common good, if you are successful and prosperous.

In this case, Judges don't act in a free market, they act in a democratic market.

If we switch Judge with CEO (kind of like a judge who acts in a free market), we would see that if the CEO's best interests weren't in line with our best interests then he would not be able to pursue his best interests, or at best would have limited time to do it, as it became infinitely expensive.

(Although this example is not the same, since a Judge is not a CEO, however in this comparison we can see that their differences primarily lie in the fact that the Judge does not act in a free market)

Re:I am glad (0)

Anonymous Coward | more than 5 years ago | (#24625223)

> Acting in ones own interest while operating in a free market always benefits the common good, if you are successful and prosperous.

There is this game called Monopoly, which simulates a very simple free market. It is usually played by 4 players. One of those becomes very rich and prosperous, while the other players go bankrupt. Yay free market! Hail the common good!

Re:I am glad (0)

Anonymous Coward | more than 5 years ago | (#24626169)

It's the market you commie bastard, it isn't evil, we all win, get over it.

The market gave us Windows.

The problem with a binary world... (3, Insightful)

bm_luethke (253362) | more than 5 years ago | (#24624783)

"Many see the temporary restraining order preventing three MIT undergrads from publicly discussing vulnerabilities they discovered in Boston's mass transit system as a violation of their First Amendment rights. Others, though, see the entire episode as yet another example of irresponsible, publicity-hungry security researchers trying to grab a few headlines."

Well, how about both? It can be a restriction of their first amendment rights *and* a publicity hungry "researcher" trying to grab headlines. They two things are not mutually exclusive.

Doing the Right Thing has not been in vogue for many years now, it is all about making some form of a statement.

It would be interesting to see the fingers being pointed if said system was attacked by terrorist and the only people killed were the family of the two sides. My guess is that the other sides point of view would become immediately obvious and they would both then point fingers at each other in an attempt to make themselves feel better.

However in this particular case I can see why the courts would give a gag order until the case is heard - that is not a violation of your first amendment rights. It has generally been established that whilst things are being litigated that the more restrictive side is somewhat enforced until the case is decided. That really only makes sense - otherwise why even have the courts have some type of decision in this case as one side is the de facto winner?

Ah well, what do I know? It's worth our deaths to tell everything yet of we kept all flaws secret then all would be well. We can't do something reasonable like, say, not tell people bent on killing us how to do it and when we are informed of a problem fix said thing. Nope, too hard to do and it may show that we aren't the Saviors of the World we think we are. Heck we may even have to look at the other side as Not Crazy and wanting to live free and with little threat of death - how bad would that be?

Re:The problem with a binary world... (1)

Pinky's Brain (1158667) | more than 5 years ago | (#24625537)

Why indeed have the court make a decision in this matter?

Prior restraint isn't their business ... there was no decision to be made by them, the fact that they thought there was something to decide in the first place is the problem.

The REAL lesson is: next time, use Wikileaks (2, Funny)

Anonymous Coward | more than 5 years ago | (#24624801)

Post it to wiki:

http://wikileaks.org/

Then, if some moron complains, point him/her to this article. No good deed goes unpunished, so to hell with them.

For those who haven't played the home game (3, Informative)

Anonymous Coward | more than 5 years ago | (#24624879)

The Tech leaked these slides days ago.

http://www-tech.mit.edu/V128/N30/subway/Defcon_Presentation.pdf

It really covers absolutely everything you care about. If you're willing to, you can do all of this from the comfort of your bedroom.

Now, I'm not in Boston, but next time I am...

Comments (5, Insightful)

RAMMS+EIN (578166) | more than 5 years ago | (#24624891)

My thoughts:

First amendmend rights are a red herring. The fact that you have a right to say something doesn't make it a good idea to say it.

Publicity-hungry researchers trying to grab a few headlines also aren't the issue here.

The issue here is security. And that raises the question of who we are trying to protect. As far as I am concerned, we _should_ be trying to maximize overall security. I think the best way to do that is to protect the users of products. So, the question then becomes: What kind of disclosure yields the best security for users?

Unfortunately, the answer to that question depends on a variety of factors. I think the three most important ones are:

1. How will the vendor react to being informed of the vulnerability?
2. How will the users react to being informed of the vulnerability?
3. How will the black hats (bad guys) react to being informed of the vulnerability?

None of these questions can be answered generally. In particular, in general, you cannot know how the black hats will react, because you cannot know if the black hats were already aware of the vulnerability. If they weren't, you have just given them a new attack vector. This is a Bad Thing, and one of the most common arguments against full disclosure. On the other hand, if they were already aware of the vulnerability, you have just told them nothing they didn't already know. Since you can't know, in general, if the black hats already know of a vulnerability, it seems that full disclosure is a bad idea, overally. But that's if you only consider point 3.

Once you factor in points 1 and 2, the picture changes. The fact that you found a vulnerability is always interesting news to the vendor and the users. If they didn't know about it already, the vendor now knows that they have a problem that affects their users and that they need to fix, and the users know they have a problem that the vendor hasn't fixed yet, and that they should protect themselves against. If the vendor or the users did know about the vulnerability, they now know that _another_ person has found it, and that, perhaps, more priority should be given to fixing it and protecting against it. In case of full disclosure, everybody now knows for sure that the black hats know about the vulnerability, that they _will_ use it to attack systems, and that it _must_ be protected against and fixed as soon as possible.

Now, I am going to say a couple of things that aren't really factual, but that seem reasonable to me.

First of all, protecting yourself from vulnerabilities and getting them fixed is _always_ the right way to deal with vulnerabilities. Doing so as soon as possible minimizes the time you are vulnerable, and thus is a Good Thing. Not everyone realizes the importance of this. But, once a vulnerability has been announced publicly, you _know_ that the black hats know about it, so it is clearly risky to not protect yourself against it.

Secondly, in general, you will never make all users aware of a vulnerability. It may seem that a vendor could inform the users of their product of a vulnerability. However, vendors are notoriously reluctant to provide their users with information about vulnerabilities. If they provide information at all, it is usually not detailed enough to allow users to take protective measures, or comes long after the black hats have already started exploiting the vulnerability. Moreover, even the vendor will not know everyone who uses a product. And nobody can exclude the possibility that some of these users may be black hats, or that the information may leak to the black hats. Public disclosure at least gives every user of the product the possibility to inform themselves of a vulnerability.

Thirdly, historically, vendors have been reluctant to fix vulnerabilities unless they were publicly known. This is a Bad Thing, because the fact that a vulnerability is not publicly known does not mean it is not being exploited. Now, of course, vendors could change. And some of them have changed. But, historically, full disclosure has been one of the few, if not the only, effective measures to get vendors to fix vulnerabilities.

It seems to me, then, that full disclosure is a Good Thing. On the other hand, that doesn't make it better than informing the vendor first, and informing the world at large later. Do inform the world that you found a vulnerability, and don't wait too long, but tell the vendor first, so that, if they do their part in protecting users, they will have a fix out by the time they or you announce the vulnerability. And do provide details, at least enough so knowledgeable users know how to protect themselves if, somehow, installing the fix is not an option. This will accomplish:

1. The vendor will know that there is a vulnerability, and that they must fix it quickly, because the black hats will soon know about it, if they don't already.
2. The users will know that there is a vulnerability, how they can protect themselves against it, and, if the vendor did its part, that there is a fix for it. Either way, they can protect themselves and be safe.
3. The black hats will know about the vulnerability, but they will only be able to use it against users who didn't protect themselves, once full disclosure has been performed. Before that, it depends on if the black hats already knew. If they didn't, they couldn't exploit anyone. If they did, they could exploit everyone.

Quick disclosure is important, because it allows you to get to the point where everyone can protect themselves and be safe quickly.

Finally, about publicity-hungry researchers: perhaps we should keep than incentive. Black hats can make a lot of money by finding vulnerabilities before they are fixed. This is certainly motivating. We must try to find the vulnerabilities before they do. We could do worse than motivating people on our side to find vulnerabilities...

Re:Comments (1)

Antique Geekmeister (740220) | more than 5 years ago | (#24625009)

Your post is not, primarily, facts. It's primarily reasoning: that's trickier to correct. Next time split your work into multiple messages, eisier to follow, please.

For example, you wrote: "..., vendors have been reluctant to fix vulnerabilities unless they were publicly known. This is a Bad Thing, ... "

There's no trivial way to fix this: Fixing the flaws often requires a complete redesign of a system. In this case, it means using a better RFID system that supports encryption better. But that's actually changing a standard technology, and will drive up the price of the cards and the card readers and the infrastructure to support them on an already existing system. That's very expensive, indeed, and we've seen the same sort of problem with the new US passport id tags.

The solution is actually earilier in the process: in public exposure of the designs themselves, before their release. This will help standardization and modularity considerably. It's an opensource approach to it, rather than merely allowing people to publish the flaws in a closed source technology that a normal person has little or no access to.

First amendment rights are a red herring? (1)

Pinky's Brain (1158667) | more than 5 years ago | (#24625561)

There have been plenty of stories about disclosure responsible or otherwise, that isn't what makes this one special. The fact that multiple courts decided that prior restraint was fine and dandy is what stands out here ... so no, it's not a red herring.

Re:Comments (0)

Anonymous Coward | more than 5 years ago | (#24626133)

And if the black hats are the owners and contractors? What then?

You have fraud of a different kind. The secret is now about securing a payola system rather than securing the public interest. All this first ammendment BS becomes a red herring. If the public's interest was at stake, why did the MBTA do nothing except try to hush the MIT students?

Because somebody in the MBTA was making bank on this "flaw", that's why. Several people on this forum have pointed out that the flaws are way out of step with the security of other systems. How did this system get put into place in this day and age? Payola, that's how.

The students are not the problem.
Follow the money.....

Re:Comments (1)

Deefburger (1345835) | more than 5 years ago | (#24626843)

And if the black hats are the owners and contractors? What then?

You have fraud of a different kind. The secret is now about securing a payola system rather than securing the public interest. All this first ammendment BS becomes a red herring. If the public's interest was at stake, why did the MBTA do nothing except try to hush the MIT students?

Because somebody in the MBTA was making bank on this "flaw", that's why. Several people on this forum have pointed out that the flaws are way out of step with the security of other systems. How did this system get put into place in this day and age? Payola, that's how.

The students are not the problem. Follow the money.....

Prior Restraint is UNCONSTITUTIONAL!!! (4, Interesting)

Jane Q. Public (1010737) | more than 5 years ago | (#24624985)

And for good reason!!!

They have a RIGHT to speak. They can exercise discretion and do people a favor, or they can exercise a different kind of discretion and do a different group of people a favor, or they can lack discretion and get themselves arrested for illegal speech, which does happen sometimes... but only AFTER they say it! There is no such law as "conspiracy to say something harmful or offensive"!

Regardless of whether it is right or responsible or moral for them to do what they want to do, they have a RIGHT to speak. And you can't mess with that right without messing up a hell of a lot more than just the "security" of one sorry municipality or corporation.

Prior restraint amounts to a legal attempt to read someone's mind. Sorry, but "thought crimes" STILL do not exist in this country. Because prior restraint would open up a whole nightmarish can of worms and, effectively legitimize the concept of "thought crime", it should never be tolerated even a little bit, EVER.

Re:Prior Restraint is UNCONSTITUTIONAL!!! (1)

Dhalka226 (559740) | more than 5 years ago | (#24625735)

[headline:] Prior Restraint is UNCONSTITUTIONAL!!!

Usually, but not always. Gag orders are prior restraint, for example, and are not always unconstitutional. There are also clear exceptions for what the USSC refers to as "exceptional circumstances," and clearly defining that is up to the courts themselves. You are correct that there is a heavy bias against it in US jurisprudence, but it is important to note that no decision has yet been rendered. It MAY be prior restraint days from now, but now it is simply "keep your pants on a minute."

They have a RIGHT to speak [. . .] they can lack discretion and get themselves arrested for illegal speech

You are contradicting yourself. A right is something you can do free of legal consequence. If one can be arrested for something one says--and you are correct that one can--then it was not a right to begin with. It is a difficult distinction sometimes, but put it this way: Screaming "I KNOW MY RIGHTS!!!" does nothing but make you look stupid if it turns out you did not.

Whether or not they have a right to say these things in the face of the harm the other side claims it will do--and I am not making any judgments at this point about who I think is correct--is the very heart of the case. It is the reason a judge is involved at all. As somebody said in a post in the original discussion, this is a temporary restraining order: it is a judge telling you to stand still until he has time to hear the actual arguments on both sides and make an ACTUAL ruling. It is not only standard procedure, it is absolutely the right thing to do. Justice is not served if people are free to do things that make the judgments moot while it has its back turned.

Prior restraint amounts to a legal attempt to read someone's mind. Sorry, but "thought crimes" STILL do not exist in this country

Why? For starters, I do not think there's any doubt that these people were going to talk about this. It is entirely possible that they had already planned out every word of what they were going to say and made cute little PowerPoint presentations for them as well. If nothing else they had an idea well defined enough that a security convention slotted them time to speak about it. "We uh, may or may not have something to tell you about stuff?" does not cut it. To use your ridiculously bad example of crime, this has moved beyond what could be reasonably considered "thought crime" and into the "attempted ____" situation. They are going to do it. The only thing stopping them is, well, the fact that they have been stopped.

But more to the point, your statement is just logical fallacy. Prior restraint in this case is the equivalent of your parents glaring at you and saying "don't you even think about it!" Logically there are only two possibilities: One, you were going to do it and an authority figure just stopped you. Or two, you weren't, and you huff around in indignation at the accusation but nothing else in the world has changed. You can only be restrained from doing something that you planned on doing in the first place. Since the only "punishment" in a case such as this is that you not be allowed to say something, I find it difficult to label it as anything approaching a thought crime, regardless of how the decision ultimately comes down.

Nonsense (1)

Jane Q. Public (1010737) | more than 5 years ago | (#24626975)

Quote: "It MAY be prior restraint days from now, but now it is simply "keep your pants on a minute."

Not so. It is prior restraint NOW. As you point out, it could theoretically turn out to be one of those rare exceptions of legal prior restraint when it gets to court, but the chances of that happening in this particular case are about the same as a snowball in Hell. This is CLEARLY a case of ILLEGAL prior restraint.

By the way, I do not recall any acceptable "exceptions" to prior restraint prohibition when it comes to speech. I would be interested in some examples. The classic example of prohibited speech is yelling "fire" in a crowded theater. Yet is is still not legally permissible to restrain someone from going to the theater, based on mere speculation that they might do such a thing. More to the point in the case at hand, it is not permissible to restrain someone from giving a speech in which they tell people that the fire exits at the theater have serious flaws... even if the theater company think that he plans to disclose that information to others, to its detriment or theirs.

Quote: "For starters, I do not think there's any doubt that these people were going to talk about this. It is entirely possible that they had already planned out every word of what they were going to say..."

In this case, the speakers had already stated that they did not intend to disclose the full details of defeating the flawed system. So there is even less justification for prior restraint. So what if they had planned what they were going to say? Most people who plan to give a presentation do exactly that. But a powerpoint slide does not reveal everything -- and sometimes reveals almost nothing -- about the accompanying spoken words. Once again, prior restraint, when it comes to speech, is an assumption that you know what someone is going to say before they say it... and there is no rational basis for that assumption.

Quote: "You are contradicting yourself. A right is something you can do free of legal consequence."

That is not even remotely true. They DO have a right to speak. However, as I pointed out, they do NOT have a right to say absolutely anything they want. There ARE legal limits. But it is not permissible to restrain someone prior to their speech because you "believe they might intend" to say something you do not like. That is an attempt to read someone's mind... which, again, is not allowed. I also have a RIGHT to own and even discharge a firearm. This was recently decided in so many words by the Supreme Court. But that does not mean that I have a right to discharge it in the direction of my neighbors, except in self-defense. There is no contradiction in that, nor in my original statement.

In your final paragraph you convey your inability to understand why I should label this "thought crime". Yet your own analogy gives it away: parents saying "Don't even think about it." They are ASSUMING they know what someone is going to do before they do it. Also, it is not true that you can only be restrained from doing something that you had planned to do in the first place! It is possible -- as in this case -- be restrained from doing perfectly LEGAL things, on the basis that someone SUSPECTS that you were about to do something illegal. That is, in fact, what Prior Restraint is all about. When it comes to free speech, it is not possible to know what someone will say until they say it. THEN they might have committed a crime... but it is very definitely NOT permissible to prevent them from speaking. If it were I could prevent you, for example, from speaking in public merely because I suspected that you were going to commit slander against me. Obviously, such a situation would be ridiculous.

They were NOT restrained merely from saying something that might damage or injure a company or a municipality or the passengers. They were restrained from speaking at all... including all of the perfectly legal things they were going to say!

Re:Prior Restraint is UNCONSTITUTIONAL!!! (1)

dissy (172727) | more than 5 years ago | (#24626243)

Regardless of whether it is right or responsible or moral for them to do what they want to do, they have a RIGHT to speak. And you can't mess with that right without messing up a hell of a lot more than just the "security" of one sorry municipality or corporation.

Prior restraint amounts to a legal attempt to read someone's mind. Sorry, but "thought crimes" STILL do not exist in this country. Because prior restraint would open up a whole nightmarish can of worms and, effectively legitimize the concept of "thought crime", it should never be tolerated even a little bit, EVER.

I wonder how private of a company the transit system actually is. I was under the impression it was run by the city, thus the government.
Assuming of course that is true, then this just opened boston up to a massive entrapment lawsuit!

It is entrapment when government attempts to entice you to commit a crime or crimes that you would not commit otherwise.
Which is exactly what this statement is doing:

"When you discover major flaws in a system that society relies on, you go to the people who own the system and work with them," Jordan said "You don't stand up on a podium and say, 'Look how clever I am.'"

Jordan is attempting to convince the blackhat security researchers into committing both blackmail and extortion, which are the two laws used against a person whom offers to disclose to the source first instead of going straight to the public (blackmail), and when you offer to fix the problem for pay (extortion)

These researchers are clearly not wanting to commit those two very real crimes, so now they are having their first amendment rights trashed because this government(*?) worker wants them to commit blackmail and extortion instead.

This is the definition of entrapment.
(* Assuming this transit system is government run of course. If it is some private company, it would not be entrapment, but then again, if it was some private company, I can't see the gag order as "Boston vs. ____")

The right to shout "Fire!" in a crowded theatre... (1)

UrinalPooper (1240522) | more than 5 years ago | (#24625001)

...remains intact if the theatre is actually on fire and the manager refuses to pull the alarm.

Re:The right to shout "Fire!" in a crowded theatre (1)

Vectronic (1221470) | more than 5 years ago | (#24625043)

"Free speech means the right to shout 'theatre' in a crowded fire."

False dichotomy (1)

pla (258480) | more than 5 years ago | (#24625303)

Others, though, see the entire episode as yet another example of irresponsible, publicity-hungry security researchers trying to grab a few headlines."

Let me add another, somewhat more cynical voice to the debate...

Why should security researchers disclose their discoveries to the original author first? That would only make sense if we assume all security researchers do what they do for the sake of improving software for which have no financial incentive to improve, out of pure benevolence. While such people might exist, only a fool would naively expect that as the majority, much less all of them.

So, why do security researchers do their thing? Two words: "fame" and "money". And even such "noble" goals still leave out the true blackhats, who do it for the sake of finding new, unpatched exploits they can use.

Considering that, does it make any sense to talk about a mythical "obligation" to disclose vulnerabilities to the right people? IMO, not at all. If we want to have a sensible conversation on this topic, we should instead focus on how best to shift the motivation to more on the money and less on the fame. Which sounds more motivating, "bounties for bugs" or "report it to the proper channels"?

Re:False dichotomy (1)

Deefburger (1345835) | more than 5 years ago | (#24629447)

I found a security breach once, tracked it, reported it to the authorities in charge of the company and got fired! They were the ones doing the dirty deed! It was a fraud scheme used to make money under the table and launder money from other nefarious ventures. So even if you do figure it out, Who are you gonna call? In My case I had to tell the owners of the company. But in the case of a Public Entity, It's the public that should be told. If the truth is inconvienient, so be it.

why disclose at all? (4, Insightful)

speedtux (1307149) | more than 5 years ago | (#24625331)

The situations of the Linux kernel and the Boston subway are completely different. In the case of the Linux kernel, people need to know because it's their security that's at stake. In the case of the Boston subway, it all comes down to the economics of fare evasion and doesn't affect anybody's security (and you can be certain that the Boston subway knew about this and accepted it when they bought the system).

Now, I think the MIT students have a first amendment right to disclose this. However, I also think that these kinds of antics deserve reproach: people should point out that this was a stupid thing to do.

Buying an insecure system was stupider. (2, Insightful)

mikelieman (35628) | more than 5 years ago | (#24625461)

This is really CYA on behalf of the incompetent people running the Boston system.

They made the cheap choice ( unvalidated stored value cards w/ crappy encryption of the data ) and it bit them on the ass.

So now, someone else discovers the OBVIOUS FLAWS, and publicises the incompetence of the administration responsible.

Here's a little secret: The researchers are surely not the FIRST people to discover this. They're just pointing it out. I'm sure others are already exploiting the flaws even before the announcement.

Re:Buying an insecure system was stupider. (1)

speedtux (1307149) | more than 5 years ago | (#24627957)

This is really CYA on behalf of the incompetent people running the Boston system.

They didn't have a secure system before (tokens), why should they have one now?

They're just pointing it out. I'm sure others are already exploiting the flaws even before the announcement.

Of course, people are exploiting it, just like they were exploiting tokens before. It's factored into the overall cost.

They made the cheap choice ( unvalidated stored value cards w/ crappy encryption of the data ) and it bit them on the ass.

Economically, I don't see a problem. In fact, the rational thing is to stick with the current system.

Of course, they have a PR problem. But Obama has a PR problem over the false claims that he is a Muslim, too. The problem here is with the accusers, not the system.

and publicises the incompetence of the administration responsible.

The only incompetence I see is by a bunch of undergraduates who don't understand the economics of mass transit.

Re:why disclose at all? (0)

Anonymous Coward | more than 5 years ago | (#24626221)

How is the truth of the systems vulnerability not in the public interest? What if that system is vulnerable by design? What if the people running it, WANT it that way? Think about that! The truth is what is needed, not protection from some "possible" miss-use by some unknown entity. It's the miss-use and miss-handling by the KNOWN entities that is at stake. Only the truth be known will uncover corruption. That is what the first ammendment is protecting us from. Corruption.

Re:why disclose at all? (1)

mysidia (191772) | more than 5 years ago | (#24626955)

The city government is clearly interfering with constitutionally granted right to a free press, just b/c it is an individual or small group that want to publish sensitive information, and not a major media organization.

But I wonder.. does order prevent freedom of information act demands for more-recognized journalists to gain access to the information?

Major news orgs publish information leaked from anonymous sources all the time... i've yet to hear of any successful government gags (granted, such gag orders may also deny the news org the ability to mention that there IS a gag order).

Re:why disclose at all? (1)

Deefburger (1345835) | more than 5 years ago | (#24626587)

(I'm reposting cause I noticed my post as "anonymous coward"! Not me!) I don't agree entirely. When the problem is with a public entity, especially government, the truth should be shouted from the rooftops. Loudly! The founding fathers put the rights to free speech and free press and the right to bear arms FIRST in the list for the reason that that is our only protection from government corruption. That's right people, corruption. That should be our first thought when such a dismal system is implemented. The founding fathers knew all about how government payola schemes worked. They knew what laws made such corruption possible. So they made FREE SPEECH a TOP priority when they wrote the constitution. Even if the truth hurts, it only hurts for a short while. You have to ask yourself if it's worth the pain AFTER you know it. But you cannot hide from it forever. Prior restraint is the nemesis of freedom. It sets up a mechanism for hiding the truth under the cloak of public interest. Nothing could be further from the public interest! MBTA is hiding more than just a flawed system. They are hiding the fact that it is flawed badly and quite possibly BY DESIGN. This system is flawed both externally in the ticket handling and INTERNALLY in the money handling!!! Who benefits from these flaws? I smell a rat and it ain't at MIT!

Re:why disclose at all? (1)

speedtux (1307149) | more than 5 years ago | (#24627997)

When the problem is with a public entity, especially government, the truth should be shouted from the rooftops.

Nobody has shown there to be a problem.

They are hiding the fact that it is flawed badly and quite possibly BY DESIGN.

Of course it is "flawed" by design, except that it is not a flaw. Subway tokens were never unforgeable before, so why should that all of a sudden be a requirement?

So they made FREE SPEECH a TOP priority

I think the MIT students should be allowed to publish their findings, just like I think people should be allowed to publish obscenities on the web. But people should also recognize that what the MIT students did is stupid. People should also recognize that arguments like yours are stupid.

Re:why disclose at all? (1)

Deefburger (1345835) | more than 5 years ago | (#24629045)

But people should also recognize that what the MIT students did is stupid. People should also recognize that arguments like yours are stupid.

What is stupid about it? What is stupid about knowing the truth? What a pointless statement.

ep..%.. (-1, Troll)

Anonymous Coward | more than 5 years ago | (#24625453)

dying' crowd - N=etBSd user it has to be fun

Free subway rides: worse than blowing up the world (1, Interesting)

Anonymous Coward | more than 5 years ago | (#24625469)

The judge upheld the gag order because he realized that riding the subway for free is a bigger threat to civilization than blowing up the world. That's why the MTA was entitled to prior restraint against the subway hackers, when the US government was not able to restrain The Progressive magazine from publishing the secret of the H-bomb in the 1980's.

For more info, google "morland progressive" or see the first hit:

http://www.fas.org/sgp/eprint/cardozo.html

"publicity-hungry security researchers" (1)

jc42 (318812) | more than 5 years ago | (#24625819)

Others, though, see the entire episode as yet another example of irresponsible, publicity-hungry security researchers trying to grab a few headlines.

See, this is exactly why one should always announce security problems anonymously, via one of the security lists that supply anonymity. That way, you don't get labelled publicly with such epithets. Then, when the fuss has died down and it's only the security geeks talking, you can let them know that you were the source of that "leak". That way, you get the credit (if they believe you ;-), without all the usual approbation that follows being a messenger carrying bad news.

The public in general, and specifically the people in power, don't want to hear about such things, and are going to want to punish you for telling them about it. Until they come to their senses, which won't happen soon, you're much better off working anonymously, known to only a few co-workers.

Re:"publicity-hungry security researchers" (0)

Anonymous Coward | more than 5 years ago | (#24626393)

I don't agree entirely. When the problem is with a public entity, especially government, the truth should be shouted from the rooftops. Loudly! The founding fathers put the rights to free speech and free press and the right to bear arms FIRST in the list for the reason that that is our only protection from government corruption.

That's right people, corruption. That should be our first thought when such a dismal system is implemented. The founding fathers knew all about how government payola schemes worked. They knew what laws made such corruption possible. So they made FREE SPEECH a TOP priority when they wrote the constitution.

Even if the truth hurts, it only hurts for a short while. You have to ask yourself if it's worth the pain AFTER you know it. But you cannot hide from it forever. Prior restraint is the nemesis of freedom. It sets up a mechanism for hiding the truth under the cloak of public interest. Nothing could be further from the public interest!

MBTA is hiding more than just a flawed system. They are hiding the fact that it is flawed badly and quite possibly BY DESIGN.

This system is flawed both externally in the ticket handling and INTERNALLY in the money handling!!! Who benefits from these flaws?

I smell a rat and it ain't at MIT!

Re:"publicity-hungry security researchers" (1)

jc42 (318812) | more than 5 years ago | (#24626859)

MBTA is hiding more than just a flawed system. They are hiding the fact that it is flawed badly and quite possibly BY DESIGN. This system is flawed both externally in the ticket handling and INTERNALLY in the money handling!!! Who benefits from these flaws? I smell a rat and it ain't at MIT!

Yeah; probably 90% of the population of Boston and environs would agree with you. But that's an even stronger reason that you shouldn't "shout it from the rooftops". It's a whole lot safer to release the information anonymously. There's a real chance that these guys won't be visited just by MBTA lawyers. I'll leave it to readers' imaginations just the sort of "persuaders" that may be paying them visits.

Of course, having gone so public with the information could give them some protection. The the way things work, since their identities are known, they could all be involved in "unfortunate accidents" over the next few years.

In a realated Story (3, Funny)

arthurpaliden (939626) | more than 5 years ago | (#24626009)

In a related story it appears the Judge's home was broken into and ransacked and several irreplaceable articles were stolen and destroyed without anyone knowing even though it has an activated alarm and security locking system. It appears that there was a flaw in the system that enabled the perpetrators bypass it. This flaw was know to security researchers however they were under a gag order and were not permitted to release this information to the general public. The gag order was applied for by the company because âoeif the general public knew about the flaw it would impact our revenue streamâ.

Who is to blame here? (1)

SeeSp0tRun (1270464) | more than 5 years ago | (#24626231)

The information was presented to the MBTA in a manner that was available to the students. Being a Boston resident myself, I know that I can't just walk into North Station (essentially the hub of the T) and speak to their security techs. Even then, I doubt the MBTA is receiving very much mail about their security issues (or at least they weren't before this). Their failure to act on information that was in all essence handed to them, is their own fault.

On another note, these security researchers seeking something other than pats on the back and shaken hands are perfectly fine. Try feeding your kids with pats on the back, breast milk only gets you so far.

They Were Right to Gag Them (0)

Anonymous Coward | more than 5 years ago | (#24626693)

Haven't you guys heard of the recent changes to the 1st amendment. They added "unless driven by ego" to the end of it.

slogan (0)

Anonymous Coward | more than 5 years ago | (#24627011)

are new slogan:

America...even our constitution has small print

No Reason Is Good Enough (1)

b4upoo (166390) | more than 5 years ago | (#24627847)

There are always ten thousand arguments for restraining free speech and they supposedly are all backed by dire need.
        At the bottom of it all we have settled it over two centuries ago. Free speech is not up for debate. Whether it harms individuals, groups or the whole world is simply not an issue. What is sick is allowing endless court cases over restraint of free speech. These cases should be dismissed without even being looked at.

Everyone is a criminal (0)

Anonymous Coward | more than 5 years ago | (#24629105)

What an awful law [cornell.edu]! Have you tried parsing through the monstrous law that the MTBA invoked in their complaint [mit.edu]? I had no idea such a law existed.

Basically, it says that if a hack could obtain anything of value or cause damage/DoS to the hacked computer, you may not tell anyone how to do it. It also says that you owe the computer owner compensation for anything damaged or taken (e.g. unpaid subway fares) as a result of your telling. The law pretends to be limited to important computers, but is so fuzzy that most large computer systems that anyone might want to hack can probably qualify.

I don't envy the EFF their task in defending this case. They may be stuck trying invalidate the law on 1st Amendment grounds. There ain't much sympathy these days for invalidating laws that can claim to keep safe people's health records, bank accounts, and national security data (which is what other parts of this law try to do).

If you want to be appalled, read sections (f) and (g), which say the law doesn't apply to government or to manufacturers of the computers in question. So, not everyone is a criminal, just people.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...