Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NSA Backdoors In Open Source and Open Standards: What Are the Odds?

timothy posted about a year ago | from the trusting-trust dept.

Privacy 407

New submitter quarrelinastraw writes "For years, users have conjectured that the NSA may have placed backdoors in security projects such as SELinux and in cryptography standards such as AES. However, I have yet to have seen a serious scientific analysis of this question, as discussions rarely get beyond general paranoia facing off against a general belief that government incompetence plus public scrutiny make backdoors unlikely. In light of the recent NSA revelations about the PRISM surveillance program, and that Microsoft tells the NSA about bugs before fixing them, how concerned should we be? And if there is reason for concern, what steps should we take individually or as a community?" Read more below for some of the background that inspires these questions.quarrelinastraw "History seems relevant here, so to seed the discussion I'll point out the following for those who may not be familiar. The NSA opposed giving the public access to strong cryptography in the '90s because it feared cryptography would interfere with wiretaps. They proposed a key escrow program so that they would have everybody's encryption keys. They developed a cryptography chipset called the "clipper chip" that gave a backdoor to law enforcement and which is still used in the US government. Prior to this, in the 1970s, NSA tried to change the cryptography standard DES (the precursor to AES) to reduce keylength effectively making the standard weaker against brute force attacks of the sort the NSA would have used.

Since the late '90s, the NSA appears to have stopped its opposition to public cryptography and instead (appears to be) actively encouraging its development and strengthening. The NSA released the first version of SELinux in 2000, 4 years after they canceled the clipper chip program due to the public's lack of interest. It is possible that the NSA simply gave up on their fight against public access to cryptography, but it is also possible that they simply moved their resources into social engineering — getting the public to voluntarily install backdoors that are inadvertently endorsed by security experts because they appear in GPLed code. Is this pure fantasy? Or is there something to worry about here?"

Sorry! There are no comments related to the filter you selected.

First (-1)

Anonymous Coward | about a year ago | (#44164205)

apk is a host file


Anonymous Coward | about a year ago | (#44164213)

We need all the eyes we can get to those memory leaks!

This is stupid (5, Insightful)

Anonymous Coward | about a year ago | (#44164243)

This is fearmongering. Encryption standards that have been adopted are open source and mathematicians comb over them with a fine tooth comb before giving them their blessing. Yes, there is a worry among mathematicians about the NSA developing an algorithm that would permit a pre-computed set of numbers to decrypt all communication. Which is why they make sure it DOESN'T HAPPEN.


Re:This is stupid (2, Informative)

F.Ultra (1673484) | about a year ago | (#44164337)

Not to mention that what became AES was a Dutch(?) algorithm to begin with (Rijndael).

Re:This is stupid (5, Insightful)

arnodf (1310501) | about a year ago | (#44164461)

Belgian ffs.
Belgium, I hate it when people mistake us for Dutch!

Re:This is stupid (-1)

Anonymous Coward | about a year ago | (#44164509)

I thought Dutch was the Belgish word for Belgium? You know, because of all those Alps there.

Re:This is stupid (5, Funny)

Anonymous Coward | about a year ago | (#44164519)

Belgium - The more awesomer part of the Spanish Netherlands!

Re:This is stupid (1)

Anonymous Coward | about a year ago | (#44164579)

stop speaking dutch then!

Re:This is stupid (3, Funny)

Alranor (472986) | about a year ago | (#44164589)

It's all Greek to me

Re:This is stupid (3, Funny)

PopeRatzo (965947) | about a year ago | (#44164763)

Belgian ffs.
Belgium, I hate it when people mistake us for Dutch!

Seriously, right? They probably don't even know you guys invented spaghetti and kung fu.

I for one think the Belgs are awesome.

Re:This is stupid (4, Insightful)

kwikrick (755625) | about a year ago | (#44164413)

Fearmongering, yes.
But not impossible.
It's not so easy to make sure that a program is a correct implementation of a mathematical algorithm or of an open standard.
A subtle bug (purposeful or not) in a crypographic algorithm or protocol can be exploited.
Writing a bug is much easier than spotting it.
Many applications and OSes get security updates almost dayly. They certainly haven't found them all yet.
Perhaps the NSA has engineered backdoors in our free software at some point, but those vunerabilities have been patched already.
Mosty paranoia then....

Re:This is stupid (-1)

Anonymous Coward | about a year ago | (#44164449)

The word is "daily", not "dayly". And no one cares that you are named Rick.

Anonymous Coward

Re:This is stupid (4, Informative)

zerro (1820876) | about a year ago | (#44164545)

It's always interesting to see what (some of the best attempts at) intentional code obfuscation can look like: []

Re:This is stupid (3, Interesting)

Anonymous Coward | about a year ago | (#44164745)

Mosty paranoia then....

Misdirection rather than paranoia. They're trying to point the finger at Linux etc when it's SecureBoot that's the vulnerability.

When you use a board with SecureBot, you're using pre-compromised hardware. Even when you install a secure OS, the underlying hardware hides the backdoor.

Re:This is stupid (0)

Anonymous Coward | about a year ago | (#44164769)

Writing a bug is much easier than spotting it.

So is it possible that NSA doesn't spot bugs because you say it is much harder?

Is it as well possible that NSA would write buggy backdoors and bugs to software what they want to later exploit, because it is so hard to spot those bugs?

How NSA would make sure their algorithms, codes etc does't include bugs?
How Open Source would make sure their altorithms, codes etc doesn't include backdoors and bugs?

Re:This is stupid (3, Informative)

Anonymous Coward | about a year ago | (#44164417)

Also what is left out in the summary is that the NSA worked to strengthen the S-boxes in DES against differential cryptanalysis attacks, even though the existence of such attacks were not know publicly at the time.

Re:This is stupid (4, Interesting)

Hatta (162192) | about a year ago | (#44164525)

Encryption algorithms may be secure, but how sure are you that your implementation is? Debian was generating entirely insecure SSL keys for a couple years before anyone noticed. Couldn't the NSA do something like that, but perhaps a bit more clever, and remain unnoticed?

Yep (5, Insightful)

Sycraft-fu (314770) | about a year ago | (#44164537)

AES was developed in Belgium by Joan Daemen and Vincent Rijmen. It was originally called Rijndael and was one of the AES candidates. What happened is the NIST put out a call for a replacement for the aging DES algorithm. It was one of a number of contenders and was the one that one the vote. The only thing the NSA has had to do with it is that they weighed in on it, and all the other top contenders, before a standard was chosen and said they were all secure and that they've since certified it for use in encrypting top secret data.

It was analyzed, before its standardization and since, by the world community. The NSA was part of that, no surprise, but everyone looked at it. It is the sole most attacked cypher in history, and remains secure.

So to believe the NSA has a 'backdoor' in it, or more correctly that they can crack it would imply that:

1) The NSA is so far advanced in cryptography that they were able to discover this prior to 2001 (when it got approved) and nobody else has.

2) That the NSA was so confident that they are the only group to be able to work this out that they'd give it their blessing, knowing that it would be used in critical US infrastructure (like banking) and that they have a mission to protect said infrastructure.

3) So arrogant that they'd clear it to be used for top secret data, meaning that US government data could potentially be protected with a weak algorithm.

Ya, I'm just not seeing that. That assumes a level of extreme mathematical brilliance, that they are basically better than the rest of the world combined, and a complete disregard for one of their missions.

It seems far more likely that, yes, AES is secure. Nobody, not even the NSA, has a magic way to crack it.

Re:Yep (0)

Anonymous Coward | about a year ago | (#44164737)

Posting AC for obvious reasons.

Yep. Never seen a NSA employee here at NIST. They don't come around here often. Said no NIST employee ever.

Re:This is stupid (3, Interesting)

mitcheli (894743) | about a year ago | (#44164607)

The bigger threat to encryption isn't the pre-computed magic numbers that the NSA may or may not have placed into key algorithms, it is the advance of technology and the subsequent rendering useless of the models we currently use today [] .

Re:This is stupid (1)

davydagger (2566757) | about a year ago | (#44164683)

at the same time, there only a handful of people who know how read it. Plus reading source code is not as easy as writing.

My real question is just how much scrutiny has been poured over it, and by who, instead of making the assumptions.

Re:This is stupid (1)

c0lo (1497653) | about a year ago | (#44164765)

Mmm? [] Are you sure that's enough? [] It wouldn't be quite the first time NSA would have "helped" [] someone.

Re:This is stupid (2)

quarrelinastraw (771952) | about a year ago | (#44164789)

Hi, I wrote the submission. To fearmonger is to exagerrate some threat to use fear in order to promote some specific ends. This question is me asking to what extent caution should be justified so that I as a user can know what to do. I'm sure you can see how those things are extremely different and in fact the opposite. One is an attempt to drive action with fear you know is unjustified, the other is an attempt to systematically determine the appropriate amount of caution.

"This is fearmongering" seems inappropriate as a response to a submission that contains only links to Wikipedia documenting known facts and that even goes so far as to call some proponents of this theory paranoid.

That said, thanks for the link.

Back doors are so 90s (0, Insightful)

Anonymous Coward | about a year ago | (#44164249)

Who needs back doors when you can buy an 0day for a few 100k? Backdoors are passé.

Re:Back doors are so 90s (5, Funny)

kc9jud (1863822) | about a year ago | (#44164305)

Backdoors are passé.

And so is proper Unicode support...

Well they COULD put a backdoor in some OSS... (4, Interesting)

Viol8 (599362) | about a year ago | (#44164255)

.... but unless they have the worlds top obfuscating coders working there (quite possible) , how long do you think it would be until someone spots the suspect code especially in something as well trodden as the Linux kernel or GNU utilities? I would guess not too long.

Re:Well they COULD put a backdoor in some OSS... (3, Insightful)

Anonymous Coward | about a year ago | (#44164315)

Nahhh the backdoors are in the compilers.. They've modified GCC to install it for you. Your code looks fine and the backdoor is there. Everyone wins!


Re:Well they COULD put a backdoor in some OSS... (0)

Anonymous Coward | about a year ago | (#44164381)

Ah, so RMS was the NSA's agent? Talk about hiding in plain sight!

Re:Well they COULD put a backdoor in some OSS... (4, Interesting)

cHiphead (17854) | about a year ago | (#44164329)

Why not just insert something at the compiler level. Perhaps they have compromised GCC itself or something at a different, less obvious point in the process of development?

Re:Well they COULD put a backdoor in some OSS... (5, Informative)

pegr (46683) | about a year ago | (#44164407)

Reflections on Trusting Trust [] (PDF alert). Required reading for anyone with interest on that very topic. Written by Ken Thompson, in fact.

Re:Well they COULD put a backdoor in some OSS... (0)

Anonymous Coward | about a year ago | (#44164423)

You say that like the source to GCC isn't available....

Re:Well they COULD put a backdoor in some OSS... (1)

tnk1 (899206) | about a year ago | (#44164567)

Like the source isn't available AND it wasn't one of the pieces of code RMS himself actually works/worked on. I suppose it could happen, but if so, they did a very fine job of getting it in there.

Although, perhaps if they instead compromised the packages that are usually used to install gcc, that might work. The source code doesn't do shit for you if you're installing pre-compiled binaries...

Re:Well they COULD put a backdoor in some OSS... (1)

kthreadd (1558445) | about a year ago | (#44164829)

That has happened before []

Re:Well they COULD put a backdoor in some OSS... (2)

Rockoon (1252108) | about a year ago | (#44164571)

You say that like the source to GCC isn't available....

..and what do you compile the source with?

Re:Well they COULD put a backdoor in some OSS... (1)

Lunix Nutcase (1092239) | about a year ago | (#44164581)

You say that as if any one person understands the entirety of GCC's massive codebase.

Re:Well they COULD put a backdoor in some OSS... (0)

Anonymous Coward | about a year ago | (#44164439)

Stop giving people ideas. We already know that's our best protection since most people can't think for themselves.

Re:Well they COULD put a backdoor in some OSS... (5, Interesting)

zerro (1820876) | about a year ago | (#44164465)

Why backdoor just one brand of compiler (since there are several), when you could backdoor the architecture?
I'm pretty sure there is a special sequence of intel instructions which open the unicorn gate, and pipe a copy of all memory writes to NSA's server.

Re:Well they COULD put a backdoor in some OSS... (0)

Anonymous Coward | about a year ago | (#44164341)

You.... work for the NSA right ?

yes, LITERALLY (2)

Thud457 (234763) | about a year ago | (#44164393)

Well, Ken Thompson's in login.c since like 1984, so we have that much.

Re:Well they COULD put a backdoor in some OSS... (1)

gatkinso (15975) | about a year ago | (#44164445)

"All" they need to do is insert a very very subtle bug, and as pointed out, that bug could be in the compiler.

Re:Well they COULD put a backdoor in some OSS... (1)

Anonymous Coward | about a year ago | (#44164573)

Took 2 years to spot the OpenSSL flaw introduced in Debian. There was a trojan in UnrealIRCD for about a year before it was noticed. When somone audited xlib they found tons of flaws that have existed for around a decade.

Re:Well they COULD put a backdoor in some OSS... (2)

Lunix Nutcase (1092239) | about a year ago | (#44164637)

Considering that security audits are actually quite a rarity it's not beyond reason to think that flaws and bugs can be introduced and go unnoticed. Just because in theory people can comb over OSS code doesn't mean that it actually happens with any regularity.

Re:Well they COULD put a backdoor in some OSS... (0)

Anonymous Coward | about a year ago | (#44164639)

I think the point is, and I find myself agreeing, that maybe we should do something more than place ALL of our hope and faith in "Oh, you know... lots of smart people out there.. someone would see it, I'm sure."

Depends (5, Interesting)

Sycraft-fu (314770) | about a year ago | (#44164653)

Check out the Underhanded C contest ( There are great examples of code that look innocuous, but aren't. What's more, some of them look like legit mistakes that people might make programming.

So that is always a possibility. Evil_Programmer_A who works for whatever Evil Group that wants to be able to hack things introduces a patch for some OSS item. However, there's a security hole coded in purposely. It is hard to see, and if discovered will just look like a fuckup. Eventually it'll probably get found and patched, but nobody suspects Evil_Programmer_A of any malfeasance, I mean shit security issues happen all the time. People make mistakes.

In terms of how long to spot? Depends on how subtle it is. If you think all bugs get found real fast in OSS you've never kept up on security vulnerabilities. Sometimes, they find one that's been around for a LONG time. I remember back in 2000 when there was a BIND vulnerability that applied to basically every version of BIND ever. It has been lurking for years and nobody had caught it. Worse, it was a "day-0" kind of thing and people were exploiting it already. Caused a lot of grief for my roommate. By the time he heard about it (which was pretty quick, he subscribed to that kind of thing), their server at work had already been owned.

Don't think that just because the code is open that it means that it gets heavily audited by experts. Also don't think that just because an expert looks at it they'll notice something. It turns out a lot of security issues are still found in the runtime, not by a code audit. Everyone looks at the code and says "Ya, looks good to me," and only when later running it and testing how it reacts do they discover an unintended interaction.

I'm not trying to claim this is common, or even happening at all, but it is certainly possible. I think people put WAY too much faith in the "many eyes" thing of OSS. They think that if the code is open, well then people MUST see the bugs! All one has to do is follow a bug track site to see how false that is. Were it true, there'd be no bugs, ever, in release OSS code. Thing is, it is all written and audited by humans are humans are fallible. Mistakes happen, shit slips through.

Why? (0)

Anonymous Coward | about a year ago | (#44164257)

Why would they care about your cryptography when they can simply use something like TEMPEST to read the plaintext or laser-acoustic eavesdropping (forgot the term for it) to listen in on you? Hell maybe they finally came up with a satellite that can do that to anyone they target.

Problem is, the cryptography is only a link in the chain.

Linux Kernel has had bugs publicly reintroduced. (5, Insightful)

CajunArson (465943) | about a year ago | (#44164265)

Last year or early this year there was a fix for a Linux kernel bug that could provide root privilege escalation. Here's the kicker though: The bug had been fixed years earlier but had been reintroduced into the kernel and nobody caught it for a very long time. For some reason, OpenSuse's kernel patches still included the bug fix, so OpenSuse couldn't be exploited, but mainline didn't reintroduce the fix for a long time.

Given the complexity of the kernel as just one example of a large open-source project, I don't really buy the "all bugs are shallow" argument from days of past. That argument was making a presumption that people *wanted* to fix the bugs, and as we all know there are large groups of people who don't want the bugs fixed. That's not to say that there is a magical NSA backdoor in Linux (and no, there isn't a magical NSA backdoor in Windows either, get over it conspiracy fanboys). That is to say that simply not running Windows isn't enough to give you real security and yes, your Linux box can be attacked by a skilled and determined adversary.

Re:Linux Kernel has had bugs publicly reintroduced (5, Insightful)

F.Ultra (1673484) | about a year ago | (#44164355)

if Microsoft giving NSA info on undisclosed vulnerabilities, they have in effect a magic backdoor in Windows.

Windows does have a backdoor. (5, Informative)

FriendlyLurker (50431) | about a year ago | (#44164541)

GP wrote: and no, there isn't a magical NSA backdoor in Windows either, get over it conspiracy fanboys

You are forgetting something [] . A pretty BIG BACK DOOR into windows that has been known and confirmed for some time now.

“...the result of having the secret key inside your Windows operating system “is that it is tremendously easier for the NSA to load unauthorized security services on all copies of Microsoft Windows, and once these security services are loaded, they can effectively compromise your entire operating system“. The NSA key is contained inside all versions of Windows from Windows 95 OSR2 onwards”

Re:Windows does have a backdoor. (2, Insightful)

Anonymous Coward | about a year ago | (#44164595)

The only speculation is whether Microsoft has given NSA et. al. access to those keys so they can load what they like onto windows (via "product updates" and whatnot) without needing UAC permission etc. Given recent Snowdens revelations/confirmations we can pretty much conclude that that is very much the case...

Re:Windows does have a backdoor. (1)

CajunArson (465943) | about a year ago | (#44164693)

So the NSA put in a magical untraceable backdoor that has never been found by the likes of Bruce Scheier or others in his field, but the NSA was also so stupid that they named the file "NSA_secret_evil_backdoor.dll" or something like that... yeah whatever.

Re:Windows does have a backdoor. (1)

CajunArson (465943) | about a year ago | (#44164727)

As a followup to my other response, if this magical backdoor into every Windows system on the planet is so great, then why was there a need for Stuxnet to ever come into existence?

The NSA should have built-in access to every Iranian Windows computer without the need for highly complex malware package!

No not really (1)

Sycraft-fu (314770) | about a year ago | (#44164695)

They aren't giving the NSA stuff that nobody else gets. The NSA is just on the early notification list. Various groups get told about vulnerabilities as soon as MS knows about them. The rest get told about them when there's a patch. So sure, I guess the NSA could quickly develop and exploit the vulnerability (if it is relevant, amazing how few no-user interaction, remote initiated exploits there are now that there's a default firewall) before MS patches it, but that is not really that likely a scenario, and more than any of the other groups that get it.

Re:Linux Kernel has had bugs publicly reintroduced (0)

Anonymous Coward | about a year ago | (#44164369)

You miss a major point in your FUD, having access to source at least gives people an option to go over it. Try that with Windows, or any closed source kernel or application.

Re:Linux Kernel has had bugs publicly reintroduced (1)

CajunArson (465943) | about a year ago | (#44164443)

Despite what you think, lots of people, including security researches, have access to the Windows source code too.

What you are saying is that:
1. Without source code, people find security holes in Windows all the time... you do agree with that statement right?
2. With source code, only the good guys find all the security bugs and fix them so fast that they never become an issue. Oh, and all existing Linux deployments, including the embedded Linux installs in your home router/cell phone/toaster/etc. get up to the minute security fixes applied too (yeah right, and I really don't care if you personally hack your devices with daily upstream kernel commits because there are millions upon millions of devices that aren't running that way).
3. Before you start accusing other people of spewing FUD, I never said that Windows is some paragon of security. You obviously see things in a very simplistic black and white world where Windows == All Bad and !Windows == All Good. Sorry sunshine, life is a lot more complex than that.

Re:Linux Kernel has had bugs publicly reintroduced (1)

tnk1 (899206) | about a year ago | (#44164687)

You have a point, but at the same time, there are plenty of people who install pre-compiled binaries on their Linux systems too. Having the source code for what you are supposed to be running isn't the same thing as having the source code for what you *are* running.

Granted, that does make an open source application safer, if you do compile it from source, but how many people do that? And be aware that you need to make sure you're always getting the source itself from the right place or that could be compromised itself. It's a simple matter of checking, of course, but many people don't.

Open source provides a means to install and operate more secure code, but you do need to take necessary precautions, and you need to make sure everyone who does it knows to take the necessary precautions.

Re:Linux Kernel has had bugs publicly reintroduced (0)

Anonymous Coward | about a year ago | (#44164485)

That argument was making a presumption that people *wanted* to fix the bugs, and as we all know there are large groups of people who don't want the bugs fixed.

They don't just cancel out.

Vegas odds (1)

Sparticus789 (2625955) | about a year ago | (#44164269)

I hear the Vegas odds of NSA backdoors into encryption schemes is 1000:0. Meaning everyone who bets $0 on the NSA not having a backdoor will receive $1,000 if they do.

Re:Vegas odds (1)

PhilHibbs (4537) | about a year ago | (#44164347)

I'll bet infinity dollars at those odds.

Re:Vegas odds (0)

Anonymous Coward | about a year ago | (#44164487)

I'll bet a random hash created by moving my mouse around like an angry artist attempting to draw a moving swarm of bees without lifting the pencil off of the paper.

Re:Vegas odds (2)

tnk1 (899206) | about a year ago | (#44164723)

You've fell for it! By adding in an infinity, they can now simply renormalize their equation and now you owe them approximately... one million dollars.

Vinnie and Joey will be over to collect momentarily.

gave me cancer (0)

Anonymous Coward | about a year ago | (#44164603)

your post just made everybody in this thread dumber.

Re:gave me cancer (1)

Sparticus789 (2625955) | about a year ago | (#44164759)

"Little: Interesting, if true. The Vegas odds tonight stand at an unprecedented 1000-0; a bet of $0 on Bender pays $1000 if he wins. Still, very few takers."
Futurama quote.

Re:Vegas odds (0)

Anonymous Coward | about a year ago | (#44164641)

Meaning everyone who bets $0 on the NSA not having a backdoor will receive a black bag over their head or a poison pellet if they do.


Alternative (1)

Anonymous Coward | about a year ago | (#44164273)

Seems to me that if they used to oppose public cryptography and are now encouraging it, then they no longer see it as a threat. Therefore I would wager that they can bypass it through some other means, such as ubiquitous backdoors in the actual hardware.

Re:Alternative (1)

Berzelius (558040) | about a year ago | (#44164389)

Yep, this plus the fact that almost everyone uses the same (simple) passwords across multiple cloud services and the NSA have access to that as well. Who needs to crack encryption if you have the keys?

Re:Alternative (1)

zerro (1820876) | about a year ago | (#44164713)

^ this one. ding ding ding.
Paraphrasing old Brucie on this:
Why would an attacker spend time trying to get through your steel-plated triple-deadbolted front door, when they can throw a rock through your kitchen window and crawl in?

All it takes are some unchallengeable secret court orders, and off to your nearest cloud/service provider to suck down all your datas.

Historically, NSA have done the opposite. (5, Insightful)

Anonymous Coward | about a year ago | (#44164287)

DES was developed in the early 1970's, and has been proven to be quite resistent to differential cryptanalysis, which didn't appear in the public literature until the late 1980's.

During the development of DES, IBM sent DES's S-boxes to NSA, and when they came back, they had been modified. At the time there was suspicion that the modifications were a secret government back door, however when differential cryptanalysis was discovered in the 1980s, the researchers found that DES was surprisingly hard to attack. It turned out that the modifications to the S-boxes actually strengthened the cipher.

Re:Historically, NSA have done the opposite. (2)

salty pirate space m (1108753) | about a year ago | (#44164467)

Intriguing ... citation please. "Strengthened the cipher" or "mucked it up with goal X and instead supported goal Y"?

Re:Historically, NSA have done the opposite. (5, Interesting)

time961 (618278) | about a year ago | (#44164691)

Biham and Shamir, Differential Cryptanalysis of the Data Encryption Standard, at CRYPTO '92. They showed that the S-boxes were about as strong as possible given other design constraints.

Subsequently, Don Coppersmith, who had discovered differential cryptanalysis while working (as a summer intern) at IBM during the development of DES in the early 1970's, published a brief paper (1994, IBM J. of R&D) saying "Yep, we figured out this technique for breaking our DES candidates, and strengthened them against it. We told the NSA, and they said 'we already know, and we're glad you've made these improvements, but we'd prefer you not say anything about this'." And he didn't, for twenty years.

Interestingly, when Matsui published his (even more effective) DES Linear Cryptanalysis in 1994, he observed that DES was just average in resistance, and opined that linear cryptanalysis had not been considered in the design of DES.

I think it's fair to say that NSA encouraged DES to be better. But how much they knew at the time, and whether they could have done better still, will likely remain a mystery for many years. They certainly didn't make it worse by any metric available today.

Real threat or open question? (2)

jetcityorange (666232) | about a year ago | (#44164289)

Is there a question in there about something specific or are you throwing pasta against the wall to see what sticks? Take AES for example. A pretty open selection process evaluating a number of known ciphers among many smart eyes. Are you saying No Such Agency pulled a fast one in broad daylight in front of multitudes or is your line of question non-specific and open ended?

Re:Real threat or open question? (1)

jeffmeden (135043) | about a year ago | (#44164359)

Is there a question in there about something specific or are you throwing pasta against the wall to see what sticks? Take AES for example. A pretty open selection process evaluating a number of known ciphers among many smart eyes. Are you saying No Such Agency pulled a fast one in broad daylight in front of multitudes or is your line of question non-specific and open ended?

It seems fair; can someone not related to the government attest to the viability of SEL? Has anyone read/understood enough of it to know for sure? Is it right to presume that someone from the OSS community would have certainly caught on to a trick by the NSA, or is it hubris?

Re:Real threat or open question? (5, Informative)

eparis (1289526) | about a year ago | (#44164489)

I can attest to the lack of backdoors in SELinux. I am the SELinux maintainer. I'm the guy responsible for it. []

I work for Red Hat. Not for the NSA. SELinux code does not go from me through the NSA, it actually goes the other way around. The NSA asks me to put code in the Linux kernel and I pass it to Linus. I have reviewed each and every line at one point or another.

The NSA may have some magic backdoor somewhere in the Linux kernel, but I'll stake my name that it isn't in the SELinux code.

Re:Real threat or open question? (1)

fche (36607) | about a year ago | (#44164725)

You can see though why in the presence of surveillance+gag orders, even such personal assurance may be less than satisfactory. That's one problem with the scheme: even honest people+companies become suspect.

Re:Real threat or open question? (1)

asylumx (881307) | about a year ago | (#44164837)

If you're not willing to trust anyone, then there's no point talking about anything with you.

They tried scare tactics with OpenBSD (3, Interesting)

feld (980784) | about a year ago | (#44164301) []

Some guy claimed to have put backdoors in the OpenBSD IPSEC stack for the FBI, but a full audit proved no such thing ever happened.

I seriously doubt this is happening in open source.

Re:They tried scare tactics with OpenBSD (1)

eer (526805) | about a year ago | (#44164587)

Ha Ha. Hahaha. I guess you missed the bit about how it is computationally infeasible (as in, halting problem) to definitively determine whether there are artifices in source or object code that deliberately mask and hide their behavior. See Naval Post Graduate School thesis and papers on how few lines of code need to be introduced to turn IPSEC implementations into clear text relays - turned on and off via encrypted key triggers.

A few years back, it was discovered that virtually every one's - and I mean EVERYone's - SSL and LDAP and PKI and IPSEC and SMIME and OpenSSH implementations were FILLED with defects - because they all were using the same open source ASN.1 library that was RIFE with buffer overflows.

The wonderful thing about open source code is that everyone uses it, thinking SOMEONE else MUST have vetted it, so all too many times, no one actually does.

Re:They tried scare tactics with OpenBSD (1)

AHuxley (892839) | about a year ago | (#44164677)

Some info of SSL here []
Seems the "private key" is the key in many ways too :) "for example if one of their servers were seized — all previous searches would be revealed where logged traffic is available."

Inefficient != Incompetent (4, Insightful)

sjbe (173966) | about a year ago | (#44164311)

I have yet to have seen a serious scientific analysis of this question, as discussions rarely get beyond general paranoia facing off against a general belief that government incompetence plus public scrutiny make backdoors unlikely.

Government's are not nearly as incompetent as many pundits would have you believe. We have some very seriously talented people doing some pretty amazing things in our government. Government isn't always a model of efficiency but inefficient does not (always) equal incompetent. And in some cases inefficiency is actually a good thing. Sometimes you want the government to be slow and deliberative and to do it right instead of fast. Some of the most remarkable organizations and talented people I've met are in government. Sadly some of the worst I've met are in government as well but my point remains. Assuming government = incompetent is in clearly wrong in the face of copious evidence to the contrary.

OpenBSD is the answer (2)

charles05663 (675485) | about a year ago | (#44164317)

With the continuing audit process and complete transparency I would trust OpenBSD along with OpenSSH, etc.

No need to (0, Insightful)

Anonymous Coward | about a year ago | (#44164325)

There are plenty of holes in the kernel and privileged program "as is". All they have to do is find them

The Clipper chip (5, Interesting)

Vintermann (400722) | about a year ago | (#44164333)

You mention the Clipper chip and its key escrow system guaranteeing government access, but what you should remember is that the cryptosystem that chip used was

1. Foolishly kept secret by the NSA, although it has long been understood that academic scrutiny is far more important than security through obscurity, and

2. The symmetric cipher the chip used, Skipjack, was subject to a devastating attack on its first day of declassification (breaking half the rounds) and by now is completely broken. That remains rare for any seriously proposed cipher...

Since presumably the NSA did not try to make a broken cryptosystem (why, to help other spies? They themselves had the keys anyway!) this illustrates that yes, incompetence is a concern even at super-funded, super-powerful agencies like the NSA.

Front doors not back doors (0)

Anonymous Coward | about a year ago | (#44164373)

Front doors, look at the key exchange for HTTPS and TLS. All it takes is a man-in-the-middle attack and a way to generate valid certificates and any HTTPS connection can be intercepted at any point. Verisign, Thawte etc. are all NSA establishment companies, any one of the myriad of certificate companies built into your browser could be working with the NSA generating fake certificates.

That goes for code signing too, and auto-update of software that connects to https.

You're looking for hidden secret security holes, but missing the really really big one, Certificates.

Likewise mail protocols, they're essentially unencrypted, SSH we do a one time key public exchange, and there after the key hash is checked each time to make sure it doesn't change. We could do the same with mail protocols, we could have secure email tomorrow. But we don't because whenever we try to introduce it, some expert tries to morph it into a certificate exchange. Protecting it from first-time key intercepts, but opening it up to a MITM attack from an NSA operative. It makes it complex and less secure, so nobody uses it.

Encrypted should be the default for all comms these days.

NSA backdoors in nature (0)

Pecisk (688001) | about a year ago | (#44164411)

Do eagles give NSA live feeds via brain waves? Do birds and insects let NSA collect frequencies so they can pull them together and have ultimate listening machine? You decide!

Also I have cloak of invisibility to sale, with NSA control beam repellent...

Seriously, people....

Bitcoin? (5, Funny)

Fesh (112953) | about a year ago | (#44164415)

Obviously I haven't read the literature enough to know how it works or why it's impossible... But it would be really funny if it turned out that Bitcoin mining was actually the NSA's attempt at crowdsourcing brute-force decryption...

Re:Bitcoin? (1)

Sparticus789 (2625955) | about a year ago | (#44164775)

I'll raise my tin-foil hat to that theory. Either the NSA is doing that right now, or they are going to start.

Better breakers (1)

b4upoo (166390) | about a year ago | (#44164441)

Obviously the government has access to very fast computers beyond what the public has available. As computer power gets greater it becomes easier for specialists to break into supposedly secure situations. We have also been in a war mode since 9/11 and all kinds of covert snooping are taking place. Deeply embedded agents do exist in this world. I have seen it first hand. Back in the 1960s that fine young girl that spent a lot of nights in your bed that you thought was a hippie was often some kind of cop. It was all too common.

Re:Better breakers (2)

dclydew (14163) | about a year ago | (#44164619)

Wait a second... a hippie from the 60's that's geeky enough to post on /.? Any girl in your bed should have been suspect!!!! ;-)

What are the odds? (1)

NoNonAlphaCharsHere (2201864) | about a year ago | (#44164481)

Close to Unity [] .

Fearmongering. (5, Insightful)

nimbius (983462) | about a year ago | (#44164493)

OpenBSD had the same press smear in . The result? there was no secret back door in SSL libraries or BSD. []
The NSA arguably doesnt need a linux backdoor. They own the links between you and the server. They already get preferential access to the #1 and #2 OS on every desktop and laptop, and when that doesnt cut it they've had a foot in the door of everything from Facebook to Amazon for quite a while now. the warrants and courts are secret, and the action comes with a free 'shut the fuck up' stamp to make sure you never hear a word about it.
what the NSA cares about is mostly what the government cares about: detecting and correcting civil unrest. monitoring social networks, chat rooms and forums ensures things like Occupy never get too far out of hand. Sure, running from your basement might be safe if you're encrypting root, running SELinux and wiping disks, but the NSA will still have enough metadata from your driving patterns and network traffic to fashion a very long noose for your execution.

No need for clever cryptographic backdoors. (0)

Anonymous Coward | about a year ago | (#44164499)

Just use the concept of plausible stupidity.

IIS is roughly half of the web servers on the internet
IE is roughly half of the web browsers on the internet

When you either use IE or IIS there are high probabilities that one of them will inflict something that can be assimilated to some downgrade attack when establishing ssl tunnels effectively undermining the security provided by any secure web browser or any secure webserver.

When you apply that reasoning on large scales it is equivalent to putting a backdoor on the whole internet without ever needing a clever backdoor to be inserted in open source softwares. There is most of the time a microsoft product at some end of the pipe, the pipe becomes compromised.

If you don't believe me just compare the behaviour of major browsers/webservers with regards to how they deal with their choice of ciphering algorithms for SSL.
It is indeed very tedious to configure cipher preferences on a webserver in order to have microsoft clients using anything not vulnerable to BEAST or providing perfect forward secrecy.

This is in my opinion a blatant example of backdoors "done right".

Implementations over fundamentals (0)

Anonymous Coward | about a year ago | (#44164507)

We can look over the crypto-specific parts and make sure they are sound but we are still vulnerable to mistakes in implementation. The Debian OpenSSL memory initialisation bug is the elephant in the room here. If it had not been found after two years how long would it have been there? Although that was a 'mistake' by two seperate people (one a debian package maintainer and one the OpenSSL upstream developer), I find it interesting that by 2011 they were both cycling around Germany for the OpenStreetMap project and one of them was later beaten to death with his laptop by some eastern-Europeans in what was made to look like a robbery.

My guess is that some peope got burned by that and suspected fould play enough to take revenge.

Hanlon's razor (0)

Anonymous Coward | about a year ago | (#44164513)

"Never attribute to malice that which is adequately explained by stupidity."

But I guess that still doesn't speak to the question of whether it is happening or not.

The GSM ciphers are an interesting story (2)

time961 (618278) | about a year ago | (#44164583)

I can't find a good reference right now, but I recall reading a few years back the observation that one of the GSM stream ciphers (A5/1?) has a choice of implementation parameters (register sizes and clocking bits) that could "hardly be worse" with respect to making it easily breakable.

This property wasn't discovered until it had been fielded for years, of course, because the ciphers were developed in the context of a closed standards process and not subjected to meaningful public scrutiny, even tough they were nominally "open". The implication was that a mole in the standardizing organization(s) could have pushed for those parameters based on some specious analysis without anyone understanding just what was being proposed, because the (open) state of the art at the time the standard was being developed didn't include the necessary techniques to cryptanalyze the cipher effectively. Certainly the A5 family has proven to have more than its fair share of weaknesses, and it may be that the bad parameter choices were genuinely random, but it gives one to think.

Perhaps some reader can supply the reference?

The 802.11 ciphers are another great example of the risks of a quasi-open standardization process, but I've seen no suggestion that the process was manipulated to make WEP weak, just that the lack of thorough review by the creators led to significant flaws that then led to great new research for breaking RC4-like ciphers.

AES? Yeah right (1)

slashmydots (2189826) | about a year ago | (#44164623)

Oh yeah, I'm so sure after this many years and many people looking at the source code for AES that nobody happened to see a totally stand-out backdoor code in it. And nobody noticed the resulting weakness in cracking the encryption. That's completely ridiculous.

origins of linux (2, Funny)

lkcl (517947) | about a year ago | (#44164707)

there's a story i heard about the origins of linux, which was told to me a few years ago at a ukuug conference by a self-employed journalist called richard. he was present at a meeting in a secure facility where the effects of "The Unix Wars" were being exploited by Microsoft to good effect. the people at the meeting could clearly see the writing on the wall - that the apx-$10,000s cost of Unixen vs the appx-$100s of windows would be seriously, seriously hard to combat from a security perspective. their primary concern was that the [expensive] Unixen at least came with source: microsoft was utterly proprietary, uncontrolled, out of control, yet would obviously be extremely hard to justify *not* being deployed in sensitive government departments based on cost alone. ... so the decision was made to *engineer* a free version of Unix. one of the people at the meeting was tasked with finding a suitable PhD student to "groom" and encourage. he found linux torvalds: the rest is history.

now we have SE/Linux - designed and maintained primarily by the NSA.

the bottom line is that the chances of this speculation being true - that the NSA has placed back-doors in GNU/Linux or its compiler toolchain - are extremely remote. you have to bear in mind that the NSA is indirectly responsible for securing its nation's infrastructure. adding in backdoors would be extremely foolish.

And that's why (1)

jameshofo (1454841) | about a year ago | (#44164751)

We use Open Source, the entire point is using something that is not under the control of one single agency, entity or company. To have a back door in mainline like that, that isn't considered a bug would take the kind of creativity these organizations neither attract nor harbor on their own. So your probably good, besides SELinux vulnerabilities are the least of your worries. There's probably a Ton of sysad's that administer Linux boxes from windows with poor to minimal security.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?