Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NIST Investigating Mass Flash Drive Vulnerability

Soulskill posted more than 4 years ago | from the 123456-letmein dept.

Encryption 71

Lucas123 writes with a followup to news we discussed earlier this week that the encryption on NIST-certified flash drives was cracked. "A number of leading manufacturers of encrypted flash drives have warned their customers of a security flaw uncovered by a German company. The devices in question use the AES 256-bit encryption algorithm and have been certified using the FIPS 140-2, but the flaw appears to circumvent the certification process by uncovering the password authentication code on host systems. The National Institute of Standards and Technology said it's investigating whether it needs to modify its standards to include password authentication software on host systems. Security specialist Bruce Schneier was blunt in his characterization of the flaw: 'It's a stupid crypto mistake and they screwed up and they should be rightfully embarrassed for making it.'"

Sorry! There are no comments related to the filter you selected.

If you want to encrypt your data (4, Funny)

MichaelSmith (789609) | more than 4 years ago | (#30706124)

Use PGP. Create a really long key, like 4096 bits.

Re:If you want to encrypt your data (1)

ls671 (1122017) | more than 4 years ago | (#30706132)

Bah, instead, I am still using an Enigma machine that my grandfather brought me back. He stole it from the ennemy while in combat.

http://en.wikipedia.org/wiki/Enigma_machine [wikipedia.org]

Re:If you want to encrypt your data (2, Interesting)

MichaelSmith (789609) | more than 4 years ago | (#30706140)

Bah, instead, I am still using an Enigma machine that my grandfather brought me back. He stole it from the ennemy while in combat.

http://en.wikipedia.org/wiki/Enigma_machine [wikipedia.org]

You should count yourself lucky that Alan Turing died all those years ago, otherwise your data could be compromised.

Re:If you want to encrypt your data (3, Interesting)

ls671 (1122017) | more than 4 years ago | (#30706178)

> otherwise your data could be compromised.

With this ? :

http://en.wikipedia.org/wiki/File:Bombe-rebuild.jpg [wikipedia.org]

Too complex to maintain in good working order. ;-))

Re:If you want to encrypt your data (1)

MK_CSGuy (953563) | more than 4 years ago | (#30706228)

while they don't showcase a running Bombe replica, I saw at Bletchley Park a fully running rebuild of the Colossus machine [wikipedia.org] , which was used at WW2 to break the more complex encryption utilized by the Lorenz SZ40 machine.
Since it has programmable boolean functions I bet it can be used to break Enigma code as well.

Re:If you want to encrypt your data (1)

ls671 (1122017) | more than 4 years ago | (#30706286)

Seriously ?

I think simply implementing the breaking algorithm in your favorite language on your PC would be more convenient and also give results much faster ;-))

But you would need to know that I am using an Enigma machine in the first place, oh wait...

Re:If you want to encrypt your data (3, Insightful)

MK_CSGuy (953563) | more than 4 years ago | (#30706352)

I think simply implementing the breaking algorithm in your favorite language on your PC would be more convenient and also give results much faster ;-))

You are right [bbc.co.uk] of course [wikipedia.org] :

Nevertheless the victor's 1.4 GHz laptop, running his own code, took less than a minute to find the settings for all 12 wheels... 240 times faster than Colossus. If you scale the CPU frequency by that factor, you get an equivalent clock of 5.8 MHz for Colossus. That is a remarkable speed for a computer built in 1944.

You still get massive geek cred. either way :)

Re:If you want to encrypt your data (2, Informative)

evilviper (135110) | more than 4 years ago | (#30709556)

you get an equivalent clock of 5.8 MHz for Colossus. That is a remarkable speed for a computer built in 1944.

It would be incredible if true, but it's not. Special-purpose hardware can perform certain types of computations far faster than general-purpose processors. Hardware that could decode 1080i MPEG-2 (HDTV) could easily (though not inexpensively) have been made a decade before Intel/AMD CPUs were up to the task. That doesn't mean we had 2GHz+ CPUs in the early 1990s, it just means we had special-purpose hardware, which would require a 2GHz+ CPU to allow it to be replicated in software...

It's the same nonsense you get with low-power devices all the time: "OMG! This 10MHz ARM CPU is fast enough to decode H.264 videos!" Not understanding there's just a DSP slapped in the same package there, which is performing the video decoding without using the CPU for anything.

Re:If you want to encrypt your data (1)

Opportunist (166417) | more than 4 years ago | (#30706314)

Turing isn't dead as long as a single geek is alive.

Re:If you want to encrypt your data (1)

ls671 (1122017) | more than 4 years ago | (#30706472)

Do this imply that geeks need to have the same sexual orientation as he did ? ;-)))

Re:If you want to encrypt your data (1)

maharg (182366) | more than 4 years ago | (#30706844)

Of course. Plus the same hairstyle ffs.

Re:If you want to encrypt your data (1)

MichaelFurey (1460819) | more than 4 years ago | (#30706864)

Are you trying to imply that geeks DO have a sexual orientation? I mean, ANY kind?

Re:If you want to encrypt your data (1)

BrokenHalo (565198) | more than 4 years ago | (#30707128)

Of course. It just involves a lot of Kleenex and having to shave palms... ;-)

Re:If you want to encrypt your data (1)

Opportunist (166417) | more than 4 years ago | (#30710314)

No, but it helps it seems.

I'm not kidding here. It's surprising how many good mathematicians and programmers, especially those in cryptography and security, are gay.

Re:If you want to encrypt your data (2, Funny)

PopeRatzo (965947) | more than 4 years ago | (#30706686)

I am still using an Enigma machine that my grandfather brought me back. He stole it from the ennemy while in combat.

You're the one who's got my grandad's Enigma machine!

Give it back. You can send it to me here in Argentina.
 

Re:If you want to encrypt your data (4, Informative)

snemarch (1086057) | more than 4 years ago | (#30706202)

Not really applicable to a hardware device.

Also, keep in mind that RSA by itself is much too slow to encrypt large amounts of data; thus, PGP and other solutions only use RSA to encrypt a symmetric cipher, which is then used for the bulk encryption.

Standard AES-256 is actually just fine, problem with these devices is that the manufacturers screwed up the implementation *majorly* (as I understand it, use the same key for every device and depend on a usermode app to say GOOD_GUY/BAD_GUY to the hardware) - but that's covered elsewhere.

Re:If you want to encrypt your data (4, Insightful)

TubeSteak (669689) | more than 4 years ago | (#30707232)

Standard AES-256 is actually just fine, problem with these devices is that the manufacturers screwed up the implementation *majorly* (as I understand it, use the same key for every device and depend on a usermode app to say GOOD_GUY/BAD_GUY to the hardware) - but that's covered elsewhere.

The fact that so many major companies have the same exact flaw in their product suggests (to me) that there is only one manufacturer and multiple vendors who just rebadged the item.

I think it's less likely that multiple companies independantly managed to screw up their products in exactly the same way.

Re:If you want to encrypt your data (1)

snemarch (1086057) | more than 4 years ago | (#30707672)

Sounds plausible; the thing that irks me is that if these USB devices really do have AES-256 capabilites and the contents of the flashram is encrypted... why the hell have they made such a terrible flaw implementing the rest of the system? :-s

Re:If you want to encrypt your data (1)

John Hasler (414242) | more than 4 years ago | (#30708052)

The bug is not in the key itself. It's in software that runs on the host system (which is a bug in itself) They may all have licensed the same software.

Encryption algorithm's aren't the weak link (4, Insightful)

Anonymous Coward | more than 4 years ago | (#30706128)

Encryption algorithm's aren't the weak link, its the implementation. But most people just look at how big the key is not who implemented it.

Re:Encryption algorithm's aren't the weak link (3, Funny)

Anonymous Coward | more than 4 years ago | (#30706142)

Put it in a way /. understands, please.

"It's like having a really huge penis but never leaving your mother's basement."

Re:Encryption algorithm's aren't the weak link (4, Interesting)

Kjella (173770) | more than 4 years ago | (#30706166)

Encryption algorithm's aren't the weak link, its the implementation.

What's more usually the case is that the implementation of the algorithm is just fine, but you fail at using it in the right way. Usually because then you've handed it off from the cryptography experts and to the general team that's building the rest of the system. Kinda like a door that has a great lock but is easy to take off its hinges, won't do you much good.

Re:Encryption algorithm's aren't the weak link (5, Insightful)

Anonymous Coward | more than 4 years ago | (#30706564)

My understanding of these devices puts the analogy at:

A super solid, near uncrackable/breakable safe, but all models use the same 12345 passcode, and the owners cannot change this.

To make matters worse, the door of the safe has been mounted on a normal house door that only has a sign saying 'do not open if this is not yours'.

This way, the safe owners dont need to rember such a complex code as '12345', and you get all the security of the full safe. Unless a intruder happens to have a brain, then they will just open the house door that holds the safe door thus negating the entire system.

If anyone could design a worse system, they are truely rube goldberg masters.

Re:Encryption algorithm's aren't the weak link (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30710690)

A slightly better analogy is the following:

A super solid uncrackable safe which has a 12345 passcode for all models. This safe however, has a removeable front panel door that covers the passcode entry input. This front panel has it's own combination input that can be user set, and when the user inputs the correct user set passcode, the front panel automatically inputs the 12345 into the underlying safe door.

So all is well and good until someone comes along and just removes the removeable front panel.

This isn't just an implementation flaw in the software from these companies, it is an overall design flaw with the devices. This cannot be fixed in software since the devices need the capability to process and verify the password themselves, which these devices apparently don't have.

Essentially, someone in management asked the question:

Hey Bob, is there a way we could make our security functionally useless while saving 25 cents per unit?

Unfortunately, they got an answer that they liked.

Re:Encryption algorithm's aren't the weak link (1)

complete loony (663508) | more than 4 years ago | (#30707066)

Or an expensive bike lock you can open with the end of a BIC pen [wired.com] ...

Re:Encryption algorithm's aren't the weak link (1, Interesting)

Anonymous Coward | more than 4 years ago | (#30707304)

What's more usually the case is that the implementation of the algorithm is just fine, but you fail at using it in the right way. Usually because then you've handed it off from the cryptography experts and to the general team that's building the rest of the system. Kinda like a door that has a great lock but is easy to take off its hinges, won't do you much good.

That's a problem with software in general, not just encryption.

Often once the coders have solved the "interesting" problems, they get bored with the mundane implementation details. If your software does everything it is supposed to, but the user experience sucks, then you have still failed. Coding isn't about creating great code. The user doesn't care about your code. They want to solve their problem. Programs are only a tool to achieve that.

Re:Encryption algorithm's aren't the weak link (1)

dayton967 (647640) | more than 4 years ago | (#30708196)

For the house you can go with many different locking systems, such as a kwikset deadbolt, they are vulnerable to bumping and picking though for the average house, this is sufficient because they would go through the window instead. Though many houses just have the knob locks, these have an added vulnerability, which is to use a jack to spread the door frame, the frame can often move more then the latch is deep. High security locks can be vulnerable to being kicked in or being jacked open, even when they are very difficult if not impossible to bump or pick open. So as with doors, encryption requires the implementation to be done properly. AES256 would be a high security lock, but from what I can understand on what was written, someone left the key under the door mat.

Re:Encryption algorithm's aren't the weak link (4, Funny)

Joce640k (829181) | more than 4 years ago | (#30706426)

The weak link is in the apostrophe.

they screwed up... OR (0, Troll)

Smegly (1607157) | more than 4 years ago | (#30706208)

Security specialist Bruce Schneier was blunt in his characterization of the flaw: 'It's a stupid crypto mistake and they screwed up and they should be rightfully embarrassed for making it.'"

... OR it was a deliberate mistake. Damn Germans for not playing ball.

Re:they screwed up... OR (1)

Mikkeles (698461) | more than 4 years ago | (#30708298)

It does take a special kind of stupidity to build a (supposedly) secure device which opens on the basis of "OK, you can open up now" from something external. Reminds me of the land shark skits on SNL.

Re:they screwed up... OR (1)

zippthorne (748122) | more than 4 years ago | (#30717838)

The articles so far have been pretty light on details, but I suspect it's possible that they shipped a development version that a manager saw "working" on the programmer's desk.

Supposing it actually does encrypt the data, then it seems really weird to me that it would
a) encrypt the data in hardware
while
b) storing the key in the hardware
which is
c) activated by a "ok to decrypt" code.
which is
d) sent by software on the computer comparing hashes
to a
e) hash of the key furnished for the software by the hardware

It would be much cheaper to just not bother with the encryption at all, and it would be slightly cheaper to not bother with the "okdecrypt" business and just not bother checking the validity of the key. The story could have given us enough information to make some speculation about causes, but unfortunately, it didn't.

was a good investment? (-1, Troll)

harvey the nerd (582806) | more than 4 years ago | (#30706246)

Sounds like CIA, NSA or FSB spent an effective $25,000 spmewhere.

Re:was a good investment? (1)

AHuxley (892839) | more than 4 years ago | (#30706908)

From Enigma, Crypto AG to Kingston the NSA like parts of the US gov love you all.

Significant flaw & workaround (4, Informative)

Snotboble_ (13797) | more than 4 years ago | (#30706296)

This is pretty major as so many vendors are affected by it. However, until there's an update or complete recall & replacement, I'd recommend using Truecrypt [truecrypt.org] . Certified by NIST (see HERE [law2point0.com] . Cross platform. Free (as in spoken beer ;o). Of course, one can only hope that its implementation is better than the devices currently uncovered :P

Re:Significant flaw & workaround (1)

AHuxley (892839) | more than 4 years ago | (#30706888)

vendors are affected because they all outsourced to the same sub set of firms in 2nd and 3rd world for cheap code.
Buzz word for the box, MS skills on the back end.
For what they save in hiring a few smart American experts they will now lose many times over in PR spin to reinvent their failed brands.
Never forget any junk crypto code news over the next decade or two will bring back quotes about your pathetic company.

Re:Significant flaw & workaround (1)

zippthorne (748122) | more than 4 years ago | (#30717614)

For what they save in hiring a few smart American experts they will now lose many times over in PR spin to reinvent their failed brands.

The problem is.. they won't. This kind of thing should be a good for competitors who did things correctly, but in the end, no lessons will be learned, and any competitors that actually do have good products will not gain significant market share.

They might get a fine, and a junior executive might have to go work somewhere else (for more money, though...). The will see very little downside. Nothing compared to the unlimited liability for complications due to loosing of users' data that they should have to deal with.

Re:Significant flaw & workaround (1)

AmberBlackCat (829689) | more than 4 years ago | (#30707200)

Where's this guy [slashdot.org] when we need him?

Re:Significant flaw & workaround (1)

greenbird (859670) | more than 4 years ago | (#30709182)

My one issue with Truecrypt is that it doesn't support multiply keys. That's almost essential in a real world environment. Why the hell don;t they add that?

Re:Significant flaw & workaround (1)

ScrewMaster (602015) | more than 4 years ago | (#30709466)

This is pretty major as so many vendors are affected by it. However, until there's an update or complete recall & replacement, I'd recommend using Truecrypt [truecrypt.org] . Certified by NIST (see HERE [law2point0.com] . Cross platform. Free (as in spoken beer ;o). Of course, one can only hope that its implementation is better than the devices currently uncovered :P

I remember a recent Slashdot article about a vulnerability in TrueCrypt. Don't remember much more than that. Does anyone else know what I'm talking about?

Re:Significant flaw & workaround (1)

TangoMargarine (1617195) | more than 4 years ago | (#30709680)

TC isn't really portable. I mean, yes, the program itself can be installed portably, but you still need admin permission to install the driver to make it actually work.

For those that don't RTFA... (1)

djupedal (584558) | more than 4 years ago | (#30706300)

"Kingston Technology Inc. ---- ....is warning customers about a potential security threat posed by a flaw in the hardware-based AES 256-bit encryption on their USB flash drives."

Re:For those that don't RTFA... (1)

Mikkeles (698461) | more than 4 years ago | (#30708352)

They're lying; the flaw is not in the encryption, 'tis in the key management.

some vendors got it right... Trust no 1 (4, Informative)

advocate_one (662832) | more than 4 years ago | (#30706534)

IronKey was among a number of companies to issue statements reassuring customers that their devices were safe from the same attacks. Jevans said that's because the password and authentication process is contained on the USB drive itself and has nothing to do with the host system.
"We don't trust the computer at all," he said. "The computer could have malware on it or have hackers accessing it. In our security design, we said we have to assume the computer is completely untrustworthy. That's where we started our threat modeling."

Re:some vendors got it right... Trust no 1 (0)

Anonymous Coward | more than 4 years ago | (#30706912)

IronKey was among a number of companies to issue statements reassuring customers that their devices were safe from the same attacks. Jevans said that's because the password and authentication process is contained on the USB drive itself and has nothing to do with the host system.

"We don't trust the computer at all," he said. "The computer could have malware on it or have hackers accessing it. In our security design, we said we have to assume the computer is completely untrustworthy. That's where we started our threat modeling."

uh got it right? this is what HSM usually do, the operations are performed on the hardware itself - its nothing spectacular

Re:some vendors got it right... Trust no 1 (2, Funny)

AHuxley (892839) | more than 4 years ago | (#30706916)

Why not just say "Microsoft"?

Nope. How to do it right... (2, Interesting)

r00t (33219) | more than 4 years ago | (#30707504)

You need buttons on the device. Without that, your password could be swiped by PC malware.

A no-frills minimal device comes with 10 buttons. The password is a 10-digit number printed on a card hidden in the packaging. To avoid having the password revealed by button wear, none of the digits repeats. You put the device in, press buttons, and then it shows up to the OS.

A better device has a config setup. Press an extra recessed button, and the device appears as a USB netword device with a DHCP server and all. Go to the device's internal web page, just like setting up a home wireless router. There you could create multiple virtualized devices, each with a distinct password. (if you create more than one, then the device shows up as a hub with child devices) This also allows for data-losing recovery from password loss: you just delete the virtual device you can no longer access, then create an empty virtual device with a new password.

Re:Nope. How to do it right... (0)

Anonymous Coward | more than 4 years ago | (#30707756)

you are missing the point on security. if the host computer is hacked, then as soon as you unlock the drive it's gone.
the hackers/crackers will just download all your data then. no, the reason for having password authentication in hardware on the usb device is so that someone can't decompile your code and get the code/command to unlock it.

Re:Nope. How to do it right... (0)

Anonymous Coward | more than 4 years ago | (#30707774)

"A better device has a config setup. Press an extra recessed button, and the device appears as a USB netword device with a DHCP server and all. Go to the device's internal web page, just like setting up a home wireless router. There you could create multiple virtualized devices, each with a distinct password."... Great so we should just add more vulnrability ridden protocols to the device. Why don't we just leave the web access login / password as Admin / password and not enforce the user to change it, just like our "secure" home networks. I'm gonna have to agree with the comment that IronKey is the only people that got it right. IN EXTERNAL DEVICE SECURITY ALWAYS ASSUME THAT THE HOST SYSTEM IS COMPROMISED IT'S NOT THAT BLOODY HARD.

Re:Nope. How to do it right... (1)

TangoMargarine (1617195) | more than 4 years ago | (#30709698)

And this is "not trusting the computer" how?

Re:Nope. How to do it right... (1)

r00t (33219) | more than 4 years ago | (#30716028)

Of course, a certain level of trust is implied by the use of the device. At minimum you're making one filesystem readable; it could be copied.

That doesn't mean you must make the filesystem writable unless actually needed.

That doesn't mean you must give up a password which you might (wrongly...) be using elsewhere.

That doesn't mean that you must make all filesystems available. Virtual devices allow you to make a per-computer choice.

Re:Nope. How to do it right... (1)

ascari (1400977) | more than 4 years ago | (#30709872)

And after all that, once your encrypted is duly decrypted by you it is displayed on the screen in cleartext. The hypothetical malware running on your untrustworthy PC simply swipes is out of the frame buffer and sends it to your enemy. (It might even encrypt it before sending it, sweet irony. ;-)

Hypotehtical perhaps, but quite doable. Bottom line is if you ever look at your data in cleartext at any time on any device I've already pwned you. Sorry to be the bearer of bad news.

Re:Nope. How to do it right... (1)

r00t (33219) | more than 4 years ago | (#30715952)

You got the current data, and maybe (if a read-write password is used) you got to mangle it.

You don't get future access. Stealing the device next year won't do you any good.

You don't get some password that I might be improperly (humans are lazy) using for something more important.

If virtual devices are in use, you only got access to the one I activated.

out in the cold (0)

Anonymous Coward | more than 4 years ago | (#30706746)

what if you spent a lot of money on one of these drives, too bad your out of luck, I doubt there will be a refund.

Who talk/writes like that? (1)

kahn (549547) | more than 4 years ago | (#30706806)

by uncovering the password authentication code on host systems

See, this is why I can't watch movies anymore.

Is this a valid comparison? (1)

allseason radial (1603753) | more than 4 years ago | (#30706878)

Back in the nineties there was a teapot tempest around an issue with Intuit's Quicken personal finance software. Users could "encrypt" their Quicken data files, but the password they created to access it was easily recoverable by simply looking at the plain text raw data stored on hard drive sectors. I had Norton Utilities which allowed me to examine drive sectors, and sure enough, there was my Quicken password (altho the actual data was encrypted).

That's what this reminds me of. Is this a valid -if perhaps incomplete- comparison?

Doesn't NIST have a clue? (2, Insightful)

Gnavpot (708731) | more than 4 years ago | (#30706898)

I really do not understand this part:
"The National Institute of Standards and Technology said it's investigating whether it needs to modify [CC] its standards to include password authentication software on host systems."

This has already been proven to be very unsafe hardware. The fact that you can access the data without using the original software and without knowing the user's password should leave no doubt. As long as you have some software which says "Open Sesame" in the same way as the original software, you will get access.

So why was this not discovered during the NIST certification process? And why do NIST state that they may need to approve the software too to protect against this?

It seems to me that NIST blames the software so they will not have to take blame for their faulty certification of the hardware.

Re:Doesn't NIST have a clue? (3, Insightful)

Anonymous Coward | more than 4 years ago | (#30707188)

NIST doesn't actually certify things as "secure", it makes very specific certification of particular tests that may or may not represent what you think of as "secure". It's only the marketing that makes you think that if the words "NIST", "Certified" and "Security" appear in the same sentence that someone has done a proper end-to-end review.

It's like the way that the auditors' certificate in financial reports makes people think that the auditor is guaranteeing that there cannot be any fraud in the company and that the company is a good investment - in fact neither of those things are true.

Re:Doesn't NIST have a clue? (0)

Anonymous Coward | more than 4 years ago | (#30707322)

So why was this not discovered during the NIST certification process? And why do NIST state that they may need to approve the software too to protect against this?

Because the companies in question, the ones requesting certification, request certification for X, but not Y or Z (whether NIST can even certify Y or Z I don't know.) So they met the certification requirements for X. Now that there's such a big deal over this screw up, it'll probably make NIST fix their certification process, e.g. all requests for X certification must also be eligible for Y certification, or some such.

Re:Doesn't NIST have a clue? (3, Insightful)

Jaime2 (824950) | more than 4 years ago | (#30707618)

So why was this not discovered during the NIST certification process?

Because the certification the hardware received only verifies that the algorithm strength is sufficient and that the device is hardened against physical tampering.

It seems to me that NIST blames the software so they will not have to take blame for their faulty certification of the hardware.

Nope, it seems that the NIST has recognized that the certification, as currently written, isn't sufficient and is looking into making it more robust. Had they audited the software, they would have discovered that the software-to-hardware interface is poorly designed and not granted the certification.

Best part of TFA (2, Insightful)

dr2chase (653338) | more than 4 years ago | (#30707038)

is how everything is carefully run through the make-nice factory. The memory chip makers ucked fup. NIST ucked fup. Yet, NIST cannot say, "whoa, we blew it, we have to fix that standard immediately" (else it will be completely worthless). No, they're organizing a committee to appoint a task force to propose revisions to the standard, pending who-knows-what. And even the guys who got it right, try to make nice with a handy excuse for how this came about -- "difficult to administer with all those different passwords". You set two passwords for each device, duh, and let either access the bits. Vendors provide them with a customer-specified admin password, or vendor supplies a chip initialization utility where customer may bake in an admin password.

Re:Best part of TFA (1)

Pushpabon (1351749) | more than 4 years ago | (#30708346)

Is it so hard to say 'fucked up'? Why do you put so much effort into going around it? Almost as bad as f*uck or sh*t other retarded censorship. You mean to say fuck and people will interpret it as fuck but instead you write 'look at me I'm an ass'.

Re:Best part of TFA (1)

dr2chase (653338) | more than 4 years ago | (#30708666)

My kids could read this. I have to at least make an effort, ok?

Besides, what does it bother you that I made the effort? It was my effort, and you clearly figured out what I meant.

Re:Best part of TFA (1)

ScrewMaster (602015) | more than 4 years ago | (#30709450)

My kids could read this. I have to at least make an effort, ok? Besides, what does it bother you that I made the effort? It was my effort, and you clearly figured out what I meant.

Ignore him. That's what "offtopic" moderations are for.

Doing file security the wrong way (2, Insightful)

Old Man Kensey (5209) | more than 4 years ago | (#30708456)

Any flash drive whose "security" involves a required app running on the host system will not be suitable for cross-platform use even if the app is well-written. The only right way to do it is to encrypt the data written to the drive, using well-known secure encryption algorithms run on the host. And for that purpose a cheap, dumb drive works just as well as a super-expensive "secure, tamper-proof" drive.

Re:Doing file security the wrong way (1)

ascari (1400977) | more than 4 years ago | (#30709814)

Unfortunately, not even that is enough. The current issue of 2600 magazine has a "how-to" article on breaking full disk security. Won't spoil the story by giving it away, but it does raise some question about depending on encryption for security in general.

Yep. (2, Funny)

ScrewMaster (602015) | more than 4 years ago | (#30709438)

Security specialist Bruce Schneier was blunt in his characterization of the flaw: 'It's a stupid crypto mistake and they screwed up and they should be rightfully embarrassed for making it.'"

That's our Bruce.

Wrong drive? (1)

hicksw (716194) | more than 4 years ago | (#30715362)

If all the drives have identical challenge response interchanges, why wasn't this noticed the first time someone using his own machine/software/password successfully accessed the wrong drive?

Why bother? (1)

spambucket235 (1258304) | more than 4 years ago | (#30715988)

I would never even bother to encrypt data on a flash drive.

1) If your data is so important that other people shouldn't read it, why are you putting it on a flash drive which can be stolen?

2) If, for some reason, you are storing your data on a flash drive, why are you leaving it where others can steal it? You can have my flash drive when you pry it from my cold, dead hands!

3) If you do put your data on a flash drive and somebody kills you to get it, it stands to reason that they will be able to work at their own leisure to decrypt the data on it. I don't know any form of encryption that is 100% bulletproof. (Except, maybe a one-time pad. - Assuming they don't get your key.) It's only a matter of time before your encrypted data is broken. Your only hope is that the information will go stale before they can decrypt it.

Flash drives are ubiquitous, easily stolen and easily concealable. They are, AFAIAC, the WORST place to store valuable data.

Encrypting data on a flash drive is like putting a padlock on a cardboard box.

It's not NIST, it's FIPS and reality colliding (0)

Anonymous Coward | more than 4 years ago | (#30722782)

Posting anonymously since I work at NIST (but not in crypto or infosec). The problem isn't NIST or even really FIPS 140-2. The problem is that FIPS 140-2 certifies something very specific -- the encryption module. It doesn't certify the entire system or the hardware.

This was not a big deal when FIPS 140-2 was initially written or expanded. However, now that full-disk encryption is mandated for federal computers and encrypted USB keys are something everyone buys, the reality is that FIPS 140-2 certification doesn't mean very much. Encryption grew up and moved out of the house, and as usual federal standards take some time to catch up.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?