Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Using Encryption Garners Exemption For Data Breach Notification

timothy posted more than 4 years ago | from the keep-your-breeches-on dept.

Privacy 101

Combat Wombat writes with this excerpt from the Register: "New data breach rules for US healthcare providers have come under criticism from a security firm that specialises in encryption. As part of the Health Information Technology for Economic and Clinical Health (HITECH) Act, which comes into effect from 23 September, health organisations in the US that use encryption will no longer be obliged to notify clients of breaches."

cancel ×

101 comments

Sorry! There are no comments related to the filter you selected.

fp (-1, Troll)

Anonymous Coward | more than 4 years ago | (#29479145)

lick my asshole like a poopsie pop, you gay ass homos.

great (3, Funny)

Savior_on_a_Stick (971781) | more than 4 years ago | (#29479151)

If the provider uses rot13, they can consider that good enough

Re:great (4, Funny)

AliasMarlowe (1042386) | more than 4 years ago | (#29479165)

If the provider uses rot13, they can consider that good enough

But they're already using rot0. Isn't that good enough?

Re:great (3, Funny)

davester666 (731373) | more than 4 years ago | (#29479377)

It's not rot0, it's rot26. And everybody knows that a higher number means it's better.

And next year, watch out for my new rot52 encryption method....

Re:great (1)

dazjorz (1312303) | more than 4 years ago | (#29479437)

I heard the new supercomputers at the NSA can already break rot1040.... Oh, technology nowadays! Amazing.

Re:great (2, Funny)

selven (1556643) | more than 4 years ago | (#29479525)

rot1040? Is that what the IRS uses to secure my private data?

Re:great (0)

Anonymous Coward | more than 4 years ago | (#29482339)

Bull, they use that new and cool, 25-round-rot24-encryption-standard

Re:great (3, Informative)

furbearntrout (1036146) | more than 4 years ago | (#29479421)

According to the pdf it has to meet FIPS 140-2 [wikipedia.org] , and implies ssl/tls level of encryption.
(IANANES, so I'm not sure just how good that is.)

I can hear people saying I must be new here but I only skimmed TFA.

Re:great (1)

Another, completely (812244) | more than 4 years ago | (#29481535)

According to the pdf it has to meet FIPS 140-2 [wikipedia.org] , and implies ssl/tls level of encryption. (IANANES, so I'm not sure just how good that is.)

It's pretty good [nist.gov] . It also has requirements for user-level authentication (machine-to-machine is not good enough) and approved key-generation algorithms. It's also actively maintained by people who know what they are doing, which makes it a much better decision than trying to write your own security requirements spec.

Why that should get you out of reporting data loss is what I don't follow. When it might be someone sniffing the data at your ISP, you need to report it, but when you have a FIPS certification proving that it must be a serious problem, you can keep it secret?

I can hear people saying I must be new here but I only skimmed TFA.

Right. Most people wouldn't have skimmed it.

Re:great (1)

turbidostato (878842) | more than 4 years ago | (#29482395)

"Why that should get you out of reporting data loss is what I don't follow."

Remember yesterday's story about data breach due to an spyware e-mail (http://news.slashdot.org/story/09/09/18/0011218/Spyware-Prank-Exposes-Hospital-Medical-Records)?

I stated there that "the true point is that Hospitals don't want security [...] Pitifully I won't hold my breath waiting for a multimillion exemplary fine against the hospital so others will take the issue more seriously."

Well, there you have it. Not only these kinds of misconducts won't get exemplary fines but they even are preparing the grounds not to disclose the incident at all.

Re:great (1)

(Score.5, Interestin (865513) | more than 4 years ago | (#29488881)

It's pretty good. It also has requirements for user-level authentication (machine-to-machine is not good enough) and approved key-generation algorithms. It's also actively maintained by people who know what they are doing, which makes it a much better decision than trying to write your own security requirements spec.

FIPS 140 only covers algorithm and implementation details, and a little bit about key management. There's nothing in there that says you can't use an all-zero key, or prepend the key to the data, or use your company name as the key. So you can still build rot-13 out of FIPS 140-certified products (and I've seen it done on numerous occasions). All this requirement is doing is making it less obvious that something's b0rken.

Re:great (1)

Another, completely (812244) | more than 4 years ago | (#29489039)

From Section 4.7.2 (Key Generation):

A cryptographic module may generate cryptographic keys internally. Cryptographic keys generated by the cryptographic module for use by an Approved algorithm or security function shall be generated using an Approved key generation method. Approved key generation methods are listed in Annex C to this standard. If an Approved key generation method requires input from a RNG, then an Approved RNG that meets the requirements specified in Section 4.7.1 shall be used.

Compromising the security of the key generation method (e.g., guessing the seed value to initialize the deterministic RNG) shall require as least as many operations as determining the value of the generated key.

If a seed key is entered during the key generation process, entry of the key shall meet the key entry requirements specified in Section 4.7.4. If intermediate key generation values are output from the cryptographic module, the values shall be output either 1) in encrypted form or 2) under split knowledge procedures.

Documentation shall specify each of the key generation methods (Approved and non-Approved) employed by a cryptographic module.

And the definition of a cryptographic module:

Cryptographic module: the set of hardware, software, and/or firmware that implements Approved security functions (including cryptographic algorithms and key generation) and is contained within the cryptographic boundary.

So, you might be able to abuse FIPS-certified components to build something that does ROT-13, but you shouldn't be able to get the resulting cryptographic module certified. Did any of the examples you have seen end up fulfilling U.S. government contracts that required a FIPS-140 certification?

Re:great (1)

(Score.5, Interestin (865513) | more than 4 years ago | (#29490205)

So, you might be able to abuse FIPS-certified components to build something that does ROT-13, but you shouldn't be able to get the resulting cryptographic module certified.

You don't need to get the overall result certified, that's why you're building it using FIPS-certified crypto. So what you do is get some FIPS-140 certified crypto (for example the crypto built into any copy of Windows) and then abuse it to make it about as secure as rot13. The module is certified, but it's used in an insecure manner.

Did any of the examples you have seen end up fulfilling U.S. government contracts that required a FIPS-140 certification?

Yes, pretty much all of them, since getting USG contracts is the main reason for going with FIPS-140 certified crypto in the first place.

Re:great (1)

Another, completely (812244) | more than 4 years ago | (#29490711)

Wow. That's interesting. Maybe it's just that I have been limited to work with components manufacturers; maybe it's because they are all covered by HIPAA, which has very vague borders (due to limited case history), so everyone plays extra cautious; maybe it's because they need FDA device-class certifications, and expect more scrutiny than other projects; or maybe the companies have more cautious legal departments. Whatever the reason, we have certainly met people who interpret the regulations in different contexts. Thanks for the information.

Re:great (1)

Man Eating Duck (534479) | more than 4 years ago | (#29479871)

If the provider uses rot13, they can consider that good enough

For extra security, use it twice. This post is encrypted with double rot13-encryption.

Re:great (1)

Coren22 (1625475) | more than 4 years ago | (#29479977)

Jung qvq lbh fnl? V pbhyqa'g ernq vg guebhtu gur rapelcgvba

http://web.forret.com/tools/rot13.asp [forret.com]

Re:great (1)

e4g4 (533831) | more than 4 years ago | (#29480097)

I see what you did there - one-upping the GP by using rot39...

Re:great (0)

Anonymous Coward | more than 4 years ago | (#29488113)

I'm not joking, we had to do a xor encryption just to make confidential personal data non-plaintext. It's all in memory, someone can read it, but at least they'll be wondering for a few minutes. Unless they find the xor keys in memory. Security, thy name is Obfuscation.

XOR! (4, Interesting)

DarkFencer (260473) | more than 4 years ago | (#29479159)

So all they have to do is 'encrypt' it? XOR here we come!

Seriously - is there any guide to what TYPES of encryption are covered under this? Otherwise its inane.

Re:XOR! (2, Informative)

Anonymous Coward | more than 4 years ago | (#29479191)

There are guidelines, as promulgated by the FTC / HHS. If anyone feels strongly about this, you should write the agencies to change the regulations.

Re:XOR! (1)

shentino (1139071) | more than 4 years ago | (#29479703)

Since when do congresscritters ever listen to Joe Schmoe more than corporate fatcat lobbyists?

Re:XOR! (1)

calmofthestorm (1344385) | more than 4 years ago | (#29479837)

I doubt fatcat lobbyists care whether they're using AES256 or ROT13.

Re:XOR! (4, Insightful)

Anonymous Coward | more than 4 years ago | (#29479895)

and I don't either. It's the key management that is the weak point. 10-to-1 the people who claim exemptions under this rule will lose a laptop in the same bag as the usb key that decrypts the whole mess...

Re:XOR! (2, Insightful)

dgatwood (11270) | more than 4 years ago | (#29481647)

The keys alone won't do the trick. It's the password written on the Post-it note taped to the palm rest that's the bigger concern....

Re:XOR! (3, Insightful)

c_forq (924234) | more than 4 years ago | (#29479935)

There is actually a balance between the two. The Congresscritters need both votes and money to survive, so when an election is near letter writing campaigns can be very effective - it takes more effort to write a letter than most people are willing to put in (it is much easier just to punch the card next to the other guys name) so a letter represents more potential votes than the letter writer alone.

Re:XOR! (0)

Anonymous Coward | more than 4 years ago | (#29480223)

nah because by the time the plebes vote, the list has been cut down to those who will support the ruling class.

Re:XOR! (0)

Anonymous Coward | more than 4 years ago | (#29479299)

No, XOR is much too low-tech. Do like the Indians did on an outsourced project I cleaned up: use Base64. Nobody can decode that!

Re:XOR! (1)

zonky (1153039) | more than 4 years ago | (#29480173)

Microsoft are "encrypting/obfusicating" domain pc passwords via their offline domain join tool that makes up part of 2008 Server R2. They're also using base64 to do this... I do not know if they added this 'feature' in india or redmond, but it is just plan retarded.

Re:XOR! (1)

descil (119554) | more than 4 years ago | (#29479445)

It's not inane, it's sinister.

Re:XOR! (5, Interesting)

Pieroxy (222434) | more than 4 years ago | (#29479455)

In any case, you need a key to decrypt your data. If the guy that broke in got the key along with the data, no amount of cryptography is going to help. Usually, from experience, the key is very often to close to the data.

In a company I worked for, we had to set up a bridge between two web apps. We chose an SSO-like solution who worked well on the paper, but the devil is on the details. The guys on the other application decided to encrypt the SSO key in JavaScript on the client.... So the key ended up in clear text in the source of the page!

Oh well....

Re:XOR! (1)

Shakrai (717556) | more than 4 years ago | (#29479799)

The guys on the other application decided to encrypt the SSO key in JavaScript on the client.... So the key ended up in clear text in the source of the page!

So just put up a EULA that forbids people from looking at the web page source code [slashdot.org] . Geez, do I have to figure out everything for you? ;)

Slashdot Cynicism (1)

mcrbids (148650) | more than 4 years ago | (#29481477)

Seems like the majority of the comments here deride this as a bad idea. But many (most?) of these same people rely on SSL and SSH to encrypt data, and purposefully send it out over a very public network, trusting the power of the encryption to protect them.

Logically, how is this really any different?

We've been using this technique for a long time, now. Our client-based application uses strong encryption to protect the files. Our encryption/decryption system embeds the password in as part of the encryption/decryption process. This means that if the laptop is ever stolen or lost, it does not constitute a compromise or leak of data.

Sure, it's possible that somebody could crack the encryption. But we use very standard, very trusted libraries and best practices. And even if it isn't bullet-proof, it's far, far better than no protection at all. And certainly, we aren't disclosing our encryption key in javascript!

Re:Slashdot Cynicism (1)

Pieroxy (222434) | more than 4 years ago | (#29481725)

The story is about not having to disclose data breach if you use cryptography. This is all well and good unless you consider that there are so many dumb setups out there that are just a joke. Not disclosing security breach for those seems like a stupid idea.

Now for sure, some setups are safe. We don't talk about these.

Re:XOR! (1)

mcrbids (148650) | more than 4 years ago | (#29481491)

Wish you could edit posts!

Have you considered using one-time pads to minimize the risk of a key disclosure? Depending on your circumstances, you could actually allow full disclosure of the keys in a session and *still* have a very secure session!

Re:XOR! (0)

Anonymous Coward | more than 4 years ago | (#29485249)

Just in case you were serious - how do you transmit the one-time pad? It's a non-trivial problem, and the geniuses that put the key in the clear in the web page would probably do the same thing with all of the one time keys they were planning to use (and then re-use, of course).

Re:XOR! (3, Funny)

Idiomatick (976696) | more than 4 years ago | (#29479471)

I'd just put a sticker on the computer like this:

1 -> 0
0 -> 1

Re:XOR! (4, Funny)

selven (1556643) | more than 4 years ago | (#29479539)

I tried that and now my data is all 1s. Thanks a lot!

Re:XOR! (1)

MoreDruid (584251) | more than 4 years ago | (#29483835)

It doesn't work!!! I've got 2 computers here and nothing happens when I tape pieces of paper with that written on it against them. Do I need to print it to have it work?

What do you mean, the box-thingy on the floor is my computer?

Re:XOR! (1)

Hurricane78 (562437) | more than 4 years ago | (#29486051)

Should have used a functional language like real men then. ;)

Re:XOR! (0)

Anonymous Coward | more than 4 years ago | (#29487335)

it is reversible for some initial states

Re:XOR! (1, Funny)

Anonymous Coward | more than 4 years ago | (#29479573)

I'd just put a sticker on the computer like this:

1 -> 0
0 -> 1

If I know people at all -- and I think I do -- all you'll end up with is a bunch of bytes that look like this: 111111111.

Re:XOR! (1)

noidentity (188756) | more than 4 years ago | (#29480703)

I'd just put a sticker on the computer like this:

1 -> 0
0 -> 1

Too much coding. Just add "Note: apply this operation TWICE to recover data". Then we don't need to modify any code.

Re:XOR! (1)

cbreak (1575875) | more than 4 years ago | (#29479477)

The only provable encryption scheme OTP works with XOR. The only drawback is the key length.

RC4 (2, Informative)

tepples (727027) | more than 4 years ago | (#29480049)

The only provable encryption scheme OTP works with XOR. The only drawback is the key length.

Which is why you use a pseudorandom number generator to make a message-specific key stream as long as the message. As long as you never reuse a key, and your PRNG doesn't suck, you have what they call a synchronous stream cipher [wikipedia.org] . An example of a well-known stream cipher is RC4 from RSA Security. Another is any block cipher in counter mode.

And message integrity (1)

zippthorne (748122) | more than 4 years ago | (#29480301)

And message integrity. Since an MITM attacker can just xor his own fraudtext over the ciphertext.

The two drawbacks are key length and message integrity...

Re:And message integrity (1)

cbreak (1575875) | more than 4 years ago | (#29482013)

For message integrity you can just append a hash value or a mac.

Re:And message integrity (1)

KDR_11k (778916) | more than 4 years ago | (#29482299)

Wouldn't he need to know either the plaintext or the key to put any useful data into that fraudtext result?

Re:XOR! (1)

jhol13 (1087781) | more than 4 years ago | (#29479593)

Oh, boy ... XOR is the best there is ...
(quite a few stream ciphers uses xor - for a reason: one-time-pad is XOR).

Yeah, I know what you meant.

Re:XOR! (1)

Xtravar (725372) | more than 4 years ago | (#29480269)

Hey dipshit! if you actually read the law, or were in the industry affected by this, you might understand that they actually did specify a level of encryption required. I should know because I just spent the last month upgrading our product to conform to the law. From what I was told by our security experts, AES and 3DES are acceptable.

Cool New Software? (1)

iceborer (684929) | more than 4 years ago | (#29479187)

Now, can someone direct me to a site showing how to setup this Encryption Garner Exemption software so that it will notify people of data breaches?

Or do we just need /. to hire an editor?

Who is advising these guys? (3, Informative)

electricprof (1410233) | more than 4 years ago | (#29479223)

Once again we see an example of public policy on technology being made with apparently little knowledge or regard for technology. The word "encryption" guarantees nothing. Suppose we just use Pig Latin? Ancay ouyay eadray isthay?

Re:Who is advising these guys? (1)

Tigersmind (1549183) | more than 4 years ago | (#29479511)

Yes I can.

Re:Who is advising these guys? (4, Funny)

pushing-robot (1037830) | more than 4 years ago | (#29479627)

No I can't.

Re:Who is advising these guys? (0)

Anonymous Coward | more than 4 years ago | (#29481865)

Sorry?

Re:Who is advising these guys? (1)

maxume (22995) | more than 4 years ago | (#29479737)

What does "Ancay ouyay eadray isthay?" mean?

I read it a couple of times, but I can't make anything of it.

Re:Who is advising these guys? (0)

Anonymous Coward | more than 4 years ago | (#29479947)

Wait, I'll translate for you: b[bl] pon[ya]l [eh]tot? (pron.: biuh panyal etat)

Oh wait, you probably meant english. That's Russian, sorry ;)

Re:Who is advising these guys? (0, Redundant)

DeHackEd (159723) | more than 4 years ago | (#29480531)

It's pig latin. Like the GP said.

1) Drop the "ay" suffix: Anc ouy eadr isth?
2) Move the last letter to the first: cAn you read this?

There's other rules, but this'll get you by. Apparently it's stronger than rot13.

Re:Who is advising these guys? (1)

maxume (22995) | more than 4 years ago | (#29480617)

Think about the question, my answer (The details are important!), and how likely it is that I am actually that obtuse.

Re:Who is advising these guys? (2, Funny)

aethogamous (935390) | more than 4 years ago | (#29480425)

Once again we see an example of public policy on technology being made with apparently little knowledge or regard for technology.

Once again we see an example of a comment on slashdot being made with apparently little knowledge or regard for the article.

Re:Who is advising these guys? (1)

electricprof (1410233) | more than 4 years ago | (#29480995)

Perhaps, but I recall when the supposedly impenetrable DES standard was rendered vulnerable. [Wie94] M.J. Wiener, Efficient DES key search, Technical Report TR244, School of Computer Science, Carleton University, Ottawa, Canada, 1994. [Wie98] M.J. Wiener, Performance Comparison of Public-Key Cryptosytstems, CryptoBytes (1) 4 (Summer 1998). I believe the 1998 publication suggests that a one million dollar specialized computer could exhaustively attack DES in 35 minutes. This was in 1998 mind you. DES is now no longer considered acceptable, but to me the larger question is whether any practical encryption method of truly important data can ever be considered secure. Intelligence agencies often measure the need based on a time scale of assuming the protection can be defeated in a time scale of weeks, or perhaps months. However, the importance of protecting personal data may typically involve a timescale of years or even decades. I didn't need to read the article at all to know that that the very idea of not reporting a security breach on data of this importance is idiotic.

Re:Who is advising these guys? (1)

aethogamous (935390) | more than 4 years ago | (#29481315)

Fair enough, and regardless of the encryption I would have to agree that the very idea is idiotic.

Re:Who is advising these guys? (1)

rubi (910818) | more than 4 years ago | (#29481155)

Once again we see an example of public policy on technology being made with apparently little knowledge or regard for technology. The word "encryption" guarantees nothing. Suppose we just use Pig Latin? Ancay ouyay eadray isthay?

As technology has been "democratized" we have gained as a result a lot of what I call "magazine tech experts", they read something vague in a magazine or web page (usually not a publication specialized in technology or computer science) and go on from that.

Using Encryption Garner Exemption For Data Breach (1)

falconwolf (725481) | more than 4 years ago | (#29479229)

Guess who wrote or helped write the law...

Those who would have to follow the law and regulations. That's a problem with regulations, the industry that is regulated writes those regulations. Which then helps cut their competition.

Falcon

Re:Using Encryption Garner Exemption For Data Brea (3, Interesting)

belthize (990217) | more than 4 years ago | (#29479497)

Having just read through the document and as some other folks have posted further down it's not nearly as bad as you're implying and is *less* friendly to health agencies where reporting rules are concerned.

It's certainly written in typical bureaucrat/lawyer speak but for individuals it's a clear improvement over the current state of affairs.

In terms of the form of these documents, I wonder if an collaborative re-write type project would fly. Get volunteers to re-write the document such that the intent and legality doesn't change but the readability is greatly increased. I noted several times where the general ordering of the document was not terribly linear, they repeated themselves or used very confusing sentence structure.

Re:Using Encryption Garner Exemption For Data Brea (1)

Runaway1956 (1322357) | more than 4 years ago | (#29479959)

"I noted several times where the general ordering of the document was not terribly linear, they repeated themselves or used very confusing sentence structure."

Psst! Excuse me, Mr. Belthize? Please, trade me papers. You've got the encrypted copy, and that's "Top Sekritz". Thank you sir!!

Re:Using Encryption Garner Exemption For Data Brea (1)

falconwolf (725481) | more than 4 years ago | (#29480303)

It's certainly written in typical bureaucrat/lawyer speak but for individuals it's a clear improvement over the current state of affairs.

And guess who's bureaucrats and lawyers were involved. I would think the average or typical person could sit down and think, heck think while walking, that any breach of privacy by any entity would be liable for damages financial or otherwise caused by that breach as well as an amount X paid to those who suffered because of it. The disagreement I see is the amount of X.

Falcon

A breach is a breach (2, Informative)

mathfeel (937008) | more than 4 years ago | (#29479265)

whether it's encrypted or not. With encryption it is (in principle) harder. The weakest link is usually not the computer engineering but social engineering anyway.

Re:A breach is a breach (2, Insightful)

R2.0 (532027) | more than 4 years ago | (#29480397)

"The weakest link is usually not the computer engineering but social engineering anyway."

And that's why that exception is there - to protect the companies who have poor policies and weak personnel controls. How many doctors are walking around with their passwords on a sticky on the back of their ID badges? And how many even know policies against that exist, much less care about them?

It's like making the law.. (3, Insightful)

mysidia (191772) | more than 4 years ago | (#29479341)

If you wear your seatbelt, you don't have to buy auto-insurance, or report a crash you are involved with.

Because if everyone was wearing their seatbelt, it's impossible for anyone to have gotten hurt.

Basically the same logic behind not reporting a data breach, if encryption was used.

*Not even considering how secure the keys are, and whether the intruder might be able to have gotten some usable data.

Businesses that use encryption for communications rarely encrypt everything.

Re:It's like making the law.. (1)

rubi (910818) | more than 4 years ago | (#29481161)

If you wear your seatbelt, you don't have to buy auto-insurance, or report a crash you are involved with.

Because if everyone was wearing their seatbelt, it's impossible for anyone to have gotten hurt.

Basically the same logic behind not reporting a data breach, if encryption was used.

*Not even considering how secure the keys are, and whether the intruder might be able to have gotten some usable data.

Businesses that use encryption for communications rarely encrypt everything.

The considerations I think were made regard more the protection of the "reputation" (if they have one to begin with) of the companies affected by shuch breaches. In my country they have made it illegal to publish anything about a bank if it can be denounced as a rumor, only to protect the "reputation" of those banks. Same principle applies for not reporting breaches.

Encryption methodology is defined (5, Informative)

sthomas (132075) | more than 4 years ago | (#29479345)

The method of encryption is defined in the law, adopts the standards set forth by the NIST, and there is a mechanism to update what is acceptable annually through published Guidances. This law is an improvement over what was previously in place. Read the HIPAA Security and Privacy rules as last updated in 2005, and then look at the major steps forward HITECH makes.

That future Guidances can update standards without having to send a law through Congress is also going to allow for future improvements in security, too. HITECH was part of the economic recovery act (ARRA), which shows how difficult it was for HIPAA to get updates - this had to be tacked onto an unrelated must-pass bill.

This article is from an encryption vendor who is stating that most encryption products are what he calls "point-to-point" encryption I bet he considers his own product to not be, thus it is superior, and thus HIPAA should require all companies to buy his products.

For those of you who think "encryption" is left up to the governed:

The HHS Guidance identifies four situations where paper or electronic data may be vulnerable to a breach, and suggests appropriate safeguards to secure the PHI:

                    - "Data at Rest". This is data that resides in databases, file systems, and other structured storage methods. The HHS Guidance points to the National Institute of Standards and Technology Special Publication 800-111, Guide to Storage Encryption Technologies for End User Devices as the approved methodology.
                    - "Data in Motion". This is data that is moving through a network, including wireless transmission. The HHS Guidance points to specific requirements in Federal Information Processing Standards (FIPS) 140-2 which include, as appropriate, standards described in NIST Special Publications 800-52, Guidelines for the Selection and Use of Transport Layer Security (TLS) Implementations; 800-77, Guide to IPsec VPNs; or 800-113, Guide to SSL VPNs, and may include others which are FIPS 140-2 validated.
                    - "Data Disposed". This is discarded paper records or recycled electronic media. The electronic media must have been cleared, purged, or destroyed consistent with NIST Special Publication 800-88, Guidelines for Media Sanitization, such that the PHI cannot be retrieved. For discarded paper records, PHI would need to be shredded or destroyed in a manner that precludes reconstruction.
                    - "Data in Useâ. This is data in the process of being created, retrieved, updated or deleted. The encryption and destruction processes described above, along with the general HIPAA safeguards, will apply to all data in use.
 

Re:Encryption methodology is defined (4, Informative)

sthomas (132075) | more than 4 years ago | (#29479447)

There's an excellent overview by a law firm here:

http://www.faegre.com/showarticle.aspx?Show=8969

"Previously, covered entities were obligated to mitigate harm caused by unauthorized disclosures of protected health information, but not required to give notice to the individuals whose information was inappropriately disclosed. Going forward, covered entities and business associates will be required to notify individuals when security breaches occur with respect to "unsecured" information. Unsecured information means information not protected through technology or methods designated by the federal government. In addition, if the breach involves 500 or more individuals, notice to the federal Department of Health and Human Services and the media is also required."

Re:Encryption methodology is defined (0)

Anonymous Coward | more than 4 years ago | (#29479643)

The FLAW still exists, however.

Those 'standards' are all well and good, but when the leak is an employee who has access to ALL of that data in its unencrypted form, the obligation to report the breach to the covered, is not required.

Informing the covered in this scenario, SHOULD be required. Under the proposals put forth, it WOULDN'T be. And that is why it is dangerous.

Re:Encryption methodology is defined (2, Interesting)

sthomas (132075) | more than 4 years ago | (#29479821)

Quit trolling. If the access is to unencrypted data and that data is compromised, notification is required. The exemption for notification is only for "secured" data. Unencrypted data is not "secured"

Re:Encryption methodology is defined (2, Insightful)

dkf (304284) | more than 4 years ago | (#29480169)

when the leak is an employee who has access to ALL of that data in its unencrypted form

Why would the system be giving an employee access to all the data in unsecured form? That'd be a mark of a very badly designed system. But if, "if" mind you, such a breach were to occur, the company wouldn't be eligible for getting out of notification.

Of course, the most likely weak-point is the legitimate end-users and their workstations. They have to have access (it's more important that they save the patient's life than keep their data secure) and you'll never persuade a large proportion of them to have good data hygiene. End users regard security as a bolt-on feature, like a spelling checker or other such; they just don't really value it.

Re:Encryption methodology is defined (1)

zippthorne (748122) | more than 4 years ago | (#29480347)

It's hard enough to convince them to have good actual hygiene...

Re:Encryption methodology is defined (0)

Anonymous Coward | more than 4 years ago | (#29481679)

Why would the system be giving an employee access to all the data in unsecured form? That'd be a mark of a very badly designed system.

Because the system isn't sealed in concrete and buried at sea?

We do all sorts of encryption and access control for our database, but if I were to walk into the data center, log into the db server's console, and pull up a database connection as the database administrator... wow, it's full of data!

I'm sure there are systems that can lock even the administrator out, but I trust those less than I trust myself.

Re:Encryption methodology is defined (1)

Another, completely (812244) | more than 4 years ago | (#29481921)

It's pretty common for the contents of a table to be encrypted. The administrator might see, for example, that ten new patients were added today, but there is no reason for that data to be available in plaintext. A quick google for references gave this practical introduction [devx.com] .

Re:Encryption methodology is defined (1)

burning-toast (925667) | more than 4 years ago | (#29483389)

As I've brought up before, I worked in the IT department of a fairly large medical transcription company before. No employee was given access to patient medical records in our organization unless it was required, all access was logged, and violations were dealt with in a very heavy handed manner. There are always going to be a few breaches (including in our own organization) but the repercussions were serious enough to force people to pay attention.

We didn't give our end users the OPTION to ignore regulation. We had a member of our staff as a compliance officer (as regulation dictates) and we regularly audited our systems for unintended access. Because the regulations posed very serious consequences for non-compliance to the company as our contracts with the hospitals and clinics included sanctions and contract termination clauses should we cause violations; the employees, by extension, were absolutely NOT allowed to be lax.

We fired a good number of people who did things like personal shopping / web browsing / etc. on the computers which were primarily used for transcription work. We even fired an employee who refused to run anti-virus on their machine. We even went so far as to provide the PC's themselves (preloaded with all of the applications necessary) to be sure we could enforce those restrictions.

We did catch one employee looking for records on someone she knew (a friend) which happened to be treated at a hospital which was our client. Not only did this require a substantial amount of paperwork to rectify (including letters to the patient even though the employee never got a record pulled up), but the employee was fired, fined, and had her MT certification revoked and is likely to never work anywhere near healthcare organizations again.

So while this was an anecdotal story, you might rest a bit easier to know that in MOST places HIPAA regulations are taken very seriously and every organization I worked with over those years was very protective of your medical records.

Re:Encryption methodology is defined (1)

burning-toast (925667) | more than 4 years ago | (#29483279)

You should know that when an employee is able to access records unfettered and unauthorized it already IS a breach of HIPAA regulations. I've worked for a medical transcription company and while we were not directly held to every regulation we DID have to notify the relevant authorities and hospitals (and a ton of other paperwork) if patient records were accidentally distributed to the wrong people INCLUDING our own staff which did not have a need-to-know (Happened once or twice via incorrect distribution groups in e-mail, was a very irritating situation, but we took many steps to prevent such issues.)

Internal employees do not get unfettered access to records even within an organization. Additionally, it is an actionable offense if they redistribute or use the information they uncover for any means not considered acceptable in the regulations, and the employer can be levied very large fines for employee misuse of patient records.

Re:Encryption methodology is defined (0)

Anonymous Coward | more than 4 years ago | (#29479699)

That is great information. Thanks. But shouldn't people still be informed if the data is stolen, even if it is encrypted? What if the standards are being specified thoroughly, but aren't being followed? All it takes is someone using social engineering, sniffing passwords via keyloggers, or some other flaw in the implementation and the encryption doesn't mean squat.

People should still be informed.

Re:Encryption methodology is defined (0)

Anonymous Coward | more than 4 years ago | (#29479749)

Barring any unknown exploit data that is encrypted with AES is useless without the key.

Re:Encryption methodology is defined (0)

Anonymous Coward | more than 4 years ago | (#29481139)

Thank you; I found myself at each "just XOR it" message wanting to shout in their faces "I'm SURE they mandated FIPS compliance," but realized the average Slashdot reader wouldn't know FIPS from CHiPs.

The actual document (5, Informative)

belthize (990217) | more than 4 years ago | (#29479401)

The actual document is here:
http://www.hhs.gov/ocr/privacy/hipaa/understanding/coveredentities/federalregisterbreachrfi.pdf [hhs.gov]

I started to post several derogatory comments as I read through it but eventually I came to the conclusion that while nearly unfathomable to most readers it doesn't completely suck.

In several cases they specifically ask for comment from the public where they think there may be valid concern and I think they accurately identified the weak links where they requested comment. If you have an opinion you might consider posting it there rather than (or in addition to) here.

They do actually address reporting breaches of encrypted data where that encryption could arguably have been broken or circumvented.

I don't quite understand the logic of not simply reporting any breach but it's hardly the disaster it's being made out to be.

Re:The actual document (5, Insightful)

fluffy99 (870997) | more than 4 years ago | (#29479847)

Congratulations, you're one of the few people that read the article or the document itself. My take on this is that if end-end encryption was used, meaning the actual files lost were still securely encrypted and the keys were not compromised, then the data owner does not have to report it as compromised data. Sounds reasonable to me.

The ACT is also a huge motivator for these agencies to implement encryption in a secure manner, thereby avoiding the whole mess that happens every time a laptop gets stolen and they don't know what files were actually on it.

Re:The actual document (1)

LifesABeach (234436) | more than 4 years ago | (#29483159)

It's obvious that the companies involved are creating a CYA clause, nothing new in business case studies. But what's really interesting are some of the fundamental issues that are emerging from the Health Care Question.
  • What harm can happen from anyone knowing your current health status be?
  • What CAN Health Care Companies do BETTER than a government subsidized health care plans?
  • What are the impacts of Genetic Therapies on the Pharmaceutical Industries?
  • Changing the Health Care Business Model from one of "Repair It", to "Prevent It"
  • Any my favorite, Mediocrity

Dream job (1)

Idiomatick (976696) | more than 4 years ago | (#29479465)

I know everyone is thinking it so i'll just put it out there.

I want to be the guy that gets paid to make cool acronyms for the government. ARTEMIS and ATLAS could have been my words! EYE WIL GETT this JOB if it's the LAST thing EYE DO (I believe that had something to do with NASA...).

Re:Dream job (1)

selven (1556643) | more than 4 years ago | (#29479549)

Let's get some free software people doing this job.

BOGUS = BOGUS Open Government for United States

Re:Dream job (4, Funny)

MurphyZero (717692) | more than 4 years ago | (#29479571)

I just know I don't want to be in charge of the Fully User Capable Key Encryption Device program.

Several thoughts... (1)

jaypifer (64463) | more than 4 years ago | (#29479637)

1) I really doubt that they were running out and telling everyone of their breaches in the first place. Unless a corporation has a gun to its head they tell the public nothing. Not that I really blame them, it's not exactly profitable to announce such things.

2) Anyone who has worked in industries where encryption is "required" laughs scornfully at press releases like these. We'll see a rush of bandaid solutions to meet the mere minimum then, over time (say one year), even that minimum will be forgotten.

3) I would like to see a penalty that says something like "Any healthcare provider that has claimed HITECH status that is then subsequently breached AND the breach reveals lack of encryption pays X amount of dollars per account fine."

Re:Several thoughts... (0)

Anonymous Coward | more than 4 years ago | (#29483561)

1.) Then you have obviously not worked in health care. HIPAA is the gun to the corporations head in this case. No company wants to be seen as HIPAA violators in this day and age.
2.) This is not an industry where encryption is always required. Companies are still allowed to use such devices as plain paper fax (to departmental fax machines none the less), but there ARE regulations in how patient data is treated when it is likely possible to intercept (unencrypted e-mail for instance). In plain English the HIPAA regulations basically work like this:
If you are sending a patient record where people unrelated to your target organization may read them, they must be encrypted.
If it is possible for an unintended employee at the intended organization may see it you provide a cover letter indicating the proper destination.
If you are storing records where many people have access, access must be restricted and logged.
If the system is accessible from the Internet it should both be encrypted and access should be restricted and logged. HIPAA regulates the minimum here, and violators have a number of potential punishments. Many of which are very effective deterrents.
3.) I think you are putting too much trust in encryption. HIPAA regulation first attempts to address the social and economic behaviors which open records up for accidental or malicious distribution, it addresses the behaviors which encryption does nothing for. This new regulation appears to be attempting to compliment the social / economic behaviors with some technological safeguards, but it is a work in progress. If you want real protection of your records in the health care industry then I would suggest asking your clinic / hospital to detail the steps they have taken to remain HIPAA compliant. It also helps to actually read through some of the legislation so you know better about what exactly is being implemented here.

The biggest point is that HIPAA doesn't regulate on some trivial and inane thing like "Were they using 512-bit encryption and model 1298AA23B fingerprint scanners?" instead it regulates on the matter of "Were the proper and reasonable safeguards taken to prevent misuse?" with "proper" and "reasonable" generally being detailed in either different parts of the HIPAA act or in other legislation (at minimum a judge can decide as well). Also, health care providers do not get to elect to be called HIPAA compliant like it has been some sort of certification review. If they are NOT compliant they get fined and other penalties. If they persist in being not compliant they can simply be shut down.

Re:Several thoughts... (0)

Anonymous Coward | more than 4 years ago | (#29491997)

Sounds like we agree completely.

1) Therefore they will never tell anyone that they are in flagrant violation.

2) Encryption is and will always be a problem.

3) I place no trust in encryption, that's why I'd like to see a fine if there were a breach and post-mortem revealed a lack of encryption. (Did you read my point?)

You place too much trust in the law rather than what actions the law will motivate. In this case...minimums. Internal post-mortems will *always* show that "proper and reasonable safeguards" were taken.

Maybe I missed something (1)

hyades1 (1149581) | more than 4 years ago | (#29479789)

I'll admit I only scanned TFA, but it seems to me that the situation is this: If they use encryption, companies that failed to protect their data banks don't have to notify those most intimately concerned that the data has been illegally accessed. At the same time, the people who would steal data AND break the encryption are those who have a real intention of using it. It's a safe bet that the use they have in mind is not one the people most directly concerned would approve of.

So under this system, the breaches most likely to cause real harm to people whose personal information has been compromised are precisely the ones that will go unreported.

I know I could have explained this better, but I'm in a hurry. Basically, if you've been told somebody got into a data bank and accessed your personal info, you can pay special attention to whatever is vulnerable. Usually nothing nasty will happen, but at least you're aware of the circumstances. If you aren't aware of the situation, though, you won't be particularly alert. And that will be exactly when you most need to be on guard against the most dangerous kind of data pirate...one who is willing to jump through BOTH metaphorical hoops: initial hack AND decryption.

Encryption ain't cheap (0)

Anonymous Coward | more than 4 years ago | (#29480237)

I work in IT in a rural hospital. Right now our s/w vendor doesn't even recommend data encryption. We can't proceed until they give the go-ahead.
And, btw, I don't hear any of you posters whining about the cost of healthcare. This is just another cost added to the cost of healthcare. (How many of you are aware that in the US, Medicare only pays approximately 85% of the COST; Medicaid in some states only pays 55% of the COST.)
Now we have to have breach reports, HIPAA committees, audits, etc. $$$$
This is not to say I disagree with protecting the data. We have a zero tolerance policy for that data. I've had to disable an account for more than 1 nurse that opened their mouth and said something they shouldn't have.
You think this stuff (encryption, f/walls, etc) comes cheap? And don't talk open source. We have to use Windows on the desktops. The HIS s/w won't work on Ubuntu or any other flavor of *nix. I know, I've tried to make it work. Replacing the HIS s/w is out of the question. $5 million at the minimum plus all the training.
I'm not ranting or criticizing the posts or posters. I'm just trying to make people understand that not all hospitals are UPMC, Mayo or Johns Hopkins with money everywhere. In fact, many hospitals (non-profits, I'm referring to) don't make enough money on providing healthcare. They make it on short-term investments in the stock market. (And, yes, our hospital just laid off 11 people due to poor market performance and people holding off on elective surgery.)
I agree, though, that major targets such as a hospital that caters to Hollywood stars or other VIP's should do more.
Cut wages? You really want a nurse who's making minimum + $1 per hour taking care of you?
No, I don't have an answer to the problem either. If you think a single-payer, aka government healthcare, would work think of the thousands of jobs that will be lost (insurance companies, accouting firms, ancillary services) when the gov. takes over.
 

Encryption doesn't help (2, Interesting)

MartinSchou (1360093) | more than 4 years ago | (#29480797)

I seem to recall a case from the UK, where two CDs filled with tax information from about 10 million people were left on a train or bus.

Thankfully all the data on the CDs was encrypted.

Typically the password(s) were written on the CDs.

So, no, encryption does nothing but add a layer of security theatre for data breaches. Notification should still be required.

Add the following requirements:

  • What was copied
  • How was it copied (i.e. CDs forgotten on a bus, laptop stolen, physical entry onto facilities, remote access etc.)
  • How was the data protected (i.e. not at all, encrypted etc.)
  • How effective is the chosen encryption (i.e. not at all, 40 bit DES, 4096 bit Blowfish etc.)
  • Were the passwords compromised as well (i.e. yes it was on the CD, possibly, no etc.)
  • What measures are being taken to prevent this happening again (i.e. nothing, passwords won't be shipped along with data, better security against remote access, fired the responsible manager etc.)

Probably a few more requirements as well. That way those who really want to know can be told, and those who don't care will just throw the letter away anyway.

Also add very very steep fines for not disclosing data breaches. If the chance of it being known that a breach has occurred are 1%, make the fines 200x the cost of notification and expected loss of business. Hell, add mandatory non-suspendable jail time for the responsible managers (including board members).

Re:Encryption doesn't help (1)

Dr_Barnowl (709838) | more than 4 years ago | (#29481913)

Thankfully all the data on the CDs was encrypted.

It was widely reported as "password protected". Whether that means the press (or the press office of HMRC) were dumbing it down for public consumption, or whether it was something considerably more fragile, who knows.

UK government agencies have become disproportionately paranoid about this kind of data loss now. On the upside, we now have mandatory examinations in data security. In our office, we have universal full-disk encryption and all writeable removable media is required to also be encrypted, enforced by some ghastly malware called SafeBoot. Some of the procedures and recommendations are still rather suspect.

We are recommended to destroy encrypted removable media which contained senstive data when the data is no longer required. This recommendation feels like it came from our supplier of removable media - it should be sufficient just to wipe the key block on the media. An encrypted reformat wipes the whole media, so that would put it completely beyond recovery. Instead we are being lead to destroy perfectly sound media that costs many multiples of the standard variants (because of the magic encryption hardware).

And the protocols for sending sensitive data are questionable. Approved couriers are all very well, but plaintext documents on paper are still vulnerable no matter how trustworthy a biker you entrust them to, and no matter how clearly labelled your envelopes are. Sending emails via our internal mail servers is rated "secure" but those servers are still cacheing the data as plaintext, and it's easy to slip an "outsiders" address into your CC list accidentally. I'd much rather all that was replaced by a single policy - all sensitive data that must be sent to other parties is sent to named recipients and encrypted to verified PGP keys only. This works for email regardless of the servers used and also couriers carrying optical media (for bulk data), and because it's a single thing to remember, should have a greater compliance rate.

Re:Encryption doesn't help (0)

Anonymous Coward | more than 4 years ago | (#29483767)

Agreed, it is a loophole, and the same loophole exists in Sarbanes-Oxley related disclosures as well.

The bottom line is there aren't enough qualified security engineers/forensics people, and I doubt business actually wants to get into the minutiae of how their breach actually happened. It is exceedingly rare.
 

It will apply to everyone (1)

gabrieltss (64078) | more than 4 years ago | (#29482805)

"More specifically (as explained here - PDF) only HIPAA-covered healthcare providers and health plans that omit the use of encryption or information destruction will be obliged to notify individuals about a breach of their personal health information."

I work for one of THE largest health insurance companies and I can say HIPPA is a FEDERAL law you don't have HIPPA covered and not HIPPA covered. If a provider is NOT abiding under HIPPA they are breaking the law. So in all actuality it will be anyone not employing encryption. The way my company is "applying" encryption is using whole disk encrytion on all servers, desktops and laptops. Granted it's not as good as they say it is. Anyone with Linux could mount the darn drivess and extract everything off of it. Please these comercial disk encrytpion schemes are more for peace of mind than actual protection. So really your personal health information is not safe if stored electronically. It's actually safer stored on paper and locked in filing cabinets behind locked doors in a locked office. Basically they are giving free rides to health folks if the data gets stolen. You hear more about electronic data theft than actual physical data theft. When is the last time you heard on the news about someone breaking into someplace and stealing thousands of files from a health insurance company or a provider, or even credit card companies?

Re:It will apply to everyone (1)

sthomas (132075) | more than 4 years ago | (#29483687)

The commercial whole-disk encryption software we use works at the BIOS-level, rather than from within the OS. You can't mount the disks on a Linux system. Your statement that commercial encryption programs are for peace of mind rather than protection is a false generalization. Well-designed, well-implemented, *and* well-managed (must be all three) systems can provide excellent real protection of data.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>