Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

IBM Claims Breakthrough In Analysis of Encrypted Data

timothy posted more than 5 years ago | from the scrambled-in-the-shell dept.

Encryption 199

An anonymous reader writes "An IBM researcher has solved a thorny mathematical problem that has confounded scientists since the invention of public-key encryption several decades ago. The breakthrough, called 'privacy homomorphism,' or 'fully homomorphic encryption,' makes possible the deep and unlimited analysis of encrypted information — data that has been intentionally scrambled — without sacrificing confidentiality." Reader ElasticVapor writes that the solution IBM claims "might better enable a cloud computing vendor to perform computations on clients' data at their request, such as analyzing sales patterns, without exposing the original data. Other potential applications include enabling filters to identify spam, even in encrypted email, or protecting information contained in electronic medical records."

Sorry! There are no comments related to the filter you selected.

First post! (5, Funny)

fenring (1582541) | more than 5 years ago | (#28469471)

Have you seen the new neighbours. I think they're homomorphic.

Re:First post! (0, Offtopic)

mcgrew (92797) | more than 5 years ago | (#28469777)

I miss metamoderation, where mods with IQs too low to get the joke were weeded out.

THIS comment is offtopic, the parent should be modded "funny" and if metamoderation was like it use to be, the moderation on the parent would be modded "unfair".

Re:First post! (0)

Anonymous Coward | more than 5 years ago | (#28470159)

Agreed 100%. Slashdot needs to stop fuxxoring with the site just because they can, and stick with what works (the shit that did work worked fucking great). Now the CSS is completely hosed and the metamodding has been completely neutered. Can't we just roll everything back to the way it was two or three years ago?

Re:First post! I was thinking this was an exotic (1)

davidsyes (765062) | more than 5 years ago | (#28470077)

sexy topic... a DEEP, probative analysis of sorts. I bet the analyzers are as gay/giddy as they can morph into... Maybe they can be TRANS-FORMERS.... slap some dap and scrum some bum...

Re:First post! (1, Funny)

Anonymous Coward | more than 5 years ago | (#28470255)

Nah, I saw them together with a woman. I think they're bijective.

fristy posty (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#28469473)

telnet://zombiemud.org:23

Breakthrough in the analysis of willys (-1, Troll)

Anonymous Coward | more than 5 years ago | (#28469505)

Imagine your Willy being smacked until it bleeds.

J.delanoy.

Yeah (3, Insightful)

rodrigoandrade (713371) | more than 5 years ago | (#28469507)

"might better enable a cloud computing vendor to perform computations on clients' data at their request, such as analyzing sales patterns, without exposing the original data. Other potential applications include enabling filters to identify spam, even in encrypted email, or protecting information contained in electronic medical records."

Right, because we've already figured out everything about cloud computing and it's a totally stable environment ready to be deployed in every company around the globe. Time to take it to the next step.

But what if it took... a TRILLION times longer? (2, Insightful)

spun (1352) | more than 5 years ago | (#28469819)

Yeah, you can perform calculations on encrypted data without unencrypting it. But it's just a LITTLE slow. The first step is just showing it can be done, but it's a very long way from useful.

Re:Yeah (1)

birrddog (1237440) | more than 5 years ago | (#28469929)

Or you could just use cloud computing resources to break the encryption code, then trawl through the data.

Re:Yeah (1)

bkpark (1253468) | more than 5 years ago | (#28470947)

Right, because we've already figured out everything about cloud computing and it's a totally stable environment ready to be deployed in every company around the globe. Time to take it to the next step.

One might argue that this is one of the pieces we had to figure out before cloud computing could become more widespread.

After all, who would entrust third party computers to do the computation unless the confidentiality of data could be guaranteed? Projects that generate no profit such as SETI and other scientific projects apparently have no issue with this, but no company would ever use cloud computing in a commercial project without somehow ensuring that their data is protected.

No one's claiming that this is the entire solution to every problem in cloud computing, or even the last piece of the puzzle—but somebody is claiming that this is a solution to a significant part of the problem, and I frankly agree.

Finally (0)

Anonymous Coward | more than 5 years ago | (#28469509)

DRM that works.

Re:Finally (1)

SilverHatHacker (1381259) | more than 5 years ago | (#28470825)

Don't say that too loud...the RIAA might get an idea...

Insert Homo Jokes Here (-1, Flamebait)

nacturation (646836) | more than 5 years ago | (#28469551)

Please consolidate all your "that's gay technology" and various other homo jokes under this thread.

Re:Insert Homo Jokes Here (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#28469795)

that's gay OP

hmmm... (1, Funny)

whopub (1100981) | more than 5 years ago | (#28470037)

Why, are you trying to keep all the gay fun for yourself?!

<SEINFELD>No that there's anything wrong with that!</SEINFELD>

No More Privacy (5, Insightful)

basementman (1475159) | more than 5 years ago | (#28469563)

"perform computations on clients' data at their request, such as analyzing sales patterns"

Or without their request.

Since it's close to being slashdotted... (5, Informative)

Magic5Ball (188725) | more than 5 years ago | (#28469615)

IBM researcher solves longstanding cryptographic challenge
Posted on 25 June 2009.
An IBM researcher has solved a thorny mathematical problem that has confounded scientists since the invention of public-key encryption several decades ago. The breakthrough, called "privacy homomorphism," or "fully homomorphic encryption," makes possible the deep and unlimited analysis of encrypted information - data that has been intentionally scrambled - without sacrificing confidentiality.

IBM's solution, formulated by IBM Researcher Craig Gentry, uses a mathematical object called an "ideal lattice," and allows people to fully interact with encrypted data in ways previously thought impossible. With the breakthrough, computer vendors storing the confidential, electronic data of others will be able to fully analyze data on their clients' behalf without expensive interaction with the client, and without seeing any of the private data. With Gentry's technique, the analysis of encrypted information can yield the same detailed results as if the original data was fully visible to all.

Using the solution could help strengthen the business model of "cloud computing," where a computer vendor is entrusted to host the confidential data of others in a ubiquitous Internet presence. It might better enable a cloud computing vendor to perform computations on clients' data at their request, such as analyzing sales patterns, without exposing the original data.

Other potential applications include enabling filters to identify spam, even in encrypted email, or protecting information contained in electronic medical records. The breakthrough might also one day enable computer users to retrieve information from a search engine with more confidentiality.

"At IBM, as we aim to help businesses and governments operate in more intelligent ways, we are also pursuing the future of privacy and security," said Charles Lickel, vice president of Software Research at IBM. "Fully homomorphic encryption is a bit like enabling a layperson to perform flawless neurosurgery while blindfolded, and without later remembering the episode. We believe this breakthrough will enable businesses to make more informed decisions, based on more studied analysis, without compromising privacy. We also think that the lattice approach holds potential for helping to solve additional cryptography challenges in the future."

Two fathers of modern encryption - Ron Rivest and Leonard Adleman - together with Michael Dertouzos, introduced and struggled with the notion of fully homomorphic encryption approximately 30 years ago. Although advances through the years offered partial solutions to this problem, a full solution that achieves all the desired properties of homomorphic encryption did not exist until now.

IBM enjoys a tradition of making major cryptography breakthroughs, such as the design of the Data Encryption Standard (DES); Hash Message Authentication Code (HMAC); the first lattice-based encryption with a rigorous proof-of-security; and numerous other solutions that have helped advance Internet security.

Craig Gentry conducted research on privacy homomorphism while he was a summer student at IBM Research and while working on his PhD at Stanford University.

Re:No More Privacy (2, Insightful)

megamerican (1073936) | more than 5 years ago | (#28469673)

Or without their request.

The NSA figured that out a long time ago.

Re:No More Privacy (3, Informative)

Anonymous Coward | more than 5 years ago | (#28469773)

Or without their request.

If they really figured it out, then sure they can analyze without your request, but they can't decrypt the results without your key. So you still have the same privacy. BTW, this is the entire point of this process.

Re:No More Privacy (0)

Jurily (900488) | more than 5 years ago | (#28469867)

s/clients/citizens/

Re:No More Privacy (3, Interesting)

mea37 (1201159) | more than 5 years ago | (#28470073)

TFA doesn't seem clear on this point, but what the name of the technique implies is that you can perform the operation, but neither the inputs nor the outputs are ever decrypted. So if you can't see the question, and you can't see the answer, then why would you perform the operation other than at the request of someone who can (i.e. the client)?

That said, I'd like to know a lot more about this before I'd want to trust it. For this to work, I'd think a lot of the data's structure must be preserved. Maybe you can't detect that structure from the encrypted data, but you can probably infer a lot about it by analyzing the algorithms your clients ask you to apply (especially if they're your algorithms - i.e. software-as-a-service type stuff). I'm impressed if this doesn't create vulnerabilities.

Also I suspect this is fundamentally divorced from public key techniques. If I'm able to encrypt values of my choosing and perform operations of my choosing on encrypted values, I'm pretty sure I can work backward to extract the cleartext from the encrypted data the client provides...

here's why this is important. (2, Informative)

goombah99 (560566) | more than 5 years ago | (#28470421)

TFA doesn't seem clear on this point, but what the name of the technique implies is that you can perform the operation, but neither the inputs nor the outputs are ever decrypted. So if you can't see the question, and you can't see the answer, then why would you perform the operation other than at the request of someone who can (i.e. the client)?

Example, I want the total sales figures for all the left handed employees. I cobble together the appropriate SQL processing request send it to my cloud server which rummages throught the data base summing up the figures for some subset of the fields. It sends me back just the sum, encypted. It never knows which employees it was selecting nor any of their sales figures or even the sum. It just has the encrypted result that it sends to me all processed.

otherwise I'd have to pull every encrypted record of every employee across the web to my computer. In effect doing a table dump across the net for even the smallest calculation. Yuck! let alone not working on my iphone.

Another application is encryption of ballots. The achilies heel of almost every voting encryption algorithm is that somewhere somebody has the key and to do any processing you have to decrypt the ballots. THis way you can do the sum and only decrypt the results. you could publish the encrypted sum before the key is even unsealed then publish the key. The key does not have to ever be in the hands of the central tabulators.

Current voting encryption schemes require centralized control with possession of the keys. This way the control is decentralized with the counters and the key keepers not being the same person.

Re:No More Privacy (5, Informative)

John Hasler (414242) | more than 5 years ago | (#28470133)

Everything remains encrypted throughout the process, including the output. Only the client can read the results. That is the point.

Re:No More Privacy (3, Informative)

Mashiara (5631) | more than 5 years ago | (#28470265)

TFA is skimp on this but after bit of Googling around I understand a little more, see also http://en.wikipedia.org/wiki/Homomorphic_encryption [wikipedia.org] .

The point being that those who provide the encrypted data must encrypt it in a special way to allow the homomorphic properties to be taken advantage of.

Re:No More Privacy (1)

caramelcarrot (778148) | more than 5 years ago | (#28471467)

Also, as someone points out below, http://science.slashdot.org/comments.pl?sid=1282009&cid=28470091 [slashdot.org] , you don't use the same algorithm as before - but you instead "encrypt" the algorithm so it's working in the same space as the encrypted data. I'm sort of imagining some sort of encrypted virtual machine. Otherwise some of the flaws being talked about would be an issue.

Re:No More Privacy (1)

bhagwad (1426855) | more than 5 years ago | (#28470565)

I'd like to know what sort of "analysis" can be done without the client's permission. Can they find out how many times the word "and" occurs (without reading the message) for example?

Basically can they do any sort of content analysis? If they're saying they can filter spam, then it's not at all a stretch to assume that they can "read" your data as well. What's the point of encryption then?

Fully homomorphic encryption using ideal lattices (5, Informative)

grshutt (985889) | more than 5 years ago | (#28469575)

The abstract for Gentry's article can be found at: http://doi.acm.org/10.1145/1536414.1536440 [acm.org]

Re:Fully homomorphic encryption using ideal lattic (0, Offtopic)

spiffmastercow (1001386) | more than 5 years ago | (#28469783)

Abstractly, we accomplish this by enabling the encrypter to start the decryption process, leaving less work for the decrypter, much like the server leaves less work for the decrypter in a server-aided cryptosystem.

Umm... That doesn't ease my concerns about this already disturbing prospect..

If they can analyze the data... (2, Insightful)

Lord Juan (1280214) | more than 5 years ago | (#28469595)

then that form of encryption is useless for highly sensitive information.

It's as simple as that.

BAD summary (5, Informative)

spun (1352) | more than 5 years ago | (#28469889)

You can not analyze the data. You can perform calculations on it without knowing what it is. So, for instance, you could encrypt all your tax info, send it to a company that processes the encrypted data without decrypting it, and sends you back your encrypted tax return, without ever having seen any of your financial detail.

Re:BAD summary (0)

Anonymous Coward | more than 5 years ago | (#28470559)

Bullshit. If it's possible to perform any meaningful calculations (by a third party) on the encrypted data at all then there is information leaking and it's a weak algorithm.

Re:BAD summary (3, Informative)

rbenech (97413) | more than 5 years ago | (#28470751)

Actually, imagine being able to add two numbers together without knowing what those two numbers were and returning the total that you STILL don't know what the number is, but you have the cyphertext for it. You still need the key to decrypt the total.

Example in plaintext:

4 + 5 = 9

Example encypted (oversimplified):

D32JFS3 + 234DSF31 = 42SDF23

So the third party would receive D32JFS3 and 234DSF31 (not knowing they meant 4 and 5) and he would return 42SDF23 (not knowing it was 9)

The ablility to add two peices of cyphertext to get some (still unknonw) peice of cyphertext does not increase the "breakability" of the encryption because, just like the rosetta stone, you really need pairs of plaintext and cyphertext to do any real analysis. There may be some NEW attack methods on lattice based encryption techniques, but they are not yet widely known.

Re:BAD summary (3, Insightful)

Shimmer (3036) | more than 5 years ago | (#28470915)

My problem with this is that you'd have to expose the structure of your data, if not its contents. Using your example, the cyphertext might look like: "QEDD32JFS3234DSF31". You'd have to tell the analyzer that integer A starts at index 3 and integer B starts at index 10. That information alone could help the analyzer crack your encryption.

Re:BAD summary (4, Funny)

KevinIsOwn (618900) | more than 5 years ago | (#28470855)

You better send this to the reviewers of Gentry's paper so that they have this important information!!!

Re:BAD summary (1)

statdr (1585151) | more than 5 years ago | (#28471151)

Not sure I see the point of this technology. Since the data provider has access to the raw unencrypted data (else how could the data provider see the unecrypted analysis results), wouldn't it be cheaper to develop a standalone analysis program for the data provider? Why encrypt, transmit, apply complex analysis process, retransmit, and decrypt results when one could just analyze the results onsite?

Re:If they can analyze the data... (2, Interesting)

Isarian (929683) | more than 5 years ago | (#28469945)

So I may have missed something from the article, but are all forms of public-key encryption vulnerable or just certain algorithms?

Re:If they can analyze the data... (3, Informative)

Magic5Ball (188725) | more than 5 years ago | (#28470569)

This isn't a vulnerability with existing encryption systems, it's a scheme for a different way to structure and encrypt the data to explicitly allow calculations on the data without exposing the original values.

Re:If they can analyze the data... (1)

Chris Mattern (191822) | more than 5 years ago | (#28470071)

then that form of encryption is useless for highly sensitive information.

Unless the analysis is also encrypted.

Re:If they can analyze the data... (2, Interesting)

Anonymous Coward | more than 5 years ago | (#28470117)

They can perform computations on the data, but the answer is still encrypted.

Re:If they can analyze the data... (2, Informative)

John Hasler (414242) | more than 5 years ago | (#28470177)

All the data and all the results remain encrypted so that only the client can read the results. That is the point. Read about homomorphic encryption here [wikipedia.org]

Wait, what? (1, Interesting)

spiffmastercow (1001386) | more than 5 years ago | (#28469625)

Okay, maybe I'm a noob when it comes to encryption, but I was under the impression that if you were able to read the encrypted email, you were probably able to read the encrypted recipient address too. Is there something I'm missing here?

Re:Wait, what? (5, Informative)

moogied (1175879) | more than 5 years ago | (#28469809)

Yes, yes you are.

The point is not to read the content, but to enable a computer to analyze the content in such a way that they can deduce statistics and patterns from it. FTFA:

computer vendors storing the confidential, electronic data of others will be able to fully analyze data on their clients' behalf without expensive interaction with the client, and without seeing any of the private data

I don't need to know that you love apples to know you definitely love the same thing as 14 other people. Lets assume that we have 20 encrypted sets of data. Lets also assume the 20 sets say basically the same thing but because of the encyrption method look nothing a like from the raw data perspective. If you go ahead and find a way to analyze the encryption enough to know that the 20 emails all contain a similar message, but not enough to actually know what the message is... well then! You could go ahead and store all of ebay's customer information and do massive amounts of data crunching for them, without ever actually seeing any data.

This is a huge problem in IT, where admins need access to the databases in order to see how the data is being stored, how the tables are working, etc etc.. but can't actually have access to the database because then they might see customer information. So you either let joe-bob admin in there and let him see all the data, or you don't. Now you can let the admin in there, they can determine anything they might want to know, but they never actually see any exact data.

No, I don't know anything about the math portion.. but thats basically what they are trying to say in the article. I think. :)

Re:Wait, what? (1)

geminidomino (614729) | more than 5 years ago | (#28469955)

Yes, yes you are.

The point is not to read the content, but to enable a computer to analyze the content in such a way that they can deduce statistics and patterns from it.

I'm not crypto-geek, but aren't patterns generally the bane of encryption?

Re:Wait, what? (0)

Anonymous Coward | more than 5 years ago | (#28470575)

Each of the examples listed above allows homomorphic computation of only one operation (either addition or multiplication) on plaintexts. A cryptosystem which supports both addition and multiplication (thereby preserving the ring structure of the plaintexts) would be far more powerful. Using such a scheme, one could homomorphically evaluate any circuit, effectively allowing the construction of programs which may be run on encryptions of their inputs to produce an encryption of their output. Since such a program never decrypts its input, it could be run by an untrusted party without revealing its inputs and internal state. The existence of a fully homomorphic cryptosystem would have great practical implications in the outsourcing of private computations, for instance, in the context of cloud computing.

http://en.wikipedia.org/wiki/Homomorphic_encryption [wikipedia.org]

Re:Wait, what? (1)

geminidomino (614729) | more than 5 years ago | (#28470965)

Unfortunately, as a non-cryptogeek (as I said), I have no idea WTF that says. It says that the program never decrypts the data. I got that much. That doesn't say anything about patterns and statistics, that TFS mentions.

It seems that if you can determine how often things show up in encrypted data (the "statistics" part), you're already revealing information.

Re:Wait, what? (1)

Cyberax (705495) | more than 5 years ago | (#28470089)

Implausible. Changing just one bit results in an 'avalanche effect' in good ciphers, so quite a lot of bits will be changed.

You won't be able to derive any useful information from that.

Re:Wait, what? (1)

lordofwhee (1187719) | more than 5 years ago | (#28470731)

This is why the data has to be encrypted in such a way as to prevent the avalanche effect, which defeats the purpose of encryption by forcing you to use a weak algorithm (at least from my understanding).

Really, you may as well use character-replacement for all the good the algorithms that support this technique would do you.

Re:Wait, what? (1)

EvanED (569694) | more than 5 years ago | (#28471067)

Changing just one bit results in an 'avalanche effect' in good ciphers, so quite a lot of bits will be changed.

Not necessarily true. Think a one-time pad; the best cypher out there, unbreakable without the key (well, you have to control for length issues), and doesn't have the avalanche effect.

I can't speak to whether there are or aren't other good cyphers without the avalanche effect, but there is at least one important exception.

Re:Wait, what? (1)

spiffmastercow (1001386) | more than 5 years ago | (#28470147)

Fair enough, but how is that better than just "anonymizing" data from a database through a one-way hash and then removing all directly identifiable info (client ID, etc)?

Re:Wait, what? (1)

moogied (1175879) | more than 5 years ago | (#28470643)

electronic data of others will be able to fully analyze data on their clients' behalf without expensive interaction with the client

Re:Wait, what? (1)

e-scetic (1003976) | more than 5 years ago | (#28470291)

If I encrypt my data, and I like apples, and I can now use this new technique to determine that 20 other people like apples too, don't I have an essential piece of information I can use to decrypt the encrypted data of those 20 other people?

Re:Wait, what? (1)

John Hasler (414242) | more than 5 years ago | (#28470437)

Homomorphic encryption does not give you any such ability.

Re:Wait, what? (1)

spiffmastercow (1001386) | more than 5 years ago | (#28470585)

Okay... So what if I like apples. And I have a username that starts with S. Now we've already established that I can see how many other people like apples. Can I see how many other people like apples that have usernames that start with S? And then can I see how many other people like apples, and have usernames that start with 'Sp'? I'm sure you see where I'm going with this. I may just be a cynical bastard with a math education insufficient to understand the technique by which this works, but it sounds like the idea is to intentionally weaken the encryption.

Re:Wait, what? (1)

NAR8789 (894504) | more than 5 years ago | (#28471155)

It doesn't allow you to compare data encrypted by different ciphers. It allows particular "multiplication" and "addition" operations on data encrypted by the same cipher. If I like apples, and have a username that starts with "S", someone with both encrypted blocks can produce another encrypted block of "I like apples" or XOR "I like apples" or something more computationally complicated along those lines. So, it does not allow the sort of attack you're proposing.

Re:Wait, what? (1)

NAR8789 (894504) | more than 5 years ago | (#28471213)

...can produce another encrypted block of "I like apples" or XOR "I like apples" or...

ah... sorry. failed to properly proofread my comment. That should be "...can produce another encryptoed block of "I like apples" or XOR "I like apples" or..."

Re:Wait, what? (1)

NAR8789 (894504) | more than 5 years ago | (#28471305)

I'm an idiot. There a modifier for HTML retard?

"[username]I like apples" or [username] XOR "I like apples"

Re:Wait, what? (1)

mhall119 (1035984) | more than 5 years ago | (#28471369)

You can only see if 20 other people like apples if that plaintext data was encrypted with the same key as the plaintext data that says you like apples.

Suppose Coca-Cola and Pepsi Cola both use the same Market Research firm, which we'll call StatisticsInc. Now, companies are very jealous of market insight data, most will not work with a firm that also works with a competitor, lest someone get bribed into sharing trade secrets. What this allows if for Coca-Cola to sent a bunch of demographic data to StatisticsInc for analysis, and StatisticsInc will never know what the input or result contained, and therefore cannot share any confidential data to Pepsi.

Re:Wait, what? (0)

Anonymous Coward | more than 5 years ago | (#28469877)

It's possible to understand what is going on just by looking at the words used. They say they've come up with a homomorphic encryption. Homo means says morhpic means changing. So homomorphic encryption is an encryption that preserves some predefined properties.

What you've got is a set of properties that are preserved by the encryption. A lot more research (in the sense of reading papers on the subject) on exactly what the properties are is necessary to figure out the value of the encryption.

If the properties in question are important to you, the encryption might be worthless. Then again, if the properties in question are not something you need to keep secret, then the encryption might be worth a great deal to you.

The strength of the encryption is also something that needs to be questioned. This question is separate from the fact that some properties are not changed.

Look but don't see. (0, Flamebait)

Itninja (937614) | more than 5 years ago | (#28469679)

First data is created and deemed private. It's then encrypted to prevent unauthorized people from seeing the private data. But other people want to analyze the data, without actually seeing it. Kind of like how it's good to know the income demographics of a city, but not to know the personal income of every person in said city. Never mind the fact that someone had to see the information at some point to render the statistics. Unless of course the results are never audited for accuracy. But's that's another story.

Anyway, so then we put our private data on someone else's server (we call it a 'cloud' - isn't that nice?) and they can't look at it because it's encrypted. But they will run analysis on the data, which requires a special tool to or something....look - all's I know is I get's me my KPI's for my board meetin'. Who really cares if some guys personal health records are sniffed by some computer...whoa! Jim has ass herpes?! I am so updating this on the Interwebz!

Re:Look but don't see. (2, Insightful)

Anonymous Coward | more than 5 years ago | (#28470119)

You've thoroughly misunderstood what this is about, I think. AFAIK this is about performing computations on encrypted data without having to decrypt the data.

Say I have a function F that I want to run on data A to produce data B. i.e. B=F(A)

F is an expensive function to run (big computation), so I'd like to hire the performance of computation service from someone, let's call them MBI, with a huge ass-computer.

But I don't want MBI to know the data A.

So I encrypt it, and give them CryptA instead.

But applying F to CryptA isn't going to give B, and maybe I don't want MBI to know B either!

But say I could derive a function G from function F, that given CryptA, produced an encrypted output CryptB, that when I decrypted gave B. i.e. CryptB = G(CryptA)

So I could give MBI data CryptA and function G, they could computer CryptB for me for a small (large) fee, no doubt - time on a blue gene machine or other large scary linux box isn't all that cheap to provide, though it's often at taxpayer expense.

And then I could decrypt CryptB to get the B I wanted. Since MBI only ever have CryptA, function G and CryptB, I don't leak input A or output B to them (I'm not sure off the top of my head whether they can derive F from G)

i.e. this is to enable provision of a SAFE (well, until someone makes a quantum computer...) computation service that PROTECTS privacy the way current systems don't

Actually, I thought it was "solved" (for cryptographer values of "solved"), just very computationally expensive and that's why people don't do it, I could have been wrong there, not actually a cryptographer.

Re:Look but don't see. (1)

John Hasler (414242) | more than 5 years ago | (#28470623)

They not only can't look at the data, they can't look at the results of the analysis. Only you can. That's the point.

logic? (0, Redundant)

poetmatt (793785) | more than 5 years ago | (#28469693)

Apparently the logic is missed altogether:
"makes possible the deep and unlimited analysis of encrypted information - data that has been intentionally scrambled - without sacrificing confidentiality." is a conflicting phrase in and of itself.

If you can analyze the data "with or without confidentiality", you have already sacrificed the confidentiality. Or does nobody remember the aol search results fiasco? [wikipedia.org]

This is more like "we can crack our own encrpytion, as if people are surprised".

Re:logic? (2, Informative)

John Hasler (414242) | more than 5 years ago | (#28470345)

Wrong. The whole point of homomorphic encryption [wikipedia.org] is that everything remains encrypted throughout the process including the output. Only the client can read the results.

Re:logic? (2, Informative)

quanticle (843097) | more than 5 years ago | (#28470361)

The summary is wrong. A Privacy Homomorphism allows third parties to compute calculations on the data on your behalf without decrypting either the input or the output. In other words, the cloud provider could, for example, total up your sales data without ever decrypting the individual sale information or the final total. The encrypted final total would then be given to you, and you would decrypt it to learn what it was.

At no point does a third party have access to a decrypted form of your data.

from the horses mouth (5, Informative)

Anonymous Coward | more than 5 years ago | (#28469701)

Just FYI this site is whole sale cut and paste ripping IBM press off.

http://www-03.ibm.com/press/us/en/pressrelease/27840.wss

Mod Parent Up (0)

Anonymous Coward | more than 5 years ago | (#28469841)

Well, Net-Security missed the last part. So, I'll just post it here.

"IBM Research is home to the largest team of cryptography researchers outside of the academic and government communities. For more information about IBM Research, please visit http://www.research.ibm.com/ [ibm.com] ."

Re:from the horses mouth (1)

Iluvatar (89773) | more than 5 years ago | (#28471157)

Also, is it just me, or the article title and content a bit misleading: how is a summer intern (PhD student from Stanford), who published this as a single-author paper (no IBM co-authors), an "IBM researcher"?

This is mentioned only at the very end of the "article".

Re:from the horses mouth (2, Informative)

NoCowardsHere (1278858) | more than 5 years ago | (#28471205)

Uhh... I'm not sure how to break this to you, but WHOLE POINT of a PRESS RELEASE is that it gets sent out to the press, in the hopes that websites and newspapers will reprint it. That's why IBM published it in the first place. So, yeah, it's not plagiarism, sorry.

Sacrificing confidentiality (1)

0xABADC0DA (867955) | more than 5 years ago | (#28469765)

I bet multi-modal reflection sorting can determine what the confidential info is.

Disappointing comments (1)

Sybert42 (1309493) | more than 5 years ago | (#28469767)

Why post if you have nothing to say?

Explanation? (1)

drunken_boxer777 (985820) | more than 5 years ago | (#28469843)

Could someone explain how this "breakthrough might also one day enable computer users to retrieve information from a search engine with more confidentiality"? (Quoted from the article.)

From the article it seems as if this aids in the scanning and searching of encrypted data rather than securing anything. While this might ultimately benefit large corporations or governments hiring third parties to perform analyses, is it of any use to the rest of us?

Even a car analogy would be great. Umm, make that "acceptable".

Re:Explanation? (1)

instagib (879544) | more than 5 years ago | (#28470471)

A car analogy? Good idea:

You bring your car to the shop. You don't tell them what type of car it is, or what the problem is. They analyze it, then hand you a closed envelope with the results the computer printed there without them seeing. You alone will read what's wrong with your car, and based on that you can order repairs.

Thinking of it, this would be great! They can't charge you for fantasy repairs anymore...

Re:Explanation? (0)

Anonymous Coward | more than 5 years ago | (#28471325)

You could create your query on the clients side(in your web browser maybe?) and then send that query, which the server side cannot decrypt but can run against their data. The server would then respond with encrypted data that you would then need to decrypt to actually see the results.

Wikipedia to the rescue (5, Informative)

Dr. Manhattan (29720) | more than 5 years ago | (#28469871)

With fully homomorphic encryption [wikipedia.org] , you can perform operations on the encrypted data, in encrypted form, that produces encrypted output. Sort of like doing a database query on encrypted data, that produces an encrypted result. So you could store your data somewhere in encrypted form, ask the host to perform some operations using their CPU cycles, and send you the result. You decrypt the result yourself, the host never sees unencrypted data at any point.

Cool, but I'm half-convinced that holes will be found. The first time a new encryption scheme is put to the test, it usually fails. Still, hopefully, it'll lead to a truly secure scheme.

Re:Wikipedia to the rescue (4, Insightful)

bobdehnhardt (18286) | more than 5 years ago | (#28469953)

Holes are always found - no method is 100% foolproof. The question is will the holes be usable? If the level of effort to exploit the holes is high enough, we may not see them exploited for some time. But the holes are there, and they will be found.

Re:Wikipedia to the rescue (0)

Anonymous Coward | more than 5 years ago | (#28470351)

Holes are usually due to the implementations, much more than the algorithms themselves.

Re:Wikipedia to the rescue (0)

Anonymous Coward | more than 5 years ago | (#28470491)

Your proclamation betrays your ignorance. Provably correct methods are 100% foolproof. Very few things, however, are provably correct.

Re:Wikipedia to the rescue (1)

Guy Harris (3803) | more than 5 years ago | (#28470955)

Your proclamation betrays your ignorance. Provably correct methods are 100% foolproof. Very few things, however, are provably correct.

And, of course, there is the risk of an error in the proof. Possibly a negligible risk, but it can happen.

Re:Wikipedia to the rescue (2, Informative)

wren337 (182018) | more than 5 years ago | (#28470519)

Holes are always found - no method is 100% foolproof.

http://en.wikipedia.org/wiki/One-time_pad [wikipedia.org]

Re:Wikipedia to the rescue (0)

Anonymous Coward | more than 5 years ago | (#28471671)

Breakable ones the pad is leaked or reused, which a fool will do. People seem to be confused as to what "foolproof" means. No method is 100% foolproof because there'll always be some fool who breaks an assumption of the method.

Re:Wikipedia to the rescue (2, Funny)

jgbishop (861610) | more than 5 years ago | (#28471633)

Of course holes will be found. It's made out of a lattice!

Can I run this homomorphism on your data? (1)

Saint Stephen (19450) | more than 5 years ago | (#28470003)

f(x) = x

Hmmm E anyone? (-1, Troll)

kenp2002 (545495) | more than 5 years ago | (#28470075)

I wrote a 200+ word response to this then found that I can summarize it in one sentence:

"Once again the Imaginative Bullshit Makers have developed a self defeating product..."

simple explanation (5, Informative)

Anonymous Coward | more than 5 years ago | (#28470091)

OK, it looks like a lot of people are missing the point.

What Gentry figured out was a scheme for carrying out arbitrary computations on encrypted data, producing an encrypted result. That way, you can do your computation on encrypted data in the "cloud", but only you can view the results.

If E() is your encryption function, x is your data, and f() is the function you'd like to compute, homomorphic encryption gives you a function f'() such that f'(E(x)) = E(f(x)). But at no point does it actually decrypt your data.

This could be huge for secure computing.

Re:simple explanation (1)

TypoNAM (695420) | more than 5 years ago | (#28471121)

If E() is your encryption function, x is your data, and f() is the function you'd like to compute, homomorphic encryption gives you a function f'() such that f'(E(x)) = E(f(x)). But at no point does it actually decrypt your data.

Got an example in C language instead?

I'm so childish (-1, Troll)

Anonymous Coward | more than 5 years ago | (#28470121)

I apologize ahead of time

'fully homomorphic encryption,' makes possible the deep and unlimited analysis of encrypted information

Analysis can mean Disclosure of Information (0)

statdr (1585151) | more than 5 years ago | (#28470135)

If a human (or computer) looks at the "analyses" performed on encrypted data and can make the same inferences as if the analyses were performed on the unencrypted data, then the encryption process hasn't reduced the possibility of disclosure that arises through analysis (or data access).

  Simple analyses like 1) produce a 2x2 table of outcomes or 2) produce a linear regression of an outcome on covariates can be used to identify information about individuals or even identify individuals.

  I've seen a push by IT firms to focus on the encryption/transmission issue when discussing privacy concerns. While important, it's only a piece of the HIPAA/privacy issue. If a human being/computer can make correct inferences from analyses of data (encrypted or not), then there is a potential for disclosure that is not covered by the encryption process.

  It is extremely difficult to determine a priori what analyses could/would reveal identities or information about individuals. Perhaps simple pre-defined two-way tables/one-way tables could be created for a given fixed set of information. A general purpose non-disclosive analysis system doesn't yet exist (if ever).

iaas (I am a statistician)

Re:Analysis can mean Disclosure of Information (3, Informative)

gr8_phk (621180) | more than 5 years ago | (#28470741)

A general purpose non-disclosive analysis system doesn't yet exist

It does now. That's the point. I don't think the wording in the article is very good. What they're doing is more like simulation of circuits (AND and XOR gates). You can construct a general purpose computer from such gates. You can run a gate-level simulation of such a machine, but your simulation would normally use unencrypted data. This breakthrough allows your simulated machine to use encrypted data, so you feed it encrypted data and you get out encrypted data. All the guy running the simulation knows is the design of the simulated hardware.

This can be taken one step further. If you simulate a programmable computer - not just a fixed algorithm - then the guy running the simulation won't even know what *algorithm* he's running in addition to not knowing what the data is since the program is just encrypted data. I've been toying with this for a while without knowing the proper name for it :-) And no, I never found a method that could handle both AND and XOR, so I look forward to reading more about this.

At first... (4, Funny)

curtix7 (1429475) | more than 5 years ago | (#28470195)

I thought I was being childish when i thought to myself "tehee homo-morphic,"
but after RTFA my suspicions may be justified:

Two fathers of modern encryption...

thanks, but no (0, Redundant)

JackSpratts (660957) | more than 5 years ago | (#28470331)

ok, so if your sales are tanking your competitors can analyse your encrypted offsite data to find out your sales are tanking. that about it?

up to you but i'll pass.

Not really a threat to privacy (2, Interesting)

bk2204 (310841) | more than 5 years ago | (#28470425)

Basically, IBM has created a set of cryptographic algorithms that allow fully homomorphic encryption. If you don't want your data to be analyzed, all you have to do is use an algorithm that doesn't support it. You'd want to do that anyway, since you'd want to use algorithms that are already considered strong, such as RSA and AES. Although RSA is homomorphic in theory, in practice it is not, since padding is used to prevent other weaknesses.

Homomorphic Agenda (0)

Anonymous Coward | more than 5 years ago | (#28470549)

This is clearly a plot to turn our encrypted files gay.

Star Trek prior art (1)

Muad'Dave (255648) | more than 5 years ago | (#28470561)

Fully homomorphic encryption is a bit like enabling a layperson to perform flawless neurosurgery while blindfolded, and without later remembering the episode.

Oh, I get it! It's like when Dr. McCoy reinstalled Spock's brain. McCoy was an idiot before, got the 1337 skillz, and then forgot it all.

Setec Astronomy (1)

tekrat (242117) | more than 5 years ago | (#28470617)

I don't suppose the researcher's name was Janik?

Deep packet inspection/National firewalls (1)

zig43 (1422373) | more than 5 years ago | (#28470733)

Great...the net nannies and oppressive governments will have yet another censorship tool in their arsenal.

Homomorphism (5, Insightful)

NAR8789 (894504) | more than 5 years ago | (#28470903)

This article needs some clarification. In particular, a lot of the worried comments here show a lack of understanding of the word "homomorphic".

Here's a very simplified example of a homomorphism. I define a function
f(x) = 3x
This function is a homomorphism on numbers under addition. Its image "preserves" the addition operation. What I mean more precisely is
f(a) + f(b) = f(a + b)
That's pretty easy to verify for the function I've given.

Homomorphic encryption is interested in an encryption function f() that preserves useful computational operations. If we take my example as a very very simplified encryption then, say I have two numbers, 6, and 15, and I lack the computational power to do addtion, but I can encrypt my data with my key--3. (I'm generalizing my function to be multiplication by a key. And yes, for some reason I have the computational power to do multiplication. Humor me). I can encrypt my data, f(6) = 18 and f(15) = 45, and pass these to you, and ask you do do addtion for me. You'll do the addition, get 63, and pass this result to me, which I can then decrypt, which yields 21.

Now, my encryption here is very simple and very, very weak, but if you're willing to suspend disbelief, you'll note that the information I've allowed you to handle does not reveal either my inputs or my outputs. (In fact, with the particular numbers I've chosen, you might guess that my key is 9 instead of 3, (though relying on lucky choices or constraining myself to choices which have this property make my scheme rather useless))

If you generalize this to strong encryption and more useful computational operations, you begin to see how homomorphic encryption can be useful. One should note that, no, homomorphic encryption will not be a drop-in replacement for other forms of encryption. (Sending encrypted emails with homormorphic encryption would be unwise. An attacker can modify the data (though, if my understanding is correct, only with other data encrypted with the same key)) Homomorphic encryption simply fills a need that the other forms do not serve.

Hopefully you now also see how the article's use of the word "analysis" can be rather misleading. In particular, one of the earlier comments notes that it might be useful in allowing you to determine if different people's encrypted information is identical. By my understanding, homomorphic encryption would not allow this.

In any case, if my explanation is not enough, here [wikipedia.org] 's the wikipedia article.

Do I need to understand this? (1)

grikdog (697841) | more than 5 years ago | (#28471543)

If someone can detect spam in public-key encrypted email, how is this different from knowing my private key? Even if they promise they haven't read my mail, only "analysed" it, hasn't my encryption scheme been compromised?

In particular, hasn't the long-standing suspicion that public-key encryption reduces to a trivial case been confirmed? (A suspicion that arose when NSA dropped its objections to Phil Zimmerman's PGP scheme involving public-key encryption of the actual key being used to encipher plaintext using Triple-DES or IDEA or whatever it was. Either NSA knew how to crack a 256-bit key, or public-key encryption was already compromised.)

Similarly, to calculate the time required to discover Major Kong's OPE code prefix, given the 28 SAC B-52 bombers proceding to targets inside Russia on the Dr. Strangelove Big Board, we simply divide 26 cubed by 28, distributing a range of 628 prefix combinations to as many teams of radiomen who can send perhaps 20 prefices a minute to a single B-52. Major Kong's OPE prefix should turn up in 21 minutes, well within the deadline, provided his CRM 114 Discriminator isn't already toast. Now that's useful.

YOU FARIL IT (-1, Troll)

Anonymous Coward | more than 5 years ago | (#28471655)

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?