Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Calif. Attorney General: We Need To Crack Down On Companies That Don't Encrypt

Soulskill posted 1 year,20 days | from the making-it-more-painful-to-be-bad-at-security dept.

Encryption 127

tsamsoniw writes "California Attorney Kamala Harris says her office will start cracking down on companies in the Golden State that don't encrypt customer data and fall victim to data breaches; she's also calling on the state to pass a law requiring companies to use encryption. That's just one of the recommendations in the state's newly released data breach report, which says 131 companies in California suffered data breaches in 2012, affecting 2.5 million residents."

cancel ×

127 comments

Encryption (0)

Anonymous Coward | 1 year,20 days | (#44174063)

ROT13 FTW!!!

Re:Encryption (2, Funny)

Anonymous Coward | 1 year,20 days | (#44174077)

EBG13 SGJ!

Re:Encryption (1)

leuk_he (194174) | 1 year,20 days | (#44174277)

Double rot13 for the win! ;)

One thing: Encrption of laptop drives and external usb/harddisk is usefule against stupid loss/theft . Encryption of company servers is only buring cpu cycles, since the key is available to users that have to use ist

Re:Encryption (1)

oPless (63249) | 1 year,20 days | (#44174445)

I detect someone doesnt know what a HSM nor what one is used for.

Here, have a wikipedia link and learn something today http://en.wikipedia.org/wiki/Hardware_security_module [wikipedia.org]

Re:Encryption (3, Insightful)

Bert64 (520050) | 1 year,20 days | (#44174731)

So instead of burning cpu cycles, you are burning crypto processor cycles plus you have the cost of buying the hardware in the first place and possibly the bus overhead of sending data to/from the device.

If the server gets compromised while its running, the data is accessible because the server needs access to the data in order to function.

If the server gets physically stolen its likely the crypto hardware will be stolen with it. If you store the key somewhere it can be automatically obtained and used then the key can be stolen too, if you enter the key manually on bootup (ie how you would on a laptop) then you require physical intervention if the server reboots for any reason.

Encryption has its uses, but its not a magic bullet, and poor/inappropriate use of encryption is damaging - not only does it waste resources unnecessarily, but it also brings a false sense of security and encourages lazy thinking... People will simply implement the bare minimum required to comply with the law, which will probably mean encrypting the data while leaving the key on the same box.

You will also end up with a "one size fits all" attitude, which is clearly ridiculous...
You need to consider *what* data your storing, *why* your storing it and *what* needs to access it.

You can segregate the data so that some is only accessible by those systems that need it.
You can tokenize the data, eg for repeat billing of a credit card you can store a token agreed only between you and your payment processor.
You can store rarely referenced data with public/private keys, leaving only the public key online and keeping the private offline for use when necessary.

No, pushing a one size fits all "encrypt your data" mandate is stupid and will only make things worse, each individual case needs to be designed by someone who understands the needs and is technically competent.

Re:Encryption (2)

AJH16 (940784) | 1 year,20 days | (#44175839)

While you are correct about the impact of anything currently running on the server, you are dead wrong about physical theft. An HSM should be hardened against picking the key out of it and should actually destroy the key if tampering is detected. Encryption on the server is still of limited benefit since the data key could probably be abused in most remote exploits on a running system, but for powered down security, such as physical breach, it is very significant, even if the chances of someone breaking in and stealing a server are generally much lower than a remote intrusion (though not as much as you might think since many attacks are internal).

Re: Encryption (0)

Anonymous Coward | 1 year,20 days | (#44176009)

Dude, we steal the server, hem and all, set it up in our lab, then we have all the time in the world to try bus based exploits.

Re: Encryption (1)

Charles Duffy (2856687) | 1 year,20 days | (#44177923)

Dude, we steal the server, hem and all, set it up in our lab, then we have all the time in the world to try bus based exploits.

So? Signatures happen on the HSM, which also stores the key material; only the cleartext data going in and the signatures going out are on the bus.

And if you mess up in the lab, the HSM kills its keystore and game-over. (Or, if it doesn't, the folks on the other side were insufficiently paranoid / excessively cheap and it's Their Own Damned Fault).

key storage (0)

Anonymous Coward | 1 year,20 days | (#44176075)

If the server gets physically stolen its likely the crypto hardware will be stolen with it. If you store the key somewhere it can be automatically obtained and used then the key can be stolen too, if you enter the key manually on bootup (ie how you would on a laptop) then you require physical intervention if the server reboots for any reason.

Unless you use a mechanism to store the key/s seperately from the dat:

http://en.wikipedia.org/wiki/Key_Management_Interoperability_Protocol

It is certainly a trick problem, but I think that if you pass a law that mandates things with a decent time frame (18 months?), then you'll see that people will find solutions. California is the home of Silicon Valley after all, and I'm sure there are plenty of egg heads that can deal with this. It may be that the entire industry will benefit, as those same egg heads will start providing the solution else where (either as products or open source).

Re:Encryption (2)

WaywardGeek (1480513) | 1 year,20 days | (#44176397)

I think you have some misconceptions about the CPU cycles involved in encryption. It's basically free. It's just a few clock cycles per byte.

The part everyone is concerned about is key stretching, where a CPU needs to do about half a second worth of processing to hash a password. There is simply no reason to do key stretching on the server. That's a dumb architecture. Instead, make the clients do it. By default, Microsoft does the key stretching on the server, and it's only for about a millisecond, if that.

I think encryption provides more security than you suspect. If an attacker only has access for a short period of time, like an hour, then probably over 90% of your user accounts would be safe. It's one thing for an attacker to steal your backup media, or get ssh access and scp some files, and quite another for him to hack your server and monitor what goes on in memory in real time. Copying files can be done by anyone. Even the secretary or janitor is a potential leak. Getting root access and inserting a memory monitor around your application, and decoding what's going on requires a skilled programmer and a lot of effort. Guys in China who do this professionally maybe can do it in their sleep, but chance are that you and I would have to work pretty hard at it.

There are two problems I see happening all around that this law could help fix. First, companies always want full access to user data. No encryption is the standard knee jerk reaction at big companies, because they want to be able to do data-mining on user data. Apparently, there is no penalty of consequence to companies that lose control of user data, and clearly the user data is valuable to the company. Some companies even sell it. Because of this, we have a stupid level of non-encrypted data, even data that really isn't valuable for data mining, such as credit card info. The second problem I see a ton is that management just takes IT's word for it that they are secure, while IT mostly ignores management because management isn't capable of understanding security anyway. It's the nature of employees in every profession to be lazy about tasks that will never be checked, and it's the nature of management to consider their company above the rest in terms of how well they are run.

Just guessing... if this law is enforced, California could reduce user info leaks by maybe 100X. Probably 10X just for making management want user data encrypted, and another 10X for making employees care.

Is this law a good idea? Beats me... Why not just post a list of every data leak the way police have a crime blotter in the local news rags? If we could make users aware of how badly their data is managed, companies would come around to caring more about it.

Re:Encryption (1)

endus (698588) | 1 year,20 days | (#44176581)

Yea, I work in the security industry and I don't really agree. I hear what you're saying about considering each application and you're not wrong, but I think the potential benefits of this easily outweigh the negatives. It will apply pressure to companies who really do need to encrypt their data and just cannot get the will from the business to do it.

Its not a magic bullet, but especially in the absence of any legitimate way to wipe data from databases in a secure manner it's a reasonable compensating control to put in place. It really depends on the actual implementation whether or not the encryption will help if the server is compromised while it's running. If companies encrypt at the database or table level and implement things decently then at least it's not just a matter of compromising the server and copying the entire database off to get the information. Web based attacks are probably going to compromise the database's security, but at least information secured in this way would be safe(er) from network based worms and other malware. That is not a trivial or uncommon attack vector, and I think it's worth serious consideration.

The other aspect of this is that it would force a lot of companies to implement real key management procedures in order to not lose access to their data. Once they need to do that to maintain the business, they'll be much more receptive to rotating and expiring keys, etc. because it's a low hanging fruit. Right now key management is kind of a nightmare and not something I see a lot of companies handling effectively. If you have to deal with key management in order not to take down your entire business being more selective about who has access to those keys, split knowledge, etc. become a much more realistic proposition. That will demonstrably increase security as well as compliance with other regs/standards.

I'm both a Libertarian and a security professional...I am suspicious of government regs but I think they are needed in this case. The industry is not keeping up with the security landscape well enough, and this stuff is far enough out of the public's line of view that it has the potential to negatively impact their lives out of nowhere, and there is no ability for them to audit or verify a companies security measures before engaging with them. I think that is a threat to the public welfare, and something that does fall within the role of government. Implementing encryption in this way is not going to be that onerous, and it will have a tremendous impact on people who really REALLY do need to encrypt their data at the price of a bit of a hassle for those who don't. As this becomes more widespread key management and implementation of encryption will also become easier, making it less onerous for people who don't necessarily need extremely tight security.

Re:Encryption (1)

dgatwood (11270) | 1 year,20 days | (#44179511)

Yea, I work in the security industry and I don't really agree. I hear what you're saying about considering each application and you're not wrong, but I think the potential benefits of this easily outweigh the negatives. It will apply pressure to companies who really do need to encrypt their data and just cannot get the will from the business to do it.

No, it won't. It will cause every non-corporate-run website run by individuals within the state of California to shut down because of the inability to pay the exorbitant costs suddenly required to keep the sites open.

I run a completely zero-profit website. The only personally identifiable information my site stores is a username and password. Everything else is voluntary. It is not involved in any financial transactions. It should not be affected by laws like this in any sane universe. The problem is that legislators don't understand technology, and will say, "Yes, but users could use the same username and password on multiple sites." And yes, they could. And if my database is compromised, I would have to send out letters to all... well, currently one users... informing myself of the breach. (Have I mentioned that this site is just getting off the ground?)

The big problem is that the database uses a shared hosting plan and a shared database server run by my ISP. I have no control over whether the database is encrypted on disk or in transit between the shared hosting server and the database server. In order to add that protection, I would have to crank my hosting plan up to a dedicated server at a monthly cost that is equivalent to several years on my current hosting plan and buy a multi-subdomain SSL cert that also costs (annually) as much as several years worth of service. And then, because I cannot possibly dedicate the time to manage my own server on an ongoing basis (hence the shared hosting plan as opposed to a VPS for the web server side), I would have to hire someone to manage that on an ongoing basis.

So if this law is not very narrowly tailored to sites that contain SSNs, financial information, and medical information, I'll have no choice but to shut my site down. I can't afford to personally spend potentially many thousands of dollars each year to run a website out of the goodness of my heart.

Implementing encryption in this way is not going to be that onerous...

In my experience, any security practice that is not onerous also has little effect on security. Physical theft of spinning storage is an exceptionally rare cause of data breaches. Backup tapes are stolen occasionally. Offline copies of the database that sysadmins allowed their employees to take home with them on a laptop are also stolen occasionally. However, data theft caused by attackers remotely cracking into servers overshadows both of those loss mechanisms by orders of magnitude. Because remote data compromises are completely unaffected by encrypting the database on disk, use of encryption is very nearly worthless in the grand scheme of things. Requiring encryption on disk is a bit like trying to prevent automobile accident deaths by requiring children to wear bicycle helmets while in the car just in case the kids didn't wear their seatbelts.

Furthermore, it is completely unnecessary. There are already laws that require encryption for anything that could be considered high-risk. HIPAA has strict requirements for how health-related data can be stored. PCI DSS compliance requires encryption of credit card data. And so on. Any company that sanely should be required to use database encryption is already compelled by law to do these things. Anyone who is not already compelled to do them should not be, because it really isn't an earth-shattering event if a hashed password database gets compromised, and the odds of it being compromised through physical theft border on nonexistent anyway. My database is more likely to be destroyed by a meteor impact than to be physically stolen by someone hoping to run rainbow tables against it and use that to guess what random college students' Facebook passwords might be.

That said, I would be in favor of laws setting a timetable whereby every website must upgrade to using parameterized SQL queries. That would be a million times more useful at improving security of users' data than back-end encryption, does not require significant effort or expense except by the companies or individuals writing the software that those websites use, and does not require any infrastructure changes that may be beyond the website's control.

Re:Encryption (0)

Anonymous Coward | 1 year,20 days | (#44176597)

I will agree with you on poorly implemented encryption. A few simple design choices will lower the risk:

Encrypt at the application level instead of the DB level. Then you don't need to have all of the data available decrypted at the same time. As records come into the application, they are encrypted there one by one and the DB just holds chunks of encrypted data in the fields. If the app server is compromised while running, the encryption is all handled in the app itself. That sounds like a joy to intercept.

Split the key between your DB and application server. The app calls the DB portion of the key to handle the encryption/decryption and if one server is compromised or physically stolen, the partial key is useless. Sure, if someone gets both servers the key is technically recoverable, but the amount of work involved is dramatically higher. Does pouring through someone's source code for a partial key sound like fun?

As for resource problems, if you are running a business that stores sensitive data as part of your business model, hardware should _never_ be the limiting factor. Size your equipment properly and let the accountants write it down as a depreciating capital expense. If you can't afford servers that can handle the encryption, then you will run into much more damaging problems well before you make it on the radar for a targeted attack.

Be sure to leave a frontdoor for the NSA (1)

For a Free Internet (1594621) | 1 year,20 days | (#44174073)

Workers soviet government is the only solution! Down with the spying lying mass murdering racist imperialist bourgeois parasites!

NSA (4, Funny)

Anonymous Coward | 1 year,20 days | (#44174085)

We Need To Crack Down On Companies That Do Encrypt

Re:NSA (1)

Anonymous Coward | 1 year,20 days | (#44174305)

It is a shame that even popular open source projects dont bother. For example Mozillas Thunderbird chat has no OTR support, Mozilla Firefox and other browsers treats self encrypted certs as WORSE than unencrypted and put big scary messages up. Instead there really should be three different "modes" - what exists now in all browsers for the certificate authorities for those who want to talk to their banks etc, self signed certs which should just work like an unencrypted link (no full secure icon etc), then plain text - the worst option of the three.

P.S. Slashdot: Where is my secure connection? Anonymous coward... not.

Re:NSA (4, Insightful)

Chrisq (894406) | 1 year,20 days | (#44174409)

Mozilla Firefox and other browsers treats self encrypted certs as WORSE than unencrypted and put big scary messages up

I think it is reasonable action for a certificate you don't know the source. You can always add the certificate to your browser [mozilla.org] and avoid the error. The rationale for the pop-up is that an unknown self-signed certificate is as bad as no encryption - totally open to a main-in-the-middle attack, but people have a higher expectation of security from SSL.

Re:NSA (5, Insightful)

tajribah (523654) | 1 year,20 days | (#44174521)

Is "as bad as no encryption" a reason for yelling on the user and presenting it like the worst security problem ever? Even if I accept the premise that it is as bad as no encryption, the obvious conclusion is that the browser should present it the same as no encryption.

Actually, it is not as bad. It still keeps you safe from passive attacks (like your ISP collecting all data for a three-letter agency, which analyses them later).

Re:NSA (1)

rioki (1328185) | 1 year,20 days | (#44174711)

Well it depends on what you are doing. Using your own private service over the internet secured by SSL, no big deal, register the cert. Using your online banking and the cert is self signed, better check on that. The reason is that no encryption is clear that it is unsafe and most people will (hopefully) not do anything sensitive. But putting trust in a self signed certificate is a gamble, especially when you assume that this SSL connection is being used to transfer secure data. The reason why it is considered worse, is that it is assumes that the SSL connection will carry sensitive data, but the unsecured one not, yet the level of protection (authentication, not encryption) is the same.

Re:NSA (1)

FriendlyLurker (50431) | 1 year,20 days | (#44174799)

That is an "all or none" argument. If self signed certs look feel and behave the same as what unencrypted does now, then people have no reason to behave differently than they do with unencrypted. Sadly and as numerous researchers have shown (like this one - "Crying Wolf: An Empirical Study of SSL Warning Effectiveness" [psu.edu] ) people quite happily transfer secure data over unencrypted connections in the current setup anyway. This further undermines your argument and the rational that treating self signed certs the same as plaintext is considered worse, especially given recent mass data passive collection revelations...

Re:NSA (0)

Anonymous Coward | 1 year,20 days | (#44175331)

How many people call their bank and check the SSL's fingerprint? (Assuming the phone hasn't been MITM'd or something.) Nope, we just trust the CAs. How weird.

Re:NSA (1)

AmiMoJo (196126) | 1 year,20 days | (#44174741)

It isn't the same as no encryption. The site is making a claim that cannot be verified, and which often points to fraud. Treating it as unencrypted would open up all sorts of man-in-the-middle attacks by criminals, ISPs and three-letter agencies quietly intercepting and replacing security certificates. Do you think people check for HTTPS and a valid cert every time they connect to their bank or email account?

Re:NSA (1)

tajribah (523654) | 1 year,20 days | (#44174807)

I expect the browser to clearly inform the user whether the connection is safe (HTTPS with a verified certificate) or unsafe (either plain HTTP, or HTTPS with an unknown certificate). I also expect the user to check that a connection to his bank is reported as safe. If you are interested in preventing attacks against careless users, the browser might also notify the user that a site previously known to have a safe connection, no longer has one. However, I do not think this is of much help: many users just enter the domain name of their bank and rely on the bank to redirect the HTTP version to the HTTPS one, which is where a MiTM attacker can always succeed. (An interesting special case is invalid certificates: expired ones, or certificates issued for a different domain. Here, a big fat warning could be appropriate.)

Re:NSA (1)

FriendlyLurker (50431) | 1 year,20 days | (#44174873)

No it isnt that same, it is better. Unencrypted is already open all sorts of man-in-the-middle attacks by criminals, ISPs and three-letter agencies are already "quietly intercepting", and recording EVERYONES traffic. Making them go one step further an have to target individuals in order to replace a deluge of self signed security certificates is a big positive step. Also If self signed certs are never blessed with security icon by default then people will not fall for fraudulent fakes - because the browser would never be telling them they have a "secure for your bank traffic" type of connection with a self signed cert.

Re:NSA (1)

nine-times (778537) | 1 year,20 days | (#44175405)

I think part of the rationale is that a self-signed certificate very well might be a sign that you're the victim of a man-in-the-middle attack, and it needs to be treated as a serious potential threat.

Personally, I don't think the problem is that web browsers treat self-signed certs as dangerous. I think the real problem is that the only infrastructure we have for authenticating certificates is expensive and unwieldy. We need to have a way of generating and authenticating certificates that's easy enough for a novice, and functionally free for most legitimate uses.

Re:NSA (1)

tajribah (523654) | 1 year,20 days | (#44175451)

I think part of the rationale is that a self-signed certificate very well might be a sign that you're the victim of a man-in-the-middle attack, and it needs to be treated as a serious potential threat.

This sounds good in theory, but the reality is that self-signed certificates (or those signed by an authority your browser does not recognize) are several orders of magnitude more common than MiTM attacks.

Otherwise, I agree that a big part of the problem is unusable UI for managing certificates in almost all existing browsers.

Re:NSA (0)

Anonymous Coward | 1 year,20 days | (#44178647)

Firefox and the other browsers should have an automatic cert. patrol built-in. They just need to keep a database of all the websites visited and the SSL certificates given. I did a test with my parents and cert. patrol gave the right answer every time as to what they should do with its suggestions. If instead of the suggestions, it popped up a warning if a site suddenly started using self-signed, or the certificate changed before it expired by a lot, it would be much safer.

Re:NSA (4, Insightful)

FriendlyLurker (50431) | 1 year,20 days | (#44174567)

people have a higher expectation of security from SSL.

I think the GPs point was that it does not have to be a all or none - that you can have SSL of a self signed cert without the error message and without giving any "expectation of [high] security" (to quote GP "no full secure icon")

The rationale for the pop-up is that an unknown self-signed certificate is as bad as no encryption

In light of the Snowden revelations and subsequent fallout, this rational has very few legs to stand on. Unencrypted is less desirable than plain text. The only argument I have seen against this rational is that people may be lulled into a false sense of security if they believe self signed certs are as secure as CA issued ones, falling for MITM attacks for their bank traffic etc. The counter to that is that is simple and sensible: no, not if the browser does not try to tell them they have a top secure connection - and treats it like it is a plain text connection.

self-signed certificate is... totally open to a main-in-the-middle attack

The current SSL system is also totally open to a main-in-the-middle attacks by state sponsors, as has been reported here various times. And yes self signed certs are also very vulnerable to the same attack - but the point here is to encrypt the majority of data. State sponsers can always target but with blanket always on encryption they are unable to perform mass illegal capture and storage.... that is the point of not raising an error message on self signed certs.

Any way I cut these arguments, browsers appear to be in the wrong on this one - throw in cosy relationships with CAs, state departments etc and we could have a conspiracy here.

Re:NSA (0)

Anonymous Coward | 1 year,20 days | (#44174719)

no, not if the browser does not try to tell them they have a top secure connection - and treats it like it is a plain text connection.

I would taker it one step further. The "big scary error message" should popup for plaintext connections. Oh, and you could always add the "unencrypted is Ok for this website" exception to your browser security settings, like everyone trying to support the current sorry state of browser encryption like to remind us.

Re:NSA (1)

blueg3 (192743) | 1 year,20 days | (#44174885)

The current SSL system is also totally open to a main-in-the-middle attacks by state sponsors, as has been reported here various times. And yes self signed certs are also very vulnerable to the same attack - but the point here is to encrypt the majority of data.

They're not vulnerable to "the same attack". One attack requires hacking a CA or exerting very substantial influence over them. The other doesn't. The set of malicious actors who can -- and do -- MitM you if you use self-signed certs is much, much larger than the set of actors who can do it if you use CA-signed certs.

The only argument I have seen against this rational is that people may be lulled into a false sense of security if they believe self signed certs are as secure as CA issued ones, falling for MITM attacks for their bank traffic etc. The counter to that is that is simple and sensible: no, not if the browser does not try to tell them they have a top secure connection - and treats it like it is a plain text connection.

Yes, that's pretty much the argument. The danger is that they could think that their connection is somehow more secure than plaintext. You cannot safely fix this without determining user intent, and even the user can't usually be trusted to determine their intent. If they intended to use HTTPS to get real security but instead were presented with a self-signed certificate, they should be warned. They shouldn't just continue on blithely unaware -- which is exactly what will happen if you treat it as a normal unencrypted connection. People don't check these things. If they intended to have no security, like HTTP, actually using HTTPS with a self-signed certificate is no worse. But you can't easily tell the difference between those two situations. By treating self-signed-certificate-based HTTPS as if it were normal HTTP, you make it easier to use a shitty security mechanism (a key exchange where you have no idea if the party you're exchanging keys with is the party you intend to communicate with), but you make it much harder for most users to detect a common and very dangerous attack. That's a terrible tradeoff.

It'd be easier to add an extension to HTTP that enables opportunistic un-cert'ed if both client and server support it, then push support for that extension into Firefox and Apache (or convince Google that it's a good idea). Except that no serious website will benefit at all from this, since they all can afford the small cost of a certificate for HTTPS.

Re:NSA (1)

FriendlyLurker (50431) | 1 year,20 days | (#44175267)

The danger is that they could think that their connection is somehow more secure than plaintext.

It is a danger *only* if the browser is giving some indication of security. If the browser does not give any indication or expectation of privacy with self signed certs then there is no danger. Most browsers already do not show the protocol being used for plaintext (no http// display).

You cannot safely fix this without determining user intent, and even the user can't usually be trusted to determine their intent.

You can safely fix it by not giving any change to normal unencrypted experience. If they intended to use HTTPS to get real security but instead were presented with a self-signed certificate, and the browser defaulted into plain text view (no ssl icon or indication of security) then the user does not need any extra warning. Of important note: This is already the default behavior if you try to use https on a website that does not support it like slashdot - the browser defaults to plain text view without any warning, any error. If the browsers were truly so worried about this problem as you claim, then there would also popup big scary messages instead of the silent redirection from https to http.

They shouldn't just continue on blithely unaware -- which is exactly what will happen if you treat it as a normal unencrypted connection.

As mentioned above, this is already the default if you punch in https on a website that does not support it. Unfortunately and in any case, research has shown repeatedly that people continue of anyway regardless (eg "Crying Wolf: An Empirical Study of SSL Warning Effectiveness" [psu.edu] ), which further strengthens the case for self signed certs being treated the same as plain text connections in every way.

At the end of the day, our data is being collected and stored en-mass. It is passing through unknown number of private companies like Booz Allen and other private third parties to the security apparatus, including unknown number of individuals who are all unaccountable in any meaningful way to how they use or abuse that data. The majority of internet traffic flows unencrypted, despite your claim that "no serious website will benefit at all from this, since they all can afford the small cost of a certificate". All the current extra "scary" warning on self signed certs is doing is effectively denying the larger part of internet traffic the ability to be encrypted - that is a much worse tradeoff than raising self signed certs to the level of plain text .

Re:NSA (1)

petermgreen (876956) | 1 year,20 days | (#44175965)

If they intended to use HTTPS to get real security but instead were presented with a self-signed certificate, and the browser defaulted into plain text view (no ssl icon or indication of security) then the user does not need any extra warning.

When I make a request to a https url I expect the information contained within that request (parts of the url other than the hostname, post data if any, cookies if any) to be sent over an encrypted and authenticated link. By the time I can "look for the padlock" the potentially private information has already been sent. So if the connection cannot be authenticated the browser MUST warn me* BEFORE it continues with the request.

I support systems that allow encrypted but unauthenticated connections to be presented to the user in the same way as unencrypted connections but to maintain current security the https url scheme MUST NOT be used for such connections. Either a new url scheme should be allocated for such connections or a protocol should be used that can share the http url scheme.

This is already the default behavior if you try to use https on a website that does not support it like slashdot - the browser defaults to plain text view without any warning, any error.

If you try to go to a website that doesn't support https with a https url then your request will just time out . An attacker can't simply block secure connections to force you to unknowingly use unsecure ones.

slashdot DOES support https, it just happens to serve up a redirect to plain http. Blindly serving up redirects from https to plain htttp is risky as it can reveal potentially private information in the url but as far as the protocol is concerned it's /.'s call to make that descision (just as it's /.'s call to take the information you posted over https and post it for the public to see) and /.'s identity was verified** before they got the chance to make that descision.

* Of course I can ignore the warning, there is only so much you can do to protect a careless or incompetant user from theselves.
** To the extent that the CA model can verify certificate ownership.

Re:NSA (1)

fulldecent (598482) | 1 year,20 days | (#44175447)

OT

Is there a list of CAs that have been compromised, including evidence. I.E. it would post two signed and valid certificates for google.com for the same time period but one of them with obviously the wrong IP address?

Re:NSA (0)

Anonymous Coward | 1 year,20 days | (#44176171)

Are you postulating the lack of such a list is evidence that there has been no compromise?

Interacting with Google, it is especially hard to be vigilant. They use so many certificates and change them so often it is difficult to keep track of them. Try installing the CetPatrol plugin for Firefox: every time you connect to Google, it complains that the certificate has changed.

Re:NSA (1)

WaywardGeek (1480513) | 1 year,20 days | (#44176625)

Dude... companies do this all the time, if for no other reason than to compress network traffic. They just buy boxes like this one [sourcefire.com] . All you do is override DNS and CA. It's standard practice.

Re:NSA (1)

complete loony (663508) | 1 year,20 days | (#44175121)

With DNSSEC we should be able to publish and verify certificate information via signed dns records, which would also shift the root of the trust relationship up to the dns registrars. And since the authentication part of CA certificates is tied to dns already, I don't see that this would change much.

Re:NSA (1)

FooAtWFU (699187) | 1 year,20 days | (#44175287)

"I think the GPs point was that it does not have to be a all or none - that you can have SSL of a self signed cert without the error message and without giving any "expectation of [high] security" (to quote GP "no full secure icon")"

Can you, really? I mean, we have a big enough problem with training users to type credentials in a login box served by http://www.myfavoritebank.com/ [myfavoritebank.com] all insecure-like. This area where security intersects user interaction design is a tricksy one.

Re:NSA (0)

Anonymous Coward | 1 year,20 days | (#44175783)

And the problem where browser SSL "security" is not protecting the majority of the people against massive pervasive data collection is an even bigger problem. All about perceived risks. If your bank has not issued you with a key generator (which make MITM attacks useless, encrypted or not), then they do not take security seriously anyway.

Re:NSA (2)

Bert64 (520050) | 1 year,20 days | (#44174991)

On the other hand, a self signed certificate which you have explicitly accepted is in many cases *BETTER* than a ca verified cert. In the former case you have explicitly chosen to trust a single party, whereas in the latter you are reliant on a large number of organisations.

Re:NSA (1)

dkf (304284) | 1 year,20 days | (#44175845)

On the other hand, a self signed certificate which you have explicitly accepted is in many cases *BETTER* than a ca verified cert. In the former case you have explicitly chosen to trust a single party, whereas in the latter you are reliant on a large number of organisations.

A self-signed certificate is better only if you can independently verify that you've got the correct certificate and that it is still valid. Otherwise it is worse, because you've got no way at all to figure out if it is correct and whether it has not been rescinded yet (e.g., because of a break-in on the server). You're far better off to have a private CA run by someone you trust and to explicitly only trust that CA to issue for a particular service, rather than some random other CA. (The downside? That doesn't scale to the level of the internet.)

I'm assuming that the NSA isn't going round breaking into every random server on the internet and that the effort to recover a private key from the public key is "large". Dealing with being a Target Of Interest is whole 'nother level of difficulty...

A US certificate is now less trustworthy (0)

Anonymous Coward | 1 year,20 days | (#44175109)

I disagree, a certificate from a US company is necessarily less trustworthy than one from a website. Because the websites, at least the ones not in the US (UK?) are NOT subject to hidden unchallenged orders from secret courts and so their certificates signed by themselves is more trustworthy than the certificate authority in the US jurisdiction that IS subject to these courts.

All that adding a certificate authority to that mix does, is to enable a potential man-in-the-middle-attack. Jitsi does work fine, I sort of wish I didn't have to click to tell it to check the fingerprint, but OTR does work and is better than nothing. LIkewise OTR system should be in place for email where the remote key is known. It's ridiculous that email mostly travels unencrypted!

A self signed cert is better than no encryption. Self signed sites should be the norm now, where an unsigned site would previously be used, we should have a self signed one.

Two things are needed for a HTTPS attack:
First, control of the routing to do a man-in-the-middle attack, NSA clearly has that with direct cooperation of the Telcos thanks to the immunity. The immunity law means Telcos disobey the laws of Congress and States and obey the military orders, with immunity, making them part of the military infrastructure, outside legal boundaries.

Second, cooperation of a certificate authority. Very highly likely given the way the FISA court behaves and the nature of secret laws and secret courts that military can fake certificates. Hence certificates need to be untrusted. If General Alexander gets his way and gets immunity for all companies that do NSA's bidding, you are screwed.

I tried deleting the US cert authorities in Firebox, but it isn't possible. They are built in. Currently I'd like a way to remove all of them, and I'll trust a site based on it's trace-route, so if a connection to a site in Germany starts making a detour via UK or USA, I'll decline the new certificate. Once I accept a cert from them, I'd like to be warned if the cert changes so I can recheck the trustworthiness of the site, by checking if its making any extra detours.

I don't want the over the top warning in Firefox. If you disagree Mozilla fine, but this is what I want and I'm looking for a browser that does it. When one arrives Firefox is gone.

I'm with GP who wants this over the top self signed error removed.

+1 interesting/informative (0)

Anonymous Coward | 1 year,20 days | (#44175973)

thanks.

Re:NSA (0)

Anonymous Coward | 1 year,20 days | (#44174515)

Beat me to it!

Re:NSA (0)

Anonymous Coward | 1 year,20 days | (#44175917)

Why we encrypt is, uz#2uk ozy9(m ko1r4 0$)lx m7qm,kow kua 4^gybn

wait (5, Funny)

Yaur (1069446) | 1 year,20 days | (#44174087)

We have reached the point in time where attorneys general have realized that companies need to encrypt customer data? Either that happened faster than I expected or I'm getting old faster than I realized.

Re:wait (0)

Anonymous Coward | 1 year,20 days | (#44174177)

Yer getting old. At this point we've got college kids that have never known a world without the Web.

Re:wait (1)

zwarte piet (1023413) | 1 year,20 days | (#44174825)

I'm modern! We had Apple II computers in college 8) But the programming textbook we used, still mentioned sending the punchcards (not really punchcards, but the ones you fill out with a pencil) to the computing center.

Re:wait (1)

auric_dude (610172) | 1 year,20 days | (#44174257)

May well be down to shareholder pressure and I expect shareholders would not wish to have company IP or indeed customer data outside of their control. What is good for shareholders is good for business and good for votes needed by those in public office.

Re:wait (1)

gstoddart (321705) | 1 year,20 days | (#44174831)

What is good for shareholders is good for business and good for votes needed by those in public office.

Well, what was 'good' for shareholders was getting a product out the door, and not spending all of that money on implementing any actual security.

If there's no penalty for not having good security and/or encryption, why would a company spend money on it? From that perspective, it would be bad for shareholders and business.

If you make it costly to have data breaches, it becomes good for shareholders to implement security.

But, as a general rule, companies only do the 'right' thing when they have no choice, or if it's advantageous to them to do it. Which is why if we didn't have pollution laws, companies would just dump toxic crap wherever they felt.

Re:wait (1)

AmiMoJo (196126) | 1 year,20 days | (#44174683)

Does she also realize that ROT13 isn't sufficient "encryption" though?

HMRC (UK tax office) lost a CD with 15 million people's personal data on it. They released a statement saying it was password protected. A password protected MS Office document is not really "protected" in any meaningful sense.

Re:wait (1)

mlts (1038732) | 1 year,20 days | (#44176203)

Depends on what version of Office/Word. A document secured with a 32+ character password in a recent version (Office 2003 and newer) can use SHA 512 and AES-CBC.

Of course, using a weak password, all bets are off.

If one needed to distribute data on CD encrypted the "right" way, I'd either use a large PGP archive, ship the CD with a TC volume and a keyfile encrypted to the receiving site public keys, or use a commercial utility like PGP Disk and have a volume only openable with the receiving site keys.

Done right, having encrypted media can be secure... but doing it right isn't the easiest way.

Re:wait (0)

Anonymous Coward | 1 year,20 days | (#44177355)

Does she also realize that ROT13 isn't sufficient "encryption" though?

Exactly. If the government gets involved in mandating encryption, then the law will have to provide some type of list of Officially Approved encryption standards. Which means we now have a political process, as opposed to a scientific one, for selecting which encryption to use. It's basically a lose-lose type of situation.

I thought credit card info had to be already (1)

Spy Handler (822350) | 1 year,20 days | (#44174097)

encrypted or the credit card companies won't do business with you. (PCI compliant or something like that)

That leaves social security number and email address/password, but really, you should not use the same password for your Gmail account and Oily Pete's V1agra Online. As for social security, never give it out to anyone under any circumstances unless it's a bank (real one, not a Nigerian prince bank) and you're asking for a loan or opening a checking account.

Just drill that into the head of the IQ challenged members of society and we won't need yet another gov't agency trying to control what people do on the internet.

Re:I thought credit card info had to be already (1)

MrDoh! (71235) | 1 year,20 days | (#44174243)

I thought everyone had to be already. Least all the hoops I've had to jump through with all our clients who insist in it, and auditing of it.

Re:I thought credit card info had to be already (2)

blueg3 (192743) | 1 year,20 days | (#44174899)

Since when is credit card data, e-mail address, and password the only "customer data" a company keeps about you?

Re:I thought credit card info had to be already (1)

Bert64 (520050) | 1 year,20 days | (#44175037)

They require that you "encrypt" the data, but they also typically require that you send the data unencrypted (albeit tunnelled over ssl) to actually process a payment, so while the data may be encrypted on disk the server typically also has the ability to decrypt it on demand in order to make use of it... So it's just a case of a hacker working out how, and then triggering the same process to extract the data.

Client side strong encryption, please. (1)

Anonymous Coward | 1 year,20 days | (#44174147)

Server side encryption in could is no good. It prevents Calvin reading Susy's diary, but nothing much more than that.

And no proprietary closed up systems, but publicly verifiable implementations end to end, or it's not even worth trying to get people trust on it.

Re:Client side strong encryption, please. (0)

Anonymous Coward | 1 year,20 days | (#44174161)

Server side encryption in could is no good. It prevents Calvin reading Susy's diary, but nothing much more than that.

And no proprietary closed up systems, but publicly verifiable implementations end to end, or it's not even worth trying to get people trust on it.

s/could/cloud/

Re:Client side strong encryption, please. (1)

game kid (805301) | 1 year,20 days | (#44174511)

No, could is correct, as in "it could have been secure but we put it on that silly clusterfuck network instead".

Re:Client side strong encryption, please. (1)

WaywardGeek (1480513) | 1 year,20 days | (#44176829)

While I agree with your points, I think the public is unfortunately pitifully trusting. This whole NSA spying stuff will pass through the news cycle and soon not be covered again. It's only making a big splash because Fox News likes making fun of the Obama administration, but before the public actually starts demanding their right to privacy, Fox News will bury the issue and convince their watchers that the government is not spying on them. All of the systems we have in common usage are total crap, and any security guy can point out the massive holes, yet our public just looks at that stupid lock on the bar in their browser and believes they are safe.

How long has Microsoft been giving the NSA zero-day exploits long before fixing them? The early 90's? How long have companies been overriding DNS and CA to spy on their worker's traffic (with some legit reasons)? How long have we known on Slashdot that AT&T built a major backbone into some huge NSA building that has it's own power plant? How is it that every time we try to secure e-mail by making encryption not only available, but the default, that our efforts get shot down? When we wanted "secure" Internet commerce, we quickly got the government hackable system we have today. How long have we been waiting for secure DNS, and do we really think the government isn't going to control the top level registrars? Actual security is easy compared to this hackable system we've built.

The government has the power to force Microsoft, Google, Apple, Cisco, Dell, HP, and the other big computing companies to leave the network entirely hackable. The fact that it is so MTM hackable is strong evidence that the government is doing exactly that. The fact that the public remains blissfully unaware and uncaring is evidence that we'll never get that actual security we wanted.

Encrypt with what... (0)

Anonymous Coward | 1 year,20 days | (#44174157)

...and how would provider-side encryption be helpful if keys were needed to be distributed across all of the data backends?

technical challenges (1)

KarlH420 (532043) | 1 year,20 days | (#44174189)

Usually at some point the server needs to be able decrypt the data so it can be displayed to a user, so the key needs to be handy. So if you have the key and data on the same sever it's of little security value.

If you want to have this data in some kind of database, there is a good chance you want to be able to search and index this data. Possible to index and pre-sort encrypted data without giving away the content?

Yes, maybe encrypt some sensitive parts, but encrypting all customer data is counter productive.

Re:technical challenges (0)

Anonymous Coward | 1 year,20 days | (#44174221)

I have no solution.

I know of one research project where you can do calculations on encrypted numbers, and getting the correct encrypted results. Maybe this can be used for data bases as well, create an index on the encrypted data, and being able to send an encrypted search query.

Re:technical challenges (1)

oPless (63249) | 1 year,20 days | (#44174459)

I detect someone doesnt know what a HSM nor what one is used for.

Here, have a wikipedia link and learn something today http://en.wikipedia.org/wiki/Hardware_security_module [wikipedia.org] [wikipedia.org]

Re:technical challenges (2)

blueg3 (192743) | 1 year,20 days | (#44174953)

So... explain how that helps when someone hacks into the server and requests data using the same mechanisms and level of authority as the server software (which must ultimately manipulate unencrypted data).

Because that's what happens.

Encrypt everything (3, Interesting)

Anonymous Coward | 1 year,20 days | (#44174211)

Don't just encrypt private details.

Get rid of users private data, so there is nothing to steal in the first place.

Use eccentric authentication*. Replaces passwords with anonymous client certificates.

Check my: http://eccentric-authentication.org/ [eccentric-...cation.org]

Re:Encrypt everything (0)

Anonymous Coward | 1 year,20 days | (#44174455)

So do you have solution how you search/index encrypted data in a database?
Because when you store data you most of the times plan to use it, like for searching a customer.

How do you search for a customer name when you put his name encrypted in the database? And name searches need to be done using a complex index which does partial names and different spellings. Also the database server should not know the key, because having the encrypted data and the key in the same location is incredbly stupid.

Re:Encrypt everything (1)

Eccentric-Dude (2910375) | 1 year,20 days | (#44174787)

What you are looking for is homomorphic encryption. I don't offer that.

I offer a way to create accounts anonymously. And much easier than the email-address password combination.

When customers sign up for an account, they create a nickname. That gets signed into the client certifcate. The web server receives that nickname from the crypto-authentication libraries as the username. Do with that username what you want.

Solution for Microsoft and Oracle (1, Redundant)

will_die (586523) | 1 year,20 days | (#44174225)

Create new column types called varhcar2rot13 where the data is inserted it gets a ROT13 encryption applied when read out it automaticly decrypts it.
Sell to California companies for major money.

More importantly, (1)

Anonymous Coward | 1 year,20 days | (#44174251)

We need to crack down on government agencies that spy.

NSA can legally collect encrypted data (0)

Anonymous Coward | 1 year,20 days | (#44174281)

The NSA can legally collect encrypted data.

Re:NSA can legally collect encrypted data (1)

Skapare (16644) | 1 year,20 days | (#44177519)

Will that include the decryption key?

too much government (1)

Anonymous Coward | 1 year,20 days | (#44174287)

...and in other news, CA police will be cracking down on people who don't lock their house doors and who don't lock their car doors.

Re:too much government (1)

mrego (912393) | 1 year,20 days | (#44176051)

and how is 'encryption' defined? Would an XOR qualify?

Encryption is easy. Key management, not so much. (0)

Anonymous Coward | 1 year,20 days | (#44174437)

Using encryption is easy. Managing the encryption keys, not so much. The number of developers I see posting questions (to StackOverflow) on encryption with NO IDEA on basic key management is very worrying.

Re:Encryption is easy. Key management, not so much (1)

oPless (63249) | 1 year,20 days | (#44174485)

Not using IDEA is a good call.

http://en.wikipedia.org/wiki/International_Data_Encryption_Algorithm#Security [wikipedia.org]

"As of 2007, the best attack which applies to all keys can break IDEA reduced to 6 rounds (the full IDEA cipher uses 8.5 rounds).[1] Note that a "break" is any attack which requires less than 2128 operations; the 6-round attack requires 264 known plaintexts and 2126.8 operations.
Bruce Schneier thought highly of IDEA in 1996, writing, "In my opinion, it is the best and most secure block algorithm available to the public at this time." (Applied Cryptography, 2nd ed.) However, by 1999 he was no longer recommending IDEA due to the availability of faster algorithms, some progress in its cryptanalysis, and the issue of patents.[5]"

Pedantry aside, I *so* know what you mean.

Re:Encryption is easy. Key management, not so much (1)

cryptizard (2629853) | 1 year,20 days | (#44174735)

What is your point exactly? 2^126 is still massively infeasible, and it only applies to a reduced round version. In fact, since a year or two ago, full-round AES is also subject to a 1-2 bit break. That means that IDEA is at least as secure as AES.

Encryption is easy. Key management, not so much (1)

MtlDty (711230) | 1 year,20 days | (#44174449)

Using encryption is easy. Managing the encryption keys however, not so much. The number of developers I see posting questions (to StackOverflow) on encryption with NO IDEA on basic key management is very worrying.

Re:Encryption is easy. Key management, not so much (1)

Eccentric-Dude (2910375) | 1 year,20 days | (#44174867)

Make key management automatic.

Users can't do it wrong anymore. Play with the demo at: http://eccentric-authentication.org/blog/2013/06/07/run-it-yourself.html [eccentric-...cation.org] or take a look at the walkthrough.

Re:Encryption is easy. Key management, not so much (2)

cryptizard (2629853) | 1 year,20 days | (#44174943)

That has nothing to do with the problem. We are already assuming that the companies have personal data, they just want to encrypt it to prevent third parties from obtaining it. The problem is that you need to decrypt the data at some point in order to make use of it, so the key must sometimes intersect with the data. Where do you keep it so that someone who gets the data won't also get the key?

Re:Encryption is easy. Key management, not so much (0)

Eccentric-Dude (2910375) | 1 year,20 days | (#44175925)

That has nothing to do with the problem. We are already assuming that the companies have personal data, they just want to encrypt it to prevent third parties from obtaining it.

Then it is too late. Any personal identifiable information on someone else server is already public data. There is no way you can take that back. If that's tagged with a real life identity, your at their mercy.

The key is to hide your real life identity behind many different digital pseudonyms. At least one, but preferably multiple pseudonyms per site. Then, be careful with what you write. And don't mention these pseudonyms on your facebook page.

Let your computer do the hard work managing all these pseudonyms, that's the power of eccentric authentication.

Time to protest this with SHA-0 (0)

Anonymous Coward | 1 year,20 days | (#44174461)

Out of protest I'll change all encryption to SHA-0. Encryption is encryption right? or Perhaps a simple algorithm where reversing characters and letters would work as well. A becomes Z and 1 becomes 9. 0? Well, it has no choice but to become imaginary.

Lower the price barrier (0)

Anonymous Coward | 1 year,20 days | (#44174687)

As a web developer pushing out low-cost websites to small businesses that can and do collect sensitive information using COTS software (such as Wordpress etc.), I believe that the relatively high cost of SSL certificates is a big barrier to wide-scale adoption. We can do this stuff easily, but adding another 15-20% for a domain wide SSL certificate is a tough sell in today's market.

If anyone knows of a PROVEN low cost solution, I'd be extremely interested to hear about it.

Re:Lower the price barrier (1)

Eccentric-Dude (2910375) | 1 year,20 days | (#44174945)

There is DNSSEC with DANE.

Create your self signed server certificate and publish it in DNS. Sign it with DNSSEC.
You can't beat free. See http://datatracker.ietf.org/doc/rfc6698/ [ietf.org]

Smart and Hot - where's the crazy (1)

Gothmolly (148874) | 1 year,20 days | (#44174771)

She seems to make a good point, and according to The Wiki she's pretty hot. I feel bad for her significant other (assuming she has one), I bet she's totally nuts.

Re:Smart and Hot - where's the crazy (2)

IamTheRealMike (537420) | 1 year,20 days | (#44175095)

The crazy is in thinking she can regulate better security onto any random industry. It doesn't work like that. Security is too complicated to magically fix by insisting on blind usage of a particular tool.

If you look at the article, a huge number of the breaches are to do with credit card leaks. Well, duh, credit cards are a pull model not a push model. Bitcoin is more sensible, but the California DFI is busy harassing Bitcoin companies. So if she really cares about upgraded security, maybe she should get the DFI off the back of people building more secure, cryptographic financial systems that compete with the incumbents? That's much less fun than coming up with new laws though.

Re:Smart and Hot - where's the crazy (1)

Stan92057 (737634) | 1 year,20 days | (#44175431)

Bitcoins are a dead issue they will never be a replacement for any kinda legal tender just for the fact its untraceable the government will not allow it. And SOMETHING must be done because they will not do it without being forced to do it."Data centers not encrypting the data they have" Your kind will always find reasons to not do something because well its hard to do right. With your kinda thinking we would have never gone to the moon. And your kinda thinking is keeping us from going to mars not because of the money. Way too much negativity

Dictate penalties and properties not methods (4, Insightful)

WaffleMonster (969671) | 1 year,20 days | (#44175113)

Good laws of this sort are those which do not impose technical solutions but rather provide general systems level requirements.

The problem with "duh use encryption" there is no guarantee of any kind simply applying encryption makes a system more secure against a specific threat.

Every time you get into the weeds you are guaranteed to codify errors and hurt those who choose to innovate using different but better or equally valid approaches.

Attorney Generals are good things (3, Interesting)

onyxruby (118189) | 1 year,20 days | (#44175487)

I've dealt with cleaning up some nasty data breaches over the years, I've had conversations with Attorney Generals when the breaches were bad enough. Companies fear Attorney Generals about as much as they fear being on the wrong end of the international news.

I've been involved with companies where data breaches happen where Attorney Generals while and while not get involved. The difference is night and day for things like encryption, notification of consumers, risk mitigation and other such steps. Pause and think about it for a moment, do you really think California is breached that much more often than other locations, or do people simply find out because the companies fear being on the wrong end of the Attorney Generals pointy stick?

Attorney Generals that give a damn are good things, they give the security professionals at the companies in their states the leverage they need to actually do the things that they want to do (encryption etc).

Re:Attorney Generals are good things (0)

Anonymous Coward | 1 year,20 days | (#44176339)

It's 'Attorneys General'.

Translation: He lied (0)

Anonymous Coward | 1 year,20 days | (#44175743)

He was given the questions a day in advance. Sorry, lying to congress is a felony.

Confused Identity (1)

b4upoo (166390) | 1 year,20 days | (#44175871)

Tradition normally holds that a person who does a bad act is the guilty party. These days that is becoming rather twisted. If a person steals data then doesn't the guilt fall upon the thief? What they are doing is similar to the rather absurd gun law that can find a person negligent for simply using one lock to secure a gun. A home owner locks his windows and doors and drives off to the market. Mr. bad guy breaks in the back door and steals the gun and later that day shoots someone. Out of the blue the law also comes down hard on the home owner for not using enough security of that firearm. Frankly it is not good policy. All of the guilt falls upon the bad guy who broke in according to me. If anything the police department shares some of the guilt as they failed to protect my home. The general public also shares the guilt when they pass laws that make it next to impossible to deal with bad people. But whether it is data or guns I think the thief is the one who should pay.
                        If there is room for guilt it would be in situations such as a finance company dumping records in a dumpster completely neglecting to shred the records. As it is understood that dumpster diving is legal and a common practice.
                        Society seems to avoid punishing the guilty.

Re:Confused Identity (0)

Anonymous Coward | 1 year,20 days | (#44176865)

On one side, you have that with the firearm owner with security. However, that analogy is fraught, because gun control is a major political firestorm right now, so anything related with firearms in the US can go either way.

A better example would be a bank storing the cash deposits in a pile in the lobby where anyone breaking glass could get to it. If this happens and the bank is unable to process transactions, it isn't just the fault of the burglar, but becomes partially the fault of the bank for not using proper security measures.

Another example is running into a gas station and leaving the key in the ignition. Yes, technically a thief should be the one punished, but in some places, the driver of the vehicle could be arrested because they didn't follow any reasonable security, and "gave away" a vehicle.

It is the same in the IT world. "Security has no ROI" is a mantra I've heard for years, followed by "Geek Squad is callable 24/7" when said PHB is asked about a plan in case of a breach.

The problem is that when security measures are working, it isn't obvious to management. It is only when they fail does it become known, so there becomes a tendency to slowly de-fund security over time until a scare or a total breach happens.

Yes, a thief is a thief and an intruder is an intruder. However, it doesn't mean that one can not bother with security.

exactly wrong (1)

markhahn (122033) | 1 year,20 days | (#44175929)

We need to make companies liable for any information they are so careless as to lose. Intruding on their business process is the wrong way to go about it: punitive liability judgements (and tighter disclosure laws) are the right way.

Part of the problem here is this horribly mistaken meme that everyone and everything is hackable. It makes people feel not responsible, and it's only true in the sense that evert newborn baby has started dying, or that the universe will cool/stop. Not concerned with this meme? Well, your country is spending billions on stupid and futile "cyber-warfare" efforts, rather than simply buttoning up the security of the electrical grid, banking network, etc.

Our goal should be for companies to think of sensitive customer data like radioactive waste: they want to ship it elsewhere, not have it sitting around in unsealed, leaky barrels in their offices. Secure access to data is obviously a specialized skill, so why not have companies devoted to doing that alone?

Re:exactly wrong (1)

Skapare (16644) | 1 year,20 days | (#44177645)

For things like the electric grid, there shouldn't even be any access at all. It's that critical. It is critical enough that they should have private FIBER following every power line.

For people info like SSN and bank account numbers, the system should be revised so that the number alone only serves to IDENTIFY and is not treated as AUTHORIZATION. Lots of people have other people's SSNs for various reasons. Using the identification number for authorization is totally wrong. This also goes for credit card numbers and bank account numbers. The WHOLE banking system was built around assumed universal trust, which actually never was a valid assumption, ever. But it is even less so today. Every banking transaction needs to include some form of authorization, either a properly encrypted hash of the full transaction, signed by the authorized person's private key, or in physical form like a signature on paper.

What these things do is make having SSNs and account numbers pointless. Any bank that hands YOUR money over to someone who can recite YOUR SSN, should be fined $1000000 plus ten times the transaction loss. If that cause then to go out of business, then that is well deserved.

Re:exactly wrong (0)

Anonymous Coward | 1 year,20 days | (#44178009)

I just wanted to say thanks to you and the GP for the most insightful posts on this thread. I agree 100% that this should be covered entirely by liability, not by mandating specific practices that would just be unwillingly followed to the legal minimum and provide no real security.

The burden of keeping a public number (like one's SSN) secret is obscenely unfair and needs to end ASAP. Businesses should be held accountable for their own bad ideas - not have us all pay the price whether we'd like to or not.

Encryption keys ... (1)

Skapare (16644) | 1 year,20 days | (#44176271)

... are essential to the servers that handle the data. They can't actually operate on the encrypted data. They have to UN-encrypted it first (and RE-encrypted it to put it back if there any changes). So what does this mean to me? It means I have to grab the encryption key(s) when I break in to get the data.

This reminds me of an incident with a state web site. Someone broke in and did some defacing. The state's top IT director answered a reporter's query with "This needs to be investigated because we bought a top of the line firewall that should have blocked the hacker".

But we all now know encryption means... (1)

3seas (184403) | 1 year,20 days | (#44177783)

.... nothing to the NSA

Start with certified emails. (1)

houghi (78078) | 1 year,20 days | (#44178373)

How many mails have you received that were official and digitally signed (not a signature)?
I work in a company where people are pretty security savy, but email somehow is an exception.. When I ask how they know the mail came from John Doe, they tell it is sure because the email address is John.Doe@example.com.
When I ask them how person X knows that it came from our company, the answer is "Because the email address is info@example.com.". So while IT enjoys themselves to add useless disclaimers (I AM the intended person to receive it. Otherwise I would not have gotten it. It might that you did not intend it on your end, but that is YOUR problem.) instead of adding digital signatures, I must change my password every 37 minutes, so I must write it down and the whole thing becomes LESS secure.

As long as IT people treat security as a technical and not a social issue, this will never be solved.

Re:Start with certified emails. (1)

psydeshow (154300) | 1 year,20 days | (#44179281)

How many mails have you received that were official and digitally signed (not a signature)?
I work in a company where people are pretty security savy, but email somehow is an exception.. When I ask how they know the mail came from John Doe, they tell it is sure because the email address is John.Doe@example.com.

Quickest way around that: send out a few emails as the company CEO, and set the Reply-to address to a random colleague.

Loads of fun, and all you need is a command line on a server somewhere.

Don't blame me if you lose your job, blame RFC 822...

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...