Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Bit9 Hacked, Stolen Certs Used To Sign Malware

Soulskill posted about a year and a half ago | from the that-seems-like-a-bad-plan dept.

Security 65

tsu doh nimh writes "Bit9, a company that provides software and network security services to the U.S. government and at least 30 Fortune 100 firms, has suffered a compromise that cuts to the core of its business: helping clients distinguish known 'safe' files from computer viruses and other malicious software. A leading provider of 'application whitelisting' services, Bit9's security technology turns the traditional approach to fighting malware on its head. Antivirus software, for example, seeks to identify and quarantine files that are known bad or strongly suspected of being malicious. In contrast, Bit9 specializes in helping companies develop custom lists of software that they want to allow employees to run, and to treat all other applications as potentially unknown and dangerous. But in a blog post today, the company disclosed that attackers broke into its network and managed to steal the digital keys that Bit9 uses to distinguish good from bad applications. The attackers then sent signed malware to at least three of Bit9's customers, although Bit9 isn't saying which customers were affected or to what extent. The kicker? The firm said it failed to detect the intrusion in part because the servers used to store its keys were not running Bit9's own software."

cancel ×

65 comments

Sorry! There are no comments related to the filter you selected.

LOL (5, Funny)

MrLeap (1014911) | about a year and a half ago | (#42838363)

"Our software is good, so good -- infact, that if we had used it ourselves our software wouldn't have been hacked.". That's one way to preserve confidence I suppose, use recursion.

Re:LOL (1)

Anonymous Coward | about a year and a half ago | (#42838919)

Yeah, and they monitor their networks 24x7 but missed someone hacking in through the very few computers that didn't have the software, not touching any others that did have the software, rooting around to figure out things, issued certificates and then sent malware to their customers. Right.

Re:LOL (2)

mrprogrammerman (2736973) | about a year and a half ago | (#42839221)

It's ironic that many security companies probably have some of the worse security practices and policies in place.

Re:LOL (2)

symbolset (646467) | about a year and a half ago | (#42840719)

Ironic but not new. Also applies to the "most critical" systems: military systems, banking systems, power infrastructure including nuclear power plants, Los Alamos National Laboratory where nuclear weapons are simulated on supercomputers and so on. The US Army uses Vista. The Fed was recently hacked. We all know about the malware and exploits circulating for SCADA that does power plant control, and the published hard wired root passwords for the systems including routers and firewalls. Los Alamos has a long history of losing the most secret data - as well as failing their background check systems. The more important it is it seems, the less secure it is.

The NSA, CIA and NRO apparently have got their act together. That's nice. The FBI, middling to fair. In general though if we really tick somebody off with nation-state level budgets and serious nerd skills, we're hosed. Unfortunately that "ticking off" is almost certain to occur if it hasn't happened already.

Re:LOL (0)

Anonymous Coward | about a year and a half ago | (#42840717)

LOL. Serves them right.

Revoke the keys, issue new ones (3, Informative)

TheRealMindChild (743925) | about a year and a half ago | (#42838365)

Revoke the keys, issue new ones, and contact all of your clients on how to update. Check and mate.

Re:Revoke the keys, issue new ones (4, Insightful)

mlts (1038732) | about a year and a half ago | (#42838397)

Even better:

Buy HSMs. Issue new keys with the private keys stored in the security modules, and the access to who gets access to sign data tightly restricted and audited.

Any production security outfit storing private key material on something that is not a hardened appliance is just asking for it.

Re:Revoke the keys, issue new ones (0)

Anonymous Coward | about a year and a half ago | (#42838879)

Slightly off-topic, but is there something as a "Software-HSM" that can be loaded into ie an ESX server-host, that's presented as a HSM to the VM?

Re:Revoke the keys, issue new ones (2)

swillden (191260) | about a year and a half ago | (#42840553)

Slightly off-topic, but is there something as a "Software-HSM" that can be loaded into ie an ESX server-host, that's presented as a HSM to the VM?

Probably, but if so it would be vulnerable to hypervisor exploits, which do exist.

If you have important keys put them in a hardware security module. Ideally, a FIPS 140-2 Level 4 certified device (level 3 is good enough, but level 4 devices don't cost any more), in a physically-secured location, with tightly-configured logical access control. If you must use VMs, get network-enabled HSMs and have your VMs talk to them.

Had Bit9 done something like that, a network intrusion could still potentially have enabled the attackers to sign some things, but they couldn't have gotten hold of the actual key material. Plus, with appropriate auditing on the HSMs, Bit9 would have known exactly how their keys were used which may have enabled damage control less extreme than generating new keys and distributing them to every client.

Re:Revoke the keys, issue new ones (0)

Pinky's Brain (1158667) | about a year and a half ago | (#42839505)

Meh, there hasn't been an OS level remote root exploit in *nix's in eons ...

Just having a service on a commodity *nix PC which only has a single open port to take data, signs it and spits it out would be secure against network attacks.

Re:Revoke the keys, issue new ones (1)

Anonymous Coward | about a year and a half ago | (#42841367)

No it wouldn't, you idiot.

If it were like you say, then anyone who hacks the network it is connected to can then send requests to it to be signed, and you're just as insecure as if you had the private keys stolen.

Re:Revoke the keys, issue new ones (0)

Anonymous Coward | about a year and a half ago | (#42858727)

In some cases, that can be useful. Done right, it can block network attacks.

However, the main reason why one wants a hardware security module is for additional security:

1: A hardened OS from the ground up that is resistant to IP attacks.

2: A physically tamper-resistant enclosure. Against a high-value target, an attacker might be able to get to the physical box, so having the keys zapped if someone cracks the case is a must.

3: Its own user/group role authentication, and audit logs. This way, this limits users to sign materials, what stuff can be signed, and at what times. It also shows what gets signed. This is what saved a major software company from a catastrophe when they had a breach, and someone managed to sign some backdoored executables. Instead of having to revoke keys, all that was needed was to deal with the single bad package.

4: On a smaller scale, HSM cards are important because it allows SSL to not have to be done on the host. That way, if a Web server gets compromised, the SSL key doesn't need to be regenerated.

So, HSM units don't just give network separation. They provide physical tamper resistance, audit controls, separation of tasks, and fairly fine-grained user security. Having a user blocked from signing items after they leave for the day can keep an intrusion attempt from even starting.

Re:Revoke the keys, issue new ones (2)

gmuslera (3436) | about a year and a half ago | (#42838753)

The steps should see revoke keys, make sure you closed all the holes used to break in (and anything potentially similar), and then start isuing new ones and give a migration plan for them, Extra points if you give your clients the name of whatever is in the same business, you are there to give solutions, if your one is not safe, giving alternatives is better than just declaring that there is none.

Re:Revoke the keys, issue new ones (0)

Anonymous Coward | about a year and a half ago | (#42839199)

Hardly. What about the clients? They might have malware. How do you trust the new keys? How do you know you've thwarted the attack? How about closing down the company and starting over??

Re:Revoke the keys, issue new ones (2)

sunderland56 (621843) | about a year and a half ago | (#42840507)

Revoke the keys and issue new ones. Contact all your former clients and try to convince them that you aren't total morons, and that they should continue to be your customers. Give the new kews to the handful that are stupid enough to stay.

Smooth Move (0)

kbob88 (951258) | about a year and a half ago | (#42838369)

Let me guess: these Bit9 geniuses are all ex-RSA employees?

first (0)

Anonymous Coward | about a year and a half ago | (#42838371)

apk is a misunderstood genius

Better Yet Buy Bit11 (1)

Anonymous Coward | about a year and a half ago | (#42838373)

Because 11 is better than 10 or even 9!

Re:Better Yet Buy Bit11 (1)

Anonymous Coward | about a year and a half ago | (#42838415)

Well, you'll have to compete with my company, Bitn+1.

Re:Better Yet Buy Bit11 (1)

eksith (2776419) | about a year and a half ago | (#42838517)

Bit+coin ... Wait, that's already taken

Re:Better Yet Buy Bit11 (1)

pixelpusher220 (529617) | about a year and a half ago | (#42845633)

yeah but now they're going to be known as Bit-Coin....since they'll be losing quite a bit of, ahem, 'coin'

Re:Better Yet Buy Bit11 (1)

eksith (2776419) | about a year and a half ago | (#42845735)

Ah! Touché. Yeah, the're gonna lose a lot more than trust. I.E. All the nice things that come with trust... like clients and money.

Re:Better Yet Buy Bit11 (1)

sunderland56 (621843) | about a year and a half ago | (#42840487)

I thought 'bit11' is the same as '11b', or 3.

FailzorS!? (-1)

Anonymous Coward | about a year and a half ago | (#42838549)

going to continue, - Netcraft has [amazingkreskin.com] with the laundry cleanF for the next another cunting

Serves them right (-1, Troll)

russotto (537200) | about a year and a half ago | (#42838595)

I hate fuckers who make software designed to prevent computer users from using their computer. This applies whether the software claims to be white-hat anti-malware stuff or outright admits it's a tool-of-the-devil locked bootloader or DRM tool.

Re:Serves them right (4, Insightful)

vux984 (928602) | about a year and a half ago | (#42838757)

I hate fuckers who make software designed to prevent computer users from using their computer.

What they are developing is really not fundamentally different from something like SELinux.

DRM is only evil because someone who is not the computer owner is unilaterally dictating what you can do with it.

Secureboot, SE Linux, and this stuff from bit9 are all tools that enable the owner of the computer to dictate what software is allowed to run on it.

Why shouldn't the owner decide that flash shall not have access to the internet? Or that flash shall not run. period.

The only time any of this is evil is when the owner isn't in control.

Some owners allegedly can't be trusted (1)

tepples (727027) | about a year and a half ago | (#42839765)

The only time any of this is evil is when the owner isn't in control.

Several fans of game consoles and Apple consumer electronics would claim that some individual hardware owners can't be trusted not to disable security to see dancing animals [wikipedia.org] , and taking control away from them is in their own good. They tend to pop up every time the Android trojan story of the week breaks or the Ouya project reaches another milestone.

Re:Some owners allegedly can't be trusted (1)

vux984 (928602) | about a year and a half ago | (#42840243)

Several fans of game consoles and Apple consumer electronics would claim that some individual hardware owners can't be trusted not to disable security to see dancing animals, and taking control away from them is in their own good.

I'm sure you would even agree that this is true for some individual hardware owners, perhaps even most of them.

But the solution is not to take it away from them per se, but rather to easily enable them to delegate it to a 3rd party they trust.

The problem with, say, Apple, is that it asserts it is that trusted 3rd party and gives users no reasonable way to take control back from Apple and either exercise their own security or delegate to a different 3rd party -they- do trust.

Contrast that with Antivirus software for example. We select it, install it, and put some faith in its ability to identify malware while trusting it not to exceed that mandate. If it sees an infected file come in it quarantines it. The 'unsophisticated user' typically accepts the antivirus companies assessment of the file and moves on. Few will challenge the antivirus software and seek to disable it and restore the file from quarantine.

However, if the antivirus software were to start blocking things it really has no business blocking and which doesn't represent what the customers want from it they are free to uninstall it and switch to something else.

I can't recall what my brother was using, but one day a few months ago it up and blocked him from going to the pirate bay. He uninstalled it and uses something else now. This is pretty much ideal... we have an established a norm that one should have antivirus, but it is ultimately up to us, and we can change antivirus providers at will.

The reason viruses are such a problem is that blacklisting simply can't work, and "detecting malicious activity" is HARD. A white listing approach would in some environments be a lot more effective. And I've seen deployed in practice with excellent results in corporate environments.

Capabilities (1)

tepples (727027) | about a year and a half ago | (#42840561)

The reason viruses are such a problem is that blacklisting simply can't work, and "detecting malicious activity" is HARD.

Ultimately, capabilities [wikipedia.org] are the real answer to "detecting malicious activity". OLPC Bitfrost protections, Android permissions, Ubuntu AppArmor, and Mac App Store entitlements work by characterizing the threat model, finding which actions are sensitive, and giving applications just enough privileges to do their work. AppArmor whitelists the parts of the file system that an application can see. Android permissions have been criticized as being yet another extra screen that the user just taps through to see the dancing bunnies, and some of this criticism is warranted especially for applications that request too many privileges that they don't really need. OLPC Bitfrost goes beyond that by making some privileges mutually exclusive at install time, such as Internet and directory scan, unless the user manually adds privileges to an application after install.

Re:Capabilities (0)

Anonymous Coward | about a year and a half ago | (#42841377)

You forgot SELinux and TOMOYO.

Really none of these have been done right, though. The solution is that you don't allow programs to open resources, for example, if the only way for a process to acquire a file handle is to ask a designated file chooser service for one, then all your arbitrary access problems go away. Unfortunately POSix makes this practically impossible.

Re:Capabilities (1)

tepples (727027) | about a year and a half ago | (#42843113)

if the only way for a process to acquire a file handle is to ask a designated file chooser service for one, then all your arbitrary access problems go away

I mentioned OLPC Bitfrost and Mac App Store sandbox, which do in fact make this the only way to acquire a file handle outside the application's own jail.

Re:Capabilities (1)

vux984 (928602) | about a year and a half ago | (#42845933)

my impression of these 'capabilities' protections is that they are not nearly specific enough to be of much use in practice.

I download a cloud contact sync program and it asks for permission to connect to the internet, and permission to scan my contacts.

So then it sends my contacts to a 3rd party spam outfit.

It asks for the ability to send/read sms because it has a feature to send contacts to other app users via sms. Cool.

So it sends copies of all my sms messages to a 3rd party. And sends sms advertising spam from my phone.

Its fundamentally broken idea in my opinion.

Re:Capabilities (1)

tepples (727027) | about a year and a half ago | (#42846019)

a cloud contact sync program [...] sends my contacts to a 3rd party spam outfit

If your sync provider is sharing user information with a spam outfit, then it's violating a reasonable privacy policy. One solution is to use applications that aren't limited to one sync provider but instead let the user specify the sync provider's API URL. This way, you could choose which sync provider to use or even run your own on a home server. But I'd agree with you that alongside the Internet privilege, which means "This application can connect to all hosts", systems should offer finer-grained controls based on host names, such as "This application can connect to IP addresses obtained by resolving *.buttbook.com or *.someadbroker.net".

Its fundamentally broken idea in my opinion.

So what's the correct solution, other than solutions that pose a substantial entry barrier to non-malicious students and hobbyists?

Re:Capabilities (1)

vux984 (928602) | about a year and a half ago | (#42855951)

So what's the correct solution, other than solutions that pose a substantial entry barrier to non-malicious students and hobbyists?

Ok.

An operating system control system that allows the user to define app profiles that dictate what hosts it can connect to, what parts of the filesystem can be read / written, etc, etc. is the first part.

The 2nd part is letting e.g. antivirus / antimalware / new-category-of-security-solution providers hook into that system, so that end users can simply subscribe to app profiles from a 3rd party provider they trust to externalize the burden of auditing and creating the profiles.

Re:Serves them right (1)

IAmR007 (2539972) | about a year and a half ago | (#42844457)

The problem is that Microsoft had the manufacturers use an implentation that's hard to use for non-Microsoft OS's. The FSF proposed a fair implementation, but Microsoft said no. That alone proves what their goal is.

Re:Serves them right (1)

vux984 (928602) | about a year and a half ago | (#42874681)

The problem is that Microsoft had the manufacturers use an implentation that's hard to use for non-Microsoft OS's.

How so? Any hardware vendor that wishes to release linux preinstalled hardware will have no difficulty whatsoever.

The only part that is 'harder' is taking windows pre-installed hardware and converting it to another operating system. And that is ENTIRELY THE POINT OF SECURE BOOT -- that there is something preventing arbitrary unknown software booting up with the PC... whether a rootkit, or the neighbor kid 'hacker' with a live thumbdrive etc. Now, sure, switching operating systems on a computer that shipped with Windows requires the owner either disable the lock, or re-key the system to run something else.

But how is that 'evil'? That's the reasonable price of increased security.

The FSF proposed a fair implementation, but Microsoft said no.

Which FSF proposal are you referring to specifically? The one where they wanted systems to be shipped with secureboot off and then users could turn it on? Resulting in piles of confused users, and users who left it off and needlessly reduced their own security? Yeah, that would have been great.

That alone proves what their goal is.

Even without knowing exactly what you are talking about, it proves nothing. There is nothing about the current secureboot implementation that is overly discriminatory against alternative operating systems.

Re:Serves them right (5, Informative)

Anonymous Coward | about a year and a half ago | (#42838975)

I hate fuckers who make software designed to prevent computer users from using their computer. This applies whether the software claims to be white-hat anti-malware stuff or outright admits it's a tool-of-the-devil locked bootloader or DRM tool.

A company has every right to lock down their own computers. Dumbass employees with Admin rights = disaster!! This software is similar to SUA + AppLocker (deny all) + whitelisted certs and it's a solid approach.

Re:Serves them right (0)

russotto (537200) | about a year and a half ago | (#42840763)

A company has every right to lock down their own computers.

The right, certainly. But turning a computer into a glorified cash register running only "approved" apps is a terrible move, even when you own it. Sure, you prevent malware. You also prevent everything else.

Re:Serves them right (2, Informative)

Anonymous Coward | about a year and a half ago | (#42841031)

A company has every right to lock down their own computers.

The right, certainly. But turning a computer into a glorified cash register running only "approved" apps is a terrible move, even when you own it. Sure, you prevent malware. You also prevent everything else.

From the summary:
Bit9, a company that provides software and network security services to the U.S. government and at least 30 Fortune 100 firms

This has nothing to do with consumer toys or personal computers. It's to do with gov't/corp workstations. It prevents employees from accidentally installing unsigned updates and plugins. It prevents spies, defectors or hackers from stealing the "secret sauce". The integrity of the certs is crucial to its effectiveness.

Removing rights from your own Windows acct. is not a bad idea and can be comfortable with tools like SuRun [kay-bruns.de]

(I'm the same AC that you replied to)

Re:Serves them right (1)

russotto (537200) | about a year and a half ago | (#42844521)

This has nothing to do with consumer toys or personal computers. It's to do with gov't/corp workstations. It prevents employees from accidentally installing unsigned updates and plugins.

It also prevents employees from deliberately installing useful items. It means they have to do their work on the computer in exactly the way that work has always been done; if they think some tool will make things easier or more efficient, that's just tough because they can't install it.

Imagine if you had to do everything in edlin because some program refused to let you install your favorite editor.

Re:Serves them right (0)

Anonymous Coward | about a year and a half ago | (#42845415)

It also prevents employees from deliberately installing useful items. It means they have to do their work on the computer in exactly the way that work has always been done; if they think some tool will make things easier or more efficient, that's just tough because they can't install it.

Imagine if you had to do everything in edlin because some program refused to let you install your favorite editor.

I understand the frustration but you're judged by your peers performance and they're dealing with the same handicaps. Efficiency takes a backseat to security. If this is unacceptable then find another job.
Fortune 100 and gov't agencies are high value targets. We're not talking about preventing run-of-the-mill malware but targeted attacks that utilize unknown unknowns. The best way to safeguard is to assume that the OS and every piece of software has unpatched vulnerabilities, that every employee is a wolf in sheep's clothing and LOCK IT ALL DOWN. Any "new" software needs to be vetted to levels that not all will pass. That's just the way it is and for good reason.

This breach is about as bad as it gets and heads will roll but it's no reason to throw solid security practices out the window. On the contrary, it proves the paranoia is founded.

Must have been the US govt (1)

oodaloop (1229816) | about a year and a half ago | (#42838607)

Because everyone knows there's no hacking threat. Right?

Just stupid (3, Informative)

Anonymous Coward | about a year and a half ago | (#42838677)

Why was this system connected to the internet either directly through the main lan or an unsecured vlan?

We have basic white papers and common sense security plans to stop this kind of thing.

Re:Just stupid (1)

Pinky's Brain (1158667) | about a year and a half ago | (#42839535)

Because admins want to ssh into it with their home laptops they browse for porn with?

Re:Just stupid (1)

cellocgw (617879) | about a year and a half ago | (#42842585)

Why was this system connected to the internet either directly through the main lan or an unsecured vlan?

Well, having just finished Ghost in the Machine, my bet is some genius in big9's IT dep't got a phone call that went "Hi, this is Bob from AccountTemps and I need you to change your password on the repository server so we can verify our updated security patch is working..."

Slashvertisement gone bad... (0)

Anonymous Coward | about a year and a half ago | (#42838683)

Its revolutionary software, I say, Nobody has ever even thought of ... a *whitelist* before!! Why its so a amazing that it, ah, shoot it just killed my dog. Well it was good, before it decided to exterminate all canines. What? The cat too? Well, it apparently hates all pets now. FML. Piece of crap software.

Dog Food (4, Funny)

Anonymous Coward | about a year and a half ago | (#42838685)

Not Eaten Here

Sounds like Bit 9.5 is in order (1)

JoeyRox (2711699) | about a year and a half ago | (#42838745)

New and improved with 5% more bits!

Re:Sounds like Bit 9.5 is in order (1)

ahem (174666) | about a year and a half ago | (#42839031)

<pedant>Actually 6% or 5.6% or 5.56% or 5.555556% more bits</pedant>

That's What They Get (0)

Anonymous Coward | about a year and a half ago | (#42838751)

They're idiots, holding on to outdated tech and business models. They should have shifted everything to the cloud last summer and all of this could have been avoided! They're probable being run by a bunch of old fuckers. Serves them right.

What?

Smartass (0)

Anonymous Coward | about a year and a half ago | (#42838827)

-smartass- I hope they used the stolen *keys* to sign malware -/smartass-

"Product was not compromised"? (4, Insightful)

Jorgensen (313325) | about a year and a half ago | (#42838931)

Impressive:

There is no indication that this was the result of an issue with our product.

Well... technically right, but the "product" people buy is not just the software: It is the whole package, which includes the on-going maintenance of whitelists, signing binaries and whatnot. And that appears to have been badly compromised.

We are continuing to monitor the situation.

Surely, if the product is that great, then you can relax, right? Isn't that what you're selling to your customers? "Security in a box?" (I know. Security is an on-going process, but not if you ask sales)

While our investigation shows our product was not compromised, we are finalizing a product patch that will automatically detect and stop the execution of any malware that illegitimately uses the certificate

Repetition Repetition... "product not compromised" ... except that it no longer provided any protection against those evil hackers?

I think I'm getting my head around doublespeak - will be useful when I respond to bugs...

Re:"Product was not compromised"? (3, Insightful)

alcourt (198386) | about a year and a half ago | (#42839547)

I had a long chat with one of their sales types a couple weeks ago. The sales person had to talk to backline engineering, but confirmed the next day that yes, the bypass I outlined in under two minutes to evade the tool completely would in fact work and their software was designed in precisely the way as to make support from OS and hardware vendors very difficult on Linux.

I tried to push them into the more useful area of logging what is done rather than trying to declare a known whitelist. Under their current scheme, a sysadmin couldn't write a custom shell script to their home dir and run it without going through twenty blessings first. Tweak that shell script? Won't run, even without privilege. I was not impressed.

Re:"Product was not compromised"? (0)

Anonymous Coward | about a year and a half ago | (#42839631)

There are ways to allow execution from specific place like a home directory with Bit9's tools. They can be really freaking draconian but there are some options to make people able to do their job. And yes, I work somewhere with Bit9 deployed to several thousand servers and even more desktops.

Re:"Product was not compromised"? (0)

Anonymous Coward | about a year and a half ago | (#42847619)

"Product not compromised but it's useless now" is so similar to Boeing's "We have total confidence in the safety of the 787 while the entire fleet is grounded". I'm starting to see a pattern here. Where do they teach this stuff?

Who was the real target? (3, Insightful)

Midnight_Falcon (2432802) | about a year and a half ago | (#42839083)

Just like the RSA hack..the infiltrators here appear to be just after signing certificates. They must have an objective to hack a client that uses Bit9 systems and thus required whitelisting. That means that some client of Bit9 is about to get seriously compromised.

Re:Who was the real target? (1)

Anonymous Coward | about a year and a half ago | (#42839167)

I think that's how they discovered the issue...

FTFS:
" The attackers then sent signed malware to at least three of Bit9's customers"

One basket for the eggs (1)

Anonymous Coward | about a year and a half ago | (#42839253)

What a shame. The truly bullshit "security" companies (as opposed to the moderately bullshit ones line bit9) will go on making money with AV software, while someone who sort of tried to do things right (whitelists) is utterly clobbered. But they did fuck up.

Ok, so you didn't run your own wares, kind of like back when (and maybe this is still the case) OpenBSD was hosted on Solaris systems. ;-)

Beyond that, though, we see another failure here, and it's one that it also shared by most of today's HTTPS problems and even some of the proposed fixes: single signers, totally trusted as part of the totally-unrealistic all-or-nothing trust model.

In PGP, imagine the conspiracy that's required to compromise a stranger's identity, for one which you might happen to believe due to it being certified by three "moderately trusted" parties. Three amateurs could trivially supply vastly better security than a major government contractor. It's that easy to do better than what bit9 did, with 20-25 year old solutions.

Something like this will happen again. Something like this does happen every few months, it seems, when some root CA is found to be shady or compromised. The lesson: one signature is not enough. Require a conspiracy, or require that some uber-badass break into multiple different systems, administered by different people, in a narrow band of time. It makes sense, whether it's to get a email public key, or check a signed binary, or whatever. If it's important then do it right. And if it's not important, then why did you pay bit9?

I think we need a better security model. (1)

fufufang (2603203) | about a year and a half ago | (#42839835)

CAs keep getting hacked recently. How can I place my trust on CAs these days? Perhaps the browser should inform the users about certificate change for individual websites, similar to SSH?

Re:I think we need a better security model. (1)

manu0601 (2221348) | about a year and a half ago | (#42840391)

There are firefox extensions for that. But unless you are the operator of the service, what do you do if the certificate change? How do you know if the change is legit or not?

Did not run their own software. (1)

manu0601 (2221348) | about a year and a half ago | (#42840379)

They say they got hacked because they did not run their own software. I see another reason: either one of the accredited operators of the signing infrastructure launched a malware on their signing machine (scaring), or the signing machine offered hackable services on the company network (scaring again).

Re:Did not run their own software. (1)

VortexCortex (1117377) | about a year and a half ago | (#42842119)

When folks don't use their own products it's because the product is shit. Do you think Microsoft compiles Windows with Visual Studio?

Re:Did not run their own software. (1)

ecotax (303198) | about a year and a half ago | (#42842209)

When folks don't use their own products it's because the product is shit. Do you think Microsoft compiles Windows with Visual Studio?

Firstly, regardless of what I think of Windows, I actually believe they do use Visual Studio, see the discussion here:
http://stackoverflow.com/questions/7381392/compiler-used-to-build-windows-7 [stackoverflow.com]

Secondly, Visual Studio is a quite acceptable IDE, and could very well be the best software product they ever made.

Re:Did not run their own software. (1)

manu0601 (2221348) | about a year and a half ago | (#42843757)

Well, they at least need a machine without their software to examine new software they are going to sign, don't they?

Bit9 Hacked (0)

Anonymous Coward | about a year and a half ago | (#42841801)

Is that a big DUH! or what? Not running their own stuff? Idiots.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?