Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

OpenSSL Revalidated Following Suspension

Zonk posted more than 7 years ago | from the can't-keep-a-penguin-down dept.

Security 51

lisah writes "Despite what looks like an organized effort to prevent it, OpenSSL has been revalidated by an independent testing agency for its ability to securely manage sensitive data and is ready for use by governmental agencies like the Department of Defense. According to the Open Source Software Institute, who has been overseeing the validation process for the last five years (something that typically only takes a few months), it seems that the idea of an open source SSL toolkit didn't sit right with proprietary vendors of similar products. A FUD campaign was launched against OpenSSL that resulted in a temporary suspension of its validation. Developers and volunteers refused to give up the ghost until the validation was reinstated, and Linux.com has the story of the project's long road to success." Linux.com and Slashdot are both owned by OSTG.

cancel ×

51 comments

Sorry! There are no comments related to the filter you selected.

Let me be the first to say... (-1, Troll)

tomstdenis (446163) | more than 7 years ago | (#17949430)

Validation is meaningless. It's how you use the tools that matters.

I can write an insecure application with OpenSSL just as easily as with say CryptLib or Botan or whatever.

Tom

Re:Let me be the first to say... (4, Insightful)

damiangerous (218679) | more than 7 years ago | (#17949478)

How does that make validation meaningless? That you can write an insecure app no matter the toolkit is irrelevant. What's relevant is that cannot write a secure app with an insecure toolkit.

Re:Let me be the first to say... (2, Insightful)

stratjakt (596332) | more than 7 years ago | (#17949550)

Sure you could, *if* you knew exactly what the insecurities were and worked around them.

Re:Let me be the first to say... (2, Interesting)

tomstdenis (446163) | more than 7 years ago | (#17949558)

Given that I'm doing NIST certification right now, I can assure you it's meaningless. They basically throw a bunch of vectors at you, you reply, if you get it right they give you a cert # and list you on some website.

The only reason ANYONE does this is so they can get on that website. Getting a compliant AES routine isn't hard. There are dozens of implementation under BSD, MIT, GPL, and various other FLOSS [including public domain]. That you picked an AESVS certified implementation doesn't mean you're application is "better".

In fact, AESVS does not mandate any implementation details other than it outputs the right ciphertext.

The FIPS-140 series are a bit diff, but overall it's still a meaningless gesture.

Tom

YOU'RE doing NIST certification? (-1, Troll)

mmell (832646) | more than 7 years ago | (#17949614)

For who? Please tell me the U. S. Government. Ple-e-e-ase?

It would just confirm my estimate of the IQ of the average civil-servant! (with apologies to all three of those civil servants out there with IQ's > 80)

Re:YOU'RE doing NIST certification? (1)

tomstdenis (446163) | more than 7 years ago | (#17949692)

What?

No, as in my company is doing the test so we can get a cert and listed on the perdy website. We've already done it for our hardware crypto, this time around it's the software crypto.

I think you don't actually know what goes on in validation. Because if you had the slightest clue you'd just say "so what?"

Tom

Re:YOU'RE doing NIST certification? (1)

Stanistani (808333) | more than 7 years ago | (#17949954)

>It would just confirm my estimate of the IQ of the average civil-servant! (with apologies to all three of those civil servants out there with IQ's > 80)

I believe you have "the average civil servant" confused with "the average civil servant's manager."

Re:Let me be the first to say... (1)

Anpheus (908711) | more than 7 years ago | (#17951400)

I wrote a valid implementation of SHA-1 and AES in JASS (scripting language of Warcraft III) just to see how poor the integer arithmatic was. It was poor. (I had to code my own bitwise functions too, which really hurt. None are standard!)

Then let me be the first to answer . . . (5, Insightful)

mmell (832646) | more than 7 years ago | (#17949552)

Non-validated tools are worthless. The likelihood of their being used as tools by any branch of the government (including the banks) is virtually nil today. D'you really think business will follow where the government/banking industry won't go, or end users (with no business or government end-points to connect to)?

And of course, any moron can "write an insecure application" using any tools. Writing a secure application, OTOH . . .

Re:Then let me be the first to answer . . . (1)

tomstdenis (446163) | more than 7 years ago | (#17949600)

Hi, I'm the author of the LibTom Projects. I know for a *fact* that they're used within the government (USA). None of my projects have 800 or 140 series validation.

Thanks for playing the "OMG there is a diff between policy and reality" game. You lose.

Tom

Please list the LibTom projects in question . . . (1)

mmell (832646) | more than 7 years ago | (#17949784)

I know I'll sleep better if non-certified code such as you describe is agressively and immediately removed from government servers.

Validation exists for a reason. If you feel the criteria are too lax, you should be lobbying to have them tightened, not proclaiming "oh, well. C'est la vie!".

Then again, perhaps you have a vested interest in denigrating the validation process - by your own declaration, you have non-validated code ostensibly running where it shouldn't be? How much do you stand to lose here if government entities start aligning theory and practice?

Re:Please list the LibTom projects in question . . (2, Interesting)

tomstdenis (446163) | more than 7 years ago | (#17949986)

My projects are public domain. I stand to lose nothing if they stopped using them.

Note I should have been clearer. I said they use them, I didn't mean specifically they end up in actual fielded projects (because I don't know about the latter). But logically from the logs and support emails I get from various organizations they're at least using it for something. I do know that some folk at NIST used the projects testing CCM implementations. Heard that from former employees.

Point is, non-validated code is used to do work.

Tom

Re:Please list the LibTom projects in question . . (2, Insightful)

stratjakt (596332) | more than 7 years ago | (#17950004)

Not every goverment application requires the same level of validation or security. Not everything they do is secret - in fact, most of what they do is not secret.

How much security do you think your local municipalities roads department needs? I'm sure they keep track of what roads got plowed, and salted, and when. I wouldnt think that would be something they need under fort knox level security.

You can reply with "what if a hacker said dont plow this road then it got real icy and a car crashed and the TERRORISTS WIN", but the fact is, they would hardly give any thought to that scenario, they would have no problem sending updates to the database in cleartext over a wifi link secured with nothing more than WEP. In fact, the key is probably something like 1A2B3C4D5E. The database would exist as a work log - people still do their jobs in the real world.

Hell, maybe some guy wrote that little database 10 years ago, and its still running on a windows 3.1 box in a back room. I saw a dos terminal in the post office, I see ancient hardware still performing its duties every day.

Re:Please list the LibTom projects in question . . (1)

mikael (484) | more than 7 years ago | (#17950410)

How much security do you think your local municipalities roads department needs?

If road departments are anything like they are in my city, the dates when a particular section of road is coned off for resurfacing seems to be on a level of "for eyes only".

We never see the constructing crews arriving. My neighbour is convinced the road crews materialise out of a higher dimension or burrow out of the hole they are repairing in an explosion of dirt, traffic cones, workers huts and heavy duty industrial machinery.

Re:Please list the LibTom projects in question . . (2, Informative)

jmwci1 (710934) | more than 7 years ago | (#17950210)

From the Too-much-information-for-those-who-do-not-wish-to- hear-it file:

The DoD policy which requires the FIPS validation process for programs such as OpenSSL is the National Security Telecommunication and Information Systems Security Policy Number 11 (NSTISSP No. 11). Overview can be found here: http://www.enpointe.com/security/pdf/nstissp11_fac tsheet2.pdf [enpointe.com]

In short, it states that for govt/DoD to purchase/acquire any Information Assurance (IA) or IA-enabled products, they must pass through the appropriate validation process (Federal Information Processing Standards-FIPS, or Common Criteria-CC).

On a technical side, the validation process only verifies that the product performs as designed/advertised. It simply verifies or validates the products claims.

From an acquisition/implementation side, it is critically important because it is "required" if a product is to used within specified DoD systems. It is the check in the box which even allows a product to be openly considered within these stringent environments.

Does this mean that there are such programs running inside DoD/govt environments which have not gone through such validation efforts...sure there are. Until now, OpenSSL was one of those products.

But, to promote and encourage the open adoption of open source programs, such as OpenSSL or Linux (of which both RH and SuSE have passed through CC), then they must pass through the same tests as other similar (most of the time proprietary) product offerings. We (in the Open Source Community) talk about wanting a "level playing field," well this is part of that process of achieving it. A level playing field is a two-edged sword, so if that's what you want, which we do, then you've got to take the challenges along with the opportunities. Those are the rules.

regards,
jmw

So MAKE it level (0)

Anonymous Coward | more than 7 years ago | (#17952060)

OSS code is there to be used, the possibility of making dosh enough to pay for validation is unlikely. So put it through FIPS/CC for nowt.

Where companies have a product made of OSS, they can afford to have *their* *specific* implementation ratified as well, with the advantage that given its' provenance, there is less repeat work.

While it costs to certify, there won't be a level playing field.

Re:Please list the LibTom projects in question . . (1)

MeNeXT (200840) | more than 7 years ago | (#17952892)

If the article is accurate then this process seems to be political in nature and not security oriented. What would it matter if a commie wrote it if it was secure? If the software needs to pass certification by answering questions like "Does the programmers mother wears army boots?" or "What are your political beliefs?" then the process is more of a scam to keep the "IN FOLK" in and all others out. It is very easy to throw out allegations when you can hide behind a veil of secrecy. I would not sleep any better knowing that the main interest in security certification being profit for the "IN FOLK". The certification becomes meaningless from a security point of view.

One last question... aren't the commies our friends and the arabs our enemies? or is it not PC to ask? I have trouble figuring out who to hate.

Re:Then let me be the first to answer . . . (2, Insightful)

stratjakt (596332) | more than 7 years ago | (#17949810)

It's all going to depend on the project, and the level of security required, who it's for.

We sell to municipal, state, and federal levels of government, and have worked with a lot of different agencies, and the requirements are different every time. A city servant who needs a database to keep track of the flow rates of fire hydrants has different security concerns from a federal agent investigating a military colonel for embezzlement, or whatever.

Technically, the policy says OS's have to be POSIX compliant. The reality is, you still stumble across a windows 3.1 box chugging on some home-spun app sitting in the basement.

Re:Then let me be the first to answer . . . (1)

AppleButter (1061188) | more than 7 years ago | (#17950642)

Validation for anything is a lot like the political process in general: intended to be painfully slow. Political candidates are vetted thoroughly beforehand because no one likes to be surprised (imagine if Howard Dean's outburst had happened right as he was about to be sworn in). Legislation is debated until the problem they are trying to address no longer requires government intervention. No one wants to take a stand only to be embarassed later.

Re:Then let me be the first to answer . . . (0)

Anonymous Coward | more than 7 years ago | (#17954138)

Obviously, you have no clue of what you talk about. I have worked on teams that have sold work to the NSA/CIA/DOD without it being validated. It would have been an easier sell if validated, but not by much. As to the banks, many of them use LOADS of non-validated software. Witness the fact that banks get comprimised all the time because the run Windows and get viruses. A few of these viruses are looking actively for banks and gov servers because they are such easy marks.

Re:Let me be the first to say... (3, Insightful)

Anonymous Coward | more than 7 years ago | (#17949878)

Validation is meaningless.

Is the government allowed to use OpenSSL if it is not validated?

If not, then I don't think the word "meaningless" means what you think it means.

Re:Let me be the first to say... (1)

Cerebus (10185) | more than 7 years ago | (#17950446)

What good would it do you if the AES implementation was just *wrong*? Or if the crypto library processes the key in unprotected memory?

That's what validation gets you.

Not necessarily... (4, Insightful)

lisah (987921) | more than 7 years ago | (#17949562)

Well, it's not necessarily "meaningless." It would be great to see more governmental agencies choosing open source options but, from what I understand, when it comes to managing sensitive data, only software that is tested and proven to be reliable and secure can be used -- hence OpenSSL's validation process. Sure, it's important to use the tools wisely but, without the FIPS validation, this open source tool can't be used by the government in the first place.

Re:Not necessarily... (4, Insightful)

lymond01 (314120) | more than 7 years ago | (#17949632)

I believe he means it's technically meaningless while others are arguing that it's beaurocratically important. The specs for validation aren't so difficult to meet, but if you don't go through the process, no one wants to use your software.

Like us: if you don't have that MCSE on your resumé, we don't want you.

Oh, wait. Yes we do...

Re:Not necessarily... (0)

Anonymous Coward | more than 7 years ago | (#17950364)

I worked in NIST's ITL-CSD (Information Assurance department) as a "Guest Researcher" 4-5 years ago (during grad school), and we made heavy use of OpenVPN and OpenSSL to write software that did ad-hoc networking and secure tunneling to/from Debian-based handhelds (iPAQs if I remember correctly).

Apparently is was OK to use a few years ago...

Re:Not necessarily... (0)

Anonymous Coward | more than 7 years ago | (#17962378)

"...when it comes to managing sensitive data, only software that is tested and proven to be reliable and secure can be used..."

Reliable and secure like Diebold voting machines? IMO the potential for secure code is much better with open source vs. black box code due to the availability of open source to a much greater public scrutiny. The more public scrutiny of anything the government does, the better.

0.9.7 vs 0.9.8 (0, Offtopic)

stratjakt (596332) | more than 7 years ago | (#17949604)

I can't get idea support, and the other proprietary algo's to compile right in 0.9.8, and they stop me from running the commercial NX server - need to use 0.9.7.

I use gentoo, but was playing with the ebuild by hand and it didn't seem to just be a gentoo problem. It was ./configure --with-idea , and all that, like it should. Anybody know anything weird about 0.9.7 vs 0.9.8, or if it's fixed/changed in 0.9.9?

zzzz..... (0)

Anonymous Coward | more than 7 years ago | (#17949806)

While I'm sure that we all feel outraged about the hook of "proprietary vendors trying to shut out opensource," there's nothing unusual here.

A vendor wants to sell a product which requires to be certified to meet certain standards. Once they're selling their product, someone else comes out with a competing product. Vendor tells everyone that the other product isn't as good as theirs, and shouldn't be allowed in the market. Vendor's claims about the other product are investigated, and found to be invalid.

This happens ALL THE TIME, and really doesn't have a lot to do with OpenSSL being open source.

Re:zzzz..... (4, Insightful)

stratjakt (596332) | more than 7 years ago | (#17949896)

No, but like they say in the article, it delayed their validation, because it's easier to criticize something open source, you can pull open the code and point at something and say "see see". The OpenSSL team cant make the same accusations against a closed source product, because the code isnt available - trade secret.

Validation is somewhat less meaningful for OSS because of this - anyone (assuming the proper skill level) could look at the code, and see for themselves if the criticisms have any merit. With a closed solution, all you have to go on is the validation - that stamp of approval.

You are correct though, this isnt that big a deal, it's just about OSS so it's /. news.

Re:zzzz..... (1)

Niksta (1062292) | more than 7 years ago | (#17959984)

This public scrutiny is a method that crypto algorithms must go through to gain any value in the public. I'm sure everyone reading this thread can recall the lame DVD CSS once called "encryption". The MPAA tried to maintain security of this "encryption" by keeping its methods in their secret vault but that didn't do them much good. Having the public attempt to poke holes in a technology claimed to be secure is the only way to prove it is. I salute those who worked so hard to validate OpenSSL. Now IT folk concerned about compliancy in the government sector can actually get work done on mission critical systems (responsible for Federal Information Processing) without having to deposit mass loads of tax dollars in the pockets of greedy vendors (namely those that posed a threat to OpenSSL's validation).

Huh? (4, Insightful)

teridon (139550) | more than 7 years ago | (#17949918)

FTA:
Since all of OpenSSL's source code has passed the testing process, now developers can focus on compiling binary libraries and submitting those for validation

Someone please explain to me why binaries aren't good enough for the first review, then later they are? Who says the new source code is "secure"?

Why didn't they require source code review for vendor products?

Re:Huh? (1)

mpapet (761907) | more than 7 years ago | (#17950506)

I don't know all the ins-and-outs of the process, but it's my understanding, based on the level of certification and any issues that may arise during certification, a source code review is necessary to clarify concerns and issues.

Re:Huh? (2, Insightful)

Iphtashu Fitz (263795) | more than 7 years ago | (#17950536)

Someone please explain to me why binaries aren't good enough for the first review, then later they are? Who says the new source code is "secure"?

I don't think it's a matter of one being better than the other. Certification of one thing doesn't mean related items are also certified. Just because the source code is now certified doesn't mean that all the libraries that can potentially be built by that source code are now automatically certified as well. (If B derives from A, and A is certified, it doesn't automatically mean that B is certified as well.)

The article did a fairly good job of explaining why they certified the source. Since there are so many options for including/exluding various components within OpenSSL it'd be difficult to build, maintain, and certify dozens of potential variations of the same version of the library. Not to mention how you keep users from getting confused by all those potential variations.

Having the source certified means that people/organizations that want to build from those sources can have a binary that meets those certification requirements as long as all the other components (any other libraries or other requirements needed to build it) are similarly certified. It also means that if an organization has some requirement for a rather uniquely configured version of OpenSSL that they can build it themselves from certified sources and be comfortable with using it. By also getting certification of various binaries they're ensuring that people who don't need/want to rely on building from source can also have a fully tested & certified solution. Chances are they won't build/certify every possible combination of OpenSSL. It's more likely that they'll build one version with all options (at least as far as legal restrictions go), one with the most common options, one with minimal options, etc. and get those few variations certified individually.

Re:Huh? (1)

Jonathan_S (25407) | more than 7 years ago | (#17951762)

It also means that if an organization has some requirement for a rather uniquely configured version of OpenSSL that they can build it themselves from certified sources and be comfortable with using it

They may be comfortable using it, but if they were a government agency with a requirement to use validated code it doesn't look like they could use a "uniquely configured version".

Taken from the CMVP validation list on the CMVP website.

(When built, installed, protected and initialized as specified in the provided Security Policy. Appendix B of the provided Security Policy specifies the complete set of source files of this module. There shall be no additions, deletions or alterations of this set as used during module build. All source files, including the specified OpenSSL distribution tar file, shall be verified as specified in Appendix B of the provided Security Policy. Installation, protection, and initialization shall be completed as specified in Appendix C of the provided Security Policy. Any deviation from specified verification, protection, installation and initialization procedures will result in a non FIPS 140-2 compliant module)


That's seems to be a pretty lengthy caveat saying if you make changes it's not the validated version anymore.

Re:Huh? (1)

tdelaney (458893) | more than 7 years ago | (#17954816)

Uh - what part of "uniquely configured ... from certified sources" confuses you?

Re:Huh? (1)

mattp (68791) | more than 7 years ago | (#17952412)

Some level of source code review is required in all FIPS levels for all validations.

Re:Huh? (1)

Panaflex (13191) | more than 7 years ago | (#17958314)

That's true - but it's mostly to prove that the finite state diagram properly describes the module operation and that the key initialization, known answer tests and zeroization are properly implemented. It's not a complete source code review. Lastly, only the certifying lab (not NIST) checks the code.

Re:Huh? (2, Insightful)

Panaflex (13191) | more than 7 years ago | (#17955712)

FIPS 140-2 (and moreso -1) were designed for certifying what is commonly known as "black-box" hardware encryption modules. Some given assumptions on the security model include:
1. Physical security of the boxes
2. Prevention of attacks
3. Disclosure of usage, known-good protocols & keys (A Security policy)
4. Testing of (P)RNG's.
5. Known Answer tests of FIPS approved algorithms (AES, DES, etc...)

The movement towards software-only modules has brought a whole series of issues to head - meaning that some whole sections of the fips validation aren't even valid.

Also - there are (at most)4 levels for certain security aspects which validate - or at least disclose the robustness of the implementation being tested.

Now you can see why it was not originaly planned to have any validation on the security of the codebase (at least for overall level 1). OpenSSL pushed the envelope of what certification means FAR beyond it's original intention.

m$^8 (1, Interesting)

Anonymous Coward | more than 7 years ago | (#17950190)

Was this also sponsored by microsoft or was it some other biggie this time?

Oh wait, there are no other hostile biggies.

Misconceptions in the write-up (5, Informative)

Cerebus (10185) | more than 7 years ago | (#17950204)

1. FIPS 140 validations taking a long time is not unusual.

2. OpenSSL was validated as *source*. All other FIPS 140 validations are of *object code* or devices. This is the first cryptomodule to be validated in source form and contributed to the time taken to validate.

3. The OpenSSL original cert was suspended because there was a small bit crypto code that resided outside the security boundary. Confusion between sponsor, lab, and NIST contributed to the suspension. See #2.

4. Claims of vendor FUD are overblown. NSS, another Open Source cryptomodule, already has FIPS 140-1 certification (for version 3.6; 3.11 will be entering FIPS 140-2 eval soon).

Re:Misconceptions in the write-up (1)

Nevyn (5505) | more than 7 years ago | (#17982938)

2. OpenSSL was validated as *source*. All other FIPS 140 validations are of *object code* or devices. This is the first cryptomodule to be validated in source form and contributed to the time taken to validate.

This is very misleading. The OpenSSL code was submitted as source, but the lab still evaluated it as a binary blob (after compiling/installing it using the instructions provided). The lab did not evaluate the source anymore than they did for NSS or the MS crypt libs. etc.

who were behind the complaints .. (4, Informative)

rs232 (849320) | more than 7 years ago | (#17950236)

"We called it the FUD campaign," he says. "There were all kinds of complaints sent to the CMVP including one about 'Commie code.'"

'While OSSI was not able to review each complaint the CMVP received, the ones they did see often contained redacted, or blacked-out, data about who had filed the complaint. Some documents, however, did reveal the complainant information, and Weathersby says that is how the OSSI became aware that, in some cases, proprietary software vendors [linux.com] were lodging the complaints'

Gov't Contracting Cesspool (1)

mpapet (761907) | more than 7 years ago | (#17950638)

This is an excellent example of how large and deep the cesspool is in government contracting.

The competitors intentionally draw out the certification process for the newcomer to literally exhauste them and drive the competition away. This is just one relatively small library/suite of applications. (albeit critical)

For any of you entrepreneurial developers thinking they're onto the the next great thing that gov'ts will buy, please consider this story carefully. A long career at the top of an agency you wish to pursue (nepotism) and a massive bankroll are material requirements to get any money out of gov't contracting.

Re:Gov't Contracting Cesspool (1)

mattp (68791) | more than 7 years ago | (#17952592)

I can assure you that the certification process was not created by the companies that are having their products validated. I think that validating OpenSSL presented a unique problem as the writers of the standard did not foresee a source only module.

Re:Right (1)

mpapet (761907) | more than 7 years ago | (#17954262)

My point was the deep pockets competitors abuse the process to get their desired result: No new competition.

Proprietary implementations not really valid? (1)

Anonymous Coward | more than 7 years ago | (#17950700)

If OpenSSL is validated in both binary and source form while proprietary implementations of SSL are only validated as binaries, one could reasonably conclude that proprietary versions are not really fully validated.

Certainly once validity of visible source code is established it should be possible to relatively easily continue to demonstrate validity of that code. Meanwhile in the case of proprietary versions it is possible to make source changes that change the behavior of binary product in ways difficult to predict or identify.

Thus it would seem that OpenSSL's "validity" is something to which we can ascribe higher confidence than proprietary products.

If indeed this FUD campaign was intended to erode confidence in OpenSSL it seems a matter of debate about whether the effort has backfired.

There was no conspiracy (1)

fugspit (632645) | more than 7 years ago | (#17951396)

Disclaimer, I work for Red Hat so I am very familiar with competitors efforts to spread FUD about open source software but I don't believe any nefarious activities were at work here. The NSS Project [mozilla.org] is an open source SSL toolkit that received FIPS140 certification in 2002. Five years ago! The opensource Crypto++ project was certifed in 2003.

So if (as the sensationalist headline proclaims) "the idea of an open source SSL toolkit didn't sit right with proprietary vendors of similar products." They've had 5 years to get over it.

Re:There was no conspiracy (1)

Anonymous Coward | more than 7 years ago | (#17952308)

So a complaint about "commie code" being present in the source isn't FUD in you opinion? You're a lot more tolerant of that crap than I would be.

It was not a cheap process either... (4, Informative)

wherrera (235520) | more than 7 years ago | (#17951908)

Re:It was not a cheap process either... (1)

FishWithAHammer (957772) | more than 7 years ago | (#17959758)

So...in other words, it was cheap. How much do you think everyone else paid?

OpenSSL generate one-time hashes (1)

zuhaifi (1060950) | more than 7 years ago | (#17953580)

Operating System likes to implement their open message digest command ( if they provide one at all ). If your system is missing a digest command, you can use the openssl utility to generate one-time hashes. OpenSSL supports the SHA1, MD5 and RIPEMD160 algorithms, and accepts one or more files as arguments
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>