Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!



Academics Should Not Remain Silent On Government Hacking

GrievousMistake They didn't! (135 comments)

What a non-story. The flaws in Dual EC DRBG were widely published shortly after release.

The backdoor was first published by Dan Shumow and Niels Ferguson in August 2007.

Bruce Schneier wrote the same year:

My recommendation, if you're in need of a random-number generator, is not to use Dual_EC_DRBG under any circumstances. If you have to use something in SP 800-90, use CTR_DRBG or Hash_DRBG.

This was common knowledge if you had more than a passing interest in cryptography. I think TFA is mistaken when it says that it didn't get enough attention. The reason academics didn't take it more seriously is that it was seen as so obvious, it was mostly harmless shenanigans.

You would only use it in a serious cryptographic product if you were an incompetent crackhead, or if the NSA had stuffed your ass full of money.

Incidentally, RSA, the large security firm, shipped it in a serious cryptographic product for years and years.

about a year ago

IETF To Change TLS Implementation In Applications

GrievousMistake Re:End of certificates, please? (80 comments)

The trouble with Convergence; I think, is the reliance on online notaries; which become highly-centralized single points of failure.

They don't, really. The great thing about notaries as opposed to CAs is that you can use as many of them as you want, and the client decides how to handle discrepancies and outages. So a browser could ship preconfigured with 8 independent notaries, and alert the user if more than four of them were down, or if any single one of them disagreed with the rest.

In the same way, CAs can still act as authoritative notaries for domains they have signed. But now if they misbehave they can be instantly delisted, and users will fall back on the standard Convergence protection.

about a year ago

Security Breach Forces Bitcoin Bank Inputs.io To Halt Operations

GrievousMistake Re:Tired of bashing Bitcoin, yet? (285 comments)

I disagree. The "proof of work" busywork is wasteful and makes it hard to prove any real security. The Bitcoin protocol scales poorly and consumes disproportionate resources.

I am sure it is possible to do both the ledger and the currency distribution more elegantly than Bitcoin does.

For instance, a IOU system like Ripple could facilitate a Hawala-like transaction network without the meaningless weapons race caused by allocating new coins proportionally to hashing power.

Or zero-knowledge protocols could be used to vastly enhance the anonymity of transactions.

Bitcoin is an interesting proof of concept, but "as elegant as a decentralised digital transactions system could be" is overselling it by far.

about a year ago

LTSI Linux Kernel 3.4 Released

GrievousMistake Re:AF_BUS -- a[n] implementation of the D-BUS" (61 comments)

Hadn't heard about AF_BUS before...
I found the rationale, and a summary of the argument against.

I get that doing multicast in userspace isn't optimal, but I'm a bit mystified what people are doing with D-Bus that would require any kind of performance. Wasn't D-Bus supposed to be a simple pub-sub system for notification of events and the like?

about 2 years ago

Denial-of-Service Attack Found In Btrfs File-System

GrievousMistake Re:Requires local access (210 comments)

this will be easily stopped by adding a filename prefix or suffix

No it won't. It is still easy to make collisions with a known prefix or suffix. You would have to include a random component.
Even if that was a feasible workaround, it's hardly a common best practice, nor should it be.

There goes this script kiddie's

He discovered this vulnerability himself, and wrote the attack code; he is by definition not a script kiddie. Never mind that he's a professor and published cryptographer.

while about experimental software not being perfect.

This has nothing to do with being experimental software. This is not a bug, it is a weakness in the design. Furthermore, the bad behaviour will not manifest by accident - you have to deliberately provoke it.
This is the type of problem that isn't fixed before someone finds and reports it -- like Junod did.

Please cease your inane babbling.

more than 2 years ago

Randomly Generated Math Article Accepted By 'Open-Access' Journal

GrievousMistake Re:Brilliant references! (197 comments)

Also be sure to check out the brilliant paper recently published by Hakin9 in their issue on Nmap.

The authors detail the working of their DARPA Inference Cheking Kludge Scanner (DICKS), and cite such prominent references as
Z. Sun, "Towards the synthesis of vacuum tubes," Journal of Concurrent, Extensible Technology, vol. 84, pp. 1-19, Feb. 2005.
C. Hoare, J. Wilkinson, and D. Ritchie, "Contrasting Scheme and Internet QoS using SluicyMash," Journal of Flexible, Omniscient Epistemologies, vol. 20, pp. 154-194, Feb. 2000

Some excerpts:

"Obviously, event-driven modalities and web browsers are based entirely on the assumption that extreme programming and digital-to-analog converters are not in conflict with the deployment of massive multiplayer online role-playing games."

"We show our method's real-time evaluation in Figure 1. We consider a framework consisting of n flip-flop gates. Such a claim might seem counter intuitive but is derived from known results. Next, NMAP does not require such a theoretical emulation to run correctly, but it doesn't hurt. This seems to hold in most cases. We use our previously enabled results as a basis for all of these assumptions. This seems to hold in most cases."

"Figure 1.3: The 10th-percentile latency of NMAP, as a function of popularity of IPv7"

more than 2 years ago

Google's SPDY Could Be Incorporated Into Next-Gen HTTP

GrievousMistake Re:Waiting for ad.doubleclick.net ...zzz... (275 comments)

Some web browsers just render the page assuming that included scripts won't call document.write(), and then render the page again when the scripts have loaded, in case they do.
I think Chrome does this, and Opera has it as an experimental option in opera:config ("Delayed script execution").
It speeds up things a lot, especially if you aren't blocking ads. Many sites spend most of their loading time just waiting for ad servers.

There ought to be an attribute or something that webmasters could use to explicitly request XHTML semantics... Something like

about 3 years ago

Google Ports Box2D Demo To Dart

GrievousMistake Re:Some Discrepancies with Your Bitching (194 comments)

Tying NaCl to a specific architecture was a very bad move in the first place, and PNaCl doesn't help a lot.
LLVM bitcode isn't intended to be a platform-independent transport of code - it isn't frozen, so you'll have to tie yourself to a specific LLVM version, while LLVM is still improving a lot with each release.
Neither is it very portable - it isn't endian independent, and it reflects details of the ABI, which means you can't even portably call C functions. It's really just a compiler IR.

See also e.g. this post.

I can certainly see reasons that you'd want to tie a VM to the browser instead of being stuck with ECMAScript for every situation, but you need to bring a real, portable VM to the table. LLVM isn't it, and the idea of putting architecture dependent binaries on the web is patently ridiculous, as should be obvious just from the time NaCl spent as x86 only. Imagine if web site owners had to recompile their site for every new architecture that became supported. "This site is best viewed on a x86"

about 3 years ago

SSL Certificate Authorities vs. Convergence, Perspectives

GrievousMistake Re:So why do I trust the notaries? (127 comments)

*Ideally* In the CA relationship, you would at least have assurance that the site being validated worked explicitly with a trustworthy CA. In the reputation system, the site being validated didn't work with anyone and has no way to authoritatively 'tell' someone they got compromised.

A CA could be one such authentication step. Consider a network of independent notaries to which the CAs could securely push public certificates and tie them to a domain name.
Now you have to compromise the CA (or a sufficient number of the notaries, some perhaps run by the CAs themselves), and you have to perform the MITM upstream, not downstream, so the perspectives-like notaries will still see a consistent view.

more than 3 years ago

SSL Certificate Authorities vs. Convergence, Perspectives

GrievousMistake Re:So why do I trust the notaries? (127 comments)

-DNSSEC secured results enumerating the CAs the site selected to secure the domain. If DigiNotar signs yourdomain.com and your DNSSEC says 'Thawte', then there is an issue.
-Multiple CAs signing a certificate. If you have 3 or so CAs (all listed in your DNSSEC record of course), then compromising all three would be required to compromise your security.

What does this gain you over storing the cert signature itself in DNSSEC?

Since the people attesting to the authenticity of a certificate have zero 'special' interaction, it remains feasible to fool them.

Nothing prevents a notary from taking extra steps to verify the authenticity of a certificate. That is one of the advantages of the concept: other methods of authentication can be added in a modular way.
In some ways the notary system gives you the security of the strongest of the notaries you trust, and the CA system gives you the security of the weakest of the CAs you trust.

more than 3 years ago

SSL Certificate Authorities vs. Convergence, Perspectives

GrievousMistake Re:So why do I trust the notaries? (127 comments)

if someone MITM's very close to you (think the people who own/control the AP you're connecting through at a hotel), they could MITM *all* of the notaries as well

The communication with the notaries is in all likelihood encrypted and signed with predistributed keys, similar to CA certificates today. That's not a large problem, because ultimately you have to trust the software you are running anyway.
That still retains all the benefits over the CA system that you mention; you get multiple points of trust that all have to be compromised, and if one is compromised you can distrust it with minimal consequences.

more than 3 years ago

Samsung Joins Ranks of Android Vendors Licensing Microsoft Patents

GrievousMistake Re:Extortion (186 comments)

We do have some idea.

We know that Microsoft is approaching this in pretty much the most scummy and mafia-like way possible, using strong-arm tactics to make companies sign NDA agreements to prevent information leaking out that would allow other companies to protect themselves ahead of time.

We know that the patents that we have seen, mostly thanks to B&N having some balls and not falling for the MS's cheap tricks, are dubious and certainly not worth what Microsoft is demanding, given that you can licence Windows 7 for about the same price.

I personally know that I'll do my best to not give MS a dime of my money, though they sure know to take their rent from PC and phone manufacturers, and in fact I'm glad this story came up, because I was about to inadvertently buy some of their hardware, but come to think of it I won't, because who wants to subsidize shit like this?

more than 3 years ago

Rage and the Tech Behind id Tech 5

GrievousMistake Re:Also iD Tech 4 blows (172 comments)

To be honest, while id Tech 5 with its heavy focus on textures is an interesting experiment, I'm looking a lot more forward to id Tech 6. It looks like it will use raycasting on sparse voxel octrees, same as the Unlimited Detail guys. That will by all accounts amount to a generational leap in graphics, doing for geometry pretty much what MegaTexturing does for textures.

John Carmack has been talking about voxels since 2008, but the hardware weren't up to it back then. Apparently they're doing research on id Tech 6 now.

While the Unlimited Detail guys have made some promising demos using static geometry with static lightning, I believe rendering a more dynamic game world with animation and varying lightning remains an unsolved problem. I can't wait to see what Carmack and his team can come up with.

more than 3 years ago

Another Cell Phone-Cancer Study Emerges

GrievousMistake Re:follow (212 comments)

That would take an hour and a half to heat an adult brain up by 1 Kelvin,

That's an oversimplification assuming the entire brain receives the exact same amount of energy, and that is just the radio transmissions, while a cell phone can also output heat on its own and reflect your body heat.

These "Platonic thought experiment" rebuttals tend to be simplistic to the point of "assuming spherical cows" and ignore the complex interactions of a real biological system. That's not a valid scientific argument, at best it's a plausibility argument.

We didn't think asbestos was a carcinogen but it was. We then thought glass fibers were likely to be carcinogens too, but they weren't.

That's the same amount of 'heating' as 45 minutes of cell phone use, every day.

Any cell phone use comes on top of all that, and some professions could easily spend more than an hour a day on the cell phone. (Though admittedly they'd normally get a hand-free set then.)

It's not enough to come up with some vague correlation if every other verified theory tells us that it just can't happen. A mechanism for the cause has to be proposed (a model), and it has to be shown to be valid rigorously, using double-blind studies and falsifiable experiments.

A correlation can be plenty to work with if you can clearly prove it. If the correlation can not be explained by other established models, or discarded as coincidence, you can then start searching for a underlying mechanism.

John Snow found a correlation between drinking from a certain well and outbreaks of cholera. The mechanism of infection was not known at the time, but even so the correlation was clear and undeniable.

The problem here is rather that they have tested for a correlation and there isn't one. Empirically testing a hypothesis you don't like is not pseudo-science.

more than 3 years ago

Another Cell Phone-Cancer Study Emerges

GrievousMistake Re:follow (212 comments)

Personally I doubt that cell phones have any notable effect on cancer rates, but dismissing it simply because the radiation is non-ionizing would be too hasty.

There are plenty of documented carcinogens besides ionizing radiation; irritants, burns, bacteria and various chemicals can all increase your risk of cancer.

The researchers are looking at cell phone use as a whole here. There are a couple of other effects that could plausibly have carcinogenic effects, though it is unlikely.
There's a list of potential issues at Wikipedia.
E.g., holding a cell phone close to your head while talking will cause slight but measurable heating of the brain.

Dismissing out of hand that any of these effects could cause cancer just because you think you understand the physics of radiation interacting with physical matter would be folly, comparable to dismissing asbestos as a carcinogen because you understand the effects of throwing rocks at a person.

Now, at this point there has been extensive studies on the matter, and I feel reasonably convinced that if there is indeed an effect, it is very slight. That was IMO the most likely result from the beginning, but considering the massive scale of worldwide mobile use, even a small probability of health issues is well worth researching.

more than 3 years ago

Public AAC Listening Test @ ~96 Kbps [July 2011].

GrievousMistake Re:And the point of this is? (277 comments)

The Hydrogen Audio Forums tests have traditionally used a sound methodology, it would probably be worth reading up on it before you comment, lest you make a fool out of yourself.

They will not be trying to measure how 'good' each codec sounds, they are trying to measure how close it is to the source material, with a 'perfect score' being statistically indistinguishable.

more than 3 years ago

Ask Slashdot: Is SHA-512 the Way To Go?

GrievousMistake Re:SHA-1 is fine, but go for SHA-512 (223 comments)

The SHA family is coming to an end; it's just a matter of time.

An end, but also a beginning; the final selection of the hash algorithm that will become SHA-3 is scheduled for 2012.

The current candidates are all faster than SHA-1 on platforms without hardware acceleration, even with the added security. Unless a weakness is discovered after the standardization, SHA-3 should eventually replace SHA-1 in all security critical applications.

more than 3 years ago

Mozilla Rejects WebP Image Format, Google Adds It

GrievousMistake Re:Why? (262 comments)

Looks like they're planning to add alpha channels and XMP metadata, as well as a bunch of other more-or-less useful features, like 3d support.

I think the SSIM advantage is adequately documented with the study linked to in TFA, though in the end what it comes down to is visual comparison. The earlier encoder was accused of overoptimising for PSNR, to the detriment of the overall image quality. Hopefully they can get some more heavy-duty psychovisual optimisations applied to both the video and still image encoders for further improvements.

more than 3 years ago


GrievousMistake hasn't submitted any stories.


GrievousMistake has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?