Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

Mozilla's 2013 Report: Revenue Up 1% To $314M; 90% From Google

jmv Re:What do they spend the money on? (147 comments)

Yes, browsers have indeed become so complicated. It's not just Mozilla, Google's putting even more resources on Chrome than what Mozilla can afford. A browser is now essentially an operating system (see FirefoxOS) that can do pretty much everything *and* needs to do it in a way that's secure against untrusted code (JS). On top of that, Mozilla is involved in projects that reach beyond just the web, like the Opus audio codec and the Daala video codec that I'm personally involved in (there's many more of course).

2 days ago
top

Where Intel Processors Fail At Math (Again)

jmv To be expected (239 comments)

There's nothing I find particularly alarming here and the behaviour is in fact pretty much what I would expect for computing sin(x). Sure, maybe the doc needs updating, but nobody would really expect fsin to do much better than what it does. And in fact, if you wanted to maintain good accuracy even for large values (up to the double-precision range), then you would need a 2048-bit subtraction just for the range reduction! As far as I can tell, the accuracy up to pi/2 is pretty good. If you want good accuracy beyond that, you better do the range reduction yourself. In general, I would also argue that if you have accuracy issues with fsin, then your code is probably broken to begin with.

about a month and a half ago
top

Lennart Poettering: Open Source Community "Quite a Sick Place To Be In"

jmv Re:In the spotlight (993 comments)

There's a difference between attacking a piece of software and attacking the author. I personally have no opinion on systemd (hell, I don't even know what init system I'm running atm), but I feel like any complaint people have should be directed at whoever *chose* systemd, rather than who wrote it. You can't blame someone for writing software. If you don't like it, don't use it and/or tell distros not to use it.

about a month and a half ago
top

NSA Director Says Agency Is Still Trying To Figure Out Cyber Operations

jmv Re:that word (103 comments)

Careful what you wish for. With the current generation, you might end up with iWar instead.

about 2 months ago
top

Turning the Tables On "Phone Tech Support" Scammers

jmv Windows is updating (210 comments)

I like to get these scammers on the line for as long as possible, but without wasting my time. So far, what I've seen to work well was "Oh, my computer just crashed, I need to reboot" and "Now windows is applying updates". This means they'll wait without me having to think of stuff to tell them. Any other effective tricks?

about 2 months ago
top

Is There a Creativity Deficit In Science?

jmv Re:Tenure-hunting discourages risk (203 comments)

In my opinion the peer-review should be changed to a double-blind system: the reviewer should not see name and affiliation of the authors, and judge the work as it would grade an undergrad paper (i.e. harshly). Like this I believe the signal-to-noise ratio in journals would increase, and only good papers would get published.

Please no! The problem with this approach (and it's already happening) is that what will get published is boring papers that bring tiny improvements over the state of the art. They'll get accepted because the reviewers will find nothing wrong with the paper, not because there's much good in there. On the other hand, the really new and interesting stuff will inevitably be less rigorous and probably more controversial, so it's going to be rejected.

Personally, I'd rather have 5% great papers among 95% of crap, than 100% papers that are neither great, nor crap, but just uninteresting. Reviews need to move towards positive rating (how many thing are interesting), away from negative ratings (how many issues you find in the paper). But it's not happening any time soon and it's one of the reasons I've mostly stopped reviewing (too often overruled by the associate editor to be worth my time).

about 3 months ago
top

Judge: US Search Warrants Apply To Overseas Computers

jmv Cuts both ways (502 comments)

It's going to be interesting when the Chinese government issues Google a warrant to get data from the US.

about 4 months ago
top

Popular Android Apps Full of Bugs: Researchers Blame Recycling of Code

jmv Re:All software is full of bugs (150 comments)

Software on Internet-connected devices is a bit different from your examples though. No matter how insecure cars are, it would be really hard for me to steal a million cars in one night, let alone without being caught. Yet, it's common to see millions of computers/phones being hacked in a very short period of time. And the risk to the person responsible is much lower.

about 4 months ago
top

How Did Those STAP Stem Cell Papers Get Accepted In the First Place?

jmv Re:We should expect some wingnuts to say... (109 comments)

It would certainly be nice, but it's not realistic. For a simple paper, it would likely cost a few thousands, but for anything that requires fancy material, it could easily run in the millions. The only level where fraud prevention makes sense is at the institution (company, lab, university) level.

about 5 months ago
top

How Did Those STAP Stem Cell Papers Get Accepted In the First Place?

jmv Re:We should expect some wingnuts to say... (109 comments)

So you're saying that reviewers should have to reproduce the results (using their own funds) of the authors before accepting the papers or risk being disciplined? Aside from ending up with zero reviewers, I don't see what this could possibly accomplish. Peer review is designed to catch mistakes, not fraud.

about 5 months ago
top

How Did Those STAP Stem Cell Papers Get Accepted In the First Place?

jmv Re:Simple: Peer review is badly broken (109 comments)

I think what is missing is that a) more reviewer actually need to be experts and practicing scientists and b) doing good reviews needs to get you scientific reputation rewards. At the moment,investing time in reviewing well is a losing game for those doing it.

Well, there's also the thing that one of the most fundamental assumption you have to make while reviewing is that the author's acting in good faith. It's really hard to review anything otherwise (we're scientists, not a sort of police)

I agree that good reviews do not need to be binary. You can also "accept if this is fixed", "rewrite as an 'idea' paper", "publish in a different field", "make it a poster", etc. But all that takes time and real understanding.

It goes beyond just that. I should have said "multi-dimensional" maybe. In many cases, I want to say "publish this article because the idea is good, despite the implementation being flawed". In other cases, you might want to say "this is technically correct, but boring". In the medical field, it may be useful to publish something pointing out that "maybe chemical X could be harmful and it's worth further investigation" without necessarily buying all of the authors' conclusion.

Personally, I prefer reading flawed papers that come from a genuinely good idea rather than rigorous theoretical papers that are both totally correct and totally useless.

about 5 months ago
top

How Did Those STAP Stem Cell Papers Get Accepted In the First Place?

jmv Re:Simple: Peer review is badly broken (109 comments)

This is not a new phenomenon, it seems to just be getting worse again. But remember that Shannon had trouble publishing his "Theory of Information", because no reviewer understood it or was willing to invest time for something new.

That's the problem here. Should the review system "accept the paper unless it's provably broken" or "reject the paper unless it's provably correct". The former leads to all these issues of false stuff in medical journals and climate research, while the latter leads to good research (like the Shannon example) not being published. This needs to be more than just binary. Personally I prefer to accept if it looks like it could be a good idea, even if some parts may be broken. Then again I don't work on controversial stuff and nobody dies if the algorithm is wrong. I can understand that people in other fields have different opinions, but I guess what we need is non-binary review. Of course, reviewers are also just one part of the equation. My reviews have been overruled by associate editors more often than not.

about 5 months ago
top

US Marshals Seize Police Stingray Records To Keep Them From the ACLU

jmv Re:Obama's police state? (272 comments)

The entire world rejected the "I was just doing my job" and "I was just taking orders" excuses during the Nuremberg trials.

You should read about the Milgram experiment.

about 6 months ago
top

In the year since Snowden's revelations ...

jmv Re:Does it really matter? (248 comments)

It's all about cost. It costs resources to break keys or break into machines. If you increase the cost by 10x, then they can break only 1/10 of what they could originally break using the same budget.

about 6 months ago
top

Study: Royalty Charges Almost On Par With Component Costs For Smartphones

jmv Re:The people that invent things must be compensat (131 comments)

You think progress is slow now? See what happens when companies actively hide how they do things rather then relying on patients to protect their IP.

Yeah, imagine all these iPhone owners with rounded corners they can't even see because Apple had to hide them.

about 6 months ago
top

PHK: HTTP 2.0 Should Be Scrapped

jmv Re:Encryption (220 comments)

How do you explain to the user well their data might be encrypted yet their data is not protected since it is not trusted?

I'm talking about http here, not https. The idea is that even with http -- where you don't pretend that anything is secure -- you still encrypt everything. It's far from perfect, but it beats plaintext because the attacker can't hide anymore -- it has to be an active attack. I don't pretend to know all about the pros and cons of http 2, but plaintext has to die.

about 6 months ago
top

PHK: HTTP 2.0 Should Be Scrapped

jmv Re:Encryption (220 comments)

Nothing is NSA-proof, therefore we should just scrap TLS and transmit everything in plaintext, right? The whole point here is not to make the system undefeatable, just to increase the cost of breaking it, just like your door lock isn't perfect, but still useful. If HTTP was always encrypted, even with no authentication, it would require the NSA to man-in-the-middle every single connection if it wants to keep its pervasive monitoring. This would not only make the cost skyrocket, but also make it trivial to detect.

about 6 months ago
top

PHK: HTTP 2.0 Should Be Scrapped

jmv Re:Encryption (220 comments)

A server cannot ask for encryption.

AFAIK, HTTP2 allows the server to encrypt even if the client didn't want to.

Unless the client establishes a secure connection in the first place, the server has no way of knowing if the client is actually who they claim to be. If the client attempts to establish a secure connection and the server responds with "I can't give you a secure connection" then the client needs to assume there is a man in the middle attack going on and refuse to communicate with the server.

If you're able to modify packets in transit (i.e. Man in the Middle), then you can also just decrypt with your key and re-encrypt with the client key. Without authentication, there's just nothing that's going to prevent a MitM attack. Despite that, being vulnerable to MitM is much better than being vulnerable to any sort of passive listening.

about 6 months ago
top

PHK: HTTP 2.0 Should Be Scrapped

jmv Re:Encryption (220 comments)

Last I heard, it still supports unencrypted, but only if both the client and server ask for it. If either one asks for encryption, then the connection is encrypted, even if there's no authentication (i.e. certificate). With no certificate, it's still possible to pull an active(MitM) attack, which is much harder to pull off at a large scale without anyone noticing (i.e. you can just collect all data you see).

about 6 months ago

Submissions

top

Opus - the codec to end all codecs

jmv jmv writes  |  more than 2 years ago

jmv writes "It's official. The Opus audio codec is now standardized by the IETF as RFC 6716. Opus is the first state-of-the-art, fully Free and Open audio codec ratified by a major standards organization. Better, Opus covers basically the entire audio-coding application space and manages to be as good or better than existing proprietary codecs over this whole space. Opus is the result of a collaboration between Xiph.Org, Mozilla, Microsoft (yes!), Broadcom, Octasic, and Google. See the Mozilla announcement and the Xiph.Org press release for more details."
Link to Original Source

Journals

jmv has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?