top Cox Comm. Injects Code Into Web Traffic To Announce Email Outage
Actually, that's exactly what I'm going to do now. I was already pissed because my connection has been going down a lot lately. Then they pull this crap. Bye Cox!
about a year and a half ago
top Cox Comm. Injects Code Into Web Traffic To Announce Email Outage
No, not like this. At least I've never seen it before. This is intrusive. I've had it show up in my browser at least 3 times in the past couple of hours and it's about a service I don't even use. I don't care if their e-mail is out. I don't use their e-mail. I don't want this stuff and there ought to be a simple way to opt out.
about a year and a half ago
top Are You Sure SHA-1+Salt Is Enough For Passwords?
This just in: HTTPS is insecure when an attacker has rooted your machine. News at 11.
top Microsoft Makes Chrome Play H.264 Video
The Internet Explorer team said, "WTF?"
top 3D Cinema Doesn't Work and Never Will
Yeah, 3D cinema will never work. Look at Avatar. It was a complete disaster. Nobody went to see it because the 3D experience just didn't work at all.
top New Programming Language Weaves Security Into Code
The compiler enforces the security policies and will not allow the programmer to write insecure code.
Oh really? I'm an expert. I can write insecure code in any language. Guaranteed!
top 10/10/10 — a Nice Day To Celebrate the Meaning of Life
My wife is very pregnant and started having contractions at 11:00pm Friday night. We went to the ER last night because the pain had really gotten unbearable (my wife is quite tough, so if she couldn't take the pain, it must have been excruciating). They sent us back home, but we're scheduled to go to the hospital at 5:00pm tonight where they'll give her prostaglandins to "ripen" the cervix. The plan was originally to induce Monday morning, but given that she's been in labor for about 36 hours now, the prostaglandins will likely be all she needs.
Since we determined the expected date (10/4/10), I've been hoping for a 10/10/10 birth (almost entirely because in binary it's 42) and I just may get my wish. Not that any of that matters a bit to me right now. The only thing that really matters to me is that my wife and baby are healthy and doing well. Fortunately she's been able to sleep a bit (she wakes for the contractions, but immediately goes back to sleep). But let's face it, it'll be a cool birthdate if she comes out before midnight!
top Scientists Stack Up New Genes For Height
While I have no doubt it's true that a large number of genes contribute to height, it's very likely there are a handful of genes that have a significantly larger effect than the rest. It's a simple matter of statistics. If you have 100 genes that all have, more or less, the same small contribution, then there would be exceedingly few people who were over 6' and the distribution of heights would be most people very close to the same height and only a handful of outliers. You also wouldn't have unusual heights being very heritable (which they are). There must be just a few genes that have a much more significant effect than others.
top Should I Learn To Program iOS Or Android Devices?
If you're only going to learn one, go Android. Java is reusable in other environments and frankly, it's just easier.
My personal opinion is that Objective C is pretty tedious and annoying. The syntax is ugly and non-intuitive. Again, this is my personal opinion. But having done years of C, C++, C#, I find it bizarre that Objective C syntax is non-obvious. Not that it is particularly complex, but if you know C++, Java and C# seem pretty obvious, whereas Objective C is just very different in syntax.
Finally, Java is platform agnostic. Objective C has few platforms that it's good for and you have to buy Apple hardware to build iPhone apps which to me is plain stupid and I think in the long run, it's going to be one of the things to hurt the iPhone.
Just my own opinions based on my experience with both. I sat down and immediately started writing Android apps using the SDK and simulator with no previous Java experience. Even after several days of playing with existing iPhone apps, I had difficulty even following what was happening in the code, understanding the stuff I was seeing in the watch windows, and figuring out exactly what the various syntactical crap meant.
top Stuxnet Worm Infected Industrial Control Systems
Developers; Listen up! NEVER, EVER, EVER, EVER, EVER have a default password in apps you build. The setup should ask for a password if one is needed and the app should not install without one! What is so hard about this? It boggles my mind that things as important as routers, database servers and industrial equipment control software would install with default passwords! Why does that not raise red flags in developers' minds the second it pops into them?
top Preventing Networked Gizmo Use During Exams?
What non-native English speaking foreign student doesn't have an English to "insert foreign language here" treeware dictionary? Tell them to use that instead. I got by just fine with one when I was living abroad. Not to mention, they tend to be more accurate than most of the online translation dictionaries I've used.
As for calculators, when I took chemistry, physical chemistry and other classes that used math, we were allowed a calculator. It could be one of the advanced programmable graphing ones, or it could be a basic one. Either would have been fine for those exams and I imagine they'll be fine for yours. Students are generally responsible for providing their own. If you'd like to throw in some cheapo simple ones to supplement that for the students who might not have one (what student taking physics doesn't have at least a basic calculator?), you're certainly welcome to, but I wouldn't expect that from a professor.
I wouldn't go too far out of your way and I wouldn't bend over backwards to accommodate them. You're the professor. You set the rules.
The things I've mentioned above have been pretty standard in universities for at least a few decades. I'm guessing this isn't advanced physics and I'm pretty sure basic physics hasn't changed drastically in the past few decades, so no reason you should have to accommodate the latest and greatest tech.
top Another Gulf Oil Rig Explodes
It still kills fewer americans than getting oil from other places... like the middle east.
The fish will be happy to hear that.
top Julian Assange To Write For Swedish Tabloid
Link or it didn't happen.
top The Possibility of Paradox-Free Time Travel
It seems to me that the multiverse itself gets one around the grandfather paradox. Granted, it's as theoretical as time travel itself, but still... Go back in time, poof, a new branch of the universe breaks off. The branch where you went back in time (which, of course, is now spawning an endless number of branches itself). Now everything you do affects the branch you're on and not the branch you left from. Paradox-free time travel.
top Cool, Science-y Masters Programs For Software Devs?
In the past few years, I've become very interested in neuroscience and I've read and studied a great deal about it. Unfortunately, the local universities don't have a neuroscience specialty, so a PhD is out of the question unless I relocate.
Computer science and neuroscience really go hand-in-hand these days. There's a great deal of research being done from the modeling of just ion channels to the modeling of entire cells, to the modeling of large-scale brain structures.
My personal belief is that software, based on neuroscience principles, will become an important area of software development for writing intelligent systems. Systems that can effectively recognize voices, faces, or interpret language, etc, are natural targets. Imagine a stock picking system that reads news stories and factors in emotional content into its picks (after all, let's face it, since the internet made stock-trading more accessible, emotion plays much heavier into the market). Systems could be designed that could monitor financial transactions to find and identify novel types of fraud. In astronomy, because of the number and quality of images coming in, one could create systems that could intelligently view the volumes of images and identify and catalog new objects.
Really, it's an area that's wide open to possibilities. But to understand how to properly piece together the types of artificial neural circuits to accomplish this kind of functionality, one would need a fairly good understanding of how the various circuits in a human brain connect and interact and how they are used to process information (we already understand a tremendous amount about this and we're learning more all the time). Really, neuroscience seems to me to be the new computer science. It's where some of the most amazing advances are being made in science today, in my opinion.
But it is just my opinion and there are lots of other possibilities. I'm definitely enthusiastic about this..
top Knuth Got It Wrong
The article is a little over-zealous its characterization, though it's careful to note that this is not actually a theoretical novelty. The summary, on the other hand, bastardizes and exaggerates it.
Not to mention, but what a conceited prick the author is as well.
One of the main reasons I accepted the Varnish proposal was to show how to write a high-performance server program... Because, not to mince words, the majority of you are doing that wrong.
Oh, please do mince words. Why don't you tell all of us, your sad little children, how we should be doing things, oh master. Clearly you're the shit! Get over yourself already.
top Why Are Digital Hearing Aids So Expensive?
"... So my question is this: when I can get a very good netbook computer for under $400 why do I need to pay $1,200 per ear for a hearing aid? Alternatives would be welcome."
Welcome to the world of for-profit health. Medical device approval process, litigation and because your insurance company will presumably pay a big chunk (not a reasonable assumption anymore, though) is largely responsible. Just one of the million reasons we need health care reform.
top Court Rules Against Vaccine-Autism Claims Again
The Lancet didn't retract that ridiculous paper from 1998
until last month and it pretty much started all this ridiculous BS. It's absolutely unconscionable that they didn't retract it sooner. Ten of the original 13 authors retracted back in 2004. That should have been a hint.
The problem with vaccines is that being vaccinated as an individual isn't what makes you safe. It's the vaccination of the herd that protects. That is, for a particular disease that you might be vaccinated against, let's say measles, it's safer to be the only person in a crowd who isn't vaccinated than to be the one person in the crowd who is vaccinated. Vaccines aren't 100% effective and what makes them truly effective, is having everyone take them.
Back in 2006, some girl in Indiana got measles on a trip to Romania. She came back and shared that gift with the people in her church, simply by showing up. Roughly 10% of the 500 people present weren't vaccinated and 32% of those people developed the measles. One person who got the vaccine also got the measles, but 94% of the cases were unvaccinated people.
The problem these days is that people don't bother to learn history. Anyone who's been to an old cemetery (I live in Arkansas, and we have tons of them) pretty much can't miss the fact that there are tons of kids aged 10 and under buried. Why? In the early 1800s, infant mortality was about 20%. Think about that. One in five infants (1 year old and younger) died. A lot more died before the age of 5. Not all of that is vaccines, but a lot of it is! Before the vaccine, smallpox alone was killing 400,000 Europeans a year.
Personally, I think vaccines ought to be required by law because they're a public safety issue and people who won't do it should go to jail.
top Xerox Sues Google, Yahoo Over Search Patents
Now I'm going to have to sue Xerox for violating my patent for the business process of how to make money from companies using ridiculous patent infringement lawsuits.
top "Immortal Molecule" Evolves — How Close To Synthetic Life?
For those current in the field, this discovery is not surprising.
Especially since those current in the field probably read about it 13 months ago when the paper was published.
Pedrito has no journal entries.