Liberal Saudi Web Forum Founder Sentenced To 600 Lashes and 7 Years In Prison
Yet another prime example of why alien civilizations won't contact us openly: How can a truly civilized race possibly take us as anything other than animals when we still do things like this? Our so-called "civilization" is just as thin a patina over the animal underneath as our neo-cortex is over the rest of our brains. It's positively heartbreaking to read of things like this in this day and age when I know that the human race, at it's best, is in such stark contrast with such senseless ignorance and brutality.
I don't disagree with your overall premise, but what says that an alien civilization with technology to travel inter-planet has to be a truly civilized race?
Microsoft Is Sitting On Six Million Unsold Surface Tablets
It's only mandatory on ARM devices that wish to be Windows Logo certified.
MS Won't Release Study Disputing Munich's Linux-Switch Savings
Maybe the year of Linux on the desktop is coming after all. Slowly, but eventually.
It's been here for awhile. It's called Android.
The Weight of an e-Book
Sorry, but this isn't significant. And to be honest, it sounds like it should be in the noise. Flash memory is flash memory. The cell can swell based on many environmental factors (air pressure changes, humidity, temperature, etc.), and TFA clearly mentions heat as a possible factor. The fact a downloaded piece of data measured at all could be the cells were heated as the gates were being used to store the data. Who knows. A billionth of a billionth of a gram for 4GB of data just sounds too tiny to be remotely significant, let alone noteworthy outside of an extremely controlled environment.
I'd like to see more data on the experiment itself, to see if the measurements were all taken in a very controlled environment or not. TFA is really lacking any details that would intrigue people who cared.
EC2 Outage Shows How Much the Net Relies On Amazon
The issue today though isn't in-house vs. colocated, it's cost. Most of these companies don't have the cash to build proper infrastructure to house their services locally. The cloud services from various companies, like Amazon, take care of the physical maintenance and cooling and power, etc.
Even if your local datacenter housed mission-critical data, I'm sure it's possible to come up with 100 scenarios where you could lose all connectivity to your locally-housed infrastructure (power company accidentally digs up your comm lines, etc.).
The cloud isn't perfect, but neither is in-house colocation. It depends on how much money you want to spend for the control. Even with the control, you can't plan for the worst and still remain cost-effective. This is just a crappy situation that is amplified given how many people rely on the services.
Intel Kills Consumer Larrabee Plans
Don't forget about the NVidia Ion platforms. They also use a "just-enough" CPU in Intel's Atom, with higher end NVidia GPUs to run nicely integrated HD set-top boxes. Nice little platforms for MythTV frontends.
Intel Kills Consumer Larrabee Plans
I don't play games on my laptop, but I do run compiz-fusion with many of the features enabled. It's very eye-candy-heavy, and my integrated Intel graphics chip keeps up just fine. My CPUs don't bear much load at all. I don't think things are as grossly out of proportion as you make them out to be. 5 years ago, yes. Today, not so much.
Novell Rises to Second Highest Linux Contributor
Another big portion is companies like Novell contract themselves to other companies to do their kernel development for them. AMD, for example, pays Novell to do their kernel work for them. This isn't an uncommon practice, since RedHat also gets money from other companies to do their development work in the kernel. But when it comes down to it, the actual "originator" of the code or concept may not be Novell or RedHat, but they're the email address getting merged on the Signed-off-by: lines, which isn't a big deal.
I don't see this as anything evil or underhanded, being a network stack hacker myself. The kernel maintainers and core contributers are far from stupid and gullible, and will *not* accept anything if they see proprietary undertones. I'm also sure they're putting a bit more scrutiny into reviewing patches from Novell just because. But the bottom line is more people are working on the kernel, trying to make it better, which is the end-goal. It really, in my mind, doesn't matter who is doing it, just as long as it's getting done and done well.