Evangelical Scientists Debate Creation Story
This is definitely one of the most common misconceptions I hear about evolution. How did a monkey evolve into a human? A *single* monkey did not evolve into anything, it was a monkey. Has any human recently evolved into a star-child? Individuals of a species do not evolve. Evolution is a function of a population over many, many generations. It is the natural selection of traits that promote survival given the current habitat/climate/whatever. More successful traits become more prominent, especially when they lead one individual to
1) Live Longer
2) Have more offspring
The offspring have these traits, and the cycle continues.
Talking about the "evolution of the individual" is like talking about how it is possible to make a chain with one link.
Phase Change Memory Points To Future of Storage
As someone who has done research in memory wear leveling, I can assure you that these technologies have a place. There are significant design trade-offs that must be considered for any application. Power, area, speed/latency, and maximum amount of write-erase cycles all come into play. One of the head researchers in emerging memory technologies at Penn State has an interesting presentation here on the roles of these memory technologies (yes, I realized it is hosted at Oregon State, and he is from PSU, oh well...): http://web.engr.oregonstate.edu/~sllu/xie.pdf
Gamers Beat Algorithms At Finding Protein Structures
Astronomers have already come up with something. It's called galaxy zoo.
As an added bonus, you get to look at some neat deep space photography.
Oscilloscopes For Modern Engineers?
I will say this. Labview is great for a quick-and-dirty setup or small application. If you need to do anything more complicated, you will find that the entire development environment is incredibly lacking and highly tedious, and there is no meaningful literature on application design in Labview (90% of Labview books are "hurf-a-durf you connect one box to another and it does things, think outside the c++ box man").
As someone who writes VHDL, Verilog, C++, and Matlab on a daily basis, I understand both control flow and data flow programming, Labview is some perverted amalgamation of the two. It lures you in under the guise that you will not need to learn any GUI programming, and then screws you over in anything more than the basics. For example, in a data flow paradigm, pointers have no meaning, as all data is by value. Nearly every complex data type is handled with pointers, and "magical pointer functions" which make life hell, as they do not fit into the paradigm. So then they add "classes", which is a way for them to say that they somehow trump C++ and Java. Upon reading the fine print, one discovers that the class system is similarly FUBAR'd. Then there's the issue of inserting something into code. In a text based language, hit enter, and begin typing. In labview, delete a shit-ton of wires, drag and drop portions of the diagram, put in new bright colored squares, connect even more wires, make everything look readable (see: drag 'n drop ad nausium).
But, if you need a quick and dirty state machine to control something, and you don't mind a polling architecture, I can implement that in about a day...
Microsoft Sues Salesforce.com Over Patents
Yes, all of these things are interesting and innovative ideas, but in and of themselves, they are not products that should be patentable. Take the following examples:
An english teacher marks up a term paper with red pen. Is this infringing on microsoft's red squiggle? I'm fairly certain that red pens have been used far longer.
I arrange all the tools on my desk in a line, pencil/pen cup, phone, coffee mug, parts directory, etc. Is this infringing on the "toolbar" or "ribbon"?
In conclusion, an idea is not necessarily a product.
How Not To Pay a Parking Ticket
Your packing crate incident wouldn't have anything to do with this?
Analyst, 15, Creates Storm After Trashing Twitter
I feel that it is important to report market information that I have assembled.
Based on a survey of the people I'm living with, Ubuntu has a 25% market share of the laptop market.
None of my friends own an iPhone, so I assure you that it is a dead market space, MMOs fall into the same category.
On average, there is only one care for six people with driver's licenses.
Wii has 100% of the market share.
All teenage girls love anime and The Lion King.
In terms of popularity, 4 out of 5 of my roommates wanted a joint memorial for Billy Mays and Michael Jackson.
Everyone I know hates MySpace. I mean everyone. Its a really stupid facebook. The only people who use it are retarded. Surveys report that people are more willing to twitter than use MySpace, which is quite shocking considering previous reports.
All of these reports are held to the highest standards of statistical accuracy and truthfulness. It has the statistical rigour usual to all of my reports.
Can Bill Gates Prevent the Next Katrina?
no, Mr. Jobs, I expect you to die...
Scientists Hack Cellphone To Detect Diseases
A friend of mine just launched a company to develop such a device. He's well on his way. The current team is assembling at Penn State.
Game Distribution and the 'Idiocy' of DRM
The only successful DRM comes from hardware makers (read: Apple) who balance the power to govern sales without extortion prices and without runaway piracy, because their interests are aligned with both consumers and intellectual property content producers.
Are you telling me that Apple's prices are reasonable? Their products cost 50% to 200% more than the market competition. They definitely are extorting people with DRM. The majority of people who use apple products don't even know that they are being controlled by DRM (think: iPod). All they know, is that, if they try to use another brand (of hardware), it just won't work.
For those who are not as tech-savvy, not working = broken (and will never use again)
Also, do you think that Apple's DRM has helped cut music piracy? I assume /. readers are smart enough to figure this one out.