As reported from the same news site the following day:
Here you go:
There are many more combinations of words in history than there are words in a dictionary. This method is plenty safe
I'm not sure you understand how this camera works. It wouldn't be possible to create an SLR because the image capture from the photosensor needs to be post-processed. This is because there is an array of lenses in front of the photosensor, so the light passing through each lens will hit the photosensor at different points. Therefore, you MUST post-process this image otherwise it would be like looking through the eye of a fly. The only way to increase the resolution is by increasing the number of lenses on the array and increase the resolution of the photosensor.
Technology like this may seem simple and not very useful, but first generation products always are. Things take time to develop, and often take other imaginations than the creator to build something truly amazing.
It's a single photosensor. The lens array and maths are doing the hard work. Therefore, although the data processing requirements may be very data intensive, the actual image should be the same, or very close to the same, as an image taken without the lens array. The maths should be implementable fully in hardware such that all processing can be done on camera at video speeds, so there is no reason that this couldn't be done. The issue would be making a cohesive focal point between frames. Having to focus a film frame-by-frame would take a lot of time and would be something only film studios might be willing to do, but would be too annoying for consumers.
> I am a gravitational theorist.
I don't even understand what that title means, but it sounds very cool.
He attracts a lot of ideas...
... was given to the Apache Community. Does that seem like action by Oracle to "stomp Apache and its open source Java efforts clean out of existence"? If anything it makes the Apache Community stronger. Java is definitely one of Oracle's most important acquisitions from Sun, which is why they are currently in court against Google. Programming mistakes happen all the time. Granted, some optimization flags were enabled that shouldn't have been, but that doesn't make Oracle intentionally malicious in this case.
Accidents happen, get over it.
I'm tired of these flamebait articles. What has happened to factual news reporting?
What is preventing a student from giving their card to a friend? Someone else could also take a test for a student if the teacher relied exclusively on an RFID scanner.
Really? There isn't even an array of memristors in the world or a model of how the brain works, and you can claim this? Get back to your cartoons.
When you create a model you generally ignore the details of a system and focus on higher-level operation. You worry about what things do, not what the components are made out of. What I was referring to was the actual hardware in our brains. Synapses are functionally almost the same as a memristor. Since the synapses in our brains connect together in an array-like fashion, it is like an array of memristors.
This is a really big deal. Since our brains work in much the same way as an array of memristors, this brings the possibility of an artificial brain (and perhaps artificial intelligence) much closer to reality.
Maybe I will live to see Data in my lifetime.
Sure you can. Microsoft could detect the client and serve compatible content (like only the top 100 friend tags) to the Xbox, and serve full content to the 360. Seems like it would be a fairly simple programming change.
Watch the documentary 'Earthlings' and then tell me how you feel about this issue.
Google hasn't invented anything with their OS. It is basically a thin client that uses the internet instead of an intranet.
The biggest danger here is the potential for competitor lock-out. But as we've seen with Microsoft, there will just be lawsuits that will open it up the platform for fair competition (at least, theoretically).
Wooden shoe rather be Dutch?
In seeking the unattainable, simplicity only gets in the way. -- Epigrams in Programming, ACM SIGPLAN Sept. 1982