Adobe Demos Photo Unblurring At MAX 2011
It doesn't sound that much harder of a problem to solve than what I learned in EE undergrad about deconvolution. Divide the Fourier transform of the blurred image by the fourier transform of the "motion kernel" as they call it to get the sharpened image. I routinely use a similar method in the lab to correct for visual aberrations in my diffraction spot imaging equipment, but there the problem is much easier as the motion function is exactly traced out by the diffraction spots.
Perhaps getting the "motion kernel" is harder than I suspect it to be in a real life scenario, though.
Six-Drive SATA III SSD Round-Up Shows Big Gains
I bought one of Intel's 3rd generation 80GB SSDs back in January and have had zero problems with it. No, it's not as fast as OCZ's drives, but it's reliable. Intel's failure rate is 0.6% while OCZ's is 3% (not sure if that's a per-year figure or something). Why an average user would buy primary storage with a 3% failure rate is beyond me.
(failure rate figure comes from http://www.anandtech.com/show/4202/the-intel-ssd-510-review/3 )
Apple Plans New Spaceship-like Campus
Looks a lot like a synchrotron to me.
Ars Looks At In-Flight Internet — State of the Art vs. Things To Come
I haven't considered using in-flight internet. If anything, it's a welcome respite from being connected. My boss usually pays for it, but he loves to micromanage and so it's understandable.
I would pay for live TV though. I missed the Superbowl this year because I was on a flight :(.
80% Improvement In Solar Cell Efficiency
With this approach at the laboratory scale, Xu and colleagues were able to obtain a light-to-power conversion efficiency of 3.2 percent compared to 1.8 percent efficiency of conventional planar structure of the same materials.
So the efficiencies went from awful to slightly less awful.
Reform the PhD System or Close It Down
There's certainly a declining signal-to-noise ratio, but I constantly am looking up obscure papers 20-30 years old. My entire field (perhaps hundreds of researchers now) arose in no small part because of a few obscure papers from the 1970s that gained relevance only recently. We (as in my lab alone) now have millions of dollars in industry funding and the PIs are currently hiring for a start-up based on the research.
NESBot: Tool Assisted Speedrun On Real Hardware
Not really. Far more work went into designing the setup so that it properly worked on an NES rather than an emulator, which, as TFA explains, was not an easy task because of the way lag is handled, among other things. Your suggestion would just add a constant delay to all the moves to account for the mechanical motion required.
On a related note, some of my friends thought IBM's Watson wasn't very impressive because it was receiving text input and parsing the results instead of "listening" to the answer being spoken, translating it into text, and then coming up with the question. Given my cell phone can translate speech into text, I have a feeling IBM didn't feel like that feature was important to the demonstration...
Why the Arduino Won and Why It's Here To Stay
(a listing can be found at http://arduino.cc/en/Main/Hardware )
I have tried to use Arduino boards in the past, and while they're really cool for hobbyist stuff, they are very hard to integrate into battery-operated things:
1. The operating voltage is 5V (some may be 3.3V, I forget) and draw a lot of current. Batteries that supply this kind of voltage are HUGE. It would be really nice if they had a design that was optimized for low voltages and low currents, like for mobile sensing, so that I could use coin cells.
2. The devices are really memory-limited. The Uno, which is probably the most popular, has something like 2kB of ram. I used the board to interface with some sensors for tracking a flight trajectory on-board, and I could only record a few seconds of data before running out of room. Wireless transmission wasn't really an option because of power (= more batteries) limitations.
3. Connecting to USB resets the board, wiping the memory, unless you cut a trace on the board. This is supposed to help facilitate loading new programs, but becomes an annoyance if you wanted to use it to transfer sensor data stored on-board to a computer. When you cut the trace to disable the autoreset, it becomes difficult to time the reset button manually so that your program uploads.
Overall, as an EE, I was very impressed at how easy it was to use, but I think the issues I mentioned warrant some fixing if Arduino is going to be used for things like sensing.
Sputnik Moment Or No, Science Fairs Are Lagging
I have judged my city's (> 500,000 people, South) science fair for the last several years. It has been about 10 years since I graduated from high school, and I had participated every year in my county (~ 1 million people, not culturally Southern) science fair back then. I remember vividly, back then, having kids with amazing projects that were worthy of MS-level theses. One year, for example, someone found a new Group Theory result (with oversight by a college professor), for example. Many others did medical studies, had detailed demonstrations of traffic pattern simulations, and so on.
Fast forward to me judging the high school science fair here, and I'm appalled at what the "best" these kids could muster is. Most kids couldn't even design a simple experiment. For example, one girl was measuring the conductivity of a solution and varying the temperature, but her "data" consisted of her saying that the conductivity went down as the temperature went down. There was no actual data. The best projects were judged "best" by me by at least having some kind of quantitative data, using proper controls, and having some understanding of the implications of the work. Nothing blew me away, and I had to wonder where the mentor involvement was because it seemed like these kids did everything on their own.
Graphene Won't Replace Silicon In CPUs, Says IBM
Advanced Silicon-Germanium transistors can hit 500GHz. You're correct that you need cryogenic temperatures for such operation because crystal lattice vibrations will slow electrons down at room temperature (this is a fundamentally different effect than what limits CPU's when you overclock them). IBM's 100GHz transistor was at room temperature, if I recall, which is probably the astounding part. The secret sauce there is simply very weak interactions between electrons and phonons (physics-y term for lattice vibrations).
Hello, Android Third Edition
You will need a new-ish (core 2 or better, probably) computer to even run the emulator. Download the SDK and Eclipse and then set yourself up to test on your phone and save yourself a lot of grief. http://developer.android.com/guide/developing/device.html
New Tech Promises Cheap Gene Sequencing In Minutes
Nanopore sequencing has been around for at least a decade in the lab. They admit that their method of using tunnel junctions to detect the DNA cannot even distinguish between different base pairs.
For background, here's the basic idea of a classical nanopore sequencer:
1. Make a solution with ions in it with a very thin membrane separating two different compartments each containing an electrode. The membrane has a very tiny hole (nanopore)
2. Apply a voltage. This will either attract or repel the salt ions, thus you get a detectable current passing through the nanopore.
3. Put DNA in the solution. The hole is hopefully small enough that the DNA can only go through as if stranded like thread through a needle. As the different base pairs move through, they block up varying amounts of the hole, manifesting as small changes in resistance across the hole.
The only real limiter is how thin you can make the membrane. Recently, some researchers used graphene, which is thinner than your average base pair, and so you do not get a resistance that is the convolution of many base pairs blocking up the pore at any given time. For more, google "Dekker DNA translocation through graphene nanopores" to see that they can already detect single pairs - and do it thousands of times a second.
Graphene Nobel Prize Committee Criticized For Inaccuracies
I guess since IAAP (Physicist), I can try to translate some of the physics-ese. Here is the basic argument of the letter:
1. One of the reasons Geim got the Nobel was that he "discovered" graphene. However, the paper the committee is using to establish the date he discovered it (2004) in fact has no reference to graphene but rather graphite, it's well-known cousin. This is an important distinction because a few other groups have graphene papers around the same time.
2. Geim uses a method for creating graphene that is not commercially viable, yet has been credited with a revolution in electronics technology.
3. One of Geim's collaborators goes almost completely uncited although his data is used in the document and appears credited to Geim.
The Android Invasion Cometh; Is Resistance Futile?
When you take a flight all the televisions in airports run Windows.
Agree with what you said for the most part, but I just wanted to point out that I think Linux is used behind the scenes too. For example, on a long flight back to the U.S. while I was flying with a major European carrier, the entertainment system crashed and I saw the Linux penguin pop up on the screen. I agree though: especially in the states, most of the displays you see on walls in buildings are Windows.
Nobel Prize in Physics For Discovery of Graphene
Geim's original paper on the subject ( http://arxiv.org/ftp/cond-mat/papers/0410/0410550.pdf ) was a real fascination because it was so simple and yet enabled many people to do real research. The original paper uses scotch tape to peel off monolayers of different bulk materials, but only graphene showed anything interesting (in particular, the so-called "field-effect" which is the principle behind CMOS transistors. To be sure, the quality of graphene produced from this method is complete crap compared to more advanced methods used by groups today (chemical vapor deposition of various organic molecules, carbon gettering from metals, epitaxial growth by silicon sublimation from SiC), but an impressive amount of exotic physical phenomena (e.g., quantum hall effect) was seen in what was essentially crap.
No doubt, Geim has probably indirectly gotten thousands of researchers perhaps a billion dollars in funding in less than a decade, but I don't think Geim's contribution was as much physics as it was successfully marketing his research (outsiders like to think of science as being purely meritocratic, but it scientists are still people, and people are susceptible to hype). In my opinion, there are many better physics researchers in the field than Geim himself, but none of them are nearly as good at communication and generating buzz.
In any case, congratulations to him for winning it so soon.
Microsoft Losing Big To Apple On Campus
... considering Apple does not even offer Macbooks with core i3's. You must get core i5 or i7 and pay out the wazoo (to the tune of $1700+) for it or else you're stuck with core 2 duo's, as far as I know. Then again, UVa is not a tech school and so I don't expect many of their incoming class to know or care. Meanwhile, my $500 dollar laptop from a local store 4 years ago still runs AutoCAD 2010 just fine with a $40 RAM upgrade.
The Possibility of Paradox-Free Time Travel
I have neither the capacity nor the will to vet the paper, but it should be noted that ArXiv is not peer reviewed. While experimentalists use it as a place to publish pre-prints of their papers and will typically only put them up after the papers have been accepted, but theorists use the medium as a substitute for publishing and so many wacky and untrue claims get put up there.
SpaceX's Falcon 9 Appears As UFO In Australia
There's no work to check. His GMT time is correct, and I'm guessing he knows his own country's time zone offsets from GMT.
SpaceX's Falcon 9 Appears As UFO In Australia
Presumably "EST" refers to Australian east coast time and not American EDT given it was 2:45pm EDT when the launch occurred.
Study Claims Cellphones Implicated In Bee Loss
(I was the GP)
More likely than cell phones, definitely. Pollution causes less incident light to be polarized, and therefore makes it harder for bees to use it for travel. see: http://en.wikipedia.org/wiki/Polarized_light_pollution
vsage3 hasn't submitted any stories.
vsage3 has no journal entries.