The paper from the actual researchers is far more guarded, and suggest that it may be EMI similar to Perytons, which are radio sources that appear to look like a pulsar signature.
From Wikipedia - "In 2015, Perytons were found to be the result of premature opening of microwave oven doors at the Parkes Observatory. The microwave oven releases a frequency-swept radio pulse that mimics an FRB as the magnetron turns off."
Here is a paper on Perytons, and their possible sources: http://arxiv.org/pdf/1404.5080...
Here is a link on Pulsar physics, including a very basic back of the envelope derivation of the dispersion medium of pulsars. Apparently two pulses from a pulsar are detected a few milliseconds from one another, and stem from the mass difference between the electron and a proton and their interaction with interstellar space. Still trying to get a handle on this.. http://www.cv.nrao.edu/course/...
Dispersion measure variations and their effect on precision pulsar timing: http://www.parkes.atnf.csiro.a...
I was on the design team for the MiSeq DNA sequencer at Illumina that can sequence 1 billion bases in one day, doing embedded systems/FPGA/control loop work. I no longer work there, but think they've managed to increase throughput. This particular unit fits on a tabletop, and costs about $100K.
A story was related to me while working there about an outbreak in the intensive care unit in Cambridge England where 7 preemie infants got sick. With this instrument, they could see how the virus mutated on a room-by-room basis, and a day-by-day basis. It was apparently unprecedented. They had one of our instruments on an early trial basis to give feedback on it's usage. The pathology department was pretty excited. This seems like a very useful kind of instrument when tracking the spread of diseases. I'd be curious about the adoption rates for such instruments in pathology labs, the CDC, etc. I understand that Illumina has made a push to have their instruments certified as a medical device, but I don't know the status of it. I'd like our labs to have all the tools they need to rapidly converge on the infectious agent, etc.
One important consideration for portable DNA sequencers is the read error rate of the DNA fragments (akin to bit error rate in a length of magnetic tape). The higher bit bit error rate, the more samples you have to make to reduce the probability of error to a small acceptable level. Even though some instruments on the market may be cheaper to run, you have to read a lot more samples to reduce the error statistics. (the Q scores). Any portable instrument must do this with a low error rate, such that the small sample size is meaningful. Also, the longer the read length of an individual strand the better.
DNA sequencing is sort of like taking a photograph and cutting it up into thousands of pieces, and reassembling it. The bigger the chucks, the more distinctive it is, and the easier it is to fit into a larger puzzle, pieces that are too small, like bits of sky aren't distinctive enough to see how they fit into the larger picture . I still don't think we've been able to completely DNA sequence a human being, because the "sequencing-by-synthesis" method used by Illumina only uses relatively short strands of 100base pairs (more if you do "paired-end" sequencing that pushes it to +250, though my knowledge is a few years old).. There is some small percentage that they can't fit because it's not distinctive enough, and the DNA itself does not break apart uni-formally. Some areas are over represented, and other ares where they're underrepresented..
I understand that chicken-wire has an extremely high radar cross section, as it's a regularly spaced array. I wonder how hard it is to see behind such a screen. Of course the attenuation varies by the spatial dimensions, A fun bit of calculation would be to find what the right size(s) of chicken-wire you need to block such instruments given their frequency ranges (assuming ISM band?). http://en.wikipedia.org/wiki/R...
I first read about how strong a return you get from chicken-wire from Stimson's book "Introduction to Airborne Radar"
I like the windows-7 interface, as well as the XP interface. My big problem however is backwards compatibility.I think we should be able to run programs from 50 years ago. I find it a real shame that it's often hard to get old programs/dev-tools/games/etc to work on a newer operating system. Sure, they have their reasons, but other operating systems have managed to handle this (especially ones that give you the source that you can recompile on a newer machine). Even when using "XP mode" I can run some old dev tools, but I can't run any 3D graphics because my nvidia graphics card only had drivers for windows-7 (on a 3 year old graphics card).
I'm friends with an FAE for a good embedded compiler company that was pretty frustrated trying to make their compiler work that was working fine under windows 7 work under windows 8. It took a long time for their developers to make the transition. I'm not sure what in the development process seems to be making development harder. I have developed for windows professionally, but not in some time. I'd love to hear from a developer perspective. I am currently working in the embedded linux/FPGA world.
Does the whole
Using ww.findchips.com (a great site to check for parts and availability across multiple distrubuters) , in small quantities, the 2Mbit part is ~$5. But still, your argument is valid. For space born applications where reliability is everything, I'd still like to know about it's Rad-hard status.. These parts come in 8 pin packages, and could also likely scale if they wanted to. Who's to say that in the future that we wouldn't see orders of magnitude larger parts.
I personally am excited to see the memristor technology that can potentially eliminate both the ram and the hard disk, with 90ns access times and 1/100th the power consumption of flash. Perhaps this will blow everything else out of the water.
We'll see if HP labs can pull it off.
I've never been a big fan of flash memory, given that it has a finite number of write cycles before a memory bit fails (varying between 1 and 100million write cycles). The probability may be low that an individual bit may need to flip so many times in it's lifetime, but it's still an issue.. A lot of care must be taken by the firmware engineer to handle this. There are a lot of job postings for firmware engineers that understand flash..
I'm a huge fan of FRAM. It has a lifecycle limit that is quoted at being 10 trillion write cycles (some mention at it being infinite). The memory density is lower, but is a lot more reliable. It's biggest issue is that the density is lower. For a spacecraft, I'd much rather have a board of these 2Mbit FRAMS then a large flash chip. They use these things in smart meters, etc. In embedded systems, you have to be really careful not to write to the flash too often out of risk of damaging the flash. Most fast SD cards have their own dedicated microcontroller (ARM9, etc) to do what they can to extend the life of the flash..
A datasheet of an FRAM device: http://www.fujitsu.com/downloa...
One question I have is how FRAM compares to NAND-flash in a harsh radiation environment, and what are the radiation differences on mars vs the earth. How many vendors offer rad-hard processes for FRAM, and how do they perform?
Here is one link I could find on FRAM, but the report from 2011 is not clear:
I don't care for comic book recaps.. Give us characters that are believable, that tell something compelling abou the human condition; something that makes us think..Like "A Face in the Crowd" , "Night Of The Hunter" , "High Noon", "Bridge On The River Kwai", "A Dog Day Afternoon", "Who's Afraid of Virgina Wolf", "Bad Day at Blackrock", "Kramer vs Kramer", "Silkwood"
Too much material seems to be regurgitated, and not enough screenwriters seem to read literature and science fiction . There are plenty of compelling stories that have never been told..
We are witnessing an extreme aversion to anything that is not tried and true, and it has cost them. The 70ies marked a time when movies were not so formulaic and deviated away from the old studio system. They took a risk, and it paid off with the Godfather, etc.
This is what I do. I am still relatively young, but I have an astigmatism (I need a cylindrical correction in both of my eyes, simple reading glasses don't work for me).. I have one set for normal use to see clearly at a distance, and another set that just corrects for the astigmatism for reading & computer use. This is much easier on my eyes for long coding sessions. I highly recommend getting the AR (anti-reflective) coating for both sets of glasses. Monitor glare is pretty noticeable otherwise.
Two sets of glasses keeps you from needing to compromise on your vision.
I've done some image processing work.. It seems to me that you can take the output of this Neural network and correlate it with some other image processing routines, like feature detection, feature meteorology, etc; A conditional probability based decision chain,etc.
I work on a LIDAR sensor meant for Anti-.
I work at a start-up that makes 3D laser-radar vision sensors for robotics and autonomous vehicles
“In the 60s, Marvin Minsky (a well known AI researcher from MIT, whom Isaac Asimov considered one of the smartest people he ever met) assigned a couple of undergrads to spend the summer programming a computer to use a camera to identify objects in a scene. He figured they'd have the problem solved by the end of the summer. Half a century later, we're still working on it.”
I just read that it's possible to transfer & play your I-tunes files on other devices, like an android phone. With an itunes player, I don't feel I own something if music files can be deleted without my permission. We have one of these players, but I've always been wary of it.
There are plenty of other players/dev boards that can read in music from something like a micro-SD card and play music without all the DRM hassles. There are plenty of open-source projects out there that use inexpensive boards, like the raspberry PI, or the STM32F4 board, running bare-metal, linux, or Free-RTOS..
In every non-trivial program there is at least one bug.