Scientific Data Disappears At Alarming Rate, 80% Lost In Two Decades
This problem occurs even for people in the same group, who often find problems to repeat the simulations from our own papers, and even as recent as one year ago. The problems typically come from people leaving (PhD finished, grants that expire, people that move to a different job), changes in the simulation tools, etc.
In our Computer Architecture research group we employ Mercurial for versioning the simulator code. Thus, we can know when each change was applied. For each simulation, we store both the configuration file that is used to generate that simulation (which also includes the Mercurial version of the code which is being used) and the simulation results, or at least only the interesting results. Multiple simulators allow for different verbosity levels, and in most cases most of the output is useless, so we typically store the interesting data (such as latency and throughput) because otherwise we would have no disk space.
Even with this setup, we often find problems trying to replicate the exact results of our own previous papers, for example because of poor documentation (this is typical in research, since homebrew simulation tools are not maintained as one would expect from commertial code), changes that introduce subtle effects, code that gets lost when some person leaves or simply large files that get deleted to save disk space (for example, simulation checkpoints or network traces, which are typically very large).
However, you typically do not need to look back and replicate results, so keeping all the data is a useless effort. I completely understand that research data gets lost, but I think that it is largely unavoidable.
IE Zero-Day Exploit Disappears On Reboot
Sure it dissapears!
Unless you're running IE as admin, you have UAC disabled and the malware has installed a hypervisor and you're hickjacked forever without having any chance to detect it. How long before we see that?
VLC Reaches 2.1
Yet still it does not support turning off the computer, despite being a feature requested for years. That's the ONLY missing feature which prevents it from being my default video player
LGPL H.265 Codec Implementation Available; Encoding To Come Later
CODEC: COder-DECoder; but there is no (en)COder here!
EFF Slams Google Fiber For Banning Servers On Its Network
all ISPs are deliberately vague about what qualifies as a 'server.' ... because TCP clearly specifies it.
The fact that some programs might behave correctly when implementing a server, or not (eg: skype) or the fact that, in some cases, ISPs allow certain services or ports, does not mean that a 'server' is something arcane. It's you that don't know it.
Computer Memory Can Be Read With a Flash of Light
The link to the actual Nature Communications paper is here: Non-volatile memory based on the ferroelectric photovoltaic effect.
This somehow resembles Phase-Change Memory (PCM). PCM devices are composed of a material which, under a high current, there is a thermal fusion and changes to a different material status, from amorphous to crystalline. This changes two properties: light reflectivity (exploited in CDs and DVDs) and electrical resistance (exploited in emerging non-volatile PCM memories). The paper cites PCM and other types of emerging non-volating memories.
In this case, it is the polarization what changes, without requiring a thermal fusion, therefore increasing the endurance of the device, one of the main shortcomings of PCM. The other main shortcoming of PCM is write speed due to the slow thermal process, in the paper they claim something like 10ns. If this can be manufactured with a large scale of integration and low cost, it will probably be a revolution in computer architecture.
Supercomputers At TACC Getting a Speed Boost
GB != Gbps.
Major Advance Towards a Proof of the Twin Prime Conjecture
"the existence of any finite bound, no matter how large, means that that the gaps between consecutive numbers don't keep growing forever"
Actually, I disagree with the unfortunate writing of the sentence. The gaps between consecutive prime numbers are variable, and on average they DO tend to keep growing forever. This is a widely known result, the density of prime numbers decreases as the numbers grow. However, since the gap between consecutive primes is variable and it does not follow a regular function (otherwise, it would be very easy to calculate prime numbers), even with a very low density of prime numbers we can find a pair of consecutive prime numbers with a gap of only 2.
The problem under study is not wether the gap between consecutive primes keeps growing forever (which is true only on average, considering a long secuence of integers), but wether there are infinite such pairs of primes with gap 2. The new result found says that there exist infinite pairs of primes with gap 70M or less. However, this does not imply at all that no consecutive pairs of primes with gap > 70M exist (which, in fact, they do).
ZFS Hits an Important Milestone, Version 0.6.1 Released
Why not labeling it 1.0? Looks like it is still in beta...
UK Researchers Build Micron LED Light Based Wireless Network
C'on, this is Slashdot. Is it so complex to say that they employ an NRZ modulation using a light carrier, rather than "a bit like Morse Code from a torch"? Is it so difficult to refere to the switching/modulation frequency, or baud rate, rather than "they can also flicker on and off around 1,000 times quicker than the larger LEDs"?
The idea of using a LED light for communication is presented as a novelty in the summary, when all remotes work this way, and even the original 802.11 specs included a PHY layer that relied on IR. You are trying to make articles more dumb-user-friendly, but what you are getting is to kick out the users that might make valuable comments.
"Bill Shocker" Malware Controls 620,000 Android Phones In China
This is NOT a virus; viruses infect a system, typically by modifying other existan executable files, and then self-replicate themselves. These are malware applications which have been installed by the users. In this case he notice, not covered in the summary, is that these applications are not designed to be malware, but rather they employ a free (as in gratis) SDK, which converts the phone in a zombie.
However, note that simply removing the applications should remove the "infection". The Android security model does not allow an application to "infect" the OS, unless the user has rooted the phone and runs the application as root (in this case, it's your fault).
Einstein@Home Set To Break Petaflops Barrier
If it was at position 24 in the Top500, it would likely be 3x as power-efficient than having all these individual computers. These sort of initiatives are impressively inefficient (but very effective), this is why the 'cloud' model won the battle over the 'grid' model. It only works because computing power is donated, not paid for. On the other hand, the equivalent supercomputer would likely cost 3-8x the aggregate (wrt the sum of costs of all these computers), because of it being custom-made.
Automation Is Making Unions Irrelevant
Repeat with me: Automation is good. It makes we, human kind, more productive. With the same human work, we can get more benefits for ourselves, so on average our wealth improves. The people that do not need to do manual and repetitive jobs can move to a more creative work which produces more benefit for mankind. Gutenberg's printing was good. e-mail was good, despite removing works in the Post office. Hydraulic excavators are good. And all of them reduce the number of jobs, and unions cannot and shouldn't try to prevent this. Fortunately, we are no longer relying on picks and shovels to dig tunnels.
The problem is not with automation, which is good for mankind as a whole; the problem is with the distribution of wealth. We are facing a serious problem, in which those who have the machines (capital) become much richer by producing the same as before, and those that lose their employments become poorer. I certainly believe that this problem will aggravate with time, as more jobs are out-dated by technology, and "the system" cannot provide an alternative way to earn a living.
One option might be to move to a system in which everyone has a basic "social earning", enough for a living, while those with a work would earn more money. However, this imposes serious trouble, such as obvious abuse and unfairness. I see the problem, but I don't foresee a clear solution.
GLIBC 2.16 Brings X32 Support, ISO C11 Compliance, Better Performance
Note that, on multiple processors, the legacy x86 and the x64 implementations are (almost completely) separated, using different processor resources. Between the larger/better resources, and the higher number of register, the x64 pipeline gets a better performance in the same processor. The lower memory usage also helps to improve performance, but its impact is minor.
Samsung TVs Can Be Hacked Into Endless Restart Loop
The vulnerability is originally disclosed here, not in the posted link.
This vulnerability only works from the same broadcast domain where the TV is, since the remote control protocol relies on broadcast messages to announce the service. This means that your TV cannot be cracked from the Internet. Let's hope that Samsung apply a fix soon, in any case.
Multicore Chips As 'Mini-Internets'
The seminal paper proposing the use of switched/routed interconnection networks on-chip (NoCs) was published by Dally and Towels 11 years ago in DAC'01: Route packets, not wires: On-chip interconnection networks. The idea of associating a router to each core and replicating it in "tiles" is not new either; Tilera was (IIRC) the first company to sell processors based on a tiled design, which was an evolution of the RAW research project. A related research project, the TRIPs, replicated functional units on each tile, rather than full cores. Intel has used a tiled design in the Polaris, SSC and MIC (which includes the forthcoming Knights Corner).
So no, the idea of using routed interconnects is not new at all. In fact, after reading the linked article, turns out that 2/3ths of the text are introducing the idea, and the last section details the contributions: Two ideas developed by the group of Li-Shiuan Peh seeking to improve performance (by using virtual bypassing, a form of routing precomputation) and reducing power consumption (using low-swing signaling).
Physicists Create a Working Transistor From a Single Atom
They make a transistor from multiple atoms, all of them silicon but one, which is phosphorus. That is NOT a transistor made from a single atom (as the title suggests). Great advance, in any case, but misleading title.
Retailer Calls Rivals' Bluff On "HDMI Scam"
its a bit like saying you can plug in a CAT 5 cable and get gigabit...
the answer is it depends...
the longer the cable the more the signal degrades and just because its digital does not mean it will produce the same results..
Actually, if the Ethernet CAT5 does not get to gigabit speeds, then a certification of the cable would fail. That cable is not cat5.
There are two categories of HDMI cables, category-1 and category-2 (for 720p and 1080p, roughly speaking). If a cable is certified for a given cat, it means that it can transmit the corresponding signal with a low (bounded) error rate. Obviously, it is more difficult for a longer cable to pass the cat-2 test. But there are more factors, such as the length, the construction (connectors, etc) and the instalation itself (twists are bad!!).
So no, the fact that you build an Ethernet cable from cat-5 does not mean the final result will be certified for cat-5; similarly, a high-speed HDMI cable should be certified before being sold, and probably the cheapest ones are not.
However, the fact that the digital signal transmitted on these cables is uncompressed audio/video makes them much more error-tolerant, so it is likely that a lower-quality cable (i.e., with a higher bit error rate) produces a higher distorsion, but it is not noticed by the end user.
Emergency Dispatcher Fired For Facebook Drug Joke
Making comments IN PUBLIC about taking drugs might be a reason to be fired off. Making jokes IN PUBLIC about the same issue might be a reason to be fired off. Both of them can negatively condition your image.
Facebook, on the other side, is mostly public, so the loss of image might also apply.
Craig Mundie Wants "Internet Driver's Licenses"
Great idea, Microsoft! Even more, the Internet Driver's License should be followed by the "System Administrator Driver's License", so only people who know the risks present in Internet, and know their own computer OS, can run with Adminnistrator privileges.