UK Consumers Reporting Contactless Payment Errors
The chip-and-pin system is the stupidest thing in the world for small amounts of money.
For example, take my cafeteria line in my building. The queue occasionally builds to 4-5 students, each spending £3-4. Each time they pay by card, each transaction takes a few minutes, as the cashier has to hand over the card reader to the customer, the customer inserts their card, types in their PIN, and then hands the device (with the card in it) to the cashier again who then inputs the price, holds the machine as it calls the bank, confirms the transaction, prints the reciept, which is then handed back with the card.
All of this BS, for what in the US is solved by a simple swipe of the card. Absolutely asinine to have the system as it is now for small purchases.
PlanetIQ's Plan: Swap US Weather Sats For Private Ones
So, could this supplement the current data record on the cheap? Then it's worth it. Would the data be property of NOAA/NESDIS and be distributed freely through known data portals? Then it's worth it. Would it be locked through a paywall and not available for researchers to actually figure out what the cost benefit of the data was? Then it's a non-starter.
Data gathering itself is going to be a low-end market. The people most interested in the data (governmental organizations, academic researchers) don't really want to pay for it, and in most cases, can't afford to pay for it when it comes from private firms. Leave the gathering of data to the governments, which then allows the data to be used for all. The US is very unique around the world in that pretty much every product generated is available to download by Joe Schmoe with nothing more then an e-mail address. You can fill terabytes with a handy knowledge of wget and shell scripting, and do whatever analysis you want on your own. Putting this data in the hands of private firms when the taxpayer's paying for it just strikes me a bit wrong.
Auto-threading Compiler Could Restore Moore's Law Gains
But even make will do things sequentially. True parallelization would be like running make in several different terminal windows at the same time.
I encounter this issue a lot with my scientific data analysis. The code that I write could be easily parallelized, if there existed the framework to do so. I repeat the same command for every hour of every data set. Writing this sequentially makes sense, works, and takes the least amount of effort to get the goal accomplished. But if I could leverage this into 12 instances of assigning a core to do each task (such as doing a daily total from 5 minute data over a year), I could truly speed up my productivity without much effort into writing separate applications to handle each month (which is possible as well, but can be a pain in the ass to actually operate, and then you have to compile it 12 times, and when you tweak your program have to recompile 12 times, etc.)
Why Microsoft's Surface Pro Could Fail
Because tablets are hot right now. So MS thinks that selling a bad tablet that also is a bad ultrabook must sell like hot cakes, because everybody badly wants the "full PC experience" everywhere.
I'm actually pretty sure that with a Core i5 in a 10.6" screen chassis, that this tablet will certainly run pretty hot. 4.5 hours of battery life won't help very much either.
It's too much of a compromise. To complete with tablets, you need 8+ hours of battery life. To compete with good ultrabooks, you need that as well. But that adds too much weight with an x86 processor.
eBay Bans the Sale of Spells and Magic Items
Sounds like you got better.
When On the Moon and Mars, Move Underground
Average temperature would be ~236 K (since it is stated -35 degrees F) in the moon craters, whereas the temperature swing would be ~278 K on the surface. Can't you divide by 1.8? And Celsius is just another arbitrary method akin to Fahrenheit, anyways, real men use Kelvin.
Tornado Scientists Butt Heads With Storm Chasers
The entire thing about "storm chasing saves lives" is complete bunk to give the PhD's moral authority over the amateur chasers who are in it for the thrill. Currently, warning times are around 15 minutes, with a fairly high false alarm rate. The miss rate on a tornado warning is actually quite low, due to the effect of the deployment of WSR-88D's on the national grid. As the WSR-88D's are upgraded to a dual-polarmetric configuration, we should be able to see more development and perhaps improve warning times.
Scientists in the field are there to research atmospheric development for an interesting phenomena that we don't know all that much about. These field campaigns get a lot of press because they are neat- "Hey, we chase after tornadoes for a living". However, while they do produce actual science, the "saving lives" stuff gets overblown.
Chuck Doswell (one of the original storm chasers) has addressed this topic on his blog: http://cadiiitalk.blogspot.com/2010/05/are-chasers-saving-lives.html
If there's not a use for good field work in science, and that's not reason enough to do this field work, then meteorologists are just boned.
Kepler Mission Finds 752 Extrasolar Planet Candidates
Mod this guy up. NASA will release the data in its entire form eventually, and in perpetuity once they get the first paper out of it. This is the same whenever NASA puts up a new satellite - they get the data, analyze it, publish the initial results, and release the entire record, for free, for anyone in the world to download. So there's an embargo period- it's not long, and it's not that significant. They are better at putting out free data (as is NOAA/NWS) then anyone else in the world- the Europeans and Chinese are exceptionally hard at getting data out of without paying for it or knowing someone behind the scenes.
Anyone can download a GOES image or MODIS image from NOAA or NASA in the span of minutes to hours. It takes days (or months) to get SEVIRI or MERRA imagery from EUMETSAT.
Cleaner Air Could Speed Global Warming
Aerosols reflect more shortwave energy then they absorb in the longwave, contributing to a net negative forcing in the climate system. With a reduction in aerosol concentration, we'll have additional warming. This is a no-duh scientific principle that has been supported by direct instantaneous observations, versus the projected "future climates" based on model results that are not nearly as reliable, since they still rely on parameterizations of physical processes that we may or may not have a handle on.
While the US Clean Air Act has really helped the air around us, with the industrialization of China and India (and lack of pollution controls) the net global effect may be minimal. That's the interesting thing to realize, as our industry gets cleaner, China gets as dirty as Cleveland in the 1920s and Detroit when there used to be industry.
Sky Watchers Want Recognized a Newly Described Type of Cloud
There most likely is no problem. I actually AM a meteorologist (BS, MS, and finishing up my PhD) and these just look like some cumulus lenticularis- the formation mechanism is due to some waveform within the atmosphere that causes regular forms of condensation that appear like this.
These are nothing really new, the sceintific basis is pretty good for these clouds to be listed. It's a 2-D wave pattern with a good airmass boundary. It's definitely neat, but it's not like it's earth-shattering cloud formation.
Wind Could Provide 100% of World Energy Needs
It's certainly possible. We haven't quite quantified it yet. But the short answer is, yes.
Here's a link to a paper that studied the effects of a proposed wind farm in Kansas:
They see lots of local effects, but little effects that go on to larger levels.
Here's another link to another paper (in PNAS)...
They say that there would be non-negligible impact in the climate due to wind power, but it would be better then current power generation.
The fact of the matter is that there is always some effect. If you put a solar panel out in the middle of the field, you're changing the local albedo, absorbing more energy (especially in a desert, as they are generally white). This will cause some differences in total energy balance and may potentially change the weather patterns and water allocation. There are studies about the changes in albedo that have shown to have large impacts in local weather.
Deforestation has the same thing happening in changing local wind patterns, and putting in a shit-ton (scientific term) of wind turbines would definitely have massive local effects on the meteorology. Would it be bad or good? Hard to say. But everything interacts with the system.
Hope the papers help.
Old-School Coding Techniques You May Not Miss
There are still good scientific applications for good old Fortran. Most research weather models are still written in it (because when you need to calculate seven partial differential equations over 10-second intervals for 48-hours for a grid that's 200x200x50, you need something that runs quick) and it's still relatively easy to understand.
Plus, Fortran scales to MPI work pretty well. And when you're running grids like that, you want to be able to assign 40 processors to it so that your 48-hour model run actually completes in less then 48 hours.
NASA Upgrades Weather Research Supercomputer
You know, a lot of the climate and weather prediction models are open source. You can download the source code if you want, and run it on your own PC if you have certain compilers. Some links for you for your own perusal:
Community Climate Model
NASA GISS Model
Weather Research and Forecasting Model
Regional Atmospheric Modeling System
As long as you have access to a Linux/Unix machine, you can get these models yourself. If you want to contribute, you can do so. Just know that you probably need to have taken graduate level courses in numerical methods and actually get the physical terms in the model to make changes that mean something. Science in this case is rather open. People can easily download these models and make changes to improve it if they needed to (or to test sensitivity, etc).
toby34a hasn't submitted any stories.
toby34a has no journal entries.