Cyberattack On German Steel Factory Causes 'Massive Damage'
Bigger Bigger example, push the red button in a nuclear power plant, yes the control rods will react, but if you don't cool the heat from radiactive decay away, you will get a Fukushima.
During WWII, the major target of bombing runs was the infrastructure used to make the weapons. That means blast furnaces were damaged far more than they were by these hackers. It ditto the electricity generation infrastructure - which was coal fired power plants at the time. They were all rebuilt.
Next time it will be the nuclear power plants, which are effectively nuclear bombs with a big red target painted on the top. Had Europe been using nuclear power plants during WWII there would be places in it still uninhabitable now.
Who's To Blame For Rules That Block Tesla Sales In Most US States?
But we have the best democracy you can fine anywhere.
Grammar nazi here. You wrote "fine", an adjective. A "fine anywhere" isn't a thing. You wanted the verb "buy".
The Failed Economics of Our Software Commons
Obviously, it would be crazy to staff such critical projects largely with a handful of unpaid volunteers working in their spare time.
The people who do this have a number of reasons. Some do it open source software garners job offers. Some do it because they or the businesses they work for need free software to exist, and it's a self perpetuating loop - the more free software there is the more people contribute to it, so the more they have to chose from. For some it's like attending church - it feels right. For some it's a nice social group to be in. None of these reasons means they or the system they contribute to are crazy.
As for the free loaders - without legions of these "free loaders" free software would not exist. Few would bother to put the effort into Linux, or X, or Debian if there weren't legions of users out there to test it, and give feedback, find bugs, suggest improvements. They are a necessary part of the system. A system that for all its faults, works as least as well as any other commercial way of developing software if you go by deployments.
Net Neutrality Alone Won't Solve ISP Throttling Abuse, Here's Why
would US ISPs be required to run trunks across the Gulf of Mexico because you decided you wanted that to have priority? Because using your argument, they really could do that.
Yep. You pretty much nailed it. In a well oiled market if there are enough customers out there who love Netflix so much they are prepared to pay the huge premium an ISP would have to charge in order to cover the cost of running such trunks, then they would exist.
But "required" is too strong a word. In a market functioning well no one requires anybody to do anything, so in this case in particular no one requires an ISP to fill a particular market niche. If the niche opens up then some will, not because anybody requires it, but because it is in their best interest to do so.
Yes, in the US "Net Neutrality" is code for "the government requiring the ISP's to act in a certain way". Yes, that is not a good way to run things - I'd be leery of it too. It's much better to let a market decide. In fact it is so obviously better that most countries had the foresight structured their telecoms so such a market would develop naturally. But not the US, which is why I said the US has cocked it up.
But then again in a well oiled market your whole scenario becomes fanciful, because if Netflix did that another content provider would pop up offering the same content from the US, so their customers didn't need to pay extra to a premium ISP just to see it. Well they would if bandwidth cost the same in the US as it did Honduras. If it cost more our new Netflix would have to recoup it somehow. I guess it could get very complicated, but that's the beauty of a well oiled market - it sorts all this shit out automagically without the need for government interference.
Net Neutrality Alone Won't Solve ISP Throttling Abuse, Here's Why
There is a simple definition of Net Neutrality that works: the customer gets to decide the priority of his traffic.
There are many ways you can engineer this outcome, but in all countries that pull it off they do it real simply: every household is serviced by multiple ISP's (at least 10's), and you chose the one you like given your budget.
As usual the US, the supposed beacon of capitalism, has cocked it up. Most homes are serviced by one ISP. And with the power that gives them, not only don't they give you a choice in the priority of your traffic (which admittedly would be a big ask), they erect pay walled gardens, and then they actively interfere with outside traffic to force you to use them. They do this in secret, and seem to have no trouble telling direct outright lies about it when queried.
Being able to bleed monopolistic profits off their customers has made them hugely profitable of course. So to cap it all off they engage in crony capitalism by bribing your politicians with campaign donations to preserve this farce.
I have no idea why voters in the US put up with sort of shit. It boggles the mind.
Soda Pop Damages Your Cells' Telomeres
It means that you're 96% certain that your hypothesis is true.
Yeah. But if that were really true, everybody would trust the results of a study like this. But no one does.
It's the bar that's used for medical studies.
And in particular, the medical fraternity almost never believes the result from just one study. They always advise waiting for it to be confirmed.
You would be correct if this was a randomly selected study, the issue is it wasn't randomly selected. It was published. Studies that don't meet the 96% interval typically don't get published. So all we know is 1 study out of god knows how many showed this effect. If it is 1 out of 1, the 96% percent applies. But if it was 1 out of 10 it's almost certainly wrong.
Now it's published there are kudo's to be made from shooting it down. Translation: now it's published, it becomes the null hypothesis. A study showing it isn't true is a positive result and now has a chance of getting published.
In other words, the first published statistical survey showing controversial result is barely worth the paper it's written on. It's only real use is to prompt further research.
Battery Breakthrough: Researchers Claim 70% Charge In 2 Minutes, 20-Year Life
Ah, here is a much better description of what they did. In short: their real breakthrough was to grow longer TiO2 nanotubes. They then re-did the work in the previous like to what difference the longer nanotubes made. Turned out it sped up the charge/discharge rate. At 30C they got 6K cycles at 86% capacity, which is where the 20 years comes from. The smaller length tubes had 10K cycles, so it a charge rate versus cycles trade off. As they said, it puts them in super capacitor territory but I'm not sure how that's helpful. We already have super capacitors. We need cheaper batteries.
Battery Breakthrough: Researchers Claim 70% Charge In 2 Minutes, 20-Year Life
I'm struggling to see what's different here to what was done 3 years ago. In other words, self annealing fast charge batteries using a titanium dioxide nanotube anode aren't new. The linked article says capacity for the Na variant of the cell is 144 W.h/Kg, which compares to around 250 for LiPo batteries. The linked article also says they had it working for Li at higher densities.
But it hasn't taken over the world yet, and so there must be some problem with it. Maybe the nanotubes cost a small fortune.
As for those of you whining about charging a car in 5 minutes, maybe it is a bit optimistic. But I'd happily settle for charging my phone in 10 minutes via 100W USB C connection.
Long-range Electric Car World Speed Record Broken By Australian Students
Inspirational. You managed to elicit a dick waving competition from our fellow geeks in the US, all chanting "Tesla".
But Telsa isn't in the same league. It can't be. It's a mass produced product.
Sadly, they don't know what we know. We may be able to design the 1st one. But we can't build the next 1000 economically, unlike Tesla.
Please guys, devote some of they enthusiasm and energy to figure out how to manufacture the thing. Don't do the work for some Chinese company.
Australia Repeals Carbon Tax
Why would you assume he would compensate people for something that was being removed?
Maybe because before the election, Abbott promised to keep those tax cuts after repealing the carbon tax?
But you are right, I didn't assume it would stay. At time Abbott was making a whole pile of promises and he could not keep then all - balance the budget, reduce taxes, keep all the benefits those taxes paid for. But by that time hearing him make promises he could not keep was no surprise. It was clear by then the man would say anything, do anything, prostitute anything (including the sexuality of his daughter) in order to get into power.
Amazingly this extraordinary behaviour got worse after he was elected. (Amazing to me anyway. I didn't think it was possible.) First we had a promise to be an open transparent government, then a week or two later we learnt a phrase: "on water matters". Who still remembers the no surprises, no excuses government speech he gave after being elected. Probably not too many, given the shock the first last budget inflicted.
Cosmologists Show Negative Mass Could Exist In Our Universe
Dark Matter and Dark Energy are two completely unrelated issues.
To a complete layman like me, it sounds from the ancestor you are posting under they could be very much related:
Negative mass reacts oppositely to both gravity and intertia. Oddly, that means that negative mass still falls down in a gravitational field: The gravitational force is opposite, but negative mass responds negatively to force (a=F/m, where both F and m are negative). So negative mass particles repel each other gravitationally, but are attracted to positive mass objects.
That sounds like a good candidate for explaining both. Space expands because Dark Matter repels itself, but it causes galaxy's to clump and gravitational lensing because it attracts ordinary matter. I did always wonder why, if Dark Matter interacts with everything so weakly, it didn't immediately clump into black holes. This would explain it.
Intel Confronts a Big Mobile Challenge: Native Compatibility
your claim that x86 has 8-bit mode is false; the lowest common denominator for x86 is the 16-bit 8086, which you're probably confusing with the 8-bit 8080 which is not x86 compatible
He was probably thinking of he 8088, which was an 8086 with an external 8 bit bus. Internally it was identical to an 8086, and so by any reasonable definition it was a 16 bit chip. It was probably the commonest version of 8086 released because it was used by the original IBM PC.
their attempts to match ARM in performance/W are so far unsuccessful when looking at non-biased benchmark results
True, but to have hope of winning the power/watt race currently, they would have to produce a chip that runs as slow an ARM. If things were static they might have be tempted to do that, but they aren't. Instead Moore's law means a OOO superscalar chip will practical on a phone in a few generations. And with it, the power advantage ARM gains form less complex, slower chips will disappear. Once that happens, the overhead imposed by the amd64 instruction will be so small it becomes irrelevant. Intel seems to be content to just wait for that to happen. Or maybe it's more a consequence of not having a choice, because the complexity of x86 did appear to impact the underpowered Atom badly.
Whatever the reason, Microsoft's abandonment of Windows RT hints that simply waiting will work. Microsoft abandoned RT because ARM simply doesn't have the horsepower, and while an i5 does get over a day worth of battery time. So they have already hit the power budget of a tablet. A phone can't be too many generations off.
But with the Mill architecture claiming a 10 to 1 MIPS/Watt advantage, while having the same raw horsepower as an OOO superscalar core, I can't help but wonder if both ARM and amd64 will loose this race in the end.
PHK: HTTP 2.0 Should Be Scrapped
I don't think HTTP has any problems with security.
I disagree. We live in a world where phishing attacks are common, and the PKI system is fragile. Fragile as in when Iran compromised DigiNotar and people most likely died as a result.
The root cause of both problems is the current implementation of the web insists we use the PKI infrastructure every time we visit the bank, store or whatever. Its a fundamental flaw. You should never rely on Trent (the trusted third party, the CA's in this case) when you don't have to. Any security implementation does the insist you do when you don't have to is broken. Ergo HTTP is broken.
It's not like it isn't fixable. You could insist that on the first visit the site sends you a cert which is used to secure all future connections, and that cert was used only when the user clicked on a bookmark created when the cert was sent. That would fix the "Iran" problem, and it would also allow the web sites to train the users to use the bookmark instead of clicking on random URL's.
So given HTTP security has caused deaths and it's is fixable, I'd say it has HTTP huge problems with security. Given HTTP/2.0 not attempting to fix it is a major fail IMHO.
Melbourne Uber Drivers Slapped With $1700 Fines; Service Shuts Down
but it is likely the demands the Directorate will place on Uber drivers, such as mandatory criminal record checks, vehicle inspections and insurance, will make the service in Melbourne unviable.
Those aren't unreasonable demands of someone wanting to carry passengers for hire. They are checks that pretty much the entire Western world has come up with after numerous problems with unsafe, uninsured and unsavoury taxi drivers. If this is enough to make Uber unviable, then I wouldn't want to be one of their investors.
You sound oh so reasonable. Pity you didn't mention that currently the only recognised way of having those checks is to buy a taxi licence. That licence costs around $30,000 per year.
It is the $30K per year that would make UberX unviable. It has no relationship to the cost of doing those checks. I have no doubt Uber will go to the and say "Look, sure, we can ask the drivers to send us the relevant certificates before we allocate them jobs. A roadworthy (which is what we in Australia call a vehicle inspection) is around $100, and they can sends us the paid insurance bill." The answer will be a resounding no, at which point is will be become obvious it has nothing do to with "safety checks".
One possible explanation of the $30K is it is protection money, charged by the government to protect the incumbents. Who, by the way, meet the definition of a monopoly. Quoting http://www.smh.com.au/technology/technology-news/apps-put-nsw-taxi-monopoly-in-doubt-20121102-28nv6.html:
University of Sydney economist Peter Abelson said Premier and Cabcharge were so interlinked that "it's not really a duopoly, it's almost a monopoly and between them they control about 80 per cent of the cabs on Sydney streets".
A government fining emerging competition to an incumbent monopoly, presumable because of regulatory capture doesn't sound so reasonable, does it? In fact it pisses me off so much, I deliberately travel using these upstarts even if it is less convenient, which it often is.
Heartbleed OpenSSL Vulnerability: A Technical Remediation
For people who didn't follow the link chain, it has since been updated:
Important update (10th April 2014): Original content of this blog entry stated that one of our SeaCat server detected Heartbleed bug attack prior its actual disclosure. EFF correctly pointed out that there are other tools, that can produce the same pattern in the SeaCat server log (see http://blog.erratasec.com/2014... ). I don't have any hard data evidence to support or reject this statement. Since there is a risk that our finding is false positive, I have modified this entry to neutral tone, removing any conclusions. There are real honeypots in the Internet that should provide final evidence when Heartbleed has been broadly exploited for a first time.
MtGox's "Transaction Malleability" Claim Dismissed By Researchers
The very short version is that what these "researchers" were looking at isn't actually how the alleged bug would have worked.
That is far too short to be useful.
Mtgox's malleability problem was caused, ironically, by the protocol fixing once source of it. When that happened the network started rejecting mtgox's transactions, in fact they weren't even relayed.
The paper says the were no malleability attacks of the scale mtgox claims because they didn't see the required number of malleable transactions. This would have been reasonable if the attacker also depended on seeing the malleable transactions relayed by the network. But they didn't. Mtgox provided a web site service that allows you to see the transactions mtgox issued, thus allowing the attacker see every malleable transaction.
Thus the attack could have been much larger than what the authors of the paper saw, thus invalidating some of the conclusions of the paper. Particularly the conclusions regarding mtgox, unfortunately.
MtGox's "Transaction Malleability" Claim Dismissed By Researchers
Security bugs in unpatched software are a thing that are well-understood by sysadmins and security researchers.
Really? The bitcoin is valued at several billions of dollars. The reward for breaking Keccak was academic creds. The reward for breaking bitcoin is notoriety for life, and being set for life as well. Besides, you do know that nothing in Bitcoin is encrypted, right? There is one signature and a lot of hashing. There isn't even a nonce.
Additionally, this isn’t an unpatched security flaw where upgrading to Bitcoin 1.1 would have fixed the issue. It’s a weakness inherent to the Bitcoin protocol which may or may not be able to be repaired without invaliding all existing BTC transactions.
Said like a person who is eager to prove he doesn't know much about the subject he is commenting on. It wasn't the upgrade to bitcoin 1.1 that fixed the issue, it was the upgrade to bitcoin 0.9.0. It happened last month. It didn't invalidate anything.
Why Are We Made of Matter?
We know where he hid it. He hid it in yesterday. Anti-matter is matter going backwards in time*, so when when the big bang happened, all antimatter disappeared into yesterday while we headed off towards tomorrow.
* For some definitions of time.
New MU-MIMO Standard Could Allow For Gigabit WiFi Throughput
MU-MIMO is part of wave 2 of the 802.11ac standard. Right now every shipping product is wave 1.
If we are lucky the routers will get wave 2 this year, or if not this year definitely next. Apart from allowing more devices to share the same cell MU-MIMO is nice in that it reduces power consumption of devices like phones, as they only see the packets for their stream. Wave 2 also bring doubling of the bandwidth (if the spectrum is available) and other efficiences which translates to 2..3 times the speeds of wave 1. This means unlike wave 1, wave 2 should be able get 1Gb/s in the real world.
All very nice. The only issue is we won't see wave 2 client chips in laptops, phone and the like until 2016 at the earliest. So unless you are doing back to back routers or range extending, don't expect this shiny new Qualcomm chip to make see any measurable improvement in any of your existing 802.11ac devices, or in any you buy in the next 2 years.
Researchers Find Problems With Rules of Bitcoin
Don't be too sure.... a large botnet could potentially do some nasty things to the availability of the network ---- particularly, a Botnet with control of sufficient number of Bitcoins to generate an overwhelming volume of transaction spam, so legitimate transactions can't get through --- by using transactions of the minimum size, Or more traditional DDoS techniques such as packet storming the IP addresses of key nodes in the Bitcoin network.
A botnet in control of a huge quantity of bitcoin's, throwing them at the miners network in minimal transactions sounds like a miners delight to me. There is a minimum mining fee, so while in the short term it might cause the bitcoin miners to gag on their feast, in the long term all it will do is transfer that huge quantity of bitcoins to the miners. Why on earth would anybody do that?
As for traditional DDoS - the history of bitcoin is one DDoS after another. Just recently some bright spark must have decided that because mtgox said there was a transaction malleability flaw it must be true, and started modifying every transaction they could get their hands on. In other words: if every there was a network battle hardened against DDoS's, it's bitcoin.