Fukushima Decontamination Cost Estimated $50bn, With Questionable Effectiveness
Over 50% of my town is over 1mSv/yr and nobody is campaigning for it to be decontaminated. My suburb is at about the .97 mark so I must be safe... I'm about a 1/4 of a planet away from Fukushima and not downwind.
Whats the bet that most of these areas have been above 1mSv/yr since the solar system formed. How many of these 77% are actually contaminated and by how much?
Australian Networks Block Community University Website
I don't think the internet filter laws got passed. I thought the ISPs jumped in and said they would voluntarily use the Interpol Worst of list. I think the compromise seems reasonable. If the list is abused then it can be voluntarily not used. To be on the list you need to host porn of kids that are under 13 and this needs to be verified by multiple member countries.
I'm guessing that this has been implemented as a BGP blackhole list from TFA. An easy way for the ISP to go. They will already be running black lists for things like bogons and performance impact will be low.
The obvious fault with this is that when some kiddie porn domain gets blacklisted the domain becomes useless so the domain admin points their A record at some popular hosting company and takes them off line as well. If your going down take somebody with you.
Being on a black list sucks if there is no way to get off. Many years ago the company I worked for was on a net block that was on an outdated bogon list used by the US military. The military is really bad at keeping things maintained, something gets installed, the person who did it gets posted elsewhere every few years so all knowledge about what, how and why it was done is lost. The military don't update their contact information so even if your email server wasn't black holed you couldn't contact them anyway. Frustrating when there were treaties requiring this communication.
Ask Slashdot: How Do I Explain That Humans Didn't Ride Dinosaurs?
My dad rode dinosaurs to school with Jesus (early days before the zombie thing).
Laser Intended For Mars Used To Detect "Honey Laundering"
I'd generally agree with the buy local bit but cheap testing would be useful for them too. Bee's are pretty much wild animals so unless they are located in true wilderness or in the middle of farm land that you control then there could be anything in the honey.
Slashdot had an article on blue honey a couple of months ago.
You just need somebody to spill a bucket of the fake honey and a few weeks later it might be being sold by the local bee keeper.
Bee Venom Has "Botox-Like Effect," Is Worth 7 Times As Much As Gold
I know a few ex bee keepers who had kept bees for years/decades with hundreds of stings and then slowly became more allergic. I don't know if they recovered their immunity after giving it up.
One guy (friend of my parents so I didn't know him that well) was advised to carry an epi pen since his last sting was so serious.
I suspect the cosmetic stuff would be slow acting so you could wash it off and seek medical help before it got out of hand but I could imagine that people could become dangerously allergic to bee stings that wouldn't have otherwise.
Bee Venom Has "Botox-Like Effect," Is Worth 7 Times As Much As Gold
This reminds me of an urban legend (or maybe I just watched it on Fox) about some guys basically stinging their penis with bee's to make it swell up.
I don't think it works this way.
The story starts with a backyard apiarist doing a quick check of my hives in the middle of January. It was stinking hot. Since I was not planning on taking any real time or doing any real work I was wearing lite shoes that didn't tuck into my suit very well. Combine this with the bad choice of boxer shorts, a little bee leakage led to at least one bee in a very dangerous place.
Lifting the second box back on I lent against it leading to the worst sting I've ever experienced. I couldn't scrape the sting out in a hurry so it had lots of time to inject lots of venom (don't take your pants off while standing next to an open hive).
There was not much swelling other than a small blister. Lots of pain for days (I was reasonably immune so normally would expect no symptoms after a couple of hours at the most). It was sort of itchy but not in a way that could be scratched and any contact was uncomfortable. I've still got a scars from the sting (If chicks dig scars then try and explaining that one).
FCC Boss Backs Metering the Internet
Energy isn't free, they have to fuel those power plants. Heavy internet users don't really increase electrical costs in a meaningful way outside of the base load to power the equipment but that doesn't care about network traffic.
The power company needs to keep spinning reserve. They have purchased reserve even if I use it or not. If I turn a light bulb on it doesn't change the amount of fuel used in any meaningful way.
The same goes for internet traffic. My individual usage doesn't make a huge difference but my ISP has to keep the equivalent of spinning reserve in upstream capacity to cover all the users. It isn't cheap to buy this bandwidth.
For example one of my clients uses about 300Mb with something over 40000 corporate users. If we tried to provide this service with a guaranteed 20Mb per user(not that high a rate) we would need over 2000X the infrastructure. Anybody who tries to convince me that this increase in costs isn't meaningful isn't being realistic.. Admittedly this isn't a straight ISP situation (10G infrastructure doesn't really exist for most of what we do) but the same scaling rules apply to ISP's.
FCC Boss Backs Metering the Internet
It's a ridiculous assumption though, because once the capacity's there, it costs about the same regardless of whether you use it or not.
It's ridiculous that the electricity company charges based on usage? After all once the capacity is there it costs about the same whether you use it or not.
I'm a fan of usage based billing. I want a fast link from a latency point of view but I don't want to pay for the upstream bandwidth to keep it saturated.
Two New Fed GPS Trackers Found On SUV
Just call it in as a bomb scare. Somebody has attached a weird device with antennas and whatever to my car and I think it could be a bomb. Ask there advice on what to do.
They will either realize there is a warrant and tell you it is safe or the bomb squad won't talk to the DEA and will do the destruction for you.
Seriously I'd be a little freaked by some random wireless device being attached to my car without knowing what it was. Let the experts decide if it is safe.
The recent snow on the U.S. east coast ...
I suppose it's a novel way of preserving the 'no fatal crashes' record a little longer.
Qantas have had heaps of fatal crashes. Admittedly none since they started flying jets about 50 years ago. http://en.wikipedia.org/wiki/List_of_Qantas_fatal_accidents
Why So Many Crashes of Bee-Carrying Trucks?
I'd put a bet on driver fatigue being the main cause.
Bee keeping is mostly a day time job except when you need to move a hive. If you close down a hive during daytime then lots of bees are flying and you get losses.
The bee keeper is working outside his normal shift, drives for hours to the bees, spends half the night closing them down and loading them on the truck and then has to drive for hours to the new site hopefully before the sun gets too hot and cooks the closed down hives that can't really vent themselves.
The bee keeper isn't really a professional driver, doesn't know the roads as well as a professional who would be more likely to be familiar with the road works. Tired and surprised so doesn't react appropriately so crashes.
UK Government Wants to Spring Ahead Two Hours
Australia stuffed around with daylight savings dates for the Olympics. Most distributions pushed updates with enough time that it wasn't a problem. I saw a few Windows servers that weren't patched miss the change. A few TV stations didn't make the update in time but that happens with DST normally. Some outlook calendar entries misbehaved.
Postgres is always unhappy with countries that stuff around with this since causes extra entries in the lookup table.
Mostly it works and mostly it doesn't matter and its fixed manually in a day or two when somebody notices.
Mathematics As the Most Misunderstood Subject
I can attest that "true" math is very removed from computation. The computational classes are all regarded as the "easy" classes. This is in contrast to the "hard" classes, real analysis and abstract algebra.
I'm not sure what you mean by computational classes but the computational mathmatics classes I did were some of the hardest I've done. There was a reasonably large choice of "tools" to solve a problem but then proving convergence and error bounds was really hard work. I never really managed to get the "art" of it right requiring much trial and error (with pages of wasted work) to get an answer. Some of the other students had a better eye for it and could make the "educated guess" about what path was going to give an answer.
Computationally its easy to get and answer to many difficult problems but it is hard to work out how good the answer is.
Mathematics As the Most Misunderstood Subject
I very much disagree. Teaching is a specialty in its own right and a good teacher can teach almost any subject given appropriate support and resources. Of course some competency in the subject is necessary to provide insight when the lesson isn't hitting the mark but you don't need to be an expert.
I'll give an example: During high school I had two physics teachers. One was pretty talented at physics and had been teaching it for years. The other hadn't taught physics much before and wasn't that strong (if he did the exam some students would have got better marks than him).
The first taught a pretty neat syllabus. He did lots of well set out problem solving examples but the course was pretty dry. I think most students just got recipes out of the course and no real insight.
The second teacher followed the same syllabus but his examples weren't as well set out and he didn't necessarily get the right answer. Because he wasn't so confident in his answers he provided a lot of demonstration of checking techniques such as estimation and general sensibility checks (The ball doesn't roll up the inclined slope). His efforts to verify his answers, clean up his working and fix his answers taught more students about problem solving, physics and maths techniques than the rest of the course.
It's a shame I didn't realise this at the time. It must have taken real guts and a lot of home work to get up and teach that course. Brilliant teacher but not a brilliant physicist.
Fix To Chinese Internet Traffic Hijack Due In Jan.
I don't think most operators could do a better job. Every ISP I've dealt with has been pretty anal about what routes they accept from me.
This incident happened at the large ISP level and currently they don't have the information required to do better filtering. In this case China Telecom might legitimately be the shortest path for some of this traffic some of the time and there is no way to tell otherwise.
The PKI signed advertisements will provide trust that I have ownership of the resources and would probably solve most of the accidental routing incidents.. i.e. somebody fat fingers a route on some "core" router and it starts advertising it under its own AS. The rest of the Internet will ignore that route because that AS doesn't own it.
What I don't see it solving is the malicious case were the attacker strips the AS path and re-advertises the route. i.e ME-A-B-C-D-BADGUY. Badguy just advertises ME-BADGUY so anybody closer will go their direction. Nobody can tell the difference cause I've signed the advertisment and they won't know that I'm not connected to BADGUY...
unless I sign that my nexthop is A. A then signs that his next hop is B. I could imagine that getting very expensive in the middle. Where the level1 carriers have to sign every route multiple times for every one they connect to.. Ouch.
Free IPv4 Pool Now Down To Seven /8s
5. If only people that designed IPv6 "by committee" though a bit about real world and technology, IPv6 would have been much easier to implement. 128 bit addresses are a *wrong* size. They should have set the size at 64 bit. 64 bit values are now natively manipulated by much of computer hardware, so just as the new protocol would come into wider use, it would be conveniently supported by many algorithms relying on hardware. Now go build a radix tree for a routing table of 128 bit IPv6 addresses - let's see how well that works.
6. IPv6 in default implementation wants to use your MAC address as part of the IP. I don't know, perhaps a few of those big companies that like tracking people so much may be interested in that. I am not.
In conclusion - I'll wait till stuff begins crashing around. May be then someone will come up with a better solution than a deadborn poorly designed IPv6 we have now.
I think the 64bit size was planned for. The network part of the address is 64bits. Anything doing routing isn't going to concern itself with the host part. Anything doing the last hop part of the processing isn't going to be doing much with the network part but doing its look ups on the host part.
I agree that the MAC address based network address is scary but I wonder how much of a signature they already have from other properties of my computer.. I wonder how long before the IPv6 address is used to try and prove that it was a specific computer that generated some traffic.
Free IPv4 Pool Now Down To Seven /8s
Why is IPv6 not based on MAC adresses? I've never understood this. Every piece of electronics capable of connecting to a network has at least one unique hardware id already. Why do we need a new one?
Is there are reason not to just use this number? Or have I misunderstood, and this actually IS the plan.
A couple of reasons:
V6 addressing often is based on mac address (for the host part) when using the auto addressing methods.
Some network devices don't have mac addresses. Serial port with ppp.
Ethernet MAC addresses aren't necessarily unique. I've had to debug a mac address collision in a medium sized site.. I think vendors are better now but it probably still happens.
Makes sense to have a static address even if the hardware has to be changed for some reason... i.e. router goes on blah::1 maybe.
Sometimes you want multiple addresses. Maybe virtual ethernets. Only one can be the MAC.
I think the plan is that the network half of the address is allocated in as hierarchical a way as possible to hopefully enable route consolidation (are we dreaming). The host part will be allocated based on MAC address except when it is not.
Google, Microsoft Cheat On Slow-Start — Should You?
So then I guess everybody should just skip slow-start then? If Google and Microsoft can and are having tremendous results, why shouldn't everybody? Heck, why is slow-start even still around then? Should be tossed to the wayside like a Colecovision if its optional and gets in the way of your performance...
Slow start probably should be skipped for most well tuned websites. Most HTTP connections are short lived enough to never ramp up to the available bandwidth or saturate queues so why use an algorithm designed to keep queues small while trying to efficiently use bandwidth.
I think the slow start concept would still be useful for bulk transfer services.. If you are serving a couple of gig ISO images then you probably don't care about a bit of round trip time latency if it means you don't clobber router queues downstream. I could imagine congestion collapse would be more likely with this load.
Bittorrent should probably use slow start. Often the competition for bit torrent connections are other connections for the same torrent. If we start too fast we could impact too many of these connections causing them to back off impacting overall performance.
I'd guess that the magic numbers that were picked for slow start when the RFC was written are no longer applicable. RTT is shorter, queues are probably longer (near the edges anyway) but the queues are probably shorter in terms of time. i.e. less consequence for a dropped packet, less likely to fill a queue and less of a performance hit if we do fill the queue..
Google's choice of initial window size would be well considered. If google's tuning impacted network performance then it would be causing packet loss to their own connections causing the latency to go up due to retries..
Similarly microsoft's initial window size seems a bit ridiculous so I'd bet it is either:
1) A mistake that is causing overall lower performance to their users.
2) Course tuning that helps for the front page (so helps in general) but causes a lower performance for bigger pages.
3) They are doing some sort of window size caching and that number was cached from previous connections.
I did note that there were no retransmissions in the MS flow so that it doesn't seem like a bad guess. They don't support SACK (WTF) so that would slow things down if they lost packets.
For 18 Minutes, 15% of the Internet Routed Through China
Your step two is flawed. VortexCortex steps are accurate.
In your step 2 Google think they send you Googles certificate but they are really sending it to the MITM. Since it was the MITM who started the connection they build the session keys so can decrypt the session.
In your step 3 They don't need googles private keys they can create their own and because they have a CA trusted by most people they can sign them so that most people trust them. (I use firefox mostly which comes with CNNIC CA installed)
This sort of MITM attack is used all the time by filtering gateways. Examples include "McAfee web gateway" amongst many others. Since the filtered company controls its desktop operating environment they can install their own CA. The gateway filter then creates certificates pretending to be the endpoint and creates a outbound connection pretending to be the client.
The only real way for SSL to solve the man in the middle problem is for client side certificates issued by the server's owners. You have a distribution problem. If the server trusts the CA in the middle as well then it can intercept both ways.
Pay Or Else, News Site Threatens
You don't have to agree to the contract but it is the only thing that is giving you a copyright license to continue viewing their content.
Probably the best example of a copyright license for publicly viewable content is the GPL. You don't have to agree with the GPL but you loose lots of rights if you don't.