Embarrassing Stories Shed Light On US Officials' Technological Ignorance
Okay, the cybersecurity negotiator ignorance is bad, the rest less so.
I have been a happy man ever since January 1, 1990, when I no longer had an email address. I'd used email since about 1975, and it seems to me that 15 years of email is plenty for one lifetime.
Email is a wonderful thing for people whose role in life is to be on top of things. But not for me; my role is to be on the bottom of things. What I do takes long hours of studying and uninterruptible concentration. I try to learn certain areas of computer science exhaustively; then I try to digest that knowledge into a form that is accessible to people who don't have time for such study.
- Donald Knuth
The role of Supreme Court Justice is also "to be on the bottom of things". It is possible to understand enough about email to make good judgements about it without using it on a daily basis. The justices have to make weekly about subjects which they have absolutely no interaction with in their normal day-to-day life. From technical to finance to agriculture, no one can possibly be an expert on all the issues they hear. It is their job to constantly learn enough about a subject to know what is important from a legal and constitutional point of view. If they are failing to do this, then that is a legitimate complaint. The fact that they weren't familiar with "common knowledge" technologies before encountering them in court, or haven't chosen to incorporate them into their life isn't.
Glamor, X11's OpenGL-Based 2D Acceleration Driver, Is Becoming Useful
The cairo-ickle blog has maintained very interesting benchmarks of the different cairo rendering backends. The short story is that every hardware accelered backend except for sandybridge SNA has performed worse than the software implementation. And in some cases the hardware acceleration is significantly less stable. I'm curious to see if this finally pushes Glamor over the hump and makes it faster than the software path.
Microsoft's Attempt To Convert Users From Windows XP Backfires
XP is over 12 years old, that's one hell of a *free* long term support package.
How long it has been since a company sold a product to their first customer is irrelevant. What matters is how long it has been since they sold the product to me. Microsoft stopped retail and OEM sales of XP in June 2008, which was shortly after Vista SP1 was released and most if it's problems had been fixed, and a bit more than a year before Windows 7 was released. Those customers got just shy of 6 years of support, which is still pretty darn good. In comparison, Ubuntu offers 3 years of support for an LTS release after it's replacement comes out, and OS X tends to be about the same. However, those both offer free or cheap upgrades so a shorter support cycle is at least somewhat justified.
For corporate customers, the support provided by a RedHat subscription is entirely comparable. No moderately sized company can get away with using OEM/retail licenses of Windows/Office; they all pay some sort of subscription to MS. RHEL 5 will be supported for just over 6 years after RHEL 6 came out. RHEL 2-4 were each supported for 5 to 5.5 years after their successor. Both MS and RH have extended support for critical security bugs beyond that, but both cost extra money. Recent Solaris releases are as good or better (depending which support phases you consider comparable).
So for corporate users, XP's support duration was reasonable and in line with the rest of the industry. For consumers it was much better for people who have to stick with older OSes for compatibility, and hard to compare once you start considering free upgrades (is an OS X point release comparable to a windows SP release or an OS release, etc).
Wolfram Language Demo Impresses
The way this is setup isn't that that you code everything in natural language, rather it is just a shortcut to look up the correct formal language. Instead of searching/browsing documentation looking up the exact names of the functions you want and how to chain them, you just type what you want in natural language. If it interpreted you correctly, then great it saved you several minutes, and now you know the real syntax to use in the future. If not, well you only lost a couple seconds.
The idea of mixing natural language like this isn't so weird; the first step that most programmers would take in looking up documentation when they don't even know the name of the library the functionality is located in is to perform a natural language search on web browser, and then go from there. This just takes it one step further and streamlines the process, which is perfect for a interactive language.
Whole Foods: America's Temple of Pseudoscience
Homeopathy is not silly; it is a lie. If you sell it, you're lying to people. So it matters that Whole Foods sells it, as it casts doubt on their grasp of science, which indicates their "healthly" foods are just marketing to the credulous.
Products in regular supermarkets are also filled with lies, and both have products that better than the other in some way or the other. Solution: make your own decision rather than expecting a corporation to base their decisions on science rather than on what sells best.
Experimental Port of Debian To OpenRISC
You can now deal directly with TSMC for this, or go through third parties like MOSIS and CMP. AFAIK none of the groups that do this publish their rates, and I haven't done this myself, so my numbers come from forums like this one, rounding up the lower numbers that I have commonly seen.
Experimental Port of Debian To OpenRISC
You don't need a multi-billion dollar plant to make the processors. You just need to pay someone who does. You can get small quantities of ASICs made for around $2-5k by taking advantage of programs that put many designs from different people on the same wafer.
Copyright Ruling On Publishing Calculated Results: Common Sense Breaks Out
I just did some queries and the only copyright statement I see is the standard one at the bottom of their page. They do have a legitimate copyright on their pages, including the layout, design and the content which they created. That notice doesn't necissarily imply that they claim to own the facts that are being displayed. In fact, they frequently provide citations for those facts, which implies that they don't claim to own them.
New DOOM Game Not Dead: Beta Comes With Wolfenstein Pre-Order
Personally, I think the Doom concept translates poorly to modern gaming. Tolkien-esque fantasy is to RPGs -- revolutionary in its time, but bland and generic today. Modern games need distinctive characters, settings, stories, and gameplay to succeed (artistically, anyway). Modern games need distinctive characters, settings, stories, and gameplay to succeed (artistically, anyway).
I disagree. The recent Serious Sam releases were great, and showed that the old-school FPSs formula still makes for a good game in today's world. Fast paced, lots of shooting, and meaningless plot. The good character helped, but it was the gameplay that really made it stand out. The problem with Doom 3 wasn't that it failed to add all the things that modern FPSs have, but rather that it failed to replicate the fun gameplay of the originals.
Gabe Newell Responds: Yes, We're Looking For Cheaters Via DNS
The biggest part of his announcement is that this checking is done client side; your DNS history is not sent to Valve. They also only record MD5 hashes that match the cheat sites they are looking for, not your entire DNS history. Finally, they claim to only check for DNS lookups of servers used by the cheat software itself, not just websites where you might read about and download cheats (although in some cases I imagine these could be the same), and use this as a second check after the client has already detected a cheat installed on you machine. So simply visiting cheat software websites without using them shouldn't get you banned.
GOP Bill To Outlaw EPA 'Secret Science' That Is Not Transparent, Reproducible
Nothing in this bill will require federally funded research to be public access, nor will it fund archival efforts for the raw data. Rather it requires that the EPA ignore any science that isn't open access, after the GOP spent years of fighting against open access.
At my current workplace, I've outlasted ...
If you consider just current employees, about half started before me and half after me. But if you consider everyone I have worked with at this job, probably close to 4/5 have since moved on to other jobs. Since most people here either stay all the way till retirement, or for just a few years, without much middle ground, the second measure will continue to grow much faster than the first.
Tesla Touts Cross-Country Trip, Aims For World Record
Some of the older Tesla charging stations have SAE J1772 connectors, in addition to superchargers. There is no reason to doubt that he has used them with his Leaf. He's not lying, just being willfully ignorant about the difference between a supercharger and a standard slow charger.
Google Planning To Remove CSS Regions From Blink
The W3C standard for Regions has mostly been created by Adobe ... I thought standards were there to implement not argue with.
CSS Regions is not a W3C standard. It is a Working Draft. The entire point of publishing a working draft is to solicit feedback from the community. There have been several working drafts that were never promoted to final recommendations, because there was no community consensus that they were a good idea. What Google and Mozilla are doing is a perfectly constructive part of the standardization process.
Fixing Broken Links With the Internet Archive
A new error code won't help, because for that to work the original website would have to send it. But if a link is broken, then they already negelcted to send a usefull response code. This feature is about how the client responds to a 404 error, in which case the most honest thing to do is show the user the 404 message that the site provided, but also let them know that they can access an older version of the page if they wish. Which is pretty much how the existing Wayback plugins work.
Fixing Broken Links With the Internet Archive
None of those examples should result in a broken link if you are maintaining your website correctly. And this feature is only "fixing" broken links; that is links that once existed and are now 404'ed.
If you want to discontinue a product, then replace those pages with one that explains that the product is discontinued, and provides links to simular current products, as well as the support page for the discontinued product. If a users is clicking on links in reviews or forum posts about your old product and receive 404's, or redirection to a completely unrelated and unhelpfull page on your site, they will be frustrated with or without this feature.
In the second case, just redirect the entire demo website URL tree to a current list of examples.
In the third case, you shouldn't do that without redirecting the old url to the new one. Seriously, are you trying to make your content hard to find?
Again, redirect to the new menu.
In no case is sending a user a 404 useful or benificial, nor is it the most appropriate thing to do according to the HTTP standard. If you really want to be pendantic then send a 301 or 303 to perform the redirect, otherwise use URL rewriting, or just change the contents of the existing URL, whichever is easiest. The user should only see a 404 if they clicked an invalid link that was never a real URL for your website. Otherwise, you have failed your users, and it's no-one's fault but your own if they choose to use a service that tries to make up for your short-commings.
US Supreme Court: Patent Holders Must Prove Infringment
If you sue someone for patent infringement you have always had the burden of proof, even before this ruling. All this ruling is saying is that if you threaten to sue someone, and they go to a judge first asking you to put-up or shut-up, the burden is still on you as the patent holder, same as if you had sued them.
Secondly, this ruling does nothing to limit the discovery process. As a small inventor suing a big company you still have the same subpoena powers during discovery as you did before.
In other words, the Supreme Court simply reaffirmed that accused infringers are innocent until proven guilty, regardless of the procedural nuances of how the lawsuit is initiated. None of the concerns you voiced will become worse due to this ruling.
Great Firewall of UK Blocks Game Patch Because of Substring Matches
Secondly it's entirely voluntary. It's not even "opt-out". You have to make an actual choice whether to enable it or not during setup.
Not for long: http://www.wired.co.uk/news/ar...
Google Fined By French Privacy Regulator
Yeah that was my reaction as well. The link has slightly more information, and seems to imply that the current policy may comply with French Law, but the way the change took place did not:
On the substance of the case, the Sanctions Committee did not challenge the legitimacy of the simplification objective pursued by the company’s merging of its privacy policies.
Yet, it considers that the conditions under which this single policy is implemented are contrary to several legal requirements:
If that is true, then the change was a one time offense, and a one-time remedy is fitting. That said, I think the remedy ought to have included a "redo" of the policy change not just a fine; declare that Google's users may choose to be bound by either the old or new policy until Google enacts the change in a manner that is in compliance with the law.
Ask Slashdot: How Many (Electronics) Gates Is That Software Algorithm?
If they plan on implementing this in hardware, then they should have people who are capable of answering that question. If instead, they are just a manufacturer and aren't capable of doing the actual hardware design, then you have bigger problems than answering this question. That is something you should find out about ASAP.
pavon hasn't submitted any stories.
pavon has no journal entries.