$57,000 Payout For Woman Charged With Wiretapping After Filming Cops
This is a bit silly. Despite the disclaimer, how many people think the department doesn't realize they shouldn't do this again? I think the combination of the court judgement and a penalty is about all you're going to get.
Why Don't Open Source Databases Use GPUs?
I'm responsible for a large university learning management system (Sakai). The daabase is completely CPU limited. I assume that's because the working set of data fits in memory. I would think lots of university and enterprise applications would be similar. Another data point is the experiments done on a no-SQL interface to innodb. That shows very large speedups. Surely some of this is due to the CPU overhead in processing SQL.
Ask Slashdot: Linux Security, In Light of NSA Crypto-Subverting Attacks?
No, but there's no reason to think that Linux is worse than anything else, and it's probably easier to fix.
If I were Linus I'd be putting together a small team of people who have been with Linux for years to begin assessing things. From Gilmour's posting it seems clear that IPsec and VPN functionality will need major change. Other things to audit include crypto libraries, both in Linux and the browsers, and the random number generators.
But certainly some examination of SELinux and other portions are also needed.
I don't see how anyone can answer the original question without doing some serious assessment. However I'm a bit skpetical whether this problem can actually be fixed at all. We don't know what things have been subverted, and what level of access the NSA and their equivalents in other countries have had to be code and algorithm design. They probably have access to more resources than the Linux community does.
Teaching Natural Sciences To Social Science Students?
I taught stat to a business school audience, too many years ago to think about. One thing you have to figure out is what to cover and from what viewpoint. Math students might be interested in the math behind some of the statistical methods. Social science students probably aren't. To be honest, they're just going to use canned packages, so details of the math are not the most important thing to teach them. What you really have to teach them is what all the math means. What assumptions are the methods based on? What do they do? When do you use them?. How do you formulate problems? What are the most important ways that people can unintentionally (or intentionally for that matter) get completely meaningless results out of statistics? E.g. what does it mean when you try 20 different models, and one of them is statistically significant at the .05 level? Answer: it means nothing at all. But those kinds of results get reported all the time. Have then read some of the articles on why so many drug studies are turning out not to be meaningful.
Senators Ask Feds To Probe Facebook Log-in Requests
I agree with the FB position. I do work with youth at Church. Several of them are my friends on FB, although these days more of my friends are professional colleagues. Their parents know that. My privacy is set somewhat tighter than the default, to minimize their exposure to others.
While I am not silly enough to put anything that matters on FB, some of the kids have said things that, while actually not very serious, they might not want other people to see. The difficulty with letting third parties use my account is that most of what's there isn't my postings, but postings of my friends. And they might well not want my potential employer to see them.
I don't know what kind of suit FB has in mind, but if I were going to make up a case I'd make it up based on compromising the privacy of minors without the consent of their parents.
The Windows 8 Power Struggle: Metro Vs Desktop
OK, I resist change just like everyone else. But that's not what is going on here.
Monitors are getting bigger. I'm doing more things at once. I want better ways of managing that. But Metro just gives me one thing at a time. Sorry, that's not a solution to the problem. That's going back to the original Macintosh.
Apple isn't perfect, but at least they've been trying some new ideas. I don't think the new ideas on screen management have been all that successful, but at least they're attacking the right problem.
At the moment, nobody has a better idea for a smart phone or a tablet than to show one app at a time. The only way W8 makes sense is if they're adding a piece for portable devices, and said "while we're at it, let's let desktop guys use it too." Fine. But only if they realize that the desktop systems still need new ideas as well.
And if I were doing a ground-up redesign, I'd consider whether we might be ready for a better approach with tablets as well. The new iPad has more pixels than many monitors. I'm not sure one app at a time should be the only way to use it.
Climate Change Skeptic Results Released Today
The skeptics I've read agree that temperatures have gone up. The questions are about models showing continuing rises, and what approach to take in dealing with it.
My concern is that we not exhaust the public's willingness to do something with approaches that will have almost negligible impact.
Is Canonical the Next Apple?
I would love to use Linux. But first and foremost the OS has to be able to do everything Mac OS or Windows can do. That includes licensed content. Currently iTunes won't run on Linux in any realistic way, and there's no real alternative. I doubt that I'm alone. I'm glad someone in the Linux community is working on things like this. I'm also going to need MS Office, as I have to be able to exchange documents with administrators and OpenOffice's compatibility isn't really good enough for that.
The Linux community has slowly understood that if you want to be a mainstream OS real people have to be able to install it. But once they've installed it, it's got to do what they need to do.
Advocacy Group For the Blind Slams Google Apps
Has anyone participating in this discussion actually done web design for accessibility? I've been looking at it for our course management system. It's not trivial, but it's also not difficult. In increases development time / cost, but probably not more than 10%. It's perfectly possible to design reasonable visual interfaces that work fine with common screen readers. A sighted user won't even be aware that it's been done. It's a combination of avoiding some standard pitfalls that a screen reader can't reasonably work around, and putting appropriate labels and tags on everything. A lot of tools are accessible. jQuery has been doing an increasingly good job. The CK editor has as well.
The issue isn't just blind people. Older people (like me, to be honest) sometimes need to increase font size, and would really like it if the web page design doesn't fall apart.
There's no way you're going to get away with saying "sorry, they should know they're handicapped." The law won't allow it, and in my opinion shouldn't. I might feel differently if there weren't reasonable approaches to dealing with it. The big problem is getting web developers to think about it, and to try their software with a screen reader now and then.
Most IPv6-certified Home Network Gear Buggy
I believe their routers run a version of BSD. They've had IPv6 support for years. Apple is an interesting mix of flashy products that tend to be on the expensive side with fairly decent underlying technology. It's a mistake for techies to become fans and enemies of particular vendors. That approach to the world is fine for football fans, but not so useful for people making technology decisions.
Most IPv6-certified Home Network Gear Buggy
I have little sympathy for the ISPs. No devices support IPv6 because there's no evidence that any of the networks for which they are intended has any plan for implementing IPv6 within the lifetime of the products. There are enough Apple routers out there to run a trial. What we need is the ISPs to turn on support, and a couple of intrepid web sites to put up attractive content. (An IPv6-only free porn site would be ideal.) Final debugging is going to occur only with real use, and you can't get real use if the pipes don't support IPv6. If the major ISPs even supported decent IPv6/v4 gateways in the right part of their architecture one could turn on tunneling, which seems to be supported by all real IPv6 implementations.
Which Language To Learn?
VLC Developer Takes a Stand Against DRM Enforcement
IANAL, but I believe that in exercising editorial control over what applications to accept, Apple can't claim to be a hosting company or the mailman. I am not yet sure, however, that they are violating v2 of the GPL, since the terms seem to say that if there is a license specific to the product, it supersedes the generic iTunes one.
VLC Developer Takes a Stand Against DRM Enforcement
It's not so obvious to me that the Apple Store is *not* a noncommercial distribution. Normally commercial means that it's charging. iTunes doesn't charge for free software. They offer other products that are commercial, but is the VLC distriibution?
How Not To Design a Protocol
These protocols were designed for a different world:
1) They were experiments with new technology. They had lots of options because no one was sure what would be useful. Newer protocols are simpler because we now know what turned out to be the most useful combination. And the ssh startup isn't that much better than telnet. Do a verbose connection sometime.
2) In those days the world was pretty evenly split between 7-bit ASCII, 8-bit ASCII and EBCDIC, with some even odder stuff thrown in. They naturally wanted to exchange data. These days protocols can assume that the world is all ASCII (or Unicode embedded in ASCII, more or less) full duplex. It's up to the system to convert if it has to. They also didn't have to worry about NAT or firewalls. Everyone sane believed that security was the responsibility of end systems, and firewalls provide only the illusion of security (something that is still true), and that address space issues would be fixed by reving the underlying protocol to have large addresses (which should have been finished 10 years ago).
3) A combination of patents and US export controls prevented using encryption and encryption-based signing right at the point where the key protocols were being designed. The US has ultimately paid a very high price for its patent and export control policies. When you're designing an international network, you can't use protocols that depend upon technologies with the restrictions we had on encryption at that time. It's not like protocol designers didn't realize the problem. There were requirements that all protocols had to implement encryption. But none of them actually did, because no one could come up with approaches that would work in the open-source, international environment of the Internet design process. So the base protocols don't include any authentication. That is bolted on at the application layer, and to this day the only really interoperable approach is passwords in the clear. The one major exception is SSL, and the SSL certificate process is broken*. Fortunately, these days passwords in the clear are normally on top of either SSL or SSH. We're only now starting to secure DNS, and we haven't even started SMTP.
*How is it broken? Let me count the ways. To start, there are enough sleazy certificate vendors that you don't get any real trust from the scheme. But setting up enterprise cert management is clumsy enough that few people really do it, hence client certs aren't use very often. And because of the combination of cost and clumsiness of issuing real certs, there are so many self-signed certs around the users are used to clicking through cert warnings anyway. Yuck.
Desktop Linux Is Dead
Whether Linux is realistic depends upon what you want to do. I use a Mac as my primary way to access TV, and to some extent movies. As far as I can tell the only major source of video content is iTunes. I'd love to find an alternative, but there doesn't seem to be any. Hulu only keeps some shows, and you can't rely on which ones. It seems unlikely that the major content providers will be interested in supporting Linux unless things change more than I think they will. It's pretty clear that if Apple hadn't pioneered with iTunes, they wouldn't even support the Mac. I find it odd that content providers haven't provided a credible alternative. It seems in their interests. But it looks like only Apple has enough clout to force them to do something reasonable.
How Can an Old-School Coder Regain His Chops?
The problem with books is that most people learn by doing, and toy problems don't teach you what a real application is like.
I'd suggest picking an open-source project and doing something with it. Depending upon the type of programming you want to do, add something to Linux, OpenOffice, or any of the number of Java-based things. (I'm currently working with the Sakai course management system. There are plenty of things that need doing there.)
The languages aren't any worse than what you're used to. The problem is that real programming these days tends to involve lots of complex libraries and frameworks. Those are hard to learn in the abstract, which is the reason for my advice.
Whether it make sense for someone to (re)enter programming as a job I can't say. That's a decision for you. There are a lot of problems with the profession. But there's also lots of important things that need to be done, and a lot of the people who think they're programmers aren't up to it. Programming approaches are changing often enough that skills go out of date in a few years. That's both good news and bad news for people like you. Since people have to learn new techniques all the time anyway, it's not like you have to relive the whole last 30 years.
The language depends upon what you want to do. Systems software and desktop applications typically use C-based stuff (C++ is probably the best place to start, although Objective C and other things have advantages.) Web applications use Java or .NET. I'd probably start with Java. You can find real and interesting applications in just about any language, so you can argue for Python, Ruby, and all sorts of other stuff. But C++ and Java are probably the place to start. I keep hoping that there will be some major new programming technology to use multiple processors / cores well. But there are lots of nice demos, but so far I haven't seen an approach that looks like it's going to really take off. That's really pretty discouraging. I wish things were more different from when you were programming. C++ and Java are only slight improvements on what you're used to. It's really the libraries and frameworks that are new.
Climategate and the Need For Greater Scientific Openness
But not releasing your data plays into their hands. I understand that you'll then have to deal what you think is irresponsible use of the data, but it's a lot better to say "they don't know what they're talking about" than to hide your data. The former is a scientific disagreement, or maybe even a disagreement between scientists and quacks, but the latter is an admission of guilt.
Copernicus Reburied As Hero
In case anyone is interested, I just looked to see what was actually done about Copernicus. No action was taken during his lifetime. During the Galileo affair, motion around the sun was declared to be erroneous and heretical. Thus Copernicus' major work was taken out of circulation for 4 years, until it could be "corrected." 9 or 10 corrections were made, which appear to have been simply inserting the word "hypothetically" or equivalent, on the grounds that it was a hypothesis that hadn't been proven.
Note that I am not defending the actions of the Catholic Church. I just thought people might want to know what they were. The uncorrected version was put on the Index.The "corrected" version was not, so it continued to circulate. The source I looked at (http://hsci.ou.edu/exhibits/exhibit.php?exbgrp=1&exbid=14&exbpg=4) says that there was no official finding that Copernicus was heretical, although it appears that there was a general condemnation of heliocentrism (at least this is how I read a couple of seemingly contradictory statements).
Is the Tide Turning On Patents?
The proper protection for software is copyright. Patents aren't needed to protect software. The development work goes into the code, which needs and has protection available via copyright. Patents protect the idea. I have yet to see a software patent that isn't obvious, except for public-key cryptography. And that patent is a major cause of the current Internet security problems. (It was filed during the same time that the basic Internet protocols were being designed. If it hadn't been present, it is highly likely that cryptographic checks would have been built into many of the protocols. While that wouldn't solve all security problems, it would leave us in a lot better shape than we are now.)