Training Materials for NSA Spying Tool "XKeyScore" Revealed
When I first got onto the Internet in the early 1990's, there were three things that were made quite clear to me when given my account:
- Don't put anything onto the Internet you wouldn't want seen on the front page of the New York Times; it will be available for all to see and it will never be deleted;
- The Internet is a public space and there is no expectation of privacy in public; and
- The packets that make up your communications are not letters but postcards -- anyone on the way between you and
the destination can read everything.
The NSA claims they are simply collecting Call Detail Records (CDRs) and packet headers, although likely more is being collected. But seeing CDRs and IP headers is no different than watching me when I'm walking around the street. Seeing the packets to my Google session is no different than knowing that I walked from my house to the nearest pizza shack. Everybody and anybody could see me do it, but it doesn't mean my privacy was violated -- I did all of these actions in public!
People should not be surprised or upset that this information is available to be collected because that is the cost of using the Internet. You are intentionally sharing information with third-parties in the interest of obtaining a service. Even the snooping of email in GMail or Yahoo should not be surprising because you shared that information with a third-party (the service provider) and the provider has different legal requirements than if you simply shared that information directly and exclusively with your interlocutor.
If you are upset about the Internet being public, then you should stop wasting your breath complaining about how what you thought was private is actually public and instead start advocating for the wide-spread use of encryption algorithms and always-on SSL. You should start advocating for the ability to run servers (mail and web) on residential connections so you don't have to share "private" information with third-party providers. You should advocate for rolling out IPv6 instead of being lazy and claiming that unencrypted NAT-ed IPv4 is good enough security.
And when your done advocating, lead by example and use these technologies yourself.
Just because you think something is private and secure doesn't mean that it is.
A Humanoid Robot Named "Baxter" Could Revive US Manufacturing
I not calling you out personally, but damn it! am I tired of hearing people suggest corporate taxes as a way of funding governments, like this somehow saves the average citizen from that burden.
Taxes are paid from the revenues of the company. Revenues come mostly from sales, though a small part may come from interest-bearing investments. The cost of paying corporate taxes must therefore go into the very price of the product you're buying. So, the average citizen ends up paying that corporate tax each time they buy the product.
"Oh, but lets make it illegal for companies to pass that cost onto the consumer!" Brilliant! Assuming this is even possible, the result would be a lowered profit margin. And where do those profits go? For a private company they go to the owner -- a citizen. For a public company they go to the shareholders. Lowered profits means lowered dividends and less of a return on your investments. Congratulations, you just raided your pension fund and 401K.
While it might feel good to say that we should just tax the rich and evil corporations, whenever you raise the corporate taxes you're just hurting yourself.
Ask Slashdot: How To Feed Africa?
Square foot gardening might be a viable option.
It turns out that the instructions on most seed packets are designed for planting in rows suitable for mechanical processing. The seeds can be planted much more closely together when a human hand is doing the tending. And, when you plant closely together there is little room for weeds, so weeding is a once-a-month activity.
You can make a raised bed planter 4' by 4' in size and only 7" deep, fill it with a mixture of 1/3 compost (manure), 1/3 peat moss and 1/3 vermiculite and you have a planter that is capable of feeding one person for an entire growing season. Once the mixture has been made, you only need to occasionally add compost and the mixture is good for five years.
Mel Bartholomew, the inventor of Square Foot Gardening, is a process engineer, and he has spent the last 20 years optimizing the heck out of this growing method. He's traveled the world working with organizations and governments to help people grown their own food.
Algorithmic Trading Rapidly Replacing Need For Humans
The purpose of the stock market is to provide price discovery. If you had perfect information at all times you would know the price of a good and the stock market would be pointless. But because perfect information is impossible, the stock market crowd-sources the gathering of information so that the true price can be discovered.
Determining the price of a good is something only a human can do. Price is a value quantified, and determining value requires sifting and filtering of information and the application of significant amounts of gut instinct. Computers cannot set prices since they don't have any concept of value -- they have neither needs not wants.
Computer assisted trading -- trades where people set stops and buy limits -- is okay because the human has done the work to determine the valid price ranges a priori; the computer simply executes the bid on behalf of the user.
High Frequency Trading, however, should be illegal since it does not involve human value judgements at all. It simply allows a computer to front-run actual humans and siphon off people attempting to perform a useful act -- that is, price discovery.
Go, Google's New Open Source Programming Language
One of the key problems with Exceptions is that they tend to violate security models.
It is probably not common knowledge that C++ exceptions on the Windows platform are implemented using Structured Exceptions, an exception mechanism built into the Operating System that works even in C. I suggest reading up on Structured Exceptions if you are not already familiar with them.
Let's say you have implemented a Windows Service in C++, and that at some point you change your thread's security token. Changing a thread security token is usually done to perform an action on behalf of a client using the client's credentials. Unfortunately, the Structured Exception system has no idea that you have changed the security token of your thread. If an action performed on behalf of the user while using the user's token throws an exception that is not caught within the scope of the function that altered the thread security token, well, congratulations! you've just violated your security model. Why? Because the exception handler that does catch the exception is in a parent scope and is now executing with the user's security token and not the original security token of the process. At best this means your exception handler may fail to work, and worst it can lead to a privilege escalation.
And, no, using catch(...) is not a valid way of solving this security problem.
Using the Sea To Cool Your Data Center
I'm not too sure about the anthropogenic global warming, but I'm starting to come around to it. Earlier my contention was that global warming scientists are causing global warming, but I'm beginning to think that maybe -- just maybe -- computers in general might be the cause. I mean, if computers are having to pump cold water from the ocean depths to cool computers, that's gotta be dumping a lot of heat back into the ocean, right? Right...?
Is Typing Ruining Your Ability To Spell?
Typing is much faster than writing, but I find that typing all day does not affect my ability to spell when writing by hand. In fact, because hand-writing is more permanent (there is no backspace), I find its slower pace actually improves the way I write, for I have to spend more time thinking rather than just typing and correcting.
No doubt others will be in a similar position when I say that my script writing (cursive) looks terrible, but that is merely a lack of practice. Bad script is not a modern invention, however: while at a naval museum in Virginia, I could not decipher the captain's log book for all of his chicken-scratchings.
My block printing is readable, albeit very small --- I write in an eight-point font.
My biggest lament about computers and the Internet is that they have reduced the already small working-set of words to what may truly be the lowest common denominator. There are many fine words in the English language that I would love to use regularly but cannot because people don't know their definitions or dismiss me as an arrogant SOB for using big words. The flip side is that when people can't spell "definitely" (definatley) or "lose" (loose), I immediately stop reading and disregard their comments regardless of quality or pertinence.
Games Fail To Portray Gender and Ethnic Diversity
I wish I had mod points. I would mod that funny plus a million.
Windows 7 Clean Install Only In Europe
This is seriously off-topic.
I've always had trouble with that line of reasoning, and it has lead to a number of situations where Microsoft has had to disappoint users while simultaneously having to kowtow to competitors.
Microsoft developed the COM model to make reusing binary components easier from within other applications. The HTML rendering component (mshtml.dll) was included in Windows not only to provide support for Internet activities (iexplore.exe, which is simply a top-level window to host the HTML component), but also to provide developers with a useful component for rendering HTML, then an emerging method for formatting test. Not being familiar with the details of the anti-trust case, my next point is merely conjecture: as long as Microsoft did not knowingly prevent the installation of other browsers, I still don't see how distributing the iexplore.exe executable could be considered abuse of a monopoly position. Besides, the interfaces supported by the HTML COM component were documented, so it was even conceivable that someone could create a drop-in replacement for the one provided by Microsoft.
While the monopoly abuse can has long been decided, the ramifications are still being felt. Microsoft has always has a huge focus on developers (see Ballmer's: DEVELOPERS! DEVELOPERS! DEVELOPERS! speech on YouTube). My understanding -- and this could be wrong, it's a hazy recollection -- was that Microsoft would have liked long ago to release a free compiler for Windows but was prevented from doing so due to the possibility of being anti-competitive. Yes, they have the Visual Studio Express Edition now, but it could have been released years ago. In addition, each new release of Visual Studio Professional brings with it hopes for improved GUI components, hopes that are dashed each time. Why? MS can't release component packs because they would be anti-competitive with other component providers. Ten years ago Borland C++ Builder had so many built-in components (Delphi Visual Component Library) that it made developing apps a breeze. Visual Studio, by comparison, gives you bear skins and stone knives. So, for the sake of competition, I have to not only buy Visual Studio for thousands of dollars, but then I have to go and spend more money on third-party components just to make an application that follows the Windows UI guidelines.
It seems distinctly unfair that a company that wants to provide things its customers want is unable to.
Are Code Reviews Worth It?
Design reviews are useful to catch problems early on, particularly the selection of poor algorithms or data structures. However, in the the software shops I've worked nobody does any documentation, which makes it particularly difficult to do a design review. So, as you can imagine, I haven't got much experience with this area.
Code reviews, on the other hand, are more about auditing code to make sure that people are following the coding standards and policies. After all, if you've got coding standards, how are you supposed to tell if anybody is following them without reviewing the code? Once your coding standards have been institutionalized -- that is, most people have internalized them and following them -- then what's the point of a code review?
So your best bet, then, is to reserve code reviews for junior developers until the coding standards are internalized; and use design reviews for the architects.
New Languages Vs. Old For Parallel Programming
I have not read the article (par for the course here) but I think there is probably some confusion among the commenters regarding the difference between multi-threading programs and parallel algorithms. Database servers, asynchronous I/O, background tasks and web servers are all examples of multi-threaded applications, where each thread can run independently of every other thread with locks protecting access to shared objects. This is different from (and probably simpler than) parallel programs. Map-reduce is a great example of a parallel distributed algorithm, but it is only one parallel computing model: Multiple Instruction / Multiple Data (MIMD). Single Instruction / Multiple Data (SIMD) algorithms implemented on super-computers like Cray (more of a vector machine, but it's close enough to SIMD) and MasPar systems require different and far more complex algorithms. In addition, purpose-built supercomputers may have additional restrictions on their memory accesses, such as whether multiple CPUs can concurrently read or write from memory.
Of course, the Cray and Maspar systems are purpose-built machines, and, much like special-build processors have fallen in performance to general purpose CPUs, Cray and Maspar systems have fallen into disuse and virtual obscurity; therefore, one might argue that SIMD-type systems and their associated algorithms should be discounted. But, there is a large class of problems -- particularly sorting algorithms -- well suited to SIMD algorithms, so perhaps we shouldn't be so quick to dismiss them.
There is a book called An Introduction to Parallel Algorithms by Joseph JaJa (http://www.amazon.com/Introduction-Parallel-Algorithms-Joseph-JaJa/dp/0201548569) that shows some of the complexities of developing truly parallel algorithms.
(Disclaimer: I own a copy of that book but otherwise have no financial interests in it.)
Microsoft Releases New Concurrent Programming Language
Off topic, but here goes.
The package must be shipped as a Windows Installer simply because it's got .NET objects in it. These objects must be installed in the Global Assembly Cache (GAC), which means they must be versioned and reference counted. It is possible (though unlikely) that the installer doesn't even create any registry entries.
Now, .NET was supposed to give us "xcopy installs", so it's possible that MS could ship a ZIP SDK pacakge; but then you'd be responsible for lugging around all of your dependencies from install to install of your own software. Plus, then MS would have to manage two different installation packages, and we all know how easy it is to keep different versions of the same thing in sync.
Barack Obama Sworn In As 44th President of the US
> The wording of this amendment is intentionally vague. If it was overly strict, the constitution would quickly become irrelevant as the times changed.
Ye gods man! Have you never heard of contract law?
The Constitution is a contract between the people of the US and the Federal Government. It was intentionally written so that the common man could understand it. Indeed, many of the phrases, such as "general welfare", were well accepted and understood common-law phrases. If every contract you signed could be interpreted "according to the times", how is it that any contract could be enforced?
The US Constitution includes within it the mechanism whereby it may be altered -- it's called amendments. Amendments are supposed to be difficult to pass because they affect the whole country as opposed to some portion thereof. Just because it is hard to modify the Constitution, though, does not mean that it should simply be reinterpreted to suit ones needs. THAT is tyranny.
IEEE Says Multicore is Bad News For Supercomputers
Faster computation doesn't help communication-limited tasks. Faster communication doesn't help computation-limited tasks.
Computation is communication. It's communication between the CPU and memory.
The problem with multicore is that, as you add more cores, the increased bus contention causes the cores to stall making so they cannot compute. This is why many real supercomputers have memory local to each CPU. Cache memory can help, but just adding more cache per core yields diminishing returns. SMP will only get you so far in the supercomputer world. You have to go NUMA for performance, which means custom code and algorithms.