GCC Moving To Use C++ Instead of C
Seeing as how GCC now supports them both, why not use one of them instead?
Thawte Will End "Web of Trust" On November 16
You don't end up trusting almost everybody, you end up with a bunch of untrusted bullshit keys in your keyring. The relative small size of the web of trust is the problem, it's difficult to try to rely upon trust, you probably just rely more upon the existence of a key. even then, more people sign stuff with PGP/GPG than actually encrypt stuff, even if they ahve a key for a recipient.
It's an authority and leadership problem. The thing the email cert dealers miss out on, in my opinion is the sale of directory services. If a dozen or so CAs could come up with a giant LDAP server of people they've verified and issued certificates to and you could just plug that in to your email client then you could rely upon a centralized like database of public keys that are all trusted, the owners could perhaps set some mail preferences (I prefer encrypted mail, I prefer signed mail, etc..) then the whole thing could develop some use. Thing is, people don't wnat to be "listed in the phone book" on the internet. PGP/GPG could possibly do some similar things, I mean you could build services on top of them such that everyone published a public key to a central server and then maybe used some social networking type stuff to encourage the web of trust to expand.
Either way, it seems like a central directory is pretty key to email certs working well, trusted CA or not, how do I know who has a key until we've already emailed? That alone prevents encryption and if it's only authentication then the value prop drops. In fact even Zimmerman suggests not signing most of your emails...
The Duct Tape Programmer
I've seen these threads the last couple days about this blog. There are a couple of things that are worth putting in to perspective. Seems we've all worked with guys who get a lot done fast and generally try to keep things simple, maybe a bit too simple, and as Joel mentioned get the go kart running and it goes fast. The big big big difference with a duct tape guy like JWZ is that he's rich. He essentially succeeded, it doesn't matter what the long term sustainability of the project is, it doesnt' matter that Netscape is dead and IE is in a world of hurt (is it? I don't even know who considers that the "goto" browser any more, nobody I know really uses it) He came in, he got it done, he got paid and he moved on. Were there engineering problems that hurt Netscape? Absolutely, was that the soul and main problem they had? Probably not. Is a "duct tape" JWZ anything like Crufty Joe? Maybe in that they both could work fast, but believe me, JWZ doesn't maintain his code, it was good enough and that balance of what that meant was good enough to cash out and good enough to change the world. I simply can't remember anyone complaining about Netscape 1.1's reliability in any serious way, it was lightyears beyond Mosaic and any problems it had were completely outweighed buy all the great stuff it actually did. I think one of the benchmarks for an official "duct tape programmer" is success, the product works, or there is some financial success or some other type of success. Crappy spaghetti that needs constant maintenance is just that, crap.
Can you emulate what JWZ did? Sure. Does that mean you'll be successful? I doubt it. THis discussion isn't about unit tests or dev methodologies in the traditional sense. It's about guys that threw that stuff out and achieved success in spite of it. It's much more subtle than being a hack that get's stuff "done." It's about being a hacker that get's stuff Done (with a big D) One you want on your team, they are rare individuals that can guide you towards riches, the other? Well you've probably got them around and they're probably paid too much and they actually cost the organization over the long term rather than help it.
According to Linus, Linux Is "Bloated"
There are probably some concepts from uKernels that can be of use. I think it's more of a inertia kind of problem.
There is tons of code reuse in Linux already, it just might be that it's kind of at the wrong level. Take drivers for example, there are basically 3 kinds of drivers, network, character and block. All the drivers essentially are one of those. Some subsystems like ieee1394 and USB kind of add some new types of drivers on top of that and provide a framework for them. There isn't exactly an ethernet device framework though, every ethernet device is basically a network device and then implements all the chip specific stuff and in some cases it could probably be identical between chips but for a couple registers that are different. By the time there is enough pressure to come up with a "WIFI device" framework, half the devs that contributed WIFI drivers have abandoned them and moved on, they wrote them got them working, got them included, debugged them and they're done. This is a double edged problem, you want to keep the hardware support but you don't want to change drivers without testing them, and ideally the original authors will play along and help out but they aren't always around.
You could probably make some great arguments about the VFS and LVM layers too, they have caused tons of public arguments over the years. For many things they are very well defined interfaces, for some new and exotic types of things (think ReiserFS4) those interfaces are broken. These are hard problems to solve. More over, LVM + EXT3 work really well for 90+% of the users out there.
There is a bigger missing piece too, Linux hasn't had a "dev branch" in ages. 2.5 took too long to stabilize, so they say, but you can have high level goals for the entire community with a dev branch. "We won't ship 3.0 until, x, y, and z are done... when they're done they'll be done." It's allowed for more rapid integration of stuff but there is always a cost with that. I think Alan Cox suggested it was time to start a 2.9/3.0 branch a while back in order to start dumping some of the abandoned drivers and stuff. It's a good thing for the community psychology.
After 8 Years of Work, Be-Alike Haiku Releases Official Alpha
Basically, it's this: unix sucks.
Nice and concise. I'll do you one better. They want some non-UNIX APIs and they also want a lot of the legacy UNIX APIs at the same time. Seems Microsoft and BeOS and really Apple too think that a proper UI depends upon kernel support, be it for message queues or for various services that need to have kernel like superiority to clients. I've been wondering about this for a long time, the X guys don't approach the problem that way but no matter what kinds of hacks they do, Linux desktops just don't have anything like the cohesive feel of Windows, BeOS or MacOSX. Let un not forget, BeOS had terrible networking support and terrible hardware support.
So why didn't Haiku start with a Linux kernel, which is really good at a lot of stuff, add some patches/drivers to provide the missing mechanisms that they desire and then build on top of that? I have no belief that the interfaces that the BeOS guys provide would get accepted by the Linux kernel or a BSD kernel, at least not any time soon but I'd like to see those interfaces defined and it's a perfect job for a distribution to apply that patch and build a product out of it. Then at the very least, this new Haiku OS would have a chance in hell of maintaining hardware compatibility and running on interesting stuff.
It also seems if you came up with a good set of audio APIs and built user space stuff that used them, you could legitimately take over Linux audio.
I'd love to see something like this succeed, and I applaud their tenacity. It's just when you start writing a kernel from scratch and only have vague explanation about software consistency as to why you can't use an existing kernel (that happens to be very good and have great support.) Provide some kernel patches to Linux, and start a completely from scratch distribution and software model on top...
Does the 'Hacker Ethic' Harm Today's Developers?
Somewhere in the last 2 decades "coding" became the singular aspect of "hacking." A lot of other things have happened in that time, a huge number of people have decided or "interpreted" or who knows what that "Functional programming" is programing with procedures rather than objects. "Object Oriented Programming" no longer has nearly as much to do with modeling so much as it does calling modules "objects." A webpage that does database I/O now constitutes an enterprise application... I suspect it's sort of the backlash of the .com boom, at first we'd hire anyone who could fill a seat as a "web developer" then the 9/11 recession happened and we culled out most of the dead wood but the die hards that picked up some skills kind of stuck around, now a days they are sort of senior guys when in reality nothing could be further from the truth.
Coding is nice and all, but good communication, good engineering, and good design (not graphics design) are parts of the old skool "hacker ethic." There was a time when you touched a program you tried to leave it better than you found it and anticipate what the next guy will need or want. That's becoming a rarity and some modern paradigms like Agile seem to ignore it completely, you're wasting energy and time if you "over engineer" anything or try to build something before you need to.
Blu-ray Adoption Soft, More Still Own HD DVD
These numbers seem flawed to me. There weren't enough HD-DVD players created. Still only like 1/3 of US households even have HD monitors. (Here.) According to Wikipedia (yes, I know) Toshiba, the largest HD-DVD unit maker had sold about 1 million units right before they pulled the plug.
Now a lot of folks might think they have HD TV and have a DVD player that is either 480p or an upscaling one but that's not HD-DVD. It just doesn't seem like it's possible for those numbers to be correct. If you look at the income distribution as well, it suggests to me that the sample set is flawed if nothing else. Computer ownership went down? HD TV ownership is substantially different than the Neilsen numbers. Original xbox numbers are consistent but PS2 numbers went down? The $50k to $75k folks own way more gadgets than the $75k+ crowd? 'splain that to me.
You've Dropped Your Landline — Now What?
Unless you're somehow going to make it "better," I'd just do nothing with the old wires.
Maybe put nice blank plates over the jacks if it bothers you that much. By "better" I'd say fishing cat 5, cat6, or structured wiring to each jack and then home running them somewhere. A loop is no good, you'll only make what's there worse with any other scheme.
The only thing worse than trying to un-fuck the wiring in a new place you just bought because the last owner did some "project" is being that home owner and trying to get it all unfucked on your own because an inspector told the potential buyers that the wiring is all screwed up. Trust me on this. Your wife will be a defcon 0 with the stress of moving. You'll be either paying two mortgages or dealing with the close on your new place, trying to get things timed just right. (And they never can time things "just right.") The new buyers will be ready to close yesterday, except for the list of stupid crap you need to fix and or explain. A contractor will want to tear up walls and fix it that way, for a couple grand (maybe more if they know you're bent over the table) and you'll have to re-clean the place with that lovely drywall dust just about everywhere... And it's going to be about 200 degrees in your attic where you cleverly "hid" most your dirty work... If you're there forever, then knock yourself out, but if you plan on selling the place, just realize that a lot of people still like to have phones in rooms and phone service (even Vonage or 8x8 or whatever can run over the old loop if you plug it in to the house instead of a phone)
Or maybe the new buyer will get a kick out of your "intercom" system or home brewed HPNA, with the speaker about 2 feet off the ground where the phone jack was... You never know.
Senator Proposes Nonprofit Status For Newspapers
A big part of the problem is the blending of opinion and news. On Fox, they have a morning show, that's not a news show, it's a variety show with lot's of opinion, then at some point, like after 3pm, the news readers stop doing what they're doing and they roll out the prime time opinion guys, O'Reilly, Hanity, etc.. A lot of folks take exception to the opinion guys and they co-mingle them so much it's really hard to compare news to news. Personally, I find the morning shows the worst because they trot out a news reader to do some "news" and then just play grab-ass for 2 or 3 hours, talking. To some folks, the Today show is actually news and when Fox and HNN do it, they do inject a very specific brand of bias.
MSNBC and HNN have nearly the same format, a morning variety show with varied opinion, but definitely not a "just news" program, some number of hours of news readers and then opinion guys/gals for primetime.
Nobody from Fox News would ever claim that O'Reilly is a news man (well he might, who knows? His program clearly isn't a news program though, and even he'd say that) same with MSNBC, Olberman nas been very outspoken on the fact that's he's paid to give his opinion, that's the point of his show, and as such, it's not a news program. It was MSNBC that really botched it over the convention coverage and tried to use the prime-time opinion line up for news.
Bottom line though, and it affects papers too, people tend to like to read opinions and editorials and they seem to like to watch it more than they like real news. You non-profit either the papers or the broadcast news and you probably have to dump them. There is probably a greater problem here if you take a step back; ABC,NBC, and CBS have been scaling back news for decades, they're basically down to a 30 minute evening news broadcast and that's about it without some sort of entertainment/investigative journalism spin. More people want to watch Jeopardy than "The News." Making papers non-profit might be a good way to make them cover more news and to protect them a little bit, but it remains unclear to me that people want to actually read news, they kind of like how they get to pick the kinds of "news" they can read or watch on their own and listen to the bias.
Even the financial news has become a sham, and if there is ever something you should be able to report on without bias, it's the markets. They do more cheerleading than real news. They're poopooing Jon Stewart's criticism and he's the wrong messenger but his points are 100% valid. Honestly, I think a whole lot fewer people watch and you can hardly run a 24 network with real news, let alone the dozen or so that we've got. It's hard to put the horse back in the barn.
IE8 May Be End of the Line For Internet Explorer
However, I've wondered if someday, the resource logic wouldn't occur to Microsoft, or the trident codebase wouldn't become such a problem that it'd become stronger. They don't need to have their own rendering engine to embrace and extend. Using webkit or gecko would mean that they could lose any advantage they might have by people coding websites to IE, but they don't need that to try and get Silverlight out there or even keep the world using Active X. And rich / active components are probably about the only hope they have of being able to get any kind of lock on the web again.
How much embrace/extend can you do to Gecko or Webkit before you're in the same position again? Admittedly, I think it's all about silverlight. Why not just let IE kind of die off and encourage users to install a new browser or something and then focus on plugins for all the other browsers?
The only thing I can think of that would make any sense to me is that this is the worst recession ever, they probably have better forecasting that we do and they can see the it might be decades before the decadence is back, if ever. In that case, they need to cut all the fat that they can. They did just RIF a bunch of folks... IE isn't exactly free to develop and it's not clear that they're getting much from it.
IE8 May Be End of the Line For Internet Explorer
I can't imagine that they'd use anything but their own rendering engine.
Lock-in is one thing. The other thing is they're a huge target for hackers, they have to be responsible for fixing things and they've never shown that they can play nice with opensource and opensource tends to be hostile towards them. If they did use webkit, it would effectively be a branch, just because of their requirements. They'd be going from zero to full blown dependence on something they don't completely own.
It'd be huge, I mean if MS wanted to ever show that they have changed their colors, supporting open browsers would be a start. I just can't see it though.
Didn't they claim that they couldn't take IE out of windows?
IBM Offers to Send Laid-Off Staff to Other Countries
Isn't this really about H1Bs and such?
It's a sensational headline, that's for sure, but for the vast vast majority of Americans in the work force, with families and roots in America it's not even remotely an option. If you're here in the US on an H1B then it's a different kind of issue.
Also, FWIW, if it really is labor costs alone (and it probably is,) this is about as perfect a time as there is. When you have GM going to congress begging for money while their laborers are making near $80k a year with gold plated benefits, the public is as open to it as ever.
Plug-In Architecture On the Way For GCC
I don't know that it has been fully established in court, my company's lawyers are scared enough by all of the language though. The intent is pretty clear, if you use GPLed code in some capacity of leverage it, they want you to GPL your code too. That's always been the idea. In the case of a compiler plugin, depending on how it is linked and other things, if you distribute it at all, you might need to make your code GPL compliant. GPL3 tried to make that all more clear and GCC is GPL3.
I'm not going to push the ideology here but the last decode or so has shown in more than a few cases the only time this seems to matter is when some company doesn't have the resources to build something but they want to put some tweak on it and sell that. If you're writing some kind of optimizer that you need to keep "secret" but you can't build a full compiler then it's hard to offer much sympathy. If you're building some sort of static analyzer or something that you need to keep "secret" again, I think there are more than enough holes here, you really just can't link to GCC, write your plugin, GPL it and just have it dump whatever intermediate form you need. Custom language or custom hardware support? There are probably some more treacherous areas, I'd imagine that some of the better ILs are somehow protected, and I'd also imagine more than a few compiler jocks would like to graft some of that stuff together, you know GCC's parser and Yoyodyne Corps optimizer and code generator, stuff like that.
Having worked with more than one chip vendor that "sold" a GCC derivative that supported their hardware, to be completely honest it would have helped our cause and theirs to just GPL the code to begin with.
Everyone thinks compiler plugins is cool for one reason or another and just about everyone will hate it when there is some interesting plugin that costs $2500 a seat and does some cool stuff but in only runs on Redhat's enterprise Linux in 32bit mode... with a version of GCC that is 2 revs back.
Despite Gates' Prediction, Spam Far From a Thing of the Past
Yes but it's something that is being run by organized criminal like folks. With botnets, it doesn't cost them nearly as much do it, so the benchmark goes down dramatically, and they can probably steal other things and sell them. If you sift through your spam folder, you'll see a lot of counter measure spams already in place, just random message designed to clog up spam filters, the volume of that makes me think that there is also an element of vandalism to it. There are even viruses and worms and other malware that just randomly create spam, it's not like there is even thought going on. What's worse, have you ever had a legitimate message end up in your spam filter? I check them regularly just because of that.
It's really absurd when you take a step back, google bought postini to deal with spam, that's a nontrivial investment. Spam filters for exchange and mail systems can be very costly to a business. Years back the "good guys" started black lists but a lot of legitimate organizations that didn't have the same tech savvy were snared; it was really vigilante style network defense. Some spammers even took offense to that and escalated things, like they were offended by the attempts to stop spam. To really fix the problem, we need to fix the email protocols, we need strong authentication for smtp peer to smtp peer and we should consider end user authentication while we're at it. Until we do that, there will be spam. If Bill Gates wanted to help, he's encourage MSN and the exchange team to work with Google and come up with a plan to secure SMTP and make it default "on" in future versions of exchange. Before we had the lame excuse that there were too many different mail servers and clients to do it, now if you got google, hotmail, and exchange to adopt a new protocol that could cover a huge percentage of the world and everyone else would follow suit.
Better isn't the right word.
It's ubiquitous. good enough and it runs everywhere you want it to.