Forest Service Wants To Require Permits For Photography
Knowing how the US government works, they'd probably try to impose a $1000 fine per picture.
Actually, that's per copy per picture. They're counting on Ansel Adams to pay down the national debt.
First Shellshock Botnet Attacking Akamai, US DoD Networks
Time to put on some pants, UNIX/Linux geeks: ain't no operating system out there immune to error.
No fucking kidding, software has bugs. And this is a doozy. It's not the first WTF moment we've seen, and it probably won't be the last.
As with the Y2K problem, though, the proof of the pudding is in the tasting. The real test will come when we look back and measure the impact. Will we see a digital wasteland, a web devastated by shellshock-ing predators? Will we find ourselves living in an online New Jersey of the soul, wretched, empty bit-badlands stretching out to the horizon in every direction? Will the Evil Bit finally be flipped? Or will this be like the day when the public library almost burnt down, but we saved all the books by forming a bucket brigade? It's too early to say, right now. But my guess is that, unlike Microsoft's legacy, the fall-out from this event will be the stuff of a cautionary tale for young systems developers, explaining how all the cleverness in the world won't save you from stupidity, so the only really good system is one that can be patched quickly, effectively and simply.
Kant might also have admitted that, while no straight thing was ever made, quite a few bent things were subsequently straightened.
First Shellshock Botnet Attacking Akamai, US DoD Networks
inputs getting into environment variables which wind up eventually inside of bash.
So we agree. Good-o.
No, you twit. Bash will read the environment variables sent to it by CGI, which populates the environment parameters before you can sanitise the inputs. By the time you're ready to begin parsing and sanitising, the damage is already (potentially) done.
The implications of this are far-reaching, and the only way to be reasonably secure is to patch the bash executable.
Outlining Thin Linux
"Servers" is not just that instance of node.js that you run in your VM. Servers in general do need hotplug (for example, a RAID array of hot swappable hard drives), and there are benefits of using DHCP for networks of servers too.
I think the point was that either udev could be forked or an older version of it could be kept kicking around for servers, and that network-manager wouldn't be needed at all. Device and network client configuration can be done via conf files with minimal effort (especially in context of a managed deployment via Puppet or the like). I agree, by and large. I'd argue that we could even do without udev, if it didn't take more effort to live without than to live with it.
How Our Botched Understanding of "Science" Ruins Everything
Super short version:
Philosophy addresses questions of truth.
Science addresses questions of observation.
Er, No. Science is Natural Philosophy.
If you think philosophy (literally, the love of knowledge) is any less exacting or evidence-based than science, then you have another thing coming....
How Our Botched Understanding of "Science" Ruins Everything
How is being scientific at all comparable to nihilism, such that you ceded that point in your head?
Heh, this is a bit like that moment when you explain null vs empty/zero values to a junior programmer. :-)
You seem to be falling victim to the misapprehension that lacking belief is the same as nihilism. But nihilism actually requires a degree of belief in order to be fully achieved. It's the active rejection of religion and morality. In other words, you kind of have to believe that there is nothing. (Absence of evidence vs evidence of absence, and all that....)
Failure to believe in anything is a workable modus vivendi that doesn't imply the explicit rejection of morality. It simply posits that there are no articles of dogma, while accepting that the best available evidence is thus and so... until new evidence arises.
Failure to believe doesn't lead inevitably to despair. It may give rise to constitutional skepticism, but that in itself doesn't have to become unhealthy or drown one in nullity. I experience wonder, poetry, rapture when I hear good music and see good art. I embrace absurdity and humour. I love food, many flavours and smells. I also experience a constant sense of novelty because I have very little certainty about how each day will turn out. The mere possibility of alternatives is often enough to keep boredom and depression away.
Nihilism rejects purpose, meaning and ultimately, hope. It's a nearly impossible condition for most human beings. Failure to believe can likewise be quite upsetting, because it humbles you utterly if you really allow yourself to experience it. I expect it's part of the release that the Buddha talked about. But there's a perverse intensity to the joy that you feel when you're laughing in the face of the void.
'Reactive' Development Turns 2.0
For 20-something years, they've been telling me to be "PRO"-active. Because just plain "active" wasn't good enough anymore. Or at least didn't sound pompous enough.
NOW you want RE-active???
Count your blessings. My boss always wants all my fixes to be retroactive.
Ask Slashdot: Remote Support For Disconnected, Computer-Illiterate Relatives
Gmail optimizes for low bandwidth links.
I didn't know that! Is it something I need to configure?
Good News: No, you don't need to configure anything.
Bad News: Yeah, it's as bad as you remember. The biggest difference is this really condescending message at the top of your screen, saying, "Hi! You're a second-class citizen, so we're sending you to a second-class interface using second-class bytes! NOM NOM!!"
... Or something - I can't remember the exact text; I just remember promising myself I'd find the developer who wrote that and emasculate him with rusted baling wire.
A decent mail client with GMail over IMAP is probably best. Only downloads headers unless you actually load the message.
Ask Slashdot: Any Place For Liberal Arts Degrees In Tech?
English lit. grads can do a variety of jobs, but wouldn't be my first choice for a programmer, unless they could demonstrate strong programming skills.
How very condescending of you. But I would say the same about engineers, CS grads, science and math majors as well. Mostly because I find them generally closed-minded, with a strong tendency toward binary thinking. It is a rare person indeed that is capable of writing truly good code. Those who are capable typically can maintain a balance between left and right brain, holding a wide range of possibilities in their head, visualising very complex models and fluid scenarios, and only in the last instance reducing them to computer logic.
It may seem paradoxical, but the only useful test of a good programmer is whether they program well.
The best team I ever worked on featured an ex-veterinarian, a chemical engineer, a Classics major, one who switched majors from music to sociology, one who did half a law degree, and myself, a theatre/English lit. double major.
The half a lawyer now helps to manage Google's international network. The chemical engineer manages the systems of a globally known company. The musician/sociologist is CTO of a successful SaaS operation. The vet is a senior application designer, and I'm Chief Technologist at a think tank. I'm sure you've done far better, but we haven't done so bad either.
Ask Slashdot: Any Place For Liberal Arts Degrees In Tech?
... employees with STEM degrees have critical thinking skills *and* STEM degrees. Just sayin'.
So... your point is that STEM degrees are intrinsically better prerequisites for all aspects of software development? Or that STEM degrees are intrinsically better in some way than liberal arts degrees? If either of those is your point, I suggest you check your assumption that completion of a STEM degree implies the presence of critical thinking skills. Because NO.
And if you think for a moment that a smart liberal arts major isn't capable of complex abstraction, conceptualisation and its expression in formal logic, then... well, once again, check your assumptions.
Ask Slashdot: Any Place For Liberal Arts Degrees In Tech?
There's certainly a place for people with dual degrees in tech and liberal arts -- people who truly understand the tech they're discussing, plus have the experience in communication and argumentation to explain it, push for it, and lead it.
Hi there. I'm the Chief Technologist of a thinktank and do a lot of technical work, from application & systems design and development through to legislation, policy and regulation. I did a double major in Theatre and English Lit. when I went to university. It amazes me that the majority of 'engineers' or science geeks show such disdain for liberal arts majors. Do they not realise that smart people are everywhere?
The thing that really makes me chuckle, though, is that they don't seem to believe that someone with strengths in the arts could ever be an autodidact, in spite of the fact that most good geeks have this capability as a defining trait. In theatre, I had to learn basic electronics, electrical circuitry, technical design, how to build weight-bearing structures, basic colour theory, linguistics, aesthetics (which, scoff as you like, requires pretty heavy thinking about the nature of human consciousness) and about a dozen other disciplines. And English taught me a little humility about the power of expression. It taught me to harness it as well.
As my colleagues will tell you, I have a significant lack of mathematical ability; my brain is simply not wired to read equations (or musical notation - another great failing). I can do it, but I expend a great deal more effort than my math whiz friends. This puts some programming work outside my competence - algorithms especially. I understand perfectly the concept of big O, though, and with assistance, I can write highly performant code.
But... I can design, create palettes, do layout and describe workflows a fuck of a lot better than most engineers. I know enough typography to be dangerous, and I can outperform most people when it comes to interfaces.
I know the value of a good engineer. I learned it at my father's knee. But if anyone ever suggested that I fill my software shop with nothing but STEM grads, I would laugh them out of the room. No offence, all you engineers, but there's a whole raft of software design and development issues that you guys suck at.
Navy Guilty of Illegally Broad Online Searches: Child Porn Conviction Overturned
The criminals here worthy of being described as scum and deserving confinement are the people involved in child pornography, not the investigator. At worst he seems to have exceeded his statutory jurisdiction in pursuit of actual crimes.
Allow me to quote the immortal words of Mr H.L. Mencken:
The trouble with fighting for human freedom is that one spends most of one's time defending scoundrels. For it is against scoundrels that oppressive laws are first aimed, and oppression must be stopped at the beginning if it is to be stopped at all.
Now, on behalf of Mr Mencken, and all those who fight for human freedom, allow me to suggest you fuck off, and to remind you that just because there are a few scummy characters in the world, it still doesn't justify putting the entire state of Washington under surveillance, which is what happened here.
UK's National Health Service Moves To NoSQL Running On an Open-Source Stack
Why the fuck are you storing the data if you don't give a damn about keeping it consistent?
There are thousands (and thousands) of cases where it is simply not reasonable to expect homogeneity in your data. Of those thousands of cases, a very large number of them not only have extremely heterogeneous data, they still need to be stored and queried. NoSQL is a useful tool in such cases.
Is it 'safe' — i.e. does it do all of the data integrity stuff we've come to associate with RDBMSes? No. Emphatically no. If you didn't code it into the right logic in the right places, you are probably worse than shit out of luck.
BUT... there are still thousands of cases where the pain of living with NoSQL far outweighs the pain (and in many cases impossibility) of living with your data inside an enterprise RDBMS.
And yes, I say this based on years of work with exactly these kinds of data sets. They were my bread and butter for a long time.
So, uh, holy fuck: Believe it.
Stallman Does Slides -- and Brevity -- For TEDx
Stallman is the crazy outlier. Where he stands, at the very edge, is exactly where we need him to be. You dont have to follow all of it, but there would be less of his ideas if he was more concerned with being central and accessible.
Just for the edification of the other readers here, which parts specifically do you feel you don't have to follow?
For the record, I know exactly which ones I would choose, but I'm interested to know what exactly you think makes Stallmann a 'crazy outlier'. Because, in my estimation, it would take a lot for someone to qualify for that kind of labeling.
I disagree with his statement that Linux distro maintainers allow non-free components because they're not sufficiently committed to freedom, but I don't think him 'crazy' for having said it. I think his blanket characterisation of profit motive as evil is too much of a generalisation, but tragically, I don't think he's entirely wrong in stating that the effects of profit motive on a lot of commercial organisations has been detrimental to our freedom - dangerously so. So yeah: same conclusion, more temperate language. That's not nearly crazy or even an outlying opinion, to my mind.
There is a point to Stallman being far out there, its so the rest of us dont have to. Let him do his thing.
I take your point, but I remind you that the same could have been said about Ghandi, or even Martin Luther King, when people were blaming him for the violence in Selma and the bombing in Birmingham.
See, the problem I have with this kind of rhetoric is that you seem willing to stand to the side at a witch-burning and say, 'Well, I would never cast a spell, but I can see why people bought magic services from her.' It's a little disingenuous, isn't it, that you would be willing to profit from someone's courage, when you're not willing to defend it?
Again, this isn't a case of 'My Free Software, Right or Wrong.' On the contrary, I'm arguing that you can quibble all you like with the arguments Stallmann makes, and the rhetoric he makes them with. But I have to ask: With an attitude like yours, how much have you actually done to promote freedom?
(Real question: I'm open to correction.)
Two Explorers Descend Into An Active Volcano, and Live to Tell About It
I am so full of envy right now, with a generous side order of awe. Watch that actually brought a tear to my eye.
Well, if you can pay the airfare to Vanuatu (3 1/2 hours away from Sydney for about $US 750), only a couple hundred dollars more will get you a walking tour to the edge of the caldera. It's not really a mountain so much as a high plateau with two (yes, two) active calderas. It's a fucking amazing place, a lunar landscape emerging over the last rise after a morning spent walking through jungle. Pretty primeval.
But if you're not in an exercising mood, you can simply pay a pilot to overfly the volcano. I did this once. We were actually on our way to Pentecost island in a small twin-prop charter, when the pilot said, 'Hey, the visibility's really good - you folks want to see a volcano?'
We thought about it for, like, 0.273 seconds and said, "FUCK YES!"
So he took us over it. Right. Over. The Volcano.
Was it cool? Yes, it was cool.
Two Explorers Descend Into An Active Volcano, and Live to Tell About It
Do you realize where this volcano is? Hint: look at the natives in the grass skirts. They're out chasing forest critters for dinner, not gearing up to rescue stupid rich tourists. I really doubt that anybody would come out and rescue them.
Howdy, commentator! I just read your comment about the people of Vanuatu, where I live. Guess what? This whole country is full of 'natives'! Isn't that cool?!
One thing though: Promise me you will never, ever, EVER come here. Because witless, condescending fucks like you tend to get killed and cooked for dinner.
(We're not really cannibals, but we'd do it just to spite you.)
Does Learning To Code Outweigh a Degree In Computer Science?
Your post also brings into question exactly how good of a programmer you really are as well. You see, English, much like programming, has a structure and a syntax. While you may have syntax, there is no structure. You may not have to compete for a job with someone who doesn't have a BS in CS, but you will most certainly have your cover letter compared to another person with a BS in CS who actually puts structure into his correspondence.
"Your post also brings into question your abilities as a programmer. English, like programming languages, has both structure and syntax. While your writing has syntax, it is largely unstructured. You may not have to compete for a job with someone who doesn't have a degree in Computer Science, but you most certainly will have your cover..."
... Oh, forget it. Your last sentence doesn't actually have logical coherence.
Akamai Warns: Linux Systems Infiltrated and Controlled In a DDoS Botnet
No, unmaintained web servers getting attacked and turned into bots is not news. This problem is not even specific to Linux systems. Any server that isn't patched with latest security fixes for the OS and applications is at risk regardless of the OS used.
The biggest difference we see between proprietary and FOSS systems is that the lack of maintenance in proprietary systems is often the fault of the vendor. In short, there's no way to keep a service or application patched, because there are no patches forthcoming.
Lack of maintenance by the sysadmin is a more common source of insecurity on Linux systems. The patches are often (not always, but often) there, but they do have to be applied by someone.
You Got Your Windows In My Linux
Oh please cry me a river. Impose systemd on others? Did Lennart own Ubuntu, Debian, RedHat and Suse now? It was deceided by the developers of those distributions to replace the old sysv init with systemd, and the alternatives were concidered. In Debian it was a democratic process by voting. RedHat is a company so clearly they will not bet on a broken system. Almost all of the arguments against systemd are not valid on technical grounds, and now you want to muddy the waters by attacking Lennart directly.
No, you ignorant little shit, I am not attacking Lennart directly. If I were attacking him directly, I would question his character and his motivations. I would suggest that he, like you, is of questionable (probably canine) parentage, and that his reading comprehension is at such a disastrously low level that he (like you) indulges in exactly the mistakes that I warn about in my critique.
So you see, if I were attacking Lennart, I would have written a paragraph or two like that one. But I didn't.
You, on the other hand, deserve no such forbearance because you fucking still don't fucking get it. Fuck.
Being technically correct is not - and never has been - sufficient for technology to become successful.
There is a time and a place to stomp on the desires of others when better answers to old problems is concerned. It is a common sin of the young that they think they know where that time and place actually is. In reality, they seldom do, because it's an insight that comes from experience.
Nobody is going to stop you on your path to self-destruction. All we ask is that you let us antiquated geezers (who actually have, you know, systems to run) alone. Is that too much for you to get your tiny little head around, you strident, inexperienced little twerp?
(P.S. If I've overdone the name-calling in the pursuit of a certain tone, please accept my apology. I have no idea whether you're actually little or not.)
You Got Your Windows In My Linux
I would be interested in the anyone's response to Lennart Poetterings rebuttal to the common complaints about systemd.
I'm too n00b to know who's right.
Full disclosure: I do not like systemd, although that judgment is based on its merits as I see them relative to my needs.
There seemed to be a pretty concerted determination not to get the point in Lennart's long spiel defending the system. As someone who has been using Unices in anger since the early '90s, I've pretty much seen Linux grow from its infancy. I've seen this kind of attitude before in technology — in Windows, Linux and elsewhere. The article is clearly written (and written clearly) by someone who's clever, articulate and... far too concerned with being right. It's not a healthy perspective.
Being technically correct is not generally optional. It's just not ever nearly as conclusive as some people think it is.
Having the humility to accept an imperfect universe — and to admit that it's imperfect in a particular way for a large number of particular reasons — is a virtue that fewer and fewer people seem to possess these days.
(It's the lack of this virtue that makes people say, for example, 'less and less' where I used 'fewer and fewer' and when someone corrects them, they trot out the grammar nazi epithet and say, 'Everybody knows what I mean. Deal with it.' And the lack of this virtue as well that will make people pick on the triviality of this example to discard my entire post. The temerity of such an approach cannot be explained to those who suffer from it.)
Systemd is clearly not change for change's sake. Lennart and the dozen and a half others who have commit rights are clearly scratching an itch. But regardless of the technical merits of the system, it is horribly, horribly wrong to impose this new system —any new system— on Linux wholesale without a significant maturing process. And by significant, I mean years.
And this is where Lennart's most completely mistaken. He thinks that the technical arguments are the decisive ones, in spite of the fact that technical merit is not why systemd is going to become prevalent. He therefore tries to write a technical opus in defence of the indefensible, which requires more than a few straw men (binary config files? shyeah....), several big ommissions (binary log files) and a clearly unwilled but nonetheless unforgivable ignorance of the fact that he's winning because he's RedHat, not because he's better. (Yeah yeah yeah, it's not all RH; it's others too, you're technically awesome - I read that part and remain utterly unconvinced by the argument.)
Paul Venezia's screed, on the other hand, is just plain substance-free. He's not arguing either technical merit or political power. He's simply looking at a looming mess and saying, well, that it's going to be messy. And to that extent, he's right. Systemd is going to make a mess, and that's precisely because its proponents think that they're perfectly within their rights to claim, 'Well, nobody's forcing anything on anyone.'
What they don't realise is that that is not how cooperation works. And believing you're better or righter than others is an absolutely shit way of improving your own software.
About once a month or so, I'm tempted to dump 25 bucks on Flickr to upgrade to a 'Pro' account, just so I can plop more than 200 photos into that particular bucket. I admit I've been on the cusp a couple of times.
But I never do. The plain fact is that Flickr is a terrible photo viewing interface.
A bright white background is possibly the worst neutral background they could have chosen. White washes out colours and destroys one of the things that I personally love best: subtle shading on very dark and earth-toned pictures. It's got the point where a lot of self-respecting photographers actually have a 'View on Black' link, pointing to one of several services that do nothing other than render the very same photo with a dark background. The difference is stunning.
But Flickr, in its infinite marketing wisdom, would rather emulate Google's 'any colour as long as it's white' mantra. In Google's case, there's wisdom in the approach; they are a utility, like power or water, not a creative service. Flickr does not benefit in the least from an engineer's design sense, and it's high time someone told them that.
One Hundred's Spartan
When viewing photos in groups - or any aggregation, for that matter - one is usually presented with a hodge-podge of 100 pixel thumbnails. Viewing photo sets is even worse. the screen is filled with a patchwork quilt of arbitrarily cropped 75×75 pixel postage stamps. No, wait, I take that back. Postage stamps are larger.
I can't imagine a worse fate for any decent photo. To be reduced to a smudge of light among dozens or hundreds of others on a glaring white page. I'm not sure even Ansel Adams could survive that.
Of course, there are some photos that do just fine in such an environment. Too often, they're from the 'Ooh Shiny!' school of art. To everyone's credit, some genuinely lovely photos can be found, if you know where to look. But they're lovely in spite of Flickr, not because of it.
There are any number of technical arguments for crowding dozens of blots of colour together and call them a collection, but none of them wash when it comes to aesthetics, or even usability, for that matter.
Flickr's groups are subject to the same AOL-ish devaluation that most large scale communities suffer from. The absolute preciousness of users who troll through other galleries, bestowing silly trophy and ribbon icons on pretty photos in a desperate attempt to burnish their collective karma by associating with only the best types... it's off-putting in a way that I'd rather not characterise in a public medium.
Let's just leave it at this: Any group of more than a few dozen people who are mostly unknown to one another can never merit the descriptor 'exclusive'.
Worst of all, Flickr is a vortex. It's a gravity well whose debris can be found throughout the Web, but which is entirely self-referential. Once you're in there, you don't come out. I've had over 14,000 visitors to my main photo stream, yet a mere 18 referrals from Flickr show up in my imagicity.com server logs. People who use Flickr don't go elsewhere.
Flickr, in other words, is good for Flickr. Any benefit that derives to individual photographers seems to be purely coincidental.
All of of this isn't Flickr's fault, per se. The fault lies in our technical inability to render - and more importantly, to manage - images efficiently through a standard GUI, and to share them effectively.
It seems almost paradoxical. Digital technology has allowed revolutionary advances in photography. It has made possible one thing that I love more than any: the ability to draw with light rather than pigment. Sometimes when I'm engrossed in my work I find myself getting almost drunk on colour. There is nothing more rewarding than watching a well-built slide show wash the room with light and shape, to see human vision captured, distilled and transformed in the process.
It astounds me, therefore, how poorly most websites handle photos.
But this is the environment that Flickr has chosen. With few tools to effectively deal with social economies of scale, people are left to their own devices, so they crowd together (as people always do), creating cacophony where contemplation might once have been. Flickr has embraced (in the embarrassing cloying-college-drinking-buddy sense of the word) conventional wisdom with regards to UI, and have spent all their effort on the engineering challenge of handling photos in volume. They've tacked on a few trendy bloggy/webbish bits, like tagging with keywords and location data, but done nothing whatsoever to innovate how photos are viewed.
And that, it seems to me, should be the very essence of innovation where photography is concerned.
I won't demur for a moment if you counter that thumbnails are a necessary evil, that larding a page up with binaries slows down load times, that we're unfortunately bound by the lowest common denominator where display and download capacity are concerned. Nor will I argue if you express admiration for their ability to handle the data volumes that they do. Just storing and serving up 2 billion photos is a decidedly non-trivial task.
But let's be clear here: I expect more from Flickr. I judge them by a higher standard.
They want to set themselves apart? Then let them deal intelligently - dare I say it? creatively - with their popularity. The engineering challenge is interesting; I'll be the first to admit it. But dammitall, this is a photography site. It's for creative people. Is it too much to ask that they should actually take a little of their revenue and use it for basic research and innovation? Where's the research into lossless compression, peer-to-peer content distribution, point-and-click monitor calibration, optimal display environments, click-and-drag online image resizing? Where's the community for UI geeks?
How many of Flickr's 10-30 million monthly visitors have paid accounts there? My guess would be: Several. Surely some of that revenue could go into renewal, exploration and invention.
Perhaps it's no surprise that Flickr founders Catarina Fake and Stewart Butterfield left Yahoo! just as soon as they reasonably could. I don't doubt for a moment that they've thought a great deal more about these issues than I have. Perhaps they'll be the ones who manage to pull a rabbit or two out of their digital cap.
If they do, they'll get my money, too.
[Cross-posted from the Scriptorum.]
Sometimes you have to destroy the document in order to save it....
I give up. I can't support OpenOffice Write any more, and it's nobody's fault but their own. For anything more than simple tasks, the application is terrible. Their only saving grace is that Microsoft Office has its own brand of polished turd, named Word. Collectively, they are racing to the bottom of a decade-long decline in useability.
No, that's too generous. The thing is, they're at the bottom. They are useless for any but the most trivial tasks, and the most trivial tasks are better accomplished elsewhere, anyway.
Yes, I'm ranting. Let's put this into a proper context:
I hate word processors. For any but the simplest tasks, their interfaces are utterly ridiculous. I haven't liked a word processing interface since WordPerfect circa version 5, and if I had my own way, I'd author all my documents in either emacs or vi, depending on the circumstances.
Why do word processors suck so badly? Mostly, it's because of the WYSIWYG approach. What You See Is What You Get, besides being one of the most ghastly marketing acronyms to see the light of day in the digital era, is ultimately a lie. It was a lie back in the early 1990s when it first hit the mainstream, and it remains a lie today. The fact of the matter is that trying to do structuring, page layout and content creation at the same time is a mug's game. Even on a medium as well understood as paper, it's just too hard to control all the variables with the tools available and still have a comprehensible interface.
But the real sin that word processors are guilty of is not that they're trying to do WYSIWYG - okay it is that they're trying to do WYSIWYG, but they way they go about it makes it even worse. Rather than insisting that the user enter data, structure it and then lay it out, they cram everything into the same step, short-circuiting each of those tasks, and in some cases rendering them next to impossible to achieve.
Learning how to write, then structure, then format a document (or even just doing each through its own interface) is easier to learn and easier to accomplish than the all-in approach we use today. For whatever reason, though, we users are deemed incapable of creating a document without knowing what it's going to look like right now, and for our sins, that's what we've become. And so we are stuck with word processors that are terrible at structuring and page layout as well as being second-rate text authoring interfaces. They do nothing well, and many things poorly, in no small part because of the inherent complexity of trying to do three things at once.
It doesn't help that their technical implementation is poor. The Word document format is little better than a binary dump of memory at a particular moment in time. For our sins, OpenOffice is forced to work with that as well, in spite of having the much more parse-worthy ODF at its disposal these days.
There's no changing any of this, of course. The horse is miles away, and anyway the barn burned down in the previous millennium. The document format proxy war currently underway at the ISO is all the evidence I need to know that I'll be dealing with stupid stupid stupid formatting issues for years to come. I will continue to be unable to properly structure a document past about the 80th percentile, which is worse than not at all. I will continue to deal with visual formatting as my only means to infer context and structure, leaving me with very little capacity to do anything useful with the bloody things except to print them out and leave them on someone's desk.
Maybe I'll just stop using them at all. Maybe I'll just start doing everything on the web and never print again. I'm half serious about this, actually. At least on the Web, the idea that content and presentation are separate things isn't heresy. At least on the Web, I can archive, search, contextualise, comment, plan, structure and collaborate without having to wade through steaming piles of cruft all the time.
At least on the Web, I can choose which steaming piles I step into.
I'm going to start recommending people stop using Word as an authoring medium. There are far better, simpler tools for every task, and the word processor has been appropriate for exactly none of them for too long now. Sometimes you have to destroy the document in order to save it.
Trust Works All Ways
[Cross-posted from the Scriptorum.]
The Debian OpenSSL vulnerability apparently went unnoticed for well over a year. Why is it that crackers and script kiddies never found it and/or exploited it?
Over the weekend, I've been thinking about last week's disclosure concerning Debian's OpenSSL package, which in effect stated that all keys and certificates generated by this compromised code have been trivially crackable since late 2006.
There's a pretty good subjective analysis of the nature of the error on Ben Laurie's blog (thanks, Rich), and of course the Debian crew itself has done a fairly good job of writing up the issue.
The scope of this vulnerability is pretty wide, and the ease with which a weak key can be compromised is significant. Ubuntu packaged up a weak key detector script containing an 8MB data block which, I'm told, included every single possible key value that the Debian OpenSSL package could conceivably create.
The question that kept cropping up for me is: This one-line code change apparently went unnoticed for well over a year. Why is it that crackers and script kiddies never found it and/or exploited it? Numerous exploits on Microsoft Windows would have required far more scrutiny and creativity than this one. Given the rewards involved for 0-day exploits, especially in creating platforms for cross-site scripting attacks, why is it nobody bothered to exploit this?
My hypothesis - sorry, my speculation is this: People at every stage of the production process and everywhere else in the system trusted that the others were doing their job competently. This includes crackers and others with a vested interest in compromising the code. I should exclude from this list those who might have a reasonable motivation to exploit the vulnerability with stealth and to leave no traces. If, however, even they didn't notice the danger presented by this tiny but fundamental change in the code base, well my point becomes stronger.
The change itself was small, but not really obscure. It was located, after all, in the function that feeds random data into the encryption process. As Ben Laurie states in his blog, if any of the OpenSSL members had actually looked at the final patch, they would almost certainly have noticed immediately that it was non-optimal.
In all this time, apparently, nobody using Debian's OpenSSL package has actually (or adequately) tested to see whether the Debian flavour of OpenSSL was as strong as it was supposed to be. That level of trust is nothing short of astounding. If in fact malware authors were guilty of investing the same trust in the software, then I'd venture to state that there's a fundamental lesson to be learned here about human nature, and learning that lesson benefits the attacker far more than the defender:
Probe the most trusted processes first, because if you find vulnerabilities, they will yield the greatest results for the least effort.
P.S. Offhand, there's one circumstance that I think could undermine the credibility of this speculation, and that's if there's any link between this report of an attack that compromised not less than 10,000 servers and the recent discovery of the Debian OpenSSL vulnerability.