×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

Judge: It's OK For Cops To Create Fake Instagram Accounts

grcumb Re:Not seeing the issue here (204 comments)

Bingo. You're absolutely correct.

"I've got three witnesses that put you there, DNA evidence, and some video with someone wearing jeans and a white hoodie, just like you wear, though the face isn't visable. You'll get the death penalty. If you give me a confession, we can get it down to manslaughter. First offense. You'll probably just get probation. Here's some paper."

You might like to look up the difference between coercion and deception. One of them is almost always a crime; the other, not so much.

4 days ago
top

Ask Slashdot: How Should a Liberal Arts Major Get Into STEM?

grcumb Re:been there, done that (280 comments)

You're not a liberal arts major, by any chance, are you? 'Cuz one thing STEM tries to do is kill the belief that an anecdote counters data.

Why yes, I am a liberal arts major, who studied classical logic, among other things. I was responding to the assertion that 'most' liberal arts majors ended up as lowly restaurant workers. I countered that by asserting a) that restaurant workers are not so lowly as characterised; b) that drawing general conclusions about people's prospects based on their education does not bear out, particularly where some of the more respected and influential jobs are concerned; and c) that in a number of cases, a liberal arts education is a precursor to the kind of work that most people can only dream about.

You see, I was actually not making a positive argument so much as rebutting (and refuting) someone else's crass, inaccurate and unsubstantiated assertion that a liberal arts degree is valueless. Shocking, isn't it, to see a STEM major failing so badly at applying basic logic?

But yeah, the plural of anecdote is not always data.

P.S. For the humour-impaired: I'm a keyboard monkey, too. A liberal arts educated keyboard monkey.

about two weeks ago
top

Ask Slashdot: How Should a Liberal Arts Major Get Into STEM?

grcumb Re:been there, done that (280 comments)

I second this comment. besides teaching college which will probably involve a graduate degree, most of thejobs with a liberal arts degree involve asking "Do you want fries with that?"

Two things:

First - I supported myself for a decade working in bars and restaurants. There are more interesting people living interesting lives employed in that sector than just about any other.

Second - Ridley Scott went to art college. Peter Jackson was self-taught. James Cameron was a truck driver. The people who have done more to shape your vision than you're likely able to realise followed no discernible pattern of behaviour. I'd advise you to save your derision until someone's earned it.

Case in point: One 'liberal arts' friend of mine plays the king of the White Walkers on GoT. Another works on The Daily Show. How's your job look now, keyboard monkey?

about two weeks ago
top

Ask Slashdot: How Should a Liberal Arts Major Get Into STEM?

grcumb Re:been there, done that (280 comments)

Have an English degree, found it useless. went back got my BSEE, been employed as such ever since. short version, go back and get your degree.

Did a double major in Theatre and English Literature. After 20 years of gainful employment in systems software development and consulting, I'm now CTO at an international think tank. I also know the value of capitalisation and punctuation.

Short version: It's horses for courses; reflect carefully, then do what you feel is best. If you're smart, the real determining factor is how hard you're willing to work, and how well you continue to learn.

about two weeks ago
top

Microsoft To Open Source Cloud Framework Behind Halo 4 Services

grcumb Re:please keep closed! (50 comments)

I disagree. Encapsulation and abstraction of complexity is natural and humans are great at breaking complexity apart and making the common-man able to accomplish what was one impossible.

No dispute there. The problem, though, is not that we make easy things simple and hard things possible (pace, Larry Wall). It's that we have of late developed a tendency to simplify too far. Microsoft is famous for making systems administration and certain types of programming 'click-and-drool' easy. And hyperbole aside, the cost to society of the half-competent people who found gainful employment due to this charade can be measured in the many billions.

You're absolutely right in that commercial flying is safer than ever, notwithstanding the tendency in airlines to pressure senior pilots out in favour of cheaper, younger staff. And those working in HFT would likely be wreaking havoc by other means if they didn't have software and fibre-optics to enable them. I guess my tongue hadn't entirely left my cheek when I wrote that last para.

BUT... Microsoft has contributed significantly to a general downward trend in the quality of software and systems integrity. And they've done so by marketing the idea that with the right tools, tool users can be commoditised. And that really, really sucks.

about two weeks ago
top

Microsoft To Open Source Cloud Framework Behind Halo 4 Services

grcumb Re:please keep closed! (50 comments)

Whatever it is that made Halo 4 (cloud-based or otherwise) should remain closed. Or better yet, incinerate it.

Agreed. 'Software that makes it easy for non-experts to do expert tasks' will one day be recognised for its role in causing the downfall of civilisation as we know it. By then, of course, it will be too late.

Some among you may think that's overstating things. Some among you are also .NET developers, so what do you know?

Seriously, though: From the Airbus crash to high frequency trading to the Sony hack, examples abound of how enabling and empowering mediocrity is the first ingredient of every modern tragedy.

about two weeks ago
top

"Fat-Burning Pill" Inches Closer To Reality

grcumb Re:May not be a practical drug. (153 comments)

This post, not least because of the sig, wins the internet today. It's why I persist in reading Slashdot. Even as this site declines, it still has moments like this when someone posts something intellectually challenging, scientifically insightful and sardonically clever as fuck.

about two weeks ago
top

Just-Announced X.Org Security Flaws Affect Code Dating Back To 1987

grcumb Re:Wha?!?!!! (172 comments)

The point the OP was trying to make was that Linus's Law [wikipedia.org], specifically Eric S. Raymond's "given enough eyeballs all bugs are shallow" argument, is ridiculously idealistic as it operates under the pretence that everyone has as much insight and knowledge into the software as the author(s) have, focusing solely on the quantity of eyes.

I disagree that it is a ridiculously idealistic statement. It is more of a misunderstood rhetorical tautology than anything else.

A discovered bug obviously had enough eyeballs on it, and an as yet undiscovered bug hasn't had enough eyeballs on it.

Actually, I wish he had limited the statement to the persistence of known bugs in FOSS code bases. ESR said the bugs are easier to find as the number of beta testers and developers increases. This doesn't appear to be true. One thing that is true is that code quality is viewed differently in FOSS than in commercial, proprietary software. All too often, software businesses treat QA, debugging and code maintenance as overhead, so there's a perverse incentive to leave known bugs - even the most egregious ones - lying around indefinitely - or at least until someone publicly raises a stink. FOSS culture values code quality more highly and is less tolerant toward bugs, so generally speaking we see somewhat better code quality, and somewhat shorter known bug life than in similar proprietary projects.

Emphasis on 'generally speaking' in the above. Exceptions abound, but I think the trend is clear.

about two weeks ago
top

POODLE Flaw Returns, This Time Hitting TLS Protocol

grcumb This is the God of Job (54 comments)

If there were a just and caring God, he would never let geeks name things.

POODLE?

Jesus wept. Literally. He heard the name and wept tears.

Geeks made baby Jesus cry.

about two weeks ago
top

The Failed Economics of Our Software Commons

grcumb Re:Article doesn't address they "why" (205 comments)

If we want to address this issue, we need a complete overhaul of our IP laws.

Er, no.

The 'why' has little to do with IP law and a lot to do with group dynamics, especially herd behaviour. Take this statement, for example:

One of my personal pet causes is developing a better alternative to HTML/CSS. This is a case where the metaphorical snowdrift is R&D on new platforms (which could at least initially compile to HTML/CSS).

The problem with the 'snowdrift' here, to abuse the metaphor, has nothing to do with IP law, and nothing to do with lack of innovation. It has everything to do with the size of the drift. You don't have any choice but to wait for someone else to come along to help shovel. But the author is trying to say, If everyone doesn't shovel, nobody gets out. And that's not always true.

A quick reminder: When HTML first came out, the very first thing virtually every proprietary software vendor of note did was publish their own website design tool. And each of those tools used proprietary extensions and/or unique behaviour in an attempt to corner the market on web development, and therefore on the web itself.

But the 'snowdrift' in this case was all the other companies. Because no single one of them was capable of establishing and holding overwhelming dominance, the 'drift' was doomed to remain more manageable by groups than by any single entity. (Microsoft came closest to achieving dominance, but ultimately their failure was such that they have in fact been weakened by the effort.)

Say what you like about the W3C, and draw what conclusions you will from the recent schism-and-reunification with WHATWG. The plain fact is that stodgy, not-too-volatile standards actually work in everybody's favour. To be clear: they provide the greatest benefit to the group, not to the enfant terrible programmer who thinks he knows better than multiple generations of his predecessors.

Yes, FOSS projects face institutional weaknesses, including a lack of funding. Especially on funding for R&D. But funded projects face significant weaknesses as well. Just look at the Node.JS / io.js fork, all because Joyent went overboard in its egalitarian zeal. Consider also that recent widely publicised bugs, despite the alarm they've caused, haven't really done much to affect the relative level of quality in funded vs proprietary vs unfunded code bases. They all have gaping holes, but the extent of their suckage seems to be dependent on factors other than funding. If not, Microsoft would be the ne plus ultra of software.

Weighed in the balance, therefore, FOSS's existential problems are real, and significant, but they're not as significant as those faced by all the other methods we've tried. So to those who have a better idea about how to balance community benefits and obligations, I can only reply as the Empress famously did when revolutionaries carried her bodily from the palace: 'I wish them well.'

about two weeks ago
top

The Rise of the Global Surveillance Profiteers

grcumb Re:I don't know if 'profiteer' is the right term (33 comments)

Just because *some* or even *most* profit is reasonable, doesn't mean all profit is reasonable.

The term "profiteer" is used for people who put profit above a higher ethical claim; for example a citizen selling arms to an enemy during wartime.

I'm not sure that's really the canonical use of the term. I would think that selling said arms to one's own government at extortionate prices would be closer to the standard definition.

But niggling aside, the real problem with this article is that it equates the control of technology with control of behaviour, and assumes that it's even possible to usefully control the proliferation of technology.

Instead of advocating a software proscription list, why not seek to promote international legal standards concerning the right to privacy, and a respect for the rule of law among all nations?

Actually, don't answer that. I know why. Because building democracy is hard and even the purportedly enlightened, 'free' nations are busy backing away from individual human rights.

about two weeks ago
top

Node.js Forked By Top Contributors

grcumb Re:Effort dilution (254 comments)

I disagree over the degree of which this would be a problem - think of it more like the free market. Under ideal conditions, the best ideas with the broadest appeal tend to win, grow and evolve, while the worst ideas with little appeal tend to fade away relatively quickly.

That's fantasy. The best ideas often wither while mediocre - even bad ones - flourish. It also makes the foolish assumption that "best" conflates with "broadest appeal".

Well, you need to define 'best' under these circumstances. The Linux kernel became 'best' when it was found that it supported and sustained the involvement of the widest developer/manufacturer constituency at a reasonable level of quality. That's hardly a glowing endorsement of the quality of the code or the operation of the kernel in real-world scenarios.

Remember that the abiding challenge for technologists is not so much 'best' as 'good enough'.

So yes, GP is wrong to see the free market as one in which the best ideas win. They don't. But the most workable available solutions do tend to get the most support. In Commodore's case, their sin was failing to market it in a way that made it readily accessible (i.e. price, distribution and support) and usable (developer support and software market). So you can praise the quality of the device, but from the buyer's perspective, it wasn't the 'best' solution after all, was it?

about three weeks ago
top

Celebrated Russian Hacker Now In Exile

grcumb Re:Commie Critter On The Lam? (130 comments)

eggcorn |egg korn| noun In linguistics, an eggcorn is an idiosyncratic substitution of a word or phrase for a word or words that sound similar or identical in the speaker's dialect (sometimes called oronyms). The new phrase introduces a meaning that is different from the original, but plausible in the same context, such as "old-timers' disease" for "Alzheimer's disease".

humor
(h)yoomr
noun
noun: humour
1. the quality of being amusing or comic, especially as expressed in literature or speech. "his tales are full of humor"

about three weeks ago
top

Game Theory Analysis Shows How Evolution Favors Cooperation's Collapse

grcumb Re:Taxpayer's Dilemma (213 comments)

You are assuming a perfect world where taxes are used efficiently, whereas most western government have rather low bang-for-the-buck. At the end of the day, what really happen is more of the realm of "Everyone pays taxes, but infrastructures still sucks".

No, actually. Unless by 'sucks' you mean, works imperfectly, but still better than those parts of the world that did not benefit from my tax dollars.

I say this with the benefit of experience. I've traveled to dozens of countries, rich and poor, and those with solid tax bases have dependably better public infrastructure than those without.

The cause of your crumbling infrastructure in the US is largely people not paying taxes.

about three weeks ago
top

Wikipedia's "Complicated" Relationship With Net Neutrality

grcumb Re:Not a good move (134 comments)

I can't speak to Facebook, et. al., but please don't lump Wikipedia Zero into your attack above, it's a very different animal. WP Zero is the brainchild of some very smart, idealistic people whose primary mission in life is to spread as much free information around the globe as possible, and that in turn is just a facet of a deeper ideal that information is empowering, and lack of information is oppressive.

Whose brainchild, specifically? I'm very interested in knowing. Because I think you'll find that the idea did not originate in Wikipedia, but that it was presented to them by others.

I know some of the individuals involved in the WP Zero movement from the get-go. These are the in-the-trenches activists. They physically went to these developing nations to examine the situation because they saw a disturbing trend in their own analytical data: the most oppressed people on the planet, who had the most to gain from free information, were not taking advantage of Wikipedia's free information as much as expected.

I hope you'll forgive my cynicism, but 'physically going' to the developing world teaches very little indeed about the broader truths of living in poverty. I say this having lived the last 11 years in a Least Developed Country, and having worked for half a generation with a parade of well-intended people who, to put it bluntly, haven't got a fucking clue, but who suck up all the oxygen in the room, making it impossible to get real, meaningful work done.

Do I sound bitter? Yes. I believe I've earned that right. Does that diminish my determination to work on real issues? Not one iota.

What they found on the ground was that in many of these developing nations, school-aged children and young adults had access to cell phones (but usually not tablets or home computers), and these cell phones had browsers and data capabilities, but the carriers are charging an arm and a leg for bytes of data over the cellular network, and that's why they're not surfing Wikipedia (or anything else much either).

Yes, and instead of helping to fight this phenomenon through better policy and changed attitudes among the global institutions, what we get instead is people perpetuating the problem by empowering the very telcos who prey on those children.

Let's be perfectly clear about this: asking telcos to make a special exception for one or two services is probably the worst possible response to the situation. It's short-sighted, it generates little real benefit, and worst of all, it creates the impression that people are actually doing something, when they're doing less than the minimum needed to move the development markers.

You can defend these people all you like. I still maintain that:

a) They were misguided and wrong; and
b) The basic idea was inspired and promoted by a number of very cynical individuals to a bunch of very naïve (if well-intentioned) people with little meaningful experience in actual development work.

about a month ago
top

Wikipedia's "Complicated" Relationship With Net Neutrality

grcumb Re:Not a good move (134 comments)

"We have a complicated relationship to it. We believe in net neutrality in America," said Gayle Karen Young, chief culture and talent officer at the Wikimedia Foundation. But, Young added, offering Wikipedia Zero requires a different perspective elsewhere. "Partnering with telecom companies in the near term, it blurs the net neutrality line in those areas. It fulfills our overall mission, though, which is providing free knowledge."

Let me state things clearly. These {facebook|wikipedia|whatever}.zero campaigns are a direct and unequivocal attack on Net Neutrality. They are the brainchild of some very smart, cynical people who know exactly how insidious the whole idea is, and whose job it is to set Open Data people against Open Networks people.

This is not an unintended consequence. This is the consequence.

My part of the world consists pretty much entirely of developing nations, and when we discussed these zero initiatives, we pretty quickly came to the conclusion that having offline versions of wikipedia (commonly available) was a more desirable thing than having a zero-cost version of it mediated by our friendly neighbourhood telco.

And Facebook zero was scoffed at when it was touted as a social Good.

about a month ago
top

Clarificiation on the IP Address Security in Dropbox Case

grcumb Re:"Keep reading to see what Bennett has to say" (152 comments)

Don't you wanna read about "clarificiations"?

Indeed. Now, most of you are out in the world seeking clarity. But, as long-time contributor Bennett Haselton writes, much more important than that is 'clarifice', the ability to explain truthiness without resorting to expertise or insight. Keep reading to see Bennett's clarification of how over two hundred years or jurisprudence can be usefully transposed onto decades-old technology....

about 1 month ago
top

The People Who Are Branding Vulnerabilities

grcumb Re:Fuck That Shit (64 comments)

You don't get points for media mentions.

You're right. You don't get points. You get funding and awareness which is far more important.

Not necessarily. If the vulnerability du jour is catching media attention the way Ebola did, then you're probably not doing work you should be doing because you've got a CEO who just publicly pronounced that not one of your customers ever is going to get $EBOLA because of you. And suddenly your entire development cycle is in ruins, every manager everywhere has to explain in voluminous detail why his business unit will not be the cause of the next $EBOLA crisis, consultants will be hired to waste your time confirming that you really never were going to contribute to the global $EBOLA scare anyway....

... and meanwhile, your maintenance cycle is fucked, you have no budget left to do the upgrades that you need to avoid good old-fashioned data loss due to hardware failure, your children have forgotten who you are, and your wife just accidentally emailed her entire carpool pictures of her naughty bits (instead of her little piece on side, as she intended).

And your dog ran away.

NOW how does all that funding and awareness feel, eh kid?

about 1 month ago
top

How Intel and Micron May Finally Kill the Hard Disk Drive

grcumb Re:Sure, but speed... (438 comments)

So you would pay $1200 for a hard drive "without hesitation"?

Don't scoff. There are a number of scenarios where even several thousand bucks can go over the board without a second thought as long as there's some demonstrable benefit. In photography or video editing, your billing rate can be such that a couple of hours saved waiting on disk I/O can be sufficient to justify some serious spending on storage.

I've got 10 TB on my desk at home, and photography is not my primary work. It was nothing to me to drop over a thousand bucks on a decent hardware RAID controller and disk array. I'd seriously consider moving to SSDs as my primary storage medium if the price got down to 2-2.5 times the cost of a traditional disk.

about a month ago

Submissions

top

Android Ice Cream Sandwich Source Released

grcumb grcumb writes  |  more than 3 years ago

grcumb (781340) writes "Looks like the folks at Google have made good on their promise to release the Android 4.0 source code. Android software engineer Jean-Baptiste Queru writes: "Hi! We just released a bit of code we thought this group might be interested in. Over at our Android Open-Source Project git servers, the source code for Android version 4.0 (Ice Cream Sandwich) is now available."

"This is actually the source code for version 4.0.1 of Android, which is the specific version that will ship on the Galaxy Nexus, the first Android 4.0 device. In the source tree, you will find a device build target named "full_maguro" that you can use to build a system image for Galaxy Nexus. Build configurations for other devices will come later."

If the Cyanogen elves get busy Daddy just might be getting a new ROM for Christmas...."

Link to Original Source
top

Economist Mag Profiles "Wireless Carrier-Pigeons"

grcumb grcumb writes  |  more than 4 years ago

grcumb (781340) writes "The Economist magazine is running a brief profile of Digicel, a 'minnow' in the wireless telecoms market that has distinguished itself by setting up shop in some of the most unlikely (and dangerous) markets in the world, including Haiti and Papua New Guinea, whose capital, Port Moresby, has one of the highest murder rates in the world.

"If you just focus on risk, you can't do a thing," said Digicel's billionaire president Denis O'Brien in a 2008 Forbes profile. But O'Brien's small-market revolution should teach us another lesson, too: Traditional economic analysis doesn't work when it comes to communications. Telecommunications is a supply-driven economy. If you build it — no matter where you build it — they will come.

Now, if someone could just teach the North American telcos this...."
top

Anonymous Coward or Corporate Troll?

grcumb grcumb writes  |  more than 7 years ago

grcumb writes "In a recent article on Alternet, Annalee Newitz writes to report that our perception of the typical anonymous poster as a fat, half-naked basement dweller with a grudge is nearly 100% wrong. Virgil Griffith's WikiScanner site exposes the surprising truth: The majority of dishonest edits and omissions on wikipedia derive from corporate and government IP addresses. In Annalee's words: 'It turns out that the people who are hiding behind anonymity online for nefarious or selfish reasons are not little guys in pajamas but the very bastions of accountability that haters of the Web have deified.'"
Link to Original Source
top

AT&T Practices Political Censorship

grcumb grcumb writes  |  more than 7 years ago

grcumb writes "Pearl Jam reports that their live webcast from Lollapalooza was censored by AT&T. The statement on the band's website outlines their concerns in the context of the ongoing Net Neutrality 'debate':

"AT&T's actions strike at the heart of the public's concerns over the power that corporations have when it comes to determining what the public sees and hears through communications media.

"Aspects of censorship, consolidation, and preferential treatment of the internet are now being debated under the umbrella of "NetNeutrality." Check out The Future of Music or Save the Internet for more information on this issue.


It's refreshing to see that at least some of our media darlings have a clue about what this debate is about,"

Link to Original Source
top

France: Surrender Your Blackberries!

grcumb grcumb writes  |  more than 7 years ago

grcumb writes "Le Monde has published a story claiming that French defence officials have asked all senior functionaries in the French government to stop using Blackberries wireless mobile devices. Fears that the US-based mail servers supporting the service could lead to systematic eavesdropping by US intelligence agencies led to the drastic move. From the AP story:

"It's not a question of trust," Mr. Lasbordes told The Associated Press. "We are friends with the Americans, the Anglo-Saxons, but it's economic war."

Research In Motion, makers of the Blackberry device, claim they couldn't read the emails even if they wanted to: "No one, including RIM, has the ability to view the content of any data communication sent using the BlackBerry Enterprise Solution,"

Apparently, nobody at RIM has ever worked at the NSA."

Journals

top

Flickr: Flunkr

grcumb grcumb writes  |  more than 5 years ago

About once a month or so, I'm tempted to dump 25 bucks on Flickr to upgrade to a 'Pro' account, just so I can plop more than 200 photos into that particular bucket. I admit I've been on the cusp a couple of times.

But I never do. The plain fact is that Flickr is a terrible photo viewing interface.

White, what?

A bright white background is possibly the worst neutral background they could have chosen. White washes out colours and destroys one of the things that I personally love best: subtle shading on very dark and earth-toned pictures. It's got the point where a lot of self-respecting photographers actually have a 'View on Black' link, pointing to one of several services that do nothing other than render the very same photo with a dark background. The difference is stunning.

But Flickr, in its infinite marketing wisdom, would rather emulate Google's 'any colour as long as it's white' mantra. In Google's case, there's wisdom in the approach; they are a utility, like power or water, not a creative service. Flickr does not benefit in the least from an engineer's design sense, and it's high time someone told them that.

One Hundred's Spartan

When viewing photos in groups - or any aggregation, for that matter - one is usually presented with a hodge-podge of 100 pixel thumbnails. Viewing photo sets is even worse. the screen is filled with a patchwork quilt of arbitrarily cropped 75×75 pixel postage stamps. No, wait, I take that back. Postage stamps are larger.

I can't imagine a worse fate for any decent photo. To be reduced to a smudge of light among dozens or hundreds of others on a glaring white page. I'm not sure even Ansel Adams could survive that.

Of course, there are some photos that do just fine in such an environment. Too often, they're from the 'Ooh Shiny!' school of art. To everyone's credit, some genuinely lovely photos can be found, if you know where to look. But they're lovely in spite of Flickr, not because of it.

There are any number of technical arguments for crowding dozens of blots of colour together and call them a collection, but none of them wash when it comes to aesthetics, or even usability, for that matter.

Cliqr

Flickr's groups are subject to the same AOL-ish devaluation that most large scale communities suffer from. The absolute preciousness of users who troll through other galleries, bestowing silly trophy and ribbon icons on pretty photos in a desperate attempt to burnish their collective karma by associating with only the best types... it's off-putting in a way that I'd rather not characterise in a public medium.

Let's just leave it at this: Any group of more than a few dozen people who are mostly unknown to one another can never merit the descriptor 'exclusive'.

Worst of all, Flickr is a vortex. It's a gravity well whose debris can be found throughout the Web, but which is entirely self-referential. Once you're in there, you don't come out. I've had over 14,000 visitors to my main photo stream, yet a mere 18 referrals from Flickr show up in my imagicity.com server logs. People who use Flickr don't go elsewhere.

Flickr, in other words, is good for Flickr. Any benefit that derives to individual photographers seems to be purely coincidental.

Flunkr

All of of this isn't Flickr's fault, per se. The fault lies in our technical inability to render - and more importantly, to manage - images efficiently through a standard GUI, and to share them effectively.

It seems almost paradoxical. Digital technology has allowed revolutionary advances in photography. It has made possible one thing that I love more than any: the ability to draw with light rather than pigment. Sometimes when I'm engrossed in my work I find myself getting almost drunk on colour. There is nothing more rewarding than watching a well-built slide show wash the room with light and shape, to see human vision captured, distilled and transformed in the process.

It astounds me, therefore, how poorly most websites handle photos.

But this is the environment that Flickr has chosen. With few tools to effectively deal with social economies of scale, people are left to their own devices, so they crowd together (as people always do), creating cacophony where contemplation might once have been. Flickr has embraced (in the embarrassing cloying-college-drinking-buddy sense of the word) conventional wisdom with regards to UI, and have spent all their effort on the engineering challenge of handling photos in volume. They've tacked on a few trendy bloggy/webbish bits, like tagging with keywords and location data, but done nothing whatsoever to innovate how photos are viewed.

And that, it seems to me, should be the very essence of innovation where photography is concerned.

I won't demur for a moment if you counter that thumbnails are a necessary evil, that larding a page up with binaries slows down load times, that we're unfortunately bound by the lowest common denominator where display and download capacity are concerned. Nor will I argue if you express admiration for their ability to handle the data volumes that they do. Just storing and serving up 2 billion photos is a decidedly non-trivial task.

But let's be clear here: I expect more from Flickr. I judge them by a higher standard.

They want to set themselves apart? Then let them deal intelligently - dare I say it? creatively - with their popularity. The engineering challenge is interesting; I'll be the first to admit it. But dammitall, this is a photography site. It's for creative people. Is it too much to ask that they should actually take a little of their revenue and use it for basic research and innovation? Where's the research into lossless compression, peer-to-peer content distribution, point-and-click monitor calibration, optimal display environments, click-and-drag online image resizing? Where's the community for UI geeks?

How many of Flickr's 10-30 million monthly visitors have paid accounts there? My guess would be: Several. Surely some of that revenue could go into renewal, exploration and invention.

Perhaps it's no surprise that Flickr founders Catarina Fake and Stewart Butterfield left Yahoo! just as soon as they reasonably could. I don't doubt for a moment that they've thought a great deal more about these issues than I have. Perhaps they'll be the ones who manage to pull a rabbit or two out of their digital cap.

If they do, they'll get my money, too.

top

Steaming Piles

grcumb grcumb writes  |  more than 6 years ago

[Cross-posted from the Scriptorum.]

Sometimes you have to destroy the document in order to save it....

I give up. I can't support OpenOffice Write any more, and it's nobody's fault but their own. For anything more than simple tasks, the application is terrible. Their only saving grace is that Microsoft Office has its own brand of polished turd, named Word. Collectively, they are racing to the bottom of a decade-long decline in useability.

No, that's too generous. The thing is, they're at the bottom. They are useless for any but the most trivial tasks, and the most trivial tasks are better accomplished elsewhere, anyway.

Yes, I'm ranting. Let's put this into a proper context:

I hate word processors. For any but the simplest tasks, their interfaces are utterly ridiculous. I haven't liked a word processing interface since WordPerfect circa version 5, and if I had my own way, I'd author all my documents in either emacs or vi, depending on the circumstances.

Why do word processors suck so badly? Mostly, it's because of the WYSIWYG approach. What You See Is What You Get, besides being one of the most ghastly marketing acronyms to see the light of day in the digital era, is ultimately a lie. It was a lie back in the early 1990s when it first hit the mainstream, and it remains a lie today. The fact of the matter is that trying to do structuring, page layout and content creation at the same time is a mug's game. Even on a medium as well understood as paper, it's just too hard to control all the variables with the tools available and still have a comprehensible interface.

But the real sin that word processors are guilty of is not that they're trying to do WYSIWYG - okay it is that they're trying to do WYSIWYG, but they way they go about it makes it even worse. Rather than insisting that the user enter data, structure it and then lay it out, they cram everything into the same step, short-circuiting each of those tasks, and in some cases rendering them next to impossible to achieve.

Learning how to write, then structure, then format a document (or even just doing each through its own interface) is easier to learn and easier to accomplish than the all-in approach we use today. For whatever reason, though, we users are deemed incapable of creating a document without knowing what it's going to look like right now, and for our sins, that's what we've become. And so we are stuck with word processors that are terrible at structuring and page layout as well as being second-rate text authoring interfaces. They do nothing well, and many things poorly, in no small part because of the inherent complexity of trying to do three things at once.

It doesn't help that their technical implementation is poor. The Word document format is little better than a binary dump of memory at a particular moment in time. For our sins, OpenOffice is forced to work with that as well, in spite of having the much more parse-worthy ODF at its disposal these days.

There's no changing any of this, of course. The horse is miles away, and anyway the barn burned down in the previous millennium. The document format proxy war currently underway at the ISO is all the evidence I need to know that I'll be dealing with stupid stupid stupid formatting issues for years to come. I will continue to be unable to properly structure a document past about the 80th percentile, which is worse than not at all. I will continue to deal with visual formatting as my only means to infer context and structure, leaving me with very little capacity to do anything useful with the bloody things except to print them out and leave them on someone's desk.

Maybe I'll just stop using them at all. Maybe I'll just start doing everything on the web and never print again. I'm half serious about this, actually. At least on the Web, the idea that content and presentation are separate things isn't heresy. At least on the Web, I can archive, search, contextualise, comment, plan, structure and collaborate without having to wade through steaming piles of cruft all the time.

At least on the Web, I can choose which steaming piles I step into.

I'm going to start recommending people stop using Word as an authoring medium. There are far better, simpler tools for every task, and the word processor has been appropriate for exactly none of them for too long now. Sometimes you have to destroy the document in order to save it.

top

Trust Works All Ways

grcumb grcumb writes  |  more than 6 years ago

[Cross-posted from the Scriptorum.]

The Debian OpenSSL vulnerability apparently went unnoticed for well over a year. Why is it that crackers and script kiddies never found it and/or exploited it?

Over the weekend, I've been thinking about last week's disclosure concerning Debian's OpenSSL package, which in effect stated that all keys and certificates generated by this compromised code have been trivially crackable since late 2006.

There's a pretty good subjective analysis of the nature of the error on Ben Laurie's blog (thanks, Rich), and of course the Debian crew itself has done a fairly good job of writing up the issue.

The scope of this vulnerability is pretty wide, and the ease with which a weak key can be compromised is significant. Ubuntu packaged up a weak key detector script containing an 8MB data block which, I'm told, included every single possible key value that the Debian OpenSSL package could conceivably create.

The question that kept cropping up for me is: This one-line code change apparently went unnoticed for well over a year. Why is it that crackers and script kiddies never found it and/or exploited it? Numerous exploits on Microsoft Windows would have required far more scrutiny and creativity than this one. Given the rewards involved for 0-day exploits, especially in creating platforms for cross-site scripting attacks, why is it nobody bothered to exploit this?

My hypothesis - sorry, my speculation is this: People at every stage of the production process and everywhere else in the system trusted that the others were doing their job competently. This includes crackers and others with a vested interest in compromising the code. I should exclude from this list those who might have a reasonable motivation to exploit the vulnerability with stealth and to leave no traces. If, however, even they didn't notice the danger presented by this tiny but fundamental change in the code base, well my point becomes stronger.

The change itself was small, but not really obscure. It was located, after all, in the function that feeds random data into the encryption process. As Ben Laurie states in his blog, if any of the OpenSSL members had actually looked at the final patch, they would almost certainly have noticed immediately that it was non-optimal.

In all this time, apparently, nobody using Debian's OpenSSL package has actually (or adequately) tested to see whether the Debian flavour of OpenSSL was as strong as it was supposed to be. That level of trust is nothing short of astounding. If in fact malware authors were guilty of investing the same trust in the software, then I'd venture to state that there's a fundamental lesson to be learned here about human nature, and learning that lesson benefits the attacker far more than the defender:

Probe the most trusted processes first, because if you find vulnerabilities, they will yield the greatest results for the least effort.

P.S. Offhand, there's one circumstance that I think could undermine the credibility of this speculation, and that's if there's any link between this report of an attack that compromised not less than 10,000 servers and the recent discovery of the Debian OpenSSL vulnerability.

Slashdot Login

Need an Account?

Forgot your password?