×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

User Plea Means EISA Support Not Removed From Linux

aussersterne Working with state agencies in the '90s (189 comments)

I saw a lot of EISA systems. It was a reasonable performer and physically robust (not as sensitive as PCI cards to positioning in slots, etc.). I'd say that EISA hardware was generally of very good quality, but high-end enough that most consumers wouldn't run into it despite being a commodity standard, sort of like PCI-X.

The systems I had experience with were running Linux, even then. :-)

about a week ago
top

The Next Decade In Storage

aussersterne Seconded. (93 comments)

For a very long time, tape drives and media gave tape drives and media a bad name.

Consumer QIC — about 1% of tapes actually held any data, total snake oil that took 10 days to "store" 10 megs (immediately unreadable in all cases)
4mm — Tapes good for one pass thru drive; drive good for about 10 tape passes
8mm —Tapes good for maybe 10 passes thru drive; drive good for about 100 tape passes before it starts eating tapes

For all three of the above: Don't bother trying to read a tape on any drive other than the one that wrote it; you won't find data there.

Real QIC —Somewhat more reliable but vulnerable to dust, magnetic fields; drive mechanisms not robust, finicky about door closings

Basically, the only tapes that have ever been any damned good are 1/2 inch or wider and single-reel for storage. Problem is that none of these have ever been particularly affordable at contemporary capacities and they still aren't. Any non-enterprise business should just buy multiple hard drives for their rotating backups and replace the lot of them once a year.

about two weeks ago
top

KDE Frameworks 5.3 and Plasma 2.1 – First Impressions

aussersterne Experts are busy. (84 comments)

And they ALREADY have expertise.

A computing expert already has decades of highly detailed experience and familiarity with a bunch of paradigms, uses, and conventions.

Experts are the LAST people that want to read manuals for basic things they already have extensive experience with, like desktop environments. Again, they're busy. Being experts.

So, reading the manual on new tech that needs to be implemented in a complex system—great. Reading the manual on a desktop environment? Seriously? That's the last thing an expert wants to be bothered with. "I've used ten different desktop environments over thirty years. Can't you pick one set of conventions I'm already familiar with and use it, so that I can apply my expertise to the actual problems I'm trying to solve? Why reinvent the wheel in such a simple, basic system?"

DEs should leverage existing knowledge and use habits to enable experts to get their real work done quickly. For an expert, using the desktop is NOT the problem at hand requiring a solution. It's not what they're being paid for and not what they care about. Experts love to learn new things—in their area of expertise.

So sure, desktop environment developers probably love to poke around in KDE's front end, code, and docs. But anyone else? People that are not DE specialists are not so excited about the new learning project that is "my desktop," I assure you. The desktop is the last thing they want to be consciously focusing on over the course of the day.

about two weeks ago
top

KDE Frameworks 5.3 and Plasma 2.1 – First Impressions

aussersterne In the very first image... (84 comments)

The tree widgets on the left are mismatched: some solid lines, some spaces with alphanumeric characters; the alpha characters are black, yet the lines are gray visual noise that creates visual processing and cognitive load for no reason, adding nothing.

The parenthetical text at the top has a title whose margin (left whitespace to other widgets) is significantly different from the text below it; there are spaces between the parentheses and the text, which no text or print style guide in the world endorses because it separates the parenthetical indicators from the parenthetical text, when they should be tightly bound for clarity.

The window title preserves the absurd convention of using both the binary name and a descriptive title together, and separates them with a typographical element (an em-dash) which is inappropriate in a label or design element because it is asynchronous—it indicates a delay in interpretation and pronunciation (as the em-dash just a few words ago in this paragraph does) and thus suggests long-form reading, which is not the intent for at-a-glance window titles (unless you don't want them to be very usable).

The title of the list widget, "Information Modules" is superfluous and redundant; the user starting an "About" dialogue expects to see "information" from the start, and they do not need to know about implementation ("modules").

The resize handle contrasts significantly with the window background, drawing undue attention to this particular area of the window above others (why is it "louder" than the window title, for example? Window controls should be secondary to window content and all at the same visual "volume" for usability).

In short—they still don't get it; they are signaling, in conventional ways that most users process subconsciously, thought habits and forms of attention that are not contributing to efficiency and use, but rather detracting/distracting from it. This is the same old KDE with poor, unprofessional design that leads to cognitive clutter. It's not that KDE has "too much going on" but rather that KDE has "too much going on that isn't actually functional and adds nothing to users ability to get things done).

Yuck.

about two weeks ago
top

Fewer Grants For Young Researchers Causing Brain Drain In Academia

aussersterne Nope, their work isn't shit. (153 comments)

But they can earn 3x as much by going into the non-academic private sector and doing their research for profit-driven corps that will patent and secret the hell out of it, rather than using it for the good of all. Because the general public doesn't want to own the essential everyday technologies of the future; they'd rather it be kept inside high corporate walls and be forced to pay through the nose for it to wealthy billionaires.

And because bright young researchers actually have to eat, and actually want a life, they grudingly go where the money is, knowing full well they're contributing to deep social problems to come. Myself included.

But why would I settle for a string of one-year postdoc contracts that pay like entry-level jobs and require superhuman hours and commitment when I can go earn six figures at a proper nine-to-five, with revenue sharing, great benefits, and job security? Yes, the company owns everything I do. But I get to pay my bills and build a personal future. Of course, society's future is much dimmer as the result of so many people making the same choice that I have, and so much good work ending up in private hands rather than public ones.

But them's the beans. If you want to own the future, public, you've got to be willing to pay for it.

about three weeks ago
top

Tumblr Co-Founder: Apple's Software Is In a Nosedive

aussersterne I think this is pretty much it. (598 comments)

In terms of revenue, Apple is following the money. iOS has made Apple the wealthy powerhouse that it is today, not OS X. They don't want to lose the installed base or be perceived as just a phone company; OS X gets them mindshare and stickiness in certain quarters that matter (i.e. education and youth) for future iOS revenue.

But they don't actually want to invest much in it; it's increasingly the sort of necessary evil that is overhead, so it makes sense for them to shift to an iOS-led company. In the phone space, where the consumer upgrade cycle is tied to carrier contracts and upgrade cycles, it's important to have "new and shiny" every single year; consumers standing in AT&T shops are fickle people that are easily swayed by displays and sales drones that may or may not know anything about anything.

So the marketing rationale at Apple is (1) follow the revenue, which is mobile and iOS, (2) do what is necessary to stay dominant there, which means annual release cycles at least, and (3) reduce the cost of needed other business wings as much as possible so as to focus on core revenue competencies without creating risk, which means making OS X follow iOS.

It makes perfect business sense in the short and medium terms. In the long term, it's hard to see what effect it will have. It's entirely possible that they could wind down the OS X business entirely and remain dominant and very profitable as a result of their other product lines. It's also possible that poor OS X experiences and the loss of the "high end" could create a perception problem that affects one of their key value propositions, that of being "high end," and that will ultimately also influence their mobile sales down the road in negative ways as a result.

I'm a Linux switcher (just over five years ago now) that was tremendously frustrated with desktop Linux (and still dubious about its prospects) after using Linux from 1993-2009, but that has also in the last couple of months considered switching back. I switched to OS X largely for the quality of the high-end applications and for the more tightly integrated user experience. Now the applications business is struggling (the FCP problem, the Aperture events, the joke that is the iOS-synchronized iWork suite) and third-party applications have declined in quality (see: MS Office on OS X these days) as other developers have ceded the central applications ground to Apple. Meanwhile, the user experience on iOS remains sound but on OS X it has become rather less so as a result of the iOS-centricity of the company.

What to do? I've considered a switch back to Linux, but the Linux distros I've tried out in virtual machines have been underwhelming to me; the Linux desktop continues, so far as I can tell, to be in a worse state for my purposes than it was in 2008. I have no interest in Windows (I have Win7 and Win8 installations in VMs for specific applications, and even in a VM window they make me cringe; just complete usability nightmares).

It's a frustrating time for desktop users in general, I think; the consumer computing world has shifted to mobile/embedded devices and taken most of the labor, attention, and R&D with it. The desktop, needed by those of us that do productive computing work, has been left to languish on all fronts. It's completely rational in many ways at the macroeconomic level, but at the microeconomic level of individual workers and economic sectors, it's been a disaster.

about three weeks ago
top

Netflix Begins Blocking Users Who Bypass Region Locks

aussersterne Um, they just want to use Netflix. It adds value (121 comments)

to the media by making it easy to browse through, search, access, and stream.

And they're paying regular price.

We live in a very strange world when "piracy" has gone from "armed crews of criminal specialists seizing tonnage shipments of goods on the high seas with cannon and sword" to "a regular schmo paying the regular price to use a regular product in the regular way in his regular living room."

Hard to believe that the word still retains any of its negative connotation at all.

"Piracy" these days sounds an awful lot like "tuesday afternoon nothing-in-particular with tea."

about three weeks ago
top

Boston Elementary, Middle Schools To Get a Longer Day

aussersterne No, this is dumb. It should be shorter. (161 comments)

Very little useful learning goes on in school. And the top students need time outside of school to visit libraries, pursue intellectual hobbies, do independent reading, and generally do all the academic stuff that will actually matter in their lives later on (and matter to society later on).

By continually extending the school day and the school year, we increasingly ensure that we lock our best and brightest into mediocrity by tying up all of their time in institutionally managed busywork designed to ensure they don't deviate from the mean, which is pretty piss-poor.

about a month ago
top

Ask Slashdot: How Should a Liberal Arts Major Get Into STEM?

aussersterne Ph.D. is NOT a career move (280 comments)

An English major is NOT getting into a STEM Ph.D. program, no matter what.

Even if they were, job prospects are worse for STEM Ph.D. holders than for MS/BS holders—there are far fewer jobs that require Ph.D. level qualifications outside of the professoriate and academics, and for Ph.D. holders in particular, employers are absolutely loathe to hire overqualified people.

Inside the professoriate and academics, the job market is historically bad right now. It's not "get a Ph.D., then become a lab head or professor," it's "get a Ph.D., then do a postdoc, then do another postdoc, then do another postdoc, then do another postdoc, really do at least 6-7 postdocs, moving around the world every year the entire time, and at the end of all of that if you've managed to stay employed at poverty wages using highly competitive postdocs that you may not even get, while not flying apart at the emotional seams, you may finally be competitive enough to be amongst the minority of 40-year-old Ph.D. holders that gets a lab or a tenure-track position, at which point the fun REALLY begins as you are forced onto the grantwriting treadmill and feel little job security, since universities increasingly require junior faculty to 'pay their own way' with external grants or be budgeted out."

And that's INSIDE STEM, which this person is almost certainly likely to be uncompetitive for as a B.A. holder trying to get into graduate programs.

Much more likely is that with great grades and GRE scores they'll be admitted to a humanities or social sciences Ph.D. program, with many of the same problems but with CATASTROPHICALLY worse job prospects due to the accelerating collapse of humanities budgets and support on most campuses.

Ph.D. is absolutely not the way to go unless you are independently wealthy and are looking for a way to "contribute to the world" since you don't actually have to draw a salary.

For anyone with student loans, it's a disastrous decision right now, and I wouldn't recommend it.

I say this as someone with a Ph.D. who is on a faculty and routinely is approached by starry-eyed top students looking to "make the world a better place" and "do research." Given the competition out there right now, only the superstars should even attempt it, and then only if they're not strapped for cash. Hint: If you don't know whether or not you're a superstar, you're not.

I think in a decade I've strongly recommended that someone enter a Ph.D. program once, and greeted the suggestion favorably maybe three times total, out of thousands of students, many of them with the classic "4.0 GPA" and tons of "books smarts."

In short, I disagree strongly with the suggestion. Unless you absolutely know that you're competitive already on the academic market, DO NOT GO. Don't listen to the marketing from the schools; it's designed to drive (a) your enrollment and tuition, and/or (b) your cheap labor as a teaching assistant/research assistant forever once you're in the program. It's a win for the institution, not for you.

The easiest sanity checks: Do you know exactly what your dissertation will be about and what you'll need to do, in broad strokes to conduct your research, as well as what resources you'll need? Do you already have personal contact with faculty on a well-matched campus in a well-matched department that are championing you and that want to bring you in as one of their own students/assistants?

If you answers to either one of these questions is "no," then while you may be offered a position somewhere, you will be on the losing end of the deal and would be naive to take it.

about a month and a half ago
top

KDE Releases Plasma 5.1

aussersterne Not so much winding down as becoming moot. (60 comments)

The Linux desktop wars mattered when Linux was the future of the desktop.

Now that the desktop has a much smaller future, and Linux clearly doesn't play much of a role even in this drastically reduced future, it's just that KDE and GNOME really don't matter much.

Desktop Linux is a niche product, and it behaves like one—adoption is vendor-driven, and clients use whatever the vendor supplies.

For individual Linux users, things haven't moved in half a decade or more. Linux is still a mostly complete operating system with mostly working desktops. None of it is very polished (polish, as always, is just a couple years off in the future). Significant time and customization are required to make any stock distro+DE work well, things are generally cluttered, kludgy, and opaque, and for the hobbyist that fits the profile—the sort of person that will actually put up with and use this kind of computing environment—one or the other (KDE or GNOME) is already a clear favorite and this isn't likely to change.

Of course there is also the developer group, probably Linux's largest cohort of "serious" users on the desktop, but they just plain don't care much about which DE is installed. They're much more concerned with toolchains and versions of environment components.

So the KDE vs. GNOME thing is just plain...not that big a deal any longer, for most anyone.

The only possibly interesting development in a very long time is Elementary OS, which appears to have adopted a different philosophy from the one traditionally associated with Linux development/packaging groups. But whether this will ultimately translate into an important operating system and user experience, with its brand that supersedes the branding of the desktop environment itself, remains to be seen.

about 3 months ago
top

Glut of Postdoc Researchers Stirs Quiet Crisis In Science

aussersterne You're mistaking "we" in "we need." (283 comments)

You mean study something that enhances profits for the very, very wealthy.

Academic research works on an awful lot of problems that *the world* needs to solve, yet it makes no money for the propertied class, so there are no investment or funds available to support it.

Many fighting this fight aren't fighting for their pocketbooks; they're fighting to do science in the interest of human goods, rather than in the interest of capitalist kings.

about 4 months ago
top

Glut of Postdoc Researchers Stirs Quiet Crisis In Science

aussersterne Um, you didn't mention adjuncts. (283 comments)

Earning beneath minimum wage with a Ph.D. and without benefits is considerably worse than a postdoc's lot.

about 4 months ago
top

Scientists Seen As Competent But Not Trusted By Americans

aussersterne Close, but I think it's simpler and more normal (460 comments)

than that.

It's not that the public doesn't trust the abilities of scientists.

It's that they don't trust their motives. We have a long literary tradition that meditates on scientists that "only cared about whether they could, not whether they should," and the politicization of sciences makes people wonder not whether scientists are incompetent, but whether they have "an agenda," i.e. whether scientists are basically lying through their teeth and/or pursuing their own political agendas in the interest of their own gain, rather than the public's.

At that point, it's not that the public thinks "If I argue loudly enough, I can change nature," but rather "I don't understand what this scientist does, and I'm sure he/she is smart, but I don't believe they're telling me about nature; rather, they're using their smarts to pull the wool over my eyes about nature and profit/benefit somehow."

So the public isn't trying to bend the laws of nature through discourse, but rather simply doesn't believe the people that are telling them about the laws of nature, because they suspect those people as not acting in good faith.

That's where a kinder, warmer scientific community comes in. R1 academics with million-dollar grants may sneer at someone like Alan Alda on Scientific American Frontiers, but that sneering is counterproductive; the public won't understand (and doesn't want to) the rigorous, nuanced state of the research on most topics. It will have to be given to them in simplified form; Alan Alda and others in that space did so, and the scientific community needs to support (more of) that, rather than sneer at it.

The sneering just reinforces the public notion that "this guy may be smarter than me, but he also thinks he's better and more deserving than me, so I can't trust that what he's telling me is really what he thinks/knows, rather than what he needs to tell me in order to get my stuff and/or come out on top in society, deserving or not."

about 3 months ago
top

Consumer Reports: New iPhones Not As Bendy As Believed

aussersterne Re:I still don't get this. (304 comments)

The retail replacement cost is why it's insane to put it in your pants pockets.

"I just dropped a grand on this. I know, I'll subject it to huge forces and see what happens!"

Why would you do that?

about 4 months ago
top

Consumer Reports: New iPhones Not As Bendy As Believed

aussersterne Re:I still don't get this. (304 comments)

I frankly don't see any difference. Big, fat force, tiny little space. That's not good for a sheet of glass, a sheet of metal—hell, you've seen what happens to a sheet of paper after spending all day in your pockets. People learn that in grade school.

If it really has to be on your waist somewhere, get a holster. Otherwise, just carry the damned thing, or put it in a shirt or coat pocket, briefcase, backpack, etc.

Since the '90s, I've never regularly carried a mobile device in my pants pockets. Obviously, it would break, or at least suffer a significantly reduced lifespan. On the rare occasions when I do pocket a device for a moment, it's just that—for a moment, while standing, to free both hands, and it is removed immediately afterward because I'm nervous the entire time that I'll forget, try to sit down, and crack the damned thing.

about 4 months ago
top

Consumer Reports: New iPhones Not As Bendy As Believed

aussersterne I still don't get this. (304 comments)

Who thinks it's okay to sit on their phone? Why do people think they ought to be able to? It literally makes no sense. It's an electronic device with a glass screen. If I handed someone a sheet of glass and said, "put this in your back pocket and sit on it!" they'd refuse.

But a phone? Oh, absolutely! Shit, wait, no! It broke?!?!

about 4 months ago
top

Slashdot Asks: What's In Your Home Datacenter?

aussersterne Yup. (287 comments)

Same conclusion. It's too easy to feel that precarity from the early computing age (not enough storage! not enough cycles! data versions of things are special!) if you were there. I think there's some of that going on here on Slashdot a lot of the time.

People in love with old Unix boxen or supercomputer hardware. People that maintain their own libraries of video, but all that's stored there is mass-market entertainment. And so on. It's like newspaper hoarding.

Storage and computation are now exceedingly cheap. 8-bay eSATA RAID cases run a couple hundred bucks, new. 4TB SATA drives run less than that. With 6 raid ports on a mainboard and a couple of dual- or quad-eSATA port PCI-x cards, you can approach petabytes quickly—and just for four digits. The same goes for processing power—a dual-processor Xeon setup (in which each processor can have core counts in the double digits) again just runs $couple thou.

And data is now cheap and easy. Whatever you want—you can have it as data *already*. Movies? Music? Books? Big social data sets? They're coming out our ears. The investment of time and equipment required, all in all, to put yourself in a position to rip and store a library of "every movie you've ever rented," and then actually do so, is much larger than the cost of simply licensing them via streaming. The same goes for music, ebooks, and so on.

There's just no need. Even my desktop is now starting to feel obsolete—for the work computing I do, there's a good chance I'll just go to Amazon cloud services in the next year or two. At that point, an iPad, a wireless keyboard, and a couple apps will probably be all the computing power I need under my own roof. If I have a desktop, it'll just be to connect multiple monitors for screen real estate.

about 4 months ago
top

Slashdot Asks: What's In Your Home Datacenter?

aussersterne We're talking old rack gear here. (287 comments)

This isn't about a 1959 Corvette. It's about a 1959 garbage truck.

about 4 months ago

Submissions

top

Personal Drones Coming to Dominate the Hobbyist Radio Control Market

aussersterne aussersterne writes  |  about a month and a half ago

aussersterne (212916) writes "Drones continue to be in the news, with more and more "personal drone" incidents making headlines. It's easy to think of such stories as aberrations, but a well-known market research company has studied the radio control hobbyist market on eBay and found that sales of radio control helicopters and, more importantly, "quadcopters" (which are most often understood to be the "personal drone" category of items) are now—when taken together—the dominant form of radio control items sold on eBay. Radio control quadcopters in particular are growing much more quickly than the rest. Are we poised to see personal drones become much bigger influences on news and popular culture? Is it time for regulation?"
Link to Original Source
top

Console gaming is dead? How about $14,600 for a launch-day PS4

aussersterne aussersterne writes  |  about a year ago

aussersterne (212916) writes "Seven years after the release of the PS3, Sony released the PS4 on Friday to North American audiences. Research group Terapeak, who have direct access to eBay data, finds that despite claims to the contrary, console gaming still opens wallets. Millions of dollars in PS4 consoles were sold on eBay before the launch even occurred, with prices for single consoles reaching as high as $14,600 on Friday. Would you be willing to pay this much to get a PS4 before Christmas? Or are you more likely to have been the seller, using your pre-orders to turn a tidy profit?"
Link to Original Source
top

Is the Desktop PC really dead?

aussersterne aussersterne writes  |  about 2 years ago

aussersterne (212916) writes "After IDC and Gartner released numbers on declines in PC sales, the technology press descended into a navel-gazing orgy of woe, declaring the PC market to be collapsing, in dire straits, all but ended. But market research company Terapeak uses eBay data to show that desktop PC sales on eBay remain only slightly off two-year highs—with a catch: most of the sales are in used and refurbished PCs of older vintages, at price points well below what new PCs cost. Perhaps the "PCs are good enough" arguments have some substance behind them. Are consumers just satisfied with what they can already find on the used market for less than $200, Windows license included?"
Link to Original Source
top

Does the iPhone's Closed Nature Foster Innovation?

aussersterne aussersterne writes  |  more than 4 years ago

aussersterne (212916) writes "The heated debate over Apple's "walled garden" has ranged for years now, only growing more intense with the rise of iPhone apps and the recent release of the iPad. Contrary to conventional wisdom, however, some are suggesting that Apple's particular approach to closedness has actually been a boon for innovation and egalitarianism in ways that few had previously thought possible. In a recent NYT article, Steven Johnson says, "I’ve long considered myself a believer in this gospel and have probably written a hundred pages of book chapters, essays and blog posts spreading the word. Believing in open platforms is not simple techno-utopianism. Open platforms come with undeniable costs. The Web is rife with pornography and vitriol for the very same reasons it’s so consistently ingenious. It’s not that the Web is perfect, by any means, but as an engine of innovation and democratization, its supremacy has been undeniable. Over the last two years, however, that story has grown far more complicated, thanks to the runaway success of the iPhone (and now iPad) developers platform — known as the App Store to consumers." Can a walled garden, as Johnson suggests, actually give rise to a "rainforest" if executed in Apple-like ways?"
Link to Original Source

Journals

aussersterne has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?