×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Ask Slashdot: Which Router Firmware For Bandwidth Management?

aussersterne Using DD-WRT (Kong latest "old" driver version) (98 comments)

on a Netgear R6300 and it has been very fast, great with signal quality, and the QoS features are working as expected.

Both the R6250 and R6300 have a dual-core 800MHz CPU, so they have the power to handle a decent QoS requirement without bogging down potential throughput too much. I'm satisfied, and it wasn't that expensive. If your situation isn't too terribly complex (many dozens of users and extensive QoS rules) then it might be a good choice.

The R7000 is even faster and supports external antennas, so I second that suggestion, but it's also twice the price of the 6250/3000, which can be found on sale from $100-$125 brand new if you're a good comparison shopper and/or patient.

2 days ago
top

PC Gaming Alive and Dominant

aussersterne I think you're missing the point (your "not into (245 comments)

FPS" comment at the end is evidence of this).

In the PC gaming world, getting it to run at the highest settings *is* the game. It's like the "bouncing ball" graphics demos on 8-bit systems in the 1980s. The actual software isn't useful or meant to occupy the user's attention for long. The challenge is in *getting it to run* and the joy is in *seeing what my super-cool computer is capable of* in processing and graphics rendering terms.

Running on last year's card/settings? Sorry, you don't get the game.

This is why I stopped being a PC gamer in the late '90s. All I wanted was a better Tetris. What I got was a better bouncing ball demo.

about a week ago
top

A Third of Consumers Who Bought Wearable Devices Have Ditched Them

aussersterne It's early days yet. (180 comments)

There were a whole bunch of smartphones before the iPhone. Anyone remember them? I stumbled across my old Palm Centro the other day, which replaced a Treo 680. These devices were useful to some (I was one of them), but the cost/benefit calculation was finicky, and they didn't find widespread adoption.

Pop consensus was that smartphones were a niche market. Then, someone got one right (iPhone) and the whole industry took off. These days, people don't even realize they're using a "smartphone" (I can remember the early press using the term "supersmartphone") because it's just "my phone."

The same trajectory outlines the computing era in general—from 8-bit boxes that were fiddly and full of cables and user manuals and coding to the Windows era during the '90s—at first, it was a geek thing, and lots of people got in and then got out, deciding it wasn't useful. Then, suddenly, a few UX tweaks and it was ubiquitous and transparent and a market we couldn't imagine the world being without.

I suspect the same will happen with wearable tech.

about two weeks ago
top

Are DVDs Inconvenient On Purpose?

aussersterne I guess I don't see the reason this is on the (490 comments)

front page of Slashdot. Of course this is price discrimination. Charge what the market will bear. Segment your users accordingly. Maximize revenue through each avenue, carefully ensuring that you match value offered to segments to pricing, etc.

This is not a story, this is marketing 101—it's what every marketing-driven organization (basically everyone in the modern economy) does, and the bigger they are, the better they do it.

It's not that any of this is wrong, it's just not newsworthy. We could write the same piece about any number of consumer goods companies, SAAS platforms, etc.

I guess my response to this is: "Yes. And?"

about three weeks ago
top

The Billionaires Privatizing American Science

aussersterne Social contract? Them's fighting words. (279 comments)

Seems to me you're talking about SOCIALISM, or even worse, COMMUNISM.

I didn't sign no contract, and there ain't no such thing as society. That's a lie told by Karl Marx.

— All of America

about a month ago
top

Silicon Valley's Youth Problem

aussersterne I do freelance/consulting for startups. Why? (225 comments)

Because:

- The pay is 2-3x what I could get paid at established firms
- The relationship-starting practices actually make sense (an interview amongst humans, often with C-levels, rather than with an HR-drone, and forms of testing that involve work on-product, rather than abstract and unrelated HR games).
- They are thankful to have me and pleasant to work with (as opposed to confronting the HR bureaucracy and middle management)
- I get better titles and better status/authority within the firm

I do good work, I produce value, and the startups that I work with see that and can measure it quantitatively. Established firms could if they wanted to, but that's the point: they don't want to. They want to pay you as little as they can get away with, and have you as silent and head-hung as they can get you to be.

I stopped working for stodgy HR- and middle-management-heavy firms years ago. It basically sucked, and was soul-sucking.

about a month ago
top

Silicon Valley's Youth Problem

aussersterne Gosh, what's sauce for the goose... (225 comments)

Companies want to talk about making yourself competitive in the labor market, then bitch and moan when those that will pay get all the hot talent?

Oh noez! Whatever will we do!?

I'd say that if someone gets paid $big_bucks at $hot_startup, they're entitled to it. If you want them, pony up.

about a month ago
top

Silicon Valley's Youth Problem

aussersterne Ditto. Old-fashioned 9-5 work at an established (225 comments)

company now:

- Pays less
- Is less secure
- Is a shitty environment
- Offers dwindling benefits
- And little respect

You're cannon fodder, that's all.

At startups and companies with that "hot startup" attitude (there are a few established companies that do this), you're the core of the business, the brains of the operation, worthy of any perks or cash they can throw at you.

Who wants to work where they're completely undervalued when they can work where they're (if anything) overvalued?

Make the salary at least reasonable, the hiring practices sane, the benefits good, and the job security reliable, and you'll find that a lot of young people are willing to work at stodgy old firms, just like they used to.

Employees are just tired of being treated like shit. These days hot startup > freelance/consult > established firm when it comes to the deal you get as a worker.

about a month ago
top

Sony & Panasonic Next-Gen Optical Discs Moving Forward

aussersterne Yes, yes, yes. (250 comments)

I paid $600 at one point for a used full-height hard drive that was made out of a solid hunk of alloy for the first hard drive for my PC.

So?

Way to let the point fly over your head.

By the time we were mid-'90s, we could get backup solutions that were—yes—$1,000 to $3,000 for the mechanism and $15-$30 for each piece of media.

But they:

- Would cover the space of most consumer drives at the time within 1-4 cartridges
- Would thus backup your entire consumer data library for $50-$150 per complete backup

This can't be done any longer. Not even close.

My point wasn't to get into a "history" pissing match. Sheesh, yes, also back in the day there were no such things as digital computers or hard drives or printing presses or even written script and everything had to be passed along as oral tradition, which meant that the cost of a backup was the cost of a human life.

As I said, this misses the point entirely. One might have hoped that in the process of getting here from the mid '90s we'd have gone forward rather than backward on the ability to make backups on removable storage media.

about a month ago
top

Sony & Panasonic Next-Gen Optical Discs Moving Forward

aussersterne Translation: Where is the consumer solution? (250 comments)

I can't find any data on MSRP now, but back in the day it seems to me that there were storage choices that were not so cost-prohibitive for consumers.

4mm and 8mm drives with multi-gigabyte capacities that compared favorably with hard drives of the time could be had for $hundreds to $a thousand or two, with media costs in the $10-$25 per tape range. At the time, there were also MO drives that had significant capacities in similar ranges, with slightly higher media costs.

Back then, the capacity of one removable cartridge/disk was much closer to the capacity of consumer market hard drives. You might have to go through 1-4 tapes or cartridges to back up all of your data, but that meant less than $100 for each additional complete backup set.

Now current consumer drive sizes are in the multi-terabyte range, while capacities of removable storage are such that you'll need 10-15 instances of media to back up your collection, and each media item is $50-$100. I have 18TB online right now. This means with a 300GB storage capacity, I'll need 30-45 instances of blank media for a single backup set. Back in the day, I had an Archive Python autoloader that used 4 DDS tapes and had a capacity of 96GB compressed, with a total online storage capacity of something like 40GB. In short, I had _excess_ capacity for less than $100 per backup set in a single operation.

At this level, it makes much more sense to just by a pile of multi-terabyte hard drives (4TB drives are currently less than $150 street price) and use them. Faster, cheaper, and without the up-front cost of the mechanism (backup drive) to pay for.

For consumers, dedicated backup technologies seem to have gone the way of the dodo.

about a month ago
top

Apple's Messages Offers Free Texting With a Side of iPhone Lock-In

aussersterne Did not have this problem. (179 comments)

I switched from iPhone to Android after using iMessage extensively and did not have this problem. So clearly it depends on some particular status/configuration of all the involved parties.

Does this depend on:

1) Moving the SIM from your old phone to your new phone
2) Leaving your old phone on and connected to WiFi so that iMessages still sees you as being on network

Or something like that?

I know that when I switched, it was a really quick thing—new Android phone arrived via USPS, pulled my old SIM, put it into new phone, turned off old phone, and away we went. I was in mid conversation with several people and never experienced a hiccup over the course of the day. Even talked about it over SMS—complained about the default keyboard on the new phone and all kinds of stuff.

Wasn't aware of this issue and didn't experience it. What gives?

about a month and a half ago
top

Ask Slashdot: When Is a Better Career Opportunity Worth a Pay Cut?

aussersterne I think you'll find that in most industrialized or (263 comments)

post-industrialized nations, it's a pretty common thing to find that the failure to pay owed wages is considered a serious problem.

about 2 months ago
top

Ask Slashdot: When Is a Better Career Opportunity Worth a Pay Cut?

aussersterne The second company was in NY in my case— (263 comments)

that is, until the state got involved thanks to myself and one other person. Then, one day, the building was locked, the board was gone to Europe, and all dozen or so employees were standing outside the door baffled and unpaid, from what I understand. I got out sooner, by about two pay periods, and got mine back before it *all* went down.

about 2 months ago
top

Ask Slashdot: When Is a Better Career Opportunity Worth a Pay Cut?

aussersterne Will they pay you? (263 comments)

I've had it with small companies. During the '00s I twice started with small companies only to hear "pay will be late" at the end of an early pay period, then "pay is just around the corner" by the end of the next pay period. In one case, the CEO simply never paid; I left before the third no-pay period was over, demanding that I be paid for my hours, to which he basically replied "so sue us!" I did—but only managed to recoup some of what I was owed. In the other case, they eventually paid but then promptly fired me for the noises I'd made about leaving due to two periods with no pay; that CEO had the gall to act infuriated and hurt at my lack of loyalty to the company.

So be sure that a small company with a low capital/revenue stream doesn't mean "You promise to do it for the love of the company if they can't afford to pay you."

about 2 months ago
top

Mathematician: Is Our Universe a Simulation?

aussersterne Um, this is why this is a bad thought experiment. (745 comments)

Because logical slippage due to the vagaries of language is a decided risk.

Here you're mistaking the location of the dream. Dreams in *our* world, as *we* understand them have these properties. But again (and as I said in my other post) we're talking about another world that we have no reason to assume is not fundamentally different from this one (in fact we might, for many reasons that don't need belaboring here, and that are bound up with the very logic of the proposition in relation to what we understand about our world, have many reasons to assume the opposite—that it *is* fundamentally different from this one).

How does a "dream" behave in another reality in which *this* entire reality can *be* such a "dream?" Who knows. Nothing of what we understand about "dreams" as we know them in practical conception is remotely similar to what we mean when we talk about *our entire reality.*

How does a "computer simulation" behave in another reality in which *this* entire reality can *be* such a "computer simulation?" Who knows. Nothing of what we understand about "computer simulations" as we know them in practical conception is remotely similar to what we mean when we talk about *our entire reality.*

All we have to do to call the universe either a dream or a computer simulation is completely throw out any particular characteristics that are unique and empirically attributable to what we mean when we say "dream" or "computer simulation" as we are able to make use of these terms.

In other words, sure, this universe is a computer simulation or it's a dream...for certain values of "computer simulation" or "dream" that, if we were to accept them as valid, make the terms able to encapsulate *just about any phenomenon*.

This universe could also just be another reality's version of a "jumbo citrus fruit" or of an "Oscar awards ceremony," for the same reasons, and with the same level of practical or logical utility obtaining for these statements. For Slashdot purposes, I propose that we collaboratively write a paper on how this universe is just another encapsulating universe's version of a "Netcraft confirms it, Linux is dying!" press release.

about 2 months ago
top

Mathematician: Is Our Universe a Simulation?

aussersterne Um, certainly it does, (745 comments)

if we're conflating matter with information or information-processing.

A blender perfectly simulates what happens in a blender, mapping matter to information. It is empirically perfect, in that every possible unit of information is represented by a dedicated unit of matter, without shortcuts; it is a perfect simulation of what happens in the theoretical case of "something being blended" which is a subset of the logically possible set of phenomena connected to the physical manifestations found in an appliance store as a "universe" of a particular kind.

"Ah," goes the response, "but in conventional simulations, the physical nature of the reality being simulated is different from the physical nature of the substance of the simulation, i.e. there is a logical congruence reliant upon some measure of generalization, but not a physical congruence, because the only reason to 'run a simulation' is for the case in which physical resources are inadequate to the computational task with complete fidelity, i.e. the case in which we can not 'simulate the concept' using a perfect and total material instance of it."

So be it. But that's my point. If all of this—you, me, the universe—is just a simulation in a "computer" of a physical order so radically different from it as to be analagous to the physical differences between—say—the simulation of a nuclear explosion and the explosion itself (the sorts of things that we need to run simulations of)—then we're talking about a "real" (i.e. non-computed, non-simulation) space so different from our own as to make the use of our terms ("computer", "simulation", and so on) in it, bound up as they are with our own ontological and epistemological limitations and assumptions, essentially meaningless—or worse, ideological—suggestive of something (by virtue of the intuitive and connotative properties of 'computer' and 'simulation') that simply isn't (and, practically speaking, can't be in any universe that we're familiar with) the case.

about 2 months ago
top

Mathematician: Is Our Universe a Simulation?

aussersterne Silly language games. (745 comments)

For this to be true in even the most allegorical sense would require that we stretch the definitions of "computer" and "simulation" well beyond anything we currently understand and well beyond the bounds of our ability to be concise and specific about what the terms mean. Using these terms here is just mixing up apples and oranges.

We might as well, in other words, say that our universe is a blender inside a giant appliance store, a stageplay inside a giant theatre district, a mildewing blow tickler inside a giant hoarder's garage mess, or anything else bearing the one of the rough relationships signal:carrier, content:form, fragment:whole, instance:structure, etc.

I mean, what sort of computer are we talking about here?
What is its nature, not just logically, but physically? Do we even know that we're speaking "physically"? Isn't this the scale at which such quantities break down?
And doesn't our idea of computation and simulation require precisely that mathematical rules apply for these to be carried out in the first place?

about 2 months ago
top

Ask Slashdot: Are Linux Desktop Users More Pragmatic Now Or Is It Inertia?

aussersterne Spoken like an arrogant developer. (503 comments)

Do they continue to be gainfully employed as a digger, yet still dig with their bare hands?

What do they and their boss know about their productivity and job requirements that you don't?

What are they digging for? Is it likely to be damaged by a spade? Are they relying on the tactile sensation in their hands as they dig to make critical digging decisions of some kind? What is the cost of spades? What is the urgency of this dig? Is the limited supply of spades reserved for cases in which rapid digs are needed, in order to avoid excessive spade wear? How long do they dig? Does the spade cause repetitive stress injuries or blisters that hamper their work later on, and for longer periods of time, despite the apparent productivity gains early on? Even if we go all the way to the silly end of the spectrum, are spades against their religion? Even if so, are they nonetheless the most productive member on their team even with bare hands, leading the boss to not give two damns whether they use a spade or a ball of cotton candy to do their work? If you mess with the magic sauce that makes them the most productive person on the team, are you going to be out of a job before they are, even if you believe that your orders for them to change are the "correct" ones?

It seems to me that the job of tech designers isn't to know about digging, but to listen to the diggers carefully as the experts on their kind of digging, digging needs, and the totality of their work life as diggers, and to thoughtfully provide the technical resources needed to enable diggers to do digging as they see fit. They are, after all, the diggers. We are the tech people. Our job is to make tech—which is merely a means to everyone else, not an end. Make the wrong means that doesn't help them to achieve their ends, and you will find that nobody values your tech, no matter how much you try to explain that a spade is a spade.

about 3 months ago
top

Ask Slashdot: Are Linux Desktop Users More Pragmatic Now Or Is It Inertia?

aussersterne Some of the GNOME problem is in evidence here. (503 comments)

We're conflating use cases and identities when we say "Newbie." As technology designers, we need to be concerned with use cases. There may be a statistical overlap between the two, but mistaking one for the other gets us into deep water for design purposes.

Rather than newbies, let's talk use cases.

Case #1: User is not "at desk, at work" but is rather "in flow, in everyday personal life." They need, for party-planning purposes, or for kid-care purposes, for example, to "send an SMS, "send an email," or "buy more diapers on Amazon.com." These are use cases that are all much better handled by tablets or mobiles, particularly if the user does not spend most of their work or personal life sitting at a computing system. The larger computing system imposes an extraordinary amount of overhead for (say) the stay-at-home parent that just wants more diapers. Leave the playroom, go to the den, power up the desktop, sit down, confront a desktop full of resources, figure out which one is the right one, start the application, and so on. All of that is overhead when we have mobiles: pull iPhone from pocket, press button, tap Amazon, type "diapers", click Buy, put phone back in pocket.

As technology folks, we have a terrible habit of taking someone's bewilderment to mean that they need more training or they're a "newbie," rather than looking at it practically: they're being told that they have to do an *awful lot of work* (moving through the house, navigating a full suite of powerful computing resources, learning to manage them) just to get some more diapers in the midst of their *real life*, the one that they actually care about, which involves diapers, not computing.

Case #2: Person new to computing is also new to the job, but it is now their *full time job* to make charts and graphs with Excel. They will happily sit down with the 600 page book, online training tutorials, and get to work learning. Why? Because this is a set of resources that are not overhead to them—it is the productive work that they will be expected to do, so the investment in time and computing use makes perfect sense. It is work, not waste.

I'd argue that in most cases, trying to marry a full-on computing environment (hierarchical file system or DB storage in quantity, multiple applications, multiple peripherals and forms of connectivity) to a rapid, task-based interface is not going to work out because they're two different use cases. Rapid, task-based use demands lightness and speed. General-purpose "big computing" resources toward the achievement of office work demands feature-richness, open-endedness, and deliberateness (i.e. the opposite of lightness and speed). One is highly endpoint-oriented, the other has no endpoint and is highly process-oriented.

The right answer is not to redesign the desktop environment. The right answer is to get the stay-at-home parent an iPad, or a laptop with everything but the web browser uninstalled, one that preferably boots straight to the web browser—in which case, the UI doesn't matter at all, because the user has no intention to use it.

The "newbies" that we commonly reference are actually a use case—people that feel that their current goals are not well-served by a high-overhead investment in full-scale computing. To serve their needs with a full-on whitebox computer, we just have to strip out the general purpose computing entirely, or at the very least, hide it entirely—which makes the system all but useless for those embroiled in "general purpose computing" use cases, particularly in comparison to those that have a full desktop UI available to them.

Make a better desktop environment *and* make a better information appliance, and both sets of users will thank you.

Try to make a desktop environment *that is* an efficient information appliance, and the computing-for-work people will find it to be inefficient and unhelpful while the casual-net-users will find it to be slow and needlessly complex in comparison to their sister's iPad.

about 3 months ago

Submissions

top

Console gaming is dead? How about $14,600 for a launch-day PS4

aussersterne aussersterne writes  |  about 4 months ago

aussersterne (212916) writes "Seven years after the release of the PS3, Sony released the PS4 on Friday to North American audiences. Research group Terapeak, who have direct access to eBay data, finds that despite claims to the contrary, console gaming still opens wallets. Millions of dollars in PS4 consoles were sold on eBay before the launch even occurred, with prices for single consoles reaching as high as $14,600 on Friday. Would you be willing to pay this much to get a PS4 before Christmas? Or are you more likely to have been the seller, using your pre-orders to turn a tidy profit?"
Link to Original Source
top

Is the Desktop PC really dead?

aussersterne aussersterne writes  |  about a year ago

aussersterne (212916) writes "After IDC and Gartner released numbers on declines in PC sales, the technology press descended into a navel-gazing orgy of woe, declaring the PC market to be collapsing, in dire straits, all but ended. But market research company Terapeak uses eBay data to show that desktop PC sales on eBay remain only slightly off two-year highs—with a catch: most of the sales are in used and refurbished PCs of older vintages, at price points well below what new PCs cost. Perhaps the "PCs are good enough" arguments have some substance behind them. Are consumers just satisfied with what they can already find on the used market for less than $200, Windows license included?"
Link to Original Source
top

Does the iPhone's Closed Nature Foster Innovation?

aussersterne aussersterne writes  |  about 4 years ago

aussersterne (212916) writes "The heated debate over Apple's "walled garden" has ranged for years now, only growing more intense with the rise of iPhone apps and the recent release of the iPad. Contrary to conventional wisdom, however, some are suggesting that Apple's particular approach to closedness has actually been a boon for innovation and egalitarianism in ways that few had previously thought possible. In a recent NYT article, Steven Johnson says, "I’ve long considered myself a believer in this gospel and have probably written a hundred pages of book chapters, essays and blog posts spreading the word. Believing in open platforms is not simple techno-utopianism. Open platforms come with undeniable costs. The Web is rife with pornography and vitriol for the very same reasons it’s so consistently ingenious. It’s not that the Web is perfect, by any means, but as an engine of innovation and democratization, its supremacy has been undeniable. Over the last two years, however, that story has grown far more complicated, thanks to the runaway success of the iPhone (and now iPad) developers platform — known as the App Store to consumers." Can a walled garden, as Johnson suggests, actually give rise to a "rainforest" if executed in Apple-like ways?"
Link to Original Source

Journals

aussersterne has no journal entries.

Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...