×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

Ask Slashdot: How Should a Liberal Arts Major Get Into STEM?

aussersterne Ph.D. is NOT a career move (279 comments)

An English major is NOT getting into a STEM Ph.D. program, no matter what.

Even if they were, job prospects are worse for STEM Ph.D. holders than for MS/BS holders—there are far fewer jobs that require Ph.D. level qualifications outside of the professoriate and academics, and for Ph.D. holders in particular, employers are absolutely loathe to hire overqualified people.

Inside the professoriate and academics, the job market is historically bad right now. It's not "get a Ph.D., then become a lab head or professor," it's "get a Ph.D., then do a postdoc, then do another postdoc, then do another postdoc, then do another postdoc, really do at least 6-7 postdocs, moving around the world every year the entire time, and at the end of all of that if you've managed to stay employed at poverty wages using highly competitive postdocs that you may not even get, while not flying apart at the emotional seams, you may finally be competitive enough to be amongst the minority of 40-year-old Ph.D. holders that gets a lab or a tenure-track position, at which point the fun REALLY begins as you are forced onto the grantwriting treadmill and feel little job security, since universities increasingly require junior faculty to 'pay their own way' with external grants or be budgeted out."

And that's INSIDE STEM, which this person is almost certainly likely to be uncompetitive for as a B.A. holder trying to get into graduate programs.

Much more likely is that with great grades and GRE scores they'll be admitted to a humanities or social sciences Ph.D. program, with many of the same problems but with CATASTROPHICALLY worse job prospects due to the accelerating collapse of humanities budgets and support on most campuses.

Ph.D. is absolutely not the way to go unless you are independently wealthy and are looking for a way to "contribute to the world" since you don't actually have to draw a salary.

For anyone with student loans, it's a disastrous decision right now, and I wouldn't recommend it.

I say this as someone with a Ph.D. who is on a faculty and routinely is approached by starry-eyed top students looking to "make the world a better place" and "do research." Given the competition out there right now, only the superstars should even attempt it, and then only if they're not strapped for cash. Hint: If you don't know whether or not you're a superstar, you're not.

I think in a decade I've strongly recommended that someone enter a Ph.D. program once, and greeted the suggestion favorably maybe three times total, out of thousands of students, many of them with the classic "4.0 GPA" and tons of "books smarts."

In short, I disagree strongly with the suggestion. Unless you absolutely know that you're competitive already on the academic market, DO NOT GO. Don't listen to the marketing from the schools; it's designed to drive (a) your enrollment and tuition, and/or (b) your cheap labor as a teaching assistant/research assistant forever once you're in the program. It's a win for the institution, not for you.

The easiest sanity checks: Do you know exactly what your dissertation will be about and what you'll need to do, in broad strokes to conduct your research, as well as what resources you'll need? Do you already have personal contact with faculty on a well-matched campus in a well-matched department that are championing you and that want to bring you in as one of their own students/assistants?

If you answers to either one of these questions is "no," then while you may be offered a position somewhere, you will be on the losing end of the deal and would be naive to take it.

5 days ago
top

KDE Releases Plasma 5.1

aussersterne Not so much winding down as becoming moot. (60 comments)

The Linux desktop wars mattered when Linux was the future of the desktop.

Now that the desktop has a much smaller future, and Linux clearly doesn't play much of a role even in this drastically reduced future, it's just that KDE and GNOME really don't matter much.

Desktop Linux is a niche product, and it behaves like one—adoption is vendor-driven, and clients use whatever the vendor supplies.

For individual Linux users, things haven't moved in half a decade or more. Linux is still a mostly complete operating system with mostly working desktops. None of it is very polished (polish, as always, is just a couple years off in the future). Significant time and customization are required to make any stock distro+DE work well, things are generally cluttered, kludgy, and opaque, and for the hobbyist that fits the profile—the sort of person that will actually put up with and use this kind of computing environment—one or the other (KDE or GNOME) is already a clear favorite and this isn't likely to change.

Of course there is also the developer group, probably Linux's largest cohort of "serious" users on the desktop, but they just plain don't care much about which DE is installed. They're much more concerned with toolchains and versions of environment components.

So the KDE vs. GNOME thing is just plain...not that big a deal any longer, for most anyone.

The only possibly interesting development in a very long time is Elementary OS, which appears to have adopted a different philosophy from the one traditionally associated with Linux development/packaging groups. But whether this will ultimately translate into an important operating system and user experience, with its brand that supersedes the branding of the desktop environment itself, remains to be seen.

about 2 months ago
top

Glut of Postdoc Researchers Stirs Quiet Crisis In Science

aussersterne You're mistaking "we" in "we need." (283 comments)

You mean study something that enhances profits for the very, very wealthy.

Academic research works on an awful lot of problems that *the world* needs to solve, yet it makes no money for the propertied class, so there are no investment or funds available to support it.

Many fighting this fight aren't fighting for their pocketbooks; they're fighting to do science in the interest of human goods, rather than in the interest of capitalist kings.

about 2 months ago
top

Glut of Postdoc Researchers Stirs Quiet Crisis In Science

aussersterne Um, you didn't mention adjuncts. (283 comments)

Earning beneath minimum wage with a Ph.D. and without benefits is considerably worse than a postdoc's lot.

about 2 months ago
top

Scientists Seen As Competent But Not Trusted By Americans

aussersterne Close, but I think it's simpler and more normal (460 comments)

than that.

It's not that the public doesn't trust the abilities of scientists.

It's that they don't trust their motives. We have a long literary tradition that meditates on scientists that "only cared about whether they could, not whether they should," and the politicization of sciences makes people wonder not whether scientists are incompetent, but whether they have "an agenda," i.e. whether scientists are basically lying through their teeth and/or pursuing their own political agendas in the interest of their own gain, rather than the public's.

At that point, it's not that the public thinks "If I argue loudly enough, I can change nature," but rather "I don't understand what this scientist does, and I'm sure he/she is smart, but I don't believe they're telling me about nature; rather, they're using their smarts to pull the wool over my eyes about nature and profit/benefit somehow."

So the public isn't trying to bend the laws of nature through discourse, but rather simply doesn't believe the people that are telling them about the laws of nature, because they suspect those people as not acting in good faith.

That's where a kinder, warmer scientific community comes in. R1 academics with million-dollar grants may sneer at someone like Alan Alda on Scientific American Frontiers, but that sneering is counterproductive; the public won't understand (and doesn't want to) the rigorous, nuanced state of the research on most topics. It will have to be given to them in simplified form; Alan Alda and others in that space did so, and the scientific community needs to support (more of) that, rather than sneer at it.

The sneering just reinforces the public notion that "this guy may be smarter than me, but he also thinks he's better and more deserving than me, so I can't trust that what he's telling me is really what he thinks/knows, rather than what he needs to tell me in order to get my stuff and/or come out on top in society, deserving or not."

about 3 months ago
top

Consumer Reports: New iPhones Not As Bendy As Believed

aussersterne Re:I still don't get this. (304 comments)

The retail replacement cost is why it's insane to put it in your pants pockets.

"I just dropped a grand on this. I know, I'll subject it to huge forces and see what happens!"

Why would you do that?

about 3 months ago
top

Consumer Reports: New iPhones Not As Bendy As Believed

aussersterne Re:I still don't get this. (304 comments)

I frankly don't see any difference. Big, fat force, tiny little space. That's not good for a sheet of glass, a sheet of metal—hell, you've seen what happens to a sheet of paper after spending all day in your pockets. People learn that in grade school.

If it really has to be on your waist somewhere, get a holster. Otherwise, just carry the damned thing, or put it in a shirt or coat pocket, briefcase, backpack, etc.

Since the '90s, I've never regularly carried a mobile device in my pants pockets. Obviously, it would break, or at least suffer a significantly reduced lifespan. On the rare occasions when I do pocket a device for a moment, it's just that—for a moment, while standing, to free both hands, and it is removed immediately afterward because I'm nervous the entire time that I'll forget, try to sit down, and crack the damned thing.

about 3 months ago
top

Consumer Reports: New iPhones Not As Bendy As Believed

aussersterne I still don't get this. (304 comments)

Who thinks it's okay to sit on their phone? Why do people think they ought to be able to? It literally makes no sense. It's an electronic device with a glass screen. If I handed someone a sheet of glass and said, "put this in your back pocket and sit on it!" they'd refuse.

But a phone? Oh, absolutely! Shit, wait, no! It broke?!?!

about 3 months ago
top

Slashdot Asks: What's In Your Home Datacenter?

aussersterne Yup. (287 comments)

Same conclusion. It's too easy to feel that precarity from the early computing age (not enough storage! not enough cycles! data versions of things are special!) if you were there. I think there's some of that going on here on Slashdot a lot of the time.

People in love with old Unix boxen or supercomputer hardware. People that maintain their own libraries of video, but all that's stored there is mass-market entertainment. And so on. It's like newspaper hoarding.

Storage and computation are now exceedingly cheap. 8-bay eSATA RAID cases run a couple hundred bucks, new. 4TB SATA drives run less than that. With 6 raid ports on a mainboard and a couple of dual- or quad-eSATA port PCI-x cards, you can approach petabytes quickly—and just for four digits. The same goes for processing power—a dual-processor Xeon setup (in which each processor can have core counts in the double digits) again just runs $couple thou.

And data is now cheap and easy. Whatever you want—you can have it as data *already*. Movies? Music? Books? Big social data sets? They're coming out our ears. The investment of time and equipment required, all in all, to put yourself in a position to rip and store a library of "every movie you've ever rented," and then actually do so, is much larger than the cost of simply licensing them via streaming. The same goes for music, ebooks, and so on.

There's just no need. Even my desktop is now starting to feel obsolete—for the work computing I do, there's a good chance I'll just go to Amazon cloud services in the next year or two. At that point, an iPad, a wireless keyboard, and a couple apps will probably be all the computing power I need under my own roof. If I have a desktop, it'll just be to connect multiple monitors for screen real estate.

about 3 months ago
top

Slashdot Asks: What's In Your Home Datacenter?

aussersterne We're talking old rack gear here. (287 comments)

This isn't about a 1959 Corvette. It's about a 1959 garbage truck.

about 3 months ago
top

Slashdot Asks: What's In Your Home Datacenter?

aussersterne No datacenter. Just a desktop computer (287 comments)

with 20 cores, 128GB RAM, 48TB online storage, and gigabit fiber coming in.

Yes, I use all of it, for work. But it's definitely not a "data center." These days, I don't know why anyone would want one—even moderately sized enterprises are increasingly happy to pay someone else to own the data center. Seems nuts to me to try to bring it into your basement.

If you just need the computation and/or the storage, desktops these days run circles around the datacenter hardware from just a few years ago. If you need more than that, it's more cost effective and reliable to buy into someone-or-other's cloud.

about 3 months ago
top

Slashdot Asks: What's In Your Home Datacenter?

aussersterne Why do this? (287 comments)

I sort of don't get it. White box PCs with many cores, dozens of gigabytes of RAM, and multiple gigabit ethernet ports cost next to nothing these days with a few parts from Amazon.com. If the goal is just to play with powerful hardware, you could assemble one or a few white box PCs with *many* cores at 4+ GHz, *tons* of RAM, gigabit I/O, and dozens or hundreds of terabytes of online RAID storage for just a few thousand, and plug them straight into the wall and get better computation and frankly perhaps even I/O performance to boot, depending on the age of the rackware in question.

If you're really doing some crazy hobby experimenting or using massive data storage, you can build it out in nicer, newer ways that use far less (and more readily available) power, are far quieter, generate far less heat, don't take up nearly the space, and don't have the ugliness or premium cost spare parts of the kinds of gear being discussed here. If you need the features, you can easily get VMware and run multiple virtual machines. 100Mbps fiber and Gigabit fiber are becoming more common and are easy to saturate with today's commodity hardware. There are an embarrassment of enterprise-ready operating systems in the FOSS space.

If you really need high reliability/high availability and performance guarantees, I don't get why you wouldn't just provision some service for yourself at Amazon or somewhere else and do what you need to do. Most SaaS and PaaS companies are moving away from trying to maintain their own datacenters because it's not cost effective and it's a PITA—they'd rather leave it to specialists and *really big* data centers.

Why go the opposite direction, even if for some reason you really do have the need for those particular properties?

about 3 months ago
top

AT&T Proposes Net Neutrality Compromise

aussersterne Same. I'm on 1Gbps Google fiber (243 comments)

and am soooo pleased to be rid of the other ISPs I've been stuck with in the past.

And of course *the moment* Google rolled out in this area, a bunch of other ISPs magically offered a competitive 1Gbps fiber plan as well.

Too late—you had me. And you pissed me off. And now I'm gone.

about 3 months ago
top

AT&T Proposes Net Neutrality Compromise

aussersterne This. (243 comments)

This is a pretty transparent proposal to immediately cap speeds, then approach platforms for extortion money based on user demand.

In short, it's exactly the same thing. The words have changed, but the idea about what to do with the cables is the same.

about 3 months ago
top

The Growing Illusion of Single Player Gaming

aussersterne Never been a fan of multiplayer. (292 comments)

Maybe I'm dating myself here, but multiplayer games are still newfangled and weird to me, and I don't know if that will ever change.

When I used to play games, I played to get away from social interaction and enjoy myself in isolation. It was a kind of recuperation. A world of gaming in which you have to face social interaction once again as part of gameplay was unattractive enough to me that I stopped playing games altogether. These days I mainly do crossword puzzles and read e-books for the respite that I used to get from gaming.

about 3 months ago
top

iPhone 6 Sales Crush Means Late-Night Waits For Some Early Adopters

aussersterne Re:Oh, but it does. You can't make a backup (222 comments)

You can't backup everything that's on the phone.

Your process sounds great to a technology-enabled person. But for mere humans?

They don't remember their Apple ID password.
They put in random answers to security questions for password recovery.
Their email address has changed, their computer has changed, etc.
They installed all that music, all those videos, and all those apps, like, a *year* ago or more. Who remembers how?

"Can't you just copy everything from my old phone over to my new phone?"

As you say, the process ends up being:

Initialize the phone as new, to their current computer.
Create a new Apple ID and sign them in.
Install and position all the apps one by one by looking at their old phone as you hold it.
Get ahold of all the music that they already bought in some other format so that they don't have to pay for it again.
Give them the bad news about what can't be tracked down/reinstalled (apps no longer in app store, music that can't be found elsewhere without re-buying, etc.)

I could have sworn that in a recent case, we lost all of SMS and she was upset about that, but may I'm remembering incorrectly. Still, the process is onerous.

It pisses people off—"You mean I can't just move all of *my* stuff from my old phone to my new phone? Why do they call it an *upgrade?*"

I'm not saying they're right. Sure, they should remember their passwords, take care of their online identities, etc.

But the fact is that you cannot simply do this:

1. Connect old iPhone to computer
2. Back up full contents
3. Connect new iPhone to computer
4. Restore full contents

I've been on to Apple a couple of times with people standing next to me while I try to act as an intermediary, and the people on the other end of the line end up just throwing their hands up, apologizing, and saying they can't help.

To be fair, this isn't exactly easy on Android either. But it's slightly easier. And both platforms need to seriously work on it.

about 3 months ago
top

iPhone 6 Sales Crush Means Late-Night Waits For Some Early Adopters

aussersterne Oh, but it does. You can't make a backup (222 comments)

if the computer + iTunes is newer than the phone. Try this:

-> Plug a full, everyday-used iPhone that was backed up or set up on an old computer
-> Into a new computer where it has never been backed up before

What you will get is an option to erase the phone and start over. You will not get the option to back up the phone, and Apple says that's by design—the licensed content on the phone is tied to the iTunes installation where it was set up, and the license can't be associated with a new iTunes.

Problem is that people that ask me for help have almost invariably either bought a new computer or reinstalled Windows since the time they set up their phone. So there is no way to create a backup—when you plug the phone in, you only get the option to erase the phone and set it up new.

about 3 months ago
top

iPhone 6 Sales Crush Means Late-Night Waits For Some Early Adopters

aussersterne Can you explain how you migrate material over (222 comments)

seamlessly? I have family members asking me to help with their iPhones routinely, and this is always a nightmare.

Is it just a matter of your having one stable iTunes installation over the entire period? Because the problem that I run into over and over again is that iCloud is either partial in its backing up and/or doesn't have enough space and thus doesn't back everything up, and they have invariably got a computer that's newer than their iPhone. As a result, their iPhone has never been backed up to iTunes, and when they ask me to help with a transition, I can't help them—iTunes simply offers to erase the phone when you plug it in since the phone predates the iTunes installation.

So we end up having to do a phone side-by-side—check each item installed on the old phone, then install and position it again on the new phone, one-by-one. Takes hours, and some things (SMS messages) are just plain lost. I'd love to find a way to just migrate one iPhone to the next with a click, but so far I haven't found it—the only way to do this appears to be to have an iTunes installation that predates your original phone and to which the phone has been synchronized since it was new. Then you can restore the backup to the new phone. But if the iTunes installation is newer than old phone, as far as I can tell users are SOL for easy transitions.

And most everyone I've helped to upgrade simply doesn't have this. Most of them don't even use iTunes at all.

about 3 months ago
top

Ask Slashdot: What Smartwatch Apps Could You See Yourself Using?

aussersterne It's not just apps, but speed and UX. (471 comments)

This is I think the thing that so many people miss about the Apple Watch announcement. The problem with existing smart watches hasn't been that the features aren't useful, it's that the promised features simply don't work. I owned two different smart watches and had the same experience:

- Extremely limited app selection
- Very, very slow and oversimple apps that did exist
- With input that was just plain cumbersome and unreliable
- And bluetooth connectivity that had to be constantly restarted/reconnected (like, every time you tried to use it, bluetooth was down)

As I've said in previous posts, I'm one of those that does still wear a watch every single day, so I could be an obvious target for a smart watch, at least moreso than people that don't wear a watch at all and haven't done so in years, if ever.

But for a smart watch to make sense, it can't be a worse experience than pulling out the phone. Watches will always lose on the screen size front, so it's got to be compelling in other areas. The phone experience does have some problems (you have to pull it out, it's risky to pull out and manipulate in some contexts—walking in the city, for example, where a drop can kill it and jostles from pedestrians can come easily, it's bulky and conspicuous, you have to put it back, and so on), so it's not inconceivable that a smart watch could make sense.

But smart watches thus far have been lessons in user friction—you had to really, really, really want to do a given task *on your smart watch*. One that I tried for a few days (the Sony watch) only recognized about 10% of the taps that you made (Want to tap that button once? Then tap manically on the screen over the button 15 times in rapid succession and hope one of them takes.) and was so slow and oversimple (presumably due to lower processing power) that even aside from UI horribleness, it just plain didn't do anything very well in practical terms.

If the Apple Watch has:

- Processing power analagous to that of smartphones
- A high-resolution display
- Input surfaces and controls that are as reliable as those of smartphones
- Battery life long enough to get through a day with certainty
- Reasonable ruggedness
- Stable bluetooth connectivity without hassles

Then it could well be a winner, not because it claims to do anything new, but because it actually managed to do what smart watches claim to do. So far, my experience with smart watches was that they claim a lot, then do absolutely none of it in practice. It's not that the feature list sucks, it's that the features themselves haven't actually been implemented in such a way that you can use them without sitting down for ten minutes to have a "smart watch session" and eke out a tap or two.

about 3 months ago

Submissions

top

Personal Drones Coming to Dominate the Hobbyist Radio Control Market

aussersterne aussersterne writes  |  about two weeks ago

aussersterne (212916) writes "Drones continue to be in the news, with more and more "personal drone" incidents making headlines. It's easy to think of such stories as aberrations, but a well-known market research company has studied the radio control hobbyist market on eBay and found that sales of radio control helicopters and, more importantly, "quadcopters" (which are most often understood to be the "personal drone" category of items) are now—when taken together—the dominant form of radio control items sold on eBay. Radio control quadcopters in particular are growing much more quickly than the rest. Are we poised to see personal drones become much bigger influences on news and popular culture? Is it time for regulation?"
Link to Original Source
top

Console gaming is dead? How about $14,600 for a launch-day PS4

aussersterne aussersterne writes  |  about a year ago

aussersterne (212916) writes "Seven years after the release of the PS3, Sony released the PS4 on Friday to North American audiences. Research group Terapeak, who have direct access to eBay data, finds that despite claims to the contrary, console gaming still opens wallets. Millions of dollars in PS4 consoles were sold on eBay before the launch even occurred, with prices for single consoles reaching as high as $14,600 on Friday. Would you be willing to pay this much to get a PS4 before Christmas? Or are you more likely to have been the seller, using your pre-orders to turn a tidy profit?"
Link to Original Source
top

Is the Desktop PC really dead?

aussersterne aussersterne writes  |  about a year and a half ago

aussersterne (212916) writes "After IDC and Gartner released numbers on declines in PC sales, the technology press descended into a navel-gazing orgy of woe, declaring the PC market to be collapsing, in dire straits, all but ended. But market research company Terapeak uses eBay data to show that desktop PC sales on eBay remain only slightly off two-year highs—with a catch: most of the sales are in used and refurbished PCs of older vintages, at price points well below what new PCs cost. Perhaps the "PCs are good enough" arguments have some substance behind them. Are consumers just satisfied with what they can already find on the used market for less than $200, Windows license included?"
Link to Original Source
top

Does the iPhone's Closed Nature Foster Innovation?

aussersterne aussersterne writes  |  more than 4 years ago

aussersterne (212916) writes "The heated debate over Apple's "walled garden" has ranged for years now, only growing more intense with the rise of iPhone apps and the recent release of the iPad. Contrary to conventional wisdom, however, some are suggesting that Apple's particular approach to closedness has actually been a boon for innovation and egalitarianism in ways that few had previously thought possible. In a recent NYT article, Steven Johnson says, "I’ve long considered myself a believer in this gospel and have probably written a hundred pages of book chapters, essays and blog posts spreading the word. Believing in open platforms is not simple techno-utopianism. Open platforms come with undeniable costs. The Web is rife with pornography and vitriol for the very same reasons it’s so consistently ingenious. It’s not that the Web is perfect, by any means, but as an engine of innovation and democratization, its supremacy has been undeniable. Over the last two years, however, that story has grown far more complicated, thanks to the runaway success of the iPhone (and now iPad) developers platform — known as the App Store to consumers." Can a walled garden, as Johnson suggests, actually give rise to a "rainforest" if executed in Apple-like ways?"
Link to Original Source

Journals

aussersterne has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?