World's Youngest Microsoft Certificated Professional Is Five Years Old
You woke up and discovered you had installed Windows 2000?
Scary what one can accomplish while on Ambien and not remember a thing the next morning.
Feces-Filled Capsules Treat Bacterial Infection
Indeed, not to be messed around with. I was lucky a few years ago in that I developed C-diff but the standard antibiotic took care of it and it never recurred. I have read horror stories of people having chronic C-diff that goes away with treatment but just comes back. Didn't know it had turned lethal, but it is damned unpleasant. Not something I would want to live with on a chronic basis.
Torvalds: No Opinion On Systemd
> You mean.... gasp! ... PostgresSQL isn't a shell script pipelining a bunch of sed/awk/grep/mv/cp commands?
In terms of the larger systems that it is integrated with, that is EXACTLY what it is. It is a highly specialized application that does one thing well and leaves the scope creep for other programs that consume it's services.
It may even be broken down into a lot of highly specialized background processes like Oracle.
Well, ok, a database server isn't a great example. Because a database server is essentially an API exposed for the purpose of being consumed by other applications. This is nothing specific to Unix, since database servers work more or less the same way on other OS's.
But my point was that often-touted killer design feature of Unix (take a bunch of little specialized programs, add pipes, mix well, and bake in a 350F oven) isn't really how complex programs are designed. On that point Torvalds is spot-on.
Torvalds: No Opinion On Systemd
There's still value in understanding the traditional UNIX "do one thing and do it well" model where many workflows can be done as a pipeline of simple tools each adding their own value, but let's face it, it's not how complex systems really work, and it's not how major applications have been working or been designed for a long time. It's a useful simplification, and it's still true at some level, but I think it's also clear that it doesn't really describe most of reality.
You mean.... gasp! ... PostgresSQL isn't a shell script pipelining a bunch of sed/awk/grep/mv/cp commands? Minecraft isn't some big long awk script that calls perl when it runs out of gas? I never woulda guessed!
Seriously though, and without belittling the value of the bunch 'o pipelined commands (especially for sysadmins), it's nice to hear someone clearly and concisely articulate this rather obvious reality.
Ask Slashdot: What Are the Strangest Features of Various Programming Languages?
Hey, just because you don't know the language doesn't mean it's necessarily wrong.
And just because you do know the language doesn't mean it's necessarily right!
Elementary OS "Freya" Beta Released
So you expect to be able to use a general purpose system that does accounting, astronomy, genomics, etc etc on everything from a modern mainframe to a pocket watch with NO learning whatsoever? Were you born knowing how to use Windows 7 or did you learn it?
Sigh... Read the GP again. He uses Linux as a primary OS for home and work. Learning curve is not the issue here.
That's what he was saying. It's not hard at all but we can't learn it for you.
In other words, "it's your fault for not learning it, not our fault for not making the user experience on par with commercial alternatives".
the simplest and most obvious 'user interfaces' of any tool we have today and yet I see people using them poorly all the time.
In other words, "it's your fault, you must be using it poorly". Or, "you're so incompetent you can't even use a hammer or a screwdriver".
I know I ramped up the flammage factor in my paraphrasing, but seriously, that's the type of worldview that has Linux desktop going nowhere fast.
Elementary OS "Freya" Beta Released
To be fair, you phrased it nicely. But it's still the same old mindset underneath that prevents Linux desktop from getting any traction.
No, it's really not. Familiarity is amazingly important. The thing is I use Linux more than anything else. If I go on a Windows or OSX machine, I'm presenetd with all sorts of weirdnesses and illogical things and things which plain old get in the way.
It's not a question of n00bishness but not working on the systems I work on day-in day-out every day.
Except the GP explained that he uses Linux as his primary OS at home and at work. Your response was to question whether he was familiar enough with it. Well yeah, it's safe to say that he's familiar with it.
You can make all of those disappear by making it *identical* to your OS of choice. That won't necessarily make it better, just more familiar.
If the cost is that in order for Linux to gain traction then it has to be like Windows or OSX, then there doesn't to be a whole lot of point.
Making it familiar and making it complete are different. Don't think that the GP (nor I) were arguing that Windows/OSX are perfect and should be verbatim copied.
Elementary OS "Freya" Beta Released
There are so many little things daily that cause the OS to be hard to use for regular people. And yes, that includes Ubuntu.
Such as? Are you sure it's not a question of familiarity, where someone who has used almost nothing but Linux might notice similar irritations about other OSs?
In other words: "Are you a complete noob and therefore it's your fault?" "Are you sure you're smart enough?"
To be fair, you phrased it nicely. But it's still the same old mindset underneath that prevents Linux desktop from getting any traction. As soon as the Linux community takes on are default mindset that any negative user experiences are the desktop's fault and not the user's fault, things might have a prayer of getting better. Sure, you're never going to make an OS that has zero learning curve, but apologizing for the learning curve rather than trying to lessen it doesn't help anybody.
Supreme Court Rules Cell Phones Can't Be Searched Without a Warrant
....or in other words: Freedom isn't Free.
Become a Linux Kernel Hacker and Write Your Own Module
Try this with windows and there's a good chance you'll find some incomplete example code from three API revisions ago that won't even compile with the latest libraries (BTDT)
Uhhhhh.... for the most part, the kernel API in Windows has been remarkably stable. I have an *extremely* non-trivial Windows driver that works from NT all the way through Win 8. The only major disruption in the 10+ years between NT4 and Vista was the TDI client debacle where they deprecated TDI and there were some workarounds that needed to be implemented to run on the new kernel.
That. Was. All.
I'm not a Linux kernel dev (though have lots of user-mode Linux/Unix experience), but my understanding of that world is "we'll change anything and everything if and whenever we feel like it, and it's up to the rest of the world to keep up with those changes". So your example, ironically, would apply much more to a Linux driver sample than it would a Windows driver sample.
One-a-Day-Compiles: Good Enough For Government Work In 1983
Me too. But that was the 1970s, not the 80s. Using punch cards in 1983 was idiotic.
Idiotic yes, unheard of, no. When I started as a freshman undergrad I was enrolled in a music school that was part of a state liberal arts college. By that time I had done a ton of programming on my TRS-80 in both BASIC and Z-80 assembler on my own. I considered taking some comp sci classes, but they forced everyone to take a course that involved programming on punched cards on the mainframe.
Needless to say, my reaction was "screw that!", and I went on with my studies.
By the end of sophomore year it was clear that (a) I wasn't going to make a decent living as a musician, and (b) I know how to work a computer, may as well get a piece of paper that says so. By that time they had dropped the punched cards stupidity and I went on to earn a double-major in CS and Music.
FCC Proposes $48,000 Fine To Man Jamming Cellphones On Florida Interstate
What about giving fellatio?
Well, if your partner is sufficiently well-endowed, then it should be no more distracting than taking the occasional sip from a Slurpie.
It's Not Memory Loss - Older Minds May Just Be Fuller of Information
"Ah, the power of the uncluttered mind!"
Embedded Developers Prefer Linux, Love Android
For some definitions of OS I suppose.
I can also make a working web server in an afternoon, if all it does is listen to port 80 and translate GET commands into the file system, and error out on everything else.
Ask Slashdot: Linux Mountable Storage Pool For All the Cloud Systems?
Uh, methinks you haven't really used tool chains designed to maximize the value of RAW files. The camera's built-in processor does way the hell more stuff than just compress raw pixels into JPEG. White balance is a huge one, along with level curves, sharpening, and a bunch of other stuff. Much of it either one-way or very hard to unwind. And as others have pointed out, most RAW *is* compressed, just lossless.
So yeah, you can fix white-balance in a JPEG, but it's way simpler and more accurate to set the white balance if the pixels haven't already been misbalanced in the first place. Ditto for exposure. Most tools that deal with processed JPEG's don't even have an exposure adjustment---quite often the same tool that does both file types will have an exposure slide if it's RAW but not if it's JPEG. Sure, you can futz with brightness, contrast, levels, gamma, etc to correct an under-exposed shot. But sliding over to +2/3 for a slight underexposure is one click and you're done.
As a guy who has deep-drilled many a software engineering discipline in his 25 year career, and shot tens of thousands of frames as an amateur enthusiast, you can pull me out of the "photographers who don't understand the tools" pool thank you very much.
I have gone back and forth between JPEG and RAW over the years. There have been periods where, with two small children, I simply didn't have time to invest in RAW processing. And I was pleased the neutrality of the DSLR's processing anyway. Other times I knew I was shooting in challenging conditions, and set the camera to RAW+JPEG as a safety net. I've rescued many a shot that way. Recently I've been putting mileage on Lightroom and can extract an immense improvement out of the RAW's that would take me 4x the time to do if they were JPEG, and probably not end up with the same result. I now have more time to invest and the payoff is real and significant.
AT&T Microcell Disassembly; Security Flaws Exposed
You obviously don't have one of these. There is in fact a GPS inside, and they specifically instruct you to put it near a window if the GPS LED doesn't go solid. There have been various complaints on other boards about this fact, with tips on where to find GPS antennas and connectors (yes, there is an antenna jack on the back of the unit) so that the MicroCell can be used in a more convenient place while still getting a GPS signal.
Insiders Call HP's WebOS Software Fatally Flawed
Another problem was the difficulty in finding programmers who had a keen understanding of WebKit as Apple and Google snatched up most of the top talent
But wait, I thought that engineers were just pluggable resources...
Reviews of Kindle Fire Are a Mixed Bag
The comparisons to the iPad are ridiculous. I do expect the Nook Tablet to be a better device and The Nook Color has the least reflective LCD display I have ever seen on a mobile device and the only LCD display I consider good enough to read on.
However the iPad is a horrible reading device. Anyone who thinks an iPad is a reading device doesn't read much.
Yeah, and despite all that, my Kindle Library pretty darned large thank-you-very-much. 90% of it read on an iPad, the other 10% on my 2nd-gen Kindle which was immediately given to the in-laws once I got the iPad. For my situation, reading on the iPad is a way better situation than the Kindle.
Ask Slashdot: Touchscreen Device For the Elderly?
Exactly. Or the shorter version: Get an iPad. Duh.
And as others have pointed out, skipping out on Internet is silly---she will get a huge benefit by being connected. eBooks, video calls, multi-player games, news, etc. If the nursing home has Wifi, then there ya go. If not, get the 3G model and have the family chip in the measly $15/mo to keep it on a basic data plan.
Siri Envy? Iris Brings Some Voice-Assistant Features to Android
Siri is ultimately at a disadvantage for taking that route, because ultimately it has to have much better comprehension of the spoken words as it can't count on matching just most of the command before worrying about what to do with the input.
And yet it is suprisingly good at figuring out the command and parameters. You seem to be saying that Siri is worse because it doesn't force the user to speak as if it were typing commands in /bin/sh. Which is precisely why it is better than traditional voice recognition.
I've used Google Voice, and I was a very frequenct user of the much-more-limited iOS 4 voice commands. Both worked well for their intended use. But Siri is a whole 'nother thing.
WalrusSlayer hasn't submitted any stories.
WalrusSlayer has no journal entries.