Ask Slashdot: How Often Should You Change Jobs?
I have benefitted from changing jobs every few years, but I never switch jobs after only a year. I'll stay at pretty much any job for a minimum of a year, and I try to stay everywhere 2 years. My career so far has been 1 year, 2 years, 3 years, 6 years, 3.5 years, and 2.5 years. I felt like I stayed at the 6-year job too long. It wouldn't have been too long if I had grown in the right way while I was there, but I didn't. I kind of plateued and stayed because it was comfortable.
And that is the reason why I think switching companies every few years is essential, especially in IT. There are just so many things you will never learn if you don't work at different companies, even if the things you learn are simply how jacked up and ridiculous some companies are. I've been at startups, Fortune 100 companies, and companies in between. I highly recommend giving each company your full commitment for at least a couple of years, but always be willing to move on when you're no longer growing your skills or doing interesting things.
America 'Has Become a War Zone'
See, this is the attitude. He says his job is to make sure his employees go home every day. He's wrong. While nobody should be reckless with the lives of law enforcement, and it is a great tragedy when the lives of law enforcers are lost in the line of duty, it is their job to protect and to serve the public. It is their job, from time to time, to be injured and/or killed in the act of protecting and serving the community. That is the job. It is voluntary. It is not for the faint of heart, or for the cowardly. It is not, in short, for me.
When the number one priority of law enforcement becomes protecting the life and safety of officers, above all else, then the public suffers. And the lives and safety of citizens have been sacrificed on numerous occasions in the name of protecting law enforcement.
I am not anti law enforcement. But I am deeply concerned about their low tolerance for risk.
Ask Slashdot: Minimum Programming Competence In Order To Get a Job?
I manage development teams. First of all, like it or not, .NET is huge and seems like it will continue to be huge. I don't mind C# and I think Visual Studio is one hell of an IDE. Myself, I prefer to code in Perl or Ruby, and to use vim to code it. But C# and Visual Studio are tolerable. So if you had to choose one to really focus on, I think from a marketability standpoint, that's a good one. It's also a great idea, though, to have a true scripting language in your back pocket (in addition to PowerShell, which is, yes, really cool).
OK, back to the other question: Do you have to be a 7 or higher? Heck no. I find that there's very little "computer science" in what we do in a lot of professional software development. It's a lot of CRUD and knowing how to put various things together (REST plus AJAX plus responsive plus ORM plus third party integration plus memcached plus S3, etc, etc). You don't have to be a brilliant programmer for that. You have to be able to learn quickly, retain knowledge (at least enough to remember what you Googled to do it again), be a good communicator, and be a really good team player.
In summary, I would gladly take a 4 with potential and a great attitude, over a 9 who is a dick. I would do that over and over again. You just need to make sure that you have a few 7s around, but not too many.
iPad Fever Is Officially Cooling
Take this for what it's worth, but I find it relevant. I, too, laughed at Microsoft's Surface tablet, and it's dismal sales figures. I was the happy owner of a personal iPad and a work iPad, and I figured that was what the tablet experience was, and what it needed to be.
I was wrong. At least for me, I was wrong.
I have the strange situation of working at a Microsoft shop, as a manager of a development team (though I am an open source guy, by background). The local Microsoft sales office invited several of us to an Azure educational session. Part sales, part hands-on training. I am a big user of Amazon Web Services, and my team uses it as well. So I figured instead of just being an Azure hater, I would come along and do the hands-on walk-through, because it's some dedicated time for some busy people to really dive in and see what all the fuss is about.
But this comment is not about Azure. Azure was fine. I'm still AWS fan, and Azure was just similar enough, and just different enough, that I'm not chomping at the bit to start using lots of Azure.
But here's the thing: They had a door prize. And I won it. It was a Surface with Windows RT. And I love it.
Would I love it as much if it had not been free? We will never know. But all I can tell you is my story.
One might argue that it's stupid for this thing to have a "Desktop mode." But it also has a USB port (shocker!) and supports Bluetooth. I already have a Bluetooth keyboard for my iPad (which I never use), and a USB mouse that I use for my laptop. I put my Surface on it's kickstand, pair the keyboard, and plug in the mouse tongle, and the dang thing is transformed. It's a little laptop. It has Office installed. It has the full Windows version of IE. Yes, I hate IE, and it sucks that there's no Chrome or Firefox for this platform. But the point is that, in a jam, this makes a damn fine little PC.
And then it's NOT a PC. You pick it back up and it's a tablet. And a NICE tablet. The apps are nice. The Reddit app is my favorite Reddit experience, in fact. Netflix, Facebook, etc.
Oh, and the best part... the Surface team designed it to be used with mouse, or a finger. It behaves differently in each case. And it behaves generally the way you expect (want) it to.
So, call me a fanboy. I did not expect to really like it. But I do. I really like it. Anybody slamming Windows 8, and particularly Windows 8 on a table, isn't really giving it a chance. It's really nice.
Functional 3D-Printed Tape Measure
I'm a big fan of 3D printing, I am. And at the beginning of this video, I just thought this was the coolest thing. Well, except for a couple of things.
Tape measures are widely available and inexpensive.
This one is REALLY short (just over 4 feet), and it was comparable in size to a standard 25-foot tape measure.
Worst of all, it's not accurate. It's off by a 16th of an inch at the maximum length, and it would only get more and more inaccurate, as the length increases.
Other than that, it's perfect :) Oh, it's definitely impressive, and I still find it hard to believe that these sophisticated contraptions are printed already assembled. It's amazing.
Ask Slashdot: Modern Web Development Applied Science Associates Degree?
Computer Science has its place, certainly, but it's not in every IT shop in America. I've been giving this a lot of thought lately: How do you take those unemployed and underemployed people, whose jobs have basically disappeared, and are never coming back... and intersect SOME of those people (not all of them will be able to do it) with the enormous shortage of talented and capable IT people.
I've come to almost accept, over the last couple of years, that there's such an insatiable demand for IT, and such a shortage of competent IT people, that it's just a reality that we're going to have lots of lots of crappy people in IT, and there's nothing that can be done about it.
But I'm having difficulty completely accepting that. Because I know that the skills that you need to be good at solving technology problems are not extraordinary. I just barely started college, and then quit to join the Air Force. Five years later, I got into the web business (in 1996) and I've had a great career for 18 years. I recently decided to finish my degree, but that's a different story.
The point is: I'm not a computer scientist. There have been a few times in my career when I would have benefited from a CS degree, but not many. Mostly, what I have needed is intelligence, verbal and written communication skills, the ability to quickly learn new things, a passionate interest in technology, the three Larry Wall traits (laziness, impatience and hubris), and an understanding of how users think and act. Editorial skill has not hurt me, and neither has graphic design skill.
While I would be really interested in helping to build an educational program, one problem I have is that I'm self-taught, and therefore don't really know how you're supposed to teach this stuff. But I would love to be part of a workshop where industry folks come together for a week and brainstorm on this topic, or something.
My big sticking point is this: I honestly believe that the one non-negotiable requirement for being a good technologist is intelligence. And this seems to be controversial, because it makes it sound like I'm calling other people stupid. And, well, I am. I really wrestle with this. I wonder how good a web developer you can be if you're not quite smart.
Oracle Attacks Open Source; Says Community-Developed Code Is Inferior
I've heard about that awful EHR (Electronic Health Record) integration effort between the Veterans Administration (VA) and the Department of Defense (DoD) for years. It's a failure of a lot of things, but if open source is even on the list of those things, it's low on the list. At the top of the list is dotted lines and bureaucracy, of course. Heck, IT projects often go off the rails, particularly big expensive ones. Let alone one done for the Department of Defense (DoD). And of course, it's not just the DoD, it's also an inter-department collaboration. Doomed for failure, unless it's managed excellently.
It appears that one big reason that this integration project is so hard is because the VA can't compete when it comes to process and bureaucracy. They don't have nearly as large a budget. This quote is telling:
"The iEHR demise was expected by all, accordingly," one VA source said. DOD officials "outspend, outtalk and outlast us at every engagement. We try to emulate much of their process-based decision-making as if we could afford to. We can't. The overhead is crippling, and we are not funded equivalently."
It pains me to see any IT project that gets out of control and ultimately fails. I hate it even worse when it's the government. As a veteran, I especially hate to see this one. And as an open source user, contributor and advocate, Oracle blaming that massive failure on open source adds insult to injury.
Salesforce.com To Cut 200 Jobs Despite Its Expectations To Make More Money
Not only is this reduction in redundant staff probably appropriate, but this is one of the rare situations in which "synergy" is used in a non-lame, non-stupid way.
When two companies merge, the hope is that the two joined as one company will be more effective than they were when they were working together, but separate companies. Synergy is a reasonable word to describe that.
But unsurprisingly, that synergy does not always happen. You're combining two companies, with two different cultures, perhaps incompatible systems, perhaps conflicting ideals. And you're certainly going to have some redundancy. For example, in each company, you probably have one person ultimately in charge of technology. Now you have two, and have to work that out. You also have one person ultimately in charge of smaller things: The phone system, for example. Now you have two. These things have to be worked out.
A successful merger in which some people don't get laid off would be very surprising.
How One Programmer Is Coding Faster By Voice Than Keyboard
I've been using vi/vim for almost 20 years. I hate emacs. It's a perfectly fine piece of software, it's just not for me.
But I'll come to the defense of emacs on this one. Let's not blame his editing software for his RSI.
What Keeps You On (or Off) Windows in 2013?
Well, as far as Linux on the Desktop -- although I continue to be impressed with what Ubuntu and others have done in this area, Linux on the Desktop for the masses seemed to be pretty much killed by Apple's brilliant move to go to an operating system that would eventually draw an absolutely massive number of people who would have otherwise been a part of that migration to desktop Linux. As much as Apple contributed to open source from that time on, I can't escape the fact that Linux on the Desktop adoption definitely suffered. I know I never seriously considered Linux on the Desktop again.
However... a couple of years ago, after having been a Mac desktop user for about a decade, I found myself working at a Microsoft shop, and they handed me an HP with Windows 7 on it. I was really quite worried about how well I would do, even though I had, of course, run Windows as a desktop prior to Mac's move to OS X (Windows 95, Windows 2000, etc).
With the exception of certain things that Windows 7 does very poorly (WebDAV client? Hello?!?), I've been overall fairly happy with both the OS and -- even more so, I think -- Office 2010.
But the best improvement that has come on the Windows platform in recent years is the continuous improvement of Cygwin, most notably a decent terminal in which to run it, mintty. If not for mintty, I would probably have struggled much more with Cygwin, and therefore with Windows. The Windows/Cygwin power punch may be the most productive setup available (aside from those environments where no MS interoperability is necessary at all, in which case I would still want a Mac).
How To Talk Like a CIO
As a fairly experienced technologist with increasing responsibility over the last several years, and who has had a certain amount of success and gathered some decent ideas along the way, I do actually think of myself as either a future CTO or future business owner.
But I almost NEVER think of myself as a future CIO. CTO definitely. But you can *have* CIO.
Ask Slashdot: Are There Any Good Reasons For DRM?
- I've been on slashdot since almost the beginning
- I'm a recreational musician who fantasizes about recording and distributing music
- I'm a web developer who has implemented DRM to protect the intellectual property of my employer
I decided to post here, so that I could say that I don't think there is any good use of DRM. I have heard lots of stories of people who distributed their own non-DRM'd music online and who do very well, for example. I think the good stuff will always pay off. People will recognize the value and the artist will be compensated.
I also hate the properties of DRM that inconvenience the consumer. Having to repurchase your content, for example But before I started typing this comment, I thought of one use of DRM that could be considered legitimate. A streaming subscription such as Netflix, or computer training videos and stuff like that, is something that works very well, is transparent to the user, and does not need to stand the test of time. As long as your subscription is active, you can access your content. You have no need to access the content after the subscription is over.
I've also taken advantage of software subscriptions lately. For example, I need Photoshop sometimes, but not all the time. Instead of paying a ridiculous amount of money to buy Photoshop, I can may for a month of Photoshop, which gets me through whatever project I'm working on. This is a form of DRM, and without it, Adobe would not offer the product the way I want to consume it. The same with Netflix. I love it, and without that protection, they could not offer it.
Yes. Gimp. I know. Sorry, I like Photoshop.
Discovery Channel Telescope Snaps Inaugural Pictures
That the word "Inaugural" caused me to think the telescope took pictures of the Presidential Inauguration.
I eventually read enough to realize this is a newer telescope, which would have made photos of any presidential inauguration unlikely. But since we're talking about powerful telescopes, I guess even the time travel element wouldn't be out of the question.
Craigslist Donates $100,000 To the Perl Foundation
So how much should they have given, to make it enough to satisfy you and all the others who think that as long as they're paying less than they would have paid for the commercial alternative, they're getting off cheap?
8 Grams of Thorium Could Replace Gasoline In Cars
I read the headline as "8 Grams of Thermite..." :) Quite a visual.
Microsoft Suggests Heating Homes With "Data Furnaces"
It is not wise to create, for something that is desirable, a dependency which is undesirable.
One example is using tobacco taxes to pay for children's healthcare. It sounds good politically, but then you're dependent on smoking and it's a conflict of interest to get people to stop smoking.
Another example is when law enforcement agencies find they are dependent on fines from speeding, or assets confiscated from drug dealers. If people stop speeding, or drugs stop coming through the area, which is what they say they want, they'll have a budget crisis. So there's a conflict of interest.
The example at hand -- heating living space with excess heat from data centers -- is not as controversial. You could argue that there's no particular need to make computers run cooler. But there certainly has been, and continues to be, a lot of research in that area. The potential conflict is enough to fall back on what we have learned -- or in some cases, not yet learned -- from other conflicted dependencies.
Mac OS X Lion Has a Browser-Only Mode
This reminds me of an idea that I had the last time I was installing OS X, though it would have appeal for all operating systems. Wouldn't it be cool if the OS installer also fired up the network and gave you a web browser, so you could surf while waiting for all the files to copy and so forth? Maybe Lion will also provide this feature.
Google Wave Out of Beta
I think I heard this analogy as "a station wagon full of backup tapes" :)
When Developers Work Late, Should the Manager Stay?
I agree that all-nighters are almost never productive, but this question was about staying late, not about working all-nighters.
I'd like to believe that all work could get done within business hours and on my team it usually does. But there will always be times that the team needs to stay late, because things always come up.
When Developers Work Late, Should the Manager Stay?
I manage developers and they can count on me to be here if they're here (and when they're not). But I'm also *not* a useless lackey. I'm a developer myself and I'm here because I add something to the process. In addition to going to get the food (which I always do), I can actually participate in the process of making decisions and solving problems.
In my opinion, if you can't do that, you shouldn't be in the position. And you certainly shouldn't be looking over anybody's shoulder if you're not needed. Give them the space. Surely you have some of your own work you can do while you wait.
But yes, be there -- unless you can't be there without getting in the way, in which case you should leave.