×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Comments

top

Detroit: America's Next Tech Boomtown

ErichTheRed Re:It could happen (284 comments)

I've had different experiences. I worked at 2 places that threatened to relocate to the South, one to Atlanta, and one to Orlando. The first one was just a corporate fiat, as in "those NY guys are too expensive, close the office and move it all down here." The second was an active IDA poach by Florida. Both wound up not working...the first was because, as you said, the company would lose too many of their knowledge workers and they weren't confident they could find new ones. When they started bleeding people, they realized they might want to rethink it. The second was basically just called off because the company got a sweetheart deal from New York.

My experience has been that companies will try to move their less skilled work or work that is considered non-core (like IT) to cheaper locations first. If the target location gives them a good enough deal, they'll just move the whole company. This is probably just short-sighted MBA stuff...executives are largely shielded from whatever environment the lower infrastructure investment produces. Their kids will go to private schools, they'll live in the best areas, etc. so they don't see any of the problems and they also see a quick win. Even if you get to keep your salary and relocate, you'll either be up for a layoff right away or never have your salary increased by much again, until your compensation matches the local market. So, you have to take that into consideration; either you love the area you're moving to and make a decision to stay, or you'll have to pull up stakes again and move back.

When even the real estate agents running a relocation tour mention that your kids will need to go to private school to get an equivalent education to where you're coming from, you know you're going to be in for an interesting experience.

9 hours ago
top

Ask Slashdot: What Tech Products Were Built To Last?

ErichTheRed IBM Model M Keyboard (513 comments)

Typing this on a Unicomp model based on the original design. Awesome keyboard, but it lacks the heavy steel backplate of the original.

Please buy a keyboard from this company so they keep making them. :-) http://www.pckeyboard.com/

Generally, any computer equipment before the mid 90s was made quite well, simply because it was so expensive at the time. It also tended to be heavily over-engineered. Some Compaq ProLiants from that era are 100 pounds because they're just solid metal all the way through.

10 hours ago
top

Detroit: America's Next Tech Boomtown

ErichTheRed It could happen (284 comments)

I'm a Rust Belt kid, so seeing northern cities on something of a comeback trajectory is a good thing to me. The problem is image -- you have to find techies who are willing to put up with a very messed up local economy and deal with winter. I'm from Buffalo, and winters there are very long and cold. The obvious benefit is that the cost of living is much lower than California or similar. I couldn't believe last time I was in CA to visit a friend that they had just paid almost a million dollars for a 3-bedroom house with no property. I don't care how good the weather is, that's absolutely nuts, and I live in the NYC metro area, so I know about high real estate prices.

I think it's all cyclical. Right now where I am, everyone is moving to North Carolina (Why??) People cite a much lower cost of living. That's true -- you can sell your Long Island house and buy (literally) a mansion on several acres in NC. The only problem is that Charlotte, RTP, etc. are still cities and real estate that's close to jobs is going to be more. Your mansion is going to be 25 miles' drive from anywhere. Atlanta has a similar issue -- people deal with multi-hour commutes so they can live in a massive house inside a gated community in the middle of nowhere. Side note - a friend of mine who moved there for a job refers to Cary, NC as an acronym -- Containment Area for Relocated Yankees.

Personally, I love winter and would have no desire to move somewhere like Florida, Texas, or Arizona. Right now, those are the cheapest places business-wise, so jobs move there. But the northern states can play the game too. New York just gave some new businesses a 10 year tax holiday if they locate in certain parts of the state. All the state economic development agencies engage in this kind of poaching. The only problem is that the South is better at it because they don't fund schools and local governments to the same extent. If Michigan and Detroit are serious about this, and can afford it, then the businesses will move back. Executives don't care because they would either stay put or be happy just about anywhere. To them, it's not all that hard to pick up and move.

Low real estate prices, compact metro areas that mean short commute times, etc. are advantages that these states and cities can use. We'll see if it pans out.

12 hours ago
top

New French Law Prohibits After-Hours Work Emails

ErichTheRed I can hear the US complaints already... (477 comments)

I work for a European company in the US. Our work rules are very different because we're a multinational and HR is handled on a regional basis. Every single one of my colleagues complains about the French 35 hour week and their unwillingness to put in crazy hours like Americans do. I happen to agree with the French on this one, and this comes from a lot of experience working in several different work environments.

US workers love to point to the "lazy socialist French" and make fun of their long vacations and very relaxed work style. And at the same time, they don't realize that they live longer, have better family lives, and are generally better adjusted than most stressed out Americans. Unemployment is higher than it is here, but their society isn't structured around crushing anyone who doesn't have a job. As an example, look at how much people complained about continuing the meager unemployment benefits for long-term unemployed people in the US. People were complaining about giving someone who has no hope of getting another job ever a couple hundred dollars a week to survive on. Long term unemployed this time around aren't lazy -- this time, all the old school manufacturing jobs are being thrown out of the economy, leaving people with average or below average intelligence with no hope of anything beyond fast food employment. But that's another worry for another time.

Back to work hours and work/life balance -- I am incredibly lucky in that I have a job in IT with lots of flexibility. Lots of my peers don't. Employers are constantly trying to squeeze every last minute of work out of their existing resources rather than adding more. Mine is too, but less so...I've been trying to get us another head for quite some time now and it's very hard. I have no problem with having a healthy work ethic, and people do need to be motivated. I do have a problem when I see employers taking advantage of people who don't realize they're being taken advantage of. Especially in IT, I have witnessed a lot of "hero culture" employers who demand that employees be available 24/7 even when it's not really necessary. Millennials are especially susceptible to this because they're used to being tethered to social media all day long. I think this is one of the reasons companies prefer younger workers -- fewer non-work demands on their time and a willingness to work crazy hours simply because they haven't figured out that their employer won't extend them the same loyalty down the road. In my opinion, your average employee is deluding him or herself into thinking that their job is super-important, that everyone else is lazy, and that their employer values them immensely. Evidence shows that this is no longer the case. It may have been in the 50s/60s "job for life" era, but unfortunately that's gone for the most part.

I'm also a new parent, and if you don't have experience, it's very hard to explain the drain on your free time that this places on you if you're doing it right and paying attention to your family. I see stressed out parents working for employers who don't give a damn responding to work emails at 2 in the morning simply because their employer expects that of them. I'll _glance_ at my messages once or twice in the evening, but I don't feel pressure to jump in and fix something right away -- unless something's literally on fire, it can wait. My opinion is this -- if something is really critical enough to require 24/7 coverage, then staff it that way. If you aren't willing to do that, then it's not critical. If the US were to adopt a "no after hours contact" rule or 35-hour week, it would reduce unemployment simply because companies would have to hire more resources. Either that, or a whole lot of "priority 1 mission critical" stuff would suddenly become less so.

about a week ago
top

State Colleges May Offer Best ROI On Comp Sci Degrees

ErichTheRed I agree, with very few exceptions (127 comments)

If you're after a good solid education, state schools do offer the best ROI for undergrad studies. I went to one, and was able to (barely) pay for it myself with a small amount of student loans, summer work and a little savings. Undergraduate education, from a content perspective, is very similar everywhere. I have a chemistry degree, and almost all undergrad chemistry programs are the same -- 2 survey courses, 2 organic chem, 2 physical chem, 2 analytical chem, 4 or 5 different lab courses, 4 or 5 electives (which vary based on what the schools' professors are concentrating on.)

The main differentiating factors I've noticed with private schools are the networking opportunities in and out of school, and the "cushy" factor. Even in a high tax state like New York, the state universities are pretty Spartan as far as accommodations go. Lately, states have been spending lavish sums trying to catch up in terms of sports facilities, etc. but they're still not a Harvard or Yale. Students going the state university route need to understand that they're going to get what they pay for, and likely be ahead of their private university peers in terms of raw dollars in debt when they get out. They need to be self-motivated and mature enough to handle their own affairs -- outside of class, everything at a state university is like dealing with a state agency. You're one student of thousands, and no one but you is going to care if you fail out. As far as opportunities go, private schools do give you a leg up. There are certain jobs you can't even hope to interview for such as white-shoe consulting firms or investment banking, who almost exclusively recruit from Ivy League schools. In my experience, this only applies to your first job or two, however. I've interviewed both public and private college grads, and there's an equal distribution of qualified people in each camp.

Since tuition is going way up at both the public and private levels, students who don't already have the money saved really do need to do a cost-benefit analysis. I probably would have had a better experience at somewhere like MIT or Stanford, just because I would have been studying with more smart people. But, I didn't have the money for $100K+ tuition. Students need to stop and think whether the caché of a big name school offsets the huge expense. They need to think about things like:

  • - Do I want to go to medical or law school after undergrad?
  • - Do I ever want to work in investment banking?
  • - Will I be disappointed if I don't get to work at BCG, Bain, or Booz & Company?
  • - Do I want the opportunity to hang out with the children of corporate executives and make those "school ties" connections that public university students can't get?

If the answer to any of these is "yes" and the student has a pot of money, they should go to private school. Otherwise, they should save their money. If a student is willing to hustle a little to get their first job, their accomplishments at that job and the connections they make will carry them through the rest of their careers. They probably won't reach stratospheric heights of corporate power, but talented students graduating with in-demand degrees can still do well.

about two weeks ago
top

NSA Infiltrated RSA Deeper Than Imagined

ErichTheRed Re:Still don't know what everyone's complaining ab (168 comments)

As for being woried more about the corps than government why? Can corperations arrest and imprison you? If not then you really have screwed up threat assesment abilities.

This is kind of what I was getting at -- among those more concerned about privacy, everything is part of a vast government conspiracy, and they're lurking behind the next corner just waiting to imprison and torture you. I think the reality is a little different -- the US has become way too diverse even in the last 50 years to allow any one group to gain enough power to do anything major. There's 300+ million people, spread over a huge geographic area, all with different opinions on pretty much everything. Even if you did live in a mountaintop compound stockpiling ammunition for the revolution, no one would bother you unless you start using it on your neighbors. Look at how hard it is to get anything accomplished with a divided Congress...the entire country is polarized like that, and I doubt that will change anytime soon.

Companies having access to your personal data is a little different. There's an incentive to squeeze every last cent out of every single customer interaction now, and I think most people don't realize how much their data is being mined, for whatever reason. I find the increasingly focused ad targeting I've been noticing lately to be a little more invasive than an imagined threat. I'd love it if Google charged a subscription fee instead of using my data as payment for their services, but I guess they make way more from advertisers or they would have offered it as an option by now.

about three weeks ago
top

Is the New "Common Core SAT" Bill Gates' Doing?

ErichTheRed Why all the fuss about Common Core? (273 comments)

Common Core is a big thing in NY where I live right now, because the state just voted to suspend its implementation for 2 years. NY already has pretty high standards for high school graduation and, if I'm any indication as a product of it, the curriculum is pretty good too. That doesn't mean that all other states have the same standards, and it seems to me that Common Core was designed to bring all states up to a higher level. As an example, my previous job wanted me to move to Florida, so I played along and did the whole relocation trip thing before telling them, "Sorry." Even the real estate agents who were pushing the place hard told me that my children, if they were smart, would have to be in private school to get a good education...just like Texas, FL values football more than education in high school apparently.

It seems to me that all the people screaming about how bad this is brought it on themselves. Look at all the press about the evil teachers' unions who have pensions, yearly raises, protect their members and only work 180 days of the year. Also here in NY, there was a big fight to force teachers to be evaluated and ranked like corporate employees get their performance reviews. I'm not a teacher, and I'm totally against that. First off, getting stuck with a class of crappy students can cost you your job, especially early on in your career when you might have to work in a bad school district. Second, teachers are professionals. Once they receive tenure, they should no longer be subject to evaluation and should have a job for life, end of story. Doctors and lawyers aren't stack-ranked -- those of us in private sector jobs who don't like it should fight to get representation.

Regarding the SAT, I wound up doing much better on the ACT when I took both. The ACT was much closer to what the SAT is slated to become. I remember it focused a lot more on what you were learning in school rather than obscure vocabulary words. I have a horrible time with head-based arithmetic, and the math section of the SAT (when I took it) had no calculators allowed and was basically two 30-minute tests of arithmetic and algebra tricks. I went on to make pretty decent grades at a state university in chemistry, so so much for the predictive factor or SAT scores... :-)

about a month ago
top

Tested: Asus Chromebox Based On Haswell Core i3

ErichTheRed Definitely not for power users (103 comments)

Even though I do find myself using things like OneDrive and Dropbox to keep non-sensitive stuff I'm working on available at home or at work, I'm not totally convinced that most power users will be replacing their PC or laptop with what's essentially a thin client that Google has control over.

On the other hand, for people who truly don't know any better, live in a location with five-nine, super-fast broadband access and just don't have the savvy to understand that their data is being mined by a third party, this might take off. It's the same reason tablets are taking off among the "content consumer" set. Amazon is doing something very similar with the Kindle Fire -- basically give away the hardware with the knowledge that Amazon uses your browsing habits to improve their prediction engine.

Honestly, I wish Google and similar services would offer a "paid" version with no data mining or tracking. People forget that the awesome search engine, maps, etc. aren't a free resource, and their data is paying Google's bills.

about a month ago
top

Computer Science Enrollments Rocketed Last Year, Up 22%

ErichTheRed I've seen this before (137 comments)

I remember CS enrollment shot way up in the late 90s as the dotcom bubble was inflating. Now that we're in the late stages of the social media/apps bubble, and people are getting interested in computer science again, I'm guessing that's the reason for the spike.

Bubble or no bubble, there's always going to be demand for good, talented people in software development and IT. The H-1B and offshoring trends have cut salaries significantly, and have made employment less stable, but there are still jobs out there. If students are going into CS that have a genuine interest in computers, that's good. Chasing the money like they were doing in the 90s without the desire will lead to the same problem we had when 2001 rolled around -- tons of "IT professionals" who had no aptitude for the work and were just employed because of the frothy market.

I've managed to stay employed for almost 20 years now and I still really enjoy what I do. It's not as wildly lucrative as it was in the 90s when you could get 20+% salary increases by changing jobs every six months. The only things I've done consistently over this time are:
- Keeping my skills current (and yes, it is a tough commitment especially when you employer doesn't care.)
- Not begging for higher and higher raises every single time salary review time comes around (which requires saving and living within one's means...)
- Choosing employers who don't treat their employees like they're disposable.

I've heard lots of older IT people that they're actively discouraging their kids from following in their footsteps. I don't think that's necessarily good advice. Sure, there are crappy employers out there, and it's not a guaranteed ticket to wealth anymore. But if you're flexible and want interesting work that lets you use your brain and get paid for it, it's still a good move IMO. Look at the legal profession right now - the ABA sold out their members by allowing basic legal work to be offshored. Law degrees were previously an absolute guarantee of a respected, high-salary job, and now that profession is starting to see what we're seeing. My opinion is that as computers get more and more involved in our daily lives, a professional framework will eventually develop when things really start getting safety-sensitive and people stop treating computers like magic boxes and IT/developers like magicians.

about a month ago
top

Teaching Calculus To 5-Year-Olds

ErichTheRed Interesting idea (231 comments)

I think that one of the problems with the way math is taught in schools is the fact that very little is done to explain how calculations students are doing can be applied to actual problems. Now that I'm older, went through a science education in college and work in a technical field, I understand this. However, one of my problems early on was that I never really felt comfortable doing math problems. It sounds really stupid, but I must have some sort of disability -- I can't do basic arithmetic in my head. The numbers just don't stick in my head the way they need to when you're doing multi-column addition or multiplication. My wife, a finance wizard, laughs at and pities me at the same time when I'm manually figuring out a tip. When I was learning math back in the Jurassic period, the students who were "good at math" were the ones who could easily do calculations in their head and just had a feel for numbers. Calculators in the early grades were unheard of back then. And this skill is still what a trader needs -- they need to be able to make a decision in 5 seconds based on a calculation they do in their head. It's also a skill you need to do well on the SATs, since they basically contain two 30-minute timed algebra and arithmetic tests.

What I'm saying is that math is more than basic arithmetic and algebraic manipulation. If you can get a student to understand what you mean when you say exponential growth, and how it relates to something they care about, then students will understand it more. I remember hating grade school math with the endless arithmetic drills, and later, the rote memorization of procedures for fractions, long division, etc. I also remember going through high school algebra just memorizing the exact steps to complete the crazy factoring/simplification problems and not understanding _anything_. It literally took me until about halfway through high school, when science classes actually got somewhat challenging and delivered meatier material, to make any sort of connection.

Calculus and other applied math should be at least touched on earlier on in the school career. I think it would help students who don't necessarily have the skills that would make them "good at math" to at least understand some of it. People I know who understand math well say it's like a foreign language, so maybe we should be teaching useful phrases for travellers more than we teach verb conjugation and sentence structure...

about a month and a half ago
top

All Else Being Equal: Disputing Claims of a Gender Pay Gap In Tech

ErichTheRed It all depends on where you work (427 comments)

Not all companies are crazy software development shops run by 20somethings who don't mind working 100 hour weeks. If you go to a company like that and ask about wage parity, you'll get all the excuses that were posted in this thread -- time off for kiddies, inability to travel, inability to work 100 hour weeks when needed, etc.

The reality is a little different. My wife and I both work, and we have 2 little kids. They take an insane amount of BOTH our time. Both of us have to share the responsibility of sick days, chores around the house and running errands, and especially this winter, snow days. We both have technical jobs, mine in IT, hers in finance. The difference is that we work for companies that don't expect 100 hour weeks. So far, it's worked out as long as one of us isn't taking all the time off work. It's not the 50s anymore -- most companies are mainly concerned with whether you get your work done and less obsessed with the butts-in-seats factor. The trade off for this is that sometimes we end up having to do a little extra work to catch up after the kids go to bed, which sucks when we're dead tired for working AND taking care of the kids. But, we're (at least not publicly) referred to as the one team member who can't get their stuff done.

So it's less of a female wage parity problem, and more of an "old guy/girl with kids" problem. That really bites as the two of us get older. We have to be smarter about the type of companies we choose to work for, and yes, both of us are leaving money on the table compared to the wages in our area. Single people who just graduated and have zero obligations will always have more choices. They can choose to work at an investment bank, or for a consulting firm that will fly them to clients' offices 300 days out of the year. They could go work for EA and fulfill their "lifelong dream to break into the exciting video game industry." These are choices only, not necessarily good or bad ones. It's just that as we get older, if we don't want to ignore our kids, we have to give up some of our options. ECO 101 - opportunity costs.

For fathers that aren't totally disconnected with the responsibility of raising kids, it can be very close to the same amount of extra time off the woman needs. You need to be there for them. It's harder to understand as you get to your late 30s and are still single or married and childless, but in my experience good managers have been able to at least relate. If one of your team members is still doing great work and needs to work a weird schedule, you would be silly to dump them and replace them with a fresh grad who doesn't have the experience but is willing to work themselves to death for you.

about a month and a half ago
top

IEEE Predicts 85% of Daily Tasks Will Be Games By 2020

ErichTheRed Pretty silly (146 comments)

I think this might work for _some_ millenials who are so used to this kind of reward system that this becomes the only way they can function in a work environment. If someone is raised on video games and collecting badges/trophies/points/whatever for doing a task, then it becomes a good workplace motivator. This would be especially true for younger software developers -- grind out this module/finish this sprint/debug this feature and receive the "Chief Debugger" badge. It could also work for mundane tasks that younger workers might turn their noses up at if there wasn't some sort of bragging rights attached to it. I'm not that old, and I was raised on video games, but not the whole "status collection" thing.

For someone who is already motivated to do a good job and doesn't need this, I can see it becoming a huge wedge issue. Not everyone works for companies that are arranged around being an extension of the college dorm lifestyle. Different people are motivated by different things. Money is nice for me, for example. Same goes for finishing something, seeing it go out to a customer or one of our internal guys, and having it work without coming back. I don't care if I have 16 badges and 20,000 points for doing that -- I care about the end result.

about a month and a half ago
top

Doctors Say New Pain Pill Is "Genuinely Frightening"

ErichTheRed Is the abuse problem really that big a deal? (294 comments)

I know "drugs are evil" and all, but I genuinely don't understand why people are so panicked about people abusing prescription pain killers. The reality is that there's a huge demand for pain medication, both for legitimate and abuse purposes. Just like the other wars on drugs, it's impossible to stop. Therefore, I'm of the mind that we shouldn't do anything...and that's coming from a very left-wing, big-government type. We should focus on providing abusers safe drugs, and spend the money we save on enforcement on treatment for the people who really want to get off drugs. I've never touched drugs, but I can't blame someone who has a crappy life and no prospects of it getting better from doing so.

Providing pain medication addicts with a preparation that won't destroy their liver (due to the included acetaminophen in other meds) would be a start. There's no fix for the demand problem, and reducing supply just drives up the price.

The reality is that the future is looking pretty bleak -- unemployment is going to be incredibly high as even safe middle class jobs are automated. Unless we want a revolution, it might be time to start loosening the restrictions on controlled substances. When unemployment goes up past 30, 40% and higher, governments are going to have angry mobs on their hands unless they have something to keep them occupied...

about 2 months ago
top

'Google Buses' Are Bad For Cities, Says New York MTA Official

ErichTheRed It's not the buses themselves... (606 comments)

The core problem that I think is being addressed is this -- if your urban area doesn't have a good mix of uses (work, leisure, living space, etc.) then it eventually starts decaying. San Francisco is the exception to this rule...the Google and Apple employees want to live the hipster city lifestyle and make enough money to do it. These companies save on insane SF rents by locating out in the suburbs where land is a little cheaper. The same is happening with the big investment banks in NYC -- there's no longer a physical reason to be right next to the stock exchange (though your data center still needs to be.) A lot of banks relocated further uptown, or to NJ or CT especially after 9/11. The difference is that there aren't "Goldman Sachs buses" or "UBS buses", but most people employed at these places have enough money to live wherever they want and commute on their own.

Other "less desirable" cities have the problem of people not wanting to live in the urban core, the reverse of what's going on in San Francisco. I've never actually been to San Jose/Cupertino/Mountain View/wherever in SV, but I imagine it's something like where I live (Long Island, suburban NYC.) We have some very nice places on LI and other communities surrounding NYC, but it's mostly very expensive sprawly development you find around most big cities. Tons of people use public transportation to get into the city every day, mainly because much of the area was at least somewhat designed around it. There are big employers on Long Island too, but not as many reverse commuters. The problem is, if businesses are downtown but _everyone_ goes home to their suburban towns after work, nothing is left to prop up the city center after the offices are done for the night. Google and Apple want to attract the hipsters, so they choose to ferry them from their hipster neighborhoods to the relatively boring suburbs. Most other employers in most other locations cater to the suburbanites, As a result, those cities' urban cores decay and become shells after 6 PM on weekdays. Fewer residents --> fewer businesses to cater to their needs --> crime and urban decay. Look at Buffalo and Detroit as extreme examples of this -- the suburbs surrounding the city have basically become the only sustainable parts of the city. Atlanta is basically a city of suburbs with no comprehensive public transportation and nightmare traffic as a result. Urban planning is really tricky to get right.

It's not an easy problem to solve. Everyone wants it both ways -- the 2 acre mansion PLUS the urban hipster bar/club scene. But the MTA is right in saying that Google buses are bad for (most) cities. The most sustainable development is a mix of uses in both city and suburban settings.

about 2 months ago
top

WhatsApp: 2nd Biggest Tech Acquisition of All Time

ErichTheRed Yay Social Media Advertising Bubble!! (257 comments)

I think it's time to call the near top of the social media bubble. Maybe this one will be called the Web 2.0 Bubble.

It's funny, because I remember the last tech bubble in the 90s ending a few months after similar insane acquisitions. Remember when AOL was bought by Time Warner because they were panicked that they would be left behind in the Web 1.0 future? How about all the IPOs of completely unprofitable companies based only on the fact that they sold stuff online or were funded by advertising?

I think whether this turns out to be a bubble or the "new normal" depends on how well these social media companies and device manufacturers can present themselves to the average joe as "the internet." Remember that AOL used to be "the internet" for anyone non-technical. People keep predicting the death of PCs simply because anyone under 25 uses tablets and phones as their primary computers, considers email old fashioned, and lives on Facebook. The question is whether this is universally true or just some hipster marketing buzz. I know people who live on Facebook, people like me who use it to post family pictures, and people who actively hate it. I think it could go either way, but the market for this stuff is way too frothy now. Even my boring corner of IT is being bombarded by cloud this and cloud that, and it's touted as the solution for everything.

The strange thing is this -- during the 90s, I was a new grad riding out the dotcom boom in one of those "boring" corners of traditional IT (sysadmin for an insurance company). This time around, I'm in a different "boring" corner of IT (systems architect in air transport). The plus side of this is that I never got laid off during the bust cycle. Marketing flash may sell IPOs, but people who actually know their stuff get to keep working when most of the fluff gets thrown out. Oh well... At least the 90s tech boom sparked a huge Internet build-out, oh, and left a lot of Aeron chairs on eBay. :-)

about 2 months ago
top

Computer Geeks As Loners? Data Says Otherwise

ErichTheRed Definition of "computer geek" has changed. (158 comments)

I think the study might have some merit, but only because the definition of geek has changed a lot.

I got into computers in the early 80s as a very young kid. By the time I really got involved with a "geek" social scene, there was a mix of people. Before that, computers were most definitely nerd toys -- there were very few "typical" folks who gravitated toward them. Even so, I've worked with people who want nothing to do with computers once they are off the clock, people who have a healthy level of hobby involvement with computers, hardcore gamers, and extremely hardcore "computer nerds" -- mom's basement types. The first group are the most likely to be in a stable relationship from my experience. I'm happily married with 2 kiddos, and I put myself in the "healthy level of hobby involvement" camp. It's surprisingly hard to find time to do anything these days with 2 young kids. You certainly won't see me playing video games for 10 hours at a clip anymore...I used to do that back in the day though.

I do have anecdotal evidence from my dealings with "tech workers" that divorces are very common. Lots of people I work with are on Wife #2 or more. I think a lot of that might be the crazy amount of time that work and computer hobbies can suck out of your life -- you really have to be matched up with someone who will either tolerate it or is a "geek" themselves and understands. And like I said, once kids come along, I can see huge problems if you decide to disappear for hours on end and expect your partner to just handle the kids. If you work an IT job for one of the crappier employers out there that demands on-call duty and tons of hours a week, only the shallowest of spouses will stick around and only if you make good money to make up for you not being there.

My other piece of strictly anecdotal evidence is the prevalence of...non-traditional...relationships among the geekier set. One US-born guy I worked with was divorced and constantly trying to bring his girlfriend from China to the US -- no clue how they met. Lots have girlfriends they met online. Others have had obvious mail-order brides. That could sound a little stereotypical, but I've seen LOTS of guy's wives who barely speak English and look like they're pretty much there to cook and clean for them. Maybe I'm just working with the wrong sorts, but that's a very common theme in my experience.

Non-traditionals aside, I think a lot of the evidence the study cites is just because computers are now a normal part of our lives. Anyone can be a Facebook user. Smartphones are designed to be used by non-techies. There are plenty of "IT" jobs that don't involve hardcore coding or systems/analysis work. My job borders on the nerdy side, but only because I make it that way.

I think that if you actually do find the right person, and that person is less of a geek than you are, it balances you out. My wife is incredibly smart, but not obsessed with computers and tinkering the way I am. (She's a finance geek.) If you find someone who's just there for the money or has absolutely no interest in what you do, that's where the divorces and bitterness creep in. I'm almost at 15 years married -- and she hasn't tossed me out yet!

about 2 months ago
top

IBM Looking To Sell Its Semiconductor Business

ErichTheRed I don't get it. (195 comments)

How is semiconductors not a core business for a company that still makes huge profits off mainframes and midranges?? Sure, keep design in house, but you'll lose the flexibility you have. Imagine your research division came up with an amazing new chip design they wanted to work on right away, but were told "Nope, it'll take 6 months to ramp up GlobalFoundries, TSMC, or whatever. Sorry."

The thing I really don't get (in general) is the way businesses feel like they can have no assets on their books and just run everything with a massive tower of multi-layer outsourcing. It doesn't make sense -- outsourcing something is never cheaper than doing it yourself. As soon as you do that ,you add in a layer of middlemen who need to get paid for doing a task which was previously cheap or "free with purchase of inhouse labor." It never works out. I guess I'll never be an MBA, because I don't get the accounting tricks that make a company appear profitable when they're wasting money on things they could do cheaper and better themselves.

For IBM's case, I do see what they're trying to do. Software is more profitable than hardware. But the problem is that IBM is/was a huge innovator in hardware and chips. They're one of the last US companies massive enough to support basic research that can improve those hardware innovations. IBM's software may be profitable, but I haven't seen anyone singing the praises of WebSphere or their Rational products lately. IBM also has a massive "services" division. I've had extremely good luck with the services people who service IBM hardware, but that's going away. So, we're left with the legendary crap outsourcing and offshoring stuff they do for large companies, and of course, "consulting." My experience with outsourced IT run by IBM is an ITIL nightmare of endless support tickets, revolving door engineers, meetings to plan meetings to plan the strategy for changes, etc.

It's kind of a shame if you ask me. I am just old enough to remember when IBM was as powerful as Microsoft was and as Apple is right now. They were able to command huge margins on everything they sold because it was backed up by a really good services team. People I know who worked for IBM "back in the day" tell me the corporate culture was weird, but employees never wanted for anything because they made so much money. (I also know people who worked for Sun and Digital who say the same thing.) In some ways, it would have been much nicer to work in the computer field during this "golden age of computing." I guess my main question is where the new hardware innovations will come from when you don't have a massive company and research group driving them.

about 2 months ago
top

Satya Nadella Named Microsoft CEO

ErichTheRed Congratulations (293 comments)

Now, can I please have Windows 9 with the Windows 7 and Windows Classic UI as options?? It's literally the only reason why I'm not switching -- some of the Windows 8 UI is nice, but I can't stand the 2D desktop interface from Windows 2.0.

Seriously, the best thing that could be done for Windows right now is not to dump Metro, but to put it on tablets where it belongs and not force desktop users to buy into the whole touch-first thing.

about 2 months ago
top

California Regulator Seeks To Shut Down 'Learn To Code' Bootcamps

ErichTheRed Remember MCSE Bootcamps? (374 comments)

Back in the late 90s / early 2000s, training companies were making tons and tons of money funneling people with zero computer experience through MCSE certification bootcamps. Basically, they would do the entire set of certification exams in 2 weeks, and not all of them were 100% honest to students about their chances of passing or even getting a job once they were done. These bootcamps still exist, but from what I've experienced, they're only for people who actually know the material and just need to update their skills quickly. The earlier iterations of these were definitely certification mills though. I went to one around 2001 because I wanted to update my certs. The class was split -- some of us were there to just do a quick skills upgrade, and others had obviously been suckered in by a dishonest recruiter. To get these folks to pass, instructors would give them copied exam questions to study and pay for these students' extra chances to pass the exams. The school would then be able to tout their super-high pass rate for the exams. And these weren't cheap either -- some were $7K or $8K in 1990s dollars. Even when you factor the cost of a hotel stay, meals and an instructor, the profit margin is huge.

Now it seems that the focus is less on system admin skills and more on "web coding" like these schools are offering classes in. Seems like a perfect hook -- young students who use their iPhone or Android mobile constantly get sold the dream that they too can be the next great app writer and make millions. And it really does seem doable -- with all the web frameworks out there, there's very little a "coder" has to know about what's actually going on under the hood to make something that works. Problem is that paper MCSEs didn't work out so well when they got on the job, so I doubt these classes will help mint genius developers either. My boot camp class back in the day had a former bus driver and someone who was fresh out of the army in an unrelated field.

Libertarians will say it's OK for businesses to take advantage of people, but I think education is a little bit different. Selling someone thousands of dollars in classes and telling them they're equivalent to CS graduates just isn't honest, and these schools profit off peoples' naivete and sell them dreams. The state gets to regulate educational institutions, so it makes sense that they're taking a look at them. And what if it was something simple like needing to publish student outcomes or pass rates? The libertarian free market would be all excited then, because the bad ones might be weeded out if students could be bothered to do research on statistics available from regulation.

It took ages to weed the paper MCSEs out of the workforce, and it's still not 100% complete. Every time I meet an "IT professional" who has no troubleshooting ability, I think back to these bootcamps.

about 3 months ago
top

Microsoft Joins Open Compute Project, Will Share Server Designs

ErichTheRed There goes the rest of the hardware market... (90 comments)

Microsoft isn't giving their server designs away out of the goodness of their hearts. They have a huge interest in getting people to move their workloads to Azure. The first step for most places has to be getting them off of VMWare or KVM onto Hyper-V/Windows Server. Next step is convincing enterprises to buy these whitebox server designs to save money on their on-premises stuff. Finally they'll make Azure too good a deal to pass up for the CIO crowd with the usual argument that you can fire most of your IT department. It's already super-easy to publish your applications right from Visual Studio to Azure...again, not an accident.

I actually think the whitebox design method is a good thing...IF...you have a dedicated staff working 24/7 to repair/replace sickly boxes, and the workload is such that a box is a box is a box. This works perfectly for large scale web apps backed by a SAN, or hypervisor hosts. It doesn't work as well for standalone application stacks that have semi-permanent physical server dependencies. Renting 3 servers in the cloud doesn't make as much sense as renting 3,000.

My company does a lot of standalone deployments of applications around the world, in places where network connectivity doesn't permit easy cloud access. It's getting harder to find vendors who aren't trying to steer us to the cloud. Microsoft is making it very difficult to purchase perpetual licenses of software, with the price of a negotiated Software Assurance deal being set less than the equivalent one time license fee [1]. Now that IBM just bailed out of the x86 server market, HP is pretty much the only vendor left making decent hardware for non-cloud applications.

I totally get why AWS, Azure and public clouds make sense. When you're running the back-end for an iPhone app, and need 40,000 web servers all cranking out the same content, it makes sense to rent that. But a lot of companies don't seem to get that it's more expensive to do the cloud thing if the servers are going to be permanent and you're hosting one of those boring line-of-business apps. Hopefully people will realize this before the last decent x86 server vendor quits selling non-cloud-optimized servers.

[1] Licensing SQL Server on multi-socket physical boxes is insanely expensive now compared to VMs. I had to add ESXi to our solution for a recent deployment just to save thousands of dollars on the database license for a low-powered app.

about 3 months ago

Submissions

top

Apple buys iFixit, declares repairable devices "antiquated".

ErichTheRed ErichTheRed writes  |  about two weeks ago

ErichTheRed (39327) writes "Apparently, Apple is buying iFixit. iFixit is (was?) a website that posted teardown photos of gadgets and offered repair advice. According to the website: "Apple is working hard to make devices last long enough to be upgraded or irrelevant, making repairability an antiquated notion." It's all clear now — I can't replace the batteries, hard drives or RAM in new Macs because I'm expected to throw them in the landfill every 2 years!

It made it to CNN, so it has to be true, right?"

Link to Original Source
top

"Clean up Github" -- A backlash against stereotypical nerd culture?

ErichTheRed ErichTheRed writes  |  about a month ago

ErichTheRed (39327) writes "The story on Monday about Julie Ann Horvath quitting GitHub because of harassment ties in nicely with this. A group called Ethical Code is starting a "Clean Up GitHub" campaign to request people to pull offensive comments out of their code. This brings up a very interesting question...is it still considered too PC to expect people to be somewhat professional in their public code submissions, or is this a sign that the industry might be "growing up" a little? I'd like to hope it's the latter...."
Link to Original Source
top

Microsoft Retiring the TechNet Subscription

ErichTheRed ErichTheRed writes  |  about 10 months ago

ErichTheRed (39327) writes "One of the nicest perks that Microsoft offered is being retired. Microsoft has reasonably-priced "TechNet Subscriptions" which give you low-cost full access to download fully functional evaluation software. The idea is that IT people could use a product in their lab for learning or simulation purposes without having to shell out thousands for an MSDN subscription. These are being retired as of August 31st. Apparently they're trying to shift "casual" evaluation of software onto their Virtual Labs and other online offerings. If you want full evals of software, you're going to need to buy an MSDN Subscription. I know lots of people abuse their TechNet privileges, but it's a real shame that I won't just be able to pull down the latest software to replicate a customer problem, which is part of what I do on a daily basis. I guess you can mark this one as "From the one-bad-pirate-ruins-the-whole-bunch department...""
Link to Original Source
top

Ex-Employee Busted for Tampering with ERP System

ErichTheRed ErichTheRed writes  |  about a year ago

ErichTheRed (39327) writes "Here's yet another example of why it's very important to make sure IT employees' access is terminated when they are. According to the NYTimes article, a former employee of this company allegedly accessed the ERP system after he was terminated and had a little "fun". As an IT professional myself, I can't ever see a situation that would warrant something like this. Unfortunately for all of us, some people do and continue to give us a really bad reputation in the executive suite."
Link to Original Source
top

Change the ThinkPad and it will Die

ErichTheRed ErichTheRed writes  |  about a year ago

ErichTheRed (39327) writes "Here's an interesting editorial piece about the ThinkPad over at CNN. The basic gist of it is what many ThinkPad devotees have been saying since Lenovo started tweaking the classic IBM design to make the ThinkPad more like a MacBook, Sony or other high-end consumer device. I'm a big fan of these bulletproof, decidedly unsexy business notebooks, and would be unhappy if Lenovo decided to sacrifice build quality for coolness. tl;dr: You can have my 1992 clicky IBM ThinkPad keyboard when you pull it from my cold dead hands. :-)"
Link to Original Source
top

IBM Sells POS Busiiness to Toshiba

ErichTheRed ErichTheRed writes  |  about 2 years ago

ErichTheRed (39327) writes "Yet another move by IBM out of end-user hardware — Toshiba will be buying IBM's retail point-of-sale systems business for $850M. I'm not an MBA, but is it REALLY a good idea for a company defined by good (and in this case high-margin) hardware to sell it off in favor of nebulous consulting stuff?? Is there really no money in hardware anymore? I doubt they'll ever sell their Power systems or mainframes off, but you never know!"
Link to Original Source
top

Learning Programming in a Post-BASIC World

ErichTheRed ErichTheRed writes  |  more than 2 years ago

ErichTheRed (39327) writes "This Computerworld piece actually got me thinking — it basically says that there are few good "starter languages" to get students interested in programming. I remember hacking away at BASIC incessantly when I was a kid, and it taught me a lot about logic and computers in general. Has the level of abstraction in computer systems reached a point where beginners can't just code something quick without a huge amount of back-story? I find this to be the case now; scripting languages are good, but limited in what you can do...and GUI creation requires students to be familiar with a lot of concepts (event handling, etc.) that aren't intuitive for beginners. What would you show a beginner first — JavaScript? Python? How do you get the instant gratification we oldies got when sitting down in front of the early-80s home computers?"
Link to Original Source
top

America's tech decline: A reading guide

ErichTheRed ErichTheRed writes  |  about 3 years ago

ErichTheRed (39327) writes "Computerworld has put together an interesting collection of links to various sources detailing the decline of US R&D/innovation in technology. The cross section of sources is interesting — everything from government to private industry. It's interesting to see that some people are actually concerned about this...even though all the US does is argue internally while rewarding the behaviour that hastens the decline."
Link to Original Source

Journals

ErichTheRed has no journal entries.

Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...