×

Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Comments

top

New AP Course, "Computer Science Principles," Aims To Make CS More Accessible

ErichTheRed Re:How is this an AP? (208 comments)

One of the state universities by me is offering a "pre-intro" CS course that focuses more on the absolute basics before stuffing them in a programming course: CSE 110 It seems to me that this is a good way to scare away people who don't actually want to do CS, and to fill in gaps in knowledge that today's students would have. It's interesting that this is different from the high level survey course for non majors, and it's only a "suggested prerequisite" for the more programming and logic-heavy traditional Computer Science I, II and III.

To me, that seems like a good idea. Typical students who think CS is a good fit because they've messed around with computers are different from those of previous times. Most will not have the low-level programming, algorithms and other experience that people had to have at least a familiarity with back before the app revolution. See my other post in this article -- writing a Minecraft mod or cooking up a web application in ReallyCoolFrameworkOnRails doesn't give you the same low level understanding of how a computer actually does all the magic it does.

3 days ago
top

New AP Course, "Computer Science Principles," Aims To Make CS More Accessible

ErichTheRed It's not really a gender thing, it's perception (208 comments)

Everyone seems to be pointing to this as a gender issue, but the way I see it, it's a way to get more students interested in the stuff that _most_ of them will be doing with a computer science education.

The world has changed significantly since I graduated college almost 20 years ago (with a STEM degree that wasn't CS.) In 1997, the year I got out, the dotcom bubble was just inflating and all the protocols and "glue" that make Web applications work were just starting to be enhanced and built out. Fast forward to now, and there's so much abstraction in typical computer systems that many "coders" writing web applications don't even deal with system-level code. There are a billion web frameworks that keep getting cycled through every 6 months as the new hotness, and tons of new languages to run on them. This group of people will be better suited to learning enough logic to keep them from doing stupid stuff in their framework of choice, and this seems to be what the course focuses on. It acknowledges the reality that many CS grads aren't going to be sitting down in front of the terminal and writing hardware drivers, or doing embedded systems work.

Think about it -- to build a web application in the 1990s, you had to first design the entire database schema and set that up on a system somewhere. Then, you had to write code to have your web application talk to the database to get information in and out. Then you had the whole presentation layer with a combination of static and dynamic HTML plus all sorts of crazy CGI, Flash, ColdFusion or whatever glue code to get everything working. To build an iPhone app now, download XCode, pick a sample project and just glue together all the huge chunks of pre-built functionality. The focus on the app becomes the presentation layer because everything below that is solved for you. This is how a bunch of ex-fraternity "brogrammers" can build Tinder or Uber and make $40 billion in Monopoly money...there is obviously some technical talent behind it, but the app itself is relying on huge amounts of pre-integrated, well-documented libraries.

A student coming into CS now sees apps, social media and mobile devices. How do you keep them down on the C++ and data processing farm when this is the current face of computing? The reality of it is that some parts of software development are no longer only for the nerdy crowd. Apple put computers into hundreds of millions of ordinary peoples' pockets. They are now a consumer electronics company more than a computer company (and their current crop of Macs seems to reflect this.)

3 days ago
top

Uber Limits 'God View' To Improve Rider Privacy

ErichTheRed You still can't change user behavior (76 comments)

One of the things that is fueling the insane hype behind the Web 2.0/mobile/social/app/whatever bubble is the fact that any group of startup kids can use tools to build an app. Just like any group of startup kids could build a website capable of processing payments in 1997, add in a shaky business model and all of a sudden, "this time it's different." Apple, Google and other smartphone OS vendors have rolled out some really cool stuff and basically given everyone a tracking device with all sorts of sensors attached to a full-powered computer the size of a phone. The problem is this -- the nature of the user interface hides the fact from ordinary users that all of their location and other data is being shared with the app developers. Android does a little better with privacy controls, but basically all this stuff is hidden from the user.

Ordinary users, i.e. non-techies, see the shiny app interface and (understandably so) don't see that the "free" services the app provides are paid for either through marketing/advertising (eyeballs in dotcom bubble 1.0 speak) or selling your data to a third party. And even if they knew about it, most people would want the benefit of hailing a cab on demand more than their privacy. It would take some serious user education, and a few very high-profile leaks of customer data to change behavior, and I don't think it would even be possible if that happened. People like their free apps. I would pay Google for a subscription to their search engine if I could be assured my information wasn't being harvested, but I know no one else would want this.

On the positive side, sitting on the sidelines and watching from my comfy seat, it looks like Bubble 2.0 is starting to reach the top. We're already seeing the insane valuations and VC investments, have had a couple high-profile revenue-free IPOs like Twitter, and the next phase is coming. Soon as interest rates start going up and the stock and VC bubble money stops flowing, things will calm down again. When you start hearing startup-speak more and more in the financial press, it's time to sell and wait for things to collapse again. It really is the dotcom bubble all over again, but this time people are carrying their web browsers in their pockets and companies have direct access to their location and habits.

4 days ago
top

Waze Causing Anger Among LA Residents

ErichTheRed Re:Sympton of a bigger problem (594 comments)

I'm looking at you Phoenix.

I've noticed that a lot in the West (I'm an east coaster -- the cities here just don't have the land available to do this anymore.) Western cities with miles and miles of flat territory around them tend to have these "planned community" developments where an entire city will be built on thousands of acres in one shot. Even if it's a planned city, people still need to go in and out of it, especially if your planned city has destinations like office parks or stadiums. (Didn't Phoenix do one of these to try to build up the area around their football stadium? And I'm sure I've read about huge abandoned planned cities in Vegas after the housing collapse.)

Where I live (metro New York,) you don't see these big bang developments -- you see random little developments sprinkled around the edges of the "insane commute zone." Northern New Jersey and Long Island have this - the first-line suburbs (example: Nassau County NY) are completely built out and full with zero land to spare. The problem is that much of the housing stock is from the 40s through the 60s on tiny lots. People still want the 2 acre lots and the 6000 sq. ft. monster houses, so they start creeping further and further out. When enough people do this, the infrastructure that was designed for a much less dense population gets overwhelmed. After about 20 years of this, more lanes get put in, encouraging more development, and making the problem worse.

5 days ago
top

Waze Causing Anger Among LA Residents

ErichTheRed Re:Perhaps the need a bigger highway? (594 comments)

Eminent domain those house and get some more lanes in.

Last time I was in LA, I noticed that lanes are not the problem. Some of the freeways are five to eight lanes in each direction. It's a crowding problem, not a civil engineering problem. Everyone is trying to get to destinations inside that corridor, _and_ through that corridor to get to other destinations. Since metro LA is hundreds of square miles of mostly low density development, travel distances to get anywhere are longer than they would be in a more compact city. Now, this Waze app is using drivers' smartphone realtime data to steer people off this road and onto surface streets, which makes the overall problem worse.

Part of it is the human factor -- yes, I know Google will perfect the self driving car in 2015, yada yada yada, but for now, you have people driving cars. People get into accidents. People have reaction times that mean they can't take their foot off the brake the instant any stopped traffic clears. (Try this sometime at the end of a long line of stopped traffic when the light turns green -- watch how long it takes for the light to turn, then Car 1 to go, then 2, then n, then you. Each driver has built in reaction delays that make this process longer than it would be in an ideal environment.)

That particular stretch of road (405) is pretty much the _only_ north south passage through that part of LA because of geography (and crappy urban planning.) It could be 30 lanes in each direction and still be slow.

5 days ago
top

Waze Causing Anger Among LA Residents

ErichTheRed Sympton of a bigger problem (594 comments)

App or no app, traffic in cities and suburbs is something that is going to need to be dealt with somehow. Cities like Boston or New York at least have a workable public transit system to keep some cars off the roads. LA is totally different -- it was built around cars and is only now getting a very small set of public transit choices. Buses do nothing when they're stuck in the same traffic everyone else is. Whenever I go to California for work (either northern or southern,) it amazes me how much people put up with to live there. I would go nuts spending 2 hours doing a 10 mile trip each direction every day.

Some trends are encouraging from a traffic perspective, but maybe not from a demographic one. Younger people aren't buying suburban houses and having big families the way they used to, so it's possible cities will become denser like they are in Europe. The big thing that has to stop, especially in mid-size cities, is the suburban sprawl. The ability to expand for miles in every direction directly contributes to messy traffic problems. Urban planners need to look into reclaiming hollowed-out cities and first ring suburbs, and getting people to move back into them.

5 days ago
top

Former HP CEO Carly Fiorina Considering US Presidential Run

ErichTheRed Not my first choice (433 comments)

I know a few people who worked at HP in the 2000s, and even with the sour grapes filters on, every one of them describes how she let HP rot away, killing divisions and outsourcing any function she could for quick balance sheet cash hits. There's still some soul left there though -- the non consumer PC and laptop division is doing OK, as is their server line with the exception of the Itanium mess. Their software and the former EDS is a disaster, and let's not even mention the Autonomy acquisition. (OK, Autonomy was done after she was kicked out.) Still, HP is a long way from its engineering-driven roots and I don't know if it can ever get back there.

Politics aside, I can't see what she could offer as President.

about three weeks ago
top

LinkedIn Study: US Attracting Fewer Educated, Highly Skilled Migrants

ErichTheRed Re:linked in? (338 comments)

"The only people I know that still use LinkedIn are desperate and unemployable."

I think it's kind of like Facebook. Some people use it 24 hours a day and are addicted, and others use it as a convenient way to share pictures and keep up with extended relations. The recruiter spam is awful, but it's kind of like the ads you're forced to watch to use Facebook. I've found it useful solely to keep track of people I've worked with in the past. Since people increasingly hop from place to place, it's a convenient way to keep up. I found my last job by calling someone I knew and saying, "Hey, my company just made a stupid decision and I want out before things get bad, are you guys hiring?"

Any shred of information you share on LinkedIn will be picked up by recruiter-bots and you will incessantly be contacted. As long as you don't read any of the "sponsored content" or offer up too much information, it has its purpose. That said, I get at least one or two desperate recruiters a week trying to fill some insanely obscure requirement that just happens to be in the middle of nowhere as well. I wonder if the "new" recruiters are given these leads to try to prove their worth. I seriously saw someone posting something for a FoxPro programmer somewhere in Nebraska...and this was in 2012.

about a month ago
top

LinkedIn Study: US Attracting Fewer Educated, Highly Skilled Migrants

ErichTheRed Lack of opportunity in general? (338 comments)

In the IT sector, I can see a few things driving this:
- Infrastructure and dev jobs are increasingly being farmed out to cloud providers and outsourcers, meaning fewer on site jobs are needed, at least at the low end. (Which is a pity, because you don't get good high-end people if they can't start out at the low end like they used to.)
- In general, economic growth is still slow in most sectors, so a lot of the traditional demand for IT isn't there.
- Tech Bubble 2.0 is increasingly eating up resources building web-based services and phone apps. Startups want young hungry coders who are exactly like the founders, which may lead to fewer foreigners being employed.
- The US isn't exactly welcoming to foreigners these days, given the debate on immigration. Even if someone is the best and brightest, it's possible they would feel lumped in with everyone else.

In STEM, it's bigger trends that are probably driving it:
- Other countries are more science friendly -- they fund it well and there's less of a cultural bias against "smart people".
- Science in general is a bad career prospect given the imbalance of graduates and permanent research positions. Most big corporate labs are shells of what they once were, and academic institutions seem to want to keep everyone on permanent postdoc status. I would have to have a total passion for my work to accept tenuous circumstances like that, and would probably be nearly broke for most of my life.

This, plus the abuse of the H1-B program by IT companies, is probably a good starter list of reasons. For every great H1-B hire, there are many stories of junior guys with questionable skills and credentials being run through a large technology company's meat grinder churning out code or performing low end tasks. It's definitely a misuse of the program in this case, since it was designed to correct a critical skills imbalance.

One thing that might reverse the trend is the fact that fewer domestic people are going into STEM fields, given the cost and the fact that it's no longer a guarantee of gainful employment. It's counter intuitive given how well _successful_ STEM graduates do compared to the general population, but once a precedent is set, it's hard to change people's minds. Think about how many IT people you know who actively say they're telling their children to avoid following in their footsteps.

about a month ago
top

Does Being First Still Matter In America?

ErichTheRed It's not possible now (247 comments)

One of the things that drove the race for the moon was us losing the space race with the Soviet Union. Having that big a Cold War enemy was a huge boost in one respect (mandates to educate the population and advance science) and a huge detractor in another (who-knows-how-many trillions of dollars wasted on a nuclear arms race that neither side actually needed to participate in.)

I think the times are different now:
- Education isn't seen as a guarantee of a decent job anymore, so fewer people are spending the money and effort on it.
- Decent jobs are no longer guaranteed either, so people are more concerned with day to day survival than long-term planning.
- We don't have a huge boogeyman like the USSR ready to wipe us out the second we let up the pressure...the closest thing now is China, and they're our biggest trade partners.
- Media is more fragmented. You can argue either side of this point, but the world was a lot simpler when there were only 3 TV networks, a much longer news cycle and newspapers of record that did real journalism. Now no one can make any sort of controversial move without 200 news analysts jumping all over it and putting forth their opinion as fact.
- People don't trust large institutions or governments, who are often the only entities big or powerful enough to mandate huge changes or push science forward. (Example: AT&T funding Bell Labs with phone company revenues leading to breakthrough inventions, or the US funding Apollo and other NASA programs.)

I think that some of these factors make it impossible to be "first" in key areas, simply because no one is willing to stick their neck out and invest the time, effort or resources.

about 1 month ago
top

"Barbie: I Can Be a Computer Engineer" Pulled From Amazon

ErichTheRed Wow, it's 1953 again! (561 comments)

I've seen a lot of posts already claiming that this is what happens in real life. Has anyone stopped to think that the reason for this is because girls are encouraged from day one to behave that way? It is true that women gravitate towards the project manager, training and business analyst jobs. But I have worked with many people of both genders, and the ability levels are pretty evenly split. There are plenty of helpless, clueless guys too, and they tend to bolt up the ladder quickly into management where they don't have to do the technical work anymore. I think women generally like PM or BA jobs for the simple reason that they get to interact with humans who actually care about something other than computers, video games and software development. (You couldn't pay me enough to be a project manager, going around begging people for work while not being able to control them and still being responsible for the project.) But, I also think that with the right encouragement early on, and without the hostile work environment that some IT outfits provide, there's nothing stopping women from doing great software work. The requirements are the same -- critical thinking, logic and an ability to deal with occasional intense levels of frustration. If girls aren't poisoned with things like this early on in their schooling, they have the same opportunity to develop these skills as boys do.

I'm just really surprised that this book made it through focus groups, internal meetings at Mattel, etc. and people still thought it was a good idea. That leads me to believe people are even more clueless than I thought.

I've got both a young son and a young daughter, and the age at which they start pushing the pink crap on the girls is astoundingly low. My daughter doesn't really play with dolls too much, and certainly doesn't own a Barbie. My wife grew up in a household where both parents were academics, and it shows. They didn't let her get sucked into this trap, and we're going to do our best to do the same. The thing I'm worried about is the peer pressure from dumbass female classmates once she gets to school. I'm amazed that in 2014 women are still being encouraged to take on traditional roles, and that sexism is somehow still OK. We've got a couple more years, so I think all we can do is just encourage them to like learning. It appears to be working for our son -- we limit TV and computer time, and actually take the time to explain things he has questions about in terms he can understand. I'll find out in 15 years or so if I did a good job or not....

about 1 month ago
top

Microsoft Azure Outage Across the Globe

ErichTheRed Re:Wow, I'd be pretty angry (167 comments)

"Most people who run large companies aren't stupid, and I'm sure that many of them do take into consideration the costs of outages."

Not stupid, but MBAs in my experience never actually dig into the spreadsheets and figure out the meaning behind the number. They just see what the vendors promise them over multiple free lunches, golf trips, etc. It doesn't help that most CIOs aren't really technology people, or are so divorced from the day to day operations that they don't know what impact a decision like that has.

It's the short sighted MBA disease -- if cost of onsite service is greater than shiny rosy cloud picture the vendor is painting, get rid of the onsite service regardless of operations impact. The other problem is that most of the decision makers will just bail when the first failure happens, after having collected the bonus for getting rid of the IT team.

about a month ago
top

Microsoft Azure Outage Across the Globe

ErichTheRed Wow, I'd be pretty angry (167 comments)

Everyone forgets that Azure is a way-beyond-massive Hyper-V implementation, and that AWS is a way-beyond-massive Xen-like-thing implementation. Even though both cloud providers let you be smart in designing your infrastructure (multi-site, redundancy, etc,...the tools are there) nothing will save you from an outage of the core guts of the system. Wasn't Azure's last failure due to a certificate expiration? There's no way an end customer can plan around that.

I'm a big fan of the private or hybrid cloud version of this fad. You get all the good stuff that Azure and AWS customers get like dynamic provisioning and software defined networking, without having to rely on a third party. Unfortunately, CIOs and other execs just see the numbers on a spreadsheet and don't take the costs of outages that you can't control into account. Power fails, networks drop, and people do stupid things in on-site implementations also. But you can at least have your staff working on it with the incentive being "you get to keep your job." With a public cloud provider or even a hoster, the responsibility ends with "oops, here's 7 hours of free service" and you have to wait in line with everyone else.

about a month ago
top

Coding Bootcamps Presented As "College Alternative"

ErichTheRed Parallels to the MCSE Bootcamp (226 comments)

When I first saw this article this morning, my immediate reaction was, "Oh no, here we go again." I'm not a developer -- I do systems integration work, and a lot of my job is getting software written by "developers" working on a real system within reasonable parameters.

The parallel I drew from this was the MCSE and CCNA bootcamps that popped up towards the end of the last bubble and continued for quite a while after. Training companies still offer them, but they're no longer touted as the "change your life in 2 weeks!" miracle workers they once were. I entered IT with a science education, but not CS, so I have used certifications throughout my career to check the HR box, and I actually did take an MCSE bootcamp back in the day when I was upgrading my self-taught Windows NT 4.0 certification to Windows 2000. Done right, they are a very good way to review concepts you already know and gain insight from instructors who teach the official classes and know what Microsoft is looking for on the exams. It saves you tons of time not having to review every single thing again looking for changes that are testable. However, in my experience, the greedy training companies also tried to cash in on desperate unemployed people, much the same way for-profit colleges and trade schools are doing now. Remember the old advertisements claiming they could turn a plumber or truck driver into a highly-paid IT administrator in 2 weeks for $10K or whatever? I had a couple of those students in that bootcamp class I took. In 1999, I'm sure they got jobs instantly. But all through the end of the dotcom boom, we were working through this huge glut of underqualified people who went this route.

The DevBootcamp thing actually sounds good on the surface, but the fact of the matter is that unless you have some grasp of machine fundamentals (how TCP works, how HTTP requests work, how to code a database call efficiently, etc.) you will only get someone who knows Ruby on Rails, a couple database tricks, and JavaScript. This is fine if you just want someone who is cranking out maintenance tasks for some small company web application, but it's disingenuous to present it as a true college alternative. There are plenty of college grads who don't have practical experience either, but at least a proper CS curriculum will expose them to the fundamentals that make all this upper-layer stuff work. Plus, maybe, you will have been exposed to something other than web development. I would much rather work with someone who is a little more well rounded than an absolute genius who can't talk about anything outside of their small area of focus. It just seems to me that these companies see a market -- bubbly, frothy VC-funded startups looking for an army of cheap young Ruby coders -- and are taking advantage of it while they can. I just wouldn't want to be one of these people who only know a Web framework or two when the bubble pops and businesses once again demand people with the capability to solve a wider set of problems.

about a month ago
top

Coding Bootcamps Presented As "College Alternative"

ErichTheRed Re:Web 2.0 (226 comments)

Of course it's a meaningless phrase, but how else do you sum up the last few years? The smartphone bubble? Not really, that leaves out cloud computing, big data and IoT. The social bubble? Also leaves out too much.

Even before the financial meltdown and the low interest rates that drove another stock bubble, there were parallels to the dotcom boom:
- Trendy startups in San Francisco, SV and New York, just like last time
- Media falling all over themselves to report on this, fueling more interest.
- Plenty of wacky revenue-free, shaky business model companies generating huge VC investments and crazy valuations
- I'm even starting to hear people say "this time it's different" again, which kind of seals it for me....

The first boom was all about getting everyone online and using your service. This one appears to be fueled by advertising and demographics. I think the people who will make out best this time will be Amazon (AWS) and Microsoft (Azure) as well as all the other hosting providers...because startups don't host their own systems and have to pay the bills every month.

about a month ago
top

The New-ish Technologies That Will Alter Your Career

ErichTheRed Buzzwords != Career-long Skills (66 comments)

Once in a great while, something comes along that fundamentally alters the way the overall practice of computing operates. Everything else is a rehash of the old stuff, with improvements that have been made since its introduction. Cloud computing is just hosted data centers with more flexibility and APIs to control the difficult tasks of resource and application provisioning. When you've been in IT long enough, you see patterns repeat. It's cool when something that barely worked before comes back around with new improvements. But, the buzzword itself is not what you build a career on -- it's the fundamentals that are only learned by dealing with lots of different problems over time. This is why older workers in IT get discriminated against -- younger people seize on the buzzword and deride the older folks for patiently explaining that it's all been done before.

The one thing on that list that did irk me a little is "Web APIs" being listed as a fundamental change to the way we work. It's a change, but not a good one. Admittedly we shouldn't be hand-coding assembler for most tasks, but the introduction of monster "service" APIs and web frameworks gets developers so divorced from the underlying complexity that the "how it works" part is lost. I would say that's the big change...developers can code up something horribly inefficient that works, but they'll never know how to track down the "why" behind the bad performance. And since hardware is virtually free, developers who don't pay the AWS bill directly will just keep consuming the free resource.

about a month ago
top

State Department Joins NOAA, USPS In Club of Hacked Federal Agencies

ErichTheRed Security in any organization is an afterthought (54 comments)

I can see 2 things as the main root cause of this:
- Layers and layers of outsourced IT. Especially when dealing with a federal agency, almost every IT service in any agency has been outsourced. Those outsourcers hire other outsourcers and it becomes a big mess when you try to do anything that affects multiple parts of a system. I see this in the private sector as well working for an outsourcer...our team does their best to help but it's really maddening to see how much things slow down when the control gets dispersed. The network team has to talk to the storage team, who has to talk to the server team, who needs to open a ticket with the field services team to implement change #C9348673634. I do systems architecture work, so it's really painful to have to design around a garbage system like this rather than having a few smart people who know the system end-to-end.
- Security is tough and no one wants to be bothered. It wouldn't be impossible to enable 802.1x on a network, implement proper PKI to enable its effective use, and encrypt hard drives. But often, it either becomes too difficult to support or no one has the will to say things must be done in a certain way. Plus, user education is impossible. No matter how stringent the password policy is, they just write them down. People leave unencrypted laptops on trains with company data on them. It's just not possible to get them to care, full stop. They could be working with top secret nuclear weapons designs and it would mean nothing to them.

Of these two, I think the first is the hardest to overcome. Once a company or government agency has given up control of its IT environment to a company that needs to squeeze every nickel out of a contract, nothing difficult will get done. If an organization retains some sort of control and mandates change, it can be done at least to some degree. Look at how the attack on Target was carried out -- the group responsible figured out that the outsourced HVAC repair company had a connection to the store network, which (idiotically,) the POS systems were also directly attached to. So by the time the outsourced IT services team figured out they had a problem, it was too late. This is what leads companies to delay things like patching and updates to equipment, because the process is too painful when dealing with the 25 third parties you have to line up for such a change.

about a month ago
top

Microsoft To Open Source .NET and Take It Cross-Platform

ErichTheRed Sounds like what Sun did (525 comments)

This is actually a pretty smart idea, but it sounds like what Sun did with Java and parts of Solaris. .NET was designed to be a Windows-only application platform, requiring Windows clients for fat applications and at least Windows servers for web applications. Now Microsoft is seeing Windows become less relevant, but they do want people to be using their software stack regardless of platform.

Same thing with Visual Studio being made free...kind of like XCode being free for MacOS, and the open source IDEs being free. It's a bold move because now the .NET ecosystem needs to stand on its own, and I guarantee they're going to try to tie this in with Azure somehow (like making you run the free VS in Azure VMs you pay for or something...)

One scary thing from my side of the house (systems engineering/integration) is the number of new security flaws and the sheer volume of patches that are going to be released once .NET gets more scrutiny. A good thing, yes, but patching .NET is already a pain in the butt.

about a month ago
top

Duke: No Mercy For CS 201 Cheaters Who Don't Turn Selves In By Wednesday

ErichTheRed Re:Ok, I am naive, but... (320 comments)

The other problem is that cheaters usually just get an F for the course if they get caught and can continue after retaking it.

Institutions only have so much power. When I was in school, my grades were all over the map, some As, lots of Bs, some Cs. One of the things I realized pretty early on was that as long as I can keep my GPA high enough, there wasn't really much worry about what actual grade I got in the course...so I focused on learning the material thoroughly rather than trying to ace the exams. Most entry level jobs don't care or ask about your college grades. The only times they matter are:
- If you want to go to professional school (law, medical, dental, MBA)
- If you want to go into academia
- If you want to be an investment banker or management consultant

Other than that, students would get a lot more out of school if they focused on learning rather than tests.

about a month ago
top

Duke: No Mercy For CS 201 Cheaters Who Don't Turn Selves In By Wednesday

ErichTheRed Is it cheating? In CS classes, yes it is (320 comments)

The problem isn't necessarily that code was copied directly from the Internet, it was that it was passed off as the students' own work. Coding assignments can only be done so many ways in lower-level CS classes, where the problems have to be small enough to be easily testable. The problem I see is that allowing it encourages the practice among CS grads in later life.

I work in systems integration, and I can't tell you the number of times I've seen crap software, even software from vendors, that is horribly inefficient. I think a lot of that software has a fair amount of copy-paste code in it simply because the goal was to get something that compiled and sort of worked.

That brings up another very important point -- the level of abstraction has gotten so high in software development that it's very hard to see what's actually going on behind the scenes. If you're calling some massive database access library to do your data entry from a web form, you really can't tell how bad the SQL that your particular function uses is for the database. (I've seen packaged applications that will tie up the CPU of a server for 30 or more seconds just to make a database change.) If students don't learn at least some of the fundamentals in CS classes, who will design the next generation of lower-level stuff? Code reuse and libraries are good, but you need to know what's appropriate to use. So if you don't have a good grasp of algorithms, data structures, etc., how will you even know whether you're solving a problem correctly?

Same thing goes for my field -- systems admin/integration. If you don't know at least the basics of how TCP works, a few of the application protocols and something about how your OS manages resources, it becomes very hard to troubleshoot anything to any degree.

about a month ago

Submissions

top

Coding Bootcamps Now Mainstream, Presented as "College Alternative"

ErichTheRed ErichTheRed writes  |  about a month ago

ErichTheRed (39327) writes "Perhaps this is the sign that the Web 2.0 bubble is finally at its peak. CNN produced a piece on DevBootcamp, a 19-week intensive coding academy designed to turn out Web developers at a rapid pace. I remember Microsoft and Cisco certification bootcamps from the peak of the last tech bubble, and the flood of under-qualified "IT professionals" they produced. Now that developer bootcamps are in the mainsteam media, can the end of the bubble be far away?"
Link to Original Source
top

Apple buys iFixit, declares repairable devices "antiquated".

ErichTheRed ErichTheRed writes  |  about 9 months ago

ErichTheRed (39327) writes "Apparently, Apple is buying iFixit. iFixit is (was?) a website that posted teardown photos of gadgets and offered repair advice. According to the website: "Apple is working hard to make devices last long enough to be upgraded or irrelevant, making repairability an antiquated notion." It's all clear now — I can't replace the batteries, hard drives or RAM in new Macs because I'm expected to throw them in the landfill every 2 years!

It made it to CNN, so it has to be true, right?"

Link to Original Source
top

"Clean up Github" -- A backlash against stereotypical nerd culture?

ErichTheRed ErichTheRed writes  |  about 9 months ago

ErichTheRed (39327) writes "The story on Monday about Julie Ann Horvath quitting GitHub because of harassment ties in nicely with this. A group called Ethical Code is starting a "Clean Up GitHub" campaign to request people to pull offensive comments out of their code. This brings up a very interesting question...is it still considered too PC to expect people to be somewhat professional in their public code submissions, or is this a sign that the industry might be "growing up" a little? I'd like to hope it's the latter...."
Link to Original Source
top

Microsoft Retiring the TechNet Subscription

ErichTheRed ErichTheRed writes  |  about a year and a half ago

ErichTheRed (39327) writes "One of the nicest perks that Microsoft offered is being retired. Microsoft has reasonably-priced "TechNet Subscriptions" which give you low-cost full access to download fully functional evaluation software. The idea is that IT people could use a product in their lab for learning or simulation purposes without having to shell out thousands for an MSDN subscription. These are being retired as of August 31st. Apparently they're trying to shift "casual" evaluation of software onto their Virtual Labs and other online offerings. If you want full evals of software, you're going to need to buy an MSDN Subscription. I know lots of people abuse their TechNet privileges, but it's a real shame that I won't just be able to pull down the latest software to replicate a customer problem, which is part of what I do on a daily basis. I guess you can mark this one as "From the one-bad-pirate-ruins-the-whole-bunch department...""
Link to Original Source
top

Ex-Employee Busted for Tampering with ERP System

ErichTheRed ErichTheRed writes  |  about a year and a half ago

ErichTheRed (39327) writes "Here's yet another example of why it's very important to make sure IT employees' access is terminated when they are. According to the NYTimes article, a former employee of this company allegedly accessed the ERP system after he was terminated and had a little "fun". As an IT professional myself, I can't ever see a situation that would warrant something like this. Unfortunately for all of us, some people do and continue to give us a really bad reputation in the executive suite."
Link to Original Source
top

Change the ThinkPad and it will Die

ErichTheRed ErichTheRed writes  |  about 2 years ago

ErichTheRed (39327) writes "Here's an interesting editorial piece about the ThinkPad over at CNN. The basic gist of it is what many ThinkPad devotees have been saying since Lenovo started tweaking the classic IBM design to make the ThinkPad more like a MacBook, Sony or other high-end consumer device. I'm a big fan of these bulletproof, decidedly unsexy business notebooks, and would be unhappy if Lenovo decided to sacrifice build quality for coolness. tl;dr: You can have my 1992 clicky IBM ThinkPad keyboard when you pull it from my cold dead hands. :-)"
Link to Original Source
top

IBM Sells POS Busiiness to Toshiba

ErichTheRed ErichTheRed writes  |  more than 2 years ago

ErichTheRed (39327) writes "Yet another move by IBM out of end-user hardware — Toshiba will be buying IBM's retail point-of-sale systems business for $850M. I'm not an MBA, but is it REALLY a good idea for a company defined by good (and in this case high-margin) hardware to sell it off in favor of nebulous consulting stuff?? Is there really no money in hardware anymore? I doubt they'll ever sell their Power systems or mainframes off, but you never know!"
Link to Original Source
top

Learning Programming in a Post-BASIC World

ErichTheRed ErichTheRed writes  |  more than 3 years ago

ErichTheRed (39327) writes "This Computerworld piece actually got me thinking — it basically says that there are few good "starter languages" to get students interested in programming. I remember hacking away at BASIC incessantly when I was a kid, and it taught me a lot about logic and computers in general. Has the level of abstraction in computer systems reached a point where beginners can't just code something quick without a huge amount of back-story? I find this to be the case now; scripting languages are good, but limited in what you can do...and GUI creation requires students to be familiar with a lot of concepts (event handling, etc.) that aren't intuitive for beginners. What would you show a beginner first — JavaScript? Python? How do you get the instant gratification we oldies got when sitting down in front of the early-80s home computers?"
Link to Original Source
top

America's tech decline: A reading guide

ErichTheRed ErichTheRed writes  |  more than 3 years ago

ErichTheRed (39327) writes "Computerworld has put together an interesting collection of links to various sources detailing the decline of US R&D/innovation in technology. The cross section of sources is interesting — everything from government to private industry. It's interesting to see that some people are actually concerned about this...even though all the US does is argue internally while rewarding the behaviour that hastens the decline."
Link to Original Source

Journals

ErichTheRed has no journal entries.

Slashdot Login

Need an Account?

Forgot your password?