Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Is Computer Science Dead?

kdawson posted more than 7 years ago | from the pining-for-the-fjords dept.

Programming 641

warm sushi writes "An academic at the British Computing Society asks, Is computer science dead? Citing falling student enrollments and improved technology, British academic Neil McBride claims that off-the-shelf solutions are removing much of the demand for high-level development skills: 'As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch. Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.' Is that quote laughable? Or has the software development industry stabilized to an off-the-self commodity?"

cancel ×

641 comments

Wow! (5, Interesting)

OverlordQ (264228) | more than 7 years ago | (#18329123)

Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.

And who made those packages?

Software don't write itself.

Re:Wow! (5, Funny)

daranz (914716) | more than 7 years ago | (#18329215)

They arrived from lands far away thanks to the magic we call outsourcing?

Re:Wow! (1, Informative)

Anonymous Coward | more than 7 years ago | (#18329609)

Hmm. [slashdot.org]

Re:Wow! (3, Funny)

sunami (751539) | more than 7 years ago | (#18329221)

Software don't write itself.
All in good time... all in good time.

Re:Wow! (1)

cablepokerface (718716) | more than 7 years ago | (#18329279)

Trying out Vista there, are ye?

Re:Wow! (3, Insightful)

jevring (618916) | more than 7 years ago | (#18329317)

Exactly. Even if there is an overflow of good developers at some point, they all retire (and eventually die), so then someone else is going to have to pick up the torch.

Re:Wow! (5, Insightful)

codonaill (834165) | more than 7 years ago | (#18329407)

There are 3 other jobs I can see that require CS skills, but are not product development/design jobs.

1. Who buys it? What skills do Computer Scientists need to differentiate between Brand X and Brand Y billing system? Basically, proper product selection is as tough a job as product design - because you have to beat down sales jargon and work out what a system actually does - generally without unfettered access to the system itself.

2. Who builds the Middleware/Integration layer? This is so specific to individual companies that you'll never get a solution that fits all the heterogenous parts of your network.

3. Who builds large networks of products - i.e. works out that Portal solution 1 goes well with reporting solution 2 and alarm system 3. Who breaks down the business flows between these and who keeps track of strategic direction in each area?

Dunno, still think there's plenty of non-dev jobs out there for CS graduates...

C.

Re:Wow! (5, Insightful)

Scarblac (122480) | more than 7 years ago | (#18329639)

How about imaging research (stuff like using image processing to learn about the state of food stuffs with infrared cameras), or the hard problems that need to be solved to get to the Semantic Web?

There is a lot of CS work out there. But it's science work, not programming or product development. That's not CS, that's engineering or just programming.

Re:Wow! (5, Insightful)

-noefordeg- (697342) | more than 7 years ago | (#18329481)

No only that...

"stable, well-proven ... "
I've yet to see, say, a well written and stable ERP system.
In Norway some of the more popular ERP/logistic and sale-systems are CS (Client System), Movex, Visma Unique and IBX. Systems which are just "ok". Terribly modules, inane logic, most likely a lot of bad code all over, but since it's closed source it's impossible to tell. From all the errors (some, really strange), lack up updated documentation and integration specifications, system resources used, and just from looking at the system documentation, you can easily tell that the systems are not "state of the art".

What most of these complex systems really are, are a collection of small modules of which many are most likely writtin at different times, by different people, for different projects and just barely working together. The companies developing the systems probably have thousand and tens of thousand bugs and points for optimizations which will never be fixed. Any work done on these systems which is not directly connected to a new deployment and paid for by one or many customers are simply a loss for the company.
Much of the "valuable" experience people get from using such a system, is actually how to use it without breaking it/how to use it despite all bugs, errors and strange quirks and twists.

What my small company has been busy with the last years, is to move a lot of logic and data outside such systems. Because it's just to expensive to try and "upgrade" these huge behemoths. We develop external databases to store different data feeds, most likely recieved in XML-format which some of these systems is not capable of using. Actually, one of those systems are only capable of importing/exporting data with fixed lenght ASCII-files.

I don't see any less work needing to be done on these systems in the coming future. Rather, the need for more developers working both inhouse and independent, to patch them up, make small adjustments here and there, and/or write "connectors" for logic/data processing will probably increase.

Re:Wow! (0)

Anonymous Coward | more than 7 years ago | (#18329509)

I guess the spirit of the quote is that once you make a stable product, everyone else can use it without having to re-invent it. However, my experience with various companies during recent weeks of job-hunting reveals quite the opposite: most companies, big corporations and startups alike, have internal development on all kinds of areas that may not even seem related to their primary business focus. For instance Amazon uses a home-grown RPC mechanism for its hugely distributed system, companies like vmware who focus on virtualization have groups working on databases, while companies like Sun are also working on virtualization. Those like Apple are involved not just in OSs and computer hardware, but also networking hardware like access points and the like. Even start-ups dealing with web-apps often "re-invent" feedback-rating and review mechanisms for their web-products, and go as far as incorporating mechanisms inspired from spam-filtering techniques to detect when buyers and sellers attempt to circumvent the middleman process.

From an insider's perspective it seems that there is too much re-inventing going on.

Re:Wow! (0)

Anonymous Coward | more than 7 years ago | (#18329629)

The point is that today the only software that sells is the one the average geek would hate to write.

Slashdot rule #1: (0)

Anonymous Coward | more than 7 years ago | (#18329127)

Never trust a McBride.... ;-)

A question of "R" vs "D" (4, Insightful)

ZombieEngineer (738752) | more than 7 years ago | (#18329147)

Computer Science graduates can go one of two directions:

Academic Research - Which has grown at a steady rate

Corportate Development - Which collapsed at the end of the dot-com boom.

There is still a need for "pure" computer science research for the next big improvement in the field of computing (where is the next "Google" going to appear?)

ZombieEngineer

Dead like a webmaster, or a dinosaur. (3, Insightful)

kale77in (703316) | more than 7 years ago | (#18329395)

Recently we mourned the Webmaster [slashdot.org] , even though some of us were implicated in his murder.

That's the kind of Computer Science that is dead: the kind that Computer Science, by its progress, leaves behind.

An similar questions might be: Is evolutionary science dead? Or was that just the dinosaurs that died?

If you only want to do pure research, maybe (4, Insightful)

Moraelin (679338) | more than 7 years ago | (#18329427)

You know, I don't buy it. On one hand you have all the corporates bitching and moaning about how they don't have enough people to do the work, and how everyone should outright give citizenship to any immigrant who can use a computer. See Bill Gates's speech recently, it was linked to right here on Slashdot. Plus, they've surely created a lot of jobs in India lately. And then we have guys like this one coming out and saying "oh, we just don't need more CS people." Something doesn't add up. Either one gang is right, or the other is right, but they can't both be right at the same time.

Way I see it, reality is a lot more... perverse. Everyone still needs programmers, still needs an IT department, etc, they just don't want to pay for it.

And enrollment has just reflected this. Studying engineering or CS is hard work, and there are only a limited number of people who do it for fun. And even those can do it as a hobby at home if all else fails. For most people you have to pay well to get them to do the extra effort. If you don't pay up, they'll go do something else.

At any rate, the jobs do exist. Sure, not most of them involve researching the next great algorithm, but they exist. There are a ton of companies who need very specialized internal applications, or their own "B2B" applications, and I just don't see the off-the-shelf software who does those. Of course, most of it doesn't involve researching any new algorithms, but rather researching what the users really want. Then again, most computer-related jobs weren't exactly academic research in the past either. There were maybe more companies making compilers and new computers and what have you, but the bulk of the jobs was always in doing corporate software.

At any rate, _maybe_ if all you're seeing yourself doing after college is researching the next paradigm shift in computing, yeah, that market has somewhat shrunk. If you don't have any qualms with writing some buzzword-ladden application for some corporation, it's as strong as ever. It just doesn't pay as much as in the dot-com times any more.

Re:A question of "R" vs "D" (1)

Professor_UNIX (867045) | more than 7 years ago | (#18329601)

Computer Science graduates can go one of two directions: Academic Research - Which has grown at a steady rate Corportate Development - Which collapsed at the end of the dot-com boom.
I don't understand why people seem to discount the other useful employment opportunities that computer science students can pursue like systems engineering, network engineering, network/IT security, and even system administration. To me there's a hell of a lot more to computer science than being a code monkey programming in whatever popular language of the day is out there. Personally I *hate* programming, but I love networking and security. I'll write a program (usually in Perl) to assist me in my daily duties, but I have absolutely zero interest in going to work for some software company and becoming a code drone developing some commercial software package. Why do Slashdot readers always equate Computer Science with programming?

Re:A question of "R" vs "D" (1)

bibel (1072798) | more than 7 years ago | (#18329621)

where is the next "Google" going to appear? Nowhere, never. Giant companies like Google and Microsoft are slowly (but surely) monopolising the industry. It is the small businesses that will probably die, and we'll all work for Microsoft as testers, and life will be good.

Horology anyone? (2, Interesting)

Tracer_Bullet82 (766262) | more than 7 years ago | (#18329149)

I remember a few years ago, 2 the minimum if my memory serves me, that watchmaking is a dead business. Even the us education dept. considered it dead and buried with less than 100 students per year taking it.

today though, with watchmaking (back) on the rise, the supply of workers is much less than the demand.

everything, well most thing at least, is cyclical. we'd expect so called researchers to have much longer timelines in their research than the immeduate ones.

Well (1)

OverlordQ (264228) | more than 7 years ago | (#18329153)

Dont tell that to my [acm.org] professor [uark.edu] .

Graduates are in short supply (1)

MichaelSmith (789609) | more than 7 years ago | (#18329155)

Where I work we are outsourcing work to India, China and Russia becaue it is impossible to deliver on our projects with people hired locally.

When I was young you had to be a bit of a geek to tinker with computers at all. You had your 8 meg basic in rom, and a bit later, CP/M. Now people who want to tinker are building machines for gaming or some such and because what they are doing is much more mainstream, they don't think of it as being anything special so when they decide what to do for a living they don't think of computing as the way to go.

Sorry about the car analogy because I know we are all sick of that, but its a bit like I used to muck around with engines when I was 18 but never wanted to be a mechanic.

Re:Graduates are in short supply (1)

Alioth (221270) | more than 7 years ago | (#18329191)

8 *meg* BASIC, before CP/M? 8K maybe...

Re:Graduates are in short supply (1)

MichaelSmith (789609) | more than 7 years ago | (#18329633)

8 *meg* BASIC

Oh damn. That was a long time ago.

Re:Graduates are in short supply (4, Interesting)

cyclop (780354) | more than 7 years ago | (#18329261)

This doesn't mean CS is dead.

Surely computing is much more accessible, and there is a hella lot more ready-to-go software and libraries compared to what was there 10 years ago, but this means nothing. New applications will always be needed/invented, and someone will need to code them. And even with the latest and easiest programming languages, doing things well needs some kind of education.

I am a biophysics Ph.D. student. I have never had a formal CS education nor I am a code geek (although I like to code), and just building a relatively little data analysis application with plugin support in Python is making me smash my nose against things that would make my code much better, that probably are trivial for people with a CS education (what's currying? what is a closure? how do I implement design patterns? etc.) but that for me are new and quite hard (btw: a good book about all these concepts and much more?): so I understand why CS is of fundamental importance.

Re:Graduates are in short supply (1)

MichaelSmith (789609) | more than 7 years ago | (#18329303)

I am a biophysics Ph.D. student.

Just curious: what drove your choice of career? For me it was hacking with electronics as a 5-15 year old in the 1970's.

Re:Graduates are in short supply (2, Insightful)

Anonymous Coward | more than 7 years ago | (#18329355)

(what's currying? what is a closure? how do I implement design patterns? etc.) but that for me are new and quite hard (btw: a good book about all these concepts and much more?)

Modern CS doesn't teach these concepts either, try wikipedia. I'm being serious.

Re:Graduates are in short supply (2, Insightful)

beelsebob (529313) | more than 7 years ago | (#18329559)

Just of note - I doubt very much you'll find many CS students who know what Currying or a Closure is. Most of them learn Java and think that it's the best thing since sliced bread. They don't even realise that Functional Programming exists, let alone what it is, what it's benefits are etc.

Re:Graduates are in short supply (1)

ReidMaynard (161608) | more than 7 years ago | (#18329523)

Where I am working they are outsourcing work to India, China and Russia because "wall street" has dictated that successfull IT orginizations outsource a particular percentage of their business. This is common knowledge.

Re:Graduates are in short supply (1)

dbIII (701233) | more than 7 years ago | (#18329535)

And in India the projects are worked on by students or recent graduates doing cut price work until they have enough experience for a job with better money and greater responsibility - they can't even deliver there at the prices with experienced staff. It's funny seeing this outsourcing thing from the USA to the inexperienced in India happen when wages are not even a significant proportion of expenses - don't blame India blame clueless managent gambling the existance of the company on short term gains.

*yawn* (1)

teknopurge (199509) | more than 7 years ago | (#18329159)

every 3141 days or so this question is asked. It wasn't true 30 years ago and it isn't true now. it's the same as gang warfare - we get kevlar, they get hollowpoints. we get semi-automatics, they get automatics....

what advantage does a single company have if they all their competitors use the same software? What, are the people going to set the company apart?!?(ha!)

Custom software within company's IS people - the knowledge of people are disected, analysed and automated in the form of proprietary systems. I'm currently working on a ~25 million USD project(actually there are a handfull of apps) whose goal is to automate various financing tasks. WIthout a doubt, we are doing it better now then we did 5 years ago, but there is no end in sight.

Re:*yawn* (1)

TapeCutter (624760) | more than 7 years ago | (#18329571)

Exactly, if my CS degree is dead I can only assume it must have died laughing on the way to the bank.

Don't think so (2, Interesting)

VincenzoRomano (881055) | more than 7 years ago | (#18329161)

Just like building construction science was not dead with Egyptians, Greeks, Romans, Chinese, Aztech ... and so on, the IT won't.
New technologies, new languages, new paradigms as well as new hardware will push forward the IT.
I fear the sentence has come from some "old school" mind, still tied to old technologies. Which could really die sometime in the future.

We get asked this every few years (1)

Tim Ward (514198) | more than 7 years ago | (#18329169)

It is true of course that most users of computers these days do not write their own accounting systems; do not write their own payroll systems; do not write their own word processors; and do not even keep a team of operating system tweakers in house ("system programmers" from the IBM mainframe days, needed just to keep the thing running).

But ... someone has to write all this stuff!

Re:We get asked this every few years (3, Insightful)

MichaelSmith (789609) | more than 7 years ago | (#18329269)

It is true of course that most users of computers these days do not write their own accounting systems

Isn't that what spreadsheets are for?

Re:We get asked this every few years (1)

donaldm (919619) | more than 7 years ago | (#18329619)

Many corporations buy business software but instead of modeling their business practice on the software methodology they insist on making the software conform to their business practices. Actually this is great for the consultant who can say "please open cheque book and I will tell you when to close it" (SAP is a great example here). If the business did their homework properly and was willing to change their practices to conform to the software they would save a fortune but this rarely happens since many business have people with too much invested interest or shear bloodymindedness and the consultant walks away with a fortune and the business is usually a few million dollars lighter.

It is rare that the consultant gets outsourced since they normally deal face to face with the business, but unfortunately it is all to easy to outsource the programmer. For those interested it is nothing unusual for an SAP consultant to ask and get US$100 to US$300+ per hour and the greater the changes wanted (not necessarily required) the better for the consultant or consulting firm.

dead no, dying? yes (5, Insightful)

rucs_hack (784150) | more than 7 years ago | (#18329171)

Over the last six years I've been increasingly worried by the falling level of ability in CS students.

I've encountered CS students recently who in their third year are unable to do such basic things as understand memory allocation. As for algorithm design? Well that's simply unknown by the majority. That scares the shit out of me.

The Mantra is 'don't re-invent the wheel'. This is used as an excuse for students taking off the shelf components for assignments (sorting classes for java being used for sorting assignments for example), or being given virtually complete assignments by lecturers and being walked slowly through the assignment to the point where little or no original thinking is required.

Now it is true that re-inventing the wheel is a bad move at times. However whilst studying for their qualification, they should learn how to build the wheel in the first place.

Back to the memory allocation point. I currently know of no final year students with a decent understanding of this topic, and yet it is the main cause of security problems in code. They should at least have a working knowledge.

The ephasis is more and more on using languages designed to try and remove the main problems in code, but who writes these languages? It sure isn't the people who are only taught to use them, not create them.

The normal course of action is to blame Java, since it has led to a simplistic approach to CS assignments. I'd love to blame it, I ferkin hate the language, but that isn't the root cause.

Computer science is a hard topic that they are trying to make simpler to encourage more students. This has the result that CS students are graduating with ever reducing levels of ability, so people no longer see it as a worthwhile topic. Nowadays a CS student who wants to do really well has to work on independent study entirely apart from the course they are attending, and has also to face the unpleasant reality that their education as provided by the university is so poor that they may face years of further study to gain a useful level of ability.

Post graduate study can reduce this problem, but there are fewer post grads too, and often it is funding, not interest in a topic, that guides the selection of a course.

Quote: (1, Interesting)

Anonymous Coward | more than 7 years ago | (#18329253)

The only way to understand the wheel is to re-invent it.

Re:dead no, dying? yes (1)

Andrew Kismet (955764) | more than 7 years ago | (#18329267)

I am a first-year computer science undergrad in the UK, studying Java.
Should I be intensely worried, or should I just do a LOT of self-study in my spare time over the next two/three years?

Re:dead no, dying? yes (4, Insightful)

rucs_hack (784150) | more than 7 years ago | (#18329357)

Self study.

Of the people I knew who did well, those who self studied alongside their normal course did things like website design, search algorithms, micro kernel design, robotics and advanced study in certain languages (lisp, c++, C, Object Pascal, assembler), everyone I knew did the last thing, but the languages varied.

You can pass and get a 2.1 or 2.2 easily just by following the course guidelines. I got my phd offer not by doing this, but by cramming every day (almost every day, have a blowout night at the weekend, you've got to have some fun time) with additional study. I exceeded the requirements of every assignment (I wasn't alone in doing this), and studied around every topic taught. The result was a lot of very interesting phd offers when I graduated, it rocked. I was tired a lot, I will admit, but the benefit was vast, I was so far ahead of the students who just followed the course that I actually tutored some.

Don't assume I'm that clever though, I sweated blood sometimes trying to get assignments done early, and the extra learning was oft times very difficult. Every evening spent on it was one well spent however.

Most of the people I know personally who did this are now in great jobs, one heading towards millionaire status at 25. In his case he worked like a dog, even more than I did. You wouldn't beleive what he was capable of on graduation.

So work hard, and study around the subjects.

Don't worry, be happy :-) (4, Informative)

Anonymous Brave Guy (457657) | more than 7 years ago | (#18329515)

You shouldn't be intensely worried, but reading around your subject is pretty much always a smart move if you're a serious student. I learned this lesson very late in my academic career, and now wish I'd understood what the phrase really meant a couple of years earlier.

In this business, knowing multiple programming languages (and in particular, knowing multiple programming styles -- OOP, procedural, functional, etc.) is a big asset. It helps you to think about problems in more varied ways, even if you will ultimately code the solution in whatever language is required by your particular professor or, in due course, employer.

There are two suggestions I've heard in the past that I appreciate more as time goes by: try to learn a new programming language and to read a new book about programming every year. In the former case, if you're learning Java, that's OK, it's a pragmatic tool that's widely used in industry and it will teach you one way of thinking about a problem. I suggest the following as complementary languages, to be explored as and when you have the opportunity:

  • C, or even some version of assembler, to understand what's going on under the hood and what a low-level programming language really is;
  • Haskell or a dialect of ML, to understand that not all programming languages are block-structured procedural languages, and what a high-level programming language really is;
  • Python or Perl, to understand the costs and benefits of requiring less formal structure, and the use of dynamic type systems, and to learn a few neat ideas like regular expressions;
  • when you're ready, LISP, to understand what the old sayings "code is data" and "data is code" really mean, and what concepts like macros and metaprogramming are really all about.

There are various other unique things you'll take away from each of the above, but if you spend perhaps a few months exploring each of them in some detail, it will make you a much more rounded programmer. I'd suggest either the above order, or swapping the first two around and going for a functional programming language and then something low-level. The requirements of your course or good advice from friends/teachers may guide you otherwise. Go with what works for you.

To make your learning practical, pick some simple projects, perhaps to practise whatever algorithms you happen to be studying lately in other courses, and write a few small but real programs in each language. For example, if you're learning about operating system basics, try rewriting a couple of simple OS utilities or networking tools in C or assembler. If you're learning about databases, try writing a simple web front-end for a database, and power it with a few CGI scripts written in Perl or Python that use SQL to look up and modify the data in your database. If you're learning about graphics and image processing, write a simple ray tracer in Haskell or ML.

Along the way, you'll develop potentially useful real world experience with things like OS APIs (and perhaps how they vary between platforms, and thus why standards are useful for these things), HTML/CSS and CGI for web development, SQL for database work, and so on.

As you go through this, consider buying a good textbook on major subjects (programming languages, databases design and SQL, graphics algorithms, etc.) or make sure you've identified some good reference and tutorial material on the web. The latter is a big advantage for the modern compsci student, though you have to be careful to check your sources are well-regarded and not just a pretty web site with an authoritative tone of voice written by someone very enthusiastic but regrettably ill-informed. Things like FAQs and newsgroups can be valuable sources of information, but sometimes, there's just no substitute for a well-written, well-edited, authoritative textbook.

Anyway, this post is now far too long, so I'll stop there. Please consider it "the approach I'd take if I could have my university days again" and take it for whatever it's worth to you. Good luck. :-)

Re:Don't worry, be happy :-) (1)

Geoffreyerffoeg (729040) | more than 7 years ago | (#18329603)

when you're ready, LISP

I oppose this. MIT teaches Scheme, a LISP dialect, in its intro CS class. And the admissions requirements don't include knowledge of programming, and many non-EECS majors take the class. (Well, they're now offering a Python class simply as intro programming. But that's mainly for people who've never seen code before.)

Get yourself a copy of The Little Schemer, and read through it. It's a nicely different way of looking at CS and coding.

Re:dead no, dying? yes (1)

Rakishi (759894) | more than 7 years ago | (#18329283)

Don't claim it's everyone when all you have is likely some small selection of students or schools. Some schools are crap and some students are lazy/idiots. If all someone takes are the "easy A" courses that a monkey could do then why do you expect them to be more than a monkey?

Re:dead no, dying? yes (1)

blankinthefill (665181) | more than 7 years ago | (#18329295)

I agree with a lot of what you said, but I also believe that many of your statements are very dependent on the school in question. At my school, CS grads are basically guaranteed a job immediately upon graduation with any number of big name companies, because the school I attend is very well known for providing a well rounded, in-depth, and difficult curriculum. I do believe, however, that my school is not the norm. Most colleges that I see around today don't teach CS, they teach coding, and as (I hope) most of us know, there is a very large difference. The real problem is that finding code monkeys is simple, but finding someone with a (good) education in project design/program development is much much harder... IMHO, real CS majors will always be in demand, but programs that simply pump out coders are doomed to fail.

Re:dead no, dying? yes (1)

MichaelSmith (789609) | more than 7 years ago | (#18329447)

At my school, CS grads are basically guaranteed a job immediately upon graduation with any number of big name companies

My advice: follow your nose. Work on what you enjoy. Big companies are okay but don't get stuck working on a production line.

Mass production is how small companies become big.

Re:dead no, dying? yes (0)

Anonymous Coward | more than 7 years ago | (#18329525)

Which colleges do these "final year" students who can't malloc() don from?

Re:dead no, dying? yes (4, Interesting)

Geoffreyerffoeg (729040) | more than 7 years ago | (#18329575)

The normal course of action is to blame Java, since it has led to a simplistic approach to CS assignments.

You should blame Java. And you should blame C++, Python, and any other similar medium-high level language, if that's the intro language and your sole teaching language.

Here at MIT we have 4 intro courses. The first, the famous Structure and Interpretation of Computer Programs [mit.edu] , is taught entirely in Scheme, a purer and more pedagogical dialect of Lisp. You learn how to do all the high-level algorithms (e.g., sorting) in a purely mathematical/logical fashion, since Scheme has automatic object creation / memory handling, no code-data distinction, etc. At the end of the class you work with a Scheme interpreter in Scheme (the metacircular evaluator), which, modulo lexing, teaches you how parsing and compiling programs works.

The next two are EE courses. The fourth [mit.edu] starts EE and quickly moves to CS. You use a SPICE-like simulator to build gates directly from transistors. (You've done so in real life in previous classes.) Then you use the gate simulator to build up more interesting circuits, culminating in an entire, usable CPU. From gates. Which you built from transistors. The end result is, not only are you intimately famliar with assembly, you know exactly why assembly works the way it does and what sort of electrical signals are occurring inside your processor.

Once you know the highest of high-level languages and the math behind it, and the lowest of low-level languages and the electronics behind it, you're free to go ahead and use Java or whichever other language you like. (Indeed, the most time-consuming CS class is a regular OO Java software design project.) You're not going to get confused by either theory or implementation at this point.

So yes, blame Java, if you're trying to teach memory allocation or algorithm design with it.

Re:dead no, dying? yes (1)

Eivind (15695) | more than 7 years ago | (#18329631)

You've been talking to the wrong students then, seriously. Language is completely beside the point.

You can't even pass *first* year in CS at the University of Bergen without programming, yourself, from scratch, atleast a dozen or 2 of the archetypical basic algorithms and datastructures; linked-list, double-linked-list, stack, circular-buffer, Heap, BubbleSort, QuickSort, Binary Trees, Red-Black Trees, Shortest-path that sort of stuff.

And they *do* use Java in the first year. Later they don't care what language you use, as that tends to be completely beside the point in more advanced courses. I did most of my crypto-assignments in 3rd year using Python, others stuck with Java, some made a point of using a different language for just about every assignment to get a bit of experience in diverse languages, I saw Diffie-Hellman implemented in C, C++, Java, Python, Ruby, Lisp (various), Modula-2 and Perl, a friend of mine threathened to do it in intercal, but I don't think that actually happened. Shouldn't really matter, the point is to understand Diffie-Hellman, not the language used.

Better tools to understand the theory (0)

Anonymous Coward | more than 7 years ago | (#18329637)

I used to teach the Algorithms course. We used Python, and the difference to the old days was drastic. The students understood the basic sorting algorithms, building trees etc at least twice as fast as they used to when we used C.

Besides, they learned it way better. At the old days what they eventually remembered was a bunch of tricks to use when programming in C. Most of these tricks were downright harmful when switching to another languages or paradigms.

C, Assembler and stuff are something to learn after you've grasped the basic theory and ideas of programming.

Re:dead no, dying? yes (1)

Hazelnut (660467) | more than 7 years ago | (#18329647)

"The normal course of action is to blame Java, since it has led to a simplistic approach to CS assignments. I'd love to blame it, I ferkin hate the language, but that isn't the root cause."

It's nice to see you're not blinded by your hate for Java, this is /. after all, but I'm curious as to why you hate it? There are a few languages that I dislike, but I don't think there are any that have inspired actual hate. Java is actually one of my favorite languages that I've used.

Pertinent part of the article (5, Insightful)

mccalli (323026) | more than 7 years ago | (#18329179)

From the article:Here at De Montfort I run an ICT degree, which does not assume that programming is an essential skill. The degree focuses on delivering IT services in organisations, on taking a holistic view of computing in organisations, and on holistic thinking.

ie. not Computer Science. For those not familiar with the UK education set up I should also explain that De Montfort University is the old Leicester Polytechnic. The Polys were set up to provide much more practical education than the theoretical stances of the Universities, and a damned good job many did of it too - I'm certainly not playing the one-upmanship card that some do about the old polys, Leicester Poly was a good place and its successor De Montford has reached even further.

But the point stands - this point of view is coming from an academic teaching at a more practically-oriented institution and already running a non-science based course. His viewpoint should be considered against that background.

Cheers,
Ian

Programmers never code from scratch. (1)

alenm (156097) | more than 7 years ago | (#18329187)

Programmers just assemble finished bits of code into something new as always. That is what programming is about. This is always true, but the question is which level of abstraction you use. But as time progresses things get more abstracted. For instance nobody codes their own c/c++ program that listen to http requests, but in the beginning you had to. Same for file upload and lot of stuff that is shrink wrapped. So a lot of work is finished, but the assembly takes time too. Wait a sec. That is what takes most of the time. So my answer back is : will computer science academics get stupider and stupider every year because they ask the same question each year? And the answer to the question is still the same.

Re:Programmers never code from scratch. (2, Interesting)

VirusEqualsVeryYes (981719) | more than 7 years ago | (#18329517)

The parent makes the same mistake that the article and the summary make: computer science != programming. TFA talks on and on about "longing" for old programming languages, about new programming tools, about the ability of 8-year-olds to program (?), about almost anything programming-related. The only non-programming thing TFA cites is the falling numbers of computer science majors, which, in my opinion, does not indicate the death of anything, but rather reflects the amount of respect that IT jobs get in the private sector--that is, next to none.

But there's so much more to computer science than programming and general software. There's robotics, artificial intelligence, distributed computing, networking, graphics, architecture, and theory, not to mention the overlaps with other fields, such as with electrical engineering (architecture), mechanical engineering (robotics, integration), mathematics (especially statistics), sociology (mass models), and just about any other science or even non-scientific field that could use modeling--multifield modeling requires skills that techie teens do not have. Don't forget that there are uncountable subfields within each field, and I most likely missed one or more fields as it is.

Artifical intelligence and robotics are especially potent because they are both in their infancy and merely budding as fields of study. Their potential is huge. And TFA has the balls to claim that CS is dying? Quite the contrary.

Re:Programmers never code from scratch. (0)

Anonymous Coward | more than 7 years ago | (#18329537)

nobody codes their own c/c++ program that listen to http requests

So I need to learn a different language for listening to http requests and wrap it into my C code?

CS students learn so many alternative languages because they help them to avoid WORK.

If the project fails because of too many alternatives they cover its remains in buzzwords and spreadsheets and move on to the next disaster.

Thank god I quit CS and turned my job into a hobby again where I can earn Adsense money for stuff I like.

Breaking News: Math dead, too! (0)

Anonymous Coward | more than 7 years ago | (#18329197)

With cheap pocket calculators in every business, managers no longer see the need for math education. All good stuff has been discovered. Math departments are reported to begin converting their useless graduates into advertising professionals.

Computer Science isn't dead.... (1)

Naughty Bob (1004174) | more than 7 years ago | (#18329199)

....It just smells funny.

this thread is now closed (1)

acidrain (35064) | more than 7 years ago | (#18329203)

Has the software development industry stabilized to an off-the-self commodity?

The US department of labour predicts the industry will "grow more slowly than average [bls.gov] ." That is hardly dead.

It is 2007 and we are still writing code using text editors, not giving verbal commands to sentient machines. Nothing to see here, move along.

No it's not (1)

El_Muerte_TDS (592157) | more than 7 years ago | (#18329211)

at least not until he hand me my MSc.

Joking aside. A lot of people _tried_ computer science because:
- of the money, this stopped after the "crach" (or the point when investors wanted to see results)
- they thought it was easy because they messed around with computers all day

As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch. Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.

Switching to component based development doesn't solve all your problems. You often still need to develop your own components and stuff. With computer science a solution pretty much always introduces new problems. Work is never finished, also because people are hardly ever happy with the endresult.

32 cores? (1)

slapys (993739) | more than 7 years ago | (#18329217)

The rise of personal computers with multiple processors, each containing multiple cores, will lead to a change in what computers are capable of doing in the not-too-distant future. To summon forth the power locked inside these new processors, software engineers will need to learn about multi-threaded software, and develop a deep understanding of the hardware on which this multi-threaded software runs. This will create a demand for serious, dedicated software engineers, engineers capable of the intellectually difficult task of keeping track of run-time concerns like simultaneous memory access, locked data, etc. As soon as processors with 32 cores hit the market, I am fairly confident that these "Computer Science Has No Future" articles will disappear.

Re:32 cores? (1)

aadvancedGIR (959466) | more than 7 years ago | (#18329455)

You're kidding, right? They'll just say that computers are so powerfull that they don't need coders anymore. Why, because it's been 40 years that those people say that CS or IT will be dead jobs within a couple of years because that is what the C*O want to hear and they will probably be wrong at least for the next couple of centuries.

academics (1)

fozzmeister (160968) | more than 7 years ago | (#18329233)

The thing about academics is they often have no real world commercial/industry experience yet feel the need to comment on it.

What's in your car? What's in your TV? What's running your website? Those are just three things that spring to mind that are not "generally off the shelf" yes of the shelf components maybe, but someone still has to integrate it all together. It's just madness to say that as computers become more prolific we need less computer scientists.

It's alive and well - in India (0)

Anonymous Coward | more than 7 years ago | (#18329245)

Being on top of the game in computer science requires considerably more effort than being an accountant, solicitor or barrister (lawyers in the American language). If the opportunity for competitive renumeration isn't there then neither are qualified practitioners. When we outsource accountancy and legal services, enrollment on computer science courses will swell.

Not dead (2, Insightful)

Zo0ok (209803) | more than 7 years ago | (#18329251)

Compare computer science to other science - like architecture. Computer Science is still very immature with very few true best practices and standards. It will not die anytime soon.

Remember the 4th-Generation-Languages that were supposed to make programming unnecessary? Where are they today?

Ask innoviative organisations like BitTorrent, Apple, Google or Blizzard if they see Computers Science be obsolete any time soon. I dont think so.

Re:Not dead (2, Interesting)

MichaelSmith (789609) | more than 7 years ago | (#18329333)

Compare computer science to other science - like architecture. Computer Science is still very immature with very few true best practices and standards. It will not die anytime soon.

Maybe this is slightly off topic, but my wife is an architect, and any time I want to stir up one of her co-workers I tell him tales of version control, automated builds, automated unit testing and bug databases linked to revisions.

None of this exists outside of the software business in anything like the same form. When it comes to producing information in a controlled fashion software is streets ahead of any other field.

Re:Not dead (0)

Anonymous Coward | more than 7 years ago | (#18329587)

Ah, I take it you havnt heared of the fabulous 4th gen language called 'india' and the compiler technology 'bangalore'. /ducks volly of sharp objects

CS: Dead As It Ever Was... (2, Interesting)

cmholm (69081) | more than 7 years ago | (#18329271)

Comp Sci has always been dead, and always will be. In 1982, one of my early CS professors claimed that the window of oportunity for a job as a programmer or s/w engineer was going to close soon as automatic code generators took over the task of raw code banging. Employers would just need a few engineers for design, and that would be it.

But, I shouldn't be surprised that yet another generation of technology dilettantes think that they're reached the pinnicle of achievement in a line of endeavor, and from here on out it's just like corn futures (Somebody oughta tell Monsanto to stop wasting time with GMO research). But seriously, when we've got bean counters like Carly Finarino and whichever IBM VP it was claiming that the years of technical advance in IT are over, not to mention the author of the fine article, Mr. McBride, I see people who are in the wrong industry. Perhaps they should be selling dish washers, or teaching MCSE cram schools.

McBride is whining because the students aren't packing his CS classes like they used to. His reasons whittle down to these: mature software packages exist to service a number of needs (which has always been true, to the contemporary observer), and it's too easy to outsource the whole thing to India. It is the writing of someone throwing in the towel. It's like the trash talk you hear from people who are about to leave your shop for another job. I won't be surprised to find him in fact "teaching" MCSE "classes" very soon. Good. His office should be occupied by someone who still has a fire in their belly.

Re:CS: Dead As It Ever Was... (2, Insightful)

MichaelSmith (789609) | more than 7 years ago | (#18329415)

one of my early CS professors claimed that the window of oportunity for a job as a programmer or s/w engineer was going to close soon as automatic code generators took over the task of raw code banging.

I read once that assemblers and compilers were both described as enabling the "self programming computer" when they came out.

Of course such things just increase productivity and open up new applications.

And Universities Killed it (1)

DrSkwid (118965) | more than 7 years ago | (#18329287)

My friend has a CS degree from the local university. He is fully fluent in Java and VB. He had to do C++ and Haskell for course points but knows little about either. He's *not* a computer scientist by any stretch of the imagination.

So it's probably a good thing.

IT is so pervasive that the CS degree should fragment to suit the new world. My friend wouldn't stand a chance in front of an xterm, he's not even interested. To him, it's not a vocation, just a job (and fair enough).

Re:And Universities Killed it (1)

hasmael (993654) | more than 7 years ago | (#18329567)

I think you touch an interesting point, and one that has been bothering me for some time now: the difference between Computer Science, and things like software engineering.

To me, CS is kind of physics (as a discipline) and SE is kind of like mechanical engineering. Evidently you do not need a physics degree in order to design a machine, but you do want one if you want to reason about how it works or to make a breakthrough in machine-making.

If the analogy holds, it is clear that there is less (diminishing) demand for CS. Not every mechanic needs a Physics degree.

But look where it comes from... (4, Insightful)

prefect42 (141309) | more than 7 years ago | (#18329293)

"Neil McBride is a principal lecturer in the School of Computing, De Montfort University."

De Montfort, one of the new universities that traditionally advertises on the TV and offers vocational courses in media and the like.

Academic really doesn't mean much these days. He's not even consistent:

"Interrupts, loops, algorithms, formal methods are not on the agenda."
vs
"The complexity of embedded systems, of modern computing applications requires a different way of thinking."

I'd not like to use an embedded system he'd developed, unless by embedded he was thinking Windows Mobile + Flash.

Sorry, a rant from someone who works at a real university, and knows he isn't an academic.

A flaw in reasoning (1)

guacamole (24270) | more than 7 years ago | (#18329305)

So, if computer science is dead, then who is going to develop the "accounting packages, enterprise resource packages, customer relationship management systems are the order of the day" that article's author mentions?

Seriously though, this is weird. How come we don't see posts every other week about how common university majors such as english, political science, mathematics, or say classics are dead, presumably because they don't teach any real world job skills. If there are reasons for those majors to exist, then please explain what's wrong with CS which actually does teach along the way a lot of stuff that'd directly applicable in real job settings?

If computer science is dead, then how come the fresh graduates from the top CS departments are being snatched away before they even graduate by a variety of companies, ranging from 10-person startups all the way up to Google, Microsoft, HP, and IBM? How many DeVry's graduates do you see working in those companies vs. people who had format CS training?

The hard sciences are all dying (5, Insightful)

SmallFurryCreature (593017) | more than 7 years ago | (#18329313)

For that matter so is education in general. I am not a computer scientist, my education is technical instead. (LTS/MTS/HTS for the dutch)

When I attended the LTS we had real shop class, learning how to work with wood, steel, electricty with real world equipment in an area that looked much like you would expect to find in industry.

I recently had the occasion to visit a modern school that supposedly teaches the same skills, yet what I found was an ordinary classroom with a very limited and lightweight set of equipment. The kind of stuff you would find at home, NOT at work.

Yet somehow todays kids are supposed to learn the same skills.

And as if that ain't enough the number of hours of shop class have been reduced while the number of theory hours has been increased. Worse, the amount of technical theory has decreased as well and instead the amount of soft theory like history and such has taken over.

This has TWO negative impacts. First young kids coming to work can't hold basic equipment and don't understand the theory behind it and even worse the kinds of kids (like me) that used to select a techincal education because they don't like theory have that choice removed. I myself was far too restless to do a theorectical class, 18 hours of shop class per week however made the remainign theory that much easier to handle and because theory and practice were linked it all made sense.

Even worse, the modern education is supposed to make kids fit better into society, so how come they are bigger misfits then any generation before them?

No this is not old people talk. Notice even here on slashdot how the art of discussion is dying out, say anything remotely controversial and be labelled a flamebaiter or a troll by some kid who can't handle the heat. I actually had a 20 year old burst in tears about two years ago because I chewed him out for drilling through the work bench. Modern education is so much about empowerment that kids who think they are the top of the top can't handle suddenly being the lowest of the low when they enter a working life. This is already a shock simply because you just went from being the youngest in school to the oldest in school and now suddenly you are the youngest again.

Simply put, I think education in general is less and less about turning out skilled proffesionals and more and more about just keeping kids of the job market. Comp Sci ain't the only victim. Just try to get a good welder nowadays. Hell I settle for anyone who can knows the difference between a steel drill bit and a stone one. (And no, that doesn't mean one is made out of stone, rather what it is for drilling into).

Re:The hard sciences are all dying (0)

Anonymous Coward | more than 7 years ago | (#18329473)

1930 called and wants the oxyacetylene torch back

Who builds the solutions? (1)

JPriest (547211) | more than 7 years ago | (#18329339)

OK, so there is enough software on the market that companies are forced to write less stuff in-house. Sooo, who is writing and supporting that software?

Is De Monfort dead? (1)

palfrey (198640) | more than 7 years ago | (#18329341)

Computer Science ain't dead yet (although it may smell that way at times), you just need to have a degree program that's worthwhile. Crappy places churning out more idiots hoping to make a fast quid tend to die off at these times, but the better ones survive.

Not everything is covered by OffTheShelf software (1)

Lonewolf666 (259450) | more than 7 years ago | (#18329351)

There are a lot of niche applications or in-house development jobs that are not covered by standard applications. Things like writing control software for machines that are typically done by a small team of developers at the hardware manufacturer.

If jobs for creating office suites disappear, that will only affect a small part of the field.

College Enrollments != Future Business Needs (2, Insightful)

slarrg (931336) | more than 7 years ago | (#18329353)

Many of the students who would look for a degree to get rich were enrolling in CS. Now that the news is filled with stories of out-sourcing to India and the collapse of programming as a way to earn infinite wealth these students are no longer interested in CS and are pursuing careers as doctors and lawyers instead. Good riddance, I say, anyone who is only into programming for the money is probably not overly good at it.

Programmers will always be needed. As tools become more capable and advanced, the only thing that changes is the methodology of programming. Programmers are required because of their ability to think discretely. Any tool is only as good as the organizational ability of the person who uses it. I've met precious few non-programmers (and scientists in general) who are able to think in discrete enough terms to actually create a functional system.

Here is an example I often use, involving how organizational systems often spring to life. Imagine a sorting facility that tells it's people to sort all the items into three different areas: one area should have all of the blue items, another will have the metal items while the final area will have tall items. The items are being sorted on three non-exclusive properties and there will undoubtedly be an issue when the tall, blue metal item is encountered. Most business managers will claim, "we'll deal with that issue when it arises." But computers don't deal with exceptions gracefully and no company has the resources to deal with the constant onslaught of exceptions produced by a poor data/process organization. This is the function that programmers provide. We always are concerned about the exceptions. The stuff that actually goes according to plan is almost an afterthought.

Anyone who has walked into a company that has it's entire order fulfillment system running on a Microsoft Access database that was kludged together by the dozen office workers cum computer programmer that make up their IT staff will immediately understand why programmers will always be needed. Garbage in = garbage out.

Software does not design itself (yet). (1)

Der Reiseweltmeister (1048212) | more than 7 years ago | (#18329363)

As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch. Accounting packages, enterprise resource packages, customer relationship management systems are the order of the day: stable, well-proven and easily available.

And who is going to make this software? Or are we going to use this same suite of software packages for the rest of the lifetime of the computer? There will always be a need for new developments in algorithm design, and for the foreseeable future someone will fulfill that need.

CS is a research/academic discipline (3, Insightful)

HuguesT (84078) | more than 7 years ago | (#18329365)

The person who wrote this article doesn't even know what CS is. CS is computer science. It will be dead when science is dead.

CS won't be dead until all the interesting questions in the theory of computing are solved : is P != NP? What can a quantum computer achieve? what are the theoretical limits to computation in the physical world, beyond Turing machines? Given the truly enormous current production in all the branches of IT from HCI to pure mathematics via signal and image processing, I would not be worried at all.

Just to rehash, CS is not about designing the best accounting package. This is ICT, not CS. CS is a means to an end.

As to ICT, I don't think the final word has been said either. Just look at the sad state of Vista, or for that matter, at just about any accounting package. Who can say with a straight face that's the best that can be done?

Is computer science dead? (1)

neuroinf (584577) | more than 7 years ago | (#18329371)

Even though the modern University defines its disciplines by market demand - which can be a slave to fashion. The issue is whether there are any genuine "big questions" left in Computer Science. Much of the early part of CS dealt with the scarcity of computing - how to make use of the limited resources of memory, computation. We don't need this anymore - there is an abundance of computing. I guess I've been fascinated by the dramatic gap between human capabilities of thinking, and the capabilities of computers. Can we ever make computers that even remotely approximate the capabilities of human brains? I don't think we are much closer to answering that question than we were 30 years ago. But in this particular question we have learned that it is of very little commercial value to answer this question. So CS research may remain a discipline with big questions, but with funding comparable to (for example) archeology. But CS skills continue to be in demand, even if CS research is a bit past it. When I was young, I was blessed with teachers who told me things plainly. So: if you are interested in CS, or computers in general, and you turn aside from this path because people tell you it's not commercial, then you are going the wrong way. Follow your interests - tell your parents and your peers to go take a long walk and leave you alone. Fortunately it seems that only students in the Western world suffer from these delusions. Which will just accelerate the movement of the center of IT to China and India.

Re:Is computer science dead? (1)

Slashamatic (553801) | more than 7 years ago | (#18329661)

Much of the early part of CS dealt with the scarcity of computing - how to make use of the limited resources of memory, computation. We don't need this anymore - there is an abundance of computing.

Really. We tie up a network of computers every night calculating consolidated Value-at-Risk. Our traders want the analytics to run a tick or two faster than the opposition so they can trade faster.

It isn't just finance where fast computation is important. Other stuff includes meteorology, hydrodynamics and so on. Also remember that a high end processor isn't always possible due to power constraints.

Are you mental? (2, Interesting)

dintech (998802) | more than 7 years ago | (#18329377)

'As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch.'

This is equivalent to 'Off-the-shelf applications now fulfil all possible needs and changing requirements.'

Surely not. The British Computer Society should really talk amongst themselves before releasing such obvious trolling public statements. This idea could get in to the hands of people who would take it seriously...

Some muppet in your management chain is trying to 'leverage' a Microsoft Office implementation for your Credit Derivitives Trading platfrom.
Cancel or Allow?

Bespoke softs (1)

kahei (466208) | more than 7 years ago | (#18329391)

I work in-or-near the bespoke software business in finance, and certainly the increasingly powerful off-the-peg solutions that have emerged in the last 5-10 years do compete with bespoke development. It's also generally fairly true that it takes fewer developers to give 10 banks the _same_ software package than to give them each a bespoke package, making off-the-peg generally cheaper. But there are other differences.

Projects go on forever either way so multiply by the number of years required :)

Bespoke app: 10 devs, 1 pm, 1 ba, 2 it people * 20 banks. Total man years: 200 dev, 20 pm, 20 ba, 40 it

Off-the-peg app: 10 devs, 1 pm, 5 bas (minimum, because the software house has to talk to lots of clients), 2 sales. 1 pm, 1 dev & 2 it ppl per bank to do rollout & integration work. Total man years: 30 dev, 21 pm, 5 ba, 2 sales, 40 it.

Now sure, the total spend has decreased. But what's more important is that developers, as a fraction of the spend, are no longer the big slice. Project management, business analysis -- these are BIGGER when development and rollout are spread across companies. The IT burden is roughly the same (IT people, meaning people installing software, plugging things in etc. are cheap anyway).

What this means is that as the market matures, the actual work of development becomes less and less important and the work of managing, selling and integrating what has been developed gets more and more important.

Note that this applies to software development that can be expressed in terms of product, i.e. software which is delivered, installed and supported in a product-like way. There's also a wide world of 'service-like' software development which is subject to very different trends -- e.g. to outsourcing. But that's a story for another post.

Re:Bespoke softs (0)

Anonymous Coward | more than 7 years ago | (#18329623)

Financial institutions should stop pretending they can do software development. They should worry about whether joe subprime can make next month's spiked mortgage payment on the interest-only negatively amortized option-ARM you sold him last year.

Is computer science dead? (4, Funny)

jandersen (462034) | more than 7 years ago | (#18329435)

No, no, it just smells funny.

Needs to evolve into Computer Sciences (plural) (1)

searlea (95882) | more than 7 years ago | (#18329449)

No, Computer Science isn't dead. It's simply grown too big to be covered by a single 'Computer Science' label.
Just as biology branched out into 'Life Sciences [manchester.ac.uk] ' it's about time Computer Science was broken into separate areas.

It used to be fine to have a single Computer Science course with one module in Law, another in Algorithms; one in AI, one in Databases etc.

These subjects are too big now; covering the full subject area in a single degree produces graduates who are classic 'jack of all trades, master of none.'

Re:Needs to evolve into Computer Sciences (plural) (2, Interesting)

thaig (415462) | more than 7 years ago | (#18329615)

I think it's more like:
Mechanical Engineers and Mechanics
or
Electrical engineers and Electricians

Each job has its problems but focuses on a different end of the product lifecycle.

Some software doesn't die and merely needs to be maintained, so naturally, after a while there is less need for hardcore Computer Scientists to develop new things. Open source probably accelerates this trend - e.g why write a portable runtime library for your app when you can use the NSPR or the Apache one?

Less demand != no demand (1)

Capt James McCarthy (860294) | more than 7 years ago | (#18329469)

Offices will always want something that the COTS does not do. I think thrid party vendors should worry about becoming obsolite because Operating Systems begin to incorporate their functionality within the OS itself. M$ is trying with Virus scan and the like. Not perfect yet, but I think the code is on the wall, so to speak.

I believe this is the crux of the AGAINST argument (1, Funny)

Moggyboy (949119) | more than 7 years ago | (#18329479)

... there'll always be the pointy-haired boss who wants that icon in "powder-blue". Believe me, I've worked for enough of them to know that I'll never be out of a job.

A practical approach? (1)

MrDomino (799876) | more than 7 years ago | (#18329503)

The article makes a lot of shaky assertions, but it gets one thing right: computer science curricula in higher education are becoming something of a joke. I think it misfires in saying that the way to go is to be more practical and interdisciplinary; I think the problem is that computer science programs are too practical. "Computer science" has come to be less the study of algorithms and information management, and more a vocational degree--universities aren't graduating computer scientists so much as they're graduating computer mechanics.

I wonder if part of this is that universities are being forced to spend time drilling into undergrad students concepts that should've been learned long beforehand through a proper high school education or (god forbid) natural curiosity--and, moreover, if this trend will seep into graduate school as more people pick up master's or doctorate degrees for purposes of job differentiation.

If we're talking about developers here... (1)

jasomill (186436) | more than 7 years ago | (#18329513)

Putting the debate about the differences between academic CS, practical software development, and IT/MIS aside, it seems to me that, all other things being equal, an IT environment built from mostly off-the-shelf components will require fewer, but better developers. After all, it's far easier (and requires far fewer skills) to build a one-off custom application (or component, or robot, or whatever) that works (I'm tempted to say "happens to work") for a single customer (especially if the original developers remain available to provide ongoing support and maintenance) than it is to build a robust, off-the-shelf application that works well for many customers in many different environments, especially if the market demands reasonably-priced support for said app.

I'm tempted to say that this is a good thing, i.e., being able to take advantage of economies of scale to drive the cost of established technology down. And it's only the death of (applied) CS when people stop coming up with novel ideas for new applications of technology (what was that about the Patent Office closing down because everything worth inventing had already been invented in the 19th century?). I don't know about you, but I'd rather the greatest minds of my generation spend their time developing interesting new application areas than writing ad-hoc, informally-specified, bug-ridden, slow implementations of QuickBooks with Excel macros and duck tape (apologies to Philip Greenspun). In the words of Thoreau, "the sun was made to light worthier toil than this."

-jtm

The software industry is a big place (1)

91degrees (207121) | more than 7 years ago | (#18329529)

There are many fields where the off-the-shelf principle doesn't work.

But curiously, reducing the labour needed for production reduces prices and seems to often increase demand sufficiently to more than compensate for the reduction in labour. Take the Ford Model T as an example. Work required per car was considerably smaller than any other brand of car, but the workforce was huge. Commercial software (e.g. Windows, and Excel) does all the tasks required of 90% of users, so this should mean that the software industry is only about 10% of the size it was in the 1960's. But it's much larger than it's ever been.

What computer science is not (1)

DrHyde (134602) | more than 7 years ago | (#18329553)

  • Computer science is not software development.
  • Computer science is not about teaching Java.

Perhaps fixing those two problems that are endemic to "Computer Science" courses would go part of the way to fixing the problem.

Re:What computer science is not (1)

tomstdenis (446163) | more than 7 years ago | (#18329607)

I agree that comp.sci isn't strictly about programming, but I think it's a good idea to have *a* language in the courses. The problem otherwise is you have these "grads" who honestly don't know shit all about actually using their comp.sci knowledge. They end up having horribly unmaintainable coding practices that people like me usually have to come around and clean up.

Things like Java, C++, Perl, etc should be single semester long courses and not the focus of the degree. Because frankly, once you get the idea of programming languages, learning Java over C++ [or whatever] is a matter of the grammar really (especially since they're all fairly similar).

Tom

Moving towards a commodity (2, Insightful)

Bloke down the pub (861787) | more than 7 years ago | (#18329561)

To get the headlines a hundred years ago, just replace "British Computer Society" with "Ye Fraternal Guild of Buggywhip Frossickers" and "off-the-shelf solutions" with "horseless carriages".

There are always more consumers than creators (1)

gorbachev (512743) | more than 7 years ago | (#18329563)

Think about it. How many people get to write Java rather than write applications using Java? Or how many people get to write a brand new sorting algorithm compared to how many people get to use it?

I don't think there's anything wrong here. It makes perfect sense schools would create more consumers of computer science than computer scientists. If everyone coming out of these schools was a "creator", either the unemployment rates would go sky high, or there'd be a whole bunch of overqualified people working on tedious crap.

no, no, no (4, Insightful)

tomstdenis (446163) | more than 7 years ago | (#18329569)

This has been asked repeatedly ever since I was a wee lad [20 years ago]. The idea then was BASIC would replace comp.sci because it was so simple to program. Of course, it overlooked the fact that BASIC is wickedly inefficient. No, the answer is no. No. No. No. Why? Someone's gotta maintain the scene.

For starters, the more automated tools are not efficient enough for most computing platforms (hint: think running that nice VB.NET application in 32KB of ram). Then combine that with the need for algorithms (re: 16MHz processors) and you can see that RAD tools don't apply.

Tom

Go to the ends of the earth for employees (1)

rodney dill (631059) | more than 7 years ago | (#18329577)

As commercial software products have matured, it no longer makes sense for organizations to develop software from scratch.

...not unless they can outsource the work to India or China or some other low cost provider.

"Some companies go to the ends of the earth for their people... and usually find that they can get them there at a substantial savings."

cheaper hardware makes software seem to cost more (1)

dropadrop (1057046) | more than 7 years ago | (#18329579)

I think servers becoming cheaper and cheaper is also affecting how eagerly companies are willing to adopt their needs to a ready package. It used to be that the hardware required to run an application and it's testing environment was so expensive that getting a custom coded application didn't feel out of line.

Most of the legacy systems we run are on either mainframes or distributed among a bunch of HP or Sun risc servers which each one cost over 10 times more then the far more powerfull computer replacing it.

computer science isn't dead (2, Interesting)

seriouslyc00l (1075045) | more than 7 years ago | (#18329589)

Computer Science isn't dead. Some old computer scientists are dying. And new ones are being born. By the dozens. In the west, and in the east. Yes, jobs migrate by rules of economics. That doesn't kill the science. Because what migrated was not science - it was bricklaying of the computer age. If computer scientists were to do the "bricklaying", that would kill the science. Having said that, there are bricklayers in every community - east or west. It's a pity that bricklayers from the west have had to see their jobs move east. Sorry, but that's how the rules of economy work. The real scientists, whether they are from the east or west, stay put where they are, doing what they like doing best. The invention of concrete mixing machines did not render civil architects extinct - on the contrary, it made it necessary to have more of them, and to have better ones. And so it is with software. Off-the-shelf software doesn't make software engineers obsolete - it makes it possible to explore new application areas - and this requires more and better software engineers than before.

The answer is to change the courses (1)

jonwil (467024) | more than 7 years ago | (#18329605)

If they had courses covering the sorts of skills you find in modern software development shops (or at least in the GOOD software development shops), maybe this wouldn't be an issue.

Skills like code inspections, documentation of your changes, configuration management and so on.

CS not dead, but CS not IT (1, Insightful)

Anonymous Coward | more than 7 years ago | (#18329655)

CS is not dead. There are plenty of interesting fundamental problems to be solved in the area of computing.
But indeed, most IT people don't need a CS education. The current CS curriculum should be split into a pure CS curriculum and an IT cirriculum. CS should focuus on subjects like computation and computer organisation. IT should focus on information processing and application development. Ofcourse, there will be overlap between the two fields, and overlap with mathematics and other science diciplines, depending on what area you wish to specialise in.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...