Professors Slam Java As "Damaging" To Students 1267
jfmiller call to our attention two professors emeritus of computer science at New York University who have penned an article titled Computer Science Education: Where Are the Software Engineers of Tomorrow? in which they berate their university, and others, for not teaching solid languages like C, C++, Lisp, and ADA. The submitter wonders whether any CS students or professors would care to respond. Quoting the article: "The resulting set of skills [from today's educational practices] is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals... Java programming courses did not prepare our students for the first course in systems, much less for more advanced ones. Students found it hard to write programs that did not have a graphic interface, had no feeling for the relationship between the source program and what the hardware would actually do, and (most damaging) did not understand the semantics of pointers at all, which made the use of C in systems programming very challenging."
tasty (Score:5, Funny)
I dunno about you, but java was nothing but helpful to me as a student. the drinkable kind, at least.
Re:tasty (Score:5, Interesting)
I find I am now having to teach myself C++, and am struggling in a lot of areas that had I been taught in Uni I would be a lot more confident in.
Re:tasty (Score:5, Insightful)
Most of my professors had no real world experience, either. So, teaching things like team dynamics and working within a project schedule were really beyond their expertise. Granted, I've been quite successful, but I attribute most of that to my abilities, not what I learned in college. College just got me a piece of paper that opened the door.
I don't think the problem is with the languages being taught, but in the lack of true engineering being taught. This is true of any of the programming related fields (CS, MIS, SE). All of them need these skills.
Layne
Re:tasty (Score:5, Insightful)
And also beyond the scope of computer science. If that's what you wanted, you should have specialized in software engineering. People keep forgetting that computer science classes should feel more like math classes than engineering or management classes. One look at TAoCP would hint at that. For the record, I'm an engineer and I find the pseudo-engineering that most CS programs push out to be highly disturbing. Either do it right and call it software engineering, or remove the non-CS stuff and call it computer science. If you're not gonna do either aggressively, give it a fake major name like "Information Technology" or "Management of Information Systems" and teach a bunch of stuff really poorly.
Re:tasty (Score:5, Informative)
During the second year we took a class that taught C/C++ which basically taught pointers and memory management. In my upper level courses like operating systems and graphics, it was all C and C++ from then on. I think this gave me a pretty well rounded education.
When I was done, I had used a number of tools (languages) to learn a variety of CS topics, and felt that I was well prepared for the industry.
Right Tools for the Job (Score:4, Insightful)
University is not where you go to learn a specific set of skills. If you want that, you go to a technical trade school.
University is where you go to get an in-depth set of concepts, critical thinking skills, research skills, and theory foundations. This is true for any major you wish to approach. In the CS department, there is a reason you take different languages, some are for system development, some are for app development, some are for theory exploration with little to no value outside of the educational environment. Java falls into one of those categories. Assembly, C, C++ fall into others. Ada falls into yet another.
Think of it in the terms of the English major, you know, those dime-a-dozen students who will end up working at Burger King and Mr. Chows Empire Chinese Buffet, or they go to Hollywood to work as waitresses while they wait for their big break. The English major takes a load of literature, English, American, Russian, Manga, and poetry from Bacon/Shakespeare to Ginsberg to Hughes to Tupac, and writing from haiku to freestyle with a goofy footed pentameter (trademark and patent pending). None of this is particularly helpful to someone who wants to come out of school with business writing skills.
Remember, in University, some of the most mistaken ideas come from the professors.
Re:tasty (Score:5, Interesting)
Computer science is theory and math. Computers and langauges are the tools used to explore these concepts. The specific languages you learn in the university doesn't really matter; anyone with a CS degree and half a brain can pick up new languages within a very short amount of time.
I think it would be a very bad move for universities to cater to the corporate world. If you want to just learn programming, get some certs or buy a book. If you want an education, go to the university.
Re:tasty (Score:5, Insightful)
anyone with a CS degree and half a brain should be able to pick up new languages within a very short amount of time.
There are unfortunately a great many universities turning out a great many low-quality "computer science" grads who don't know the first thing about programming, much less the intricacies of stacks and pointers in C. I've met some alleged CS grads who didn't know a compiler from a hole in the ground.
I think the problem may be improving from how it was a few years back (the dot-com bust knocked CS off the lists of many students just looking for an easy $80k paycheck on graduation), but there are still a lot of dolts around, devaluing the degree.
Re:tasty (Score:4, Insightful)
But, the whole reason to GO to a University, is to get the skills/education to make more money when finished, than you would have if you had not gone.
College is a means to an end....and while it is nice to learn other things to be a bit well rounded, that is extra fluff if you have the time and money for it while there, but, don't forget the real reason for going.
If people could make good $$ without college, I doubt you'd see so many people trying to go....
A degree gets you in the door for a job....regardless of what it is in often...you have to have one these days to get a good job.
Re:tasty (Score:5, Insightful)
Re:tasty (Score:5, Interesting)
I think you are missing the mark here, the profs who wrote the original architects are both principals at a company that sells Ada tools. What they are complaining of really is the lack of demand for their stuff. Treating the argument seriously is a mistake.
I don't think that there was ever a time when Ada was a popular teaching language for any purpose other than coding in Ada. Same goes for COBOL, Fortran and such at this point.
Nobody would claim that there was a desperate need to teach CPL or BCPL, Pascal, or the like these days. They had their moment, they were found wanting. There are much better teaching languages these days and much better production languages.
These days I would probably teach either Java or C# as the intro language, depending on which is ahead at the time. I might teach C# to comp sci students simply to force the students to acknowledge the fact that they need to be able to adapt to new languages. Most other cases Java is most likely to be the most useful language.
The big change that came with Java is that when Java appeared it was the first time that a mainstream language was an acceptable teaching language. Pascal was popular in universities but the architecture was bjorked (arrays of ten elements are not a different type to arrays with eleven). The functional languages had dreadful performance and pawky support libs.
Re:tasty (Score:5, Insightful)
I think anyone who is spending 4 to 6 years getting an a degree in computer science only to get a high paying job when then get out, are a tad silly. They are really, really wasting their time. They can get an intern job right now, at a software consulting company, study their ass off (as we all have to do in this field). Within a year they will be making decent money, within 3 years making really good money. 4 years later when the person has their shiny degree, after studying Java (which probably wont' even be used then), they get the joy of getting a junior developers job.
There is an old adage: "How do you become a writer?" "Write... a lot". This is the same with programming. You can't fake your skills, and a PHD in CS won't matter if you can't bill your clients because your application doesn't fulfill requirements or even work.
Truthfully if all you care about is money, work in finance, or become a salesperson. The best developer in the world won't compete with a high end salesperson dollar for dollar; hell CEOs can't compete with top salespeople. Zero education required.
I very much value a university education, but it has nothing to do with making more money. Learn, create, become a very educated person; the money will follow; the money part really isn't that hard.
Not why I went (Score:4, Interesting)
That's not why I went to college. That's why you go to a trade school.
I went to college as a CS major because I loved programming. I went to college because I enjoyed learning, and wanted to round out my education in a lot of ways.
That I happened to be able to get a job after was because I was able to take all of the very abstract concepts I had learned and apply them to practical matters. But I had always been doing that on my own all through school anyway - why would I need school for that? Anyone can do that on their own, schools are there to teach you things that are hard to grasp or learn on your own.
Re:tasty (Score:5, Insightful)
College is a means to an end....and while it is nice to learn other things to be a bit well rounded, that is extra fluff if you have the time and money for it while there, but, don't forget the real reason for going.
If people could make good $$ without college, I doubt you'd see so many people trying to go....
A degree gets you in the door for a job....regardless of what it is in often...you have to have one these days to get a good job.
For a moment, put the reason why YOU go to a University to the side and consider what the purpose of the University is. It's an institution that's literally thousands of years old, dating back to the old Greek institutions of education. When Plato and Aristotle founded their schools, they didn't put up a big sign that said "When you're done, you get more money." That wasn't the promise. The promise was that by teaching you about the world, you would become a better person. That is to say, the founding concept of the University was that education lead to human excellence. And, for the Greeks especially, human excellence was not directly related to the possession of wealth.
This understanding of education was dominant up until very recently. Everyone was required to learn Greek and Latin, so they could read Homer and Plato. Reading the Homer isn't going to get you a job, it's not going to get you a promotion, it's not going to get you an interview, and it's not going to get you laid this Friday. No one at the University used to make the claim that it would. They'd claim that reading Homer made you a better person, even if it doesn't get you a job.
Now, as to why YOU should go to a University? If you're going for the purpose of getting a job, you're not going to understand the vast majority of your classes at the University. You're going to be wondering "Why do I have to take this anthropology class?" or "I have no interest in Operating Systems, why do I need this Operating Systems class?" and "Why do I need a foreign language, I'm going to be working with code all day." All these questions miss the larger point of what the University is trying to do to you. And if you're missing the point of the entire institution, it's exceptionally difficult to do well there.
The whole thing is really just the result of multiple generations of corruption, I think. Employers realized that well-rounded, educated (dare I say, excellent) human beings are better for the health of a company. So they pay more for people who are excellent, and a University degree used to be a short-hand of some form of excellence. The masses of uneducated began to realize this, and started saying to their kids "If you want a good job, you need a degree." So their kids started going to the University, thinking the point was to make money. Professors, having tenure, just did what they were going to do anyway, but now we've gone two or three generations like this. We're reaching the point where current professors went to school thinking it was for money. We have boards of Universities with pressure from the state to focus less on the goal of education for excellence and more on the goal of education for job skills.
Re:tasty (Score:4, Interesting)
Art majors
English majors
Performing arts majors
I could go on... I don't see how you have a valid point. I am not saying that the above fields are not worthy of pursuing, but people do not get into them for the money.
Re:tasty (Score:5, Insightful)
Do more. Try doing your homework in haskell or lisp or hell, write in forth or postscript. It's a billion times easier to learn a language when you have someone else telling you what to do in it, and a billionth of the stress when your paycheck doesn't depend on it working.
I've wanted to learn ruby and rails for a while now, but I've got nothing to do with it at home, and like hell I'm going to show up at work and replace a production app with ruby for the hell of it, even though we've got a number of internal web apps that are basically exactly the kind of CRUD RoR was designed for.
Re:tasty (Score:5, Funny)
Hell? Maybe you meant Perl?
Re:tasty (Score:4, Interesting)
Lisp and Scheme I'll lump in "parenthesis hell." I've never seen the allure of list processing languages - they drive me nuts. In the real world, you'll probably never see them even for what it's known best for, AI (python or lua are much more useful these days).
Haskell I've never used and seems to be stuck in the university wasteland. Ruby and Rails seem more practical, but no more than Python. To be honest, I dislike python because it uses indent significance, the one thing I despised most about Make. I've been meaning to look at Ruby, as well, because I like the Smalltalk object model, but I'm not sure how much I like linebreak significant line endings.
As abusive as it sounds, languages like COBOL (banking/finance) or Ada (government/military) or even FORTRAN (mech-E) will get you a job faster than Haskell. Or learn RPG and dedicate your life to IBM and Unisys mainframes (*shudder*).
Re:tasty (Score:5, Insightful)
But knowing those langs helps a lot ! (Score:4, Insightful)
Java == Jobs (Score:3, Interesting)
But I have not been able to find any such jobs. Job databases show 90%
Re:Java == Jobs (Score:5, Insightful)
I'm mainly a C programmer these days, but I took the job basically understanding that I would be working significantly with Java. That was the only language I had experience with on leaving Uni, and I was promptly put to work on a Pascal / OpenVMS system! Friends from Uni have had similar experiences.
I have been a bit worried about an outdated skillset as lots of employers ask for lots of object oriented programming experience and I only occasionally use this. I think this would be my primary problem if I started looking for a new job. I also think it's a bit unfair as the skills are pretty transferrable - there's only a little new theory to learn and after that, good programming practices aren't hugely dependant on language used.
In dealings with many (perhaps even most) other companies whose software I write interfaces with, it's pretty clear that they are also using C or C++, and often even older systems (in one interface we have to convert our messages from ASCII to EBCDIC). You can frequently tell what language the other system is from the sort of errors that crop up, and sometimes from the design of the interface. I'm forced to believe that my area of the industry is still primarily C based.
Re:Java == Jobs (Score:5, Insightful)
As code gets more complex, the best way to keep it understandable to others is to follow common language idioms, indentation / code formatting practices, and use built-ins in the standard libraries. These alone often take months to become familiar with, but that's only half of it. The other half I can only describe as trying to approach problems from the unique perspective of the language. Any asshole can jump from Java to Ruby, or from C++ to Lisp, or from VB.NET to Scala. But learning how to solve problems using those languages' strengths, rather than writing code as you would in the language you're coming from, is crucial.
From my own experience, Java programmers coming fresh into Ruby don't use blocks. When you finally convince them to use blocks for enumerators, they miss the point entirely and simply use each_with_index for everything, rather than more powerful methods from functional programming like map. They also don't like to reopen classes. In Ruby, classes can be added to at will, so if you want a method to calculate the average value of an Array, you can simply define it as a new method on the class. But Java programmers will create a Util module, throw a method in there that takes an Array, and think nothing more. It's not wrong, per se, but it's ignoring Ruby's strengths, and simply writing Java code inside the Ruby interpreter. And the people who do this are bloody useless.
My rant is getting long, but the main point is this: learning syntax for a new language is easy. Learning to use that language properly (much as a screw is used differently than a nail) is crucial to being able to work with other people, and getting anything meaningful done.
Re:Java == Jobs (Score:5, Insightful)
If you look at developers who spend a lot of time doing things in C (e.g., the OpenBSD developers-- have a look at their repository [openbsd.org]), you'll see that they are keenly aware of "object-oriented" design principles. They also tend to know exactly when things like byte alignment is an issue, and when you really should just use a void pointer, because they are forced to think about their machines. Most OO programmers I know have no idea why they would need OO language features-- they just use them because that's what they've been taught-- and they know next to nothing about the machines themselves. I would argue that a good programmer is a good programmer; and if they have standard procedural programming experience, that will nicely complement their future OO work.
GP is right-- OO is simply a design philosophy. The actual mechanics of building an application are no different.
Re:Java == Jobs (Score:5, Insightful)
Re:Java == Jobs (Score:4, Insightful)
Mostly the classes were theory, concepts; stuff that applied equally to all languages. It was weird in some ways; I had a networking class that assigned the eternal "create a server/client chat program" project, where part of the project was a Java GUI. At this point, I'd never programmed a GUI, and neither had anyone else I talked to. The response of the TA (who was the only one who'd ever give programming advice, because the professor only dealt in theory), was that GUI design was beyond the scope of the class and we'd just have to figure it out.
My method for figuring it out involved downloading a Java editor, and using the GUI design tools. It was the first time I'd used a graphical editor for Java; it was encouraged to do the work in VI or Emacs, and generally, that's all we did.
Now I hated that crap at the time, but nothing has prepared me better for my day to day life than having projects dumped on me where I had to goddamn well use my initiative and figure it out. Over and over again, I was forced to go out and read and work out for myself how to translate the theory into code. These days, I program in Java about 20% of the time. I'd hardly say it stunted my abilities, and it certainly didn't make me into a cookie cutter corporate programmer.
I'd have to say that specifically teaching any language is a problem. They all come in and out of fashion. I work with a guy whose mind is stuck in Visual Basic...And I don't mean
Re:Java == Jobs (Score:4, Interesting)
But C and a C++ background will help you keep that job.
It's very hard to be taken serious as a computer programmer, even a Java one, if you aren't able to understand memory allocation, points, etc. Often times it's critical to understand that so you can understand what the garbage collector is doing.
I'm the director of technology for a small/medium company (about 5 programmers), and when I'm interviewing a new job candidate for a Java position, during the interview I ask them to explain to me how they would achieve writing a linked-list in Java without using the collection class.
Interesting some of the responses I get.
software engineering != computer science (Score:5, Insightful)
that's true, but again soft engineering/programming is a subset of computer science (maybe, i suppose you could argue they aren't)
"Computer science is no more about computers than astronomy is about telescopes."
- Edsger Dijkstra
Re:software engineering != computer science (Score:5, Insightful)
In practice systems have to work 100%, and when your graph search algorithm (by Dijkstra naturally) segfaults due to dereferencing a wrong pointer then computer science is very much about computers.
I'm just worried that too few students these days know assembly and C, which leaves us in a predicament when the current generation of kernel devs retire.
Re: (Score:3, Insightful)
Re:software engineering != computer science (Score:4, Insightful)
Re:software engineering != computer science (Score:5, Insightful)
Pointers aren't rocket science. If you never perform an operation where you haven't first met the operation's preconditions, you never get a pointer error.
If you aren't rigorously checking preconditions on *every* operation you perform, you're not going to cut it as a kernel dev anyway. Pointers are the least of your problems. Race conditions can become exceptionally hard to reason about. The prudent kernel dev architects the system such that this doesn't transpire. That requires a whole different galaxy of aptitude beyond not leaking pointers.
When I first learned C in the K&R era, I thought those greybeards were pretty clever. Then I came across strcpy() and I wondered what they were smoking that I wasn't sharing. I thought to myself, their must be some higher level idiom that protects against buffer overflow, because no sane architect would implement such a dangerous function otherwise. Man, was I ever naive.
More likely, too many of them had learned to program on paper teletypes, and just couldn't bring themselves to face having to type unsafe_strcpy() when they had reason to know it would work safely and efficiently.
The C language deserves a great deal of shame in this matter of giving many beginning programmers the false impression that any function call should dispense with formal preconditions.
Interestingly, if you sit down to implement an STL template algorithm manipulating iterators, it proves pretty much impossible to avoid careful consideration of range and validity.
OTOH, C++ makes it charmingly easy for an object copy routine, such as operator=(self& dst, const self& src) to make a complete hash of managed resources if you fail to affirm dst != src.
There are plenty of amateur mathematicians who can manipulate complex formulas in amazing ways. The difference with a professional mathematician is that the necessary conditions for each transformation is clearly spelled out.
A = B ==> A/C = B/C iff C != 0
A > B ==> C*A > C*B iff C > 0
Infinite series must converge, etc.
I'm not even getting into defining A,B,C as fields, groups, rings, monoids, etc. for maximum generality.
Yet the average programmer feels sullied to exercise the same intellectual caution manipulating pointers. I've never understood that sentiment. My attitude is this: if that's how you feel, get your lazy coding ass out of my interrupt handler; go code a dialog box in some Visual Basic application that won't work right no matter what you do.
Why did the software industry play out this way? Other professions have much harsher standards. Primarily because software was in an exponential expansion phase, any work was regarded as better than no work (perhaps falsely), and industry couldn't afford to reduce the talent pool by demanding actual talent.
Now we've allowed many people to enter the profession without comprehending the rigors of preconditions. It's as if we had taught a generation of lawyers how to practice law, but omitted liability. Oops. What to do about it? Invent Java, and tell all these programmers it wasn't their fault in the first place.
So yes, Java doesn't teach very darn much about the harsh realities of actually thinking. And since thinking is hard, it's an impediment to productivity anyway, so it hasn't much been missed. The only thing we lost in the shuffle is our professional self respect.
Re:software engineering != computer science (Score:5, Insightful)
Naive about the purpose of C, anyway. C was never designed to prevent you from shooting yourself in the foot. Writing C requires you to think, which is sadly out of vogue these days, as you point out later. C was never designed to protect you from yourself, as explicitly pointed out by Dennis Ritchie many times. If you want a language that will protect you from yourself, program in VB.
So yes, Java doesn't teach very darn much about the harsh realities of actually thinking.
But C obviously does - like checking boundary conditions. I don't understand how you can slam C in one breath, then praise it in the next.
Re:software engineering != computer science (Score:5, Insightful)
I do advocate designing primitives as essential to the language as the C string functions to powerfully remind the programmer using those functions of the programmer's logical obligations and support the programmer to reason correctly about those obligations, without having to digest 15 lines of preceding context to see that calloc() provided the implied terminating NUL.
strlcpy and strlcat - consistent, safe, string copy and concatenation [gratisoft.us] by Todd C. Miller and Theo de Raadt, OpenBSD project
Weirdly enough, the Linux people are about the only major group of people that has constantly stayed deaf to these arguments. The chief opponent to strlcpy in glibc is most certainly Ulrich Drepper, who argues that good programmers don't need strlcpy, since they don't make mistakes while copying strings. This is a very mystifying point of view, since bugtraq daily proves that a lot of Linux and free software programmers are not that bright, and need all the help they can get.
One must recognize that in a solid code base, thinking occurs more often while reading code than writing code. Correctness is not a write-only proposition in any living code base.
The original C string functions were (and remain) a pedagodic disaster. Most beginning programmers failed to realized how much thinking had been folded into the surrounding context. If they were reading K&R, that thinking existed. If they were reading any code they had at hand, it likely hadn't, by any survey of average C code quality ten years later. With the original string functions, whether this careful thinking existed is not obvious without doing a lot of mental work, and that work has to be repeated *every time* the code is seriously reviewed.
Worst of all, the strcpy() function seemed to imply "buffer overflow is no great concern, we're not even going to give you a single argument on this very dangerous function to help you avert it". It was a false parsimony to save that extra argument in the default case.
This isn't at the level of whether the handgun has a safety or not. It's at the level of whether it is possible to chamber a round too large for the barrel. I can point the gun successfully, but I'd greatly prefer it not to detonate in any other direction.
A more thoughtful C string API would have averted mistakes on the magnitude of chambering bad ammunition, without encumbering the pointy end in the slightest, or failing to endanger the programmer's foot.
Re:software engineering != computer science (Score:4, Insightful)
I know, it's insane nowadays! Why, just the other day I was just remarking to my barefoot wife, who was cleaning our clothes by beating them against the rocks in the river, and then rolling them through a wringer, that all these fancyboys with their pushbutton machines and laundry powder just don't value godly, manual labor anymore! Then I went to my factory job where the machines have no safety features, so they really require you to think, which is sadly out of vogue these days with OSHA and safety goggles and whatnots.
The shame!
--Rob
Re:software engineering != computer science (Score:5, Interesting)
With Java and most other 'friendly' languages you have literally no way of knowing what is going on under the hood unless you are prepared to invest a lot more time and effort than is available to the average comp-sci student.
With C that's as close as a single flag on your compile line and you can study the generated code until you're tired of it.
Re:software engineering != computer science (Score:5, Insightful)
Re:software engineering != computer science (Score:5, Insightful)
Re:software engineering != computer science (Score:5, Insightful)
This goes for most of the so called 'insecure' functions in C, they only become insecure if you have already messed up in an earlier stage of your code. If you are aware of the limitations of the standard library routines (even the unsafe ones) and you are operating in a 'hostile' environment (and todays internet certainly qualifies as such) then you'll need to take great care to accept only input that matches your assumptions in the code further down, if not you are in trouble. But good programmers will work like that anyway.
It's perfectly possible to write crappy code in *any* language, not just in C (though, in the words of one old timer programmer 'C is like a racecar, you can cut corners but if you do that too often you'll end on your side).
To come back to a fairly well thought out piece with an answer like what was written several levels above here is not in any way helping the discussion, it is simply insulting.
And ignorance is key to bad habits (Score:5, Insightful)
Guess what? Even in Java, pointers still come to bite you in the arse when you least expect them. I see people every day who have trouble understanding the difference between "==" and "equals()" in Java, because they never learned the pointers behind them. They're essentially one abstraction level too far from understanding what their own code is doing.
Or even in Java learning why you can't modify an "int" parameter, but you can modify the contents of an "int[]" parameter, guess what? Requires pointers. People end up doing all sorts of unnatural metal contortions to remember when passing by value isn't really passing by value, when "it's a pointer" would sum it up perfectly.
And it shows. I've had people come to me half a dozen times with basically the same idiotic "auugh! Java's Hashtable is broken! I added a new value, and when I look into its array with a debugger it replaced my old one!" When in fact, it was only added a node to the front of the linked list. But they don't know what a linked list is, nor what a hash table really is, nor how a Node can contain another Node, without a concept of pointers. Worse yet, not only I see them spending a week debugging Hashtable, I see piss-poor workarounds done to prevent it from doing its job.
Or I see burger-flippers-turned-programmers occasionally get the real programmers fired for doing the right thing. Like using a "==" where it's correct to use it. But the burger flipper doesn't understand that. He learned some "for String use equals()" mantra, and he'll apply it and preach it, cargo-cult style, without even understanding what he's _doing_.
Or I see people think that optimization means replacing two lines with a one line call, because they have no fucking clue what the machine does with that code. They think that speed is measured in lines of code, because noone explained to them otherwise. So they wonder why their replacing two ifs with a catch is actually slower. (And I'm not getting into the many ways such a catch can make the code less secure, for example, by assuming that a real exception is just their loop reaching the end of the array.) Exactly what throwing an exception does, is a mystery to them.
Etc.
No, noone said you must keep programming in "a language where even K&R wrote unsafe code, nor that difficulty equals worthiness. But it helps to be at least exposed to those concepts once, even if thereafter you go on to program in Java or VB for the rest of your days. The fact that you worked with pointers once in C and managed to get them right, _will_ show in your Java code too.
Probably the best thing that helped my coding was doing assembly on my parents' old home computer, back in high school. In fact, in hex, because that ZX-81 with 1k RAM didn't even have enough RAM for an assembler. Wrap your mind around _that_, if you think C is too hard.
Would I advise anyone to write a production program in assembly nowadays? Nope, God forbid. I wouldn't have advised writing a whole program in assembly even back then. But understanding the machine behind that high level stuff will show even in your Java code.
And, yes, not every architect needs to be a Michelangelo. But it helps if they're not a clueless moron who can't even build a doghouse right. You can see plenty of architects nowadays who can't even get a basic house right. They know how to draw an artsy sketch of a house, but they have no clue how to calculate it to actually stand upright or what materials to use so it doesn't get damaged by rain within a year or two. And/or need a civil engineer to fix their elementary mistakes. Maybe it wouldn't hurt that much if they knew a bit more, ya know?
Just as an extra anecdote (Score:5, Funny)
Yeah, that guy was quite a bit less than a Michelangelo.
Re:software engineering != computer science (Score:5, Insightful)
As I said, I'm not a programmer. I could (if I had to) model the frequency response of a simple mechanical system to a range of perturbations by hand. The chance that I'd have to do that in the course of my professional employment is so slim as to be laughable. Yet, the fact that I could do this (if I really had to) tells me that I don't want to put an eccentric load on a rotating shaft with out a lot of careful consideration. Now if I sort of knew this was a bad thing, but didn't really understand why I might to something silly like put only a single U-joint in a shaft. After all, it provides flexibility, and as long as the shaft is straight there is no eccentricity. The problem occurs when there is a deflection, then your single U-joint translates a nasty sinusoid down-stream. If you do that things tend to break.
Now, I agree that the university should have some courses focused things that practicing professionals in the field use. I could draw a part by hand (if I really had to), but if I've never seen CAD before, I'd be at a serious disadvantage if I ever wanted to be a machine designer. However, fundamentally, a university is an academic institution. The suggestion that it should be an employment mill would severely compromise our education system.
Re:software engineering != computer science (Score:5, Interesting)
That they didn't know C wasn't too surprising. That they didn't have more than a basic grasp of memory management was shocking. They were also completely baffled when it came to not using an IDE to develop software. Makefiles had to be explained several times.
I've grumbled many times about this concentration on Java, and the resultant lack of detailed understanding about programing, but each time I did so at my university I was disregarded, and someone always trotted out that age old nonsense "not re-inventing the wheel".
I mean, sure, I see the point, but surely you should have a basic idea of how wheels are made?
Re:software engineering != computer science (Score:5, Insightful)
When I took data structures, and we used C++, I didn't have mental convulsions because Java had wrecked up my thinking so much (although I did have mental convulsions cause C++ is incredibly messy to read at a glance), I learned different ways of doing things. So, maybe these professors should look at whoever's teaching these kids so sloppily, not the language.
Re:software engineering != computer science (Score:5, Insightful)
employers want nothing more then easily replacable drones who come with an easily definable skill set which they can replace when a new buzzword comes along. this is NOT what universities should be pandering to.
Re:software engineering != computer science (Score:5, Interesting)
I was rolling on the ground laughing when I saw the problems people were having making a simple Sudoku program in C#.
Once they were done drag/dropping all the UI elements, they all got stuck.
Mind you your right about they are teaching what they think employers want.
We didnt get to see a *nix system let alone use one.
Although that may be because of that rather large donation Microsoft gave.......
Re:software engineering != computer science (Score:4, Insightful)
I code mostly in Java in my professional life, but when I was in school we were forced to diversify, and it was a definite plus.
The intro course used mostly JavaScript for some reasons (!), but other (even relatively low-level courses) required projects written in C, Schema, and Java. I took an operating systems course where we had to write a project in some kind of mini-assembly language... it's all a bit fuzzy now (I graduated 10 years ago), but I remember it being tough to wrap my head around for a while. And that's a good thing, right?
I also did a couple of summer-long solo projects that probably taught me more than anything -- just fighting through the problems on my own, learning the hard way about the value of clean code, OO, version control, debugging skills, etc. etc..
Perhaps obviously, I'm much better a *Java* developer than I would have been without the other stuff. So I agree wholeheartedly that students must learn more than one language in their schooling -- either for professional reasons OR for academic reasons; you simply have to flex your thinking in more than one way if you're going to learn.
It also strikes me as a tough way to learn... how do you learn what X language is, and the reasons behind its design, totally in isolation? How do you learn what OO is if "functional" is a meaningless concept to you?
Re:software engineering != computer science (Score:5, Insightful)
Re:Pointers, References and Performance (Score:5, Informative)
try {
int fds[2] = new int[2];
pipe(fd);
} catch (memoryAllocationErrorOrSo) {
}
Why does this need to be so complicated [...]
Re:Pointers, References and Performance (Score:4, Informative)
Of course, because while it's possible to run out of of heap space (implying a possible OutOfMemoryError), computers have had infinite stacks since the 1960s. Simply by moving the memory over to the stack, you can't possibly run out anymore! It's brilliant!
Hint: it's all the same memory. Your desperate desire to have the bytes come out of the "stack" bucket instead of the "heap" bucket is simply irrational.
Re:Pointers, References and Performance (Score:4, Informative)
And because it's like that, you have heap allocations for every non-atomic data type, which is really the opposite of performance.
Not really, no. The just-in-time compiler performs pointer escape analysis for the allocated objects and only uses the heap for the ones where heap allocation is actually necessary; the rest use the stack regardless of how the programmer wrote the declaration.
Admittedly, it's taken a while for this optimisation to be included, but it is there in the latest versions of Java.
You have to start somewhere... (Score:5, Interesting)
Yeah, I just read a press release from the FAA blasting driver training courses. Apparently, flight students who just got their drivers licenses were not able to navigate in the air, execute banks, take-off, or land properly.
Students have to start somewhere. It's easier to start with simple stuff than to try to cram their heads full of everything all at once.
Re:You have to start somewhere... (Score:5, Insightful)
It is reasonable to expect that a CS student has both the ability and the interest it takes to learn all the details of programming well in C.
Start simple and use different types of languages (Score:3, Insightful)
I would think it better to have functional language next. Students are much more receptive in the earlier years, and Functional programming does take some getting used to. I don't know much to recommend these languages
After tha
Re:You have to start somewhere... (Score:5, Interesting)
I started programming in Pascal, and then moved to C/C++. Structured programming, language syntax, variable typing, functions, parameters, recursion, etc I could ALL learn in Pascal.
When I came through Java was still pretty new, but I did take a java course, and found it reminded me a more of Pascal than C/C++; I'd say its a good starter language.
Also you can easily write command line apps in java, so i don't know why they blamed gui dependancy on java.
And as for 'systems programming' well DUH. Your first language is where you learn the basics of programming, before you start taking systems programming you should also have a lower level course ideally in something like assembly language (even if its just on emulated hardware) or C.
Re:You have to start somewhere... (Score:4, Interesting)
Java is also a lousy "beginners" language, because its reliance on standard libraries leads beginners to look for pre-packaged solutions rather than writing their own. That was one of the main arguments against Java in the paper, and it was a problem even a decade ago when my school was transitioning beginners classes to Java (I was ahead of the change by a semester or two in each class, so I got to start from Scheme, learn data structures in C++, learn AI in Lisp, etc). Yes, in the "real world" you don't want everybody reimplimenting their own linked list or hashtable. However a beginner must learn the concepts behind those data structures in order to advance, and Java just makes it too easy to use the standard set of classes.
That's not to say that Java is all bad. With a good teacher and a good curriculum, it's absolutely possible to teach core concepts in Java (or any language, really). You have to be merciless about banning standard library usage such as collections, and teach your students the theory behind those data structures. People understand theory best when they can actually see it in practice, so you have to have your students implement their own linked lists, doubly linked lists, trees, etc. With Java it's an uphill battle getting people to ignore the standard libraries for "academic" purposes, but it's possible to do.
Personally, I'm thankful that my first real programming language (not counting BASIC in its various forms) was Scheme, and that I was exposed to a number of languages through my college career (the afore-mentioned Scheme, C/C++, and Lisp, as well as ML, Java, and MIPS assembly) even though my current day job consists of C# and SQL. Because of my background, I can easily pick up pretty much any language (and have done so several times), which gives me an advantage over those "programmers" churned out of today's Java-mill universities.
Re:You have to start somewhere... (Score:4, Insightful)
Which is exactly what we want people to do when they're off doing real work. One problem with teaching languages without such libraries is that people get used to writing their own rather than looking for pre-packaged.
There's no problem with basic data structures and algorithms in Java. You declare that certain libraries are off-limits for an assignment, and give a zero score to anybody who violates that. The ones that actually belong in the field will catch on real quick.
Start with (Score:3, Interesting)
Though you may not use assembly language much, it helps to better understand what is going on under the hood.
Re: (Score:3, Insightful)
Language choice affected the content of both courses quite a bit. In the Java course, students spent more time understanding how specific data structures worked and working on more interesting programming assign
Re:You have to start somewhere... (Score:5, Interesting)
Assembly is necessary, to understand how a computer really works. Functional languages are good, just to know a completely different style. Some other language for breadth. Then the student can realise that everythin after asm was a waste of time, and return to C.
Re:You have to start somewhere... (Score:5, Insightful)
Assembly is necessary, to understand how a computer really works. Functional languages are good, just to know a completely different style. Some other language for breadth. Then the student can realise that everythin after asm was a waste of time, and return to C.
This is kind-of bollocks.
When I was a young programmer - which is about twenty-five years ago - the team I was on got a new ink-jet printer. It printed its own character set, we needed it to print bitmaps. The processor it used was one none of us had ever worked with before. One of the older members of the team - a guy called Chris Burton - took the spec sheet for the processor and the spec sheet for the printer home with him on the train, and came back the next day with the code for the new printer driver written in long hand, not in assembler mnemonics but in actual op-codes, in pencil on a pad of paper. It was burned on an EEPROM that day and drove the printers until that model became obsolete five years later - there were no bugs, it never needed fixing.
It should be said in passing that Chris had worked in his youth on the Manchester Mark One [wikipedia.org], and after he retired was part of the team that rebuilt Baby and got it running again.
I've always thought that was epic programming, a standard I'll never reach. But it's one particular layer on the stack. My job on that team was writing inference engines, and Chris was always really impressed by that. It's nearly thirty years since I touched any assembler and fifteen since I wrote anything serious in C. A modern computer system is way too complex for any single person to really understand, in depth, all the layers. I take what the silicon designers do as given, and likewise the microcode programmers. Right back in the early days of Linux I did fix issues in kernel code a couple of times but I wouldn't even try these days - the guys who do that are much more expert at it than I am. Likewise, I don't expect them to understand the compiler compilers that I write. It's a different layer on the stack.
I agree that you need to have a rough idea about how the whole stack works. But we no longer expect all computer science students to be able to wire up NAND gates from discrete valves or transistors. And although a computer scientist needs to know that there are primitive logic operations carried out on the metal, and that on top of that there are a stack of different software layers with real machine code on the bottom and a whole slew of intermediate code representations above that, I don't believe that it is any longer necessary for all students to be able to write a serious program in assembler.
1 language is damaging. (Score:5, Interesting)
In the course of my CS education (early 90s), they started with Pascal when they explained algorithmical basics.
Later courses were in C for OS and networking, while other courses used about everything from PROLOG to ADA.
You learn that some paradigms map to certain types of problems better (or worse) than others. So don't open sockets in
Prolog (I have seen'em do it man) , and don't do AI in C.
a quote: "if the only tool you have is a hammer, every problem looks like a nail".
Re: (Score:3, Insightful)
COME ON!
And don't tell me Java doesn't have pointers - what do you think references are? Glorified pointers with auto-null checks.
One problem I've seen is Java Developer Syndrome (JDS) - think devs who don't know the difference between Java API names and datastructures that are used
Re: (Score:3, Funny)
And what's a SIGSEGV if it's not an auto-null check? ;-)
About the Authors (Score:5, Informative)
Edmond Schonberg, Ph.D., is vice-president of AdaCore and a professor emeritus of computer science at New York University. He has been involved in the implementation of Ada since 1981. With Robert Dewar and other collaborators, he created the first validated implementation of Ada83, the first prototype compiler for Ada9X, and the first full implementation of Ada2005.
Re: (Score:3, Insightful)
A Real Programmer Can Write in Any Language (C, Java, Lisp, Ada)
Why C matters...
Why C++ matters...
Why lisp matters...
Why Java matters...
Why Ada matters...
So, I don't think the article is biased.
Re:About the Authors (Score:5, Informative)
Students found Ada a relatively simple language to start with (if you choose an appropriate subset)
Java can have more overhead for a beginning student
Lecturers are often tempted to push a lot of "stuff" in intro subjects
Java GUI motivates some students to get more involved
Many of my students regretted that Ada would no longer be taught in first year (having quite enjoyed it)
No matter what you start with, teaching students to be better programmers takes more than just a language. Each language allows you to teach a specific set of skills, and Ada is not bad for teaching some important SE skills (IMHO).
I think pointers are overrated as a first year concept, and can wait for later years.
I think they have some credibility (Score:3, Interesting)
You might be right, but just because they're involved in Ada doesn't necessarily make them biased towards it -- it does mean that they probably know a lot about it. What actually matters are their teaching qualifications and their understanding of what's important.
They might just as easily have come to be involved in Ada because it met all their requirements as a good
Programming languages are tools, not religions. (Score:5, Interesting)
As someone who programs mainly in java, I have to say they have a point. Surely a degree in CS should get someone familiar with all forms of higher order programing (both OO and functional). They should also have a reasonably solid understanding of basic hardware architecture and how that affects programs.
Unfortunately this does not seem to be the case at least in NZ. Some don't even know about basic complexity ideas and often have little to zero mathematics under there belt.
I did not do CS but physics. I was required to do Assembly,basic,C,matlab,R,Lisp,Java,C++,Haskell and a bunch of others I don't care to mention (Like PLC's and FPGA stuff).
Biased? (Score:5, Informative)
Dr. Robert B.K. Dewar, AdaCore Inc. (President)
Dr. Edmond Schonberg, AdaCore Inc. (Vice President).
The article by some weird coincidence slams Java and praises Ada.
Salt, please...
PS, Ada is mainly alive in the Military/Aerospace industries where projects can last 20+ years.
Different goals (Score:4, Insightful)
On the other hand, Universities have a much different end goal. They want to teach such that completing there program means that the student can go onto a Masters program, etc. Obviously, Java won't get students there without a massive amount of pain if they go on to further study.
Well, at least that how it was, and how it should be. Currently, Universities are edging toward the College level. What this has produced is a massive gap in knowledge/skill of where the student is expected to be and where they actually are upon entering a Graduate program.
Unfortunately, this isn't just in CS. More and more I see Mathematics and Physics programs degrading as well. From what I've seen, this is due to Administration applying... pressure for high grades, etc. No grades, no funding. The cycle continues.
Though, I must point out that there are some Departments that are making an attempt at fighting back. Small in number they may be, there is still hope for a return to actual academics. Though, we'll see how that plays out. You never know, I wouldn't put it beyond a spiteful Administration to turn a Department into offering just service courses.
Java brings out the best and Worst in you (Score:3, Interesting)
To take apart their argument by logic:
1. Java is an abstraction a VM over the hardware. People who study comp sci, today are of three kinds:
a) Those who want to be a programmer and then a Project Mgr.
b) Those who want to maintain hardware and/or design new ones. iPod maybe.
c) Those want to manage systems (20% hardware: 80% software).
For the first kind, Java is a better choice because: It does not force you to dig down to machine-code which is unnecessary today. Much like car driving in 1920s and 1990s. It teaches you the best of programming by forcing you to think in terms of objects and how to act upon them in real-world. If you mess up, you don't overwrite the root disk thus causing innumerous references to the time-worn joke about shooting in one's foot.
It also teaches you GUI writing is tougher by way of its MVC programming. At this time, programmers can be split into real men or MSFT weenies: real men would go to Java in Server-world. Weenies would love GUI and goto VB.
It also teaches you the worst in programming: Forcing you to think only in OO way.
The second kinds is better off learning C or even Assembly.
The third kind is tricky: There are lots of management tools nowadays. Some of them written in Java. if they want to write their plugins easily, then Java is the way to go.
2) Java is one more step in evolution which normally the professors hate because it moves them away from the machine. But mankind has more important things to do (watching LOST and Sopranos) than twiddling with RPG.
3) Blaming Java alone for problems is like blaming the Sea for causing Katrinas.
Lastly if anyone should be blamed for warping the minds of youngsters permanently, it should MSFT with its Visual Basic system.
Re:Java brings out the best and Worst in you (Score:4, Interesting)
Wrong. Java is a step *back*. Not because of the abstraction, mind you, but simply because Java is such a severely limited language. 30-40 years ago amazing new things were developed in CS, none of these are included in Java. Lambda? Map-Reduce? Purely functional code? Caml-style pattern matching? Conditions? People say Java made C++ obsolete, which is completely wrong, since C++ is far more powerful than Java (thanks to generic programming and metaprogramming). Just have a look at Boost, or C++0x, and try to replicate the features in Java - WITHOUT runtime overhead (this is one key feature of generic/meta programming - a lot of computation can be done at *compile*-time).
Actually, the next step in evolution should be Lisp, since it is one of the the most powerful and flexible languages in existence. There is nothing in the language that intrinsically prohibits Lisp performing as well as C++, its mostly the tools that lack development (for comparison, g++ has had a hell of a lot of improvements; the 2.x series was awful). If you avoid certain things like eval, Lisp code can be optimized well enough. There is a Scheme compiler called Stalin which does just that, and it can even outperform comparable C code.
So, if you want to abstract away from the machine, why use Java instead of Lisp?
IMO there are other reasons for Java:
1) its a braindead easy language.
2) you only need mediocre programmers, since Java is easy enough for them, and there are much more mediocre programmers than good ones.
3) as a consequence, programmers are expendable, jobs can be outsourced etc. One C++/Lisp/Haskell expert could replace an entire Java team. Not good for the company, since this guy becomes indispensable, and can demand more salary etc.
4) Universities can claim to have more success in teaching, since the number of guys with a CS degree is higher.
5) Companies need less teaching courses, because of (1).
6) Java has been overhyped for a while now, and many CTOs are so clueless, they just buy into it.
Well, you can disagree at my opinion about Java, but it is a fact that CS students *should* learn about the functional paradigm, what lambdas are, closures, and so on as well as what pointers are, how the memory works, what the garbage collector actually does etc. However, this is not the case - and THIS is a serious problem.
Variety (Score:5, Interesting)
One thing I have noticed though, is a complete lack of security related training. Something about calling eval() on every input just to parse integers makes me cringe. I guess the idea is that worrying to much standard practices keeps people from thinking creatively or something. Unfortunately, it also seems like a good way to get into a lot of bad habits.
Why not D? (Score:4, Interesting)
Sooner or later, languages are going to evolve, and surely it's only a matter of time before something D-like is going to be used anyway. Might as well make the switch sooner rather than later.
Why is "Computer Science" Staffing S/W companies? (Score:5, Interesting)
I totally agree that universities shouldn't be teaching Java exclusively. They need to teach the basics of modular, functional, declarative and oo languages. Why? Certainly *not* to fill "software engineering" positions!!! A university's role is to do research, not to act as some technical college. OK, I can see having a programming course aimed at creating programmers for industry if it's going to pay the bills at the uni. But *don't* make that your "Computer Science" course!!
Computer Science should be science (well, math anyway). Universities should be getting the 5 or 10 graduates they need that will move on to academia (or industry research) later in their careers. Because right now, *nobody* is getting taught Computer Science! Lately I've been reading papers posted on http://lambda-the-ultimate.org/ [lambda-the-ultimate.org] Regularly I have to go back to the basics and learn extremely fundamental theory because nobody *ever* taught them to me in the first place. Half the time I think, "OMG, I never even knew this existed -- and it was done in 1969!!????"
More and more lately, I've been wanting to phone my University up and ask for my tuition back.
If you want to learn how to program in a professional setting, there's nothing better to do than just start writing code. Get your chops up. Then find some big free software projects and start fixing bugs. Learn how to use the tools (configuration management, etc). Learn how to interact with the other programmers. That's all you really need (well, that and a quick automata and grammar course so that I don't have to look at yet another context free grammar being "parsed" by regular expressions).
But right now, where do you go if you want to actually learn theory? I guess the library... And getting back to the point, this is essentially what the paper is suggesting. Students need to learn all these things because they are relevant to the field. A university supports industry by doing basic research. If you don't understand the concepts that they point out, you just can't do that. Paraphrasing from the article, having a university course that's meant to pad out a student's resume is shoddy indeed.
Java for Dummies (Score:5, Interesting)
Java is fine for teaching design patterns, and classical algorithms like Quicksort, or binary search.
But you can't do operating systems, and the success of Java in isolating you from any notion of the hardware is actually the problem.
We have already blacklisted courses like the one at Kings College, because they teach operating systems in Java.
Yes, really.
Their reason apparently is that it is "easier".
I have zero interest in kids who have studied "easy" subjects.
The world is a bigger, more competitive place, how many jobs do you think there are for people who have an easy time at college ?
Java is part of the dumbing down of CS.
A computer "expert" is not someone who knows template metaprogramming in C++, or compiler archaeology in BCPL, or the vagaries of the Windows scheduler.
It is someone who understands computers at multiple levels, allowing them to choose which one illuminates the problem at hand.
To be wise in computers you choose whether to think of something as a block of bytes, quadwords, a bitmap, a picture, or a buffer overflow pretending to be porn. If also have the option of understanding flash vs static RAM, virtual memory, or networked storage, all the better. I doubt if even 1% of CS grads could write code to turn this BMP into a JPG, or even explain the ideas behind this. In my experience, 50% could not work out how to develop a data structure for a bitmap that used palettes.
I have interviewed CS grads with apparently good grades who could not explain any data structure beyond arrays.
Any CS grad who sends us their CV with bullshit like "computers and society" or "web design" has their CV consigned to trash with no further reading.
A CS should be able to write a web server, not be an arts graduate who didn't get laid.
C++ makes you think at multiple levels, unlike Java, you simply cannot avoid thinking about your system from patterns to bytes. This may be good or bad for productivity, and I'm sure we risk a flame war here.
But I am entirely convinced you need to hack your way through a "real" system.
How can someone understand the Linux kernel without C & C++ ?
Is someone really fit to be called a computer scientist if like >50% of the Computer "Scientists" we interview for very highly paid jobs, show actual fear of working at that level.
They have the same "way above my head" attitude that a mediocre biologist might have to applying quantum theory to skin disease.
Partly, as in the Kings College debacle it is lazy mediocre lecturers, but also CompSci grads frankly are not that smart, so they need their hands held.
Although the seats get filled, they quality is in monotonic decline.
Re:Java for Dummies (Score:5, Insightful)
Java is the new COBOL. And we will regret it in 20 years for much the same reasons.
It actually gives me hope that you have recognized this in hiring practices. That a CV with a list of Sun's Java buzzwords is not an indication of a useful programmer.
I was disturbed in college (1997-2001) that things were changing towards Java and other idiocy. Too many people didn't get pointers and other basic concepts, and Java was hiding them even more. I believe it was the one class we had in assembly programming that really pointed it out - when confronted with having to deal with real hardware, most of the students didn't know what to do. Concepts like "two's complement" vs "one's complement" caused a strange brain-lock for them, as they were so sheltered from the actual binary math and hardware of the computer.
It was only a handful of us that had been programming for years already (yay for the Atari 800XL) that had any idea of what was going on. The college (UC Davis) skipped entirely over very basic concepts like Von Newmann Architecture. I ended up having to spend most of my time trying to help my fellow students, there was so many fundamentals missing.
I think the most frightening part was having to yell at one of the professors one day, because the basic data structures he was teaching were being done incorrectly. He was teaching people to leak memory. ("Let's allocate a huge linked list, and then just set the head pointer to NULL and consider it freed!")
Sigh. It was frightening then, and apparently all my fears were justified, as now the entire discipline is getting a bad reputation. Unfortunately, I can't exactly disagree with that reputation from some of the CVs I've seen recently. My degree is destined fscked, apparently.
You hiring? ^_^
Re: (Score:3, Insightful)
Just to harp on one statement in your comment. This is either an absurdly basic demand to ask of programmers, or an absurdly complex one. On one side, this is as simple as making a function call bmp2jpg(...) or whatever it is with the libraries they are using. The process of doing this is simplistic, and taught to the java mentality of "find the right tool in the shed to do your entire pro
You're barking mad. (Score:5, Insightful)
So, what, you're going to hire math geeks only? People with degrees in mathematics or operations research, or perhaps some of the hard sciences? In my own experience, while there are some non-CS degrees that are excellent preparation for a CS career, only a CS degree is a CS degree. It is lamentable that some schools have embraced the trade-school mentality, but many more have not. When I was teaching courses as a graduate student (just a couple of years ago), the curriculum began with Java and quickly shifted to Haskell. A neighboring institution still uses Ada as an undergraduate language. There's also a legion of Knights of the Lambda Calculus who are trying to get Scheme reintroduced to the undergraduate curricula in several institutions in the area. Intellectual diversity about languages is alive and well in the academy, based on the institutions I've seen up close and personal.
Also, who is this "we"? You and someone else who shares your prejudices? Or is this you and the senior engineering staff? If you're about to decree CS as a non degree, maybe you should get the input of the people who will be most brutally affected by your shortsightedness.
So glad to know that you think design patterns and classic algorithms are worth studying.
Look, pick up a copy of Cormen, Leiserson, Rivest and Stein's Algorithms textbook sometime. That's the definitive work on algorithms--if you need an algorithm, it's probably sitting in CLRS somewhere, along with some beautiful mathematical exposition about it. Every algorithm listed in the book can be trivially converted into Java. So why the hate for teaching CS with Java? It's a perfectly sensible language for many very important parts of CS.
Further, I've taught operating system design in Python. Yes, Python. When talking about how a translation lookaside buffer works, I don't write C code on the board. I write pseudocode in Python and say "so, this is how it looks from twenty thousand feet." On those rare occasions when we have to get down and dirty with the bare metal, then it's time to break out C--and we leave C behind as soon as possible. I want students to be focused on the ideas of translation lookaside buffers, not the arcane minutiae of implementations.
After all. Implementing it is their homework, and it involves hacking up the Minix code. In C.
If it was an easy subject, would changes need to be made to make it easier?
If it was a spectacularly hard subject with a 50% washout rate, would changes need to be made to make it easier?
I've been in courses where 50% of the class washed. They were horrible, horrible classes. The pedagogy needed to change. The learning curve needed to be smoothed out and made gentler. This is in no way equivalent to saying it was made easy. The fact you think otherwise brands you as an intellectual elitist who can't be bothered to think logically about his own prejudices.
"Sure I know C!" (Score:5, Insightful)
These students are being trained as engineers. They shouldn't be afraid of a little grease.
2 professors, 1 cup. (Score:3, Interesting)
I've taught grinds to first year students in Ireland in Java (I'm SCJP 14/5) and their professors do not even allow the use of an IDE when coding. They also grade them over Java patterns and OO rather then knowledge of the language.
C/C++ have their place, but any good CS student normally learns a number of languages.
I can code in a number of languages, certified in quite a few as well and I've never used Ada. Considering both professors work for a company that sells ADA stuff it seems a little biased and uninformed on Java.
That's true (Score:4, Insightful)
I love Java, and I find it much more pleasant to use than C/C++, but I generally agree with TFA. I have seen many people doing things like this
which creates 10K temporary objects to construct a single string*. This is because they started learning programming with a high abstraction level so they have no idea of what is going on behind the scenes. It's something similar to starting programming with one of these new "intelligent" IDE's such as Eclipse, which do lots of things for you so you don't have to figure them out for yourself. I think all these abstractions are great for people who already know how to program, not for beginners. You wouldn't give a calculator to 6 year old kids learning math, would you?
I personally would being with C and then jump to Java. C is not so complicated as a first language if you don't introduce all its features from day one. It was my first language and I think it was a good choice, it shows you enough low-level concepts to be able to make efficient optimised code in higher-level languages. Besides, when you later jump to a more high-level OO language you appreciate the difference and learn it with interest.
* I know modern compilers are able to optimise that automatically using a StringBuffer or StringBuilder. I just chose that (somewhat unrealistic) example for the sake of simplicity, but the same happens in other cases that aren't so easily handled by the compiler.
Better CS programs don't teach languages anyway (Score:5, Insightful)
At CMU the very first CS class (that losers like me who didn't AP out of the first CS course, mostly because my high school didn't even have computer classes let alone AP computer classes!) really did focus on teaching a language - Pascal - and a significant part of the class was the learning of the language. It was the least useful CS class I took in the long run (not surprising, as an introductory course in any subject is likely to be the same). Subsequent courses would spend 1 - 2 weeks going over the fundamentals of the language to be used in coursework for the remainder of the class (which in some classes was C, in some was C++, some used ML, others Scheme, etc), to get everyone started, and after that, you had to figure it out on your own in conjunction with actually learning the theory that was being taught. It really isn't that hard to pick up a new language once you know a couple, although I did have a hard time with ML, mostly because I was completely unmotivated to learn it, feeling that it was absolutely useless to know (I was right).
No really good CS program has any classes with names like "Java 101" or "Advanced C++". To use a carpentry analogy, I would expect a really good carpentry school to teach the fundamental rules and "theory" of carpentry, so that the student upon graduation really understood what carpentry was all about and could apply their knowledge to aspects of the subject that they hadn't even encountered in school. I wouldn't expect a good carpentry school to have classes like "Advanced Hammering" and "Bandsaw 101". The courses would instead be "Introduction to House Frames" and "How to Construct Joints". You'd be expected to learn the use of the tools in the natural course of studying these subjects.
It's the same for CS. Good programs don't teach the tools, they teach the *subject*; learning the tools is intrinsic in the study of the theory.
Java is like "The Incredibles", or a circus (Score:5, Interesting)
Every time someone tells me that there are no pointers in Java I laugh a little. EVERYTHING in java that isn't a scalar is actually referenced through pointers. That is, you declare the pointer variable and then "new" the object into place.
They are just incredibly _boring_ pointers. You cannot math on them. There is no sense of location to those pointers. But the absence of interesting pointer operations, and the absence of the _semantic_ _copy_ operation is what all this alleged pointerlessness is all about.
I have only two _Real_ problems with java... (okay three if you count the complete requirement that you constantly have to deal with exceptions even when you know they cannot really happen, and if they did, you would want the thing to abort all over the place... but I digress)
(1) Java has no useful destructors because no object has predictable scope. If you think finalize methods are the same as destructors then don't bother responding, you don't know what destructors are...
(2) Since everything is a pointer in Java, you have to bend over backwards to pass-by-value. The fact that the language doesn't even begin to provide copy-construction semantics. What a miserable PITA.
Now the _dumbest_ thing about java is that they were so set against multiple inheritance that they never bothered to ask themselves why _every_ OO language starts out life without multiple inheritance only to have to add it later. By making everything a proper linear subclass of Object, they left themselves with having to graft on "interfaces" which is just multiple inheritance with the "bonus" of completely preventing default implementations. (Which lead to delegation etc.)
The way the language keeps sprouting things it claims to never have and never need, well it's very like watching a clown car endlessly explode with ridiculous archetypes. After a while it just isn't funny any more.
So yea, teaching people Java as an introductory language is something of a disservice if you ever want to make them truly think about programming and what makes some things machine smart, while others are machine stupid.
--- BUT ---
I worked in education for years. The fundamental problem with computer science education is that it is being taught by computer scientists instead of educators. We are stuck learning from the people who learned from the people who made it up. None of these people ever learned to EFFECTIVELY IMPART INFORMATION.
Consequently, the students are largely unemployable on the day of graduation.
The classic computer curricula seems to consist of throwing three or four languages at a kid in the hopes that they will "just kind of figure out this programming stuff."
The field of computer science has not yet come up with a "basic theory"... a starting place... The list of things a student simply must know before you start filling their head with syntax.
And so we are a bunch of prelates training our acolytes in our special, individualized deeper mysteries.
And that's what everybody is doing worldwide, so our graduates are just as lame as everyone else's...
Cue "Enter the Gladiators"...
Beginner language? (Score:5, Interesting)
This guy is about as unbiased as Stroustrup! (Score:5, Interesting)
Most CompSci college graduates are totally unproductive on their first job. They can be put to work on trivial things, but no matter what school they came from, they are just going to need a lot of hand-holding to make it through the first year. That is just how it is. Doing coursework at school is no substitute for coding on a meaningful project, whether it be work related, something open-source related, or just something for fun. That is the honest to god's truth as a software developer for over 12 years now and I don't even consider myself even that wisened in the field (maybe after 20 years I will feel differently).
Now, with respect to Java as an introductory programming language, it is not bad but not great either, however the purpose of any introductory course to anything should be to capture the interest of the people who are curious enough to take the course in the first place. Back in college, we started with C (most of my peers had already been programming since they were teething but this was CMU) and if not for my persistent no quit attitude in life, I probably would of given up programming right then and there because spending your entire night trying to debug a trivial program not because you didn't understand the material but because of one stupid uninitialized pointer turns a lot of people off right then and there who may have had the potential to be great programmers, but because their first impression of programming was so bad, they gave it up before they got to learn more about how great programming really is.
Oh yeah, and the not relevant at all math courses didn't really help much either. Whenever in your career you need to use some advanced calculus or discrete math, you will have likely forgotten about 99% of it and need to look it all up in a book anyways. Besides, 99% of programming projects in the real world basically involve high school level algebra and not much else. What separates the productive programmers from the unproductive ones is not who got a better grade on their math course back in college, but those who innately understand systems and are willing to make the extra effort to learn all about the gazillion design patterns available to programmers so that when they are faced with a difficult project, they will not waste inordinate amounts of time reinventing the wheel.
As for understanding computing at a rather low-level, as is the case with a class in operating systems, then yah Java might not be such a great choice, but then again learning C is easy because C is made up of very simple constructs (C++ is another story). However, using C productively just requires a crapload of practice/experience to be good with, not necessarily a whole lot of computing expertise. In addition, the mastery of whatever API's you happen to be basing your career on is paramount as well. In the real world, employers don't want to hear "but I can learn anything quickly" because mastering some API's can take 6 months or more so if you come out of university with no specific skill sets, it is going to be really hard to get that first job because unless you can be productive soon (or even on day one), you are useless as far as employers are concerned. Also, though I don't program in Win32 professionally myself, from my understanding it takes at least 3 years of non-stop work with those API's just to be semi-proficient in them. Professionally, most of my work over the years has been in Java, and Java is probably scary to a lot of neophyte programmers these days because since 1.5, it has unfortunately turned into the bastard child of complexity like its twisted sister C++.
Last but not
Lisp (Score:4, Insightful)
Why We Teach Java (Score:5, Insightful)
(1) Java is what the market wants. Yes, we can teach any other language under the sun. But the reality is, that the software industry values individuals who are Java-literate. By this I mean an individual who has a basic understanding of the OO principles that the language is founded upon, can write Java code using common tools, and has at least some insight into some of the more common Java APIs. Any learning institution that doesn't take this into account when designing their curriculum is doing a serious disservice to their student body. While some do go to University for the sheer joy of learning a subject - most are there to ultimately get a job.
(2) In my opinion there is something seriously wrong with a Java course that emphasizes Swing or Web development, rather than the fundamentals. Yes, its important to get things in and out of a program, but, at least initially these should be incidental to the main event. Learning the language, and applying it effectively. Thinking in an object-oriented way, which many of you know is not necessarily an intuitive way to look at the world - especially if you already have a procedural background. GUI and web application development should be separate, advanced courses.
(3) I sometimes lament the lack of insight into pointers, but any professor worth their salary will spend some time discussing the Java object reference architecture, and relate that to pointer-based languages. Regardless of how abstract your language is "opening up the hood" and demonstrating how things work, and why things have been designed the way they are, is often worth knowing.
(4) I laughed when I read the article about Praxis, especially the part about formal methods. Are they serious? Yes I was taught formal methods in school, and could understand *why* I'd want to use them... If I had all the time in the world... a huge budget to burn and customers not screaming for something that the business needed yesterday. Praxis offers software development based on formal methods and as a consequence occupies an important (and probably expensive) specialized niche of the software ecosystem. To suggest that this approach should be the norm and lament its absence really betrays that the authors have spent too much time in academia and not enough in the real world.
(5) Ada is a great language - in fact I learned Ada 83 as a first language along with C. It just isn't relevant to most software development companies or IT departments - if indeed it ever was. I worked on a research project that was part of the Ada 9X Real-Time initiative - the main users were aerospace and military vendors - particularly embedded systems. There you do need to know about concurrency and distribution - along with hard performance deadlines and often a slew of safety and mission-critical issues you need to consider to do a good job. However, I fail to see the general relevance of Ada to a commercial market that is primarily interested in "simple" information systems, getting information out of a database and/or putting it in - with some processing on route. Why should I use Ada when the market in general doesn't use it?
(6) We teach concurrency - its useful stuff to know. I think that using formalisms to describe concurrent programs is going a wee bit too far (see (4)) above.
Couldn't agree more... (Score:4, Informative)
I list them because they hold a lot of wisdom, and wanted to draw special attention to them for such as well.
When I was in college I got really ticked at the level of theory - there was too much of it. It wasn't balanced well enough with implementation; and as I looked around, I noticed that was pretty common place among academic institutions (colleges AND universities - and I'm not talking about trade schools either). That was before they moved their curriculum to using Java for the first couple classes; and after they did, I had already heard some stories about the upper classes getting some of these "new" students and not being able to focus on the class materials because they had to teach these students C/C++ first and the students had a harder time getting it. (Not so the other way around.)
That said, I've started thinking about how I would put together a curriculum for teaching computer programming/science/engineering. (I'm not talking about computer _hardware_ engineering, btw.) I even did some tutoring after college. So what would I do?
I'd start students with a language that can be used to teach the real basic skills and concepts (variables, functions, etc.) - even vbscript could be used at this level; but I'd also quickly move them on to more advanced concepts (in the case of vbscript, it would only be used for a couple weeks at most), moving from language to language to bring not only a depth of concepts and understanding, but also a breadth of computer languages and kinds of tasks. I'd also ensure that somewhere in the curriculum students would be exposed to Assembly, and have found that even a small exposure makes a big difference in programming styles and philosophies for programmers.
Furthermore, I'd break the curriculum into two parts. One part would start from the ground up; and the other would start from the top down. Both would be required of students. The idea being one part would be more focused on the theoretical, while the other would be more focused on the substantial - implementation. Both would work together to produce a well-rounded student. Additionally, it would be designed such that students that wanted to work on operating systems would simply follow the one from start to end; while other students would be able to leave for more focused courses at the layer of their choice. (Students wanting OS would still have other courses for focus work too, btw.) The primary idea being that even a web-app developer needs to know the underlying systems, and even the OS developer needs to understand the abstractions of the web-app developer.
I'd also have the overall curriculum be far more software engineering focused. Yes, if people want to really be computer "scientists", then they could do that; but industry really needs software engineers, not computer scientists. Real programs require engineers, and sadly, this is strongly lacking from most all academic computer programs. (Some have changed it, but not many.)
I'd also think that this approach would be very favorable to the authors of TFA and the comments I've linked. The ideas probably need a bit more refinement, but the general approach would be sound - and it's not what academia is doing today by any stretch of the imagination.
FWIW - While I am relatively young (college grad of 2003), my main strength is C prog
two profs working for adacore love ada (Score:5, Interesting)
The reason is because Computer Science has developed into a discipline that is no longer pure mathematics. There's only so many courses you can squeeze into four years.
2. The development of programming skills in several languages is giving way to cookbook approaches using large libraries and special-purpose packages.
Guess what, that's what building real software is like today. We don't need people that can write quicksort in obscure unused languages but people that can grasp systems of millions of lines of code. Ada doesn't prepare you for that because it is a toy language that never really was adopted outside of the academic world. It has no good, widely used frameworks & libraries like you find in the real world. People don't use it for a whole range of software systems that you find in the real world and to prepare you for this real world there are simply much better languages around these days.
3. The resulting set of skills is insufficient for today's software industry (in particular for safety and security purposes) and, unfortunately, matches well what the outsourcing industry can offer. We are training easily replaceable professionals.
I agree that skills are important. A good prof can teach those using pretty much any turing complete language if it needs to be done. Java isn't half bad for teaching a whole lot of important CS concepts and theory. And unlike Ada, people actually use it. As for C and C++ they are useful languages to learn of course. Many colleges still do.
But of course two ex profs working for adacore are hardly objective. Ada is as dead as latin. It has some nice features but nothing you won't find somewhere else. Keeping professional skills up to date is as important for professors as it is for students. Having done a Ph. D. in software engineering & architecture and having practiced my skills in several companies, my view is that one of the largest problems in computer science education is teachers who have never worked on real, industrial sized software systems and continue to send students into industry with a lot of misguided & naive ideas about how to build software. Most SE teachers out there simply have no clue what they are talking about. Software engineering is a skill learned in practice because the teachers in university mostly lack the skills required to properly prepare students. That's the sad reality.
Re: (Score:3, Insightful)
Re:Right on! (Score:5, Insightful)
Re:I've noticed that... (Score:5, Insightful)
I view school as bootstrapping a person to learn how to learn, and for teaching them the things that are timeless. The only reason that a popular programming language like Java is used in the first place is because something has to be used, so it may as well be that. However, many schools offer Scheme, ML, or Common Lisp as the programming language of choice when the job market is comparatively low. This is because it's seen to help the learning process. The goal isn't a marketable skill, but a vehicle to teach the timeless things like algorithms, data structures, and all those courses that have he word "theory" tacked on to the end of the titles.
If you want someone to be a lackey and build you a GUI, you'd be better off looking for someone who has an ITT certificate. If you're looking for something more on the math side of computing (again, algorithms, analysis), then you talk to a computer scientist.
Re:I've noticed that... (Score:4, Insightful)
I'm a Mechanical Engineer as well. Are you suggesting that _we_ should have spent our degrees studying look-up charts for HVAC ducts, or how to make nice Excel graphs? (calculus, mechanics, thermodynamics, heat transfer, ring any bells?)
Re: (Score:3, Interesting)
As for the theory you callously disregaurd- that theory is what allows us to dream
Re:CS Newbie here. (Score:5, Funny)
http://www.phy.duke.edu/~rgb/Beowulf/c++_interview/c++_interview.html [duke.edu]
Re:Oh noes! Java is not C! (Score:4, Insightful)
Many universities are simply training highly replaceable professionals, which is a big reason why outsourcing is such a problem. When two people--one in the USA, one in India, for example--have the same skills, the cheaper will be chosen (and rightly so, sorry). The point of the article is that many universities are simply training programming rather than teaching computer scientists. It's an important distinction, which some people just don't understand.
Re:University should be about people (Score:5, Funny)
It does not work for you. In your post mispelled:
narrowm, lets, aggreed, trun collage, auctually, focuesed, assue, grammer, socialolgy, beeing, couyld, cynsical
Read more...
Re:University should be about people (Score:4, Funny)
1. English is not my first language.
2. I'm a techie...
3. I read only books full of pictures!