Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

CMU Eliminates Object Oriented Programming For Freshman

timothy posted more than 3 years ago | from the doesn't-seem-an-april-fool's-joke dept.

Education 755

fatherjoecode writes "According to this blog post from professor Robert Harper, the Carnegie Mellon University Computer Science department is removing the required study of O-O from the Freshman curriculum: 'Object-oriented programming is eliminated entirely from the introductory curriculum, because it is both anti-modular and anti-parallel by its very nature, and hence unsuitable for a modern CS curriculum.' It goes on to say that 'a proposed new course on object-oriented design methodology will be offered at the sophomore level for those students who wish to study this topic.'"

Sorry! There are no comments related to the filter you selected.

Hmmm ... (0)

WrongSizeGlass (838941) | more than 3 years ago | (#35621138)

So OO Programming because it's unsuitable for a modern CS curriculum. I guess we should go back to just assembly language so we can make sure we never go down this road of false progress again.

Re:Hmmm ... (4, Insightful)

mellon (7048) | more than 3 years ago | (#35621162)

I don't know about *starting* in assembler, but a programmer who isn't somewhat proficient in assembler is going to have a very weird mental model of how programs work. OOP has the same problem--it's not that OOP is bad; it's that if you start out with an OOP language, you don't learn a lot of things that are important to understand. Once you know how the machine works, then you can start studying abstraction. Treating OOP as the only way, or even the best way, to solve any computing problem is going to tend to produce programmers who think everything is a nail. It doesn't mean that there are no nails, nor that students shouldn't learn to swing a hammer.

Re:Hmmm ... (3, Interesting)

WrongSizeGlass (838941) | more than 3 years ago | (#35621192)

Agreed ... but aren't most modern OS's OO based? In most cases students need OO programming in order to become employable. OO certainly isn't the holly grail of computing but it is entrenched in business and needs to be taught just like COBOL was all those years ago (when I had to learn it even though it was like writing a book every time I wanted to write a small program).

Re:Hmmm ... (4, Funny)

immaterial (1520413) | more than 3 years ago | (#35621244)

Given its toxicity to humans, I recommend avoiding drinking from the holly grail altogether.

Re:Hmmm ... (4, Informative)

salesgeek (263995) | more than 3 years ago | (#35621324)

Um. No. Many modern libraries or "frameworks" (newfangled word for library) are OO. Most OSes remain written in classic system programming languages like C and assembly language. In fact, most frameworks start as object oriented wrappers for certain OS calls and cruft up from there.

Re:Hmmm ... (5, Insightful)

osu-neko (2604) | more than 3 years ago | (#35621364)

Agreed ... but aren't most modern OS's OO based? In most cases students need OO programming in order to become employable. OO certainly isn't the holly grail of computing but it is entrenched in business and needs to be taught just like COBOL was all those years ago (when I had to learn it even though it was like writing a book every time I wanted to write a small program).

And how is this an argument for including in the introductory, freshman curriculum? I put forward the possibility that some topics may be more appropriate to be taught to students only after they've learned the basics.

Re:Hmmm ... (0)

Anonymous Coward | more than 3 years ago | (#35621206)

I agree with you except for your use of proficient. I think familiar would be a better term. Not many are programming in assembler and if a real programmer needs to, they'll be able to because of their skills in modern programming (which includes OO Programming), math skills and ability to learn new computer languages will permit them to.

Re:Hmmm ... (1)

Anonymous Coward | more than 3 years ago | (#35621228)

I don't know about *starting* in assembler,...

Why not? What better way to start a CS program than learning how to program an actual computer without all the abstraction? And then go to some mid-level language like 'C' from there and then a higher level language from there - all the while implementing data structures and algorithms in those languages: like parallelism.

Languages are just syntax of an abstraction of the processor instructions and it pains me when I see posts about how one language is "better" than another for a particular job or the "right tool for the job" - we're talking about CS NOT carpentry.

Re:Hmmm ... (0)

Anonymous Coward | more than 3 years ago | (#35621274)

Why not? What better way to start a CS program than learning how to program an actual computer without all the abstraction?

Why not just go program in machine code? No abstraction there!

Re:Hmmm ... (0)

Anonymous Coward | more than 3 years ago | (#35621238)

Assembler is a great start, but far better to hand wire logic gates and address decoders to have a real feel for what's going on.

Re:Hmmm ... (1)

obarel (670863) | more than 3 years ago | (#35621314)

Yes, but how do you create a transistor? Do you just buy one?

I think all studies should start from scratch - throw you on an island and let you invent calculus yourself. What's the point in progress if you can't explain it to a 10 year old in two sentences?

Re:Hmmm ... (2)

narcc (412956) | more than 3 years ago | (#35621504)

far better to hand wire logic gates and address decoders to have a real feel for what's going on.

Laugh all you want, but this was a big part of my intro to CS class way back before "CS" turned in to "using Visual Studio".

Programming is a useful skill and should be taught to CS students, that isn't in question. However, college isn't (or shouldn't be) a trade school. Programming is a bit of an art, and consequently shouldn't be a major part of any CS curriculum, certainly not the dominant part as it seems is the case at far too many institutions.

That said, first year students should learn to program, I'd recommend BASIC for learning basic concepts like iteration, flow control, etc. followed by an assembly language where they can get close to the machine while learning what's really going on as they implement common algorithms and data structures.

After that, it really doesn't matter what language they use in the rest of their core courses, as there shouldn't be a need to spend any time on even a new language in the rest of their core courses. Any class time spend on "how-to do blank in language x" is a huge waste.

On OOP specifically, I agree with CMU -- it's both anti-modular and anti-parallel. In short, it's a failed paradigm. Look at the hideous mess modern languages have become as a result of the nearly religious devotion to this astonishingly over-hyped and ill-defined concept! (Before the terminally incompetent chime in: objects are great, OOP is not.)

I applaud CMU for taking action here -- let's hope other colleges and universities follow.

Re:Hmmm ... (1)

oliverthered (187439) | more than 3 years ago | (#35621254)

well if you teach bad OOP then yes, if your good at teaching it and you pick as sensible language (for simplicity) or use good patterns... then the problems mentioned are solved by the very thing they claim is inherently against it.

Re:Hmmm ... (4, Interesting)

Assmasher (456699) | more than 3 years ago | (#35621422)

If it means they stop doing everything in Java throughout their education, I'm all for it. There's nothing wrong with Java, and I use it often in my current company, but kids in school need to learn from the get go that languages are tools in a toolbox - use the right tool for the right job when you can. I can't remember the last time I interviewed a graduate who had used C++ or C outside of a single survey course on the language! Hell, I can't remember the last time I interviewed a post 2000 graduate who had built their own processor or had even taken an assembly class. The kids are just as smart, just as eager, but woefully unprepared. The one thing they are getting a little better at is included some 'software engineering' into the curriculum - but only a little bit better in that they do 'projects together' which in my experience means that the alpha nerd does 90% of the work and the other 4 team members offer worship and keep the ramen coming.

Re:Hmmm ... (0)

Anonymous Coward | more than 3 years ago | (#35621508)

I thought the article was the most idiotic piece of written work about IT I've seen in my life... until I saw your head-down comment. And to have it modded as Insightful, congrats on first class trolling... but then that classic Groucho phrase applies "I'd never join a club that would accept me": is it worthy to get such a high modding from people you fool so easily?

A-freaking-men! (1)

oldwarrior (463580) | more than 3 years ago | (#35621452)

Yes - let's learn how computers really work. Then teach us how to do high level abstractions about them. Then contrast obsolete abstractions like strict 0-0 with more pragmatic approaches that accomodate resuse without GETTING IN YOUR WAY as you solve real problems.

Re:Hmmm ... (1)

wisnoskij (1206448) | more than 3 years ago | (#35621492)

Ya I am not really sure I believe this article either, sure OO is not great for parallel but it seems a lot better then any alternative to OO I have ever heard of.
And I can say for sure that at the university of Waterloo (the leading school for software engineering in Canada, by far) OO is still the leading thing they teach freshmen programmers/software engineers.

so the wheels are coming of the OO band wagon then (4, Interesting)

mjwalshe (1680392) | more than 3 years ago | (#35621146)

I always thought the obsession with making everything OO when it doesn't suit every type of programming problem was a bad thing - glad to see some one agrees with me.

Re:so the wheels are coming of the OO band wagon t (0, Troll)

Doc Ruby (173196) | more than 3 years ago | (#35621178)

So "no OOP" == "some non-OOP"?

You must be a terrible programmer.

Re:so the wheels are coming of the OO band wagon t (-1)

Anonymous Coward | more than 3 years ago | (#35621318)

You must have terrible reading comprehension skills.

Re:so the wheels are coming of the OO band wagon t (1)

Doc Ruby (173196) | more than 3 years ago | (#35621366)

No, you have terrible reading and logic skills, because what I summarized is exactly what the post to which I replied actually means.

Re:so the wheels are coming of the OO band wagon t (1)

osu-neko (2604) | more than 3 years ago | (#35621428)

No, you have terrible reading and logic skills, because what I summarized is exactly what the post to which I replied actually means.

Hmm. Nope, not even close. Looks like the pot calling the kettle black here... except the kettle isn't black in this case, just the pot. In any case, your logic and reading comprehension skills are sorely lacking...

Re:so the wheels are coming of the OO band wagon t (0)

Doc Ruby (173196) | more than 3 years ago | (#35621480)

No. The article says CMU is eliminating OOP training, saying it's bad. You agree, saying "it doesn't suit every type of programming problem". You therefore are saying that since OOP isn't good for everything, there should be no OOP. Ignoring the alternative of teaching some OOP, but not exclusively OOP. In other words, "No OOP" == "Not some OOP".

Yours is the fallacy of the excluded middle. Look it up. And don't ask me for any programming jobs. Or any that require logic. Or recognizing your own limitations, even when they're shoved in your face. Because then you engage in the denial projection that makes fallacists like you so annoying.

Goodbye.

Re:so the wheels are coming of the OO band wagon t (0)

Anonymous Coward | more than 3 years ago | (#35621494)

It's just Doc Ruby being himself. It gets even worse; he's also a Space Nutter. Not only can't he read, he also believes there are fantasy-levels of energy and technology out there. He actually thinks we'll make aerogel in space to insulate windows...

Re:so the wheels are coming of the OO band wagon t (1)

maxwell demon (590494) | more than 3 years ago | (#35621230)

However I wonder on the reasoning: OOP is anti-modular? How that?

Re:so the wheels are coming of the OO band wagon t (0)

Anonymous Coward | more than 3 years ago | (#35621440)

I do not know if OOP is anti-modular per se, but almost every implementation I have seen had serious problems from a modularity point of view.

Worst is of course C++ where OOP means you no longer have any encapsulation at compile time, but you need to almost recompile everything most of the time some private fields change.

But even with more complex module systems not relying on header files, OOP means your interfaces are likely to contain object hierachies even where the object hierachy is realistically mostly a implementation detail (or at least should be). Thus you either need interfaces changing more often or the code no longer having a suiteable interface, which usually is visible by having glue code on at least one, often even both sides of the module's interface.

Re:so the wheels are coming of the OO band wagon t (0)

Anonymous Coward | more than 3 years ago | (#35621448)

When you write proper OOP, you have a whole building set up, with methods, classes, and stuff. That's not a three-liner, and that's not easy to modularize and reuse.
It's about the same as with anti-parallel. It was (and is) a nice paradigm on a high level or a high layer. You inherit stuff, you call methods. You try to be complete. If you cut down problems into single simple pieces of Lego, you don't need all of that. You write a small utility that does a complete mediation on the date entered. It doesn't need OO.
On a very high level, or a user interface, it is much more intuitive: You create a file, and you can duplicate it, manipulate it, print it. Left mouse button - right mouse button.

Re:so the wheels are coming of the OO band wagon t (2)

oliverthered (187439) | more than 3 years ago | (#35621264)

I've found the same problem with SQL and prolog (of that was object prolog, scrub that)

Victory for Tablizer? (2)

Compaqt (1758360) | more than 3 years ago | (#35621402)

Longtime Slashdot readers know and either love or hate user "Tablizer" [slashdot.org] .

He has a website [geocities.com] detailing his objections to object-oriented programming, while arguing for "table-oriented programming". It has been a fruitful source of flamewars over the years.

So, is this a vindication for Tablizer? Tablizer, what say you?

Oups (0)

Anonymous Coward | more than 3 years ago | (#35621148)

I don't see the point of such drastic measure.

strange progress (0)

Anonymous Coward | more than 3 years ago | (#35621156)

while I'm sure it looks strange to the jargon-speakers of the business world, it's nice to see some actual progress being made in programming education.
now, I'm not sure about the choice Standard ML over Haskell or Scheme, but at least we're seeing FP taught again at a freshman level outside of MIT and Britain.

Re:strange progress (1)

Anonymous Coward | more than 3 years ago | (#35621194)

I forgot one important caveat though, which the summary neglects. "TFA" states that OO is "anti-parallel by its very nature," but does not explain how or why this is so, and indeed, my own experience is that this is complete hokum. A well-designed program can be parallelized regardless of implementing code; let's not forget the idea of computational equivalence. This is not to disparage FP and other such "rational" languages as I would have to agree that implicit parallelization can be a major feature -- but how about some love for Erlang in all this?

Law of Demeter the problem? (2)

Latent Heat (558884) | more than 3 years ago | (#35621430)

Maybe the anti-parallism is not in OOP as such but in programming style such as this Law of Demeter business.

Demeter, they tell me, was some manner of software project and the Law of Demeter is a style of OOP programming that is supposed to have come out of the experience on that project. In the strict sense, you are supposed to never invoke methods on objects embedded inside other object. Instead, the containing object is supposed to have a method that in turn invokes the method on the embedded object.

This manner of strict enforcement of encapsulation has the effect that most of your methods are mainly invocations of other methods. This may have the effect that any typical method invocation results in a long chain of method invocations.

This is kind of like I ask Phil for the monthly report and Phil tells me "you gotta get that from Sally." So I shoot Sally an e-mail and she gives me the monthly report? No, that is bad in some sense because if my contact is Phil I am not supposed to know about Sally. So I e-mail Phil with the request, who e-mails Sue, who e-mails Tom, who in turn e-mails Jason, who finally e-mails Sally. And when Sally generates the monthly report, she passes it back through that chain. So am I correct in thinking that OOP leads to such high levels of indirection, and programming styles such as Law of Demeter make this worse?

This style of programming probably helps coarse-grained parallelism of breaking programs up into elements that can run on separate threads or even processes. It helps by the rigorous enforcement of encapsulation. But this high level of indirection may break fine-grained parallelism -- every memory access chases long sequences of references.

Re:strange progress (2)

Assmasher (456699) | more than 3 years ago | (#35621432)

Ironically, I find it vastly easier to encapsulate my mechanisms for parallelization in objects :).

cnn emulates real life for US? babys et al differ (-1)

Anonymous Coward | more than 3 years ago | (#35621158)

hard to tell which smells worse? the fog of tax free (for some) war can do that? did we say tax free? pardon, the non-taxpayers actually profit ($billionerrors$) on the heavy weapon (keeping ALL sides supplied including mexico) murder massacre business outings. so that's good?

we support the views of this former person
http://www.youtube.com/watch?v=TY2DKzastu8&NR=1&feature=fvwp ("stop killing")

we do not support the material in this cnn propaganda video from yesterday
http://www.youtube.com/watch?v=BXB75IK6pL4 ("we can win this, with my help")

same guy? clone? confused? we must focus... on the images. we must....saw a picture of one of those godaffy psycho-killer freaks being paraded around our military bases (may have paid for them, along with our holycost tithing's) like royalty, only to become our very worst 'enemy' just weaks/leaks later? focus-pocus?

babys rule, with tiny chubby soft fingers, advanced dna etc..... unclear?

Re:cnn emulates real life for US? babys et al diff (1)

WrongSizeGlass (838941) | more than 3 years ago | (#35621210)

Is it just me or does this sound like someone has been listening to too much Talking Heads? [wikipedia.org]

real math; taking one (1) life =crime vs. humanity (0)

Anonymous Coward | more than 3 years ago | (#35621236)

give US a minute here. this can't be right? isn't there justdenyable homicide? that's the old time religion? god's will? too many of us (by about 5 billion)? still foggy? in these complex times, it can be disgustingly enlightening to return to the teachings of the georgia stone trustdead freemason 'math'?

freemason kids traumatized by native teachings (0)

Anonymous Coward | more than 3 years ago | (#35621368)

we're not the only (chosen) ones? the natives must have made some mathematical errors? let's see, wasn't that problem taken care of before? & before that. let's check the georgia stone, all the answers are there? not to fret then, the #s never lie?

the GSM get their tiny (ie; selfish, stingy, eugenatic, fake math) .5
billion remaining population, & the money/weapons/vaccine/deception/fake
'weather' alchemist/genetically altered nazi mutant goon exchangers, get
us? yikes

the 'fog' is lifting? more chariots will be needed?

with real math, even being remotely involved in lifetaking (paying for, supplying endless ordinance) is also a crime against ALL of the world.

ALL (uninfactdead) MOMMYS......

the georgia stone remains uneditable? gad zooks. are there no chisels?

previous math discardead; 1+1 extrapolated (Score:mynutswon; no such thing as one too many here)

deepends on how you interpret it. georgia stone freemason 'math'; the
variables & totals are objective oriented; oranges: 1+1= not enough,
somebody's gotta die. people; 1+1=2, until you get to .5 billion, then
1+1=2 too many, or, unless, & this is what always happens, they breed
uncontrolled, naturally (like monkeys), then, 1+1=could easily result in
millions of non-approved, hoardsplitting spawn. see the dilemma? can
'math', or man'kind' stand even one more League of Smelly Infants being
born?

there are alternative equations being proffered. the deities (god, allah,
yahweh, buddha, & all their supporting castes) state in their manuals that
we needn't trouble ourselves with thinning the population, or being so
afraid as to need to hoard stuff/steal everything. chosen people? chosen
for what? to live instead of us? in the case of life, more is always
better. unassailable perfect math. see you at the play-dates, georgia
stone editing(s) etc... babys rule.

exploding babys; corepirate nazis to be caged (Score:mynutswon; hanging is too good for them?)

there are plans to put them, (the genetically, surgically & chemically
altered coreprate nazi mutant fear/death mongerers (aka47; eugenatics,
weapons peddlers, kings/minions, adrians, freemasons etc...)) on display
in glass cages, around the world, so that we can remember not to forget...
again, what can happen, based on greed/fear/ego stoking deception.

viewing/feeding will be rationed based on how many more of the creators'
innocents are damaged, or have to be brought home (& they DO have another
one) prematurely.

Computer scientists? (5, Funny)

DNS-and-BIND (461968) | more than 3 years ago | (#35621170)

Why are computer scientists even learning programming? When did this happen? Programming sounds like one of those get-your-hands-dirty jobs in flyover territory, where you would show a lot of ass crack on the job and live in a trailer park. Educated people don't do that.

Re:Computer scientists? (2)

Haedrian (1676506) | more than 3 years ago | (#35621224)

Not sure whether you're being sarcastic or not, assuming you're not...

Programming lets you put your mind in a certain 'mindset' which can help you analyse and solve problems, even if you don't actually get your hands dirty in the end.

Re:Computer scientists? (2)

multipartmixed (163409) | more than 3 years ago | (#35621298)

Chem majors do work in chem labs.
Physics majors do work in physics labs.

Why shouldn't CS students do lab work?

Re:Computer scientists? (5, Insightful)

WrongSizeGlass (838941) | more than 3 years ago | (#35621302)

Why are computer scientists even learning programming? When did this happen? Programming sounds like one of those get-your-hands-dirty jobs in flyover territory, where you would show a lot of ass crack on the job and live in a trailer park. Educated people don't do that.

They need to be able to program for the same reasons management and engineers need to spend some time on the assembly line: so they can learn how things actually work. There's often a wide chasm between "on paper" and "in practice" and ideas need to be able to traverse it.

CS... (0)

Anonymous Coward | more than 3 years ago | (#35621172)

Well, maybe the correct way to say that is "modern CS curriculum is unsuitable for programming".

Re:CS... (1)

lennier1 (264730) | more than 3 years ago | (#35621286)

We're back to the discussion about the "scientific education" vs. "trade schools" rift again?

Why remove it? (3, Insightful)

Haedrian (1676506) | more than 3 years ago | (#35621182)

I don't see why it should be removed, it should be 'complimented' instead with other programming methodologies in order to let users compare and contrast. But most CS in the end will end up being done with OOP so there's no reason not to start in the beginning - at least that's my personal experience.

In my first year we had a mixture of different programming types, including functional programming. I never really used any of those, I'm sure there are certain places where Prolog or Haskel is used, but its not as common as an OOP.

Re:Why remove it? (5, Funny)

asifyoucare (302582) | more than 3 years ago | (#35621258)

I don't see why it should be removed, it should be 'complimented' instead ....

Oh java, your polymorphism is awesome.

Re:Why remove it? (1)

unwesen (241906) | more than 3 years ago | (#35621320)

I don't see why it should be removed, it should be 'complimented' instead ....

Oh java, your polymorphism is awesome.

You flatter me! I don't even know multiple inheritance!

Re:Why remove it? (1)

MemoryDragon (544441) | more than 3 years ago | (#35621276)

Well I would not call OO either directly anti parallel, the statefulness of objects is. The entire Actors / Message model which seems to become more popular despite being functional really blends well into OO. After all some OO systems are basically very close to that, you have objects and messages for interoperation. The main difference to the Actors / Message model is, that OO leaves it up to the implementor whether the message sender/receiver can be stateful or not.

Outside of that, I personally think an introductory course should reduce itself down to basic algorithms, the various programming paradigms should not be taught.
Thats how we learned it we basically used Modula II back then which had a very clean syntax for teaching the basic core algorithms back then in a modular procedural paradigm, but I dont think I missed anything by not having OO in the first courses.

Re:Why remove it? (0)

Anonymous Coward | more than 3 years ago | (#35621344)

It not the language which is OO or not OO. Any language can be used in any way. The problem with teach OO is that some programmers can't get past it and apply it inefficiently to all problems. If you've only ever programmed OO. Then you haven't programmed a large variety of problems or machines. Some machines ( yes modern systems ) don't even allow branching or sequential execution. It's extremely parallel. You can't do that with an OO mentality.

Re:Why remove it? (1)

deniable (76198) | more than 3 years ago | (#35621474)

Twenty years ago I could have said the same thing for procedural programming.

Interesting move (4, Interesting)

bradley13 (1118935) | more than 3 years ago | (#35621186)

OO is practical for lots of problems, because it makes modelling real-world data easy. However, it is not useful if you want to give students a solid understanding of the theoretical computer science. OO is fundamentally data-centric, which gets in the way of algorithmic analysis.

To give a pure view of programming, it would make sense to teach pure functional and pure logic programming. If CMU really wanted to concentrate on the theory, they would have eliminated imperative programming from the introductory semesters, because it is very difficult to model mathematically. Apparently that was too big of a step.

Re:Interesting move (1)

janoc (699997) | more than 3 years ago | (#35621272)

If you read the article, they kept functional programming in parallel with imperative one, with focus on proving validity of programs. So part of that is there.

On the other hand, you must balance theory with practice, because otherwise the students will a) leave b) not be able to do practical projects while studying the theory. So teaching only logic programming (which is great, IMO - it helped me a lot!) is not practical.

Re:Interesting move (2)

maxwell demon (590494) | more than 3 years ago | (#35621482)

This is about freshman courses. You are not expected to be finished after the first semester. You are expected to learn the basics you need to know in order to get the most of the courses in the later semesters. For example, in my first semester physics courses, the majority of courses weren't actually physics, and the physics course wasn't very deep. Actually it was more of a math study with a bit of low-level physics added on top. The actual physics came later. After the first semester, I didn't know too much about physics which I didn't already know before; I did, however, know a lot more about the math I needed to understand the physics in the later semesters.

Re:Interesting move (2)

Pinky's Brain (1158667) | more than 3 years ago | (#35621284)

Functional and logic programming get in the way of some aspects of algorithmic analysis too ... the hardware is imperative after all.

Re:Interesting move (4, Interesting)

digitig (1056110) | more than 3 years ago | (#35621420)

OO is fundamentally data-centric

Maybe the way that you do it. Personally I find that quite a lot of my classes have methods.

If CMU really wanted to concentrate on the theory, they would have eliminated imperative programming from the introductory semesters, because it is very difficult to model mathematically.

Imperative programming isn't really any more difficult to model mathematically than functional programming. They just use different branches of mathematics. Check out David Gries's The Science of Computer Programming for example, which shows how to do it, and Object-Z which actually does it. The main difficulty is with side effects, but functional programming has the same issues as soon as you try to interact with the external world.

OO a tool for craftsmen, not comp sci (5, Insightful)

shoppa (464619) | more than 3 years ago | (#35621188)

Focusing on the basics, and not on the tools of the trade, is very important at something that is not a "trade school", and CMU's computer science department certainly lives above the trade school level. (Just to contrast: when I was a freshman, the "trade school" argument was whether new students should be taught Fortran or Pascal ! Thank heaven I didn't devote my career to being a programmer.)

It seems to me that CMU's made the very obvious decision that today, OO is a tool for craftsmen, not for freshman computer scientists. And they probably are right. It's important to not confuse the tools of the trade, with the basics of the science, and this is especially true at the freshman level. For a good while (going back decades) OO was enough on the leading edge that its very existence was an academic and research subject but that hardly seems necessary today.

In the electrical engineering realm, the analogy is deciding that they're gonna teach electronics to freshmen, and not teach them whatever the latest-and-greatest-VLSI-design-software tool is. And that's a fine decision too. I saw a lot of formerly good EE programs in the 80's and 90's become totally dominated and trashed by whatever the latest VLSI toolset was.

Re:OO a tool for craftsmen, not comp sci (1)

salesgeek (263995) | more than 3 years ago | (#35621378)

CMU decided that the future is focusing on parallel. Because of the direction of hardware, I think they are on the right track.

Re:OO a tool for craftsmen, not comp sci (1)

Chemisor (97276) | more than 3 years ago | (#35621404)

Ok, so let's make a computer science degree exclusively about "computer science" as opposed to "computer programming". Then we might as well dispense with the CS degree requirement for the vast majority of programming positions. Then we should realize that the market for "computer scientists", the ones that design pure math algorithms and do scientific studies of computer-related systems, is extremely small and already overfilled. Then everyone will realize that going into "computer science" as opposed to a trade school "computer programming" is a losing proposition. Eventually we'll just have "computer programming" schools, and "computer science" schools will all die. Oh, wait. Couldn't we just rename the existing degree and just teach computer programming?

Re:OO a tool for craftsmen, not comp sci (2)

obarel (670863) | more than 3 years ago | (#35621520)

We teach music *theory*, not how to play an instrument. Nothing to do with us.

Computers are tools. You program them to get them to do something useful. They're not an abstraction, they're real. Computer science is about the theory behind computers. It's not about some "higher dimension" of thought that is unrelated to programming machines, it's all about what you can achieve by programming those complex machines.

Otherwise, why study functional or logical programming? That's "dirty" as well, isn't it? Why have a computer lab? Why call it "computer science" when we hate computers and programming so much?

Some computer scientists wish they were mathematicians - sit in an office, use a pen and a paper, and come up with ideas. That's not what computer science is about - that's what mathematics is about. Computer science is about understanding what can and cannot be done with a computer - it's a tool, and people use it. The actual use of computers is a subject of huge (practical!) amount of research - human interaction, programming languages, computer graphics, artificial vision. If your idea of a computer is an abstraction, why would you ever care about programming languages? They're all equivalent to a Turing machine, and the details are only of interest to "peasants" - the programmers.

In that case, why worry about parallel computation? It's all just "dirty" details - you have more than one processor or more than one machine - it's just details. Who cares which programming language or paradigm is suitable - maybe "craftsmen" care, but they're peasants - worrying about whether to use a shovel or a hoe. This is so far removed from the work of a true computer scientists, why even worry about such details?

So universities should not teach "computer science" because it's a contradiction of terms. Computers are machines, tools, and science, well, science is elevated. It's not about experimenting with reality, about the interaction between time, space, cost, humans and machines. It's about mathematics and P=NP.

Well, it just isn't. It's the same old story: Chemists say to biologists "everything is chemistry". Physicists say to chemists "everything is physics", mathematicians say to physicists "everything is mathematics". But a mathematician who doesn't grow yeast in a lab is not a biologist. And a computer scientist who never writes a program or touches a computer, is not a computer scientists but a mathematician.

A larger problem (4, Insightful)

wootest (694923) | more than 3 years ago | (#35621190)

The big problem with OO courses has always been that there's far more focus on the mechanism (and, lately, on design patterns) than on actually structuring your program to take advantage of objects. The word "polymorphism" isn't as important as getting across the concept that you can coordinate a few things as if they were the same thing, and have them act in different ways. Knowing a bunch of design patterns by name isn't as important as figuring out once and for all that you can structure your program in such a way that you can create interfaces (not necessarily actual interfaces/protocols) between different parts of your program and essentially replace them, or at least rework them entirely without having the whole house of cards collapsing.

There's no focus on making it click. There's no course that teaches you explicitly just about formulating problems in code, and that makes me sad.

Really? (2)

vmfedor (586158) | more than 3 years ago | (#35621204)

Perhaps I'm misunderstanding the post... it sounds to me like OO techniques are only going to be taught in elective courses from now on. If that's the case, I think CMU is missing the fact that the majority of development work in the "real world" is done on already-existing platforms. Parallel/cloud computing and modular design may be the "next big thing", but what happens when the student gets their first job working with an application built with Java or .NET? Maybe in their ivory tower they can say "OO is dead" but in the real world, OO is very real.

Re:Really? (5, Funny)

Fnord666 (889225) | more than 3 years ago | (#35621288)

Perhaps I'm misunderstanding the post... it sounds to me like OO techniques are only going to be taught in elective courses from now on. If that's the case, I think CMU is missing the fact that the majority of development work in the "real world" is done on already-existing platforms. Parallel/cloud computing and modular design may be the "next big thing", but what happens when the student gets their first job working with an application built with Java or .NET? Maybe in their ivory tower they can say "OO is dead" but in the real world, OO is very real.

This is a CS program we are talking about. Much like economics, in these disciplines the real world is often considered a special case.

Re:Really? (1)

maxwell demon (590494) | more than 3 years ago | (#35621328)

If your goal is to get a programmer job, maybe you should have chosen to study software engineering instead of computer science.

Re:Really? (0)

Anonymous Coward | more than 3 years ago | (#35621484)

Bullshit. I've been a software engineer for almost twenty years. I have a CS degree. I find that I can usually tell the good programmers from the mediocre ones by who has a CS degree and who has a SwEng one. Not always. The gifted will be good regardless of the schooling, but of those that always needed to learn programming in college, the CS major blow the Eng ones away. Of course, the big problem is the Eng majors *think* they are better. They're lucky if they can rub two if statements together.

Re:Really? (1)

multipartmixed (163409) | more than 3 years ago | (#35621330)

When I went to school, graphics was an elective.

I didn't take graphics, so I didn't get a job where I needed to write a ray tracer or whatever.

See how that works? Not so hard.

Re:Really? (1)

CynicTheHedgehog (261139) | more than 3 years ago | (#35621406)

I've been noticing that Java is moving away from OOP. If you use EJB3 persistence (i.e. JPA), JAXB, JSF, WebBeans, or any number of recent technologies you'll find that:

- Business methods (stateless EJBs/web services, servlets, JSF components, entities, etc.) are stateless and non-thread-safe
- All data containers must adhere to the Java Bean specification; all mutable properties must either have a public setter method or the entire object must use field access
- Data objects are encumbered by persistence state, making complex operations on object graphs either costly (eager fetching) or perilous (lazy initialization exceptions, stale object state exceptions, etc.)
- Most, if not all, user state is managed in the HTTP session or client (in the case of a fat client)

The result is that business logic exists as a set of stateless functions (that happen to exist in a class) and that the date is a set of functionless state (all fields must be exposed so that the business methods and container can operate on them). So basically you have a set of functions in a class file that receive a set of data (entities or business keys), operate on it, and then return some other data that then is stored as a name/value pair in a session or a field on some Swing control.

This movement away from OOP has been driven largely by JEE and container-managed services, which are multi-threaded, clustered, and distributed. In order to do that efficiently, the programs have to separate data and logic.

There have been some attempts to "fix" this such as Seam, but the reality is that these solutions do not scale. They are good for hotel reservation apps, but not a 300 table enterprise OSS system.

So OOP is going the way of the Dodo, but the writing has been on the wall for some time. Java, .NET, and other technologies in the business of driving large scale enterprise applications are still very much relevant.

Re:Really? (1)

ZankerH (1401751) | more than 3 years ago | (#35621456)

This is a university-level Computer Science course, not a programming trade school. They aren't in the business of teaching people to write $5 iGroan apps.

Important question. (0)

Anonymous Coward | more than 3 years ago | (#35621214)

Is CMU an anagram for anything?

Anti-Modular? (5, Interesting)

msobkow (48369) | more than 3 years ago | (#35621216)

Apparently the meaning of "Modular" has changed since I was in University back in '82. OO used to be the epitome of modularity.

But I do agree that making it an introductory first-level course does warp the mind of the young programmer. There are a lot of languages that don't enable OO programming at all (e.g. Erlang), which become much more difficult for them to grasp because OO is so engrained in their thinking.

I can't think of anything specific about OO that makes it poorly suited to parallel programming. There are languages whose nature is parallelism (again, Erlang), but that's usually accomplished by adding parallelism operators and messaging operators to a relatively "traditional" language. I don't see why you couldn't add and implement those constructs in a non-parallel language.

I also shudder to think how a CS student is going to deal with parallelism using languages that don't make it a natural extension if they're learning to rely on those extensions in their first year.

I gotta tell you, though, I really object to the use of Java as an introduction language for programming. Java is far from a shining example of any particular style of programming. It's not real OO because it's only single inheritance. It's not designed for parallelism. It doesn't have messaging built in. In short, Java is actually a pretty archaic and restricted language.

Re:Anti-Modular? (1)

MemoryDragon (544441) | more than 3 years ago | (#35621300)

I also shudder to think how a CS student is going to deal with parallelism using languages that don't make it a natural extension if they're learning to rely on those extensions in their first year.

I dont shudder they should learn why some patterns lend themselves better to parallelism than others instead of learning some high level in the language baked tools.
For instance if you give a student Erlang or Scala with a dedicated actors model I personally think they will never grasp why this high level construct works as pattern better than lets say a critical region - semaphore based model.
The students need a deeper understanding than applying a few patterns. The entire segment of parallelism probably should be taught at the same time OO and other things are taught. I would not teach them in an introductory course where the emphasis should be basic algorithms.

Re:Anti-Modular? (0)

Anonymous Coward | more than 3 years ago | (#35621334)

So much for getting a real job... (1)

Anonymous Coward | more than 3 years ago | (#35621226)

...because all Fortune 500 companies have critical infrastructure written in some object-oriented language, most probably Java or .NET.

Now, I get "anti-parallel" what with manual synchronization, but Fork-Join (Java), Grand Central Dispatch (Objective-C) and whatever the C#/.NET equivalent is are supposed to go a long way towards solving these problems. Also, what's a real-world alternative used by large corporations? Clojure and Scala are by no means ready for enterprise use.

Oh yeah, and how is OOP "anti-modular"? What programming paradigms are "pro-modular"? C certainly isn't, nor are most functional languages.

This prof needs to get off his high horse and realize that his job is to prepare his students for the real world.

Re:So much for getting a real job... (1)

MemoryDragon (544441) | more than 3 years ago | (#35621360)

Clojure is the typical unreadable lisp hackjob. Scala however is pretty enterprise ready if you ask me, it is just on step up from a complexity point of view, you need some time to grasp all the additional stuff. All which stands between scala and the enterprise is their refactor full force mentality which means the apis are not entirely stable yet.
Btw. someone mentioned OOP will go away in favor of functional programming. This is a wet dream of the functional guys, not even parallelism will do that instead most languages if not yet will evolve into multi paradigm languages. Actors and messages for instance are an elegant high level pattern of doing parallelism, is it anti oo, no. It just comes down to make your data structures immutable if you want to run parallel or run into semaphore hell.
So where is the need here to go fully functional? And auto parallelism which has been proposed by the functional programming guys still is a wet dream.
I think while Scala might not be the future, it shows clearly where things are heading. After all functional programming languages have their own set of problems like not allowing proper program structuring unless you introduce modular and/or OO concepts.

Re:So much for getting a real job... (0)

Anonymous Coward | more than 3 years ago | (#35621374)

> his job is to prepare his students for the real world.

But it isn't. His job is to teach them Computer Science, not to train them in whatever tools de jour happen to be used in the industry. It's the industry's job to train them in tools, the university's job is to teach them fundamentals and science, i.e. the "right way", not the practical way.

What? (0, Troll)

mr100percent (57156) | more than 3 years ago | (#35621240)

Seriously? It's a little early for April Fools.

How will students learn the proper techniques to program for major OSes and platforms like Macs, iPhones (Objective-C), Android devices (Java), Microsoft's C#, Python, Perl, C++, and the dozens of OOP languages [wikipedia.org] ?

Re:What? (3, Insightful)

Anonymous Coward | more than 3 years ago | (#35621308)

The point of a university Computer Science course is not to train students in iPhone development.

Re:What? (1)

Thundersnatch (671481) | more than 3 years ago | (#35621412)

No, but CS departments are not supposed to be ignoring the real world either. The fact that almost my entire CS degree in the early 1990s was taught via C and Scheme provided me with a good theoretical foundation, which has indeed made me adaptable and served me well over the course of time. But such a focus can made finding that first job quite painful; I was somewhat lucky. When you don't have $LANGUAGE_OF_THE_DAY on your resume, it gets circular-filed before interview, even when you come from a top-15 undergraduate school. It makes me cringe, but I actually find myself doing this circular-filing myself, as my small company cannot absorb the pain and expense of training up a newly minted CS student on the toolchains, languages, and libraries that are actually in use in the business world.

Re:What? (3, Insightful)

osu-neko (2604) | more than 3 years ago | (#35621332)

Perhaps by not quitting after their freshman year, and learning some OOP?

Re:What? (2)

multipartmixed (163409) | more than 3 years ago | (#35621354)

Geez, I don't know.

Most kids go to school for four years. Maybe they'll learn OOP in one of the other three?

Re:What? (0)

Anonymous Coward | more than 3 years ago | (#35621498)

If I were you I'd worry about being able to read a 5 line news summary instead of worrying about these poor students who are not taught how to program in perl. They will do just fine. You might not. So again: "removing the required study of O-O from the Freshman curriculum", and the mandatory http://en.wikipedia.org/wiki/Freshman

But maybe you live in an alternate dimension where software companies do hire students who dropped off after their first year to do serious sofware development.

OOP in freshman year (5, Interesting)

janoc (699997) | more than 3 years ago | (#35621246)

From the position of someone who used to teach basic programming courses to freshmen, I can only applaud the decision.

Many kids coming to colleges these days do not have any programming experience or a very shaky one at best. Picking up concepts like classes, inheritance, the entire idea behind OO modelling is difficult if you are lacking basics such as how memory is managed, what is a pointer, how to make your program modular properly, etc. From the course description they are going to use a subset of C, I think that is a good starting basis for transitioning to something else (C/C++/C#/Java/... ) later on.

What is worse, many of these introductory courses were given in Java - producing students who were completely lost when the black box of the Java runtime and libraries was taken away - e.g. when having to transition to C/C++. We are talking engineering students here who could be expected to work on some embedded systems later on or perhaps do some high performance work. Even things like Java and C# still need C/C++ skills for interfacing the runtime with external environment.

I think it is a good move, indeed.

Re:OOP in freshman year (4, Interesting)

DCheesi (150068) | more than 3 years ago | (#35621446)

I agree that OO in the 101 course is a little much. You should really be focusing on simple programming techniques that a non-major might encounter when, say, writing a batch script or macro. I'm not sure about the second semester courses, though, since those are more for potential majors. Certainly at some point a CS major needs to be exposed to OO, but I don't think it needs to come first.

As for understanding the infrastructure, I do think C/C++ get you closer, but in my experience it doesn't really click until you take some kind of computer architecture course or similar. For instance, I didn't *really* understand pointers until I understood how values and addresses are stored in memory.

Please enlighten me... (1)

rasmusneckelmann (840111) | more than 3 years ago | (#35621256)

Can someone name me some actual real world, large software projects based on functional programming? (Projects led by university professors don't count.) Dismissing OO completely because it is "anti-modular and anti-parallel by its very nature" seems kinda strange to me. I write parallel and modular OO software all the time... Maybe it's just me misunderstanding something.

Re:Please enlighten me... (1)

lennier1 (264730) | more than 3 years ago | (#35621342)

So, because last month a car manufacturer sold more cars with an automatic transmission we no longer need to teach people to drive with a stick shift?

Re:Please enlighten me... (1)

multipartmixed (163409) | more than 3 years ago | (#35621376)

> Can someone name me some actual real world, large
> software projects based on functional programming?

Ericsson's stuff scales to billions-of-cell phones and is written in Erlang.

Re:Please enlighten me... (1)

unwesen (241906) | more than 3 years ago | (#35621408)

It's anti-parallel because it invites you to do this:

struct foo {
    int a;
    int b;
};
foo asdf[200];

Instead of:

struct asdf {
    int a[200];
    int b[200];
};

The data layout in the first case does not exactly lend itself to efficiently processing all a in parallel to all b.

As with most programming paradigms, if you understand their limitations, you can work around them. But they do suggest certain ways of structuring code that may not be good for all applications.

Re:Please enlighten me... (0)

Anonymous Coward | more than 3 years ago | (#35621524)

You're an idiot.

Re:Please enlighten me... (1)

Eponymous Coward (6097) | more than 3 years ago | (#35621414)

Well, much of the cell telephone infrastructure runs on Haskell. That's probably the most famous example.

In the end, it doesn't really matter. If you are teaching computer science, you pick the tools that are the best for that job. If the university has done a good job, when you graduate you will be able to use whatever tools are required for the job at hand.

Re:Please enlighten me... (2)

Shinobi (19308) | more than 3 years ago | (#35621434)

Major parts of the worlds telecom networks, for example the software in any of Ericssons equipment made in the last 15 years is written in Erlang. Nokia also uses a fair chunk of Erlang IIRC.

Software upgrades for some equipment used by the Swedish Defense Force, to accomodate the Network-Centric Warfare model used functional programming to.

I've used Erlang myself for semi-embedded networks that need to work in parallell.

Re:Please enlighten me... (0)

Anonymous Coward | more than 3 years ago | (#35621522)

One well known and widely used operating system is written in a functional programming language: Emacs. The language is of course LISP.

This is just about as dumb... (1)

unwesen (241906) | more than 3 years ago | (#35621280)

... as people claiming OO is the only way forward out of the mire of procedural programming.

Tools, jobs, you know the drill *sigh*

Re:This is just about as dumb... (1)

osu-neko (2604) | more than 3 years ago | (#35621506)

Perhaps one could consider the possibility that OOP isn't the right tool for the job of teaching freshman students basic, introductory computer science concepts.

Honestly, it's a bit shocking to me to hear that any colleges these days try to teach OOP before anything else. Wrong tool for the job, indeed!

It's about time! (1)

Kufat (563166) | more than 3 years ago | (#35621306)

Thank goodness a university has finally decided to teach a curriculum based on what its professors like, instead of adhering to silly concerns about what might be useful in the real world. Students can rest assured that they'll get a first class CS education, and--sorry, what was that? Jobs? You want to get a job? What the fuck do you think this is, DeVry?

Now go finish your LISP homework!

Re:It's about time! (1)

MemoryDragon (544441) | more than 3 years ago | (#35621316)

Speaking of, I recently read the message of someone who was prayising lisp as the perfect choice of being able to express algorithsm. Speaking of boneheaded that is. This guy probably never worked with languages like pascal which really give strong emphasis on clean syntax for expressing algorithms instead he has been drowning his brain constantly in parentheses.

Re:It's about time! (0)

Anonymous Coward | more than 3 years ago | (#35621442)

Speaking of what?

Silly troll, there are no parentheses. What, by the way, is prayising? Something sweet put on messages to a deity? And, seriously, Pascal?

Until you've expressed Knuth's algorithms in Lisp, you haven't seen true beauty. Enjoy your ignorance.

They are right (1)

aaaaaaargh! (1150173) | more than 3 years ago | (#35621312)

The decision is sound, OOP mostly leads to bloated software, mind-bending and partly just plain stupid object-oriented constructions, and strongly encourages mutation. In a sense it's even based on the idea of mutating objects. Because of the dramatic increase in concurrent programming in the future strictly functional programming will be much more important than OOP. I understand that OO enthusiasts will not like to hear this, and of course OOP also has its good sides, but it is highly overrated. A good module system + immutable structures is much more important than classes and objects. If you use only immutable structures many operations with these structures can be parallelized massively without a single change to the code.

Mutating objects is simply programming style (1)

Latent Heat (558884) | more than 3 years ago | (#35621458)

That you base your software program on mutating objects is a programming style. You could just as easily base your program on immutable objects, creating such objects as needed and then discarding them.

The style of mutating objects comes about when you don't have good garbage collection. You want to avoid creating immutable objects that you have to discard as this stresses the garbage collector.

But they tell me there has been considerable progress in garbage collection, both in theory as to how it could be done and in practice in later versions of Java. The Java people are now encouraging the use of immutable objects because they claim their garbage collector is efficient enough to handle this style of programming, even thrives on creating objects with short lifetimes.

This is a gOOd thing. (2)

v(*_*)vvvv (233078) | more than 3 years ago | (#35621346)

Programming is about utilizing paradigms. Not being stuck in one. OO is just another layer on top of anything else. It is a set of rules that one can follow in any language. Some languages have it built in for convenience, but it is also an inconvenience for implementations where it is not so optimal.

There was definitely a moment when OO seemed to be some new paradigm in programming, but no, it is merely a tool, and CMU put it in its rightful place.

Thank you for doing that (0)

Anonymous Coward | more than 3 years ago | (#35621348)

This is a most welcomed initiative I would think.
First to address some comments, OO programming is not removed, but just shifted to the sophomore level. As a CS professor myself I get to appreciate the diversity of all students, and the different backgrounds they come from. In first year most of them have never programmed anything (of course there are also
occasional geeks, which can actually make the job interesting by asking more advanced questions) but for someone who had little experience in programming, starting
straight with OO is an abomination. First teach what a variable is, what a function is. What are side effects, recursion (very important). What are the basic data structures (stacks, queues, heaps, maps, and their implementation both persistant with trees, AVL and so on or mutable based on arrays, hash tables etc...).
Even more important talk about data-types, proper prototyping of functions, use a statically strongly typed language which enforces a strict typing discipline (and no you can't add strings and integers and no you can treat an integer as a boolean, not when you are just starting to program).
Then teach modular design, encapsulation, interfaces, modules, compilation unit. Teach the compilation process, linking process and basically how a program 'runs'.
Then ONLY you should teach about OO design (which is really not that bad and of course should be taught). OO relies on more complex concepts -- at least if you teach it properly -- (Classes, objects, methods, message passing, dynamic dispatch, overloading inheritance). It's just an heresy to teach what a 'method' is to students who barely know what a function is (both at the mathematical and programming language level).
And believe me once you know the basic concepts function, types, modules and so on, understanding OOP and actually using it efficiently when it's the best tool is much much easier.

Good strategy. (2)

davidbrit2 (775091) | more than 3 years ago | (#35621478)

I guess they want to build a nice moat to go with their ivory tower.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?