Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Convincing Colleges to Upgrade Their Classes?

Cliff posted more than 11 years ago | from the updating-those-old-20lb-text-books dept.

Education 115

Pray_4_Mojo asks: "I'm an engineering student at the University of Pittsburgh, and I'm currently taking a required class known as 'Computer Interfacing'. While I enjoy the instructor, I find most of the material to be severely dated. We will spend the majority of the class covering RS232/XMODEM/Token Ring means of computer-to-computer communication. Almost no mention of USB, Firewire, or IRDa is made within the class. I am trying to convince my professor that this material is relevant, as these types of interfaces will be dominate in the world we future grads will be working in. As an example, I demonstrated that the keycard access system to gain access to the Interfacing Lab has a USB port for data download/firmware programming. The professor seems interested, but it seems that I need to convince the department to revise the course requirements. Has anyone attempted to modernize their CS/Engineering program and met with success?"

Sorry! There are no comments related to the filter you selected.

FP DORKS!!! (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5514354)

Data dies at the end of STARTREK NEMESIS

Egads! (4, Funny)

jo42 (227475) | more than 11 years ago | (#5514447)

What next? Sneakernet?? 80K hard sectored 5.25" floppies??? Two tin cans and a string???? And this college degree is supposed to show that you are educated and get you a job in the real world?

Re:Egads! (1)

Synic (14430) | more than 11 years ago | (#5517254)

"Computer Science isn't about being prepared for the real world!" (this is what they always tell you at university)

Tennenbaum quote (0)

Anonymous Coward | more than 11 years ago | (#5527468)

"Never underestimate the bandwidth of a stationwagon filled with magtape."

Or something like that

The concepts you will learn are the same... (5, Insightful)

gtwreck (74885) | more than 11 years ago | (#5514470)

It's not about whether or not you have experience in the latest tools and technologies. It's whether you have the fundamentals in place to allow you to apply that fundamental knowledge to any other system.

In the specific case of serial interfaces, there really isn't all that much different between RS-232, RS-485, and USB or Firewire. They are all serial interfaces that employ the same fundamental concepts. In the real world you'll have to apply that knowledge to any number of serial interfaces.

The same logic can be applied to a discussion yesterday about using MS or open source programming environments in a CS department.

Re:The concepts you will learn are the same... (2, Insightful)

itwerx (165526) | more than 11 years ago | (#5514520)

I'll second this. It costs an unbelievable amount of money (millions of dollars) to design and test a curriculum.
You see this in other fields as well (e.g. psychology, business etc.) As long as you're getting the concepts, it doesn't matter what the mechanics of the course are based on.
You'll learn the real-world applications of those concepts quickly enough in the real world. :)

Re:The concepts you will learn are the same... (2, Troll)

override11 (516715) | more than 11 years ago | (#5514660)

Soooo... Go to school for 4 - ? years... to learn the basics in a field, (and go into major debt doing it) then go into real world job, and are told 'forget what you learned in school, this is different'... Its the curriculum model that is antiquated. Its just not practical in many industries that knowledge advances faster than a curriculum can be made. The last thing I'm going to do is go pay to get an education about 5 year old hardware / software that by the time I graduate is not used any more! :P

Re:The concepts you will learn are the same... (5, Insightful)

itwerx (165526) | more than 11 years ago | (#5514969)

No, not at all. (Are you trolling? :)
You are told to forget the technology which was used to convey the concepts, but the concepts are where the value is.
Here's an example.
If you want to learn how to fly a 747 you don't start out on one! You spend many years and tens of thousands of dollars learning on the concepts on smaller aircraft. Granted, knowing the gauge layout of a Cessna has zero relevance to a 747 but the concept of watching your fuel levels applies equally well in either case.
So yes, when you get to 747 school they will say "forget all that other airplane stuff" but they're not really telling you to forget the concepts, just the nitty-gritty details that you don't need any more.
Compres vous?

Re:The concepts you will learn are the same... (2, Insightful)

override11 (516715) | more than 11 years ago | (#5515060)

Ok, I agree with that totally. I am just griping because I am in IT, and have been working in IT for almost 6 years now. When I look at classes offered at collages, they are all old school... C programming, I mean sheesh, when I took the A+ test, they had old Apple2E questions on it, and IRQ questions!! I tell you, I memorized IRQ usage and their associations to COM ports etc., and the only time I have used this in years is on antique equipment. I guess I am expressing my frusteration , I would like to go and take collage, but I dont see any use ATM! :P

Re:The concepts you will learn are the same... (2, Insightful)

itwerx (165526) | more than 11 years ago | (#5515354)

In your position college will only be useful if the fact of having the degree impresses the right people or you want to change careers. :)

747? A poor example. (1, Insightful)

Anonymous Coward | more than 11 years ago | (#5517749)


If you want to learn how to fly a 747, I'm betting there are about a zillion more things that you have to do before even getting to taxi to the runway, as compared to a cessna.

On the other hand, if you want to learn how to program by learning C#.NET, all you need is notepad and a compiler. In other words, the existence of advanced stuff won't get in the way of the basics, and in the meantime, you're learning modern syntax and modern thought patterns.

Not only that, but remember that learning the syntax is vital to becoming proficient -- and employable. You are much less likely to be asked to "forget" Java/C++/C# than, for example, lisp or cobol.

- a.c.

Re:The concepts you will learn are the same... (1)

Artemis (14122) | more than 11 years ago | (#5521129)

Way to copy an AC post and get a +5 for it. Good ole Slashdot moderation.

Re:The concepts you will learn are the same... (3, Insightful)

kooshball (25032) | more than 11 years ago | (#5515384)

It costs an unbelievable amount of money (millions of dollars) to design and test a curriculum.

I'm not sure where you went to school, but I've never studied or taught anywhere that spent ANY money on designing the curriculum. And testing it? Forget it!

Most professors are left to their own devices to cover what they like in class as long as they hit a few basic points. For instance, compare the syllabi of the same Macroeconomics course as taught by a Keynesian and a Monetarist who studied under Milton Friedman. They will look like completely different courses!

The first course that I ever taught was a core undergraduate Microeconomics course. When I asked the chairman of the department what I should cover, he told me that I should take a look at the last semester's syllabus for ideas, but could really cover anything I wanted.

Re:The concepts you will learn are the same... (1)

dacarr (562277) | more than 11 years ago | (#5514582)

Yes, but *token ring*? Near as I can tell there is a vast difference between TR network protocol and your standard Ethernet stuff.

On these grounds, I would propose that VMS be reinstated as the standard big iron system, and OS/2 be revived.

I'm not fresh on TR.... (2, Insightful)

gtwreck (74885) | more than 11 years ago | (#5514632)

...but I do remember it's fairly different from standard TCP/IP. Which would make it quite a useful platform to teach concepts, as there would almost certainly be a dedicated TCP/IP class in most curriculums anyways. It's good to demonstrate different (if unusual) concepts.

This is why CS curriculums include not widely used in industry languages such as LISP; just because they do things radically different.

Re:I'm not fresh on TR.... (1)

dacarr (562277) | more than 11 years ago | (#5514713)

I concur with this, but as a basis it's probably not a good idea.

In immediate retrospect, I'll grant the expense problem, especially in computers, where "shelf life" tends to revolve right around ten minutes for any given thing.

Re:I'm not fresh on TR.... (1)

Dahan (130247) | more than 11 years ago | (#5514760)

...but I do remember it's fairly different from standard TCP/IP.

You're trying to compare two things that aren't exactly comparable :) TCP/IP is a different layer from Token Ring in the ISO view of things... TCP is layer 4, IP is layer 3. Token Ring is layer 1 and 2, like Ethernet. In other words, you should be comparing TR to Ethernet, not to TCP/IP.

You can run TCP/IP over a Token Ring network, just get a TR network interface card. Just like you can run TCP/IP over a serial line, FDDI, SONET, etc...

Re:I'm not fresh on TR.... (1)

stefanlasiewski (63134) | more than 11 years ago | (#5514942)

Bah ha! They taught you how to network using Token Ring!

Re:I'm not fresh on TR.... (1)

Chacham (981) | more than 11 years ago | (#5515253)

Token ring is stil around, and some newer technologies are based on it.

Re:I'm not fresh on TR.... (2, Insightful)

sasami (158671) | more than 11 years ago | (#5517738)

Token ring is stil around, and some newer technologies are based on it.

Fibre Channel.

Well, it's not a literal descendent of Token Ring (is it?). But it's certainly a loop topology. And frequently, the primary cost of deploying a Fibre Channel SAN is in training Ethernet-centric people to administer it properly. (Indeed, the nascent iSCSI market is driven less by a distaste for expensive FC switches than by an aversion to sending one's admins to FC boot camp for six weeks.)

Incidentally... why a loop? Isn't the concept outdated? Not if you're a storage transport that needs to enforce fairness.

--
Dum de dum.

Re:The concepts you will learn are the same... (2, Insightful)

FroMan (111520) | more than 11 years ago | (#5515191)

There are lots of concepts that come and go. And then they sneak back up on everyone. Take peer-to-peer and client-server concepts for instance. They pop back and forth in fashion. Virtual machines seem to come and go also. VM's were not started with Java you know.

Now, token passing is a valid idea. For networks it may not be used currently, but for systems that cannot withstand collisions of any sort token passing is a valid algorithm.

My point is that just because you cannot find a way to apply a concept does not mean that there isn't necessarily a reason to not teach it. Sometimes its much easier to teach from a lot of older hardware and such because it can be simpler, or less integrated.

Consider if you were to try and write an OS now, it would be overwhelming because of all the details that need to be supported to make it equivalent to what there is now. Many years ago protected memory, virtual memory, gui, and many other things were not even thought up yet.

Re:The concepts you will learn are the same... (1)

MrWa (144753) | more than 11 years ago | (#5514935)

The same logic can be applied...

Please don't be so rational - it breaks the flow of nonsense that I usually am able to indulge in when reading "Ask Slashdot" discussions.

Re:The concepts you will learn are the same... (4, Informative)

cybermace5 (446439) | more than 11 years ago | (#5515031)

No.

USB and Firewire are vastly different from EIA-232 and siblings.

USB is much closer to Ethernet than it is to EIA-232. I've done some serial development and some USB development, and the USB development is abstracted from hardware by several layers; while serial is barely abstracted by one layer (in microcontrollers, if you're lucky to get a UART).

It really is different. I would agree that students would benefit from learning more modern interfaces later on, though EIA-232 is perfect for teaching basic communications concepts. I certainly had difficulty the first time I developed a USB peripheral; it had never been taught, if barely mentioned at all.

It makes sense now. The abstraction almost makes it easier to develop for on the PC side, and there are amazing features built right into the protocol. A simple microcontroller can change from a keyboard, to a mouse, to a joystick, or dozens of other devices with a simple change in firmware.

Re:The concepts you will learn are the same... (1)

91degrees (207121) | more than 11 years ago | (#5515888)

It all depends on the level that you're looking at things. The connection between software and the raw data is irrelevent to some fields, and critical to others.

I'm not actually sure how USB works. I always assumed it used the basic transfer technique that serial does for the actual communications (i.e. high = 1 low = 0, we have a paritiy bit, etc...). What you really need to know about communications though is things like different data encoding methods, handshaking, and the various protocols used for sending data down a single piece of wire. Given enough of this ionformation, you can understand any design with just a few key words. e.g. ethernet uses fixed legth packets with a packet size of X(Can't quite remember - how embarrasing), and manchester encoding.

Fundamentally the same stuff (2, Insightful)

RevAaron (125240) | more than 11 years ago | (#5514615)

When it comes down to it, the stuff you are learning is the same as all the modern interfaces. The same concepts, not much different. Sure, USB is a bit faster than 56k serial/RS232. But in the end, having the *tools* it takes to learn the stuff you will in the are those that would enable you to learn what is behind USB and Firewire with relative ease. Hell, at least half the class (probably a lot more) will probably forget the information out of lack of it being useful down the line.

Eventually, USB and FireWire may be what is taught in that class, provided they stand the test of time like *MODEM and RS232 have.

Fake Assembly (2, Interesting)

jasonrocks (634868) | more than 11 years ago | (#5516972)

I started school at BYU this semester. I'm going into CS. The first required class had a horrifying syllabus. We were to learn about C, basic electronics, and assembly language built for a theoretical computer. I was disgusted that we would learn just about NOTHING which would be practical. I transferred out of there so fast. Now, I just hope I can get exempted from that class or take a C++, Java, or X86 assmebly course instead.


void

Re:Fake Assembly (0)

Anonymous Coward | more than 11 years ago | (#5517693)

You've gotta learn to walk before you can run. I understood CS best after taking 80x86 ASM and circuit design.

Re:Fake Assembly (4, Informative)

nbvb (32836) | more than 11 years ago | (#5518064)

Then you don't understand what a CS degree is good for.

My suggestion: Go to Chubb.

If you start thinking in the "That's not practical, who cares" mode, you belong in a trade school.

Sorry, I know that's not very politically correct, but it's the TRUTH.

Now, if you want to learn real computer SCIENCE, stick it out.

Learning assembly language for a theoretical computer is a great exercise -- you have to actually exercise that mush between your ears!

My favorite class in CS was Theory of Digital Machines.... designing AND, OR, NOT gates, building some theoretical microprocessors .... stuff that isn't "practical", but that theory means the world to you later on ...

Again, if you want practical, go to Chubb. If you want to learn something, stick it out ...

--NBVB

Re:Fake Assembly (0)

Anonymous Coward | more than 11 years ago | (#5520737)

You, sir, are my hero.

+10, CORRECT OPINION.

If I hear one more brainless little twerp whine about how his CS degree isn't teaching him how to code in (trendy_language_001), I'm going to scream.

For some reason, computer science seems to attract a large quantity of unbelievably stupid people with no concept of the difference between **science** and a trade skill. Lets face it folks, a trained monkey can write code... how on earth do you think writing code is science??

I've *never* heard a physics major bitch that they didn't cover how to use a microwave oven in classes... Likewise, I don't see many plumbers taking continuum mechanics courses and being "horrified" to find that the class doesn't include a section on welding u-joints.

"But professor! These Navier-Stokes equations are useless!! They don't show me how to plumb a bathroom in (insert_trendy_condo_architecture)!"

Why do we get all the retards in CS?

Re:Fake Assembly (2, Insightful)

g4dget (579145) | more than 11 years ago | (#5522558)

I'm going into CS. [...] I was disgusted that we would learn just about NOTHING which would be practical.

Computer Science is not job training. If you want job training, take a CISCO or Microsoft certification class.

A good computer science program will teach you very little that is "practical"; it is expected that you can pick up C++, Java, or x86 assembly language on your own when you are done. If you can't, or if you don't want to, you are enrolled in the wrong field of study.

The other side of the ditch (0)

n1k (560089) | more than 11 years ago | (#5514625)

Carnegie Mellon University, one of the best CS/ECE colleges in America, is like 2-3 blocks down the road. ;)

But, seriously, I have to agree with gtwreck: The old-school interfaces that you're learning probably aren't that far off from current technology.

It's a heck of a lot of work to structure a good course. You may talk to your prof about projects done outside of class (in place of a paper or as extra credit) to start out.

You have to walk before you can run (2)

shaka999 (335100) | more than 11 years ago | (#5514694)

These interfaces are much simplier and are a good base for moving forward. While your not likely to use them the concepts are the same AND if you go into a lab you have a reasonable chance to use one of the older interfaces.

That said a brief discussion of newer interfaces towards the end of the class is probably relevant.

I know when I was in college we studied the PDP-11 architecture. Well, I've never done anything on a PDP-11 and except for hobbiests I doubt you could find one BUT I do have a solid understanding of concepts that started with that class.

Re:You have to walk before you can run (1)

MyGirlFriendsBroken (599031) | more than 11 years ago | (#5516431)

These interfaces are much simplier and are a good base for moving forward. While your not likely to use them the concepts are the same AND if you go into a lab you have a reasonable chance to use one of the older interfaces.

Absoululty true, I have been taught this way the poster is and it is paying off. Yestersay I has a written and verbaly assesed test on MIPS ISA and it went OK (no previous experience). This is only because I had tuition in basic computing and ISA consturcts. I would say no matter how complicated the system, you still have to deal with the hardware and the best way to get starteded in to start simple. After all these are the device which everbody learned on and therefor the tecnhology which all other divces(sic) are based opon,

My adviece(sic) is stick with what you got and see how it goes. After all there is nothing more valuable than favorable(sic) contacts to get your name in and then impress in the interview.

Hopr all of this helps, just my observations form UK grad recutment

The new technologies are bad to learn from (4, Interesting)

LordNimon (85072) | more than 11 years ago | (#5514748)

The newer technologies are much harder to learn from than the older ones. The speeds are much higher, the protocols are more complicated, and the tools are more expensive. For a beginner learning this stuff, you never want to work with the latest technologies.

If you really want to learn about Firewire, do something with it for your Senior project.

Re:The new technologies are bad to learn from (2, Insightful)

Glonoinha (587375) | more than 11 years ago | (#5515181)

Put another way, the students can easily think as fast as a CCITT v.21 connection (thats a 300 baud dial up connection) and actually follow the modulation / demodulation routines, convert each ascii character from an 8 bit string of 1's and 0's to an actual character in real time. That's like 30 per second, no problem. Actually follow the train of computer actions from sound on the telephone line to characters on the screen in their head.

Crank the speed up slowly, give the student a chance to listen to the differences and the good students will be able to differentiate between connection speeds by listening to the modem connect. Good students should be able to follow the traffic up through at least a 2400bps connection - after that it is a little difficult to follow individual byte traffic and all blurs together.

Crank it up some more and you can move 5000 bytes of information a second through the same channels - but you have lost the ability to relate to what is happening.

Crank it up to USB and 1394 interfaces and sure you are moving 1Meg/second or more ... but it becomes no longer technology you understand, can trace in your head through the steps, and more of black box magic.

Bingo - the University isn't looking to put out six week wonders that can slap together objects from the MFC and VisStudio.NET, they want to put out engineers, scientists that can look at the binary dump of an executable and see where the code makes an unexpected jump to the end of the code listing (generally slop space of nulls) after the literal table and starts executing code that was not put there by the original compiler (I am discussing the way virus code overlays regular code in regular executables, of course.)

L/Nimon - I'm not disagreeing with you, this just looked like a good place to state my opinion.

Re:The new technologies are bad to learn from (1)

ables (174982) | more than 11 years ago | (#5515608)

The newer technologies are much harder to learn from than the older ones. The speeds are much higher, the protocols are more complicated, and the tools are more expensive. For a beginner learning this stuff, you never want to work with the latest technologies.

This applies to all sorts of things. The idea of the class is not to learn RS-232 or RS-485 or RS-3.14159 or whatever. The idea is to learn serial computer-to-computer communication, and the best way to do that is to minimize time on the nuances of the protocol and get the general concepts down. RS-*** is simple, so you can get the mechanics out of the way quickly.

The best electronics class I ever took (Physics 123 at Harvard from the creators of The Art of Electronics [artofelectronics.com] ) had us building a computer from individual chips based around a 68000 processor. Nothing modern and useful like VGA cards or PCI, but now I have a good understanding of the general concepts behind microcomputer design. Never could have done that in 1/2 a semester with current technology.

More than fundamentals (2, Insightful)

Synistyr (529047) | more than 11 years ago | (#5514758)

You can teach fundamentals with Cobol and Logo too.



A school teaching the 'fundamentals' using newer technology, like php, .NET, firewire, usb, irda, would hopefully give you a better chance of getting a decent job than one still using older technology.

Re:More than fundamentals (2)

m0rph3us0 (549631) | more than 11 years ago | (#5514869)

Teaching implementations is the job of trade schools not universities. I expect MCSE's and such to have to go back to school every time a new OS comes out. I would not expect the same of CS / Engineering grads. The service the uni provides is education, not job training, an increase in knowledge / intelligence just happens to make you more valuable in the workplace. Uni is not a job training centre, otherwise you would have classes like Resume 254.

Re:More than fundamentals (1)

nellardo (68657) | more than 11 years ago | (#5515005)

If students don't take the course, enrollment drops, and the university cuts funding to the department (as funding is based in part on enrollment in the department's classes).

Students still demand courses that look good on resumes. Students still demand courses that are enjoyable and interesting. The faculty has a responsibility to teach the "right" material, but simple Darwinian survival of the department means that the faculty must teach that material in a way that gets butts into seats. One way to keep butts out of seats is to teach boring and drab stuff, stuff that the students will find irrelevant at the time (even if the faculty knows it isn't irrelevant). RS-232 is not terribly exciting in a world of USB and Firewire, even if the principles are the same.

But if the students aren't learning that the principles of serial interfaces are the same no matter what the bandwidth or connector or acronym, whose fault is that?

Re:More than fundamentals (1)

sasami (158671) | more than 11 years ago | (#5517858)

Students still demand courses that look good on resumes.

Have you ever interviewed a new grad?

Every hotshot college grad learns very quickly that the "practical" skills you learn in school are worth squat. I've only been out of school for a few years but I code ten times faster and better than I did then, no exaggeration -- and that's still ten times worse than the best of my cow-orkers.

Which is why I always prefer to see a solid courseload in fundamentals and theory. Then I know I've got someone who can think, design, do research, and (depending on the project) get up to speed fast. Code monkeys are a dime a dozen.

--
Dum de dum.

Re:More than fundamentals (1)

nellardo (68657) | more than 11 years ago | (#5519237)

I said:

Students still demand courses that look good on resumes.
Sasami [slashdot.org] said:
Have you ever interviewed a new grad?
Of course. But I'm not talking about grads. I'm talking about students choosing courses.
Every hotshot college grad learns very quickly that the "practical" skills you learn in school are worth squat.
They may learn this once they're out of school - I was talking about the dynamics of students choosing courses, and how that affects an academic department's thinking.
Which is why I always prefer to see a solid courseload in fundamentals and theory. Then I know I've got someone who can think, design, do research, and (depending on the project) get up to speed fast. Code monkeys are a dime a dozen.
Maybe you aren't posting all the job listings with catalogs of acronyms in them, but someone is. Students read them and come to the reasonable conclusion that they need to know about those acronyms in order to obtain the best job opportunities. I'm not saying it's a good thing. I'm just explaining the phenomenon as someone that saw it - students didn't care if Java was pedagogically the best language or not. They wanted to learn Java because that's what employers wanted. The department could meet that demand and increase its enrollment or ignore it and reduce its enrollment.

Re:More than fundamentals (1)

blackcoot (124938) | more than 11 years ago | (#5515721)

Moderate the parent comment up!

Personally, I really don't think that an undergraduate classes should discuss the latest and greatest bleeding edge technology until the students have all the tools necessary to understand how the technology works. It's hard to really appreciate the power of things like Paradigm X and Technique Y until you've had to do it the hard way. Likewise, it's hard to know how to apply Paradigm X and Technique Y in appropriate ways if you haven't seen any alternatives.

This all said, I don't think it's a bad thing to discuss these things in class. I found the case studies of Solaris, WinXP, BSD, Linux, etc. in my OS text to be invaluable, but that was only after I understood the driving principles. However, a comparison between threading models on these platforms is essentially useless until you know the difference between 1:1, N:1 and N:M threading models are.

Re:More than fundamentals (0)

Anonymous Coward | more than 11 years ago | (#5519498)

A school teaching the 'fundamentals' using newer technology, like php, .NET, firewire, usb, irda, would hopefully give you a better chance of getting a decent job than one still using older technology

A school teaching those "fundamentals" is not teaching fundamentals at all! They are teaching a specific implementation. That is exactly the kind of "relevant" training (n.b., that's training--not education) that will leave you "technically obsolete" within a decade of graduation. That leaves you unemployable.

Real fundamentals are things like DeMorgan's theorem, state machines, and yes, machine language programming. Don't forget the EM field theory, math, and circuit theory while you're at it. Every digital circuit is really an absraction of its implementation, which is an analog circuit. High-speed digital design is the science of keeping the digital abstraction (gates and registers) valid when the signal speeds approach the limits of the underlying analog components (FETs and transmission lines). Dr. Howard Johnson's book is a great place to see what real design is about. Sign up for his email newsletter, too.

The fundamentals necessarily cover a broad range of topics, and you must understand them all if you are to be successful. Not be an expert in everything, but know enough to recognize what's going on in your specialty field and how things will affect it. Tunnel-vision concentration on detailed implementations is great way to train (not educate) designers (not engineers) who can't do anything but regurgitate what somebody else has already done. Businesses who hire these guys are a dime a dozen these days, or more accurately, a dime a share!

Re:More than fundamentals (1)

sql*kitten (1359) | more than 11 years ago | (#5520128)

A school teaching the 'fundamentals' using newer technology, like php, .NET, firewire, usb, irda, would hopefully give you a better chance of getting a decent job than one still using older technology.

A degree prepares you for a career, not a job. A career is a marathon, and job is a sprint. It's not about "how to do X with Y", it's about, "how to do $X with $Y" - do you see the difference? The first is like hardcoding everything into your program, the second is like abstracting all your constants into a header file and using variables everywhere else.

At college I learnt how to program in FORTRAN, DCL, and a language you probably haven't even heard of for programming CNC machines. There were no variables in this language, if you wanted to store a value, you had to move one of the tool heads to that position, then to recall that value, you had to use the command that told you where the head was! Fortunately there were 6 heads, and you usually only needed 3 to make what you were making...

But what mattered were the principles, nowadays I use Solaris 9, Oracle 9i, Java 1.4, all the latest technologies, no problems, and I'll have no problems on versions 10, 11, etc. Whereas all the NT4 MCSEs had to start again Windows 2000, and they'll have to start all over again with .NET.

Funny you mention that. (1)

jotaeleemeese (303437) | more than 11 years ago | (#5529470)

When I was an student I learned Lisp and Pascal to learn *concepts*. I never ever have used thos languages professionaly.

In the "Real World" [tm] I have used COBOL, ALGOL, C, C++, Java, Perl and two or three more more exotic.

I would had made no difference to my preparation as a proficient Engineer if I had learned a bigger amount of buzzwords per hour.

RS-232 etc are fine (1)

m0rph3us0 (549631) | more than 11 years ago | (#5514813)

When I learned ASM I learned it on a Motorola HC11, not a P4. Learning the concepts is much more important than learning an implimentation.

If one is unable to extrapolate the knowledge gained from studying one form of serial interfaces to another then if the course is modernized you would still be required to go back to school when the USB / FireWire / Whatever the course is taught with fad ends.

Re:RS-232 etc are fine (1)

n1ywb (555767) | more than 11 years ago | (#5515699)

I learned assembly on x86. Then I learned assembly on the Microchip PIC. Then I learned motorola HC12 assembly (backwards compatible with HC11) and I thought to myself "Holy christ HC12 assembly is dumb." Maybe I just had a bad teacher. IMO PIC assembly is the one to teach in the classroom because of it's simple yet powerful instruction set.

Real experience modernizing curriculum (4, Interesting)

nellardo (68657) | more than 11 years ago | (#5514827)

Back in the day, in the early 90's, I was largely responsible/to blame for switching Brown University's [brown.edu] undergraduate first semester programming course [brown.edu] to object-oriented programming. It had been teaching structured programming, but industry at that time was following object-oriented design precepts (even if using languages like C). The faculty all firmly believed in OOP, and taught it in all upper-level courses. But it was seen as "too advanced" for beginning students.

As it turned out, the real problem was not teaching OOP to the students. OOP is easier to explain to new programmers than structured programming (people use real-world objects all the time - not so much real-world procedures). Half-way through the first semester, the students could implement Tetris.

The real problem was retraining the faculty. Even though they knew OOP was a good thing, it took them a while before they had internalized OOP enough to present, e.g., algorithms and data structures in an object-oriented style. No one believed that you could teach inheritance and polymorphism before you taught loops, conditionals, and arithmetic.

Faculty teaching the intro courses may be in touch with industry and research. That's not enough. The faculty need to rethink an entire course to present the right academic material in a modern, industry-relevant way. If the faculty can do that (and, make no mistake, it isn't easy), they you'll get a course the students love, that will get them a job, and that will prepare them for a strong academic program as well.

For the truly curious, the textbook [amazon.com] for that course is actually still in print, even though it depended on Borland Pascal, which is long-since defunct.

Re:Real experience modernizing curriculum (1)

d^2b (34992) | more than 11 years ago | (#5524962)


The real problem was retraining the faculty. Even though they knew OOP was a good thing, it took them a while before they had internalized OOP enough to present, e.g., algorithms and data structures in an object-oriented style. No one believed that you could teach inheritance and polymorphism before you taught loops, conditionals, and arithmetic.


Of course you can teach polymorphism first.
I'm just not (yet) convinced it is a good idea.

The problem is that algorithms are all about
loops, conditionals, and arithmetic. The
other stuff is completely orthogonal. You might
not agree about which is more important,
but it turns out that students really lack
basic algorithmic thinking when they are
immersed in OO first. The theory people think
they don't understand algorithms. The systems
people think they don't understand computers.

I think OOPS is good, useful stuff. I'm
not sure if it is how people should start.

Re:Real experience modernizing curriculum (1)

nellardo (68657) | more than 11 years ago | (#5525532)

I said:
No one believed that you could teach inheritance and polymorphism before you taught loops, conditionals, and arithmetic.
d^2b [slashdot.org] said:
Of course you can teach polymorphism first. I'm just not (yet) convinced it is a good idea.
It wasn't an "of course" ten years ago, when I was first working on the curriculum.....
The problem is that algorithms are all about loops, conditionals, and arithmetic. The other stuff is completely orthogonal.
I'd disagree. An algorithm works hand-in-hand with a data structure. A sorting algorithm works in a given run-time because of the nature of the operations allowed on the data structure it operates on. This is obvious when you compare, e.g., insertion sort on an array versus insertion sort on a linked list. The red-black tree algorithm for balanced trees is useless without the data structure that allows for "coloring" the nodes of a tree.

Algorithms are behavior. Data structures are data. What is an object? Behavior and data tied together. If they are inextricably related (like the red-black tree algorithm and the colorable tree nodes), work on them together, rather than trying to create artificial separations between them. That's just good software engineering.

You might not agree about which is more important, but it turns out that students really lack basic algorithmic thinking when they are immersed in OO first.
They might lack an understanding of the deeply nested flow of control sometimes found in procedural or structural designs. But that isn't the same as not understanding complex systems - some ostensibly complex algorithms are in fact easier to understand and formulate in an object-oriented framework. For example, 2-3-4 trees are easy to implement in an object-oriented framework, much easier than the red-black trees that were invented because 2-3-4 trees were so "hard".

No, the students that have seen OOP from the beginning understand just as much. They're simply used to framing that complexity in different terms and organizing it in different ways from the class von-Neumann-driven separation of behavior and data. Faculty may not always realized that, because they may have been using a strict separation of data and behavior for much much longer.

The theory people think they don't understand algorithms. The systems people think they don't understand computers.
I'd suggest that the theory people and the systems people in question don't understand what it is the students do understand. That was the kind of knee-jerk responses we got at Brown, until the theory people realized that the students did understand complex systems, but just organized them in different ways - that's part of the faculty retraining I alluded to. At Brown, it was a success, and led to algorithms textbooks written from an OOP perspective (written by faculty members that were originally some of the strongest critics of the OOP approach to the curriculum). The systems people at Brown were all very OOP in mindset - they were ecstatic with the new approach and the students they got.

Re:Real experience modernizing curriculum (1)

d^2b (34992) | more than 11 years ago | (#5527829)

What is an object? Behavior and data tied together. If they are inextricably related (like the red-black tree algorithm and the colorable tree nodes), work on them together, rather than trying to create artificial separations between them. That's just good software engineering.
I think this is the essence of where we agree and disagree. Immersing people in OOP from the beginning is a good way to create software engineers, but...
At Brown, it was a success, and led to algorithms textbooks written from an OOP perspective (written by faculty members that were originally some of the strongest critics of the OOP approach to the curriculum).
If these are the same algorithms textbooks by Brown authors that I am thinking of, (Data Structures in Java?), then they precisely exemplify what I think is wrong with this method of teaching. They take Binary Search Trees and bury them beneath something like six pages of classes. No doubt the result is "industrial strength", but it takes some kind of archeology to extract the key ideas.

So I suppose it depends on your priorities.

Shit on Pitt (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5515096)

Well, maybe you wouldn't have these problems if you went to a REAL school like Penn State that has much higher Quality classes then bloody Pitt :)

The real problem is.... (1)

foooo (634898) | more than 11 years ago | (#5515220)

The real problem is, CS and CE programs should be designed from a theory stand point. Any class that relies heavily on a specific implementation or technology will be out of date as soon as you graduate (or sooner) The notable exception being classes that are designed to deal only with specific technologies or implementations or languages.

The bulk of your education in CS should be theory, optimization and concepts. Implementation and specific technologies you should be able to pick up on the job or as a tool to learn... not the actual content of the learning.

**Instant Short Version**

I learned C so that I could learn to write decent code, with planning and an eye for efficiency of design. Not just for the sake of learning C.

~foooo

Re:The real problem is.... (1)

n1ywb (555767) | more than 11 years ago | (#5515730)

Computer Science should be mostly theory. That's why I am not taking Computer Science. I am taking Computer Engineering because it is more practical.

Even with CE, though, it's still more important to teach the theory than the implementation. That is why I support teaching RS232 and XMODEM. I do NOT support teaching TokenRing, however, for obvious reasons. I ALSO support teaching USB, because it really isn't THAT hard to learn, and it is obviously the way of the future. There's no point to teaching FireWire since it is essentially fast USB.

Re:The real problem is.... (1)

foooo (634898) | more than 11 years ago | (#5515847)

please read the parent comment. it clearly says CS and CE =)

Re:The real problem is.... (0)

Anonymous Coward | more than 11 years ago | (#5515939)

Software Engineering is the 'practical' side of computer science; CompE should be closer to Electrical Engineering, and focus on hardware things.

Rember : "Computer science is as much about computers as astronomy is about telescopes". The 'computer' in Computer science refers to 'a device that performs computations' (where 'computation' is a mathematical concept) not "a chunk of plastic, metal & silicon that you order from Dell"; silicon microproccessors just tend to be be the most cost effective device for performing computation.

OTOH, computer engineering -does- focus on the physical device that we commonly refer to as a 'computer'.

Re:The real problem is.... (1)

alannon (54117) | more than 11 years ago | (#5518750)

I'd agree with most of what you say, but the concepts between USB and Firewire actually have very little in common. Firewire is essentially a networking architecture. USB is a host-to-device bus. An apt comparison would be Firewire ~= SCSI and USB ~= PS/2 port.

I tried. (1)

BigZaphod (12942) | more than 11 years ago | (#5515228)

Unfortunately, the profs weren't very interested in listening to a stupid undergraduate. Ah well. I went around the system by using the computer club as my own class of sorts. The few folks who showed up every week seemed to really enjoy learning about how the modern stuff works. Plus, I think I had way more fun doing that than if the classes would have changed. Generally I didn't try to lecture all the time. Sometimes someone had something they'd be interested in and do some research then show the rest of us. Other times I'd be sitting there and they'd start asking me questions (as I had gained a bit of a reputation as being a good source of knowledge) and I'd help wherever I could. It was cool and I like to think I helped some of the others expand their horizons a bit.

If you want (0)

Apreche (239272) | more than 11 years ago | (#5515337)

If you want to learn new stuff, the teacher has to know it as well. If the professor doesn't know it, they very well can't teach it to you. If you go to a technical school, like RIT where I go, the professors are just as up to date as the students, usually moreso. I'm a CS major, and they teach us current technologies. Taking a class on XML right now actually. If you aren't in a technical major at a technical school you can't expect to be learning anything valuable. It's like being in an engineering class in liberal arts school. You aren't going to learn quality stuff.

Interfaces to learn are RS232 and SCSI (1)

Ashurbanipal (578639) | more than 11 years ago | (#5515349)

\.

Everything else is either based on or pretty similar to those two... well, OK, there's also Ethernet's CD/CSMA paradigm.

THREE things, that's THREE things to learn Cardinal Fang! (And a fanatical devotion to the Pope!)

John Slimick's the guy to learn from at Pitt; he used to teach at the Bradford campus in the frozen north. He's an excellent teacher as well as an all-around nice guy.

Welcome to reality. (1, Interesting)

FreeLinux (555387) | more than 11 years ago | (#5515356)

The fact is that everything you will learn at any university is going to be dated. As has been said before, it takes a great deal of time to properly develop a new or up-to-date course. The profs have to be trained first.

But, no matter how progressive the school is, they will still be behind the industry curve, unless they themselves are developing the technology. When you get out of school you will not have been fully educated on the latest and greatest technology. That's why you do internships and graduate work and entry level positions.

Only after you have finished learning the "legacy" stuff and then spent time in a job getting up to speed on the present day technologies, as well as getting some hands on professional experience under your belt, can you hope to be up-to-date. Then you will find that you have to spend the rest of your career "running" as hard as you can, in order to keep up-to-date.

But, here comes the biggest kick in the huevos. Every single generation of students wants/tries to change things. Evry single one. Each one seems to feel that they know better, what should be done. But, the sad fact is that, you don't have the necessary life, business, political, technical experience to be qualified to make that decision.

I'm sure that all the present students will proceed to flame me for this last statement. But, they, and you, will see in the next 10 years or so that, you really aren't qualified. Only after several years will you develop the necessary life/political/business/technical experience to be truely qualified to make decisions on changes for the future. So, for now, sit back and learn. That's what you are there for, not to run the show.

You never said but, I'll bet that at least some of the old RS232 stuff that you are learning is indeed new to you. Also, RS232 is STILL heavily used in a very wide range of technical industries and it will continue to be for some time to come.

Re:Welcome to reality. (1)

n1ywb (555767) | more than 11 years ago | (#5517094)

But, here comes the biggest kick in the huevos. Every single generation of students wants/tries to change things. Evry single one. Each one seems to feel that they know better, what should be done. But, the sad fact is that, you don't have the necessary life, business, political, technical experience to be qualified to make that decision.


LOLLERSKATING. I worked in the computer industry for 10 years before I started college 5 years ago. From my experience, most of my professors' knowledge predates MY entrance into the industry. Here are some quotes from my professors:

"XMODEM will work over a 7 bit link."
"Linux is useless junk." (That one is from today.)
"128 megs of ram is plenty to run Visual Studio .NET on a Win2k box." (Also running AV software, Novell client, OpenOffice pre-loader, etc)
"A lot of people program in ML."
"Students don't know anything. If we taught what the students wanted us to teach, we wouldn't get anywhere."

Most of these guys are horribly out of touch, and weren't necessarily ever in touch to begin with. There are two kinds of professors at my college: The ones who retired from industry and are willing to tolerate the shit pay because they like teaching and living in rural Vermont, and the ones who are here because they couldn't get a better job (the vast majority). None of them are still ACTIVELY involved in the industry, they spend all their time teaching and correcting homework.

The bottom line is that LEARNING isn't a passive process, it requires active participation, discussion, circulation of ideas, and most of all dissent. Going to school isn't about learning a bunch of facts and forumlae, it's about LEARNING HOW TO LEARN. How can you learn how to learn if you take everything your teacher tells you at face value? Do you have the same attitude towards GOVERNMENT? You frighten me.

Re:Welcome to reality. (0)

Anonymous Coward | more than 11 years ago | (#5525513)

The parent to this comment is completely full of bullshit. Sure, go ahead, blame the over-eager, green, younger, generation. Generalize your experiences with them to make any contributions they produce aboslutely trivial so you do not feel as outdated and overvalued as you really are. I have been in industry, with a CS degree, for just over a decade know, and you know what holds back innovation in just about every circumstance I've ever come across? The old fucks like yourself who refuse to keep their qualifications current. Those who get the position, get complacent, and refuse to re-educate themselves. Those who complain when they have to learn the new system, or go by new rules, because it slightly alters their world of nine-to-five, punch-in-punch-out lives. Just be a help to all of us out there who want to make our lives and the lives of our co-workers, clients, and general population simpler and quit if you are going to resist the innovation of the younger generations. Some of the best ideas have come from the youngest members of my staff, yet many never get fully implemented because there is some old fuck with too much invested in his pension who bitches about actually having to use his brain while he is at work. Do us all a fovour, quit, and get a job at Radio Shack.

sorry, you're an idiot. (1)

kevin lyda (4803) | more than 11 years ago | (#5515468)

industry still uses all the "old" things you mention. and all the "new" things are based on the old ones. they evolved from those standards.

crawling is a useful skill to have. and while walking is better for many tasks, you still learned to crawl first.

Re:sorry, you're an idiot. (1)

MrResistor (120588) | more than 11 years ago | (#5515723)

industry still uses all the "old" things you mention.

That's exactly what I was going to say.

I've yet to see USB, Firewire, etc, in use in the "real world" except for consumer-level personal computer peripherals. In industry, RS232 is The Shit, RS485 is a handy substitute for long distances, and people are experimenting with ethernet (some of them are even using Cat-5). Every once in a while you might run into something designed for a parallel connection, but not too often.

All these hot new interfaces are only really relevant at the consumer level, and that's only a small part of the real electronics industry. This lab sounds like a pretty close approximation of the real world as I've experienced it. Sorry if that disappoints you, but you might as well get used to it.

Re:sorry, you're an idiot. (1)

n1ywb (555767) | more than 11 years ago | (#5515772)

Yes, TokenRing is still in very limited use, but it sucks. Maybe if anybody other than IBM had developed it... And no, nothing is based on it. Maybe FDDI but there's another crazy esoteric protocol that 99% of us will never have to deal with.

Searched the web for token ring. Results 1 - 100 of about 516,000.
Searched the web for ethernet. Results 1 - 100 of about 5,770,000.

Which protocol do YOU think should be taught in schools?

Re:sorry, you're an idiot. (1)

greck (79578) | more than 11 years ago | (#5518048)

More like "definitely FDDI"... same 802.5 frame type. And dual-attached stations? I may be biased because I was weaned on Token Ring at IBM, but even today I think it's pretty damn cool--I'll take a network that has a sense of self-health over the chaos of Ethernet any day of the week.

What I really wish, is that 100VG [io.com] had gotten off the ground.

Re:sorry, you're an idiot. (1)

nbvb (32836) | more than 11 years ago | (#5518119)

Both.

There's merit to teaching a token-based media access control mechanism.

There's also merit to teaching a shared media access mechanism.

If all we taught were the Most Popular (tm) results, music students would study Britney Spears, Lit majors would study Harry Potter, and CS would be nothing but how to use MS Word & Excel.

Sheesh.

Wait till you get to the Real World (tm). You'll grow up fast, I promise.

TR isn't that limited in use; lots of mainframe-type environments still use it. Hell, we still have SNA links ....

--NBVB

Poor example, but I know what you mean. (2)

Kris_J (10111) | more than 11 years ago | (#5515638)

Updating a course is not a trival thing. Deploying new hardware, writing new course notes, finding new text books are all costly or time consuming things. Students benefit financially from stable courses as there are a greater pool of secondhand text books available. Everyone benefits from a well shaken-down course. If the fundamentals are still present and materials are still available (books and spare parts), you will be unable to convince the appropriate people to change the course.

Meanwhile, and most students don't realise this, but you are allowed to do research of your own beyond the scope of any given class. I know you may not have the funds to pursue everything you want, but neither does the college.

College is not about getting a job (3, Interesting)

lkaos (187507) | more than 11 years ago | (#5515646)

There are two types of jobs out there: 1) ones that require experience and 2) ones that don't want experience because they want groom you.

For the first group, there is no way that you could possibly gain enough experience in 6-9 hours a week for four years. That's only about 4-6 months of professional experience (about two full-time internships if you're so lucky).

For the second group, employers are more interested in finding someone who is a good problem solver and has the ability to pick up newer technologies quickly. In a lot of ways, as an employer I'd rather have someone who learned COBOL at school for fear that they'd carry bad habits if they knew C++ or a newer language that I'd expect them to use on the job.

<Open Source Evangelizing>
Of course, working on Open Source software can give you the desired experience and prove you have the ability to learn quickly on your own :)
</Open Source Evangelizing>

Re:College is not about getting a job (1, Insightful)

Anonymous Coward | more than 11 years ago | (#5525410)

I wish more people made this point. Working on free software projects is in many ways better for your skills than any paying job could ever be. You often get to work with people far, far, far more experienced than you; you get to ask questions; you get to toy and experiment with different methodologies. If you screw up, you can go back to the drawing board, learning from your mistakes. And, you get to play with all the new toys and cherry pick from the old toys. You learn about tradeoffs in different tools based on their technical merits alone, not on the whims of some manager who was sold on something because he got a great steak dinner from a vendor rep.

In the commercial world, you're often given a project timeline and framework handed down from your managers, and a product from sales that could only exist in the imagination of somebody living on mars. And forget about learning from your mistakes. Your mistakes become the bedrock of the product!

Boy does this sound familiar (2, Insightful)

n1ywb (555767) | more than 11 years ago | (#5515671)

It's basicly the same here at Vermont Tech. Granted I think that RS232 and XMODEM are still relevant, as they're SIMPLE. Also RS232 is still widely used, and while XMODEM may be garbage it is still the basis of many other protocols and is easy to understand. At least I thought so. I got in an argument with a professor once, as a lab assignment he asked us to connect two computers with a null modem, set the link to 7 bits, and transfer a file using XMODEM, in that order. I told him XMODEM doesn't work over a 7 bit link. He told me it does. It took me about a half hour to convince him that he was wrong.

After putting intense pressure on this same professor, he did spend a couple of days at the end of the class talking about USB, but it was uselessly superficial. It would have been far more beneficial for us to have done some USB programming in lab, or something.

It is hard for schools to keep up with all of the modern hardware and software and protcols, as the industry moves to fast. But why should they keep right on the bleeding edge? While RS232 may be old, learning about RS232 teaches you the PRINCIPLES of communication, thus better equipping you to learn new interfaces. The same goes for XMODEM. USB and FireWire are pretty fucking complex protocols to jump right into when you haven't covered any time of communication standard before. But I think that considering how ubiquitous USB is becoming, it should absolutely be included in the curriculum.

On the other hand, there's no excuse for teaching TokenRing. For the love of god, spend that time teaching ethernet.

Re:Boy does this sound familiar (1)

n1ywb (555767) | more than 11 years ago | (#5515792)

To understand the present, you must understand the past. That is why archeologists painstakingly excevate ancient sites. That is why history is taught in schools. And that is why you have to learn RS232 before you learn USB. It's an obligatory point of passage.

Re:Boy does this sound familiar (1)

i_am_nitrogen (524475) | more than 11 years ago | (#5517953)

Token ring is still used quite a bit in industrial settings, like oil refineries (my uncle is a manager at an oil refinery). They like to use what's been proven to work reliably, and token ring doesn't fail them. They even still use arcnet, because it works well over the distances of the plant, and it doesn't ever crap out.

Not to mention, token ring uses a different way of communicating than any other line protocol I'm aware of. I would love to learn token ring.

Another thing I think should be taught at some point is I2C. That way I wouldn't have to figure it out myself when I go to reverse engineer a Windows driver for a card that has I2C devices on it.

A one-wire bidirectional interface is cool. Having a clock line and only one master helps make it easier.

How old are the prof's notes? (1)

n3bulous (72591) | more than 11 years ago | (#5515949)

Chances are, the prof has been using these notes since the dawn of time. I had one professor who had notes that looked 30 years old, paper scrawled in illegible chicken scratch, and fraught with errors. Most of the time, the same professor will teach the same class for quite a few years because it is easier to teach something you are familiar with than learn the ins and outs of the latest technology, otherwise it would interfere with research.

Don't for a minute think that the professors are there to teach you. A few enjoy teaching, but most are there for research.

An explanation (-1, Flamebait)

Gnissem (656009) | more than 11 years ago | (#5516043)

Those who can do... Those who can't teach.

Alternative (0)

Anonymous Coward | more than 11 years ago | (#5516049)

You may find what you're looking for at a nearby trade school. At university you're expected to learn things for the sake of personal enrichment, not for the development of specific technical skills. That is perhaps not your objective.

Toss it in the garbage.. (1)

MountainLogic (92466) | more than 11 years ago | (#5516418)

No, seriously, what happens if the lab get a lightning strike? How would the school find replacements for the outdated equipment. Sure, some of it is off the shielf (RS-232), but the token-ring stuff is getting really hard to find. Better for the school to make a planned scheduled move to current (replacable) equipment that be faced downtime in the lab while new platforms are sourced

For what it's worth, a serial port is a serial port and RS232 is a ver forgiving place to start an interface course.

Re:Toss it in the garbage.. (1)

unitron (5733) | more than 11 years ago | (#5518249)

They can replace it with a little searching on eBay, or, more likely, they've got another closet full of the stuff just down the hall.

In addition to the other good reasons mentioned here for starting people out on "legacy" concepts, there's also the very important in a budget conscious setting principle that if the student screws up and fries an old piece of equipment, no big deal and they've learned what not to do, whereas if it's all new expensive stuff everybody's going to be too scared of frying it to let the students actually do any hands on.

Get the feel for removing and installing with old ISA cards and 286 chips and 30 pin SIMMs before you mess with AGP, BIOS chips, etc.

Have never needed to.. (2)

mivok (621790) | more than 11 years ago | (#5516786)

A lot of the courses had already been changed.

But to higlight an example, the 3d graphics course in 3rd year is revised each year to adapt to new developments in the past year. For example, pixel shaders are taught in the current 3rd year course, and who knows what else next year.
However, the second year introductory graphics course has stayed mostly the same, introducing the basic concepts which are applicable to more than one situation, and dont change very often.

I would assume that a lot of the concepts, if not the specifics, would stay the same even between say RS-232 and USB. Sure, there are a lot of differences in protocol specifics, and a lot of extra features of USB (otherwise there wouldnt be much point to using it), but the general idea of transmitting a signal down the wire, and having to deal with flow control, error correction, and whatever else (now I _know_ I didnt revise hard enough - memory's failing me here).

Of course, updating the specifics can help, but not at the expense of clouding the basic concepts, or just for the sake of something being newer and more in fashion at the moment (think replacing a C course with a java course, losing any concept of memory management with free(), and confusing students no end when they finally do come to c/c++ ). Tomorrow, USB will be obsolete, and all the specifics learnt today will also be obsolete, but the next big thing that replaces it will use the same basic concepts, have the same problems to deal with, just at a higher speed, and with more features.

Teaching would be a great job (3, Insightful)

hengist (71116) | more than 11 years ago | (#5517187)

if it wasn't for the students.

I teach two undergraduate courses. I know what it's like to have students complaining about the content of a course, and I have two comments about this topic.

Firstly, changing what is taught in a course is very very very very hard work, and a course that has been restructured or had its content changed is very very very likely to have problems with said new content. It is simply not practical to keep updating a course to deal with new technology. Once a course is stable, it is far better to leave it that way. Also, the staff teaching that course must spend time doing research and likely supervising postgrad students. They must do this to keep their job and to maintain the reputation of the university.

Secondly, universities are not vocational training institutes. University teaches the basic theory and concepts behind the technology, and teaches students how to learn these concepts. The student should then be able to apply these theories and concepts in an employment situation.

If you want to learn how to use new technology solely to apply those skills to a job, go to polytech or do a training course. Don't sit around whining to the course instructor, because frankly he probably knows a hell of a lot more about how to run a course than you do.

Re:Teaching would be a great job (2, Insightful)

No. 24601 (657888) | more than 11 years ago | (#5517661)

I think that's a shameful answer to an important issue. It appears that what you describe as very very very very hard work is simply an unwillingness to do what is necessary to keep your students at the cutting edge of their field. I understand that basic theory must be taught, and using tried-and-tested material is the easiest approach. However, it's also important to expose students to the most modern practices and tools in their field. This can mean the difference between them getting jobs, or being left in the dark. I mean not to offend you but rather to suggest that you ponder whether you are doing what's best for your students, or what's easiest for you.

Re:Teaching would be a great job (3, Insightful)

i_am_nitrogen (524475) | more than 11 years ago | (#5518058)

DUDE, you're not getting the POINT. University is NOT FOR GETTING PEOPLE JOBS!!! It's for academic enrichment, broadening your horizons, learning things because they're interesting, and simply learning how to learn. Stop complaining and either be the change you want to see in the world, or go somewhere else. Why don't you teach your own course about the latest industry standards?

As nearly everyone else has said... it's the concepts that matter. Whether it's at 128kbits or 1mbit, a serial communication interface with a tx line and an rx line will always have the same basic concepts.

Re:Teaching would be a great job (2, Insightful)

hengist (71116) | more than 11 years ago | (#5518330)

I think that's a shameful answer to an important issue

Not to be too flip, but this issue is a minor one. The important issue is how to deal with student feedback

Reading your post, I think you missed my second point, which was:
universities are not vocational training institutes

it's also important to expose students to the most modern practices and tools in their field.

No, it is not. It is important to teach students how to learn these things.

This can mean the difference between them getting jobs, or being left in the dark

I can't speak for employers, because I am not one. But, the employers who come to graduate recruitment emphasise that they want thinking individuals who can learn new things, not people who already have all the knowledge. It seems that most employers who pick up graduates straight from university will train them in the areas they need to know.

I understand that basic theory must be taught, and using tried-and-tested material is the easiest approach

For a university, teaching the theory is more important than teaching the practice. Using tried and tested material isn't just the easiest approach, it is the best approach. Students don't like it when you use them to debug courses. I know this from experience.

It is so much work and so time consuming to rework a course, that by the time the course is reworked the technology the instructor is trying to incorporate is no longer cutting edge. Other problems include a lack of familiarity with the material on the part of other people in the teaching team. Also, as I stated before, the instructor will have to give up most or all of their research work to do so, which will impact the reputation of the instructor and school, and thence of the students who graduate.

Finally, a point that I should have included in my original post - there are only a certain number of weeks in the teaching semester. If you want to cover the basics as well as the bleeding edge stuff, then the depth and quality of the material taught is going to suffer. Students (in my experience, based on the feedback I have gotten from students) prefer depth of material over breadth of material.

suggest that you ponder whether you are doing what's best for your students, or what's easiest for you.

What is best for students is to give them the learning skills they need to pick up cutting-edge technology, for them to have an instructor and institution with good reputations, and for them to be taught a course that is stable and problem-free. Frequent upgrading of courses will jepordise all of these things.

Finally, I do spend a lot of time and effort eliciting feedback from the students, and in considering what they have to say. However, not everyone's suggestions are going to be acted on. One student in one presentation of a course isn't going to affect a change in that course, it's as simple as that.

Re:Teaching would be a great job (1, Informative)

Anonymous Coward | more than 11 years ago | (#5519796)

Reading your post, I think you missed my second point, which was:
universities are not vocational training institutes


This echos what was said before this comment, really. I recall my first night class (I wasn't getting up at 8AM for any class!) as a lowly EE student. The professor plodded into the hall (it was a 200 seat lecture hall), laid down his books and course material on the lectern, then proceeded to write the requirements down on the blackboard. After 5 minutes of silence except for the scraping chalk, he outlined a rigorous set of goals and requirements to pass the course. Several minor collaborative projects, lab work, take home work, quizzes on any of the previously covered material at least once per week, four tests, a final test, and a final project. One of the older students up front commented that it was a large amount of work to do for one class, while supporting a family and having a full time day job. The professor fixed him with a steady stare for a few moments, hooked his thumbs in his suspenders (he was a portly man), puffed out his cheeks and proceeded to tell him (and everyone else in the room) that, "higher education is for the leisure class." There were a few more gems like this, but that about sums up his monologue on the subject. More than half the room was empty the second night, most of the students were part timers with real jobs.

I can't speak for employers, because I am not one. But, the employers who come to graduate recruitment emphasise that they want thinking individuals who can learn new things, not people who already have all the knowledge. It seems that most employers who pick up graduates straight from university will train them in the areas they need to know.

No, you cant speak for them. I've been on the "prospective employee" side of the matter more than a few times, and I can tell you that, overwhelmingly, in the tech sector, all they care about is whether you have the exact set of skills that the last guy in the position had. People who can "learn"? Surely, you jest! Internship positions are requiring 2 years solid Java/C++/SQL/VB.net/etc with "proven" software development skills. An internship position!

What is best for students is to give them the learning skills they need to pick up cutting-edge technology, for them to have an instructor and institution with good reputations, and for them to be taught a course that is stable and problem-free. Frequent upgrading of courses will jepordise all of these things.

Yes, we had several of your type at my old university. They build a course, spend a few years "perfecting" it, then proceed to teach it unchanged for the next 15 years (including test material). Their classes are easy and meaningless. I found that by skipping the class of one particularly bad professor, and just reading the book, I got more out of the course than most of the people that listened to the braindead lecture. The professor was nonplussed, as he claimed that you needed to attend lectures to get a good grade -- not true, as my 'A' (and sanity) will attest.

Some of the best professors in the department regularly re-evaluated their course material every year, and would update it at least every 2-3 years. Students' learning skills did not suffer and most were heartened to learn that they were studying material that was current.

*sigh* (1)

Meowing (241289) | more than 11 years ago | (#5518655)

What flavor of engineering are you attendiong school to learn? There is still lots and lots of test and measurement equipment being produced that uses good old serial communications. It's cheap, reliable and still more universal than Ethernet and its friends. Even if later in life you end up using, say, TCP for your data collection, what is that TCP stream? Little more than an emulated serial connection. You want to know how this stuff works, really.

Learning (1)

elbuddha (148737) | more than 11 years ago | (#5518693)

Learning how to think is more important than learning how to do.

teach a man to fish ... (1)

Anonymous Coward | more than 11 years ago | (#5518723)

I've had similar situations in my cs degree where I found the material almost archaeological. But it's the concepts that are important to learn at the start and the roots of these concepts are in the old stuff.

basics first (0)

Anonymous Coward | more than 11 years ago | (#5519061)

If you want to learn how to fly a Space Shuttle you don't start out on one! You spend many years and tens of thousands of dollars learning on the concepts on small aircraft. Granted, knowing the gauge layout of a Cessna has zero relevance to an orbiter but the concept of watching your fuel levels applies equally well in either case.
So yes, when you get to space school they will say "forget all that airplane stuff" but they're not really telling you to forget the concepts, just the nitty-gritty details that you don't need any more.
Compres vous?

Quit yer bitchin... (0)

Anonymous Coward | more than 11 years ago | (#5519444)

Highlights of Skills [conestogac.on.ca]

Develop computer solutions in a variety of computer languages and data base management systems, including: Word, Excel, Visual Basic, RPG, COBOL, Access, Oracle, C++, SQL and Windows applications development systems

Univ of Pitt Problem (0)

Anonymous Coward | more than 11 years ago | (#5519541)

The real problem at Pitt is the upeer level graduate courses aren't updated either, starting couses should cover old tech, nothing like learning MIPs assembly on simulated hardware, but the upper graduate probgram doesn't seem to cover modern tech

RS232 Is EVERYWHERE (4, Insightful)

muscleman706 (654133) | more than 11 years ago | (#5519704)

I don't know about Token Ring, but RS232 is all over the place in industrial hardware like barcode scanners and other non-PC hardware. I think it is much simpler to program both for the programmers and the hardware designers. Also, remember that Intel came up with USB to sell processors because USB is a total CPU hog as compared to FireWire. So, while your PC does not have a problem with this now, certainly industrial hardware does not have the infrastructure on board to deal with USB. So, I think the appropriate thing is to talk about RS232, USB, IrA, BlueTooth, and WAP. You want BlueTooth because it is going to be in all cellphones, hence proliferate into everything else. You want WAP because for things where BlueTooth is too slow, you will want a higher-speed wireless system. For instance, you could have a WAP enabled Digital Video Camcorder that automatically pops up a recording window when you start recording, all without any wires!

Math (3, Insightful)

jbolden (176878) | more than 11 years ago | (#5519738)

When you first learn math we don't nursery school / kindergarden with "Let Delta be a derived functor mapping abelian catagories...."; you don't learn 20th century math at all. Rather what you learn is:

counting -- a technology that is certainly tens of thousands of years old
arithmetic -- a technology that is many thosands of years old and was fully developed 5000 years ago
algebra of one variable -- a technology that is a thousand years old
geometry of 2 dimension -- a technology that is over 2000 years old.

And if you are really good at highschool you learn
calculus of one variable -- a technology that is over 300 years old

By college the undergraduates make it up to about the civil war.

____________

There is a difference between education and vocational training. Education teaches you how to evaluate information and how to learn new information. Vocational training teaches you specific information for a specific field. There goal is to teach concepts not technologies.

What you are learning are very simple hardware / software interfaces. Why use complex interfaces of modern hardware that confuse the issues on an academic course? Leave that for vocational schools.

Independant Study (1)

aaarrrgggh (9205) | more than 11 years ago | (#5519815)

If you want to do a class in current technology the best way is to go independant study. If there is an insstructor with relevant knowledge it shouldn't be that hard to get something set up.

Consider yourself ahead of the game when you look for a job. If the university wants to move ahead in the course outline, they have "supporting evidence" for next year.

Unique to engineering? (1)

ggwood (70369) | more than 11 years ago | (#5522305)

I'm probably in trouble on this one, but here it goes:

I think this problem is unique to engineering. Universities were not created for this kind of thing. Thus you have an inherent conflict between (a) getting a job and (b) learning theory.

Historically, you would get a degree *in a different field* (probably physics or math) then go to professional school, such as law school or medical school (this one would be engineering school) and learn all the latest applied technology there.

I am not saying this is the way things should be, only noting the origin of the conflict. In fact, I think it is great to have a degree in engineering, during which you spend some time learning musty subjects like math, physics, english lit, art history...something else.

I teach physics at a University. (Contrary to what was said earlier, my students are the best part of my job - I think of it like playing a single person RPG versus a massive multiplayer RPG. All other things being equal, it is always nice to interact with people, and help them out where you can.)

Point is, in physics you learn the last 400 years of physics research, in roughly chronological order. You go a long ways before you reach the cutting edge and this is fine because you are trained to become a scientist - not a technician. Seems about half to 2/3 of the posts seem to agree there is an equivelant principle in engineering. I can't comment on that.

token ring is very relevant (1)

g4dget (579145) | more than 11 years ago | (#5522567)

Here is why. [syix.com]

You wouldn't benefit anyway this time around (3, Interesting)

James Youngman (3732) | more than 11 years ago | (#5522914)

I think you don't stand a chance. To get the course material updated, you will have to do one of two things:-
  1. Get the lecturer replaced with someone else - this means someone else has to be willling to teach the course
  2. Find the lecturer a whole lot of free time to revise the course material (which I assume has been generated over a period of years) all at once. This probably means them taking time away from research, which is what you professor probably feels he's there for anyway.
Even if you succeed, the material won't be updated while you're on the course. At best, the next course would start with the new material.

(This reflects the situation in the UK, where academic teaching staff in Universities almost always have research commitments (and publications are used as a performance metric).

Some of the material you are working with is not so bad, either. Learning about RS232 might teach you several things that are generically useful in designing system interfaces :-

  1. It's Not The Hardware, It's The Protocol Design, Stupid
  2. Race conditions between the two ends
  3. Reliability measures (Checksums, CRCs, ACK/NAK, windows)
  4. Resilience versus Bandwidth (e.g. max reliable baud rate ~ 10000/(RS232 cable run in feet))

It's been a while since I've worked heavily in industrial interfacing, but I'd be surprised if USB is even relevant to that at all. Think more along the lines of RS422, 10baseT, and optical fibre (often carrying converted RS232, in fact).

I'm not particularly familiar with XMODEM, but I think it's likely to help you undersand valuable facets from the above (bandwidth/reliability tradeoffs, protocol features for catching errors, latency versus throughput, bandwidth-delay products). Token Ring seems an odd choice to me, though. After all the hardware must be tricky to get these days (or perhaps your course has no hands-on component, which would make hardware availability irrelevant).

One of the most interesting hardware interfacing things I've done was implement both ends of a mostly-symmetrical serial protocol. One end was implemented as a set of four cooperating threads, and the other as a state machine. One way of doing it was (in that case) much much easier than the other (less code and more reliable).

Modula-2 for me... (1)

w42w42 (538630) | more than 11 years ago | (#5527374)

The 'out of date' technology I got to learn was Modula-2, back around '91/'92. I was mortified at the time.

Looking back though, it didn't really matter. I went from that to C/C++, Java, Python, etc. College isn't to teach you a specific technology, but the fundamentals, and how to learn the rest.

a tale from the trenches (1)

reddish (646830) | more than 11 years ago | (#5527687)

When still in college, I upgraded the lab course for assembly programming. The lab course used to be PDP-11 based, and we used a semi-working MS-DOS based emululator. We upgraded to PowerPC assembly - x86 was rightly regarded as way too complicated as a starting point.

Making the new assignments and getting everything to work on the local computer setup (this included writing an eery program that used TCP/IP to communicate between the PowerPC emulator and a custom-built terminal-emulator) took the better part of three months.

While the lab course got much better, we were still getting complaints about doing 'non-mainstream' assembly (some students complained that we didn't do x86). But we went to the trouble of explaining them why, which helped. I am sure we would've stuck to PDP-11 if it wasn't for the terrible software: the PDP-11 is ideal as a learning device. It is arguably the best example of proper orthogonal design in a microprocessor; better than PowerPC, and incomparably better than x86. The fact that PDP-11 immediate addressing mode is just post-increment indirect access on the program counter still brings tears to my eyes ;-)

First bottom line: learning proper principles is much more important than learning a bleeding-edge technology. These are often cluttered with performance-enhancing details that students cannot grasp when first approaching a subject. After leaving college, you should be intellectually equipped to read specs and manuals of current and future designs. While I agree that it is a good idea to pay some attention to current technology (it helps to convey insight into the limitations of older technology), current technology usually has too many quirks for teaching a subject.

Second bottom line: revising a class taking a tremendous amount of work. In practice, this is probably what keeps most lecturers from doing so.

Universities vs. Trade Schools (1)

raygundan (16760) | more than 11 years ago | (#5528951)

I see lots of comments pointing out that for specific technology skills, you would be better off at a trade school, and that general concepts based on tried-and-true, simple example technologies are what you should expect at Universities.

Isn't a balance possible? I have been well-served by the general concepts I picked up while working through my engineering degree, but a more practical class or two would certainly have been welcome. I fully expect to be learning new technologies regularly during my career-- but a kickstart for students coming into the industry seems like a good idea.

So why not? Is there such an anti-tech-school feeling in universities that their students can't benefit from some specific training in addition to their more general classes?

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?