×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Old-School Coding Techniques You May Not Miss

samzenpus posted more than 4 years ago | from the good-riddance dept.

Programming 731

CWmike writes "Despite its complexity, the software development process has gotten better over the years. 'Mature' programmers remember manual intervention and hand-tuning. Today's dev tools automatically perform complex functions that once had to be written explicitly. And most developers are glad of it. Yet, young whippersnappers may not even be aware that we old fogies had to do these things manually. Esther Schindler asked several longtime developers for their top old-school programming headaches and added many of her own to boot. Working with punch cards? Hungarian notation?"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

731 comments

Some, not all... (5, Insightful)

bsDaemon (87307) | more than 4 years ago | (#27768223)

Some of those are obnoxious and good to see them gone. Others, not so much. For instance, sorting/searching algorithms, data structures, etc. Don't they still make you code these things in school? Isn't it good to know how they work and why?

On the other hand, yeah... fuck punch cards.

Re:Some, not all... (3, Insightful)

AuMatar (183847) | more than 4 years ago | (#27768279)

Its absolutely essential to know how those work and why. If not you'll use the wrong one and send your performance right down the crapper. While you shouldn't have to code one from scratch anymore, any programmer who can't do a list, hash table, bubble sort, or btree at the drop of a hat ought to be kicked out of the industry.

Re:Some, not all... (5, Insightful)

SanityInAnarchy (655584) | more than 4 years ago | (#27768459)

any programmer who can't do a list, hash table, bubble sort, or btree at the drop of a hat ought to be kicked out of the industry.

Why?

Lists, hash tables, and sorting is already built in to many languages, including my language of choice. The rest, I can easily find in a library.

When performance starts to matter, and my profiling tool indicates that the sorting algorithm is to blame, then I'll consider using an alternate algorithm. But even then, there's a fair chance I'll leave it alone and buy more hardware -- see, the built-in sorting algorithm is in C. Therefore, to beat it, it has to be really inappropriate, or I have to also write that algorithm in C.

It's far more important that I know the performance quirks of my language of choice -- for instance, string interpolation is faster than any sort of string concatenator, which is faster than straight-up string concatenation ('foo' + 'bar').

And it's far more important that I know when to optimize.

Now, any programmer who couldn't do these at all should be kicked out of the industry. I could very likely code one quickly from the Wikipedia article on the subject. But by and large, the article is right -- there's a vast majority of places where these just don't matter anymore.

Not that there's nowhere they matter at all -- there are still places where asm is required. They're just a tiny minority these days.

Re:Some, not all... (2, Insightful)

AuMatar (183847) | more than 4 years ago | (#27768493)

Because they're dead simple, and if you don't know how they work you won't write good code. I didn't say you had to do so regularly (or ever, after college), but you need to be capable of it. If you aren't, you're not qualified to program. Period.

Re:Some, not all... (3, Insightful)

Anonymous Coward | more than 4 years ago | (#27768583)

Lemme just review your *ahem* "arguments":

Because they're dead simple

Your first key point is that programmers must understand them because they're simple??? Um...

and if you don't know how they work you won't write good code.

Now you're asserting that it's impossible to write good code unless you understand these things. So all of good programming hinges on this? That's incredible! </sarcasm>

If you aren't, you're not qualified to program. Period.

Heehee!

Try this one: thinking logically is critical to being qualified to program.

Re:Some, not all... (4, Insightful)

AuMatar (183847) | more than 4 years ago | (#27768639)

I'm thinking perfectly logically. If you don't understand and can't replicate the concepts that underpine your craft, you aren't qualified to practice it. It's like a physicist who can't understand force, or a mathematician who doesn't understand the first fundamental theory of calculus. They aren't capable of doing their job. Apparently this includes you.

Re:Some, not all... (1, Insightful)

drolli (522659) | more than 4 years ago | (#27768593)

mod parent up.

if you are that uninterested in computers that these algorithms are uninteresting for you, you should leave. Moreover there can be *extremely* tricky performance things to consider about your cache/physical ram size (just write a loop which covers more and more memory.....). There are algorithms which work extremely well unless you exceed the available size of ram, but break down suddenly (i have the feeling that something of this class happens on my Nokia E61 with the bundled e-mail client. There was a day when my imap folders exceeded a certain number of e-mails, and suddenly the times to process things where growing by a factor of approx. 20-100).

Re:Some, not all... (5, Insightful)

Ruie (30480) | more than 4 years ago | (#27768591)

any programmer who can't do a list, hash table, bubble sort, or btree at the drop of a hat ought to be kicked out of the industry.

Why?

Because if these well known tasks are difficult for them their job title is really typist, not programmer. The challenge is not to write bubble sort day in and day out, but being several levels above that so it is as easy as computing six times seven or reading road signs.

Re:Some, not all... (0)

Anonymous Coward | more than 4 years ago | (#27768731)

Yes, your general points are correct, but still...

One should be well-versed in implementing custom algorithms for these basic operations, and it shouldn't take much time at all to drop one in when the situation warrants it. Some of your logic is based on the premise that "or I have to also write that algorithm in C" is a difficult proposition.

Parent and grand-parent's point is that you should be good enough at this stuff that having to write a replacement in C is a quick no-brainer (most good higher level languages make it pretty easy to write C extensions), and therefore there isn't much of a development time or cost tradeoff to consider.

Re:Some, not all... (0)

Anonymous Coward | more than 4 years ago | (#27768359)

They do indeed still teach this (and make you code it) in any intro to algorithms course.

Re:Some, not all... (5, Interesting)

Blakey Rat (99501) | more than 4 years ago | (#27768383)

I did a ton of work in THINK C 5 on Mac OS 7. Programming in C on a computer with no memory protection is something I never want to experience again. Misplace a single character, and it's reboot time-- for the 13th time today.

What's *really* weird is that at the time I didn't think that was particularly strange or difficult. It was just the way things were.

Re:Some, not all... (1)

X0563511 (793323) | more than 4 years ago | (#27768551)

Misplace a single character, and it's reboot time...

If you're lucky. If you aren't, you get silent (well, sometimes not so silent) data corruption. Fun times I imagine! I'm glad I never got into it back then.

But it teaches you to be careful (1)

_merlin (160982) | more than 4 years ago | (#27768755)

I got my start on System 7, and I'm grateful for it. You see, with a fixed size heap and no memory protection, you learned to be very, very careful about memory leaks and corruption, because your program could do very bad things if you weren't. I'm a better developer for it.

Re:Some, not all... (1)

paxswill (934322) | more than 4 years ago | (#27768415)

I took AP Computer Science AB my senior year of high school, and that was a major portion of it. We had to know why certain data structures (trees, hash tables, linked lists, arrays) were better for certain tasks, and also why some sorting algorithms were better than others. It's also part of the curriculum for CS and computer engineers at my university.

Re:Some, not all... (1)

delsvr (687275) | more than 4 years ago | (#27768489)

Are you telling me you want us to still be quibbling over which is more efficient, "binary trees versus modified bubble sort"? Implementing hash tables?

The complexity of our problems have grown. It's called progress.

But of course they still teach us that stuff in school, just as aspiring physicists still go through classical, Newtonian concepts before starting their research in quantum mechanics. But by no means is this generation of physicists less capable just because they've moved past studying the nature of gravity.

Punched cards - there was a machine for that (2, Insightful)

NotQuiteReal (608241) | more than 4 years ago | (#27768229)

Heh, I had to turn in a punched card assignment in college (probably the last year THAT was ever required)... but I was smart enough to use an interactive CRT session to debug everything first... then simply send the corrected program to the card punch.

I was an early adopter of the "let the machine do as much work as possible" school of thought.

Re:Punched cards - there was a machine for that (4, Interesting)

rnturn (11092) | more than 4 years ago | (#27768385)

``I had to turn in a punched card assignment in college (probably the last year THAT was ever required)... but I was smart enough to use an interactive CRT session to debug everything first... then simply send the corrected program to the card punch.''

Jeez. You must have taken the same course that I did. (Probably not actually.) In my case it was a programming class emphasizing statistics taught by someone in the business school who actually wanted card decks turned in. (This was probably no later than, maybe, '80/'81.) I did the same thing you did. I wrote all the software at a terminal (one of those venerable bluish-green ADM 3As) and when it was working I left the code in my virtual card punch. When I sent a message to the operator asking to have the contents sent off to a physical card punch, his message back was "Seriously?

Begone, common file format loaders! (3, Interesting)

fractoid (1076465) | more than 4 years ago | (#27768237)

The one thing I don't think I'll ever, ever miss is writing loaders for some of the stupider file formats out there. Sure, it's not hard, per se, to write a .bmp loader, but once you've done it once or twice it gets old. Eventually I wrote a helper image library to do it all but it still would occasionally come across some obscure variant that it wouldn't load. Far worse were early 3D model formats, even now I tend to stick with .md2 for hobby projects just because it's simple, does what I want, and EVERYTHING exports to it.

Re:Begone, common file format loaders! (1)

pavon (30274) | more than 4 years ago | (#27768471)

Oh, man. You just made me remember that horrible mishmash of Auto-Lisp and Q-Basic I wrote in high school trying to use AutoCAD 10 as a Quake level editor, since that's what we had at school. I had moderate success, but finally broke down and bought a shareware package, only to find after months of work, that it always rotates walls around their center, so if you ever rotated a wall with an odd length, it was no longer aligned with the grid. And it didn't have a way to snap objects back to the grid, so you ended up with BSP-tree leaks galore. I had to reverse engineer it's file format so I could write more Q-Basic to snap all the coordinates back to the grid.

Hungarian Notation (1, Insightful)

masdog (794316) | more than 4 years ago | (#27768239)

I don't get what the big deal is with Hungarian Notation. Why do people consider it a bad thing?

Modern IDEs might reduce the need for it, but not everyone uses an IDE to read or write code.

Re:Hungarian Notation (2, Interesting)

fractoid (1076465) | more than 4 years ago | (#27768287)

Full Hungarian notation is a bit redundant, precisely because everyone (for reasonable values of 'everyone') DOES use some form of IDE to code, and any non-epic-fail IDE will at the least tell you variable types when you mouse over them, or pop up a member list for a class/struct when you go to type them.

However, specific notation on some things IS a good thing. Conventions like CONSTANTS, m_memberVariables, and so forth are good because they remind you that the variable in question is an exception to what you'd normally expect (that it's a number, a string, or an object). They're not strictly necessary any more (my current workplace just uses upper camel case for everything, for instance, and my last job used trailing underscores to denote member variables which was downright annoying) but IMO it's good to prevent brain fail errors. Recognising that the programmer is the source of all errors is the first step towards getting rid of them. Well, except in Borland Turbo C++. :)

Re:Hungarian Notation (1)

AuMatar (183847) | more than 4 years ago | (#27768401)

Not everyone uses an IDE. There's a hell of a lot of us who still use emacs and vi. FOr that matter unless I need a debugger I'd pull up notepad over an IDE- IDE's features tend to make it sputter on lower end computers and/or large projects, and there advantages are minimal to nill. It tends to not even be all that good at the things it can do- I can grep and find all references to a variable faster than most IDEs will find them for me.

Re:Hungarian Notation (1)

SanityInAnarchy (655584) | more than 4 years ago | (#27768503)

Not everyone uses an IDE. There's a hell of a lot of us who still use emacs and vi.

I suppose if it's 'vi' and not 'vim', you might have a point. But emacs absolutely should be able to do something like that.

IDE's features tend to make it sputter on lower end computers and/or large projects, and there advantages are minimal to nill.

However, one thing they have over Notepad: Syntax highlighting.

I hear you -- I use Kate [kate-editor.org] for development these days. But even there, I've at least got syntax highlighting and code folding.

I can grep and find all references to a variable faster than most IDEs will find them for me.

Then for you, hungarian notation should be equally redundant.

Re:Hungarian Notation (1)

Matheus (586080) | more than 4 years ago | (#27768547)

I still use a bit of Hungarian as it makes the code easier to read plus I can reuse variable names {{ especially in GUI code.. _lBLAH sits next to _tfBLAH for example }} PS I *hate* IDE generated GUI code!

"I don't need no steenking IDE!" Gimme a file browser and a term and I'm happy as a clam. I have gotten a bit more modern now.. running gvim and mvn instead of vi and make but the only time I've used IDEs is when some employer forced me (and they still spawned vim as my editor)

As far as finding anything in my code the parent is right: "find ./ -name "*.java" -print | grep -v "\.svn" | grep -v "target\\" | xargs grep -in " makes for a fantastic alias :)

Re:Hungarian Notation (1)

fractoid (1076465) | more than 4 years ago | (#27768641)

From your post history, I'd say you probably fall outside the 'reasonable values of everyone' I was talking about - as in, you're probably rather more technically literate than average, and (judging by your UID) probably learned to program using a text editor and command line compiler rather than with MSVC++ 6 or later (the first IDE that I used that had IntelliSense, good god it was awesome!) Also you may not be joining large projects part way through where you are unfamiliar with most of the code.

I'd still maintain that while vi + gcc is situationally better for expert users, those of us trying not to burn too many brain cells on our daily work are almost certainly better off using a tool that can tell us what type a variable is, where it was declared, what members a struct has as we type its name, and all the other little things. They're not necessary but they most definitely speed up the arduous task of maintaining someone else's code.

Re:Hungarian Notation (1)

AuMatar (183847) | more than 4 years ago | (#27768715)

Borland Turbo C++ 3.0 (on 5 whole floppy disks!) rather than command line, but close enough- it was technically an IDE, but it was probably less capable than emacs. And I only tend to join large projects in mid-stream when I switch jobs.

Hey, use whatever tool works for you. Different personalities work best with different methods. I just get on edge when people say "everyone uses this"- it leads to assumptions about how others work that can cause problems for their coworkers/project mates down the line.

Re:Hungarian Notation (1)

grimsweep (578372) | more than 4 years ago | (#27768757)

I completely disagree; IDEs make more sense on larger projects. Don't get me wrong; none of my IDE's can beat VI's ability to handle large files, and nothing says convenient like an inline diff. When you're working on something that starts to reach the 100+ file mark, I'd rather not rely on a simple editor and my window manager.

A good IDE helps you to organize, track, and maintain your sanity at this scale. In Netbeans 6+, I can jump to a class's declaration, find all usages of a method (regardless of any nasty nesting), and refactor a name change across a project and everything that depends on it. I know where most of my stuff is when I write it, but I find that it can be a NIGHTMARE reviewing a huge project written by a team without sensible documentation.

If you love regular expressions, don't let me stop you, but I think you're missing out.

Re:Hungarian Notation (1)

ppanon (16583) | more than 4 years ago | (#27768315)

It makes sense when you use it with abbreviations for a limited set of predefined types in procedural languages. When you're dealing with object oriented programming with multiple large class libraries which need separate namespaces to avoid class name conflicts, and lots of your variables are objects that are class instantiations, it loses a lot of its effect in clarifying code. You're better off using longer more descriptive variable names.

Re:Hungarian Notation (4, Insightful)

Dunx (23729) | more than 4 years ago | (#27768321)

Hungarian notation is bad because you are encoding type and scope information into the name, which makes it harder to change things later.

The fact that it is also one of the ugliest naming conventions is merely a secondary issue.

Re:Hungarian Notation (3, Insightful)

AuMatar (183847) | more than 4 years ago | (#27768323)

Three reasons.

1)Variables change type. And then you have to rename everything. Its a pain
2)The extra information it gives you is minimal. I want to know what data is in a variable, not the language type used to hold it. If the name of the variable is firstName, I don't need it to be called lpcstrzFirstName, I know it's a string. And the language type is rarely interesting- I want to know that the variable outsideTemp holds degrees farenheit, not that it's an integer. But Hungarian doesn't tell me that. (It also doesn't work even if I make a typedef for temperature- it'll still start with 'i').
3)It makes searching the code for a variable that much more annoying, because they all start with freaking 'i' and 'p'.

Re:Hungarian Notation (5, Informative)

smellotron (1039250) | more than 4 years ago | (#27768355)

And the language type is rarely interesting- I want to know that the variable outsideTemp holds degrees farenheit, not that it's an integer. But Hungarian doesn't tell me that

Good Hungarian notation does exactly that, actually. Check out Apps Hungarian [wikipedia.org], which encodes the semantic type of the data, rather than the language-level data type.

Of course stupid Hungarian notation is stupid. Stupid anything is stupid. Problem is, most people don't hear about the right approach.

Re:Hungarian Notation (4, Interesting)

hedronist (233240) | more than 4 years ago | (#27768621)

Correct. I worked for Charles at Xerox on the BravoX project and I initially fought Hungarian. One day I had an epiphany about what it was really about and then I didn't have any problems with it. Properly done it can reduce "name choosing time" to almost zero and it makes walking into other people's code almost completely painless. The key is that you encode semantics, not machine-level details.

Re:Hungarian Notation (4, Insightful)

snookums (48954) | more than 4 years ago | (#27768335)

Really it has nothing to do with IDEs, but more compilers, good coding practice and OO principles. A few cons:

  • The code should be simple enough that you can easily track a variable from declaration through use, or imply the type from the context and name.
  • Since most (all?) compilers and interpreters ignore the Hungarian prefix, there's no way of knowing that iFoo is really an integer. This is particularly true of weakly typed languages that are popular in a lot of modern programming environments.
  • In a large OO project you might have hundreds of types. Creating meaningful prefixes for all of them is going to be next to impossible, and having obj at the front of everything is redundant.

For a succinct summary: Hungarian Notation Considered Harmful [erngui.com]

I Like Hungarian Notation (1)

Greyfox (87712) | more than 4 years ago | (#27768421)

It's a pretty good indicator that the programmer was inept and trying to hide that fact with an obnoxious coding convention. I don't recall ever seeing a piece of production code using it where this wasn't the case.

Re:Hungarian Notation (3, Funny)

Mad Merlin (837387) | more than 4 years ago | (#27768611)

I don't get what the big deal is with Hungarian Notation. Why do people consider it a bad thing?

The proper name is Hungarian Line Noise, which should answer your question.

Someone doesn't get data compression (1)

SpazmodeusG (1334705) | more than 4 years ago | (#27768255)

From the article

For instance, one of my programming friends fit a 7K print driver into 5K by shifting (assembly language) execution one bit to the right.

Re:Someone doesn't get data compression (2, Interesting)

pavon (30274) | more than 4 years ago | (#27768409)

Yeah, that was odd. I could see if the final field of each assembly instruction was an address and everything was aligned to 2-word boundaries (msb-first) or you didn't use memory passed a certain boundary (lsb-first) then you could save memory by compacting all the instructions by one bit (and then packing them together). Same for registers, or if didn't use instructions with op-codes over a certain threshold. But if you were really saving one bit per instruction and you managed to compress 7k into 5k, then that means your instructions were only 3.5 bits on average to begin with, which doesn't seem very likely. Something definitely got lost in translation there.

Eliminate Structured Programming? (1)

honestmonkey (819408) | more than 4 years ago | (#27768273)

I don't understand why they said that Object oriented code eliminated the need for structured programming. OO isn't any better than structured, just different. It has some good uses, but a lot of overhead and is not the best fit for all problems.

That said, I remember spaghetti code. There was a point when FORTRAN got if/then/else statements. It was obvious in one piece of code that they told the developers to start using them. They had written

If (a .eq. 1) then
go to 20
else
go to 30
endif
20 continue

My brain hurt after reading that.

Re:Eliminate Structured Programming? (4, Insightful)

Brett Buck (811747) | more than 4 years ago | (#27768307)

Actually, the worst spaghetti code I have ever seen (in 30+ years most of it in life-critical systems) is OO C++. It doesn't have to be that way, but I have seen examples that would embarrass the most hackish FORTRAN programmers.

          I am alarmed at the religious fervor and non-functional dogma associated with modern programming practices. Even GOTOs have good applications - yes, you can always come up with some other way of doing it, by why and with how much extra futzing? But it's heresy.

        Brett

Re:Eliminate Structured Programming? (1)

BikeHelmet (1437881) | more than 4 years ago | (#27768499)

You're absolutely right. I've seen some methods that really would benefit from a goto statement.

Goto lets you create loops with multiple entry points, which avoids other clutter code. It can enhance legibility when used inside a single method.

One that I'd like to see die is:

do {} while();

One that I'd like to see live (but doesn't yet?) is a language like java, but with optional manual garbage collection on a per-class basis.

Re:Eliminate Structured Programming? (2, Insightful)

Max Littlemore (1001285) | more than 4 years ago | (#27768619)

The worst I saw in my ~25 years, and I include old COBOL and BASIC crap, was not spagetti in the strict sense of the word. It was a 10000 line Java method written by a VB developer. There were no gotos, but the entire thing was nested ifs switches and for loops nested to over 10 layers deep. Oh, and you did read that right, it was a method - the entire class had a solitary static method full of copy and pasted chunks. He explained that it was OO because it was Java. I might forgive him if it was gigantic nested unrolled loop that ran like stink, but it was slow and crash prone.

A bunch of gotos and gosubs are a pleasure to debug compared to that kind of poo, seriously.

No matter how nice a new paradigm that comes along, there is always some idiot who can make it suck far, far more than the last paradigm.

Re wrote that as 10 classes of ~20 lines each, it ran faster and never died until it was told to.

Get down to the metal (3, Interesting)

GreatDrok (684119) | more than 4 years ago | (#27768291)

Yeah, some of these are pretty old. I do remember working on a machine where the compiler wasn't smart enough to make the code really fast so I would get the .s file out and hand edit the assembly code. This resulted in some pretty spectacular speedups (8x for instance). Mind you, more recently I was forced to do something similar when working with some SSE code written for the Intel chips which was strangely slower on AMD. Turned out it was because the Intel chips (PIII and P4) were running on a 32 bit bus and memory access in bytes was pretty cheap. The Athlons were on the 64 bit EV6 bus and so struggled more so were slower. Once I added some code to lift the data from memory in 64 bit chunks and then do the reordering it needed using SSE the AMD chips were faster than the Intel ones.

Sometimes I think we have lost more than we have gained though with our reliance on compilers being smarter. It was great fun getting in there with lists of instruction latencies and manually overlapping memory loads and calculations. Also when it comes to squeezing the most out of machines with few resources, I remember being amazed when someone managed to code a reasonably competent Chess game into 1K on the Sinclair ZX81. Remember too that the ZX81 had to store the program, variables, and display all in that 1K. For this reason, the chess board was up at the left top of the screen. It was the funniest thing to be writing code on a 1K ZX81 and as the memory got full you could see less and less of your program until the memory was completely full and you could only see one character on screen....

Universal timeless programmer problem (4, Funny)

MacColossus (932054) | more than 4 years ago | (#27768293)

Documentation!

Re:Universal timeless programmer problem (0)

Anonymous Coward | more than 4 years ago | (#27768537)

Exactly, documentation is really important, UNIX programmer's manual vol 1. keeps my stomack cool under the notebook.

What a retard! (4, Insightful)

Alex Belits (437) | more than 4 years ago | (#27768297)

First of all, most actual practices mentioned are well alive today -- it's just most programmers don't have to care about them because someone else already did it. And some (systems and libraries developers) actually specialize on doing just those things. Just recently I had a project that almost entirely consisted of x86 assembly (though at least 80% of it was in assembly because it was based on very old code -- similar projects started now would be mostly in C).

Second, things like spaghetti code and Hungarian notation are not "old", they were just as stupid 20 years ago as they are now. There never was a shortage of stupidity, and I don't expect it any soon.

Dirty old Fortran (4, Interesting)

wjwlsn (94460) | more than 4 years ago | (#27768305)

Hollerith constants
Equivalences
Computed Gotos
Arithmetic Ifs
Common blocks

There were worse things, horrible things... dirty tricks you could play to get the most out of limited memory, or to bypass Fortran's historical lack of pointers and data structures. Fortran-90 and its successors have done away with most of that cruft while also significantly modernizing the language.

They used to say that real men programmed in Fortran (or should I say FORTRAN). That was really before my time, but I've seen the handiwork of real men: impressive, awe-inspiring, crazy, scary. Stuff that worked, somehow, while appearing to be complete gibberish -- beautiful, compact, and disgustingly ingenious gibberish.

Long live Fortran! ('cause you know it's never going to go away)

Re:Dirty old Fortran (1)

hoytak (1148181) | more than 4 years ago | (#27768405)

I once interned with a high-energy physics research group of about 40 people or so that had a policy that all physics related code had to be written in fortran 77. The reasoning was that it had to be fast, and that everyone could read it as everyone knew it. That was 2003. So yeah, it's not going to die.

OTOH, that was the main reason I left physics and went into computer science -- I kept thinking "There has to be something better out there..." -- and I don't regret that decision.

Re:Dirty old Fortran (3, Funny)

techno-vampire (666512) | more than 4 years ago | (#27768463)

They used to say that real men programmed in Fortran (or should I say FORTRAN).

Years ago I programmed with a true genius. His language of choice was PL/1, but sometimes he had to use FORTRAN to fit in with what other people were doing. Any fool could write FORTRAN code in any language they wanted, but he was the only man I ever saw write PL/1 in FORTRAN.

Hardware not working as promised (1)

bradbury (33372) | more than 4 years ago | (#27768327)

I can think of at least two instances on entirely different hardware (an IBM 370 and a Motorola 68000) where I had to discover that it was not working as promised. This involved a test-and-set instruction which was essential for Oracle (in the early 1980's days) to function reliably. Now of course with the multi-core CPUs they have to get these things right -- but back in the "old" days the hardware engineers could be more careless.

A test-and-set instruction, for those uninformed, is an instruction which locks the memory to prevent other CPUs from changing it when one needs an exclusive lock on it.

It is interesting, though infuriating, from a software standpoint when one is using it to diagnose hardware problems.

Re:Hardware not working as promised (1)

Animats (122034) | more than 4 years ago | (#27768519)

Now of course with the multi-core CPUs they have to get these things right -- but back in the "old" days the hardware engineers could be more careless.

Multi-processor arbiter theory wasn't properly understood until the mid-1970s. Some early shared memory processors had race conditions with access to shared memory. There's a famous Intel patent on this. See Arbiter on Wikipedia [wikipedia.org] for details. Ir turns out that you cannot resolve a race condition of that type in constant time. Sometimes you have to allow extra time for the arbiter to settle in one state or the other. Support to detect the need for that additional delay has to be designed in.

Engineers still get this wrong occasionally in I/O hardware. The people who design CPUs know all about it, but the lower-level people who design bus interfaces sometimes don't.

Intellisense and Debuggers (0)

ArcadeNut (85398) | more than 4 years ago | (#27768339)

Are probably the greatest things (productivity wise) they ever added to an IDE.

No more :

...code...

PRINT "Got to this point!"

...code...

...code...

PRINT "Now I'm here!"

Re:Intellisense and Debuggers (1)

AuMatar (183847) | more than 4 years ago | (#27768437)

Ewww. Intellisense is evil. It's just freaking annoying. I refuse to use IDEs just because of the existence of intellisense- I don't want the slowdown it brings, and I want exactly what I type and only what I type to appear. I'd rather have a computer wth clippy than intellisense.

Re:Intellisense and Debuggers (1)

Tacvek (948259) | more than 4 years ago | (#27768575)

working Intellisense should not cause any slowdown, because it should be running only in the background, using idle cycles. When it works well, it lets you avoid having to stop to check the manual or header file to determine the correct order of the the parameters to a function, since the prototype would appear as a tooltip when you typed the open paren. Or when you remember that a class has a member function that does what y6ou want, but it had an odd name, which you know you would recognize if you saw. Just type the '.' and press the command completion key (or chord) and skim the list.

If you are actually getting text that you did not type being added to the document, and it is not because you explicitly pushed a command completion key (or chord) then something is wrong.

Re:Intellisense and Debuggers (1)

AuMatar (183847) | more than 4 years ago | (#27768613)

I'd rather just not have the feature- if I want to type something, I'll type it. Typing isn't exactly the slow part of programming as it is- for every minute I spend typing I spend several of design and debugging. You don't need to defend the feature- for the way I like to work it's just broken. I'd rather use freaking notepad without intellisense than an IDE with it.

Re:Intellisense and Debuggers (3, Insightful)

mark-t (151149) | more than 4 years ago | (#27768709)

A feature like intellisense isn't a feature to save typing time... its primary benefit is to save looking things up in a manual if one happens to not remember the exact spelling of some class member or function. If one knows exactly what ones wants to type in the first place, it doesn't stop you, nor should it even slow you down, unless it's implemented poorly.

Re:Intellisense and Debuggers (1)

AuMatar (183847) | more than 4 years ago | (#27768729)

It slows me down by annoying me- I find text appearing underneath what I'm typing annoying. Which doesn't stop you from using it- just add a feature to turn the damn thing off. And while maybe it "shouldn't" slow things down, I've never seen an implementation that didn't, significantly. Maybe recent releases of Eclipse and Visual Studio have improved it, but on VS it used to be a multisecond slowdown at times.

Patching binaries (1)

spaceyhackerlady (462530) | more than 4 years ago | (#27768341)

Yikes!

About the only one I never used was self-modifying code. Does patching binaries with a hex editor count?

I always thought hungarian notation was a crock. It alway seemed to be preoccupied with low-level data types, when what you were supposed to to was define data types that said what your data did, and leave it at that. I actually rather liked the way Pascal defined numeric types, leaving it up to the compiler to select an appropriate representation while you go on with your work.

...laura, who has programmed in APL and who has used punched cards

Fortran implicit integers (4, Informative)

belmolis (702863) | more than 4 years ago | (#27768353)

For some reason the article says that only variables beginning with I,J,and K were implicitly integers in Fortran. Actually, it was I-N.

Re:Fortran implicit integers (3, Informative)

techno-vampire (666512) | more than 4 years ago | (#27768573)

Yes, and the reason for that was that I and N were the first two characters of the word integer.

Re:Fortran implicit integers (5, Informative)

Geirzinho (1068316) | more than 4 years ago | (#27768705)

Nonsense, it's simply because i - n commonly is used to denote integer variables (sum x_i from 1 to n) i mathematical notation. This is a practice dating back at least to Gauss.

Duff's Device (4, Interesting)

Bruce Perens (3872) | more than 4 years ago | (#27768371)

Duff's Device [wikipedia.org]. Pre-ANSI C-language means of unrolling an arbitrary-length loop. We had an Evans and Sutherland Picture System II at the NYIT Computer Graphics Lab, and Tom wrote this to feed it IO as quickly as possible.

Some of those are just wrong (4, Informative)

AuMatar (183847) | more than 4 years ago | (#27768379)

First off, most of the things on the list haven't gone away, they've just moved to libraries. It's not that we don't need to understand them, it's just that not everyone needs to implement them (especially the data structures one- having a pre-written one i good, but if you don't understand them thoroughly you're going to have really bad code)..

On top of that, some of their items

*Memory management- still needs to be considered about in C and C++, which are still top 5 languages. You can't even totally ignore it in Java- you get far better results from the garbage collector if you null out your references properly, which does matter if your app needs to scale.

I'd even go so far as to say ignoring memory management is not a good thing. When you think about memory management, you end up with better designs. If you see that memory ownership isn't clearcut, it's usually the first sign that your architecture isn't correct. And it really doesn't cause that many errors with decent programmers(if any- memory errors are pretty damn rare even in C code). As for those coders who just don't get it- I really don't want them on my project even if the language doesn't need it. If you can't understand the request/use/release paradigm you aren't fit to program.

*C style strings

While I won't argue that it would be a good choice for a language today (heck even in C if it wasn't for compatibility I'd use a library with a separate pointer and length), its used in hundreds o thousands of existing C and C++ library and programs. The need to understand it isn't going to go away anytime soon. And anyone doing file parsing or network IO needs to understand the idea of terminated data fields.

Yes, I'm old (5, Insightful)

QuantumG (50515) | more than 4 years ago | (#27768381)

* Sorting algorithms

If you don't know them, you're not a programmer. If you don't ever implement them, you're likely shipping more library code than application code.

* Creating your own GUIs

Umm.. well actually..

* GO TO and spaghetti code

goto is considered harmful, but it doesn't mean it isn't useful. Spaghetti code, yeah, that's the norm.

* Manual multithreading

All the time. select() is your friend, learn it.

* Self-modifying code

Yup, I actually write asm code.. plus he mentions "modifying the code while it's running".. if you can't do that, you shouldn't be wielding a debugger, edit and continue, my ass.

* Memory management

Yeah, garbage collection is cheap and ubiquitous, and I'm one of the few people that has used C++ garbage collection libraries in serious projects.. that said, I've written my own implementations of malloc/free/realloc and gotten better memory performance. It's what real programmers do to make 64 gig of RAM enough for anyone.

* Working with punch cards

Meh, I'm not that old. But when I was a kid I wrote a lot of:

100 DATA 96,72,34,87,232,37,49,82,35,47,236,71,231,234,207,102,37,85,43,78,45,26,58,35,3
110 DATA 32,154,136,72,131,134,207,102,37,185,43,78,45,26,58,35,3,82,207,34,78,23,68,127

on the C64.

* Math and date conversions

Every day.

* Hungarian notation

Every day. How about we throw in some reverse polish notation too.. get a Polka going.

* Making code run faster

Every fucking day. If you don't do this then you're a dweeb who might as well be coding in php.

* Being patient

"Hey, we had a crash 42 hours into the run, can you take a look?"
"Sure, it'll take me about 120 hours to get to it with a debug build."

Self-modifying code has been a lose for a decade. (4, Informative)

Animats (122034) | more than 4 years ago | (#27768465)

Self-modifying code
Yup, I actually write asm code.. plus he mentions "modifying the code while it's running".. if you can't do that, you shouldn't be wielding a debugger.

Code that generates code is occasionally necessary, but code that actually modifies itself locally, to "improve performance", has been obsolete for a decade.

IA-32 CPUs still support self-modifying code for backwards compatibility. (On most RISC machines, it's disallowed, and code is read-only, to simplify cache operations.) Superscalar IA-32 CPUs still support self-modifying code. But the performance is awful. Here's what self-modifying code looks like on a modern CPU:

Execution is going along, with maybe 10-20 instructions pre-fetched and a few operations running concurrently in the integer, floating point, and jump units. Alternate executions paths may be executing simultaneously, until the jump unit decides which path is being taken and cancels the speculative execution. The retirement unit looks at what's coming out of the various execution pipelines and commits the results back to memory, checking for conflicts.

Then the code stores into an instruction in the neighborhood of execution. The retirement unit detects a memory modification at the same address as a pre-fetched instruction. This triggers an event which looks much like an interrupt and has comparable overhead. The CPU stops loading new instructions. The pipelines are allowed to finish what they're doing, but the results are discarded. The execution units all go idle. The prefetched code registers are cleared. Only then is the store into the code is allowed to take place.

Then the CPU starts up, as if returning from an interrupt. Code is re-fetched. The pipelines refill. The execution units become busy again. Normal execution resumes.

Self-modifying code hasn't been a win for performance since the Intel 286 (PC-AT era, 1985) or so. It might not have hurt on a 386. Anything later, it's a lose.

Re:Self-modifying code has been a lose for a decad (1)

QuantumG (50515) | more than 4 years ago | (#27768495)

Hehe.. dude, the purpose of self-modifying code these days is mostly as an anti-debugging trick.. although sometimes its as a replacement for this code:

static bool flag = true;
if (flag) { /* do something once only */
      flag = false;
}

but I've not seen any compilers that do it automatically. When I worked at VMWare there was actually a bit of code that did self modification specifically to *cause* a cache clear of the current page.

Re:Yes, I'm old (1)

Spit (23158) | more than 4 years ago | (#27768529)

Yeah self-modifying is a satisfying trick. Left off the list though is cycle-counting and padding, required for juggling processor and display.

Re:Yes, I'm old (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#27768541)

It's what real programmers do to make 64 gig of RAM enough for anyone.

Presently, 64 gigs of RAM should be enough for anyone...right?

Re:Yes, I'm old (0, Redundant)

isBandGeek() (1369017) | more than 4 years ago | (#27768555)

It's what real programmers do to make 64 gig of RAM enough for anyone.

I would really love 64gig of RAM in my PC. Surely you mean 64MB?

Re:Yes, I'm old (1)

QuantumG (50515) | more than 4 years ago | (#27768579)

Hehe, no I don't. Server machines with 16 processors and 64 gig of ram and the app still runs out of memory. This is what you get for trying to scale a multi-threaded app by throwing hardware at it.

Re:Yes, I'm old (1)

SanityInAnarchy (655584) | more than 4 years ago | (#27768605)

If you don't know them, you're not a programmer. If you don't ever implement them, you're likely shipping more library code than application code.

If you're shipping more library code than application code, that's a good thing. It means you're not reinventing the wheel.

All the time. select() is your friend, learn it.

Nah, I'd rather use eventmachine if I'm going for cooperative multithreading, and I'd like to be able to use Erlang-style processes for pre-emptive.

plus he mentions "modifying the code while it's running"

Yeah, that was the one that made me go "wtf"? The examples given certainly aren't needed often anymore. However, modern scripting languages do this almost by definition. Sufficiently high-level metaprogramming can be very good. (It can also be very bad, but so can anything.)

I've written my own implementations of malloc/free/realloc and gotten better memory performance.

Great! Put em in the GC library, so the rest of us don't have to think about it.

* Math and date conversions

Really? This should be in the standard library for just about anything. In Rails, the example given would be calculated as: 3.weeks.from_now

* Hungarian notation

Ew.

Every fucking day. If you don't do this then you're a dweeb who might as well be coding in php.

Or Ruby. PHP sucks for reasons other than slowness and its potential to attract noobs.

However, "making code run faster" is what you do after the code runs, and does what it's supposed to do, and is modular, flexible, and maintainable. And even then, you're often faced with a very clear question of programmer time vs CPU time. Sometimes it's worth it (games, embedded systems) -- often times, you do the simplest performance hacks you can, and throw hardware at the rest.

"Sure, it'll take me about 120 hours to get to it with a debug build."

Yeah... I don't have this problem. Ever. But then, I'm a web developer, so I probably have things a bit easier...

Just what is it you do?

Re:Yes, I'm old (2, Informative)

QuantumG (50515) | more than 4 years ago | (#27768695)

However, "making code run faster" is what you do after the code runs, and does what it's supposed to do, and is modular, flexible, and maintainable.

Yeah, and you say this like you've never experienced it. Honestly, if you're writing new code you're in the vast minority of programmers.. or you're just playing around. Most of us are working on code that was written years ago and has to keep doing what it does or the company will lose money.

I'm a web developer

Ahh, I see.

I'm.. not.

Linux kernel? (1)

hoytak (1148181) | more than 4 years ago | (#27768387)

Given that half the stuff people supposedly don't have to worry about are just things taken over by the kernel, I'm guessing she didn't poll many of them...

As someone who works in the gaming industry... (0)

Anonymous Coward | more than 4 years ago | (#27768389)

I can attest that we still do a fair bit of this stuff. We implement our own sorts/searches, data-structures, GUIs, low to high level threading primitives, and memory management. We aren't afraid of pointers, we do strange things to make stuff run fast, and use self-modifying code in some cases (in-game dynamic scripting, for example.)

Most of us don't use gotos or hungarian notation anymore, although occasionally you'll find some brain dead game studios who still do.

Also, the article was written by someone who isn't a hardcore developer. I can smell a casual from a 32 hops away.

Memory Management (3, Insightful)

Rob Riepel (30303) | more than 4 years ago | (#27768407)

Try overlays...

Back in the day we had do all the memory management by hand. Programs (FORTRAN) had a basic main "kernel" that controlled the overall flow and we grouped subprograms (subroutines and functions) into "overlays" that were swapped in as needed. I spent hours grouping subprograms into roughly equal sized chunks just to fit into core, all the while trying to minimize the number of swaps necessary. All the data was stored in huge COMMON blocks so it was available to the subprograms in every overlay. You'd be fired if you produced such code today.

Virtual memory is more valuable than full screen editors and garbage collection is just icing on a very tall layer cake...

Re:Memory Management (1)

Thornburg (264444) | more than 4 years ago | (#27768545)

Virtual memory is more valuable than full screen editors and garbage collection is just icing on a very tall layer cake...

MMmmmm... Garabage Collection Cake, yum yum.

Is your name Oscar, per chance?

Up hill both ways (1)

oljanx (1318801) | more than 4 years ago | (#27768419)

Yeah yeah, I've heard this story before. Hey what about 64K memory segments? I'm at least old enough to have had a headache or two with that...

radio in the computer case (5, Interesting)

bcrowell (177657) | more than 4 years ago | (#27768425)

Circa 1984, when I did summer programming jobs at Digital Research (purveyors of CP/M), one of the programmers there showed me how you could put a transistor radio inside the case of your computer. You could tell what the computer was doing by listing to the sounds it picked up via the RF emissions from the computer. For instance, it would go into a certain loop, and you could tell because the radio would buzz like a fly.

Documentation was a lot harder to come by. If you wanted the documentation for X11, you could go to a big bookstore like Cody's in Berkeley, and they would have it in multiple hardcover volumes. Each volume was very expensive. The BSD documentation was available in the computer labs at UC Berkeley in the form of 6-foot-wide trays of printouts. (Unix man pages existed too, but since you were using an ADM3A terminal, it was often more convenient to walk over to the hardcopy.)

On the early microcomputers, there was no toolchain for programming other than MS BASIC in ROM. Assemblers and compilers didn't exist. Since BASIC was slow, if you wanted to write a fast program, you had to code it on paper in assembler and translate it by hand into machine code. But then in order to run your machine code, you were stuck because there was no actual operating system that would allow you to load it into memory from a peripheral such as a cassette tape drive. So you would first convert the machine code to a string of bytes expressed in decimal, and then write a BASIC program that would do a dummy assignment into a string variable like 10 A$="xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx". Then you would write self-modifying code in BASIC that would find the location where the string literal "xxx...." was stored, and overwrite it with your machine code. So now if you gave the LIST command, it would display the program on the screen, with the string literal displayed as goofy unprintable characters. Then you would code the program so it would execute the machine code stored at the address of the variable A$. Finally you'd save the program onto cassette.

Re:radio in the computer case (1)

Aardpig (622459) | more than 4 years ago | (#27768753)

One of the great strengths of BBC BASIC was that you could put assembly language in-line. You didn't have to muck around with hand-assembly, like you did on other platforms.

No programming language... (3, Insightful)

Nakoruru (199332) | more than 4 years ago | (#27768435)

You will never find a programming language that frees you from the burden of clarifying your thoughts.

http://www.xkcd.com/568/

Wasnt that old (0, Offtopic)

gmuslera (3436) | more than 4 years ago | (#27768439)

I tought that Schindler's List was from the 40's, but this one seems a bit more recent.

it's even really (0, Offtopic)

ILuvRamen (1026668) | more than 4 years ago | (#27768449)

And old people have powered wheel chairs so it's even. Not that I'm implying a 35 year old programmer would need one lol. In fact, they'll have robolegs or their brain in a robot body when they're old :P

One page version (0)

Anonymous Coward | more than 4 years ago | (#27768461)

http://www.computerworld.com/action/article.do?command=printArticleBasic&taxonomyName=Development&articleId=9132061&taxonomyId=11

In a word, DOS (0)

Anonymous Coward | more than 4 years ago | (#27768467)

It started with "640K ought to be enough for anybody". That led to an incredible series of hacks (terminate-and-stay-resident, upper memory address space overcommit, small-frame expanded memory, large-frame expanded memory, the "extra" 64K segment, VCPI, DPMI) that promised to maintain backwards compatibility with existing DOS software while allowing slick new applications to use multiples of the 640K limit. Sometimes they did, until the customer installed some new package which used other undocumented tricks that didn't play well with the first vendor's. Then the customer support lines would light up, and everyone would point fingers at everyone else.

Meanwhile, Intel came up with a 16-bit chip that allowed apps to use as much as 16M (considered a huge amount of memory in those days), but only in "protected mode", which wasn't meant to be compatible with DOS. It turned out that compatibility was something that customers cared about. So startups including Phar Lap and Rational (no relation to the later ClearCase vendor) came up with system software to allow DOS applications to leverage 16-bit protected mode (not too hard), in a graceful and robust manner in the presence of other DOS-extended apps (hard). Microsoft wrote its own DOS extender for Windows 3; customers could run Windows in 16-bit real mode, in 80286 protected mode, or in 80386 protected mode. Marketers were apparently convinced that there was a huge base of installed 80286 machines; while this was undoubtedly the case, most of those users had no intention of upgrading to Windows 3 or the GUI spreadsheets and word processing packages that ran atop it. But engineers at the time spent absurd amounts of time trying to get their programs working in the "brain dead" 80286 protected mode environment, with signs like "SS != DS" posted in cubicles as reminders of pitfalls to avoid.

And that was just memory management aspect of DOS and Windows 3 (which was really just an graphical DOS extender, as Andrew Schulman pointed out in his book "Undocumented Windows").

Remember "inside out" coding? (2, Informative)

roc97007 (608802) | more than 4 years ago | (#27768479)

"Top-down" coding produced readable but horribly inefficient code. Doesn't do any good for the code to work if it doesn't fit in the e-prom.

"Bottom up" code produced reasonably efficient spaghetti. Good luck remembering how it worked in 6 months.

"Inside-out" coding was the way to go.

You wrote your inside loops first, then the loop around that, then the loop around that. Assuming the problem was small enough that you could hold the whole thing in your head at one time, the "inside-out" technique guaranteed the most efficient code, and was moderately readable.

At least, that's the way I remember it. 'S been a long time...

Now, these new-fangled tools do all the optimizing for you. 'S taken all the fun outta coding.

swapping two values without a temporary variable (5, Interesting)

domulys (1431537) | more than 4 years ago | (#27768517)

x = x xor y
y = x xor y
x = x xor y

Now you know!

Spagetti code (1)

G3ckoG33k (647276) | more than 4 years ago | (#27768531)

Have you ever wondered why procedural spagetti code is hard to read? Me too. Maybe it is because the code is full of references to a myriad places which makes it hard to follow. However, that may be applied to object oriented programming too. Have you ever looked at at class and wondered which and what the...

Old school coding problems... (2, Funny)

greenguy (162630) | more than 4 years ago | (#27768549)

1. Writing on a slate. Man, did that compile slow!
2. When you heard "stack," it meant firewood.
3. The error message was a ruler across the knuckles.
4. Memory overloads. My own memory, I mean.
5. If you wanted to change your resolution, you had to wait until New Years.
6. Try coding when you're drinking the original mountain dew.
7. The rantings of code guru SFBM, the inventor of open Morse code.

I can't believe that anyone has mentioned... (0)

Anonymous Coward | more than 4 years ago | (#27768553)

programming without a decent source control system. Too many programmed without any source control system. And for a long time the ones that existed generally used exclusive locking to avoid conflicts. Ugh.

Can you imagine trying to synchronize software without having the patch utility? It is just how things were until Larry Wall got tired of people doing it by hand and breaking rn horribly. And even once diff and patch existed, it took a while for people to accept that you could use it as a basis for an effective source control system. Kids today have no idea what an advance CVS was in its heyday.

Another item that should be there is programming without having unit testing, but far too many people still don't do that. However unit testing is not particularly new. The first version of Perl in 1987 came with a set of unit tests, and the test suite is one of the reasons that Perl managed to be ported to so many systems.

I am still waiting in vain for other mainstream programming groups to learn what has been standard in Perl since CPAN started - every library you get should come with a set of unit tests, and those unit tests should run successfully before you install the library. Some almost get it. For example lots of Ruby folks write unit tests, but unfortunately by default rubygems does not run them before installation.

Programming IOCPs has got to count for something (1)

HockeyPuck (141947) | more than 4 years ago | (#27768615)

Ah, the good old days...

CHPID PATH=(F9,FD),SHARED, *
          PARTITION=((OSPX,OSP1,OSP2,OSP3,OSP4),(OSPX,OSP1,OSP2,OS*
          P3,OSP4)),SWITCH=97,TYPE=FC
*
    CNTLUNIT CUNUMBR=1097,PATH=(F9,FD),UNITADD=((00,001)), *
          LINK=(FE,FE),UNIT=2032
        CNTLUNIT CUNUMBR=B700,PATH=(F9,FD),UNITADD=((00,256)), *
          LINK=(**,05,09,0B),CUADD=7,UNIT=2105

Wait, this was yesterday.

Overlay trees (1)

pesc (147035) | more than 4 years ago | (#27768663)

I developed a program on a PDP-11. It was a 16-bit computer and had 64kB memory. It didn't have virtual memory so in order to fit a large program you had to build an overlay tree.

Consider if function a() called b0() and b1(). And b0() called c0() and c1().

By knowing the call tree in your program and some other stuff about the dynamics of your program you could arrange so that b0() and b1() shared the same space in memory. Likewise for c0() and c1().

By studying linker maps you could create an overlay description file to make your program fit into 64kB. The OS would use this to automatically bring pieces of code in and out.

You can only imagine the consequences when you start to change the program and the pieces grow in size or new calls are added (b1() now calls c0()). You'd often have to manually do a new overlay tree.

No wonder VAX/VMS was such a hit in the late seventies with 32-bit computing and virtual memory support.

Hungarian Notation (0)

Anonymous Coward | more than 4 years ago | (#27768677)

Regardless if the Hungarian Notation is considered an old skool trick (which I wasn't aware of), I'd still find it useful because it kind of gives your code a certain amount of self-documentation just by glancing at it.

Example:

m_pSomeStruct

We can instantly tell ourselves by the Hungarian Notation that this is a pointer to some kind of structure, which is also a member variable of a class or structure.

Look how much we can instantly figure out merely by glancing at it without sifting through comments or other documentation!

Regardless if it is considered old skool or not, I'd recommend every coder adopt, or continue, this practice.

The stack is just taller (1)

schickb (629869) | more than 4 years ago | (#27768683)

The main thing that has changed over the years is that the software stack has grown deeper, and today there are many more people working at higher abstraction levels where a lot is handled for them.

But under the covers, little has changed. Someone must write and understand code all the way down to the hardware. You can bet that people writing kernels and drivers think a lot about space and speed. Or write some code for one of the low cost micro-controllers that are in virtually all electronic devices and you won't have the luxury of a large software stack.

There are very likely *more* people writing low level code today than 20 years ago. Only the percentage of programmers writing low level code is declining because there are many orders of magnitude more programmers writing higher level code.

How about working with EPROM burners and erasers (1)

shoor (33382) | more than 4 years ago | (#27768735)

I worked for a time in what are now called 'embedded systems'. Our prototype equipment had eprom (eraseable programmable read only memory). The burner was roughly square, covered with sockets for different kinds of eproms. You had to find the socket that matched your eprom. You erased the roms by putting them in a box with a strong ultraviolet light. You could tell the eraseables because they had a little window to let in the UV.

Do people still use those old fashioned hardware debuggers with the adapter that had pins for the socket on the board where the CPU would go, so it could capture the signals on all the pins?

They're still here! (1)

Wintee (1521397) | more than 4 years ago | (#27768747)

These historical artifacts of programming haven't gone away, they're still here. Runtime libraries and OO languages hide a lot of the complexity from users. There's still plenty of lists, and memory management code hanging around in the C++ runtime libraries. As a Compiler developer for DSPs we still encounter a lot of these problems. Memory is sparse, and applications need to have a minimal footprint. On some processors we need to do pointer math to calculate pointers. And while we do provide C++ support and an abridged C++ runtime library, you would be amazed at the number of users who stick to assembler and C. They are insistent that C++ is slower (which it can be if you get lost in certain parts of the language) and far more memory hungry (which is certainly is if you pull in large sections of the runtime library. But it can drastically reduce time to market. DSP's are the little brothers to your desktop CPUs that most people will be programming for. Because they're smaller with more constraining power requirements the processors are still playing catch-up to your PC. Multi-core processors are in some ways still an emerging technology. And the languages and tools used on them are steps behind too. Not to mention the conditioning of a lot of DSP developers (One of our senior chip designers (just retired) always designed his chips for assembler - "no one programs a DSP in C".)..
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...