Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

What Today's Coders Don't Know and Why It Matters

Soulskill posted more than 3 years ago | from the we'll-just-test-on-production dept.

Programming 368

jfruhlinger writes "Today's programmers have much more advanced languages and more forgiving hardware to play with — but it seems many have forgotten some of the lessons their predecessors picked up in a more resource-constrained era. Newer programmers are less adept at identifying hardware constraints and errors, developing thorough specifications before coding, and low-level skills like programming in assembly language. You never know when a seemingly obsolete skill will come in handy. For instance, Web developers who cut their teeth in the days of 14.4 Kbps modems have a leg up in writing apps for laggy wireless networks."

Sorry! There are no comments related to the filter you selected.

Newsflash (2, Insightful)

Anonymous Coward | more than 3 years ago | (#37001432)

Experienced people have experience in things they have experienced

tl;dr (1, Funny)

itchythebear (2198688) | more than 3 years ago | (#37001444)

tl:dr - "Get off my lawn"

Re:tl;dr (0)

PylonHead (61401) | more than 3 years ago | (#37001548)

Yeah, that was my first thought. I'm pretty long in the tooth myself, but for the most part, if something has been forgotten, it's because it is no longer of use.

Sure things come back (the mobile app market is a good example), and this becomes a great opportunity for the older generation to pass on information to the younger. But I'm not going to miss the days where you had to figure out how to handle your data set when you couldn't use more than 64k of consecutive memory.

Re:tl;dr (2)

cptdondo (59460) | more than 3 years ago | (#37001616)

I don't miss the days of my PDP-8 programming. OTOH, I do a lot of embedded stuff as a hobby these days, and it's incredible to think that a lot of the newer programmers think you need huge resources to do even minor tasks. I sometimes choke at the megabytes of dependencies sucked into a small piece of code... Anyway, it's a bit of "Get off my lawn" and a bit of "when the car starts skidding, you really do need to understand oversteer, understeer, and torque steer to keep from getting wrapped around that light pole." And yes, I do know that torque steer has little to do with the first two. :-)

Re:tl;dr (2)

Lennie (16154) | more than 3 years ago | (#37002210)

I prefer "less code"*.

I find
- it is usually easier to manage less code
- has very little depencies
- is usually easier to debug
- is usually more efficient

I have a feeling some people just don't even check anymore what dependencies they suck into it.

Just recently, I had to look at Microsoft Exchange, a database had a problem. And the commandline tool to fix it didn't work anymore either.

I took depends.exe and looked at it. The commandline tool to fix the Exchange database files indirectly depends on Internet Explorer. I copied a dll in the directory from an other machine and now the commandline tool works again.

Some people really are crazy.

* Not to the extreme ofcourse.

Re:tl;dr (1)

Lennie (16154) | more than 3 years ago | (#37002222)

While I'm making lists, I should have added:
- easier to read code
- easier to understand code
- done faster reading the code

Re:tl;dr (2)

coolmadsi (823103) | more than 3 years ago | (#37001796)

Yeah, that was my first thought. I'm pretty long in the tooth myself, but for the most part, if something has been forgotten, it's because it is no longer of use.

Sure things come back (the mobile app market is a good example), and this becomes a great opportunity for the older generation to pass on information to the younger. But I'm not going to miss the days where you had to figure out how to handle your data set when you couldn't use more than 64k of consecutive memory.

Thing is, a lot of the stuff that was required knowledge and isn't now is probably unlikely to come up very often any more. So knowing it internally isn't as important as it can just be looked up for a one off use. Sure, maybe a basic overview understanding of some of the potential issues may be useful, but details of something that you might use once and not again that can be looked up isn't.

I think my phone has a bit over 200MB of memory, so any code optimisations for a computing device with less than that aren't as important as they would have been back when computers had less RAM (computers did have double digit RAM at some point, right? My history of computer hardware isn't that great)

Re:tl;dr (1)

maxwell demon (590494) | more than 3 years ago | (#37001876)

What people consistently forget is that today there are generally many applications running at the same time. Also, it makes a big difference in performance if the data you are about to access is still in the cache.

Re:tl;dr (1)

queazocotal (915608) | more than 3 years ago | (#37002400)

In addition, most app programmers test their application on their comparatively fast machine, while not running other stuff in the background.

If developers were forced to test their app on 4 year old hardware, then it would improve general performance lots.

Re:tl;dr (1)

sqlrob (173498) | more than 3 years ago | (#37002034)

Depends what you mean by "double digit".

I cut my teeth on a computer with 5K RAM. If you want to go true single digit, I think there were some with 16 byte.

Re:tl;dr (0)

Anonymous Coward | more than 3 years ago | (#37002098)

Wow, that's 10x more RAM than I got when I bought my VIC=20, stock version had 0.5K and then I splurged an extra 50 bucks to buy the 3K expansion cartridge.

Re:tl;dr (1)

50000BTU_barbecue (588132) | more than 3 years ago | (#37002130)

VIC-20 represent! I built a 16K memory expansion from plans in 73 magazine and chips I bought with money from collecting return deposit bottles!

Re:tl;dr (1)

somersault (912633) | more than 3 years ago | (#37002260)

(computers did have double digit RAM at some point, right? My history of computer hardware isn't that great)

*sob*

Re:tl;dr (0)

Neil Blender (555885) | more than 3 years ago | (#37001814)

Where can I get the headers for tl:dr? yum and apt can't find them. I'm lost.

Copy (0, Funny)

Anonymous Coward | more than 3 years ago | (#37001454)

Paste

Minor Edit

Commit

PEEK (0)

MRe_nl (306212) | more than 3 years ago | (#37001582)

POKE

PUSH

PULL

Re:PEEK (0, Funny)

Anonymous Coward | more than 3 years ago | (#37001626)

This is about programing, not gay sex

Re:PEEK (1)

somersault (912633) | more than 3 years ago | (#37002292)

# man sex

Fashion (3, Informative)

funkatron (912521) | more than 3 years ago | (#37001556)

So your particular skillset has fallen out of vogue for a while; it happens. If this stuff is useful, it'll come back. For instance, a lot of the hardware related skills mentioned are still around, they're just considered to be a specialisation these days, in most situations it's safe to assume that the hardware either performs within spec or that the lower layer (OS etc) is dealing with any irregularities.

Re:Fashion (2)

snowgirl (978879) | more than 3 years ago | (#37002018)

So your particular skillset has fallen out of vogue for a while; it happens. If this stuff is useful, it'll come back. For instance, a lot of the hardware related skills mentioned are still around, they're just considered to be a specialisation these days, in most situations it's safe to assume that the hardware either performs within spec or that the lower layer (OS etc) is dealing with any irregularities.

I'm actually a youngin' who took interest in the lower layers, and developed my skill set around that. I'm kind of waiting for the oldies to retire/kick the bucket and open up more openings for people like me. I anticipate that I'll be one of the hawt shite programmers once the population of systems programmers starts dwindling...

those young whippersnappers (1)

sneakyimp (1161443) | more than 3 years ago | (#37001590)

They don't know that old trick from liblawn
Lawn::GetOffLawn(kid);

Re:those young whippersnappers (1)

Sarusa (104047) | more than 3 years ago | (#37001756)

This obviously calls for a LawnFactoryFactorySingletonFactory pattern

Re:those young whippersnappers (1)

Desler (1608317) | more than 3 years ago | (#37001818)

I think you need an adapter included in their somewhere.

Re:those young whippersnappers (3, Funny)

maxwell demon (590494) | more than 3 years ago | (#37001956)

No, it clearly demands a combination of the observer pattern with the command pattern: You observe your lawn, and if you see kids, you command them to get off it.

Re:those young whippersnappers (4, Funny)

billcopc (196330) | more than 3 years ago | (#37001832)

That lib requires cooperative event handling in the kid class. I much prefer the longer, but deterministic form:

if ( $myLawn->getContents()->filter({type: kid})->count() > 0 ) {
    $myShotgun = new Shotgun()
    $myShotgun->loadAmmo();
    $myLawn->getOwner()->wieldObject($myShotgun);
    for( $i = 5; $i>0; $i--) { sleep(1000); }
    while ( $myLawn->getContents()->filter({type: kid})->count() > 0 ) {
        $myShotgun->fire();
    }
}

Re:those young whippersnappers (0)

Anonymous Coward | more than 3 years ago | (#37001936)

I think your missing semicolon after "new Shotgun" may be a problem.

Re:those young whippersnappers (1)

JamesP (688957) | more than 3 years ago | (#37002360)

What?

it should be get_off(my_lawn), none of this modern 'object orientation' nonsense

or maybe

lea ax,[my_lawn]
call get_off

The problem is (4, Insightful)

geekoid (135745) | more than 3 years ago | (#37001596)

they aren't trained to engineer software, and the industry hasn't incorporated good engineering practices.

Re:The problem is (2)

ADRA (37398) | more than 3 years ago | (#37001680)

Coming from a legacy modernization project, just because people wrote programs 10, 20, 30 years ago doesn't mean that the code was good, or that the developers knew what they were doing. One would hope that decades of development experience would teach a well rounded set of skills and often it does.

To sum up, a 5 year out of school brat learning technology X is any less capable than a 5 year out of school brat learning technology Y in the 80's/90's.

This... (1)

RobbieThe1st (1977364) | more than 3 years ago | (#37001598)

would explain why Runescape runs perfectly well over dialup or EDGE(4KB/sec) speeds, while most other games do not: The original creators probably had some experience that way.

Re:This... (0)

Anonymous Coward | more than 3 years ago | (#37001820)

Even during a 25-man raid in WoW, the game is only using 8KB/s of bandwidth.

I used to play EverQuest 1 on Dialup... and I did just fine.

Re:This... (1)

loufoque (1400831) | more than 3 years ago | (#37001856)

Game development is where you see the worse code ever.

Re:This... (-1)

Anonymous Coward | more than 3 years ago | (#37002322)

Having 17 years in the games industry...
I agree and disagree.
There are the hot-shit programmers who write the nasty hard stuff (what I did) and they tended to write pretty decent code or it all fell apart.
Now the AI coders.. shudder.

Re:This... (1)

kiddygrinder (605598) | more than 3 years ago | (#37002090)

wow works ok for 5 mans over dialup, as long as you don't go near a major city...

1200 bps! Our c-64 had 300 Bps (1)

18_Rabbit (663482) | more than 3 years ago | (#37001602)

...and we LIKED it!

300 Baud? Our microbee has 1200 Baud! (0)

Anonymous Coward | more than 3 years ago | (#37001914)

My first computer had optional 1200 Baud for tape save, and we hated it!

(It was really unreliable. At least 300 Baud worked most of the time.)

It doesn't matter. (4, Insightful)

man_of_mr_e (217855) | more than 3 years ago | (#37001604)

"We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil" - Donald Knuth

Most developers will never need for their apps to run in constrained environments, and most employers don't want to spend money to eek out performance when average performance is perfectly fine.

Too many programmers get caught up in trying to make something the fastest, or most memory efficient, or makes the best use of bandwidth. When most of the time, it just doesn't matter. Such things are expensive, and in the long run it's cheaper to be fast and sloppy than slow and lean.

Re:It doesn't matter. (4, Informative)

jackb_guppy (204733) | more than 3 years ago | (#37001790)

I love D Knuth and have read is sorting and searching book many time over, always finding good times.

SPEED does mater and so does SIZE and BANDWIDTH. It is important to design things right the first time versus loops and loops of daily optimization that must code is written in today. The understanding of record-locks, index optimization and other multiplexing methods are needed today. I see too much of sucking a 1+ million peaces into memory to find 1 item, "Because it is fast".

Yes this sounds like "get of my grass", but "fast and sloppy", is a waste on everyone's resources not just a single computer.

Re:It doesn't matter. (1)

man_of_mr_e (217855) | more than 3 years ago | (#37001844)

Those are important... *IF* they are important. Duh. Most of the time, they are not. Thus, complaining that developers who don't need to write lean apps aren't writing lean apps is kind of pointless.

Re:It doesn't matter. (2, Insightful)

Anonymous Coward | more than 3 years ago | (#37002072)

The thing is people use that quote to be *LAZY*. Yeah most of the time it doesnt matter. But guess what when it does you may find yourself rebuilding an entire framework because you made just plain stupid mistakes and LOTS of them.

For someone who understands what optimizations are available vs your code jockey who just whips up some code the difference is miles different.

I used to think the same way. "It doesnt matter much" but then I realized it does matter. It matters a lot. Think if all of your programs started up 10 seconds faster. Now multiply that by the millions of times programs are run every day.

Individually on single runs it doesnt matter much. But that time debt builds up.

Think about this. Take the standard crt function printf. One of the more used functions out there. What if it ran 2x as fast. What sort of impact in the world would that make?

Re:It doesn't matter. (1)

DragonWriter (970822) | more than 3 years ago | (#37001970)

SPEED does mater and so does SIZE and BANDWIDTH.

Sometimes, one or more of those matters, and sometimes it matters enough that an otherwise correct naive implementation will not suffice, but most of the time, focussing on correctness first and optimizing only where problems become apparent is better than building for speed, size, or other performance aspects first.

And if the problem analysis has been done well, you'll know the times when speed, size, etc. have critical inflexible constraints ahead of time, and know that you have to focus on one or more of those, and which one, and what the boundaries are.

It is important to design things right the first time versus loops and loops of daily optimization that must code is written in today.

Experience has shown that, as nice as that might sound, its really not true most of the time, and in fact most systems face flux in requirements over time such that its better to build small bits that produce correct results, and adapt -- either to optimize, to add new features, or to adjust behavior to meet changing requirements -- rapidly.

Re:It doesn't matter. (1, Flamebait)

PenisLands (930247) | more than 3 years ago | (#37002158)

Yeah, DragonWriter. More like DragonCOCKER. Looks like you completely cocked up your post. BIG PENIS to the maximum.

PENIS FOREVER!!! !!! !!!

Re:It doesn't matter. (4, Insightful)

Mitchell314 (1576581) | more than 3 years ago | (#37001994)

You want spend the most effort to conserve the most expensive resource. And that is not the cpu, ram, or disk time. It's human time. Hell, even working for low wage, a person is expensive. Thus the most effort should be put in having them do the least effort. Unless you have a case where the hardware time is getting expensive, but that's the exception as hardware costs go down while salary doesn't.

And no, that's not an excuse to be sloppy. "Back in the ancient days" it was important to write good code for the limited resources. Now you still need to write good code, but the constraints are relaxed. But we still need code that is maintainable, dependable, extendable, flexible, understandable, etc.

Re:It doesn't matter. (2, Insightful)

Anonymous Coward | more than 3 years ago | (#37002138)

And all those users sitting waiting 5 minutes for the page to load, for the data to completely draw, or whatever?

You do read thedailywtf.com, don't you? Plenty of stories where a 'quick app' becomes mission critical, and can't handle slightly (?) larger datasets.

Well, those aren't _our_ users.. their time is _free_. And they can _leave_ if they don't like it. ....

Re:It doesn't matter. (1)

Mitchell314 (1576581) | more than 3 years ago | (#37002364)

I do read it, and my last statements in both paragraphs was to address that issue.

Re:It doesn't matter. (2)

maxwell demon (590494) | more than 3 years ago | (#37002252)

You seem to have this mental model that more efficient code must take longer to develop. But not making bad decisions may take up exactly zero time if you are in the habit of making good decisions.

A simple example is the ordering of loops. Exchanging the order of loops after the fact may take extra time, but writing the loops in the right order from the start doesn't take more time than writing them in the wrong order.

Re:It doesn't matter. (1)

Mitchell314 (1576581) | more than 3 years ago | (#37002338)

You seem to have this mental model that more efficient code must take longer to develop.

I did not say, claim, or imply that. I was talking about factoring in developer time as a resource.

Re:It doesn't matter. (3, Interesting)

billcopc (196330) | more than 3 years ago | (#37002054)

Is it truly cheaper to be sloppy ? Hardware keeps getting cheaper and faster, sure, but not matching the pace at which code is getting slower.

Just look at your average web server. Today's machines are over a hundred times faster than they were 10 years ago, and we're not doing anything significantly different. Serving up text and graphics, processing forms, same old b.s. So then, why aren't we serving 100 times more pages per second ? Apache keeps getting fatter, PHP seems to do a "sleep(100)" after each line, and don't even get me started on Ruby.

There was a time, not so long ago, when I would spend an hour tweaking an oft-used assembler loop, and the end result was a 3x speedup or more. I'm not saying we should rewrite everything in assembler, but I think we're become so far removed from the actual machine, relying on the compiler to "do the right thing", that people don't even have the slighest clue how to distinguish fast code from slow. How often do we use benchmarks to test different solutions to the same problem ? Almost never! People bust out the profiler only when things go wrong, and even then they might say "just add CPU/Ram/SSD" and call it a day.

Or, if we must motivate the hippies, call it "green optimisation". Yes, faster code finishes quicker, using less power to complete the same job. If we're dealing with web servers, faster code would require less cluster nodes, or maybe free up some CPU time for another VM on the host, and those 60A circuits aren't cheap either. If spending an extra day optimizing my code could save me $2000 / mo off my colo bill, I'd be a fool not to invest that time.

Re:It doesn't matter. (0)

Anonymous Coward | more than 3 years ago | (#37002278)

Amen. I'm just watching the fruits of my labour optimising a large platform for the company I work for. Short answer - I've just paid for the whole departments wages for many years to come, and NetApp will probably be dissapointed to learn we no longer need to splash out on several very large purchases, plus we have several datacentre reorganisations we no longer need to perform.

It's sadly typical that most of the less experienced coders reading this don't recognise that even very small changes in performance translate to massive improvements when things are run at a large scale.

Re:It doesn't matter. (1)

Anonymous Coward | more than 3 years ago | (#37002058)

Most developers will never need for their apps to run in constrained environments, and most employers don't want to spend money to eek out performance when average performance is perfectly fine.
-----------
No.

Performance problems can seriously affect software development during application testing.

Performance problems can affect batch processing operations that must complete within a time window.

Performance problems can affect the usability of desktop applications.

If these problems are not addressed early they can become unfixable. We know some practices are broken. And not doing the wrong thing very often has zero cost.

Re:It doesn't matter. (3, Interesting)

antifoidulus (807088) | more than 3 years ago | (#37002214)

It all comes down to scale ultimately. It's rare in the computer science field to see code that runs x% slower than a more optimized version, at both very small and very large scales. Coders that don't know how the hardware and lower level software interfaces work tend not to write very scalable code because they have no ideas how the computers actually work, and even less of an idea of how a lot of them work.

As an example, consider a database with very poorly designed primary and secondary keys. This choice will either:

a) not matter in the least because the tables are so small and queries so infrequent that the other latencies(e.g network, hard disk etc) will dwarf the poor key choice or

b) Will quickly make the entire database unusable as the time it takes for the database software to search through every record for things matching the given query takes forever and just kills the disk.

I've seen plenty of (b), largely caused by total ignorance of how databases, and the hard disks they live on, work. The indices are there not only for data modeling purposes, but also to minimize the amount of uber expensive disk I/O necessary to perform most queries. And while you may be in situation (a) today, if your product is worth a damn it may very well get bigger and bigger until you end up in (b), and by the time you are in (b) it may end up being very, very expensive to re-write the application to fix the performance issues(if they can be fixed without a total re-write)

Anyone who codes should have at least a fundamental grasp of computer architecture, and realize what computer hardware is good, and bad at. That isn't "premature optimization" as you seem to think it is, it is a fundamental part of software design. "Premature optimization" in this context means things like changing your code to avoid an extra integer comparison or two, that kind of thing. It is not,"lets just pretend the computer is a magic device and ignore performance concerns because it's easier".

Re:It doesn't matter. (0)

Anonymous Coward | more than 3 years ago | (#37002372)

It is not actually all that hard to optimize for speed. Once the application is feature complete, you profile the application under normal use cases and see where the bottlenecks are. You can also profile some unusual use cases as well and see if the program has different sets of bottlenecks under unusual loads.

Once you identify the bottlenecks you can evaluate what you need to do to fix the issues. Often it is trivial to fix memory usage or to reuse already allocated memory for new usage to get a doubling of performance for almost free.

You can run into bottlenecks that are not fixable easily. The way you began your design can cause problems down the road. It might take a lot of refactoring to fix issues of this magnitude. If the problem can be put off until later you can do a bit of refactoring with each new project you do over a years time.

Learn the basics (1)

countertrolling (1585477) | more than 3 years ago | (#37001606)

It's just like pilots should learn how to fly [cumulus-soaring.com] before being put at the helm of an airliner.

What Today's Humans Don't Know and Why It Matters (1)

Anonymous Coward | more than 3 years ago | (#37001650)

Today's humans have much more advanced technologies and more forgiving world to play in — but it seems many have forgotten some of the lessons their predecessors picked up in a more resource-constrained era. Newer humans are less adept at identifying water sources in a forest, finding directions without a GPS, and low-level skills like starting a fire with two stones. You never know when a seemingly obsolete skill will come in handy. For instance, stone age humans who cut their teeth in the days of, well, stone age have a leg up in hunting wild beasts with just bare hands and stones.

Hi, my name is Bob! (1)

tulcod (1056476) | more than 3 years ago | (#37001652)

I am an electrical engineer. Here's some news for ya: us electrical engineers do learn these skills, so don't bother complaining about the state of the world because new students are taught these vital insights in computing every day.

Maybe having a conservative mindset helps.. (2)

JeremyMorgan (1428075) | more than 3 years ago | (#37001664)

but other than that I fail to see the outrage. I also don't see a lot of value in learning things you won't likely need to use. Whats the cost/benefit to learning and mastering assembly if you aren't going to need it? Building software as if you have low resources is fine, so long as you aren't compromising quality to make sure it will run on an archaic hardware. Making things as lean and fast as you can is always plus... if you have the time. Which is another thing today's programmers deal with more: insane deadlines. Expectations are growing and deadlines are getting shorter, and today's programmers (unfortunately) cut a lot of corners and don't get the chance to truly optimize something, just so they can get it out the door.

Re:Maybe having a conservative mindset helps.. (1)

loufoque (1400831) | more than 3 years ago | (#37001882)

Whats the cost/benefit to learning and mastering assembly if you aren't going to need it?

Less people master it, therefore it is more rewarding, both intellectually and financially.

Would you rather be a replaceable coding monkey, doing what anyone else could do, or be an expert software architect that your company relies on?

Re:Maybe having a conservative mindset helps.. (0)

dtmos (447842) | more than 3 years ago | (#37002102)

another thing today's programmers deal with more: insane deadlines. Expectations are growing and deadlines are getting shorter...

Don't kid yourself -- some things never change, although the reasons for them might. There's nothing new about insane deadlines.

When microprocessors were just becoming popular, management of embedded software projects was an unknown: Nobody could determine, based on the product requirements, how many people would be required to complete the task, or how long it would take. Hardware projects people knew how to manage, based on experience, but software projects were a mystery -- especially when they were combined with hardware development.

The first software project with which I was associated, in the 1980s, ended up taking six times as long as initially estimated, with twice the staff. The last six months of the program, the coders' hours were: Arrive at work 8 AM Monday, leave at 5 PM Tuesday. Arrive at work 8 AM Wednesday, leave at 5 PM Thursday. Arrive at work 8 AM Friday, leave at 5 PM Saturday. And a lot of them came in on Sunday, too. Managers made food runs for meals, so that the coders did not have to leave their keyboards. This was not unusual at the time. Divorces were common.

Just because people have gray hair doesn't mean they haven't been put through the wringer. Quite the opposite, in fact.

Re:Maybe having a conservative mindset helps.. (0)

Anonymous Coward | more than 3 years ago | (#37002144)

Whats the cost/benefit to learning and mastering assembly if you aren't going to need it?

I would say that if you doesn't know at least one assembly language before then you will benefit more from learning one than you would from learning yet another programming paradigm and/or high level language.
You don't have to understand everything that is going on under the hood when you use a high level language but if you have a firm grasp of at least one architecture you will have a much better understanding of the different parts of the high level languages and will also be able to track down compiler bugs much easier.

If you really want to master high level languages you should try to write one of your own.

awful article (1)

roman_mir (125474) | more than 3 years ago | (#37001690)

well that was a waste of time, a terrible incoherent, lazily written article with very little anybody could actually use if they wanted to learn something if they cared to find out what it was they 'lost'. There are so many things that people have to deal with in projects today, that having to deal with limitations that are artificially imposed by ideology, which insists doing things the old way... it's just nonsense.

Why not then say everything should be done with a hard limit of 5MB hard drives or better yet, with throughput of punch cards?

If the author wanted to say something useful, he failed.

Sounds of failure, the zen way (1)

rippeltippel (1452937) | more than 3 years ago | (#37001694)

What is the sound of a pointer dereferencing to NULL?

Re:Sounds of failure, the zen way (1)

maxwell demon (590494) | more than 3 years ago | (#37001768)

What is the sound of a pointer dereferencing to NULL?

I'm sure you mean dereferencing a null pointer. A pointer dereferencing to NULL would be a pointer pointing to a null pointer.

Re:Sounds of failure, the zen way (1)

Obfuscant (592200) | more than 3 years ago | (#37002198)

A pointer dereferencing to NULL would be a pointer pointing to a null pointer.

That would be "a pointer dereferencing to a NULL pointer". A "pointer dereferencing to NULL" would be a pointer to a NULL value.

I learned 68000 and VAX assembly code when I learned C. The best way to learn what a C statement is actually doing (with all those & and * and [] and stuff) is to have the compiler dump the assembly and look at it. It's also a good way to see what optimizations the compiler is trying to do.

I have a feeling the main best use for people who know how hardware acts these days are 1) high performance computing specialists who need to worry about cache hits and row vs. column-wise data access, and 2) people porting an app from a large-capacity system to a low capacity one.

Yes, human time is important, but a software system that needs to keep up with realtime but it can't because the programming is sloppy is not worth much, and the savings of being able to use a smaller cheaper processor in 1,000,000 units of a consumer device can make up for it.

Good link (1)

ustolemyname (1301665) | more than 3 years ago | (#37001742)

Link in the summary is to the print page. Thanks jfruhlinger!

Stone Carving Skills (1)

paulsnx2 (453081) | more than 3 years ago | (#37001750)

I remember when bits were made of actual stone, hand crafted, and strung by hand into computational frames. Today's kids don't even understand that this is what we are referring to when we talk about "Frame Works" today. Those Frames were tons of work, let me tell you.

Computational systems used these frames so that sometimes you had to go through several of them to get to your goal (These were the early AND gates), and other times you allowed users to pick which one to pass through (OR gates). When you forced people into exclusive choices (such as gates to bathrooms) these frames were referred to as exclusive gates (use this one if male OR use this wone if female, or XOR gates).

We used simple trip wires in our frame works to implement inverters, Very effective at reducing time wasted with managers. (Generally once a manager has been inverted a few times, they leave you mostly alone.)

Knot gates were generally frameworks made from pine.

I could go on about heat sinks (Hot rocks used to warm water) and other early computational technologies. But all of this is lost on this young crowd who don't even remember Java used to be drunk black for a nickle.

Re:Stone Carving Skills (2)

garyebickford (222422) | more than 3 years ago | (#37001966)

Oblig.: A Bunch of Rocks [xkcd.com] . :D

Stuff (0)

Anonymous Coward | more than 3 years ago | (#37001754)

I learned to program on TI calculators (in C, so a bit more advanced than what most high school kids did with BASIC). The people I know who did similar things are good with making things quick, fast, and memory efficient.

It doesn't particularly make sense to me, but I think this is why I build websites the way I do--minimal number of images, and if CSS and JS aren't more than a few hundred bytes, I include them in every page instead of linking to them. Cutting down the number of requests does far more to optimize web pages than making smaller files (to an extent, of course). I'll never understand why people insist on having dozens of images, and a bunch of very small .js and .css files.

Re:Stuff (1)

Lennie (16154) | more than 3 years ago | (#37002248)

less is definitely more, atleast if you ask me.

Lessons from the Old and Wise (4, Funny)

AirStyle (2363888) | more than 3 years ago | (#37001766)

I do agree with you. I'm new to programming myself, but I've always felt the need to learn more about the computer than just the high-level language. That's why I want to take up PERL. Apparently, there's still a strong Ruby community out there, so I might take that up as well. On top of that, I like to plan out my programs. I like to know exactly what it will do before I do it, which may require writing out the code first. I'm only two years into programming, so I still have a long way to go. I just want to make sure that what I do is very efficient so that all my future supervisor has to do is sit back and trust me.

Re:Lessons from the Old and Wise (2)

garyebickford (222422) | more than 3 years ago | (#37002038)

Or go the other way - toward Church instead of Turing [wikipedia.org] - check out Erlang [erlang.org] or Haskell [haskell.org] or Ocaml [inria.fr] . In fact I recommend learning at least one of the above (I personally like Erlang, but that's just me) just to get a different perspective on computation than any of the 'classic' imperative, memory-location-oriented languages.

Excuse me while I laugh my head off... (3, Insightful)

sconeu (64226) | more than 3 years ago | (#37002052)

I'm new to programming myself, but I've always felt the need to learn more about the computer than just the high-level language. That's why I want to take up PERL.

(emphasis mine)

This possibly the most hysterically unintentionally funny thing I've read in a long time.

IDE debugging really isn't that bad (2)

jader3rd (2222716) | more than 3 years ago | (#37001770)

One of those interviewed in the article complained about the fact that modern day programmers try to solve the problem through an IDE or debugger, instead of putting in statements which change the output of the program. They wanted printf debugging. While I do value a good tracing subsystem, I for one, am grateful for modern debuggers which let me view the state of the system without having to modify/redeploy the code.

Re:IDE debugging really isn't that bad (1)

coolmadsi (823103) | more than 3 years ago | (#37001922)

One of those interviewed in the article complained about the fact that modern day programmers try to solve the problem through an IDE or debugger, instead of putting in statements which change the output of the program. They wanted printf debugging. While I do value a good tracing subsystem, I for one, am grateful for modern debuggers which let me view the state of the system without having to modify/redeploy the code.

Oooh, I remember printing out debug statements. When I was at Uni.

Tried doing it once while working on a massive program when I got a job after Uni, it was near useless due to the scale of the system. Figured out how to use a decent debugger properly (we might have been taught how to use a basic one at Uni) and haven't looked back.

Recently found out with the debugger I am using that I can change variable values mid execution - can't do that with print statements. You're right - modern debuggers are great.

Re:IDE debugging really isn't that bad (1)

DeadCatX2 (950953) | more than 3 years ago | (#37002004)

Not only can you change variables during execution, you can manually move the execution pointer around, you can recover from unhandled exceptions, and you can edit the source code during a breakpoint and then continue without having to restart your application.

You can also still direct things to the Output window in the IDE if you fancy the printf style statements.

Re:IDE debugging really isn't that bad (1)

maxwell demon (590494) | more than 3 years ago | (#37002420)

No, they didn't complain about modern day programmers using an IDE or debugger. They complained about modern day programmers being lost without them.

In my day (4, Funny)

alostpacket (1972110) | more than 3 years ago | (#37001786)

we had to code uphill in 10 feet of snow on an abacus using roman numerals.

Re:In my day (2)

sconeu (64226) | more than 3 years ago | (#37001998)

You had Roman numerals? You lucky, jammy, bastard! I'd have killed for Roman numerals. We had to use tally marks. And if we didn't put the slash to mark the fifth, the compiler got confused and core dumped on us!

Re:In my day (1)

maxwell demon (590494) | more than 3 years ago | (#37002124)

we had to code uphill in 10 feet of snow on an abacus using roman numerals.

You had an abacus and roman numerals? Luxury! We had to use heaps of stones for our programming. And we liked it!

Re:In my day (1)

tool462 (677306) | more than 3 years ago | (#37002246)

Roman numerals? Then how did you terminate your strings?!

Meh (0)

Anonymous Coward | more than 3 years ago | (#37001808)

30 years of experience can be 1 year times 30.

I've found the important thing is an actual passion for computer science, which gives you a real desire to know what's going on inside at the chip level.

The dot.com bubble brought in most of these "older" guys, who showed up for a paycheck, don't have that passion, and suck.

Age has little to do with it, in my opinion.

old wives' tale (1)

fertilizerspike (2430602) | more than 3 years ago | (#37001812)

This pops up every few months really, somebody goes on and on about how much harder it was to program "back then", and that programmers who have been programming longer are better programmers. Duh! It's called experience, it has nothing to do with how "easy the kids have it these days". How does this stuff get to the front page when my story about the Columbia being destroyed by the Starfire Optical Range was completely swept under the rug. Ditto to my story about Giffords' partial brain transplant into the nine-year-old clone of her, the young Christina Green. Slashdot sucks.

Re:old wives' tale (1)

HarrySquatter (1698416) | more than 3 years ago | (#37002048)

Yep and posts like this one [slashdot.org] are fucking hilarious with such obviously made up shit like:

I'm old enough to remember when it wasn't like that. You'd run your program and it was ready in a second, you'd exit and it left no trace. Crashes were virtually unheard of. We have people where I work who only do data entry, and they still use wordperfect 4.2 on 386 hardware. I've seen their workflow and how fast it works for them and I can see if they "modernized" it would cripple their productivity.

Hahah lolwut? Crashes were virtually unheard of? Back in DOS crashes were basically a way of life with all the buggy software that each had their own handrolled version of psuedothreading, video ouput, etc. Secondly, WordPerfect 4.2? That was seriously buggy shit. Yep, this is just a bunch of old timers remembering a romanticized version of history that never really existed.

Re:old wives' tale (0)

Anonymous Coward | more than 3 years ago | (#37002226)

Crashes were NOT that common in systems before dos. It happend, but only once every two or three months when the system was being updated. Not during normal operation.

Any crash caused the system to be analyzed and modified to eliminate the crash.

With DOS, crashes occurred so often that it was assumed that that was the normal thing.

Re:old wives' tale (1)

HarrySquatter (1698416) | more than 3 years ago | (#37002270)

Yes, that was kind of the point. Since he was holding up WordPerfect as an example, which was *gasp* wait for it *gasp* a DOS program, he was clearly attempting to claim that the DOS era was something free of crashes which is completely absurd. Secondly, the claim that software older than DOS wasn't buggy and crash prone is also bullshit. Some of the most classic bugs that get pointed out when you read up on the history of C come from this supposed "golden age" where all programmers were apparently wizards of their craft. Except that this "golden age" is a nostalgic farce and there were plenty of shitty programmers in that day producing buggy and bloated (for the time) code.

'Resource constrained era' (1)

DavidR1991 (1047748) | more than 3 years ago | (#37001852)

Comments/phrases like these completely fail to grasp that things like this are RELATIVE. What is 'resource constrained' today isn't what was seen as 'resource constrained' 20 years ago. Likewise, many young programmers _today_ (including myself) DID in fact learn to code in what would be seen as resource constrained environments compared to today's machines. I cut my teeth on an 8MB Win95 machine and later a 32MB machine. Sure, that amount of RAM to play with is an insane luxury if we're thinking back to early/earlier 90s or 80s. But compared to what we have now it IS resource constrained. And heck, even mobile devices have more than 32MB to play with these days.

20 years from now 2 or 4GB of RAM will look like 'resource constrained' environments, and all of us who are 20-year olds now will be thinking "Herp derp, this new generation won't understand the resource constrained 32MB days!". And then 20 years from that, their experience will help them operate in 'resource constrained environments' when several TB of RAM is the norm or something

TL;DR: This is all a stupid circular pattern where each generation seems inferior to the last but still has some useful knowledge relative to specs of modern machines

One question... (1, Funny)

MachDelta (704883) | more than 3 years ago | (#37001854)

How the fuck do you forget something you were never taught in the first place?

This article should really read: "Crotchety old programmers fail to pass on tricks of the trade, then complain anyways"

Re:One question... (1)

Anonymous Coward | more than 3 years ago | (#37002172)

To listen, you must first be quiet.

Duplicate story (1)

amn108 (1231606) | more than 3 years ago | (#37001900)

Eh, didn't someone remind us of this a couple of months ago? Seems like someone really has teeth to grind with modern coders. Get a life, you suspicious person!

Go bang some metal (1)

Jimbookis (517778) | more than 3 years ago | (#37001950)

Coder types, buy a CRO (like the cheapie but terrific Rigol 1052E), buy an Arduino board and use the AVRStudio4 to start hacking. Learn how to use digital I/O pins to measure the time it takes functions to run and try do some realtime stuff. I'll be making my daughter an electronic drum kit next week using an Arduino, some piezos and her eeePC to receive MIDI over a USB/serial to generate the sounds. A simple project to be sure but it'll get me up to speed with the Atmega1280.

I inherited an 8051 (spit - fucked if I know why anyone likes this CPU) project at my old work. I asked why they didn't turn the C optimiser on, the answer was the optimiser is broken as the code wouldn't run which sounded like BS to me. The owner with 30 years of embedded eperience was satisfied with this explanation. I got it working by putting the volatile keyword on some globals that were shared between main() loop functions and interrupts and bugger me if the application didn't start working with the optimiser on! My respect for the owner and his company began it's downhill slide that day. So even grumpy old codgers with years in the field like my old boss can be total n00bs too.

An observation (1)

zerox030366 (2430128) | more than 3 years ago | (#37001990)

Most of my programming experience is in either C or assembly language, and a good portion on embedded devices. Recently at my job I have been learning perl, which is obviously an entirely different animal. I find it particularly interesting coming from this background to look at the different data types- In assembly, there really are no data types but only the hardware, in C, the data types are basically just abstractions of the hardware (pointers, arrays, etc). But perl has all kinds of things like typeglobs, and hashes and references (which still strike me as the english major's version of a pointer), which are far more abstracted. The hardware is doing the same things as always, but the model of the computer portrayed by the language is much different.

Re:An observation (1)

Jimbookis (517778) | more than 3 years ago | (#37002240)

I have the same programming pedigree as you. For my work at least I have never had to use anything higher level than C due to embedded projects. I worked with a young bright self taught programmer for whom higher level languages was his only experience and took the view of the computer as a mysterious black box. He appreciated that there was low level stuff going on in this thing but he couldn't really get at it with .NET. A good example of things going pear shaped was the insane amount of time the.NET application took on an eeePC when writing large amounts of data to serialised XML form.

Frustration (0)

Anonymous Coward | more than 3 years ago | (#37002014)

This lack of skills is particularly frustrating when troubleshooting a product developed and supported by a 3rd party.

Examples that spring to mind recently for me:

Vendor DBMS software is randomly corrupting database files. The vendor looks on the server and finds that it has used a couple hundred megabytes of swap. Holy crap, the machine is swapping, you didn't put the specified 4GB of RAM in this machine, you need to do that before we'll even touch the problem! Never mind that there was virtually no paging activity on the server at all when the problem occurred. And even if there was... please explain how that corrupts your database?

Vendor ETL software is running slowly. Finger pointed at the horsepower of the target SQL server. An hour later we discover that the vendor has failed to identify or use the relevant table indexes, and is in fact using a terribly slow computed WHERE clause.

Vendor tells us our database is full. The only solution is to add more disk space. When we look inside we find that the two largest tables hold exactly the same data, the third largest has no index and is regularly deleted from/inserted to, and most of the columns have storage types far in excess of requirements (do you really need an 8 byte floating point number just to store an integer value between 0 and 43200, or 40 characters to store which month we are in)?

Hardware issues less common (2)

coolmadsi (823103) | more than 3 years ago | (#37002022)

I get the feeling that hardware errors were a lot more common back in the authors day; they don't come up very often now.

One of the first things I learnt when troubleshooting problems is that it is probably a problem with your code, and not the hardware or external software libraries (apart from rare cases).

Embedded software development (2)

DeadCatX2 (950953) | more than 3 years ago | (#37002082)

We still use straight C, sometimes even intermixed with assembly. We know all about resource constraints, given that our microcontrollers sometimes only support a few kB for instructions and data RAM. As far as debugging goes, I'll see your debug-with-printf and I'll raise you a debug-with-oscilloscope-and-GPIO-toggling.

Slashdot's Pointless Story of the Day? (-1)

X3J11 (791922) | more than 3 years ago | (#37002194)

Newer programmers are less adept at identifying hardware constraints and errors, developing thorough specifications before coding,

Possibly because there are less hardware constraints (and errors) than there were a scant few years ago? Logic? Nah!

and low-level skills like programming in assembly language.

People don't need assembly language any more. 15 years ago we were all bypassing the BIOS to write to the display, opening COM ports to communicate with modems, using INT 33h to handle the mouse, and basically doing everything ourselves. Windows, Linux, Apple, everyone, have obviated the need for such low-level tricks. While I agree that it is certainly useful to understand what instructions a compiler might translate a particular LoC into, as well as a basic grasp of the underlying architecture and its instruction set, these days computers are "fast enough" that such knowledge is hardly used any more.

There's a reason high-level languages were invented, and their performance is "good enough" on modern hardware. Even modern operating systems are largely written in HLLs, with only a bit of glue assembler to get the ball rolling or handle the most time-sensitive/intricate tasks.

For instance, Web developers who cut their teeth in the days of 14.4 Kbps modems have a leg up in writing apps for laggy wireless networks.

I'm not sure what timeline this fella originated from, but back in the good ol' days of the 14.4 modems, BBS' were more prevalent than consumer Internet connections; and even then, most "Web developer"'s of that era either used some shitty program that produced terrible HTML (and subsequently ran it through htmltidy if they had any clue at all), or "developed" their web pages entirely by hand. I fail to see how some guy who barfed up an HTML page and uploaded it via a modem has any advantage over one who never did it on anything but a highspeed connection.

While the article itself is an interesting read, particularly with those who are fascinated by the Days of Computers Past, the summary is stupid and almost convinced me to not RTFA.

Re:Slashdot's Pointless Story of the Day? (0)

Anonymous Coward | more than 3 years ago | (#37002370)

OH really? No one needs it? I use it all the time. Not everyone writes Java or web apps. It has nothing to do with "low-level" tricks. I write real-time, embedded systems. Compiler or library code bugs, flaky or temperamental hardware, drivers or bootstrap code that absolutely have to use assembly code, time-sensitive buses, you name it. This isn't "Days of Computers Past." Not all of us suck our thumbs and pound on a PC keyboard. Get a clue.

Security... (0)

Anonymous Coward | more than 3 years ago | (#37002220)

Not everyone needs to learn assembly and low level programming, but young people who deal with the latest and greatest need to get of their high horses too. Bad code is bad code. If you don't learn the fundamentals then you tend to write very bad code. If programmers are getting better, why are there so many more software flaws leading to security vulnerability? There is something to be said for learning what makes you the most productive but ignoring the fundamentals leads to problems that you can't even begin to solve.

go for the faster hardware (1)

Osgeld (1900440) | more than 3 years ago | (#37002300)

it will last longer and be less hassle

much like today's posters not knowing (0)

Anonymous Coward | more than 3 years ago | (#37002398)

what was posted in the past and many times.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?