Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Programmers and the "Big Picture"?

Cliff posted more than 11 years ago | from the distinguishing-the-forest-from-the-trees dept.

Programming 405

FirmWarez asks: "I'm an embedded systems engineer. I've designed and programmed industrial, medical, consumer, and aerospace gear. I was engineering manager at a contract design house for a while. The recent thread regarding the probable encryption box of the Columbia brought to mind a long standing question. Do Slashdot readers think that the theories used to teach (and learn) programming lead to programmers that tend to approach problems with a 'black box', or 'virtual machine' mentality without considering the entire system? That, in and of itself, would explain a lot of security issues, as well as things as simple as user interface nightmares. Comments?"

"Back working on my undergrad (computer engineering) I remember getting frustrated at the comp-sci profs that insisted machines were simply 'black boxes' and the underlying hardware need not be a concern of the programmer.

Of course in embedded systems that's not the case. When developing code for a medical device, you've got to understand how the hardware responds to a software crash, etc. A number of Slashdot readers dogmatically responded with "security through obscurity" quotes about the shuttle's missing secret box. While that may have some validity, it does not respect the needs of the entire system, in this case the difficulty of maintaining keys and equipment across a huge network of military equipment, personnel, installations."

Sorry! There are no comments related to the filter you selected.

For big pictures (1, Funny)

Anonymous Coward | more than 11 years ago | (#5280825)

I like to use jpeg compression. PNG is great for line-drawings, etc though. If you really need the whole big picture, and size isn't an issue, then yes, PNG's 24-bit mode is pretty good.

Thanks! (-1)

PedoPeteTownshend (641098) | more than 11 years ago | (#5280852)

I personally prefer Jpegs as their low-bitrate and ubiquity makes them easy to transfer from computer to computer should the police want to take a look.

Linux? (-1, Troll)

Amsterdam Vallon (639622) | more than 11 years ago | (#5280828)

The "Big Picture".

It's Linux, plain && simpal.

You, sir, are an asshat (-1)

Anonymous Coward | more than 11 years ago | (#5280876)

That is all.

Re:You, sir, are an asshat (-1, Troll)

Anonymous Coward | more than 11 years ago | (#5280921)

I h4v3 w4r3z3d A55h4t L1nuX V8.1

I teh l33tX0r!

what a fantastic idea! (0, Troll)

SweetAndSourJesus (555410) | more than 11 years ago | (#5281011)

I think I'm going to roll up Asshat Linux 0.1 today. It will be the exact same thing as gentoo, but the fortune database will be nothing but slashdot posts and RMS quotes.

Re:what a fantastic idea! (0)

Anonymous Coward | more than 11 years ago | (#5281145)

>> the strongest word is still the word "free"

Pity you Yanks have completely lost the concept of what that really means.

Re:Linux? Yes...NINNLE! (0)

Anonymous Coward | more than 11 years ago | (#5281103)

More specifically, the Big Picture is Ninnle Linux!

Hacked by Chinese! (-1, Offtopic)

SqueakRu (212186) | more than 11 years ago | (#5280836)


fairly possible (0, Troll)

Anonymous Coward +1 (645038) | more than 11 years ago | (#5280839)

it's been time for a paradigm shift for years now.

Re:fairly possible (0)

Anonymous Coward | more than 11 years ago | (#5280883)

Nice vocab


Anonymous Coward | more than 11 years ago | (#5280857)

The most horrible space tragedy in recorded history occured one week ago, and all you people can do is talk about whether programmers get the "big picture"? My *god*, folks, get some priorities here!


j3110 (193209) | more than 11 years ago | (#5281046)

The population of the US is: 265,283,783
According to the dissaster center, 180.2 people die in 100,000 in car accidents.

Is not the other 478,401 people that die in vehicular accidents in a year a bit more important than 7 astronauts in the past decade?

You should work on your priorities first because I don't think you are seeing the big picture!

(all numbers based on 1996 data)

Non sequitor. (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5280872)

Nothing wrong with a `black box` mentality. It's what's behind all modern programming. The shuttle problem was linked to bits of heat shielding falling off.

Do keep up, old chap!

Re:Non sequitor. (1)

Bendebecker (633126) | more than 11 years ago | (#5280954)

Exactly. With a simple hardware system, you can afford to think of the underlying hardware while writing the software. However, try writing software with the hardware in mind when the hardware is the shuttle. The 'black box' mentality is important because you can easily get completely overwhelmed by trying to think of how every little piece of hardware fits into your software design. The hardware should work and not be concern. Instead of worrying about what the hardware will do if your software crashes spend that time making sure your programs are stable and won't crash.

In general... yes (4, Interesting)

Anonymous Coward | more than 11 years ago | (#5280879)

I don't have as much experience as some, but I've always wondered about coders who restrain themselves in the 'world' their code runs in. It overlaps, I think, with the problems of sysadmins who leave systems/gateways/firewalls and whatnot wide open to the world.

If a coder isn't ignoring the fact their code isn't going to be running on the exact same shell as they are, they're ignoring that it won't always be running in the exact same OS, or exact same network. Tragically, when it breaks it can then break BIG.

Note I also don't have enough experience to offer a solution other than "get a clue!". It's more work until you embed it in your habits to take notice of these possibilities.

Probably (4, Insightful)

nizcolas (597301) | more than 11 years ago | (#5280881)

Most programmers who are going to come across a "black box" have enough experience to be able code for the situation. Isn't that skill a trait of a good programmer?

Then again maybe Im missing the point :)

Re:Probably (levels) (3, Interesting)

$$$$$exyGal (638164) | more than 11 years ago | (#5281029)

Programming Levels:
  1. Microsoft Frontpage
  2. Raw HTML
  3. CGI/PHP/etc.
  4. Servlets/Mod-perl/etc.
  5. Object-Oriented black boxes
  6. Documented API's
  7. Public Documented API's
  8. Performance
  9. "The Big Picture" - Architects

--sex []

Correction (-1, Troll)

H.G. Pennypacker (649549) | more than 11 years ago | (#5281086)

Dear Sir,

I am afraid to inform you that you are mistaken if not in your logic, but in your assumptions. Look at the crowd you are dealing with. Most Slashdotters have no experience with any "box", let alone black box, asian box, girl-next-door box, etc. They do not have the capacity to code for that situation! If one should ever be within any tangible measure of acquiring some box, they would probably have to be defibrillated first, and THEN acquire this 'experience' that you (crudely) refer to.

Remember the two key features of the demographic here :

  • basement dwellers
  • virgins

Hope that helps! HAND.

Re:Probably (5, Interesting)

ackthpt (218170) | more than 11 years ago | (#5281113)

Most programmers who are going to come across a "black box" have enough experience to be able code for the situation. Isn't that skill a trait of a good programmer?

I think it's more than a skill, it's an attitude. I've encountered a number of programmers (just out of school/training) who are oblivious to external concerns, including interface design (traditionally what users complain most about and programmers lack any standard to follow.) Generally it takes little effort to break programs written by very skilled programmers, but blind to anything outside their scope. I was probably as bad when I first started, but recently an analyst complained angrily why I went beyond the scope of the project by including an error/warning log (most likely because the errors/warnings accounted for any untrapped logic and revealed how incomplete the spec was and how little the analyst, and some of the higher-ups, knew of the business function) I felt there were too many things unaccounted for and added the log, when it produced 1,000+ entries things got a little heated. I stuck to my guns though and see a general lack of interest in review of why there are gaps in the spec or knowledge (by the very people who should know.

OT: IBM Drops Linux on Itanic (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5280886) m_1.html

Linux? (-1, Troll)

Amsterdam Vallon (639622) | more than 11 years ago | (#5280894)

You know what guys, I do a lot of thinking about the "big picture" and "whole scheme of things" often at my day job as executive IT manager for a large warehouse distribution systems center in north Jersey.

What I've found is a move, a paradigm shift per se, toward a more distributed and less technological reliance upon a single entity, such as Microsoft.

The answer for the future is Linux. Why would we line the pockets of MS's corporate bottom dollar when we could simply utilize the foundation of Linux upon traditional X-86 hardware components like soundcards and high-speed motherboards?

I think that with all the amazing examples of Linux powering powerful business to business and Web portal shopping sites, it'd be a simple tale to examinne Linux in the ``real" corporate enviorment as opposed to the very theoritecial and academic feel to the current manual page.

As long as you know the GPL (unlike your all set for succeeding in business without even really trying.

Adivce -- avoid proprietary software like the plaque.

Re:Linux? (-1, Troll)

govtcheez (524087) | more than 11 years ago | (#5280913)

I hate you and I hope you eventually look like the goatse man.

Re:Linux? (-1)

Anonymous Coward | more than 11 years ago | (#5280920)

$20 for a suck? KARMA WHORE.

Re:Linux? (-1)

Anonymous Coward | more than 11 years ago | (#5280925)

Goddam! You are one fast motherfucker.

Re:Linux? (0)

Anonymous Coward | more than 11 years ago | (#5280926)

Ignore Linux. It is going away, after all, IBM has just announced that they're dropping it on itanic: m_1.html

Re:Linux? (1)

OppressiveGiant (558743) | more than 11 years ago | (#5281024)

why not enclose that link in <a> tags </a>. Not that it really matters since its a dead link anyways.

Re:Linux? (0)

Anonymous Coward | more than 11 years ago | (#5281058)

Try again: here it is []

This is SLASHDOT (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5280895)

We discuss the oscars!

Your question makes no sense. (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5280902)

Perhaps you should consider another profession other than programming.

Are cats black? (0)

Anonymous Coward | more than 11 years ago | (#5280908)

Do Slashdot readers think that the theories used to teach (and learn) programming lead to programmers that tend to approach problems with a 'black box', or 'virtual machine' mentality without considering the entire system?

Yes. And no.

(Ask a uselessly-general question, get a uselessly-general answer.)

Big Picture Programming (1)

rivendahl (220389) | more than 11 years ago | (#5280910)

I think perhaps the question was meant to focus on how big picture programming (not images) might have helped or hindered the abilities of the investigators in determining the cause. On the other hand, images of the destruction might come in handy for investigative purpoases as well. Either way, and regardless the intent of the original question, what might systems engineers improve upon or how could they improve upon current paradigms to assist in investigative methods without using "black box" technology approaches?


O-U-T-SIDE (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5280911)

"Do Slashdot readers think that the theories used to teach (and learn) programming lead to programmers that tend to approach problems with a 'black box', or 'virtual machine' mentality without considering the entire system?"

I think all programmers should consider spending more time outside in the fresh air and bright warm sun over all else, truth be told.

Re:O-U-T-SIDE (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5281003)

Why is this flamebait?

Because the moderator hasn't seen the light of day for years most likely and human interaction scares the shit and piss out of him.

Involvement in the SDLC (5, Insightful)

seanmcelroy (207852) | more than 11 years ago | (#5280919)

I think the problem increases as programmers are less and less a part of the complete systems development life cycle and are contracted to work on individual components of an overall system. Especially during the maintenance phases of a system's life, the inexperience of new programmers on a project is probably more to blame than 'training' per say to think in a black-box mentality.

Huh? (5, Funny)

twofidyKidd (615722) | more than 11 years ago | (#5280923)

I don't know what you're trying to say here man, but no amount of programming or "Fatal Error: Wing no longer attached to craft" terminal prompts would've saved them from what happened.

If you're trying to make a case for programming paradigm shifts based on security procedures, it isn't working in this context.

Black Box....yes, but....... (5, Insightful)

jlk_71 (308809) | more than 11 years ago | (#5280924)

I am taking courses toward my degree and I must say that in my intro to programming course, the instructor was constantly stressing the need for 'black box' programming. In addition though, he also stressed that while keeping things black box, you also need to keep your mind on the whole project, always watching out for possibly security problems, etc.
I believe that some people tend to get tunnel vision and concentrate wholely on the bb theory, without taking into consideration the whole program. This does usually lead to problems and errors in the code.



Sometimes It's Impossible (5, Insightful)

syntap (242090) | more than 11 years ago | (#5280928)

Many times, management is the cause of preventing developers to see the "big picture". Sometimes it's "Here, code this" and you don't get a lot of opportunity to ask the questions you know need to be asked. Sometimes you have to hope resolutions to these types of issues are built into the requirements specification or will be ironed out in quality assurance measures.

The developer is only one in a group of responsible parties in any given system, and his/her output depends largely in input from others. If a developer is kept "out of the loop" on things (or is lazy and stays out of the loop opn purpose), you're going to see these problems.

Often it's like blaming clogged fuel injectors _for_ cheap gasoline instead of _on_ it.

Re:Sometimes It's Impossible (4, Insightful)

Anonvmous Coward (589068) | more than 11 years ago | (#5281150)

"Many times, management is the cause of preventing developers to see the "big picture". Sometimes it's "Here, code this" and you don't get a lot of opportunity to ask the questions you know need to be asked..."

Don't forget the "make it work by the next trade show" mentality.

The "Big Picture" is TOO big for most people (5, Insightful)

Entrope (68843) | more than 11 years ago | (#5280929)

Keeping the "big picture" in mind is a good thing for managers and designers. For people implementing the finer details, though, it can be a distraction and a poor use of their time. Someone implementing or verifying flow control in a ground-to-space network link does not need to know much about the format of data carried over the link. Someone doing circuit layout or design for a cockpit control widget does not need to worry about reentry dynamics and airflow. Similar examples can be found in any large system design.

It is the responsibility of the higher level designers and managers to encapsulate the big picture into digestible, approachable chunks. To the extent possible, they should be aware of and codify the assumptions and requirements that cross multiple domains -- when those are documented, it is easier to test the system for correctness or robustness, as well as to diagnose problems.

When everyone on the project tries to orient their work to what they each perceive as the big picture, you end up with enough different perceptions that people work against each other. Breaking down the system into smaller, more defined, chunks combats that tendency.

IMHO (5, Funny)

Em Emalb (452530) | more than 11 years ago | (#5280935)

People tend to focus exclusively on their area of expertise.

Otherwise they become managers :D

Offtopic (-1, Offtopic)

johann909 (241219) | more than 11 years ago | (#5280936)

Yama Yama Yama

What does that have to do with UI design? (2, Insightful)

192939495969798999 (58312) | more than 11 years ago | (#5280937)

If i write a component that takes in X1 and outputs X2, isn't it the designer's job to make it look pretty? I mean, supposedly they were the ones that came up with needing the component in the first place, to accomplish some function or other, and thus make the user happy.

Abstraction is necessary! (5, Insightful)

Dr. Manhattan (29720) | more than 11 years ago | (#5280943)

Being able to abstract chunks of a program or system out and not worry about implementation is utterly vital. No human, however gifted, is capable of understanding the entirety of more than a trivial system at once.

Now, the amount of abstraction possible does differ depending on what you're doing. Embedded systems programming is hard, and you do have to know details of the machine. But I ask you - do you insist on a gate-level understanding of the embedded CPU, or will you settle for knowing the opcodes and their timing characteristics?

Because, in embedded programming, you need to know more about the device, it's proportionately harder to do. That's one reason, apart from power and cost considerations, that embedded systems tend to be simple - the simpler the system, the easier it is to think about, to prove correctness or to at least enumerate possible pathways and handle them.

But even in that case, you need to be able to ignore some implementation issues or you can't do it at all.

Re:Abstraction is necessary! (4, Insightful)

dboyles (65512) | more than 11 years ago | (#5281074)

I agree, and would like to add my thoughts.

One of the most likeable things about programming is that on a low enough level, it's always predictable. This kind of goes hand-in-hand with the fact that computers don't make mistakes, humans do. As a programmer, it's very comforting (for lack of a better word) to have a chunk of code and know that, given X input, you'll get Y output. You can write a subroutine, document it well, and come back to it later, knowing how it will behave. Of course, other programmers can do the same with your code, without having to have intricate knowledge of how the code goes about returning the output.

But of course, there's a catch. It's probable that the programmer who wrote the subroutine initially didn't envision some special case, and therefore didn't write the code to handle it. If everybody is lucky, the program will hiccup and the second programmer will see the problem. The worse situation is when the error is seemingly minor, and goes unnoticed: when that floating point number gets converted to an integer and nobody notices.

I know this isn't some groundbreaking new look on abstraction in code, but it is pretty interesting to think about.

Re:Abstraction is necessary! (1)

phuturephunk (617641) | more than 11 years ago | (#5281158)

No human, however gifted, is capable of understanding the entirety of more than a trivial system at once.

That's not entirely true. Right brained people can, thing is.. Alot of right brained people have a hard time learning programming.. Its the way its taught, or so I've experienced . .

Not the theories themselves.... (5, Insightful)

keyslammer (240231) | more than 11 years ago | (#5280956)

... but the lack of experience.

Programmers have to consider subsystems as abstractions: there's a limit to how many things the brain can deal with at one time. We know that this kind of thinking produces cleaner designs which are less susceptible to bugs and security holes.

Knowing the limitations of the "black box" and what will break the abstraction is the product of lots and lots of experience. I don't believe there's any way to teach this - it's something that you just have to live through.

That's why senior level developers can continue to justify their existence (and higher price tags)!

I think everyone is missing the point. (2, Insightful)

Anonymous Coward | more than 11 years ago | (#5281143)

I've been doing pda programming for both the pocketpc and the palm os.

The application for both is intended to be identical, but the api is different for each device.

I designed the app originally for the palm, but now I am porting it over to the pocketpc. Unfortunately, the api is different enough that little of the code is portable.

If I had known I would be coding for both, I would have tried to design the code to be more portable. Knowing the requirements of both systems might have allowed me to factor out the device-specific sections.

When I was programming the temperature control ... (3, Funny)

burgburgburg (574866) | more than 11 years ago | (#5280958)

on the new Death Star, I found that trying to envision the "Big Picture" interfered with the specific requirements of my task. I needed control mechanisms smart enough to deal with Storm Trooper suits, regular Empire uniforms, robots with various temperature ranges, Wookies. It needed to be able to maintain a comfortable temperature range in the beam tunnel vicinity even during firing. And it needed to be efficient enough that they wouldn't shift power from the exhaust port shields to the jacuzzi heaters like they did on the old Death Star.

Experience (4, Insightful)

wackysootroom (243310) | more than 11 years ago | (#5280959)

The only thing that school prepares you for is to get an entry level job where you can gain the experience to write reliable software.

School will get you up to speed on new terms and concepts, but the only thing that will make you better at writing good code is to read good code, write your own code and compare it to the good code, notice the difference, and adjust your approach until your code is "good".

Re:Experience (1)

Havokmon (89874) | more than 11 years ago | (#5281153)

The only thing that school prepares you for is to get an entry level job where you can gain the experience to write reliable software.

School will get you up to speed on new terms and concepts, but the only thing that will make you better at writing good code is to read good code, write your own code and compare it to the good code, notice the difference, and adjust your approach until your code is "good".

I agree entirely, and will actually take it a step further:
Question the 'facts' you were taught. They may appear to be correct, but you may not only learn something new by examining why the 'fact' is as it is, but you may find a better way.

Of course, take that with a grain of salt. I don't believe the speed of light can't be exceeded. Not that I can prove anything, I just don't like having someone tell me I can't do something :)

Re:Experience (2, Insightful)

Anonymous Coward | more than 11 years ago | (#5281166)

So then how does better code get developed if everyone is merely attempting to match their coding style to the "good" style they see?

*Someone* had to provide that good code in the first place.

From a programming point of view, it depends. (3, Interesting)

Dthoma (593797) | more than 11 years ago | (#5280962)

Programming classes may encourage a 'black box' approach to programming, depending on what language you use. The reason for this is that it all relies on how high-level the language is; if you're using PHP then chances are you won't be worrying nearly as much about the hardware of your system than if you're using C, assembler, or machine code.

We don't need anymore black boxes (4, Interesting)

jj_johny (626460) | more than 11 years ago | (#5280964)

I think that the programmer who thinks of things in a black box mentality is usually going to be involved in failed program. I have run into so many programmers who know nothing of the many parts that their program touches. They seem to believe that their software does not work within a wider system and a wider world.

The problem with these programmers is that they rarely understand what can and does go wrong with the outside world. It is always amazing to me that there are people out there that assume that everyone has a 100BaseT Ethernet hub between the front end and the back end or other stupid assumptions.

The issue that crops up most when programmers think in black box terms is that today's software is not spec'd out enough so that the end user does not get what they wanted but the programmer did not solve it by asking. Too often the problem is very fuzzy and thus the programmer is there to help clarify not just implement.

Without a well rounded programmer looking at the overall system (or his/her boss), you will wind up with chatty, buggy applications that was what the user asked for but not what they needed.

The problem as I see it (0, Troll)

Jack Wagner (444727) | more than 11 years ago | (#5280965)

It isn't so much a matter of looking at the big picture as it is a matter of not using the proper tool for the proper job. Too too often you will see someone using the technology flavour of the month because it's been hyped and gotten lots of press when a better more robust tool is available.

For instance while looking at some of the source code for the latest version of Gnome/Linux 8.0 I noticed many points where the coder was using an improper algorithm because they were attempting to emulate XML technologies when a flat unicode text aproach would give them a speed increase of Olog(n). Why would they do this? Simple, because they think that XML is cool and it's the way to go, regardless of whether it's valid in their context or not.

I think Jack Brooks summed it up the best in "The Mythical Man Month" when he stated that technology would only be limited by the lack of maturity in those who develop the tools.

Warmest regards,

Re:The problem as I see it (1)

crmartin (98227) | more than 11 years ago | (#5281098)

"Fred Brooks". Frederick J Brooks, Jr.

Good show! (0)

Anonymous Coward | more than 11 years ago | (#5281122)

+4 Insightful - you are the master of trolls, Mr. Wagner. I salute you.

Re:The problem as I see it (0)

Anonymous Coward | more than 11 years ago | (#5281137)

I think Jack Brooks summed it up the best in "The Mythical Man Month" when he stated that technology would only be limited by the lack of maturity in those who develop the tools.

I don't think he ever said that, and the author's name is Fredrick P. Brooks, Jr [] .

Good question (1)

proverbialcow (177020) | more than 11 years ago | (#5280966)

When I was taking classes, this was always the most frustrating thing I had to deal with. I was expected to program in "ideal" C or LISP (or Pascal or BASIC, way back in the day), and then have to compile it according to the whims of the particular compilers we had (i.e. very specific command-lines), without any explanation of what those quirks were and why they chose a quirky compiler over something more standard.

That was a lot of damage to undo, and I'm still working on it.

The pros and cons of Black Box (4, Insightful)

levik (52444) | more than 11 years ago | (#5280967)

The black box paradigm obviously has its proper and improper applications.

It can be a great boon in OO programming, where you can assume a component will live up to its end of the bargain by providing the specified functionality, letting you concentrate on using whatever interface it exposes.

It can obvioulsy be taken too far in cases where failure to know about the internal workings of a system can lead to grossly unoptimized or even error-prone code. However, more often than not such problems are caused by faulty abstraction, and incomplete documentation on the part of the implementor.

In most such cases a "grey box" approach would do, where the end-developer is made aware of some of the limitations and quirks of the component they are working with, but not neccessarily the minute details of its operation. You don't need to know if the sort() function is implemented with Bubble Sort or Quick Sort, but it does matter if it's a square time or a log time function.

Everything breaks down if taken to an extreme.

No... (1)

NitsujTPU (19263) | more than 11 years ago | (#5280971)

No, I don't think that this is an issue, especially with Cryptography which is entirely in the space of mathematical theory. Unless this is some form of hardware crypto, and a programmer said "oh yeah, it's made by the hand of God," I doubt that it is programmer error.

On the other hand, managers and people more dettached from the actual implementation...

Yeah, pretty much... (5, Insightful)

drenehtsral (29789) | more than 11 years ago | (#5280973)

I think you've got a point there.

The way modern projects are often managed, along with the way modern programmers are often taught does lead in that direction.

Even if you are only responsible for a small part of a much larger project, it will always help to have a decent understanding of the REST of the system. Maybe not in excruciating technical detail, but at least a decent grasp on what goes on and how it all works.

The goal of the whole 'black box' thing is that in theory that minimizes stupid dependancies and hidden interconnections that can cause things to be unmaintainably complex. Individual components should still be well spec'd out, and projects should still be modular, but each programmer should grok the context in which his/her code runs, and people should still communicate to iron out inefficiencies, strive for a consistant UI, etc...

I think it's hard to teach that, you just have to learn by experience. Where I work, we all go out for curry once a week (company pays) and we just talk about the project, off the record, no beaurocracy, just a handful of geeks talking about programming. We've hammered out more efficiency/UI/complexity issues that way than in any formal meetings.

Very Interesting (0)

Anonymous Coward | more than 11 years ago | (#5280977)

I've often wondered that exact same thing. I've always thought the best programmers were true problem solvers -- driven to find the BEST answer to the given problem. However, I think it is quite common for (inexperienced??) programmers to thumb through their copy of "Design Patterns" for a solution.
Most programming is driven by bussiness. And we all know there are tradeoffs (cheap, fast, good: pick 2) in every bussiness.

Agreed (0)

Anonymous Coward | more than 11 years ago | (#5280984)

I've been out of college five years now and it never ceases to amaze me how many people I meet who have master's in CS and have _no clue_ how the HW underneath works. I work in embedded systems also, so maybe I have a biased view point, but I think all CS majors should be required to take courses in computer architecture. How can you write good code if you don't understand how the HW underneath operates? You don't have to a be a master at digital logic or assembly programming, but if you don't understand what a set-associative cache is, how TLB misses affect performance of your system, or how instruction pipleling works, you should not be programming...

Hardware or OS (0)

binaryDigit (557647) | more than 11 years ago | (#5280986)

If he's asking whether or not most programmers today should give a flip about hardware, then I'd say NO, most programmers don't need to care about the hardware their running on. They should be concerned about solving the problem, and then doing it in a way that is free of bugs and well designed. The fact that it's running on a 600mhz PIII or a 3ghzP4 is a non issue, EXCEPT that most applications should run in a useful fashion on either config. This is what OS's are for, so we don't have to worry about hardware issues.

Now if by bigger picture you're talking about the OS, then sorta. Again, it depends a lot on what you're writing. Again, most programmers don't need to be as concerned, though it's usually hard to get anything useful done without having to "get your hands dirty" a bit in your environment.

Abstractions (0)

Anonymous Coward | more than 11 years ago | (#5280995)

You have to deal with abstractions when you program. You can never get the "whole picture", because there's no place to stop. It's turtles all the way down. For most programming, I don't worry about the existance of memory as a large array, even though that's just an abstraction for transisters, etc. It's fun to program close to the hardware, but it's often not worth the extra effort.

There was a great Slashdot article/discussion on leaky abstractions. Search and you will find.

University vs. real world (1)

denladeside (535103) | more than 11 years ago | (#5280996)

The usual problem applies here. While the world evolves... university professors teaching material usually doesn't.

bah (1, Interesting)

Anonymous Coward | more than 11 years ago | (#5280999)

We need new paradigms, and more best-processes. We need a design for system mentality and a design for integration production engine.

Or just realize that spaceflight is risky, and it's not always the great unknown that will bring us down.

I'm not advocating carelessness, just to point out that these space machines we build tend to last 15 years between failure and are so complicated it makes the MS Windows operating system look like Linkin Logs. REAL engineers built that sucker, and they aren't infallible, but they are thorough.

I'd bet my life on their processes any day.

That's the difference between CS and CE. (4, Insightful)

Anonymous Coward | more than 11 years ago | (#5281004)

Computer science professors and courses are more concerned with the methods, ideas, and logic of computer programming and design. The idea is to create a totally abstract system, hardware or software, that can then be implemented on any system. This is the purpose of "black box" programming.

While I agree with you that programmers should understand the hardware they are writing for, any knowledge of that hardware is biasing their creation of a system to run on that hardware and further removing itself from computer science's notion of total abstraction.

UI issues: left hand/right hand (5, Insightful)

Illserve (56215) | more than 11 years ago | (#5281006)

I recently installed the recent version of the accursed RealOne player to watch an rm file. I hate Real player more than can be described by words and it just seems to be getting worse.

So I pop it up to view the file, and what happens? I get the movie playing in a window on top of the Real advertising/browser thing. It spontaneously pops up a "help" balloon giving me a tip for how to use the browser window. The balloon is sitting RIGHT ON TOP OF THE GODDAMNED MOVIE IMAGE. It goes away after a few seconds of frantic clicking, but the point is clear, these programs are often a monstrous brew, created by too many chefs. They just throw in features, and there doesn't seem to be someone sitting at the top, deciding whether these features actually contribute to improving the final product, or just make it worse.

Then there's Office, which, by default will turn 2-21-95 into 2/21/95. ????? I have to dig through numerous help pages to figure out what subpanel of the preferences menu will deactivate this. Worse, I enter 23 hello on a new line in Word, and hit enter, it auto indents, adds a 24 and positions the cursor after. !?!?!!?!?!?!?!!?
How many times I've had to fight this particular feature I can't tell you.

And it's certainly not just a closed source thing either, if anything, some open source GUI packages are even worse. Although, to be fair, I don't expect as much from open source stuff, because noone's getting paid. But when a program created by paid programmers is just badly done, I get infuriated at the incompetence, at the hours wasted taking a usable product and making it actually worse by throwing in garbage features.

It's been said a million times, but if we made cars the way we make software, noone would get anywhere.

itanium is going away (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5281019)

doo dah doo dah
itanium is going away []
doo dah doo dah day

The problem isn't black boxes it's grey edges (1)

joe_fish (6037) | more than 11 years ago | (#5281020)

I've worked on large systems where I've had a lot to do with many parts of the system, and not been able to hold the whole thing in my head all at the same time.

And as a someone that's had a lot to do with the building of these systems I have a better chance than most. New programmers would have no chance. We need black box systems to enable us to continue working.

The Real problem is that the black boxes we define don't have nice sharp edges, so when we put 2 boxes next to each other there are cracks. Cracks for the crackers to crawl through.
Joel Spolsky wrote about it and called it The Law of Leaky Abstractions []

Bad security? (1, Funny)

Anonymous Coward | more than 11 years ago | (#5281025)

You need to give your black box objects a coat of super-black paint... security by invisibility...

Of course (5, Insightful)

Scarblac (122480) | more than 11 years ago | (#5281031)

It is essential that every programmer in a big system only thinks about his own problem, and uses the other parts as a black box.

Say I want to use some library. Then it has a documented API, which explains how I can use it. I don't need to know more. For me as a programmer, that means:

  • Simplicity - it is a limit on what I need to understand.
  • Compatibility - if a new version comes out, which changes implementation details but leaves the API intact, programs that don't make assumptions about these details won't break.
  • Portability - if there is a new implementation of the same API by another vendor, I can (theoretically) just change to that implementation and nothing changes.

I'm certain that without these black boxes, no big software engineering project would be possible. The human mind can't keep track of everything in a whole system at once (except for some simple cases - like embedded systems, perhaps).

It is done sometimes - I believe perl looks inside a file struct when reading/writing files on some platforms to get faster I/O than standard C, for example. But that's only as an optimization after coding the general case, and even then I don't believe it's a good idea.

For hardware, the story is much the same. Any speedups specific for the hardware are optimizations, and they should only be looked at when the program works, after profiling, when there's a speed problem, and the algorithm can't be improved.

Remember the rules of optimization: 1) Don't do it. 2) (for experts only) Don't do it yet.

Black boxes in software engineering are your friend.

Welcome to API's and OOP (4, Insightful)

DakotaSandstone (638375) | more than 11 years ago | (#5281032)

I'm also an embedded systems engineer. Two huge concepts I got as an undergrad in CS were "APIs" and "object oriented programming." By their very nature, these things inspire black box thinking.

And heck, I don't know. I mean, is it great that I can now call malloc(1000000); and get a valid pointer that's just usable? Yeah probably. In DOS, I wasn't shielded from the memory manager as much, and to do something like that, I had to write my own EMS memory swapping code! That was a PITA, and kept me from the true task I was trying to solve.

So a modern 32-bit malloc() is a black box for me. Cool. It's empowered me very nicely.

However, something like WinSock has become a big black box for people too. Okay, great. So it's really easy, in 5 function calls, to open a socket across the internet and send data. But you've missed the nourances of security. So now your app is unsafe, because you weren't forced to know more about what's going on in the "black box."

Well, that's all I can really say in my post. Black boxes are a darn complex issue to talk about. Anyone who attempts to distill this down to a "yes" or "no" answer is probably missing a lot of the completixy at hand in the issue.

Very interesting question to put forth, though. Good topic.

As a 3rd Year CS Student... (2, Interesting)

calebp (543435) | more than 11 years ago | (#5281036)

I concur that there is a lot of this kind of niavete out there, but on the other hand, there are always the few that will go above and beyond. While my schooling tends to focus on a more abstract approach with emphasis on OOP, I have also started working on embedded systems in my own time.

It seems apparent enough to me, that any passionate CS student will not be satisfied with a mystically based understanding of computer architecture; and will in turn educate themself. I propose then, that any kind of 'black-box' mentality is more a reflection of the students' drive then their education.

hey man... (1)

LordYUK (552359) | more than 11 years ago | (#5281037)

why's it always got to be the "black box"? Huh??

Re:hey man... (1)

WetCat (558132) | more than 11 years ago | (#5281080)

Because magician's black boxes, that usually
contained some enigmas, were usually black.

Re:hey man... (1)

Mr. Bad Example (31092) | more than 11 years ago | (#5281152)

'cause The Man is always tryin' to keep a brother box down.

Oh like in the grand scheme of things ... (5, Funny)

nicodaemos (454358) | more than 11 years ago | (#5281043)

Okay the "big picture" college profs should be showing you is this one [] .

Not yet, anyways (2, Interesting)

captredballs (71364) | more than 11 years ago | (#5281044)

Is black box programming a pipe dream? I wouldn't go that far, as software engineering/compsci is a relatively new "science". At any rate, I know that I am very reliant on knowledge of the underlying platform that my code is running on. When a piece of software (especially one that I didn't write) doesn't work, I often resort to tools like truss/strace, lsof, netcat, /proc,etc... to help me determine what is going on "under the hood". I can figure out what ports, files, dlls, and logs the software is using in a matter of seconds, instead of resorting to a dubugger or printf's.

I'm no superstar engineeer, but I find this methodology (my window into the black box) so valuable that I'm often frustrated by collegues who refuse to learn more about an OS/VM/interpreter and make use of it. It is also what most frustrates me about troubleshooting in windows.

While it's true that I don't know much about windows, I get the feeling that these kind of observation tools that are so common on unix-ish machines aren't quite so prominently available on winderboxen. Sure, you can figure out a lot about a problem using MSDEV (what I remember from college, where VC++ wouldn't stop opening everytime netscape crashed), but it isn't available on ANY machine that I ever troubleshoot.

Hell, even when I'm programming java I use truss to figure out what the hell is wrong with my classpath ;-)

black box by necessity (1)

bob_jenkins (144606) | more than 11 years ago | (#5281050)

A black box is something you don't know anything about. You make it not a black box by learning something about it. Most of software development is spent learning about the system (reading documents, searching indexes, walking through code with a debugger). How much you can get done is determined by how little you can get away with learning before fixing a problem.

That's not universal, it might not be the case with the shuttle software, but it's true for a lot of software. It's definitely true for my job.

Programmers and the "Big Picture"?...? (1)

Glock27 (446276) | more than 11 years ago | (#5281061)

Back working on my undergrad (computer engineering) I remember getting frustrated at the comp-sci profs that insisted machines were simply 'black boxes' and the underlying hardware need not be a concern of the programmer.

I'm not sure if there was a lack of communication with your prof, but the concept should have been "SOFTware as black boxes". This is the concept of data hiding, which is a good thing. The cornerstones of software engineering are abstraction and encapsulation, and data hiding is a big part of encapsulation.

The hardware is (to an extent) a "black box" from the standpoint of any higher-level language, including C and Ada. That is the whole point of software portability, which is also a good thing. Both of those have been used for a tremendous number of embedded systems (particularly Ada, which is used for quite a lot of the space shuttle software). One must know that one's algorithms will execute deterministically in the required time, but knowing in detail how the data and instructions flow from memory though cache and processor is emphatically not required in 99% of cases.

Detailed knowledge of computer hardware is helpful to software engineers, but by no means essential. Talk to the hardware folks if you have a question. ;-)

By the way, don't forget another important axiom:

"Premature optimization is the root of all evil."


School Doesn't Prepare You For Real World Coding (2, Insightful)

Carnage4Life (106069) | more than 11 years ago | (#5281066)

There are many things computer science education does not teach the average student about programming. This is burdened by the fact that programming can vary significantly across areas of CS (i.e. networking vs. database implementation) and even within the same area (GUI programming on Windows vs. GUI programming on Apple computers).

When I was at GA Tech the administration prided themselves on creating students that could learn quickly about technologies thrown at them and had broad knowledge about various areas of CS. There was also more focus on learning how to program in general than specifics. This meant that there was no C++ taught in school even at the height of the language's popularity because its complexity got in the way of teaching people how to program.

Students were especially thought to learn how to think 'abstractly' which especially with the advent of Java meant not only ignoring how hardware works but also how things like memory management work as well. In the general case, one can't be faulted for doing this while teaching students. Most of my peers at the time were getting work at dotcoms doing Java servlets or similar web/database programming so learning how things like how using linked lists vs. arrays for a data structure affects the number of page faults the system makes were not things that they would really have to concern themselves with considering how things like the virtual machine and database server would be more significantly affect their application than any code they wrote.

Unfortunately for the few people who ended up working on embedded systems where failure is a life or death situation (such as at shops like Medtronic [] ) this meant they sometimes would not have the background to work in those environments. However some would counter that the training they got in school would give them the aptitude to learn what they needed.

I believe the same applies for writing secure software. Few schools teach people how to write secure code not even simple things like why not to use C functions like gets() or strcpy(). However I've seen such people become snapped into shape when exposed to secure programming practices [] .

How big of a team (1)

briancnorton (586947) | more than 11 years ago | (#5281071)

If you are in a very small team, you obviously need to be aware and conscientious of the system as a whole, but on a larger team, if everybody has a view of the whole project with their own vision, everybody goes a different direction. It is better in this case to have each individual or group be concerned with following the specifications for their individual componant, and having lead programmers/designers integrate as needed. I've never worked in embeded systems, so I cant tell you if that holds up.

My comp sci experience (1)

truthsearch (249536) | more than 11 years ago | (#5281089)

My undergrad studies for computer science included fundamental understanding of gates and boolean logic. We also studied some of the microcode that goes into processors. So we went from the level of the gates to simple chips to the basics of processors to assembly to operating systems to applications. It wasn't taught in that order from the ground up. Algorithms were studied at the same time as chips, but it worked out well. Anyone getting a comp sci degree from Pace U. in NY has at least a fundamental understanding of computers from the ground up. However I have a coworker (developer) with an electrical engineering degree. He has much better knowledge of the electronics from beginning to end and he's a great programmer.

I find knowing how things work from the bottom up makes me better at building on top. I find the most ignorant and least innovative developers to be those with only a high level understanding of how the underlying software and hardware works.

Abstraction (1)

NixterAg (198468) | more than 11 years ago | (#5281091)

Do Slashdot readers think that the theories used to teach (and learn) programming lead to programmers that tend to approach problems with a 'black box', or 'virtual machine' mentality without considering the entire system? That, in and of itself, would explain a lot of security issues, as well as things as simple as user interface nightmares. Comments?"
But isn't that exactly how we are able to use abstraction and make large, complex systems? A good programmer and engineer is naturally going to want how their piece of the system fits in with the overall machine, but usually it simply isn't practical.

The best case scenario will always be for each member of a development team to understand every nuance of the system and every detail of its interface with the underlying hardware. However, it simply isn't practical (and for some systems it might not even be possible).

Big Pictures? (0, Offtopic)

krystal_blade (188089) | more than 11 years ago | (#5281095)

Go to The Hun if you want "big" pictures.


Is it time for this link again? (1)

clovis (4684) | more than 11 years ago | (#5281109) ht ml

Diassembling the radio (1)

Tomy (34647) | more than 11 years ago | (#5281111)

I think part of what makes many of us go into an engineering career is the curiosity that requires that we have to have a look under the hood. I never was a very good Lisp programmer until I wrote my own interpreter in C. That gave me the knowledge to write more efficient Lisp code.

Every Java programmer should at least look at the source for the Java base classes, and ultimately should understand the VM. C++ programmers should at least read "Inside The C++ Object Model." C/C++ programmers should peek at the assembly their compiler creates. Python or Perl programmers that have a good understanding of the internals of their interpreters are going to write better code.

All these abstractions are there so you don't have to sweat details all the time. But this shouldn't be misconstrued as "never."

Ah, the mythical black box (1)

QuasiEvil (74356) | more than 11 years ago | (#5281115)

It depends on what field you're in. Are you writing a portable software component that is basically a business logic module, or are you writing code that is the whole system (embedded micros, etc.)? Software components, if they need to be portable, should be blackbox-esque.

Embedded stuff? Hell no. Software is there, usually, to facilitate the hardware. The smaller, tighter, and faster you can get the code, the cheaper the hardware to run it on, and the cheaper the overall design. That was one of the things that annoyed me most about college programming classes, etc. - the fact that everything was abstracted into a world where all hardware was equal and all code was perfectly modular.

I'm an embedded programmer. The hardware is mother, the hardware is father. Sure, usually I have at least a bootloader and/or possibly a separate kernel that abstracts the hardware just a bit. However, while abstractions are nifty and help prevent code duplication and coder error, the machine itself certainly doesn't need them and all they do is chew up processor cycles. That leads to needing bigger, more power hungry hardware to keep ahead of all the bloated code. The trick is balancing the two.

Sure, I can write C for a PIC with 64 bytes of memory and 2K of program space as well as I can write C for a 16-way Alpha server monstrosity, but the style of code I write will be massively different. There ain't no malloc, no filesystem, etc. on a PIC. You can't interface to a ODBC client to store data - you have to write the flash interface routines, check for errors, figure in wait states, etc. To manipulate certain registers, you have to execute very specific assembly in a very specific order, or you risk falling into bugs in certain revs of silicon.

I also tend to think that programming is moving bits around between registers. It's a fun way to flip transistors, essentially. Likewise, I've always thought that computer engineers should start at machine architecture, then move to assembly, then C, then C++, etc. Make them really think about what the abstraction each language buys you vs. what it costs in bloat and inefficiency (because no optimizing complier is as good as an hard-working ASM guru).

I'm not against portability, but I am against bloat and inefficient coding in cases where it doesn't buy me (or future maintainers) anything.


Black-Box Programming is Essential (1)

cK-Gunslinger (443452) | more than 11 years ago | (#5281120)

We're taught black-box programming in school because of the simple fact that hardware changes, frequently. Abstraction is necessary, or we would all be writing in assembly.

But, of course, encapsulated programming must be designed that way. A top-down, system-wide approach to security, error-handling, IPC, etc must be established in advanced and trickle-down to each component, even if it means that your software simply returns a -1 on a failure, but the higher piece in the heiarchy sees your -1 and the -1 of other parts and determines that a boolean value of FALSE is deserved, which it returns to its parent component, etc. Eventually, a high enough component will recognize the true problem and do what ever it needs.

The point is, black-box programming works, but only if there is an underlying plan in effect. It's the five (or is it 6?) P's of a successful project in action!

A issue for humanity, not just programmers (1)

scotay (195240) | more than 11 years ago | (#5281125)

Do Slashdot readers think that the theories used to teach (and learn) programming lead to programmers that tend to approach problems with a 'black box', or 'virtual machine' mentality without considering the entire system?

Having a human brain leads programmers to approach problems with a 'black box' or 'virtual machine' mentality.

I don't think we were built for a natural 'big picture' view. We were built to understand our little piece of the African savanna from inside that box.

All the 'big picture' stuff is doable, but not as naturally. We will always feel a little more comfortable inside the box.

Humanity will always have to force itself to think outside the boxes we constantly make to aid our system of modeling reality through perception.

Part of the problem is the Real World (1)

GeekDork (194851) | more than 11 years ago | (#5281131)

In theory, a perfect system could be built from a lot of communicating "black boxes" without knowing the whole system. That's what the fun is all about. In the Real World however, this approach is not possible since there are too many broken black boxes and too many unsupported standards that make a secure approach impossible without knowing or at least controlling (=choosing which black boxes to use) the whole system.

Unix (1)

kisrael (134664) | more than 11 years ago | (#5281138)

Hrmm. Some of the philosophies of Unix and its revolutionary system of pipes tend to emphasize individual components, each doing its job well. (Though Perl as the swiss army chainsaw with sometimes surprisingly better performance has something to say on that...)

I'm probably living in a dreamland, but I really think small teams, where people can realistically have a hand in all and therefore knowledge of all parts of the system, can do almost any software project. It seems to me that "mythical man month" scaling problems really start to attack productivity, even with medium size teams.

Paying attention to the *ENTIRE* task is important (0)

Anonymous Coward | more than 11 years ago | (#5281154)

Where I work, we have two software groups (in practice). One is an integration group, and they actually sit in the lab with the hardware. They all know how to reset the hardware, replace cards, even perform diagnostics on the system. Their code is almost invariably good, and their knowledge of the system is almost as formidable as the hardware designers. Sadly, they are limited to writing BIST and self-test code, not application code.

The app guys, on the other hand, don't even sit in the same room as the hardware. They have to call someone to hit the bleedin' reset button, and most of them barely know how to power the equipment up. They code at an abstract level, and frequently miss nuances of the hardware that were explicitely designed in for the task at hand. Their code is *always* bug ridden, and when we have serious problems on the system bench, nine out of ten times we can point the finger at a software engineer.

So, yes - I can say from experience that software engineers could benefit from paying attention to the entire task, not just the code they are being paid to produce.

Not quite the issue (1)

dido (9125) | more than 11 years ago | (#5281159)

One side of the issue is that if you attempt to look at any sufficiently large and complex system it will overwhelm you with its complexity. The human mind can only deal with a certain amount of complexity at a time before overloading. That's the reason why object-oriented methodologies were invented, to attempt to chop up a large and complex problem into smaller and more manageable pieces, so you can deal with certain things as "black boxes" and move on to the bigger picture. Sort of like zooming in and out. A graphic artist would never think of working at a single zoom scale when editing a picture, she would zoom in for fine work and zoom out for an overall view. Treating things as black boxes is done so that you don't lose sight of the forest for the trees, not the other way around!

But of course, as my own experience in embedded systems development and electronics work has taught me as well, it does no good to simply leave things as black boxes. You also have to know how the black box works on the inside before you can go on to treat it as a black box. I had to learn the ins and outs of semiconductor and transistor physics before I learned how to use logic IC's, which have these components as their basic building blocks, so that I'd understand the limits and quirks of these devices. I think the big problem we have is that people are generally unfamiliar with how the many black boxes they use actually look like on the inside, so if their system winds up eventually tickling limitations or quirks (which, as the complexity of the system they're building grows becomes more and more likely), they have no idea what the hell is going on or what to do about it. In other words, too much zooming out, not enough zooming in, so you get work which has too many rough edges and not enough fine detail.

Salon had a highly insightful article some years back about this very topic as it pertains to software engineering: "The Dumbing Down of Programming", by Ellen Ullman, Part One [] and Part Two [] . She talks about the way too much knowledge is disappearing into code, and the problems that causes.

The picture is not static (1)

uw_dwarf (611383) | more than 11 years ago | (#5281165)

"Black box" thinking provides the ability to encapsulate or modularise certain functions. You can test the black box to the specification (if the specification is good enough), and determine whether the black box will do its job in the specified context correctly.

Enter "reuse." This can change the context in which the original black box was required to operate. It may work, but it may be degraded in some way (for example, a data management system designed for a year-old HP rp7410 server having to be shoehorned into a five-year-old K460). Or the black box will have to be modified to deal with a new set of conditions (what was a legitimate error in the old context is OK in the new one, and the black box has to have additional input to tell it which context applies). Either of these can be managed.

Things get difficult when this black box is picked up without careful examination and embedded in something completely unknown to the original black box design team. The specification for the black box may not match this new context, and things will break. What's changed? The picture, not the black box. It is not the designers' fault that the black box doesn't fit the new context perfectly. But designers must be prepared to adjust them to new contexts if appropriate. And design managers must be aware of this possibility.

In short, the black box methodology is fine, but it is limited in that it does not recognize changing contexts. Everyone involved in the design process must understand this.

yes, but more a management issue (1)

drteknikal (67280) | more than 11 years ago | (#5281167)

My sense is that this does happen, but not usually because of any flaw in the programmers or engineers. More often, it's the result of management not giving more information than is necessary to complete the task.

It's important to take the whole team, as a group, through the big picture. Even if it is just a short overview meeting, there is significant value in making sure that everyone knows that their assignment is part of a larger whole, showing how the pieces are intended to come together, and giving everyone a context for their individual bits.

My experience is that being given all the additional info doesn't take too much time, doesn't overwhelm anyone, and produces far more usable results. Letting everyone work in a vacuum, the other extreme, tends to cause integration nightmares and lots of wild tangents that make sense ONLY in the context of one little bit while working against the overall goals.

some food for thought (1)

mrtroy (640746) | more than 11 years ago | (#5281168)

first off im just wondering after reading these posts how many people thought you were discussing the flight data storage in airplanes referred to as "black boxes" *grin*
on a serious note, in my undergraduate studies in computer science, we were first taught how everything works with hardware, then how to hardwire program it, then microprogramming, and so on. By the time you progress to the next step you cannot forget about how things work and relative efficiency, for example whenever I do anything now I am always reminded of that damned microprogramming.
I dunno, if you are taking a black box approach to programming, chances are that you arent in a coding position where you need to take hardware into account. If you are in a situation when you cant/shouldnt take a black box approach I would hope that you have had enough schooling and/or experience/knowledge to take a more advanced approach
But if you are using doesnt matter what you do :P
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?