FirmWarez asks: "I'm an embedded systems engineer. I've designed and programmed industrial, medical, consumer, and aerospace gear. I was engineering manager at a contract design house for a while. The recent thread regarding the probable encryption box of the Columbia brought to mind a long standing question. Do Slashdot readers think that the theories used to teach (and learn) programming lead to programmers that tend to approach problems with a 'black box', or 'virtual machine' mentality without considering the entire system? That, in and of itself, would explain a lot of security issues, as well as things as simple as user interface nightmares. Comments?"
"Back working on my undergrad (computer engineering) I remember getting frustrated at the comp-sci profs that insisted machines were simply 'black boxes' and the underlying hardware need not be a concern of the programmer.
Of course in embedded systems that's not the case. When developing code for a medical device, you've got to understand how the hardware responds to a software crash, etc. A number of Slashdot readers dogmatically responded with "security through obscurity" quotes about the shuttle's missing secret box. While that may have some validity, it does not respect the needs of the entire system, in this case the difficulty of maintaining keys and equipment across a huge network of military equipment, personnel, installations."