Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Back To 'The Future of Programming'

Soulskill posted about a year ago | from the coding-at-88-mph dept.

Programming 214

theodp writes "Bret Victor's The Future of Programming (YouTube video; Vimeo version) should probably be required viewing this fall for all CS majors — and their professors. For his recent DBX Conference talk, Victor took attendees back to the year 1973, donning the uniform of an IBM systems engineer of the times, delivering his presentation on an overhead projector. The '60s and early '70s were a fertile time for CS ideas, reminds Victor, but even more importantly, it was a time of unfettered thinking, unconstrained by programming dogma, authority, and tradition. 'The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor. 'Because once you think you know what you're doing you stop looking around for other ways of doing things and you stop being able to see other ways of doing things. You become blind.' He concludes, 'I think you have to say: "We don't know what programming is. We don't know what computing is. We don't even know what a computer is." And once you truly understand that, and once you truly believe that, then you're free, and you can think anything.'"

Sorry! There are no comments related to the filter you selected.

Short version? (-1)

Anonymous Coward | about a year ago | (#44523469)

That's a 30 minute video. Isn't there a cliffs notes for the internet age?

Re:Short version? (3, Funny)

oodaloop (1229816) | about a year ago | (#44523567)

What, TFS wasn't short enough for you?

Re:Short version? (-1)

Anonymous Coward | about a year ago | (#44523627)

TFS length wasn't the issue. It was the fact that it said very little. Just sounded like pretentious philosophical BS.

Re:Short version? (4, Interesting)

Alwin Henseler (640539) | about a year ago | (#44523763)

You must be new here. That "pretentious philosophical BS" is like the spark in a fuel-and-oxygen filled chamber. It ignites into a heap of comments, and those comments are the actual story. Who needs an article when you can browse +5 funny / informative / interesting and -1 trolls?

As for the linked articles, that's just a cleverly disguised DDoS botnet setup. Some figured it out, but few seem to care the /. botnet is still operating. Heck, I'm even contributing people-time to it (on top of CPU cycles).

Re:Short version? (2)

BreakBad (2955249) | about a year ago | (#44524093)

...who needs an article when you can browse +5 funny / informative / interesting and -1 trolls?

Here at /. I just look at the pictures. Playboy, now thats a different story.

Re:Short version? (1)

AK Marc (707885) | about a year ago | (#44524321)

So it was a TED talk?

Re:Short version? (1)

Java Pimp (98454) | about a year ago | (#44523673)

Summary of TFS: There is no spoon.

Re:Short version? (1)

Anonymous Coward | about a year ago | (#44525171)

Wait, I thought Programming is Ruby, Computing is in the Cloud, a Computer is something you rent from Amazon! Who cares about the underlying 40 year old code that seems to just work and why it's there? I won't get any VC if I don't have a wacky logo and cloudy this and that.

All I know about 1973 .. (4, Interesting)

bAdministrator (815570) | about a year ago | (#44523901)

.. is that C was seen as a major setback by Frances E. Allen and others.

It [C] was a huge setback for--in my opinion--languages and compilers, and the ability to deliver performance, easily, to the user.

Source:
Frances E. Allen
ACM 2006 Conference
http://www.youtube.com/watch?v=NjoU-MjCws4 [youtube.com]

The context here surrounds abstractions and not allowing users (programmers) to play with pointers directly (C, and later, C++), which is a setback concerning optimization, because of the assumptions/connections you make about/with the underlying machine.

If you want to learn more about the ideas of the 1960s and 1970s, I highly recommend looking up talks by Alan C. Kay ("machine OOP" which is Smalltalk in a nutshell), Carl Hewitt (actor model), Dan Ingalls, Frances E. Allen (programming language abstractions and optimization), Barbara Liskov ("data OOP" which is C++ in a nutshell), and don't stop there.

Re:All I know about 1973 .. (1)

phantomfive (622387) | about a year ago | (#44524255)

I highly recommend looking up talks by Alan C. Kay ("machine OOP" which is Smalltalk in a nutshell),

Do you have a specific talk in mind here? This sounds fascinating.

Re:All I know about 1973 .. (5, Informative)

bAdministrator (815570) | about a year ago | (#44524389)

The thing to get here is that there are basically two kinds of OOP, so to speak.

Here's a short discussion that covers it:
https://news.ycombinator.com/item?id=2336444 [ycombinator.com]

In Alan Kay land objects are sub-computers that receive messages from other sub-computers. In Barbara Liskov world objects are abstract data with operators and a hidden representation.

Kay OOP is closely related to the actor model by Carl Hewitt and others.

Liskov had her own idea of OOP, and she was not aware of Smalltalk (Kay, Ingalls) at the time. She started work on her own language, CLU, at the same time as Smalltalk was developed.

Re:All I know about 1973 .. (1)

lgw (121541) | about a year ago | (#44526219)

Sure, if you're doing high level programming (and plenty were in the 70s just as today), C is a bad tool. If you're writing an I/O driver and the hardware works though updates to specific memory addresses, well, you need to be aware of pointers.

I see the biggest failing of C itself was this notion of "int", where you don't know how many bits that is. If you're writing the kind of code that belongs in C, you have to know that, and endless 16-32 and 32-64 bit porting nightmares were the result. It wasn't until C99 that int32_t became standard.

The mistake was mistaking "you can write a program that will compile for all platforms" for "your program will do what you expect on all platforms for which it compiles". The latter being rather more useful.

Re:Short version? (0)

Anonymous Coward | about a year ago | (#44524435)

you are exactly what is wrong with the so called internet age

Re:Short version? (0)

Anonymous Coward | about a year ago | (#44524661)

Moreover, that isn't the correct uniform for an IBM systems engineer of those days. He's missing the nonconductive tie-clip, the pocket protector, and the pinstripes. And an IBM systems engineer would always show up with a jacket, and take it off when they started working.

I knew it (3, Funny)

ArcadeMan (2766669) | about a year ago | (#44523547)

Every time some stupid colleagues of mine told me I was doing it wrong, I kept thinking they were close-minded idiots.

Turns out, I was right all along!

Open the blinds (0)

Anonymous Coward | about a year ago | (#44523577)

a beginning mind that is uncluttered by expectations. nice concept it works well in racing too

Re:Open the blinds (1)

robot256 (1635039) | about a year ago | (#44523667)

This is why no one ever reads patents before violating them. *ducks*

Re:Open the blinds (0)

ArcadeMan (2766669) | about a year ago | (#44524923)

If patents didn't want to be violated they shouldn't wear short skirts, they're just asking for it. /duck

They always fall the first time. (0)

Anonymous Coward | about a year ago | (#44523619)

Right Trin?

Hmm (5, Insightful)

abroadwin (1273704) | about a year ago | (#44523635)

Yes and no, I think.

On the one hand, it is a good thing to prevent yourself from constrained thinking. I work with someone who thinks exclusively in design patterns; it leads to some solid code, in many cases, but it's also sometimes a detriment to his work (overcomplicated designs, patterns used for the sake of patterns).

Unlearning all we have figured out in computer science is silly, though. Use the patterns and knowledge we've spend years honing, but use them as tools and not as crutches. I think as long as you look at something and accurately determine that a known pattern/language/approach is a near-optimal way to solve it, that's a good application of that pattern/language/approach. If you're cramming a solution into a pattern, though, or only using a language because it's your hammer and everything looks like a nail to you, that's bad.

Re:Hmm (4, Insightful)

orthancstone (665890) | about a year ago | (#44524025)

Use the patterns and knowledge we've spend years honing, but use them as tools and not as crutches.

Having just watched this video a few hours ago (sat in my queue for a few days, providence seemingly was on my side to watch it right before this story popped), I can say he argues against this very idea.

He mentions late in the talk about how a generation of programmers learned very specific methods for programming, and in turn taught the next generation of programmers those methods. Because the teaching only involved known working methods and disregarded any outlying ideas, the next generation believes that all programming problems have been solved and therefore they never challenge the status quo.

Much of his talk references the fact that many of the "new" ideas in computing were actually discussed and implemented in the early days of programming. Multiple core processing, visual tools and interactions, and higher level languages are not novel in any way; he's trying to point out that the earliest programmers had these ideas too, but we ignored or forgot them due to circumstances. For example, it is difficult to break out of the single processing pipeline mold when one company is dominating the CPU market by pushing out faster and faster units that excel at exactly that kind of processing.

While TFS hits on the point at hand (don't rest on your laurels), it is worth noting that the talk is trying to emphasize open mindedness towards approaches to programming. While that kind of philosophical take is certainly a bit broad (most employers would rather you produce work than redesign every input system in the office), it is important that innovation still be emphasized. I would direct folks to look at the Etsy "Code as Craft" blog as an example of folks that are taking varying approaches to solving problems by being creative and innovating instead of simply applying all the known "best practices" on the market.

I suppose that final comment better elaborates this talk in my mind: Don't rely on "best practices" as if they are the best solution to all programming problems.

Re:Hmm (0)

Anonymous Coward | about a year ago | (#44524289)

Sounds very similar to the problem of dogma in mainstream economics.

Re:Hmm (1)

Kal Zekdor (826142) | about a year ago | (#44524299)

Much of his talk references the fact that many of the "new" ideas in computing were actually discussed and implemented in the early days of programming. Multiple core processing, visual tools and interactions, and higher level languages are not novel in any way; he's trying to point out that the earliest programmers had these ideas too, but we ignored or forgot them due to circumstances. For example, it is difficult to break out of the single processing pipeline mold when one company is dominating the CPU market by pushing out faster and faster units that excel at exactly that kind of processing..

I can attest to this. The phrase "Everything old is new again." (Or "All of this has happened before, and all of this will happen again." for you BSG fans) is uttered so frequently in our office that we might as well emblazon it on the door. It's almost eerie how well some of the ideas from the mainframe era fit into the cloud computing ecosystem.

Re:Hmm (0)

Anonymous Coward | about a year ago | (#44525197)

One potential reason for this is that a high number of people never touched a mainframe, or even a "mini". They started with C64 or PC, and learned to know that kind of computing. Mainframes were big, expensive, and you in essences needed a degree to get to work on them.

Re:Hmm (1)

jythie (914043) | about a year ago | (#44525121)

As with so many things, it is a matter of balance. We now have what, 60 years or so of computer science under our collective belts, and there are a lot of good lessons learned in that time... but on the down side most people only know (or choose to see) a subset of that knowledge and over apply some particular way of doing things,.. then they get promoted, and whatever subculture within CS they like becomes the dogma for where they work.

Re:Hmm (2)

TsuruchiBrian (2731979) | about a year ago | (#44526091)

Designs are only complicated when they are unique. If I write my own LinkedHashMap to store 2 values, it is overcomplicated. If I just invoke a standard java LinkedHashMap to store 2 values, then it's the same design, but since everyone knows what a java LinkedHashMap does, it is simple. Also It can be swapped out for a simple array with relative ease if the code is designed in a way that is maintainable.

Even if you are using design patterns, you should be leveraging not just the knowledge that other people have acquired, but also the APIS and routines available in libraries. This way switching your program from using a simple design to a complicated design patter is easy.

It shouldn't matter that you are using an overly fancy tool to solve a job that can be solved by a simple tool. Electrons are free. What isn't free is human time and effort. So just be sure not to design your application in a way that makes it hard to change.

Would you happen to be an InfoSys trainer? (5, Funny)

xxxJonBoyxxx (565205) | about a year ago | (#44523647)

>> We don't know what programming is. We don't know what computing is. We don't even know what a computer is.

Aha - they found the guy who trains InfoSys employees.

70s yeah right! (3, Funny)

rvw (755107) | about a year ago | (#44523649)

The future of programming, from the seventies, it's all hippie talk...

"We don't know what programming is. We don't know what computing is. We don't even know what a computer is." And once you truly understand that, and once you truly believe that, then you're free, and you can think anything.'"

Next thing we can throw our chairs out and sit on the carpet with long hair, smoke weed and drink beer....

Re:70s yeah right! (4, Insightful)

Zero__Kelvin (151819) | about a year ago | (#44523691)

"Next thing we can throw our chairs out and sit on the carpet with long hair, smoke weed and drink beer...."

If you aren't doing it that way already, then you're doing it wrong.

Re:70s yeah right! (2)

Michael Casavant (2876793) | about a year ago | (#44523731)

So...Steve Ballmer got part of it right? I mean, throwing chair's is his specialty right?

Re:70s yeah right! (4, Interesting)

phantomfive (622387) | about a year ago | (#44523847)

The future of programming, from the seventies, it's all hippie talk...

What you don't understand is, in ~1980 with the minicomputer, Computer Engineering got set back decades. Programmers were programming with toggle switches, then stepped up to assembly, then started programming with with higher level languages (like C). By the 90s objects started being used which brought the programming world back to 1967 (Simula). Now mainstream languages are starting to get first-class functions. What a concept, where has that been heard before?

Pretty near every programming idea that you use daily was invented by the 80s. And there are plenty of good ideas that were invented back then that still don't get used much.

My two favorite underused (old) programming ideas:

Design by contract.
Literate programming.

If those two concepts caught on, the programming world would be 10 times better.

Re:70s yeah right! (0)

Anonymous Coward | about a year ago | (#44524047)

I really gave literate programming a try.
It is nice for small programs.

Re:70s yeah right! (1)

DamonHD (794830) | about a year ago | (#44524193)

I did an entire thesis with Tangle and Weave, and I'm glad that I did, but I'm not convinced that a narrative exposition is any better than the more random-access style that a hierarchical directory layout with some decent (embedded and out-of-line) documentation and viewer IDE does.

Rgds

Damon

Re:70s yeah right! (4, Insightful)

DutchUncle (826473) | about a year ago | (#44524223)

In college in the 1970s, I had to read the Multics documents and von Neumann's publications. We're still reinventing things that some very clever people spent a lot of time thinking about - and solving - in the 1960s. It's great that we have the computer power and memory and graphics to just throw resources at things and make them work, but imagine how much we could make those resources achieve if we used them with the attitude those people had towards their *limited* resources. And we have exactly the same sets of bottlenecks and tradeoffs; we just move the balance around as the hardware changes. Old ideas often aren't *wrong*, they're just no longer appropriate - until the balance of tradeoffs comes around again, at which point those same ideas are right again, or at least useful as the basis for new improved ideas.

Re:70s yeah right! (1)

murdocj (543661) | about a year ago | (#44524947)

Speaking of setting programming back, the current push in languages to get rid of declaring types of variables and parameters has set us back a few decades. In languages like Ruby, you can't say ANYTHING about your code without executing it. You have no idea what type of objects methods receive or return, whether formal and actual parameters match, or whether types are compatible in expressions, etc. I actually like a lot of aspects of Ruby, but it seems like it's thrown about 50 years of improvement in programming language development.

Re:70s yeah right! (1)

jbolden (176878) | about a year ago | (#44525297)

Ruby is actually rather strongly typed. Shell is far more like what you are describing.

Re:70s yeah right! (2)

murdocj (543661) | about a year ago | (#44525783)

Yes and no. It's true that objects have classes, but that's entirely malleable, and there's no way to look at a particular piece of Ruby code and have any idea what class an object has, unless you actually see it being created (yes, yes, even then you don't know because classes can be modified on the fly, but let's ignore that for the moment). Basically, I can't look at a method and do anything except guess what parameters it takes. Personally, I think that's a bad thing.

Re:70s yeah right! (3)

Shados (741919) | about a year ago | (#44525601)

I'm a static language guy myself, but its important to keep in mind that different problems have different solutions.

Doing heavy image processing or transnational operations, number crunching, I/O with third party APIs, etc? Yeah, static languages are probably better.

Doing prototyping, or UI intensive work? Most UI frameworks suck, but the ones designed for static languages generally suck more, because some stuff just can't be done (well), so they have to rely on data binding expressions, strings, etc, that are out control of the language. At least dynamic languages deal with those like they deal with everything else and have them as first class concepts.

Case in point: an arbitrary JSON string, in a dynamic language, can be converted to any standard object without needing to know what it will look like ahead of time. In a static language, you either need a predesigned contract, or you need a mess of a data structure full of strings that won't be statically checked, so you're back at square one. These type of use cases are horribly common in UI.

Re:70s yeah right! (1)

SuricouRaven (1897204) | about a year ago | (#44525829)

I dabble in image processing algorithms. A lot of the things I write for my own use end up being a C program to do the serious number crunching, with a perl script for the interface. Perl does all the pretty work, then just calls upon the compiled executable when serious performance is required.

Re:70s yeah right! (1)

Greyfox (87712) | about a year ago | (#44526289)

You still need to know what your JSON string will look like at some point in order to use it. It's always (for at least as long as I've been programming, a bit over 2 decades) been a problem that programmers don't fully know or understand their requirements, so they try to keep their code as generic as possible. The problem with that is that at some point you're going to have to do actual work with that code, so you end going through a labyrinth of libraries, none of which want to take the responsibility to actually do anything. All because they thought they might want to use that library for something else at some point. When design patterns guys/agile guys talk about "you aren't going to need it," this is what they're talking about. Code things you need right now and let the future sort itself out.

I was just talking to a guy today who's in the process of writing an entire fucking rest framework in C++ for an embedded system because his clients think they might want a fucking pony at some point in the future. Sure, they could just get some concrete requirements for what they need right now and design a system that will be smaller, faster and better than what he's building. Doesn't seem like anyone stopped to think about that.

Re:70s yeah right! (2)

jbolden (176878) | about a year ago | (#44525281)

Design by contract is my favorite way of handling interfaces. It really is a good idea.

Literate programming though I'm not sure if I see much point to. There are cool examples like mathematica notebooks but in general even very good implementation like Perl's POD and Haskell's literate mode just don't seem to offer all that much over normative source code. API documentation just doesn't need to be that closely tied to the underlying source and the source documentation just doesn't need to be literate.

As for your 1990s and Objects. I also disagree. Objects were used for implicit parallelism and complex flow of control. No one had flow of controls like a typical GUI to deal with in 1967. Event programming was a hard problem solved well.

As for PC's pushing back languages in general. I'd agree. Same way mobile is doing now by forcing people to think about writing programs to minimize electrical usage they are having to deal with low level details again.

Re:70s yeah right! (0)

Anonymous Coward | about a year ago | (#44524473)

wait, what?

I always thought programming always involves weed and beer and the occasional throwing of chairs?!

my worldview is shattered...

Re:70s yeah right! (0)

Anonymous Coward | about a year ago | (#44524745)

I always thought programming always involves weed and beer and the occasional throwing of chairs?!

This literally describes my college senior design class.

Over-generalization error in line 4 (1)

Zero__Kelvin (151819) | about a year ago | (#44523655)

"'The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' ... 'Because once you think you know what you're doing you stop looking around for other ways of doing things and you stop being able to see other ways of doing things. You become blind.' "

Unless of course you know you know what you are doing, because you also know to never stop looking for new ways of doing things.

Re:Over-generalization error in line 4 (1)

gweihir (88907) | about a year ago | (#44524035)

But if you know what you are doing, you still have a majority with no clue around you, in the worst case micro-managing you and destroying your productivity. I think the major differences between today and the early years of computing is that most people back then were smart, dedicated and wanted real understanding. Nowadays programmers are >90% morons or at best semi-competent.

Re:Over-generalization error in line 4 (1)

Zero__Kelvin (151819) | about a year ago | (#44524279)

I couldn't agree with you more. My estimate has traditionally been about 80% [slashdot.org] , but I concede that I may be a bit of an optimist.

Re:Over-generalization error in line 4 (1)

gweihir (88907) | about a year ago | (#44524873)

Well, I did include the semi-competent, those that eventually do get there, with horrible code that is slow, unreliable, a resource-hog and a maintenance nightmare. Plain incompetent may indeed just be 80%. Or some current negative experiences may be coloring my view.

Re:Over-generalization error in line 4 (1)

AK Marc (707885) | about a year ago | (#44524395)

Unless of course you know you know what you are doing, because you also know to never stop looking for new ways of doing things.

If you have to look for a new way to do something, then you don't know the answer, so how can you know you know what you are doing when you know you don't know the answer? When you are 100% confident in the wrong answer, you know you know what you are doing (and are wrong). If *ever* you know you know what you are doing, you don't.

Re:Over-generalization error in line 4 (1)

Zero__Kelvin (151819) | about a year ago | (#44524967)

"If you have to look for a new way to do something, then you don't know the answer,"

I'm not a big enough moron to think that there is one answer that can be called the answer. Your mileage clearly varies.

" If *ever* you know you know what you are doing, you don't."

Slashdot really needs a -1 anti-insightful option.

Re:Over-generalization error in line 4 (0)

AK Marc (707885) | about a year ago | (#44525337)

Mods, for people too stupid to post.

Re:Over-generalization error in line 4 (0)

Zero__Kelvin (151819) | about a year ago | (#44525449)

"Mods, for people too stupid to post."

Clearly you and mods were made for each other ;-).

Epic facepalm (2)

girlintraining (1395911) | about a year ago | (#44523659)

The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor.

Yeah. I bet Vincent Van Gogh thought he was total shit at painting, didn't know anything about paint mixing, brushes, or any of that. Look, I know what you're trying to say, Victor, but what you actually said made my brain hurt.

However, exploring new things and remembering old things are two different things. You can be good at what you do and yet still have a spark of curiousity to you and want to expand what you know. These aren't mutually exclusive. To suggest people murder their own egos in order to call themselves creative is really, really, fucking stupid.

You can, in fact, take pride in what you do, and yet be humble enough to want to learn more. It happens all the time.. at least until you're promoted to management.

Re:Epic facepalm (1)

DNS-and-BIND (461968) | about a year ago | (#44523913)

Well, you have to realize that the whole point of this exercise is to draw a line between "creative people" and "the other". We creative people are the good ones...those others...gosh, they're capable of violence. We good people are the correct ones, and it is not evil to look down on the humble folk. After all, what kind of creative would call himself fearful of people who can't create so much as a scrapbook unless they're following an example from youtube posted by...a creative.

Look, Vincent van Gogh is a role model emulated by creatives worldwide. The fact is, there are way too many non-creatives and they are screwing up the planet. Just imagine how much better the world would be if every member of the Tea Party suddenly disappeared overnight. Oh, we can dream....

Re:Epic facepalm (1)

girlintraining (1395911) | about a year ago | (#44523983)

We creative people are the good ones...those others...gosh, they're capable of violence.

I don't see how the two are mutually exclusive. Oh, the creative ways I murder people in my fantasies!

After all, what kind of creative would call himself fearful of people who can't create so much as a scrapbook unless they're following an example from youtube posted by...a creative.

Depends. Are they armed with just a scrapbook and a laptop, or something more substantial?

The fact is, there are way too many non-creatives and they are screwing up the planet. Just imagine how much better the world would be if every member of the Tea Party suddenly disappeared overnight. Oh, we can dream....

A true creative doesn't want people dropping dead or disappearing... they want them doing something useful and productive so they don't have time to "screw up the planet."

Re:Epic facepalm (1)

Anonymous Coward | about a year ago | (#44524979)

A true creative person doesn't say things like "A true creative person does X," because that imposes a restriction on future interpretations of "a true creative person" and destroys creativity.

Re:Epic facepalm (2)

Kal Zekdor (826142) | about a year ago | (#44524357)

The most dangerous thought that you can have as a creative person is to think that you know what you're doing,' explains Victor.

Yeah. I bet Vincent Van Gogh thought he was total shit at painting, didn't know anything about paint mixing, brushes, or any of that.

Um... yes actually. Van Gogh actually only sold one painting in his entire life, and he considered himself somewhat of a failure as a painter. He did not become famous until after his death.

Re:Epic facepalm (0)

Anonymous Coward | about a year ago | (#44525879)

shh. don't ruin the defense of his dogma. you'll scare him and he'll start barking "the right tool for the right job!"

Re:Epic facepalm (2)

Quila (201335) | about a year ago | (#44524521)

When I was doing design work, my mentor taught me the rules and told me to stay within them. After you've mastered the rules, learning the successes and mistakes of everybody before, then you can start breaking them as you explore new possibilities.

I am afraid this will convince people who know nothing yet to just go off in whatever direction they please, wasting massive time on things others already learned not to do, subjecting others to more horrible code.

Re:Epic facepalm (1)

Trifthen (40989) | about a year ago | (#44525685)

That's guaranteed to happen. The only question is the extent. There's bound to be a few who say, "Hey! I don't have to know what I'm doing. That one guy said so!" In reality, we know different. Progress is made by learning from the mistakes of others. :)

Re:Epic facepalm (1)

AK Marc (707885) | about a year ago | (#44524801)

Yeah. I bet Vincent Van Gogh thought he was total shit at painting,

He probably did. He died a commercial failure. Critic reviews at the time were very critical of his work.

Time for real apprenticeships in tech and not year (4, Interesting)

Joe_Dragon (2206452) | about a year ago | (#44523661)

Time for real apprenticeships in tech and not years of theory?

Re:Time for real apprenticeships in tech and not y (0)

Anonymous Coward | about a year ago | (#44523959)

But how will universities make a profit that way?

Re:Time for real apprenticeships in tech and not y (1)

Anonymous Coward | about a year ago | (#44523965)

I think most companies would rather have the slavery of unpaid internships then actually have to train people.

can't have unpaid internships any more (1)

Joe_Dragon (2206452) | about a year ago | (#44524123)

and what about internships with out the university part.

Re:Time for real apprenticeships in tech and not y (1)

gweihir (88907) | about a year ago | (#44524055)

No. Time for real theory coupled with real experience. Apprenticeships only work when the profession is ruled by real craftsmen. The programmers today rarely qualify, hence real apprenticeships would only make things worse.

Re:Time for real apprenticeships in tech and not y (1)

Shados (741919) | about a year ago | (#44525479)

Won't change much. Even the "real theory" is half assed except in a select few colleges, usually (but not always) the high end ones. Then the professors that are good at the theory are usually impossibly terrible at the engineering aspect but still pass on their words as laws.

Its really an awkward situation.

Yeah but (0)

Anonymous Coward | about a year ago | (#44523669)

When you adopt a Blue Sky attitude, it's hard to actually accomplish anything worthwhile.

Even Shakespeare, Bach, Mozart and Einstein labored within a highly established scholarly tradition.

computer (-1)

Anonymous Coward | about a year ago | (#44523685)

[blogspot.com]

Patents (5, Insightful)

Diss Champ (934796) | about a year ago | (#44523715)

One reason I had so many patents relatively early in my career is I wound up doing hardware design in a much different area than I had planned on in school. I did not know the normal way to do things. So I figured out ways to do things.
Sometimes I wound up doing stuff normally but it took longer, this was OK as a bit of a learning curve was expected (they hired me knowing I didn't know the area yet).
Sometimes I did things a bit less efficiently than ideal, though this was usually fixed in design reviews.
But sometimes I came up with something novel, and after checking with more experienced folks to make sure it was novel, patented it.

A decade later, I know how a way to do pretty much everything I need to do, and get a lot less patents. But I finish my designs a lot faster:).

You need people who don't know that something isn't possible to advance the state of the art, but you also need people who know the lessons of the past to get things done quickly.

Those days are LONG gone. (1)

Anonymous Coward | about a year ago | (#44524177)

they hired me knowing I didn't know the area yet

Those days are long gone.

And it's interesting that you actually innovated while you were learning. That's something that corporate America should consider.

They won't.

Today is just same old same but different names and pseudo innovation. Coupled with old farts pretty much being thrown out, the youngsters actually believe the rehash tech they're working with is "new" and "innovative" - and then they wonder why they get sued for patent infringement.

It's not necessarily the IP laws: you're just reinventing the wheel and calling it a "ground synergistic scalable momentum enabling device" and the guy who owns the wheel patent has an issue with that.

Not always - but usually.

"We don't know what programming is" (0)

Anonymous Coward | about a year ago | (#44523723)

"We don't know what programming is. We don't know what computing is. We don't even know what a computer is."

I'll try that at my next job interview, they're sure to hire me with that open minded attitude.

We don't even know what a "job" is (1)

Latent Heat (558884) | about a year ago | (#44524393)

Yeah.

We don't even know what "employment" is, what a "salary" is, and what "benefits" are . . .

Wow man (3, Funny)

jackjumper (307961) | about a year ago | (#44523739)

I need some more bong hits to fully consider this

Become One with the WTF (1)

Trifthen (40989) | about a year ago | (#44523793)

'I think you have to say: "We don't know what programming is. We don't know what computing is. We don't even know what a computer is." And once you truly understand that, and once you truly believe that, then you're free, and you can think anything.'

I agree having an open mind is a good thing. There is, of course, taking things too far. Just throw away everything we've spent the last 40-50 years developing? Is there some magical aura we should tap into, and rock back and forth absorbing instead? Should we hum esoteric tantras under the enlightening influence of various chemical enhancements awaiting Computing Zen?

Someone watched The Matrix one too many times. There is no spoon!

Re:Become One with the WTF (1)

phantomfive (622387) | about a year ago | (#44524057)

We don't even know what a computer is.

Think of it like this. If you believe you already know what a computer is, then you are not likely to look for alternatives. If you're looking for alternatives, then you might come up with something interesting like this [hackaday.com] . If you just accept that super-scalar pipelines, the way Intel does it, is the best way, then you're not going to find a different, potentially better way of doing it.

Re:Become One with the WTF (1)

Trifthen (40989) | about a year ago | (#44524295)

Far from it. I seem to recall a researcher I read about over a decade ago who was designing a chip that worked more like a human neuron. Superscalar pipelines is just how Intel does instructions, and even they're trying to get away from it due to the cost of cache misses becoming more expensive as pipeline lengths increase. Having a talk on not being constrained to accepted dogma, and outright throwing away all known concepts are completely different things.

The very fact that you and I can even have this conversation, is because we know what those things mean. We know they've been tried. We know their limitations and strengths. We know there are alternatives. Having a strong grounding makes it possible to progress, even through occasional setbacks. Standing on the shoulders of giants, and all that. Throwing away everything you know and starting from scratch is very romantic, but it isn't very practical if you want to collaborate with others.

Re:Become One with the WTF (0)

Anonymous Coward | about a year ago | (#44525277)

That works well if applied on the proper level. If my job is to parse data in a defined format, I'm not going to get anywhere by saying 'What if we used a different architecture in our cpu?' I will get somewhere by saying 'What if we thought beyond SAX'.

How's that working out? (0)

Anonymous Coward | about a year ago | (#44523937)

> it was a time of unfettered thinking, unconstrained by programming dogma, authority, and tradition.

Uh-huh. And tell me why so many modern websites still store their password database in the clear instead of as salted hashes. Dogma, authority, and tradition are also know as "best practices."

Too simplistic (1)

gweihir (88907) | about a year ago | (#44523979)

We do know a number of things about programming. One is that it is hard and that it requires real understanding of what you intend to do. Another is that languages help only to a very, very limited degree. There is a reason much programing works is still done in C, once people know what they are doing, language becomes secondary and often it is preferable if it does not help you much but does not stand in your way either. We also know that all the 4G and 5G hype was BS and that no language will ever make the (today prevalent) semi-competent or incompetent programmer into a productive highly competent programmer. We know that OO done well (Eiffel, Python, some few others) is nice, but that it still requires real understanding from the people using it or it is worse than not having it. And we know that it is easy to get OO badly wrong (C++, Java, etc.) making it counter-productive. We also know that architecture and design can and often is more critical than implementation. And we know that many "modern" languages cause more problems than they solve because of performance issues, code bloat, bad readability, etc. (Yes, Java, I am looking in your direction.) We also know that design patterns are nice in theory, but do not hold up in practice, mostly because most people calling themselves programmers have no clue about them, but also because patterns do only solve part of the problem.

So, no, I don't think there is a problem with programming. I am however convinced that there is a problem with the majority of the people calling themselves "programmers' these days. A real eye opener is this: http://www.codinghorror.com/blog/2010/02/the-nonprogramming-programmer.html [codinghorror.com]
Even those regarded as "competent" often take months to accomplish what really competent people can do in days or even hours. I this regularly.

"We" know this? Don't speak for the rest of us. (0)

Anonymous Coward | about a year ago | (#44525113)

Go ahead and write your massive S/W system in C. See how much longer it takes to write and validate than the equivalent in Java. You know, I too have been writing code since way back, and I do not miss for one second tracking down the memory leaks or cause of a null pointer access that *crashed the entire program*, not logged the null pointer exception and continued, as Java would do. Lastly, if Java and C++ are so worthless, why is the majority of the S/W you run written in them? Obviously, it is getting the job done.

'Back to the Future' of Programming (1)

stillnotelf (1476907) | about a year ago | (#44524107)

I was intrigued until I noticed that where I put the quote marks, and where the quote marks actually were, were not the same place. So much for the "Mr. Perl Extreme-Fluxing Agile Capacitor."

better feel for stability in the 70s/80s (1)

a2wflc (705508) | about a year ago | (#44524141)

Most people I worked with in the 80s (and learned from in the 70s) had a good feel for concepts like "stable systems", "structural integrity", "load bearing weight", and other physical engineering concepts. Many from engineering degrees (most of them weren't CS grads like me), and a lot from playing with legos, erector sets, chemistry sets, building treehouses (and real houses). These concepts are just as important in software systems, but I can only think of a handful of people I've worked with over the last 20 years who had a feel for the stability of a system (physical or software) or an ability to find system weaknesses when a bug is found rather than fix a programming error.

That's very important for development time and quality. To go fast you need to know where it's important to go slow. You have to know what's important to get right at the start (structurally) so you can change requirements as needed and not risk breaking the system or requiring a lot of rewriting (or refactoring). Your framework should be a stable "frame" for the system (like a building or car), not a set of libraries you cobble together for speed of implementation. After deploy, "bugs" are easy to fix but system weaknesses are not.

On the other hand, a lot of things have improved. Tools, methods, and specializations allow a team to be comprised of some people who understand systems (architects, senior developers) and others who specialize in certain areas (html, db, communication protocols, builds, etc). And there are many more people available who are capable in specific areas so far more teams can exist doing many more applications. If we only had the same percentage of people writing software now as in the 70s & 80s and those people had the backgrounds developers had then, we'd be producing better software but orders of magnitude less of it.

Software reinvents the wheel (2)

trout007 (975317) | about a year ago | (#44524185)

I find it interesting that people in software think they are the first ones to ever design complicated things. It seems there are so many arguments over design styles and paths. All they need to do is look at what other engineering fields have done for the past 100+ years. It's pretty simple. When you are working in a small project where the cost for failure and rework is low you can do it however you want. Try out new styles and push technology and techniques forward. When it comes to critical infrastructure and projects where people will die or lose massive amounts of money you have to stick with what works. This is where you need all of the management overhead of requirements, schedules, budgets, testing, verification, operation criteria, and the dozens of other products besides the "design".

I'm a mechanical and a software engineer. When I'm working on small projects with direct contact with the customers it's easy and very minimal documentation is needed. But as more people are involved the documentation required increases exponentially.

Re:Software reinvents the wheel (0)

Anonymous Coward | about a year ago | (#44525921)

people in software think they are the first ones to ever design complicated things.

This is because software is so accessible, solving the easy problems leads to wanting to solve the hard problems, and a lot of the "hard" problems are just as solvable now as they were in the 60s. It's a fun, self-directed ride to go solo through all of this.

People can embark on that journey and find success without ever being required to know history. It's still a shame, though. At least with exposure through concepts on the internet in general and specifically with Wikipedia, the odds of somebody stumbling around old concepts & solutions is greater than it's been before.

This guy (0)

Anonymous Coward | about a year ago | (#44524205)

Don't have nothing on Douglas Engelbart

I agree with him in principle (0)

Anonymous Coward | about a year ago | (#44524257)

But how do you do that and still make a decent living? The glory days, at least for now, seem to have been replaced by corporate slavery.

"We don't even know what a computer is." (0)

Anonymous Coward | about a year ago | (#44524271)

And once you truly understand that, and once you truly believe that, then you're never going to get past a job interview ever again."

Mature fields have only marginal innovation (0)

Anonymous Coward | about a year ago | (#44524315)

Have you read a math paper recently? All the obvious, innovative, and interesting stuff has been done. The work is at the margins now, on really really abstruse topics that few people can figure out and which have no immediate relevance. The game-changing, huge breakthroughs are over in math.

Same way with computers. Innovation has all but stopped. The only thing left is change for the sake of change and reinventing the wheel. Software in general is mature. Lowering the cost of software is more important than doing it right. The only innovations are at the margins, if there is even that left. The only software which hasn't been written is software that would cost more to write than anyone would ever make from bringing it to market.

Seems bizarre that IBM, of all companies, would talk about innovation when they're the ones leading the race to the bottom to get software development as cheap as possible.

Re:Mature fields have only marginal innovation (0)

Anonymous Coward | about a year ago | (#44525507)

The game-changing, huge breakthroughs are over in math.

You couldn't be more wrong. For goodness sakes, we're not even sure about whether Mochizuki has proven the ABC conjecture or not - and everybody and their advisor is scrambling over themselves to advance the recent breakthroughs on the Twin Prime conjecture. Just because people are using terms like 'Endomorphic holomorphism functor space' and 'sparse fiber' and half the abstract concepts have the names of dead Frenchmen doesn't mean that the work being done doesn't yield interesting results.

High and low voltages (0)

Anonymous Coward | about a year ago | (#44524877)

It's all about 1s and 0s (or if you want to remove even that abstraction, high and low voltage states). Anything else is simply an argument about how far you personally prefer to abstract the concept. Nothing more.

Re:High and low voltages (1)

SuricouRaven (1897204) | about a year ago | (#44525913)

Ever heard of analog computers? They existed. There were a few made using tri-state logic too, and a lot of the early ones used base ten arithmatic in hardware via dekatron tubes.

The base is set. The abstractions are open (1)

Beeftopia (1846720) | about a year ago | (#44525431)

So, we know what the computer does. It's this: List of x86 instructions. [wikipedia.org] It executes those instructions. The device stores and executes instructions. [wikipedia.org]

We think in terms of programming languages. The language abstracts away the complexity of manually generating the instructions. Then we build APIs to abstract away even more. So we can program a ball bouncing across a screen in just a few lines rather than generating tens of thousands of instructions manually, because of abstraction built upon abstraction.

In hardware, they build more complex circuits and give us more instructions. Perhaps one day mathematicians will come up yet a different device which can "do what we want", whatever that may be. The early computers were created to facilitate computation - calculation. [wikipedia.org] Then people came up with programming languages (aka compilers) to generate the instructions automatically. The hardware was extended to do more things - draw a dot on a monitor for example. People discovered they could represent abstract concepts in programming languages. Then it was off to the races.

I think all complex systems are built evolutionarily. From single-celled organisms to eventually man. And it's done by taking the basic building blocks and building more and more complex systems from those building blocks. The basic stored instruction computer is settled (right?). We're doing all sorts of weird and interesting things with it, from playing Tetris, to emailing cat videos, to modeling hurricanes. And of course porn. Looking at porn.

Where does it end? Well, man can manage a certain amount of complexity (not much). He uses tools (in this case computers) to leverage his ability to manage complexity, as the device can represent all sorts of abstract concepts in code.

Suggested Reading: Mythical Man Month (2)

kye4u (2686257) | about a year ago | (#44525525)

If you want some relevant history and insight on the struggles and triumphs of software engineering, I highly suggest reading the Mythical Man-Month.

What was surprising to me was the fact that something written in the 60's about software development is still very relevant today.

The engineers who worked on the IBM System/360 OS discovered software engineering through pure trial and error.

One of the classic insights from the book that I've seen companies (i.e. Microsoft) violate over and over is Brooke's Law. Brooke's law states that "adding manpower to a late software project makes it later." It is incredible how we reinvent the wheel everyday instead of taking time learn the from the trials and mistakes of others.

Another surprising insight to me at the time was the following. Although the engineers were working on a very technical problem, the biggest challenges they had to overcome were social/people challenges.

The mess at the bottom (5, Insightful)

Animats (122034) | about a year ago | (#44525715)

A major problem we have in computing is the Mess at the Bottom. Some of the basic components of computing aren't very good, but are too deeply embedded to change.

  • C/C++ This is the big one. There are three basic issues in memory safety - "how big is it", "who can delete it", and "who has it locked". C helps with none of these. C++ tries to paper over the problem with templates, but the mold always comes through the wallpaper, in the form of raw pointers. This is why buffer overflow errors, and the security holes that come with them are still a problem.

    The Pascal/Modula/Ada family of languages tried to address this. All the original Macintosh applications were in Pascal. Pascal was difficult to use as a systems programming language, and Modula didn't get it right until Modula 3, by which time it was too late.

  • UNIX and Linux. UNIX was designed for little machines. MULTICS was the big-machine OS, with hardware-supported security that actually worked. But it couldn't be crammed into a PDP-11. Worse, UNIX did not originally have much in the way of interprocess communication (pipes were originally files, not in-memory objects). Anything which needed multiple intercommunicating processes worked badly. (Sendmail is a legacy of that era.) The UNIX crowd didn't get locking right, and the Berkeley crowd was worse. (Did you know that lock files are not atomic on an NFS file system?) Threads came later, as an afterthought. Signals never worked very well. As a result, putting together a system of multiple programs still sucks.
  • DMA devices Mainframes had "channels". The end at the CPU talked to memory in a standard way, and devices at the other end talked to the channel. In the IBM world, channels worked with hardware memory protection, so devices couldn't blither all over memory. In the minicomputer and microcomputer world, there were "buses", with memory and devices on the same bus. Devices could write anywhere in memory. Devices and their drivers had to be trusted. So device drivers were usually put in the operating system kernel, where they could break the whole OS, blither all over memory, and open security holes. Most OS crashes stem from this problem. Amusingly, it's been a long time since memory and devices were on the same bus on anything bigger than an ARM CPU. But we still have a hardware architecture that allows devices to write anywhere in memory. This is a legacy from the PDP-11 and the original IBM PC.
  • Academic microkernel failure Microkernels appeared to be the right approach for security. But the big microkernel project of the 1980s, Mach, at CMU, started with BSD. Their approach was too slow, took too much code, and tried to get cute about avoiding copying by messing with the MMU. This gave microkernels a bad reputation. So now we have kernels with 15,000,000 lines of code. That's never going to stabilize. QNX gets this right, with a modest microkernel that does only message passing, CPU dispatching, and memory management. There's a modest performance penalty for extra copying. You usually get that back because the system overall is simpler. Linux still doesn't have a first-class interprocess communication system. (Attempts include System V IPC, CORBA, and D-bus. Plus various JSON hacks.)
  • Too much trusted software Application programs often run with all the privileges of the user running them, and more if they can get it. Most applications need far fewer privileges than they have. (But then they wouldn't be able to phone home to get new ads.) This results in a huge attackable surface. The phone people are trying to deal with this, but it's an uphill battle against "apps" which want too much power.
  • Lack of liability Software has become a huge industry without taking on the liability obligations of one. If software companies were held to the standards of auto companies, software would work a lot better. There are a few areas where software companies do take on liability. Avionics, of course. But another is gambling. Gambling software companies, the ones that serve casinos and lotteries, are usually held financially liable for their mistakes. One of the big ones, GTech, reports this costs them about 1% of their revenue. Gambling software thus tends to have redundancy and checking far beyond most other commercial software.

Those are some of the real reasons we have problems with software.

Lately, I've been worried about this as I watch what the Open Robotics Foundation is doing with their Robot Operating System. Until now, ROS has been used only on rather slow, wimpy mobile robots. But now they're putting it on Boston Dynamic's ATLAS, which looks like a Terminator [ieee.org] and is as strong as it looks. ROS is typical open source - i.e. much of it is half-debugged, the documentation is out of sync with the code, and the parts don't play together well. They're going to put this on a machine that's strong enough to kill and faster than humans. This may not end well.

Boston Dynamics themselves uses QNX on their robots, but the DARPA Humanoid Challenge uses ROS.

Re:The mess at the bottom (3, Insightful)

SuricouRaven (1897204) | about a year ago | (#44526037)

The whole x86/64 architecture is a mess when you get deep enough. It suffers severely from a commitment to backwards compatibility - your shiny new i7 is still code-compatible with an 80386, you could install DOS on it quite happily. But the only way to fix this by now is a complete start-over redesign that reflects modern hardware abilities rather than trying to pretend you are still in the era of the z80. That just isn't commercially viable: It doesn't matter how super-awesome-fast your new computer is when no-one can run their software on it. Only a few companies have the engineering ability to pull it off, and they aren't going to invest tens of millions of dollars in something doomed to fail. The history of computing is littered with products that were technologically superior but commercially non-viable - just look at how we ended up with Windows 3.11 taking over the world when OS/2 was being promoted as the alternative.

The best bet might be if China decides they need to be fully independant from the 'Capitalist West' and design their own architecture. But more likely they'll just shamelessly rip off on of ARM or AMD's designs (Easy enough to steal the masks for those - half their chips are made in China anyway) and slap a new logo on it.

Ok (0)

The Cat (19816) | about a year ago | (#44525837)

If you use the word "actually" when replying to someone who knows what they are talking about, you are an asshole.

Mod it down, crybabies.

Other than shared memory latency, pretty good. (1)

Baldrson (78598) | about a year ago | (#44525861)

This guy is a ray of light from the younger generation. He's avoided grappling with the hard problem of shared memory latency, but other than that, he's doing pretty good. You have to deal with shared memory latency [blogspot.com] to handle a wide range of modeling problems, not the least of which is real-time multicore ray tracing like this [youtube.com] .

Entertaining, but... (3, Insightful)

msobkow (48369) | about a year ago | (#44526097)

It's an entertaining presentation, but I don't think it's anything nearly as insightful as the summary made it out to be.

The one thing I take away from his presentation is that old ideas are often more valuable in modern times now that we have the compute power to implement those ideas.

As a for-example, back in my university days (early-mid 1980s), there were some fascinating concepts explored for computer vision and recognition of objects against a static background. Back then it would take over 8 hours on a VAX 7/80 to identify a human by extrapolating a stick figure and paint a cross-hair on the torso. Yet nowadays we have those same concepts implemented in automatic recognition and targetting systems that do the analysis in real time, and with additional capabilities such as friend/foe identification.

No one who read about Alan Kay's work can fail to recognize where the design of the modern tablet computer really came from, despite the bleatings of patent holders that they "invented" anything of note in modern times.

So if there is one thing that I'd say students of programming should learn from this talk, it is this:

Learn from the history of computing

Whatever you think of as a novel or "new" idea has probably been conceptualized in the past, researched, and shelved because it was too expensive/complex to compute back then. Rather than spending your days coding your "new" idea and learning how not to do it through trial and error, spend a few of those days reading old research papers and theories relevant to the topic. Don't assume you're a creative genius; rather assume that some creative genius in the annals of computing history had similar ideas, but could never take them beyond the proof-of-concept phase due to limitations of the era.

In short: learn how to conceptualize and abstract your ideas instead of learning how to code them. "Teach" the machine to do the heavy lifting for you.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?