Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Interview with Jaron Lanier on "Phenotropic" Development

CowboyNeal posted more than 11 years ago | from the stuff-to-read dept.

Programming 264

Sky Lemon writes "An interview with Jaron Lanier on Sun's Java site discusses 'phenotropic' development versus our existing set of software paradigms. According to Jaron, the 'real difference between the current idea of software, which is protocol adherence, and the idea [he is] discussing, pattern recognition, has to do with the kinds of errors we're creating' and if 'we don't find a different way of thinking about and creating software, we will not be writing programs bigger than about 10 million lines of code no matter how fast our processors become.'"

cancel ×

264 comments

I claim... (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5157030)

This first post in the name of Spain!

Take that, Portugal!

Jaron Lanier is a patchouli-soaked technofaggot (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5157076)

god he's never done *anything* except conjecture on the future of VR. why people listen to this dreadlock hippy queer, i will never know.

Re:Jaron Lanier is a patchouli-soaked technofaggot (0)

Anonymous Coward | more than 11 years ago | (#5157291)

It's the hair.

first post (-1, Troll)

Anonymous Coward | more than 11 years ago | (#5157035)

oh and yes, there will be more than 10 million lines of code.. bugs will just rein free.. look at microsoft patchwork

Red Alertt! (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5157036)

Dirty gay hippie alert! Its RMS v2! IEEEEEEEE!!!!!!!!!!

Re:Red Alertt! (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5157045)

This is the most insightful post so far! What hell are you thinking, you crackpot moderator! RMS is clearly a fag!

Perhaps you are one of his peter puffers?

Re:Red Alertt! (-1, Flamebait)

Anonymous Coward | more than 11 years ago | (#5157087)

Peeeetar Pooofur be yuo!

Re:Red Alertt! (0)

Anonymous Coward | more than 11 years ago | (#5157059)

Please moderate this comment up. The author's clever reference to the high quality Internet Explorer ("IE") helps to demonstrate the philosophical differences between Communist RMS, and the competant corporation, Microsoft.

sp (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5157037)

second post hehe

10 million lines (5, Funny)

chuck (477) | more than 11 years ago | (#5157040)

we will not be writing programs bigger than about 10 million lines of code no matter how fast our processors become.
Thank God.

-Chuck

Re:10 million lines (0)

Anonymous Coward | more than 11 years ago | (#5157066)

Yep. And on a javasoft site (home of the people that can't get swing stable)

Re:10 million lines (3, Insightful)

chuck (477) | more than 11 years ago | (#5157111)

Seriously, has there ever been a need to write a program of 10 million lines? I rather believe that creating a number of small components that work well, and combining them in some intelligent way, is the way that you build large systems.

Now, the extent to which the pieces that you're building are called "programs," or whether the whole system is called "a program" is questionable.

I mean, I've worked on programs of 10 million bytes, and they've seemed to work okay. It would surprise me if 10 million lines is out of my reach using the methods that I'm familiar with.

-Chuck

Re:10 million lines (0)

Anonymous Coward | more than 11 years ago | (#5157261)

> Big commercial software is probably already there.

(I'm the AC you responded to).

Of course it all depends on what is called a 'program'.

The linux kernel is > 2M loc.

If one consider that the linux kernel is not a single program, then one can take a look at mozilla, which is also > 2M. Xfree is slighly under 2M, gcc and gdb are both around 1M.

10 million loc are definitely not out of reach, big commercial software are probably already there. So the hippy from the article is probably wrong :-)

That beeing said, software engineering is about mastering complexity.

You point about having a componentized system, which is the first step in mastering the complexity.

But generally, we have complexity layered, which means that instead of having 10M of line of code, you get say 2 Million lines used to build an abstraction (like a language) that you use to build components (say 2 million of lines) that make the system up. Such layers are stacked on each others (like microcode->assembly->C->SQL, or kernel->userland->libraries->apps). [Here, I should insert a rant about meta-languages, in which you can develop and use the abstraction. LISP is the example that code to mind]

At the point, if we want to measure the size of the mozilla application that I use to type, we need to take into account its size, the size of gtk, the size, of wmaker, the size of xfree, the size of the libc, the size of the freebsd kernel. We are not a 10M loc, but damn near.

So, in one word, I agree with you. 10M are within our reach, and have been for some time.

Cheers,

--fred

Re:10 million lines (3, Insightful)

AxelTorvalds (544851) | more than 11 years ago | (#5157368)

Yes, there have been the need. Windows2000 is well over 10million lines. Now is it a single program or system of programs or what? Arguably there is a large amount of that code whereby the removal of it would make the system stop being windows2000; GDI for example. LOC is a terrible metric.

There are other very large systems out there. LOC never factors in expressivness though. I know of multimillion line 370 systems that were done in 370. I believe that they could be much much shorter if they were done in PLx or Cobol or Java or something else.

My thoughts exactly (1)

0x0d0a (568518) | more than 11 years ago | (#5157405)

Seriously, has there ever been a need to write a program of 10 million lines?

Exactly. I don't care how many lines of code there are in the kernel or glibc or glib or gtk or Xlib or SDL -- I happily use them without worrying about them. If you sum all of them up, you can probably get some insane LOC count...but it's modularized.

Re:10 million lines (0)

Anonymous Coward | more than 11 years ago | (#5157073)

Mozilla is greater than 10M lines, therefor it is crap.

QED

Re:10 million lines (0)

jackb_guppy (204733) | more than 11 years ago | (#5157077)

10 Thousand lines of code in one program - programmer needs to go back to school.

10 Million lines of code in one program - confirms the programmer is insane.

Furthermore: (1)

seanadams.com (463190) | more than 11 years ago | (#5157317)

we will not be writing programs bigger than about 10 million lines of code no matter how fast our processors become.

You can only drink 30 or 40 glasses of beer a day, no matter how rich you are. -- Colonel Adolphus Busch

There really is a hard limit to just about everything...

Re:10 million lines (4, Insightful)

sql*kitten (1359) | more than 11 years ago | (#5157361)

Thank God.

You're modded as funny, but what you said is insightful. The whole point of moving to ever higher levels of abstraction - from ASM to C to C++ (or CXX as we called it on VMS) to Java to <whatever comes next> is that you can do more work with fewer lines of code. The fact that programs aren't getting any longer is utterly irrelevant to any experienced software engineer.

I don't think programs will get longer, since why would anyone adopt a language that makes their job harder? I bitch about Java's shortcoming's constantly, but given the choice between Xt and Swing, I know where my bread's buttered. Or sockets programming in C versus the java.net classes. I'll even take JDBC over old-skool CTLib.

We have plenty of apps these days that in total are well over 10M lines, but you never have to worry about that because they're layered on top of each other. Someone else worries about the code of the OS, the code of the RDBMS engine, the code of GUI toolkit and so on.

In short, pay close attention when someone from Sun tries to tell you anything about software development - he's got some hardware to sell you, and you'll need it if you follow his advice!

Just a thought (2, Informative)

Neophytus (642863) | more than 11 years ago | (#5157049)

The day a compiler makes programs based on what it thinks we want a program to do is the day that conventional computing goes out the window.

DUPE (0)

Anonymous Coward | more than 11 years ago | (#5157052)

This was posted earlier in the week on /..

with words like phenotropic (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5157053)

You can spew gnu/jizz into micheals mouth.

In the future... (0)

Anonymous Coward | more than 11 years ago | (#5157054)

Computers will program themselves!

And IN SOVIET RUSSIA (-1)

Anonymous Coward | more than 11 years ago | (#5157143)

Programs write people! So does that mean The Matrix is a comentary on Socialism?

Re:In the future... (0, Offtopic)

ChrisTaylor2904 (553656) | more than 11 years ago | (#5157228)

And in Soviet Russia, it'll be the other way around...

What on earth is phenotropic? (-1, Offtopic)

Anonymous Coward | more than 11 years ago | (#5157063)

Full of it. (5, Insightful)

cmason (53054) | more than 11 years ago | (#5157065)

If you think about it, if you make a small change to a program, it can result in an enormous change in what the program does. If nature worked that way, the universe would crash all the time. Certainly there wouldn't be any evolution or life.

<cough>Bullshit.</cough>

This guy obviously knows nothing about biology. A single base change in DNA can result in mutations that cause death or spontaneous abortion. As little as a change in a single 'character' can be lethal. That's a pretty "small change" that results in a pretty big "crash."

I'm not sure if this invalidates his argument, but it certainly doesn't do much for his credibility.

Evolution and Core Dump (4, Insightful)

QEDog (610238) | more than 11 years ago | (#5157103)

That is only if you consider one living been only. I think he means that is robust as an ecological balance. If a small change in the DNA base of one animal happens, he dies, and is unable to reproduce. So the 'error' was confined, and dealt with. It didn't explode giving a blue screen. Evolution is a phenomena of many living beens, not one. Even if a big change happens in a specie, most of the time the system is robust enough to absorb it and change the system into one that works. And, because of the evolutionary mechanism, only the good mutations, by definition, spread. Imagine a computer program where only the useful threads got resources allocated...

Re:Evolution and Core Dump (-1)

Anonymous Coward | more than 11 years ago | (#5157162)

Look, if you can't even spell "being", then I can't credit anything else you say... Learn to spell, m'kay? That's an example of one error causing your message to crash.

Re:Evolution and Core Dump (2, Interesting)

cmason (53054) | more than 11 years ago | (#5157180)

I can see this point. I think the analogy is faulty. If you liken DNA to a computer program, then consider a single organism to be a run of that program. A single organism can crash just like a run of a program can.

Now there are certainly programming methodologies modeled on evolution. But that's not what he's talking about. What he's talking about is using pattern recognition to reduce errors in computer programs, I assume, although he doesn't say this, by making them more tolerant of a range of inputs. Evolution has nothing to do with pattern recognition, other than that both are stochastic processes. Evolution is tolerant of environmental pressure by being massively parallel (to borrow another CS metaphor). And even then it's sometimes overwhelmed (think ice age). His programs would be more tolerant of errors because they used better algorithms (namely pattern recognition).

I think it's a bullshit analogy. As I said before, I'm not sure if this analogy is key to his argument, but I don't give him a lot of cred for it.

Re:Evolution and Core Dump (5, Interesting)

haystor (102186) | more than 11 years ago | (#5157232)

I think its a pretty good analogy but that comparing it to biology leaves it a bit ambiguous as to what the metaphor is.

If you compare it to something like building a house or office building the analogy works. If you misplace one 2x4, its very unlikely that anything will ever happen. Even with something as serious as doors, if you place one 6 inches to the left or right of where its supposed to be, it usually works out ok. It always amazed me once I started working with construction at how un-scientific it was. I remember being told that the contractors don't need to know that space is 9 feet 10 1/2 inches. Just tell them its 10 feet and they'll cut it to fit.

One of the amazing things about AutoCad versus the typical inexpensive CAD program is that it deals with imperfections. You can build with things that have a +/- to them and it will take that into account.

Overall, he definitely seems to be on the right track from what I've seen. Most of the projects I've been working on (J2EE stuff) it seems to be taken as a fact that its possible to get all the requirements and implement them exactly. Like all of business can be boiled down to a simple set of rules.

Re:Evolution and Core Dump (3, Insightful)

Anonymous Coward | more than 11 years ago | (#5157410)

IMO, if you liken DNA to a computer program, an individual is one instance of that code, or one process. That process can be killed without the entire system going kaput, which is what makes biological systems robust.

However, even though I think Lanier's observations are valid, they're not particularly groundbreaking. His "wire-bound" vs. "interface" argument is basically a minor revision of the old procedural vs. OO debate. The problems with coding in terms of objects and their interactions continues to be the same: It's never going to be the most efficient(in terms of information content) possible description of a problem, and it's hard work to write extra code for a level of robustness in the future, when most developers are paid for performance in the present. I strongly believe that the roadblocks in development of more robust software are not technical, but are mostly economic.

Re:Full of it. (1)

JohnFluxx (413620) | more than 11 years ago | (#5157116)

I think his point is more of, what if that one small dna change caused the entire world to explode. One small creature dying isn't a "big crash".

Re:Full of it. (1)

zephc (225327) | more than 11 years ago | (#5157120)

This guy has seemed to turn into a bullshit artist as of late... i respected his early VR work, but VR is the tech that never was, and I guess he's just killing time now?

Re:Full of it. (0)

Anonymous Coward | more than 11 years ago | (#5157151)

He's always been more of a sideshow freak than a real programmer/technologist. He's one of those people who looks great on the cover of Wired magazine talking about vr or telepresense.

Re:Full of it. (0)

Anonymous Coward | more than 11 years ago | (#5157128)

You just said what he said.

He said:

if you make a small change to a program, it can result in an enormous change

You said: ...a pretty "small change" that results in a pretty big "crash."

Re:Full of it. (1)

cmason (53054) | more than 11 years ago | (#5157189)

Yes, obviously you can't read. He said "evolution or life" doesn't crash. I was saying it does.

Re:Full of it. (0)

Anonymous Coward | more than 11 years ago | (#5157217)

Sure it does. We are here to prove it.

Re:Full of it. (3, Insightful)

Sri Lumpa (147664) | more than 11 years ago | (#5157174)

"This guy obviously knows nothing about biology. A single base change in DNA can result in mutations that cause death or spontaneous abortion. As little as a change in a single 'character' can be lethal. That's a pretty "small change" that results in a pretty big "crash.""

This means that natures has got an excellent error catching and correction system, rather than letting buggy code run and produce flawed result it catches the worst cases and prevent them from running (spontaneous abortion) while code with less bugs (say, a congenital disease) has less chance to run (early death, lesser sexual attractiveness to mates...).

It is only with the advent of modern society and modern medecine that the evolutionary pressure has dropped enough to make it less relevant to humans. Maybe in the future and with genetic engineering we will be able to correct congenital diseases in the womb.

Even beyond DNA I am convinced that nature has a good recovery system and that if humans were to disappear tomorrow most of the damages it did to Earth would eventually be healed (but how long before we reach the no return point?).

Now if software could have similar damage control mechanism and if it could still function while invalidating the buggy code, that would be something.

Re:Full of it. (1)

TheLink (130905) | more than 11 years ago | (#5157346)

Well if your software stops working, most of the damages it did to Earth would eventually be healed too.

Re:Full of it. (1)

Cerberus9 (466562) | more than 11 years ago | (#5157357)

Now if software could have similar damage control mechanism

Exactly. What we need is a mechanism that will drive companies with buggy products into bankruptcy instead of allowing them to flourish and produce flawed results.

Thank You! (2, Interesting)

ike42 (596470) | more than 11 years ago | (#5157206)

Small perturbations often have disproportional large consequences, as your DNA example illustrates. Paradoxically, as Lanier suggests, complex systems can also be amazingly fault tolerant. This is in fact the nature of complex, or chaotic, systems and some say life. However, we cannot, in general, predict which sort of behavior a complex system is likely to exhibit. Lanier seems to miss this entirely. So while his ideas are intriguing I don't think he has a good grasp of the real issues in designing "complex" software systems.

No, YOU'RE full of it (4, Informative)

Pentagram (40862) | more than 11 years ago | (#5157258)

Who modded this up? A single base change in DNA is almost never fatal. For a start, considerably more than 90% of the human genome is junk that has no expressive effect anyway (according to some theories it helps protect the rest of the genome.) Even point mutations in coding sections of the DNA often do not significantly alter the shape of the protein it codes for, and many proteins are coded for in several locations in the genome.

True, single base changes can have dramatic effects, but this is rare. As an example, the human genetic equipment is so fault-tolerant that humans can be even born with 3 copies of a chromosome and still survive (Down's Syndrome).

You're right (2)

Flavio (12072) | more than 11 years ago | (#5157276)

And that's one of the advantages diploid organisms have, allowing heterozygous organisms for even a deleterious mutation to be able to live normally.

So in a way this guy's right about nature's built in error tolerance methods. But he still throws around words like "chaos" and "unpredictability" without really saying anything remarkable.

This is only part 1 of what will supposedly be a series, but this looks nothing more to me than an artist's view of a computational problem. From a pragmatic perspective, it makes obvious observations, weak statements that only cite impressive concepts (out of dynamical systems theory, computer science and biology) and proposes no answers.

To wrap it up, he suggests the reader should question where Turing, Shannon and von Neumann were wrong. Well, guess what: these were all mathematicians and even though one may question why they studied particular topics, their mathematics isn't and never will be wrong because it's logically sound.

I'm not impressed.

Re:Full of it. (0)

Anonymous Coward | more than 11 years ago | (#5157349)

The case of DNA is a special one. (in fact, isn't DNA the physical equivalent of software? Is it any wonder that behavior parallels?) In most physical systems a small change does not result in a crash (ecological crash...biological crash...). In virtually all cases, moving the cheese a foot to the left will not cause the mouse to starve. Cutting off your finger will not kill you.

Tunnelvision is a superpower and a crippling weakness. Geeks have it in spades.

Re:Full of it. (2, Interesting)

raduf (307723) | more than 11 years ago | (#5157387)

Well, actually, changes in DNA often don't do anything bad, much less fatal. That's how evolution takes place, right? See how long/well a random DNA change can survive. Anyway, DNA and biological stuff in general is much more resistant (resilient?) to change than software. Much more. DNA is around for a very long time and it hasn't crashed yet.

The point this guy makes and I totaly agree with is that programmimg can't stay the same for ever. I mean come on, we're practically programming assembly. High level, IDE'd and coloured and stuff but not a bit different fundamentally.
Functional programming for example, that's different. It probably sucks (I don't really know it) but it's different. It's been around for about 30 years too.

There has to be something that would let me create software without writing
for i=1 to n
for every piece of code I make. It's just... primitive.

And this guy is right about something else too. If nobody's looking for it, it's gonna take a lot longer to find it.

I'm fairly sure you're wrong (1)

0x0d0a (568518) | more than 11 years ago | (#5157415)

This guy obviously knows nothing about biology. A single base change in DNA can result in mutations that cause death or spontaneous abortion. As little as a change in a single 'character' can be lethal. That's a pretty "small change" that results in a pretty big "crash."

I think most of the data in DNA requires multiple base pair changes to have a major impact -- I'm not a biologist, though. Otherwise, radiation from the Sun would mutate the bejeezus out of everyone all the time.

10 million lines (0, Flamebait)

the eric conspiracy (20178) | more than 11 years ago | (#5157083)

we will not be writing programs bigger than about 10 million lines of code no matter how fast our processors become

And how is this a problem?

Re:10 million lines (1)

Elwood P Dowd (16933) | more than 11 years ago | (#5157284)

>> we will not be writing programs bigger than about 10 million lines of code no matter how fast our processors become

> And how is this a problem?


You flamebaiter! How dare you post such vitriolic filth on slashdot! Where do you get off?!

10 million lines of bullpucky (5, Insightful)

aminorex (141494) | more than 11 years ago | (#5157089)

And when you link your 10 million line program with my
10 million line program, we've got a 20 million line program.
This idea of an inherent limit to the complexity of
programs using current methods is pure larksvomit, and
if Jaron Lanier sells it, he's a snake oil hawker.

This is Jack's total lack of surprise -> :|

Re:10 million lines of bullpucky (1)

Elwood P Dowd (16933) | more than 11 years ago | (#5157303)

And when you link your 10 million line program with my
10 million line program, we've got a 20 million line program.
This idea of an inherent limit to the complexity of
programs using current methods is pure larksvomit, and
if Jaron Lanier sells it, he's a snake oil hawker.

This is Jack's total lack of surprise -> :|


If this isn't a troll, my name isn't Elwood P. Dowd. And it isn't.

Big Programs (5, Insightful)

Afty0r (263037) | more than 11 years ago | (#5157093)

"if 'we don't find a different way of thinking about and creating software, we will not be writing programs bigger than about 10 million lines of code no matter how fast our processors become."

Fantastic! We'll all get down and program small, specific routines for processing data, each one doing its' own job and doing it well. Those nasty, horrid standard protocols he refers to will allow all these small components to easily talk to each other - across architectures, networks etc.

Oh wait, this is the way it already works. Is this guy then, proposing that we learn a new way to program because our systems aren't monolithic enough? *sigh*

micheals dna (-1, Troll)

Anonymous Coward | more than 11 years ago | (#5157098)

iwantgnujizzinmymouthiwantgnujizzinmymouthiwantgnu jizzinmymouthiwantgnujizzinmymouthiwantgnujizzinmy mouthiwantgnujizzinmymouthiwantgnujizzinmymouthiwa ntgnujizzinmymouthiwantgnujizzinmymouthiwantgnujiz zinmymouthiwantgnujizzinmymouthiwantgnujizzinmymou thiwantgnujizzinmymouthiwantgnujizzinmymouthiwantg nujizzinmymouthiwantgnujizzinmymouthiwantgnujizzin mymouthiwantgnujizzinmymouthiwantgnujizzinmymouthi wantgnujizzinmymouthiwantgnujizzinmymouthiwantgnuj izzinmymouthiwantgnujizzinmymouthiwantgnujizzinmym outhiwantgnujizzinmymouthiwantgnujizzinmymouthiwan tgnujizzinmymouthiwantgnujizzinmymouthiwantgnujizz inmymouthiwantgnujizzinmymouthiwantgnujizzinmymout hiwantgnujizzinmymouthiwantgnujizzinmymouthiwantgn ujizzinmymouthiwantgnujizzinmymouthiwantgnujizzinm ymouthiwantgnujizzinmymouthiwantgnujizzinmymouthiw antgnujizzinmymouthiwantgnujizzinmymouthiwantgnuji zzinmymouthiwantgnujizzinmymouthiwantgnujizzinmymo uthiwantgnujizzinmymouthiwantgnujizzinmymouthiwant gnujizzinmymouthiwantgnujizzinmymouthiwantgnujizzi nmymouthiwantgnujizzinmymouthiwantgnujizzinmymouth iwantgnujizzinmymouthiwantgnujizzinmymouthiwantgnu jizzinmymouthiwantgnujizzinmymouthiwantgnujizzinmy mouthiwantgnujizzinmymouthiwantgnujizzinmymouthiwa ntgnujizzinmymouth

This is an interesting concept... (4, Interesting)

Anonymous Hack (637833) | more than 11 years ago | (#5157108)

...but i don't see how it's physically possible. It sounds like he's proposing that we re-structure programming languages or at least the fundamentals of programming in the languages we do know (which might as well mean creating a new language). This isn't a bad thing per se, but one example he talks about is this:

For example, if you want to describe the connection between a rock and the ground that the rock is resting on, as if it were information being sent on a wire, it's possible to do that, but it's not the best way. It's not an elegant way of doing it. If you look at nature at large, probably a better way to describe how things connect together is that there's a surface between any two things that displays patterns. At any given instant, it might be possible to recognize those patterns.

Am i stupid or something? He seems to be drawing two, completely unrelated things together. Our computers, our CPUs, our ICs, at the end of the day they're just a bundle of very, very tiny on/off switches - pure binary logic. When we develop code for this environment, we have to develop according to those binary rules. We can't say "here's a rock", but we can say "turn on these switches and those switches such so that it indicates that we are pointing to a location in memory that represents a rock".

Maybe i'm missing his point, but i just don't understand how you can redefine programming, which is by definition a means of communication with a predictable binary system (as opposed to a "probability-based system" or whatever quantum physicists like to call reality), to mean inputting some kind of "digitized" real-world pattern. It's bizarre.

Re:This is an interesting concept... (-1)

Anonymous Coward | more than 11 years ago | (#5157179)

He's just explaining OOP in a faggy, earthy, hippy way. He has to keep his job somehow. Anal sex can only get you so far in this world, until you have to shovel the BS with a pitch fork.


Just like any workplace.

Re:This is an interesting concept... (1)

moonbender (547943) | more than 11 years ago | (#5157393)

Well, you can describe most occurences in nature with an extremely deterministic set of rules, the basic laws of physics everyone learns in school. It's not like a random, "probability-based system" was necessary to understand, simulate or imitate the behaviour he described. Also note that the digital representation in a computer also physically exists in the real world (in whatever way you chose to store them) and is thus influenced by the same "random" effects - although I wouldn't blame any software bugs on quantum phyics.

That said, I also have a hard time creating a connection between what he suggests and the example he gives. Maybe you need better drugs to grok it. ;)

Disclaimer: I'm not a physicist, and that likely shows. Heck, I'm not even a qualified computer scientist, yet.

Re:This is an interesting concept... (4, Insightful)

goombah99 (560566) | more than 11 years ago | (#5157417)

When I first started reading it I thought well this is cute but impractical. But I had a change of heart. What first bothered me was the idea that if a function is called with a list of args that the function should not just process the args but in fact should look at the args as a pattern that it needs to respond to. First this would imply that every function has been 'trained' or has enough history of previous calls under its belt that it's smart enough to figure out what you are aksing for even if you ask for it a little wrong. Second the amount of computational power needed to process every single function call as a pattern rather than as a simple function call is staggering.

or is it? how does 'nature' do it. well the answer in nature is that everything is done in parallel at the finest level of detail. when a rock sits on a surface every point on the rock is using its f=MA plus some electomagentics to interact with the surface. each point is not supervised, but the whole process is a parallel computation.

so although his ideas are of no use to a conventional system, maybe they will be of use 100 years from now when we have millions of parallel processors cheaply available (maybe not silicon). So one cant say, this is just stupid on that basis.

indeed the opposite is true. if we are ever going to have mega-porcessor interaction these interactions are going to have to be self-negotiating. It is quite likely that the requirements for self negoitation will far out strip trying to implement doing something the most efficeint way possible as a coded algorithm would. spending 99% of your effort on pattern recognition on inputs and 1% of your processor capability fuulfilling the requested calacultion may make total sense in a mega scale processing environement. it might run 100x slower than straight code would but it will actually work in a mega scale system.

The next step is how to make the processor have a history so that it can actually recognize what to do. That's where the idea of recognizing protocols comes in. At first the system can be trained on specific protocols, which can then be generalized byt theprocessor. superviser learning versus unsupervised.

Cellular systems in multi-cellular organism mostly function analogously. They spend 99% of their effort just staying alive. hugeamounts of energy are expended trying to interpret patterns on their receptors. some energy is spent reponding to those patterns. Signals are sent to other cells (chemically) but the signals dont tell the cell what to do exactly. Instead they just trigger pattern recognition on the receptors.

thus it is not absurd to propose that 'functions' spend enormous effort on pattern recogntion before giving some simple processing result. But for this to make sense youhave to contextualize it in a mega processor environement.

Why? (-1, Troll)

Anonymous Coward | more than 11 years ago | (#5157115)

Are you such a smelly faggot?

What about the Irish (2, Funny)

RodeoBoy (535456) | more than 11 years ago | (#5157122)

we will not be writing programs bigger than about 10 million lines of code no matter how fast our processors become.

Oh I am sure a group of say about 15 Irish kids could do it in a year.

Unfortunately... (2, Insightful)

holygoat (564732) | more than 11 years ago | (#5157129)

... this guy doesn't seem to really know what he's talking about.

As someone else has mentioned, life in many respects isn't robust - and where it is, it's not relevant.

For instance, genetic code is mutable - but try doing that with machine instructions. That's why Tierra creatures are written in its own pseudo-machine language.

If there's one thing that I don't want happening in code, its tiny errors causing almost-reasonable behaviour. Brittle code is code that's easy to debug.

What he really wants is lots of small, well-defined objects or procedures doing their one task well. If you decompose a design well enough, there's nothing to limit scalability.
Good design is the answer.

Re:Unfortunately... (0)

Anonymous Coward | more than 11 years ago | (#5157426)

Rubbish.

How many of you have experience coding 10 million lines of code? I know everyone around here likes to think they are master coders, but if you churn out more than 30 lines of debugged, documented code a day, you're doing well. In a lifetime, that's what, 250,000 lines of code - an entire order of magnitude less than what we're talking about.

Brittle code may be easier to debug but it's also inherently more vulnerable to crashing the bigger the program gets. Anything which reduces that brittleness may mean you lose some control but that may well be a worthwhile trade-off.

Read it couple days ago (2, Interesting)

f00zbll (526151) | more than 11 years ago | (#5157130)

Most of what was stated is "pie in the sky" idealism. Get real, it will take a long time for programming and software development to get to the point where it works elegantly the way he describes it. I have no problems with reminding people "hey, lets try to improve how software is developed." Like those of us in the trenches don't realize how much of a mess it is most of the time. We can't get from point A to point M without going through all the painful intermediate steps.

I seriously doubt nature came to the elegant design of 4 base pairs overnight, so let's work hard at making it better w/o throwing a pile of dung on people's face. After all, they are the ones who have to build the pieces to get to that point.

Lanier? (0)

Anonymous Coward | more than 11 years ago | (#5157133)

That bastard tried to kill President Sheridan, and then ran off!

10 million lines is easy (-1)

Anonymous Coward | more than 11 years ago | (#5157142)

micheallikesgnujimicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitremicheallike sgnujizzinhismouthandhelikestodrinkitbythelitremic heallikesgnujizzinhismouthandhelikestodrinkitbythe litremicheallikesgnujizzinhismouthandhelikestodrin kitbythelitremicheallikesgnujizzinhismouthandhelik estodrinkitbythelitremicheallikesgnujizzinhismouth andhelikestodrinkitbythelitremicheallikesgnujizzin hismouthandhelikestodrinkitbythelitremicheallikesg nujizzinhismouthandhelikestodrinkitbythelitremiche allikesgnujizzinhismouthandhelikestodrinkitbytheli tremicheallikesgnujizzinhismouthandhelikestodrinki tbythelitremicheallikesgnujizzinhismouthandhelikes todrinkitbythelitremicheallikesgnujizzinhismouthan dhelikestodrinkitbythelitremicheallikesgnujizzinhi smouthandhelikestodrinkitbythelitremicheallikesgnu jizzinhismouthandhelikestodrinkitbythelitremicheal likesgnujizzinhismouthandhelikestodrinkitbythelitr emicheallikesgnujizzinhismouthandhelikestodrinkitb ythelitremicheallikesgnujizzinhismouthandhelikesto drinkitbythelitremicheallikesgnujizzinhismouthandh elikestodrinkitbythelitremicheallikesgnujizzinhism outhandhelikestodrinkitbythelitremicheallikesgnuji zzinhismouthandhelikestodrinkitbythelitremichealli kesgnujizzinhismouthandhelikestodrinkitbythelitrem icheallikesgnujizzinhismouthandhelikestodrinkitbyt helitremicheallikesgnujizzinhismouthandhelikestodr inkitbythelitremicheallikesgnujizzinhismouthandhel ikestodrinkitbythelitremicheallikesgnujizzinhismou thandhelikestodrinkitbythelitremicheallikesgnujizz inhismouthandhelikestodrinkitbythelitrezzinhismout handhelikestodrinkitbythelitremicheallikesgnujizzi nhismouthandhelikestodrinkitbythelitremicheallikes gnujizzinhismouthandhelikestodrinkitbythelitremich eallikesgnujizzinhismouthandhelikestodrinkitbythel itremicheallikesgnujizzinhismouthandhelikestodrink itbythelitremicheallikesgnujizzinhismouthandhelike stodrinkitbythelitremicheallikesgnujizzinhismoutha ndhelikestodrinkitbythelitremicheallikesgnujizzinh ismouthandhelikestodrinkitbythelitremicheallikesgn ujizzinhismouthandhelikestodrinkitbythelitremichea llikesgnujizzinhismouthandhelikestodrinkitbythelit remicheallikesgnujizzinhismouthandhelikestodrinkit bythelitremicheallikesgnujizzinhismouthandhelikest odrinkitbythelitremicheallikesgnujizzinhismouthand helikestodrinkitbythelitremicheallikesgnujizzinhis mouthandhelikestodrinkitbythelitremicheallikesgnuj izzinhismouthandhelikestodrinkitbythelitremicheall ikesgnujizzinhismouthandhelikestodrinkitbythelitre micheallikesgnujizzinhismouthandhelikestodrinkitby thelitremicheallikesgnujizzinhismouthandhelikestod rinkitbythelitremicheallikesgnujizzinhismouthandhe likestodrinkitbythelitremicheallikesgnujizzinhismo uthandhelikestodrinkitbythelitremicheallikesgnujiz zinhismouthandhelikestodrinkitbythelitremicheallik esgnujizzinhismouthandhelikestodrinkitbythelitremi cheallikesgnujizzinhismouthandhelikestodrinkitbyth elitremicheallikesgnujizzinhismouthandhelikestodri nkitbythelitremicheallikesgnujizzinhismouthandheli kestodrinkitbythelitremicheallikesgnujizzinhismout handhelikestodrinkitbythelitremicheallikesgnujizzi nhismouthandhelikestodrinkitbythelitremicheallikes gnujizzinhismouthandhelikestodrinkitbythelitremich eallikesgnujizzinhismouthandhelikestodrinkitbythel itremicheallikesgnujizzinhismouthandhelikestodrink itbythelitremicheallikesgnujizzinhismouthandhelike stodrinkitbythelitremicheallikesgnujizzinhismoutha ndhelikestodrinkitbythelitremicheallikesgnujizzinh ismouthandhelikestodrinkitbythelitremicheallikesgn ujizzinhismouthandhelikestodrinkitbythelitremichea llikesgnujizzinhismouthandhelikestodrinkitbythelit remicheallikesgnujizzinhismouthandhelikestodrinkit bythelitremicheallikesgnujizzinhismouthandhelikest odrinkitbythelitremicheallikesgnujizzinhismouthand helikestodrinkitbythelitremicheallikesgnujizzinhis mouthandhelikestodrinkitbythelitremicheallikesgnuj izzinhismouthandhelikestodrinkitbythelitremicheall ikesgnujizzinhismouthandhelikestodrinkitbythelitre micheallikesgnujizzinhismouthandhelikestodrinkitby thelitremicheallikesgnujizzinhismouthandhelikestod rinkitbythelitremicheallikesgnujizzinhismouthandhe likestodrinkitbythelitremicheallikesgnujizzinhismo uthandhelikestodrinkitbythelitremicheallikesgnujiz zinhismouthandhelikestodrinkitbythelitremicheallik esgnujizzinhismouthandhelikestodrinkitbythelitremi cheallikesgnujizzinhismouthandhelikestodrinkitbyth elitremicheallikesgnujizzinhismouthandhelikestodri nkitbythelitremicheallikesgnujizzinhismouthandheli kestodrinkitbythelitremicheallikesgnujizzinhismout handhelikestodrinkitbythelitremicheallikesgnujizzi nhismouthandhelikestodrinkitbythelitremicheallikes gnujizzinhismouthandhelikestodrinkitbythelitremich eallikesgnujizzinhismouthandhelikestodrinkitbythel itremicheallikesgnujizzinhismouthandhelikestodrink itbythelitremicheallikesgnujizzinhismouthandhelike stodrinkitbythelitremicheallikesgnujizzinhismoutha ndhelikestodrinkitbythelitremicheallikesgnujizzinh ismouthandhelikestodrinkitbythelitremicheallikesgn ujizzinhismouthandhelikestodrinkitbythelitremichea llikesgnujizzinhismouthandhelikestodrinkitbythelit remicheallikesgnujizzinhismouthandhelikestodrinkit bythelitremicheallikesgnujizzinhismouthandhelikest odrinkitbythelitremicheallikesgnujizzinhismouthand helikestodrinkitbythelitremicheallikesgnujizzinhis mouthandhelikestodrinkitbythelitremicheallikesgnuj izzinhismouthandhelikestodrinkitbythelitremicheall ikesgnujizzinhismouthandhelikestodrinkitbythelitre micheallikesgnujizzinhismouthandhelikestodrinkitby thelitremicheallikesgnujizzinhismouthandhelikestod rinkitbythelitremicheallikesgnujizzinhismouthandhe likestodrinkitbythelitremicheallikesgnujizzinhismo uthandhelikestodrinkitbythelitremicheallikesgnujiz zinhismouthandhelikestodrinkitbythelitremicheallik esgnujizzinhismouthandhelikestodrinkitbythelitremi cheallikesgnujizzinhismouthandhelikestodrinkitbyth elitremicheallikesgnujizzinhismouthandhelikestodri nkitbythelitremicheallikesgnujizzinhismouthandheli kestodrinkitbythelitremicheallikesgnujizzinhismout handhelikestodrinkitbythelitremicheallikesgnujizzi nhismouthandhelikestodrinkitbythelitremicheallikes gnujizzinhismouthandhelikestodrinkitbythelitremich eallikesgnujizzinhismouthandhelikestodrinkitbythel itremicheallikesgnujizzinhismouthandhelikestodrink itbythelitremicheallikesgnujizzinhismouthandhelike stodrinkitbythelitremicheallikesgnujizzinhismoutha ndhelikestodrinkitbythelitremicheallikesgnujizzinh ismouthandhelikestodrinkitbythelitremicheallikesgn ujizzinhismouthandhelikestodrinkitbythelitremichea llikesgnujizzinhismouthandhelikestodrinkitbythelit remicheallikesgnujizzinhismouthandhelikestodrinkit bythelitremicheallikesgnujizzinhismouthandhelikest odrinkitbythelitremicheallikesgnujizzinhismouthand helikestodrinkitbythelitremicheallikesgnujizzinhis mouthandhelikestodrinkitbythelitremicheallikesgnuj izzinhismouthandhelikestodrinkitbythelitremicheall ikesgnujizzinhismouthandhelikestodrinkitbythelitre micheallikesgnujizzinhismouthandhelikestodrinkitby thelitremicheallikesgnujizzinhismouthandhelikestod rinkitbythelitremicheallikesgnujizzinhismouthandhe likestodrinkitbythelitremicheallikesgnujizzinhismo uthandhelikestodrinkitbythelitremicheallikesgnujiz zinhismouthandhelikestodrinkitbythelitremicheallik esgnujizzinhismouthandhelikestodrinkitbythelitremi cheallikesgnujizzinhismouthandhelikestodrinkitbyth elitremicheallikesgnujizzinhismouthandhelikestodri nkitbythelitremicheallikesgnujizzinhismouthandheli kestodrinkitbythelitremicheallikesgnujizzinhismout handhelikestodrinkitbythelitremicheallikesgnujizzi nhismouthandhelikestodrinkitbythelitremicheallikes gnujizzinhismouthandhelikestodrinkitbythelitremich eallikesgnujizzinhismouthandhelikestodrinkitbythel itremicheallikesgnujizzinhismouthandhelikestodrink itbythelitremicheallikesgnujizzinhismouthandhelike stodrinkitbythelitremicheallikesgnujizzinhismoutha ndhelikestodrinkitbythelitremicheallikesgnujizzinh ismouthandhelikestodrinkitbythelitremicheallikesgn ujizzinhismouthandhelikestodrinkitbythelitremichea llikesgnujizzinhismouthandhelikestodrinkitbythelit remicheallikesgnujizzinhismouthandhelikestodrinkit bythelitremicheallikesgnujizzinhismouthandhelikest odrinkitbythelitremicheallikesgnujizzinhismouthand helikestodrinkitbythelitremicheallikesgnujizzinhis mouthandhelikestodrinkitbythelitremicheallikesgnuj izzinhismouthandhelikestodrinkitbythelitremicheall ikesgnujizzinhismouthandhelikestodrinkitbythelitre micheallikesgnujizzinhismouthandhelikestodrinkitby thelitremicheallikesgnujizzinhismouthandhelikestod rinkitbythelitremicheallikesgnujizzinhismouthandhe likestodrinkitbythelitremicheallikesgnujizzinhismo uthandhelikestodrinkitbythelitremicheallikesgnujiz zinhismouthandhelikestodrinkitbythelitremicheallik esgnujizzinhismouthandhelikestodrinkitbythelitremi cheallikesgnujizzinhismouthandhelikestodrinkitbyth elitremicheallikesgnujizzinhismouthandhelikestodri nkitbythelitremicheallikesgnujizzinhismouthandheli kestodrinkitbythelitremicheallikesgnujizzinhismout handhelikestodrinkitbythelitremicheallikesgnujizzi nhismouthandhelikestodrinkitbythelitremicheallikes gnujizzinhismouthandhelikestodrinkitbythelitremich eallikesgnujizzinhismouthandhelikestodrinkitbythel itremicheallikesgnujizzinhismouthandhelikestodrink itbythelitremicheallikesgnujizzinhismouthandhelike stodrinkitbythelitremicheallikesgnujizzinhismoutha ndhelikestodrinkitbythelitremicheallikesgnujizzinh ismouthandhelikestodrinkitbythelitremicheallikesgn ujizzinhismouthandhelikestodrinkitbythelitremichea llikesgnujizzinhismouthandhelikestodrinkitbythelit remicheallikesgnujizzinhismouthandhelikestodrinkit bythelitremicheallikesgnujizzinhismouthandhelikest odrinkitbythelitremicheallikesgnujizzinhismouthand helikestodrinkitbythelitremicheallikesgnujizzinhis mouthandhelikestodrinkitbythelitremicheallikesgnuj izzinhismouthandhelikestodrinkitbythelitremicheall ikesgnujizzinhismouthandhelikestodrinkitbythelitre micheallikesgnujizzinhismouthandhelikestodrinkitby thelitremicheallikesgnujizzinhismouthandhelikestod rinkitbythelitremicheallikesgnujizzinhismouthandhe likestodrinkitbythelitremicheallikesgnujizzinhismo uthandhelikestodrinkitbythelitremicheallikesgnujiz zinhismouthandhelikestodrinkitbythelitremicheallik esgnujizzinhismouthandhelikestodrinkitbythelitremi cheallikesgnujizzinhismouthandhelikestodrinkitbyth elitremicheallikesgnujizzinhismouthandhelikestodri nkitbythelitremicheallikesgnujizzinhismouthandheli kestodrinkitbythelitremicheallikesgnujizzinhismout handhelikestodrinkitbythelitremicheallikesgnujizzi nhismouthandhelikestodrinkitbythelitremicheallikes gnujizzinhismouthandhelikestodrinkitbythelitremich eallikesgnujizzinhismouthandhelikestodrinkitbythel itremicheallikesgnujizzinhismouthandhelikestodrink itbythelitremicheallikesgnujizzinhismouthandhelike stodrinkitbythelitremicheallikesgnujizzinhismoutha ndhelikestodrinkitbythelitremicheallikesgnujizzinh ismouthandhelikestodrinkitbythelitremicheallikesgn ujizzinhismouthandhelikestodrinkitbythelitremichea llikesgnujizzinhismouthandhelikestodrinkitbythelit remicheallikesgnujizzinhismouthandhelikestodrinkit bythelitremicheallikesgnujizzinhismouthandhelikest odrinkitbythelitremicheallikesgnujizzinhismouthand helikestodrinkitbythelitremicheallikesgnujizzinhis mouthandhedrinksitbythelitre

"Robust" versus "goal-oriented" (4, Interesting)

Viking Coder (102287) | more than 11 years ago | (#5157161)

I used to like this guy.

The problem with the kind of system he's talking about is that the more robust you make it, the harder it is to change it's behavior.

Take the cockroach, for instance. It is damned hard to train 100 of them to work together to open a pickle jar.

That's because a cockroach is extremely robust at being a cockroach, which has nothing to do with teaming up with 99 other cockroaches to open a pickle jar.

I don't believe nature had a design for each individual life form, other than to be robust. That doesn't give us any particular insight into how to both design something robust that meets a specific goal, which is the point of almost all software.

Once you get to the point where the specifications of each component are as exact as they need to be to meet a specific goal, you're lacking exactly the kind of robustness that he's describing.

What he's really saying is that entropy is easy to defeat. It's not. Perhaps there will be easier ways to communicate our goals to a computer in the future, but the individual components will still need to be extremely well thought-out. I think it's the difficulty of the language that makes symbol exchange between a human and a computer difficult - the fact that the human needs an exact understanding of the problem before they can codify it isn't going to change.

Re:"Robust" versus "goal-oriented" (2, Insightful)

Anonymous Coward | more than 11 years ago | (#5157287)

Exactly.

I don't want my computer to be fuzzy and robust. I want it to be precise and brittle. I don't want computers to become "life forms". The whole point of the computer is to solve problems, not to be a little organism that "mostly works". Life forms already exist.

That's what I hear all these "visionaries" talking about: they want to make computers operate like the human mind, but they miss the point that we ALREADY HAVE the human mind! We know how it can solve amazing problems quickly, but can also fail miserably. Why do we focus on this, and not on making computers simpler and more effective tools!

It's good to always question the design of our computers. The stored program concept, files, all that stuff is arbitrary. But let's not miss the point that computers are tools, assistance, calculators, etc... they aren't brains and shouldn't be.

I'll go you one better (1)

0x0d0a (568518) | more than 11 years ago | (#5157429)

You're certainly right that it's hard to change, but I don't even think we need to go that far. It's hard to *make* a system like this. Nature used brute force, a massive computer, and bazillions of years to do it, and didn't get to specify much about what came out the other end.

That tired old cockroaches & pickle jar exampl (0)

Anonymous Coward | more than 11 years ago | (#5157464)

*sigh*

I thought my life had moved beyond this.

That tired old training cockroaches to open a pickle jar example has just been beaten to death.
Let it rest Viking Coder, let it rest.

and another thing (3, Insightful)

Anonymous Hack (637833) | more than 11 years ago | (#5157164)

His comments don't seem to make any sense with regard to the way we, as humans, actually view The Real World either:

So, now, when you learn about computer science, you learn about the file as if it were an element of nature, like a photon. That's a dangerous mentality. Even if you really can't do anything about it, and you really can't practically write software without files right now, it's still important not to let your brain be bamboozled. You have to remember what's a human invention and what isn't.

Of course a file is a human invention, but it's also a concept without which NOTHING would work - not just computers. A "file" is just an abstraction of a blob, and i mean that both in the sense of a "blob" as in "a thing" and as in "Binary Large OBject". It's a piece of data that represents something. That's exactly the same thing as looking at a house and saying "that's a house" or looking at a car and saying "that's a car". It's just a way to categorize a bunch of photons/atoms/whatever into something we can readily communicate and understand. This is HUMAN, this is how we reason. If we "saw" the universe as a bazillion photons, we'd never get anything done, because we couldn't "here is a car", we'd be describing each photon individually, which would take up millions of human lifetimes. It's a human limitation, and i don't see any way that we could just up and ignore it.

Don't get me wrong, i think what this guy is talking about is fascinating, but i also think it's got more of a place in some theoretical branch of math than in real life development.

Re:and another thing (1)

JohnFluxx (413620) | more than 11 years ago | (#5157204)

There's lot of places that we use files where we don't have to.

Take libraries for example - why are they in files? Why not put all the functions in a database? Fully indexed, and cross referenced. When you need new functions, just download them.

Same with programs. Why not just make every program a function? That would make it a lot easier to manipulate the output and input (This is actually close to a project I've been working on for some time.)

Re:and another thing (1)

Anonymous Hack (637833) | more than 11 years ago | (#5157278)

Take libraries for example - why are they in files? Why not put all the functions in a database? Fully indexed, and cross referenced. When you need new functions, just download them.

That's a red herring. How do we store the database? As a file. What is a file? An indexed, named set of blocks on a storage medium. But that wasn't my point. My point wasn't that we couldn't use a DB or some other way of accessing our data, my point was that the concept of a "file" is just a way of categorizing data. It's semantics - you could call a DB table a "file", you could call a single field in the DB a "file", it wouldn't make any difference. You still use the data in a "blob" format that arbitrarily represents something useful. I think what he was trying to say was that data shouldn't be stored in any format at all - that it should just exist randomly and in and of itself, and our program should at run-time determine what the random data is supposed to be and somehow use it in some way. "Just like nature". Except in real life we isolate connected atoms (in the sense of "small things", not physics/chemistry atoms) into arbitrary groupings aswell. It's a conceptualization we are required to make in order to create, reason, and communicate.

So many errors... (1)

holygoat (564732) | more than 11 years ago | (#5157165)

Just picking one...
The important thing to look at is how files became the standard. It just happened that UNIX had them, IBM mainframes had them, DOS had them, and then Windows. And then Macintosh came out with them. And with the Internet, because of the UNIX heritage, we ended up thinking in terms of moving files around and file- oriented protocols like FTP. And what happened is that the file just became a universal idea, even though it didn't start out as one.

Macintosh launched January 1984 (link [compsoc.net] ).
Windows 1.0 released November 1985 (link [intelescope.net] ).

Not to mention that what he's saying is waffle... where do they dig up these guys?

My favorite quotes (2, Insightful)

ckedge (192996) | more than 11 years ago | (#5157168)

Virtual reality-based applications will be needed in order to manage giant databases

"Phenotropic" is the catchword I'm proposing for this new kind of software.

Oh, those are good signs.

And we're at the point where computers can recognize similarities instead of perfect identities, which is essentially what pattern recognition is about. If we can move from perfection to similarity, then we can start to reexamine the way we build software. So instead of requiring protocol adherence in which each component has to be perfectly matched to other components down to the bit, we can begin to have similarity. Then a form of very graceful error tolerance, with a predictable overhead, becomes possible.

Phht, I want my computer to be more predictable, not less.

we need to create a new kind of software.

No, what we need is an economic model that doesn't include a bunch of pointy haired bosses forcing tons of idiot (and even good) developers to spew crap.

And we need consumers to up their standards, so that crap programs can't become popular because they look shiny or promise 100,000 features that people don't need. And we need to get rid of pointy-haired bosses that choose software because of all the wrong reasons.

In phenotropic computing, components of software would connect to each other through a gracefully error-tolerant means that's statistical and soft and fuzzy and based on pattern recognition in the way I've described.

Sounds like AI and another great method of using 10,000 GHZ CPUs to let people do simple tasks with software written by morons, instead of simply writing better code and firing and putting out of business the morons.

what is "nature"? (1)

miro2 (222748) | more than 11 years ago | (#5157177)

Basicly, the problem with his argument comes from his vague use of the word "nature." Sometimes nature behaves in such a way so that little changes make a difference. Sometimes it doesn't. It all depends on what level you look at. The same goes for a computer. Change the voltage at a chip pin from 5V to 4.9V and it still behaves fine. Change the value of a key variable, and it crashes. Modify the input to a face recognizer slightly, and it will gracefully recover.

God I'm so tired of that guy (0)

Anonymous Coward | more than 11 years ago | (#5157185)

A while back it seemed like he was in every single issue of Wired.

Just goes to show you that genius doesn't guarantee the ability to produce worth.

Did he actually say anything? (1, Insightful)

Anonymous Coward | more than 11 years ago | (#5157187)

He thinks pattern recognition-based methods like neural networks and genetic optimization are the solution to the complexity of traditional software.

So do lots of naive people. HE has a fancy new word for it.

Yes, fuzzy computing has its place -- there are certain applications for which it's much better than traditional programming -- but it took so many millions of years for our brains to evolve to the point where we can use logic and language to solve and express problems. It's ridiculous to think we should throw that all away.

Phurst Poast!! (-1)

Anonymous Coward | more than 11 years ago | (#5157200)

YEE HA! I r0xor! Im so 3733+ ! Phurst poast! you all suxor. I roxor! Phurst poast!! Live with it!

Get that man a haircut, STAT! (0, Offtopic)

MonTemplar (174120) | more than 11 years ago | (#5157210)

Maybe his thinking is being affected by the 10 millions miles of dreadlocks coming out of his head. :)

10 Million Lines +, Fine just don't write them all (1)

kingtonm (208158) | more than 11 years ago | (#5157219)

This seems to me as something that we're already trying to do. If you look at the way that we're trying to shift the programming pardigm now we'll see some of the concepts Jaron mentions. The current approach of analyse the problem, abstract it and then solve it on a case by case basis is not wrong, per se. However, we try to write our code for re-use, but all these small pools of answers to problems all reside inside small development groups.

The answer to all of this might be to role the concept of generative programming, with better pattern matching. Therefore companies can more intelligently apply existing, tested, debugged code to their problems and the problems of others.

Jaron, lose the dreadlocks and get a job. (1, Insightful)

SmokeSerpent (106200) | more than 11 years ago | (#5157223)

Why do we need programs with more than 10 million lines of code?

Has anyone ever noticed that every time Jaron writes an essay or does an interview he tries to coin at least one new word? Dude's better suited to the philosophy department.

"It's like... chaos, man. And some funky patterns and shit. Dude, it's all PHENOTROPIC. Yeah..."

Ugh (0)

Anonymous Coward | more than 11 years ago | (#5157227)

Just looking at his picture and reading his pretentious bio, this guy annoys the crap out of me.

Just plain silly... (2, Insightful)

rmdyer (267137) | more than 11 years ago | (#5157241)

Obviously you "can" write mammoth programs with 1 billion lines of code without crashing. It's the kind of program you are writing that becomes critical.

Generally, serial programs are the simplest programs to write. Most serial programs control machines that do very repetitive work. It's possible to write serial programs very modularly so that each module has been checked to be bug free. Today's processors execute code so that the result is "serially" computed. Yes, the instructions are pipelined, but the result is that of a serial process.

Where we go wrong is when we start writing code that becomes non-serial. Threads that execute at the same time, serial processes that look-ahead or behind. Most OOP languages tend of obfuscate the complexities behind the code. Huge class libraries that depend on large numbers of hidden connections between other classes make programming a real pain.

Mr. Lanier might be right, but I doubt it. Seems to me that a line of code could be as simple as that of a machine language command, in which case we are already using high level compilers to spit out huge numbers of serial instructions. Does that count? I think it does. Scaling code comes only at the expense of time. Most people simply don't think about the future long enough.

My 2 cents.

Jaron Lanier On Software Design and Phenotropics (2, Informative)

rpiquepa (644694) | more than 11 years ago | (#5157246)

I wrote the following on Dec. 20, 2002 about phenotropics. Jaron Lanier is mostly known for being the guy behind the expression "virtual reality." For its special issue "Big [and Not So Big] Ideas For 2003 [cio.com] ," CIO Magazine talked with him about a new concept -- at least for me -- phenotropics. "The thing I'm interested in now is a high-risk, speculative, fundamental new approach to computer science. I call it phenotropics," says the 42-year-old Lanier. By pheno, he means the physical appearance of something, and by tropics, he means interaction. Lanier's idea is to create a new way to tie two pieces of software together. He theorizes that two software objects should contact each other "like two objects in nature," instead of through specific modules or predetermined points of contact. Jason Lanier also talks about software diversity to enhance security. Check this column [weblogs.com] for a summary or the original article [cio.com] for more details."

Jason Lanier's virtual reality (1)

resident-crank (644692) | more than 11 years ago | (#5157248)

Jason Lanier's understanding of the history of computing displays a profound ignorance of anything he finds inconvenient, not to mention his understanding of the sciences from which his latest HypeWord(tm) is taken.

When Virtual Reality first became au courant, I thought it was almost completely wacko, with the rare exception of a few genuine applications relating to telepresence in dangerous or remote environments. Now that we know he thinks software the size of a planet is a *good* idea, it is impossible for me to take him seriously for any purpose.

WHOOOOSH! (0)

Anonymous Coward | more than 11 years ago | (#5157252)

The sound of 10,000 slashdotters missing the point.

Car Commercial (1)

AltImage (626465) | more than 11 years ago | (#5157275)

Isn't that him in that Nissan car commercial (I think it's Nissan). It shows a shot of him pretty quickly riding in the car with some mild electronic music playing over the top, but I'm pretty sure it's him. Can anyone back me up on this?

theory is very interesting (3, Interesting)

jdkane (588293) | more than 11 years ago | (#5157289)

If we don't find a different way of thinking about and creating software, we will not be writing programs bigger than about 10 million lines of code, no matter how fast our processors become.

I think if more people get turned onto pure component-based development, then the current object-oriented paradigm could carry us much further.

You have chaotic errors where all you can say is, "Boy, this was really screwed up, and I guess I need to go in and go through the whole thing and fix it." You don't have errors that are proportionate to the source of the error.

In a way you do. Right now it's known as try() {...}catch(...) {}throw -or- COM interface -or- whatever other language you might work with that has a way to introduce error handling at the component or exact source-area level, and to handle errors gracefully.

"protocol adherence" is replaced by "pattern recognition" as a way of connecting components of software systems.

That would mean a whole new generation of viruses that thrive on the new software model. That would certainly stir things up a bit. Of course any pioneered methadology is subject to many problems.

But I'm not putting down his theory, just commenting on it. The guy's obviously a deep thinker. We need people to push the envelope and challenge our current knowledge like that. Overall the theory is extremely interesting, although the practicality of it will have to be proven, as with all other new ideas.

Crap Artist (0, Flamebait)

mr.henry (618818) | more than 11 years ago | (#5157294)

This guy reminds of Ira Einhorn [crimelibrary.com] , the hippie guru who BS'd his way into lots of "adjunct appointments" like Mr. Lanier's page says he has. Lots of companies thought he was some eccentric genius because he said freaky things and dressed weird.

Phenotropic programming = crypto fascist code (1)

Krapangor (533950) | more than 11 years ago | (#5157319)

These guys want to weed out all code which is not perfectly flawless and fullfills 100 percent it's obejctives.
They want to use genetic meta-programming algorithms to create new code from the old one. But only the "pure code" is allowed to recoded, evolve and grow on. Code with even minor bugs is subjected to oblivion.
Besides the obvious fact that this "superior" code is an evolutionary cul-de-sac, it's also a crypto fascist agenda which should be not toleranted in the open source movement. Imagine yourself that the code would be humans that people like RMS or Linus Torvalds would never have been bred for their anchestors having a long beard and no shower or even wearing glasses. This is morally flawed, even it's just lines of C sources. And note that Linux would never have reached it's modern state for having lot's of bugs in this infancy.

I just checked out his site... (1)

inode_buddha (576844) | more than 11 years ago | (#5157342)

and the full text of the interview. I'm starting to think he's onto something, given such newer areas of research as chaos theory [around.com] and complexity [santafe.edu] . For the uninformed, these are the folks who bring you such things as fractal generation and the "butterfly effect". (I have purchased hardcopy/books a few years ago). I hope he will correct me if I'm wrong, but I think that what Jaron's question really is, is "At which point can we not use complex computations/computers to model the "real" world (FSVO $REAL)? If our computational mechanisims and models approach the complexity of the "real", how can we validate our results against a third-party?" Just an idea.

not that bad... (4, Insightful)

nuffle (540687) | more than 11 years ago | (#5157345)

Give him a break.. He's got some points. And at least he's thinking about real and genuine problems. I see a lot of people commenting with the 'who needs 10 million lines of code?' shtick. Sounds familiar [uct.ac.za] .

We're going to need to do things in a decade or two that would require 10 million lines of code (measured by current languages), just as the things we do now would require 10 million lines of code in 1960's languages.

The new languages and techniques that we have now provided ways for us to reduce the apparent complexity of programs. This guy is just looking for a way to do that again. Certainly there is room to disagree with his techniques for accomplishing this, but it is shortsighted to deny the problem.

Malbolge is the language of the future? (1)

sam_handelman (519767) | more than 11 years ago | (#5157360)

Sumarry: In order to duplicate the features of a biological organism Jaron Lanier finds desirable, you'd end up with a programming language maybe something like Lisp, but a lot like Malbolge. [mines.edu] Malbolge, the programming language of the future, is a free download!

There's something about the way complexity builds up in nature so that if you have a small change, it results in sufficiently small results; it's possible to have incremental evolution.

Firstly, that simply isn't true at all; as someone who understands both computer programing and genetics (my degrees are in Biochemistry and Computer Science) I can say with confidence that this is all hogwash.

The same is true of most of what supposedly imports biological concepts into computing. Neural nets and genetic algorithms are very useful tools, and they have been inspired by what we see in nature, but in terms of how they really function, under what circumstances they function, what sorts of problems they are suited for solving - a neural net is nothing like a real nervous system.

As a biologist I put it this way - a neural net is a very poor model of a nervous system. Genetic algorithms are utterly dreadful models for natural selection.

So, in this (utterly stupid) comparison between computer source code and living genomes, Jaron Lanier asserts that a living organism is somehow fault tolerant while a program is not. Let me disassemble this assertion.

Firstly, a living organism is far larger than any single computer program, even windows. Living organism == computer is far more appropo. The Genome (analogous to source code) of a living organism runs up to the billions of bits; the proteome (the concentrations and structures of the proteins that do the actual work of the living organism) would map, even in a single celled organism, to some vastly larger and more complex structure, terrabytes of data at LEAST. You can say, "that's his point!" But this level of complexity is CONSTRUCTED FROM SMALLER PIECES; individual genes. We can duplicate the complexity of a living organism in a computer without duplicating the complexity of a living organism within a single program. If each program can be as complex as an individual gene (thousands of bytes? Easy!) and produce executable code as complex as an individual protein (this is actually harder, but I believe it is possible) than your program construct can mimic the level of complexity of a biological organism.

So, how IS it that all of this complexity (a human organism) is bug free, while a computer program is not?

Firstly, the human organism is NOT "bug free." There are all sorts of inputs (chemicals) that cause aberrant behavior of every sort. Bugs happen with some random frequency, anyway. Over time, even if nothing else did, the accumulated errors in your biological processes would kill you.

Secondly, to the extent that the human organism is, in some abstract sense, more fault tolerant than a computer program, recall that the human organism is NOT designed (warning: science in progress. All creationists are asked to leave the room.) BILLIONS OF YEARS of trial and error have gone into the selection of the protein sequences that give us such problem free use (or not!) every day of our lives. With a development cycle that long, even Windows could be bugfree.

Thirdly, there is another consequence to our having evolved (rather than having been designed) - inefficient use of memory. Most of the "junk DNA" probably serves some purpose, but brevity is barely a consideration at all (in a larger organism, such as you are I. In fast replicating organisms, such as bacteria or yeast, there is far less genetic packaging.) We are extremely tolerant to mutations in these regions of junk DNA - there are analagous regions in a computer memory were substitutions are tolerated; bit flips in the picture of autumn leaves displayed as my desktop would not crash my machine - in fact, this image is a bitmap, I wouldn't even NOTICE the changes. If we applied natural selection to our computer programs, some regions of high-fault tolerance code might eventually evolve into something functional; my desktop picture might evolve into awk (Okay, now I'm being silly.)

In something which has been DESIGNED, you short-circuit all of that. The code of your computer program is not filler, pictures or stuffing; it doesn't, it CAN'T share the properties of these dispensible DNA sequences - it isn't dispensible! There are a number of single-nucleotide substitutions (equivalent to flipping a single bit) that will kill you stone dead! Your computer program is not less fault tolerant than the core sequences of the ribosome, the structure which you use to convert nucleic acid sequences (your genome) into protein sequences (proteome.)

Now, it is true, there are other places in your DNA where a bitflip will alter some chemical constant in a non-fatal (possibly beneficial) fashion. Might we not duplicate this property in a programming language? A computer language with this property would have certain desirable properties, if you wanted your computer program to evolve toward a certain function through a series of bitflips. Indeed, there are computer languages which have this property, to some degree or another. LISP does. Do you know what programming language really EXEMPLIFIES this property? Malbolge. [mines.edu]

Who wants to program in Malbolge? Raise your hands, kids! A protein does one job, instead of another, because it has affinity for one substrate/chemical, instead of another. In a computer, you'd duplicate this sort of thing by fiddling with constants, and not by changing the source code at all. Small, low order changes in these constants would have incremental effects on what your program actually did.

Malbolge duplicates this property very nicely.

To me, this complacency about bugs is a dark cloud over all programming work.

Personally, I believe that this problem is fundamentally solved, and has been for some time. Heap on more degrees of abstraction. If I wanted to write a program that would take 1 billion lines of C-Code, I'd write a higher level language, and write in that, instead.

Reason for Bugs (1, Interesting)

Anonymous Coward | more than 11 years ago | (#5157383)

Would you buy a CPU which is full of bugs?
No? Why? Because it wouldn't be very usable?
Would you use software which contains bugs?
Yes? Why? Because it remains usable...

It is possible to write bugfree software,
but there's no need to for the average joe market.

universe uses just a few lines of code (1)

bigpat (158134) | more than 11 years ago | (#5157409)

The analogy to the universe is important. The whole approach to physical sciences has been to simplify the rules (analogous to the lines of code) to as simple equations as possible. So the universe can be described with just a few equations with the rest just being data.

I think Jaron is onto something, but his conclusions are off. I think current methodologies are just where we need to go. Keep the code as simple (small) as possible and put the data in some database or something.

Just think of Microsoft... our favorite whipping Mega Corp... when separate groups work on millions of lines of code and try to piece it all together then the result is not always ideal. But when the goal is simple code and simple protocols then we get better software.

Re:universe uses just a few lines of code (0)

Anonymous Coward | more than 11 years ago | (#5157434)

The analogy to the universe is important. The whole approach to physical sciences has been to simplify the rules (analogous to the lines of code) to as simple equations as possible. So the universe can be described with just a few equations with the rest just being data.


Ever wondered where the source of the universe-program is stored?

Programming evolution (2, Insightful)

ites (600337) | more than 11 years ago | (#5157414)

There seem to be only two ways to break the need to write code line by line. Either it evolves (and it is possible that evolved software will work exactly as this article suggests, living off fuzzy patterns rather than black/white choices). Or it is built, by humans, using deliberate construction techniques. You can't really mix these any more than you can grow a house or build a tree. We already use evolution, chaos theory, and natural selection to construct objects (animals, plants), so doing this for software is nothing radical.
But how about the other route? The answer lies in abstracting useful models, no just in repacking code into objects. The entire Internet is built around one kind of abstract model - protocols - but there are many others to play with. Take a set of problems, abstract into models that work at a human level, and make software that implements the models, if necesary by generating your million line programs. It is no big deal - I routinely go this way, turning 50-line XML models into what must be at least 10m-line assembler programs.
Abstract complexity, generate code, and the only limits are those in your head.

Liability (1)

wytcld (179112) | more than 11 years ago | (#5157440)

A friend recently got a $90 plane fair to France. Business class. And got bumped up to First because it was oversold. There was a computer error, but the airline had to honor the offer. Much of that flight was sold at the $90 price.

Right now, if that error's there, and the sale goes through, you can in principle track it back to whose error it is. If it was in an agent's system rather than the airline's, for instance, the airline could recover from the agent. So in Jaron's tomorrow, when things are matched by patterns instead of precisely, and an error happens, is it the case that no one exactly is responsible? Would you want to do business with someone whose systems just approximately, sort of matched up with yours? If yours are running on rough approximation rather than exactitude too, can a determination ever be made of ownership of a particular error?

Maybe it will all balance out. Maybe large errors will become less frequent as Jaron claims. But small errors, in being tolerated, will become an overhead of corruption. Perhaps a predictable overhead, as he claims, but what's to keep it from inflating over time, as programming practices become laxer because nobody can really get blamed for the errors any more, because they'll be at best even more difficult to localize than now, and at worst totally impossible to pin down?

Right about one thing (1)

still_nfi (197732) | more than 11 years ago | (#5157460)

This is a thought that had been crossing my mind recently. I mean, look at the games industry. Look at the complexity of games as they have evolved over the last 10 years or so. The manpower to produce a commercial game now is exponentially more than it was then. Programmers, testers, artists, 3D designers, musicians, management....there is already a barrier to complexity there...cost, time to market, talent?

Complexity is definitely becoming a problem but arbitrarily picking 10 million as the magic barrier is a little naive. He discounts the evolutionary process of development. Tools, languages, libraries get richer every day (slowly).

IMHO, the solution lies in the development of testing tools. When you can write an automated test harness that can fix the bugs that it finds, then we are making progress!

Reasonable especially for interfaces (2, Insightful)

markk (35828) | more than 11 years ago | (#5157466)

Looking at this I don't see anything revolutionary in what is proposed, just something hard to do. Of course we work hard at signal processing, pattern recognition and clustering algorithms. They are used for everything from Medical Imaging, Radar, Oil Exploration to trying to automatically call balls and strikes. What I see being proposed here would be to look at interfaces between hmm... modules for lack of a better term in a similar way. If you want it is a far out extension of the idea of Object Request Brokers.

For example, a very large system would have a goal seeking component creating a plan, it would inquire as to what modules are available, look at the interfaces around and classify them (here is where the clustering and pattern recognition might help) and then choose ones which fit its plan. It would then check results to see if this worked closely enough to move it along its plan.

This implies a database of types of modules and effects, a lower level standard protocal for inquiring and responding, and another means of checking results and recognizing similarity to a wanted state - a second place where the recognition and clustering algorithms would be useful. This is obviously not easy to do...

The novel "Night Sky Mine" by Melissa Scott comes to mind as an example of this taken way out. There is an ecology of programs that is viewed by "programmers" through a tool that re-inforces the metaphor of programs being living organisms "trained" or used by the programmers to get what they want. I cannot see this being generically useful - many times we really do want a "brittle" system. It is certainly a way to get to very complex systems for simulation or study or games!

Unfair Comparison (2)

xonos (218227) | more than 11 years ago | (#5157467)

i think comparing computer science to nature is a pretty unfair comparison. Nature is analog, computers are digital. You can make computers seem like they are fuzzy recognizing, but underneath it all, it is, it is a very strict set of rules. Also, faults tolerance for an entire ecosystem is very high, but for the individual is very, very low. So if there is a small defect in my heart, the human race will continue without a hitch, but i won't and neither will my family... So putting fault tolerances into software might make an entire system stay up and running, it might behave slightly differently every time it adjusts for an error. After a while it might behave completely differently than what it was designed to. and IMHO, that's bad.

The reason computers are so powerful and useful is there strict adherence to the rules, and the fact that it should be able to reproduce exactly repeatable results, no matter how often it is ran.

Nature and computers play two completely different roles in our society, and trying to make one to be like the other seems for non-logical and detrimental.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...