Become a fan of Slashdot on Facebook

 



Forgot your password?
typodupeerror
×

No More Coding From Scratch? 323

Susan Elliott Sim asks: "In the science fiction novel, 'A Deepness in the Sky,' Vernor Vinge described a future where software is created by 'programmer archaeologists' who search archives for existing pieces of code, contextualize them, and combine them into new applications. So much complexity and automation has been built into code that it is simply infeasible to build from scratch. While this seems like the ultimate code reuse fantasy (or nightmare), we think it's starting to happen with the availability of Open Source software. We have observed a number of projects where software development is driven by the identification, selection, and combination of working software systems. More often than not, these constituent parts are Open Source software systems and typically not designed to be used as components. These parts are then made to interoperate through wrappers and glue code. We think this trend is a harbinger of things to come. What do you think? How prevalent is this approach to new software development? What do software developers think about their projects being used in such a way?"
This discussion has been archived. No new comments can be posted.

No More Coding From Scratch?

Comments Filter:
  • Good Idea (Score:4, Funny)

    by fatman22 ( 574039 ) on Saturday November 04, 2006 @06:59PM (#16719677)
    Code recycling, and that word was chosen specifically for this case, is a time-honored and proven concept. One just has to look at Microsoft's products to see that.
    • That's why there are many times more Microsoft developers than there are for any other platform. MS has made programming insanely simple. DLL's and other forms of COM objects make code reuse very, very simple. I can make a application with a functional web browser embedded in it in under 30 seconds. Can't do that on any other platform that I'm aware of.
  • Oh my (Score:2, Insightful)

    by Lord Duran ( 834815 )
    If you think debugging is a pain NOW...
  • I predict that this will happen shortly after all composers simply re-arrange themes and phrases from previous musical pieces in order to create any new compositions and authors simply splice paragraphs from existing books to form new ones.
    • Comment removed based on user account deletion
      • It's truly disturbing to me how someone who put so much effort into writing a response could have completely failed to understand simple words like "all" and "any".
        • Comment removed based on user account deletion
          • by Curien ( 267780 )
            it's silly to compare an aesthetic field, such as music, with a functional field, such as programming

            Music is not always artistic; it is often quite functional. I'd go so far as to say that most new music is more functional than artistic. Consider, for example advertisement jingles. If for some reason you insist that music is especially artistic, substitute any other creative process: architecture, drawing, etc. These are all things that increasingly use existing material; but in all cases, the idea that or
          • by fuzzix ( 700457 )

            What I wanted to write further is that, in spite of the insistence of some Slashbots that coding is an artform, people who code for aesthetic reasons are a minority. Most of the time people just want to get their work done and cash their paycheck.

            You could say the same about professional musicians - they just want to play the session, get paid and smoke that joint so they can think of guitar notes that would irritate an executive kind of guy.

            Just because working in the medium is often as soulless as any oth

    • Re: (Score:3, Funny)

      by B3ryllium ( 571199 )
      It's called "Pop Music". Or, more specifically, "Nickelback". :)
    • It has already happened. Take linux: there's hardly a program out there that doesn't use libc. It's almost impossible to find a GUI program that doesn't use libX11. When's the last time you implemented sin()?
    • I predict that this will happen shortly after all composers simply re-arrange themes and phrases from previous musical pieces in order to create any new compositions

      Unfortunately, this already happens. The standard for a "substantial" portion under US copyright case law is so short that most melodies are likely already taken [slashdot.org]. Spider Robinson wrote a short story about such a situation [baen.com].

  • More often than not, these constituent parts are Open Source software systems and typically not designed to be used as components. These parts are then made to interoperate through wrappers and glue code.

    Isn't that the point of Free software? That it is flexible, that we have the source code so we *can* make it do just what we want, rather than be limited by the original authors?

    What's the point of coding from scratch if you don't have to?
    • by msobkow ( 48369 )

      Distributed systems are designed for component integration from day one.

      Has everyone been sleeping?

  • by technoextreme ( 885694 ) on Saturday November 04, 2006 @07:02PM (#16719723)
    I certainly do not want to be developing the same common hardware implementation for a FPGA over and over and over again. Give me something the code for something that works and let me get to the stuff that is unique to the project.
  • What do software developers think about their projects being used in such a way?
    Fine with me as long as they pay the licensing fees.
  • If a piece of open source software is used as a component often enough, people will turn it into a component (through an API, plugins, software packages, modules etc). This improves the overall design of the software and allows better code reuse, but more importantly reduces maintainance issues by allowing developers to upgrade components with relative ease. I see this as a "Good Thing(tm)".
  • Nice piece of science fiction but the reality is there are orders of magnitude more things that can be done writing new pieces of code from scratch than just combining existing ones. People paying for the software tend to make sometimes ridiculous demands on customisation of everything from look and feel to algorithms for doing the actual work for everything from transaction processing, science or graphics. Therefore new snippets will continually need to be coded.

    What we're talking about here is the reuse o
    • Hell coders keep re-writing string handling routines that exist aplenty.

      A lot of this may come from software license incompatibilities. Publishers of proprietary software don't want some copylefted library such as libiberty or readline "infecting" the entire project, which may contain valuable trade secrets or patented processes, with the requirements of the GNU GPL. Even in free software, it's impossible to use GPL and Apache license software together.

  • by bigtallmofo ( 695287 ) on Saturday November 04, 2006 @07:11PM (#16719795)
    The promise of SOA says that you won't have to do this. If you believe in that promise then anyone that develops projects in the future will create them in discrete elements that are accessible as a web service. If you want to start a new development project, just utilize those services that you need and ignore the ones you don't. Because the functionality is encapsulated (and therefore, written, debugged and tested) within the service you're good to go.

    I see application projects in the future acting like glue that holds many services like this together and makes them more than the sum of their parts.
    • Re: (Score:3, Informative)

      SOA does not require everything to be made into a web service to be useful and accessible. A service-oriented architecture is NOT tied to a specific technology and may be implemented using interoperability standards including RPC, DCOM, ORB or WSDL or anything else for that matter as long as the interface is defined and published. An application that wants to use the services just builds a contract with the service and off you go. The implementation of the interface is hidden so it could be in .NET, Java, C
    • Re: (Score:3, Interesting)

      by samael ( 12612 )
      The promise of SOA is, sadly, not terribly accurate.

      Services change over time, new projects need extra bits of information, changes regulations mean that input that was allowed no longer is, etc. And you either end up with 300 different versions of each business service, so that older clients can still talk to their version of the service, or you have to update the clients whenever the server versions change.

      They're still useful - we use them a lot where I work, but they aren't a silver bullet.
  • ...though basically well written, is full of dumb ideas. This particular dumb idea comes from the notion that the development tools we use now are pretty much as sophisticated as they're ever going to get. That's right up there with the famous urban legend about the proposal to close the patent office.

    Vinge's problem is that he makes too much of the famous failures of AI, and has fallen in with the camp that believes that software will never be able to compete with wetware. That has yet to be proven, but

    • You criticism is a bit... flat.

      You know, he build an universe, where technological progress just isnt infinite where humans live.

      Also, your first statement is at best ambigous. Do you mean the present tense in the time of the book? Or nowaday technology?
      For the last one: You realize they good ramscoop spacedrives, live for 500 years and can do just about anything you expect from a not-quite far future SciFi enviroment.

      And if you mean the first one: Actually try to understand the book. Technological progress
    • Vinge's problem is that he makes too much of the famous failures of AI, and has fallen in with the camp that believes that software will never be able to compete with wetware.
      That doesn't sound much like the Vinge novels I've read. In fact the companion novel to A Deepness In The Sky, A Fire Upon the Deep operates from the assumption that true AI is godlike in comparison to any biological intelligence.
  • by Zepalesque ( 468881 ) on Saturday November 04, 2006 @07:13PM (#16719809)
    As the developer of freely available software, I find the prospect of people using my code a mixed bag. Partially, I feel an ownership of the code I write and am somewhat offended by the idea of people using it . However, as a paid engineer, I go through this at regular intervals; older projects get handed to others for support, as I work on new components.

    On the other hand, I welcome the idea that my free code would be used by others - it is a flattering prospect, I suppose.

    Others profit from this sort of re-use: COM, CORBA, Jars, etc...
    • The way I learned how to program in pretty much any language was by looking at other people's code, seeing how they did things, and duplicating those methods in my own. Eventually, I get good enough at a thing that I can come up with my own ideas, or that I remember how things are done without having to go look.
  • A key barrier to code reuse is that technology marches on. Many advances render older code either obsolete or, at best, cumbersome. Recent examples might, for example, include adding templates to .NET, which rendered vast tracts of our .NET 1.1 internal class libraries obsolete.
  • Few people write their own C compilers, libc, math libraries, HTTP protocol handlers and so on. If other/bigger components are available, of course they will be reused.
  • No way Jose. (Score:4, Insightful)

    by 140Mandak262Jamuna ( 970587 ) on Saturday November 04, 2006 @07:27PM (#16719935) Journal
    Yeah, all European Space Agency was trying to do was to use the Ariane 4 code in Ariane 5. And the rocket blew up 40 seconds [umn.edu] after the launch. Why? Ariane 5 flies faster than Ariane 4 and hence it has larger lateral velocity. The main software thought the readings were too high and marked lateral velocity sensors as "failed". All four of them. Then without sensors all the computers shut down. The vehicle blew up. But by that time some bean counter had already shown millions of francs in savings, claimed credit for specifying ADA language for flight control software, collected his bonus.

    Some basic tasks like file io or token processing and such minor things might be reused. But even then porting something so simple like a string tokenizer written for a full fledged virtual memory OS like Unix/WinXP to a fixed memory handheld device is highly non trivial, especially if you want to handle multi-byte i18n char streams

    If the author sells what he was smoking while coming up with the article, he stands to make tons of money.

    • Re: (Score:3, Insightful)

      all European Space Agency was trying to do was to use the Ariane 4 code in Ariane 5. And the rocket blew up 40 seconds after the launch. Why? Ariane 5 flies faster than Ariane 4 .. the main software thought the readings were too high and marked lateral velocity sensors as "failed"

      You claim that this rocket failure is due to software reuse. That just sounds wrong to me. I don't think that not starting from scratch is that relevant. I could more convincingly argue that the failure is due to the software no
      • by topham ( 32406 )

        Without doing the research required (I'm lazy on the weekend, ok...) I believe the actual cause of the failure was, they loaded a table of values intended for Arian 4, instead of the dataset for the arian 5. This table of values could have been anything from temperature readings, to timeout values, etc.

        Obviously with values specific to the appropriate to the OTHER rocket an imminent failure occurred. Better checking of datasets would have prevented the problem.

        Software re-use wasn't the problem.
        • by mce ( 509 )
          Actually, the whole situation was a lot more complex. There were hardware exceptions (not enough bits to represent data after conversion from floating point to integer), combined with deadly so-called efficieny optimisations (the trap for these hardware exceptions had been disabled), combined with a cascade of other problems in which code mis-interpreted the bits encoding the unexpected hardware exception as normal data.

          On top of all that, the original conversion error occured in a piece of code that had no
    • by Coryoth ( 254751 )
      It was indeed a reuse error. It wasn't, however, an unavoidable reuse error [eiffel.com]. Had the code included proper specification, rather than the specification being buried in a vast paper document, then reuse would have worked out fine.
    • There are a lot of techniques available that can make software reuse possible. One of them is Test Driven Development [agiledata.org]. When every bit of your functionality is checked with automated tests you can easily put your sourcecode into another contexts, rerun all the tests, and if they are ok it is a good indicator that it will work.

      We also need a paradigm shift to concurrent programming [wikipedia.org]. This will IMHO inevitable happen, as the processors are becoming more and more multicore. The currently used threading model is
    • If the author sells what he was smoking while coming up with the article, he stands to make tons of money.

      But you aren't familiar with the setting of the story. There will come a time when writing code is too complex for a single person anymore unless of course the compilers are so advanced we basically let Strong AI write it for us. There will come a day when the average office application will have trillions of lines of code in them and the operating system quite a large jump beyond that.

      Having to write f
      • Chicken and the Egg...the Strong AI is made up of VERY COMPLEX software which has been reused from other AIs.

        Oh, and when you get a Strong AI let me know. I've been waiting on one for over 20 yrs. I did AI work in the early 80's and back then such a thing was "just around the next corner". That's been a LONG corner.
  • Your code will be assimilated.
  • The article is a bit of nonsense really. It ignores the fact that building software is a vertical problem anyway where most often you pick the highest level tool to get the job done. You now have

    Transistors
    Digital Logic
    Machine Code
    Compilers
    C Code Family
    Dynamic Languages / Visual Languages
    What next...

    In 20 years time nobody will be pissing around with C code or Java or or Lisp ( ok maybe lisp) except for historical/maintaince reasons. There will be new higher level constructs leveraging streamlined minimal l
    • Re: (Score:3, Funny)

      by ResidntGeek ( 772730 )
      In 20 years time nobody will be pissing around with C code or Java or or Lisp ( ok maybe lisp) except for historical/maintaince reasons.
      That's right. Why, just think of 20 years ago. Today we program in C and C++, and beginners use BASIC. Not at all like people did it 20 years ago.
  • Vernor Vinge described a future where software is created by 'programmer archaeologists' who search archives for existing pieces of code, contextualize them, and combine them into new applications.

    Yeah, thats the present for a lot of us (most of us even?

    Unless you're working in R&D or making things that don't exist (a lot of game engines that are from scratch, a lot of open source projects, new protocole implementation, etc), thats probably what you do 40 hours a week. I know a lot of people for exa

    • by XMyth ( 266414 )
      Yea...this article is stupid. Sound like a manager heard about this concept called 'code re-use' and thought it could improve productivity and let them get more work done. I feel sorry for this one's subordinates. Though really, this kind of thing happens every day....
  • What?!? Why is this news? We have libraries - those are the big chunks of useful code - packaged and (hopefully) documented for re-use and widely distributed.

    When you design new applications you look for libraries that do the bulk of the work - and the application becomes mostly 'glue' to hold the libraries together. Scripting languages are the epitome of this - where large piles of carefully written and optimised libraries make up for the overhead of interpreting the actual library code.

    Dunno why anyone
  • The Neverwinter Nights 2 Toolset was redone in .NET, but I checked the various components, and it at least used these:
    - RAD Game Tools' Bink library
    - RAD Game Tools' Granny library
    - RAD Game Tools' Miles Sound System
    - Crownwood Software's DotNetMagic 2005
    - D. Turner, R. Wilhelm, W. Lemberg -- FreeType font rasterizer
    - Glacial Components Software's GlacialTreeList
    - Mike Krueger's (of ICSharpCode) SharpZLib library
    - Divelements Limited's SandBar
    - Various libraries done by Sano
    - Various libraries done by Quant
  • With companies like EA forcing insanely short deadlines to get Generic Sports Game 2007 out by the holidays, it pretty much forces their code slaves to reuse mostly everything in its' horrific state. If end product quality matters then "from scratch" programming will have a larger priority.
  • The notion of complete code-reuse and never coding anything from scratch is a very good idea in theory. The problem is that in order for it to work the components from which things are built have to be reliable and trustworthy, and such things are not always easy to find.

    I've been studying errors and defects in engineering (both computers and otherwise) for many years. I was raised by an accident investigator, so I have an innate understanding of why things fail and what people can do to avoid it. The core
  • You may not realize it, but as of today, virtually all new applications being developed re-use existing working software systems. For example, when you develop a new web application, the components that you re-use are: the OS, HTTP server, database server, scripting language (PHP...), etc. Of course nobody is re-developing everything from scratch. Very often only the higher level software layer (ie. your application) is developed partially or mostly from scratch, but everything else (ie. the internal or low
  • One of the problems is the necessity for glue code. There needs to standards for software libraries. You can have different libraries, but certain functions should always get you something that will work in the context. One should be able to take out any library from any piece of software, and replace it with another library (assuming there is more than one for the same purpose).
  • This has been the PHBs wet-dream since programming began. They see writing programs as assembly. It isn't assembly. It's design. You can't automate design.

    They always talk about making generic components that can just be "glued" together to create a functioning application. You can't. That "glue" is the business logic, and it's what your program does. It's what you are paying the developers to do.

    If you have good developers (skilled, not monkeys) they will create (or use) libraries to do most of the
  • Look at recently developed programming environments that have made it big: Java and C#, running on a virtual machine, using a large class library.

    My first impression of Java was of "a small language with a big class library" - I don't mean that in a bad way, the relatively small number of syntactic elements in the actual language and handing of things like say, threading to classes is a good thing.

    But coding from scratch? What's that supposed to mean - typing ones and zeros at a OS-less motherboard? Workin
  • I haven't written a program longer than about 2,000 lines from scratch in more than ten years. Nobody I know has, either. It's all about adding another feature, or removing another bug, from giant ecosystems of software that grow like moss over the hardware.

    Makes me nostalgic for the days when I used to start with an idea and create the design from there. But even then, it was almost always the case that everything I built was built on top of, or in the context of, another software system. (well, the

  • OhGodPleaseNo (Score:3, Interesting)

    by cperciva ( 102828 ) on Saturday November 04, 2006 @08:18PM (#16720369) Homepage
    Code re-use is a great idea. Free software is a great idea. Taken together, and to an extreme, they can cause problems, particularly where security is concerned. What happens when someone finds a security flaw? How can you contact the people who are reusing your code if you have no idea who they are?

    To take a personal example, my delta compression code [daemonology.net], which I wrote for FreeBSD, is now being used by Apple and Mozilla to distribute updates; I've talked to their developers, and if I find a security flaw in this code (very unlikely, considering how simple it is), I'll be able to inform them and make sure they get the fix. On the other hand, I know developers from several Linux distributions have been looking at using my code, but I'm not sure if they're actually using it; and searching on Google reveals dozens of other people who are using (or at very least talking about using) this code.

    Putting together software by scavenging code from all over the Internet is like eating out of a garbage dump: Some of what you get is good and some of what you get is bad; but when there's a nation-wide recall of contaminated spinach, you'll have no idea if what you're eating is going to kill you.
    • by Jeremi ( 14640 )
      What happens when someone finds a security flaw? How can you contact the people who are reusing your code if you have no idea who they are?


      A mailing list or RSS feed is a good idea for an open-source project... anyone using/relying on the code should be strongly encouraged to subscribe to it. Of course, you can't force people to subscribe, but those that don't can't complain if they miss the big bug announcement.

  • economics...That is, until the hardware demands recoding, reuse is both financially and programatically a generally sound philosophy. Of course GIGO still applies :)
  • It won't happen that way until there is a coherent established framework for functional testing of the "building block" code. It is GREAT to reuse code, and I've worked on a few projects that became GPL due to my incorporation of GPL code. However, most professional projects I've worked on require more on the reliablity front so no time (time == $$money$$) is lost. Any reused/mined code incorporated into these projects has had to be so rigorously inspected and tested for function that when combined with
  • The problem with this concept is deceptively simple:

    1. The most difficult part of programming is developing software that does a good job of unanticipated uses
    2. Most programmers are fine with something that just does the job they need it to do, and no more
    3. Relatively few useful software components can operate in isolation, the rest place requirements on the environment

    The consequence of this is that contrary to the theory that writing something once should be sufficient, the truth of the matter is that i
  • I think that we are not going to be "mining" for code so much as using smarter and smarter tools to develop the code in the first place. Historically, we used to code by flipping bits on the bare metal. Later we developed smarter tools called assemblers and loaders to flip the bits for us. Then we developed "high" level languages like Fortran, Pascal and C, and developed tools to compile our programs. Our tools became smarter. We don't write the code for GUIs anymore. We drag and drop widgets in a des
  • Often it is desirable to reuse someone else's library. But often it is still very valuable to provide the code yourself, to prevent another dependency on someone else's code, and also avoid something that might limit portability.

    I think it is the same cost/performance threshold that people use in everyday life, in making a decision to purchase or "do it yourself." Is using a third-party library worth the hassle of integrating and packaging it?

    For example, one developer wanted to add a default dependency o
  • We will add your binary and technological distinctiveness to our own. --the Borg
  • due to massive code reuse we now suffer from the "mindrot" disease that disables us from re-inventing anything, even after we have long forgotten it.
  • That's what is currently happening....for instance, take VB. Each line of code you write isn't an individual assembly instruction--it's basically a prewritten piece of code....

    All this guy is doing is taking it to a higher level. It sounds exciting, but is really obvious :-P.

  • We already have reusable components, they're called libraries and they keep getting better. But to think that we can ever easily take some piece of software that isn't written to act like a library and reuse it for something else is pure fantasy. If you have a library with a design specification, you have a bug. If you have a piece of software, it might work quite fine until suddenly somewhere in a mangled mess of code you have some implicit assumption somewhere (which may have been perfectly reasonable) th
  • In certain situations this is useful and is already being used to an extent. But rabid use of it is generally bad. Remindes me of hammer factory factory [joelonsoftware.com]
  • What about web services. Aren't they pretty cross-platform and language independent?

    Isn't that where we want to be eventually?
  • More often than not, these constituent parts are Open Source software systems and typically not designed to be used as components. These parts are then made to interoperate through wrappers and glue code. We think this trend is a harbinger of things to come. What do you think?

    It is a harbinger all right. It is a harbinger of loads of work to come, straightening out that rotten stinking spaghetti. Hint: all you need to do to establish your reputation as a top open source coder is to dig into one of those
  • This just illustrates how little people understand software development. If blind managers were designing houses I'm sure we would end up with the boss saying "here is a functional bathroom - can you patch it to this functional kitchen?". Well maybe yes - and the toilet would be under the table.

    Finding analogies is difficult.

    Why not walk into a roomfull of stories written by a creative writing class and try to cob them together into a best seller?

    Many years ago IBM was touting the line that with the new f
  • by arvindn ( 542080 )
    This is exactly how DNA works. Programmers have so far not had much success at this approach, one because our computers are puny, and two because our programming practices have been tailored for engineered code. But as hardware gets fast enough that most common tasks can be run at a one-millionth slowdown and still run fine, we will get to a point where we can write glue code that runs some other piece of code, throws away 99.9% of its computation, and only uses the rest, simply because of the value of hum
  • by Jerf ( 17166 ) on Saturday November 04, 2006 @09:46PM (#16720933) Journal
    Some context for people who didn't read the book... or didn't read it carefully enough.

    First, Vernor Vinge has a PhD in Computer Science. This obviously doesn't guarantee he can't be wrong, but to those commenters who said something like "these ideas are idiotic"... you've got an uphill battle to convince me that you're that much smarter than Vernor Vinge, especially as most of you saying that don't show me you understood what he was saying in the first place.

    Second, A Deepness In The Sky is set in his "zones of thought" universe. In this universe, the fundamental limits of computation vary depending on where you are physically in the galaxy. This is only faintly hinted at in A Deepness In The Sky, it is explicitly spelled out in A Fire Upon the Deep. This limit on computation may or may not be real. One of the effects of this limit on computation is that you can build a system larger than you can really handle, and eventually all such systems come apart in one way or another. This story is set thousands of years in the future and it is explicitly (albeit subtly) pointed out that the software running the ramscoop ships has direct continuity with modern software. (Qeng Ho computers use the UNIX epoch as the most fundamental form of timekeeping; apparently even the relativistic compensation is layered on top of that.) We are at the very, very beginning, where it is still feasible to burn an OS down entirely and start from scratch.... or is that really still feasible? (Perhaps Microsoft will soon find out.)

    Those of you posting that "we can always wrap it up in an API or whatever", I'd say two things: First, you get the Law of Leaky Abstractions working against you. The higher up the abstractions you go, the more problems you have. (Look it up if you don't know what that is.) The more sub-systems you make, the larger the state space of the total system becomes (exponentially), and the harder it is to know what's going on. It is entirely plausible that you eventually hit a wall beyond which you can't pass without being smarter, which, per the previous paragraph, you can't be.

    In other places in the galaxy, you can be smarter, and Vernor Vinge postulates the existence of stable societies on the order of thousands or tens of thousands of years or beyond, where the society actually outlasts the component species, because the software base that makes up the species does not exceed the ability of the relevant intelligence to understand it.

    Both cases (software might exceed intelligence, intelligence might grow with software) are extremely arguable, and I do not think he is advocating either one per se. (Leave that for his Singularity writings.) But you do him a disservice if you think he is not aware of the issues; he's extremely aware of the issues, to the point that he is the reason some of us are aware of the issues.

    (Even this is a summary. In isolation, probably the best argument is that it is always possible to create a software system one can not understand or control, but one person can be wise enough to avoid that situation. However, in A Deepness in the Sky Vernor Vinge explicitly talks about how in a societal situation, one can be forced by competitive pressures to over-integrate your system and make it vulnerable. "OK, but the government can be smart enough to realize that's going to happen and step in to stop it." First... smart government? But even if you overcome that objection, now your society faces death-by-surveillance and other automated law enforcement mechanisms, which since they can't be human-intelligent will fail. If you avoid that (and it is a big temptation), then you face the problem of anarchy. And remember that "governance" is anything that governs; even if the "formal government" doesn't regulate you to soceital death, private corporations may do it. Anyhow, upshot, Vernor Vinge has done a lot of thinking on this topic, it shows in his books, it is not showing in the criticisms I've seen posted, and when it gets down to it he really has more questions than answers.)
  • I don't think we're close to the end of coding from scratch, at least not in the sense that coding from
    scratch will be replaced by grabbing existing F/OSS parts and combining them. And the reason is
    exactly because

    More often than not, these constituent parts are Open Source software systems and typically not designed to be used as components.

    Trying to glue things together that weren't meant to be used that way can sometimes be done, but it's rare that it works well and it's usually a major kludge. Until
  • Let's be honest here. How many people call themselves programmers and have no idea what a zero flag is or what the difference between Big and Little Endian is? The current level of "being programmer" is to be able to hack together a few lines of code with pre-set functions that pretty much do the work for you. It's actually frightening to see graphics programmers who don't know how to normalize a vector because "there's a function for that".

    Fine, of course there is one, and most likely it is by heaps more o

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...