Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×

The Future of Computing 182

webglee writes "What will the relationship between computing and science bring us over the next 15 years? That is the topic addressed by the web focus special of Nature magazine on the Future of Computing in Science. Amazingly, all the articles are free access, including a commentary by Vernor Vinge titled 2020 Computing: The creativity machine."
This discussion has been archived. No new comments can be posted.

The Future of Computing

Comments Filter:
  • by JDSalinger ( 911918 ) * on Thursday March 23, 2006 @10:58AM (#14980245)
    It is easy to understimate the speed at which technology is changing. Pending brick walls (insurmountable laws of physics), computing in 2020 should be absurdly different from that of today.
    According to Ray Kurzweil: "An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense "intuitive linear" view. So we won't experience 100 years of progress in the 21st century -- it will be more like 20,000 years of progress (at today's rate). The "returns," such as chip speed and cost-effectiveness, also increase exponentially. There's even exponential growth in the rate of exponential growth. Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity -- technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light."
    • by Anonymous Coward
      It is easy to understimate the speed at which technology is changing. Pending brick walls (insurmountable laws of physics), computing in 2020 should be absurdly different from that of today.

      No kidding - by 2020 we should just be able to start playing Duke Nukem Forever in Windows Vista.
    • by AKAImBatman ( 238306 ) * <akaimbatman AT gmail DOT com> on Thursday March 23, 2006 @11:10AM (#14980321) Homepage Journal
      Remember how all the SciFi shows of the 60's thought that we'd be cruising the solar system (perhaps even the stars!) by the year 2000? The Jupiter II optimistically took off in 1999, and Star Trek contained several references to "Eugenics Wars" and "early space travellers" that were supposed to have happened by now.

      What do we actually have? The same space shuttle that's been flying since the late 70's, and updates to the same rockets that have existed throughout the history of the space program.

      Technology does progress at an exponential rate. The only problem is that the focus of technology moves. Computers have already gone through several booms of massive technology increase, and are now very stable creations. There's just as good of a chance that they'll continue to update in a more linear fashion (ala automobiles) as there is that they'll experience exponential increases in technological sophistication. I personally find it more likely that technology will begin to focus on improving other areas for the time being, and allow computers to remain stable for the time being.

      So be careful not to severely overestimate while you're attempting to avoid underestimation.
      • by vertinox ( 846076 ) on Thursday March 23, 2006 @11:33AM (#14980490)
        Might I point out that more money is probaly put into the cell phone, telcom, and computers industry than all the world's space programs combined.

        The reason we aren't seeing great advancments in our space and nuclear programs is that they are highly centralized and are at the whim of select few if they get funding or not.

        However, when technology is decentralized... As in everyone can have a cell phone, broadband, and a computer within their means then those types of technology will advance faster at an accelerating rate. (I hope I don't sound like Kurzweil).

        Not everyone can go to the moon... But most everyone in the western world can have an Xbox360. May not mean everyone is going to get one... But more than enough to cause rampant R&D into that industry.

        Trust me... I'm shocked myself. I remember a time when we didn't have cell phones, computers with hard drives (I miss my old IBM pc jr), internet, 4-7 channel TVs, and every thing else that is happening now... And I'm only 27.

        Things are happening at an accelerating pace... Short of a world disaster or economic depression lik ethe 1930's I doubt we will see a slow down.
        • Trust me... I'm shocked myself. I remember a time when we didn't have cell phones, computers with hard drives (I miss my old IBM pc jr), internet, 4-7 channel TVs, and every thing else that is happening now... And I'm only 27.

          Damn, I'm 27 and now I feel old. When I first read your post I was like, I can remember dad having a cell phone for the longest time. If I remember correctly it was either a bag phone or mounted in his company vehicle. Now, my wife, mom, dad, and two brothers have one. Computers with H
          • Although I like the idea of exploring space and all the neat things that we could do up there, I'm gald we've said forget it. I'd rather have cheap entertainment and cell phones than have "a man" walk on the moon.

            To my mind this is very short-sighted. Perhaps it's appropriate that we have fallen back to regroup, but not going into space in a large scale is suicidal -- not on an individual basis, but for the species. The only question is the appropriate time frame. Perhaps it's appropriate that we stop a
            • Maybe we go through cycles. For a while we're content to sit and play with ourselves, er, our toys, then our children or grand children get bored and decide (or are forced) to do something important, then their children or grand children have a whole set of new toys to play with.


            • To my mind this is very short-sighted. Perhaps it's appropriate that we have fallen back to regroup, but not going into space in a large scale is suicidal -- not on an individual basis, but for the species. The only question is the appropriate time frame.


              The risk of an extinction event happening on Earth is pretty significant. The risk of it happening in the next 100 years is pretty damned slim. Probably the most significant likelyhood of our own extinction is ourselves, a la holy wars, pollution, and globa
              • The risk of an extinction event happening on Earth is pretty significant. The risk of it happening in the next 100 years is pretty damned slim.

                While this is true, and most would agree with you, you have to also consider the consequence. The chance of it happening is slim x something we really want to avoid. This makes it a lot more important. If we are elmininated - that is, the end for man. All the progress we have made, the sacrifices, the research, the sex has all been for nothing! (OK, maybe th
            • To my mind this is very short-sighted. Perhaps it's appropriate that we have fallen back to regroup, but not going into space in a large scale is suicidal

              We aren't ready to go into space yet. I for one think about the only reason that a US man walked on the moon was because of the USSR. The US government could have cared less except that they needed something that they could do better than the USSR. I think that it was a mistake spending all that money for that purpose. Telcommunications, weather monitorin
        • Here are my predictions on what will be available in the year 2026. First factories will be built here in the United States but most of the worker will live in India or China. They will use the internet to control robots in the factory. Apartment buildings will have examination rooms where one will go and there will be robots controlled by a doctor in India or China who will be able to remotely do an exam even better than if the doctor was there in person. Every automobile will be part of a network wh
        • 4-7 channel TVs? Am I reading that wrong? Growing up our TV SUPPORTED even more than 7 channels (I think it was something like 15). Of course, we could only get two, one kind of fuzzy. Except on those rare nights when everything was perfect sometimes you could get the sound from the French channel.

          I've got a few hundred channels now though. :)
          • 4-7 channel TVs? Am I reading that wrong?

            Well, the first TV I remember as a kid had a dial that could pick up way more than that, but we could only pick up about 4 channels (7 on a good night) and everything else was static.

            By the time I was in middle school my parents had cable though... So a moot point.
      • by MrFlibbs ( 945469 ) on Thursday March 23, 2006 @11:36AM (#14980508)
        Indeed. One thing that's easily overlooked is that even though the hardware performance has increased exponentially, the software development has not. Those tasks that are compute-bound benefit directly from the exponential hardware growth, but other tasks do not.

        Software is hard -- perhaps fundamentally so. It cannot be written exponentially faster even with infinite hardware resources. Vast hardware improvements may support vast software possibilities, but writing that software is still a daunting task.
        • Software is hard -- perhaps fundamentally so.

          Yes, because computers can't design software. Humans are the ones have to do it.

          I've said before that there are still algorithms that need to be designed - intelligent audio compression thru sampling (if there's a piano, strip the necessary information and just store the notes and variations, if it's a voice, just store the vowels / consonants and pitch changes, of course, with the rest of the "noise" as high-fidelity info), sprite-based video compression (that's
        • Task ' Write a web page'

          a) download a trail of dreameweaver and write a simple web page, test your page with firefox.

          b) Pop in a dos disk, fire up debug and write a text editor to write your web page, then write a GUI and web browser in ASM to test your web page, hand write all your browser tests too don't use the automated ones on the web.

          There's been a lot of change in the way people write software, ever used a punch card and waited a day or two to get your debug results back?
        • We can program exponentially (yes, exactly O(exp(x))) faster by real code reuse. Like what people say they do (looking for libraries and using them), not like what people really do (reusing only self coded fragments, looking for libraries and rewritting them).

          Most of the problems with real code reuse are solved by free software, that is probably why there is now an emergent desire to do that on the community. Just look the number of /. comments stressing that you SHOULD build a library and a CLI envelopes

      • Not all thechnology changes at the same rate. Microelectronics technology is far from mature and it's changinf fast. Chenical rocets has become a mature technology. Just look at the two newest large space bosters the Delta IV and the Atlas V. Mechanically there is nothing in there an engineer from the 1960s would not recognize. Same goes for the passigeer jet aircraft. Rapid changes in the first half of the 20th centerury and little change from the 1970's to present. I suspect that people in the 193
        • It's not that simple. New technologies do change rapidly for a brief while. There is no guarantee for just how long they will continue to change rapidly. Also, long stable technologies will sometimes be changed by external events, and begin changing quite rapidly.

          When I was in high school I tended to model this by a helix, with different technologies distributed around the circumference of the bounding circle, and the bright spot of rapid change climbing along the rising helix. This gave me a rough guid
      • Yes, but you'll notice that we already have 23rd-century communicators, tri-corders, data storage*, and display technology. As a sibling post points out, that's because these are easily commoditizable and there's lots of money to be made.

        I think this bolsters your theory that it's the focus that matters most. And focus is largely a matter of markets and profitability.

        * There was an episode in ST:TOS where they were plugging in 2.5-inch orange squares into a computer. I don't recall now what was on them. But
      • Knowledge appears to increase exponentially (actually, Kurzweil says double exponentially). That doesn't mean that individual technological solutions will advance the same way, particularly over short periods of time.

        We haven't seen a boom in space because we're lacking new propulsion (think of all those SF shows -- I don't think any of them had us riding around on chemical rockets) and we haven't really put the will into it.

        However, advances have been made. We have ion engines a la Star Wars now. People
        • We haven't seen a boom in space because we're lacking new propulsion

          This is a commonly repeated urban legend. The truth is that we have propulsion methods pouring out of our ears [wikipedia.org]; many of which are far better choices for manned flight than Ion engines.

          The biggest problem has been the $500,000,000 that gets sunk into every shuttle flight. It eats up the money that's useful for better space craft. The next biggest problem is the ISS. It eats up money without accomplishing its original goal. (To be a launching
          • I wouldn't say it's an urban legend. If you were building a spacecraft today and you had to go out and buy engines for it, what could you buy? Chemical or ion.

            The others you indicated range from will-be-off-the-shelf-tomorrow through never-tested-outside-a-lab to might-be-possible.

            Like I said, we're not cruising around the solar system Jetsons style because we've had a lapse in the engineering. Science and ideas have been advancing in the background regardless, as your link illustrates.
            • If you were building a spacecraft today and you had to go out and buy engines for it, what could you buy? Chemical or ion.

              Actually, you'd buy nothing since that is pretty much what's available on the open market. You might be able to subcontract Rocketdyne to build you engines based on an existing design, but that's about as close as you can get.

              The others you indicated range from will-be-off-the-shelf-tomorrow through never-tested-outside-a-lab to might-be-possible.

              Many of the engines have undergone extens
              • I'm sure the Russians would be happy to sell you something. Probably the Europeans too. Even Boeing (or whoever it is that actually owns the rights to their engines), if you paid them enough. Fission engines have been tested, but never in flight. That's one of the ones I'd put in the "may-be-available-tomorrow" category, or the "never-tested-outside-the-lab" one, if you count a rocket engine firmly anchored horizontally to the ground as in the lab (which I do).

                Regardless of any political/financial/socie
      • Computers have already gone through several booms of massive technology increase, and are now very stable creations.

        No, they're stagnant creations, and that's because there's a monopoly player acting as a sea-anchor to innovation. Wait until there's real competition in the software market and you'll see what computers can really do.

    • Comment removed based on user account deletion
      • You're getting caught up in the short-term. Corporations in the US might but a little blip in the curve, but the growth of human knowledge will continue. That's not to say that US will continue to be a leader. Kurzweil is looking at long term trends, on the order of thousands of years. The 20 years it would take China to catch up to and overtake a luddidical United States is noise.
    • by Anonymous Coward
      Within a few decades, machine intelligence will surpass human intelligence, leading to The Singularity -- technological change so rapid and profound it represents a rupture in the fabric of human history.

      Hahaha. That shit is just too funny. "The Singularity" eh?

      Let's just ignore the last 50-odd years of AI research. The problem is Real Fucking Hard (tm) and throwing more hardware at it just isn't working (see: Combainatorial Explosion, NP Complete, etc.). Computers are very good at doing mechanical thin
      • Kurzweil's prediction of a singularity is based on extrapolation of a curve, not any particular technology prediction. He does that because people drool over stuff like that.

        Who says silicon is going to give us AI? Maybe it'll be biological. Nature has shown us that intelligence is possible and we're developing the tools to mess with that kind of thing.

        Think of all the things we take for granted today. How many of them do you think would be "Real Fucking Hard (tm)" for society a century ago, a millenniu
      • It will be software. It will require sufficient hardware support. It won't be easy.

        It will require many functionally specialized modules that will need to be created separately and merged together smoothly. It won't be easy.

        It's in process now. You just aren't noticing, because we are still in the early days, also most hardware isn't sufficiently powerful to support it. But there have been noticeable improvements in just the past year. Most places still find voice recognition systems to be too expensi
    • by Dr. GeneMachine ( 720233 ) on Thursday March 23, 2006 @11:20AM (#14980398)
      I actually don't buy into Kurzweil's singularity theory. I am not sure where he pulls that super-exponential growth figure from. Looking at past technological advances, I rather think that technological growth follows a succession of sigmoids. First you got a "buildup phase", followed by a very fast "breakthrough" phase, which slows down again, till the process settles on a plateau. Then there might be nothing for quite some time, till the next advancement phase sets in.

      Such a development model might very well go on for a long time, without reaching a Kurzweil-style singularity.

      • First you got a "buildup phase", followed by a very fast "breakthrough" phase, which slows down again, till the process settles on a plateau.

        Where is the plateau? Even in the Dark Ages, technology still advanced. (Albeit it was more on the lines of masonry, shipbuilding, arms and weapons manufacturing). A mounted armored knight in the 1200's was comparably better armed than a Roman Legionaire in 100AD.

        But I suggest you read his book... I don't agree with everything he says or that a singularity will happen
      • "I am not sure where he pulls that super-exponential growth figure from."
        Ever heard of Moore's Law? The exponential growth of processing power that has been ongoing since the 60's. In his book "The Age of Spiritual Machines" Ray Kurzweil points out that this exponential growth can also be found in many more technological developments. He doesn't make this stuff up, it is pretty common knowledge. What Kurzweil does is point out that most of our prediction-of-the-future models are still based on industrial ag
        • I was at a talk earlier in the week by Bob Colwell (former Intel Fellow, leader of a few Pentium design teams). The focus of his talk was Moore's Law, and how it is no longer a useful guide. The important thing to remember about Moore's Law is that it makes economic, rather than technological predictions. It claims that the number of transistors it is economically feasible to put on a single IC doubles every n months (where n is somewhere between 12 and 24, depending on when you ask Gordon Moore).

          Acco

          • According to the materials people at Intel, they can keep up this level of progress for at least the next 10 years. This means (going by transistor count), it will be possible to make a 2000 core P6. The catch? It will draw 200KW

            I think you either misunderstood, he was exaggerating, or the materials people at Intel are completely full of shit. There's no way they've got materials or technology that can dissipate 200KW of heat off of a computer chip, no matter how much money they throw at the problem.
            • This was his point. They can put that many transistors on the chip, but powering and cooling them is going to be impossible. No one is going to want a laptop that drinks several gallons of liquid nitrogen a second just to stop the chip burning through the case, desk, and floor, and requires its own electricity sub station to power.
      • First you got a "buildup phase", followed by a very fast "breakthrough" phase, which slows down again, till the process settles on a plateau.

        An alternative way of looking at it is, that first you invent something, and market it to rich people. Then, the focus doesn't move to inventing something else, it moves to making the existing thing cost effective so that it can be marketed to everybody else. Then, after that, the focus still doesn't move to inventing something else, it moves to refinement of

      • So suppose those sigmoids get steeper and steeper and the flat parts get shorter each iteration. They build off each other, of course, so the bottom of one is around the top of the previous. If you get a whole bunch of them together and fit a curve to them....

        For example, I did some research in history. How long did it take the atomic bomb to go from theory to use to insane arms race? Less than a decade. Okay, that's maybe a special case. How about tanks, or mechanized warfare in general? Know how lo
    • Responding to Kurzweil. Exponential growth is a mathematical concept. The rate of change of a quantity is proportional to the amount of the quantity. This presumes the ability to measure that quantity. How exactly does one measure "technology" or "progress" in sufficient detail to determine that they are increasing at an exponential rate? Sure, one can make qualitative observations. It seems that the more technology there is, the more quickly we can design new technology. But that's an awfully fuzzy
    • It seems like technological feats happened much quicker in the past. The building of the Pyramids, The Great wall of china, Cross continental railroads. We can't even build a Secure OS, or a laptop that doesn't toast your lap.
      • Those things probably took longer to build than you've been alive. Certainly it took longer to invent the technique then build them. I remember when laptops didn't exist at all, and I'm not even middle aged.

        If you live in the New World then the Great Wall took longer to build then your country is old. If you live in Europe then your country may have sort of been around as long as the GW took to build.
    • I've been reading some of Kurzweil's articles for over 2 decades and 99% of the time I call bullshit. He always promises AI but he (or others) never comes up with anything close to his predictions. I have no idea where he gets his projections from, even if his (?) singularity theory kinda makes sense (once the computing power of a CPU gets above that of the human brain thanks to Moore's Law, all hell breaks loose). Just to say that putting 10 thousand pocket calculators next ot each other doesn't make it an
    • All projections of the future are based on a continuation of middle-class life in the USA, Japan, and Europe as it is today only more so.

      Probably won't be that way. Oil is peaking. Which means that the easy oil is gone and what's left won't be easy to get to. Or easy to pump out. Or protect from pirates, terrorists, or religious fanatics.

      Plus...

      The world's population continues to explode. Billions of more young people are becoming mature (15-20 years old) a
      • Good thing there are dreamers. Otherwise I'd probably be busy killing myself after just reading your post.

        Yeah, maybe the world is going to go to pot. People have been saying it for a long time, maybe they'll be right one of these days.

        There was a big hullaballo a few hundred years ago about burning all the trees in England too. Why, when all the trees are gone, what are we going to use to heat our houses? Our entire society is built on wood and horses!

        We'll either overcome new challenges or we won't.
    • Whoa, hey! Kurzweil is definitely out there, huh? :-)

      Back here on Earth, I'm not so sure things are going to move along so quickly. For instance, a concept I would love to see developed in my lifetime is a wearable computer with:
      • AI
      • voice interface
      • persistent wireless connection to the Internet
      • lots of memory

      It would also come with (optional) things like:

      • HUD (glasses or contact lenses)
      • miniature cameras (possibly infrared)
      • GPS

      Think of what you could do with a device like this and how it would pr

  • Trends (Score:5, Insightful)

    by Red_Foreman ( 877991 ) on Thursday March 23, 2006 @11:07AM (#14980305)
    There's two distinct movements, and in 2020 we could see one trend finally win out over the other, for better or for worse.

    One trend is the Open Source movement, the other is the closed source / DRM movement.

    The way I see it, one of two things could happen: Computing becomes nearly free, due to lower and lower hardware costs and free operating systems, with entertainment at our fingertips, or... an extreme DRM lockdown where only "trusted" devices may connect and Linux becomes contraband.

    • Re:Trends (Score:3, Insightful)

      by hal2814 ( 725639 )
      Despite what you've read in the GPL3, open source and DRM are not mutually exclusive. Just because you can read the source code on how a DRM scheme works does not mean that you can bypass it. DRM also won't neccessarily lead to the demise of Linux. There are too many Linux shops who are not going to be willing to switch server platforms over trusted computing measures to ever let that happen. I'm not the biggest fan of DRM but it's probably going to be here to stay and it's not going to lead the the end
      • As the AC pointed out, you're right, DRM isn't necessarily contrary to open source (although practically...) but trusted computing (a form of DRM) IS.

        I disagree with him that you can't implement open source DRM (I bet you can) but it's harder and not nearly as secure as trusted computing supported DRM.

        • "DRM isn't necessarily contrary to open source (although practically...) but trusted computing (a form of DRM) IS."

          You left out "in its currently proposed format" between "computing" and "(". I am well aware that the AC is right and in its current form, trusted computing would lock out derivative versions of a program but there's no reason it has to be that way.
          • So what sort of scheme would you use for trusted computing? I associate trusted computing with requiring that the hardware not be under your control. Agreed that could allow for some open source (say, you can write Java but it gets treated like an untrusted applet from the Internet) but it'll put quite a damper on what you can do.
            • "hardware not being under your control" != "trusted computing"

              If that's your definition then an OS with a protected mode is already engaged in trusted computing since it won't always let software touch hardware directly (especially kernel memory space). Last I checked, there are a few OSes that are open source and take advantage of protected modes.

              Ideally, for trusted computing to coexist with open source software, there must be a mechanism that allows you to derive trusted work. I don't claim to have the
              • You should ponder it a bit more, because the problem is unsolvable. If you let the owner (the one who brought it) of the machine decide what he'd like to run on it, you can't enforce DRM on it. And if you don't let him decide, he'll not be able to run his software, thus FOSS will go nowhere. But we may still have Linux, distributed by the machine manufacturers, or even Microsoft (talking about irony).

                Your point about protected mode is nosense. Despite the kernel that the owner chosed to use having protecte

              • You can always remove that OS and install one of your own though. A full trusted computing platform would not let you install a non-trusted OS.

                I'm not sure what you mean by "derive trusted work." If you mean assure a particular piece of media that it's running in a "trusted" environment, then I don't see how you can do that if you're going to allow me free access to the hardware.

                The media companies are worried about allowing devices to put high definition ANALOG data on a wire to your first generation HD
      • Yes, you are right. It is not because you can read the source that you can bypass the DRM, you can bypass the DRM because it is DRM.

        There is no secure DRM, unless you start enforcing it with hardware (TCPA). And that only moves the break point into the hardware arena, so that it is more expensive (very expensive) to break it, but still possible.

      • Just because you can read the source code on how a DRM scheme works does not mean that you can bypass it.
        That's right. It's a real, official ThoughtCrime.

        Welcome to 1984.

  • Vinge dissappoints (Score:3, Insightful)

    by ObjetDart ( 700355 ) on Thursday March 23, 2006 @11:18AM (#14980377)
    Was anyone else as completely underwhelmed by Vinge's article as I was? For a man who has produced so many incredible, original visions in the past, he seems to be stuck in a bit of a rut these days, going on and on about ubiquitous computing. There wasn't a single idea in his article that I haven't heard many times before already, from him and others. It reads like something he cranked out in 10 minutes to meet some last minute deadline...
    • I felt exactly the same way.

      I am so used to being blown away by Vinge.

      He doesn't often write but when he does, he puts a lot of thought into it.

      He didn't put much thought into this article.
  • No high hopes (Score:5, Insightful)

    by hcdejong ( 561314 ) <hobbes@@@xmsnet...nl> on Thursday March 23, 2006 @11:20AM (#14980394)
    Compare the state of computing in 1990 with that of today. Yes, computers are immensely faster than they were 15 years ago, but have things changed on a fundamental level? Have computers become more *intelligent*, rather than just faster? I, for one, am disappointed.

    An example: handling contact and scheduling information. In 1993, Apple showed how it should be done with the Newton. 13 years on, the most popular application (Outlook) still doesn't have that level of functionality.

    Computers were supposed to make things easier for us. Instead, they all too often complicate things needlessly.

    Yes, thanks to better hardware, more tasks have become feasible to do on a computer. Video playback, massive networks like the internet are very nice.

    But while new functions are being added, existing software stagnates. Mac OS X is nice and robust, but UI improvements over Mac System 7 are tiny to nonexistent. Windows shows a similar lack of progress. Word processing is not fundamentally different from 1984.
    • This has more to do with the fact that people are becoming increasingly blase about the potential of coputing. at the moment, i am actually undertaking a project to try and design a human/computer interface that is totally removed from what engelbart came up with back in '68 - we're trying, essentially, to show people that thinking outside the box is the best way to improve the use of said box. computers these days are capable of amazing things in 3-dimensional graphics, but we're still constrained by the 2
      • I really doubt that you can dramatically improve things by adding another geometric dimension. If you want to improve the interface, then the improvement will need to be based on the improved understanding (by the computer) of the interface. A 3-D interface is probably not going to be particularly useful. Various people keep trying to prove me wrong about this, but to my mind what that would do is increase the computational requirements on the user, with minimal to no improvement on the information throug
        • Agreed. People are fond of thinking 3D is the way to go but as yet our displays are very much 2D, as are our input devices. I have yet to see a GOOD, convenient THEORETICAL 3D display technology. Input is easier, but even so there are very few actual pieces of hardware and I haven't seen one of those that's the equivalent of a mouse.
    • Compare the state of programming with the 60s. Which programming languages are we using today? C++, Java, Perl, Python, etc...in other words, ALGOL, disguised in various forms.

      We can have no real progress until we have AI that we can just describe what we want to it and it understands it. Only then we can make computers as clever as we imagine they can be.
    • 1984 (Score:3, Funny)

      by 2008 ( 900939 )
      Of course word processing hasn't changed since 1984. LaTeX and GNU Emacs were written in 1984... how could you improve on that?
  • Who is Vernor Vinge? (Score:2, Informative)

    by resonte ( 900899 )
    In case you wanted to know

    Vernor Vinge is a sci-fiction author who was the first to coin up the term singularity, and uses the idea in some of his novels. Linkie: http://mindstalk.net/vinge/vinge-sing.html [mindstalk.net]

    If you would like to read one of his books I would suggest Across Realtime, which touches on this subject lightly. Although his other stories are somewhat less palatable for me (but I've only read three).

    Other authors who delve more deeply into singularity issues are Greg Egan (hard going, but defina

  • by Opportunist ( 166417 ) on Thursday March 23, 2006 @11:34AM (#14980495)
    Let's take a parallel in the space race of the 60s. Everyone expected the development to continue in the same pace it did during the 60s. I mean, face it, between 60 and 70, the technology changed from being able to lift some rather small mass into orbit (well, at least sometimes, most of the time it just went up in smoke) to bringing a 3 man craft including lander, car and a lot more junk to our moon! People extrapolated. 60 to 70: Zero to moon. 70 to 80: Flight to moon -> Moon base. 80-90: Mars. 90-2000: Past the asteroid belt and prolly even more.

    Now, what people didn't take into consideration was that, with the race over, funding stopped. No more money for the NASA, no more leaps in science.

    Same could happen to us and computers. Now, it is of course vastly different since there isn't only one customer (like in the space race, the only customer was the feds, and when they don't want your stuff anymore, you're outta biz), but it all depends now if the "consumer base" for the computer market is willing to spend the money. There are SO many issues intertwined that influence the market and thus development, that it's virtually impossible to predict what is going to be in 5 years, but trying to give an even remotely sensible prediction for 15 years is impossible.

    Too many factors play into it. Sure, you can extrapolate what COULD be, considering the technology we have now and the speed in which technology CAN evolve. Whether it does will highly depend on where our priorities lie. DRM, will it kill development with less companies daring to get into the market, or will it increase development since DRM technology swallows away huge amounts of cycles? Legislative, patents and copyright, how will the market react? Will we let it happen or will we refuse to play along? Are we descending to being consuming drones or will there be a revolt against the practice of abusive patents?

    Too many variables. Too many "what if"s.
  • Wrong focus (Score:2, Interesting)

    by jettoki ( 894493 )
    I'm not very concerned with progress in hardware. My 3 year old computer runs pretty much anything just fine, and I expect it to continue doing so for a few years to come, at least. Right now, I'm severely disappointed by the lack of ideas in technology. There's only so far you can take word processing, e-mail, scheduling, etc. Enough with 'innovation' in those areas, already!

    What I'd really like to see is improved content creation tools. How about 3D scanners, so Joe Artmajor can easily scan his sculp
    • What I'd really like to see is improved content creation tools. How about 3D scanners, so Joe Artmajor can easily scan his sculptures into modelling programs?

      And I'm still waiting for the do-it-yourself anime rendering program :(

      Anyway, mod parent up. He's so right about this one.
    • I've got DIY plans for a 3D laser scanner. Chances are your art major has access to one, or he could run down to somebody's prototyping lab and use one.

      They're not common because, well, what would you use it for?
    • Also existing, but too expensive for normal end users, are the 3-D printers. Some of them will even allow you to specify which bits should be electrically conductive. They're limited, but they can turn out quite cute models. (One of the companies that uses them is a car company.)

      I think I even read that someone had printed out a working scale model V-8. Not sure I believe this, though. That seems rather unlikely.
  • I think a singularity is possible, by the definition of "a point beyond which we cannot hypothesize", because we cannot truly conceive/understand of that point. But will it necessarily be AI, or even computers, that create this? It's about as likely as extraterrestrial contact. Which is, you'll note, also a singularity.
    • I think you misunderstand.

      The singularitarians tend to focus on computers, because that's what's currently hot. This doesn't mean that they are committed to AI. Nanotech, both biological and otherwise, is also one form that's considered. Instantaneous communicators are also considered (though not frequently...that won't lead to a singularity until a civilization is spread far enough that light-speed delays become very significant). Little attention is paid to things that can't be predicted not because t
  • Of Professor Frink:

    "I predict that within 100 years computers will be twice as powerful, 10,000 times larger, and so expensive that only the five richest kings in Europe will own them."

  • Waste of time (Score:3, Interesting)

    by HangingChad ( 677530 ) on Thursday March 23, 2006 @12:50PM (#14981165) Homepage
    For 20 years I've been hearing about the future of computing and when the future gets to be the present it doesn't really look anything like the future that was previously described. So to me that whole line of speculation is just a waste of time.

    The truth is you don't know which technologies will take or why. Sometimes you think X should be popular but it doesn't catch on for 10 years after you found it. Or something you blow off as insignificant comes out of nowhere to dominate a market.

    Although I have noticed one small arena that tends to be a good predictor of the wider market. If p0rn distributors pick it up, then you can almost bet it's going to be the next insanely great thing. I remember taking a training class for a streaming video server in Atlanta a few years ago. Half my classmates were from p0rn distributors. Which definitely made break time more interesting.

  • Consider that our basic approach to computer programming has not changed in over a century and a half. It all started when Lady Ada Lovelace wrote the first algorithm (or table of instructions) for Babbage's gear-driven analytical engine. Software construction has been based on the algorithm ever since. As a result, we are now struggling with hundreds of operating systems and programming languages and the ensuing unreliability and unmanageable complexity. It's a veritable tower of Babel. Computing will not
  • by Anonymous Coward
    Check out http://imagination-engines.com/ [imagination-engines.com] which is a US company founded by an AI researcher Dr. Stephen Thaler. In summary his systems are composed of paired neural nets in tandem where the first is degraded/excited to produce 'novel ideas' (the 'dreamer') and the second is intended as a 'critic' of the first system's output, or a filter for 'useful' ideas.

    In real-life applications, it was used to invent a certain oral-B toothbrush product.

    At one time the site's literature announced that 'invention number

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...