Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Technology Science

Does Moore's Law Help or Hinder the PC Industry? 191

An anonymous reader writes to mention two analysts recently examined Moore's Law and its effect on the computer industry. "One of the things both men did agree on was that Moore's Law is, and has been, an undeniable driving force in the computer industry for close to four decades now. They also agreed that it is plagued by misunderstanding. 'Moore's Law is frequently misquoted, and frequently misrepresented,' noted Gammage. While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. 'Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it.'"
This discussion has been archived. No new comments can be posted.

Does Moore's Law Help or Hinder the PC Industry?

Comments Filter:
  • Both (Score:5, Insightful)

    by DaAdder ( 124139 ) on Wednesday April 25, 2007 @12:46PM (#18872105) Homepage
    I suppose it does both.

    The drum beat of progress pushes development to it's limits, but at the same time hinders some forms of research or real world tests of computation theory, for all save the few chip makers dominating the market currently.
    • Re:Both (Score:4, Insightful)

      by ElectricRook ( 264648 ) on Wednesday April 25, 2007 @01:08PM (#18872411)

      We're also not paying US$800 for a 80387 math co-processor (only did floating point). Like a friend of mine did in the 80's. That would be about $US1,600 in today's dollars.

    • I'd have to say that it, on the whole, has driven the PC industry more than it's hindered it.

      A car, discounting accident or outright neglect*, can be expected to last in excess of ten years today. Sure, you'll have probably replaced a few parts by then, but it will be pretty much the same car.

      While you can get the same longevity from computers, a four year old machine is considered ancient and no longer capable of keeping up. Yes, there are still many decade old machines out there, but I'd guess them to b
  • by $RANDOMLUSER ( 804576 ) on Wednesday April 25, 2007 @12:48PM (#18872129)
    If only because it keeps us tied to the x86 instruction set. If we didn't have the luxury of increasing the transistor count by an order of magnitude every few years, we'd have to rely on better processor design.
    • Re: (Score:2, Interesting)

      by bronzey214 ( 997574 )
      Sure, being tied to the x86 architecture hurts, but it's nice to have a pretty base standard as far as architectures go and not have to learn different assembly languages, data flows, etc. for each new generation of computers.
      • Re: (Score:3, Insightful)

        Game designers do it all the time. Compiler writers do it all the time. For 99.5% of the programmers out there, the underlying architecture is a black box; they only use the capabilities of the high-level language they happen to be using. But the final performance and capabilities of the system as a whole depend on that underlying architecture, which has been a single-accumulator, short-on-registers, byzantine instruction set (must. take. deep. breaths...) anachronism for far too long.
        • Re: (Score:2, Insightful)

          And for 99.9% of users the underlying architecture is a block box. But 100$ of applications the underlying architecture is important, and if the application doesn't run then the user gets upset. Doesn't matter if the application only needs to be recompiled, even if the developers gave away free recompiles to people who had previously purchased the software, it would require users to know which architecture they have (already difficult for most people) and make sure they get the right one.

          Have you ever
      • a new architecture every fifteen years wouldn't be so bad, staying in this essentially i386 + some bolt-ons rut is getting tiresome. Look at the cool things Sun is doing with the new T2 chip because sparc is somewhat less constipated than i386. Now imagine a chip designed from the ground up to be massively multi-core / SMP.
        • Now imagine a chip designed from the ground up to be massively multi-core / SMP
          Actually, I'd rather not. SMP embodies two concepts:
          1. Homogeneous cores.
          2. Uniform memory architecture.
          For maximum performance and scalability, you don't really want either of these.
        • Re: (Score:3, Interesting)

          by TheLink ( 130905 )
          So any recent benchmarks of how the latest T2 stuff does vs recent x86 machines in popular server apps like _real_world_ webservers, databases?

          AFAIK, it was slower than x86 the day it was launched, and when Intel's "Core 2" stuff came out it got crushed in performance/watt.
        • What could I do with a slow, 32-thread processor that is having a single FPU?
          As long as most of the programs I use are not massively multithreaded, I would use three or four cores, leaving the other 28 unused.
          This Sun thing was a bright idea - too bad a dual proc dual core Opteron was equivalent in performance, while finishing individual tasks faster? (by what I remember, the Opteron ran some 4 tasks at a time, finishing them in tens of milliseconds, while the Sun
    • If only because it keeps us tied to the x86 instruction set. If we didn't have the luxury of increasing the transistor count by an order of magnitude every few years, we'd have to rely on better processor design.

      I'm just going to refer you to my comment made earlier today when discussing a "new, better" processor architecture. Because there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.

      See here [slashdot.org].

      • Re: (Score:3, Informative)

        The miniscule number of registers everyone complains about is irrelevant

        Were it not for the opcode fetches to register-dance (because only certain registers can do certain things), or having to use memory to store intermediate results (because there aren't enough registers), or stack-based parameter passing, (not enough registers) or, again, the single accumulator (more opcode fetches and more register dancing) you might have a point. But what you're suggesting (in the rest of your post) is that having 1

        • by G00F ( 241765 )
          10 year software that is worth running?

          Just some older games that I at least go back to once in a while and beat again.
          XCom UFO: Enemy Unknown/Terror from the deap
          Baldurs Gate (and others in family)
          Syndicate Plus
          Imperium Galactica
          Starcraft
          • OK, since you answered honestly and respectfully, I'll do the same. I'll ignore that all your examples are nostalgia games, and not "Apps", which require LESS performance than games do. In those days, the top-of-the-line gaming rig was what? A 486DX @50Mhz or a P133? Running what? Direct-X 5 or 6? I'll be generous, I'll give you a 500 MHz Pentium-3 running Direct-X 7. Do you suppose a 3 GHz Prescott running Direct-X 9 could emulate 10 years ago hardware at better the performance than you had then? Never mi
      • Re: (Score:3, Interesting)

        by Doctor Memory ( 6336 )

        there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.

        We are, because it's just a new implementation of a crappy architecture. Apple showed that it's quite feasible to run old software on new hardware, even new hardware that had almost nothing in common with the old hardware. Intel provides x86 compatibility on Itanium, there's no reason why we can't all move to a new processor and take our old software with us. It's just that nobody's coming out with any new processors for PC-class machines.

        I'd say the ability to run 30-year-old software unmodified on a m

        • Oh, really? So, I can run legacy Mac OS9 ("mac classic") apps on a new Intel-based Macintosh?

          cool! Oh wait, you can't.

          (Oh I know, you can run an emulator just like you can on Windows or Linux, but it's hardly the same)
      • But can we? (Score:3, Informative)

        by Rix ( 54095 )
        30 year old software won't run on new operating systems, and 30 year old software won't support modern hardware. You won't be getting USB support, and parallel/serial ports are quickly disappearing. Where would you find a modem with drivers for your old OS? Where would you find a dial up ISP, let alone one that would support 1200 baud or whatever you'd be limited to.

        You're going to be *far* better off running 30 year old software under emulation, where these things can be faked.
    • Re: (Score:3, Insightful)

      by sgt scrub ( 869860 )

      we'd have to rely on better processor design.
      Not to mention we'd have to rely on better software design. The way Moores law effects software, by allowing it to bloat, is the anti technology.
    • by LWATCDR ( 28044 )
      It is really going to hurt when computers reach a stable point. Let's face it toasters don't get much better every year. The reason is that they are good enough at making toast now that there isn't a lot of need for improvement. PC are reaching that point. If you don't play games why do you need a faster pc?
      Will it run Firefox faster? Make Quicken more useful? At some point Moore's law will push the price down to next to nothing.
      • I disagree that PC's are reaching a point of stability, but even if they were, it wouldn't matter. We'd just find new sorts of computers to build. Like smartphones, in dash navigation systems, media players and such.
        • by LWATCDR ( 28044 )
          But those are all low power, low price, and soon to be low margin items.
          That is what will really kill the industry as we know it. When nobody needs to buy a new version of Office because Office or OpenOffice is good enough. No body needs a faster computer but a cheaper lower margin computer. And nobody needs a better smart phone because they all do enough.
          If PCs are powerful enough each die shrink will just make them cheaper and cheaper until the cost of the case and power supply are the most expensive part
      • by shmlco ( 594907 )
        The next big step is video. Real-time rendering and support systems to make cutting your own home movie (or business product demo) as easy as it is now to do an iPhoto album. As-fast-as-you-can-transfer-it down-rezzing of movies and TV shows to your iphone or ipod. Multi-way video conferencing aka iChat. Video is a ton-and-a-half a data, and an 8-core Mac Pro still can't do everything it needs to do in real time.

        Besides, I'm not going to be happy until we Star Trek voice recognition and "secretary-level" AI
    • No, the real problem as I see it is not the flaws of processor design, but a software industry that can't keep up with it. Rather than getting elegant and refined pieces of software written, we're constantly struggling to keep up with all of the latest features. Once we get to a point where hardware growth has slowed to a crawl, only then will software truly come of age.
  • by Edward Ka-Spel ( 779129 ) on Wednesday April 25, 2007 @12:48PM (#18872135)
    It's not a law, it's an observation.
    • Re: (Score:2, Insightful)

      by Ngarrang ( 1023425 )
      Agreed. Why the industry chooses to measure itself against what some may consider just an anecdotal observation I will never understand.

      I suppose the laymen need something like to rally around, though.

      Sure, double the number of transistors! But, did that do anything useful? Did you gain enough performance to offset the complexity you just created? In the drive to "keep up with Moore's Law", are we better off? Are the processors now "better", or simply faster to make up for how fat they have become?
      • by jhfry ( 829244 ) on Wednesday April 25, 2007 @01:03PM (#18872341)
        If you think that Intel or AMD double the number of transistors in an effort to keep up with Moore's law than you know nothing about business.

        No one does anything in an effort to prove Moore correct... they do it for their own benefit. Intel does it to stay ahead of their competition and continue to keep selling more processors. If they chose to stop adding transistors they could pretty much count on losing the race to AMD, and likely becoming obsolete in a very short time.

        I agree that more transistors != better... however it is indeed the easiest way, and least complex, to increase performance. Changing the architecture of the chip, negotiating with software developers to support it, etc, is far more complex than adding more transistors.
        • Well written, JHFry. Yes, my initial reply was a bit simplistic. The consumers ultimately dictate the increases we have seen.

          In light of this, is the media guilty of over-hyping the concept? In other words, would we even be beating this subject into the ground if it were not for reporters heralding the "defeat of Moore's Law for 10 more years!" or some other similar headline?
      • by TheRaven64 ( 641858 ) on Wednesday April 25, 2007 @01:42PM (#18872877) Journal
        It takes roughly 3-5 years to design a modern CPU. At the start of the process, you need to know how many transistors you will have to play with. If you guess to few, you can do some tricks like adding more cache, but you are likely to have a slower chip than you wanted. If you guessed too many, you end up with a more expensive chip[1]. Moore's 'law' is a pretty good first-approximation guess of how many you will have at the end of the design process. A company that can't make this prediction accurately is not going to remain competitive for long.


        [1] There is no real upper bound on the number of transistors you can fit on a chip, just the number you can for a given investment.

        • And while Moore's Law is about transistor counts, what we're really interested in is what we can do with those transistors, and that generally means trying to get more performance. Performance has been growing exponentially along with transistor count, though slower since it's hard to turn a 1% increase in transistor count into a full 1% of performance (not to mention harder to quantify). I think 2 years is the commonly used time constant for the doubling of performance, vs 18 months for transistor counts
    • Re: (Score:2, Insightful)

      by vux984 ( 928602 )
      Mod parent up.
      Seriously.

      Moore's "law" doesn't mean squat. Its not like gravity. Its more like noticing that I've never had a car accident.

      Then, one day, I will, and the "Law of Magical Excellent Driving" that I've been asserting has been an invisible hand guiding my car around has been violated. Oh noes! How could this have happened?! How did this law which had protected my safety for all those years suddenly fail to apply? ...

      Yeah. Right.
      • GP is absolutely correct; one interesting question we might ask, however, is whether Moore's "Law"/Observation has actually "driven" development that wouldn't have existed otherwise. That is, has the mere existence of Moore's Law resulted in it growing legs at any stage and actually *driving* the changes it supposedly just observed?
      • Then, one day, I will, and the "Law of Magical Excellent Driving" that I've been asserting has been an invisible hand guiding my car around has been violated. Oh noes! How could this have happened?! How did this law which had protected my safety for all those years suddenly fail to apply? ...

        You can't ignore the fact that you would be a different driver today if your "Law of Magical Excellent Driving" had not upheld itself. If you were regularly involved in accidents, that would affect your opinions of buy
  • No significances. (Score:5, Insightful)

    by Frosty Piss ( 770223 ) on Wednesday April 25, 2007 @12:48PM (#18872139)
    "Moore's Law" is not a real law. In reality, it is not relevant at all. It's kind of a cute thing to mention, but when it gets down to the real world engineering, it has no significances.
    • by truthsearch ( 249536 ) on Wednesday April 25, 2007 @12:58PM (#18872273) Homepage Journal
      While you are correct, it has value as being accurate foresight. So the question is, was it just an observation or did it become a self-fulfilling prophecy? If it was a self-fulfilling prophecy then what other judgements can we make now that may drive technology in the future?
      • Moore's Law (in its original 1965 form) strikes me as an observation of what engineers are comfortable doing if there are no physical constraints. That is to say, there will be a new generation of semi fab equipment built every 18 months, and it will be able to image twice as many transistors on the same area of silicon.

        That's a 30% linear scale reduction, which is something that any engineer would be happy to pursue for the next version of their equipment.

        Ask them to make it 50% smaller scale in the next

    • I'd agree. The progress we've seen has been motivated primarily by people trying to make money. More specifically, competition for the giant sums of money being spent on faster computers. Not that there's anything wrong with that.

      On the level of individual engineers, I doubt there are many of them out there kept up at night worrying about how their progress relates to the general progress curve over the past couple decades. Simply put, we've all got better things to worry about.

      Moore's Law is more a descrip
    • by treeves ( 963993 )
      I disagree - Intel, for one, has made it into the thing that drives everything they do. It's not a natural law, by any means, but they act as though they HAVE to keep on the track that it predicts. It becomes kind of a self-fulfilling prophecy. I think they hate the idea of showing one of their founders, Gordon Moore, to be wrong, even if the prediction has already far outlived any expected lifetime. And because Intel is such a key player in the semiconductor industry, the ITRS [itrs.net] follows.
  • by titten ( 792394 ) on Wednesday April 25, 2007 @12:48PM (#18872149)
    If Moore's law is helping or hindering the PC industry? I don't think it could hinder it... Do you think we'd have even more powerful computers without it? Or higher transistor density, if you like?
    • by paladinwannabe2 ( 889776 ) on Wednesday April 25, 2007 @01:00PM (#18872311)
      I've heard that companies plan, design, and release new processors based on Moore's Law. In other words, if it doesn't keep up with Moore's Law it's discarded, if it goes faster than Moore's Law its release is delayed (giving them more time to fine-tune it and get their manufacturing lines ready). If this is the case, then it could be hindering developement in new ways of processing (that have a payoff that takes more than 3 years to develop) and we might even be able to beat Moore's Law rather than follow it. Of course, Moore's Law [wikipedia.org] is awesome enough as it is, I don't feel the need to complain about how it takes two whole years to double the effectiveness of my hardware.
  • Efficiency (Score:4, Interesting)

    by Nerdfest ( 867930 ) on Wednesday April 25, 2007 @12:51PM (#18872171)
    It certainly seems to have had an effect on peoples attention to writing efficient code. Mind you, it is more expensive to write code than throw more processor at things ...
    • by jimicus ( 737525 )
      However now that chips are going in the direction of multicore rather than ever higher clockspeeds, it means that development methodologies have to shift focus in order to take full advantage of it. Not every app has yet done so, not by a long way.

      Case in point: A business application my boss recently bought. Client/server app, with most of the intelligence in the client. They recommended at a minimum a Pentium 4 4GHz. Did such a thing even exist?
    • Re: (Score:3, Interesting)

      by Kjella ( 173770 )
      It certainly seems to have had an effect on peoples attention to writing efficient code. Mind you, it is more expensive to write code than throw more processor at things ...

      Well, you can have software that's feature-rich, stable, cheap, fast or resource efficient, pick any two (yes, you still only get two). Let faster processors handle speed and GB sticks of memory handle resource efficiency, and let coders concentrate on the other three. The margin between "this will be too slow it doesn't matter what we d
  • by jhfry ( 829244 ) on Wednesday April 25, 2007 @12:51PM (#18872179)
    is more important to nerds than Moore's law anyway. Where's the /. article about it?

    Murphy tells us that more bugs will be found on release day than any day previous. That your laptop will work fine until the very minute your presentation is scheduled to begin. And that backup generators are unnecessary unless you don't have them.

    Who cares about Moore's law... it's just prophecy from some Nostradamus wannabe.
  • by Red Flayer ( 890720 ) on Wednesday April 25, 2007 @12:52PM (#18872183) Journal

    While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. "Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it."
    Hmmm. Seems to me Gammage might have it backwards, the misunderstanding of Moore's Law by most people is due to the density... the density of those people.
    • Re: (Score:2, Funny)

      by hAckz0r ( 989977 )
      My first own version of Moore's Law states in rule one; that the 'density' of the sales force is inversely proportional to the 'core size' (N) of the sales force times e^2. [eg. 1/(N*e^2)] That is the only "density measurement" worth paying attention to when buying any new computer equipment.


      My second law of 'density' states that that the PR intelligence quotient is randomly modulated by Schroedingers' cat in the next room, and is only measurable when not actually listening to it.

      • Re: (Score:3, Funny)

        by Red Flayer ( 890720 )

        My second law of 'density' states that that the PR intelligence quotient is randomly modulated by Schroedingers' cat in the next room, and is only measurable when not actually listening to it.
        Wow, you deserve a Nobel Prize. You've figured out how to directly measure a null value!
  • by eldavojohn ( 898314 ) * <eldavojohn@noSpAM.gmail.com> on Wednesday April 25, 2007 @12:54PM (#18872211) Journal
    I always viewed this as an observation or rule of thumb, not a law.

    Moore (or Mead for that matter) didn't get up one day and declare that the amount of transistors on a square centimeter of space will double every 18 to 24 months. Nor did he prove in anyway that it has always been this way and will always be this way.

    He made observations and these observations happen to have held true for a relatively long time in the world of computers. Does that make them a law? Definitely not! At some point, the duality that small particles suffer will either stop us dead in our tracks or (in the case of quantum computers) propel us forward much faster than ever thought.

    Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!
    • Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!

      Because it sells ad space on a web page which has been slashdotted. Duh. ;)
    • by oGMo ( 379 )

      Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!

      Because clearly, clearly, if this law is a problem, we should repeal it or, nay, make a NEW law that keeps the industry safe. Yes. Clearly we need government intervention here to keep the number of transistors over time to a safe minimum. Terrorists. This will protect the industry (terrorists!).

      Once they're done with this, they can pass a law to prevent gunpowder from igniting unless the u

    • These snippits below are a recurring theme in the "Moore's law is bad" camp:

      Gammage said, "just to keep up, just to make sure that you're capable of supporting the software that's running within your environment."

      I don't understand this perspective - especially on the enterprise side. Did the applications you were running suddenly slow down because a new CPU came out? Then why lament the rate of progress?

      One valid argument is the frustration of having to upgrade hardware to get acceptable performance on

  • Better Summary (Score:5, Informative)

    by Palmyst ( 1065142 ) on Wednesday April 25, 2007 @12:54PM (#18872219)
    The core of their argument is that instead of actually delivering same performance at lower prices, Moore's law delivers more performance at same prices. i.e. you can buy Cray-1 level performance for $50, but you can't buy Apple I level performance for $0.001. The second level of their argument is that this march of performance forces users to keep spending money to upgrade to the latest hardware, just to keep up with the software.
    • Isn't the fact that developers are exploiting the available processing power (putting pressure on hardware manufacturers to keep ahead) validation for the industry's focus on more performance at the same price?
      • That would be the case but for the fact that the most common cpu hogging applications are the least efficient pieces of code on an average box. Word processors should not require anything close to the amount of memory and cpu speed that they do today. For example, word will refuse to do grammar checking if you have less than 1Gb or RAM. And look at Adobe Reader. That thing is several times slower than pretty much any other pdf reader, even the ones that come close to having all the features (even the ones t
    • You can get more than Apple I level performance for substantially less than $1.00 these days. I don't know what the cheapest 8-bit microprocessor is going for these days, but you can buy flash-based microcontrollers with substantially more power than a 6502 for less than $0.50 each these days.

      It seems like the ultimate limiting factor is in packaging and testing - you'll be spending a certain amount for a fully-tested chip in a plastic package, no matter what the actual chip is. That price will have more to
  • Definately Both (Score:2, Insightful)

    by john_is_war ( 310751 )
    With companies driving to increase transistor density by decreasing process size, the speed we can accurately use these methods is slowing. With each decrease in process size, a lot of issues arise with power leakage. This is where multi-core processors come in. These are the future because of the speed cap of processors. And hopefully this will spur an improvement in microprocessor architecture.
  • The Real Story (Score:4, Insightful)

    by tomkost ( 944194 ) on Wednesday April 25, 2007 @12:57PM (#18872267)
    The real story is that Moore's law describes the basic goal of the semiconductor industry. Perhaps there are better goals, but they tend to get swallowed up in the quest for smaller transistors. The other real story is Gate's law: I will use up those extra transistors faster than you can create them. My hardware OEMs need a bloated OS that will drive new HW replacement cycles. I also seem to remember Moore's law was often quoted as a doubling every year, now I see some saying 18-24 months, so I think in fact the rule is slowing down. We are pushing into the area where it takes a lot of effort and innovation to get a small increase in density. Even still, Moore's law has always been a favorite of mine! Tom
    • It was always 18 months. Looking at transistor speed and density (number of X-istors per cm2), Moore observed that it looks like we are doubling the density, and doubling the speed every 18 months.
  • Lots of sloppy, inefficient software out there. (I did not say MSFT.) It gets "rescued" by faster, larger computers. I am not advocating the old days of assembly code, but there is room for better coding.
    • While this is true, it allows for more layers of code to exist and more problems to be solved. When the Mac first came out I thought "They have this super-fast Moto 68000 processor, but then they put an OS on it that slows things down to a crawl." And then the hardware caught up with it and saved the concept of a GUI.
  • Why? (Score:4, Interesting)

    by malsdavis ( 542216 ) on Wednesday April 25, 2007 @12:59PM (#18872289)
    Why do computers in general need to get any faster these days?

    Ten years ago I wouldn't believe I would ever ask such a question but I have been asking it recently as my retired parents are looking to buy a computer for the web, writing letters and emails. I've told them specifically "DO NOT BUY VISTA" (why on earth would anyone want that ugly memory-hog?), so I just can't think of a single reason why they need even one of the medium-spec machines.

    Personally, I like my games, so "the faster the better" will probably always be key. But for the vast majority of people what is the point of a high-spec machine?

    Surely a decent anti-spyware program is a much better choice.

    • by kebes ( 861706 )
      You know, I've also had this "do we really need faster computers?" thought more than once.

      Yet inevitably I eventually encounter a situation where my computer is having trouble keeping up, and I'm reminded that, yes, I would indeed like to have a faster computer, and I'd be willing to pay for it (up to some level, obviously).

      These "I want more speed" situations don't come up that frequently, but they do come up. And I can think of millions of ways that having arbitrarily more computing power could be put to
    • Ten years ago I wouldn't believe I would ever ask such a question but I have been asking it recently as my retired parents are looking to buy a computer for the web, writing letters and emails. I've told them specifically "DO NOT BUY VISTA" (why on earth would anyone want that ugly memory-hog?), so I just can't think of a single reason why they need even one of the medium-spec machines.

      People have been asking this question since there have been PCs. The N-1th (and usually the N-2th) generation of PCs alw

      • This is my point, since Win 98 / 2000 I can't really think of any major usability increase.

        It seems to me that the upgrade cycle is an extremely artificial one with Microsoft and PC Vendors both working together to force people to upgrade when they really do not need or even particularly want to.

        Take Windows Vista, it is basically a clone of Windows XP (with a few freeware apps built-in), yet somehow (I'm yet to figure out why) it consumes so much memory that a new computer is virtually required for anyone
    • by jimicus ( 737525 )
      Plenty of people make that argument, and have been doing so for years.

      Yet even today I can point you at a few real business applications which could really benefit from more power. I have no doubt whatsoever that in a few years time, they'll be OK on anything you're likely to run at them, but another troop of applications will have come along which require more power.
    • Why do computers in general need to get any faster these days?

      Medicine, physics, engineering, and AI all benefit from increasing computer power. There are probably numerous other fields that benefit secondarily, but those are probably the most important. Protein folding and cellular simulation will ultimately do more for medicine than anything in previous history, probably the same with physics and engineering. Nanotechnology will require massive computing power to design and test.

      Computers would not g
  • Cost of fabs... (Score:3, Interesting)

    by kebes ( 861706 ) on Wednesday April 25, 2007 @01:02PM (#18872327) Journal

    "...Every 24 months, you're doubling the number of transistors, doubling the capacity," he said. "But if you think about the process you're going through--they're taking a wafer, they put some devices on it, they cut it up and sell it to you--the cost of doing that is not doubling every 18 to 24 months."
    Is he claiming that the cost of doing lithography on wafers doesn't increase? That's crazy talk! The cost of building and running fabs is in fact also growing exponentially. According to Rock's Law [wikipedia.org], the cost of building a chip-making plant doubles every four years, and is already into the multi-billion dollar range.

    In fact there's alot of debate whether Moore's Law will break-down due to fundamental barriers in the physics, or whether we will first hit an economic wall: no bank will be willing (or able?) to fund the fantastically expensive construction of the new technologies.
    • not to worry, the number of chips produced by a successful product line grows at a rate that more than covers the growing cost (sucks to have be the ones making an Itanium, doesn't it, Intel). The 486 is still a hot seller in embedded industry
    • Re: (Score:2, Funny)

      by Tablizer ( 95088 )
      The cost of building and running fabs is in fact also growing exponentially. According to Rock's Law ...

      Rock's Law??? Tablizer's Law: The number of tech "laws" doubles every 2 years.
             
  • Cue all the pedantic asshats who absolutely have to point out that Moore's Law really isn't a Law... it's an observation.
    • It's true - even if pedantic. And the difference is important - because a Law or a Theory has predictive power, and an observation doesn't.
  • Several good examples is the use of goto and global warming/cooling.
    1. When the issue of goto was first looked at back in the 60's, they found that it was massive problems due to using gotos INTO blocks. The study found that judicious use of gotos made perfect sense but only out of blocks or within the same block. Now, we are trought that gotos are bad.
    2. Back in the 70's, a single report spoke that a global cooling COULD happen. Other scientists showed that it was not happening and most likely could not. But
    • We do have GOTO now, except it is done much more nicely. What are exceptions, I ask. I am in a procedure some 25 levels deep from main() and I encounter an unrecoverable error. Back in Fortran77 I would have written GOTO 9999 to report an error and quit. Now I throw a string "What? Triangle vertex is a null pointer?" and catch it at some level and handle the error. Far superior to a hard coded target address to jump to. Functionally similar to GOTO. I would go so far as to venture, in every case where GOTO
      • I did not say that gotos are now replaceable. I just pointed out that the original report has been greatly misunderstood and misused. But yeah, for the most part, I agree that a throw is much better. As it is, I still see code that does setjump/longjump which leaves LOADS of memory undone. But for real speed and base operations, C is the answer.
  • So has chip performance doubled every 18 months?

    I have tried to find out, but didn't get a clear enough answer from what is publicly available on the internet.
  • by Ant P. ( 974313 ) on Wednesday April 25, 2007 @01:09PM (#18872423)
    ...Like GHz or lines of code.

    Take the Itanic for example, or the P4, or WindowsME/Vista.
    • Moore's Law cannot be a crappy measurement, because it isn't a measurement at all. It is not even a measure (such as GHz or lines of code). It's an observation, and an extrapolation from that observation. The observation is simply true. One might argue its relevance, but that's it. The extrapolation is questionable, but seems to have worked up to now. It probably will stop working at some time in the future.
  • If you use Windows, each new installation (or daemonic posession) in your computer negates whatever gains you may have had gotten through Moore's Law.

    Computers only 12 months old with a _gigabye_(!) of RAM are not robust enough to run a full install of Vista with all the bells and whistles, for example.

    --
    BMO

    Yeah, mod me troll, bitch. I've got more karma than you've got mod points.
  • by Animats ( 122034 ) on Wednesday April 25, 2007 @01:23PM (#18872623) Homepage
    • Windows. The speed of Windows halves every two years.
    • Spyware, adware, and malware. Extra CPU power is needed to run all those spam engines in the background.
    • Spam filtering. Running Bayesian filters on all the incoming mail isn't cheap.
    • Virus detection. That gets harder all the time, since signature based detection stopped working.
    • Games. Gamers expect an ever-larger number of characters on-screen, all moving well. That really uses resources.

    Despite this, there have been complaints from the PC industry that Vista isn't enough of a resource hog to force people to buy new hardware.

    Computers have become cheaper. I once paid $6000 for a high-end PC to run Softimage|3D. The machine after that was $2000. The machine after that was $600.

  • It's just a law for chrissakes. It doesn't help, it doesn't hinder. Under certain circumstances it holds, as every sane law would. Eventually someone would need to update it according to changes in the circumstances. Anyway, just a a reminder, Moore was talking about the number of transistors, which has more or less been OK up to now. But thing is, Moore didn't just invent a law out of thin air and the industry followed that rule [I hope you feel the stupidity in that], but he observed how the technology ev
  • by Applekid ( 993327 ) on Wednesday April 25, 2007 @01:30PM (#18872709)
    What we do with the transistors? Run software of course. Enter Wirth's Law [wikipedia.org]:

    "Software is decelerating faster than hardware is accelerating."
    • I call Bull. On my first PC, doing a print preview on WP 3 for DOS was a big, slow process. Then we did colour. Then we did sound. Then we did video. Then we did DVD quality video with surround sound. Three things have required the extra power: media types, useability (Clippy!) and using half of your resources to keep the other half malware-free.
  • 186,000 miles per second, That's a law.
  • Marketers, execs, and pundits have stolen ownership of Moore's law. What none of them understand is the computer geek culture that the "law" was spawned from. Moore's law is in the same realm as Murphy's law, which is also not a law, but fun to invoke.

    However, the paid talking head pundits grab it and start talking about it and dissecting it and taking it literally. It's not a topic for geeks any more, it's not funny, and it's stupid to be discussing it in an article.

    I propose a real law. A legal law.
  • They submit "$module is too slow" bugs. I do nothing for 18 months and then resolve it saying, "it is fast enough now".
  • Moore's Law is a huge help for technical computing. Anytime you need to crunch numbers for something, it's a good bet that you'll have more processing power in next year's hardware. This gets us closer to really important breakthroughs in science and technology.

    It's a monster hindrance for mainstream computing. Having all this processing power available to you, coupled with cheap memory, means you can be as lazy as you want when you write software. I do systems integration work for a large company, and the
  • Another subtle problem is that it causes inflation to be underestimated, because the Bureau of Labor Statistics figure that a computer that is five times as "powerful" is worth more, even if you personally are not doing significantly more with it than the computer you bought five years ago.
  • It contributes to the production and distribution of really bad code. Firefox with tens of millions of copies is a case in point. (Oh yes, they *claim* with version 3 they are going to consider performance). I'm still waiting for Firefox to run in the same memory footprint and as fast as Netscape 4.72 did. (Firefox will not start with less than ~55MB of memory under Linux.)

    When excessive amounts of memory and processor speeds allow you to release software which by any stretch of the imagination is "bloa
  • Increased transistor count can be used for higher volumes, greater performance, or bigger cache.

    Up until a few years ago, more performance and memory resulted in a distinct return on investment. Right now, most machines are "good enough" for present apps. I predict a shift to system on a chip designs driving small reasonably powerful systems like the OLPC.

    The problem is the industry adapting to this new model.
  • Just because the hardware gets faster and faster and CPU buses, networks and graphics cards change and improve peple think that everything has changed and that the old school rules don't apply. I'm thinking about team organizations and how one produces a product. There are lessons we learned a hundred years ago during the industrial revolution that I continually find people relearning today. About making changes as the last minute and quality assurance. Information technology allows some parts of managi

For God's sake, stop researching for a while and begin to think!

Working...