×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Does Moore's Law Help or Hinder the PC Industry?

ScuttleMonkey posted more than 6 years ago | from the start-counting-the-misquotes dept.

Technology 191

An anonymous reader writes to mention two analysts recently examined Moore's Law and its effect on the computer industry. "One of the things both men did agree on was that Moore's Law is, and has been, an undeniable driving force in the computer industry for close to four decades now. They also agreed that it is plagued by misunderstanding. 'Moore's Law is frequently misquoted, and frequently misrepresented,' noted Gammage. While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. 'Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

191 comments

Both (5, Insightful)

DaAdder (124139) | more than 6 years ago | (#18872105)

I suppose it does both.

The drum beat of progress pushes development to it's limits, but at the same time hinders some forms of research or real world tests of computation theory, for all save the few chip makers dominating the market currently.

Re:Both (3, Insightful)

ElectricRook (264648) | more than 6 years ago | (#18872411)

We're also not paying US$800 for a 80387 math co-processor (only did floating point). Like a friend of mine did in the 80's. That would be about $US1,600 in today's dollars.

Re:Both (0)

Anonymous Coward | more than 6 years ago | (#18873147)

You also have to think about how much of an impact that chip actually had on processing. It'd be equivalent to being able to convert your dvd to divx in 20 seconds as opposed to an hour.

I'm gonna vote for hurts - big time (5, Insightful)

$RANDOMLUSER (804576) | more than 6 years ago | (#18872129)

If only because it keeps us tied to the x86 instruction set. If we didn't have the luxury of increasing the transistor count by an order of magnitude every few years, we'd have to rely on better processor design.

Re:I'm gonna vote for hurts - big time (2, Interesting)

bronzey214 (997574) | more than 6 years ago | (#18872223)

Sure, being tied to the x86 architecture hurts, but it's nice to have a pretty base standard as far as architectures go and not have to learn different assembly languages, data flows, etc. for each new generation of computers.

Re:I'm gonna vote for hurts - big time (2, Insightful)

$RANDOMLUSER (804576) | more than 6 years ago | (#18872381)

Game designers do it all the time. Compiler writers do it all the time. For 99.5% of the programmers out there, the underlying architecture is a black box; they only use the capabilities of the high-level language they happen to be using. But the final performance and capabilities of the system as a whole depend on that underlying architecture, which has been a single-accumulator, short-on-registers, byzantine instruction set (must. take. deep. breaths...) anachronism for far too long.

Re:I'm gonna vote for hurts - big time (2, Insightful)

Chosen Reject (842143) | more than 6 years ago | (#18872749)

And for 99.9% of users the underlying architecture is a block box. But 100$ of applications the underlying architecture is important, and if the application doesn't run then the user gets upset. Doesn't matter if the application only needs to be recompiled, even if the developers gave away free recompiles to people who had previously purchased the software, it would require users to know which architecture they have (already difficult for most people) and make sure they get the right one.

Have you ever seen people confused about which package to get for which Linux distro? They don't know if they should get the one for Fedora, Ubuntu, Knoppix, Gentoo or Debian, and then they have to decide i386, x86_64, ppc, or whatever else there is.

Yes, most developers would have no problem, and most users wouldn't care once everything was working, it's just getting things into a working state that would suck when underlying architectures are changing every few years.

Re:I'm gonna vote for hurts - big time (0)

Anonymous Coward | more than 6 years ago | (#18872943)

I'm sorry, but my mom is now using gentoo.

I had her upgrade evolution the other day. Do you think she cared that she has an amd64 processor?

Re:I'm gonna vote for hurts - big time (1)

rubycodez (864176) | more than 6 years ago | (#18872475)

a new architecture every fifteen years wouldn't be so bad, staying in this essentially i386 + some bolt-ons rut is getting tiresome. Look at the cool things Sun is doing with the new T2 chip because sparc is somewhat less constipated than i386. Now imagine a chip designed from the ground up to be massively multi-core / SMP.

Re:I'm gonna vote for hurts - big time (1)

TheRaven64 (641858) | more than 6 years ago | (#18872789)

Now imagine a chip designed from the ground up to be massively multi-core / SMP
Actually, I'd rather not. SMP embodies two concepts:
  1. Homogeneous cores.
  2. Uniform memory architecture.
For maximum performance and scalability, you don't really want either of these.

Re:I'm gonna vote for hurts - big time (2, Interesting)

TheLink (130905) | more than 6 years ago | (#18872973)

So any recent benchmarks of how the latest T2 stuff does vs recent x86 machines in popular server apps like _real_world_ webservers, databases?

AFAIK, it was slower than x86 the day it was launched, and when Intel's "Core 2" stuff came out it got crushed in performance/watt.

Instruction set != architecture (2, Insightful)

Criffer (842645) | more than 6 years ago | (#18872833)

If only because it keeps us tied to the x86 instruction set. If we didn't have the luxury of increasing the transistor count by an order of magnitude every few years, we'd have to rely on better processor design.

I'm just going to refer you to my comment made earlier today when discussing a "new, better" processor architecture. Because there's always someone who thinks we are somehow "hindered" by the fact that we can still run 30-year old software unmodified on new hardware.

See here [slashdot.org].

Re:Instruction set != architecture (2, Informative)

$RANDOMLUSER (804576) | more than 6 years ago | (#18873769)

The miniscule number of registers everyone complains about is irrelevant
Were it not for the opcode fetches to register-dance (because only certain registers can do certain things), or having to use memory to store intermediate results (because there aren't enough registers), or stack-based parameter passing, (not enough registers) or, again, the single accumulator (more opcode fetches and more register dancing) you might have a point. But what you're suggesting (in the rest of your post) is that having 1000 horsepower on bicycle tires is the same as having 500 horsepower on real tires - and I can't agree.

...30-year old software unmodified...
Can you name any 30-year old software that is worth running unmodified? Hell, I'll give you a break. Can you name any 10-year old software that is worth running unmodified?

Re:I'm gonna vote for hurts - big time (2, Insightful)

sgt scrub (869860) | more than 6 years ago | (#18873003)

we'd have to rely on better processor design.
Not to mention we'd have to rely on better software design. The way Moores law effects software, by allowing it to bloat, is the anti technology.

Moore's Observation (5, Insightful)

Edward Ka-Spel (779129) | more than 6 years ago | (#18872135)

It's not a law, it's an observation.

Re:Moore's Observation (1, Insightful)

Ngarrang (1023425) | more than 6 years ago | (#18872239)

Agreed. Why the industry chooses to measure itself against what some may consider just an anecdotal observation I will never understand.

I suppose the laymen need something like to rally around, though.

Sure, double the number of transistors! But, did that do anything useful? Did you gain enough performance to offset the complexity you just created? In the drive to "keep up with Moore's Law", are we better off? Are the processors now "better", or simply faster to make up for how fat they have become?

Re:Moore's Observation (4, Insightful)

jhfry (829244) | more than 6 years ago | (#18872341)

If you think that Intel or AMD double the number of transistors in an effort to keep up with Moore's law than you know nothing about business.

No one does anything in an effort to prove Moore correct... they do it for their own benefit. Intel does it to stay ahead of their competition and continue to keep selling more processors. If they chose to stop adding transistors they could pretty much count on losing the race to AMD, and likely becoming obsolete in a very short time.

I agree that more transistors != better... however it is indeed the easiest way, and least complex, to increase performance. Changing the architecture of the chip, negotiating with software developers to support it, etc, is far more complex than adding more transistors.

Re:Moore's Observation (1)

Ngarrang (1023425) | more than 6 years ago | (#18872459)

Well written, JHFry. Yes, my initial reply was a bit simplistic. The consumers ultimately dictate the increases we have seen.

In light of this, is the media guilty of over-hyping the concept? In other words, would we even be beating this subject into the ground if it were not for reporters heralding the "defeat of Moore's Law for 10 more years!" or some other similar headline?

Re:Moore's Observation (4, Insightful)

TheRaven64 (641858) | more than 6 years ago | (#18872877)

It takes roughly 3-5 years to design a modern CPU. At the start of the process, you need to know how many transistors you will have to play with. If you guess to few, you can do some tricks like adding more cache, but you are likely to have a slower chip than you wanted. If you guessed too many, you end up with a more expensive chip[1]. Moore's 'law' is a pretty good first-approximation guess of how many you will have at the end of the design process. A company that can't make this prediction accurately is not going to remain competitive for long.


[1] There is no real upper bound on the number of transistors you can fit on a chip, just the number you can for a given investment.

Re:Moore's Observation (1, Insightful)

vux984 (928602) | more than 6 years ago | (#18872353)

Mod parent up.
Seriously.

Moore's "law" doesn't mean squat. Its not like gravity. Its more like noticing that I've never had a car accident.

Then, one day, I will, and the "Law of Magical Excellent Driving" that I've been asserting has been an invisible hand guiding my car around has been violated. Oh noes! How could this have happened?! How did this law which had protected my safety for all those years suddenly fail to apply? ...

Yeah. Right.

Re:Moore's Observation (1)

VWJedi (972839) | more than 6 years ago | (#18872637)

Mod parent up.

Looks like it's happening already. Now we need someone to mod you up for your explaination.

A "scientific law" should be something that is universally true (everywhere in the universe, at any time past, present, or future) or very nearly so. Moore's "law" seems to be true on earth for the past few decades, but we have no basis for thinking it would be true anywhere else in the universe or that it will continue to hold true in the future.

Re:Moore's Observation (1)

maxwell demon (590494) | more than 6 years ago | (#18872859)

but we have no basis for thinking it would be true anywhere else in the universe or that it will continue to hold true in the future.

Even more, we have basis for thinking it will not continue to hold true at some time in the future. That's because it's quite unlikely that we will ever have several transistors per atom.

Has the existence of Moore's law changed anything? (1)

Dogtanian (588974) | more than 6 years ago | (#18872799)

GP is absolutely correct; one interesting question we might ask, however, is whether Moore's "Law"/Observation has actually "driven" development that wouldn't have existed otherwise. That is, has the mere existence of Moore's Law resulted in it growing legs at any stage and actually *driving* the changes it supposedly just observed?

Re:Moore's Observation (1)

noidentity (188756) | more than 6 years ago | (#18872897)

And even if it were a law, it would not be the actual cause of the increasing performance, just a simple abstraction of whatever the real causes are. Put another way, massive objects attracted each other before we came up with the "law of gravity".

It's a law (1)

Jotii (932365) | more than 6 years ago | (#18873621)

It's a law just as much as the law of supply and demand and the law of universal gravity are. Only mathematical laws are deductively proven.

No significances. (4, Insightful)

Frosty Piss (770223) | more than 6 years ago | (#18872139)

"Moore's Law" is not a real law. In reality, it is not relevant at all. It's kind of a cute thing to mention, but when it gets down to the real world engineering, it has no significances.

Re:No significances. (4, Insightful)

truthsearch (249536) | more than 6 years ago | (#18872273)

While you are correct, it has value as being accurate foresight. So the question is, was it just an observation or did it become a self-fulfilling prophecy? If it was a self-fulfilling prophecy then what other judgements can we make now that may drive technology in the future?

Re:No significances. (0)

Anonymous Coward | more than 6 years ago | (#18872773)

So the question is, was it just an observation or did it become a self-fulfilling prophecy?

Oh, it definitely became a self-fulfilling prophecy. I design CPUs for a living, and I can't tell you the number of times I stopped and thought to myself "waitasec, I've already used up my allotment of transistors, so I'd better not do that or I'll break Moore's Law!"

Seriously, WTF did you think the answer to that question would be?

Re:No significances. (1)

cowscows (103644) | more than 6 years ago | (#18872533)

I'd agree. The progress we've seen has been motivated primarily by people trying to make money. More specifically, competition for the giant sums of money being spent on faster computers. Not that there's anything wrong with that.

On the level of individual engineers, I doubt there are many of them out there kept up at night worrying about how their progress relates to the general progress curve over the past couple decades. Simply put, we've all got better things to worry about.

Moore's Law is more a description of a "symptom" of the progress, not a cause.

The definition doesnt matter, the effect does. (0)

plasmacutter (901737) | more than 6 years ago | (#18872143)

The "speed (capability) increase" is what matters in determining if moore's law is helpful to the computer industry.

the short and sweet.. it is.. it drives sales, provides greater resources allowing expansion of computer capability.

Answer to the question? (3, Interesting)

titten (792394) | more than 6 years ago | (#18872149)

If Moore's law is helping or hindering the PC industry? I don't think it could hinder it... Do you think we'd have even more powerful computers without it? Or higher transistor density, if you like?

It could be... (3, Funny)

paladinwannabe2 (889776) | more than 6 years ago | (#18872311)

I've heard that companies plan, design, and release new processors based on Moore's Law. In other words, if it doesn't keep up with Moore's Law it's discarded, if it goes faster than Moore's Law its release is delayed (giving them more time to fine-tune it and get their manufacturing lines ready). If this is the case, then it could be hindering developement in new ways of processing (that have a payoff that takes more than 3 years to develop) and we might even be able to beat Moore's Law rather than follow it. Of course, Moore's Law [wikipedia.org] is awesome enough as it is, I don't feel the need to complain about how it takes two whole years to double the effectiveness of my hardware.

Slownewsday (-1, Offtopic)

TheLink (130905) | more than 6 years ago | (#18872165)

Uh there must be some other more interesting story right?

Here's a more interesting question: (3, Funny)

spun (1352) | more than 6 years ago | (#18872255)

Does Cole's law help or hinder picnics?

Discuss.

Efficiency (3, Interesting)

Nerdfest (867930) | more than 6 years ago | (#18872171)

It certainly seems to have had an effect on peoples attention to writing efficient code. Mind you, it is more expensive to write code than throw more processor at things ...

Re:Efficiency (1)

jimicus (737525) | more than 6 years ago | (#18873373)

However now that chips are going in the direction of multicore rather than ever higher clockspeeds, it means that development methodologies have to shift focus in order to take full advantage of it. Not every app has yet done so, not by a long way.

Case in point: A business application my boss recently bought. Client/server app, with most of the intelligence in the client. They recommended at a minimum a Pentium 4 4GHz. Did such a thing even exist?

Murphy's law... (3, Funny)

jhfry (829244) | more than 6 years ago | (#18872179)

is more important to nerds than Moore's law anyway. Where's the /. article about it?

Murphy tells us that more bugs will be found on release day than any day previous. That your laptop will work fine until the very minute your presentation is scheduled to begin. And that backup generators are unnecessary unless you don't have them.

Who cares about Moore's law... it's just prophecy from some Nostradamus wannabe.

density of transistors? (2, Funny)

Red Flayer (890720) | more than 6 years ago | (#18872183)

While most people believe it means that you double the speed and the power of processors every 18 to 24 months, that notion is in fact wrong, Gammage said. "Moore's Law is all about the density...the density of those transistors, and not what we choose to do with it."
Hmmm. Seems to me Gammage might have it backwards, the misunderstanding of Moore's Law by most people is due to the density... the density of those people.

Re:density of transistors? (2, Funny)

hAckz0r (989977) | more than 6 years ago | (#18873693)

My first own version of Moore's Law states in rule one; that the 'density' of the sales force is inversely proportional to the 'core size' (N) of the sales force times e^2. [eg. 1/(N*e^2)] That is the only "density measurement" worth paying attention to when buying any new computer equipment.


My second law of 'density' states that that the PR intelligence quotient is randomly modulated by Schroedingers' cat in the next room, and is only measurable when not actually listening to it.

In My Opinion, It Isn't a Law (5, Insightful)

eldavojohn (898314) | more than 6 years ago | (#18872211)

I always viewed this as an observation or rule of thumb, not a law.

Moore (or Mead for that matter) didn't get up one day and declare that the amount of transistors on a square centimeter of space will double every 18 to 24 months. Nor did he prove in anyway that it has always been this way and will always be this way.

He made observations and these observations happen to have held true for a relatively long time in the world of computers. Does that make them a law? Definitely not! At some point, the duality that small particles suffer will either stop us dead in our tracks or (in the case of quantum computers) propel us forward much faster than ever thought.

Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!

Re:In My Opinion, It Isn't a Law (0)

Anonymous Coward | more than 6 years ago | (#18872281)

Moore's Law Enforcement Agents have already been dispatched to beat you silly with slide rules.

Re:In My Opinion, It Isn't a Law (1)

Profane MuthaFucka (574406) | more than 6 years ago | (#18872313)

It's a perfectly fine law. A law is simply a mathematical relationship between a domain and an range. It's a mathematical function which describes observations.

Don't think that lawyers make mathematical laws, and judges enforce adherence of the universe to the laws.

Re:In My Opinion, It Isn't a Law (1)

truthsearch (249536) | more than 6 years ago | (#18872539)

Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!

Because it sells ad space on a web page which has been slashdotted. Duh. ;)

Re:In My Opinion, It Isn't a Law (1)

oGMo (379) | more than 6 years ago | (#18872837)

Why debate if a well made observation hurts or hinders the industry when it's the industry doing it to itself?!

Because clearly, clearly, if this law is a problem, we should repeal it or, nay, make a NEW law that keeps the industry safe. Yes. Clearly we need government intervention here to keep the number of transistors over time to a safe minimum. Terrorists. This will protect the industry (terrorists!).

Once they're done with this, they can pass a law to prevent gunpowder from igniting unless the user has a license, and save a lot of lives!

two issues with the "it's bad" camp (1)

nobodyman (90587) | more than 6 years ago | (#18872911)

These snippits below are a recurring theme in the "Moore's law is bad" camp:

Gammage said, "just to keep up, just to make sure that you're capable of supporting the software that's running within your environment."
I don't understand this perspective - especially on the enterprise side. Did the applications you were running suddenly slow down because a new CPU came out? Then why lament the rate of progress?

One valid argument is the frustration of having to upgrade hardware to get acceptable performance on newer applications. I can empathize, but what's the alternative? You've got three choices, really:
  • Upgrade hardware to run newer software
  • Arbitrarily stifle software innovation so that you don't have to upgrade every 12 months
  • Increase development time so that developers can wring every ounce of efficiency or shave down a footprint*
In my opinion, the the first option lesser of three evils.

The problem is that there are often sustained periods where, despite cramming more transistors onto a chip, no particular incentive--that is no compelling innovation--exists in the industry to prompt people or companies to upgrade their equipment.

I agree. Still, let's extrapolate to some sort of conclusion. Do we tell hardware vendors to hold off until The Next Big Thing warrants better hardware? What if The Next Big Thing is out of the question on current hardware?

Software innovation prompts hardware innovation, and hardware innovation prompts software innovation.

Consider folding@home. Developers said, "hey we've got these GPU's that would be perfect, lets use them". Now GPU clients are some of the top performers. Supposing perfected 2D VGA videocards and said "well, that'll do", such an innovation would be impossible.

*I'm sure that someone will reply with "Well this wouldn't be a problem if people didn't write such sloppy code!". Yes, sure sure. People wrote much better code back in the day and you had to walk 5 miles in the snow just to get to school and all that. Whatever. Even if all coders were brilliant, I would still prefer to have their brilliant minds focused on new features than on fitting code into hardware constraints.

Better Summary (5, Informative)

Palmyst (1065142) | more than 6 years ago | (#18872219)

The core of their argument is that instead of actually delivering same performance at lower prices, Moore's law delivers more performance at same prices. i.e. you can buy Cray-1 level performance for $50, but you can't buy Apple I level performance for $0.001. The second level of their argument is that this march of performance forces users to keep spending money to upgrade to the latest hardware, just to keep up with the software.

Re:Better Summary (1)

GodfatherofSoul (174979) | more than 6 years ago | (#18872757)

Isn't the fact that developers are exploiting the available processing power (putting pressure on hardware manufacturers to keep ahead) validation for the industry's focus on more performance at the same price?

Re:Better Summary (1)

644bd346996 (1012333) | more than 6 years ago | (#18873103)

That would be the case but for the fact that the most common cpu hogging applications are the least efficient pieces of code on an average box. Word processors should not require anything close to the amount of memory and cpu speed that they do today. For example, word will refuse to do grammar checking if you have less than 1Gb or RAM. And look at Adobe Reader. That thing is several times slower than pretty much any other pdf reader, even the ones that come close to having all the features (even the ones that nobody uses).

The problem is that software has bloated such that the only way to get a responsive UI is to upgrade the hardware. None of the big software companies have any qualms with introducing noticible but small latency into the UI. If instead they worked to gurrantee that all simple operations completed imperceptibly quickly, the majority of users wouldn't need or have cpus faster than 1Ghz.

Definately Both (2, Insightful)

john_is_war (310751) | more than 6 years ago | (#18872263)

With companies driving to increase transistor density by decreasing process size, the speed we can accurately use these methods is slowing. With each decrease in process size, a lot of issues arise with power leakage. This is where multi-core processors come in. These are the future because of the speed cap of processors. And hopefully this will spur an improvement in microprocessor architecture.

The Real Story (4, Insightful)

tomkost (944194) | more than 6 years ago | (#18872267)

The real story is that Moore's law describes the basic goal of the semiconductor industry. Perhaps there are better goals, but they tend to get swallowed up in the quest for smaller transistors. The other real story is Gate's law: I will use up those extra transistors faster than you can create them. My hardware OEMs need a bloated OS that will drive new HW replacement cycles. I also seem to remember Moore's law was often quoted as a doubling every year, now I see some saying 18-24 months, so I think in fact the rule is slowing down. We are pushing into the area where it takes a lot of effort and innovation to get a small increase in density. Even still, Moore's law has always been a favorite of mine! Tom

Re:The Real Story (1)

ElectricRook (264648) | more than 6 years ago | (#18872611)

It was always 18 months. Looking at transistor speed and density (number of X-istors per cm2), Moore observed that it looks like we are doubling the density, and doubling the speed every 18 months.

hides crappy software (1)

peter303 (12292) | more than 6 years ago | (#18872275)

Lots of sloppy, inefficient software out there. (I did not say MSFT.) It gets "rescued" by faster, larger computers. I am not advocating the old days of assembly code, but there is room for better coding.

Re:hides crappy software (1)

soft_guy (534437) | more than 6 years ago | (#18872419)

While this is true, it allows for more layers of code to exist and more problems to be solved. When the Mac first came out I thought "They have this super-fast Moto 68000 processor, but then they put an OS on it that slows things down to a crawl." And then the hardware caught up with it and saved the concept of a GUI.

Why? (3, Interesting)

malsdavis (542216) | more than 6 years ago | (#18872289)

Why do computers in general need to get any faster these days?

Ten years ago I wouldn't believe I would ever ask such a question but I have been asking it recently as my retired parents are looking to buy a computer for the web, writing letters and emails. I've told them specifically "DO NOT BUY VISTA" (why on earth would anyone want that ugly memory-hog?), so I just can't think of a single reason why they need even one of the medium-spec machines.

Personally, I like my games, so "the faster the better" will probably always be key. But for the vast majority of people what is the point of a high-spec machine?

Surely a decent anti-spyware program is a much better choice.

Re:Why? (1)

VWJedi (972839) | more than 6 years ago | (#18872951)

Why do computers in general need to get any faster these days?

For the same reason automakers don't build cars that last longer. If I could buy a (car / computer / OS) that would last me for decades, the vendor wouldn't see me (and my money) again for a looooong time. If nothing is ever obsolete, they can only sell you a new computer if yours breaks or gets stolen, or if your need expands (i.e. purchasing an additional computer so your child can have his / her own).

But I agree with your point. At this point, a computer that's a few years old running XP and semi-recent applications will be more than enough for the majority of the population. The problem comes when it breaks (physically or OS / software) or you need additional software. Where can you find parts and software for an "obsolete" machine?

Re:Why? (0)

Anonymous Coward | more than 6 years ago | (#18873077)

People were saying very similar things to this over a decade ago when chips were hitting 100Mhz. The fact is, that as processor power increases more uses are adopted. Look at slashdot - the fancy new javascript comments system takes quite a lot of processing power, browsing it on a Pentium 90 would suck! Same for youtube, you wouldn't get decent video playback. Just starting firefox (or any browser with a comparable featureset of tabs, pop-up blocker, good font handling including various character sets, assorted plugins and a live spellchecker) is fairly slow on a modern system, on an older one you could click the icon and go for coffee.

My parents watch DVDs, have a TV card allowing them to watch digial OTA TV, surf the web heavily including youtube, run quite hefty programs like photo-paint on hi-res photos... none of which happened in the P90 era because the processing power wasn't there. And when TV/youtube switches to hi-def, and they get a 20 megapixel camera instead of 4, then their current system will probably require upgrading! This is not to mention the other uses which no-one does today because we also don't have the processing power.

This is not to say that spending $1000 bucks on a machine is good value for money, because it'll cost $400 a year later. Buying it then (if you need a new PC) certainly is value for money.

Re:Why? (1)

kebes (861706) | more than 6 years ago | (#18873365)

You know, I've also had this "do we really need faster computers?" thought more than once.

Yet inevitably I eventually encounter a situation where my computer is having trouble keeping up, and I'm reminded that, yes, I would indeed like to have a faster computer, and I'd be willing to pay for it (up to some level, obviously).

These "I want more speed" situations don't come up that frequently, but they do come up. And I can think of millions of ways that having arbitrarily more computing power could be put to use. For instance, there are many operations that take a second or two on a current computer (rendering something, refreshing a complicated spreadsheet and graph, doing some data lookup, file searching, etc.). If these operations were somehow 1000X faster, then you could imagine having a slider that the user can move around, and the given operation updates effectively in real-time. That has major useability advantages.

This is but one example, of course. I understand that many people only do web-browsing and email, but I nevertheless maintain that there is a whole world of useability and features that we have not yet touched, simply because it would be too processor intensive to implement. (I'm sure even grandma would appreciate it if after drag-and-dropping a 2Gb video file into an email, it automatically transcodes it down to a 20Mb video file, and it only takes 0.1 second to do so.) Then again, maybe such advances are not "worth it"--but of course the same could be said of any advance in computer software/hardware.

The point is that if your computer eventually breaks, and you need a new one, would you rather get an identical one for $800, or one that is 100X faster for $800 ?

Re:Why? (1)

Mr. Underbridge (666784) | more than 6 years ago | (#18873375)

Ten years ago I wouldn't believe I would ever ask such a question but I have been asking it recently as my retired parents are looking to buy a computer for the web, writing letters and emails. I've told them specifically "DO NOT BUY VISTA" (why on earth would anyone want that ugly memory-hog?), so I just can't think of a single reason why they need even one of the medium-spec machines.

People have been asking this question since there have been PCs. The N-1th (and usually the N-2th) generation of PCs always handles mundane applications fine. This was true 10 years ago too.

The only real difference between now and 10 years ago has more to do with the fact that Win2K was so much better than 98, and nearly as good as XP, that effectively any computer that can run Win2K is modern enough for home use. In fact, I still have a mothballed vintage 2001 computer laying around running Win2K and office 2K quite snappily, and there isn't a damned office-related task it can't do just fine. So pretty much any machine that could have been considered state of the art in 1999 is more than enough to do the things most people need to do.

For a while at least, new generations of computers saw new advances in the types of stuff that was available for home use. Word processors, better graphics, web browsing, etc. But for home use, we're doing pretty much the same crap we were 10 years ago. The only reason not to use a machine from 1997 is because it would be Win98, and a lot of security software no longer supports it.

Re:Why? (1)

jimicus (737525) | more than 6 years ago | (#18873433)

Plenty of people make that argument, and have been doing so for years.

Yet even today I can point you at a few real business applications which could really benefit from more power. I have no doubt whatsoever that in a few years time, they'll be OK on anything you're likely to run at them, but another troop of applications will have come along which require more power.

To: John McCain, Senator +Informative (-1, Troll)

Anonymous Coward | more than 6 years ago | (#18872295)

Your "experience" as a pilot and prisoner-of-war DOES NOT make you a military expert.

My birth in a hospital DOES NOT make me a medical doctor.

Similarly, an empty CV IS NOT a qualification to be the Walking-Talking Doll President [whitehouse.org].

Patriotically,
K. Trout

Cost of fabs... (2, Interesting)

kebes (861706) | more than 6 years ago | (#18872327)

"...Every 24 months, you're doubling the number of transistors, doubling the capacity," he said. "But if you think about the process you're going through--they're taking a wafer, they put some devices on it, they cut it up and sell it to you--the cost of doing that is not doubling every 18 to 24 months."
Is he claiming that the cost of doing lithography on wafers doesn't increase? That's crazy talk! The cost of building and running fabs is in fact also growing exponentially. According to Rock's Law [wikipedia.org], the cost of building a chip-making plant doubles every four years, and is already into the multi-billion dollar range.

In fact there's alot of debate whether Moore's Law will break-down due to fundamental barriers in the physics, or whether we will first hit an economic wall: no bank will be willing (or able?) to fund the fantastically expensive construction of the new technologies.

Re:Cost of fabs... (1)

rubycodez (864176) | more than 6 years ago | (#18872605)

not to worry, the number of chips produced by a successful product line grows at a rate that more than covers the growing cost (sucks to have be the ones making an Itanium, doesn't it, Intel). The 486 is still a hot seller in embedded industry

Here come the pedants (2, Insightful)

Dara Hazeghi (1076823) | more than 6 years ago | (#18872329)

Cue all the pedantic asshats who absolutely have to point out that Moore's Law really isn't a Law... it's an observation.

Re:Here come the pedants (1, Insightful)

Anonymous Coward | more than 6 years ago | (#18873291)

Cue all the pedantic asshats who absolutely have to point out that Moore's Law really isn't a Law... it's an observation.

And cue the moderators who mod those folks up thereby encouraging pedantic asshat behavior on Slashdot.

misunderstanding is rampent everywhere (1)

WindBourne (631190) | more than 6 years ago | (#18872333)

Several good examples is the use of goto and global warming/cooling.
  1. When the issue of goto was first looked at back in the 60's, they found that it was massive problems due to using gotos INTO blocks. The study found that judicious use of gotos made perfect sense but only out of blocks or within the same block. Now, we are trought that gotos are bad.
  2. Back in the 70's, a single report spoke that a global cooling COULD happen. Other scientists showed that it was not happening and most likely could not. But the media (Such as national enquirer which was the sensationalist equivilent of Fox news today) got it and ran with it. Now, we deal other media point to that single item and hold it up as proof that science it bad.
The problems occur when ppl who do not understand the issue or a report try to use them.

Re:misunderstanding is rampent everywhere (1)

140Mandak262Jamuna (970587) | more than 6 years ago | (#18873007)

We do have GOTO now, except it is done much more nicely. What are exceptions, I ask. I am in a procedure some 25 levels deep from main() and I encounter an unrecoverable error. Back in Fortran77 I would have written GOTO 9999 to report an error and quit. Now I throw a string "What? Triangle vertex is a null pointer?" and catch it at some level and handle the error. Far superior to a hard coded target address to jump to. Functionally similar to GOTO. I would go so far as to venture, in every case where GOTO made sense in 1970s, we have better structures and concepts now.

TO HELL WITH IT! (0)

Anonymous Coward | more than 6 years ago | (#18872337)

So much bullshit about moore's law... On an on people go, talking about how "we can't keep up with moore's law" well WHO THE FUCK CARES ABOUT THAT?? Stop talking about it like it's some holy edict passed down from the heavens that MUST be followed or there will be DIRE CONSEQUENCES! It doesn't matter! If they'd stop with their pissing contest to see "who can best keep up with moore's law" maybe we'd have some decent asynchronous chips by now.

It wasn't even supposed to be a law! It was just a guy noting the pattern in chip density! The person worst off is moore, because he has to have this bullshit repeated with his name on it all the time.

No point to this flame. I'm just sick of hearing about it.

Performance (1)

javilon (99157) | more than 6 years ago | (#18872347)

So has chip performance doubled every 18 months?

I have tried to find out, but didn't get a clear enough answer from what is publicly available on the internet.

Moore's law is bad for performance per watt (0)

Anonymous Coward | more than 6 years ago | (#18872349)

The pursuit of higher densities is causing power consumption to become a bigger problem than ever. As semiconductor manufacturers jump from 90nm to 65nm to 45nm technologies, static (or leakage) power consumption is hugely hurting semiconductor wafer yield, and it just gets worse.

Unfortunately the performance benefits with these new technology nodes are often not enough to justify the increased power.

Add to this the fact that smaller transistors are harder to print correctly, and you see yield margins taking a huge hit.

Of course, there should be progress, but preferably without the forced "deadline" imposed by Moore's law.

It all about money. (1)

rimcrazy (146022) | more than 6 years ago | (#18872355)

Now, more than ever, it has much more to do with $$$$$$$$$ than technology. The cost of a next generation fab is between 3-5 Billion dollars. That's Billion with a capital B. A single mask set for your new design has been typically double for every new design rule node. We are currently way above 1 Million for a mask set going to 2-3 Million. Economics are driving next generation technology more than anything and will continue to do so.

Re:It all about money. (1)

VWJedi (972839) | more than 6 years ago | (#18872993)

The cost of a next generation fab is between 3-5 Billion dollars. That's Billion with a capital B.

Is that more or less than billion with a small b?

Re:It all about money. (1)

maxwell demon (590494) | more than 6 years ago | (#18873535)

Well, Billion starts with Bill, as in Gates, while billion starts with bill, as in restaurant. Now guess which is more money. :-)

Moore's Law is a heuristic, nothing more (1)

spikeham (324079) | more than 6 years ago | (#18872403)

Like Murphy's Law, Moore's Law is a heuristic rule of engineering. "In general, the computing power of a commercially available CPU doubles every 18 months." Trying to define this specifically in terms of number of transistors, MIPS, processor speed, etc. is silly. The specific technological advances that drive Moore's Law are diverse, driven both by incremental improvements in existing technologies, such as shrinking die sizes in chip fabs, and occasional leaps of innovation, like multi-core CPUs. Representing them as a smoothly increasing exponential function is a massive oversimplification for the benefit of laypersons and Wall Street.

** Check out free Windows OpenGL screensavers at http://www.mounthamill.com/ [mounthamill.com] **

Moore's Law is a crappy measurement (3, Insightful)

Ant P. (974313) | more than 6 years ago | (#18872423)

...Like GHz or lines of code.

Take the Itanic for example, or the P4, or WindowsME/Vista.

Obviously (1)

bmo (77928) | more than 6 years ago | (#18872433)

If you use Windows, each new installation (or daemonic posession) in your computer negates whatever gains you may have had gotten through Moore's Law.

Computers only 12 months old with a _gigabye_(!) of RAM are not robust enough to run a full install of Vista with all the bells and whistles, for example.

--
BMO

Yeah, mod me troll, bitch. I've got more karma than you've got mod points.

Bad code hinders the PC industry. (0)

Anonymous Coward | more than 6 years ago | (#18872503)

The problem is not moores law. The problem is that as processor power increases, the tools to use that processor power get more and more bloated and useless.

Windows vista on modern hardware boots no faster then 98 did on hardware that was modern for its era. The processors are clearly way more powerful today, but all the bloated code that comes along with these processors mean applications are no faster then they were 10 years ago. After all, when you are under a deadline, it's way easier to write ineffient code and make it work by brute proccessing force available to you, then it is to write elegant code and debug it before you get fired for missing your deadlines.

Lets not forget the epedemic of bad programmers out there too. The ones who have no idea how PC's actually work, want to code EVERYTHING in java because its the only language they managed to learn part of by going to night school. This sort lack of knowledge does tend to foster the ability to jam a square peg into a round hole in the most creative of manners, but it also creates some of the worst code imaginable.

Back in the days of DOS, you could do with a 2 megabyte program and 1 mb ram, what today takes a 200mb program and 50 megs of ram. THIS IS NOT PROGRESS.

How about neither? (1)

lbmouse (473316) | more than 6 years ago | (#18872553)

Help me out here, where is the correlation? I feel that Moore's Law effects the computer industry about as much as Rose O'Donnell leaving "The View" does.

Help or hinder? (0)

Anonymous Coward | more than 6 years ago | (#18872613)

That would be a resounding "Yes!"

Why we need faster computers (2, Insightful)

Animats (122034) | more than 6 years ago | (#18872623)

  • Windows. The speed of Windows halves every two years.
  • Spyware, adware, and malware. Extra CPU power is needed to run all those spam engines in the background.
  • Spam filtering. Running Bayesian filters on all the incoming mail isn't cheap.
  • Virus detection. That gets harder all the time, since signature based detection stopped working.
  • Games. Gamers expect an ever-larger number of characters on-screen, all moving well. That really uses resources.

Despite this, there have been complaints from the PC industry that Vista isn't enough of a resource hog to force people to buy new hardware.

Computers have become cheaper. I once paid $6000 for a high-end PC to run Softimage|3D. The machine after that was $2000. The machine after that was $600.

Not a Law but Observation... (1)

mario_grgic (515333) | more than 6 years ago | (#18872633)

And yes it is severely misused by people like Ray Kurzweil KurzweilAI.net [kurzweilai.net] who extend it and use it to predict that by 2020 (that's only 13 years from now!) computers will be thinking for us. I'd be happy if computers run Vista smoothly by then instead.

It should be called Moore's Observation (1)

davidwr (791652) | more than 6 years ago | (#18872639)

Moore's law isn't a law. It's an observation.

If transistor density suddenly started doubling every day or only every decade, no "law" would be broken.

Asking "does it help or hinder the industry" isn't so much the right question.

The right question is:
Does the expectation that past results will predict future performance hurt or help the industry?

Maybe it helps by driving research to "meet or beat" expectations, creating a self-fulfilling prophecy.

Maybe it hurts by sending research money elsewhere on the grounds that "it's going to happen anyways."

It might also hurt by driving money to research into "meeting expectations" about future transistor density when those expectations become unrealistic and the money would be better spent on other areas of research such as heat dissipation or other approaches to the "size problem" besides transistor-density.

Designers Don't Follow Any Law (1)

A440Hz (1054614) | more than 6 years ago | (#18872673)

Moore's Law is a law like the Law of Gravity. Newton's formulation of the LoG didn't cause gravity to start operating. Newton simply made a mathematical expression of observed reality.

Designers aren't sitting around making sure their gate counts conform to Moore's Law. They design the fastest, highest-performance silicon they can.

I get really PO'd at references to Moore's Law being a determiner of things as opposed to an observer.

Stupid question (0)

Anonymous Coward | more than 6 years ago | (#18872681)

Moore's law doesn't drive anything.
Moore's law doesn't affect the computer industry, it is an effect of the computer industry.

help or hinder ? (1)

l3v1 (787564) | more than 6 years ago | (#18872689)

It's just a law for chrissakes. It doesn't help, it doesn't hinder. Under certain circumstances it holds, as every sane law would. Eventually someone would need to update it according to changes in the circumstances. Anyway, just a a reminder, Moore was talking about the number of transistors, which has more or less been OK up to now. But thing is, Moore didn't just invent a law out of thin air and the industry followed that rule [I hope you feel the stupidity in that], but he observed how the technology evolves and said it probably will keep going this road [i.e. the repetitive approx. doubling in transistor count]. He was no fortune teller, and as nobody else, he also couldn't foresee technological evolution. Thus, one day, inevitably, Moore's law will be history. Until that, please dumb a bag of bricks on everyone's head who asks whether such a law would hinder or help.
 

Simple answers from an old Guru (4, Interesting)

Applekid (993327) | more than 6 years ago | (#18872709)

What we do with the transistors? Run software of course. Enter Wirth's Law [wikipedia.org]:

"Software is decelerating faster than hardware is accelerating."

Re:Simple answers from an old Guru (0)

Anonymous Coward | more than 6 years ago | (#18872787)

I blame OOP and Design Patterns. Hence "Anonymous Coward" to avoid getting rocks thrown at me.

Moore's law co-opted by the suits (1)

hellfire (86129) | more than 6 years ago | (#18872827)

Marketers, execs, and pundits have stolen ownership of Moore's law. What none of them understand is the computer geek culture that the "law" was spawned from. Moore's law is in the same realm as Murphy's law, which is also not a law, but fun to invoke.

However, the paid talking head pundits grab it and start talking about it and dissecting it and taking it literally. It's not a topic for geeks any more, it's not funny, and it's stupid to be discussing it in an article.

I propose a real law. A legal law. A law that states if the editors post another stupid article about Moore's law, the entire slashdot community is allowed to line up and each get to spank all of the editors one by one.

Except for CowboyNeal... he might like that... his punishment is he has to watch.

Mainly hinder, but both! (1)

ErichTheRed (39327) | more than 6 years ago | (#18873121)

Moore's Law is a huge help for technical computing. Anytime you need to crunch numbers for something, it's a good bet that you'll have more processing power in next year's hardware. This gets us closer to really important breakthroughs in science and technology.

It's a monster hindrance for mainstream computing. Having all this processing power available to you, coupled with cheap memory, means you can be as lazy as you want when you write software. I do systems integration work for a large company, and the bloated, inefficient garbage that runs fine given enough hardware is mind-boggling. I may seem a little bitter, but it seems like apps written for internal use only survive due to pumped up memory specs. I'm not saying you need to do funky tricks to squeeze a program into 4K anymore, but at least optimize code so you're not doing crazy things like iterating through each row in a database table, etc.

And it has a bad effect on the CPI... (1)

dpbsmith (263124) | more than 6 years ago | (#18873349)

Another subtle problem is that it causes inflation to be underestimated, because the Bureau of Labor Statistics figure that a computer that is five times as "powerful" is worth more, even if you personally are not doing significantly more with it than the computer you bought five years ago.

Hurts - big time (1)

bradbury (33372) | more than 6 years ago | (#18873367)

It contributes to the production and distribution of really bad code. Firefox with tens of millions of copies is a case in point. (Oh yes, they *claim* with version 3 they are going to consider performance). I'm still waiting for Firefox to run in the same memory footprint and as fast as Netscape 4.72 did. (Firefox will not start with less than ~55MB of memory under Linux.)

When excessive amounts of memory and processor speeds allow you to release software which by any stretch of the imagination is "bloatware" (could you do the same job with significantly less memory and processor utilization? I strongly suspect so...) then the hardware capabilities is facilitating really "dumb" development processes.

Its like putting an AK-47's (with 300+ rounds) in the hands of people who are hardly qualified to operate pocket knives.

As Forrest was prone to observe, "Stupid is as stupid does." Abusing the CPU or memory capacity at ones disposal is not something I would want tacked onto my resume.

Wrong Question (1)

monopole (44023) | more than 6 years ago | (#18873495)

Increased transistor count can be used for higher volumes, greater performance, or bigger cache.

Up until a few years ago, more performance and memory resulted in a distinct return on investment. Right now, most machines are "good enough" for present apps. I predict a shift to system on a chip designs driving small reasonably powerful systems like the OLPC.

The problem is the industry adapting to this new model.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...