Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

End of Moore's Law in 10-15 years?

CmdrTaco posted more than 7 years ago | from the no-for-real-this-time dept.

Intel 248

javipas writes "In 1965 Gordon Moore — Intel's co-founder — predicted that the number of transistors on integrated circuits would double every two years. Moore's Law has been with us for over 40 years, but it seems that the limits of microelectronics are now not that far from us. Moore has predicted the end of his own law in 10 to 15 years, but he predicted that end before, and failed."

Sorry! There are no comments related to the filter you selected.

it's the law (5, Funny)

User 956 (568564) | more than 7 years ago | (#20667659)

Moore has predicted the end of his own law in 10 to 15 years, but he predicted that end before, and failed.

So then it seems with regards to his Law, Moore has fallen prey to Murphy.

Re:it's the law (5, Funny)

click2005 (921437) | more than 7 years ago | (#20667713)

What about the inverse of Moore's Law.. Every 2 years, the average IQ of all users on the internet halves.

Re:it's the law (1)

Mr. Sketch (111112) | more than 7 years ago | (#20667811)

Every 2 years, the average IQ of all users on the internet halves.
So true. I believe we have now coined the click2005 law.

Re:it's the law (0, Funny)

Anonymous Coward | more than 7 years ago | (#20668057)

I believe we have now coined the click2005 law

"we"? "we" didn't do shit, n00b.

Python (1, Offtopic)

goombah99 (560566) | more than 7 years ago | (#20668449)

IN ten years, according to moore's law python will be 32 times faster than it is now. Right now it's about 1000x slower than tuned C and 100x slower than unoptimized C-code. So in ten years python will still be slower than C running on todays computers. (mean while C will also be 32 times faster).

That is not a rag on python. well no a big one. indeed for lots of things don't need to be faster (word processors) so being 32x faster would enable python to take over development of lots of areas we now use C for.

No the point here is that if python can't even see a future where it's faster than today's other languages before it's obsolete it needs to go on a vision quest. Other than taking over distasteful roles that C refuses to do anymore, what's the point in life?

I think, like the thinking-in-python dude's rant said--python needs to ask it self what high level languages could be really good at that low level laguages will perpetually suck at. And that is multi-processing. thread safety is possible in any language but if you actually are thinking about while you are programming then you have a problem. Too hard. If you modified C to be thread safe intrinsically it would dramatically slow down. But if you modified an already slow language for this then it's not going to make a big difference in speed. Thus proportionally high level language poise to gain the most advantage by multi-processing.

And moores law is going to vector in to multi-processing in the future as a way to sustain itself.

Python should reinvent itself to be the multi-processing language.

Otherwise things like Fortress, which everyone scoffs at these days, is going to go to charles atlas school and be kicking sand in all your faces. (fortress is written from the ground up to assume multi-processing by default: e.g. for-loops always can execute in any order and the local variables are thread safe.)

Re:Python (1)

Anonymous Coward | more than 7 years ago | (#20668543)

man, why is it that crazy people always feel the need to just fucking ramble on for paragraph after paragraph?

Mod up (1)

Anonymous Coward | more than 7 years ago | (#20668615)

Post is not off topic.
Gains in CPU speed affect the utility of scripting languages more than compiled languages. Parent makes this connection is a humorous way.

Re:Python (1)

Surt (22457) | more than 7 years ago | (#20668657)

You're assuming that in 10 years they'll have made no optimizations to python? I mean, maybe, I don't much like that language and think it is probably a dead end, but even the currently less popular ruby is getting more than twice as fast at basically everything in the next version. Surely python programmers can get it at least 50% faster?

Re:Python (1)

goombah99 (560566) | more than 7 years ago | (#20668765)

I exaggerate slightly for the sake of being funny. But not much. Python is trying to optimize itself by adding more and more features or libraries for more very special cases that allow speed for that case. I think that has limited potential. And worse I fear adding features which need be supported in future versions may slowly foreclose the right direction which is to 1) permit implicit and explicit typing so that it can become a compiled language 2) create ways to do multi-processing.
So yes python can get faster but not much I expect.

personally I'm eyeballing Groovy right now because it is a script language that mirrors a compiled typed language that already has some degree of thread safety and behind the scenes memory management (GC) at a low level: java. But I'm still using python (and perl!)

Re:it's the law (2, Insightful)

tomstdenis (446163) | more than 7 years ago | (#20668457)

But the IQ is the average ... so it can't halve. :-)

It's a law of econmics (3, Insightful)

goombah99 (560566) | more than 7 years ago | (#20667843)

Moore's law is not about physics it's about economics. Basically the entire industry has built an economic engine that requires that growth pattern to sustain it self.

To put it another way, growth needs to be geometric not addative. that is things need to grow at x% per year, which leads to a doubling time. If the grew linearly at x += D then as x grew the proportional rate (1/x dx/dt) of x growing shrinks with time--or the doubling period gets longer and longer. Eventually it takes a lifetime before your computer is 2x more capable. Then it takes 2 lifetimes.

Why would you ever upgrade at that point? except due to wear and tear. Things become commodities and sales are based on price and other values-added. So long to intel's industry domination model.

Moore's law is also a limit too. Namely that very same growth engine will not invest twice as many research dollars to get a slightly faster doubling time. The fact that it has held steady tells you that this is so. Empirically this growth rate is the sweat spot between creating innovation at the lowest cost, and reaping a profit on it.

Indeed the only surprising thing we've seen in the consumer market that seemed (superficially) to violate this was apples' replacement of the ipod mini with the ipod nano shortly after it's introduction. They could easily have milked it for longer. But here the driver was the competition that they needed to stay ahead of.

Re:It's a law of econmics (2, Insightful)

jcr (53032) | more than 7 years ago | (#20667883)

Moore's law is not about physics it's about economics.

Exactly. Whenever one process technology reaches its physical limits, we get a new one, because the new process makes money. X-ray lithography, chip stacking, 3D circuits, and eventually nanotech will all keep us on the Moore's law path probably for the rest of my life, at least.


Re:It's a law of econmics (1)

polar red (215081) | more than 7 years ago | (#20668049)

so you're 75? 80?

Re:It's a law of econmics (1)

jcr (53032) | more than 7 years ago | (#20668215)

No, I just feel that way once in a while. ;-)


Avast! (2)

ackthpt (218170) | more than 7 years ago | (#20668495)

Exactly. Whenever one process technology reaches its physical limits, we get a new one, because the new process makes money. X-ray lithography, chip stacking, 3D circuits, and eventually nanotech will all keep us on the Moore's law path probably for the rest of my life, at least.

Ye be forgettin' one thing, matey, they be makin' multiple cores now. Eventually we be lookin at distributed computing on an individual platform. Ye may be layin' claim to Moore's law applyin', but it be tenuous a claim at best. The paradigm be shiftin' away from the domain of Moore.

Re:It's a law of econmics (2, Interesting)

Jeff DeMaagd (2015) | more than 7 years ago | (#20668051)

You are right, but that's also because the fabs get more expensive on each generation, I think each feature size shrink requires a fab that costs 50% more than the previous fab.

Re:It's a law of econmics (1)

goombah99 (560566) | more than 7 years ago | (#20668181)

You are right, but that's also because the fabs get more expensive on each generation, I think each feature size shrink requires a fab that costs 50% more than the previous fab.
See my post below about the corollary for more discussion. But right now your point does not hold simply because the size of the market is increasing and revenues are also increasing. Therefore 1) cost-per-cpu cycle and the cost per unit of computation is falling despite the increasing cost of fabs 2) the growing cost of the fabs as a fraction of growing revenue is not increasing (I believe) yet.

Corollary to moores law (4, Interesting)

goombah99 (560566) | more than 7 years ago | (#20668103)

If you accept the statement I just made about moore's law being sustained because of economics then here's a corollary which makes an observable prediction.
Moores law stays fixed because the industry invests enough research dollars--and not one dollar more-- to keep it at that rate. Their entire economic model is built on this.

Therefore, if we every do reach a point where we simply are running out of available physics and computer science (multiprocessing) then the first sign of this will be an increasing fraction of research dollars spent to sustain moores law.

Plot the industry's margin, smooth the curve, and you will be able to extrapolate to the point where the research dollars cross the profit line. somewhere shortly before that is when moore's law will end.

The only way that would not be true is if the nature of innovation changes from frequent small leaps to massive leaps spaced far apart.

Gordon Moore (1)

morgan_greywolf (835522) | more than 7 years ago | (#20667675)

Is Gordon Moore crying wolf again?

Actually, I don't think it will matter. In 10-15 years, the emphasis will shift away from traditional binary computing and towards quantum computing anyway, making Moore's law sorta moot.

Re:Gordon Moore (2, Insightful)

sexybomber (740588) | more than 7 years ago | (#20667757)

Actually, I don't think it will matter. In 10-15 years, the emphasis will shift away from traditional binary computing and towards quantum computing anyway, making Moore's law sorta moot.

And if quantum computing should herald The Singularity, then it's definitely moot, since no predictions (Moore's Law included) can be made about post-Singularity computing.

Re:Gordon Moore (2, Insightful)

plover (150551) | more than 7 years ago | (#20667767)

I don't think quantum computing will be the future for general-purpose computing, and certainly not in 10 years. I think you're nearly right in that the future will lie in parallel computing -- increasing the number of CPUs will be the path to higher throughputs (which coincidentally aligns nicely with Intel's goal: sell more CPUs.)

Either way, when Gordon Moore eventually dies he will still be overflowing the (long)money; variable.

Re:Gordon Moore (2, Insightful)

oliverthered (187439) | more than 7 years ago | (#20667773)

quantum computers aren't really general purpose machines and wouldn't be able to replace traditional CPUs for a lot of tasks.

Re:Gordon Moore (4, Interesting)

kebes (861706) | more than 7 years ago | (#20668069)

A realistic design for a quantum computer would probably have a classical CPU that does most of the work, with a quantum co-processor. Traditional things, like running the OS and dealing with hardware I/O, would probably still be classical. The quantum co-processor would be assigned computations by the CPU that can be accomplished much faster than on the classical CPU.

This abstraction would mean that most software wouldn't have to be written with any understanding of quantum computing: libraries and compilers would be designed to use CPU calls that launch the quantum co-processor, if available.

For many operations, the quantum CPU would not be needed. But for certain tasks, it would provide orders-of-magnitude speed boosts. If quantum co-processors became commonplace, we would see improvements in all kinds of parallel-processing tasks (matrix operations, simulations, graphics, maybe even search?).

Re:Gordon Moore (1)

SeekerDarksteel (896422) | more than 7 years ago | (#20668795)

I agree with what you're saying in general, but I think one thing requires clarification.

Quantum computing will not provide some sort of magical bullet for parallel calculations. Yes, you can do a lot of calculations in parallel, but you can't get the answers because the qubit(s) holding the superposition of all possible outputs decoheres when you measure it into only one possible output. Quantum algorithms that actually do anything interesting are very narrowly focused and rely on complicated things like the quantum fourier transform or manipulating the qubit(s) so that they're more likely to be measured as the answer you want.

Re:Gordon Moore (3, Insightful)

Anonymous Coward | more than 7 years ago | (#20667881)

One problem in these discussions is that different people use different definitions of "Moore's Law." Strictly the law is an observation about the increasing density of transistors (i.e. decreasing size of each transistor). However, as we all know, many people simply use the term "Moore's Law" loosely, referring to all exponential increases in computing power.

There is no doubt that we will reach a hard physical barrier beyond which we cannot shrink individual transistors any longer. This limit will be reached in a decade or two, and is probably what Moore is referring to: at our current scaling, we will hit atomic limits rather soon.

But, that doesn't mean that the exponential increases in computing power will end. There are many other things that may happen, such as figuring out ways to build microprocessors with transistors stacked in 3D (rather than having a single 2D layer of transistors), which will increase the transistor count in our computers by orders-of-magnitude. Improvements in chip designs, layouts, and algorithms are other areas that may see improvement. Specialized and dynamically re-programmable chips may also provide us with further advances. Or perhaps, as you pointed out, quantum computers will become viable and mainstream.

There is no guarantee that these more exotic technologies will work out. Yet the microelectronics industry has surprised us time and again with their ability to overcome huge technical obstacles. Thus it seems at least possible that they will deliver technology that is very much up against the physical limits of what is achievable. And with regard to those physical limits, the hard-wall the Moore is predicting in 10 years is only one aspect. There are many other ways for this technology to be advanced.

OT: Thin processors (0)

Anonymous Coward | more than 7 years ago | (#20668463)

My favorite MacGyver episode is where he visited a college. One student shouts, "I got my processor design down to 4 atoms thick!"

What made it so funny is that they made it clear that these were undergraduates. Don't forget that this was in the eighties, too.

Re:Gordon Moore (3, Interesting)

Daniel_Staal (609844) | more than 7 years ago | (#20668491)

Somewhere, once a upon a time, I saw an article that took the opposite approach: They worked out what the absolute maximum transistor density was, and worked out from that when Moore's Law had to end. They figured one transisitor per Plank-unit, in a spherical computer. (Where the clock speed is proportional to the size of the sphere, governed by the speed of light.)

IIRC, it ended up something like 150 years in the future.

Re:Gordon Moore (1)

Daniel_Staal (609844) | more than 7 years ago | (#20668577)

Sorry, correction: The clock speed is inversely proportional to the size of the computer. (Smaller computer, faster cycles.)

Re:Gordon Moore (1)

Surt (22457) | more than 7 years ago | (#20668741)

There's hope for transistor density after atomic level transistors, actually, because density is a measure of transistors per unit volume, not per unit area. We currently talk about the density in terms of transistors per surface area of the chip, we really need to be thinking about the density of transistors you can put in the volume currently occupied by a cpu plus heatsink.

There's probably another 30 doublings that could plausibly be accomplished before the true 3 dimensional density is no longer realistically improvable.

Re:Gordon Moore (2, Funny)

Velveeta_512 (1142553) | more than 7 years ago | (#20667953)

I was thinking the same thing, or if quantum hasn't been quite refined as a useful science yet, at the very least, nanocomputing should be advanced enough to the point that it can take the reigns from microcomputing... That's what we like to do in this industry anyway, just change out prefixes when the technology hits a certain milestone... If you think the nano-itx motherboards are small, wait until you see the super-mega-ultra-nano-itx in 10 years, all components will be on separate cards, not quite risers, because they'll be connected via fiberoptic links directly into the bus a few microns away from the tri-tri-core CPU, the AMD Zelda-core 64000+ processor... And thanks to the miracle of nanocomputing's molecular level manipulation, food replicators will be a USB device available from Newegg... However, thanks to the miracles of the evolving software industry to meet the ever-increasing needs of users, there won't be drivers available for the new food replicators on Linux, it takes Windows 2020 over an hour to produce 1 hardboiled egg that you could have made yourself in 5-10 minutes, because it's thrashing your solid state drive (because really people, 2 terabytes of RAM is OS's recommended *minimum*), and it tastes more like a peanut butter sandwich because you don't have the latest Microsoft version of the driver for Windows 2020, because of course it makes the hardware incompatible with Windows 2020... Mac users will complain that they've had their iReplicator for 6 months already and stood in line for 12 hours to get one, and Jobs is slashing the price in half in anticipation of the release of the 2nd generation iReplicator...

Re:Gordon Moore (1)

drmerope (771119) | more than 7 years ago | (#20668197)

I don't think it will happen this way. First, GP computing is just a small slice of the VLSI application space. It isn't obvious that all the other problems in the world will suddenly become quantum computing problems. I see two effects: first because of cost reasons performance gains will not appear across all sections of the GP market. e.g., home systems will begin to fall further behind top-end computing. Second, Moore's law will slow down before it "ends" at a physical limit. So if it predicts we have five more generations to go, those may smear out over 20 years not 10.

Also: there are many other problems in silicon manufacturing not relating to feature size (the primary subject of Moore's law). i.e., there are ways we could build faster (but not denser) devices than we do now. Thus, the density aspect may stall and emphasis could return more to performance. e.g., using photonic interconnects for long distance signaling, lower Vt with less leakage, etc. The emphasis could also shift to cost. If the per-design mask costs go down than these really advanced process technologies will be available to more low-volume application specific situations. That could be potentially a very big deal.

Law? (3, Insightful)

haystor (102186) | more than 7 years ago | (#20667687)

Can we stop calling a prediction a law?

Re:Law? (1)

UbuntuDupe (970646) | more than 7 years ago | (#20667729)

When a prediction is confirmed over a long enough time and with enough consistency, that makes it a law.

Although at this point, I think it can only be justified as between "hypothesis" and "theory", but I'm no expert.

Re:Law? (1)

Blinocac (169086) | more than 7 years ago | (#20667817)

It doesn't really make it a law, but that is what normally happens. The law of gravity, just a theory. The laws of physics, theories really.

A law is an observation (3, Informative)

benhocking (724439) | more than 7 years ago | (#20667947)

To simplify things a bit, a law is an observation, whereas a theory is an explanation. They are not the same thing, but you can have laws and theories dealing with the same subject matter.

Re:Law? (1)

mollymoo (202721) | more than 7 years ago | (#20668551)

I suggest you look up what "law" and "theory" mean in the context of science.

Re:Law? (1)

IndustrialComplex (975015) | more than 7 years ago | (#20667841)

Can it even be a theory? There are far too many variables that are dependant on human decisions to treat this as some natural law.

It seems more like "Moore's surprisingly accurate prediction and continued observation" than Moore's law, but until I can figure out a catchy acronym I think we are stuck with Moore's law.

Re:Law? (1)

Larry Lightbulb (781175) | more than 7 years ago | (#20667989)

"Moore's SAPACO" perhaps?

Re:Law? (0, Redundant)

Cutie Pi (588366) | more than 7 years ago | (#20667801)

Can we stop being so f'ing pedantic every time a story mentions Moore's Law?

Re:Law? (4, Funny)

plover (150551) | more than 7 years ago | (#20667975)

Can we stop being so f'ing pedantic every time a story mentions Moore's Law?

This is slashdot, so ...


Re:Law? (1)

nomadic (141991) | more than 7 years ago | (#20668257)

Can we stop being so f'ing pedantic every time a story mentions Moore's Law?

No, because that would be breaking Slashdot's Law.

Re: Law? (0)

Anonymous Coward | more than 7 years ago | (#20668065)

Not unless you want to throw out Boyle's Law, Charles' Law, the Law of Universal Gravitation, etc. Sorry, but there's a name for predictions supported by significant evidence but without positing a specific explanation. []

Then it is.. (0)

Anonymous Coward | more than 7 years ago | (#20668311)

.. the "prediction of gravity"?

Re:Law? (1)

Surt (22457) | more than 7 years ago | (#20668775)

I'm sorry, but law is completely correct, and calling it Moore's law complies with definition (1A!) from websters: []
1 a (1) : a binding custom or practice of a community

Not to worry... (5, Funny)

Marc Desrochers (606563) | more than 7 years ago | (#20667689)

It will be just in time for the arrival of cold fusion.

Moore's Second Law (4, Funny)

MyLongNickName (822545) | more than 7 years ago | (#20667691)

Moore's second law: "Moore's first law will only work for 10-15 more years".
Moore's third law: "Moore's second law applies from the time it is quoted not from when it was originally uttered".

Re:Moore's Second Law (1)

irtza (893217) | more than 7 years ago | (#20667899)

Wait, I thought the third law was "Moore's second law takes effect 10-15 years before Moore's first law no longer holds true". I hope I get this straight before the test.... there is going to be a test, right?

Re:Moore's Second Law (1)

rumblin'rabbit (711865) | more than 7 years ago | (#20668455)

I thought the second law was:

The number of experts predicting the end of the first law doubles every 18 months.

Things never change (0)

Anonymous Coward | more than 7 years ago | (#20667715)

It's funny how we hear the same predictions over and over: Moore's law done in 10 years, fusion power in 50 years, Iraq pull out in 18 months, hard AI in 10 years. The date never gets any closer.

Re:Things never change (1)

pipatron (966506) | more than 7 years ago | (#20667765)

Don't forget a much improved battery technology and flexible, outdoor-readable LCD in 3-5 years.

Moore's Law (1)

Poromenos1 (830658) | more than 7 years ago | (#20667725)

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year ... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.

That's from Wikipedia. He actually said that the cost would halve every year.

10 years ... (1)

jbeaupre (752124) | more than 7 years ago | (#20668789)

Pretty much what he is saying now. So a corollary might be that when Moore stops predicting, Moore's law only has 10 years to run. Which means we all better hope he doesn't die any time soon.

Again? (4, Interesting)

dylan_- (1661) | more than 7 years ago | (#20667731)

There are always a few of these [] .

I do recall someone telling me that no CPU would ever run at more than 2GHz, as it would then start emitting microwave radiation...

Re:Again? (1)

JoelKatz (46478) | more than 7 years ago | (#20667923)

When I was in college, I learned all the reasons the features on a CPU couldn't be significantly less than 1 micron in size. I also learned that 20Kbps was about the theoretical upper limit for modems.

Re:Again? (3, Interesting)

krray (605395) | more than 7 years ago | (#20668135)

I do recall someone telling me that no CPU would ever run at more than 2GHz, as it would then start emitting microwave radiation...

I remember having / making a similar claim myself way back when -- with the 486/33 and 486/66 being the hot system in the day. I predicted they'd have a hard time getting above ~80Mhz because of FM radio interference / shielding problems. Boy was I wrong.... :*)

Today I predict "Moore's Law" to hold pretty true -- even in 10 or 15 years. IBM has been playing with using atoms as the gate / switch which will make today's CPU's look like Model T's.

In the 90's they had []
Not too long ago they've done []
And recently it has been []

This will both be a boom for storage and the chips themselves IMHO (not to mention my stock :).

Re:Again? (1)

Dolda2000 (759023) | more than 7 years ago | (#20668487)

I do recall someone telling me that no CPU would ever run at more than 2GHz, as it would then start emitting microwave radiation...
Thinking of it, it would make sense. Really, how comes they don't? Are they just sufficiently shielded?

Noore's Law (0)

Anonymous Coward | more than 7 years ago | (#20667735)

The number of people predicting the end of Moore's law will double every two years.

And I predict that in 10-15 years time.... (2, Insightful)

Anonymous Coward | more than 7 years ago | (#20667753)

We will start using a new technology without transistors, but which will exhibit a similar exponential gain so long as there is money to be made from it.

In fact, now I come to think of it, ALL human endeavour exhibits exponential growth so long as there is money to be made from it. Technology is just one field where it's true. Sex (Malthus), agriculture, you name it - humans do it exponentially!

I think I'll put that on my t-shirt, and call it 'Anonymous Coward's Law'.


Re:And I predict that in 10-15 years time.... (1)

Colin Smith (2679) | more than 7 years ago | (#20667937)

humans do it exponentially
Life does it exponentially... Till it runs out of resource.


Moore did not come up with exponential growth (0)

Anonymous Coward | more than 7 years ago | (#20668025)

The uplink poster seem to imply that warios exponential growth laws (Malthus, Compound Interest, etc) are particular cases of Moore law. WRONG! Exponential growth laws have been known for centuries, from the Middle Ages, even earlier. The law of compound interest was well known to Italian merchants in the Middle Ages (remember Fibonacci!). Malthus did not write any equations, the exponential law for population growth was introduced by Euler half a century before Malthus. Moore law is no big deal, just a late example of exponential growth.

Re:Moore did not come up with exponential growth (0)

Anonymous Coward | more than 7 years ago | (#20668497)

I like to attribute the growth to Mario, but I guess Wario played a role too.

Moore's law isn't really a law (3, Insightful)

vlad_petric (94134) | more than 7 years ago | (#20667805)

... there's nothing fundamental about it. Instead, it's a self-fulfilling prophecy. The big players in the silicon world all use the "law" and its corollaries as their business plan. They'll likely discard a feature/product if it falls behind the curve in terms of speed. For the layperson, this "precision" may indeed create the appearance of an actual law, even though it's just an observation (similar to Malthus' "law")

Re:Moore's law isn't really a law (0)

Anonymous Coward | more than 7 years ago | (#20667931)

If I ate a nickel every time some Slashbot piped up about Moore's Law not really being a law, I'd be shitting $billions.

Re:Moore's law isn't really a law (1)

aadvancedGIR (959466) | more than 7 years ago | (#20668153)

-As someone already posted, a law IS an observation (law of gravitation: objects attracts each other depending on their relative mass and their distance. theory of gravitation: something about gravitons, or something else, no one is really sure what it really is).
-Moore's law is just a particular case of learning law (any industry tend to regulary improve its production technique as long as the improvement has a positive ROI to justify investing money, wether the result is cheaper or better products is a choice left to marketing, not science).

I wonder (0, Redundant)

packetmon (977047) | more than 7 years ago | (#20667809)

Has Moore ever heard of Murphy? []

Since When... (1)

His Shadow (689816) | more than 7 years ago | (#20667827)

So many things in the tech world that should die (PS2, MBR) and we perennially have articles about a law that never was "dying". Cutting edge journalism, all right.

10 years ... (1)

polar red (215081) | more than 7 years ago | (#20667839)

that means 2^5 ... that's 32 times more computing power on a single chip ... that's about 16 times the computing power I need.

Re:10 years ... (2, Funny)

mce (509) | more than 7 years ago | (#20668077)

Nobody will ever need more than twice the computing power he already has available on the moment he makes this assessment.

Re:10 years ... (1, Funny)

Anonymous Coward | more than 7 years ago | (#20668099)

Why do you need twice the computing power but not more? Your current system is too slow for HiDef porn playback, but once that's fixed you have no other need?

Re:10 years ... (1)

kebes (861706) | more than 7 years ago | (#20668231)

Well, it's about 1/1000 of what I would "need."

Of course, by "need" I mean what I would like to have. People keep talking about computers being "fast enough," but every time I have to wait for something to finish (whether it's a Photoshop filter, compiling code, running an optimization, or waiting for a Slashdot page with hundreds of comments to load), that's time I could have used a faster computer.

If my computer were 1000 times faster, then things that currently take minutes would finish "instantly." It's not just a matter of saving seconds here and there: having complex operations being effectively instantaneous changes one's workflow. (E.g. instead of trying different filter settings in Photoshop iteratively, it would be nice to have a live preview, even for the computationally-intensive filters; or, it would be nice to have complex multi-dimensional data-fitting finish so fast that you can try different models in realtime, rather than waiting hours for it to finish...)

All I'm saying is that there is indeed a use to having more and more computing power.

Best part about predicting own failure (3, Funny)

gurps_npc (621217) | more than 7 years ago | (#20667879)

Is that even if you are wrong, you are still right.

Wow, that Moore guy was so smart he outsmarted Moore.

Hafnium Breakthrough (5, Funny)

smitty97 (995791) | more than 7 years ago | (#20667891)

from tfa:

"We're not far from that," Moore said on Monday. "Before we had our Hafnium breakthrough, we were down to the point where we were at five molecular layers in the gate structure of the transistors. When you clearly can't go below get into other types of problems," Moore said.
The law will continue, they just need breakthroughs with Quarternium, Eigthnium, etc..

And next year ... (4, Funny)

mshmgi (710435) | more than 7 years ago | (#20667893)

Next year, they'll tell us that Moore's Law will end in 5-7.5 years.

Re:And next year ... (1)

ttapper04 (955370) | more than 7 years ago | (#20668143)

I thought your comment was very funny. I would mod it if I had the points.

CPU speed already on the wane as consumer bait (4, Insightful)

dave_mcmillen (250780) | more than 7 years ago | (#20667903)

I have no idea if Moore's Law will really start to "fail" in a particular time scale (one of these times it's gotta be true, right?), but a related issue I find interesting is that CPU speeds don't seem to be being touted to computer buyers so heavily anymore. Walk into a big electronics store and look at their desktop offerings: where they used to prominently feature how many GHz they had inside (and people vaguely felt that more of these mysterious GHz was better), now the CPUs are given code names and numbers that don't reflect CPU speed: Check out this nifty X2, or the Turion 64, or ...

The new hook for consumers is the number of "cores", and once again most people have probably picked up the vague sense that having more of them inside means the computer is better. I've been told by people who might be in a position to know that it's not that they can't keep cranking up CPU speeds, but that the cost/benefit (profit-wise) stops making sense at some point because of the huge cost of implementing a new fab at a finer length scale, and we're pretty much at that point. So it makes sense that cores are the new GHz, and Moore's Law will have less and less direct impact on the end computer buyer from now on.

Maybe there's a Core Law to be formulated about how often the average number of processors per computer can be expected to double?

Re:CPU speed already on the wane as consumer bait (2, Interesting)

Cutie Pi (588366) | more than 7 years ago | (#20668339)

As the parent implied, Moore's Law will likely not end because of technological constraints but rather economic ones.

We reached a wall a few years ago in terms of transistor speed, mostly due to the thin gate oxides giving rise to significant leakage current, which translates into heat. The upcoming high-k metal gate technology mitigates but doesn't eliminate this problem. Thus, Intel and the like are putting those smaller transistors to work in redudant cores rather than faster, monolithic circuits. However, Moore's law is still marching on, from 65nm, to very soon 45nm, and then 32nm and 22nm. Each technology node effectively halves the area requirement, or (more realistically) doubles the number of transistors that can fit in the same area. This translates into lower cost per transistor. 32nm technology is in the final stages of development, and 22nm is believed to be possible, although much more difficult. The real limit right now is the optical lithography process used to pattern the circuits. There is no high-volume solution available for the 16nm node. We can certainly make patterns this small (using electron beam lithography, for example), but it would be prohibitively expensive (each chip would take many hours to expose, compared to SFIL but the industry isn't really putting money into them. My belief is that the 2-year cycle we're on will be ending after 22nm, or possibly 16nm. Circuits will probably continue to get smaller, but at a slower pace, as development and technology costs become prohibitive. Even now, there is debate in the industry about whether to delay 22nm and instead do a 28nm "half node". If this was to occur in the entire industry, the 2-year Moore's Law as we know it would end.

BTW, "45nm" and "32nm" don't directly refer to the size of the transistors, but rather to 70% of the current half-pitch being printing with lithography. Thus, the 45nm node has a 65nm half-pitch, which means the wires are spaced at 130nm. This spacing decreases to 90nm and 65nm for the 32 and 22nm nodes, respectively. The actual transistor size (channel length) can be much smaller than the currently technology node designation.

NO NO NO NO NO!!!!! (2, Insightful)

Colin Smith (2679) | more than 7 years ago | (#20667905)

Moore's law will continue until THE SINGULARITY takes US ALL!!!!!!

Or at least, that's what the singularity nuts claim. Sorry people, there are limits on this planet.

Ob Simpsons quote (2, Funny)

Maximum Prophet (716608) | more than 7 years ago | (#20668121)

Homer: Lisa, in this house, we obey the laws of thermodynamics!!!

Details at 11 (0)

IorDMUX (870522) | more than 7 years ago | (#20667955)

In other breaking news, the Department of Homeland Security reminds everyone that the terrorism thread advisory level has been raised to 'orange'.

Moore's Law will expire (1)

CruddyBuddy (918901) | more than 7 years ago | (#20667977)

The end of Moore's Law is 10-15 years away

just like "true" AI.

cost per transistor (0)

Anonymous Coward | more than 7 years ago | (#20667987)

Transistor density may reach physical limitations after we shrink everything to atomic scales and construct 3-D wafer-bonded chips, but the original statement of Moore's law was about the cost per transistor. It is likely that long after we stop increasing transistor density, breakthroughs in fabrication processes such as molecular self-assembly will lead to a second-wind for the exponentially decreasing cost per transistor. The computing industry will experience exponential improvements in throughput-per-dollar long beyond the end of semiconductor scaling.

Depends... (1)

john_is_war (310751) | more than 7 years ago | (#20668013)

The true death of Moore's law will occur when we mastered quantum transistors. Once we hit that level of technology, I highly doubt we can progress anymore. Or at least as the same pace.

two variables (2, Insightful)

OwlofCreamCheese (645015) | more than 7 years ago | (#20668015)

he forgets his law has two variables, the number of transistors and the cost. The number of transistors might stop doubling, but there is still huge change to come with the transistors being fixed and the cost changing.

I mean, a 8 core chip would be an improvement right now, but so would a 4 core chip at half the price. Think about a world where a 80 core chip exists, and how it would change the world. It would do so again when it went from 999 dollars to 99 dollars to 9 dollars to 99 cents to 9 cents to 9 for a cent/

Not yet, (3, Insightful)

SharpFang (651121) | more than 7 years ago | (#20668031)

The law speaks about number of transistors. Considering current size of a typical CPU die (about 1cmx1cmx1mm) and assuming a "reasonable" maximum size of some 10cmx10cmx10cm we have about 15 years of doubling the SIZE of the CPU (with some challenges like heat dissipation, but nothing nearly as difficult as increasing the density further) and that's not considering increasing the density any more. So even if the density reaches its limits, the CPUs may simply grow in size for a good while.

Re:Not yet, (1)

mce (509) | more than 7 years ago | (#20668147)

Sorry, but 10cmx10cmx10cm is not reasonable. Not only because you run into functional yield issues, but also because you can only fit so many dies of that size and form factor on a single wafer. And it ain't many. Manufacturing cost would skyrocket even in case the yield would be 100%, which it never will be.

Re:Not yet, (1)

tomstdenis (446163) | more than 7 years ago | (#20668195)

A typical processor is not 100mm^2, they're more like 140mm^2 (core2duo is about 143mm^2 according to a quick google search). Do you have any idea how expensive a 1000mm^2 die would be? That would cut yields down, and also be really hard to package, not to mention that if you packed 1000mm^2 with 65nm transistors the thing would consume 300 Amps.

Sure I suppose it's possible to make a 1000mm^2 die, but it would cost $25,000 USD and probably either be a super-many-core or run really slow (think longest wire...).

Re:Not yet, (3, Insightful)

Chirs (87576) | more than 7 years ago | (#20668287)

You've neglected to consider power issues. The 10cm cube you mention is 10000x the volume of the "typical" current size you mention. Assuming power scales linearly with volume, that would require approximately 300KW of power just for the cpu. That works out to about 1250Amps of current at 240V.

Nobody wants to increase the size of cpus...defects scale more than linearly with area, so there is a strong incentive to keep the die area down. Also, as the physical size increases you run into other problems...power transfer, clock pulse transfer, etc.

My prediction (5, Funny)

Random832 (694525) | more than 7 years ago | (#20668073)

I predict the number of predictions of the end of Moore's Law will double every six months.

Obligatory Pedantry -- it's about what's cheap (2, Insightful)

Urban Garlic (447282) | more than 7 years ago | (#20668093)

Luckily, there are enough geeky pedants on slashdot to make up for the fact that the editors have actually messed up this totemic bit of geek lore.

Moore was/is a technology manager, and his law is a management law. It says the number of transistors that can be economically placed on an integrated circuit, i.e. the transistor density of the price/performance "sweet spot", will increase exponentially, doubling roughly every two years.

The original [] refers to "complexity for minimum component cost", which emphasizes the economic aspect of it even more strongly.

Moore's law has never been about what's possible, it's always been about what's cheap.

FAILZORS (-1, Offtopic)

Anonymous Coward | more than 7 years ago | (#20668111)

MOVIE [] tangle of fatal a GAy NIGGER

Shriver's Law (1)

jshriverWVU (810740) | more than 7 years ago | (#20668157)

Computing power will continue grow in direct relation to finite amount of knowledge we have regarding physics. For each advancement in our knowledge of particle physics the more apt we are to apply it toward electronics in general.

Duke Nukem Forever... (1)

Notquitecajun (1073646) | more than 7 years ago | (#20668175)

Hmmm...that coincides with the release date of Duke Nukem. Give or take 3-20 years.

Nope, nope, and nope (2, Informative)

Ancient_Hacker (751168) | more than 7 years ago | (#20668227)

First of all you've misquoted Moore's law.

Secondly it's not so much a "law", as a consequence of how long it takes to amortize the cost of a fab plant.

Thirdly, it's tied to 2-D circuit layouts. If and when 3-D IC technology becomes practical, then all we need is 2^1/3 percent or about 22% linear shrink every year, which is somewhat more maintainable for a few more generations.

10 years is 5 more cycles (3, Insightful)

hattig (47930) | more than 7 years ago | (#20668241)

That's 32 times as many transistors... whereas today you can get 4 cores on a CPU in under 300mm^2, you'll be getting 128 cores in 2017 (simplistic, you'll get a variety of generic cores, and application specific cores, and per-core improvements will increase their size, so say 32 generic cores and 32 application specific cores).

If it's 16 years, thats 256 times as many transistors. 256 generic cores and 256 application specific cores in 2024? Let's not even imagine the per-core speeds! It's all pretty exciting, and I'm being conservative with the figures here.

Of course, applications will grow to utilise this stuff, but more and more tasks are getting to the point of 'fast enough', even despite the bloating efforts of their creators. Even if there is a 10 year hiatus in process improvements after 2024, it'll take some time for the applications to catch up apart from certain uses. If those uses are common enough, there will be hardware available for it instead. Of course if only Intel and IBM have fabs that can make these products, because the fabs cost $20b each...

It's not a law it's an observation (2, Informative)

gelfling (6534) | more than 7 years ago | (#20668329)

It's not a law, it's simply an observation that within Intel, that's more or less the rate of progress. As we saw with the P-4 chip the problem we bumped into was not Moore's Law, but the laws of thermodynamics. So we found a good enough reason to go to multicore CPU's. Eventually though you do bump into Albert Einstein. In 1 billionth of a second, light travels about 1 foot so the entire circuit length from end to end, in order to have a switching frequency of 1 billionth of a second, has to be less than one foot.

We're already there (1)

Animats (122034) | more than 7 years ago | (#20668405)

We're already near the end of Moore's Law. The problem is not feature size, it's getting rid of the heat. CPUs are already hitting heat and power limits, which is why CPU speeds stalled out around 3GHz.

Feature size alone matters for memory devices, and we can expect continued progress in memory density. Even for DRAM, getting rid of the heat is becoming a problem, so the future is with devices that don't require refresh cycles. We'll see progress in flash memories and static memory technologies.

Moore's meta-law (0, Redundant)

Weaselmancer (533834) | more than 7 years ago | (#20668413)

The number of predictions about the end of Moore's law will double every two years.

Simple Solution (1)

Nom du Keyboard (633989) | more than 7 years ago | (#20668479)

the number of transistors on integrated circuits would double every two years.

The solution is simple. Just make integrated circuit dies twice as big every two years.

End of Moore's law! (1)

Deadplant (212273) | more than 7 years ago | (#20668711)

The end of Moore's law! New solar panels with double efficiency! Flying cars now only 5 years away!!!!

Are these articles being generated by a script or what?
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?