Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

45 Years Later, Does Moore's Law Still Hold True?

CmdrTaco posted more than 3 years ago | from the wish-i-had-a-law dept.

Intel 214

Velcroman1 writes "Intel has packed just shy of a billion transistors into the 216 square millimeters of silicon that compose its latest chip, each one far, far thinner than a sliver of human hair. But this mind-blowing feat of engineering doesn't really surprise us, right? After all, that's just Moore's Law in action isn't it? In 1965, an article in "Electronics" magazine by Gordon Moore, the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months. Or was it 24 months? Actually, nowhere in the article did Moore actually spell out that famous declaration, nor does the word 'law' even appear in the article at all. Yet the idea has proved remarkably resilient over time, entering the zeitgeist and lodging like a stubborn computer virus you just can't eradicate. But does it hold true? Strangely, that seems to depend more than anything on whom you ask. 'Yes, it still matters, and yes we're still tracking it,' said Mark Bohr, Intel senior fellow and director of process architecture and integration. 'Semiconductor chips haven't actually tracked the progress predicted by Moore's law for many years,' said Tom Halfhill, the well respected chip analyst with industry bible the Microprocessor Report."

cancel ×

214 comments

Sorry! There are no comments related to the filter you selected.

Moores law of first posts (-1, Offtopic)

seramar (655396) | more than 3 years ago | (#34757390)

doubling my first post count every 18 months

Re:Moores law of first posts (1)

alvinrod (889928) | more than 3 years ago | (#34757460)

I'd say you'd run out of stories in which to post first, but it seems as though /. follows a Moore's law of duplicate stories so you should be alright.

Re:Moores law of first posts (5, Insightful)

fusiongyro (55524) | more than 3 years ago | (#34757816)

I feel like I've been reading this article every six months for the last ten years.

Number of components, not computing power (2)

Daengbo (523424) | more than 3 years ago | (#34757394)

Number of components, not computing power, and the time-frame should be easy to figure out given the difference between 1965's number and the 65,000 predicted in 1975.

Re:Number of components, not computing power (3, Insightful)

MrEricSir (398214) | more than 3 years ago | (#34757424)

Adding components is easy. Making faster computers is not.

Re:Number of components, not computing power (0)

Anonymous Coward | more than 3 years ago | (#34757458)

Especially when the software retards are jut eager to pile on bloat, abstraction and whatever tickles their fancy this week. After all, they're not engineers, and they're not doing the heavy lifting.

Re:Number of components, not computing power (0)

Anonymous Coward | more than 3 years ago | (#34757600)

Yeah, god forbid the software actually uses the chip. It should sit there idle, like RAM.

Resources are for stockpiling.

Re:Number of components, not computing power (5, Insightful)

Yvan256 (722131) | more than 3 years ago | (#34757718)

You're either trolling or looking at it the wrong way.

More efficient software means we could probably run tomorrow's software with yesterday's hardware.

Instead, because of bloat, we're stuck running yesterday's software with tomorrow's hardware.

When put in the mobile context, it also means shorter battery life.

Re:Number of components, not computing power (5, Insightful)

icebike (68054) | more than 3 years ago | (#34758000)

Well said.

I'm often modded troll when I claim that every advancement in computer processing power has been absorbed by look and feel of the OS interface.

Recalculating the spreadsheet (or just about any other real work) seemingly takes just as long (short?) as ever.

I know its not provably true, but it sure seems that way.

Re:Number of components, not computing power (4, Insightful)

Cid Highwind (9258) | more than 3 years ago | (#34758212)

The problem with that is there is no objective definition of software bloat. It's just slashdot shorthand for "spending time on stuff I personally don't find important".

Your "bloat" is another user's "better interface" or "better security" or "maintainable code".

Re:Number of components, not computing power (1)

rawler (1005089) | more than 3 years ago | (#34758524)

When put in the mobile context, it also means shorter battery life.

It also provides incentive for hardware makers to keep focusing on performance rather than other qualities. It's to the point of the hardware literally catching on fire, killing people.

Sure, if you sit with your laptop in your lap, it's smart to make sure it gets properly ventilated, but WHY SHOULD THE USER HAVE TO CARE? Had software systems remained efficient, hardware manufacturers would soon had to differentiate themselves on other qualities such as cool and quiet, case-design or other things users actually value.

Re:Number of components, not computing power (1)

bluefoxlucid (723572) | more than 3 years ago | (#34757724)

You mean like how back in the day you had 32 megs of RAM using Windows, with a 100MHz processor, and you could pile on a new program and the computer would swap 50 megs to disk, and tick along just fine mostly?

And nowadays we have 4 gigs of RAM, and the computer uses 500 megs of swap and every time you alt-tab you have to wait 4-5 seconds for everything to load back into RAM as windows slowly get redrawn, and everything runs slow... but wait! Developers are piling more and more on, since there's 4 gigs of RAM then EVERY program can use 2-3 gigs of RAM, and now... yes! 6 gigs of swap, and a computer that barely runs at all with 6 8 core processors!

Re:Number of components, not computing power (1)

rubycodez (864176) | more than 3 years ago | (#34757766)

some of us might want to run more than one major application at once, our maybe use our RAM for our own data instead of hundreds of megabytes of bloatware.

Re:Number of components, not computing power (5, Funny)

jappleng (1805148) | more than 3 years ago | (#34757614)

We call those Java developers.

Re:Number of components, not computing power (-1)

Anonymous Coward | more than 3 years ago | (#34757656)

Troll

Re:Number of components, not computing power (0)

Anonymous Coward | more than 3 years ago | (#34757686)

No, it's the truth.

Re:Number of components, not computing power (3, Insightful)

BlueWaterBaboonFarm (1610709) | more than 3 years ago | (#34757730)

Since when can't you call someone a troll for telling you the truth? ~

Re:Number of components, not computing power (0)

Anonymous Coward | more than 3 years ago | (#34757828)

We call those Java developers.

Or Bjarne Stroustrup & C++0x.

Today's C++ ain't your Dad's C++!

Re:Number of components, not computing power (1)

MrEricSir (398214) | more than 3 years ago | (#34757810)

That's called "enterprise software."

Re:Number of components, not computing power (3, Insightful)

Joce640k (829181) | more than 3 years ago | (#34757822)

I remember worrying when they started making 16 and 20Mhz CPUs, I thought digital electronics wouldn't be very stable at those sort of clock speeds.

Re:Number of components, not computing power (1)

Anonymous Coward | more than 3 years ago | (#34758542)

Presumably you weren't familiar with Emitter Coupled Logic [wikipedia.org] (ECL, pronounced "Ek-el") during the same era. I remember looking at ECL signals on an oscilloscope and observing how "beautifully" (or "cleanly") they transitioned their logic states--fascinating, and it also inspired confidence that reliability could be achieved at very high speed switching speeds.

Re:Number of components, not computing power (0)

Anonymous Coward | more than 3 years ago | (#34757872)

Adding components is easy. Making faster computers is not.

exactly. quantum mechanics sets a pretty hard and fast limit of just how fast computers can actually get.

Re:Number of components, not computing power (1)

Jonner (189691) | more than 3 years ago | (#34758008)

It depends what you mean by "faster computer." Nobody expects clock speeds to advance much beyond the several GHz possible today. Therefore, more and more components are being devoted to parallel processing, such as multiple cores, pipelines, and processor threads.

It seems to me that chip designers like Intel, AMD and others are doing pretty well at getting more and more processing power out of each clock cycle, though I'd hesitate to call anything about chip design "easy." Writing software to take advantage of the increasing parallelism seems harder.

Re:Number of components, not computing power (1)

nabsltd (1313397) | more than 3 years ago | (#34758314)

Nobody expects clock speeds to advance much beyond the several GHz possible today.

With the Sandy Bridge chips overclocking about 20% faster at the same temperature, it will only take about 3 more iterations before we are nearing 10GHz.

Re:Number of components, not computing power (1)

Monkeedude1212 (1560403) | more than 3 years ago | (#34757450)

I thought it was the number of transistors in a chip will double (or more, due to major breakthroughs) every 2 years, which means whatever they had 2 years ago would need to have been doubled. Which, when people asked if Intel would have 1 billion transistors on a 1 inch chip by 2010 - they said "Already done it!"

Re:Number of components, not computing power (0)

Anonymous Coward | more than 3 years ago | (#34757534)

The exact law is that after a certain time period you get twice the number of components on the same piece of silicon FOR THE SAME PRICE. IC node scaling still follows this. Once about every 18 months we move to a smaller process.

Re:Number of components, not computing power (1)

MaskedSlacker (911878) | more than 3 years ago | (#34758204)

There is no exact law.

Re:Number of components, not computing power (2)

NewWorldDan (899800) | more than 3 years ago | (#34757606)

Conveniantly, the actual 1965 article is linked in the summary above. Specifically, it was about the cost-effectiveness of adding components to in integrated circuit. Circuits with few components aren't cost effective to build, and cuircits with more components have lower yields, making them not ideal either. At the time, the component count was doubling on a yearly basis, and Moore predicted that this would continue for the near term (5-10 years), but that the longer term trend was unlikely. And so it was with time between component count doubling increasing from 12 moths to 18 to 24 to whatever it is today. I remember in the early 90s, processor performance was easily doubling every 2 years, and it certainly hasn't been that way the last 4-5 years.

Re:Number of components, not computing power (5, Insightful)

mangu (126918) | more than 3 years ago | (#34757752)

I remember in the early 90s, processor performance was easily doubling every 2 years, and it certainly hasn't been that way the last 4-5 years.

It was easier to measure then, because performance was directly related to clock rate. Now that clock has stopped going up, performance depends on parallel processing.

Then there's a catch, parallel processing depends on the software. Doubling clock rate will probably double the performance of almost any software that runs in the computer, doubling the number of cores not necessarily. Luckily, the most demanding tasks in computing are those that can be parallelized.

With the advent of the GPGPU the future looks bright for Moore's Law. I've recently run some benchmarks using Cuda to perform FFTs and compared it to the data I have from my old computers. In my case, at least, my current computer is above the curve predicted by applying Moore's Law to the computers I've had in the last 25 years.

Re:Number of components, not computing power (5, Interesting)

vux984 (928602) | more than 3 years ago | (#34757972)

It was easier to measure then, because performance was directly related to clock rate.

It was easier to measure then because real world performance was actually doubling and was apparent in most benchmarks.

Now that clock has stopped going up, performance depends on parallel processing.

Performance isn't doubling anymore. Cores are increasing, and the pipelines are being reworked, cache is increasing, but PERFORMANCE isn't doubling.

Then there's a catch, parallel processing depends on the software.

It depends on the task itself being parallelizable in the first place, and many many tasks aren't.

Luckily, the most demanding tasks in computing are those that can be parallelized.

Unfortunately its the aggregate of a pile of small independent undemanding tasks that drags modern PCs to a crawl. And these aren't even bottlenecking the CPU itself... to be honest I don't know what the bottleneck is right now in some items... I'll open up the task manager... cpu utilization will be comfortably low on all cores, hard drive lights are idle so it shouldn't be waiting on IO... and the progress bar is just sitting there... literally 20-30 seconds later things start happening again... WHAT THE HELL? What are the possible bottlenecks that cause this?

Re:Number of components, not computing power (2)

increment1 (1722312) | more than 3 years ago | (#34758046)

// TODO remove this
sleep(30);

Re:Number of components, not computing power (1)

Anonymous Coward | more than 3 years ago | (#34758138)

This depends on your definition of performance.

If you define it as "How fast does program X run" then no, it's not increasing.

If you define it as "How much work can the computer do in one second" then yes, it increases.

Also, you don't necessarily need a highly parallelizable task, you simply need a lot of tasks.

To your last question, a some operating systems these days (I'm looking at you, Vista) try to make certain things "faster" by indexing. That means, every time you copy/move/unzip files, it has to update the index. That also means that working with a lot of small files is much worse than one large one. This is basically why Vista's copies are so slow. The benefit of this (the usefulness of which is debatable) is that searching for files is much faster.

For me, I don't often even *do* file searches, so it's mainly a feature that slows things down and does nothing for me.

Re:Number of components, not computing power (2)

Confusador (1783468) | more than 3 years ago | (#34758310)

I know you said that it shouldn't be I/O, but I would still bet money that if you put an SSD in there you'd notice a dramatic improvement. (Although, you didn't mention RAM usage, but even then the SSD would help since it would speed up swap.)

Re:Number of components, not computing power (2)

vux984 (928602) | more than 3 years ago | (#34758562)

I know you said that it shouldn't be I/O, but I would still bet money that if you put an SSD in there you'd notice a dramatic improvement. (Although, you didn't mention RAM usage, but even then the SSD would help since it would speed up swap.)

However, when I observe PCs stall with no significant cpu activity and no disk activity... if it were thrashing ram there should be disk activity. No, those stalls have got to be something else.

Personally, though, yes, an SSD is my next upgrade, and I agree with you that I hope and expect to see a dramatic difference in load times, boot times and other io intensive stuff, but I don't think the stalls I'm complaining here about are related.

One source of stalls that I am aware of are networking related. I've seen lots of stuff choke badly trying to reach servers that aren't reachable, where it just stalls the entire UI of an app while it waits for some network query to timeout. That accounts for some of them, but I'm still head scratching why it stalls in a number of other cases where there shouldn't be any networking dependency/functionality.

Re:Number of components, not computing power (1)

nabsltd (1313397) | more than 3 years ago | (#34758504)

Performance isn't doubling anymore. Cores are increasing, and the pipelines are being reworked, cache is increasing, but PERFORMANCE isn't doubling.

It really is, if you have software that takes advantage of all those core. If you have a single-threaded task, then you probably aren't seeing an increase in performance of that task, but you can now run that task plus something else at the same time.

I have been encoding audio to Dolby Digital recently, and the single-threaded compressor finished the job in about 1-2% of the length of the audio, so, 1 hour of audio took about a minute to encode. Although it has been available for a long time, I had not tried the Aften AC-3 encoder [sourceforge.net] before, but after discovering that it is multithreaded and can encode that same hour of audio in less than 5 seconds on an 8-core machine, I'm never using anything else.

There are many other examples like mine that show overall performance is increasing. Even games now benefit from more cores, although 4 is about the limit of increasing performance for most current titles.

Re:Number of components, not computing power (1)

TheCarp (96830) | more than 3 years ago | (#34758210)

> It was easier to measure then, because performance was directly related to clock rate. Now that clock
> has stopped going up, performance depends on parallel processing.

Very true, but, is it still "Moore's Law" if you reformulate it to take new paradigms into account? When Einstein adopted the Lorentz Transformations to describe relative motion, nobody referred to those equations as "Newton's Laws".

Its splitting hairs but, I don't think its all that useful to call a "law" anyway. I always thought of it more of a "rule of thumb". As such, it is still useful. If I have a workload that runs slow today, I know that that isn't a total killer, because, 2 years or so down the line, I can expect to have machines that can do the same workload faster (or take on a bigger one).

In those terms, "Moore's Law", as long as you don't try to get too specific with it, works ok. Its also good for explaining things to non-technical people when they want an idea of a comparison between their old machine and a brand new one.... again... accuracy and precision are not important here, just, a general ballpark.

For that, I think its great, and I use it that way. However that said... there is no reason to expect it wont change. Like, I fully expect that if some major national government decided to utterly pour resources into the problem we would start seeing that doubling time climbing to 4 or 6 or even 8 years! :)

-Steve

Virus? (1, Insightful)

Anonymous Coward | more than 3 years ago | (#34757396)

lodging hold like a stubborn computer virus you just can't eradicate

Stop using Microsoft Windows.

A Better Question: (5, Insightful)

justin.r.s. (1959534) | more than 3 years ago | (#34757408)

45 Years Later, Does Moore's Law Still Matter?
Seriously, hardware is always getting faster. Why do we need a law that states this? Which is a more likely scenario for Intel: "Ok, we need to make our chips faster because of some ancient arbitrary rule of thumb for hardware speed.", or "Ok, we need to make our chips faster because if we don't, AMD will overtake us and we'll lose money."?

Re:A Better Question: (2)

cxbrx (737647) | more than 3 years ago | (#34757512)

Agreed. When watching a presentation, I have a corollary to Moore's Law, where if a slide mentions Moore's Law and has "the graph", then it is ok to ignore that slide and the following two slides because no new information will be transmitted. It is like a nicer (and temporary) version of Godwin's Law [wikipedia.org] .

Re:A Better Question: (4, Interesting)

kenrblan (1388237) | more than 3 years ago | (#34757516)

Well said. I was about to post the same question. The progress definitely matters, but the prediction is really not much more than an engineering goal at this point. That goal is secondary to the goal of remaining the market leader. Without intending to start a flame war, I wish the programming side of computing was as interested in making things smaller and faster in code. Sure, there are plenty of academically oriented people working on it, but in practice it seems that most large software vendors lean on the crutch of improved hardware rather than writing tight code that is well optimized. Examples include Adobe, Microsoft, et al.

Re:A Better Question: (4, Informative)

alvinrod (889928) | more than 3 years ago | (#34757594)

Funny you should say that as there was a /. story [slashdot.org] not terribly long ago about how algorithm improvements have improved beyond hardware.

The problem with products from Adobe and Microsoft is that the codebase is massive and it can be a pain to fix and optimize one part without breaking something else. Software vendors deal with the same issue of needing to be faster than the competitor as Intel/AMD. If Adobe and Microsoft don't, I think it speaks more to the lack of competition in some of their product areas than it does to simply being lazy.

Re:A Better Question: (0)

Anonymous Coward | more than 3 years ago | (#34757698)

Poor software performance is a result of both lazy programmers and the fact that people worry about optimizing "too early". The fact is, you have to think about performance from the beginning, it can't be left to be done after the system has been designed and built. It's damn hard to optimize after the fact even though it doesn't look that way. This problem is compounded by the fact that "smart" and experienced people (teachers, even managers that used to be programmers) think you should save optimization for later.

Re:A Better Question: (2)

houghi (78078) | more than 3 years ago | (#34757704)

but the prediction is really not much more than an engineering goal at this point

No, it is a prediction, not a goal. There is a HUGE difference between the two. The worst that could happen (pr the best) is that it becomes a self fulfilling prophecy.
If they get there, they stop trying as they reached the prophecy.
If they do not get there, they will try harder to reach the prophecy.

Now the question is if the self fulfilling prophecy speeds up the process or slows it down in the long term.

Re:A Better Question: (2)

mangu (126918) | more than 3 years ago | (#34757838)

If they get there, they stop trying as they reached the prophecy.
If they do not get there, they will try harder to reach the prophecy.

Now the question is if the self fulfilling prophecy speeds up the process or slows it down in the long term.

Let's try it out:

-"Boss, I have this fantastic idea for a chip that will have ten times more components than the ones we have today".
-"No way! That would violate Moore's Law, make it just twice the number of components!"

No, I don't think Moore's Law is slowing down progress.

Re:A Better Question: (5, Insightful)

epte (949662) | more than 3 years ago | (#34757728)

My understanding was that the prediction was indeed important, for inter-business communication. Say, for example that a company purchases cpus from a vendor, for use in its product when it releases two years from now. The product development team will shoot for the expected specs on the cpus at that future date, so that the product will be current when it hits the market. Such predictability is very important for some.

Re:A Better Question: (1)

TheRaven64 (641858) | more than 3 years ago | (#34758510)

Actually, it's important for intra-business communication too. Intel takes around five years to bring a chip to market. First, marketing works out what kind of chip will be in demand in five years, and roughly how much they will be able to sell it for. They produce some approximate requirements and the design team starts work. In the meantime, the process team works on increasing the number of transistors that they can fit on a chip, improving yields, and so on.

At the end of the five years, they produce the chip that the design team came up with. The cost of production depends on the number of transistors that the design team used and the amount of progress that the process team has made in this time.

It is incredibly important for the design team to have an accurate prediction of their transistor budget early on. If they get this guess wrong, then they end up with a chip that's too expensive to produce, or one that uses fewer transistors than it could (for the target market segment) and underperforms.

Intel made bad predictions a couple of times. I had the opportunity to talk to the lead designer on the Pentium 4 team, which suffered from two bad predictions. The first was marketing not realising that power consumption was becoming an important factor, so fast-clock-at-all-costs was not the correct design goal. The second was the belief that they'd be able to ramp the clock speed up to 10GHz by the 'tock' cycle of the NetBurst architecture's lifecycle (they didn't even hit 4GHz).

Re:A Better Question: (1)

JustinOpinion (1246824) | more than 3 years ago | (#34757954)

Without intending to start a flame war, I wish the programming side of computing was as interested in making things smaller and faster in code.

I don't think it's as bad as all that. Believe me, I would love it if all the software I used were trimmed-down and brilliantly optimized. There is indeed quite a lot of software that is bloated and slow. But it really just comes down to value propositions: is it worth the effort (in programming time, testing, etc.)? For companies, it comes down to whether making the software faster will bring in more sales. For many products, it won't bring in new sales (as compared to adding some new feature), so they don't bother.

But in places where it does matter, there actually is some good competition. In browser rendering, for instance, the big players are all competing to improve performance (e.g. Mozilla [arewefastyet.com] ). Think even of something as horribly inefficient as Adobe Acrobat Reader... It's inefficiency has in fact led to the creation of lighter-weight alternatives (e.g. Sumatra [kowalczyk.info] or FoxIt [foxitsoftware.com] ). Another example is in graphics: there are all kinds of brilliant and powerful algorithms and optimizations working in modern software to make the slick graphics we now take for granted.

In an ideal world, every piece of software would be crafted to perfection, and would ship as a perfectly secure, extremely small chunk of code that runs blazingly fast because of the thousands of meticulous assembly-level optimizations that were performed. Reality falls short. But, on the other hand, our modern computers are really quite functional and fast. So I would say we should keep putting pressure on vendors to ship faster software, to the extent that we notice the slowness and it bothers us... but we should also acknowledge the real effort that is going into optimization all the time.

Re:A Better Question: (2)

RandCraw (1047302) | more than 3 years ago | (#34758268)

Fact is, software development has relied on exponential hardware speedup for the last 40 years, and that's why Moore's Law *is* still relevant.

If a global computer speed limit is nigh then mainstream computing will slowly decelerate. Why? 1) Perpetual increase of bloat in apps, OSes, and programming languages. 2) Ever more layers of security (e.g. data encryption and the verification & validation of function calls, data structures, and even instructions). 3) Increasing demands of interactivity (e.g. event polling of network & GUI & the explosion of sensors).

The care, feeding, and survival and all of this crap depends on a nonstop increase in hardware horsepower. If hardware no longer improves (regardless of whether it was due to transistor density, clock speed, or microarchitecture design), the effect is the same -- computers won't keep up with the rising demands on them, and they will slow down.

In fact, Moore's Law died the minute that multicore CPUs were announced as its heir apparent. Parallelism has always been the last ditch solution to hardware speedup. (Amdahl's Law is a harsh mistress). Parallel CPUs have been available forever and were mercifully avoidable as long as clock speeds multiplied. But when CPU Hz topped out at around 3 billion (around 2003), the hardware pros knew the party was over.

Moore's Law, RIP.

Re:A Better Question: (1)

avandesande (143899) | more than 3 years ago | (#34757638)

Yes, I think it does matter, because eventually the law will 'fail'.
I have no idea at what point that will come but it will certainly be an important inflection point for technology.

Re:A Better Question: (1)

flonker (526111) | more than 3 years ago | (#34757642)

The utility of Moore's Law is not that "hardware is always getting faster", but rather, it is a good rule of thumb for the specific rate of change.

You can also throw in "transistor count != speed", but that's been beaten to death already.

Re:A Better Question: (4, Informative)

hedwards (940851) | more than 3 years ago | (#34757746)

Don't be stupid. AMD did overtake Intel on a couple occasions and the response most recently was to bribe companies not to integrate AMD chips into their computers.

Re:A Better Question: (1)

HelloKitty2 (1585373) | more than 3 years ago | (#34757750)

It seems more reasonable business-wise to follow this rule than just create and release a 1000 core CPU today, and the company has no new products and tons of competitors tomorrow.

Re:A Better Question: (1)

rawler (1005089) | more than 3 years ago | (#34758554)

It does, since something needs to counter the increasing sluggishness of software.

http://en.wikipedia.org/wiki/Wirth's_law [wikipedia.org]

Again? (5, Funny)

Verdatum (1257828) | more than 3 years ago | (#34757490)

Is there some corollary to Moore's law regarding the frequency at which articles will be written commemorating the age of Moore's law and asking if it is relevant?

Re:Again? (0)

Anonymous Coward | more than 3 years ago | (#34757588)

That would be Moore's law's law.

Re:Again? (1)

skine (1524819) | more than 3 years ago | (#34757934)

And if it were studied by someone whose last name was Moore, it would be Moore's Moore's law's law.

Re:Again? (0)

Anonymous Coward | more than 3 years ago | (#34758076)

And if it were studied by someone whose last name was Moore, it would be Moore's Moore's law's law.

Too late,
Les

Re:Again? (2)

Galestar (1473827) | more than 3 years ago | (#34758186)

From the article:

It's also been so frequently misused that Halfhill was forced to define Moron's Law, which states that "the number of ignorant references to Moore's Law doubles every 12 months."

Re:Again? (2)

Scroatzilla (672804) | more than 3 years ago | (#34758394)

Cole'slaw.

Re:Again? (1)

TheRaven64 (641858) | more than 3 years ago | (#34758530)

There is also Moron's Law, which states that no one who links to Moore's paper is capable of correctly stating Moore's Law.

Well, if it didn't... (2)

pedantic bore (740196) | more than 3 years ago | (#34757492)

Well, if it didn't, then would we still be talking about it, forty-five years later?

Answers (2)

noidentity (188756) | more than 3 years ago | (#34757504)

No, yes, no, no, no.

The real problem: Access Speeds (1)

Chowderbags (847952) | more than 3 years ago | (#34757544)

The real problem is access speeds. Even if you had an arbitrarily powerful CPU, you'd still have to load in everything from memory, hard disk, or network sources (i.e. all very slow). Until these can keep up the same pace as CPUs (SSDs are still expensive), it's pretty much just AMD and Intel having a pissing match. How often do you really max out your CPU cycles these days anyway?

Re:The real problem: Access Speeds (5, Funny)

Anonymous Coward | more than 3 years ago | (#34757568)

How often do you really max out your CPU cycles these days anyway?

Every-fuckin'-time Flash appears on a web page!

Re:The real problem: Access Speeds (1)

Pieroxy (222434) | more than 3 years ago | (#34757896)

+1

Re:The real problem: Access Speeds (2)

serviscope_minor (664417) | more than 3 years ago | (#34757644)

How often do you really max out your CPU cycles these days anyway?

All the time. Why?

Re:The real problem: Access Speeds (1)

alvinrod (889928) | more than 3 years ago | (#34757662)

If that continues to be a big problem, chip makers will just using the extra transistors to increase the cache sizes. That doesn't solve every problem, but if they're hitting a wall in terms of clock speed or core utilization, then having more cache can keep what is being used well fed.

The other side of this is that a general PC might not get much more powerful, but notebooks, tablets, and phones will be able to pack the same amount of power into smaller chips, resulting in reduced power consumption or increased capabilities until they run into the same wall, which would take about a decade.

Re:The real problem: Access Speeds (0)

Anonymous Coward | more than 3 years ago | (#34757692)

How often do you really max out your CPU cycles these days anyway?

If you're not, then you're wasting valuable resources that would be better spent on those less fortunate than you. In other words: why must you be so greedy and hog more power than you require?

Useful car analogy: why buy a car with 500hp if you only require 114?

P.S. I'm only kidding; FTW--consume! More! Er, Moore! Moore!

Re:The real problem: Access Speeds (1)

hedwards (940851) | more than 3 years ago | (#34757758)

Because it's cheaper than getting your penis permanently enlarged?

The real problem: Speed of Light (2)

mangu (126918) | more than 3 years ago | (#34757986)

Even if you had an arbitrarily powerful CPU, you'd still have to load in everything from memory, hard disk, or network sources (i.e. all very slow)

Considering that light only travels 30 cm per nanosecond in a vacuum, the maximum practical clock speed depends on how far your memory is. At a 3 GHz clock rate, a request for data from a chip that's just 5 cm away on the circuit board will have a latency longer than the clock period.

The only solution to this problem is increasing the on-chip cache. But that will depend on having software that manages the cache well, i.e. more complex algorithms. In that case, since you have to optimize the software anyhow, why not go to a parallel architecture?

I bet that in the future we will see chips with simpler (read RISC) architectures with more on-chip memory and special compilers designed to optimize tasks to minimize random memory access.

Re:The real problem: Speed of Light (1)

Fict (475) | more than 3 years ago | (#34758262)

yeah, RISC architecture is gonna change everything

Check the Wiki sources. (1)

0100010001010011 (652467) | more than 3 years ago | (#34757548)

Is it really that difficult?

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year... Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer.[7]

Original Article:
Cramming more components
onto integrated circuit
Article 2: Excerpts from A Conversation [intel.com]
with Gordon Moore: Moore’s Law

It's linked in tf summary, it's TFA (1)

grimJester (890090) | more than 3 years ago | (#34757888)

I realize no one reads TFA; not even the submitter did. I don't know which is more absurd, the submitter claiming the above is not in TFA or someone ignoring TFA, reading the wiki page, then linking TFA as the source of Moore's Law without realizing it's TFA!

Head explodes

fox news at work here (0)

Anonymous Coward | more than 3 years ago | (#34757560)

So an article on Fox News concludes that it doesn't really matter if it's true so long as people think it's ok by consensus. Par for the course.

Re:fox news at work here (1)

houghi (78078) | more than 3 years ago | (#34757736)

it doesn't really matter if it's true so long as people think it's ok by consensus

Is that Fox's new motto?

Re:fox news at work here (1)

Elros (735454) | more than 3 years ago | (#34757780)

Sorry, I thought that was NPR.

Re:fox news at work here (0)

Anonymous Coward | more than 3 years ago | (#34758380)

You thought not only wrong, but backwards.

Transistor count doubling every 2 years. (1)

RightSaidFred99 (874576) | more than 3 years ago | (#34757564)

There's no question on what "Moore's Law" is as the article would paint. Originally, he said double transistor count every year. Then, in 1975, he revised it to every two years.

It's obviously not a scientific law but it is based on the manufacturing process for circuits and how they evolve, and it has been a good rule of thumb number and has proven as accurate as can be expected while we continue to make chips in basically the same ways.

It's fairly easy to look this up [intel.com] , there's no need for a lame mainstream media article link.

/. doing its part (2)

deapbluesea (1842210) | more than 3 years ago | (#34757624)

It's also been so frequently misused that Halfhill was forced to define Moron's Law, which states that "the number of ignorant references to Moore's Law doubles every 12 months."

There are only 13 posts so far, and yet /. is still on track to meet this law. Great job everyone.

Ask a vague question, get a vague answer. (5, Insightful)

JustinOpinion (1246824) | more than 3 years ago | (#34757636)

Well the problem here is that the question "Does Moore's Law Hold True?" is not very precise. It's easy to show both that the law doesn't hold, and that it is being followed still today, depending on how tight your definitions are.

If you extrapolate from the date that Moore first made the prediction, using the transistor counts of the day and a particular scaling exponent ("doubling every two years"), then the extrapolated line, today, will not exactly match current transistor counts. So it fails.

But if you use the "Law" in its most general form, which is something like "computing power will increase exponentially with time" then yes, it's basically true. One of the problems with this, however, is that you can draw a straight-line, and get a power-law exponent, through a lot of datasets once plotted in a log-linear fashion. To know whether the data "really is" following a power law, you need to do some more careful statistics, and decide on what you think the error bars are. Again, with sufficiently large error bars, our computing power is certainly increasing exponentially. But, on the other hand, if you do a careful fit you'll find the scaling law is not constant: it actually changes in different time periods (corresponding to breakthroughs and corresponding maturation of technology, for instance). So claiming that the history of computing fits a single exponent is an approximation, at best.

So you really need to be clear what question you're asking. If the question is asking whether "Moore's Law" is really an incontrovertible law, then the answer is "no". If the question is whether it's been a pretty good predictor, then answer is "yes" (depending on what you mean by "pretty good" of course). If the question is "Does industry still use some kind of assumption of exponential scaling in their roadmapping?" the answer is "yes" (just go look at the roadmaps). If the question is "Can this exponential scaling continue forever?" then the answer is "no" (there are fundamental limits to computation). If the question is "When will the microelectronics industry stop being able to deliver new computers with exponentially more power?" then the answer is "I don't know."

It's not the law of gravity (1)

Opportunist (166417) | more than 3 years ago | (#34757688)

It's not a natural law. It's neither a law of physics nor one of biology. Heeding or ignoring it has no real meaning. And, bluntly, I doubt anyone but nerds that double as beancounters really care about Moore's "law".

Computers have to be fast enough to do what tasks they're supposed to solve. Software will grow to make use of it (or waste it on eye candy). Nobody but us cares about the rest.

Wait just one minute here! (3, Funny)

gman003 (1693318) | more than 3 years ago | (#34757788)

What the fthagn is this? A Fox News article on /.? And it's actually accurate, non-politicized reporting on a scientific matter?

Apparently, I have entered the Bizarro World. Or perhaps the Mirror Universe. I can't be dreaming, because I'm not surrounded by hot women in tiny outfits, but something is most definitely WRONG here, and I aim to find out what.

Re:Wait just one minute here! (1)

blueg3 (192743) | more than 3 years ago | (#34757852)

It's neither scientific nor accurate, but other than that, yes.

Re:Wait just one minute here! (2)

kenrblan (1388237) | more than 3 years ago | (#34757958)

This was probably the work of a single reporter who had a New Year's resolution to write a factually correct article without political bias. The writer has fulfilled the terms of the resolution, and will probably resume business as usual tomorrow.

Moore's law is not a law (3, Insightful)

ahodgkinson (662233) | more than 3 years ago | (#34757798)

Moore made an observation that processing power on microprocessor chips would double every 18 months, and later adjusted the observation to be a doubling every two years. There was no explanation of causality.

At best it is a self-fulfilling prophesy, as the 'law' is now used as a standard for judging the industry, which strives to keep up with the predictions.

Re:Moore's law is not a law (0)

Anonymous Coward | more than 3 years ago | (#34758264)

I believe he said that transistor density would double every 18 months (and then two years), not computing power...

Re:Moore's law is not a law (2)

dkleinsc (563838) | more than 3 years ago | (#34758338)

It should be pointed out that the various social observations that have often been termed 'Laws' are not always true. For instance, Parkinson's Law states that work expands to fill the time and resources available. It's usually true. But sometimes, it's not, because it's trying to describe something about a system that nobody's been able to fully explain, specifically how an organization / business / bureaucracy actually functions.

That doesn't make them useless, but it does mean you have to treat them as trends rather than absolutes.

Re:Moore's law is not a law (0)

Anonymous Coward | more than 3 years ago | (#34758464)

It is a statement based on observed patterns that preceded the statement and were extrapolated into the future.

In a nutshell... (0, Troll)

wcrowe (94389) | more than 3 years ago | (#34757826)

...Moore's law is fucking stupid.

There, I said it.

In all seriousness, this is not like some sort of law of physics or something. It is just bloody stupid to keep quoting it all the time.

BTW, I want to add that I don't think Gordon Moore is stupid, only that the myth of this "law" is perpetuated.

Precise time to double (1)

jethr0211 (996156) | more than 3 years ago | (#34757860)

The graph in Moore's article clearly predicts double the number of chips every 13 months. Nine "doublings" in 10 years.

Maybe it's My Age... (1)

stewbacca (1033764) | more than 3 years ago | (#34757868)

...but I gave up caring about processor speed about 10 years ago.

magical thinking from FoxNews (1)

ganv (881057) | more than 3 years ago | (#34757926)

That is an amazingly shallow article on FoxNews.com Maybe they should advertise the source in the summary blurb. They seem to believe that computers will magically keep getting better without any specific hardware improvements required.

Just make a graph (1)

llZENll (545605) | more than 3 years ago | (#34757938)

I clicked only wanting one thing, a graph with three lines showing: Moore's Law, transistor count, and computing power of each processor.

pdh (0)

Anonymous Coward | more than 3 years ago | (#34757988)

First law of journalism.. never in your headline write a question that can easily be answered with a yes or a no....

Is it true that computing power has exactly doubled for 45 years over any specific length of time?... No... end of article.

No moore (1)

cjseealf (1594985) | more than 3 years ago | (#34758182)

Moore's law more than a technologcal vision, is a business strategy powered by a technological vision. I just see corporates pigs, (no ofence to pigs) doing unjustified capitalism. Now we should have other priorities.

you keep using that word 'law' (1)

Tumbleweed (3706) | more than 3 years ago | (#34758246)

None of these are 'laws', where you get punished by breaking them. Not Moore's, not Godwin's, etc. They are more 'generalizations' than anything else. Moore's, especially, could be more acccurately terms an 'observation', as that's what was going on at the time he made it. Everyone repeat after me: "Moore's Observation"

There we go.

Even "Moore's Average" would be more accurate.

Not who you ask, but what you ask... (3, Informative)

joeyblades (785896) | more than 3 years ago | (#34758418)

the future founder of chip juggernaut Intel, predicted that computer processing power would double roughly every 18 months. Or maybe he said 12 months

What Gordon Moore actually said was that complexity would double every year. Moore was also relating cost at that time, but cost doesn't actually scale well, so most people don't include cost in modern interpretations of Moore's Law.

For circuit complexity, Moore's Law (with the 18 month amendment) seems to still hold true. However, we are fast approaching some physical limits that may cause the doubling period to increase.

Performance is commonly associated with Moore's Law (as you mention), However, performance is a function of clock speed, architecture, algorithm, and a host of other parameters and certainly does not follow Moore's Law... It never really has, even though people still like to think it does... or should...

memsistors (0)

Anonymous Coward | more than 3 years ago | (#34758588)

Some people have thought Moore's law was soon to come to a close due to heat restrictions, but with the recent work on memsitors they believe chip technology will continue to grow steadily for quite sometime. I herd this from a professor of mine at the University of Alberta, take it for what you want.

CharlieMopps's Law (2)

Charliemopps (1157495) | more than 3 years ago | (#34758610)

CharlieMopps's Law(TM): The quantity of articles posted to Slashdot that mention Moore's Law will approximately double every time Intel or AMD come out with a new processor.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?