Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Gigahertz Race is Back On

Zonk posted more than 7 years ago | from the start-yer-engines-of-commerce dept.

AMD 217

An anonymous reader writes "When CPU manufacturers ran up against the power wall in their designs, they announced that 'the Gigahertz race is over; future products will run at slower clock speeds and gain performance through the use of multiple cores and other techniques that won't improve single-threaded application performance.' Well, it seems that the gigahertz race is back on — a CNET story talks about how AMD has boosted the speed of their new Opterons to 3GHz. Of course, the new chips also consume better than 20% more power than their last batch. 'The 2222 SE, for dual-processor systems, costs $873 in quantities of 1,000, according to the Web site, and the 8222 SE, for systems with four or eight processors costs $2,149 for quantities of 1,000. For comparison, the 2.8GHz 2220 SE and 8220 SE cost $698 and $1,514 in that quantity. AMD spokesman Phil Hughes confirmed that the company has begun shipping the new chips. The company will officially launch the products Monday, he said.'"

Sorry! There are no comments related to the filter you selected.

More Power for What? (5, Funny)

pipingguy (566974) | more than 7 years ago | (#18822585)

This reminds me of the sign at the local breakfast shop (paraphrased): "Use coffee: do stupid things faster".

Yeah, this is cool, no doubt. How many users actually *use* how much power they already have? I use a lot, but it's mostly dependent on the graphics card.

Re:More Power for What? (2, Insightful)

Jartan (219704) | more than 7 years ago | (#18822669)

How many users actually *use* how much power they already have? I use a lot, but it's mostly dependent on the graphics card.


You're correct that people don't need this much power for their desktops but there are still plenty of uses for more speed in servers and for certain other applications.

Re:More Power for What? (5, Insightful)

Name Anonymous (850635) | more than 7 years ago | (#18822723)

How many users actually *use* how much power they already have? I use a lot, but it's mostly dependent on the graphics card.

You're correct that people don't need this much power for their desktops but there are still plenty of uses for more speed in servers and for certain other applications.
Actually I think the correct phrase is "most people don't need.... and at that, it may be inaccurate. Someone who does heavy video work can certainly chew up a lot of processing power. Heavy image work can use a lot of prcessing power in bursts.

Then there is the big fact that progammers these days are sloppy and waste resources. A machine that is faster than one needs today will only be adequate in 2 or 3 years given upgrades to all the programs. (Am I being cynical? Maybe, but then again, maybe not.)

Re:More Power for What? (0)

pipingguy (566974) | more than 7 years ago | (#18822771)

Heavy image work can use a lot of prcessing power in bursts.

Now that you mention that, I see your point. I remember having multiple PaintShopPro files open at the same time and hearing the hard drives churning.

Re:More Power for What? (3, Informative)

kv9 (697238) | more than 7 years ago | (#18822901)

I remember having multiple PaintShopPro files open at the same time and hearing the hard drives churning.

perhaps you need more memory? a bajillion gigawurtz won't help.

Re:More Power for What? (3, Funny)

pipingguy (566974) | more than 7 years ago | (#18822957)

2GB of RAM ought to be enough for everyone.

Re:More Power for What? (1)

gameforge (965493) | more than 7 years ago | (#18823489)

I'd pop open a system monitor to be sure. Incessant disk activity is rarely provoked by simply having numerous windows open unless you're out of RAM. Maybe your Windows... has amnesia?

Re:More Power for What? (1, Insightful)

Anonymous Coward | more than 7 years ago | (#18822801)

just htought i'd remind you all that the article mentions Opteron Processors for a server platform.

Servers still do need more power.
With virtuaisation software allowing dozens of people to share on server to replace their desktop apps

Were still have a long way to go
this is only the beginning

Re:More Power for What? (2, Insightful)

morcego (260031) | more than 7 years ago | (#18823905)

Then there is the big fact that progammers these days are sloppy and waste resources. A machine that is faster than one needs today will only be adequate in 2 or 3 years given upgrades to all the programs. (Am I being cynical? Maybe, but then again, maybe not.)


You know, that is something that really piss me off.

Yes, I know many times it is not the programmers fault, and they have to be sloppy to be able to meet that stupid deadline. But, c'mon. Take a look at the system resources something like Beryl uses. Then take a look at whats-name Vista 3D crappy gui uses.
And I'm pretty sure Beryl could be even more efficient (tho I'm not sure if it would be worth the effort).

2GB to RUN an OS ? Tons of processing power (both CPU and GPU) to run a simple game ? I can understand beefy CPU needs for things like video encoding and cryptographic processing, along with a few other things. But most apps simply SHOULD NOT need that much resource. IT IS JUST PLAIN STUPID.

Re:More Power for What? (1)

LiquidCoooled (634315) | more than 7 years ago | (#18822671)

Recently I have been forced to use a whole lot more just to run stupid Visual Studio.net
Even if I gave that thing its own cryogenically cooled 6000 core super processor it would still leave it feeling slow.

Re:More Power for What? (4, Insightful)

Dwedit (232252) | more than 7 years ago | (#18822681)

One word: Flash.
Flash is ridiculously inefficient, and requires an extremely beefy machine to render real-time full-screen animation.

Re:More Power for What? (1)

bumptehjambox (886036) | more than 7 years ago | (#18822947)

Very correct. I'm playing fl0w [usc.edu] on an overclocked Opteron 148 and it bogs down pretty hard sometimes. I think more power is great, give me more MIPS, and more memory bandwidth please (mostly for my 3d games.) It really is impossible to have enough, and contrary to what seems are many's beliefs, there will always be opportunities to use more power. I don't see why so many slashdotters hate on more powerful desktop processors, come on, more power, MORE POWER... (those who remember the show "Home Improvement" know the sound that proceeds)

Re:More Power for What? (1)

AaronLawrence (600990) | more than 7 years ago | (#18823635)

Hm, nice game, but yeah the performance is horrendous. All that CPU just to draw a few lines and circles in 2D. Once there's a little bit more going on in the game, it drops down to quite low framerates. Perhaps its doing software transparency as well...

Re:More Power for What? (4, Informative)

nyctopterus (717502) | more than 7 years ago | (#18822699)

Anyone doing graphics, even hobbyists. Editing home movies with effects, for example can use an almost unlimited amount of resources. As an artist working with a graphics tablet on large file in photoshop, and complex vector graphics, processors are no where near fast enough. I want everything now. I don't want to wait for the screen to redraw. I don't want to have to wait for filters. Bollocks to that. Give me 64-core 24GHz machine an I'd find a way to slow it down.

Re:More Power for What? (1)

pipingguy (566974) | more than 7 years ago | (#18822747)

I do graphics, too, but maybe not do what you do. How much of your wait time is video card-dependent? Do you know?

For engineering work, a CAD-dedicated card with 64MB blows away a 256MB consumer card quite easily based on my experience.

Of course, I'm talking about 3D performance, which you might not need. There are $4000 cards out there, who could possibly need to spend that much money?

Re:More Power for What? (1)

nyctopterus (717502) | more than 7 years ago | (#18822783)

I'm mostly 2D, though a do sometimes composite in 3D in After Effects (another thing which it would be good to have like 2000x the power for). I don't know how much is graphics card-dependent. In Photoshop I think not a lot. Vector graphics maybe.

Re:More Power for What? (3, Informative)

muridae (966931) | more than 7 years ago | (#18823167)

Yeah, it really depends on what type of graphic work you do. Some ray-tracing might make more use of the FPU when rendering, but while you are modeling the object it uses the graphics card. CAD tends to have more trouble displaying the work in real time, so those specialty graphics cards work wonders. Something like After Effects might have to keep lots of frames of video in memory to apply an effect, and could be bottle necked by the CPU or memory or even the bus between the two.

The only part of a computer that I can not picture graphics work using would be the sound card, and that's only because I haven't heard of anyone crazy enough to try it yet.

Re:More Power for What? (1)

drsmithy (35869) | more than 7 years ago | (#18823313)

Of course, I'm talking about 3D performance, which you might not need. There are $4000 cards out there, who could possibly need to spend that much money?

Those cards cost $4000 because that's what the market they are sold to will pay, not because they're doing anything a $100 card couldn't do with minor (if any) tweaking.

I get to see this kind of thing first hand, in the radiology industry. You can buy "certified" displays for $thousands, or buy off-the-shell hardware that meets all of the necessary standards for $hundreds.

For another textbook example, look at the storage industry - a "certified" 400G SATA drive for an IBM DS4xxx costs ~US$700, while an identical (except perhaps for some firmware changes) off-the-shelf drive costs ~US$100.

Re:More Power for What? (2, Interesting)

TheRaven64 (641858) | more than 7 years ago | (#18822949)

For simple video editing, disk and RAM are the bottlenecks. When it comes to effects (transitions, blending/warping etc), most of those will run on a one or two generation old GPU as pixel shader programs much faster than on any modern CPU.

The interesting thing about the CPU market now is that most of the workloads that really tax a general purpose CPU (and there aren't a huge number left) are the ones that perform very badly on a general purpose CPU. For home use, something like one of TI's ARM cores with an on-board DSP might well give better performance.

Re:More Power for What? (1)

Falladir (1026636) | more than 7 years ago | (#18823431)

It's not so much "even hobbyists" as "only hobbyists." Well, hobbyists and freelance professionals. For professionals working together, you could save a lot of money by doing all the heavy lifting through a thin client. Thin clients are great for anything that's going to take more than a few seconds, because it doesn't matter that you lose snappiness.

Re:More Power for What? (1)

nyctopterus (717502) | more than 7 years ago | (#18823697)

Err, no. All the design/post-production places I've worked at use 1 machine/1 designer (not say there isn't a bunch of other arrangements out there). Snappyness is really important in graphics. Understand that I want to be able to do dozens of different operations a minute, some small, some large (as far as computational power is concerned), and I want them all to be snappy. The sort of thing would be, "oops no, rotate it back 5 degrees and try 7, okay that looks good now let's try it without the blur" -- if that takes any appreciable time at all, the thing is too slow. I don't see how that is going to be solved by a thin client architecture.

Re:More Power for What? (5, Funny)

zaibazu (976612) | more than 7 years ago | (#18822715)

Displaying myspace profiles. The CPU load they produce is astonishing.

Re:More Power for What? (2, Interesting)

onedotzero (926558) | more than 7 years ago | (#18822809)

If my mod points hadn't run out yesterday, this would be +1 Insightful. The CSS hacks (primarily transparent overlays which aren't handled too gracefully by Opera) and overloaded flash content puts a strain on my CPU (2.6Ghz). It's incredible to see a browser struggle with these things.

Re:More Power for What? (2, Insightful)

matt me (850665) | more than 7 years ago | (#18823473)

Displaying myspace profiles. The CPU load they produce is astonishing.
Let me guess: you use Firefox.

Re:More Power for What? (0)

Anonymous Coward | more than 7 years ago | (#18822729)

OMG WTF?! I thought AMD was dead? lol

Re:More Power for What? (1)

Analein (1012793) | more than 7 years ago | (#18822761)

Porn? It's the interwebs, dude!

Re:More Power for What? (1)

suv4x4 (956391) | more than 7 years ago | (#18822845)

Yeah, this is cool, no doubt. How many users actually *use* how much power they already have? I use a lot, but it's mostly dependent on the graphics card.

I assume we're talking casual consumers and not pro users. Well, it's not really up to them, their software will bloat up to take whatever CPU "volume" there is and take all of it in the next version.

At a first glance Photoshop 4 isn't THAT much simpler than Photoshop 10. But it's plenty times faster for all basic operations both support. Wonder how that happened, no..

Windows itself, now takes seconds to do what used used to be zero time instant action in the likes of Windows 95.

Re:More Power for What? (1)

pipingguy (566974) | more than 7 years ago | (#18822921)

AutoCAD is similar, just based on this one user's perception. R14 was very fast, probably the best ever (if we're talking Windows-based). Then Autodesk started to load on stuff and changing things. Sure, ACAD fan boys will claim that this old fart failed to keep up, but it's not my job to play catch-up with the latest software fads, I design things - I'm not a computer operator.

Re:More Power for What? (1)

Random Destruction (866027) | more than 7 years ago | (#18823645)

I dunno, I'd say multiple paper-space layouts is a pretty useful feature. I'd have a much longer list, but Its been over a year since I've used autocad, and much longer since I've used R14.

Re:More Power for What? (2, Informative)

Stormwatch (703920) | more than 7 years ago | (#18822979)

Want to see bloat? Check the latest version of Nero Burning ROM. It installs loads of crap, hoses the whole system and makes it unstable, and takes hundreds of MB in the hard disk. Then some pirate trimmed the crap and made a "lite" version, which works MUCH better.

You'd be surprised (5, Insightful)

Moraelin (679338) | more than 7 years ago | (#18822905)

You'd be surprised how much more _can_ be made with a CPU.

E.g., sure, we like to use the stereotypical old mom as an example of someone who only sends emails to the kids and old friends. Unfortunately it's false. It was true in the 90's, but now digital cameras are everywhere and image manipulation software is very affordable. And so are the computers which can do it. You'd be suprised the kind of heavy-duty image processing mom does on hundreds of pictures of squirrels and geese and whatever was in the park on that day.

And _video_ processing isn't too far out of reach either. It's a logical next step too: if you're taking pictures, why not short movies? Picture doing the same image processing on some thousands of frames in a movie instead of one still pictures.

E.g., software development. Try building a large project on an old 800 MHz slot-A Athlon, with all optimizations on, and then tell me I don't need a faster CPU. Plus, nowadays IDEs aren't just dumb editors with a "compile" option in the menus any more. They compile and cross-reference classes all the time as you type.

E.g., games, since you mention the graphics card. Yeah, ok, at the moment most games are just a glorified graphics engine, and mostly just use the CPU to pump the triangles to the graphics card. Well that's a pretty poor model, and the novelty of graphics alone is wearing off fast.

How about physics? They're just coming into fashion, and fast. Yeah, we make do at the moment with piss-poor approximations, like Oblivion's bump-into-a-table-and-watch-plates-fly-off-superso nic engine. There's no reason we couldn't do better.

How about AI? Already in X2 and X3 (the space sim games) it doesn't only simulate the enemies around you, but also what happens in the sectors where your automated trade or patrol ships are. I want to see that in more games.

Or how about giving good AI to city/empire building games? Tropico already simulated up to 1000 little people in your city, going around their daily lives, making friends, satisfying their needs, etc. Not just doing a dumb loop, like in Pharaoh or Caesar 3, but genuinely trying to solve the problem of satisfying their biggest need at the moment: e.g., if they're hungry, they go buy food (trekking across the whole island if needed), if they're sick, they go to a doctor, etc. I'd like to see more of that, and more complex at that.

Or let's have that in RPGs, for that matter. Oblivion for example made a big fuss about how smart and realistic their AI is... and it wasn't. But the hype it generated does show that people care about that kind of thing. So how about having games with _big_ cities, not just 4-5 houses, but cities with 1000-2000 inhabitants, which are actually smart. Let's have not just a "fame" and "infamy" rating, let's have people who actually have a graph of aquaintances and friends, and actually gradually spread the rumours. (I.e., you're not just the guy with 2 points infamy, but it's a question of which of your bad deeds did this particular NPC hear about.) Let's not have omniscient guards that teleport, but actually have witnesses calculate a path and run to inform the guards, and lead them to the crime. Etc.

Or how about procedurally generated content? The idea of creating whole cities, quests and whatnot procedurally isn't a new one, but unfortunately it tends to create boring repetition at the moment. (See Daggerfall or Morrowind.) How about an AI complex enough to generate reasonably interesting stuff. E.g., not just recombine blocks, but come up with a genuinely original fortress from the ground up, based on some constraints. E.g., how about generating whole story arcs? It's not impossible, it's just very hard.

And if you need to ask "why?", let's just say: non-linear stories. Currently if you want, for example, to play a light side and a dark side, someone has to code two different arcs, although most players will only see one or the other. If you add more points and ways you can branch the story (e.g., what if I want to side with the usurper instead of the king of city X?), the number of possibilities explode exponentially. Literally exponentially. As little as 5 possible branch points and you already have 32 possible outcomes. Someone has to script all that and write the story, which is why you mostly see linear games instead of that. Or games which fake giving you choices, but nothing except the very last choice matters. (E.g., you can flip from dark side to light side right before the final battle, and none of your previous actions matter.) If the computer could automatically adjust, based on your actions, you could have truly non-linear games.

As an added bonus, how about fitting that content to the character I'm currently playing, like a human GM would? Most games seem to assume I'll want to play a damage dealer. What if I wanted to play a healer? Will I be boned at the final one-on-one boss battle? Why can't the computer adapt it?

Etc.

So, as you can see, there's still _plenty_ of stuff we could use a CPU for. If we only had fast enough CPUs and the tools/algorithms.

Re:You'd be surprised (1)

pipingguy (566974) | more than 7 years ago | (#18822971)

I'm glad you mentioned Caesar. [codecad.com] Yeah, I know that's not what you were referring-to.

Re:You'd be surprised (4, Funny)

toejam316 (1000986) | more than 7 years ago | (#18823047)

on your oblivion point, IT DOES have incredibly smart game AI. its just the sheer number to preformance ratio means they have to be stupid. On a good computer (cant remember the specs) they setup 2 NPCs and tasked 1 to rake the lawn and another to clip the hedges. They did so, then, they swapped the items needed so the clipping person had a rake and the raking person had clippers. The raker ended killing the clipper for the rake. Another example of Oblivion's AI is in a test, before you could make it to a quest guy who sold Skooma (a drug), the Skooma Addicts already killed him, ruining the game plot by not allowing you to progress. seem dumb to you? Oh yeah, I'm not sure if the second example is actually what happened in game with AI beefed up or just in a section with few NPCs.

Re:You'd be surprised (1)

MMC Monster (602931) | more than 7 years ago | (#18823081)

Getting back to photo manipulation and the hypothetical grandmom, what about using face recognition to correctly tag each person in each picture? Or scan all pictures of a person and (based on the composite) get rid of any redeye automatically, with the proper iris color. How about when a picture is taken, automatically instead take 1000 photos within a second and use them to create a composite so there is no blurring or motion artifact?

Lots of things possible when the horsepower is there.

Re:You'd be surprised (1)

harry666t (1062422) | more than 7 years ago | (#18823107)

"Or how about procedurally generated content? The idea of creating whole cities, quests and whatnot procedurally isn't a new one, but unfortunately it tends to create boring repetition at the moment.(...)"

Do you *really* need a 666 ghz CPU with 1337 MB of cache to do that? I think it's a matter of creating such a "story engine", not of unsufficient processor power.

BTW, why nobody created one earlier? That would be cool stuff.

waitwaitwait, AFAIR Heroes 1 or 2 had automatically generated random maps, and playing them was actually fun. HOMM ftw.

Re:You'd be surprised (5, Funny)

cerberusss (660701) | more than 7 years ago | (#18823171)

nowadays IDEs aren't just dumb editors with a "compile" option in the menus any more. They compile and cross-reference classes all the time as you type.
I program in C, you insensitive clod!

Re:You'd be surprised (1)

smallfries (601545) | more than 7 years ago | (#18823309)

Thankyou. That was a very rare post on slashdot, with some genuine insight. You've nailed exactly what disappoints me in Oblivion. It is certainly a step-up from Morrowind (pretty graphics, fixing the obvious game mechanics holes) but ... it's not a very large step. When I walk into the Imperial city of Tameriel, that is so imposing I can see it from everywhere in the surrounding valley, it looks like a sparse little village with one or two people dotted around. The game sets up expections but then it dashes them.

The new outdoors look with vegetation and trees looks awesome... at about 5fps. Turning the vegetation off gets an acceptable framerate, but the limitation here is not the grunt on the graphics card. It's the CPU not doing enough LOB calculation and billboard generation. This is exactly where adding some power to the brains (CPU) of the system would improve the result of applying the brawn (GPU).

The game that I really want to see somebody write is quite like you describe, it's Nethack with the Oblivion game engine. It's starting each game in a random city in a generated world that is consistent, and has a generated back story. Oh well, it's going to be a wet dream for a few more years at least... back to my thieves guild initiation test...

Re:More Power for What? (1)

eebra82 (907996) | more than 7 years ago | (#18823175)

How many users actually *use* how much power they already have?

I disagree with your thinking. It is not about using 100% of the 'power'. It is about the definition of how much power '100%' is. We all hit 100% during intensive operations, but to a lesser extent if the power is exceedingly efficient.

Re:More Power for What? (1)

eMbry00s (952989) | more than 7 years ago | (#18823445)

Oh fye, now you gave me such a huge craving for coffee. ARGH.

Re:More Power for What? (1)

ocbwilg (259828) | more than 7 years ago | (#18823479)

Yeah, this is cool, no doubt. How many users actually *use* how much power they already have? I use a lot, but it's mostly dependent on the graphics card.

This article is discussing 2-way, 4-way, and 8-way Opteron CPUs for servers. I don't know about you, but with all the virtualization going on nowadays, more computing power in the same size box is a good thing. We can use all the power we can get.

Re:More Power for What? (0)

Anonymous Coward | more than 7 years ago | (#18823593)

Fractals. Those use all the power of whatever machine you do them on, and the faster and more processors you have, the faster you can render something.

Re:More Power for What? (1)

Dun Malg (230075) | more than 7 years ago | (#18823753)

Yeah, this is cool, no doubt. How many users actually *use* how much power they already have? I use a lot, but it's mostly dependent on the graphics card.
Time and again, Intel and AMD come out with new, faster processors, and every time some bozo like you feels the need to say the same stupid thing: "who needs this much computing power on the desktop?" Time and again, someone like me has to post the same damn reply:

You are not the market for this. The desktop is not the market for this. Games are not the end-all be-all of high-intensity computation. In a more general sense, progress just fucking progresses. Are you saying AMD and Intel should just market their 2GHz parts forever, as that's plenty for everyone?

Seriously, this is equivalent to someone developing a 200mph freight train, and you coming along questioning its value, as the only freight you ever move is a trunk full of groceries 1 mile from the store to home, and your car is more than sufficient.

Oh come on (4, Insightful)

Anonymous Coward | more than 7 years ago | (#18822605)

No sane person actually believed that the gigahurtz race was over. But who cares about it anyway, just more power for a little faster operation.

I muchly prefer a fanless processor.

Re:Oh come on (1, Funny)

Anonymous Coward | more than 7 years ago | (#18822879)

I muchly prefer a fanless processor.


So you're a fan of a fanless processor?

Re:Oh come on (1)

timeOday (582209) | more than 7 years ago | (#18822885)

The P4 hit 3 GHz, what, 4 years ago? For Opteron to hit 3GHz only now is just proof of how badly the quest for GHz has atrophied.

Had the wall not been hit, and GHz continued to increase as in the 90s, we'd be up to someting like 20 GHz by now. So the truth of this story is the exact opposite of "The Gigahertz Race is Back On." RAM and HDD capacity and price are relatively stagnant for the last few years, too. The only thing still growing by leaps and bounds is flash memory.

Re:Oh come on (4, Insightful)

Anonymous Coward | more than 7 years ago | (#18822975)

The P4 hit 3 GHz, what, 4 years ago? For Opteron to hit 3GHz only now

To understand how this is not a sign of slacking off by the chip designers, you have to understand that the P4 was able to run at high clock speeds only because it was designed to use a very long pipeline of small functional units. This design has proven to be inefficient because it causes too many pipeline stalls and because it requires a higher clock speed and higher power consumption to achieve the same performance. The more complicated functional units of chips with shorter pipelines cannot be clocked as fast, but they perform better at the achievable clock rates than the P4 did at higher clock rates. The last Gigahertz race was ended by a shift of architecture, not by "hitting a wall". Then came multicore designs, which further reduced the need and opportunity for higher clock rates (heat dissipation is somewhat of a "wall"). All this caused clock rates to grow much slower. Now that chip designers have found ways to control power consumption, increasing the clock rate is viable again, so the race is back on.

Re:Oh come on (1)

timeOday (582209) | more than 7 years ago | (#18823147)

you have to understand that the P4 was able to run at high clock speeds only because it was designed to use a very long pipeline of small functional units... The last Gigahertz race was ended by a shift of architecture, not by "hitting a wall".
The NetBurst architecture was noting but Intel's response to hitting the MHz wall. Intel wanted to continue ramping up MHz which in the past had corresponded very well with overall performance and was thus important to consumers. But because they were starting to hit the wall, they couldn't pull off extra MHz without compromising the amount of work per cycle. Worst of all, the P4 didnt move the MHz wall back very far - in production it never got much past 3.6 GHz or so.

The MHz wall is still standing strong. The fastest CPUs now have that same clock speed (+- 30%, in contrast to the 10,000% MHz increase of the previous decades).

Re:Oh come on (1, Insightful)

Anonymous Coward | more than 7 years ago | (#18823291)

You're seeing a standstill where things simply improved in a different way. It was easier and cheaper to improve the logical design than to further speed up the clock. That doesn't mean that clock speed increases are impossible. They have just not been the best choice economically for a while. There are technological constraints to simply ramping up the clock rate, like the size of the clock domain, so the chip designer has to change the architecture to enable higher clock speeds. The Netburst architecture was one such design change which was exceptionally successful in enabling higher clock speeds. Unfortunately it wasn't as successful in the performance/watt department, so it hit the heat wall. The chip architecture determines the number of switching operations per clock cycle. In the absence of technology which reduces the power consumption per switching operation/transistor, the only way to reduce power consumption (and heat dissipation requirements) is to optimize the architecture. It is no coincidence that the last years have been dominated by power efficiency research (on the electrical level), because that is the wall that stands in the way of higher clock speeds right now. Intel's next dual core CPUs will "overclock" if one core is idle. That's just a fancy way of admitting that they could run much faster if they could get rid of the resulting heat. The silicon is not at a gigahertz limit: The chip simply becomes too hot. Every improvement in that department is a step towards higher clock rates, not towards lower power chips.

misleading (1)

jasonhamilton (673330) | more than 7 years ago | (#18823071)

Having a really long pipeline with a high clock speed doesn't make your computer faster. It sounds better in marketing terms though - which is why Intel eventually fell flat on it's face when consumers found out the P4's were crap. If Intel's high clockspeed cpus also performed, we'd still see a mhz war because intel would still be focusing on it in their marketing. Right now it's just on the backburner.

Re:Oh come on (1)

dreamchaser (49529) | more than 7 years ago | (#18822977)

What I find interesting is the role reversal. Intel in it's current generations is focused more on improving IPC. AMD seems to have hit the IPC wall and is instead focusing on clock speed increases. It's a reverse of what the situation was a few years ago, with AMD touting how much more elegant it's architecture was with it's higher IPC than Netburst, and Intel pushing high clock speed as it's answer.

Re:Oh come on (1)

drerwk (695572) | more than 7 years ago | (#18823317)

It is over in the sense that I do not expect to see a 30GHz processor any time soon. Not too long ago I had to account for the fact that if I did not get my product out in nine months, the target computer would be twice as fast. In 1977 I had a 1MHz 8bit PC. In 1990 I had a 25MHz 32 bit CPU. 1996 200MHz, in 2002 2GHz. Do you think we will see a factor of 10 to 30Ghz by 2011? Sorry. But I do expect number of cores to double every 18 months. I expect 80 core CPU in 5 years. I'm not sure that the software development tools will keep pace. I doubt that I'll be able to use the 80 cores unless it is through a library call.

Maybe there is still a point for bragging rights, but the physics says there is little point in trying for 30GHz for a CPU. And Intel is going with the physics, I assume AMD uses the same physics.

Re:Oh come on (0)

Anonymous Coward | more than 7 years ago | (#18823397)

I have a 75MHz Pentium I I'll send you. Give me a day to strip it; I've got a Postgres DB on it.

Re:Oh come on (1)

kestasjk (933987) | more than 7 years ago | (#18823483)

Today AMD reported huge losses due to increasing competition against rival Intel.

In other news, AMD has started to release overclocked processors which increase the speed at the expense of power consumption, but with no R&D cost at all. I now totally cannot remember what the first news piece was.

Gigahertz a bad thing? (5, Funny)

jibjibjib (889679) | more than 7 years ago | (#18822657)

I was kind of hoping the gigahertz race would end so Microsoft would have to stop making each version of Windows slower than the last.

Re:Gigahertz a bad thing? (-1, Troll)

EveryNickIsTaken (1054794) | more than 7 years ago | (#18822709)

Ooooooooooooooo, BURN. Cry me a fucking river.

Re:Gigahertz a bad thing? (3, Insightful)

TeknoHog (164938) | more than 7 years ago | (#18822751)

I was kind of hoping the gigahertz race would end so Microsoft would have to stop making each version of Windows slower than the last.

You're missing the whole point. CPU performances are increasing all the time, which allows Microsoft to continue making everything slower. However, the GHz race had little to do with performance; Intel pushed their Pentium 4 closer to 4 GHz, even if it performed slower than many competing CPUs between 2 and 3 GHz. They probably did it because most consumers would only look at raw GHz instead of performance.

Re:Gigahertz a bad thing? (1)

pipingguy (566974) | more than 7 years ago | (#18822823)

CPU performances are increasing all the time, which allows Microsoft to continue making everything slower.

A-ha! Finally the truth comes out! The massive, worldwide adoption of computers is *actually* a global job creation program!

No, really, think about it.

Re:Gigahertz a bad thing? (1)

maxume (22995) | more than 7 years ago | (#18823271)

You can say that about anything that isn't strictly a necessity. Personally, I like having a machine wash my clothes and fresh food.

Consumers don't care much any more (1)

zaibazu (976612) | more than 7 years ago | (#18822691)

Remember when in the Mhz number was the most important part of PC advertising ? It did make a difference in the 90s but now people just grab a box and it will be fast enough for their office work anyway.

Re:Consumers don't care much any more (1)

Jugalator (259273) | more than 7 years ago | (#18822927)

Yes, you're right, but for respectable gaming, you must have Windows Vista for the DirectX 10 and a SLI'd GeForce 8800GTX with drives that Lucifer himself designed!

No, You Are Wrong (1)

mfh (56) | more than 7 years ago | (#18822697)

The common misconception that you should measure a CPU's power using GHz of a processor is one we really need to put to rest forever. That is a bad choice of deciding factor for going with a particular processor, or not. Differences between chips will always be totally immeasurable, so only fools go by chip ratings, IMHO. Customers should read as much as they can and look at final performances and make a decision of whether to buy or not buy.

I would place a much higher reliability factor upon balancing the chip manufacturer's position in the market, their support and services, their overall reputation, their evil factor, and the overall performance of computers that rely on the chip.

A customer should decide on what is really important to them and it should really involve a lot more than which horse looks faster. Sometimes even the slowest damn horse wins the race because of Murphy's Law, which we all know applies to computers.

Re:No, You Are Wrong (1)

mcbiondi (879388) | more than 7 years ago | (#18822735)

When you compare a 3.0Ghz Opteron to a 2.6Ghz Opteron, you kinda know without doing much research the 3.0 chip will be faster. When you are using the Opteron for industrial level computations, these numbers can make a huge difference.

Re:No, You Are Wrong (1)

catxk (1086945) | more than 7 years ago | (#18822737)

Then again, in the same product line, GHz (and perhaps cache amount) is often the only thing that can decide how fast a chip is compared to another. This applies more broadly but to a lesser extent to chips that are in the same generation, albeit not the same product line, for example various Athlon XP incarnations.

Maybe, and just maybe, the trend we see is creating a new generation set of chips, pushing them to their limit in terms of frequency, then when you hit the power consumtion roof, you come up with a new set of chips which are more power efficient, just to eventually push them through the roof. It happened with the Pentium 4, and it looks like something similar is now happening to Opteron. Maybe.

Exactly (1)

mfh (56) | more than 7 years ago | (#18822829)

Then again, in the same product line, GHz (and perhaps cache amount) is often the only thing that can decide how fast a chip is compared to another.

You are using the GHz info as correctly as possible. There are other mitigating factors that are too ubiquitous for most to comprehend but at the end of the day, you probably won't notice them playing WOW.

In many cases, the true difference is here [google.ca] and here [google.ca] : to confirm, I'm not talking about the overall rates of stock trading, but more so the graph as Google displays what the stocks are doing. Typically the more profitable a company is, the better their product offerings are overall. It's at least a good factor to consider, IMHO.

Re:Exactly (1)

maxume (22995) | more than 7 years ago | (#18823315)

The graphs in your links have essentially no relationship with each other(as they are much more reflective of that particular day of trading than they are the companies in general). Here:

http://finance.yahoo.com/q/bc?t=5y&s=INTC&l=on&z=m &q=l&c=amd [yahoo.com]

or here:

http://finance.yahoo.com/q/bc?s=INTC&t=my&l=on&z=m &q=l&c=amd [yahoo.com]

Gives you a better picture of the relative performance of the companies(intel spanks AMD over the very long term, but the last five years are more or less a wash).

Here:

http://finance.yahoo.com/q/ks?s=INTC [yahoo.com]
http://finance.yahoo.com/q/ks?s=amd [yahoo.com]

Gives a clear picture of the more immediate health of both companies. AMD is faltering badly; intel is humming right along.

Re:No, You Are Wrong (0)

Anonymous Coward | more than 7 years ago | (#18822939)

I've found in my needs and application of CPUs that cache size plus RAM memory is the deciding factor in performance not just the bigger is better advertising sound bite.

Re:No, You Are Wrong (1)

ocbwilg (259828) | more than 7 years ago | (#18823521)

The common misconception that you should measure a CPU's power using GHz of a processor is one we really need to put to rest forever. That is a bad choice of deciding factor for going with a particular processor, or not. Differences between chips will always be totally immeasurable, so only fools go by chip ratings, IMHO. Customers should read as much as they can and look at final performances and make a decision of whether to buy or not buy.

They should, but we're talking about consumers here, so they won't. Lets face it, most of them can barely figure out how to make the mouse go in Windows, let alone understand the differences in CPU architecture and their relative merits with regards to CPU performance. So because the majority of computer buyers and users don't know what they're buying, then people go with the "bigger number = better computer" theory. And why shouldn't they? When we're comparing the amount of memory, hard disk space, or monitor size, bigger tends to be better. And when it comes to pricing, most people think that the more expensive computer (or parts) are better. And since a faster CPU (or PC with a faster CPU) is priced higher than a slower CPU (or PC with a slower CPU), most people would assume that it is inferior in some way.

I would place a much higher reliability factor upon balancing the chip manufacturer's position in the market, their support and services, their overall reputation, their evil factor, and the overall performance of computers that rely on the chip.

How often have you had to call on a CPU manufacturer for support? I've been using and building computers since the early 80's, and I never once had to go to a CPU manufacturer for support.

NO NO NO NO NO (1)

JackMeyhoff (1070484) | more than 7 years ago | (#18822759)

I want LOWER POWER CONSUMPTION. This is done by scaling out with more cores. My laptop is going to MELT for fclucks sakes. This is why I buy Intel now instad of AMD for my mobile needs. They have BETTER power "management". Core wars please, not speed wars. SMARTER DESIGNS not BRUTE FORCE.

Re:NO NO NO NO NO (5, Insightful)

Soul-Burn666 (574119) | more than 7 years ago | (#18822819)

So take a top end 3GHz model and underclo it and reduce its voltage. You still get good performance, with lower power consumption.

Re:NO NO NO NO NO (1)

embsysdev (719482) | more than 7 years ago | (#18823293)

Why was this modded "Funny"? It's essentially what AMD PowerNow does and it's how I was able to re-purpose an old PC as a silent router by under-clocking it and removing the fans from the CPU and PS.

Then again... (1)

xerent_sweden (1010825) | more than 7 years ago | (#18822827)

...software must be written to make use of all those cores. Dumping low level cache all the time when moving a calculation from one core to another really slows things down unless the software really is optimized. So your wallet would be much better off with a single or maybe dual core anyhow. Like the new 8 core Mac Pro which didn't show much improvement at all over the quad core model. Of course, in the future, software will be written for many cores, but today isn't tomorrow just yet.

Re:Then again... (0, Flamebait)

JackMeyhoff (1070484) | more than 7 years ago | (#18823061)

Use the correct implementation and design tools If you stick to C++ then well, good luck with that in the future LOL

Re:Then again... (0)

Anonymous Coward | more than 7 years ago | (#18823467)

Geez.. what kind of fucking retarded astroturf-troll are you..

Re:Then again... (1)

JackMeyhoff (1070484) | more than 7 years ago | (#18823561)

and what a fucking retarded blue collar worker you really are :)

Re:NO no NO no NO (2, Insightful)

Cinnamon Whirl (979637) | more than 7 years ago | (#18822855)

Don't do THAT please. This ISN'T a comic BOOK. ;)

Re:NO NO NO NO NO (1)

Ramble (940291) | more than 7 years ago | (#18823077)

So your laptop has 3GHz Opterons inside it?

Don't be so stupid, one of the reasons any semiconductor company raises clocks is to see if they can run something faster at the same voltage. This inevitably leads to faster processors at lower wattages or lower wattages overall.

Re:NO NO NO NO NO (1)

JackMeyhoff (1070484) | more than 7 years ago | (#18823117)

No but the problem is "speed wars" in GPU's and thats going to ramp up for the GPGPU battlefield thats comming. My laptops cost me about 6000 USD and they are top end so yeah I have "powerful" laptops. Its unfortunate that a workstation, mobile or stationary, takes more power than my fridge freezer.

Re:NO NO NO NO NO (1)

Ramble (940291) | more than 7 years ago | (#18823199)

With GPUs you don't have the large changes in the process that CPUs have but they are getting more efficient. The problem is that people think efficiency means low power; it doesn't. GPUs are having quantum leaps in speed while CPUs haven't seen that, so they're just using up more and more power.

Re:NO NO NO NO NO (0, Troll)

JackMeyhoff (1070484) | more than 7 years ago | (#18823379)

Explain to me why my 6000 USD mobile workstation is sucking 200 watts when I expect BETTER DESIGN that uses less power, especially when we are trying to REDUCE power usage. I can tell you why, BAD DESIGNERS that are all high on their laurals saying its not a blue collar job and its a "skilled job", well go show me some skill then, give me powerful mobile workstations that use LESS POWER or talk to the hand.

Re:NO NO NO NO NO (2, Funny)

trentblase (717954) | more than 7 years ago | (#18823889)

Yeah, and while we're complaining, my $1.5 million Bugatti Veyron gets under 3mpg at full throttle when I expect BETTER DESIGN that uses LESS GAS when we're trying to REDUCE fossil fuel consumption. I can tell you why, BAD DESIGNERS who need to get off their asses and give me MORE HORSEPOWER with LESS FUEL or talk to the hand.

Re:NO NO NO NO NO (1)

JackMeyhoff (1070484) | more than 7 years ago | (#18823909)

Yeah those Ford engines sure do suck.

Re:NO NO NO NO NO (5, Informative)

level_headed_midwest (888889) | more than 7 years ago | (#18823647)

Low thermal dissipation is a much more prevalent theme for AMD than it is for Intel, especially outside of the notebook sector. Yes, Intel has some 1.06 and 1.20 GHz Core 2 Duo ULVs for laptops and those have a 10-watt or so thermal dissipation while AMD's lowest-TDP mobile chips rate in at 25 watts (Turion 64 MT/Sempron.) Intel also has the single-core Core Solo series at 1.06-1.33 GHz that dissipates 5.5 watts. However, those chips are very rarely seen in any notebooks larger than a 12" screen size. You'd be much more likely to see a 31-watt Core Duo or 34-watt Core 2 Duo sitting in an average laptop than a 10-watt C2D ULV. AMD's Turion X2s have similar TDPs, ranging from 31 to 35 watts. All of the processors have similar frequency and voltage scaling mechanisms and battery life is roughly similar.

For desktops, most of Intel's newer Core 2 Duo processors have an average thermal dissipation of 65 watts. The fastest Core 2 Duo, the 2.93 GHz X6800, has a 75-watt average TDP. The quad-core chips range from 105 watts for the 2.40 GHz Q6600 to the 130-watt QX6700. These chips have a very reduced version of the SpeedStep that Intel puts in its laptop chips. The lowest core speed of the 800 MHz FSB chips is 1.20 GHz and 1.6 GHz for the 1066 MHz FSB chips. AMD's current new desktop processors start from a maximum thermal dissipation of 35 watts for the single-core Sempron EE and go up to 45 watts for the Athlon 64 single-cores (Lima), 65 watts for the Athlon 64 X2 models from the 3600+ to the 5200+, 89 watts for the 5400+ and 5600+, and 125 watts for the X2 6000+ and FX-70 series. The AMD chips all clock down to 1 GHz at idle. AMD also rates the chips on their absolute maximum thermal dissipation rather than an average thermal dissipation like Intel does, so a 65 watt AMD chip will usually end up drawing less power than an Intel 65-watt chip. The AMD chips also draw significantly less power at idle due to their lower clock speed.

The scenario is much the same for servers. AMD has their High Efficieny line of dual-core chips that draw 68 watts, the normal line that draws 95 watts, and the SE line that draws 125 watts. Intel has a few low-voltage Xeons, but those are very uncommon and pretty much limited to blade server vendors. AMD sells its Opteron HEs through a wide range of vendors.

What race? (1)

suv4x4 (956391) | more than 7 years ago | (#18822869)

Gigahertz race is back on! AMD increases the clockrate of its chips to 3 GHz! Ok.. race is over.

Slows news day? When people announced GHz race is over they didn't mean that they'll only decrease the clockrate didn't they? Both Intel and AMD still bump the clock rate up on further developments of their models, but we should expect that we'll be seeing chips in the range of 1 GHz - 3.8 GHz and no higher than this.

There no effin GHz race.

Rev up, don't shift (1)

Opportunist (166417) | more than 7 years ago | (#18822907)

So again we're pushing down the throttle a hint more instead of shifting into next gear. I mean, ok, I'm not a hardware guru, but could it be that we might get more done with less speed if we managed to get things done more "intelligently" instead of simply "faster"?

How about a bit more than just 8 registers? Maybe a bit more "distributed computing" inside the machine, with more than just outsourcing the graphics to a GPU, maybe a chip dedicated to memory or interrupt handling? I dunno, personally it feels like we're just adding more heat and try to clock it faster than trying to find new ways to speed up processing.

Maybe someone with more background in hardware design can enlighten me why the race for more cores and more Hertz. I mean, I can see the marketing aspect (after so many years, people would buy a 3 GHz processor "old style" rather than a "new school" 2GHz at the same price 'cause it "looks" faster), but is there actually a technological reason why we don't even consider looking at other ways to improve our speed?

Re:Rev up, don't shift (4, Informative)

DaleGlass (1068434) | more than 7 years ago | (#18823041)

How about a bit more than just 8 registers?

AMD64 has 16 registers

with more than just outsourcing the graphics to a GPU

AMD seems to be working on putting a GPU in ther CPU

maybe a chip dedicated to memory

Memory used to be managed by a dedicated chip -- the northbridge. But AMD moved it into the CPU because it was faster that way.

or interrupt handling

The APIC? But anyway, the slow part of interrupt handling is done in the OS kernel, which runs on the CPU. So I'm not sure how much a chip would help there.

Maybe someone with more background in hardware design can enlighten me why the race for more cores and more Hertz.


I'm not an expert, but my guess is that because computers are all-purpose devices. Specialized hardware can accelerate something like encryption or audio mixing, but there doesn't seem to be all that much of that sort of thing that's still worth accelerating. Most people don't need to encrypt the huge amounts of data that would make a dedicated accelerator make much of a difference. Notice also how now almost nobody buys sound cards anymore, because you can just mix sound in hardware.

Re:Rev up, don't shift (1)

Opportunist (166417) | more than 7 years ago | (#18823709)

Notice also how now almost nobody buys sound cards anymore, because you can just mix sound in hardware.

I guess hardcore gamers and audiophiles would disagree.

Dedicated soundcards have 2 advantages over letting CPU handle the task: Less CPU load (yes, that doesn't matter in normal applications, but since games stress the CPU to the limit anyway you'll want that task in a dedicated device) and sound quality, which is again often due to "good" sound being actually quite a bit of load on whatever piece of hardware should handle it. Many sound cards also offer additional quality enhancing technologies and/or chips for that matter.

Re:Rev up, don't shift (1)

DaleGlass (1068434) | more than 7 years ago | (#18823865)

Sound quality, maybe. CPU load, I'm not that sure.

Some time ago, Creative intimidated John Carmack into supporting EAX in Doom 3 [slashdot.org] . This is yet another reason why I don't use Creative hardware anymore. The SB Live cards causing disk corruption with VIA boards, drivers being unstable on SMP (this was before dual core) and them taking drivers off their site for some time are some others.

But anyway, why do you think Creative had to basically force John Carmack to support their EAX tech? Apparently because Doom 3, on modern hardware, can mix 5.1 audio faster than the sound card.

Seriously, mixing sounds isn't that big of a deal. I have here a book on game programming that describes the creation of a wolfenstein-3D-like game, and part of it explains how to mix sound in software. This was on a 386 CPU. Besides that, all a sound card gives you is extra reverb/etc effects, and computers don't seem to have much of a problem with doing that in real time anymore either.

Re:Rev up, don't shift (1)

tpwch (748980) | more than 7 years ago | (#18823319)

This is what I would like to see in new computers: Chips dedicated to specific common alghoritms, that software could take advantage of.
This has been done on a small scale, but imagine if every computer had chips that did things like encryption, compression, encoding/decoding of common movie/music/image formats, etc, etc, and faster than any program would be able to do in software, without putting load on the CPU.

There is no limit to how far we could go, maybe there could be a chip with common image manipulation alghoritms for use by various photo editing software, maybe one with a few alghoritms commonly used when compiling programs that compilers could take advantage of, random number generators, hashing alghoritms, alghoritms used by databases to find data fast. Who knows.

Its not going to happen any time soon, but I can still dream.

The market doesn't embrace higher gears (1)

baffled (1034554) | more than 7 years ago | (#18823801)

Ever heard of Itanium [wikipedia.org] ? It has a bit more than 8 registers:

The architecture implements 128 Integer registers, 128 Floating point registers, 64 1-bit predicates, and eight branch registers. The floating point registers are 82 bits long to preserve precision for intermediate results.
It has an Explicitly Parallel Instruction Computing (EPIC) [wikipedia.org] core:

The goal of EPIC was to increase the ability of microprocessors to execute software instructions in parallel, by using the compiler, rather than complex on-die circuitry, to identify and leverage opportunities for parallel execution. This would allow performance to be scaled more rapidly in future processor designs, without resorting to ever-higher clock frequencies, which have since become problematic due to associated power and cooling issues.

Back on? I say not.. (1)

Jugalator (259273) | more than 7 years ago | (#18822915)

"The Gigahertz Race is Back On"

No, I don't think so. It's just that AMD pushed the clock frequency for this CPU, but that works becuase it was just up to 3 GHz.

Watch me be right when they don't continue to push that generation to clock speeds 3x higher or so like they could in the past.

Misleading info (1)

suv4x4 (956391) | more than 7 years ago | (#18822933)

future products will run at slower clock speeds and gain performance through the use of multiple cores and other techniques that won't improve single-threaded application performance.

This is misleading. No one gave up improving the performance of single-threaded apps.

All new chips are striving to improve the performance of each core by packing more executed commands per cpu cycle. This is achieved with better branch prediction, concurrent execution of commands that are in principle serial (this is possible as long as they don't depend on each other), and less execution "stages", i.e. more efficient architecture.

We'll see lots of speed improvements in each separate core, and we'll see it via smarter and smarter architecture that adapts to the code being executed.

The reason we've not seen this before, is because this is a very complex task, and upping the GHz seemed easier.

Re:Misleading info (1)

battery111 (620778) | more than 7 years ago | (#18822989)

Am I the only one who finds this story, for lack of a better word, alarmist. I'm sure there is a better word, but I can't think of it at the moment. This article is all about AMD shipping a 3 GHZ chip. 3 GHZ chips are not exactly breaking news, and while it is a high for AMD, again, not news. Now were AMD to push the envelope farther than intel has previously gone, then I would find this story more insteresting, but the fact the AMD is only now shipping 3 GHZ processors I find quite ho-hum, and perhaps a desperate attempt to imnprove their financial situation, rather than any kind of real innovation.

Re:Misleading info (1)

suv4x4 (956391) | more than 7 years ago | (#18823105)

Am I the only one who finds this story, for lack of a better word, alarmist. I'm sure there is a better word, but I can't think of it at the moment. This article is all about AMD shipping a 3 GHZ chip. 3 GHZ chips are not exactly breaking news, and while it is a high for AMD, again, not news. Now were AMD to push the envelope farther than intel has previously gone, then I would find this story more insteresting, but the fact the AMD is only now shipping 3 GHZ processors I find quite ho-hum, and perhaps a desperate attempt to imnprove their financial situation, rather than any kind of real innovation.

Right.. right.. Or just they were doing business as usual and a bunch of guys freaked out for no good reason? Don't get caught in the loop.

Set to improve, like everything else (1)

BlueParrot (965239) | more than 7 years ago | (#18823139)

People say they should focus on multiple cores and not push the clock frequency and whatnot. Thing is, those are not mutually exclusive improvements. The the only real problem if you manage to create faster chips is the power consumption, and consequentially heat generation. Sure, you have to do some science to get it working in the first place, but is there actually a direct disadvantage to having chips run quicker other than power consumption? It doesn't stop you from using multiple cores, or even multiple CPUs, heck you could probably do something cunning where you run 10 minutes on one CPU, as it heats up you switch over to the another for 10 min and let it cool down. The limit to clock frequency is far from attained, the question is only one of economics. What I can see is a drop in clock frequency as they start stacking circuits in 3D as it will be harder to get the heat out of them, then the frequency will scale up again as innovative ways to cool the cpu develops. The only real place I can see the power consumption of a CPU being a big show stopper in itself ( i.e not due to limiting how densely you can pack in it etc ) is in laptops. In a desktop system you can get kilo watts of the socket, and the CPU itself doesn't produce more heat than a few light bulbs. Basically the only thing keeping the frequency down is that there are currently ways to improve CPUs that have not been previously exploited ( multiple cores etc ). Once those start to become unable to give a speed increase ( you only have so many threads etc ) the clock frequency will become important again.

AMD is desperate (2, Insightful)

tru-hero (1074887) | more than 7 years ago | (#18823289)

This is a desperation move. AMD is back on their heels and their recovery plan is too far off in the future. In hopes of saving face they are pulling the only lever they have, clock speed.

Funny, Intel was chumped by AMD just like this a couple of years ago, why did AMD let themselves get tagged back? Intel woke up in a major way. Can AMD? Doesn't look too good...

Re:AMD is desperate (1)

Joe The Dragon (967727) | more than 7 years ago | (#18823583)

AMD is working on there new Barcelona chip and it takes time for a NEW CPU to come out. Also AM2 was forced as DDR 1 was starting to go a way at the time and they needed to move to DD2 but the AM2 boards will work with the new AM2+ cpu unlike Intel where they use the same socket but need new chips and newer boards with the same chip set with support for the newer vcore.

Re:AMD is desperate (1)

postmortem (906676) | more than 7 years ago | (#18823589)

Indeed. Sadly, 3.0GHz Athlon 64 X2 is as fast as 2.4GHz Core 2 Duo. Maybe better for 64-bit apps, that are still nowhere.

However, intel was not desperate when they high-clocked Pentium 4, they simply decided that marketing ploy is more important than performance is.

It takes years to develop something like Core 2 Duo, AMD qill of course manage to produce something faster per clock than C2D, but not today when they need it.

Re:AMD is desperate (4, Interesting)

ocbwilg (259828) | more than 7 years ago | (#18823755)

This is a desperation move. AMD is back on their heels and their recovery plan is too far off in the future. In hopes of saving face they are pulling the only lever they have, clock speed.

Not so. AMD never said that they wouldn't increase clock speed on their CPUs. In fact, that's pretty much standard practice to get higher performance. So now their manufacturing process is capable of producing 3 GHz CPUs in sufficient volumes to sell, and they're selling them. As the process is refined there may be faster CPUs.

Intel does the same thing. As the manufacturing process is refined they are able to produce more and more CPUs at higher clock speeds. It's not a sign of anything other than business as usual.

Funny, Intel was chumped by AMD just like this a couple of years ago, why did AMD let themselves get tagged back? Intel woke up in a major way. Can AMD? Doesn't look too good...

AMD has more than just clock speed coming, Barcelona (aka K10) is supposed to be shipping in the next month or two. That's generally expected to take back the performance crown from Intel, and even if it doesn't it should at least eliminate the performance gap. For purposes of historical reference, AMD pretty much bitchslapped Intel when they released the Athlon 64. It took Intel 4 years to finally catch up to AMD and pass them with the Core 2 architecture, and even today the Opterons are still higher performers on 4 and 8 processor systems. If Barcelona turns out to be as fast as or faster than Core 2 (and by all rights, it should be) then it will have taken them only 1 year to catch up. Conroe was "previewed" at Spring IDF in 2006, but didn't ship until several months later.

As for why it's taken AMD a year to catch up, it takes quite a long time to design, layout, test, and debug a new CPU. Once all that is done the manufacturing process has to be designed and tested too. Then the CPUs have to actually be produced, and once production has started it takes almost 2 months to go from silicon wafers to functioning CPUs. However, something to keep in mind is that Intel is a much, much larger company than AMD and that Intel runs severals CPU design teams concurrently, while AMD doesn't. Intel has several times the number of designers, engineers, and fabs that AMD does. Because of their resources, Intel is able to completely scrap a CPU project and switch to something else if they need to. AMD can't, or at least not without seriously hurting the company. The fact that AMD is even competitive with Intel says quite a lot about the talent they have in-house.

The thing that I find most interesting was that last year when Intel was on the ropes, they offered the IDF preview to select web sites in order to generate buzz and FUD regarding Intel vs. AMD. And it worked too, because for 3 months everybody was talking about how Intel was king again even though they still hadn't shipped any Conroe CPUs. This year they're doing the same thing with their new Penryn architecture, and they don't appear to be on the ropes. Why would you tip your hand early if you don't have to? That indicates to me that Intel is concerned about something, and I suspect that something is Barcelona.

Even more interesting is that none of the previews compare Conroe with Penryn at the same clock speed. Most of the benchmarks that I have seen show a roughly 20% performance advantage for Penryn. But the Penryn CPU was running at about 14% higher clock speed, a 25% higher FSB, and with 50% more L2 cache onboard. Now who's playing the Gigahertz Game? I suspect that if you overclocked a Conroe and it's FSB to reach the same speeds, you probably would see little to no difference with Penryn. Which means that Intel's response to the all-new Barcelona is going to be...you guessed it...run up the clock speed and slap on some cache, because we're in for a bumpy ride.

Bah! (1)

Lumpy (12016) | more than 7 years ago | (#18823967)

I just rebuilt my home server with a Via C7 processor and several segate laptop hard drives (Gotta love 7200rpm laptop drives)

I have plenty of power for zoneminder,web,upnp server, mp3 playback server,SQL etc... and I cut the power consumption by almost 80%!

Cripes the Via processor runs at 2ghz with no fan and deals with Centos quite fast.. In fact it's far faster than the quad P-III it replaced that was sucking up power like no tommorow (Corperate servers at home are a silly thing!)

I dont care about GHZ, I care about power use. 1.8ghz is more than enough for 90% of daily tasks, I want my laptop to last 7 hours and my power bill to be under $300.00 a month.

I still am grinning ear to ear that I have a 2U server that has a terabyte of storage and does everything the last one did but only has a 150watt power supply in it. at normal operation my wattminder shows it is only using 65 watts. Yes that is with all 7 hard drives spinning. It drops down to 15 watts during idle times (well as idle as you can get with zoneminder running)

it uses less power than my Crestron Pro2 home automation processor!

Hell even my old outdated P4 3.0ghz 32bit dinosaur plays Quake4 at full settings just fine as well as editing 1080i video.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?