AMD's Dual-core Athlon 64 X2 reviewed 309
ChocolateJesus writes "Weeks after formally announcing its dual-core Athlon X2 desktop processor, reviews are finally trickling out. The Tech Report's coverage tests two flavors of the Athlon 64 X2 against a whopping 17 competitors, including AMD and Intel's fastest single- and dual-core offerings. They've even thrown in a handful of dual-processor systems (and dual-core, dual-processor systems) for good measure. Testing focuses on multi-threaded applications, and the X2s deliver remarkable performance. Perhaps even more impressive is the fact that unlike Intel's dual-core Pentiums, AMD's X2s consume no more power than single-core chips." Looks like this story has come out of embargo - if you've find more reviews, post them in comments.
Cooling (Score:4, Interesting)
Re:Cooling (Score:5, Informative)
On top of that, A64 platforms are known for their low power consumption compared to Netburst based processors.
Re:Cooling (Score:5, Funny)
Re:Cooling (Score:2, Funny)
Re:Cooling (Score:3, Funny)
Re:Cooling (Score:5, Informative)
http://www.anandtech.com/cpuchipsets/showdoc.aspx
On that page they compare a 130nm single core Athlon to a 90nm dual core. Even under a full load, the 90nm dual core uses less power than the single core 130nm chip.
Re:Cooling (Score:3, Informative)
Other considerations factor in to determine the power consumption (total number of transistors, other elements, arrangement, etc.), but the smaller size drops the power level quite a bit beforehand.
Re:Cooling (Score:3, Interesting)
So, assuming they used the same system for all measurements and just swapped out the cpus, the relative differences are accurate. But you can not draw any conclusion about the absolute power requirements of the cpus based solely on Anandtech's review.
Maybe no one cares, but it would be ea
Cooler than the old AMD 130nm designs and Intel (Score:5, Informative)
Re:Cooling (Score:2)
Re:Cooling (Score:2, Informative)
I don't get how this can run on the same power level as the single core chips. Can someone explain on how this is possible?
It isn't.
Under load, the dual core system consumes about 25 watts more power than the single (178 watts vs. 154) -- and 25 watts is just less than what a single-core A64 consumes under load.
I think the poster was looking at the numbers for idling.
Re:Cooling (Score:3, Interesting)
It's hardly accurate to judge a CPU's performance based on a "power drawn at the wall" measurement.
Re:Cooling (Score:3, Informative)
Re:Cooling (Score:2)
Re:Cooling (Score:4, Insightful)
You would expect to see less than 100% increase in the case of a dual core CPU due like the shared components which are not replicated:
- The X2 chips still have a single, 128-bit wide memory controller. Since the memory controller charges/discharges external bit lines going to DIMMs, they do burn quite a bit of power. This power consumption is not duplicated in the case of a dual core CPU.
- The X2 chips still have a single HyperTransport bus. The power consumption of this bus is the approximately the same between a dual core and single core CPU.
However, power scaling due to these shared components would probably not explain how a dual core chip can burn only 20% more power. For both of the above cases, you could argue that one should expect to see higher utilization of the memory bus and the HyperTransport bus, so the exact power consumption contribution is not entirely clear.
One thing to note that, AMD Athlon 64 cores tend to burn much less power in idle state compared to Intel chips. This is probably due to choices AMD made both in architecture and process. So the fundemantal reason why AMD X2 chips have such minimal incremental power consumption over single core chips is that one of the cores is typically underutilized most of the time and therefore burns much less power.
Anand's Take (Score:5, Informative)
Re:Anand's Take (Score:2)
Re:Anand's Take (Score:2)
What's that burnt intel smell? (Score:2, Interesting)
Re:What's that burnt intel smell? (Score:3, Insightful)
Yes.
Re:What's that burnt intel smell? (Score:3, Interesting)
Intel is obviously relying on fat vendors like Dell, but with performance like this and power consumption like that, buyers will be asking Dell what their problem is. When Dell finally cracks, you'll know Intel have spent too long fixating on their stock price rather than their products. It's a tough thing to recover from, too, and will call for a major shake-up.
Pity is, companies which go though this usually are considerably weaker. AMD looks good, but you h
I'll wait for the next version... (Score:5, Funny)
Sorry, it just had to be said.
Re:I'll wait for the next version... (Score:2)
They only come in pairs.
Also, after you've had it a while one of the pair refuses to go into "69" mode, even though it's still fine in standard mode. Something about being incompatible with a daughtercard.
Re:I'll wait for the next version... (Score:3, Funny)
Row, row row your boat, gently down... (Score:5, Informative)
No actually, they're going to be launched in June. The fact that this would be lost on the submitter was so obvious, I was able to prepare this message in advance and just paste it in.
These look to be amazing CPUs. After the initial linpack-with-large-matrices benchmark, you have to go thirteen pages into the benchmarks at TechReport [techreport.com] to find some of note where the Intel solutions are able to score off a win!
Don't Forget the [H] (Score:5, Informative)
Or you can jump right to their conclusions [hardocp.com].
Rollout process (Score:5, Interesting)
1. Announcement
2. Technical Preview (benchmarks Appear)
3. Launch (OEM Availability)
4. Ramp-up and Reseller Availability
They even give dates, if they can keep to those dates then we might actually have a product launch that doesn't antagonize the community with accusations of a 'paper launch'.
I'd like to see more companies be more upfront about this.
vs (Score:3, Insightful)
Re:vs (Score:5, Informative)
The best example of what you're looking at that i've found is at http://www23.tomshardware.com/index.html [tomshardware.com]
It's an interactive chart of all major processors available now and plenty that aren't available, it's a good idea to compare what you might have not and what an upgrade could do for you.
Re:vs (Score:2)
Wow... those are fast (Score:2)
This really is going to make me think twice about the need for separate CPUs. I really want to get my hands on one of these to test.
Does dual core == 2xProcessor or hybrid? (Score:3, Interesting)
I recall reading a
But does a vendor HAVE to make a dual core chip with two of the same processor? Perhaps gains could be made using a less powerful, commodity chip core and pairing it to a top of the line core.
Costs would be lower and they could sell more of this hybrid dual core because they would only need 1 top of the line cores.
Oh, you get what I am saying.
Re:Does dual core == 2xProcessor or hybrid? (Score:2, Informative)
Re:Does dual core == 2xProcessor or hybrid? (Score:2)
Which, needless to say, is probably pretty damn hard to do. So hard, that it'll never happen.
Re:Does dual core == 2xProcessor or hybrid? (Score:5, Informative)
But the less powerful core does not exist, so they'd have to design it. And the design cost is killer.
However, assuming unlimited design budget and schedule, there are some academic papers showing that heterogeneous cores are a good idea.
Re:Does dual core == 2xProcessor or hybrid? (Score:2)
Re:Does dual core == 2xProcessor or hybrid? (Score:4, Informative)
Re:Does dual core == 2xProcessor or hybrid? (Score:5, Interesting)
The only time when heterogeneous processors are really useful is when each is better than the others at a sub-set of tasks. Current PCs are usually a set of 3 different processors in a single box[1]. They have a reasonably fast general purpose CPU, and on the same die a simple vector processor (e.g. MMX, SSE, AltiVec), which has a different instruction set to the main processor and must be invoked explicitly. They also have a highly parallel large vector processor on a separate chip, which is usually used for graphics. No automatic scheduling is performed between these - it is up to the programmer to explicitly code for each one. Ideally, a heterogeneous processing environment would require code to be JIT compiled for each processor, and then moved between them depending on run-time profiling information.
[1] Yes, this is an oversimplification.
YeS! (Score:2, Funny)
Fast and INEXPENSIVE to run! (Score:3, Informative)
This is significant if you live in say Honolulu where electricity is 14cents/KWh or on Kauai where it's close to 22cents/KWh.
Windows Licenses (Score:2, Insightful)
No 'update' necessary (Score:2, Informative)
Re:No 'update' necessary (Score:5, Informative)
No, Microsoft's official licensing policy is one socket = one CPU. Therefore, a dual-socket Opteron motherboard with two dual-core chips would be licensed as a dual-CPU system, even though it has four separate cores.
I think your post was trying to get that idea across, but your statement of "one die = one CPU" is misleading to that effect.
What's odd about this is if you bought a dual-core, dual-CPU Xeon system supporting HyperThreading. If you opened up Task Manager you'd find eight CPU graphs. Not that you'd get anything near the performance of a eight-way system, though...
Microsoft's licensing is a bright spot when it comes to commercial software and multi-core CPU's. There are several firms still clinging to the "one core = one CPU" model, and dual core chips are going to immediately make such software very expensive.
I contacted Oracle a couple of weeks ago to clarify their position, and I was told then that dual-core chips would be considered a single CPU for the purposes of licensing. It seems that Microsoft's adherence to the "one socket = one core" idea is forcing its competitors into the same pricing model. Who woulda thunk Microsoft would actually be helpful in this situation?
RISC (Score:3, Interesting)
Re:RISC (Score:2)
It's not quite fair to compare a cpu that doesn't have any of that to a recent x86 that does.
Cuz you know what, an AMD64 can hold it's own against alpha just fine. And with CISC instructions it does so with less code space pollution.
lw $0,blah
work
sw $0,blah
boring...
ADD [eax],ebx
much more efficient
I mean something like an ARM processor.. you won't see that at 2Ghz anytime soon [e
Re:RISC (Score:2)
Re:RISC (Score:2, Insightful)
http://arstechnica.com/cpu/4q99/risc-cisc/rvc-1.h
MythTV? (Score:2)
Server's slow, but we have a mirror (Score:5, Informative)
Re:Server's slow, but we have a mirror (Score:4, Funny)
Re:Server's slow, but we have a mirror (Score:3, Funny)
Does anyone buy performance anymore? (Score:3, Interesting)
AMD might be turning out some pretty good products but they are not making any money [networkworld.com] selling them and it is only a matter of time before they have to fold their tent and leave the field to Intel.
Re:Does anyone buy performance anymore? (Score:4, Interesting)
Re:Mmm.... dual core. (Score:5, Funny)
Comment removed (Score:5, Funny)
Re:Redsigning your applications. (Score:5, Funny)
Re:Redsigning your applications. (Score:2)
duel-core
Wow, couldn't have reinforced that point any better. These cores don't fight eachother, there are just two of them (duel/dual).
-Jesse
Re:Redsigning your applications. (Score:3, Insightful)
Re:Redsigning your applications. (Score:3, Informative)
Re:Redsigning your applications. (Score:5, Funny)
Correct. Instead of executing the code in parallel, both cores will fight to the death for the privilege. Since only one core survives, you don't really get much benefit from duel-core processors.
Highlander! (Score:5, Funny)
It's about the interactivity (Score:5, Informative)
Re:It's about the interactivity (Score:3, Insightful)
Certain image process apps (e.g.: enblend, a panorama enhancing app) drive my single athlon into a state where it is sluggish and hard to use (1.8ghz athlon, 768 mb ram, fedora core 3).
So, while the Gimp can be compiled to do some multiply-threaded stuff, the real boost is that my computer should still be useable for other things while it's off fixing my panoramas.
Since some of my panoramas take over 2 hours to fix, I'm looking into a faster system and will definitely be trying t
Re:It's about the interactivity (Score:3, Interesting)
Re:It's about the interactivity (Score:3, Interesting)
Don't really share your user experience.. Being a college student at the time, I salivated over a dual system for years, and finally found the opportunity with the dual Pentium II-class celeron motherboards by Abit. That brief window in history when you could have a full dual processor system for under $250. It was dual 433's overclocked to 466. At the exact same time, I had an AMD K5-400 as my main machine. The dual ran on Red
Re:Redsigning your applications. (Score:5, Informative)
And a singlethreaded badly written application will be less prone to lock your computer, too, since the other apps will still be able to run from the second core.
The main issue is not the multithreading abilities of the applications, but the multithreading abilities of the OS itself. If the OS handles multithreading well, multicore (physical or virtual) will always give a slight to impressive improvement over single core.
Re:Redsigning your applications. (Score:2)
Re:Redsigning your applications. (Score:2)
Re:Redsigning your applications. (Score:2)
Re:Redsigning your applications. (Score:2)
You'll see that almost all running processes use threads. Only the tiniest systray apps may be singlethreaded. Apps like Internet Explorer use dozens of threads, and will render a page with many jpg's or flash applets faster than a comparable single CPU/single core system.
The 'not many apps use threads' myth keeps on being spread, but anyone can see for himself just how many
Re:Redsigning your applications. (Score:2)
If readers didn't get the above replies, duel and dual are two very different words. Dual is a word used for "two", duel is a word for a fight.
I think it is a chicken and egg thing, developers of performance intensive software
Check out Linux (Score:3, Informative)
A lot more common apps are multithreaded than people think. Nautilus, Firefox, OpenOffice, Gnome Terminal, and, um Gnome Weather Applet are all mutithreaded.
Even if no apps on your system are multithreaded, if you're like the 99% of users who run multiple processes simultaneously, you'll still get an advantage. Your updating app runs on one core while your desktop runs on another, for example.
Re:Redsigning your applications. (Score:2)
I usually do that on my dual-cpu systems when I want max performance out of a known single-threaded app.
Re:Redsigning your applications. (Score:2)
This is one of the sadder things about projects such as OpenSSI and Mosix. Only whole processes can be migrated to another node... not threads.
Re:Redsigning your applications. (Score:3, Interesting)
Well we realize it here, because it's BROUGHT UP every single time there's a mention of more than one processor running!! Yeesh. Heh.
On a lighter note: When these processors become more popular, multi-threaded apps will come. Besides, its not like our machines aren't keeping up with apps today. Except for my 3D rendering, I don't have anything that would benefit from a faster proces
Re:Redsigning your applications. (Score:2)
1. If you do more than one thing at a time, you'll benefit--two single-threaded programs can run without getting in one another's way.
2. This might be a good time to brush up on your pure applicative language skills...don't they lend themselves to easier parallelism than imperative languages?
Re:Redsigning your applications. (Score:3, Interesting)
Most people who post it don't realize that your CPU is context switching dozens of times per second when idle in your OS already. Simply letting two cores handle different interrupts is a benefit for system responsiveness.
How often is your CPU wanting to do more than one thing at a time? All the time in an OS like Linux or Windows.
If you're running Linux, run vmstat and check the context switches per second.
If you install a second CPU, you may
Re:Redsigning your applications. (Score:5, Interesting)
Dual CPU systems tho are useless to the home users, it's for businesses and scientists with more computing need. Real enterprise applications are multithreated.
Not so!
I was one of the lucky people buy a cheap dual Celeron setup right after that hack was first discovered and I can tell you that multiprocessors on the desktop rock. My old system was a dual Celeron 400, and while it couldn't compete with a modern system in terms of benchmark speed, it had my current 1400 MHz Celeron system beat bloody when it comes to interactivity and responsiveness -- that elusive "feel".
The price is steep now, but don't let arguments about application benchmarks dissuade you from trying out multicore when prices go down. The Anandtech review cited about has some really telling benchmarks about how well a dual system performs when loaded down with multiple tasks.
Unlike the unnoticeable 200 or 400 MHz incremental bumps you usually see with processors, dual core really brings something of value to the desktop user. Try it and you'll see.
Re:market for this? (Score:2, Insightful)
The fact of the matter is if you build it, they will come. I'd bet that it won't be more than a couple of years before you see a recommendation for 2 processors on games.
Re:market for this? (Score:2, Insightful)
Besides, in an industry where if you don't come out with something new frequently you die, it seems likely that it won't be too many years down the road before dual-core may be the only option for consumers in the m
Re:market for this? (Score:3, Funny)
Nothing sexier than a 16slot Blade server running Dual-Core Opterons. Equivelent of 32 cpus in 5u of space.
MMMmmm...mmm...mmm...mmm...SEXY!
Re:market for this? (Score:3, Funny)
Adware&Virus: hardware makers win!! (Score:5, Interesting)
It's a sad case that as malware becomes more previlent, hardware vendors win. Really, you can be productive with (for example) Win2K on a 1GHz machine and 256MB, in an office. Now add the wait as every file is scanned on access for viruses (per corporate policy), and the machine somehow becomes "too slow."
OH well. I guess it's time to put all productivity applications on a Server & run them remotely. Again;-(
Re:market for this? (Score:5, Insightful)
AMD on the other hand has always started out chips on the enthusiast / enterprise market because they simply don't have the fabrication capacity that Intel does. Thus they market first for the high end users and over time the processors find their way into the desktop market when they've been dated by yet another new, improved processor being marketed at the first group. Their whole revenue plan is based off of the 'rich' people niche (which includes many medium to large businesses). Based on their success, I'd say that they've done really well with this business model and continuing to do so would likely continue to work for them.
The common misnomer that is latched onto with many processor reviews nowadays is that both AMD and Intel are prodcing processors for the desktop platform, when in reality their business goals for their processors are on opposite spectrums. Intel starts desktop side, AMD starts server side. It is only after both have matured to some degree (and software caught up to both of them) that the processors can be meaningfully compared for the average joe user that just bought a new computer (or had one built for him).
Most people who go crazy over these new technologies are either wanting it for pure bragging rights, or simply aren't aware of how little it will actually do for them... or both in all liklihood.
Re:market for this? (Score:5, Informative)
s/always/recently/
Clearly, you've just not been around very long, or not paying attention, or have only short-term memory.
It's only been in recent years that AMD has bested Intel, performance-wise. For many, many years, AMD could release a new chip with good performance similar and then Intel would beat them with another new chip.
There's a long, long history of AMD selling their chips at approximately half the price. Certainly through all of the 90's (486, pentium 1/2/3), AMD chips were substantially cheaper than buying Intel.
During much of this time, AMD's chips also had a strong reputation to run very hot. Intel had a reputation for running cool and being easy to overclock. It was Intel that introduced the multiplier locks to prevent overclocking, which apparantly became quite a problem outside the USA where unscrupulous companies would sand down the tops of the chips (back then they were usually ceramic on top) and print a faster speed and resell them as such.
It wasn't even all that long ago when the infamous celeron 300A, which was multiplier locked, could overclock to 450 MHz (then, nearly the fastest chip they sold) by overclocking the front side bus by 50%. At the time, AMD's chips were far behind, and they were running hot with very little overclocking margin, just to try closing the substantial perforance gap.
Even back in the early Pentium days, even before AMD came out with a comperable chip, the 90 MHz pentium appeared in a new, smaller geometry process that made it run about as cool as the 486 66's.
Intel has indeed been in the lead, technologically, for a very long time... ever since they stopped licensing IP from Intel. For a bit of really ancient history, long ago, some large well known companies had a strong policy of never using any components that were not available from a second source. AMD's business model 20+ years ago was to license designs and be that second source.
Even a number of articles mention how the tables have turned recently, and speculate whether Intel will regain the honor of top performance.
I'm not affiliated with Intel, and in fact the PC I'm using to write this comment runs an AMD chip. When I upgrade, it'll probably be AMD again. Recently, AMD appears to have made some really smart architectual decisions that have put them in the lead, technology-wise.
But to believe such has always been the case, or even been a trend that's anything more than recent, is to ignore or be utterly ignorant of the very long history of Intel dominating the PC / x86 market with the best chips.
Re:market for this? (Score:3, Interesting)
Re:market for this? (Score:5, Insightful)
Longhorn (Score:5, Funny)
Re:market for this? (Score:4, Interesting)
Re:market for this? (Score:3, Informative)
The question is, other than gamers and graphic artists, who needs them? You have a point in that almost every other application that the average guy uses has been saturated in terms of (quite prolific) features for years. I really don't
Re:market for this? (Score:3, Insightful)
who is going to buy computers with these new ultra powerful dual core processors?
I will. I'm often running applications that take 100% CPU. Having another core around to make the system nice and responsive would be wonderful.
gamers don't need dual core
Right, and when video cards that supported an accelerated transform and lighting (i.e. the GeForce) came out, they didn't need that either since current games didn't support it. You can bet the next core of games will be multi-threaded.
everyday use
Re:market for this? (Score:2)
Right, and when video cards that supported an accelerated transform and lighting (i.e. the GeForce) came out, they didn't need that either since current games didn't support it. You can bet the next core of games will be multi-threaded.
Right and false, in the test DOOM III doesn't seems to benefit from the dual core, but Far Cary and UT seems to benefit from it.
See: Gaming performance [techreport.com]
Re:market for this? (Score:5, Interesting)
Anyone involved in matrix math (circuit design, mechanical engineering, fluid dynamics, etcetera) would love to be able to do this on their desktop instead of shared time on an HPC. Or combine the computational power of an office full of these machines at night or over weekends for the really big jobs. What's not to like?
Any scientific organization that has been holding off on capital expeditures while waiting for a clear winner to emerge ((AMD vs. Intel) vs. (PPC vs. SPARC)) will have come that much closer to making a decision.
Intel's IA64 gambit has not panned out -- their marketing hype has brought down some of their competition (PA-RISC and MIPS), but it has not proven to be the market leader Intel would have hoped. But like a wildfire in the woods, Intel's IA64 has opened up competition for diversity and some new leadership.
Re:whitebox linux on dual core Opterons (Score:2, Troll)
Re:whitebox linux on dual core Opterons (Score:2)
Doesn't make his post less offtopic
Re:Funny from TFA: (Score:2)
-Jesse
For the gentoo users out there... (Score:2)
Re:Sounds nice, BUT.... (Score:5, Insightful)
The machine I'm typing this on (just a simple diskless workstation) currently has 75 different processes running. The server it's connected to has 145. With a dual core processor in either of them, the number of processes able to run simultaneously would be increased by 100%.
The idea of running just one application on your box went out more than 10 years ago. Wake up and smell the coffee.
(If nothing else, all those blasted Flash animations can run without chewing up CPU cycles I would rather use for something else.)
John
Re:Not enough comparisons ... (Score:3, Insightful)
Re:What would have been even more interesting... (Score:3, Insightful)
I'm not saying its impossible to compare 2 different architectures, I'm just saying its not practical to compare 1 part of 2 architectures and expect to