Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Inside AMD's Phenom Architecture

samzenpus posted more than 7 years ago | from the behind-the-curtain dept.

AMD 191

An anonymous reader writes "InformationWeek has uncovered some documentation which provides some details amid today's hype for AMD's announcement of its upcoming Phenom quad-core (previously code-named Agena). AMD's 10h architecture will be used in both the desktop Phenom and the Barcelona (Opteron) quads. The architecture supports wider floating-point units, can fully retire three long instructions per cycle, and has virtual machine optimizations. While the design is solid, Intel will still be first to market with 45nm quads (the first AMD's will be 65nm). Do you think this architecture will help AMD regain the lead in its multicore battle with Intel?"

cancel ×

191 comments

Sorry! There are no comments related to the filter you selected.

Begging the question (-1, Troll)

Dan Stephans II (693520) | more than 7 years ago | (#19114925)

Do you think this architecture will help AMD regain the lead in its multicore battle with Intel?"

Did they ever have the lead?

Re:Begging the question (4, Informative)

tomstdenis (446163) | more than 7 years ago | (#19115001)

In terms of market share, no. In terms of tech yes. See Opteron v. Intel P4 Xeon for example.

Tom

Re:Begging the question (1)

OSSRocks (572015) | more than 7 years ago | (#19115029)

P4 was never multi core... multi processor yes... but core no...

Re:Begging the question (0)

Anonymous Coward | more than 7 years ago | (#19115131)

OSSRocks, meet Pentium D - P4 dual-core with new marketing name.

Re:Begging the question (4, Informative)

Chris Burke (6130) | more than 7 years ago | (#19115137)

I introduce to you the Pentium D [wikipedia.org] .

Re:Begging the question (1)

OSSRocks (572015) | more than 7 years ago | (#19115245)

ahh yes i forgot about that one, i didnt realize they were sold i thought they were just engineering samples, i didnt bother biulding a new box until core 2 was out... probably why lol good catch!!

Re:Begging the question (1)

Chris Burke (6130) | more than 7 years ago | (#19115805)

Well it's not like anyone cared about the Pentium D, since it was pretty craptacular. Lessee, take a bandwidth-starved core design, slap two down in one package, downclock the system bus since the MCM has some of the same signal integrity issues as multi-socket, and you get... two cores that are starved for bandwidth even more. Yay.

So you were much better off waiting until Core 2 regardless, if you wanted an Intel dual core anyway.

Re:Begging the question (4, Funny)

gormanly (134067) | more than 7 years ago | (#19116247)

Craptacular indeed (great new word) - the only thing craptacularer was the Celeron D they had out at the same time, which despite the name was not dual-core. Very amusing though, watching the 'tards with enough knowledge to be dangerous and who wanted a cheap PC,

"That one's a 'D', that's got 2 processors, that makes the internet faster"

Re:Begging the question (1)

GreenEnvy22 (1046790) | more than 7 years ago | (#19117525)

I've been using craptacular for years, don't make be break out the DMCA!

Re:Begging the question (1)

laffer1 (701823) | more than 7 years ago | (#19116315)

Your technical analysis is correct, but the pentium D is so cheap on newegg that it replaced the celeron line out in october. Its now a good entry level dual core chip. My Pentium D 805 outperforms a Dell Precision 650 with 2 2Ghz Xeon chips. As bad as the Pentium D is, there as still a little progress. Cooling it is another issue all together.

Re:Begging the question (1)

Short Circuit (52384) | more than 7 years ago | (#19116199)

At one point, the Opteron was single-core. And it still beat the P4 Xeon.

Re:Begging the question (1)

Sorthum (123064) | more than 7 years ago | (#19115035)

The problem is, analysts don't look at technology so much as they do market-share. The unfortunate battle between techies and beancounters wages on...

Re:Begging the question (3, Informative)

homer_ca (144738) | more than 7 years ago | (#19115103)

The Athlon X2 was superior to the Pentium D. It wasn't until Core 2 Duo that Intel took the lead in desktop CPUs.

Re:Begging the question (1)

drinkypoo (153816) | more than 7 years ago | (#19115957)

The original Core Duo led most benchmarks as well (I have a T2600 before me.) Core Duo is really an excellent design (made a nice change.)

Re:Begging the question (1)

qbwiz (87077) | more than 7 years ago | (#19116823)

Well, the Athlon X2 was probably a better desktop chip, but the Core Duo was a better laptop chip than anything put out by AMD.

Re:Begging the question (1)

Movi (1005625) | more than 7 years ago | (#19117571)

Funny thing is, everyone knew Core2 will beat X2, even before X2 came out. AMD must have crapped their pants when they saw the samples. (no, I'm no fanboy, but please search the archive from bitnet for example, and see for yourself). Core2 was the hammer for AMD like K8 was the "sledgehammer" for intel back in the day.

Re:Begging the question (0)

Anonymous Coward | more than 7 years ago | (#19116243)

Only in benchmarks and anecdotal evidence. Apart from that, who's to say?

Mod parent up! (0)

Anonymous Coward | more than 7 years ago | (#19116757)

He used the phrase "begging the question" properly!

Answer: (-1, Troll)

drgonzo59 (747139) | more than 7 years ago | (#19114931)

Do you think this architecture will help AMD regain the lead in its multicore battle with Intel?"


No.


Next question please...

mid 1996 (0)

Anonymous Coward | more than 7 years ago | (#19114933)

intel released the core 2 duo in 1996????

Whoa, I'm more behind than I thought.

What?! (3, Funny)

rumith (983060) | more than 7 years ago | (#19114951)

From the TFA:

However, the dual-core duel became, and remains a performance battle. AMD was widely perceived to have taken an initial lead. Intel was seen as recovering the advantage when its introduced its Core 2 Duo family in mid 1996.
Looks like it happened in a parallel universe.

Re:What?! (4, Funny)

LurkerXXX (667952) | more than 7 years ago | (#19114989)

You must have brushed your teeth in a quantum mirror this morning.

Re:What?! (2, Funny)

rumith (983060) | more than 7 years ago | (#19115705)

Yes I did, and not only it's quantum, it also runs Linux.

Re:What?! (1)

drinkypoo (153816) | more than 7 years ago | (#19115339)

Core 2 Duo is faster in almost every situation than any dual AMD. But when you get up to 4 cores even, bus contention can be a problem. At 8 cores, intel is hopeless.

Hey Einstein (3, Informative)

p3d0 (42270) | more than 7 years ago | (#19115379)

Take another look. He's making fun of the date they mentioned (1996).

Re:Hey Einstein (1)

drinkypoo (153816) | more than 7 years ago | (#19115657)

Yes, I saw it in that comment and others. However, I have learned to never assume that someone is doing something clever on slashdot. Usually one is disappointed.

Re:Hey Einstein (2, Funny)

Nexum (516661) | more than 7 years ago | (#19115927)

Queen Elizabeth II... is that you?

Re:Hey Einstein (4, Funny)

somersault (912633) | more than 7 years ago | (#19116355)

I'm guessing it's QE I, or she would have said "two is disappointed".

Re:What?! (0, Offtopic)

dAzED1 (33635) | more than 7 years ago | (#19117461)

your sig ignores the "Cold War" and a vast sea of other examples, btw.

Re:What?! (2, Funny)

doubleofive (982704) | more than 7 years ago | (#19115897)

In a universe where there existed a processor that could actually run Win95.

Re:What?! (1)

logic hack (800754) | more than 7 years ago | (#19117483)

I was thinking more along the lines of a universe where Win95 would actually run.

Re:What?! (-1, Troll)

Anonymous Coward | more than 7 years ago | (#19116045)

DOUBLE TEH YOU LEWSAR

Sorry what? (4, Insightful)

tomstdenis (446163) | more than 7 years ago | (#19114959)

I had a 2P dual-core opteron 2.6GHz box as my workstation for several months. To be honest I couldn't really find a legitimate use for it. And I was running gentoo and doing a lot of my own OSS development [re: builds].

While I think quad-cores are important for the server rooms, I just don't see the business case for personal use. It'll just be more wasted energy. Now if you could fully shut off cores [not just gate off] when it's idle, then yeah, hey bring it on. But so long as they sit there wasting 20W per core or whatever at idle, it's just wasted power.

To get an idea of it, imagine turning on a CF lamp [in addition to the lighting you already have] and leave it on 24/7. Doesn't that seem just silly? Well that's what an idling core will look like. It's in addition to the existing processing power and just sits there wasting Watts.

Tom

Re:Sorry what? (4, Insightful)

LurkerXXX (667952) | more than 7 years ago | (#19115043)

Certain apps get a big boost from quad cores, lots of others don't. Some of those apps aren't for servers. For example, if you happen to do a ton of video editing, a quad core might be a good choice. I'll agree with you for most of us it's silly on the desktop right now. That won't necessarily be true in a few years when they write a lot more apps that need and take advantage of multithreading.

Re:Sorry what? (2, Interesting)

CastrTroy (595695) | more than 7 years ago | (#19115471)

Does multi-cores really help things like video rendering that much? Usually multicore means faster processor so yes it would help, but do you actually get better performance on 4x1GHz than you would on 1x4GHz? If not, then what you're actually looking for is a faster processor, not necessarily dual core. Servers need multiple cores because they are often fulfilling multiple requests at the same time. Desktops on the other hand are usually only doing 1 processor intensive thing at a time, and therefore, would probably not benefit as much as you might think from multiple processors/cores. That being said, it's a lot easier to get a 10 GHz computer with 4x2.5GHz CPUs, than it is to make a single 10 GHz CPU.

Re:Sorry what? (3, Insightful)

LurkerXXX (667952) | more than 7 years ago | (#19115745)

That being said, it's a lot easier to get a 10 GHz computer with 4x2.5GHz CPUs, than it is to make a single 10 GHz CPU.

That's the entire answer right there.

Re:Sorry what? (2, Insightful)

Short Circuit (52384) | more than 7 years ago | (#19116361)

Well, there's the copious amounts of per-core cache. That helps. Then there's the fact that it's a hell of a lot cheaper to make a four parts that run at 2 GHz than one part that runs at 8GHz. (Like, it can't be done right now.)

Re:Sorry what? (3, Insightful)

somersault (912633) | more than 7 years ago | (#19116551)

Also your computer tends to be doing quite a lot in the background (especially with lots of 3rd party crapware/virus scanners/firewalls loaded onto it) rather than just running whatever app you currently want to be using. It's nice to be able to experience the full potential of one core in the app that you do want to use while leaving another core to handle background services, though I don't know if Windows automatically organises processor time to do that kind of thing, and I've never tried splitting my tasks over my 2 cores manually. I guess my system is nippier than my old single core one, though the thing is that you tend not to notice stuff that *isn't* there (ever got a shiny new graphics card and just been like "oh.. everything's the same but without the slowdowns!" .. can be kinda anticlimactic!)

Re:Sorry what? (1)

tknd (979052) | more than 7 years ago | (#19117707)

Does multi-cores really help things like video rendering that much?

Yes, in fact it helps a big deal. Anything applications that are processor intensive like video rendering and encoding are being developed to utilize multiple processing units when available. For example x264, a new H.264 encoder, will run considerably faster with multiple threads than with just a single thread on a multi-cpu or multi-core machine. How cpu intensive is x264? Very especially with the deblocking filter and extra high quality settings turned on. On the same video sample, I can encode at 4-6fps with x264 while on xvid I can easily break 30+fps even with all of xvid's best settings turned on. And this is an Athlon X2 3800 with both cores utilized for the encode. If I really want the optimum quality x264 will provide, I can bump the reference frames to some insane number (like 30) and lose a few precious fps during the encode. When you're only encoding at the 5-6fps range for a half hour video with 30 frames per a second of video, that is painful. The gain by adding more cores and more threads diminishes however. For example encoding using one core vs encoding using two isn't a 2:1 ratio. Usually the gain by adding the second core is somewhere between 70 to 90% performance gain. So there'd still be significant gains when going to four cores, but beyond that it maybe not be worth it. That may just be due to how x264 is written/designed, not necessarily the implementation of the multi-core hardware.

But I agree that most desktop users probably don't need the multi-cores except maybe 2. Two cores are perfect for desktop users since it will allow them to run anti-virus in the background as well as all their other junk they don't understand. At the same time that might be bad for them because if the get malware, they may not be aware that it is running and consuming resources since they have plenty to begin with.

Re:Sorry what? (0)

Anonymous Coward | more than 7 years ago | (#19116765)

That won't necessarily be true in a few years when they write a lot more apps that need and take advantage of multithreading.
\

That won't necessarily be true in a few years when they write a lot more lousy apps that only multithreading can save.

See - it looks more like the truth.

Re:Sorry what? (0, Flamebait)

Joe The Dragon (967727) | more than 7 years ago | (#19115047)

Windows vista needs a lot of cpu power So 1 core just for the os. 1 for all the back round apps, anti-virus apps, and drivers. leaving 2 cores for your apps / games.

Re:Sorry what? (0, Flamebait)

Turn-X Alphonse (789240) | more than 7 years ago | (#19115465)

Who the fuck is coding THAT poorly?

If anti virus and background aps take an entire core to themselves I think we need to rethink our entire industry. I mean fuck me that is a lot of power for something we shouldn't even need if we had good security in the OS by default.

Re:Sorry what? (1)

Splab (574204) | more than 7 years ago | (#19115569)

Future games will utilize more cores, one core for kinematics/animations, one for physics/collisions, one for AI and one to bind it all.

Re:Sorry what? (1)

Cap'nPedro (987782) | more than 7 years ago | (#19115979)

I have an Athlon64 X2 4200+ runnign at 3GHz. With Windows Vista Ultimate, sitting at the desktop takes just 00-02% CPU usage.

I wouldn't call that very demanding.

Re:Sorry what? (3, Informative)

Applekid (993327) | more than 7 years ago | (#19115163)

According to a writeup on HardOCP back in September, the new design features the ability to pretty much halt cores on-die and save power [hardocp.com] . (hit next a few times, I wish I could get my hands on the actual Powerpoint)

Re:Sorry what? (1)

Iron Chef Unix (582472) | more than 7 years ago | (#19117561)

Wife: "Honey, my computer is frozen, can you come fix it?" Me: "It's not frozen, it's just the new architecture saving power."

Re:Sorry what? (4, Informative)

TheThiefMaster (992038) | more than 7 years ago | (#19115295)

My workstation is a core 2 quad, and a full debug build of our project takes 20 minutes, despite using a parallel compiler. On a single core it takes about an hour. You don't want to know how long the optimised build takes on one core.

So there are plenty of workstation uses for a quad core, but I agree that at the moment it's overkill for a home desktop.

less power (5, Insightful)

twistedcubic (577194) | more than 7 years ago | (#19115487)

Actually, I just got a 65W Athlon X2 4600+ from Newegg which uses less power than my current 6 year old Athlon XP 1800+. The motherboard (ECS w/ ATI 690G) I ordered supposedly is also energy efficient. I guess I could save $60 by getting a single core, but almost all single core Athlons are rated at more than 65W. Why buy a single core when it costs more long term and is slower when multi-tasking?

Re:less power (1)

TheThiefMaster (992038) | more than 7 years ago | (#19115925)

Why's that a reply to me? I didn't mention power/heat.

Re:less power (1)

katani (1090285) | more than 7 years ago | (#19117641)

When AMD switched over to the 65nm process, they also switched a couple of their single-core Athlon 64's. Look for the "Lima" cores for Socket AM2. AFAIK, they have a Thermal Design Point of 45 watts.

Re:Sorry what? (1)

Chirs (87576) | more than 7 years ago | (#19115611)

Heh...our full build takes about 6 hrs on a quad. Full kernel/rootfs built from scratch for 7 separate boards using 4 architectures.

Re:Sorry what? (1)

TheThiefMaster (992038) | more than 7 years ago | (#19115857)

You're obviously working on something bigger than me then.

Mods, pay attention (1)

p3d0 (42270) | more than 7 years ago | (#19115431)

Someone makes this same comment every time advances in CPU technology are mentioned.

Re:Sorry what? (1)

ArcherB (796902) | more than 7 years ago | (#19115541)

I had a 2P dual-core opteron 2.6GHz box as my workstation for several months. To be honest I couldn't really find a legitimate use for it. And I was running gentoo and doing a lot of my own OSS development [re: builds].

While I don't write my own stuff, as someone who runs a Gentoo based system (Sabayon) I do spend a lot of time compiling. I see a significant improvement of compile times with dual cores. Do you not see the same improvements when you compile your own stuff? Are you using the right switches to take advantage of the dual cores?

Re:Sorry what? (1)

tomstdenis (446163) | more than 7 years ago | (#19116115)

Yeah it's faster. but how much time do you spend building software? Maybe 15 mins every three or four days at most.

Tom

Re:Sorry what? (4, Funny)

ArsonSmith (13997) | more than 7 years ago | (#19117713)

This is true, but as an EMACS user I find quad core to be a huge boost to my ability to edit text files at a respectable speed.

Re:Sorry what? (0)

Anonymous Coward | more than 7 years ago | (#19115717)

Dual/Quad cores make sense if (like me) you have a bunch of VMs running and I think this is increasingly the case for developers.

Re:Sorry what? (5, Informative)

rrhal (88665) | more than 7 years ago | (#19115733)

While I think quad-cores are important for the server rooms, I just don't see the business case for personal use. It'll just be more wasted energy. Now if you could fully shut off cores [not just gate off] when it's idle, then yeah, hey bring it on. But so long as they sit there wasting 20W per core or whatever at idle, it's just wasted power.

AMD's cool & quiet tech will shut down individual cores when you are not using them. I believe this is all new for the Barcelona. It idles down cores when you are not using them fully. It shuts off parts of cores that you aren't using (eg the FPU if you are only using integer instructions).

Variable voltages, variable MHz's (1)

mosel-saar-ruwer (732341) | more than 7 years ago | (#19116695)


AMD's cool & quiet tech will shut down individual cores when you are not using them. I believe this is all new for the Barcelona. It idles down cores when you are not using them fully. It shuts off parts of cores that you aren't using (eg the FPU if you are only using integer instructions).

According to the last picture [imageID=9] in the Image Gallery, different cores on the same chipset can run at different voltages and different MHz's:

http://www.informationweek.com/galleries/showImage .jhtml?galleryID=30&imageID=9&articleID=199501467 [informationweek.com]

Re:Sorry what? (1)

jshriverWVU (810740) | more than 7 years ago | (#19116021)

I just don't see the business case for personal use. Depends on the business. I can definitely see these being useful in the financial segment or in animation studios. But if you're comparing it to Boss X reading email and loading spreadsheets then I agree it's overkill.

Uh... (2, Informative)

Per Abrahamsen (1397) | more than 7 years ago | (#19117573)

I had a 2P dual-core opteron 2.6GHz box as my workstation for several months. To be honest I couldn't really find a legitimate use for it. And I was running gentoo and doing a lot of my own OSS development [re: builds].


Uh, doesn't "make -j 3" gives you a good speedup? I'd imagine multi-core being great for development, at least for compiled languages.

Re:Sorry what? (1)

Hellkitten (574820) | more than 7 years ago | (#19117699)

I had a 2P dual-core opteron 2.6GHz box as my workstation for several months. To be honest I couldn't really find a legitimate use for it. And I was running gentoo and doing a lot of my own OSS development [re: builds].

man make

-j [jobs], --jobs[=jobs]
Specifies the number of jobs (commands) to run simultaneously. If
there is more than one -j option, the last one is effective. If
the -j option is given without an argument, make will not limit
the number of jobs that can run simultaneously.

Why the fuss over 45nm? (4, Insightful)

Smidge204 (605297) | more than 7 years ago | (#19114991)

Ultimately, it's performance that makes a successful product, not gigahertz or nanometers.

Sure, the 45nm process has great potential for better performance and higher efficiency, just like faster clock speeds had great potential - until AMD made a better architecture and achieved better performance at a lower clock speed than Intel's offerings at the time.

Let's wait and see how it really performs before passing judgement.
=Smidge=

Re:Why the fuss over 45nm? (5, Funny)

Anonymous Coward | more than 7 years ago | (#19115143)

So what you're saying: size doesn't matter?

Re:Why the fuss over 45nm? (1)

DeadCatX2 (950953) | more than 7 years ago | (#19117305)

Size only matters if it's too big or too small.

Re:Why the fuss over 45nm? (2, Interesting)

Eukariote (881204) | more than 7 years ago | (#19115299)

Indeed, let's wait for the benchmarks. I would like some more real-world and 64-bit benchmarks: most recent reviews seems to have studiously avoided those in favor of synthetic 32-bit only benchmarks that are not very representative and are easily skewed with processor-specific optimizations.

And I'm not sure going to 45nm process will allow Intel to step back ahead. It seems process improvements have been yielding diminishing results in performance related areas. Transistor density will go up, though, so Intel can compensate by adding more cache. Also, AMD's process technology is a little advanced than Intel's at the same feature size: Intel does not do Silicon on Insulator, dual stress liners, and a few other things.

Re:Why the fuss over 45nm? (1)

Belial6 (794905) | more than 7 years ago | (#19116555)

Exactly. If AMD can make a faster/cooler processor at 65nm than Intel can at 45nm, AMD is the better processor. This is particularly true for the long run, as Intel is closer to hitting the size wall than AMD is.

Support? (1, Interesting)

Sorthum (123064) | more than 7 years ago | (#19115009)

Quad core is all well and good, but are there really that many apps as of yet that can take advantage of it? TFA claims this is for servers and for desktops, and I'm not certain of its utility on the latter just yet...

Re:Support? (2, Insightful)

EvanED (569694) | more than 7 years ago | (#19115173)

MAKE -j6.

Mmmmmmmm....

(-j6 instead of -j4 in an effort to counter I/O latencies... Actually that'd be an interesting benchmark; figure out what the optimum level of parallelism is. Too little and processors will be idle, too much and context switches would become an issue.)

Re:Support? (0)

EvanED (569694) | more than 7 years ago | (#19115257)

Not sure why I typed "make" in capitals...

Re:Support? (4, Informative)

Mr Z (6791) | more than 7 years ago | (#19116837)

Prevailing wisdom and personal experience suggest using "-j N+1" for N CPUs. I have a 4 CPU setup at home (dual dual-core Opterons). Here's are approximate compile times for jzIntv + SDK-1600, [spatula-city.org] which altogether comprise about 80,000 lines of source:

  • -j4: 6.72 seconds
  • -j5: 6.55 seconds
  • -j6: 6.58 seconds
  • -j7: 6.59 seconds
  • -j8: 6.69 seconds

Now keep in mind, everything was in cache, so disk activity didn't factor in much at all. But, for a typical disk, I imagine the difference between N+1 and N+2 to be largely a wash. N+1 seems to be the sweet spot if the build isn't competing with anything else. Larger increments might make sense if the build is competing with other tasks (large background batch jobs) or highly latent disks (NFS, etc). But for a local build on a personal workstation? N+1.

--Joe

Re:Support? (1)

PitaBred (632671) | more than 7 years ago | (#19117637)

I've found kernel builds mirror those results on my dual 3.06GHz Xeon workstation, with hyperthreading enabled, -j5 gives me the best performance.

Re:Support? (1, Insightful)

Anonymous Coward | more than 7 years ago | (#19115193)

Don't you get it? Let's say you have four processor hungry applications that aren't multi-threaded. Cool! One runs on each core...

We've had this for YEARS. Literally, 20 years.

Re:Support? (2, Informative)

homer_ca (144738) | more than 7 years ago | (#19115211)

Photo and video editing parallelize nicely. Besides gaming, that's the only CPU intensive process that most home computers will run. On the gaming side, most games don't run any better on quad core, but Supreme Commander [hardocp.com] is one of the few that do.

Re:Support? (1)

gravesb (967413) | more than 7 years ago | (#19115515)

With consoles becoming multi-core, won't the video game industry have to learn how to better write games that take that into account? Before, most of their audience was single CPU computers (with a GPU of course) and consoles. However, now that most computers are multi, as are consoles, it seems like they have to better use that power. Of course, it may take a few years before they figure out the best way to do so, and apply it consistently.

Re:Support? (2, Interesting)

vertinox (846076) | more than 7 years ago | (#19115547)

Quad core is all well and good, but are there really that many apps as of yet that can take advantage of it?

Maya 3D

Or any other 3d rendering software where every CPU cycle is used to the last drop.

But other than that I can't think of anything off the top of my head, but multi-cores is very important to these types of apps. It is the different between 12 and 6 hours waiting for the project to render then people will go with the 6 hours.

Re:Support? (0)

Anonymous Coward | more than 7 years ago | (#19115853)

Right because OS kernels never schedule single process apps over multiple cores... oh wait.

Re:Support? (2, Insightful)

QuasiEvil (74356) | more than 7 years ago | (#19116257)

So I suppose whatever OS you're using only has one thread/process running at a time? I've never understood the argument that multi-core doesn't benefit the desktop user. As I look at my machine right now, I have two development environments going (one actually in debug), four browser windows, an email client, an IM client, various background junk (virus scanner, 802.1x client for the wireless), and of course the OS itself - XP. None of those needs a more powerful proc, but it's nice when they're all grabbing for CPU time that I have two cores for them to run on.

At home I'm often converting images from RAW in the background and doing postprocessing on them in the foreground. RAW->JPEG conversion is CPU intensive, and it's nice that it doesn't bring my system to its knees while doing it. I can continue about my work, while the converter is maxing out one of the cores in the background.

I've had dual procs since 1996, and would never, ever, ever go back. It's so nice to not have everything stall when a background job starts hogging the CPU.

Scalability, 64-bit, and FPU (3, Interesting)

Eukariote (881204) | more than 7 years ago | (#19115099)

When it comes to multi-processing scalability, AMD's Barcelone/10h/Phenm single-die four core with hypertransport inter-chip interconnects will do far better than the two-die four core shared-bus Intel chips. Also, both the old and new AMD architecture will do relatively better on 64-bit code than the Intel Core 2 architecture: Intel's micro-op fusion does not work work in 64-bit, and their 64-bit extensions are a relatively recent add on to the old Core architecture. The FPU power of the new 10h architecture will be excellent as well. On the other hand, Intel chips will remain very competitive on integer code, cache-happy benchmarks, particularly when run in 32-bit mode. Also, the SSE4 extensions of the upcoming 45nm Intel ships will help for encoding/decoding and some rendering applications, provided that the software has been properly optimized to take advantage of them.

Core 2 Duo was 1996? (0, Redundant)

stevedcc (1000313) | more than 7 years ago | (#19115215)

This article seems to be trying to rewrite history:

Intel was seen as recovering the advantage when its introduced its Core 2 Duo family in mid 1996.

Amazing that, a 10 year lead on dual core parts!

Re:Core 2 Duo was 1996? (0, Redundant)

hexed_2050 (841538) | more than 7 years ago | (#19115505)

From the Article:

However, the dual-core duel became, and remains a performance battle. AMD was widely perceived to have taken an initial lead. Intel was seen as recovering the advantage when its introduced its Core 2 Duo family in mid 1996.
Wow, dual cores in 1996! Let's review history...
1994: Macintoshes using the PowerPC start shipping.
1994: Intel introduces the 486DX4 clock-tripling microprocessor.
1995: IBM announces 1 million copies OS/2.
1995: Windows 95 is released with no small fanfare 1 million copies sold through retail in first 4 days.

1996: Intel introduced its Core 2 Duo family!

1997: Intel announces 200-MHz Pentium MMX
1998: First FAT32 operating system MS windows 98 released

hmm... something is wrong here.
h

Whatever they do, (0)

Anonymous Coward | more than 7 years ago | (#19115425)

I hope AMD neglects to advertise this one too.

More details at HotHardware.com (0)

Anonymous Coward | more than 7 years ago | (#19115437)

HotHardware.com has more details [hothardware.com] on the AMD Phenom processor, including die map shots and other specifications as well.

Core 2 Duo? (3, Funny)

ratboy666 (104074) | more than 7 years ago | (#19115443)

1996? Wow, have *I* been misled. Mid 1996 is the vintage of my Dual Pentium Pro 200Mhz, and I *really* thought that it was state-of-the-art.

Colour me disappointed...

Re:Core 2 Duo? (1, Redundant)

hexed_2050 (841538) | more than 7 years ago | (#19115559)

From the Article:

However, the dual-core duel became, and remains a performance battle. AMD was widely perceived to have taken an initial lead. Intel was seen as recovering the advantage when its introduced its Core 2 Duo family in mid 1996.

Wow, dual cores in 1996! Let's review history...

1994: Macintoshes using the PowerPC start shipping.
1994: Intel introduces the 486DX4 clock-tripling microprocessor.
1995: IBM announces 1 million copies OS/2.
1995: Windows 95 is released with no small fanfare 1 million copies sold through retail in first 4 days.

1996: Intel introduced its Core 2 Duo family!

1997: Intel announces 200-MHz Pentium MMX
1998: First FAT32 operating system MS windows 98 released

hmm... something is wrong here.

h -- I think I'll be a geek this week

Re:Core 2 Duo? (1)

truthsolo (519347) | more than 7 years ago | (#19117335)

...the vintage of my DEC 21164A stomping the Pentium Pro on the SPEC charts. Still doubles as a nice space heater :)

AMD beats intel (1)

PermanentMarker (916408) | more than 7 years ago | (#19115503)

I think that AMD beats intel, as a cheap underdog brand it kicked Intel of the top.
Overall i think however that single or complex duo / quad cores or more are a dead end.

To get performance a complete redesign is required, not only the main chips.
Perhaps a tandam of ARM chips using 3 types of mem compactflash besides fast ram and a disk.
And a diffirent aprouch to a memory bush, perhaps rather clusters of small chips with some memory who together devide tasks simultane. spreading out load is often cheaper then tuning up load on a single (critical) part.

it's just a mater of time then we will see this.
Also i think a cheaper base material should be found, well perhaps the next chips are made by small nano fabrics

Watch the pretty assistant while.... (0, Flamebait)

Anna Merikin (529843) | more than 7 years ago | (#19115529)

Intel, a giant corp with continued antitrust oversight, was quite happy to allow AMD to appear to be a genuine competitor to the US authorities, as long as AMD's market share remained a relative sliver. Once AMD's CPUs achieved a much better price/performance ratio than Intel, Intel moved to squash them like a deer tick in Olde Lyme, CT.

Does AMD stand a chance against Intel? Not unless they can make a profit out of having less than five per cent market share for the rest of eternity. Perhaps they can.

In any event, any competition is better than none. If it weren't for AMD's great processor architecture and 64-bit extensions, we doubtless would not have Core-2 duos and quads at an affordable price point now.

But Intel's R+D PIZZA BUDGET must be larger than AMD's total worldwide cash flow.

(Full disclosure: I build my own boxen, and have since 1991. I have never used an Intel chip in any computer I've assembled. So I have no grudge against AMD; I used them because they performed very well for a low price. They still do.)

Re:Watch the pretty assistant while.... (2, Informative)

Eukariote (881204) | more than 7 years ago | (#19115879)

They have a good chance. For one, their market share is rather higher than you make it out to be: about 20% of the 80x86 market vs Intel's 80%. Also, the computer manufacturers have an interest in keeping the competition between Intel and AMD alive. Unless they behave irrationally, they will help AMD to fully break the monopoly.

But the main thing that is pending for AMD is the antitrust lawsuit. Assuming there will be a just judgment, which is not a given with the US justice system led by the likes of Alberto Gonzales, a multi-billion dollar compensation for anti-competitive practices will fall to AMD. They have enough debt financing to last until then.

Re:Watch the pretty assistant while.... (1)

0123456 (636235) | more than 7 years ago | (#19115973)

"So I have no grudge against AMD; I used them because they performed very well for a low price."

That's odd. When I built my current PC in 2003 I looked seriously at using an AMD CPU rather than Intel, and discovered that the 'equivalent' AMD CPU was not only slower than the Intel CPU, but more expensive too. And, unlike someone who's never used an Intel CPU in their PC, I have no aversion to using which ever one is better.

The simple fact is that AMD had a brief period where they were technically better than Intel, and then Intel took back the lead. Right now Intel chips are simply better than AMD's, and unless the next generation beat Intel they're going to be in trouble.

Personally I hope they do recover, because the more competition we have in the CPU market the better, and some of the new features they've talked about do appear to have the potential for a useful performance boost. That said, I'm not buying any AMD stock.

That's ONE theory, I guess.... (1)

King_TJ (85913) | more than 7 years ago | (#19116227)

You might even be right, but my instinct says no.

Intel should have realized, from how U.S. govt. treated Microsoft and others, that they weren't in NEED of someone like AMD to cut deeply into their sales for a while with truly competitive products.

I'd say your scenario would hold much more merit if govt. had already broken up Microsoft into separate divisions or something....

The fact is, AMD has occasionally built a very comparable, yet cheaper alternative to Intel's offerings. (Remember the success of AMD's response to Intel's 486DX CPUs?) Whenever they pull this off, they do quite well, UNTIL Intel ups the ante with another huge R&D effort, and produces something "next gen" that AMD has to counter.

I don't think Intel is afraid of AMD "taking over the marketplace" in processors.... but I don't think they'd be wise to ignore them as irrelevant either (or only existing as a "straw man" for the sake of legal arguments). They certainly proved themselves much more of a competitor than, say, Cyrix.

AMD IS Doomed to Always Be a Follower Unless... (1, Offtopic)

MOBE2001 (263700) | more than 7 years ago | (#19116075)

It seems that AMD's research department is only concerned with beating Intel at its own game. This is foolish, IMO. AMD is doomed to always be a follower unless its engineers can come up with a revolutionary new CPU architecture based on a revolutionary software model. The new architecture must address the two biggest problems in the computer industry today: reliability and productivity. Unreliability puts an upper limit to how complex our software systems can be. As an example, we could conceivably be riding in self-driving vehicles right now but safety and reliability concerns will not allow it. Why? Because there is something fundamental wrong with software. Fortunately, a software model that solves these problems already exists. It is called the "non-algorithmic, synchronous, reactive software model. That's what Project COSA [rebelscience.org] is about.

Re:AMD IS Doomed to Always Be a Follower Unless... (2, Interesting)

DrMrLordX (559371) | more than 7 years ago | (#19116571)

Exactly why is AMD a fool to be concerned with "beating Intel at its own game"? Even Intel tried coming out with a revolutionary new CPU architecture, and look where that got them. Itanic has been undermined by Intel's own Xeon processors. The market has spoken, and it wants x86. Not even Intel has been able to change that (yet).

A smaller firm operating on tighter margins like AMD could easily go belly-up trying to break out with a new CPU microarchitecture. At least Intel could afford all of Itanic's failures.

Re:AMD IS Doomed to Always Be a Follower Unless... (1)

Nasarius (593729) | more than 7 years ago | (#19117647)

Itanic has been undermined by Intel's own Xeon processors. The market has spoken, and it wants x86. Not even Intel has been able to change that (yet).
This probably has more to do with the fact that IA64 was garbage than any inherent attachment to x86. Microsoft even went to great lengths to support it, which is much more than you can say for SPARC or POWER. There's plenty of room, especially in the *nix server market, for processors unrelated to x86. With Linux or the BSDs, all you really need to do is send the specs and some reference hardware to a few key developers, contribute some code to GCC, and you'll be fully compatible in a matter of months. The Itanic went down solely because of its many flaws.

Re:AMD IS Doomed to Always Be a Follower Unless... (2, Informative)

Anonymous Coward | more than 7 years ago | (#19116947)

wait.. What? AMD is following Intel? Mind telling me how that is exactly? IIRC, both Intel and AMD are using the 64 bit extensions that... guess who... AMD made on their Athlon 64 processors first. Also, AMD was first to move their processors away from a shared BUS. The reason why they say their processors are "True" dual or quad core is because their architecture was designed better to scale. Take a look at the multi processor benchmarks compared to netburst, and even take a look at how much better AMD processors today scale with the number of cores compared to Intel's lineup.

Following? Hardly.

Re:AMD IS Doomed to Always Be a Follower Unless... (1)

MOBE2001 (263700) | more than 7 years ago | (#19117583)

wait.. What? AMD is following Intel? Mind telling me how that is exactly?

They are following because they are barely making a profit, the last I heard. Why? Because they have to compete by drastically cutting prices to compete head-on with Intel. With a new architecture and a new market niche (mostly embedded systems and mission-critical systems), they would leave Intel in the dirt. The desktop market would follow soon afterwards when the industry comes to its senses and realizes that it has been doing it wrong ever since Lady Ada wrote the first algorithm. The good thing is that AMD has the engineering resources and know-how to pull it off. One man's opinion, of course.

What's new? (1)

ghoul (157158) | more than 7 years ago | (#19116105)

What's new in the article? All this was announced at the conference in Germany in January!!! Why is Slashdot even posting this? The only interesting thing is that the hype has made AMD's share go up 10 % in the last 2 sessions:)

parotting press releases != journalism! (0)

Anonymous Coward | more than 7 years ago | (#19116185)

This is, again, US style journalism: just parotting press releases. No depth, no investigation, no knowledge involved. These lazy writing parrots become so f***ing annoying and a waste of my time.
Please post links to articles when the contents includes some orginality and knowledge, otherwise just link to the press releases themselves!

Yeah, but when can I buy quadcores from AMD? (1)

hirschma (187820) | more than 7 years ago | (#19116433)

They seem a bit slippery there. When will the Barcelona Opterons ship? Anyone know?

Re:Yeah, but when can I buy quadcores from AMD? (2, Informative)

DrMrLordX (559371) | more than 7 years ago | (#19116641)

You might be able to get them in Q4 2007 [wikipedia.org] . With launch dates of August 2007, we'll probably see the actual chips in retail channels by October. OEMs/builders should have products featuring the new Opterons much earlier.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>