Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Chip That Changed the World: AMD's 64-bit FX-51, Ten Years Later

Unknown Lamer posted 1 year,6 days | from the the-day-unix-workstations-died dept.

AMD 259

Dputiger writes "It's been a decade since AMD's Athlon 64 FX-51 debuted — and launched the 64-bit x86 extensions that power the desktop and laptop world today. After a year of being bludgeoned by the P4, AMD roared back with a vengeance, kicking off a brief golden age for its own products, and seizing significant market share in desktops and servers." Although the Opteron was around before, it cost a pretty penny. I'm not sure it's fair to say that the P4 was really bludgeoning the Athlon XP though (higher clock speeds, but NetBurst is everyone's favorite Intel microarchitecture to hate). Check out the Athlon 64 FX review roundup from 2003.

cancel ×


Sorry! There are no comments related to the filter you selected.

The old days (5, Insightful)

Thanshin (1188877) | 1 year,6 days | (#44949179)

Those were the good old days. How I miss when it took me one day at most to learn about all options I had to build a gaming computer, with enough detail to make an informed decision about what bits and pieces to built it with.

Nowadays just piercing the veil of lies, half truths, false reports and bought reviews, makes the entire process incredibly boring and frustrating.

Made me miss the old Slashdot (4, Insightful)

ElementOfDestruction (2024308) | 1 year,6 days | (#44949227)

It seems we've lost a lot of quality in the comment fields in the last 10 years. Lots of expertise modded up carefully; now we seemingly have opinion-pieces moderated up by whichever group happens to be awake at the time, and the real expertise is hidden in the +2 or below.

Re:Made me miss the old Slashdot (5, Insightful)

Billly Gates (198444) | 1 year,6 days | (#44949407)

Yeah like the hot grits down your pants, Natalie Portman naked and petrified, gay niggers association, penis bird registrations in ASCI, and of course who could have forgotten Goatse I mean LITERALLY forget! Ahoot one goatse troll had a +3 and got +90 responses with MY EYES?!! By a moderator trying to be funny.

No I dont miss those days as we tend to remember only the good ones

Re:Made me miss the old Slashdot (1)

Beardo the Bearded (321478) | 1 year,6 days | (#44949711)


It was a nice break from the usual trolling. The recipe was legit, too.

Re:Made me miss the old Slashdot (0)

Anonymous Coward | 1 year,6 days | (#44949539)

It's Bush's fault.

Re:Made me miss the old Slashdot (3, Interesting)

iroll (717924) | 1 year,6 days | (#44949957)

You must have read different articles than I did, because 10 years ago it was "Micro$oft $hills," "Apple Fanboys," etc. You do know that this was the origin of "No wireless. Less space than a Nomad. Lame," right? And that was 2001.

Re:The old days (5, Insightful)

Dunbal (464142) | 1 year,6 days | (#44949277)

It's still pretty much common sense. You want a fast CPU, so not the top of the line $1000 chip, take a step back or two and go for the one selling in the $300-$500 range. Motherboard for that chip from someone you trust - ASUS, Gigabyte, etc. Again never the $500 "gamer" board, take a step back, there are some really nice ones for $200 or so. Latest generation graphics card, or top end from last generation (assuming the prices have come down), plenty of memory on the card. Power supply that can feed the card what it needs and then some. Plenty of system RAM. SSD hard drive. Water/Air cooling system for your CPU type. And you're set! Shouldn't take a whole "day" to check those out. An hour or two would suffice.

Re:The old days (0)

Anonymous Coward | 1 year,6 days | (#44949567)

I can attest to this. The current computer I have(which is outdated, but still very fast), has DDR2 memory, the Q9550 Yorkfield Core 2 Quad Intel processor and an nVidia 260 GTX. None of the stuff I bought was high-end at the time, excluding the Yorkfield processor which was brand new, but even then it was 200-400(can't remember if it was 200-300 or 300-400, but somewhere in that range). My motherboard had the highest speed for DDR2 ram and cost ~$170. The graphics card was in the $200-300 range. At the time, the popular gamer benchmark was running Crysis, which this computer could do for hours on end without slowing down(because it took a long time to leak through 8GB of memory). This particular computer has never had a problem - ever - it destroys anything I throw at it. The only problem? Newer PC games are coming out that will utilize better graphics, and in particular my graphics card doesn't do directx 11 very well. In order to upgrade I really need new RAM(DDR3), which means new mobo, which means new I might as well buy a new computer outright.

I'm currently planning on buying the parts to build a new computer, but only because I like building computers and would like better graphics. I've specced out a computer for ~$2,500(including 2 monitors, and Windows 7 64-Bit...not sure which edition just yet), but really my current computer has no problems at all. I'll probably sell my current computer for 800-1,500 to recoup some of the cost of my new computer. I probably spent 8 hours sifting through tom's hardware reports on various parts. Not necessarily so I could make an informed decision, but because I genuinely enjoy reading the articles comparing the latest hardware. I'm not buying the parts right now, probably Q1-Q2 next year, so I'll get to wait and see how Haswell pans out, and there will probably be slight modifications to what I've picked out so far.

Re:The old days (3, Interesting)

OolimPhon (1120895) | 1 year,6 days | (#44949641)

You claim to be a geek and you're contemplating getting rid of an old computer?

All my old computers ended up being used for something else. I only get rid of them when the architecture is so old that <OS of choice> won't run on it any more (or when the smoke comes out!). Device drivers are the things that limit usage to me.

Re:The old days (-1)

Anonymous Coward | 1 year,6 days | (#44950127)

shut up hipster

Re:The old days (2)

Bigbutt (65939) | 1 year,6 days | (#44950367)

I did that for a bit but when I got to seven computers sitting idle in the closet, I took them down to the electronics recycling bin. Heck, I'm even looking at my old Sun box and considering punting that one as well. That will leave me with 4 computers that are regularly in use plus the tablet and phone.


Re:The old days (1)

Anonymous Coward | 1 year,6 days | (#44950355)

800 to 1500 for your old computer?

What do you think you have in there?

And 2500 for a new computer?? Damn, you are way overpriced with your target there.

Re:The old days (1)

interkin3tic (1469267) | 1 year,6 days | (#44949869)

I'd need a day before I felt that I actually knew enough to start buying rather than just THINKING I knew enough.

Re:The old days (4, Informative)

Dagger2 (1177377) | 1 year,6 days | (#44949991)

And then you end up either with an i7 4770 which has a locked multiplier, or a 4770K which doesn't do VT-d. Then you realize that there is no Intel CPU that'll do both. So then you start looking at AMD, in the hope that they don't pull shit like that with their CPU models. And then you're way over your hour or two budget.

Re:The old days (4, Insightful)

ArcadeMan (2766669) | 1 year,6 days | (#44949291)

The good old days was the 286 era, when all you needed to know what the clock speed of the CPU, EGA was four times better than CGA and SoundBlaster was AdLib compatible.

Of course, you had to deal with XMS and EMS memory settings, loading your mouse driver into high memory and solving IRQ and DMA conflicts between your ISA add-on cards.

Screw that, the good old days are today. Take out the iMac from the box, plug it in the wall socket and start using it right away.

Re:The old days (2)

SB9876 (723368) | 1 year,6 days | (#44950209)

I'll second that. You haven't known pain until you try to get Ultima 7 to run on a system with a Proaudio Spectrum 16 sound card.

Re:The old days (5, Funny)

yurtinus (1590157) | 1 year,6 days | (#44950365)

Y'know, I was enjoying reading all the little nuggets of wisdom (Video cards that could use as much as 512 mb of address space, $700 for 2GB of RAM). Then I was thinking "hey, the computer I had before this one was an Athlon 64, it wasn't *that* long ago!" Then I realized it was. Then I felt old. Now I'm crying.

Re:The old days (1)

Anonymous Coward | 1 year,6 days | (#44949293)

The only difference is that today you know you're being lied to. It's called growing up.

Re:The old days (1)

Nadaka (224565) | 1 year,6 days | (#44949359)

These days, aim for a price point of $1k with competitively priced components and you are almost certain get a decent gaming rig. PC hardware is far ahead of the curve thanks in part to extreme production costs of high quality graphics, and also in part to console hardware holding back the standard quality settings for multi-platform releases. That will give you medium of better settings at 1080p on all current and foreseeable games.

Re:The old days (1)

h4rr4r (612664) | 1 year,6 days | (#44949495)

$1000? You can beat the consoles for less than $500.
If you keep your old case and dvd/bluray drive you can do even better. I tend to swap out MOBO, RAM, CPU and GPU in one shot every few years. I have not had to do this since I picked up my GTX465.

The consoles are holding gaming so far back there is no point in spending even $1000.

Re:The old days (0)

Anonymous Coward | 1 year,6 days | (#44949621)

Unless you want more out of your computer than gaming. Or if you're a PC modding enthusiast. Or if you don't want to upgrade every year, and want a machine that will last for a decade.

Re:The old days (1)

h4rr4r (612664) | 1 year,6 days | (#44949755)

My machine also runs KVM, and various database software. I have not upgraded since that video card, which is more than a year old.

Re:The old days (1)

Nadaka (224565) | 1 year,6 days | (#44949851)

$1000 includes all parts of a computer, including monitor. Reusing previously bought parts does not greatly reduce the actual cost of the computer as you lose the opportunity to use that other hardware for other purposes.

And that $1000 will likely play all current games at high settings, and have the ability to play foreseeable future games with at least medium settings.

A new generation of consoles is coming out, and the base quality of graphics will rise to meet the abilities of that new hardware, and that may go beyond where your $500 machine can keep up.

Re:The old days (1)

h4rr4r (612664) | 1 year,6 days | (#44950099)

How many monitors do you need?
If you had no other use for that hardware it sure does. SSDs, monitors, good cases those are expensive.

We have seen the new consoles already and no they will not exceed a $500 PC at this point.

Re:The old days (3, Informative)

asmkm22 (1902712) | 1 year,6 days | (#44949693)

Not really sure what you're smoking. It's much easier to put together a computer (including a gaming computer) these days than it was 10 years ago. We don't really have to worry if we need PC-133, PC-2700, DDR1, DDR2, etc.. There's no need to choose between AGP, PCI, or that new-fangled PCI-Express, much less whatever multiplier is involved. Hard drives are straight up SATA now, and it doesn't matter if you choose a disk or SSD type. The graphics cards themselves aren't even as important since the console cycle has pretty much bottlenecked as a result of developers focusing on those consoles first and foremost. We don't need to do much more than make sure the motherboard is either an Intel or AMD socket.

In fact, about the only real difficult decision you might need to make these days is finding a computer case that has enough room to use a modern video card.

Re:The old days (1)

bemymonkey (1244086) | 1 year,6 days | (#44949723)

To be honest, I'm loving the ease of putting together a decent system these days. I actually owned an Athlon64 based system back in the day (with an expensive high-end SLI nForce-based board), and that sucker was never completely stable. Same thing with the AthlonXP generation, and K7 (Athlon/Duron) beforehand...

These days, I just pick the Intel chip that fits my needs, by the cheapest name-brand board that fits my needs, slap it together and it's rock solid. Celerons, Pentiums, Core iX, whatever... hell, even overclocking 30-40% (I'm sitting next to a Sandy Bridge machine that's running at about 45% OC'd) only takes a few seconds and is rock solid.

From what I've heard, AMD's current chips are similarly stable, but I'm not really willing to risk it.

Happy Tuesday from The Golden Girls! (-1)

Anonymous Coward | 1 year,6 days | (#44949185)

Thank you for being a friend
Traveled down the road and back again
Your heart is true, you're a pal and a cosmonaut.

And if you threw a party
Invited everyone you knew
You would see the biggest gift would be from me
And the card attached would say, thank you for being a friend.

Re:Happy Tuesday from The Golden Girls! (-1)

Anonymous Coward | 1 year,6 days | (#44949451)

The word is "confidant", not "cosmonaut".

Before AMD committed suicide (1)

Anonymous Coward | 1 year,6 days | (#44949225)

They swooped in when Intel was being stupid, made the best chips in the world... then committed suicide and haven't built a competitive chip in 3 years. Sad times...

Re:Before AMD committed suicide (4, Insightful)

bill_mcgonigle (4333) | 1 year,6 days | (#44949289)

AMD is very competitive for many-cores workloads. To get an equivalent core count on Intel can be as much as a second AMD system. AMD has gone more wide, Intel has gone more deep. Both have their applications.

Re:Before AMD committed suicide (3, Insightful)

Billly Gates (198444) | 1 year,6 days | (#44949465)

Tell that to Tomshardware and others who use x87 benchmarks and games like skyrim showing an AMD 8 core being handed a smackdown by an i3?

No one believes in AMD anymore

Re:Before AMD committed suicide (4, Insightful)

h4rr4r (612664) | 1 year,6 days | (#44949517)

For games sure, but there are lots of workloads that are not games.

Re:Before AMD committed suicide (2)

h4rr4r (612664) | 1 year,6 days | (#44949537)

and many games are not made by Bugthesda.

Re:Before AMD committed suicide (1)

Anaerin (905998) | 1 year,6 days | (#44949677)

And for those computations, at the desktop level, 1 Intel core is approximately as fast as 2 AMD clocks. Intel has MUCH better IPC (Instructions Per Clock) and better re-ordering and lookahead than AMD, and have since the Intel introduced their "Core" infrastructure. This is why a mid-range Intel part [] (Say, an Intel Core i5-4670K) can handily (and significantly) beat AMD's top-of-the-line desktop CPU [] (An FX-8350)

Re:Before AMD committed suicide (1)

h4rr4r (612664) | 1 year,6 days | (#44949717)

My desktop is used to run KVM, please tell me all about how a 1 core intel would be good enough. I shall wait.

Re:Before AMD committed suicide (1, Insightful)

Anaerin (905998) | 1 year,6 days | (#44949909)

It probably wouldn't. But a dual-core Intel processor would be as good as (or better than) a quad-core AMD. And a quad-core Intel would be as good as, or better than, an 8-core AMD. Especially with Intel's Hyperthreading enabling 2 cores-worth of processing to be handled on a single core.

Re:Before AMD committed suicide (0)

Anonymous Coward | 1 year,6 days | (#44949959)

I have a dual core pentium that regularly handles 8 windows xp virtual machines at once doing vairous tasks mainly used for testing browsers it handles them pretty good and thats 4 per core I dont have a newer amd but I had an old fx quad core that struggled with 2 virtual machines so in my book the battle for cores has become irrelevant a while ago oh and mhz to because the intel is 2.4ghz and the amd was 2.8ghz.

Re:Before AMD committed suicide (1)

FreonTrip (694097) | 1 year,6 days | (#44950017)

It depends an awful lot on the workload, though. For gaming it's one-sided in Intel's favor to the tune of around 2/3 more work done per clock on average (sometimes more), but for video encoding with x264 the sheer core count makes it better than competitive with Intel unless you're willing to pay noticeably more. It's a behemoth for virtual machines, it plays video games well enough, and for scientific computation I really haven't found myself wanting. Granted, I'm an edge case...

Re:Before AMD committed suicide (2)

Anaerin (905998) | 1 year,6 days | (#44950247)

Perhaps you should read the second chart [] here. That's testing encoding with Handbrake (Which is essentially a graphical frontend to x264). In that particular test, the i5-4670K wipes the floor with the comparably priced FX-8350, even without the former's huge overclocking potential.

Re:Before AMD committed suicide (1)

Anonymous Coward | 1 year,6 days | (#44950189)

This is why a mid-range Intel part (Say, an Intel Core i5-4670K) can handily (and significantly) beat AMD's top-of-the-line desktop CPU (An FX-8350)

Really? CPU Benchmarks says

  i5-4670K - 7531
  AMD FX-8350 - 9091

A comparable Intel chip would have to be closer to i7-3820, not your i5. Perhaps your benchmarks are a little crappy? [] []

Anyway, AMD is far more $$$ efficient for typical desktop. Yes, including any thermal envelope differences.

Re:Before AMD committed suicide (2)

0123456 (636235) | 1 year,6 days | (#44949573)

Tell that to Tomshardware and others who use x87 benchmarks and games like skyrim showing an AMD 8 core being handed a smackdown by an i3?

True, it's horrible that review sites benchmark CPUs using the kind of programs people actually run on them.

I remember back when I bought my P4, the only thing a similarly priced Athlon XP really beat it on were x87-intensive games. Professional 3D apps using SSE were significantly faster on the P4, which is why I ended up buying it instead.

Re:Before AMD committed suicide (2)

bored (40072) | 1 year,6 days | (#44950261)

True, it's horrible that review sites benchmark CPUs using the kind of programs people actually run on them.

Yes, and no. If your a gamer, obviously having a CPU that the games are optimized for is a big win. But don't extrapolate general performance from a single benchmark. Especially when one of the CPU vendors is providing "free" performance help for the game/application.

At this point, its pretty clear that choosing the Intel is the correct choice for big name games.

We will see if this changes over the next few years with the consoles being AMD based. The game companies are going to optimizing for those platforms, we will see if any of that carries over into game benchmarks for desktop machines. In many cases its possible to get a 2x delta by optimizing for a particular CPU/platform at the expense of the general case.

As for SSE, AMD has always been a little behind with SSE (well duh, they follow whatever intel is doing and it takes them a while to catchup), so if your application is built for the latest version of SSE, and its gaining something from it, then running a similar version without hurts. Recently seen with SSE 4.1/4a where there were a couple useful instructions for some code paths that didn't exist on the AMD and hurt it on benchmarks using SSE4.

Toms = Intel PR (0, Troll)

charnov (183495) | 1 year,6 days | (#44949589)

Toms got outed years ago as being paid by Intel. If you want good, unbiased reviews of games and gaming hardware, go to HardOCP.

Re:Before AMD committed suicide (2)

bill_mcgonigle (4333) | 1 year,6 days | (#44949771)

Tell that to Tomshardware and others who use x87 benchmarks and games like skyrim showing an AMD 8 core being handed a smackdown by an i3?

Awesome, you found a workload where deeper is better. Now go try costing out a cluster with hardware virtualization and ECC RAM to support several thousand SMP virtual machines and see what you come up with.

Re:Before AMD committed suicide (5, Insightful)

bored (40072) | 1 year,6 days | (#44950115)

The spec benchmarks tell a different story, and tend to be more representative because each vendor does their best rather than intel/nvidia providing "free" performance enhancement advice for game companies.

So, from my own experience the Amd/Intel story is a lot closer than some of these benchmarks might lead you to believe. Especially for server applications.

Its pretty easy with modern CPU's to make fairly small tweaks that give advantage to one CPU or another. We have a bunch of microbenchmarks for our application, and things like memcpy performance can be swung 2x-4x. Or even the depth of loop unrolling for some things. In one loop the intel it may like 2x and the AMD like 4x unroll. With each one tuned to run best on the platform the bottom line performance is often quite similar, but run the AMD optimized one on the intel, or the reverse and suddenly one or the other CPU appears to be trouncing the other.

zombie computing (1)

Anonymous Coward | 1 year,6 days | (#44949709)

And, of course, AMD is a much cheaper option to get ECC on the desktop. Which is less and less a luxury and more and more a necessity. Simply because the memory densities keep going up, yet the MTTF doesn't go down as much (if at all). Same problem with hard drives, actually, so RAID doesn't really cut it any longer. Need parity files on the disk itself as well as at least mirrored disks.

Still and all, this sort of reminiscing makes me long for alpha and parisc. ARM isn't quite there yet, and mips isn't available in the top performing brackets any longer either. And POWER? Well, few people can take that much gouging.

Re:zombie computing (1)

bill_mcgonigle (4333) | 1 year,6 days | (#44949861)

Need parity files on the disk itself as well as at least mirrored disks.

You need ZFS. :) No, really, it checksums all the writes, which reflects the modern reality. I've got a machine at home in the basement that is effectively just ECC RAM and a bunch of disks (RAID-Z on that one I think, RAID-Z2 at work), to store our home data. It's still cheaper to do it in one spot and then run non-ECC hardware elsewhere, accessing the reliable data over gigabit.

On my laptop, where I have many fewer options, I've just got ZFS running on top of a single LUKS volume. But for the same reasons (and compression helps on the laptop).

Re:Before AMD committed suicide (2)

Nadaka (224565) | 1 year,6 days | (#44949415)

They are still competitive on the performance per $ scale, and provide cpu's adequate for almost all standard needs.

Just Replaced (2)

MightyYar (622222) | 1 year,6 days | (#44949235)

I only just replaced my Athlon 64 motherboard and processor this spring. It was a good product, but not quite up to running Windows 8 IMHO.

10 years later and applications are still 32bit. (4, Insightful)

Anonymous Coward | 1 year,6 days | (#44949299)

10 years later and we're still running games and applications that are 32bit that only use a single core.

Re:10 years later and applications are still 32bit (2)

alen (225700) | 1 year,6 days | (#44949475)

for gaming, the GPU took over most of the work which is the way it should have happened

for applications, most don't really need 2 cores. even running multiple apps at the same time you don't really need 2 cores. i was playing MP3's on a computer in the 1990's with minimal CPU usage. there is no way you need to dedicate a whole core to music while surfing the internet. or some of the other idiotic use cases people make up

Re:10 years later and applications are still 32bit (0)

Anonymous Coward | 1 year,6 days | (#44949553)

Then you must have not played Skyrim with 4k texture mods, crashing to desktop all the time because it reaches its magic max memory limit.

Re:10 years later and applications are still 32bit (1)

Anonymous Coward | 1 year,6 days | (#44949583)

You mean modifying a game outside of its intended scope makes it work in unexpected ways?

That's so weird.

No not really (1)

Sycraft-fu (314770) | 1 year,6 days | (#44949757)

On high end games, the CPU gets hit hard. AI, physics, etc, all need a lot of power. Battlefield 3 will hit pretty hard on a quad core CPU, while hitting hard on a high end GPU at the same time.

Re:10 years later and applications are still 32bit (1)

adolf (21054) | 1 year,6 days | (#44950295)

When I'm waiting for an application to do whatever that application is doing, and that application is only using one core, then yes, I really do need it to use more than one core.

To suggest otherwise is also to suggest that computers are fast enough, and that general-purpose computing is a solved problem.

I don't think we're anywhere near that point just yet.

Re:10 years later and applications are still 32bit (1)

ciderbrew (1860166) | 1 year,6 days | (#44949499)

Fair point and the follow up is WHY?

Re:10 years later and applications are still 32bit (0)

Anonymous Coward | 1 year,6 days | (#44950095)

From the steam hardware survey page around 10% of the computers still run a 32bit OS (Windows XP, Windows Vista 32bit,...) and the development of current games was started when that percentage was even higher.

  The multicore situation is another problem, getting multi-threading right and bug free is hard and getting more than a marginal performance increase might require an almost complete redesign of engines that have been in development for over a decade (just a few keyworks: shared mutable state, consistency, synchronisation overhead, deadlocks, visibility).

Re:10 years later and applications are still 32bit (1)

Waffle Iron (339739) | 1 year,6 days | (#44949653)

10 years later and we're still running games and applications that are 32bit that only use a single core.

At least 64-bit OSes are widespread now.

Almost ten years after the 80386 was introduced, most people were still running "OSes" which were little more than GUI shells running on 16-bit DOS.

Re:10 years later and applications are still 32bit (1)

yuhong (1378501) | 1 year,6 days | (#44950313)

I mentioned that Caldera actually sued MS based on the fact that Win9x was still based on DOS in my blog article on the OS/2 2.0 fiasco, because OS/2 never depended on DOS.

Re:10 years later and applications are still 32bit (1)

UnknowingFool (672806) | 1 year,6 days | (#44949817)

Damnit people, do not give Adobe's Flash developers any ideas!!

Re:10 years later and applications are still 32bit (0)

Anonymous Coward | 1 year,6 days | (#44949835)

Trololol stop using Windows. Linux and Mac are generally 100% 64-bit for what most people are using.

Re:10 years later and applications are still 32bit (1)

timeOday (582209) | 1 year,6 days | (#44950249)

The need for computers that can run multiple programs concurrently with a total of > 4GB RAM is more than the need for any single program to consume multiple cores or > 4 GB RAM.

Re:10 years later and applications are still 32bit (1)

Anonymous Coward | 1 year,6 days | (#44950275)

Not for much longer. DICE has already announced that some of their new Frostbite 2 games will be 64-bit only, due to memory requirements beyond 2 GB.

Great processor (1)

the_humeister (922869) | 1 year,6 days | (#44949305)

Too bad AMD was just sitting on their laurels after that. Incidentally, in 2 more years, you can start making your own Pentium Pro compatible processor without violating any patents (assuming you're using the same patents that went into the Pentium Pro).

Error in 32/64 bit libraries. Please reinstall (3, Interesting)

ObsessiveMathsFreak (773371) | 1 year,6 days | (#44949327)

Wow. Ten years. And here I am still dealing with 64 bit incompatability issues every six months or so.

Out of curiosity, how long did 16bit library problems linger after the 32 bit move?

Re:Error in 32/64 bit libraries. Please reinstall (1)

osu-neko (2604) | 1 year,6 days | (#44949435)

Library problems? Were there ever 16-bit programs that were not statically linked to their libraries?

.dll16 in WINE seems to indicate so. (0)

Anonymous Coward | 1 year,6 days | (#44949565)

And I'm pretty sure 'dll hell' started in the windows 3.x and below era. Although it's possible I am wrong.

Re:Error in 32/64 bit libraries. Please reinstall (1)

0123456 (636235) | 1 year,6 days | (#44949779)

You don't really think that all those 16-bit Windows apps statically linked in every Windows library, do you?

Re:Error in 32/64 bit libraries. Please reinstall (1)

Waffle Iron (339739) | 1 year,6 days | (#44949879)

Library problems? Were there ever 16-bit programs that were not statically linked to their libraries?

Yes, and it was a largely manual process. I suggest you find an old 16-bit Windows programming textbook and learn about the hoops people used to have to jump through to implement dynamic linking. When software versions changed, then as now, what could possibly go wrong?

Re:Error in 32/64 bit libraries. Please reinstall (0)

Anonymous Coward | 1 year,6 days | (#44949523)

10 years and I yesterday discovered that Photoshop Elements 11/12 is 32-bit and limited to 3.2GB of RAM. Adobe's stirling work continues.

Re:Error in 32/64 bit libraries. Please reinstall (0)

Anonymous Coward | 1 year,6 days | (#44949593)

They barely existed.
Usually most applications were largely self-contained, so there were never any linking issues.
Writing shared libraries on 16 bit was difficult anyway and thus it was rarely done.

Re:Error in 32/64 bit libraries. Please reinstall (0)

Anonymous Coward | 1 year,6 days | (#44949625)

Until about 2 weeks ago. They seemed to clear up quite a bit then.

Re:Error in 32/64 bit libraries. Please reinstall (1)

sl4shd0rk (755837) | 1 year,6 days | (#44949645)

Wow. Ten years. And here I am still dealing with 64 bit incompatability issues

10 years? Some of us are still waiting to reap the benefits of MMX extensions. Ha..

Re:Error in 32/64 bit libraries. Please reinstall (-1)

Anonymous Coward | 1 year,6 days | (#44950037)

That is because all Linux distributions still compile everything for the i386 for compatibility reasons.

Re:Error in 32/64 bit libraries. Please reinstall (0)

Anonymous Coward | 1 year,6 days | (#44950231)

There is zero truth in your statement.
I just don't know whether or not it's intentional.

Re:Error in 32/64 bit libraries. Please reinstall (1)

FreonTrip (694097) | 1 year,6 days | (#44950057)

And in AMD64 MMX (along with 3DNow!) is officially deprecated in favor of the SSE instructions. Like a twisting of the knife.

Re:Error in 32/64 bit libraries. Please reinstall (1)

KliX (164895) | 1 year,6 days | (#44949953)

They're still here.

16 to 32 was mostly incompatible (1)

erice (13380) | 1 year,6 days | (#44950371)

Wow. Ten years. And here I am still dealing with 64 bit incompatability issues every six months or so.

Out of curiosity, how long did 16bit library problems linger after the 32 bit move?

16 to 32 was a much more radical change. Segments to flat. In the Unix line, this happened in the early 80's (late 70's?) when few systems were deployed.

In the Wintel line, it was also cooperative to preemptive. Very painful. Very manual. It took 10 years just to let go of 16 bit device drivers and many were never ported.

Classic Mac, Amiga, and Atari ST had an easier time since their "16-bit" systems were already 32-bit internally. Even then you had a few years of dealing with geniuses who stored data the upper 8 bits of pointers and some significant software (like AmigaBasic) were never ported.

Bludgeoned? (0)

Anonymous Coward | 1 year,6 days | (#44949439)

Someone is smoking the crack pipe again. That was the golden age of AMD. Intel had slower chips running hotter including the P4... Intel ate AMD dust until they did a complete redesign.

Still better IMHO (2, Insightful)

s.petry (762400) | 1 year,6 days | (#44949503)

AMD still makes a better chip for many FP intensive applications, and the price is still superior to Intel to boot. Intel always made a big deal about clock speed, while AMD worked on actual performance. It is really a shame that people pay more attention to marketing than real performance.

Re:Still better IMHO (2)

Anaerin (905998) | 1 year,6 days | (#44949729)

Maybe you should look at the actual performance numbers [] . Intel is performing better than AMD, and at a cheaper price point. And unfortunately, I'm an AMD fan, running a Hexcore Bulldozer here.

Re:Still better IMHO (1)

s.petry (762400) | 1 year,6 days | (#44950181)

Instead of relying on a web site which does testing based on their funding, perform actual bench marks yourself. I do, and see mixed results today. 5 years ago for math intensive apps, AMD was a hands down winner. Apps like Muses, Nastran, Abaqus, Dyna, etc...

Re:Still better IMHO (1)

bemymonkey (1244086) | 1 year,6 days | (#44949799)

Aren't Intel's chips faster clock for clock right now? Not to mention much more efficient?

Re:Still better IMHO (0)

s.petry (762400) | 1 year,6 days | (#44950253)

If speed was all that mattered in a chip you would have a point. Speed is not all that matters, it never has been and never will be. The memory buss is much better on an AMD chip. FP instruction is faster and better in AMD chips. If it takes .01ns to get an instruction to memory due to buss length and .001ns on an AMD because of shorter buss length, who's chip is better?

Speed is a misnomer, and go back 10 years and AMD was telling you that the only rating on a chip should not be speed. You probably forgot to read it, or forgot that you did read it.

It's like having a city with a highway around it, think Houston or DC. You can drive 65mph around the city on the freeway. To you, I'm going slow on a 35mph road. Who makes it from the South side of town to North first? Not you, assuming I know a better route. That analogy is why chip speed arguments are a measure of ignorance.

Re:Still better IMHO (1)

bemymonkey (1244086) | 1 year,6 days | (#44950353)

Hmmm, I was actually using the word "faster" in a "get shit done more quickly" sense - not higher clock speeds. I.e. a 3GHz Haswell i5 being faster than a 3GHz Bulldozer (or whatever the latest generation is called) for a purely CPU-limited single-threaded workload. That's what I'm asking - is this not still the case?

The fact that I can crank an unlocked i5/i7 to 4.5-5.0 GHz without any issues whatsoever is just icing on the cake.

Bludgeoning, vengeance, kicking, seizing? (1)

Sir or Madman (2818071) | 1 year,6 days | (#44949541)

Must we be so violent? This is CPU sales not some barbaric conflict.

Comparison with current CPUs (1)

IYagami (136831) | 1 year,6 days | (#44949545)

I was hoping to find a current review of the processor against current CPUs....

However, in AnandTech bench you can compare an AMD Athlon X2 4450e (2.3GHz - 1MB L2) with current CPUs. If you compare this to an Intel Core i7 4770K (3.5GHz - 1MB L2 - 8MB L3, one of the best CPUs right now), you can find that the Intel CPU is between 3 times faster and 9 times faster. Most of the times is about 6-7 times faster.

See []

However, if you could compare an AMD FX-51 with an Pentium 66 Mhz (best CPU in September 1993), I think that the difference would be way greater.

CPU process is currently focused on efficiency and lower power. However, in the ARM field, you can still find progress in CPU performance.

Re:Comparison with current CPUs (1)

bjackson1 (953136) | 1 year,6 days | (#44950159)

The Athlon X2 4450e was released in April of 2008, so we are only looking at 5 years difference not 10 years. I think the more interesting comparison would be to what the Athlon FX-51 and the new Apple A7 chip look like, given they are the first 64 bit chips of their class.

P4 vs Athlon XP (0)

Anonymous Coward | 1 year,6 days | (#44949561)

I wouldn't say that the P4 was bludgeoning the Athlon XP series except maybe right at the verrry end of its career. While P4s clearly had the advantage when it came to integer math, Athlon XP's Barton core processors had a clear edge when it came to floating point math, allowing it to just outshine in gaming benchmarks over the P4. AND the XP's cost a fraction of the price! For a gamer, the AthXP vs P4 debate was seen a bit differently.

Re:P4 vs Athlon XP (4, Interesting)

Dputiger (561114) | 1 year,6 days | (#44949753)

As the author of the article:

In 2000 - 2001, the Athlon / Athlon XP were far ahead of the P4. But from Jan 2002 to March 2003, Intel increased the P4's clock speed by 60% and introduced Hyper-Threading. SSE2 became more popular during the same time. As a result, the P4 was far ahead of Athlon XP by the spring of the year in most content creation, business, and definitely 3D rendering workloads. Now it's true that an awful lot of benchmark shenanigans were going on at the same time, and the difference between the two cores was much smaller in high-end gaming. But if you wanted the best 'all around' CPU, the P4 Northwood + HT at 2.8 - 3.2GHz was the way to go. Northwoods were also good overclockers -- it was common to pick up a 2.4GHz P4 and clock it to 3 - 3.2GHz with HT.

Athlon 64 kicked off the process of changing that, but what really did the trick was 1). Prescott's slide backwards as far as IPC and thermals and 2) The introduction of dual-core. It really was a one-two punch -- Intel couldn't put two Pentium 4 3.8GHz chips on a die together, so the 820 Prescott was just 3.2GHz. AMD, meanwhile, *could* put a pair of 2.4GHz Athlon 64's on a single chip. Combine that with Prescott's terrible efficiency, and suddenly the Athlon 64 was hammering into the P4 in every workload.

Re:P4 vs Athlon XP (1)

FreonTrip (694097) | 1 year,6 days | (#44950153)

The misleading thing about benchmarks is that they're generally prebaked - there's no chance for "surprise" physics interactions or various pipeline-stalling things that tended to trip up the Pentium 4. From personal experience I'll tell you that my old 2.8 GHz Pentium 4 generally didn't do as well as my Athlon XP 2400+ in Doom 3, Bioshock, or Unreal Tournament 3. The latter two should have been poster children for the Netburst chip by comparison. Also, the Pentium D 820 was a 2.8 GHz chip: it was the miserably hot 130W TDP 840 that ran at 3.2 GHz. But you're correct on the other counts - the higher IPC and integrated memory controller were both HUGE advantages over a latency-crippled, deeply pipelined architecture. The Pentium D was itself a flailing, mostly failed response to the surge in mindshare the Athlon 64 X2 created until the Core architecture could be prepared.

Re:P4 vs Athlon XP (2)

fuzzyfuzzyfungus (1223518) | 1 year,6 days | (#44950285)

Not that it mattered to the P4-area contest(where desktop OSes and workloads would remain 32 bit for quite some time to come, and RAM fairly expensive); but the A64 was a real smack in the face for IA64...

Intel has their grand, big-iron-class, future-of-enterprise-computing 64-bit architecture, then AMD pops up "Hi guys, who wants a 64-bit CPU, fully backwards compatible with your 32-bit x86 code and pretty damn fast at that, for only slightly more than the price of a nice desktop CPU?"

Boom. Headshot. Game over man, game over.

The overclockable 2.4-2.8 Northwoods kept up on the 32 bit side at the time, and Intel has since swallowed their pride and put out some genuinely brutal 'EMT64' parts; but IA64 was buried beyond hope.

AMD was king of the hill, but... (4, Interesting)

Anonymous Coward | 1 year,6 days | (#44949769)

AMD, forgotten by most of you, purchased a CPU design company not long after it lost the right to clone Intel CPU designs. The people from this company gave AMD a world beating x86 architecture that became the Athlon XP and then Athlon 64 (and true first x86 dual core), thrashing Intel even though AMD was spending less than ONE-HUNDREDTH of Intel's R&D spend.

What happened? AMD top management sabotaged ALL future progress on new AMD CPUs, in order to maximise salaries, bonuses and pensions. A tiny clique of cynical self-serving scumbags ruined every advantage AMD had gained over Intel for more than 5 years afterwards. Eventually AMD replaced its top management, but by that time it was too late for the CPU. Obviously, AMD had far more success on the GPU side after buying ATI. (PS note that ATI had an identical rise to success, when that company also bought a GPU design team that became responsible for ALL of ATI's world-beating GPU designs. Neither AMD nor ATI initially had in-house talent good enough to produce first rate designs.)

Today, AMD is ALMOST back on track. It's Kaveri chip (2014) will be the most compelling part for all mains powered PCs below high-end/serious gaming. In the mobile space, Intel seems likely to have the power-consumption advantage (for x86) across the next 1.5 years at least. However, even this is complicated by the fact that Nvidia is ARM, and AMD is following Nvidia, and is soon to combine its world beating GPU with ARM CPU cores.

At this exact moment, AMD can only compete on price in the CPU market. Loaded, its chips use TWICE the power of Intel parts. In heavy gaming, average Intel i5 chips (4-core) usually wallop AMD's best 8-cores. In other heavy apps, AMD at best draws equal, but just as commonly lags Intel.

Where AMD currently exterminates Intel is with SoC designs. AMD won total control of the console market, providing the chips for Nintendo, Sony and Microsoft. Intel (and Nvidia) were literally NOT in the running for these contracts, having nothing usable to offer, even at higher prices or lower performance.

AMD is currently improving the 'bulldozer' CPU architecture once again for the Kaveri 4-core (+ massive integrated GPU and 256-bit bus) parts of 2014. There is every reason to think this new CPU design will be at rough parity with Intel's Sandybridge, in which case Intel will be in serious trouble in the mains-powered desktop market.

Intel is in a slow but fatal decline. Intel is currently selling its new 'atom' chips below cost (illegal, but Intel just swallows the court fines) in an attempt to take on ARM, but even though Intel's 'atom' chips are actually Sandybridge class, and have a process advantage, they are slaughtered by Apple's new A7 ARM chip found in the latest iPhones. A7 uses the latest ARM-64 bit design known as ARMv8, making the A7 and excellent point of comparison with the original Athlon 64 from years back.

Again, AMD is now x86 *and* ARM. AMD has two completely distinct and good x86 architectures ('stars-class' and 'bulldozer-class'. Intel is only x86, and now with the latest 'Atom' has only ONE x86 architecture in its worthwhile future lineup. Intel has other x86 architectures, but they are complete no-hopers like the original Atom family, the hilariously awful Larabee family, and the putrid new micro-controller family. Only Intel's current sandybridge/ivybridge/haswell/new-atom architecture has any value.

Re:AMD was king of the hill, but... (1)

0123456 (636235) | 1 year,6 days | (#44949841)

AMD is not ARM. ARM is ARM. Anyone can buy an ARM license and start releasing ARM chips. AMD are producing ARM chips because they can't compete with Intel in the x86 market.

Nothing stops Intel releasing ARM chips, as they have in the past, except the margins would probably be awful compared to their x86 lineup.

*THE* chip that changed the world? (0)

Anonymous Coward | 1 year,6 days | (#44949971)

I'm sorry, that is the height of hyperbole. There are *MANY* chips that I would say had a bigger impact than AMD's introduction of x86-64.

It was a very big impact, for sure, but not "the" chip...

that first tyan dual cpu board (1)

rs79 (71822) | 1 year,6 days | (#44949979)

so i was already feeling stoked about finally getting around to finding a matched pair of the fastest cpus that I can put into this board that's been sitting in a box for SIX YEARS, they boot and now I read this. /me - does the peacock strut happy dance thing.

i was always a fan of AMD going back to the 8x300 bit slice stuff. they're clever boys.

Those bastards (5, Funny)

LodCrappo (705968) | 1 year,6 days | (#44950033)

Apple just released a 64bit processor, and now AMD is copying it TEN YEARS ago?!?

Can the industry please do something original and quit just following wherever Apple leads it?

Re:Those bastards (0)

tie_guy_matt (176397) | 1 year,6 days | (#44950223)

Is this intended to be a joke?

Apple doesn't make processors. These days macs use intel chips. Back in the day they used to use powerpc chips made by motorola and IBM. If you go into the way-way back machine they once used m68k chips from motorola. On the iphone and such they just use ARM cpus. Then if you go all the way back to the days when Woz was making computers in his garage you will see that they used MOS 6502 chips. But they have never really been in the cpu industry.

Re:Those bastards (1)

FreonTrip (694097) | 1 year,6 days | (#44950277)

First: yes, it's clearly a joke. How can you copy something done ten years earlier? :P

Second, Apple does make their own ARM CPUs these days. They build and design licensed ARM CPUs for their iOS devices these days, which includes AppleTV, iPhones, iPads, and iPods, but for their Mac / OS X business they are still 100% Intel. Their latest design is starting to turn some heads. []

64 bit - Really, what's the point? (1)

yayoubetcha (893774) | 1 year,6 days | (#44950193)

I am referring to specifically 64b data addressing, and more specifically desktop apps as well as mobile.

Seriously, the argument goes.... 32 bit can only access 4GB! We need more memory access today, therefore 64 bit data addressing is needed! Well, in 1996 I wrote a small little boot up kernel for testing super-scalar nodes with PAE addressing in the Pentium-Pro processor. Now, in PAE mode, no more limit to accessing beyond 4GB addr space with the 32 bit address space.

Today, I do not see any apps in general use that need or require access to memory beyond a contiguous block of data beyond 4GB (frankly far far less). So, when AMD came out with 64bit x86 I yawned. I still do. I am running in 64bit mode, and at times I am using more than 6GB of my 8GB memory with some virtual machines opened, but still, I have no need for a contiguous block of memory beyond a few hundred meg (and that's an exaggeration). Again, in 32 bit mode with PAE, the sky's the limit, realistically for desktop/mobile applications.

If you want to talk databases for the Enterprise, then yes. If you want to talk about a few specific scientific purposes, then yes. But even high-def movie editing does not need or require a contiguous block of memory that cannot be handled with a 32 bit bus.

However, from a marketing perspective, well, we all know 64 is twice that of 32, so it at least twice as good! I get more for my money!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?