×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Launches New Processor Socket Despite Poor Economy

ScuttleMonkey posted more than 5 years ago | from the damn-the-torpedoes-full-speed-ahead dept.

AMD 215

arcticstoat writes to tell us that despite a poor economic climate, AMD is moving forward with a new processor socket launch, although they are trying to make it as upgrade-friendly as possible. "As you probably already know from the AM3 motherboards that have already been announced, AM3 is AMD's first foray into DDR3 memory support. As Phenom CPUs have integrated memory controllers, it's more accurate to say that it's the new range of Phenom II CPUs (see below) that are DDR3-compatible. However, the new DDR3-compatible Phenom II range is also compatible with DDR2 memory. As the new CPUs and the new AM3 socket are pin-compatible with the current AM2+ socket, you can put a new AM3-compatible CPU into an existing AM2+ motherboard. This means that you can upgrade your CPU now without needing to change your motherboard or buy pricey new DDR3 memory."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

215 comments

It's a myth (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26790765)

I'm doing fine. The people I know who are getting fired are the crap programmers who cheated when they took classes with me, and didn't do they're work in group projects. Screw'em

Re:It's a myth (0, Informative)

Anonymous Coward | more than 5 years ago | (#26791527)

And presumably all those auto industry and manufacturing workers getting fired were cheating when they took classes in high school right? The 500,000+ lost jobs outside the computing industry and major across-the-board corporate spending restriction measures, they're a myth too.

It's not all about you.

Re:It's a myth (0, Insightful)

Anonymous Coward | more than 5 years ago | (#26791593)

>92% of the population still has a job. think about that for a second. we are doing quite well. stop listening to the mass media idiots.

Re:It's a myth (1, Offtopic)

Gerzel (240421) | more than 5 years ago | (#26791735)

Uhm.... no.

You are taking that an 8% unemployment rate equals the actual rate of those who are unemployed. It does not.

First of all that figure does not include retired, children and those who for what ever reason are not interested at all in having a job.

Then you'll wanna figure the people how are willing to work but don't have a job or don't have enough of a job(sorry flipping burgers does not feed a large family).

If you take all the people who don't have a job but would like one you'll need to up that figure to at least one and a half to double or more of what it is now. (Note this is closer to how they measured in the Great Depression, though they also included all "eligible" workers and didn't include women.)

If you add in all the people who are under-employed those who has part-time work that need full time and those who have a job but not enough of one(or two or three in many cases) to fit their needs then you are looking at a figure of at least double that of the reported unemployment rate.

In short the percent of the population that has a job is closer to 60 than it is to 92.

Re:It's a myth (1)

Endo13 (1000782) | more than 5 years ago | (#26791837)

It also does not include anyone who is not eligible for unemployment, for any reason. (Such as being unemployed for longer than the unemployment period + extension.)

Re:It's a myth (1)

Toonol (1057698) | more than 5 years ago | (#26792329)

So, if we eliminate women from the tally, like during the great depression, will we have a -42% unemployment rate?

That's a joke, but seriously, it's folly to equate today's unemployment to the great depression's. It's like comparing a windy day to a hurricane.

Re:It's a myth (1)

Fozzyuw (950608) | more than 5 years ago | (#26792435)

(sorry flipping burgers does not feed a large family)

Do people actually think it does?

Re:It's a myth (2, Funny)

Fozzyuw (950608) | more than 5 years ago | (#26792449)

doh! Hit the wrong button. Forgot to add the punchline about... unless they're bringing those burgers home. =P

Re:It's a myth (3, Insightful)

The End Of Days (1243248) | more than 5 years ago | (#26791745)

And presumably all those auto industry and manufacturing workers getting fired were cheating when they took classes in high school right?

No, they were busy getting drunk and ignoring the adults who told them they'd end up with shitty jobs later on in life if they didn't take school more seriously.

Good (1, Interesting)

Sangui (1128165) | more than 5 years ago | (#26790777)

Because DDR3 memory is so high latency it isn't even worth it. There's no speed increase.

Re:Good (5, Informative)

XanC (644172) | more than 5 years ago | (#26790853)

The latency is generally lower than DDR2, measured in wall-clock time. The advertised latency appears worse only because of the faster clock.

Re:Good (5, Informative)

aliquis (678370) | more than 5 years ago | (#26791043)

That's bullshit, CL in periods * period length = latency, and since they are clocked higher the latency will probably be around the same, I won't calculate it for you.

And that latency is how long it takes before you actually start to read any bits, but as soon as you have started each bit will come faster from the higher clocked memory.

If you don't get a speed increase it's because either of:
1) Processor not fast enough to take benefit of additional bandwidth.
2) Cache system smart enough to not take benefit of additional bandwidth.
3) Application not using memory in a fashion where it will take benefit of additional bandwidth.

Most likely the later one ..

All higher end graphic cards come with faster memory, it may not be a huge deal always but it probably add some benefit, rather stupid if it didn't.

AMD said they would skip DDR2 and go directly to DDR3 earlier because there was no benefit when actually in use but I guess they "had to" when Intel was using DDR2 just because people see the numbers and wonder why one is bigger than the other.
Though first AM DDR2 chips vs 939 DDR chips showed no increase in speed in benchmarks.

Anyway, DDR3 is faster than DDR2, will you notice it? I have no idea.

Re:Good (0)

aliquis (678370) | more than 5 years ago | (#26791063)

... may I also add that the higher clocked DDR2 memories are (mostly?) all out of spec to, while I'd assume atleast some faster DDR3 memories are within specs, probably not the more expensive stuff from Corsair, OCZ and such.

Re:Good (5, Insightful)

EmagGeek (574360) | more than 5 years ago | (#26791457)

The 3-fold clocking scheme will only really help on interleaved burst reads. The memory cells don't charge the output buffers any faster just because you clock them at a higher rate. This is why the nCLK latencies scale with the number of folds in DDR scheme. The only things that will make the cells charge faster are a) higher voltage or b) smaller process or c) a more conductive semiconductor chemistry that lowers resistances and increases currents on the wafer.

If you can have 3 banks of DDR3 interleaved by 1 clock then you can probably see some significant gains on sequential (aka burst) reads. In real life, this doesn't happen very much, especially in a multithreaded environment where almost all s/w is written using high-level foundation classes with very little machine optimization.

What's the point in wating for markets to turn (4, Insightful)

von_rick (944421) | more than 5 years ago | (#26790815)

If your competitor has a better marketshare and also a better line of processors, it would be a suicide to not release a competitive product when the economy is staggering. Withholding the technology while waiting for the economy to improve can make the gap between them and Intel even wider.

Re:What's the point in wating for markets to turn (4, Insightful)

faloi (738831) | more than 5 years ago | (#26790887)

Not to mention that the money has already been spent for R&D. Spending the money for R&D, then sitting on it because the time isn't perfect is, as you mentioned, the best way to increase the gap. And have the added bonus of being out cash on something that won't sell.

Re:What's the point in wating for markets to turn (2, Interesting)

hannson (1369413) | more than 5 years ago | (#26790899)

What exactly is the gap between Intel and AMDs CPUs?
 
  (I'm not trolling or trying to start a flamewar, just curious)

Re:What's the point in wating for markets to turn (5, Informative)

aliquis (678370) | more than 5 years ago | (#26791125)

AMD is competitive at the low and middle end as long as you don't overclock the middle end CPUs.
(If you include the price of the motherboard and don't care about overclocking a low- or midrange AMD system will be cheaper.)

AMD don't have as high end CPUs as Intel and the ones which are closest don't overclock as good or use as little power.

Though then I'd say you shouldn't overclock anyway and AMD chipsets have used less power making the two when used in a complete system rather comparable.

Also AMD used to have an advantage in memory bandwidth and when using multiple CPUs.

Information may be slightly outdated but all of it is probably true, Intel may have catched up in memory bandwidth performance with their latest CPUs since they have put the memory controller within the CPU themself to.

Re:What's the point in wating for markets to turn (1)

Radhruin (875377) | more than 5 years ago | (#26792129)

Actually, while Intel CPUs have very low power draw, the current crop of Intel chipsets are comparatively power hungry [tomshardware.com]. When considering system power draw in its entirety, an AMD system will use less power.

Re:What's the point in wating for markets to turn (2, Informative)

Anonymous Coward | more than 5 years ago | (#26792543)

That's not actually true, AMD give realistic power draw estimates but real world testing has shown that the AMD parts now use less power. One must also take into account that AMD has been integrating a significant part of northbridge into the CPU die for some years now.

Re:What's the point in wating for markets to turn (2, Insightful)

aliquis (678370) | more than 5 years ago | (#26791153)

For notebooks I have no idea how total system power usage looks, AMDs chipsets provide better integrated graphics than Intel do however. And I guess I would go for someone better though still crappy graphics when somewhat faster / more power efficient CPU (if Intel really is.)

Afaik AMD don't have an alternative to Atom, I may be wrong though.

Also Intel notebook with Nvidia chipset may compare better to AMD.

Re:What's the point in wating for markets to turn (1)

corychristison (951993) | more than 5 years ago | (#26791479)

For notebooks I have no idea how total system power usage looks, AMDs chipsets provide better integrated graphics than Intel do however. And I guess I would go for someone better though still crappy graphics when somewhat faster / more power efficient CPU (if Intel really is.)

In my experience Intel is dominating in the notebook business. I prefer AMD but the notebooks out there using them are either:
1) based on Sempron (slowish but low powered)
2) based on older X2 core (good performance but runs hot and sucks power)

Afaik AMD don't have an alternative to Atom, I may be wrong though.

AMD has the Geode LX and NX lines.
Geode LX [amd.com] is very low powered and the highest clock speed (I've seen) is 566Mhz.
Geode NX [amd.com] is targeted directly at the Atom. Although I have yet to see any of these out in the wild.
I've only ever found a Geode in the wild clocked as high as 500Mhz (see the ALIX boards [mini-box.com])

Re:What's the point in wating for markets to turn (5, Informative)

subsolar2 (147428) | more than 5 years ago | (#26791629)

AMD has the Geode LX and NX lines.
Geode LX [amd.com] is very low powered and the highest clock speed (I've seen) is 566Mhz.
Geode NX [amd.com] is targeted directly at the Atom. Although I have yet to see any of these out in the wild.
I've only ever found a Geode in the wild clocked as high as 500Mhz (see the ALIX boards [mini-box.com])

Actually the Geode is a dead end processor, AMD already has stated they are disconinuing it.

AMD recently announced a new processor "Conesus" that is intended for netbooks and UMPC.
http://gizmodo.com/5086703/amds-upcoming-conesus-netbook-chip-wont-stoop-to-mid-levels

Re:What's the point in wating for markets to turn (5, Informative)

WEqR0lDRR6I (1452367) | more than 5 years ago | (#26791463)

This is probably something not many people care about, but...It's a hell of a lot easier and cheaper to find an Athlon64 motherboard that supports(and actually does ECC) ECC memory. Think $50-$100 for an Athlon64 motherboard that does this, versus $200-$300+(original Asus Maximus Formula, Asus P5E WS Pro) for a Core 2 motherboard(has to have an X38 northbridge, unless you want to give up PCIe x16 with a server chipset). I don't think the currently released Core i7 processors with built-in memory controllers support ECC *at all*.

(PS to trolls: Unbuffered ECC memory is only marginally more expensive than unbuffered non-ECC, though it usually has a small latency penalty. Registered/FB-DIMMs ECC on the other hand are Quite Expensive)

Re:What's the point in wating for markets to turn (4, Interesting)

stuffman64 (208233) | more than 5 years ago | (#26792003)

Core i7 940 -> $564.99 + about $250 for mobo = $800+
Phenom II 940 -> $224 + about $150 for mobo = about $375

Core i7 needs DDR3, Phenom II 940 runs DD2 (note that the 940 is an AM2+ part, not AM3 so it doesn't support DDR3). DDR3 is somewhere around 50% more expensive than DDR2 (though falling).

For me, the fact that the i7 is only about 10-20% faster than the Phenom for more than twice the cost, it's simply not worth considering for me. Then again, I do most of my gaming on consoles.

Re:What's the point in wating for markets to turn (1)

Cornflake917 (515940) | more than 5 years ago | (#26792113)

For me, the fact that the i7 is only about 10-20% faster than the Phenom for more than twice the cost, it's simply not worth considering for me. Then again, I do most of my gaming on consoles

Not to mention the fact most games bottleneck at the graphics card, and not the CPU. So that 10%-20% faster CPU isn't guaranteed to give you 10%-20% increase in FPS. Phenom II is definitely the way to go if you want the best gang for your buck, imo.

Re:What's the point in wating for markets to turn (2, Funny)

whoop (194) | more than 5 years ago | (#26792577)

But then you lose the "I spent $X,000 on my superawesomeness gaming rig" pissing contests when you play those games online...

Re:What's the point in wating for markets to turn (1)

Kneo24 (688412) | more than 5 years ago | (#26792621)

It's not always about FPS gain. As a PC gamer I consider other aspects as well. For example, how well it can handle local hosting when I need to (for example L4D, where my PC can handle local hosting far far better than a good portion of the servers I've played on). Or how it handles processor intensive actions inside a game. Things along these lines are important too. Yeah, the GPU might be the bottleneck for FPS, but I don't expect my CPU to churn out graphics, I expect my CPU to do CPU specific tasks. Besides, if the GPU is doing horrible things to your frame rate, you probably want to think about investing in something greater.

Re:What's the point in wating for markets to turn (0)

Anonymous Coward | more than 5 years ago | (#26791299)

Magrathea stopped making luxury planets.
I think AMD would be wise to hold back on the luxury chips

Re:What's the point in wating for markets to turn (3, Insightful)

poetmatt (793785) | more than 5 years ago | (#26791579)

In addition the fact that it's cheaper for them to make this than the previous version as well, they have every reason to stay competitive.

Who writes this "poor economy" crap?

Many companies are doing just fine through this downturn, it's just a mental state of consumers that has changed, and probably not for the long run either as consumers tend to have about the memory of a goldfish when it comes to taking corrective action financially.

We're just slowly deflating back to where we were before this hyperinflation the last few years has brought.

Re:What's the point in wating for markets to turn (2, Funny)

synaptik (125) | more than 5 years ago | (#26792383)

it's just a mental state of consumers that has changed... We're just slowly deflating back to where we were before this hyperinflation the last few years has brought.

I reserve the right to remind you that you said that.

strange (2, Insightful)

Lord Ender (156273) | more than 5 years ago | (#26790865)

This means that you can upgrade your CPU now without needing to change your motherboard or buy pricey new DDR3 memory.

Other than starving CIS majors, who barely earn enough money from their university's computer lab to pay for Ramen Noodles, who does that? IT professionals would just buy all the hardware together because their time is worth more than their money, and everybody else just buys entire new computers. This could only appeal to a handful of small-budget kids.

Re:strange (5, Insightful)

mewshi_nya (1394329) | more than 5 years ago | (#26790889)

Or a lot of small-budget husbands :P

Re:strange (3, Funny)

Lord Ender (156273) | more than 5 years ago | (#26791081)

OK, I'm not married, but recently when I was at the computer store, I overheard this scrawny guy on the phone with his wife begging her for permission to get the 2G instead of the 1G RAM upgrade. His whiny, pathetic, groveling demeanor over a $20 difference in price, and his futile attempts to explain to her why 2G is better than 1G, made me absolutely want to vomit. I'm not married, but I vowed that day to either divorce or kill myself if I ever find myself to be such a pathetic, spineless loser.

So my advice to the married chumps out there is to keep a separate bank account for discretionary purchases which your wives have neither control of nor access to. Life without self-respect (and gadgets) is not worth living.

Re:strange (4, Insightful)

Fulcrum of Evil (560260) | more than 5 years ago | (#26791229)

So my advice to the married chumps out there is to keep a separate bank account for discretionary purchases which your wives have neither control of nor access to. Life without self-respect (and gadgets) is not worth living.

Seconded. One of the best things you can do is establish the idea of a slush fund for both sides of the relationship; fighting over money is one of the more common reasons for divorce.

Re:strange (0)

Holistic Missile (976980) | more than 5 years ago | (#26792173)

Seconded. One of the best things you can do is establish the idea of a slush fund for both sides of the relationship; marriage is one of the more common reasons for divorce.

Fixed that for you. ;-)

Re:strange (3, Insightful)

Facegarden (967477) | more than 5 years ago | (#26791235)

So my advice to the married chumps out there is to keep a separate bank account for discretionary purchases which your wives have neither control of nor access to. Life without self-respect (and gadgets) is not worth living.

Or... marry someone who isn't a total shite and respects your interests.
-Taylor

Re:strange (3, Funny)

Chris Burke (6130) | more than 5 years ago | (#26791327)

Life without self-respect (and gadgets) is not worth living.

Dude! Way to totally reverse priorities. Did it occur to you that maybe she's the one with the high-paying job, and all that groveling got him better hardware than he could have bought if his was the only income?

Okay that probably wasn't the case, I'm just sayin', if I had to choose between self-respect and gadgets... "Honey, please?! I took out the trash last night and everything!"

Re:strange (1)

deraj123 (1225722) | more than 5 years ago | (#26791499)

I'm just sayin', if I had to choose between self-respect and gadgets... "Honey, please?! I took out the trash last night and everything!"

Well that's your decision. Just please have the decency to not do your groveling in public - it embarrasses the rest of us.

Re:strange (1)

dlevitan (132062) | more than 5 years ago | (#26791761)

Life without self-respect (and gadgets) is not worth living.

Dude! Way to totally reverse priorities. Did it occur to you that maybe she's the one with the high-paying job, and all that groveling got him better hardware than he could have bought if his was the only income?

Should it matter? In my opinion, if you're married, then it doesn't matter who makes the money, you're in it together. By that reasoning, should a parent who stays home with the children not be able to buy anything at all since they don't earn anything? Granted, I think that any major purchase or decision should be made together, and a $1k purchase is usually considered major, but one person shouldn't be begging the other for anything. That kind of relationship is not sustainable and not healthy for anyone.

Re:strange (2)

Chris Burke (6130) | more than 5 years ago | (#26792171)

Should it matter? In my opinion, if you're married, then it doesn't matter who makes the money, you're in it together. By that reasoning, should a parent who stays home with the children not be able to buy anything at all since they don't earn anything? Granted, I think that any major purchase or decision should be made together, and a $1k purchase is usually considered major, but one person shouldn't be begging the other for anything. That kind of relationship is not sustainable and not healthy for anyone.

To get all serious in a moment for what was supposed to be a joke (gadgets before self-esteem, come on work with me here people), no matter how you think it should work ideally, marriage isn't that simple. The line between "not be able to buy anything" and "major purchase" is going to fluctuate based on who is bringing home more of the bacon. Or one spouse thinks "you're in it together" means "you have to ask for everything" regardless of who is getting paid, especially if one thinks of themselves as the financially responsible one. It may not be healthy, but many things in real relationships aren't healthy, but somehow a lot of them end up sustainable. That's why the married guy begging his wife to buy something, or to go fishing with the guys, and so on are sort of an archetype. Ideal? No. Real? You betcha.

Re:strange (0)

Anonymous Coward | more than 5 years ago | (#26791875)

Good boy!

You have been well trained.

Re:strange (1)

SanityInAnarchy (655584) | more than 5 years ago | (#26792513)

Dude! Way to totally reverse priorities. Did it occur to you that maybe she's the one with the high-paying job,

Irrelevant. Would she be groveling for permission to spend $20 extra on a nicer pair of shoes, if he had the high-paying job? No, if she's already buying the shoes, she'll write the check, or swipe the card, and tell you later -- easier to get forgiveness than permission.

What's more, if it's a high-paying job, $20 is nothing.

if I had to choose between self-respect and gadgets...

Your choice.

I would certainly choose self-respect over marriage, though.

Re:strange (1)

Chris Burke (6130) | more than 5 years ago | (#26792537)

Irrelevant. Would she be groveling for permission to spend $20 extra on a nicer pair of shoes, if he had the high-paying job?

Ha! And you think those circumstances are the same?

I would certainly choose self-respect over marriage, though.

Yes that's pretty much the choice. ;)

Re:strange (1)

WildStreet (1362769) | more than 5 years ago | (#26791453)

Thank God for my ebay account, and their electronic statements. Coupled with a gmail account, and she is none the wiser. NO MAAM

Re:strange (0)

Anonymous Coward | more than 5 years ago | (#26791821)

I must date the wrong kind of women because not once have I ever known one that was very fiscally responsible. In fact, I would kind of like it because it would lower some of my own stress over trying to be responsible at all times.

When I was married I did the separate accounts thing too. It's a very good idea. We each deposited our checks into our own accounts. Course it kind of sucked because I never saw one penny she made and she did not pay the bills. Still was better than her use my account for whatever she spent all that money on.

Re:strange (4, Insightful)

onkelonkel (560274) | more than 5 years ago | (#26791895)

Works both ways. You come home with a new CPU & MOBO and install it in the old case when she's not looking; she sneaks a new pair of shoes into the closet and tells you she bought them last christmas on sale when she busts them out.

Or, you can be adults, and maybe agree on an amount for discretionary spending that doesn't require the others approval.

Re:strange (1)

moderatorrater (1095745) | more than 5 years ago | (#26792041)

OT, but that's what me and my wife eventually came out to. We have our separate bank accounts so that we can make stupid purchases on our own and not worry the other spouse as much. I can't remember who said it, but the best fiscal advice for married people that I even heard was that both members need to have money that they're not accountable for to the other spouse.

Re:strange (1)

AlXtreme (223728) | more than 5 years ago | (#26792263)

His whiny, pathetic, groveling demeanor over a $20 difference in price, and his futile attempts to explain to her why 2G is better than 1G, made me absolutely want to vomit. I'm not married, but I vowed that day to either divorce or kill myself if I ever find myself to be such a pathetic, spineless loser.

Hear hear!

Listen spineless losers: only grovel for 4GB. 2GB isn't enough. If you grovel, do it for something worthwhile.

Re:strange (2, Insightful)

Namlak (850746) | more than 5 years ago | (#26792431)

How do you know the guy isn't a total loser who's run up $20,000 in credit card debt with all the things he "had to have", essentially forcing his wife to become the authority that keeps him in check? I know of at least three other guys in this situation, two of whom are on their way to their *second* bankruptcies.

Re:strange (1)

Bill, Shooter of Bul (629286) | more than 5 years ago | (#26792619)

OK, I'm not married,

Wow, really? by your response, I never would have guessed.

Its your response, that repulses me. Life lived for oneself is not worth living. Not that I'm recommending suicide or Marriage for you. But seriously, don't make the purchase of 1 gig of freaking ram the definition of a life worth living. Gadgets suck compared to people.

Re:strange (1)

aliquis (678370) | more than 5 years ago | (#26791185)

Reminds me of my cousin, he works in a computer store and his girlfriend wouldn't allow him to get a new computer because she thought he should spend the money on a trip for them (he was like 19 and she 18 so pretty pussy whipped.)
Anyway, he kept the case, problem solved ;), or well, not the actual problem, be he found a workaround.

Re:strange (3, Insightful)

Darkness404 (1287218) | more than 5 years ago | (#26790925)

Gamers. Sure, most would rather go out and buy a totally new box, but if someone just wanted to upgrade a CPU, AMD would let them do it. It may seem illogical for hardware vendors to target a small portion of the hardware buying community, but both AMD and Intel are trying their best to get the gamer's money.

Re:strange (1, Insightful)

Anonymous Coward | more than 5 years ago | (#26791061)

Might AMD want to sell fewer different types of CPUs? They can stop selling all their AM2 CPUs sooner without issues from OEMs who perhaps transition slower, might want to continue using DDR2 due to cost or latency, etc.

Not strange at all (4, Insightful)

Chris Burke (6130) | more than 5 years ago | (#26791051)

Other than starving CIS majors, who barely earn enough money from their university's computer lab to pay for Ramen Noodles, who does that? IT professionals would just buy all the hardware together because their time is worth more than their money, and everybody else just buys entire new computers. This could only appeal to a handful of small-budget kids.

If you don't think in terms of upgrading the processor of the computer sitting on your desk, but instead think of HP updating the processor in their line of AM2-based computers, then you should be able to see that the appeal is basically universal. This way the OEMs can offer refreshed versions of their lines without having to incur the extra expense of DDR3. Obviously they will also make a DDR3 AM3-based line, but the DDR2-based line will be cheaper.

Backward compatibility and in-place upgrades appeals to far more than a handful of poor hobbyists.

Re:strange (4, Interesting)

afidel (530433) | more than 5 years ago | (#26791073)

Actually we bought drop in CPU upgrades for our Database server, when you have the time invested in the OS and application installs and QA time not to mention tons of ram it's a very cheap upgrade to just swap out CPU's if you are CPU limited. Spending $5K or so to get 40% better performance out of say $300K is sunk cost is a no brainer. Now that's on the Opteron side not the Phenom side, but again if you do a lot of transcoding it's probably cheaper to buy a new CPU then upgrade the whole rig.

Re:strange (-1)

moosesocks (264553) | more than 5 years ago | (#26791265)

Your situation sounds like a really strong case for virtualization or clustering....

Re:strange (4, Insightful)

Cozminsky (452030) | more than 5 years ago | (#26791925)

Virtualization doesn't help your performance if you're already using all of a particular resource. It has overheads that mean you're getting less out of your hardware in terms of raw performance. The fact that you can put 5 boxes that would otherwise be sitting idle on the same hardware is what makes virtualization attractive.

Re:strange (2, Informative)

silanea (1241518) | more than 5 years ago | (#26791959)

Exactly how does virtualisation magically add performance out of hot air? And exactly how does buying additional iron provide that same kind of performance increase per $ spent that the parent mentioned?

Gosh, can we please have an automated -2 buzzword sucker whenever someone comes up with fancy terms where they just don't fit?

Re:strange (1)

moosesocks (264553) | more than 5 years ago | (#26792121)

Perhaps I should have been more clear.

Virtualization allows you to swap out the underlying hardware without reconfiguring the server's OS.

Granted, you still do need to set up a "host" operating system, although this tends to be a fairly trivial task.

Re:strange (2, Insightful)

afidel (530433) | more than 5 years ago | (#26792275)

It's still cheaper to buy $5k in processors than a new $40K+ server, and even with virtualization it's not like racking, configuring, and SAN and network connecting a server is free. Plus with per CPU licensed applications where you are performance constrained it doesn't make a whole lot of sense to lose some percentage of performance when a license upgrade is even more expensive than the time invested. I'm not saying everyone looking for better performance should run out today and buy a drop in CPU upgrade, but there has always been a market for such products because there are always situations where it makes economic sense to do so.

Re:strange (1)

sjames (1099) | more than 5 years ago | (#26792277)

Not necessarily. If you have a nice machine that could go for another year in it's current role if it was just a bit faster, why not drop in the new CPU? Getting an all new machine will take up time as well moving the software and/or configs over.

Next year when the budget is hopefully better, get a new system sans CPU, and move it over. Put the old one back in the old box and deploy somewhere less demanding.

Doesn't matter. (2, Interesting)

Jeff DeMaagd (2015) | more than 5 years ago | (#26790873)

I mean the economy in terms of releasing a product update. If the work is done & ready to go, it's too late to worry about the economy, just ship it. Not only that, product development cycles on these products are long enough that they need to continually invest in R&D regardless of the economy, by the time a just-started project is done, the economy will have rebounded and ready for new product.

If the world is switching to DDR3, that probably means having a new socket. As such, AMD needs to introduce the new socket when they are ready to.

Be fair (1)

DavidR1991 (1047748) | more than 5 years ago | (#26790875)

Being fair to AMD, these processors/sockets do not just turn up in a week or so in a finished state - they go through multiple stages during months/years before we ever see them. So I doubt they planned on releasing this at an economic slump - it may have looked very unlikely such a slump would even occur when the project was launched.

Re:Be fair (0)

Dunbal (464142) | more than 5 years ago | (#26791065)

Being fair to the customer, his money does not just turn up in his bank account ready to be spent. He has to give a lot of time and consideration to shelling out money for yet another computer upgrade.

      Of course manufacturers love switching sockets, be it from ISA EISA to PCI to AGP back to PCI-Express etc etc etc. Don't get me started on memory, that has undergone countless socket evolutions. And CPU's. I remember back in the late 80's some company was advertising a motherboard that promised a "standard" CPU socket no matter what. Hah what a joke.

      Of course there is the fact that new sockets are supposed to bring better things. However I'm sure that sockets could be designed to be backwards compatible much more often than they really are - only the drive to maximize profit by getting you to junk all the other stuff that is made obsolete by your new motherboard seems to trump everything.

      I remember reading about what a huge impact the concept of interchangeable parts had on the industrial revolution. And at the beginning there for a while, it also applied to computers. But now the greedy "Intellectual Property" gods (hah, I credit them too much, they're mere lawyers) have pissed all over our wallets, demanding a new socket for every upgrade - obviously covered with reams of patents and copyrights. Guess who is paying for all of this?

      I remember way back when BASIC used line numbers, we used to be told to write code in increments of 10, in case you wanted to add more code between your lines in the future. I am positive you can engineer a socket that leaves plenty of room for the future. But why would you want to do that, if you can drain your customer's bank accounts? Well, look at where capitalism, corporate ownership of government and pure greed has gotten us today. Yeah, the economy is in great shape. Keep it up. /rant

Re:Be fair (1)

Fulcrum of Evil (560260) | more than 5 years ago | (#26791305)

I remember reading about what a huge impact the concept of interchangeable parts had on the industrial revolution. And at the beginning there for a while, it also applied to computers.

It still does. Your PCI-E video card works in just about any computer that supports that socket.

I am positive you can engineer a socket that leaves plenty of room for the future. But why would you want to do that, if you can drain your customer's bank accounts?

Why would you do that? It'd be more expensive now, offer no advantages, and in three years, the new socket would be faster and come with updated IO choices.

Re:Be fair (1)

Dunbal (464142) | more than 5 years ago | (#26791685)

and in three years, the new socket would be faster and come with updated IO choices.

      That's arguable. We're quickly approaching performance limits, diminishing returns are starting to show up, and processors are only getting marginally faster. Now the trick is to play with the cache, and add cores to show a performance increase. I'm wondering in 3 years if we'll be able to justify upgrading at all. If that's the case, I expect new sockets galore and radical design changes just to artificially fuel demand in a saturated market, with the occasional rare leap in technology. Just like all the "new and improved" products you see at the supermarket, that really boast "new and improved packaging" to get your attention and hopefully your dollar.

      When I think of the tens of thousands of dollars I have spent on computers in my life-time because I'm a computer nerd and as a hobby I want to stay up to date - it really is impossible to justify this expense if I look at it in terms of increased productivity. It just isn't there. And I argue that soon (if not already) profit, not engineering, will be the main factor deciding if a board is updated and older sockets made obsolete. However economists will argue that there is a hidden economic cost to be paid by society as a whole, in terms of waste and inefficiency, when this happens. So Intel or AMD or whoever gets richer, and we're left to fill in the holes or deal with the garbage/wasted time.

Re:Be fair (1)

Fulcrum of Evil (560260) | more than 5 years ago | (#26792043)

That's arguable. We're quickly approaching performance limits, diminishing returns are starting to show up, and processors are only getting marginally faster. Now the trick is to play with the cache, and add cores to show a performance increase.

Guess what? Adding more cores usually needs more pins for IO. AMD did something smart and reserved some space for another HT link (apparently), which is what you've been complaining about them not doing, so I'm confused why you're unhappy. Why would they have added the HT stuff 2 years ago if noone was going to use them? What happens when it's time to go past 40 address lines? We get a new socket, that's what.

Re:Be fair (1)

DavidR1991 (1047748) | more than 5 years ago | (#26791389)

My comment was addressing the fact that this has turned up during an economic slump - I was not commenting on the validity or usefulness of the upgrade itself. My point was, AMD do not have magic crystal balls when they start their projects (like most chip makers) months or even years before launch - they probably did not foresee this new tech. turning up during a crunch.

Excellent (1)

Jethro (14165) | more than 5 years ago | (#26790929)

This is great. I'm hurting for a new desktop and was planning on getting an AM2 CPU.

I still am, but knowing that the AM3 was just around the corner, waiting to knock all the AM2/AM2+ prices down has delayed my plans for a few weeks, and now there's finally an end in sight!

am3 CPU in am2+ motherboard: OK Otherway.. no (5, Informative)

amcdiarmid (856796) | more than 5 years ago | (#26790935)

You may be able to put a am3 processor in a am2+ motherboard, but the Register says that am2+ processor in a am3 motherboard will not work. (http://www.reghardware.co.uk/2009/02/09/review_cpu_amd_phenom_ii_am3/page2.html)

To quote:
"makes life horribly confusing as the Phenom X4 920 and 925 and the X4 940 and 945 will be identical apart from the processor socket. This means that there is the possibility that some poor so-and-so will buy an AM2+ CPU and an AM3 motherboard when ne'er the twain shall meet." ..
careful what you buy out there

Re:am3 CPU in am2+ motherboard: OK Otherway.. no (2, Informative)

Anonymous Coward | more than 5 years ago | (#26791365)

IIRC the AM3 has fewer pins and is able to plug into
an AM2+ socket, but AM2+ chips can't plut into an AM3 socket. So, if you buy the wrong one you'll know as soon as you try to plug your AM2+ cpu into your AM3 motherboard...

Re:am3 CPU in am2+ motherboard: OK Otherway.. no (1)

Enleth (947766) | more than 5 years ago | (#26791733)

That's kind of logical - sure, some poor so-and-so could in fact get into some trouble this way, but there's absolutely nothing AMD could do to fix that, save for putting an appropriate notice on the box. I mean, AM3 motherboards will use DDR3 memory (which is different from DDR2 even in terms of physical dimensions and pinout, so you can't put a DDR2 module in a DDR3 slot), but AM2 processors can't talk to DDR3 memory because they were not designed to do that, and AMD can't magically fix all those AM2 processors that have been already made to be able to use DDR3. See the problem?

Re:am3 CPU in am2+ motherboard: OK Otherway.. no (0)

Anonymous Coward | more than 5 years ago | (#26792469)

Poor fuckass can either RTFM before embarking on a system-building exercise, or if they fuck up they can exchange.

Despite a poor economy? (5, Insightful)

Toonol (1057698) | more than 5 years ago | (#26791003)

This is the sort of thing that gets us out of a poor economy.

Re:Despite a poor economy? (1)

Janek Kozicki (722688) | more than 5 years ago | (#26791047)

And also due to poor economy, otherwise they wouldn't support cheaper DDR2.

Re:Despite a poor economy? (4, Informative)

Chris Burke (6130) | more than 5 years ago | (#26791093)

And also due to poor economy, otherwise they wouldn't support cheaper DDR2.

I guarantee you they would.

Even when the economy was good, there was a lot of downward pressure on the prices of computers. Mandating a switch to a more expensive memory tech before the market is ready is a sure way to have it backfire in your face *cough* RAMBUS *cough* Ugh that was some nasty phlegm.

Re:Despite a poor economy? (1)

Endo13 (1000782) | more than 5 years ago | (#26792253)

Yeah, speaking of which I had to tell a poor sot today that it's not worth upgrading his memory because it uses Rambus. Heck, for the cost of adding 512 MB he could buy a whole new computer. If they'd used DDR from the start, he could probably be rockin' 1GB RAM now, and be good to go for at least a few more months.

Re:Despite a poor economy? (0)

Anonymous Coward | more than 5 years ago | (#26792107)

I also don't recall any similar story on Slashdot when Intel released the Core i7, which requires a new socket type, complaining about the "poor economy".

arcticstoat is obviously just some anti-AMD troll.

Really a New Socket or Just Chipset?? (1)

BigAssRat (724675) | more than 5 years ago | (#26791087)

So is the socket physically different or is it just the chipset that operates with the chip? It seems to me to be the latter. The AM3 chip can't have pins that the AM2+ has or it wouldn't fit in the socket. Or does the AM3 have fewer pins?

Just sounds like a chipset upgrade to me.

expensive memory? (1)

Fulcrum of Evil (560260) | more than 5 years ago | (#26791159)

Seriously, unless you're building a DB server, memory hasn't been expensive for a while - 2G is $40-50. A new processor is $200+

Re:expensive memory? (1)

AmigaHeretic (991368) | more than 5 years ago | (#26791555)

Exactly. 4GB of DDR3 on NewEgg right now it $99 (2-2GB sticks)

Asus M4A79T DELUXE Socket AM3/ AMD 790FX/ Quad CrossFireX Motherboard is $219. So you can probably find a 'cheaper' AM2 board, but that's not a huge premium.

AMD Phenom II X4 940 3.0GHz Black Edition Quad-core is about $225. So versus the 'old' Phenom 9950 which is about $160 that's not bad either.

Still better than buying a prebuilt from BestBuy...

Inflationary summary? (4, Insightful)

TJ_Phazerhacki (520002) | more than 5 years ago | (#26791173)

Not that I expect any different from /. most days, but who cares if its the middle of a recession? The R&D work on this has been in place for quite a while, and this is actually MORE attractive than an i7 platform right now because you don't need to move up to the new socket for the new chips - they are backwards compatible.

"Despite a poor economic climate, farmers still harvest crops they planted last year...." - come on....

DDR3? (0)

Anonymous Coward | more than 5 years ago | (#26791313)

DDR is double-date-rate, right? It's been around for years now, why haven't we moved past double? Shouldn't we be on triple or quadruple data rate by now?

Re:DDR3? (1)

Dogtanian (588974) | more than 5 years ago | (#26791617)

As far as I'm aware, "double" data rate means that it can transfer data on the rising *and* falling edges of the square clock signal. That's why it hasn't gone higher than "double", I assume.

Re:DDR3? (1)

giverson (532542) | more than 5 years ago | (#26792093)

Last I heard, the clock speed was too low for it to be competitive with the different forms of DDR. DDR simply scales better. And lets face it - DDR/2/3 has been more than adequate to keep CPUs fed with data.

Taxpayer funded infrastructure updates are coming (0)

Anonymous Coward | more than 5 years ago | (#26791319)

Everyone is also getting a new bus! But it may not stop at everyone's socket. We also get new ponies.

(The Colloquial) Moore's law is a cruel mistress (1)

Wrath0fb0b (302444) | more than 5 years ago | (#26791415)

Taking as gospel the non-technical* formulation of Moore's law: processing power doubles every 24 months, then delaying your product even a few weeks puts you behind the performance curve. A 2 week delay comes in at a manageable 1.3% but delay your product 8 weeks, and you are already 5.5% behind your competitor**. In a market with margins in the low single digits, that's the difference between profit and loss.

Economy or not, you've got to release the product when the engineers say its ready or else it decays.

* Please don't flame me about transistor densities at optimal cost -- I'm simplifying!
** Yes, processors come out in discrete updates, not continuous updates. I'm only right on average.

Actual processor speed (1)

vlad_petric (94134) | more than 5 years ago | (#26791903)

Except that actual processor speed went off Moore's curve a while back ... While transistor densities have gone up (mostly) according to schedule, actual processor speed has not.

Your argument is good, and AFAIK processor makers use it to a certain extent, it's just that the percentages are a bit smaller.

DDR3 not worth it (1)

Nom du Keyboard (633989) | more than 5 years ago | (#26791741)

The early tests I've seen published indicate almost no improvement for DDR3 over DDR2 in an identical system. A bit lower latency is the only real improvement, and that's tiny. So what gives? How much should DDR3 improve over DDR2, except for the estimated 30% improvement in power requirements?

Economy Schmoconomy. (1, Informative)

Anonymous Coward | more than 5 years ago | (#26791827)

I, for one, welcome our new DDR3 overlords. Every time they dump something new on the market, prices on their older product lines go down. I just bought a 9850 for $135. About $40 cheaper than late last year before the Phenom 2 line hit the streets. It's the fastest processor my motherboard officially supports and it cost less than the dual core it replaced.

I always intentionally build my gaming rigs a notch or two below bleeding edge because it's so much cheaper. But, if "they" were to stop pushing the edge forward, where would I be?

Despite Poor Economy (0)

Anonymous Coward | more than 5 years ago | (#26791949)

That's why YOU are not in charge of anything!!!

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...