Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Intel and AMD May Both Delay Next-Generation CPUs

Soulskill posted about 3 years ago | from the make-it-so dept.

AMD 193

MojoKid writes "AMD and Intel are both preparing to launch new CPU architectures between now and the end of the year, but rumors have surfaced that suggest the two companies may delay their product introductions, albeit for different reasons. Various unnamed PC manufacturers have apparently reported that Intel may push back the introduction of its Ivy Bridge processor from the end of 2011 to late Q1/early Q2 2012. Meanwhile, on the other side of the CPU pasture, there are rumors that AMD's Bulldozer might slip once again. Apparently AMD hasn't officially confirmed that it shipped its upcoming server-class Bulldozer products for revenue during August. This is possible, but seems somewhat unlikely. The CPU's anticipated launch date is close enough that the company should already know if it can launch the product."

cancel ×

193 comments

Sorry! There are no comments related to the filter you selected.

BRING BACK THE K5 TEAM !! (0)

Anonymous Coward | about 3 years ago | (#37298810)

They got shit done !!

Re:BRING BACK THE K5 TEAM !! (1)

eeek (83889) | about 3 years ago | (#37298850)

Yes, bug K5 was a lot less complex. There's a lot more room for bugs in modern CPU designs.

Re:BRING BACK THE K5 TEAM !! (0)

Anonymous Coward | about 3 years ago | (#37298882)

K5 team was NextGen or whatever the hell that was. They were all fired for the really, really bad performance amid all the hype. Oh, yeah ... !!

Re:BRING BACK THE K5 TEAM !! (0)

Anonymous Coward | about 3 years ago | (#37299104)

No, the K5 was developed by AMD before they bought Nexgen. The K6 was based on the Nexgen 686 CPU.

Both the K5 and K6 series of CPUs were pretty good. They cost a lot less than the Intel CPUs of the time and offered better integer performance. The Intel CPUs had better FPUs, but 3DNow! made up for that somewhat.

Re:BRING BACK THE K5 TEAM !! (1)

Billly Gates (198444) | about 3 years ago | (#37298896)

I thought the k5 never saw the light of day? K6 was the first AMD success.

Bulldozer might give Intel a run for the money and low to mid end notebook and sub notebook markets. Finally an integrated GPU that is not 7 years behind dedicated video cards. That is a plus for PC games and AMDs graphics are better than Intels.

Re:BRING BACK THE K5 TEAM !! (0)

Anonymous Coward | about 3 years ago | (#37298924)

My god, man, get thee to a nunnery quick, and then ask the hoes about GooGleeGoo.

Re:BRING BACK THE K5 TEAM !! (1)

sortius_nod (1080919) | about 3 years ago | (#37300172)

A quick google would answer your own questions. K5 was released (was a competitor to the Intel Pentium & IBM/Cyrix 586), the K6 was up against PII and suffered from major thermal problems. While it did give Intel a little bit of a worry as far as sales go, the K6/K6 II's weren't exactly powerful. It wasn't until the Athlon (K7) that Intel shat themselves.

Success is a very loose term for a processor with major problems.

Re:BRING BACK THE K5 TEAM !! (1)

Mindflux0 (2447336) | about 3 years ago | (#37300366)

I remember when someone I knew (who thought he was great with computers but actually didn't know what he was doing) decided to overclock his K6.
It ran so hot that it fused to his motherboard before booting into windows.

Re:BRING BACK THE K5 TEAM !! (1)

bigsexyjoe (581721) | about 3 years ago | (#37299906)

Yeah. What is Rusty Foster even doing these days?

Re:BRING BACK THE K5 TEAM !! (0)

Anonymous Coward | about 3 years ago | (#37300234)

At least the money he conned from K5 readers must have run out by now. Unless he managed to invest it profitably and is living off the proceeds.

Frist psot!! (-1)

Anonymous Coward | about 3 years ago | (#37298834)

OMG I actually did it!!

Re:Frist psot!! (0)

Anonymous Coward | about 3 years ago | (#37298854)

Everyone loves idiots.

Seems perfectly reasonable (1, Interesting)

Mensa Babe (675349) | about 3 years ago | (#37298840)

Here are some links that were missing in the story:
  1. Bulldozer [wikipedia.org]
  2. Ivy Bridge [wikipedia.org]

People seem to be surprised by the delay and I have an exactly opposite reaction to that story. I remember when I was reviewing the first drafts of Buldozer (or actually Piledriver to be more specific) and I was surprised that the original date when it was planned to be released back than would have made it way ahead of the curve predicted by Gordon Moore. I was saying that it should be delayed some time so it actually is more accurate to the prediction and it has been delayed, however I will never know whether it had been done for that reason. The point is that in this industry there is something called "too good, too soon" which is not always desirable. Nevertheless, I hope both Bulldozer and Ivy Bridge will be available soon because they are both brilliant pieces of engineering.

Re:Seems perfectly reasonable (1)

Bengie (1121981) | about 3 years ago | (#37299192)

" would have made it way ahead of the curve predicted by Gordon Moore"

They are ahead of the curve. Intel is postponing because of their recent trigate tech. Intel could still release 22nm right now if they wanted, but it wouldn't be worth it with such a huge difference in power leakage.

Now, if you want to look at processing power, look at GPUs. They're beyond doubling every 18 months already and are expected to approach 2.5x-3x/18months in the next 2-3 years.

Re:Seems perfectly reasonable (1)

Redlazer (786403) | about 3 years ago | (#37299372)

Thank you so much for this response. Mod parent up.

Re:Seems perfectly reasonable (3, Interesting)

Skarecrow77 (1714214) | about 3 years ago | (#37300328)

I was surprised that the original date when it was planned to be released back than would have made it way ahead of the curve predicted by Gordon Moore. I was saying that it should be delayed some time so it actually is more accurate to the prediction and it has been delayed, however I will never know whether it had been done for that reason.

So... If I'm reading you correctly... the reason that they should have delayed the hardware wasn't because of something like it being too expensive to produce for expected market, too difficult to produce in sufficient yields, or any other technical or business reasons that might exist, but because the number of transistors involved didn't match up to a prediction made 30-some-odd years ago?

You realize that prediction has only "come true" when you average the graph over a very long period, and there are significant statistical outliers (that represent significantly successful chips in their day) along that plot?

Wait, wait... you're trolling right? I admit, you got me!

Collusion (3, Interesting)

parlancex (1322105) | about 3 years ago | (#37298856)

There might be good reasons on both sides, but the tinfoil hatter in me believes this might have more to do with fact that both companies might want to see a little more profit out of the R&D that went into the current generation of products before obsoleting them. The performance of the current generation is high enough that it is getting harder to introduce a new generation at a price point that could both recover R&D and provide reasonable value for the customer.

Ya right (4, Interesting)

Sycraft-fu (314770) | about 3 years ago | (#37298930)

The current situation is Intel is slaughtering AMD. AMD hasn't had an architecture update in a long, long time and it is hurting them. Clock for clock their current architecture is a bit behind the Core 2 series, which is now two full generations out of date. Their 6 core CPU does not keep up with Intel's 4 core i7-900 series CPU, even on apps that can actually use all 6 cores (which are rare). Then you take the i5/7-2000 series (Sandy Bridge) which are a good bit faster per clock than the old ones and there is just no comparison.

On top of that, Intel is a node ahead in terms of fabrication. All Sandy Bridge chips, and many older ones, are on 32nm. AMD is 45nm at best currently. Not only does that equal more performance but it equals lower heat for the performance, particularly for laptops. Then of course Intel is talking about Ivy Bridge, which is 22nm, another node ahead. Their 22nm plant is working and they've demonstrated test silicon so it will happen fairly soon.

The situation is not good for AMD. All they've got is the low end and that is getting squeezed hard by Intel too. They need a more efficient CPU and they need it badly. Delaying is not something they want to do, Bulldozer has been fraught with delays as it is. They've been talking about it for a long time, like since 2009, and delivered nothing.

They have every reason to want to get Bulldozer out as soon as possible and preferably before Ivy Bridge. Each generation that Intel releases that they don't have a response for just puts Intel that much farther ahead.

Now that said, Intel may well have decided to hold Ivy Bridge if AMD can't deliver Bulldozer because they don't need to. Sandy Bridge CPUs are just amazing performers, they don't need anything better on the market right now. However I can't imagine AMD colluding with Intel on this. They are not in a good situation.

Re:Ya right (1)

Gaygirlie (1657131) | about 3 years ago | (#37299036)

I hope AMD can get back on their feet soon, and hopefully will some day in the future offer some real competition to Intel. It would be generally good for consumers by lowering prices and pushing both companies to keep on innovating. But it doesn't look likely :/

I've always liked AMD's products and the whole underdog-fighting-for-its-life is easy to symphatize with, it's just sad to see they don't ever seem to be able to properly compete with Intel, there's always something wrong or amiss.

Re:Ya right (1)

aztracker1 (702135) | about 3 years ago | (#37300792)

I think the Pentium 4 days really f'd over AMD in the OEM space, though since the Core/Core2 Intel's held the crown, though for most use AMD's E-350 is a really nice offering. I think Bulldozer will have advantages in the low-mid end, where Intel may keep the raw cpu speed crown. I'm more interested in seeing an NVidia Tegra 3 in laptop/desktop options myself. I think we've gotten so used to the bigger, better cycle that we've lost sight that 5+ year old tech is more than fast enough for anything most people want to do. I think we're really close to a GP CPU nirvana now. At least in the consumer space.

Re:Ya right (3, Insightful)

laffer1 (701823) | about 3 years ago | (#37299038)

I don't believe the bit about 32nm is accurate. I just ordered a new laptop with an AMD A6-3400 CPU. This is a fusion based chip and is 32nm.

As far as the performance claims regarding 6 core AMD chips, I have to agree with that. However, the cost of an Intel chip is not worth it. My 6 core AMD upgrade saved me hundreds of dollars. it still improved my starcraft 2 framerate by double over my phenom 9600 x4.

Intel stuff is faster if you have the money. It's not fanboyism, just practical price/performance based on benchmarks.

Re:Ya right (2)

Billly Gates (198444) | about 3 years ago | (#37300056)

I bought a nice 6 core Phenom X6 x1035T. It is underclocked to only 2.6 ghz, but for $450 I got 8 gigs of ram and virtualization to run VMWare with its SSD instructions. With the 6 cores and 8 gigs of ram it rocks to have 3 - 4 VMS running for the price I paid.

With an ATI 5750 that came with it, games run reasonable well too. As soon as I upgrade the PSU I plan to flash the bios so I can clock my cpu to 3.2 ghs. Asus crippled but there are hacks to get around it.

For value, the AMD phemom II is only 4-7% slower and well worth the price. They are just as fast seriously. Intel pays off people ... cough Tomshardware ... cough ... money to find benchmarks and then stretch the graphics to make it look like the Sandy Bridge is like 40% faster. It really is not.

Also the integrated graphics on Bulldozer and Llamo give Intel a run for the money too. Windows 8 tablet UI uses IE 10 for html 5 hardware acceleration. Graphics will be much more important

Re:Ya right (1)

m.dillon (147925) | about 3 years ago | (#37300062)

Intel core-i5-2310 Sandy Bridge - $190.
AMD Phenom II X6 1090T Black - $170.

That's a $20 difference and BTW the i5 blows away the Phenom (any Phenom). You don't even need an i7.

Intel is able to price their cpus at a bit of a premium over AMD, which is why Intel is rolling in money and AMD is not. But there's a good reason why Intel has that pricing power and its one word: "SandyBridge".

It is also true that the absolute highest-end unlocked Intel cpu is priced at a very serious premium... but if you are trying to compare roughly similar cpus there's nothing to compare that against.

The AMD Phenom (AM3 socket) series has one advantage over Intel for consumer cpus, and that is they all support ECC while Intel's consumer SandyBridge does not (caveat: you have to find an AMD mobo which supports ECC, not all of them do even though the socket format does). You have to move on to Intel's Xeon SandyBridge to get ECC, and there the pricing premium becomes significant. Very few people want the added cost of ECC (I seem to be the only one who really cares :-( ) for a consumer cpu. Intel clearly has pricing power here too.

-Matt

Re:Ya right (0)

Anonymous Coward | about 3 years ago | (#37300772)

http://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i5-2310+%40+2.90GHz [cpubenchmark.net]

Look at two things, one is the phenom that is beating it, and 2 the second graph that shows cpu mark / price. Certainly looks like the Phenom 2 isnt that easily thrashed, especially compared to the i5 you suggested.

Re:Ya right (1)

aztracker1 (702135) | about 3 years ago | (#37300830)

You also have to consider a few other things, integrated graphics options, especially in laptps, and/or motherboard options. This is where the pricing really favors AMD. For my mom, father, brother, sister, and grand, others an E-350 based system is sufficient, and other integrated gpu options from amd all exceed intel. This is why I've gone about half and half on even recent builds often in favor of amd. Total system cost for a non-suck system (sub-$750) really favors AMD. I think where AMD needs to firm up to compete is in really low power, and linux gpu support.

some people actually use their computer (1)

dutchwhizzman (817898) | about 3 years ago | (#37300728)

for something more useful than Starcraft. I agree you don't need 6 cores and 4G ram to read your e-mail, but today the workload of the average server or desktop, includes running virtual machines, virus scanners, full encryption, flash websites and whatnot. The laptop I was "given" 2 months ago has a brand new 4 core Intel, 4G ram, Nvidia quadro GFX and it's too slow to run my normal workload of terms, browser and VMs. Given the fact that I'm a contractor, spending a little extra on a faster CPU would probably pay itself back in less than a week for my employer. Sure, if they'd stop mandating W7 on the desktop with full encryption and on-access virus scanning, the world would be a better place for me, but most likely not for the company.

For servers, almost everything is running on VMs now. More power per CPU is very welcome there, since you can run faster/more VMs per box. The less heat you produce, the more servers you can put in a data center. Given the cost for real estate at prime interconnect sites, it's profitable to go green, even if you're not a tree hugging hippie.

Re:Ya right (0)

Anonymous Coward | about 3 years ago | (#37299078)

Llano is 32nm APU. Quad core computing unit and 6xxx class Radeon chip in a $120, 100W package. If you need both, it's a great deal.

The situation is not good for AMD. All they've got is the low end and that is getting squeezed hard by Intel too. They need a more efficient CPU and they need it badly. Delaying is not something they want to do, Bulldozer has been fraught with delays as it is. They've been talking about it for a long time, like since 2009, and delivered nothing.

Agreed. The APU is great for the "regular" system, but they have nothing for server market anymore. Not if the servers are loaded. The c32 and g34 socket processors are OK, but performance per watt is not superior to Intel's offerings. Performance per $ is OK on low end. Again, low end.

Also they have nothing in performance market. If I want to spend no more than $!00-$120 on a CPU, then AMD is a great choice. But if I my workload is CPU limited, Intel is the only way to go.

On the other hand, CPU market is kind of "good enough" as is. There just isn't that much that needs to be pushed on the CPU anymore unless you have specific workload you need CPU for. GPU integration is by far superior on the APU than the Sandy Bridge, and Intel's solution is quite limiting. You can do most graphics things on the APU, but Sandy Bridge built in graphics cannot compete. Of course if AMD is thinking this, they will remain on the low-end.

Anyway, I'm hoping AMD can deliver bulldozer that can compete with Intel's offerings.

PS. Intel is delaying their next gen because they want to milk current gen and announce it *after* AMD unveils bulldozer..

Re:Ya right (1)

dbIII (701233) | about 3 years ago | (#37300478)

Agreed. The APU is great for the "regular" system, but they have nothing for server market anymore.

It depends what you mean by server. The twelve core AMD chips beat anything Intel can sell you if you have tasks that are going to use as many cores as they can get. Stick four of them in a SuperMicro board with a bit of memory and you have something that will outperform a far more expensive four socket Intel Xeon based machine.

Re:Ya right (1)

Billly Gates (198444) | about 3 years ago | (#37299084)

The benchmarks I have seen only show a 5% decrease in performance. You wont notice that unless you are running simulations or doing something hardcore. It is not like it is a big drop.

Llamo, on the otherhand has much better graphics compared to an Intel atom .. to hell even a full speed i5! Bulldozer will have even better graphics. If you just run IE 9/10, flash, and Office the AMD llamo and bulldozer will seem faster and less choppy. They can also run games like World of Warcraft as well. Sub notebooks sucks and can do these things and run these games.

If you own a dedicated GPU go open IE 9 with Google video and move the up and down arrows? IE 9 is fluid and smooth compared to any browser. If you have a crappy video card it wont seem any different than Chrome or Firefox.

Windows 8 will be hardware accelerated by video with IE 10 integration and a better integrated GPU that is on par with a dedicated graphics card will be better for average users.

AMD has some nice products out now and the next generation will be great for netbooks and notebooks and teenagers who can finally play more than angry birds on their sub $400 notebook.

Re:Ya right (1)

cyber-vandal (148830) | about 3 years ago | (#37299108)

Llano. AMD have been fucking up badly recently but even they wouldn't name their new chip Lamo.

Re:Ya right (1)

wagnerrp (1305589) | about 3 years ago | (#37299776)

Llamo, on the otherhand has much better graphics compared to an Intel atom .. to hell even a full speed i5! Bulldozer will have even better graphics. If you just run IE 9/10, flash, and Office the AMD llamo and bulldozer will seem faster and less choppy. They can also run games like World of Warcraft as well. Sub notebooks sucks and can do these things and run these games.

Llano has much better graphics than Intel's offerings, too good in fact. They stuffed in far more performance than has ever been seen on an embedded GPU, and bottlenecked it severely on the CPU's memory bus. It can't run full speed. It can't even run half speed without running into bandwidth limits. There is good reason why its discrete brethren are reaching into the triple digits. All they're doing is sucking down more power and real estate on silicon that can't be properly used.

Bulldozer won't change anything. The desktop systems get a modest increase in supported memory frequency, but now you're sharing that memory bus with a more powerful CPU. The workstation and server grade processors have double the memory channels, but people who buy those couldn't care less about onboard graphics. Those chips are either going into a server rack, where all you need is something that can drive a monitor, or workstations, where you're going to pair it with a proper discrete graphics card.

Re:Ya right (1)

Billly Gates (198444) | about 3 years ago | (#37299994)

It can share fine if it uses the same dedicated memory controller without doing an interrupt to the CPU or chipset each time it needs to access ram. That is what crippled the other integrated chipsets.

If it access the ram via the CPU ram controller then the bandwidth is much higher. Bulldozer can do this even quicker with the same memory bus. It will probably get close as a dedicated card since it does not have to wait for the CPU to finish access the ram first since multiple channels are used and you no longer have the issue of waiting for the chipset before it can access the ram like the older integrated solutions.

Not quite as fast but even if it performs as fast as a Nvidia 8600 GTS that will be good for cheap solutions and laptops. My guess is Crysis and World of warcraft mixed with Flash running 1080p will be fine. Crysis maybe at 1280 x 1024 instead of 1080p but that is fine. One slashdotter who owns a Llamo system said he can rum Sims 3 and full screen 1080p video surprisingly fine which shocked him. These are not cutting edge like a dedicated gaming desktop, but for $450 laptop not bad. Certainly with Windows 8, running HTML 5 applets with IE 10 hardware acceleration it will best Intel's offerings easily in user experience in cheap devices.

Re:Ya right (1)

0123456 (636235) | about 3 years ago | (#37300744)

It can share fine if it uses the same dedicated memory controller without doing an interrupt to the CPU or chipset each time it needs to access ram. That is what crippled the other integrated chipsets.

No it's not. Older integrated GPUs were crippled by low core performance, not memory bandwidth (though, to be fair, they couldn't have high core performance because the memory bandwidth was so low).

Modern CPUs want a lot of memory bandwidth. Modern GPUs want a staggering amount of memory bandwidth (the GTX580, for example, has around 200 gigabytes per second and even the 8600GTS has 32 gigabytes per second whereas a dual-channel DD3-1600 system only has 25 gigabytes per second). Stick both of those on one chip and you're going to be starving both of them unless you have at least four memory channels.

Admittedly as games become more shader-heavy the demand for increased bandwidth may drop, but you're still way behind the capabilities of discrete cards.

Re:Ya right (1)

malkavian (9512) | about 3 years ago | (#37299094)

People don't really care about clock for clock these days.. The kicker is in energy efficiency and cost for performance.
AMD do reasonably well in the energy efficiency and really well in the "Bang for Bucks" department. Yep, Intel currently outstrip them on the high end, but AMD have a lot of the mid to low range market, and still have a good showing in the server market. I wouldn't exactly call that 'getting slaughtered'..
Still, as you say, they do need to get newer architectures out the door to keep being competitive.

Re:Ya right (1)

0123456 (636235) | about 3 years ago | (#37299126)

AMD do reasonably well in the energy efficiency

Where? Every benchmark I've seen puts AMD well behind the i3 and i5 in performance/power and the idle consumption of the i3 and i5 isn't much worse than an Atom.

Re:Ya right (1)

hedwards (940851) | about 3 years ago | (#37299168)

To an extent yes, but Atom sucks, I mean seriously, Intel ought to have been too embarrassed to let that dog see the light of day. And yes, the Intel offerings do offer better battery life, but at a cost, the only ones I looked at were several hundred dollars more. Battery life is great, but with that much extra on the price tag you might as well just buy a couple extra batteries.

Re:Ya right (1)

symbolset (646467) | about 3 years ago | (#37299394)

Well, for sure I wouldn't have put the founder's name on the Moorestown product. That's just asking for trouble.

Re:Ya right (1)

wagnerrp (1305589) | about 3 years ago | (#37299794)

The Atom was a quick and dirty way for Intel to get into the tablet and netbook business as that market was taking off. It's only reason for existence is that it is an x86 processor, and thus can run Windows. Except to see sales drop off rapidly when Windows 8 comes out with ARM support.

Re:Ya right (0)

Anonymous Coward | about 3 years ago | (#37300198)

Windows 8 on ARM was pretty much Microsoft admitting defeat on behalf of Intel by saying "Intels products battery life to performance ratio is inadequate", probably the same thing Apple is saying to Intel as well.

With Motorola and IBM on the POWER/PPC architecture it was Price/Performance.

As for ARM, in the grand scheme of things, we might at some point see the x86/x86-64 phased out in favor of ARM if Intel can't come up with a way to compete on the performance/thermal scale. So far all their "low power" parts are simply clocked down versions of the same bin. Everyone knows the limitation of the x86-64 architecture, but so few people code directly in assembly that it doesn't matter, the C/C++/OBJC compilers just code it for the CPU in the machine, and we get performance penalties when the optimized ASM can't be used (see pretty much all emulators, including DOSBOX where x64 native mode is unusable due to lack of ASM cores.)

Remember how emulating a x86 on PPC was a piece of cake, but the reverse was often considered impossible, or too slow to be useful (see Rosetta.) This is why Microsoft switching to ARM is going to be something "new" , and hopefully they do exactly what Apple did with iOS and cut the damned OS to the bone with a stripped down UI otherwise we're just going to have yet another terrible piece of software from Microsoft.

Re:Ya right (1)

0123456 (636235) | about 3 years ago | (#37300268)

Remember how emulating a x86 on PPC was a piece of cake

As someone who's written x86 emulators, that comment would have destroyed my laptop if I had been drinking coffee at the time.

Re:Ya right (0)

Anonymous Coward | about 3 years ago | (#37299518)

That's benchmark. Real life usage is likely to be very different.

Re:Ya right (2)

0123456 (636235) | about 3 years ago | (#37299632)

That's benchmark. Real life usage is likely to be very different.

So the i5 uses less power than the best remotely comparable Phenom at idle, uses less power under 100% load, yet magically uses more power in between?

I guess it's possible, but not exactly likely.

Re:Ya right (1)

Anonymous Coward | about 3 years ago | (#37299118)

While I will have to agree that Intel is ahead of AMD by at least several steps, if you look at best bang for your buck gaming systems, AMD CPU's are still always the top pick. While I run an Intel CPU personally I build plenty of budget gaming systems with AMD and they still perform very well for a whole lot less money. Not only is the CPU cheaper, but the Motherboards are cheaper and the heatsinks are cheaper too because AM3 has been around forever now. A decent gaming system for $500 is a possibility thanks to AMD CPU's AND cheap AM3 motherboards.

Re:Ya right (1)

m.dillon (147925) | about 3 years ago | (#37300028)

I think this was true ~8 months ago but Intel mobos are priced about the same as AMD mobos these days... really ever since SandyBridge came out. There is so much chip integration now that the only real differentiation between mobos is added features and BIOS software.

Most of the costs involved in building a gaming system are unrelated to the cpu. I don't count built-in graphics as being decent, though you might, and I've fried enough systems with cheap PSUs that I don't buy cheap PSUs any more. So my concept of a decent gaming system is probably ~$100 to ~$150 more than yours.

At best whatever price advantage AMD might have goes away in $$ savings from the lower power consumption Intel systems have.

-Matt

Re:Ya right (1)

zixxt (1547061) | about 3 years ago | (#37299136)

The current situation is Intel is slaughtering AMD. AMD hasn't had an architecture update in a long, long time and it is hurting them. Clock for clock their current architecture is a bit behind the Core 2 series, which is now two full generations out of date. Their 6 core CPU does not keep up with Intel's 4 core i7-900 series CPU, even on apps that can actually use all 6 cores (which are rare). Then you take the i5/7-2000 series (Sandy Bridge) which are a good bit faster per clock than the old ones and there is just no comparison.

Intel only beats AMD with their most recent SandyBridge chips, If you go by the sites that get Intel kickbacks doing synthetic benches then yeah Intel wins, however if you search for realword benches not using crap like superpi or cinebench then AMD cpus win some/ lose some and Intel is not a overall faster chip.

Re:Ya right (2)

0123456 (636235) | about 3 years ago | (#37299152)

Intel only beats AMD with their most recent SandyBridge chips

Intel has been beating AMD since the Core-2, only AMD fanboys claim otherwise. Mostly by saying 'but, but, if you run benchmarks at 3840x2160 then the CPU is irrelevant'. Well, duh.

but ... but ... (1)

unity100 (970058) | about 3 years ago | (#37299416)

'but, but, if you run benchmarks at 3840x2160 then the CPU is irrelevant'

it is. it is quite irrelevant. anyone who is spending money to get good performance on the mentioned resolution vicinity you speak about, is either an extreme enthusiast, a hobbyist, or a moron.

Re:Ya right (2)

IorDMUX (870522) | about 3 years ago | (#37299150)

Well, the article basically said that there is no reason to believe that Bulldozer is delayed at all. I dunno why the title reads "Intel and AMD may both delay"

... wait. Yes, I do. To get readers.

From the article:

The CPU's anticipated launch date is already close enough that the company should already know if it can launch the product or not; waiting until now to announce a delay isn't something Wall Street would take kindly. Moreover, AMD has been fairly transparent about its launch dates and delays ever since the badly botched launch of the original K10-based Phenom processor back in 2007. Llano has been shipping for revenue for several months, and we're not aware of any 32nm production troubles at GlobalFoundries.

Re:Ya right (1)

Kjella (173770) | about 3 years ago | (#37299162)

On top of that, Intel is a node ahead in terms of fabrication. All Sandy Bridge chips, and many older ones, are on 32nm. AMD is 45nm at best currently.

No, the Llano chips are shipping and 32nm SOI. However only the low power Bobcat cores are made on that process, they need the high power Bulldozer cores to compete with Intel on performance. But yes, AMD ships very many 45nm chips still.

Re:Ya right (2)

Daniel Phillips (238627) | about 3 years ago | (#37299218)

The current situation is Intel is slaughtering AMD.

Then why do I only buy AMD (and ARM) these days? Frankly, Intel just seems to fib about their power envelope every generation and I do not, repeat, do not like to be surrounded by noisy computers. Currently running a quietized 4 way Phenom II box, very happy with it. I have not been happy with any intel box as a workstation for quite some time. Nothing beats the Pentium M in my aging Shuttle for a basically silent server (21 db @ 3 meters). Every other Intel box I have run recently requires stupid amounts of cooling. As far as servers go, Intel wins on minimum latency, so Intel owns the data centers of the financial industry, but AMD wins on mips/watt and mips/dollar, so AMD owns a disproportionate share of the throughput in the top 500 list. And AMD owns the space under my desk.

Re:Ya right (1)

0123456 (636235) | about 3 years ago | (#37299276)

Then why do I only buy AMD (and ARM) these days? Frankly, Intel just seems to fib about their power envelope every generation and I do not, repeat, do not like to be surrounded by noisy computers. Currently running a quietized 4 way Phenom II box, very happy with it.

You'd have been happier with an i3 or i5. I can just hear the fans on my i5 server when I stand with my ears a few inches away from it.

Re:Ya right (1)

Rich0 (548339) | about 3 years ago | (#37300032)

Not at the same price point. For what he'd spend on the i3/i5 he could probably have water cooling or something else that is rediculous. That is the thing these kinds of comparisons always leave out - cost. I've been running AMD systems for a while now - I can upgrade a box every two years when buying Intel would mean I'd be upgrading them every four years. While in years 1-2 the Intel system would be somewhat faster, in years 3-4 the AMD system would be miles ahead. Plus I'm sinking less money into what is a quickly depreciating asset.

Re:Ya right (1)

Billly Gates (198444) | about 3 years ago | (#37300550)

I do not understand? I run an AMD too and until last fall I was CPU agnostic. AMDs were more bang for the buck and I wanted a system that was ATI graphics. Are you talking about upgrading mroe often with money saved? Because in 3 to 4 years your system will be slow anyway no matter if you paid $1,000 or $5,000.

Re:Ya right (1)

Daniel Phillips (238627) | about 3 years ago | (#37300638)

You'd have been happier with an i3 or i5. I can just hear the fans on my i5 server when I stand with my ears a few inches away from it.

Both Intel and AMD give the TDP of their four core parts (i5 for Intel, Phenom II for AMD) as 95 watts. The difference is that I believe AMD. And I wonder about your belief that an i5 box would be quieter than mine with similar components, or your definition of what a few inches is, or whether you have the stereo on when you post to Slashdot. I have had people tell me with great assurance than an XBox 360 is quieter than a PS3, or that they can't hear their PS3 when watching a movie, both patently absurd with a moment's first hand experience.

Re:Ya right (1)

0123456 (636235) | about 3 years ago | (#37300760)

Both Intel and AMD give the TDP of their four core parts (i5 for Intel, Phenom II for AMD) as 95 watts

And I've never seen the power consumption of the i5-2400 go much above 50W in benchmarks. So perhaps Intel are lying; the i5 seems to use about half as much power as they claim it does.

Perhaps rather than believing either company's numbers you should actually try measuring them?

Re:Ya right (0)

Anonymous Coward | about 3 years ago | (#37299368)

Then why do I only buy AMD (and ARM) these days?

Because you're an AMD fanboy?

Re:Ya right (1)

LordLimecat (1103839) | about 3 years ago | (#37299496)

There are worse things to be, AMD needs all the help they can get and I really dont want to have the CPU market become a 1 horse race.

Re:Ya right (1)

NoNonAlphaCharsHere (2201864) | about 3 years ago | (#37299692)

x86 isn't "the CPU market". Less so every day.

Re:Ya right (0)

Anonymous Coward | about 3 years ago | (#37299800)

It's the desktop and server CPU market though. Anything else in those areas are just a tiny niche.

Re:Ya right (1)

0123456 (636235) | about 3 years ago | (#37299934)

It's the desktop and server CPU market though. Anything else in those areas are just a tiny niche.

To be fair, ARM is likely to eat into the low end of that market over the next few years. My Atom-based server/DVR was fast enough until we got OTA HD here and it became too slow for transcoding, and the Ion Xbmc box is plenty fast enough for video playback or general desktop usage; my i5 laptop spends most of its time at 1.2GHz, where it's probably not much faster than an Atom.

So if ARM can produce a chip at least that fast (if they haven't already) I think there's a chunk of the x86 market waiting for them.

Re:Ya right (2)

rdnetto (955205) | about 3 years ago | (#37300046)

There are Nvidia Tegra 2s being sold clocked at 1.2 GHz (dual core) right now. The Tegra 3 line will be quad core 1.5 GHz with 1.5 GB RAM. With Win8 supporting ARM, I can easily see ARM netbooks/laptops becoming commonplace within the next few years.

Re:Ya right (1)

the linux geek (799780) | about 3 years ago | (#37300132)

High-end RISC/mainframe platforms make up ~35-40% of the server market (source: both IDC and Gartner's numbers for Q1 and Q2) by revenue. High-end UNIX is staying flat and mainframe use is hugely increasing. You have no fucking idea what you're talking about.

Guess what: web servers aren't everything.

Re:Ya right (1)

Daniel Phillips (238627) | about 3 years ago | (#37300654)

Then why do I only buy AMD (and ARM) these days?

Because you're an AMD fanboy?

Or maybe because I get more for my money and I am pleased with the mips per watt performance?

Re:Ya right (1)

iotaborg (167569) | about 3 years ago | (#37299288)

Ivy bridge does include Intel's solutions to ARM offerings in the lower power areas, which they need to come out with as soon as possible. The reasons for the delays may be more than just AMD.

Re:Ya right (1)

CajunArson (465943) | about 3 years ago | (#37299572)

Ivy Bridge can't compete with ARM... but similarly, ARM can't compete with Ivy Bridge either. IB will have its biggest advantage in thinner-lighter notebooks (Macbook Air & the new Ultrabooks being put out by lots of different vendors). The projected power envelope is about 17 watts which is much much higher than even the tablet-level ARM chips. At the same time, its performance will destroy anything that ARM will have in the next 5 years (the newest ARM chips that are coming out next year are just barely approaching the earlier core 2 chips, and usually require more core & multithreading to get there). In a notebook sized device, the 17 watt power envelope is fine and should give very nice battery life in a lightweight system.

Intel's biggest competitor isn't AMD, it's ARM, or more accurately the various ARM licensees. Intel does have an big advantage in that it can put out one easy to use reference platform instead of the plethora of semi-compatible ARM implementations floating around right now. Also, the Atom has been a second-class citizen that hasn't been good enough to compete in ARM's market, but the noise coming out of Intel is that this is changing and Intel may finally be getting serious about making a solid version of Atom that can at least compete well in tablets, and eventually work down to higher-end smartphones. I'm not sure Intel wants to even try to compete in the low-end phone market since the profits there aren't all that great to begin with.

Re:Ya right (2)

YojimboJango (978350) | about 3 years ago | (#37299608)

No offense, but I'm typing this on the $350 15" Acer laptop with an E-350 (zacate fusion processor). I played portal 2, start to finish, on this thing at medium settings. It gets about 6 hours of battery life out of light web browsing. Intel may be killing AMD on the low end, but based on my comparison to a $600 HP probook with an i3-2ksomething, it's indistinguishable at web browsing and word processing, and the i3 just fails any time you try to run a game.

Not saying that the sandybridge i3 isn't a better number cruncher, it's just that for real people usage AMD is curb stomping Intel. I can only assume that marketing alone is the only reason Intel is selling anything under $900.

Re:Ya right (1)

dutchwhizzman (817898) | about 3 years ago | (#37300774)

You have a point, but you are comparing a business notebook at the inflated suggested retail price (big companies get big discounts) to a consumer grade machine. Try opening the laptop 2000 times, type 1000 hours on it and see which one is still more or less functional. A better comparison would be an HP pavilion. You can get something like a DM4 for $450 on the HP website right now, that compares to your Acer in specs and also has the AMD chipset. The cheapest I3 pavilion is another $100 more expensive. This suggests a retail price difference of about $100 for the AMD set and the I3 set. That is substantial, but less than the $250 or almost double what you came up with initially.

Re:Ya right (1)

aztracker1 (702135) | about 3 years ago | (#37300860)

Agreed, the E-350 is such a great value, I'm a bit sad that they're not selling closer o their release price points, but it really is a testimony to how well they work.

Re:Ya right (4, Informative)

sjames (1099) | about 3 years ago | (#37299620)

Actually, in Opteron vs. Xeon, AMD is doing quite well. Clock speed only gets you so far if you're bottlenecked on memory bandwidth.

Re:Ya right (1)

LWATCDR (28044) | about 3 years ago | (#37299790)

"The current situation is Intel is slaughtering AMD"
Frankly in the consumer space ARM is slaughtering Intel. The truth is that to day 90 of all desktops have more than enough CPU power. The most intensive thing most computers do today is playback HD video. Sure the I7 SandyBridge is blindingly fast but most people don't need the speed or the price tag. The new A8 and I3s show where the future is going. Fast enough with good enough graphics and low price.

Re:Ya right (2)

m.dillon (147925) | about 3 years ago | (#37299970)

Basically you are right. AMD has nothing even remotely close to SandyBridge and Bulldozer won't get them there either. I've been a long-time AMD fan, and over the years AMD has saved me bundles of money with their socket compatibility.

But AMD has to make a socket switch now and there are way too few AM3+ mobos available. Not only that but the mobos that are available are wired for compatibility.. they will work with AM3+ cpus but they won't be able to make use of all the new performance capabilities. So right now jumping to whatever AMD comes out with next is going to require a mobo replacement, and there's no point buying any current AM3+ mobo to get it.

SandyBridge is 30% faster than AMDs fastest cpu (either the x4 running all cpus accelerated or the x6). In addition, SandyBridge uses 30% less power at similar load levels (whole systems are running around ~40W at idle without having to sleep). Think about it. It's a HUGE advantage for Intel.

This isn't a benchmark... this is running DragonFlyBSD (basically a BSD), and linux will have similar results, doing things like parallel gcc compiles and such. No benchmark fakery here. These are real loads. I have many high-end AMD systems and I also have an Intel i7-2600K system and it runs rings around both my Phenom x 6 black and my newer x 4 with all four cores running at top speed (which is actually faster than the x6 in most cases).

And whatever lead AMD had with overclockers before is gone now. People have been overclocking i7's to almost 5GHz with water cooling.

SandyBridge completely blows AMD away on raw memory bandwidth too. The performance is across the board.

So Intel definitely doesn't have to rush to come out with their next architecture. They have AMD by the throat.

I'm not sure why people think ARM will blow away Intel. ARM is a slow cpu. It doesn't come close to AMD or Intel in performance. It's a cpu for portable devices. ARM does have a major advantage in low power use and 'enough' cpu suds to run devices, and they are certainly taking market share away from desktops, but you won't be finding ARMs in high-end servers any time soon (or even ever). Intel has the best fabs in the world and regardless of what happens with their tit-for-tat with Apple they will be diving into the low power arena over the next few years anyway. They'll lose some share now, but they'll get it all back in a few years.

Right now though it isn't a big deal because Intel can charge a $100-$200 premium for their cpus over AMD, while AMD is forced to sell their cpus at firesale prices just to keep the pipeline going. SandyBridge is that good. For a server that premium takes less than 2 years in reduced power consumption to zero out AMD's price advantage. Intel is a major cash machine because of this. AMD is not. Big difference.

So AMD has lost the high-end cpu war. AMD still has a fighting chance in the integrated graphics arena for lower-end machines but remember Intel has a 2 year Fab advantage. Intel can destroy AMD in this arena too if they feel AMD is getting too much good press.

-Matt

Re:Ya right (1)

dbIII (701233) | about 3 years ago | (#37300450)

Even on the slower moving opteron development it appears your information is stuck in June 2009.

Re:Collusion (0)

Anonymous Coward | about 3 years ago | (#37300192)

There might be good reasons on both sides, but the tinfoil hatter in me believes this might have more to do with fact that both companies might want to see a little more profit out of the R&D that went into the current generation of products before obsoleting them. The performance of the current generation is high enough that it is getting harder to introduce a new generation at a price point that could both recover R&D and provide reasonable value for the customer.

Agreed. Plus lame economy.

Re:Collusion (1)

kermidge (2221646) | about 3 years ago | (#37300322)

Interesting thought, but maybe it's not collusion - I suspect separate things going on.

AMD needs Bulldozer on the desktop (and servers) _now_ - the K10s have been milked. They're holding their own, sorta, on servers, for the nonce, while Intel can afford to slide for nigh a year on brand name alone, even if for no other reason.

[I admit to being prejucided towards AMD, ever since reading a lengthy article in Byte circa '91 on Intel, AMD, and Cyrix, AMD's engineering philosophy and realization impressed me, even in light of all the great research done at Intel. That said, I think one of the larger errors in computing was the 68k family getting sidelined; we've been saddled with a host of unhappy consequences since continuing with x86.]

Windows 8 (2)

Billly Gates (198444) | about 3 years ago | (#37298880)

Gee those delays mean the brand new shiny chips will just hapen to come out with Windows 8. Coincidence?

Not only can you finally ditch that aging Vista or XP machine, with shiny Windows 8 but now you can have a shiny new CPU too!

Re:Windows 8 (1)

creat3d (1489345) | about 3 years ago | (#37298922)

My Core2 and WinXP would like to have a word with your unnecessary upgrades.

Re:Windows 8 (0)

xMrFishx (1956084) | about 3 years ago | (#37299010)

As would my Penryn running Vista boxen too.

Re:Windows 8 (0)

Anonymous Coward | about 3 years ago | (#37299132)

Look, a Windows ME, too statement. Coincidence?

Re:Windows 8 (1)

xMrFishx (1956084) | about 3 years ago | (#37299170)

Windows what?

Re:Windows 8 (0)

Anonymous Coward | about 3 years ago | (#37299666)

boxen

Oh Slashdot, never change!

Re:Windows 8 (0)

Anonymous Coward | about 3 years ago | (#37299760)

Here [catb.org] you go junior, from when you were in diapers.

Re:Windows 8 (1)

Lanteran (1883836) | about 3 years ago | (#37299210)

As would my 486DX and debian with you.

Re:Windows 8 (0)

Anonymous Coward | about 3 years ago | (#37299880)

I'm quite happy with my Motorola 68000 CPU.

It's awesome, and it's really easy to work with. Sometimes I can even be faster than it!

OS? Who needs an OS? I work in assembly!

I would still be on my Z80, but there was a hot coffee incident. Very sad.

Re:Windows 8 (1)

rubycodez (864176) | about 3 years ago | (#37300090)

The bronze gears of my steam powered Babbage Engine just smell better when spilled coffee boils off of them. And none of that base 2 conversion crap fouling up my results, decimal in, decimal crunched, decimal out, bi-yatches. The firebox for the boiler takes coal, #2 bunker oil, tires (fuck you al gore), or anything else that burns. I use sheets of mica for punched cards, they're waterproof, fireproof, static electricity proof, magnet proof.

Re:Windows 8 (1)

Lanteran (1883836) | about 3 years ago | (#37300150)

Nice! Specs? How many gear-yards do you spin?

Re:Windows 8 (0)

Anonymous Coward | about 3 years ago | (#37300012)

As would my Commodore PET running OpenVMS.

Re:Windows 8 (0)

couchslug (175151) | about 3 years ago | (#37299172)

"Not only can you finally ditch that aging Vista or XP machine, with shiny Windows 8 but now you can have a shiny new CPU too!"

Your ideas intrigue me and I would like (everyone else) to subscribe to them.

Re:Windows 8 (1)

hedwards (940851) | about 3 years ago | (#37299182)

Possibly, but realistically most people would do fine with AMD's Fusion core processors, the ones they've already released. Tthere are legitimate reasons to have more power, but for the things that people typically do, it's more than enough power.

Re:Windows 8 (0)

Anonymous Coward | about 3 years ago | (#37299834)

"Vista"

Hahahahahahahahahahaha! Good one!

old news? (2)

Verunks (1000826) | about 3 years ago | (#37299054)

we have known that ivy bridge will be released in 2012 since april... http://www.maximumpc.com/files/u69/sandy_bridge-e_roadmap_updated.jpg [maximumpc.com]

Re:old news? (0)

Anonymous Coward | about 3 years ago | (#37300406)

Also, if you look close to what TFS says: It's nothing more than "rumors have surfaced", "various unnamed manufacturers" and more "rumors".

So bullshit, bullshit and more bullshit.

I could just as well write an article saying that "Rumors have surfaced that monkeys are going to take over the world", that "various unnamed bananas have apparently been eaten" and that "there are also rumors that this is because MojoKid raped them in the ass so often that they now decided to retaliate and wipe out humanity". "MojoKid himself has not confirmed whether he likes to fuck monkeys up the ass or has AIDS yet." ;)

Maybe I should make up some "news too"...
Maybe someone already thought of that...
Oh... Maybe that's the reason for this FOX-level "article"...

Probably not relevant to Moore's Law (3, Interesting)

JoshuaZ (1134087) | about 3 years ago | (#37299080)

The most naive question to ask if is this sort of delay is relevant to Moore's law and similar patterns. There are a variety of different forms of Moore's law. We've seem an apparent slowdown in the increase in clockspeed http://www.tomshardware.com/reviews/mother-cpu-charts-2005,1175.html [tomshardware.com] . The original version of Moore's Law was about the number of transistors on a single integrated circuit and that's slowed down also. A lot of these metrics have slowed down.

But this isn't an example of that phenomenon. This appears to be due more to the usual economic hiccups and the lack of desire to release new chips during an economic downturn (although TFA does note that this is a change in strategy for Intel's normal approach to recessions.) This is not by itself a useful data point, so this is not further need to panic.

On a related note there's been a lot of improvement in the last few years simply by making algorithms more efficient. As was discussed on Slashdot last December http://science.slashdot.org/story/10/12/24/2327246/Progress-In-Algorithms-Beats-Moores-Law [slashdot.org] by a variety of benchmarks linear programming has become 40 million times more efficient in the last fifteen years and that only a factor 1000 or so is due to the better machines, with a factor of about 40,000 attributable to better algorithms. So even if Moore's law is toast, the rate of effective progress is still very high. Overall, I'm not worried.

Re:Probably not relevant to Moore's Law (1)

dbIII (701233) | about 3 years ago | (#37300538)

most naive question to ask if is this sort of delay is relevant to Moore's law

Does it have to be? Moore doesn't work at Intel anymore so they might have a new plan. It was going to have to hit a physical limit at some point anyway which Moore would have very clearly known when he proposed it in the first place.

Sandy Bridge E first (0)

Anonymous Coward | about 3 years ago | (#37299140)

While the consumer variants for SandyBridge-E are rumored to be delayed till Q4, this next chip to hit the 'high end' consumer sector--along with workstation/server variants--has yet to hit the market so of course Ivy is not going to meet the late 2011 launch projected 2-3 years ago.

From what's been 'leaked' or rumored already, SB-e will come on LGA-2011 and the X79 chipset for consumers, with 2 hexa core variatns and a quadcore variant , to replace X58 and it's high PCIe lane count & triple memory bus. The TDP envelopes for these chips and the Xeons have been made public now too.

In fact the Workstation/Server market is already putting info out in the public about the upcoming Xeon motherboard models (specifically Supermicro & Tyan) and it's rumored that on Sept 7th Apple may also announce the upcoming Mac Pro update based on the same cpu's & Xeon chipsets. While all of that may be paper launch at first, it does appear the workstation/server parts may have higher availability sooner than the consumer parts as well (especially since there's been nothing for a while that wasn't oriented purely at low voltage/efficiency servers in 1u & 2u packages.)

Re:Sandy Bridge E first (0)

scalarscience (961494) | about 3 years ago | (#37299154)

Woops meant to post that under my account, didn't realize this machine wasn't logged in...wish I knew how to delete a posting made accidentally under AC status. Any mods able to check ip/post time?

Re:Sandy Bridge E first (0)

rubycodez (864176) | about 3 years ago | (#37299974)

if the slashdot developers weren't so hellbent on making this site a steaming pile of web 2 bloatware, they would instead put in useful features like the ability to log in while posting, like they had long ago. but instead it's going Ubuntu Unity, bazooka barfing the firefox 4 5 6 7 8 9, Gnome 3ing, KDE Krapwaring, lubing the Vista Aero Glass Ass....

Win (0)

Anonymous Coward | about 3 years ago | (#37299156)

But surely once their Bulldozer is out the door they will bulldoze those sandy bridges away. Just the fact that Intel keeps changing their sockets while any AM3+ motherboard will be able to support the next generation of CPUs alone is a good enough win in my book. As soon as the box is delivered I can have my computer upgraded in under 10 minutes and be good to go for at the very least another year.

Re:Win (1)

ifiwereasculptor (1870574) | about 3 years ago | (#37300692)

any AM3+ motherboard will be able to support the next generation of CPUs alone is a good enough win in my book. As soon as the box is delivered I have to have have my computer upgraded in under 10 minutes

FTFY

No, but seriously, we went from AM2 to AM2+ to AM3 to AM3+, so compatibility is not that great. I just got my AM3 box last month (because going Intel would cost me about $100 more and perform a tad slower in Premiere) and I don't know if I'll be able to upgrade to Bulldozer since not all AM3 boards will work with AM3+ CPUs. An AM2+ Phenom II X4 920 owner will certainly have to buy a new mobo. Sure, it's a bit better than what Intel does, but you can only go one generation further and, frankly, I don't think it's worth it, since if you build you PC carefully, you won't really need a performance boost in such little time.

x86 running out of steam? (0)

Anonymous Coward | about 3 years ago | (#37300130)

I really wonder how much profit there is anymore with these x86 processors. I personally have one fast machine for development that I tend to use remotely and a bunch of low power machines otherwise, including tablets I hold. I'm absolutely not interested in these burning infernos anymore.

It's time to take the whole PC down to sub 10W, and I believe there will be a push for this to be the mainstream within 5 years.

AMD! (2)

AncientFalcon (2309610) | about 3 years ago | (#37300788)

I will continue to buy AMD. ive compared my sub $500 AMD rigs with comparable Intel rigs, I don't see why spending 2 to 4 times the amount of money for intel over AMD when AMD does a fine job. My Phenom II 945 has served me well, runs cool, runs fast, everything I put on it it takes like a champ. I have yet to stress out the Phenom. Ive run multiple games on it, audio and video work on it. The only 'advanced' thing I haven't done on it is CAD and seti@home. Why spend 2 or 3 times for the Intel, when all i'm buying is a name??? You intel fanbois go ahead, spend your money and feed the giant, duchebags
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?