Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?

Soulskill posted about 2 years ago | from the driving-piles-and-dozing-bulls dept.

AMD 259

An anonymous reader writes "AMD just officially took the wraps off Vishera, its next generation of FX processors. Vishera is Piledriver-based like the recently-released Trinity APUs, and the successor to last year's Bulldozer CPU architecture. The octo-core flagship FX-8350 runs at 4.0 GHz and is listed for just $195. The 8350 is followed by the 3.5 GHz FX-8320 at $169. Hexa-core and quad-core parts are also launching, at $132 and $122, respectively. So how does Vishera stack up to Intel's lineup? The answer to that isn't so simple. The FX-8350 can't even beat Intel's previous-generation Core i5-2550K in single-threaded applications, yet it comes very close to matching the much more expensive ($330), current-gen Core i7-3770K in multi-threaded workloads. Vishera's weak point, however, is in power efficiency. On average, the FX-8350 uses about 50 W more than the i7-3770K. Intel aside, the Piledriver-based FX-8350 is a whole lot better than last year's Bulldozer-based FX-8150 which debuted at $235. While some of this has to do with performance improvements, that fact that AMD is asking $40 less this time around certainly doesn't hurt either. At under $200, AMD finally gives the enthusiast builder something to think about, albeit on the low-end." Reviews are available at plenty of other hardware sites, too. Pick your favorite: PC Perspective, Tech Report, Extreme Tech, Hot Hardware, AnandTech, and [H]ard|OCP.

cancel ×

259 comments

Sorry! There are no comments related to the filter you selected.

How about idle?? (5, Interesting)

Anonymous Coward | about 2 years ago | (#41743569)

90+% of my CPU is idle time.

How much power does the new chip use at idle and how does that compare to Intel?

50W at the top end means about $25/yr if I was running it 24/7. But since typical desktop is idle, what is the power difference there??

And yes, I don't care about single thread performance as I care about multithread performance. Single thread performance has been good enough for desktop for almost a decade, and the only CPU intensive task I do is running those pesky `make -j X` commands. No, not emerging world or silly things like that ;)

Re:How about idle?? (5, Insightful)

Animal Farm Pig (1600047) | about 2 years ago | (#41743699)

I agree about multithreaded performance being important thing moving forward.

Regarding power consumption, anandtech review [anandtech.com] puts total system power consumption for Vishera tested at 12-13W more than Ivy Bridge. Scroll to bottom of page for chart. Bar and line graphs at top of page are misleading-- they put x axis at 50W, not 0W.

If you are concerned about power consumption, find 100W lightbulb in your house. Replace with CFL. You will have greater energy saving.

Re:How about idle?? (2, Informative)

ifiwereasculptor (1870574) | about 2 years ago | (#41743845)

So true. AMD isn't competitive energetically in any way anymore, but the desktop is probably the only place while it doesn't matter. When you're thinking mobile, saving energy is a priority. On huge server farms, little relative gains can mean a tangible different in absolute numbers. But on desktops, their difference is about a light bulb, at load, and hardly anything when idle. And with PSUs under 350W being incresingly harder to find and an FX-8350-based system only gobbling about 200W at its most intensive, it's probably "low enough" power consumption for most people.

Re:How about idle?? (0)

Anonymous Coward | about 2 years ago | (#41744045)

You know they just labelled all the 350w power supplies 500w, right?

And that they don't draw that much at idle?

Re:How about idle?? (3, Insightful)

Kjella (173770) | about 2 years ago | (#41744097)

I think the biggest factor for your home desktop is noise - it takes a lot more airflow to remove 125W of heat than 77W of heat. In Anandtech's tests he actually measures 195W versus 120W total system power consumption. Sure it might not matter much if you plan to put a noisy 200W+ graphics card or two in it, but for non-gamer use I'd say that's pretty significant.

Re:How about idle?? (1)

ifiwereasculptor (1870574) | about 2 years ago | (#41744337)

True. Especially for hotter cities. Here, My undervolted, underclocked 95W Athlon II already manages to annoy me with its 3500+RPM fan (it goes to about 5000RPM at full load and clock, it's madness). Maybe AMD should start putting coolers with 120mm fans on their boxed processors. If they plan on selling such behemoths, that would provide more airflow and less noise at the same time. Fitting that on a motherboard would likely be a problem, though. Water cooling would be good, but it's much too expensive to make sense.

Re:How about idle?? (2)

negRo_slim (636783) | about 2 years ago | (#41744415)

Get an aftermarket cooler [bing.com] if CPU fan noise is really a concern.

Re:How about idle?? (1)

Anonymous Coward | about 2 years ago | (#41744765)

A Bing link, a Hotmail email address, and you misspelled Zalman?

Are you sure you're on the right site?

Re:How about idle?? (1)

PopeRatzo (965947) | about 2 years ago | (#41744555)

My undervolted, underclocked 95W Athlon II already manages to annoy me with its 3500+RPM fan

Get a better fan.

Re:How about idle?? (4, Informative)

negRo_slim (636783) | about 2 years ago | (#41744395)

I think the biggest factor for your home desktop is noise - it takes a lot more airflow to remove 125W of heat than 77W of heat.

Larger fans with slower rotational speed.

Re:How about idle?? (1)

amorsen (7485) | about 2 years ago | (#41744013)

If you are concerned about power consumption, find 100W lightbulb in your house. Replace with CFL. You will have greater energy saving.

Surely you already did that 3 years ago, so now you're preparing for the switch to LED?

Re:How about idle?? (3, Insightful)

Lumpy (12016) | about 2 years ago | (#41744093)

"Surely you already did that 3 years ago, so now you're preparing for the switch to LED?"

Why? so I can get less light output for the same watts used but instead of spending $8.95 per bulb I get to spend $39.99?

LED is a joke for home lighting, only fools are buying it right now. CFL is still way more efficient.

Re:How about idle?? (0)

Anonymous Coward | about 2 years ago | (#41744155)

Fools and people with powerline networking. Some CFLs cause interference with powerline networks whereas LEDs are fine. I've also had several CFLs release nasty substances while they run turning my light fixture yellow and covering the bulb along with burn marks around the base. Before you ask, they were GE bulbs.

I might be a fool to use LED, but I feel safer doing so. If you leave lights on for long periods, some CFLs aren't that safe.

Re:How about idle?? (0)

Anonymous Coward | about 2 years ago | (#41744205)

Fools and people with powerline networking. Some CFLs cause interference with powerline networks whereas LEDs are fine. I've also had several CFLs release nasty substances while they run turning my light fixture yellow and covering the bulb along with burn marks around the base. Before you ask, they were GE bulbs.

I might be a fool to use LED, but I feel safer doing so. If you leave lights on for long periods, some CFLs aren't that safe.

Or, in the absence of a meddling government, you could just use a 50 cent incandescent with none of those problems.

Re:How about idle?? (1)

h4rr4r (612664) | about 2 years ago | (#41744243)

And spend more money than the cost of the bulb to keep it lit, and I get to replace them all once a year.

I loved CFLs, but LEDs are all I buy now. Why bother changing lightbulbs?

Re:How about idle?? (1)

cyberjock1980 (1131059) | about 2 years ago | (#41744499)

Same here. Used to buy CFLs, but switched to LEDs because they never burn out. Now I have a single circuit in my whole house with a single UPS(300W) for lighting. If I lose power all the lights stay on for about 10 minutes. Very useful at times. Not to mention they never burn out. I have yet to have an LED fail me and my whole house has been LED since 2009.

Do I think I'll make my money back someday? Maybe, maybe not. But I'm glad to have given money to a technology that needs to become more common and get cheaper, so why not invest in it? If everyone is buying LEDs for their homes in 5 years then I'm glad to have done my very meager but useful part to help make LEDs a daily reality.

Re:How about idle?? (2, Insightful)

afidel (530433) | about 2 years ago | (#41744721)

LED's running on AC will fail for the same reason most CFL's fail, the ballast.

Re:How about idle?? (1)

Bryansix (761547) | about 2 years ago | (#41744457)

And then I would be charged in the highest tier for my Electricity ( I live in California where electricity is expensive) and then that would be compounded by the fact that incandescent produces massive amounts of heat which would raise my electricity bill even further to cool down. Again, I live in California where I need to use A/C 9 months out of the year. I'm glad I don't use incandescent.

Re:How about idle?? (1)

hoboroadie (1726896) | about 2 years ago | (#41744565)

We only use incandescent heaters for illumination in the studio. Stratocaster pickups hum plenty already.

Re:How about idle?? (1)

amorsen (7485) | about 2 years ago | (#41744181)

Why? so I can get less light output for the same watts used but instead of spending $8.95 per bulb I get to spend $39.99?

LED can come much closer to proper full-spectrum light than CFL ever will. CFL is just a stopgap technology we have to deal with until LED gets there.

Also, it is possible to make LED spotlights to handle that strange modern trend of building lots of spotlights into ceilings. CFL cannot do that.

Re:How about idle?? (1)

zenith1111 (1465261) | about 2 years ago | (#41744779)

That may be true, but all the good quality fluorescent bulbs I've bought lately have a color quality at least as good as the good quality LED ones and with very similar power efficiency.

I've found a place were LEDs are superior, tough, we have a corridor with a motion sensor with 5 bulbs and, with dozens of daily starts, some CFLs don't even last 6 months there, others quickly get very dark near the heating filament. I replaced them with cheap LED bulbs and they are going strong after one year.

Re:How about idle?? (1)

h4rr4r (612664) | about 2 years ago | (#41744267)

You do realize they last like 20 years, right?

Please explain why you think someone would be foolish to buy a longer lived and nearly as efficient device?

Re:How about idle?? (1)

squiggleslash (241428) | about 2 years ago | (#41744363)

I still have CFLs that I bought in 1998 working properly - that's when I moved to the US. The few CFLs I've had to replace have generally done so for external reasons (such as a bizarre one I bought that started to fall out of its base. I assume it wasn't glued together properly or something. Bizarre both in the sense that it did it, and that it still actually seemed to work OK.)

I'd be very surprised, at current prices, if LEDs actually represent value for money against CFLs if the quality and capabilities of both types of "bulb" are close enough to not be a consideration for the task at hand (like lighting a living room.)

Re:How about idle?? (2, Insightful)

LordLimecat (1103839) | about 2 years ago | (#41744569)

You live in an apartment and dont plan to be there for 20 years?

I imagine for a lot of people, dumping $40 into each light socket is a losing proposition for you, and a winner for your landlord (who I am sure would greatly appreciate the gift).

Re:How about idle?? (0)

Anonymous Coward | about 2 years ago | (#41744327)

I use nothing but LED lights. I replaced them all when I moved in, and they run for 16 hours a day for 2 years, and they're they same brightness as when I bought them. They beat CFL's handsdown, and completely kill Incandescent lights.

Re:How about idle?? (1)

Bryansix (761547) | about 2 years ago | (#41744423)

I'm fairly certain the light output is basically the same per watt for LEDs compared to CFL. In fact here, I looked it up http://cleantechnica.com/2011/09/01/led-vs-cfl-which-light-bulb-is-more-efficient/ [cleantechnica.com] and the LED's actually produce MORE lumens per watt.

Now, lets expand and point out something this article got wrong. They say you would only have to replace a CFL three times over the course of an LED lifespan. In my experience, this is at least an order of a magnitude on the low side. I have dimmers in all my sockets and the dimmable CFL lights will fail sometimes within a year to a year and a half. In that case, I'd be replacing 15-20 CFL lights per LED light I put in. Nevermind that CFL has a serious issue with diminishing light quality over the lifespan. Yes, there are CFLs without these problems, and they are 50-100% more expensive then the normal ones.

Re:How about idle?? (4, Insightful)

war4peace (1628283) | about 2 years ago | (#41744077)

I play games maybe 1h 30m a day on average. My 5 year old dual-core E6750 overclocked at 3.2 GHz handles most of them gracefully, but there are some new releases which require more processing power. However, in choosing a new platform, I'm mostly looking at TDP, not from a consumption perspective, but heat dissipation. I hate having to use a noisy cooler.
My current CPU has a TDP of 65W and a Scythe Ninja 1 as cooler, and the fan usually stays at 0% when the CPU is idling. While gaming, I can't figure out whtehr it makes noise, because my GPU cooling system makes enough noise to cover it. And I'd like to keep it that way when I pick my new CPU.

You're saying that graphs are misleading. No, they're not, if one has half a brain. I'm not looking at the hard numbers and the power consumption difference is of about 100W. The i5 3570K draws about 98W and Zambezi and Vishera (who the fuck names these things?) draw around 200W. if you put TWO i5 on top of the other, they barely reach ONE AMD cpu power consumption. Thanks, but things DO look bad for AMD. I'll just have to pass.

Re:How about idle?? (0)

Anonymous Coward | about 2 years ago | (#41744225)

total system power consumption for Vishera tested at 12-13W more than Ivy Bridge. Scroll to bottom of page for chart. Bar and line graphs at top of page are misleading-- they put x axis at 50W, not 0W.

Umm.. 12W difference is for an IDLE system. At load the difference is clearly closer to 80-100W. Maybe for a desktop this doesn't matter. But given the decline of desktops and consumers moving to laptops (and even more mobile devices), these results are downright TERRIBLE for AMD.

Re:How about idle?? (1)

Animal Farm Pig (1600047) | about 2 years ago | (#41744463)

Umm.. 12W difference is for an IDLE system. At load the difference is clearly closer to 80-100W.

No shit, Sherlock. Look at comment subject line.

Maybe for a desktop this doesn't matter. But given the decline of desktops and consumers moving to laptops (and even more mobile devices), these results are downright TERRIBLE for AMD.

If you think desktop chips (AMD or Intel) are used in laptops, you are an idiot.

Re:How about idle?? (0)

Anonymous Coward | about 2 years ago | (#41744301)

I have to agree, AMD needs to take what they have in a 135watt part and make it 80 watts or less for them to be competative, otherwise all these parts are only going to end up in corporate desktops where they all go to sleep after work, and places where electricity is cheap (eg mid-west)

These aren't going to end up in data center machines due to to being way too hot/power-hungry. Even the APU A series are too hot to be used effectively in a desktop or laptop, and only become a 'win' in a few very specific use cases (eg netbook/ultrabook, media centers) where the intel's video is a complete joke.

Re:Wattage costs (1)

hoboroadie (1726896) | about 2 years ago | (#41744487)

More heat equals louder fans. and more dust on the vents.
The limosine tax imposed by our ISP overlords is a couple orders of magnitude more painful.

Re:How about idle?? (0)

Anonymous Coward | about 2 years ago | (#41744603)

Moron.

Re:How about idle?? (4, Informative)

ifiwereasculptor (1870574) | about 2 years ago | (#41743703)

Idle power seems pretty competitive with Intel's Core offerings. Anand found little difference and attributed it to their selection of a power-hungry motherboard.

Re:How about idle?? (2)

viperidaenz (2515578) | about 2 years ago | (#41743975)

Its not just 50W at full load, there is another graph that shows the same workload on Intel used 220.8watt-hours, the AMD took 352.5watt-hours. Not only did the Intel system use less power under load, it finished the work quicker.

Re:How about idle?? (0)

jgrahn (181062) | about 2 years ago | (#41744673)

And yes, I don't care about single thread performance as I care about multithread performance. Single thread performance has been good enough for desktop for almost a decade, and the only CPU intensive task I do is running those pesky `make -j X` commands.

You don't mean multithread, exactly. Make isn't multithreaded; it uses real processes.

So they are not dead (1)

Krneki (1192201) | about 2 years ago | (#41743603)

Finally something positive from AMD, while I'm not interested in this CPU since I'm a gamer, the lack of competition kept the Intel CPU price stagnant. This new AMD CPU seems to have some strength in multi-threaded applications. But then again, a 2 year old o/c Intel i5 eats any game you give him, while the same can't be said for a same priced video card. So Intel is not all that evil (watching AMD marketing troll campaign for Bulldozer on the other hand made me hate AMD).

Re:So they are not dead (1)

ifiwereasculptor (1870574) | about 2 years ago | (#41743759)

I'm a gamer too, and I'm actually interested, mainly because the FX-4300 seems to be now a fierce competitor to Intel's i3 while costing quite a bit less. The FX-8xxx still sucks, but this is a major improvement for AMD on the mainstrem segment. They were losing to cheaper Pentiums with the FX-4100, it was embarrassing.

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41743799)

Not dead, but if they're having to sell a chip with twice as many transistors as an i7 for two thirds of the price, they must be spraying blood all over the room.

Re:So they are not dead (1)

Synerg1y (2169962) | about 2 years ago | (#41743839)

I miss the FX series... we may never have gotten the i7 if not for that war between intel and amd, but AMD never answered...

Re:So they are not dead (1)

Kjella (173770) | about 2 years ago | (#41743947)

Not dead, but if they're having to sell a chip with twice as many transistors as an i7 for two thirds of the price, they must be spraying blood all over the room.

Twice as many transistors? No, it's 1.2B versus Ivy Bridge's 1.4B - but the die size is almost double, 315mm^2 to 160mm^2. That is both because Intel is on a 22nm process and because they have higher transistor density, Intel's 32nm Sandy Bridge has 1.16B transistors but is only 216mm^2. I guess die size is one of the many things AMD didn't have the time or resources to optimize for and yes they're going to hurt selling these chips for $200 and down.

Re:So they are not dead (2)

Synerg1y (2169962) | about 2 years ago | (#41743817)

You do realize that most of your processing occurs via your gpu in a game right...?

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41743957)

You do realize that most of your processing occurs via your gpu in a game right...?

And?

The majority of modern games are quite bottlenecked by CPU. Thank shit console ports and/or general shit programming, combined with the industry's complete fucking lack of how to properly use multiple cores.

Re:So they are not dead (1)

Synerg1y (2169962) | about 2 years ago | (#41744177)

Something like Physx rendering gets kicked over to your cpu if your gpu can't do it... which at this point in time puts it in the bottom bracket of gpus, but besides the point even w crappy coding cpus are leagues and leagues ahead of any bad practices programmers may deploy. Read below for why mutli core.

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41744341)

Only if you have a shit CPU, or you're playing a really bad port. Most games are designed around hardware specs that are six years old, so as to be compatible with consoles. Any relatively modern CPU, even my 1090T, can slam anything currently on the shelf at 16xAA with even a mediocre video card.

If you're buying the high-end i7s "because gaming hurr", then you're wasting your money -- your CPU isn't even breaking a sweat. Game engines have simply not caught up with even the last generation of CPUs/GPUs.

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41744403)

Not only games. Google Chrome, Firefox, and others don't make use of threads in efficient ways, and these are the programs people actively use.

Where is our multi-threaded Javascript? Oh firefox doesn't do it: https://bugzilla.mozilla.org/show_bug.cgi?id=392073 , neither does chrome: http://www.chromium.org/developers/design-documents/multi-process-architecture

Why are we still using 1970's "multi-process pre-fork" models? Waste ram, waste cpu, waste power.

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41743959)

Depends on the game.

Re:So they are not dead (1)

baka_toroi (1194359) | about 2 years ago | (#41744001)

No, that depends on the game. AI processing isn't done on a GPU. Of course, it's almost always better to improve your GPU instead of your CPU.

Re:So they are not dead (1)

Synerg1y (2169962) | about 2 years ago | (#41744147)

Most AI / Game processing can be handed on a single core 2 ghz processor, why do we need multi-core for gaming? ... ... ...
Windows background processes, yep it's as lame as that.

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41744217)

There are games coming to market now that are *very* CPU dependant. A current example would be the Planetside 2 beta (which is also very dependant on the memory bus).

There's likely to be more coming along with the next 'next-gen' consoles to boot - there's normally a period of a year or two after a new family of consoles is launched where system requirements for the PC ports surge.

Re:So they are not dead (1)

0123456 (636235) | about 2 years ago | (#41744523)

Guild Wars 2 seems to be CPU-limited as well; no graphics setting change other than supersampling makes more than 1fps difference on my system.

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41744339)

Too bad my 2.16ghz i7 can't feed data to my GPU to push my 6950 past 10% load in many games at 1080p ultra settings. Single threaded performance still matters until deferred-rendering becomes standard.

Re:So they are not dead (1)

Synerg1y (2169962) | about 2 years ago | (#41744611)

That's probably your gpu not being challenged, gpu's have exceeded game requirements in recent years by quite a wide margin (also you failed to mention your cpu load)... there does seem to be a bunch of confusion on the issue, so here's a great link on the subject: http://www.thinkdigit.com/forum/hardware-q/154674-what-role-cpu-gpu-relation-gaming.html [thinkdigit.com]

Now, going back to my previous statement: unless you run a p4, you're probably fine on the cpu and the gpu is what does the heavy lifting.

Re:So they are not dead (1)

Z00L00K (682162) | about 2 years ago | (#41743905)

It certainly will make things more interesting - and increased number of cores is the way to go today. Most applications and operating systems will benefit from multiple cores - even though some applications may benefit from additional tuning.

However - most bread&butter applications today runs well on even the cheap processors on the market, it's only when you start to look at things like gaming and advanced processing that you really will benefit from a faster processor. Most computers will be a lot faster when you replace the ordinary hard disk with a SSD. And if you really want some computing power you should look at using GPUs for computing. Sooner or later it's time to get rid of the x86 architecture and look at new ways to raise the performance.

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41744247)

My rig for the last year has been built on an AMD X6 1090T Thuban processor and is running great. I do a fair amount of gaming, recently I've been playing Skyrim and Boarderlands 2. I don't quite understand what people have against AMD for gaming. All of these games run great, I don't have any issues with lag and the system definitely does not feel slow. When playing multilayer with my friends (who run a mix of Intel i5/i7 and AMD Thuban/Bulldozer systems) I am always one of the first people to load into a map.

My question is can anyone give any reproducible demonstrations as to what I am missing out on using an AMD processor vs an Intel when it comes to gaming. Sure, benchmarks can show that an i7 can have X number more points then the AMD, but what does that translate out to in reality. Can anyone give an example where if I put my rig up beside someone with an equivalent Intel system I would get clearly "smoked" outside of a synthetic benchmark?

Re:So they are not dead (0)

Anonymous Coward | about 2 years ago | (#41744613)

No, they cannot, because there isn't a game on the market right now that can really stress even the 1090T, never mind the current-gen i5 or i7 series. Even if there were, you'd have a bottleneck at the PCIe bus, so what does it matter?

I also have the 1090T, and even Skyrim with high-texture mods and the LBA patch, at maximum settings plus driver-level AA enhancements, my CPU never inched above 50% across all cores. My GPU was sweating bullets, sure, but even with active monitors running on my second screen, I never had a bit of lag. The whole rivalry BS is just that... games quite simply haven't caught up to the hardware, but neither Intel nor AMD want you to figure that out -- they just want you to buy "the next big thing!"

Lowers barrier to entry (5, Interesting)

Gothmolly (148874) | about 2 years ago | (#41743633)

I put together an 8way,32GB machine (no local storage) for $400 to play with ESXi. Courtesy of the freebie VMWare download and a reasonably priced 8way machine, I can get into some pretty serious VM work without spending a ton of dough. I don't need massive performance for a test lab.

Re:Lowers barrier to entry (2, Interesting)

h4rr4r (612664) | about 2 years ago | (#41744289)

Get an SSD.
Local storage is a must for performance. iscsi cannot hold a candle to local SSDs. In a lab you won't need to share the storage with multiple machines anyway.

Re:Lowers barrier to entry (2, Informative)

Anonymous Coward | about 2 years ago | (#41744557)

Local storage is a must for performance

This is hyperbole. If what you're doing is mostly CPU or Memory intensive and requires very little disk activity having fast local storage isn't going to help much, if at all.

Besides, apparently it isn't a must for the grandparent as he stated he doesn't "need massive performance for a test lab."

Don't get me wrong, using an SSD to provide storage for a handful of VMs is a great idea (massive read/write IOPs), but it isn't necessary.

Re:Lowers barrier to entry (1)

LordLimecat (1103839) | about 2 years ago | (#41744751)

Without iSCSI you cant really use shared storage, which means 90% of the features of ESXi cant be used. Kind of dampers the whole "for a lab" thing.

SSDs ARE quite sweet for VMs, Id recommend setting up a VM that serves out a local SSD as iSCSI over an internal ESXi storage network-- thats actually how things were done during my VCP training. I believe they were using FreeNAS (MIGHT have been openfiler) to serve up iSCSI and NFS targets. Its a little buggy but sufficient for a lab.

Re:Lowers barrier to entry (4, Informative)

rrohbeck (944847) | about 2 years ago | (#41744307)

Same here. I built a Bulldozer machine for compiling projects in VMs last year and it works very nicely. If Intel had had a CPU with ECC memory and hardware virtualization support at a reasonable price I would probably have bought it, but I would have needed at least a $500 Xeon for that, with a more expensive motherboard, and I wouldn't be able to overclock it. For the same performance I have now I would probably have needed a $1k CPU.

Re:Lowers barrier to entry (-1)

Anonymous Coward | about 2 years ago | (#41744489)

wtf is an "8way" you fscking moron.

Re:Lowers barrier to entry (1)

Fackamato (913248) | about 2 years ago | (#41744561)

8-way would mean 8 sockets on the motherboard, like http://www.supermicro.com/products/system/5U/5086/SYS-5086B-TRF.cfm [supermicro.com] .

Re:Lowers barrier to entry (0)

Anonymous Coward | about 2 years ago | (#41744691)

8-way would mean 8 sockets on the motherboard

This is incorrect. N-way can apply to the number of sockets, cores, or threads in a system.

An 8-core AMD FX processor is considered to be an 8-way processor.
A 4-core Intel i7 processor could be considered either a 4-way or 8-way system, depending on if you're looking at physical cores or total threads.
A SPARC T4 could be either an 8-way or 64-way system, once again if you're looking at cores or threads.

Re:Lowers barrier to entry (1)

Fackamato (913248) | about 2 years ago | (#41744739)

I've always looked at it as number of CPUs (physical chips), number of cores (total, on all CPUs), number of threads (total cores, with HT or not etc).

-way was a socket thing for me, but perhaps it's not.

Here's the problem... (4, Informative)

CajunArson (465943) | about 2 years ago | (#41743777)

These chips "excel" at big, heavily threaded workloads. Which is to say that they can beat similarly priced Intel chips that are simply up-clocked laptop parts. Move up to a hyperthreaded 3770K (still a laptop part) and Vishera usually loses. Overclock that 3770K to be on-par with the Vishera clocks while still using massively less power than Vishera and the 3770K wins practically every benchmark.

  Unfortunately, if you *really* care about those workloads (as in money is on the line) then Intel has the LGA-2011 parts that are in a completely different universe than Vishera, including using less total power and being much much better at performance/watt to boot. I'm not even talking about the $1000 chips either, I'm talking about the sub $300 3820 that wins nearly every multi-threaded benchmark, not to mention that $500 3930K that wins every one by a wide margin.

    So if you want to play games (which is what 90% of people on Slashdot really care about): Intel is price competitive with AMD and you'll have a lower-power system to boot. If you *really* care about heavily-multithreaded workloads: Intel is price competitive because the initial purchase price turns into a rounding error compared to the potential performance upside and long-term power savings you get with Intel.

      Vishera is definitely better than Bulldozer, but AMD still has a long long way to go in this space.

Re:Here's the problem... (1)

Synerg1y (2169962) | about 2 years ago | (#41743873)

GPU's benchmark game performance...

Re:Here's the problem... (2)

CajunArson (465943) | about 2 years ago | (#41744019)

GPUs are very important for games, which is why an Ivy Bridge with an on-die PCIe 3.0 controller is going to do better at keeping next-generation GPUs running full-tilt than the PCIe 2.0 controller on the Northbridge of an AM3+ motherboard.

Re:Here's the problem... (2, Insightful)

Anonymous Coward | about 2 years ago | (#41744027)

AMD has never been about pure performance. It's all bang for buck. You can : buy an AMD system for much less than an Intel one, get a motherboard that has a lot more connectivity than the equivalent Intel board for less, AND get a true 2 x PCI-Ex 16x (while Intel force you to get an LGA2011 board, much costlier). The tradeoff? You'll get a machine with a CPU that perform maybe 10-20% less in benchmark than the Intel equivalent. But seriously, who cares in 2012? Most game are GPU starved, so you're much better spending that 50-70$ that an Intel would have costed you on a better GPU. And most day to day operation can be sped up by getting a fast SSD and more RAM. Get used to it, the CPU is simply not the main bottleneck anymore.

HOWEVER, if you're an overclocker that looks to shatter the world benchmark record or a mad scientist that need workstation-like horse power AND you have the cash to spend on it, I'd say go for Intel. It'll worth every penny.

Re:Here's the problem... (4, Insightful)

Anonymous Coward | about 2 years ago | (#41744049)

You are missing an importan point when it comes to "money is on the line". No one in their right mind would use a desktop processor from Intel for anything critical. Why? All non-xeon processors have been crippled to not support ecc memory. If money really is on the line there is just no way that is acceptable.

Amd on the other hand does not cripple their cpu's as all. The whole Vishera lineup support ecc memory, as did Bulldozer.
The xeon equivalent of 3820 is in a completely different price league.

So please, when you compare price and use cases make sure you fully understand which processors are the actual alternatives.

Re:Here's the problem... (1)

CajunArson (465943) | about 2 years ago | (#41744109)

Sure Vishera theoretically supports ECC memory, but you need a motherboard that takes ECC memory to tango... and those are a rare beast in the consumer space, meaning you really are looking at Opteron socketed motherboards and Opteron chips (which are nowhere near as cheap as Vishera). So there is no free lunch.

Re:Here's the problem... (0)

Anonymous Coward | about 2 years ago | (#41744261)

Here in Norway, in the most popular consumer-oriented web-stores there are more ECC-enabled motherbords for AM3+ than not. If this is not true for your location I send my sincere condolences.

Re:Here's the problem... (1)

Kjella (173770) | about 2 years ago | (#41744421)

You are missing an importan point when it comes to "money is on the line". No one in their right mind would use a desktop processor from Intel for anything critical. Why? All non-xeon processors have been crippled to not support ecc memory. If money really is on the line there is just no way that is acceptable.

Well if it's critical then I wouldn't want to use a desktop processor or system in any case, then you should get a proper Opteron/Xeon server with all the validation and redundancy and managed environment. As for the general employee, I've yet to see anyone working on a "workstation" class machine. Unless they need the horsepower for CAD or something like that my impression is that 99.9% from the receptionist to the CEO use regular non-ECC desktops/laptops to do their work and I'm pretty sure that means money is on the line. Of course it could be that we live in an insane world, but I think the risk of someone making a disastrous typo is equal or greater than the risks of a bit flip.

Re:Here's the problem... (1)

Enderandrew (866215) | about 2 years ago | (#41744133)

When you talk about Intel being price competitive, it depends.

AMD clearly wins on budget gaming systems where the processors and motherboards are much cheaper, but Intel has the fastest high end systems out there right now.

Just a couple months ago I priced two builds with similar benchmark numbers on NewEgg, and the AMD budget gaming rig was around $800, and the Intel equivalent was around $1100.

Re:Here's the problem... (1)

rrohbeck (944847) | about 2 years ago | (#41744367)

Look at the Phoronix benchmarks. Vishera beats the 3770k in many benchmarks as long as you're running multithreaded code.
http://www.phoronix.com/scan.php?page=article&item=amd_fx8350_visherabdver2&num=1 [phoronix.com]
And do the math about power savings. Unless you're a folder you'll need several years to recap the additional cost of an Intel CPU.

Re:Here's the problem... (1)

CajunArson (465943) | about 2 years ago | (#41744549)

You have a very interesting definition of "many". I would say that the non-overclocked 8350 beats a 3770K in "a few" multi-threaded benchmarks, usually by a small margin, while still losing by much much larger margins in many other multi-threaded benchmarks (in fact the Phoronix article barely has any lightly-threaded benchmarks in the mix at all).

    The Vishera OC'd to 4.6 GHz wins a few more benchmarks, but, as I said above, all you have to do is apply an easy OC to Ivy Bridge to get up to 4Ghz and Vishera's lead in the few benchmarks it excels at quickly disappears.

Shared FPU? (1)

Anonymous Coward | about 2 years ago | (#41743941)

Does it still share a floating point unit between two normal cores? For HPC users, that was the worst possible thing to hear about AMD's Bulldozer product. Almost everything in HPC needs good floating point performance and AMD didn't deliver, thus if you buy AMD you have to buy twice the number of cores just to get the FPU count up.

tl;dr version (5, Informative)

gman003 (1693318) | about 2 years ago | (#41743983)

New AMD processor, higher clocks than the last one but no massive improvements performance-wise. Still rocks at multi-threaded, integer-only workloads, still sucks at single-threaded or floating-point performance, still uses a huge amount of power. AMD giving up on the high end, their top-end parts are priced against the i5 series, not the i7. Since Intel's overpricing stuff, they're still roughly competitive. Might be good for server stuff, maybe office desktops if they can get the power down, but not looking good for gaming. Overall mood seems to be "AMD isn't dead yet, but they've given up on first place".

There. Now you don't need to read TFAs.

Re:tl;dr version (1)

theswimmingbird (1746180) | about 2 years ago | (#41744245)

I built a new rig back in March, went with the Phenom II 975 Black Edition. Haven't been disappointed, as it's still a good bit faster than the Core2 P8400 in my laptop. I'm still holding out hope AMD comes out with a better gaming processor, though, so I don't have to change motherboards when I do decide to upgrade...

Nice CPU, high power usage (4, Informative)

coder111 (912060) | about 2 years ago | (#41743999)

I've read through some of the reviews. It looks like a nice CPU with a bit too high power usage for my taste.

And please take benchmark results with a pinch of salt- most of them are compiled with Intel compiler, and will have lower results on AMD CPUs just because Intel compiler will disable a lot of optimizations on AMD CPUs.

I don't know of any site which would have Java application server, MySQL/PostgreSQL, python/perl/ruby, apache/PHP, GCC/llvm benchmarks under Linux. Video transcoding or gaming on Windows is really skewed and nowhere near to what I do with my machine.

--Coder

Re:Nice CPU, high power usage (2)

ifiwereasculptor (1870574) | about 2 years ago | (#41744087)

I think Phoronix took Vishera through some GCC tests. They have another article about GCC optimizations for the Bulldozer architecture in general (it seems to improve some workloads by quite a bit).

Phoronix is great but I want more (4, Interesting)

coder111 (912060) | about 2 years ago | (#41744425)

I read phoronix a lot these days. I find more technical news in there than in Slashdot.

However, their benchmarks are often flawed. For example they did a Linux scheduler benchmark recently which measured throughput (average this or that) and not latency/interactivity (response times), which was totally useless. Well, ok, you can consider it a test checking for throughput regressions in interactivity oritiented schedulers, but it did not measure interactivity at all.

And regarding their Vishera benchmark, they measured most of their standard stuff, mostly scientific calculation, video/audio encoding, image processing, rendering. I very rarely do any of this.

The developer related benchmarks they had were Linux kernel compilation times (Vishera won), and you might count OpenSSL as well. They didn't do PostgreSQL, they didn't benchmark different programming languages, nor application servers, nor office applications, nor anything that would really interest me. I wish someone would measure Netbeans/Eclipse and other IDE performance.

And anyway, did you notice that AMD usually does much better in Phoronix reviews than in Anandtech/Toms Hardware/whatever sites? That's because Phoronix doesn't use Intel Compiler nor Windows, so results are much less skewed.

--Coder

athlon mp baybeeee (0)

Anonymous Coward | about 2 years ago | (#41744025)

I could love AMD again if they allowed Opteron overclocking.

8 cores + poor memory performance (0)

Anonymous Coward | about 2 years ago | (#41744051)

What were they thinking? The more cores, the more memory bandwidth you need.
This is what AMD gets for outsourcing its engineering. (look up articles from 2007)

If anyone has doubts or thinks this is trolling, go see the benchmarks for yourself.

http://www.overclock3d.net/reviews/cpu_mainboard/amd_vishera_fx8350_piledriver_review/3

For linux... (5, Insightful)

ak3ldama (554026) | about 2 years ago | (#41744105)

Here are a set of benchmarks that are more centered on the Linux world from phoronix [phoronix.com] and are thus a little less prone to intel compiler discrimination. The results seem more realistic: better and worse and similar to an i7 at different work, still hard on power usage, low purchase price.

Perfect for the 99% (2)

EmagGeek (574360) | about 2 years ago | (#41744129)

The new Trinity, and now these FX Procs, are perfect for "the 99%," that is to say for what 99% of people do with their machines: surf the web, check email, maybe do some photo editing or piecing together home movies.

They're cheap, reasonably fast, and support all the latest extensions and optimizations. Plus, even for enthusiast prosumers who want to screw around with things like virtualization, you can get into the IOMMU space cheaply with AMD, which is nice for a platform like ESXi that has a mature PCI passthrough implementation.

I think the Trinity A-10 does really hit the sweet spot for most consumers. It has reasonably fast processing and reasonably fast GPU in a ~$130 package. If you're putting a box together on the cheap, it becomes quite compelling.

Re:Perfect for the 99% (2)

FreonTrip (694097) | about 2 years ago | (#41744249)

Don't underestimate these things for video encoding or playing around with scientific computing. They're great for embarrassingly parallel computing problems, and the price is very good.

Re:Perfect for the 99% (1)

0123456 (636235) | about 2 years ago | (#41744451)

The new Trinity, and now these FX Procs, are perfect for "the 99%," that is to say for what 99% of people do with their machines: surf the web, check email, maybe do some photo editing or piecing together home movies.

I did all those things on a Pentium-4.

I agree about AMD's low-end CPUs, but why would you want an 8-core CPU to do any of those things? Does an email program really need eight cores these days?

Re:Perfect for the 99% (5, Interesting)

Billly Gates (198444) | about 2 years ago | (#41744699)

I am typing this on a Phenom II 6-core system. It is quiet, 45 watts, and at the time (2010) it was only 10-15% slower than an icore5. What did I get that the intel icore5 didn't?

- My whole system including the graphics card was $599! Also an Asus motherboard by the way too and part of their extended warranty boards.
- Non crippled bios where I can run virtualization extensions (most intel mobos turn this off except on icore7s)
- 45 watts
- My ati 5750 works well with the chipset
- the AM3 socket can work with multiple cpus after bios updates.

What the icore5 has
- It is made by intel
- It is 15% faster
- The cost of the cpu alone is 2x the price and I can pretty much include a motherboard as well if you are talking up to icore7s.

An icore7 system costs $1200 at the store. An icore5 gaming system similiarly specced cost $850 and does not include virtualization support to run VMWare or Virtualbox.

The FX systems ... ble. I am not a fan. But for what I do AMD offered a quieter cheaper system that could run VMs of Linux and can upgrade easier. To me my graphics card and hard drive are the bottlenecks. I would rather save money on the cpu. I was so hoping AMD would use this to have a great graphics for tablets and notebooks :-(

Re:Perfect for the 99% (2)

JDG1980 (2438906) | about 2 years ago | (#41744651)

The new Trinity, and now these FX Procs, are perfect for "the 99%," that is to say for what 99% of people do with their machines: surf the web, check email, maybe do some photo editing or piecing together home movies.

If you're building a "good enough" system for a non-technical user, why in the world would you even consider a Vishera FX CPU? It's expensive, power-hungry, and has a high TDP. And it doesn't even have integrated graphics, so you'd have to add the expense of a discrete graphics card.

For an inexperienced user who doesn't need much processing power, a Bobcat board will get the job done for cheap, and if that isn't quite good enough, the Pentium G630T is very competitively priced and (with its 35W TDP) very efficient. For enthusiasts, especially gamers, the Sandy/Ivy Bridge "K" series CPUs offer better performance, better efficiency, and more overclocking headroom, for only a little more cash. (In fact, if you have a Micro Center in your area, you can score an i5-2500K for $160, or an i5-3570K for $190.)

What's the intended audience for Vishera – people who do a lot of x264 encoding and don't want to step up to a server CPU or a Sandy Bridge-E? I'm having a hard time seeing a substantial audience for this thing.

Perfect for parents PC (1)

hibble (1824054) | about 2 years ago | (#41744175)

CPU's tends not to be the bottle neck these days its either the hard drive or slow internet connection. At work I priced up some AMD classroom PC's and in bulk we could have given each pc a 128GB SSD(all user work kept on a SAN and system image is about 60-80GB once MSI's have been deployed) with the savings of going AMD but as the boss had never used AMD and reviews keep going on about how intel is faster he went for a mix of I5 and I7 intel based pc's with little ram so as to keep the cost down and as a result startup times are no better than the old intel core duo pc's they replaced. To top it off the boss now has my demo AMD setup as 'its the fastest PC' admittedly that’s only due to a SSD not the CPU but I do get his I7. How do I find a excuse to put a large RAID of SSD's in it?

Re:Perfect for parents PC (2)

0123456 (636235) | about 2 years ago | (#41744431)

CPU's tends not to be the bottle neck these days its either the hard drive or slow internet connection.

AMD fanboys have been saying that for years as AMD lagged behind Intel's CPU performance. But they then keep telling people they should buy AMD 8-core CPUs instead of Intel 4-core, when, if they really believed what they were saying, they would be telling people to buy the cheapest dual-core available, or an Atom.

Seriously, why would you possibly think an 8-core CPU was 'perfect' for your parents' PC, no matter who makes it? If they're not CPU-limited, buy the cheapest CPU you can get.

Re:Perfect for parents PC (2, Funny)

Animal Farm Pig (1600047) | about 2 years ago | (#41744563)

For non-technical parents and other users, I actually like lots of cores and lots of memory-- more so than 'power users.'

Have you ever seen some people boot up their machines? It will take 5 minutes because of the sheer amount of crap installed.

It will be shit like two different anti-virus suites (the first one's subscription expired, and they installed a new one without uninstalling the old). There will be a update checker for every conceivable thing that's been installed. There's the preloader software that loads whatever crappy application it is into memory at boot time so that the application starts faster when you click on it's associated file type. Then, there are all the various helpful toolbars for Internet Explorer that "enhance your internet experience" and delivery dollars to your inbox. Of course, the user will probably go through two or more inkjet printer/scanner/copier all-in-one devices that require the manufacturer's 300 MB "printing experience suite" to enhance user experience by making pop-up windows when the ink is low and will helpfully tell you where you can order more special photo printing paper. There's the two different desktop search tools that got installed when the user downloaded the kitten screensavers pack and accompanying mouse cursors. Then, there is the sidebar app showing the weather outside and what time it is in Fiji.

That's stuff I can think of off the top of my head. I have no idea how average users manage to cruft up their machines so much, but they do. Keep in mind also that they're probably not going to upgrade hardware for a few years, but they'll likely install iTunes 32, Internet Explorer 17, and Office 2016, which will have >1 GB memory footprints.

By having lots of cores, and lots of memory, you can give them a decent user experience even on a machine that they've bogged to shit.

Re:Perfect for parents PC (2)

0123456 (636235) | about 2 years ago | (#41744689)

Have you ever seen some people boot up their machines? It will take 5 minutes because of the sheer amount of crap installed.

So uninstall the crap or install an SSD. A faster CPU makes far less difference to boot time than an SSD, because boot time is mostly limited by the seek performance of the disk as it loads all that crap into RAM.

Finally (1)

Spiflicator (64611) | about 2 years ago | (#41744187)

a CPU for the low-end enthusiast. Quite the niche market.

Re:Finally (1)

kriebz (258828) | about 2 years ago | (#41744351)

Me and the $60 A4 I'm typing this on disagree. Replaced the socket 754 1st gen Athlon 64 that wasn't quite cutting it 6 years later. Plenty of enthusiasts want a cheap 2nd PC for playing with OSs, home server, etc. The Pentium E5200 I bought a couple of years ago (Intel's version of this niche) is not aging very well with no virt and DDR2.

Vishera Piledriver Trinity Bulldozer FX-8350 i5-25 (0)

Anonymous Coward | about 2 years ago | (#41744335)

As Hobbes predicted, language has become a complete impediment to understanding. Vishera Piledriver Trinity Bulldozer FX-8350 i5-2550K i7-3770K.

SATSQ (4, Informative)

JDG1980 (2438906) | about 2 years ago | (#41744533)

AMD FX-8350 Review: Does Piledriver Fix Bulldozer's Flaws?

No. It still guzzles power like crazy compared to Sandy/Ivy Bridge, and its single-threaded performance still sucks royally. (And that's still very important since many, many programs cannot and will not ever support full multithreading.)

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>