×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Llano APU Review - Slow CPU, Fast GPU

CmdrTaco posted more than 2 years ago | from the welcome-to-the-doldrums dept.

AMD 184

Vigile writes "Though we did see the fruits of AMD's Fusion labor in the form of the Brazos platform late in 2010, Llano is the first mainstream part to be released that combines traditional x86 CPU cores with Radeon-based SIMD arrays for a heterogeneous computing environment. The A-series of APUs reviewed over at PC Perspective starts with the A8-3850 that is a combination of a true quad-core processor and 400 shader processors similar to those found in AMD's Radeon HD 5000 series of GPUs. The good news for the first desktop APU is that the integrated graphics blows past the best Intel has to offer on the Sandy Bridge platform by a factor of 2-4x in terms of gaming. The bad news is the CPU performance: running at only 2.9 GHz the Phenom-based x86 portion often finds itself behind even the dual-core Intel Core i3-2100. On the bright side you can pick one up next month for only $135."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

184 comments

This is important (-1, Offtopic)

For a Free Internet (1594621) | more than 2 years ago | (#36621850)

I elivce that tecnologi isthe futor of us every wgere, it is the thing that we use to bake better for us all the live. But bad people will the thing , to it and then the stop, so no. In concluson, the with to the tecnologiu the were me, I would it AND SO SHOULD YUO! because no matter what chips it is, we all love me.

Re:This is important (0)

Anonymous Coward | more than 2 years ago | (#36622018)

Come on little baby AI, practice speaking some more...

Re:This is important (0)

Anonymous Coward | more than 2 years ago | (#36622162)

You're a funny dwunk.

Now drink some water, go to bed, and sleep it off.

Who buys AMD? (-1)

Anonymous Coward | more than 2 years ago | (#36621852)

Unless you want bang for your buck, $100 more and you have a good high end Intel that sweeps AMD away.

Re:Who buys AMD? (1)

Anonymous Coward | more than 2 years ago | (#36621932)

Nice how the marketing worked on you.

Re:Who buys AMD? (2, Informative)

hedwards (940851) | more than 2 years ago | (#36622096)

Well, people that don't want to reward Intel's illegal behavior for a starter. I recently got a Llano based laptop and was shocked at how well the chip handles the things that I do on a day to day basis. Sure, there's no chance of playing The Witcher or DNF on it, but it handles casual gaming just fine, especially the older games that I tend to like to play.

In practice, the dual core is much more responsive than the celeron I was using a couple years back, even though it's a third slower than that older Intel chip.

It's not for those that want top speeds, but it was substantially less expensive than the Intel option. A $100 price difference is pretty significant these days in terms of the machines that most people use. And in practice, I'm not so sure that it is only a $100 price difference as you then don't need to shell out for a graphics chip or the circuitry to make that worse. I ended up spending several hundred dollars less than I would have for the Intel option. Personally, I'd rather spend the money upgrading the warranty or paying for a back up plan.

Re:Who buys AMD? (2)

durrr (1316311) | more than 2 years ago | (#36622144)

I think the whole point of APUs are to not be high end expensive battleship-system components.
You see, the $230 device you suggest to buy instead have no integrated graphics, and thus you'll want to add $100 or more for a matching decent pice or GPU(or you can be a retard and enjoy integrated shit-tier graphics along with your high end CPU.

Or you simply settle for a lower-mid tier system and buy the Llano device from the above article and end up with a $200 cheaper system.

Re:Who buys AMD? (1)

Tx (96709) | more than 2 years ago | (#36622312)

The Intel Sandy Bridge parts (which I assume the GP is referring to) do have integrated graphics, but as the article says, the point here is that the Llano graphics outperformed the Sandy Bridge integrated graphics by 2-4x. Enough to make the difference between entry-level 3D gaming and no 3D gaming.

Re:Who buys AMD? (1)

Tx (96709) | more than 2 years ago | (#36622382)

To be clear, the Sandy Bridge chipset has integrated graphics, not the CPU, but you can't have one without the other is the point.

Re:Who buys AMD? (1)

dc29A (636871) | more than 2 years ago | (#36623284)

To be clear, the Sandy Bridge chipset has integrated graphics, not the CPU, but you can't have one without the other is the point.

No. GPU is on the CPU, like Llano.

Re:Who buys AMD? (4, Insightful)

ByOhTek (1181381) | more than 2 years ago | (#36622244)

If you don't need that kind of performance, then that extra $100 is wasted.

My server currently runs on an AMD. For one, it was the lowest energy using quad core I could find (45W). For two, at the time, it was cheaper than most Intel quad cores. And used less power than all but their lowest end dual cores.

Then again, my gaming rig is an i7 and my notebook is a Core2 Duo.

So, to answer your question: when it is the right tool for the job.

Re:Who buys AMD? (1)

Sloppy (14984) | more than 2 years ago | (#36622776)

My server currently runs on an AMD. For one, it was the lowest energy using quad core I could find (45W).

FWIW I did the same thing. Athlon II 610e: part of 2010's awesomest series of server CPUs. But let's not kid ourselves: if you were building a server from scratch today (not late 2010), you wouldn't use Sandy Bridge? I sure as hell would.

I can see some niches where this Llano stuff fits, though. Not sure if any of these are on my upcoming computer menu, but I've got one particular box where if it suddenly vaporized, I might replace it with Llano. Might.

Re:Who buys AMD? (0)

dc29A (636871) | more than 2 years ago | (#36623394)

But let's not kid ourselves: if you were building a server from scratch today (not late 2010), you wouldn't use Sandy Bridge?

Just finished building my linux server/workstation. I needed cores and cheap. Picked up a Phenom II X6 1090T for about 160$, motherboard for 100$ (supports Bulldozer, has lot of SATA ports), 8 GB RAM for about 80$, rest of pieces I had them. 340$ total. For that, I can barely get a 2500K with some shitty motherboard. The 2600K is about 320$. And since I'll be doing mostly programming, running virtual machines and just normal PC use (browsing, videos, etc ...) I need as many physical cores as possible (even if IPC is low) vs less cores and more IPC. Also, this build allows me to upgrade processor later on with an 8 core bulldozer if I ever need it.

PS: My dual core gaming rig is getting a 2500K + Asus Maximus Gene IV upgrade. As others said it: Best tool for the job.

Re:Who buys AMD? (1)

Sloppy (14984) | more than 2 years ago | (#36623814)

And since I'll be doing mostly programming, running virtual machines and just normal PC use (browsing, videos, etc ...)

Your server doesn't smell like a server. ;-) But fair enough; my Sandy-Bridge-now-always-beats-AMD-on-servers position is pretty prejudiced to certain workloads. YMMV and all that.

Re:Who buys AMD? (1)

goarilla (908067) | more than 2 years ago | (#36623486)

Kinda depends on your expected server workload, no ?

Re:Who buys AMD? (1)

0123456 (636235) | more than 2 years ago | (#36623648)

Kinda depends on your expected server workload, no ?

Sandy Bridge is faster, has lower peak power consumption for a given performance level and lower idle power consumption. I can't really see any expected workload where AMD is a better choice unless you plan to have lots of CPUs in your system.

Perfect for Bitcoin mining! (0)

dingen (958134) | more than 2 years ago | (#36621860)

I'm curious to see the output of this chip when mining Bitcoins. Bitcoin output depends heavily on the number of shaders and right now the Radeon 5000 series are the best value for your money, with a 5870 offering over 400 Mhash/s (which is a lot). CPU power on the other hand doesn't matter at all, so all in all this Liano chip sounds like the perfect candidate for a Bitcoin mining rig. With the current conditions, you'll probably earn your chip back in less than a month.

Re:Perfect for Bitcoin mining! (1)

Afforess (1310263) | more than 2 years ago | (#36621976)

I take it you live in your mom's basement, because Bitcoin mining is never profitable (anymore) due to elecricity costs. I calculated it out a few months back, at least in Michigan, leaving a 500 watt computer on 24/7 for 30 days costs ~$35. Expect that to rise over time.

Re:Perfect for Bitcoin mining! (1)

theantipop (803016) | more than 2 years ago | (#36622068)

What kind of computer are you running that draws 500W from the wall?

Re:Perfect for Bitcoin mining! (1)

Afforess (1310263) | more than 2 years ago | (#36622080)

Any computer that mines bitcoins all day will be using a full load.

Re:Perfect for Bitcoin mining! (2)

marcosdumay (620877) | more than 2 years ago | (#36622642)

My best computers to peak under 300W. And they aren't old or slow (but they aren't the fastest ones availabe either, just near them).

I'd understand if you have 2 or more GPUs...

Re:Perfect for Bitcoin mining! (1)

Anonymous Coward | more than 2 years ago | (#36623110)

I have a core i7 920 with a Nvidia GTX 560 GPU, 2 monitors, a 2.1 sound system, 3 hard drives and 1 external. All those are plugged into my UPS and I've never drawn more than about 350 watts of power under full load.

Re:Perfect for Bitcoin mining! (0)

Anonymous Coward | more than 2 years ago | (#36622430)

The APU has what, 65W TDP? That means it'll consume on average about 32,5W amirite? On a ATX booted from a USB flash drive with the useless hardware disabled in the cmos setup utility you're going to be nowhere near that 500W. Also, there's been massive btc deflation recently.

Re:Perfect for Bitcoin mining! (0)

Anonymous Coward | more than 2 years ago | (#36622552)

If you're fully loading the cpu for days on end, it'll average at about the TDP... So youarewrong.

Re:Perfect for Bitcoin mining! (1)

dingen (958134) | more than 2 years ago | (#36622556)

How is paying $35/month for electricity not profitable? With two 5870's running 24/7, you can easily make a few hundred dollars a month, even with the current difficulty and exchange rate.

Re:Perfect for Bitcoin mining! (0)

Anonymous Coward | more than 2 years ago | (#36622848)

over time, the exchange rate should become just above the cost of creation.

if it costs $35 to make some bit coins, then they are worth approximately ... $35.

You would like to sell them for $100, but your neighbor may be willing to sell them for $50 or $40. No one would sell at or below $35. But with a technology efficiency improvement, the value may continue to decrease slightly.

Re:Perfect for Bitcoin mining! (1)

Nursie (632944) | more than 2 years ago | (#36622950)

Well, unless the currency undergoes something of a collapse (a glut of btc on the market is very possible).

Re:Perfect for Bitcoin mining! (1)

medv4380 (1604309) | more than 2 years ago | (#36623210)

Two 5870 running at full will be 350~400 Watts Each.

Add in the motherboard and other basics you're talking 1000 Watts constantly.

It ends up being closer to 70-80 a month.

Plus the cards become worthless because you're running them so hot they are probably going to die and not be resellable ether.

Re:Perfect for Bitcoin mining! (5, Informative)

ewhenn (647989) | more than 2 years ago | (#36623596)

Two 5870 running at full will be 350~400 Watts Each.

Add in the motherboard and other basics you're talking 1000 Watts constantly.

Nice job pulling those numbers out of your ass.

Here's the real power consumption of a 5870 right off of AMD's spec sheets: http://www.amd.com/us/products/desktop/graphics/ati-radeon-hd-5000/hd-5870/Pages/ati-radeon-hd-5870-overview.aspx#2 [amd.com]

I'll pull the relevent part out for you: Maximum board power: 188 Watts

Assuming people who bitcoin mine use at least a decent power supply that is 80% efficient PSU at given load (realistically most decent ones are 82%+ in optimal load range), you're going to be pulling 235 watts from the wall per card, max.

235 watts is way less than 350-400 watts, by a long shot.

The rest of the system isn't going to be pulling huge amounts of power, since nobody who is mining bitcoin for real cash does it on a CPU, they do it on GPUs, and the amount of power a motherboard, RAM, disk drive, CPU use while they aren't really working is pretty low, usually in the 30-60 watt range, depending on your CPU, but nowhere near 200 watts of draw

Re:Perfect for Bitcoin mining! (1)

Toonol (1057698) | more than 2 years ago | (#36623436)

It's much cheaper to simply buy bitcoins from other people than it is to mine them. Assuming the currency doesn't self-destruct soon, market forces will surely correct that price inequality... either by lowering the value of bitcoins back down to the creation cost, or by people abandoning creating them until a time that hardware speed brings the cost down to their value.

Re:Perfect for Bitcoin mining! (0)

Anonymous Coward | more than 2 years ago | (#36621988)

Not exactly.

The performance of the GPU is roughly similar to a radeon HD 5550 which mines at 41 MHash/second. The computing power of this is no where in the same ballpark as the desktop 58xx series.

Given the current difficulty that will net you approximately 40 cents per day. In a month that would get you a whopping 12$.

Re:Perfect for Bitcoin mining! (0)

Anonymous Coward | more than 2 years ago | (#36622350)

-2 offtopic. Seriously, does someone have to post a bitcoin comment in any thread about hardware? Take it to your damn forums.

Re:Perfect for Bitcoin mining! (1)

N0Man74 (1620447) | more than 2 years ago | (#36623140)

Right, because some types of applications that are heavily hardware dependent are on-topic than others. It's fine to talk about servers, and benchmarks, gaming, video encoding, and other topics that might have some relevance to hardware performance, but not bitcoins!

Seriously, I don't use bitcoins, don't care about them, but yet some of the overreactions regarding them and the outcries of, "Stop talking about bitcoin on slashdot" is more annoying than the mentions of bitcoin themselves. Since when is application performance on specific hardware not relevant to hardware discussions?

Slower than an i3... (5, Interesting)

LordLimecat (1103839) | more than 2 years ago | (#36621924)

On newegg that core i3-2100 is retailing for $124; how do the graphics in the llano stack up against the i3's graphics? Might not be such a bad deal at all.

Article (or at least the material they got from AMD) indicates that graphics is precisely where it shines, so an i3-class CPU with nearly-discrete-class graphics, at an i3 pricetag, sounds quite compelling.

Re:Slower than an i3... (1)

h4rr4r (612664) | more than 2 years ago | (#36622038)

That is AMDs plan with this unit. Same relative cost and performance as the i3 but much better GPU.

Re:Slower than an i3... (4, Informative)

butalearner (1235200) | more than 2 years ago | (#36622160)

I did a little digging for those wondering: it does run Linux [phoronix.com], but only with the proprietary Catalyst driver at the moment. Might be interesting once the open source driver catches up (assuming AMD shares the required info).

Re:Slower than an i3... (2)

royallthefourth (1564389) | more than 2 years ago | (#36623320)

The open source driver won't catch up; the open source drivers have never even come near to the closed drivers in 3D performance. They're for people who want to always use the latest kernel without worrying about incompatibility.

Re:Slower than an i3... (1)

rbrausse (1319883) | more than 2 years ago | (#36622090)

how do the graphics in the llano stack up against the i3's graphics?

this is not only answered in TFS but even in TFT :)

and arguable your question is kind of senseless as Intels i3 is not a CPU/GPU combination but "only" a processor, though if you use your i3 with Intel on-board graphics the AMD will run circles around it.

Re:Slower than an i3... (3, Informative)

LordLimecat (1103839) | more than 2 years ago | (#36622196)

One of the SandyBridge selling points was "our integrated graphics no longer suck, and are now semi-decent". And calling the Llano a CPU/GPU combo while not doing the same for Intel is kind of pointless; both have integrated graphics, and both have it as a selling point. Since the prices are comparable, "one gives me good graphics and the other sucks" isnt a hard choice to make.

Re:Slower than an i3... (1)

rbrausse (1319883) | more than 2 years ago | (#36622210)

Intels i3 is not a CPU/GPU combination but "only" a processor

argh, call me stupid; like you I read only half of TFS and ignored the Sandy-Bridge-sentence :/

Re:Slower than an i3... (2)

wagnerrp (1305589) | more than 2 years ago | (#36622336)

No. The i3 and i5 lines integrate the graphics core on the same package as the CPU. The only thing the board provides are video transmitters. Intel has not produced a chipset with graphics since the G45 and Core 2 line.

Re:Slower than an i3... (2)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#36622122)

IIRC, the contemporary i3s are Sandy Bridge parts(or older) and that intel's on-die graphics options come in a few tiers, depending on the tier of the CPU they are integrated with.

So, if, in fact, the Llano's graphics are "2-4x better than the best Sandy Bridge has to offer" they should crush the i3's IGP like a bug, and be a better gaming part generally unless a given game is atypically CPU bound.

I suspect that AMD will have themselves a cheapskate(and/or space constrained) hit, since their part would appear to be a natural winner for any system that wants to do GPU-bound stuff without an additional 80+ dollars worth of add on board; but if you were planning on an add-on GPU anyway, the i3, or better, would start to look pretty good unless the motherboards are substantially more expensive.

Re:Slower than an i3... (1)

LordLimecat (1103839) | more than 2 years ago | (#36622216)

Ah, but if you get a discrete ATI card, it looks like the integrated graphics teams up with it in some kind of bizarre Crossfire setup, so the AMD processor would be even better than the i3. Good luck setting up dual-rendering between intel integrated and an nVidia or ATI card.

Re:Slower than an i3... (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#36622466)

That does help to seal the fate of the lower-end i3s as the budget CPU of choice only for the must_have_intel brigade(I'm guessing that a lot of corporate typing boxes will be sold therewith...); and it certainly won't help Nvidia's chances of selling lower-end expansion boards to AMD users. However, at the higher end, I suspect that, while nice, the asymmetric Crossfire won't matter much: in the battle between two ~$50-80 expansion boards, having a bit of help from the most competent integrated graphics yet will tip the balance. In the battle between two ~$250-500 expansion boards, the assistance of the onboard shaders will be worth maybe a one-model bump worth of performance. Since you simply cannot buy a Llano part with a faster than 2.9GHz quad CPU; but you can buy incrementally better GPU performance right up until you bump into the limits of two of the highest end presently available, anybody who wants more CPU power than that will simply have to go with an iSomething and buy one tier higher on the discrete GPU side.

Re:Slower than an i3... (2)

LordLimecat (1103839) | more than 2 years ago | (#36623338)

Ive always heard people talk about how faster cards need a faster CPU, and that if you do a 2.2ghz 2core AMD you will end up bottlenecking your high-end 6990 card, but Ive never really seen it quantified or explained; surely the CPU isnt processing data that the GPU spits out onto the DVI port; and we are well past the days of needing the CPU to intervene on RAM and HDD requests; a lot of the point of AGP and PCIe (IIRC) is that they do not require CPU intervention to access memory-- they have a direct link to the controller.

What Im getting at is, why WOULDNT the 2.9ghz Llano be sufficient for your Llano+ 2x HD6990 combo?

Re:Slower than an i3... (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#36623504)

It depends on what you are using it for: As you say, the GPU does not directly lean on the CPU to any significant extent; but most people buying fancy GPUs(with the specific exception of people using them for entirely GPU-based compute tasks), are buying them to run applications that eat both considerable CPU time and considerable GPU time. If somebody is buying some serious GPU power, this usually means that they are running something, or cranking up their game's settings, or whatever it happens to be, in a way that will also place considerable demands on the CPU.

CPUs don't directly bottleneck GPUs; but for mixed CPU/GPU tasks(like pretty much any gaming, CAD, etc. application) there is often some degree of correlation between the power of the GPU needed for a given visual quality level, or model mesh complexity, or what have you and the amount of CPU power needed. So, if somebody finds themselves with a GPU that can handle 1920x1080, super-high-quality, with high detail models; but finds that CPU load spikes to 100% and they get half the framerate expected, they speak of being "CPU bottlenecked".

The same is true, if less commonly whined about, in the reverse situation. If you fire up a game on somebody's badly-specced business box, the screaming CPU will just sit there, yawning politely, and making snarky comments about whether you are getting frames per second or seconds per frame. Aside from the modest demands of running the driver, the lousy GPU isn't literally bottlenecking the CPU; but the application performance you can expect to achieve is being bottlenecked by the GPU.

Re:Slower than an i3... (1)

goarilla (908067) | more than 2 years ago | (#36623602)

Drivers use/need the cpu to do some processing before it gets send to the gfx cards.

Re:Slower than an i3... (1)

goarilla (908067) | more than 2 years ago | (#36623644)

Another thing to realise is that originally the first GPU (geforce) was called a graphics card with T&L ( transform and lightning engine).
They offloaded the last 2 steps (step 5 and 6) of the rendering process to the gpu, while shaders could provide the opportunity to do a lot more of the offloading
eg: all 6 steps, it's highly unlikely that all the cpu rendering responsibilities are now gpu only.

Re:Slower than an i3... (0)

Anonymous Coward | more than 2 years ago | (#36623272)

If it's anything like the 880G chipset option, only certain cards will work with Hybrid CrossFireX. I'm not sure why this is the case. My 880G has a 4250 onboard but only pairs with a 5450 for Hybrid CrossFireX. My 5770 cannot utilize Hybrid CrossFireX

Re:Slower than an i3... (1)

AmiMoJo (196126) | more than 2 years ago | (#36622798)

I don't think the benchmarks were very helpful because for most people the performance of this chip should be excellent. I have a dual core hyperthreaded Atom based server which is very responsive and usable, but going by raw CPU performance benchmarks sucks. For desktop use you don't need that much CPU power, and in fact simply having more cores is a better bet as it improves responsiveness massively.

AMD are expecting the GPU to do a lot of the heavy processing like video decoding, we just need more software to take advantage of it. My understanding is that the architecture of these APUs makes it easier to get the GPU working on stuff, e.g. by not having to transfer it to the segmented GPU RAM area.

Re:Slower than an i3... (2)

hedwards (940851) | more than 2 years ago | (#36623728)

I'm watching the development of Open CL fairly closely, because it's probably going to end up making or breaking Llano in the long run.

Pretty well sounds like (1)

Sycraft-fu (314770) | more than 2 years ago | (#36622134)

The i3 does not have the best graphics for the SB, the i7 does. They say it is 2x-4x what that is. Well, that means pretty reasonable lower-midrange graphics. Enough to play modern games, though probably not with all the eye candy.

That could make it worthwhile for budget systems. $135 for an all inclusive solution rather than $124 for a CPU and $50 on a video card.

Of course there are some downsides too in that it is a weaker CPU and some games (Bad Company 2 and Rift come to mind) need better CPUs and of course with the GPU you could spend $80 instead of $50 and get one that far outperforms any integrated GPU, this one included.

Still, I can see the idea being appealing. If they can firm up their CPU performance a bit with Bulldozer (which isn't likely to be as fast as Sandy Bridge but will be faster) and maybe bring down the price a bit it is a good budget gaming alternative.

I know people who are interested in PC gaming, but put off by the cost and complexity of getting discrete GPUs. If AMD can sell them a cheap integrated solution, it may be a win.

Just have to see in the long run.

Re:Pretty well sounds like (1)

obarthelemy (160321) | more than 2 years ago | (#36622566)

Intel pre-emptively released an i3 with their top-of-the-line HD3000 graphics GPU a short while ago, so the i3 is on a par with the best Intel can offer, iGPU-wise. 2105 I think.

Re:Pretty well sounds like (1)

Sycraft-fu (314770) | more than 2 years ago | (#36622772)

Even so, the benchmarks show it well ahead of an HD3000. The chip has got some reasonable graphics. Equal to or in most cases better than a $50 card. That's not bad.

Re:Pretty well sounds like (3, Insightful)

Rockoon (1252108) | more than 2 years ago | (#36622888)

Bulldozers wont have on-die graphics like these Llano (Bobcat) CPU's until mid to late 2012 at the earliest.

What should be noted and what isnt well understood is that these "APU's" coming out from AMD are all Bobcat chips. Bobcat is a design directly targeting Intel's Atom market. The review here is for the King of the Bobcat's, the high powered variant weighing in at 100W peek built on the 32nm processes. The low power bobcats only have 80 stream processors (5.9W, 9W, and 18W variants) instead of the 400 stream processors (100W) that this thing has at are on the 40nm process.

All the Bobcat modules have only 2 ALU's and 2 FPU's, and only a 1-channel memory controller, so it is no surprise that it has trouble competing with the i3's. What is surprising is that never-the-less, its competing with the i3's.

Re:Pretty well sounds like (1)

shizzle (686334) | more than 2 years ago | (#36623496)

Bulldozers wont have on-die graphics like these Llano (Bobcat) CPU's until mid to late 2012 at the earliest.

True.

What should be noted and what isnt well understood is that these "APU's" coming out from AMD are all Bobcat chips.

Not true. The E-series and C-series parts released in January (Ontario/Zacate) are Bobcat chips, built on TSMC 40nm process. The big deal with Llano (A-series) is that it's not Bobcat, it's an enhanced Phenom-derived core, built on GlobalFoundries new 32nm process. There is no such thing as a 32nm Bobcat at this point in time.

Re:Pretty well sounds like (1)

shizzle (686334) | more than 2 years ago | (#36623548)

Bulldozers wont have on-die graphics like these Llano (Bobcat) CPU's until mid to late 2012 at the earliest.

True.

Responded too quickly... the "Bulldozers wont have on-die graphics [...] until [...] 2012" is true (this is the Trinity part, and was demo'd a couple of weeks ago, but it was announced that it won't be ready for production until 2012, I don't remember what was said about when in 2012). The "Llano (Bobcat) CPU's" part is not true, Llano is most definitely not Bobcat.

Re:Pretty well sounds like (1)

0123456 (636235) | more than 2 years ago | (#36623590)

The review here is for the King of the Bobcat's, the high powered variant weighing in at 100W peek built on the 32nm processes.
All the Bobcat modules have only 2 ALU's and 2 FPU's, and only a 1-channel memory controller, so it is no surprise that it has trouble competing with the i3's. What is surprising is that never-the-less, its competing with the i3's.

It has twice as many cores and from the numbers you give here uses about 3x as much power. I'm not too surprised that you can compete with a cheaper chip in that case.

Re:Pretty well sounds like (0)

Anonymous Coward | more than 2 years ago | (#36623804)

I thought Trinity was slated for Q4 2011? Or are you incorporating the usual release slippage here?

BTW, these APUs are not bobcat cores. The bobcat cores are to be found in the Ontario line, and branded as E-xxx. These are Axxx and are based on the Propus cores currently found the Athlon line. So yes, there is nothing new here on the CPU side, but they are not re-branded ULP devices either.

Re:Slower than an i3... (1)

obarthelemy (160321) | more than 2 years ago | (#36622544)

globally, twice as fast. extremely memory constrained though, so shell out at least for 1600MHz DDR3, 1833 is best.

Re:Slower than an i3... (1)

slashmydots (2189826) | more than 2 years ago | (#36623656)

To answer your question, the processing power isn't very close but the graphics get around double the framerate on average in a game at 1024x768. But that's around 40 FPS compares to 20 FPS on average. If you go any bigger on resolution, it fails. It got almost double the score on benchmark numbers in 3Dmark as well. So yeah, it supports DX11 unlike intel but it's not going to be pretty. So that brings it into what other reviewers have called the "barely playable" area as far as modern gaming goes. I think anyone buying it would be playing flash and java games though but what I really want to know is how it does at HD on netflix, hulu, and youtube and whether or not it can play blu-ray decently because that would be the only thing making it worth it. Other than that it's some sort of obscure low power, low heat gaming build chip or something I've never heard of anyone wanting to build. If the price comes down, it could make a really nice netflix/blu-ray media box for your TV though that can also play basic games.

Slow? (5, Insightful)

Anonymous Coward | more than 2 years ago | (#36621946)

This new AMD product specifically targets the budget user with occasional gamings. It allows entry level gaming, for the price of a very cheap CPU + GPU at lower TDP. It's also a better solution than a CPU + Discrete graphics because it already gives you entry level gaming without taking up a PCI-E slot; at the same time allows for asymmetrical CrossFire so in case you want to get a high end CPU you can see a benefit (in DX10 & DX11 titles)

This new APU from AMD shoots down any budget graphics Intel has to offer whilst giving you more CPU power to do anything Atom does.

At the end of the day, Core i3 + HD3000 costs more and has a higher idle power usage.

IMO the title should read: "Brilliant new budget gaming APU from AMD!"

Re:Slow? (3, Informative)

cshake (736412) | more than 2 years ago | (#36622910)

I know this article is about the desktop APUs, but as I've been running the C-50 Ontario on my netbook (Acer AO522-BZ897) for a few months now, I think I can share some real-world experience.

Overall: It's a dual-core netbook, and still gets 6 hours battery life if I'm writing code with the brightness down, a little less if I'm listening to music. It may be slower on the individual cores than a competitive Atom, but if your program is threaded it's great. I'm very happy with the performance. It replaced a Powerbook G4 (I know, different class altogether, but still), and in terms of % CPU used for common tasks it's far and away better. No more mp3/m4a decoder taking 10+% CPU for decent bitrate songs.
Real-world Performance: I can say that any downside I've seen is entirely due to bad software - I hear that in windows I could watch 720p on it, but right now with x64 linux and the beta multi-threaded flash player in the latest firefox or chromium I can't watch youtube videos at more than 480 before it starts to drop frames. Not a big deal for me though. Once the video drivers caught up with the chipset I can say that compositing and desktop effects work flawlessly, no lag whatsoever. I don't play games on it, being a netbook, except for the occasional flash thing (which sometimes lag, but again that's the flash plugin).
Hooking it up to a 1080p TV over HDMI and running at native resolution, playing standard definition (624x352) XviD files zoomed to fullscreen works flawlessly in VLC. 720p x264 almost works - it saturates a single core and drops a frame here and there, and will really hang if you try to bring up a semi-transparent control bar over the video, but again if the codec were multithreaded it would be perfectly fine. I suspect that VLC on linux isn't taking full advantage of the GPU here either, considering I'm running the open source radeon drivers and not the official binary. For those of you running with officially supported closed-source software, i.e. official drivers or windows, I suspect it might even play 1080p without a problem.
Let's remember that this APU is in the same power class as an Atom, and it's a netbook - impressive performance in my mind.

As my only direct comparison points are the G4 powerbook that it replaced and my Phenom X4 9950 desktop, it's (gasp) right in the middle, but comparing a netbook to a desktop built for CAD and gaming is stupid, and so is comparing it to something 5 years older.

My only gripe is that you can't set how much RAM the GPU side takes, so no matter what size stick I've got in there the system sees the total minus 256M. The upside is that you only need a total of one stick of RAM, but the downside is that when it comes with a 1GB stick, suddenly you're trying to run a windows 7 system (out of the box) with ~750MB, and that's asking for trouble. As I swapped out the HD and RAM before ever turning it on the first time and installed linux fresh, I can't say I've seen the slowdown, but it could be there.

For those who haven't seen it a thousand times, (1)

Samantha Wright (1324923) | more than 2 years ago | (#36621982)

Ladies and gentlemen, I remind you about how well-documented this sort of thing is: the wheel of reincarnation [catb.org]. Personally, I'm betting that hardware is now so disposable that we'll eventually get to having our machines in one hunk of silicon, and the wheel will stall.

Re:For those who haven't seen it a thousand times, (1)

ArcherB (796902) | more than 2 years ago | (#36622766)

Ladies and gentlemen, I remind you about how well-documented this sort of thing is: the wheel of reincarnation [catb.org]. Personally, I'm betting that hardware is now so disposable that we'll eventually get to having our machines in one hunk of silicon, and the wheel will stall.

Exactly. I'll bet it will be called a "Tablet".

Actually, I envision the day when all phones will have a compatible interface that will allow for keyboards, mice and monitors to be hooked up to them. You take your "phone" to work, plug it in, do work. Pull it out, browse the web on your way home and plug it into your dock at home where you play games or whatever it is you do with your current PC at home. You go and visit your buddy and want to show him some new whiz-bang-app you have, you plug your phone into his dock and use your pc/phone as you did at home and work.

When ARM processors become comparable to the current x86's we use in our PC's, the above scenario is not at all implausible. The Motorola Atrix is an example of this in action, but it is too slow and expensive to be practical. Once the ARM is fast enough and if phone manufacturers can agree on a dock standard and third parties get involved in making the docks, your phone will push your PC out of you home and office. And like you said, cell phone hardware is more or less disposable.

Re:For those who haven't seen it a thousand times, (0)

Anonymous Coward | more than 2 years ago | (#36623704)

Exactly. I'll bet it will be called a "Tablet"

I'm hoping it will be called floppy. You know, e-ink on a sheet of reprocessed corn. And what's that with plugging? Wireless connectivity ftw!

Tom's Hardware... (1)

Anonymous Coward | more than 2 years ago | (#36622014)

http://www.tomshardware.com/reviews/amd-a8-3850-llano,2975.html#xtor=RSS-182

Looks pretty solid for entry level stuff. Interesting to see the "Enhanced Bulldozer" design that also incorporates the GPU elements.

Where are the boards? (0)

Anonymous Coward | more than 2 years ago | (#36622030)

I can see CPUs being available now, but where are the boards to put them on?

Not slow CPU, laggy software (1)

Anonymous Coward | more than 2 years ago | (#36622034)

Figure in redesign needed for software that expect GPU memory pipes when that memory is now more direct, and you easily see where significant improvement is yet to be noted. AMD realizes that and seems to purposely left Llano as limited edition for its speed. Meant for those that want to afford one strictly for optimization. The known slowness help identify potential optimization techniques. They already have one with faster core, yet to be released. Hold out if fast is all you want, as Llano seems strictly for developers.

Not quite slow (2)

Eravnrekaree (467752) | more than 2 years ago | (#36622042)

To saw its slow is a little ridiculous. Compared to a 286? I know, that this is in comparison to other modern CPUs, but any modern CPU is pretty fast.

I wonder if AMD or Intel will ever manage to develop an x86 integrated chip for handheld devices. It would be pretty interesting to have binary compatability between desktop and handheld devices.

Re:Not quite slow (1)

m50d (797211) | more than 2 years ago | (#36622286)

My netbook uses an AMD Geode. It's great - runs a "normal" windows XP and plays ten-year-old games perfectly. It's a 7", so not quite handheld yet, but not a million miles away from something like the original gameboy.

Re:Not quite slow (2)

LWATCDR (28044) | more than 2 years ago | (#36622650)

For most people CPU power is a none issue. Truth is that most office PC and home PCs are very over powered for what they are doing. Honestly most users would be see the biggest improvement in performance if they put their money into more RAM and faster storage as well as a half decent GPU over a faster CPU.
The APU idea really has so much merit that it just isn't funny. If AMD can get this pushed out and if more software starts to take advantage of the GPU you will see a big benefit. This isn't all that different from when Intel came out with MMX and AMD came out with 3Dnow extensions for the CPU. At the time they where not used very often bot when they where the difference in performance was huge. Now we have SSE in the CPUs and most software uses it. Throw in the extra real cores on the APU vs the i3 and as the article pointed out for programs that supported threading the AMD APU tended to beat the more expensive I3. In graphics performance it was two to four times the speed.
What we have is a CPU that will do very well when running programs that are multi-threaded and can use the GPU units well. Folding at home would be a good one.
I would have loved to see some browser benchmarks using IE9 and Chrome as well since they are using GPU acceleration. This APU is could be marking a really big potential change in how programmers write code. When I started a few decades ago 64k was a lot of memory. Even when we got to 32bit CPUs and a few megabytes of memory we would avoid floating point math as much as possible. Until the Pentium line you still had to deal with many computers that didn't have support for hardware FPUs. Floating point was slow and we worked hard to not have to use it. Now we have thing like SSE and floating point is nothing. In fact it is as fast or faster then using integers. Today every programmer should be thinking of multiprocessing and how to use the GPU to solve problems besides graphics. This is just the first of the line and frankly I find it very interesting.
And NO. We do not not want X86 on our mobile devices. Software on mobile is very different than software on the desktop binary compatibility is pretty much useless. In the mobile space we are now seeing the shift to multicore already and the integration of the GPU is already standard. SOCs like the Tegra 2, OMAP4, Apple A5, Snapdragon, and the next generation Hummingbird are all multicore and all have a GPU. Plus the X86 has a long way to go to match the low power and heat that ARM offers. If anything I would say the X86 needs to start being concerned that it will be pushed from the market from below just as the PDP-11 and VAX where. Multi-core mobile SOCs will be common in 6 months if not less and they are progressing at a very fast rate. Intel my have completely missed that boat.

Re:Not quite slow (1)

Plainswind (2089218) | more than 2 years ago | (#36623826)

Why do so many nerds extrapolate the limits of their hardware requirements onto others? Trust me, general use of computers is far more diverse in hardware requirements than most nerds want to believe. Take for example all hobby artists(Photo, video, music), or like my mother, doing patterns for sewing, stitching etc. My brother uses his computer to help him with his hobby of working on old boats, including doing CAD. Both need beefier hardware than I do when I code. I can just put a compile on when I go and do some household chores etc, while they need beefy hardware to actively work with their tasks. Gaming is far more widespread now than it was 12 years ago, and contrary to the popular meme here on Slashdot, it's not just flash-games. My mother loves the Settlers games, my sister loves Sims. My dad is a fan of Trainz, which requires rather massive hardware, and it's not just about RAM and storage... He's building a scenario based on where he grew up, and how he remembered it. And he's no computer nerd either. Yet he needs more hardware than I do.

Re:Not quite slow (0)

Anonymous Coward | more than 2 years ago | (#36623074)

Slower than an i3, yes. But...it's waay faster than the fastest Atoms, which is the segment this is aimed at in the first place. I honestly wish the gamer crowd would QUIT trying to compare this stuff to the beasts they field for their enjoyment.

Re:Not quite slow (2)

0123456 (636235) | more than 2 years ago | (#36623710)

Slower than an i3, yes. But...it's waay faster than the fastest Atoms

And costs waay more and uses waay more power. Atoms are mostly being used for cheap, low power systems, and this chip fails on both counts.

Overclocking (1)

h4x0t (1245872) | more than 2 years ago | (#36622094)

Whats the story on overclocking these things?

I'd imagine there's a whole new set of problems to overcome before they can be reliably tweaked by the end user

Re:Overclocking (1)

Shatrat (855151) | more than 2 years ago | (#36622340)

Nobody really overclocks low-end hardware like this, but if you did at least you'd only need a single waterblock or heatsink.

You'd get much more bang for your buck spending the money you would have spent on cooling and spending it on a faster CPU+Discrete Graphics combo.
The only reason to overclock one of these would be shits and/or giggles.

Faulty Testing Methodology (2)

TPoise (799382) | more than 2 years ago | (#36622124)

The article does not test using Quick Sync technology for the video rendering portion. When this is turned on, an Intel HD3000 is 6 times faster at video encoding than a top-of-the-line Radeon. (Benchmarks here [tomshardware.com]). And also some of the tests show the Core i7-970 is twice as SLOW than a Core i5?? Gotta call B.S. on that one. And what's the point of testing a dual card (APU + Radeon) against a single Intel integrated graphics? We all know the HD3000 isn't for gaming, that's why you get a $65 Radeon to run your games. Most mid-range laptops come with some sort of discrete graphics card that rivals the GPU performance of the Llano. I waited around for Llano and was severely dissapointed with the CPU results. TomsHardware and Anandtech reviewed it in-depth and found the gaming performance was comparable against a mid-range discrete card, along with similar battery life and similar heat. However cost is the only thing working in AMD's favor. I still don't see why somebody would buy a 4-year old CPU architecture that will be EOL'd by the time Bulldozer comes out in a few months.

Re:Faulty Testing Methodology (0)

Anonymous Coward | more than 2 years ago | (#36623796)

That is only good for h.264 and not a very good encoder either.

A6 reviews, anyone? (1)

jensend (71114) | more than 2 years ago | (#36622412)

I'll be building a mini-itx system this summer, and I find the cheaper (and possibly cooler) versions of Llano more interesting. Since the GPU side of the chip is rather bandwidth-limited, I wonder whether the lower-clocked and/or lower shader count (320 instead of 400) versions of the chip might perform almost as well as the highest-end chip all the sites I've seen have tested. Anybody seen reviews of any of the rest of the lineup?

Re:A6 reviews, anyone? (1)

obarthelemy (160321) | more than 2 years ago | (#36622660)

The 65W versions are not out yet, haven't even seen a single test anywhere, and I've been looking.

FYI, I couldn't wait and built a mini-ITX rig with Asus' E-350 board, and I'm fairly happy with it: dual screen, SD video on one, office stuff on the other, no real slowdowns, very quiet, no games later than 4+ years old though. The challenge was finding a nice vesa-mountable mini-itx case. Logicsupply.com has plenty (M-350 or T3410 caught my eye, bought both for funsies), or the elementQ is OK if you want a shoe box, I took that for a second small NAS/HTPC. My passive E-360 runs at 60C, so even Llano 65W will need some cooling for sure. The M-350 accepts up to 3 4cm fans, with the $4 extra fan support.

Re:A6 reviews, anyone? (1)

Billly Gates (198444) | more than 2 years ago | (#36623744)

Here you go [tomshardware.com].

I saw another benchmark compare it the Atom as it is the closest in price range for these. As you can tell it is not the fastest chip by any means nor does it have the best GPU as a dedicated gaming box. However, look at the benchmarks for the price you get? Not so bad for older games. An Intel Atom with Nvidia ION can't even run half of these games.

The bandwidth limitations will be further removed in future versions of Llano later this year as it will have its own memory controller that will not have the latency delay of waiting for the CPU one or somethign weird like that. The fact that the GPU is integrated gets rid of some of the latency and as you can see does quite well for like dirt cheap.

A quad-core @ 2.9Ghz isn't slow! (4, Insightful)

Anonymous Coward | more than 2 years ago | (#36622514)

It's just not. Maybe it's "slow" compared to the newest chip, but, if you want to pull that crap, the newest chips are "slow" compared to a new Cray.

If you're doing things on a regular basis that are CPU-intensive, then, sure, you need speed. But 99% of applications aren't even going to stress a quad core @ 3ghz.

Re:A quad-core @ 2.9Ghz isn't slow! (1)

hedwards (940851) | more than 2 years ago | (#36623808)

Indeed, I've got a dual core Zacate clocked at 1.6ghz, and I'm not having any performance problems, even when I unplug and start working cordless. Sure it can get hot and the battery life sucks when I turn it all the way up, but the entire laptop maxes out at about 25 watts.

In fact, the next time my folks are in need of a new computer, I'll probably recommend that they go with whatever equivalent is available at that time. Apart from gamers and people that regularly engage in computationally stressful tasks, most folks don't need any more power.

"only" 2.9GHz? (0)

Ant P. (974313) | more than 2 years ago | (#36622616)

That's faster than the Phenom II I'm using now, and still costs less even with a GPU built in. And unlike the equivalent Intel part I know I'll get basics like hardware virtualisation without having to read the 0.5-micron-high fine print.

Re:"only" 2.9GHz? (2)

TPoise (799382) | more than 2 years ago | (#36622748)

If you read TFA, you would have seen that the MHz may have been numerically higher but the performance was slower than the Phenom II Quad-Core. And yes, the Core i3 (Sandy Bridge version) has hardware virtualization assist. http://ark.intel.com/VTList.aspx [intel.com] #deniedfud

Re:"only" 2.9GHz? (1)

Billly Gates (198444) | more than 2 years ago | (#36623676)

These processors are for tablets and netbooks running Windows 8 or browsing the web with the GPU taking part of the load. This is the first value oriented chip with decent graphics for people who want to play World of Warcraft and have a decent computing experience on youtube for like $499 or even $399. Most users run Office and browse the web and play Angry Birds. Flash 10.3, Firefox 4, and IE 9 or higher will run very well with Direct2D on these.

Before these systems costs more like $899 because of the added cost of a video card. I have a Phenom II 6 core, but it is under clocked at only 2.6 ghz. It is pretty snappy and I know it is not the fastest. But this is fine and much needed in the new age of graphics AJAX galore in UI design and video for entry level devices.

Loonix support? (-1)

Anonymous Coward | more than 2 years ago | (#36622704)

What is the loonix support like for this 'guano' 'APU'?

Windows 8 will fly on this (1)

Billly Gates (198444) | more than 2 years ago | (#36623580)

Hardware accelerated browsing for both IE 10 and flash 10.3 it will make up for the mediorcre CPU unlike the atom CPU netbooks. This is perfect for an entry level CPU for someone who just browses the web, plays Angry Birds, and runs Office (about 80% of users). This thing can run full 1080p HD video at 30 FPS easily with Flash 10.3 or higher.

However, you are running Ubuntu 10.10 with flash 10.0 with Firefox 3.6 you wont see any benefit because the tasks are not unloaded off to the CPU. Hopefully this will be fixed in future releases with better flash and more modern web browsers.

But if the Metro interface with all its color and eye candy is the new norm this moderately priced chip will due wonders offloading its GPU even if the benchmarks do not show it right away. The user experience will be better.

World of warcraft can finally run on a cheap integrated video now. :-)

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...