Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Launches New Richland APUs For the Desktop, Speeds Up To 4.4GHz

Unknown Lamer posted about a year ago | from the keeping-up-with-the-intels dept.

AMD 153

MojoKid writes "AMD recently unveiled a handful of mobile Elite A-Series APUs, formerly codenamed Richland. Those products built upon the company's existing Trinity-based products but offered additional power and frequency optimizations designed to enhance overall performance and increase battery life. Today AMD is launching a handful of new Richland APUs for desktops and small form factor PCs. The additional power and thermal headroom afforded by desktop form factors has allowed AMD to crank things up a few notches further on both the CPU and GPU sides. The highest-end parts feature quad-CPU cores with 384 Radeon cores and 4MB of total cache. The top end APUs have GPU cores clocked at 844MHz (a 44MHz increase over Trinity) with CPU core boost clocks that top out at lofty 4.4GHz. In addition, AMD's top-end part, the A10-6800K, has been validated for use with DDR3-2133MHz memory. The rest of the APUs max out at with a 1866MHz DDR memory interface." As with the last few APUs, the conclusion is that the new A10 chips beat Intel's Haswell graphics solidly, but lag a bit in CPU performance and power consumption.

cancel ×

153 comments

I beg your pardon (0, Interesting)

Anonymous Coward | about a year ago | (#43915537)

I thought I'm computer literate but this summary is so full of acronyms that I understand nada. Are we talking about discrete graphics cards here or what?

the only question that matters (1, Funny)

noh8rz10 (2716597) | about a year ago | (#43915611)

but will it run 4 instances of crysis 3 running at 1080p on a single 4k monitor?

Re:the only question that matters (0)

Anonymous Coward | about a year ago | (#43916047)

It will not.

Re:the only question that matters (0)

Anonymous Coward | about a year ago | (#43916573)

Will it blend?

Re:I beg your pardon (5, Informative)

dogbert_2001 (1309553) | about a year ago | (#43915657)

Accelerated processing unit. Basically a CPU with integrated graphics. Both AMD and Intel's recent CPUs have been APUs.

Re:I beg your pardon (0, Troll)

shafty (81434) | about a year ago | (#43917083)

And after the APU is done processing, it says in a catchy Middle Eastern accent, "THANK YOU, COME AGAIN!"

Re:I beg your pardon (1)

X0563511 (793323) | about a year ago | (#43917449)

I was confused what Auxiliary Power Units had to do with this.

I think aviation/spaceflight has dibs on this acronym.

Re:I beg your pardon (4, Informative)

GrumpySteen (1250194) | about a year ago | (#43915697)

http://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit [wikipedia.org]

APU is the only unusual acronym in the summary. It refers to a chip with both the CPU and graphics processor on the same die. It was previously called Fusion, but trademarks got in the way.

Re:I beg your pardon (1)

Spy Handler (822350) | about a year ago | (#43916781)

So if I buy this I won't need my Radeon 7850 video card anymore? Should I sell it on ebay now before resale value plummets?

Or is this APU just a slightly better version of motherboard integrated graphics that's been around for decades? Not fit to play 3-D games?

Re:I beg your pardon (1)

darkwing_bmf (178021) | about a year ago | (#43917063)

Thanks. I was wondering myself. For some reason all my brain could come up with was "analog processing unit" but I was pretty sure that wasn't right.

Re:I beg your pardon (4, Informative)

kcbnac (854015) | about a year ago | (#43917181)

Here is your Radeon HD7850: http://www.gpureview.com/Radeon-HD-7850-card-678.html [gpureview.com]

It has 1024 Shader Processors ("Radeon Cores" in the summary), and (stock) is clocked at 860MHz. The 8670D included in this new APU has 384 Shader Processors, and is clocked at 844MHz. So about 2/5ths of the computing power; presuming all other factors are equal.

So while for high-end gaming, it won't quite cut it (Turning on most of the shiny and enabling it across 3 monitors with Eyefinity would make it beg) - it should be plenty powerful for light/medium gaming on a single monitor, or any light/moderate duties across multiple monitors with Eyefinity.

Re:I beg your pardon (0)

Anonymous Coward | about a year ago | (#43917213)

>So if I buy this I won't need my Radeon 7850 video card anymore?
I actually went from the A6-3650 IGP to 7850 which has 6X the performance.
One of the few things they did on that generation was using the L3 cache for GPU silicon.

Right now their APU line shares DDR3 between CPU & GPU, so there is only so much bandwidth the onchip GPU can access.
Not until they release APUs for the PC market that have similar specs as their PS4 SoC: (shared GDDR5 memory with 7850/7870 GPU)

BTW I played Crysis 3 on low setting and 1024x768 on that while waiting for the courier delayed my 7850 for 4 days at below 20fps.

Re:I beg your pardon (3, Informative)

Molochi (555357) | about a year ago | (#43917301)

If you feel the 7850 is needed then these will be too slow for you.

The GPU in the A10-5800 (the one currently on the shelves) is fairly accurately labeled a 6550d and requires settings to be turned down to Low@720p/1366x768 to get acceptable performance in a game like Battlefield3. The new APU is only incrementally more powerful and faster.

What these "APU" chips (which in my mind includes Haswell Chips) are obsoleting are the lowend budget cards with 64bitGDDR5 and 128bitDDR3 that get put in a lot of office desktops.

Re:I beg your pardon (1)

wolfemi1 (765089) | about a year ago | (#43917435)

http://en.wikipedia.org/wiki/AMD_Accelerated_Processing_Unit [wikipedia.org]

APU is the only unusual acronym in the summary. It refers to a chip with both the CPU and graphics processor on the same die. It was previously called Fusion, but trademarks got in the way.

Unfortunate, because it already stands for Auxiliary Power Unit in aerospace. But I think we've passed peak TLA.

Re:I beg your pardon (0)

Luckyo (1726890) | about a year ago | (#43915989)

Intel and AMD now have graphics hardware built into their CPUs.

AMD has traditionally had better graphics, but worse CPUs when both are integrated when compared to intel's similar offerings. This trend appears to continue in next generation.

Look at all that speed (-1)

Anonymous Coward | about a year ago | (#43915571)

And they're still crippled by the FSB bottleneck.

Re:Look at all that speed (5, Informative)

JDG1980 (2438906) | about a year ago | (#43915961)

Huh? The front-side bus hasn't existed in years. AMD abolished it way back in 2003 when they moved the Athlon 64's memory controller on-die. Intel did the same thing with Nehalem in 2008.

Perhaps you just meant that there isn't enough memory bandwidth to use the GPU to its full potential with games? The good news is that AMD's upcoming Kaveri will have GDDR5 support [brightsideofnews.com] , with a homogenous memory architecture similar to the new consoles.

Re:Look at all that speed (1)

afidel (530433) | about a year ago | (#43916467)

Yeah but it's only 128bit and it's mutually exclusive with DDR3 so you cap out at 4GB with yet-unreleased high density GDDR chips, not exactly useful for a general purpose computer.

Re:Look at all that speed (1)

gman003 (1693318) | about a year ago | (#43916843)

128bit GDDR5 is still twice the bandwidth of 128bit DDR3 (quad-pumped, not double-pumped). And won't they have banked memory, so you can have more DIMMs than channels (just like you can have four DIMMs in a dual-channel DDR3 system)?

Re:Look at all that speed (1)

Blaskowicz (634489) | about a year ago | (#43917201)

With the GDDR5 version memory will be soldered onto the mobo, along with the APU. No DIMMs. Though it isn't clear to me if it maxes out at 4GB or 8GB - might well be 4GB which makes it useful for gaming but sucks if you want to do crazy hungry web browsing or something else on the side.

Only usable with Windows 8? (0)

Anonymous Coward | about a year ago | (#43915587)

Is Windows 8 the only (Microsoft) desktop OS that will be able to use PC's built with this?

Re:Only usable with Windows 8? (0)

Anonymous Coward | about a year ago | (#43915715)

Is Windows 8 the only (Microsoft) desktop OS that will be able to use PC's built with this?

Nope. They've announced EOE (Every Other Edition) support so it will run Windows ME, Vista and 8. Win, win, win!

CPU speed? (-1)

Anonymous Coward | about a year ago | (#43915597)

4.4 GHz? Oddly not mentioned in TFA....

Re:CPU speed? (1)

countach44 (790998) | about a year ago | (#43915631)

4.4 GHz? Oddly not mentioned in TFA....

See page 2 [hothardware.com]

Fascinating misues of adjectives there! (-1, Offtopic)

CajunArson (465943) | about a year ago | (#43915653)

AMD's marketing department wrote that summary!

Richland's GPU is at best about 20% faster than the intentionally-midrange HD-4600 GPU in Haswell. Add in any form of desktop GPU, including midrange models from 2011, and Haswell wins by a landslide.

At CPU, I recall seeing delightfully hilarious graph where a 6800K overclocked to 5GHz had exactly half the score of the (stock clocked) 4770K. Before we get to the usual "But AMD is cheap!" argument, when you take into account the $150 price of the 6800K and the $350 price of the 4770K, AMD only wins on price/performance if you intentionally buy the most expensive Haswell model available and intentionally don't overclock it while also overclocking the crap out of the 6800K.

Kaveri could be a very interesting product. Richland is a placeholder meant to fight Haswell with numeric model number inflation.

Re:Fascinating misues of adjectives there! (1, Informative)

CajunArson (465943) | about a year ago | (#43915681)

P.S. --> the score in question from my previous post was for Cinebench 11.5, but there are many many others like it. And don't think that OpenCL holds any miracles for Trinity either, the 4600 is actually a better OpenCL part than it is a GPU.

Re:Fascinating misues of adjectives there! (5, Informative)

Baloroth (2370816) | about a year ago | (#43916017)

P.S. --> the score in question from my previous post was for Cinebench 11.5, but there are many many others like it. And don't think that OpenCL holds any miracles for Trinity either, the 4600 is actually a better OpenCL part than it is a GPU.

Really? Because the one OpenCL benchmark [hothardware.com] I can find in TFA pegs the new chips at 2.5 times faster than the 4600 that comes with the i5-4670k. I wouldn't consider a part that is less than half as fast to be "better." Maybe that's just me? Could be. Also, I wouldn't say "at best" 20% faster when several benchmarks peg it at 30% or more. The Enemy Territory: Quake Wars [hothardware.com] high-res benchmark, in particular, is... hilariously one sided (and since most people are going to be playing at high-res settings, it's a benchmark that actually matters). Actually, all the high-res gaming tests are, with the new chips often coming in close to twice the Haswell chips. In fact, the Cinebench 11.5 tests [hothardware.com] peg the Richland at 60% faster than the i5-4670k, so I'm not sure where the hell you got any of your numbers from.

Re:Fascinating misues of adjectives there! (2)

sexconker (1179573) | about a year ago | (#43916815)

It's the same story every generation.

CPUs: AMD wins in the $/performance category but loses in terms of pure performance. For 2 generations AMD won the pure performance crown as well.

Onboard (or on-die) GPUs: Intel's implementation will get you moderate FPS on games released in [PurchaseYear - 1] at a sub-native resolution. AMD's implementation will at least run medium settings around 30 fps at native resolution. AMD wins in the $/performance category AND the pure performance category.

Discrete GPUs: AMD wins in the $/performance category but loses in terms of pure performance. If you want the top of the line, you spend big money on two of Nvidia's top-end cards every year.

CPU performance has been good enough for the vast majority of tasks that it's taking a back seat for me. I still need it for video encoding since the x264 kids don't want to do an OpenCL version.
If the DivX can get their HEVC encoder running on OpenCL (or if the x264 team does the same), then I'll see no reason to go with Intel in the near future. I'd rather spend the $ difference on more SSDs.

I'm not sure where the hell you got any of your numbers from.

It's CajunArson, he's a known fanboi/troll, and he loves to reply to himself with additional info to whore +1 Informative mods.

Re:Fascinating misues of adjectives there! (1)

pepty (1976012) | about a year ago | (#43917115)

It's the same story every generation.

CPUs:

CPUs: The low price AMD units win the performance/purchase price competition. Low price Intel units win the performance/(purchase + operating costs) competition, at least they do if your computer is on very much and you pay for your own electricity.

Re:Fascinating misues of adjectives there! (1)

pepty (1976012) | about a year ago | (#43917145)

for desktops, that is.

Re:Fascinating misues of adjectives there! (5, Insightful)

Anonymous Coward | about a year ago | (#43915869)

Yeah you're right. Fuck AMD. Let's support Intel, the anti-competitive market-abusing cocksuckers who had to secretly pay off Michael Dell to use their chips. That's a company I want to support with my money.

Re:Fascinating misues of adjectives there! (0)

Anonymous Coward | about a year ago | (#43916359)

Where or where are my mods points when I need them? Intel is a dirty filthy company who plays dirty filthy pool debasing the whole idea of a free market and undermining the progress of CPUs in the process.

http://www.pcworld.com/article/184882/A_History_of_Intels_Antitrust_Woes.html [pcworld.com]

http://www.osnews.com/story/21468/Source_Intel_To_Be_Found_Guilty_of_Monopoly_Abuse [osnews.com]

http://abcnews.go.com/Business/story?id=7574976&page=1#.Ua94NkDrz4M [go.com]

and plays hardball against even the smallest of critics-

http://www.faceintel.com/kenwonintellost2.htm [faceintel.com]

All the while sucking as hard as any monopoly at the public teat:

http://www.faceintel.com/tax$subsidizeintel.htm [faceintel.com]

Intel is a dirty, disgusting company that debases the whole idea of a free market.

Re:Fascinating misues of adjectives there! (0)

Anonymous Coward | about a year ago | (#43916595)

Intel has the advantage that most of their fab plants are in the United States.

On the other hand, I'm not an American. So fuck you, Intel!

Re:Fascinating misues of adjectives there! (0)

Anonymous Coward | about a year ago | (#43916765)

Intel also spends more on R&D than AMD has revenue. Intel was the only chip company that didn't scale back R&D during the economic down-turn.

Re:Fascinating misues of adjectives there! (1)

spire3661 (1038968) | about a year ago | (#43916381)

In a world were heat and power are becoming REALLY important, AMD is lagging behind.

Re:Fascinating misues of adjectives there! (1)

cheesybagel (670288) | about a year ago | (#43916801)

Piledriver is not the only CPU core they have. The Jaguar core (to be used in the PS4 and XBox One) is low power and has better performance than the Intel Atom cores.

Re:Fascinating misues of adjectives there! (1)

gman003 (1693318) | about a year ago | (#43917141)

Considering the ARM Cortex-A15 has better performance than the Atom, that's not saying much. Intel's been pretty neglectful of the Atom line.

Re:Fascinating misues of adjectives there! (0)

Anonymous Coward | about a year ago | (#43916551)

I'm not going to reward AMD for turning out substandard products and their poor business practice.

Intel cpus are faster. Full stop. The fastest AMD CPUs can't compete with mid-grade i3s. (And any graphics edge evaporates int the face of a 75 dollar video card)

AMD is behind because they got some hot shit CEO a years back that decided the best way to make money was "fire all the engineers" The're just cost centers, after all. AMD has been behind ever since.

Re:Fascinating misues of adjectives there! (1)

Anonymous Coward | about a year ago | (#43915887)

If I buy $350 cpu and a discrete GPU it will beat the hell out of 150 cpu, did I get your logic right?

Re:Fascinating misues of adjectives there! (2)

0123456 (636235) | about a year ago | (#43916197)

If I buy $350 cpu and a discrete GPU it will beat the hell out of 150 cpu, did I get your logic right?

I think the point is that regardless of whether you buy Intel or AMD, you'll still probably be playing new games on the lowest graphics settings available. If you actually bought a PC to play games, you'll be buying a discrete graphics card, so the on-chip GPU is just a waste of space unless the OS is able to switch back to it for the desktop to save power when you're not running games.

Re:Fascinating misues of adjectives there! (0)

Anonymous Coward | about a year ago | (#43916279)

Gee, if only AMD had a line of chips that didn't have the graphics unit in them...

Re:Fascinating misues of adjectives there! (1)

0123456 (636235) | about a year ago | (#43917169)

Gee, if only AMD had a line of chips that didn't have the graphics unit in them...

If only AMD had a line of chips which didn't have the graphics unit in them and were actually competitive with anything other than Intel's low-end parts....

Re:Fascinating misues of adjectives there! (0, Interesting)

Anonymous Coward | about a year ago | (#43915899)

Your comment about overclocking the Richland and not the haswell is great and all, except for the fact that the haswell chip is defective by design as far as thermal transfer is concerned. Haswell still uses the same heat spreader design as Ivy bridge, so you have raw silicon - thermal paste - heat spreader - thermal paste - heatsink, instead of raw silicon - welded head spreader - thermal paste - heatsink.

What does this mean? It means that you CANNOT efficiently overclock desktop haswell processors, and that intel specifically designed it so that you cant. If anything when comparing Richland and Haswell components this should be taken into consideration from the start.

Here is an article that discussed the problems facing Ivy Bridge, Haswell's predecessor:
http://www.tomshardware.com/news/ivy-bridge-overclocking-high-temp,15512.html

And here is what you have to do to get reasonable temperatures out of these chips:
http://www.youtube.com/watch?v=XXs0I5kuoX4

Who in their right mind would intentionally buy a cpu that uses thermal paste under the heatspreader?

Re:Fascinating misues of adjectives there! (1)

0123456 (636235) | about a year ago | (#43916253)

Who in their right mind would intentionally buy a cpu that uses thermal paste under the heatspreader?

The 99.9% of PC users who don't overclock them?

Re:Fascinating misues of adjectives there! (1)

Anonymous Coward | about a year ago | (#43916905)

Even if you're not overclocking, you can expect a shorter service from the parts.

Funny thing about the overclockers - they're good at showing up design flaws that can affect the rest of us. The on-the-fly overclocking most modern CPUs do on thier own now (intel 'turbo speed' is just a situationally triggered overclock) just makes it more relevant.

Re:Fascinating misues of adjectives there! (1)

sexconker (1179573) | about a year ago | (#43916961)

The whole heat spreader design is so stupid. Instead of a thin aluminum (or whatever) cap, why not make it a thick copper block with fins and fan mounting points, and attach that directly to the core right at the factory? You'd get much better results.

Yeah, then you have a larger and less flexible design that OEMs have to deal with. I say fuck em.

Re:Fascinating misues of adjectives there! (1)

Kongming (448396) | about a year ago | (#43917453)

My understanding is that while copper has higher contact heat conductance than aluminum, aluminum has higher conductance with air. Hence (in addition to how much lighter aluminum is), the large number of heatsinks that have a copper core and aluminum fins.

Re:Fascinating misues of adjectives there! (5, Interesting)

JDG1980 (2438906) | about a year ago | (#43916073)

Richland's GPU is at best about 20% faster than the intentionally-midrange HD-4600 GPU in Haswell. Add in any form of desktop GPU, including midrange models from 2011, and Haswell wins by a landslide.

Yes, if you buy a $250-$350 CPU and then add a $100 video card, it will outperform a $150 all-in-one unit. No shit.

At CPU, I recall seeing delightfully hilarious graph where a 6800K overclocked to 5GHz had exactly half the score of the (stock clocked) 4770K. Before we get to the usual "But AMD is cheap!" argument, when you take into account the $150 price of the 6800K and the $350 price of the 4770K, AMD only wins on price/performance if you intentionally buy the most expensive Haswell model available and intentionally don't overclock it while also overclocking the crap out of the 6800K.

You're looking at this from an enthusiast perspective. But if I'm building a system for someone who mostly does web surfing, Office, and occasionally some light gaming like WoW and The Sims, then an AMD APU starts to look a lot better from a price/performance perspective. You assume that as long as the performance per dollar stays high, the buyer is willing to spend as much as necessary, but that's simply not true for most users. Probably 90% of users will never even hit the maximum limit of an A10-6800K, so for these people, Haswell is overkill.

Small form factor, etc (2)

phorm (591458) | about a year ago | (#43916951)

Also, consider things like smaller form factor cases, or even laptops. In many cases where space in a concern, a decent mobile APU is better than a CPU+GPU.

In other situations, well, good enough is good enough. I'm building a small luggable (basically suitcase-PC) for LAN parties, to replace a shuttle which I previously used to drag around.
Some people show up with *huge* Antec cases and dual CPU,capable of playing [latest shooter] resolution at >1080P at superhigh detail, and then we end up playing Starcraft 2, DOTA, and Left 4 Dead 2, possible BF3... which worked just as well on an older dual/quad-core AMD with a cheap GPU.
An upgrade to a APU would be more than enough for most of our needs.

Re:Fascinating misues of adjectives there! (1)

Kjella (173770) | about a year ago | (#43916995)

You're looking at this from an enthusiast perspective. But if I'm building a system for someone who mostly does web surfing, Office, and occasionally some light gaming like WoW and The Sims, then an AMD APU starts to look a lot better from a price/performance perspective. You assume that as long as the performance per dollar stays high, the buyer is willing to spend as much as necessary, but that's simply not true for most users. Probably 90% of users will never even hit the maximum limit of an A10-6800K, so for these people, Haswell is overkill.

For gaming the graphics part is most important, but otherwise as a general rule light usage is poorly threaded and heavy usage well threaded. Often the "snappiness" of the computer is based on the performance of a single thread. So for the non-gamer I'd go with high single thread performance, for the gamer I'd suggest a discrete card but for the right level of casual gamer I guess an APU is what serves them best.

Re:Fascinating misues of adjectives there! (0)

Anonymous Coward | about a year ago | (#43917055)

This is a little disengenuous. A $65 Pentium G2020 + a $50-60 Radeon 6670 will outperform a $130 A10 6800 in nearly everything (some highly threaded CPU workloads might slightly favor the 6800).

The space advantages of no PCIe cards, on the other hand, is indeed unbeatable by AMD, and they should trumpet it more with ITX builds.

Re:Fascinating misues of adjectives there! (0)

Anonymous Coward | about a year ago | (#43916141)

Just point out: "Add in any form of desktop GPU" ... The point of the APUs is that you don't. These are "regular-user parts" meant for things like tablets, laptops, and budget computers.

Also, you claim yourself that Richland gets 20% better performance than midrange Haswell. Correct me if I'm wrong, but isn't midrange Haswell = Richland prices? Aren't you getting 20% better performance per dollar then?

Re:Fascinating misues of adjectives there! (4, Informative)

robthebloke (1308483) | about a year ago | (#43916155)

Richland's GPU is at best about 20% faster than the intentionally-midrange HD-4600 GPU in Haswell.

Yes, but what OpenGL features does the Haswell APU have compared to the full GL 4.3 found in the AMD version? How good are the Intel drivers? How many textures can I bind at once? What anti-aliasing modes does it support? What are the max number of shader varying/uniform attribs? How many shader instructions can I fit within my shaders? Back in 1999, comparing raw polygon speed may have meant something, but these days it's not really as interesting as the rest of the details....

Did it ever occur to you to look it up? (2)

Sycraft-fu (314770) | about a year ago | (#43916697)

Intel provides rather extensive technical documentation of all their products. http://www.intel.com/content/www/us/en/processors/core/CoreTechnicalResources.html [intel.com] is the page with basic datasheets (basic in this case meaning a couple hundred pages, their more detailed ones are a thousand). If you truly are as interested in the technical details as you pretend, then go look them up.

However if you are just throwing out technical shit in an attempt to deflect the argument then knock it off. Particularly since much of what you are asking for are the kind of the things that would be of concern for high end dedicated GPUs for particular applications, not for an integrated controller for general use.

For most people, what matters is how fast it is at running the programs they want to use, like games. All the other stuff is for, as Tam McGleish would say "Specy wanks who get excited about fuckin' GPU clock speeds and hardware tessellation and all that shite folk who are actually interested in playing games dunnie give a stuff about." It's all well and good, and matters for certain markets and applications, but those markets are generally not the ones using an integrated GPU. Most people just care how fast it runs their stuff.

Re:Did it ever occur to you to look it up? (1)

cheesybagel (670288) | about a year ago | (#43916835)

If we weren't interested in specs we would be using a tablet.

Re:Did it ever occur to you to look it up? (5, Informative)

robthebloke (1308483) | about a year ago | (#43917219)

Intel provides rather extensive technical documentation of all their products. http://www.intel.com/content/www/us/en/processors/core/CoreTechnicalResources.html [intel.com] [intel.com] is the page with basic datasheets (basic in this case meaning a couple hundred pages, their more detailed ones are a thousand). If you truly are as interested in the technical details as you pretend, then go look them up.

I've had a look through, but apart from saying "it has 20 execution units", it doesn't really mention any specific figures (for the actually useful information). It does however state that it's OpenGL4.0, which is a little disappointing (a step up from 3.2, but it's still lagging behind AMD & NVidia).

However if you are just throwing out technical shit in an attempt to deflect the argument then knock it off. Particularly since much of what you are asking for are the kind of the things that would be of concern for high end dedicated GPUs for particular applications, not for an integrated controller for general use.

Well, I'm a graphics engineer in the games industry by trade, so I guess you could say I have a passing interest. The things I am asking for, are things that can help improve the performance of the products I work on. Now you might not find this stuff particularly interesting, however I do. So as a very simple example, I have an order-independent-transparency pass to handle pixel perfect transparency. On the current integrated AMD GPU, I can basically pick between any number of algorithms to achieve this (weighted average, dual depth peeling, etc, etc). Now, which one I choose, is going to be largely affected by what GPU resources I need to use for other things, and this includes: memory, the max number of shader attribs, the max number of bindable texture units, etc; but in general, I have resources to spare, so I am free to pick and choose.
The problem with Intel APUs in the past, is that whilst the last generation may have implemented OpenGL 3.2 to the letter, the max attrib counts and shader instructions were significantly lower than the AMD/Nvidia equivalents. This means you typically have to insert an Intel only codepath, where you will either just rip out the nice stuff, or you'll end up using a much slower multipass technique. As a result, making frame-rate comparisons in any game is most likely to be meaningless (since there is a good chance they are running a simplified codepath for intel).

It's all well and good, and matters for certain markets and applications, but those markets are generally not the ones using an integrated GPU. Most people just care how fast it runs their stuff.

Yes, and No. It's very true that most people just want their stuff to run quickly. However, to say that the legions of people out there running low powered ultrabooks and cheap generic laptops don't care about this stuff, is complete and total bullshit. You might imagine that all gamers have £3000 desktop rigs with all the trimmings, but the reality is infact very different. If I can spend a few months optimising the graphics routines to run a game smoothly at 720p on an Intel APU, then the market sector into which we can sell our product, has more or less tripled. Even if you don't go to the effort, you will probably be forced into making those optimisations anyway. Honestly, you would be surprised at just how many people ignore the minimum system requirements on a game, and simply assume their "i3 Dell laptop is brand new, so it should play the latest games". What are you going to do? Refund half of your sales? Or fix it? If you see sense, you'll fix it, and then most of your users will have the luxury of being able to ask how quickly it runs....

Re:Fascinating misues of adjectives there! (1)

WilyCoder (736280) | about a year ago | (#43916769)

They provide GL 4.3 support in the APU driver? Nice, I might buy one now.

Re:Fascinating misues of adjectives there! (1)

Cajun Hell (725246) | about a year ago | (#43916683)

Add in any form of desktop GPU, including midrange models from 2011, and Haswell wins by a landslide.

The market for both series of products, is that you don't add anything. Use the on-CPU graphics. If you are using a graphics card instead of the integrated graphics, then neither Haswell or Richland is of interest to you. You have a Sandy Bridge-E or a non-APU AMD equivalent, which you're using along with the graphics card.

Re:Fascinating misues of adjectives there! (1)

X0563511 (793323) | about a year ago | (#43917467)

Also, P.S.

APU is taken. [wikipedia.org] They need to come up with a different acronym. They can't toss the "well that's not computing!" card either, because it's still taken. [wikipedia.org]

Still a step behind Intel (0)

Anonymous Coward | about a year ago | (#43915673)

I applaud the advancement of integrated graphics but I think Intel will beat AMD in this generation.

Intel will soon be launching Haswell parts with significantly improved integrated graphics. GPUs need access to very fast memory, and that's not something that can be provided on memory modules. The signaling tolerances mean the chip and memory have to be soldered on to the motherboard.(Eg - I'm not talking about the socket 1550 parts they launced earlier this week. Those have similar performance to the last gen.)

Intel has coming soldered-on-motherboard desktop parts and laptop parts that have access to high speed GDDR memory, and that alone will make them faster than any current AMD APU. Intel even has a high-end product that will have 128MB of very fast chip-on-package edram.
Funilly enough this is a similar solution to the AMD chip that will be in the xbox one (Chip on package edram).. When will AMD bring that tech to their PC part line?

Re:Still a step behind Intel (3, Informative)

hedwards (940851) | about a year ago | (#43915759)

Intel can get away with solder in components because they change the socket type so often that people are unlikely to be able to upgrade the processor anyways. AMD OTOH, has a tradition of not forcing you to do that every single time you upgrade.

Personally, I refuse to buy Intel parts, and quite frankly, the way I use my computer, I don't need the overpriced solutions that Intel is pushing.

Re:Still a step behind Intel (3, Insightful)

Trepidity (597) | about a year ago | (#43916101)

In practice, how often do people upgrade a CPU in the same mobo these days anyway? Even in server settings it's not that common; it's more common to buy a CPU/mobo package, and keep it until it's time to replace both.

Re:Still a step behind Intel (1)

Rinikusu (28164) | about a year ago | (#43916349)

Can't speak to everyone else, but I "upgraded" the APU in my HP laptop to an A8 from an A4 last year, and if the new FS sockets were still backwards compatible, I'd have upgraded again (and that still irritates me that it's not). The computer is fine.. keyboard, lcd, drive, etc.

Re:Still a step behind Intel (1)

drinkypoo (153816) | about a year ago | (#43916523)

In practice, how often do people upgrade a CPU in the same mobo these days anyway?

Which people? I upgraded my X3 720 to an X6 1045T. Almost the same base clock, it overclocks itself to the same speed I was able to get out of my 720 when I am running few threads so I don't even lose single-thread performance, cost me $120 shipped and taxed... used. And my 720 cost me only $110 shipped and taxed, new. I did this because I could. And I went AM3 in the first place with the expectation that I'd be able to do this.

Re:Still a step behind Intel (2)

Kjella (173770) | about a year ago | (#43916855)

Well there's three things:
1) Ability to upgrade
2) Ability to mix/match motherboard/CPU
3) Replacement cost if it fails out of warranty

On the other hand, if you buy a new motherboard/CPU combo you have a working old machine to sell or re-purpose, if you upgrade just the CPU is a low end CPU with no motherboard will usually be a complete write-off. The BGA package is cheaper, which might offset the lack of choice and most the functionality is now on the processor or chipset anyway. The repair cost is pretty real, but if you found a cheap motherboard or CPU to repair with in the past now you'll be looking for a cheap combo instead.

Remember that PCs overall are seeing a slump, desktops have long been in decline, non-OEM desktops are a small part of the desktop market and people are not whining about this on laptops which by far outships desktops, nor or smartphones or tablets so most of the market is already used to this being one piece of hardware. If anything maybe you'll see a small revival of the expansion card market where you get "just" the standard CPU/chipset features on the motherboard and the rest as extras on daughterboards. Overall a lot of drama and not that much reality...

Re:Still a step behind Intel (1)

beltsbear (2489652) | about a year ago | (#43916921)

With Intel it is less likely to be a possibility. I have built low priced machines for family and friends with AM3 twin core CPU's, all of them are upgradable to six core now (and some were already upgraded to four core when those were cheap).

Re:Still a step behind Intel (1)

KGIII (973947) | about a year ago | (#43916969)

I am not sure but it looks like you're confusing what you do for what everybody else does. I really don't know but, well... I upgrade CPUs sometimes. It doesn't help that I have a number of fairly new PCs and am always tweaking and poking. But, yeah, I buy new CPUs, update RAM, upgrade GPUs, etc...

Re:Still a step behind Intel (0)

Anonymous Coward | about a year ago | (#43916153)

This used to be a nice thing about a decade ago but I really don't think it applies today. New CPUs are so fast that I don't get rid of them for a long time. By the time you want to upgrade, you really want a new motherboard too because other technologies have jumped ahead too. Furthermore, they keep pulling formerly motherboard-only features in to the CPU package itself.

AMD's solution isn't perfect either. You really have to check to make sure that your motherboard will support a new CPU. Even if it's compatible in theory, it won't be in practice. You nearly always need a BIOS update too. I've known people that have been bitten by this before.

Make sure you upgrade your bios before you swap! Also, what if you're building a new system and your board doesn't support the new CPU without a bios update.. And you don't have and old cpu to boot the board first?

solder in kills MB choice so you may not be able t (2)

Joe_Dragon (2206452) | about a year ago | (#43916405)

solder in kills MB choice so you may not be able to get a board with what you need.

Say you need a board with lots of slots but so much in the cpu sorry the broads with lot slots only come with the high end cpus.

Need a lot of cpu power but not all kinds of OC and other stuff found in higher end MB no we don't have the mid range or lower boards with fast cpus.

Re:Still a step behind Intel (0)

Anonymous Coward | about a year ago | (#43916889)

Intel can get away with solder in components because they change the socket type so often that people are unlikely to be able to upgrade the processor anyways. AMD OTOH, has a tradition of not forcing you to do that every single time you upgrade.

Personally, I refuse to buy Intel parts, and quite frankly, the way I use my computer, I don't need the overpriced solutions that Intel is pushing.

Ah yes, the AMD battlecry "I don't have to upgrade my mobo every time I want a new cpu". Its the equivalent of apple users crying macs cant get a virus.

That's true you don't have to upgrade your mobo every time. But when youre using inferior cpus what does it matter really? For a decade now everything new AMD puts out is trounced by intels mid range offerings, some times those mid range offerings from the previous year. This is another case since intel has released its new line of cpus that once again run over AMDs new line. And sorry to say, the cost is about the same. Its not how many ghz you push, its about the architecture, AMD has never learned this and it goes back to the days when they released the 1ghz Athlon and touted its speed but yet it was still getting ruined by intel with a slower clock speed cpu.

And its great that you aren't upgrading your mobo every time but what good does it do to have the latest cpu but a chipset that is 5 years old behind it? If youre going to spend several hundred dollars for a new cpu you might as well spend another 100 and get a new mobo with the newest tech in it. You might as well buy an old crt monitor and swap the screen with an lcd.

Whatever AMD offers intel has something for nearly the same cost that runs better, runs cooler and is built like a bomb shelter.

Re:Still a step behind Intel (0)

Anonymous Coward | about a year ago | (#43917129)

That's not true and you know it.

AMD has for most of that period been the party that's actually bothered to innovate. We're using their 64bit architecture, not Intels. AMD was the first to have real dual core processors, Intel had to catch up with their technology being inferior. AMD beat them to the punch with their APUs as well.

The only things that AMD doesn't seem to be able to do, are at the high end and make enough chips to keep up with demand.

Re:Still a step behind Intel (1)

Kjella (173770) | about a year ago | (#43915907)

Haswell has launched. They could have made a socket for the R-series, but it couldn't have been the same socket as the other processors. The socket is LGA1150, not 1550. No Intel part uses GDDR, it's all eDRAM + system memory (DDR3). I guess we'll see when Intel releases their sub-$300 line, so far it's only been the top models on display. Personally I ordered an i7-4665T for a fanless build, looks to pack an awful lot of power in a 35W TDP.

Re:Still a step behind Intel (1)

spire3661 (1038968) | about a year ago | (#43916429)

Where can you order Tray (T) processors?

Re:Still a step behind Intel (1)

JDG1980 (2438906) | about a year ago | (#43916027)

Funilly enough this is a similar solution to the AMD chip that will be in the xbox one (Chip on package edram).. When will AMD bring that tech to their PC part line?

In the second half of 2013 [extremetech.com] .

Re:Still a step behind Intel (0)

Anonymous Coward | about a year ago | (#43916339)

So you're claiming that AMD's current (as in, available now) offering are a step behind something Intel will come out with in the future?

Shocking.

Re:Still a step behind Intel (0)

Anonymous Coward | about a year ago | (#43916645)

I applaud the advancement of integrated graphics

Me too. I don't want to go back to the days where black graphics and white graphics had to drink from different water fountains.

I love my AMD (4, Interesting)

WOOFYGOOFY (1334993) | about a year ago | (#43915713)

Bulldozer 8150. It rocks the house. Headroom still for a 8350 without having to change platforms- thanks AMD ! 189 bucks. Can't touch it for the price. Highly recommended.

Re:I love my AMD (2, Funny)

CajunArson (465943) | about a year ago | (#43915797)

Yeah! After waiting 4 months after Bulldozer launched to get that $189 price, and now waiting another 8 months after Piledriver launched to get the current $180 price, you got almost-as-good-at-Intel-in-a-couple-of-synthetic-benchmarks performance for the low low price of $369 in 2013!!!!

Those blubbering morons who bought the 2600K in 2011 for $350 are stuck with outdated crap that will finally be eclipsed when steamroller launches next year*! What a ripoff!

* Assuming that they spent extra for the K-series part and never bothered to overclock it for some strange reason that is...

Re:I love my AMD (-1)

Anonymous Coward | about a year ago | (#43915947)

You are such an Intel shill it's fucking pathetic. Are they paying you off too, like Dell? Fucking dork.

Re:I love my AMD (-1)

Anonymous Coward | about a year ago | (#43916151)

Someone has to pay for the INTEL INSIDE shit you see all over...

And the giant marketing campaign you see on tv, and in print everywhere...

Guess who that is... The end user.

Re:I love my AMD (5, Funny)

Anonymous Coward | about a year ago | (#43916377)

You are such an Intel shill it's fucking pathetic.

Is that what that was? I hope CajunArson isn't getting paid to shill, that post was so poorly written I honestly can't tell whether it's meant to be anti-Intel or anti-AMD. My money's on both: he's secretly using a VIA processor.

Re:I love my AMD (2)

cheesybagel (670288) | about a year ago | (#43916605)

I have the Piledriver and it is fast enough for my needs. Even the area where it is supposedly weaker, FP computation, I can do real-time ray-tracing benchmarks at 1/3 to 1/5th the speed of a 1Tflops GPU.

Re:I love my AMD (4, Insightful)

gman003 (1693318) | about a year ago | (#43915929)

Depends heavily on use, though.

Intel's been focusing on single-thread performance and power efficiency - Haswell basically did nothing for performance, giving a few percentage points of improvement, but dropped the power consumption down to the point that putting it in a tablet actually makes sense. Idle power was a particular focus.

AMD's been focused more on multi-threaded performance, cramming a ton of cores onto one chip. In some cases that works well, but in others they suffer heavily. They also focused on integer, not floating-point, performance. Sadly, even when playing to AMD's strengths, Intel's process node advantage (and compiler advantage, oftentimes) lets them at least keep pace.

I will agree that AMD has been much better at socket compatibility. My 2006 Intel motherboard is now three sockets out of date, while my similar-age AMD board would probably work with a current Bulldozer. And AMD's pricing, thankfully, reflects their performance. I might be getting one of the Richland chips for a low-cost SFF build I'm planning.

Re:I love my AMD (1)

baka_toroi (1194359) | about a year ago | (#43917473)

while my similar-age AMD board would probably work with a current Bulldozer.

Yeah, no. That whole "socket compatibility" thing is long gone in AMD. Socket FM1 was released in 2011, FM2 in 2012, probably FM3 this year.

Compare that with Intel's sockets 1156 (2009), 1155 (2011), 1150 (2013).

Sure, AM3 is better, let's see how it works out with their next AM3+ release.

Benchmarks vs. Business PCs (3, Interesting)

intermodal (534361) | about a year ago | (#43915761)

I'm trying to figure out right now whether office PCs will see the difference between AMD and Intel. It seems like as long as you install plenty of RAM, pretty much anything should handle a moderately multitasking business PC for at least a few years. I keep seeing posts of Intel vs AMD benchmarks, but even with the benchmarks being what they are, how much difference will a nontechnical end user really notice in an office environment? I run an AMD A8 quad core laptop at home, but it runs Linux and does just fine. I don't want to judge Windows performance based on my experience with Linux though.

Re:Benchmarks vs. Business PCs (-1, Troll)

0123456 (636235) | about a year ago | (#43916175)

As I see it:

If you care about performance, you buy Intel.
If you care about power consumption, you buy Intel.
If you're cheap and don't care about power consumption and want to play games on really low graphics settings, you buy AMD.

business users won't notice (2)

Chirs (87576) | about a year ago | (#43917233)

I have a 2yr old core i3 laptop that runs office apps just fine. It'll do high def streaming just fine too. "Regular" office stuff just isn't all that strenuous.

There are scenarios where you would see a difference, but they tend to be more technical users...video editing or transcoding, source code compilation, database indexing, numerical simulation, etc.

Choices, choices... (1)

Aerokii (1001189) | about a year ago | (#43915781)

Within about a month or so I'll be building a PC- initially when I started the process of selecting parts I was happy to go with an i5 from the last series, but now AMD and Intel both release their new guns...

I'm not sure which to go with any more- still leaning towards Intel since I'll be getting a separate graphics card and I like their raw power, but at the same time, it's hard to beat the price on AMD.

Good thing I've still got a month to mull it over.

Re:Choices, choices... (1)

spire3661 (1038968) | about a year ago | (#43916451)

AMDs heat kills it for me. I would rather spend money on the processor then extra cooling. With a stock cooler, Sandy and Ivy Bridge machines are almost silent.

Re:Choices, choices... (1)

Aerokii (1001189) | about a year ago | (#43916537)

Actually one of the comments I read about the Ivy Bridge i5 I was looking at mentioned how the fan was less than reliable, which is one of the reasons I'm quite glad these came out when they did, since I'd love to read the reviews without (immediately) spending more on cooling.

You never had (cpu related) cooling problems with your intel machines? That would be quite helpful to know, actually!

Re:Choices, choices... (1)

spire3661 (1038968) | about a year ago | (#43916607)

I havent had heat issues since I upgraded everything to Sandy Bridge or above. Currently i have a i5-2500k in a Bit Fenix Prodigy case, a i5-2450S in an Antec ISK 300-150, a Celeron 1610 (Ivy Bridge) in an Antec ISK-110, and a i5-2400 in a mid tower case. All of them have stock cooling and work great.

Re:Choices, choices... (1)

Aerokii (1001189) | about a year ago | (#43916637)

Good to know, thank you! I think I'll still invest in a little extra cooling down the road for overclocking (especially if/when I decide to Crossfire), but considering how far over my original budget I've gone, if I can hold out on pushing my system for a while to save some money, I think I'll do just that.

+1 Helpful, good sir.

Re:Choices, choices... (1)

gradinaruvasile (2438470) | about a year ago | (#43916481)

I'm not sure which to go with any more- still leaning towards Intel since I'll be getting a separate graphics card and I like their raw power, but at the same time, it's hard to beat the price on AMD. Good thing I've still got a month to mull it over.

You can get a cheat quad core AMD - the Athlon series of the FM2 socket come with disabled gpu and low price (70 euros or maybe lower for a quad).

I have a A8-5500 and its just perfect for my needs. I run Linux on it and so far its flawless (even the maligned fglrx driver runs perfectly). The GPU in it handles everything i need and its runs cool&quiet with its DEFAULT heatsink (which is small).

Anyway, i find all this benchmark wars a bit like pissing contests since, as PC sales too suggest, the current gen anything (hell, even core 2 duo/quads) is good enough for just about everything most people do on computers (even tablets are enough for some) . Interesting is that Intel, after touting their superior CPUs and dismissing AMD's GPU-centric solutions, have come to follow their lead with not so much of CPU, but graphics and power management improvements. Also, those high GPU improvements will most likely end up in the top tier i7s that nobody buys for their GPU.

Re:Choices, choices... (1)

Aerokii (1001189) | about a year ago | (#43916599)

The main thing is that I'm hoping to run games off of these, so I need something with at least a bit of power- I know most of the cheaper ones out I can handle most anything that's put out these days, but I'm hoping to future-proof myself a little, or rather as much as is possible in my budget, since I'm getting this instead of any next-gen consoles. AMD's price is very enticing, and the out of the box clock speeds look pretty impressive, but it seems like the major selling point of this CPU over Intel, aside from price, is graphics better than Intel's integrated.

But in the end... I basically agree with your last paragraph. It's all a pissing contest and either one will probably work just fine for me, but given my current lack of any good D&D campaigns, I'm jonesing for a bit of min-maxing. At this point I could probably decide with a coin toss and still be perfectly content. Either way, thanks for your input! It's given me plenty to consider.

Just the facts (1)

dicobalt (1536225) | about a year ago | (#43915933)

A10-6800K GPU Cores..384
Xbox One GPU Cores....768
PS4 GPU Cores...........1152

Re:Just the facts (2)

spire3661 (1038968) | about a year ago | (#43916471)

The new consoles also have hUMA, which is a big step forward.

You're a console fanboy, aren't you? (1)

gman003 (1693318) | about a year ago | (#43917259)

Here, have some more facts:
Radeon 7870GE GPU Cores: 1280
Radeon 7950 GPU Cores: 1792
Radeon 7970GE GPU Cores: 2048
Radeon 7990 GPU Cores: 4096

Oh, and don't forget the clock speeds. The A10 and PS4 (and probably the Xb1) run at 800MHz, while many of the discrete cards run at 1GHz (only the 7950 runs at less, at 850MHz).

The important thing to note here (0)

Anonymous Coward | about a year ago | (#43916221)

...is how fast each family of integrated graphics is improving. This chip is a nice little linear bump in AMD's APU power. Intel's iGPU power is increasing exponentially, and their lithography advantage is about to widen even more with Broadwell, 16nm seems to have three times the transistors per die area of 28nm, which AMD will be moving to soon. Intel is still figuring out graphics, and scaling them up, but if they throw in three times the transistors, and they could, I think it's fair to say Broadwell's iGPU will outclass this easily, as well as AMD's future offerings. Hell, I think the Intel Iris Pro 5200 will at least match it within a few months.

Re:The important thing to note here (0)

Anonymous Coward | about a year ago | (#43916309)

the iris pro already likely beats the 6800k by a bit, but we only have laptop chips equipped with iris pro that have been benchmarked, and the difference between the iris and the 5800k and the 6800k to the same was bigger for intel. We'll see what happens when they decide to make a desktop version equipped with crystalwell and iris

never heard of APU before (0)

Anonymous Coward | about a year ago | (#43917191)

i've heard of a CPU, GPU and FPU, but not an APU. had to search for APU - Accelerated Processing Unit

What Is An APU? [Technology Explained], http://www.makeuseof.com/tag/apu-technology-explained/

wonder what kind of heatsink the 4.4 gigahertz process would need.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...