Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's Kaveri APU Debuts With GCN-based Radeon Graphics

Soulskill posted about 9 months ago | from the onward-and-upward dept.

AMD 123

crookedvulture writes "AMD's next-generation Kaveri APU is now available, and the first reviews have hit the web. The chip combines updated Steamroller CPU cores with integrated graphics based on the latest Radeon graphics cards. It's also infused with a dedicated TrueAudio DSP, a faster memory interface, and several features that fall under AMD's Heterogeneous System Architecture for mixed-mode computing. As expected, the APU's graphics performance is excellent; even the entry level, $119 A8-6700 is capable of playing Battlefield 4 at 1080p with medium detail settings. But the powerful GPU doesn't always translate to superior performance in OpenCL-accelerated applications, where comparable Intel chips are very competitive. Intel still has an advantage in power efficiency and raw CPU performance, too. Kaveri's CPU cores are certainly an improvement over the previous generation of Richland chips, but they can't match the per-thread throughput of Intel's rival Haswell CPU. In the end, Kaveri's appeal largely rests on whether the integrated graphics are fast enough for your needs. Serious gamers are better off with discrete GPUs, but more casual players can benefit from the extra Radeon horsepower. Eventually, HSA-enabled applications may benefit, as well."

cancel ×

123 comments

Sorry! There are no comments related to the filter you selected.

Integrated graphics is shit. (-1, Flamebait)

Anonymous Coward | about 9 months ago | (#45957313)

It can't even run Solitaire. Good luck trying to run any games with this CPU or Xbone or PS4.

Re:Integrated graphics is shit. (-1, Flamebait)

binarylarry (1338699) | about 9 months ago | (#45957371)

No, this is worse than integrated. These use AMD drivers.

What GCN stood for before Graphics Core Next (2)

tepples (727027) | about 9 months ago | (#45957405)

I had a game console with AMD GCN graphics (the "Flipper" GPU) back in 2001. I played Super Smash Bros. Melee on it.

Re:What GCN stood for before Graphics Core Next (1)

BLToday (1777712) | about 9 months ago | (#45957871)

I love that game. Basically the only reason I bought a Gamecube and 4 controllers.

Re:What GCN stood for before Graphics Core Next (1)

Anonymous Coward | about 9 months ago | (#45958231)

The legendary (for its time) Radeon 9700 Pro, the card that kept ATi from going the way of 3dFX and the other failures, was GCN-based, if by GCN you mean the Gamecube.

ArtX, a startup created by SGI refugees, created the architecture for that chip and then got bought out by ATi partway through the console's development. The architecture then became the R300, which hilariously outperformed not only the contemporary nVidia GeForce 4 series, but also nVidia's followup "GeForce FX". ATi had both the performance and price/performance crown for three solid years. If it wasn't for that success, ATi would have been broke in short order; the previous two generations of Radeons were unimpressive and they were not doing particularly well financially.

No idea what happened to the team that was developing GPUs over there before the ArtX buyout.

Re: What GCN stood for before Graphics Core Next (0)

Anonymous Coward | about 9 months ago | (#45958789)

Teams are broken up, a bunch are at apple, nvidia, Qualcomm, Samsung, synaptics... Some remain. It's really not like whole teams moved, just individual engineers, architects, managers...

Re:What GCN stood for before Graphics Core Next (-1)

Anonymous Coward | about 9 months ago | (#45962449)

Nintendo GameCube not GameCube Nintendo, you fucking moron. The correct initialism is NGC.

How about competition on price? (2)

SargentDU (1161355) | about 9 months ago | (#45957403)

The summary did not state what the prices are. Are they cheaper to buy than the Intel chips they are being compared with?

How about reading TFA? (0)

Anonymous Coward | about 9 months ago | (#45957565)

Prices are on the first page of the anandtech review.

Re:How about competition on price? (0)

TheRealMindChild (743925) | about 9 months ago | (#45957567)

By at least $200. That doesn't include the difference in price when comparing the AMD APU socket motherboard vs. Intel socket motherboard

Re:How about competition on price? (2, Insightful)

Anonymous Coward | about 9 months ago | (#45957597)

It's $200 cheaper than an i3 4330? That's pretty impressive given that the i3 is $130, are AMD going to refund me $70 for buying their CPU?

Re:How about competition on price? (1)

lister king of smeg (2481612) | about 9 months ago | (#45957841)

It's $200 cheaper than an i3 4330? That's pretty impressive given that the i3 is $130, are AMD going to refund me $70 for buying their CPU?

if so i think i am going to buy me enough cpu's to retire early.

Re:How about competition on price? (1, Troll)

higuita (129722) | about 9 months ago | (#45960519)

and the graphic card is free for you?

Re:How about competition on price? (0)

Anonymous Coward | about 9 months ago | (#45960725)

Getting a $70 graphic card will likely give you more performance than the APU

Re:How about competition on price? (0)

Anonymous Coward | about 9 months ago | (#45960765)

That's the point of the processor line. they are better than getting a low priced discrete GPU.

Re:How about competition on price? (0)

Anonymous Coward | about 9 months ago | (#45962479)

It's included in the CPU, idiot. The Core i3 4330 has a built in Intel HD Graphics 4600.

Re:How about competition on price? (3, Insightful)

s.petry (762400) | about 9 months ago | (#45959449)

The summary also spends a lot of time talking about how great Intel is. It makes sense that prices are not discussed because the submitter appears to be heavily biased, and price always favors AMD.

Re:How about competition on price? (4, Interesting)

hairyfeet (841228) | about 9 months ago | (#45962801)

Oh it wasn't just TFA, look above and below you and see how every single post that said anything positive about AMD was downmodded. Not just one, or two, EVERY SINGLE ONE. If that doesn't prove that the mod system is completely broken here? Then honestly I don't know what does.

But watch how quick they burn this...AMD has the "bang for the buck" sown up, nowhere can you get a quad CPU with a graphics chip capable of BF4 in the Intel camp for less than triple that, nowhere at all.

Oh and before the fanboys trot out any benchmarks? might help you to know they are as rigged as "quack.exe" was back in the day as Intel's compilers put out crippled code [theinquirer.net] that it is 100% IMPOSSIBLE to disable, and guess what compiler is used by most if not all the major benchmark suites? You guessed it. Try running a real world test with programs compiled with GCC or even AMD's compiler (as AMD doesn't "return the favor" and rig their compiler, in fact they hand out the code so you can see what it does for yourself) and you'll find nearly all the tests come within less than 20% of each other and the only ones they manage to pull away to a whole 30%? The top o' the line i7. 300% price increase for less than 30% real world performance difference...sorry but the bang for the buck is still with big red.

Re:How about competition on price? (2, Insightful)

hairyfeet (841228) | about 9 months ago | (#45960185)

Actually they are usually on the order of HALF what the Intel chips cost, for example an AMD quad will run you around $89-$99 whereas the entry quad from Intel runs right at $200.From the looks of it the quad A8-7600 is running at $120 which or a quad with decent graphics performance? Is a steal.

This is why I still build and sell exclusively AMD units as the "bang for the buck" just can't be beat. I have a friend that mainly plays older flight sims with a few mainstream titles and when he comes by the shop next month to get a kit? If i can find this put in a decent kit I'll be happy to recommend it. After all if it can play BF4 at 1080P it'll have no problem playing his games on his 720P set while still giving him plenty for his video streaming and office apps.

Of course the dirty little secret that neither AMD or Intel want to talk about is that if your PC is less than 7 years old its probably overpowered for what you do IF you are Joe and Jane Average. After all 7 years ago I was selling Phenom I quads with 4Gb of RAM and 400Gb HDDs and for Joe and Jane Average? That unit will spend most of its life idling because they simply can't come up with enough useful work to max out the cores. Heck even we gamers don't have to upgrade like we used to, my two boys and I all play FPS yet our 4 year old AMD X6s and the youngest's X4 have no problem playing the latest games when paired with an HD7750 or HD7790. Of course we aren't trying to play BF4 on 4K widescreens but ya know what? Most desktops here are 1600x900, a few 1080p and at those resolutions it plays the latest games just fine.

But for those people with the first gen Athlon X2s or even worse, the Pentium D like my friend has? I would have ZERO problem recommending this chip, it'll give you a quad CPU and decent graphics OOTB and you can always add a discrete down the line. A win/win in my book.

Re:How about competition on price? (2, Insightful)

guacamole (24270) | about 9 months ago | (#45961263)

Benchmarks show that for pure CPU intensive tasks, the A10 APUs are roughly comparable to Haswell Core i3s (the entry level ones, at least). The i3-4150 costs $130-140, the last generation A10-6800K dropped to $130-140. The new A10-7850K is listed for $189 on Newegg. Considering this, the new A10-7850K is not very inciting at all. It's not even convincingly faster than A10-6800K, with the current drivers at least. AMD hinted that the new A10-7850K graphics performance will be on the level with Radeon HD7730 or 7750 ($100-120 graphics cards), but looking at the results, it's not near that.

If you just play older games or no games at all, or if you will be buying a dedicated GPU, Core i3 and the quad-core A10-6800K seem like a good deal. If you game a lot, adding a dedicated GPU seems like the best way to go.

Re:How about competition on price? (1)

ArcadeMan (2766669) | about 9 months ago | (#45962393)

How about the Intel Pentium G3220? It's Haswell, socket 1150, low power and nearly half the price of the i3.

Re:How about competition on price? (2)

guacamole (24270) | about 9 months ago | (#45962427)

Yes. To add the insult to the injury, the G3220 is priced at $69 on newegg right now. It's basically a slightly lower clocked i3 without hyperthreading. If you don't play games or edit multimedia, then that's all you really need on an entry level desktop. Add a $100 video card, and it will probably run games at faster frame rate than AMD's $189 A10 Kaveri.

Re:How about competition on price? (0)

Anonymous Coward | about 9 months ago | (#45962529)

The G3220 is only $62 with free shipping on Amazon.

Capable of Playing - worthless statement (1)

locopuyo (1433631) | about 9 months ago | (#45957465)

Saying something is "capable of playing" a game at X settings is a completely pointless statement for almost all hardware because the ability to play is based almost solely on having enough memory to load all of the assets. They need to state the average frame rate it gets at whatever settings they are playing it on.
You could play BF4 on a 300 MHz processor as long as you have enough memory, it would just look like a slide show.

Re:Capable of Playing - worthless statement (4, Informative)

Baloroth (2370816) | about 9 months ago | (#45957657)

Most people, when they say "capable of playing", mean that it can actually be played on those settings, i.e. that the frame rate is high enough for the game to be considered playable. Generally, this means an average frame rate of ~30 and minimums of 20 or more (although that depends a bit on the reviewer, some people consider a frame rate of 30 totally unplayable, personally anything above 20 can still be played).

Re:Capable of Playing - worthless statement (1)

locopuyo (1433631) | about 9 months ago | (#45959663)

Except for people that say "capable of playing" never mean 30 fps and are likely completely oblivious to how many frames per second the game is running at. The GPU in question definitely does not get an average of 30 fps on BF4 medium settings.

Re:Capable of Playing - worthless statement (0)

Anonymous Coward | about 9 months ago | (#45959935)

Depends on the game of course, you see I am building a small AMD box right now just to be able to play Rocksmith 2014 on a budget on my TV. RS2014 runs on my intel laptop, but it gets way too hot and it's a hassle. A $40 AMD CPU on a cheap ITX board will do the trick.

Re:Capable of Playing - worthless statement (0)

Anonymous Coward | about 9 months ago | (#45960791)

All next gen consoles are based on low end versions of this technology. The PC versions are quite capable of playing many games. I build cheap gaming computers for people(400-500) and they work great. especially if you are saving to add a video card later. They dominate in HTPC situations.

Re:Capable of Playing - worthless statement (1)

Confusador (1783468) | about 9 months ago | (#45962975)

Except that in this case "X settings" is 1080p30. It may be low quality otherwise, but it meets your requirement.

Looked into it for a friend's build (3, Interesting)

gman003 (1693318) | about 9 months ago | (#45957469)

I'm helping a friend with a custom, low-cost gaming machine. We'd looked into using an APU, and I looked into it again today when I saw this. The gaming performance just isn't there yet. They're fine for regular desktop use, but even the top-of-the-line one can't handle gaming.

The two things that could still be useful are GPGPU, and dual graphics. Having an on-chip GPU just for compute purposes, especially with all the enhancements they've added, would be very useful if more things used GPU compute, but it just wasn't worth it for this build and this user. And they have spoken a bit of using both the integrated GPU and a discrete graphics card in tandem, similar to using two GPUs in Crossfire, but they haven't released the drivers for it, nor listed which cards will work, and the card they chose to demo it with was their bottom-end graphics card. Given all that, and that a similar CPU without the integrated graphics was about half the price, I couldn't justify getting one.

I am pretty impressed with how tightly they've integrated them, though. Much better than Intel's offerings. If they made one that had the graphics horsepower for gaming, I'd have used one.

Re:Looked into it for a friend's build (0)

Anonymous Coward | about 9 months ago | (#45957571)

You ain't got friends, you liar.

Re:Looked into it for a friend's build (0)

Anonymous Coward | about 9 months ago | (#45962501)

I think he means an imaginary friend, like "Tony", "Wilson" or "God".

Re:Looked into it for a friend's build (1)

viperidaenz (2515578) | about 9 months ago | (#45957653)

and the card they chose to demo it with was their bottom-end graphics card.

Probably because you wouldn't notice a difference if you paired a tiny integrated GPU with a powerful standalone one. The added overhead may even reduce performance.

Re:Looked into it for a friend's build (1)

jandrese (485) | about 9 months ago | (#45957695)

I was thinking the integrated GPU might be useful for PhysX calculations while the discrete GPU does the graphics.

Re:Looked into it for a friend's build (2)

FreonTrip (694097) | about 9 months ago | (#45957741)

That's not valid for AMD cards or IGPs because PhysX is Nvidia-only. If companies start using OpenCL to implement physics acceleration that could change.

Re:Looked into it for a friend's build (0)

Anonymous Coward | about 9 months ago | (#45957697)

The people that write the games always are writing for current- or next-gen discrete cards on the current titles. If you are waiting for an APU that will run current titles at full settings and keep you near the bleeding edge for a couple years, you'll be waiting forever. That's just not how things work.

Re:Looked into it for a friend's build (0)

Anonymous Coward | about 9 months ago | (#45958347)

No, they're writing for current-gen consoles, and the PC version is a port. Just wait until the new batch of consoles is about three years old and midrange PC hardware should run most games on high settings without a problem unless you're going for exotically high resolutions or user-created high-density texture mods or other stuff like that.

Re:Looked into it for a friend's build (1)

Bengie (1121981) | about 9 months ago | (#45958623)

The APU is more like what an FPU was. Are you claiming FPUs are useless?

Tech (1)

phorm (591458) | about 9 months ago | (#45957885)

The article also notes that a lot of the tech in these is new, so older games don't necessarily take advantage of it. It would be interesting to see how this looks a year from now.

Re:Tech (1)

0123456 (636235) | about 9 months ago | (#45958625)

The article also notes that a lot of the tech in these is new, so older games don't necessarily take advantage of it. It would be interesting to see how this looks a year from now.

Game developers optimize to run best on the fastest computers out there, not slow CPUs with slow integrated graphics. AMD would have to pay them to put effort into optimizing for these things.

Re:Tech (1)

SpankiMonki (3493987) | about 9 months ago | (#45960087)

I think GP might be referring to AMD's Mantle API. [wikipedia.org] Apparently Battlefield 4 [pcgamer.com] supports it.

Re:Tech (1)

phorm (591458) | about 9 months ago | (#45962797)

Good game developers make games to run at reasonable performance on the most machines. it sells a lot more games that way...

Re:Looked into it for a friend's build (1)

Anonymous Coward | about 9 months ago | (#45958273)

It sounds like you were actually trying to build a low-cost high-end gaming machine. That can't be done, it doesn't work like that. Look at what GPU and CPU performance AMD A-series gives you, assess if it's enough for you, and if it is then look at the price and pick up your jaw; this is the strength of the AMD A-series, good CPU and GPU performance at an amazing price in a single package.

If you want the best then you have to pay silly money for Intel and discrete graphics boards, that's just how it is.

Re:Looked into it for a friend's build (1)

Impy the Impiuos Imp (442658) | about 9 months ago | (#45958737)

I'm shocked GPUs, especially with all this integration, haven't taken over already. The whole reason Intel bought half the industry was it became obvious a Pentium core could be tucked into a tiny corner of a 3D graphics chip, both speed and transistor count-wisr, as their development, drivin by infinite potential consumption on cooler and more complex virtual worlds, would ever-more outstrip a general-purpose CPU.

Frankly, by now I was expecting a merged GPU/monster FPGA-type design with dynamic programming keeping the hardware full of things to do, with the tiny CPU in the corner doing mundane interface and housekeeping tasks not worth it to underuse the GPU or FPGA bits, transistor-per-speedup ratio.

Re:Looked into it for a friend's build (1)

Anonymous Coward | about 9 months ago | (#45958899)

GPU computations hasn't taken off because APU still uses separate address space for CPU and GPU bits. It is a real bitch to copy from CPU to GPU and then to CPU back again. Yes, even on a single chip. Insanity!

Unified address space should fix the problems, but then memory protection need to be handled properly. Apparently coming "Soon" from AMD, but not soon enough.

Re:WRONG (0)

Anonymous Coward | about 9 months ago | (#45960661)

RFTA again

Re:Looked into it for a friend's build (1)

Bengie (1121981) | about 9 months ago | (#45962025)

Kaveri supports protected memory and even preemptive multitasking.

Re:Looked into it for a friend's build (0)

Anonymous Coward | about 9 months ago | (#45958791)

...by the time HSA is supported by enough software to be worth it, Intel will have something similar. But, if he's not going to upgrade his motherboard in 3 years, AMD is probably a better choice.

Re:Looked into it for a friend's build (3, Informative)

theGreater (596196) | about 9 months ago | (#45958799)

And they have spoken a bit of using both the integrated GPU and a discrete graphics card in tandem, similar to using two GPUs in Crossfire, but they haven't released the drivers for it, nor listed which cards will work, and the card they chose to demo it with was their bottom-end graphics card.

That's not very truthy:
http://www.amd.com/us/products/technologies/dual-graphics/pages/dual-graphics.aspx#3 [amd.com]

Re:Looked into it for a friend's build (1)

gman003 (1693318) | about 9 months ago | (#45959023)

That link is not very truthy. Not only does it only list a single "recommended" card, rather than a list of any that are compatible, but it also has not been updated for these new GCN-based APUs. As noted in TFA (the Anandtech one, specifically) "AMD recommends testing dual graphics solutions with their 13.350 driver build, which due out in February."

Re:Looked into it for a friend's build (1)

hairyfeet (841228) | about 9 months ago | (#45962867)

What games? Because I just built a kit using one of the APUs (this one [tigerdirect.com] if you want to check it out) and its playing World Of Warplanes just fine. Also Bioshock I & II, TF2,Burnout Paradise, and a half a dozen more I can't recall at 3AM.

So unless you are trying to go 4K which you shouldn't be doing with a "budget gaming rig" then I seriously doubt your friend will have a bit of trouble out of one of these APUs. And the nicest part? He can always go hybrid crossfire or add a discrete later and with a price THAT low? he can add more RAM, maybe a SSD to go with the HDD and still come in under budget. Give them another look, go type the name of the APU into YouTube and see what games real users are playing and what the framerates are, you'll probably be surprised.

One word of advice though, from someone that has an APU in his laptop...buy the fastest RAM that the board will take, as unlike a regular system where it doesn't matter as much with an APU it DOES matter. I went from 1066 to 1333 and it gave me a good 20% framerate increase, so get the fast RAM, its only a couple bucks diff anyway.

Looking forwards... (2)

serviscope_minor (664417) | about 9 months ago | (#45957527)

Really looking forwards to the HSA benchmarks.

Nothing out there will tax these chips. All GPGPU codes are written asuming hugh latency between CPU and GPU. With shared caches these things have nanosecond latency and should beable to bring the GPU to bear on a much wider class of algorithms.

Now, it's always worth shipping the data to the GPU, since if it's in the L2 cache, it's there for the GPU as well.

It will take a while before people code to this though.

Re:Looking forwards... (1)

viperidaenz (2515578) | about 9 months ago | (#45957675)

So you mean kind of like what the Intel chips already do?

Re:Looking forwards... (0)

Bengie (1121981) | about 9 months ago | (#45958655)

Intel's IGP is not the same. It's true that it also uses the same L4 cache, but it is not the same address space. This means data will be duplicated as it needs to be copied from the CPU address space to the GPU address space, even if they share the same physical memory.

Re:Looking forwards... (1)

viperidaenz (2515578) | about 9 months ago | (#45959279)

Sort of like "Intel InstantAccess", that allows the CPU to directly access GPU memory space?

It was a driver limitation, not a hardware one

Re:Looking forwards... (2, Insightful)

Bengie (1121981) | about 9 months ago | (#45960681)

Except "Intel InstantAccess" requires making system calls to allow the kernel to map GPU memory to user space. AMD's HSA requires nothing special at all. The GPU understands and honors protected mode, so you can arbitrarily pass pointers to and from the GPU with no system calls. You can even communicate between the GPU and CPU without system calls. AMD HSA even lets the GPU work with virtual memory. "Intel InstantAccess" only works with data that is in memory, AMD can issue page faults and let the OS load from the page file.

Re:Looking forwards... (0, Flamebait)

viperidaenz (2515578) | about 9 months ago | (#45961117)

So you knew how it works and you just decided to spread lies in your previous post?

Re:Looking forwards... (1)

Bengie (1121981) | about 9 months ago | (#45962045)

Get back to me when Intel has something that requires no system calls and has a unified memory space, then they'll be comparable.

Re:Looking forwards... (1)

timeOday (582209) | about 9 months ago | (#45957763)

The XBox One and PS4 have given us a preview of AMDs technology in this area, haven't they?

Re:Looking forwards... (1)

Joe_Dragon (2206452) | about 9 months ago | (#45958299)

the ps4 has high speed ram sheared for video and cpu.

Xbox has slower desktop DDR3 for video / cpu.

PS4 seems good and high speed ram makes it better then other on board chips that use the slower desktop ram.

Re:Looking forwards... (0)

Anonymous Coward | about 9 months ago | (#45961605)

why did they decide to shear it? did it look too sheepish?

Re:Looking forwards... (1)

godrik (1287354) | about 9 months ago | (#45959001)

I am a little bit skeptical about that. I am not really sure how much it will really change things. The use case actually seems very thin to me. You need a kernel which is compute intensive and where the data transfer from memory to the core is expensive. Because if there is little data to transfer, then the overhead is small. I read some benchmarks from AMD and only few kernels seemed to be in the sweet spot. On top of that becasue of the memory architecutre, I feel like raw memory to core bandwidth will be closer to what a cpu get (50GB/s) rather than what a gpu get (250GB/s).

Anyway, I'll probably buy one and give it to a student to play with.

Re:Looking forwards... (1)

serviscope_minor (664417) | about 9 months ago | (#45962967)

You need a kernel which is compute intensive and where the data transfer from memory to the core is expensive.

You're missing the cpu-gpu latency.

So, the integrated GPU will already work as well as any other GPGPU of similar specs. Nothing has got worse there.

There is a problem with writing GPU code in that some things work far better on the GPU and other things work far worse than a CPU. At the moment writing code which has a mixture of those is extremely hard.

Basically, this architecture will allow one to use the GPU much more like a low latency FPU attached closely to the main procesor.

Anyway, I'll probably buy one and give it to a student to play with.

Good plan. The best thing for this is for things which are currently too tricky to implement on a GPU.

Re:Looking forwards... (1)

LWATCDR (28044) | about 9 months ago | (#45959451)

That is going to be the kicker. Just like VLIW it will really depend on software tools and support. AMD is supporting a lot of FOSS projects that support OpenCL. Maybe AMD needs to throw some support to WebKit and Mozilla to support their GPU compute systems.

Disappointed (2)

edxwelch (600979) | about 9 months ago | (#45957575)

While the GPU is good, the Kaveri CPU is slightly slower than Richland in the benchmarks - after 4 years of waiting that's a big disappointment.

Re:Disappointed (0)

Anonymous Coward | about 9 months ago | (#45958201)

What benchmarks? AMD say it's about 20% faster, not 10% slower. To be off by 30 percentiles like that doesn't seem likely.

Re:Disappointed (0)

Anonymous Coward | about 9 months ago | (#45958419)

What I read was something like 10% faster per cycle, 10% slower clock = no performance gain over several years.

Re:Disappointed (0)

Anonymous Coward | about 9 months ago | (#45958403)

The benchmarks I've seen show that it is slightly faster than richland at the same clock speed. AMD are releasing these are slightly lower clocks with roughly the same performance.

Re:Disappointed (1)

elwinc (663074) | about 9 months ago | (#45958471)

Anandtech points out that they chose a process with higher transistor density to go for greater IPC instead of high clock rates in the CPU. There's also an amusing comment in the review about how the Bulldozer CPU architecture "sure had a lot of low hanging fruit." In other words, why weren't most of these improvements included back in 2011?

Re:Disappointed (1)

hamster_nz (656572) | about 9 months ago | (#45958725)

In other words, why weren't most of these improvements included back in 2011?

Developing hardware is a lot different than developing software. With software you can go - "oh that now works, lets add this" or "oh, that didn't work out - how about we take that out". With hardware you can't without going back to the start of the manufacturing process.

With hardware a large part of the exercise in risk management - adding one feature that you can't get production ready will kill the entire product. So most projects pick just one or two key areas to develop, the ones that will make the biggest advance on the roadmap, and leave the others well alone. Verifying and validating an entire CPU design is just too much work.

This always leaves low hanging fruit to be picked off in future updates and design refreshes.

The COST difference should be mentioned. (4, Informative)

Anonymous Coward | about 9 months ago | (#45957613)

It's 1:2 AMD:Intel, at the kindest level.

It's 2:3 with radeons:nvidia.

APU Name (2)

BeTeK (2035870) | about 9 months ago | (#45957725)

In Finnish kaveri mean buddy. Quite fitting name :)

Re:APU Name (1)

TeknoHog (164938) | about 9 months ago | (#45959367)

In Finnish kaveri mean buddy. Quite fitting name :)

And "apu" means help or assistance, or auxiliary as a prefix. For example "apuprosessori" meaning co-processor.

Who fabs this? (1)

unixisc (2429386) | about 9 months ago | (#45957775)

Since AMD has gone fabless, who do they now use to manufacture these chips?

Re:Who fabs this? (1)

Anonymous Coward | about 9 months ago | (#45957847)

GloFo

Re:Who fabs this? (1)

Mashdar (876825) | about 9 months ago | (#45961253)

The same people as always. They spun off "Global Foundaries", but are still using them. (They are contractually obligated to, for the foreseeable future.)

Embedded GPU Boom (2)

Salgat (1098063) | about 9 months ago | (#45958121)

It's very exciting seeing both AMD and Intel compete to push embedded GPUs. More and more of the computer is being pushed onto the CPU's package (SoC); one day we can expect to see RAM become embedded too as a new level of cache that is more than sufficient for even gamers. The reason why discrete GPUs and other components will ultimately lose is latency. GPUs and CPUs will reach a point where the bottleneck that exists between them will hinder communication enough that embedded GPUs will become a necessity. The same goes for RAM. One day we may even see hybrid CPU/GPUs, such that some cores will be more general purpose where others are more special purpose. Ultimately we can thank our phones for helping drive this push; especially since phones are rapidly approaching the performance of desktop and laptop computers.

Re:Embedded GPU Boom (1)

0123456 (636235) | about 9 months ago | (#45958647)

Good luck getting a 100W CPU and 300W GPU into the same package.

Well, OK, stuffing them in there won't be too hard, but cooling it will be a bastard.

Re:Embedded GPU Boom (0)

Anonymous Coward | about 9 months ago | (#45959843)

You may be the only person on Earth that sees high power consumption as a desirable feature.

Re:Embedded GPU Boom (1)

0123456 (636235) | about 9 months ago | (#45961641)

You may be the only person on Earth that sees high power consumption as a desirable feature.

You do realise that high-end CPUs and high-end GPUs use a lot of power, yes? You do realize that putting both in a single package would use even more power, yes?

Oh, obviously not, since you've completely mis-read the point of my post.

Best choice for 4 out of 5 desktop users (4, Insightful)

Anonymous Coward | about 9 months ago | (#45958165)

Most of the people who decide they still need a full-sized desktop computer will be completely covered with one of the AMD A-series APUs, at a bargain price. Only the remaining 1 out of 5 users are power-users who need the highest CPU and/or GPU performance, and have to resort to expensive Intel CPUs and discrete graphics boards.

Re:Best choice for 4 out of 5 desktop users (1, Insightful)

guacamole (24270) | about 9 months ago | (#45960875)

Expensive Intel CPUs? Intel's Core i3 is pretty much equivalent to the AMD A10 on general purpose CPU power. Right now, i3-4130 is $129 on newegg while the A1-7850K is $189. The only thing that the A10 has on Core i3 is integrated graphics, but throw a $100 Radeon card into either of these systems, and it will run much faster than the integrated graphics on the A10. And don't forget the dual core Haswell Pentium chips sold for under $100. A Pentium G3220 costs $69 on newegg right now. Add a $100 Radeon HD7730, and it will still beat the A10 in games, while being roughly the same, or a little slower by an unnoticeable margin for general purpose computing.

Re:Best choice for 4 out of 5 desktop users (0)

Anonymous Coward | about 9 months ago | (#45962861)

Yes, but then you have twice the number of devices in your machine, twice the number of fans, twice the noise, twice the power requirement, at twice the cost. You need to compare APUs to APUs.

Re:Best choice for 4 out of 5 desktop users (1)

Confusador (1783468) | about 9 months ago | (#45963011)

Except the point is not to need the $100 graphics card, even if that means sacrificing some CPU performance that that segment doesn't need.

Sadly, a near total disaster for AMD (-1)

Anonymous Coward | about 9 months ago | (#45958361)

Kaveri marks the third distinct and new chip design in its so-called high-end APU family. However, from Llano (the first) through Richland and now Kaveri, AMD has made almost ZERO worthwhile improvements in the consumer space that matters.

The 'vastly' repaired BULLDOZER architecture proves to be as crap as ever, using insane numbers of transistors, and insane amounts of power, to run slower by fay than Intel designs from the beginning of Intel's new CPU architecture (Sandybridge). Meanwhile, AMD stomps Intel on the GPU side, BUT Intel's GPU performance is so putrid, much better than Intel = utter garbage.

Only at 45W has AMD made great strides. Problem is, desktop users either want MUCH cheaper solutions than Kaveri, or much more powerful gaming solutions. Kaveri is 'great' exactly where the market does NOT exist. Ironically, this matches Intel's current problem with its dreadful FinFET tech- where Intel gets great power saving at the wrong part of the curve- again around 45W. Neither AMD or Intel see significant performance improvements at much lower power usage, where their chips can find their way into SANE mobile designs.

AMD has 'interesting' hardware features in Kaveri that will NEVER see a public API/SDK. Trueaudio, for instance, requires that developers pay AMD a VERY expensive license to access the hardware. The same is true for Kaveri's hardware video encoder functions, and JPG encode/decode. The same looks as if it will be true for Mantle development.

So, when you think about Kaveri, forget about those 'great' hardware features, since AMD intends for them NEVER to be public. Sure, in the future you may be able to buy one or two games or apps from developers who paid to use these features- but open-source TRUEAUDIO apps, forget about it.

AMD wants 178 dollars plus for the 'best' Kaveri- for this price you can choose amongst many much cheaper parts from AMD or Intel, AND afford a discrete GPU as well. Your CPU + discrete GPU will cost the same, use the same power, and run VASTLY faster. Pay a little bit more (say for a heavily discounted 7790 from AMD) and you have Trueaudio support as well.

Kaveri makes ZERO sense, from a price, performance or power usage POV, unless you want and NEED the 45W variant. So, the bottom line is this. Does a 45W APU have a modest power usuage so important, it is the dominant reason for buying? Answer "yes" and you need a Kaveri. Answer "no" and the Kaveri is a terrible choice (today) no matter what kind of PC you are building.

PS I am sickened- SICKENED- that AMD refused to build motherboards with the RAM soldered on, because then Kaveri could have utilised the same 256-bit GDDR5 solution found in the PS4, for no more money (save the cost of the RAM, obviously). A GDDR5 Kaveri would EXTERMINATE every competing Intel part. Kaveri with the same old s**ty 2x64 bit memory bus that has been crippling PC performance for well over a decade has Intel rolling on the floor with laughter.

Re:Sadly, a near total disaster for AMD (0)

Anonymous Coward | about 9 months ago | (#45958493)

You exaggerate so much we can see the Intel Inside logo bulging on your forehead.

Re:Sadly, a near total disaster for AMD (4, Insightful)

Bengie (1121981) | about 9 months ago | (#45958829)

Ohh yes. Lets solder memory right on, increasing board complexity and gaining almost no advantage. The APU is meant to be a mixture of a "good enough" GPU, and a higher performance compute-unit for low memory problems, which there are a lot of. As for open source, AMD is actively committing work to the Linux kernel in both the mantle framework and better driver support. They are also working with Steam, because the SteamOS is Linux which means AMD needs decent Linux drivers if they plan to be used.

Yes, it is not a very good GPU when it comes to high end graphics because it has about 1/3rd the flops of a discreet GPU and it is memory bandwidth starved for those work loads, but for non graphics related work loads, it's perfect. It is the first of something new. How many people piss and moaned about FPUs when they came out? "derp, there's no software that uses them, so they must be useless". You need to have the platform before you can have the developers. Once the next gen consoles start taking off, expect games to be nearly directly ported and taking advantage of this new GPU paradigm.

Re:Sadly, a near total disaster for AMD (1)

Mashdar (876825) | about 9 months ago | (#45961213)

I've been hearing for years that AMD's Linux drivers are just around the corner. Still waiting...

Maybe SteamOS will get them off thier butts, but for the time being my money is still going to nVidia.

AMD performance in Linux is 3-10 times slower than Windows in most games I've tried on Llano. I love my Llano laptop outside of gaming, but it pains me to still dual boot, whereas my desktop has been Windows-free for 6 years.

Radeon Drivers are getting better. (1)

coder111 (912060) | about 9 months ago | (#45963211)

Radeon performance increases by ~20% in each new Mesa release. I think it should be ~60-80% Windows performance with Mesa 10 and Linux 3.12. It still doesn't support OpenGL 4.x, but it's getting quite good. Latest Mesa also has VDPAU video decoding acceleration (still a bit buggy), better power management, PRIME switching between integrated/discrete GPU also works. Unfortunatelly no crossfire yet.

Given that I don't play latest and greatest games (there are plenty of good 3 year old games), performance is sufficient for me.

--Coder

Re: Sadly, a near total disaster for AMD (0)

Anonymous Coward | about 9 months ago | (#45962259)

Remember a year or two ago when AMD dismissed theis entire Linux driver team? You keep on holding to that AMD Linux driver support myth.

Re:Sadly, a near total disaster for AMD (1)

UnknownSoldier (67820) | about 9 months ago | (#45959105)

> PS I am sickened- SICKENED- that AMD refused to build motherboards with the RAM soldered on, because then Kaveri could have utilised the same 256-bit GDDR5 solution found in the PS4, for no more money (save the cost of the RAM, obviously). A GDDR5 Kaveri would EXTERMINATE every competing Intel part.

Actually I'm also surprised AMD isn't doing that. Might be that the existing architecture of separate RAM is "good enough" and they don't want to pursue a tiny market where only console devs care about performance, and/or don't want to piss of OEM motherboard manufactors.

Re:Sadly, a near total disaster for AMD (1)

spire3661 (1038968) | about 9 months ago | (#45959523)

I keep hope that some OEM will commission a Kaveri/GDDR5 Steam Machine. Dell/Alienware might be able to get AMD to do it.

Re:Sadly, a near total disaster for AMD (0)

Anonymous Coward | about 9 months ago | (#45961121)

Kaveri doesn't have a GDDR5 controller. It doesn't make sense in your average computer (if you need better gfx performance then get a card with GDDR5 on it, for everything else RAM bandwidth isn't going to be the bottleneck) and the silicon needed for it can be put to better use (for instance, more GPU and CPU cores).

As we so often talk about the death of desktops (2)

wjcofkc (964165) | about 9 months ago | (#45959067)

I would say this discussion fits in well with the frequent discussions we have about the alleged impending death of the desktop computer. Consider the argument that PC's which were brand new around 2007/2008 are still so overpowered for most needs that this is the cause of declining PC sales. Now consider that much less expensive than Intel, yet also brand spanking new AMD chips are practically supercomputers compared to chips from that era. If my 2008 PC is still really fast, but I want to upgrade anyway, why pay the Intel premium (outside of some ultra-demanding professional use) when I can save so much and still have a computer that will be faster than I need for years to come. That's AMD's advantage in this game. I currently have a quad core 3 GHZ AMD system with GPU disabled in favor of a low cost NVIDIA card and it's great. I am waiting till next fall for the price to plummet on current 8 core 4 GHZ AMD chips for my next upgrade. And even then it will be just for the hell of it, not need.

Re: As we so often talk about the death of desktop (0)

Anonymous Coward | about 9 months ago | (#45962301)

If it's AMD, it's not significantly faster now than in 2008. That's the problem with Kaveri. Same as Richland, just 15% better than trinity! maybe 5% more than Llano, which was SLOWER than Phenom II...which was Phenom done "right", which was maybe 10% over Athlon II...

You get the picture.

Already tested by Anandtech (0, Troll)

guacamole (24270) | about 9 months ago | (#45960825)

It's a pretty sad reading IMHO. The Kaveri APU does not seem even decidedly faster than the last generation A10. The only bright spot is that the 65watt TDP A8 APU is not that much slower than the 95watt A10 APU.

Am I the only one who wants a *CPU*? (2)

xiando (770382) | about 9 months ago | (#45961477)

Still using a Phenom II 3x *CPU* and it's fast enough for my GNU/Linux system so I see little reason to upgrade it - but if I decide to do so then I would very much like to buy a CPU, not a APU. Would it be so hard for AMD and Intel to offer actual CPUs again? Am I the only one who would like to buy one at some point? APUs are nice if you want a cheap system with alright graphics.. but why do they force us to buy one even if all we want/need is a CPU?

Re:Am I the only one who wants a *CPU*? (1)

guacamole (24270) | about 9 months ago | (#45963035)

Well, at least Intel is not charging a huge premium for the integrated graphics. The Core i3-4150 is only $130 and the rest of Core line use the same basic GPU.

Re:Am I the only one who wants a *CPU*? (1)

loosescrews (1916996) | about 9 months ago | (#45963129)

No one is forcing you to buy a CPU with integrated graphics. Look at the high end solutions. This includes AMD's FX series and Intel's socket LGA 2011 platform. Both Intel and AMD know that people buying high-end CPUs will buy a discrete graphics card anyway, so there is no point wasting valuable die space on it.

Re:Am I the only one who wants a *CPU*? (0)

Anonymous Coward | about 9 months ago | (#45963251)

Even if the CPU has integrated graphics, it's not like you need to use it.

My i5-2500k uses it's IGPU to play movies via HDMI, because the HDMI port on my nVidia GPU died. That was after about a year when I had an epiphany that my mobo has HDMI.

And when my graphics card died, I was really happy that I could simply plug in a HDMI cable and continue with my work without having to go buy a graphics card on impulse.

Intel offers CPU's that don't have integrated graphics, look up their CPU naming scheme. And I'm sure any AMD CPU of the latest gen is more than powerful enough for all your processing needs, extra graphics support or not. So buy cheap or wait until the prices drop.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?