×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Trinity A10-4600M Processor Launched, Tested

Soulskill posted about 2 years ago | from the neo's-favorite-chip dept.

AMD 182

MojoKid writes "AMD lifted the veil on their new Trinity A-Series mobile processor architecture today. Trinity has been reported as offering much-needed CPU performance enhancements in IPC (Instructions Per Cycle) but also more of AMD's strength in gaming and multimedia horsepower, with an enhanced second generation integrated Radeon HD graphics engine. AMD's A10-4600M quad-core chip is comprised of 1.3B transistors with a CPU base core clock of 2.3GHz and Turbo Core speeds of up to 3.2GHz. The on-board Radeon HD 7660G graphics core is comprised of 384 Radeon Stream Processor cores clocked at 497MHz base and 686Mhz Turbo. In the benchmarks, AMD's new Trinity A10 chip outpaces Intel's Ivy Bridge for gaming but can't hold a candle to it for standard compute workloads or video transcoding."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

182 comments

That's ok, but (0)

Anonymous Coward | about 2 years ago | (#40009075)

Where's the FX-8170 [cpu-world.com]? I can't buy it if it's not produced.

Re:That's ok, but (0)

Anonymous Coward | about 2 years ago | (#40009101)

Broken link, I apologize.
It seems Google [google.com] can go to it just fine.

AMD is done and gone... (0)

Anonymous Coward | about 2 years ago | (#40009089)

They used to be able to beat Intel in the Athlon days. Now they are hopelessly far behind and dumping huge hot graphics cores into their chips putting them further and further behind.
Focus on cheap compute with unlocking cores AMD. Not stupid graphics cores which do nothing for the CPU. A 16 core phenom ii at $100 will sell much better than this insane graphics + cpu crap.

Re:AMD is done and gone... (-1)

Anonymous Coward | about 2 years ago | (#40009489)

Exactly. These articles and benchmarks are a joke. The Intel CPUs are so far ahead, in performance and value, that I can't help but feel embarrassed for AMD.

Re:AMD is done and gone... (3, Insightful)

Mojo66 (1131579) | about 2 years ago | (#40009917)

Exactly. These articles and benchmarks are a joke. The Intel CPUs are so far ahead, in performance and value, that I can't help but feel embarrassed for AMD.

Without AMD you clueless retard would have to pay 5 times the price for an Intel CPU. You should thank them for providing competition instead of dissing their products.

Re:AMD is done and gone... (5, Insightful)

Anonymous Coward | about 2 years ago | (#40010205)

Appreciating competition is not mutually exclusive with being critical of the competitions quality.

Re:AMD is done and gone... (0)

Anonymous Coward | about 2 years ago | (#40010335)

And Intel drives down the price for AMD too.

I always stay a couple of years behind the cutting edge, and buy older AMD chips that are being discounted. Then its *really* cheap, and fast. (Compared to the AMD chips I'm replacing anyway :)

Re:AMD is done and gone... (0)

Anonymous Coward | about 2 years ago | (#40010175)

I see. So a CPU designed to be a low power CPU + GPU combo, has failed terribly, because it can't compete with an i7 + Top-of-the-range Nvidia card, that consumes an order of magnitude more power? Terrible failure indeed.

Re:AMD is done and gone... (0)

Anonymous Coward | about 2 years ago | (#40011327)

Low power? Let's take a look at the best that both have to offer right now.

Intel Ivy Bridge Core i7 3770K 3.5GHz - TDP 77W
AMD Llano A8 3870K 3GHz - TDP 100W

Intel is beating AMD on performance, power and value. Sure, the i7 3770K costs three times as much as the A8 3870K, but it's also easily three times faster. You'll make up the difference quickly in both power savings and the amount of time it takes to get work done.

Re:AMD is done and gone... (2)

thue (121682) | about 2 years ago | (#40011037)

> The Intel CPUs are so far ahead, in performance and value, that I can't help but feel embarrassed for AMD.

Not so. The Intel traditional CPU is faster, but the AMD integrated GPU is faster.

For AMD's pure-CPU parts, they seem competitively priced to me (ie: cheap).

Re:AMD is done and gone... (4, Insightful)

fuzzyfuzzyfungus (1223518) | about 2 years ago | (#40009553)

They used to be able to beat Intel in the Athlon days. Now they are hopelessly far behind and dumping huge hot graphics cores into their chips putting them further and further behind. Focus on cheap compute with unlocking cores AMD. Not stupid graphics cores which do nothing for the CPU. A 16 core phenom ii at $100 will sell much better than this insane graphics + cpu crap.

That is pretty much the exact opposite of a good plan for AMD(as much as I would like cheap compute...) Since Intel has a process advantage, and presently has a superior x86 compute core architecture, they can almost certainly beat AMD on production cost for chips of a given level of punch. Trying to compete on price with somebody kicking out chips a process node ahead of you just isn't a good plan. Unless they really fuck it up, or their yields tank horribly or similar, they'll be able to beat you on production cost every time. Intel has little to gain by cutting its own margins in order to chase AMD down a hole(since lower margins are bad, and killing AMD would mean becoming antitrust scrutiny case #1 for the indefinite future...); but there isn't any architectural barrier to their doing so.

Since Intel has comparatively worthless GPU designs, tacking GPUs onto CPU dice is a way for AMD to offer something that Intel cannot(and at a price lower than a discrete CPU + discrete GPU without totally cutting their own throat), and also happens to go well with today's enthusiasm for laptops and all-in-ones. They have a second niche, much more directly focused on price, in compute-light, memory-heavy server applications(since you can populate your sockets with AMD CPUs for less and the number of DIMMs you get is roughly proportional to the number of sockets you have active); but competing on price isn't good for your margins.

With an inferior process and a weaker x86 design, gunning directly for the compute performance crown would just be asking for a whupping from Intel.

Re:AMD is done and gone... (4, Interesting)

drinkypoo (153816) | about 2 years ago | (#40009811)

Speaking of all-in-ones, an all-in-one AMD chip would be a dandy basis for a games console. If not one from Microsoft (who has no particular need for x86) then it would perhaps be a good match for Valve. Public distaste for Sony is at an all-time high, but is it enough to unseat them? etc etc.

if I could have a 16 core phenom ii, though, that would be pretty awesome. I could drop it right into my current machine. I'd pay $100 for even eight cores, though, let alone sixteen.

Re:AMD is done and gone... (-1)

Anonymous Coward | about 2 years ago | (#40009891)

intel does NOT have worthless GPU designs you moron. Since ivy bridge intels GPU now equals anything from AMD.
and they are a process node ahead.
only cheap compute will save AMD.

Re:AMD is done and gone... (1)

Jeng (926980) | about 2 years ago | (#40010475)

Did you just seriously insinuate that Intels GPU's are better than AMD's?

So Intel has a better GPU than this?
http://www.newegg.com/Product/Product.aspx?Item=N82E16814161399 [newegg.com]

I think you might have meant that Intels best GPU's are starting to compete with AMD's bottom rung GPU's

Re:AMD is done and gone... (-1)

Anonymous Coward | about 2 years ago | (#40010629)

yes because we are comparing IGPs not discrete GPU cards you moron.
how the hell are you going to fit that card in a notebook ? duct tape it to the top ?
idiot.

Re:AMD is done and gone... (4, Funny)

Jeng (926980) | about 2 years ago | (#40010659)

Log the fuck in so I can be sure I'm talking to the same moron who posted "Since ivy bridge intels GPU now equals anything from AMD."

Re:AMD is done and gone... (-1)

nedlohs (1335013) | about 2 years ago | (#40011731)

The topic was integrated GPUs, which you must know since you brought it up when you wrote:

Since Intel has comparatively worthless GPU designs, tacking GPUs onto CPU dice is a way for AMD to offer something that Intel cannot(and at a price lower than a discrete CPU + discrete GPU without totally cutting their own throat), and also happens to go well with today's enthusiasm for laptops and all-in-ones

So the "any" is clearly restricted to the GPUs embedded in the CPUs not discrete GPUs in huge cards. Heck the "since ivy bridge" statement means you don't even need the context to work out what is being talked about.

But apparently admitting you are wrong is too hard for you so moving the goal posts is the go to play.

Of course you could also just argue that ivy bridge GPUs are worse than AMD's embedded GPUs if you think that, I guess that's harder than pretending the topic is something else.

Re:AMD is done and gone... (0)

Anonymous Coward | about 2 years ago | (#40011733)

He was clearly referring to integrated GPUs with that statement.

Re:AMD is done and gone... (1)

BlackSnake112 (912158) | about 2 years ago | (#40010747)

I thought we were talking about laptop graphics not desktop graphics? What laptop can you put a 7970 into? The article is about a laptop CPU with build in graphics. Many laptop do have that today. They are not gaming laptops usually.

Intel does have some good day to day 1080P movie watching graphics. But for high end gaming? No, they are lacking. For a low power movie watching dvr, the Intel graphics do work fine. Well the 4000 series and later 3000 series anyway. Before that, no. Intel needed work. They finally got there.

Personally I prefer to have a separate video card in my home desktops. If it breaks I can change it. If the built into the motherboard video card breaks, swapping out the motherboard is harder and takes longer then changing the video card. The less down time the better. I usually have a spare video card just for that reason.

Re:AMD is done and gone... (0)

Narishma (822073) | about 2 years ago | (#40010875)

No, but here we are talking about integrated GPUs, and in this category, Intel GPUs are competitive with AMD's.

Re:AMD is done and gone... (3, Informative)

Xeranar (2029624) | about 2 years ago | (#40011419)

They aren't competitive though. You keep missing the point. Intel's Sandy Bridge/Ivy Bridge integrated GPUs basically do video playback on laptops at a suitable level. They cannot play any sort of games made within the last 2-3 years at any level beyond the most base settings. the A-series processors by comparison can play the newest games at relatively low settings and the new Trinity based models can do it at reasonable settings. With the newest A10 laptops starting prices around $600 for 17' laptop that's quite competitive since the first Nvidia/AMD dedicated laptops that can hold a candle to them start around $800-900. The small ultrabooks are going to be harder to justify using intel when the A10 will do it all faster and just as thin. In other words AMD has a serious contender in the mobile market for gaming and cost-effectiveness.

The problem remains that Intel holds the cards on mainstream OEMs and will continue to keep the A-series processors out of the big seller's hands because mobile is becoming their bread and butter.

Re:AMD is done and gone... (0)

Anonymous Coward | about 2 years ago | (#40010931)

It'd be nice if they for instance were capable of displaying even rudimentary OpenGL well, let alone without causing the battery on a laptop to take such an immediate nose drive (on a 9-cell, to boot). I really like primary color tearing and an oven on my lap, not using 3d...

Re:AMD is done and gone... (3, Interesting)

obarthelemy (160321) | about 2 years ago | (#40009679)

Built myself a PC to play WoW 3 months ago. Went with the high-end Llano, no discrete graphics required. An Intel setup would have required a graphics card, larger base (mini-itx MB), and more money. For most users that are also *casual* gamers (not hard-core), AMD's CPU/GPU balance saves a graphics cards while providing sufficient CPU power.

Re:AMD is done and gone... (1)

drinkypoo (153816) | about 2 years ago | (#40009869)

Even pretty old stuff is good, I have the last nVidia IGP without CUDA support and it's good enough to do 1080p with XBMC. This story would be cooler if I could remember which IGP it is... 9400 or something. Of course that's nVidia and this is AMD but I guess there's some hope the drivers will work since they're pretty much betting the farm on this one.

Re:AMD is done and gone... (0)

Anonymous Coward | about 2 years ago | (#40011427)

They used to be able to beat Intel in the Athlon days. Now they are hopelessly far behind and dumping huge hot graphics cores into their chips putting them further and further behind.
Focus on cheap compute with unlocking cores AMD. Not stupid graphics cores which do nothing for the CPU. A 16 core phenom ii at $100 will sell much better than this insane graphics + cpu crap.

No, it wouldn't. Few people need or want 16 CPU cores. In practice, for the vast majority of computer users, 14 of them would be idling almost all the time, doing nothing useful.

AMD is making the right choice given the constraints they operate under. They don't really have a hope of contending with Intel's CPU performance any more (whether single or multithreaded). So, they're focusing on the advantage they do have: GPU performance. They're trying to identify market niches which Intel isn't serving well because of Intel's traditionally bad GPUs, and going after them.

Unfortunately for AMD, it looks like as of next year with Haswell, that GPU advantage will be gone or perhaps even reversed, but right now chips like Trinity are definitely what AMD should be doing.

BTW, AMD does sell 16 core chips with no integrated graphics -- in rackmount servers, not desktops and notebooks. They aren't doing too well with that. Trying to make up for poor per-core performance figures by throwing lots of cores at customers means two things:

1. Most customers aren't interested because fewer cores for the same (or better) performance is almost always preferable in the real world. Intel gets to charge more for their competing server chips, and has a commanding lead in marketshare (even more so than on the desktop last I looked).

2. AMD has to ship a lot more silicon per CPU package to scale core counts way up, which, when combined with the reduced prices they can charge per CPU, makes their profit margins small compared to what Intel's getting.

Not seeing the point (-1)

Anonymous Coward | about 2 years ago | (#40009091)

While a 4600 core CPU is extremely innovative and *theoretically* powerful, programming a CPU like that efficiently is going to be extremely hard, and it's not as if current applications will make use of more than one core - perhaps two for more recent versions.

Re:Not seeing the point (0, Informative)

Anonymous Coward | about 2 years ago | (#40009121)

It's 4600 graphics cores, via ten pipelines. Generally most graphic engines (OpenGL, ActiveX) are easily parallizable, and this will have a noticable affect on many computer games.

Re:Not seeing the point (-1)

Anonymous Coward | about 2 years ago | (#40009153)

its 384 graphics cores which are basically not even full cores. just limited stream processors.

Re:Not seeing the point (0)

Anonymous Coward | about 2 years ago | (#40009197)

OK, I think I understand. We're talking:

384 graphics cores

4216 x86-64 CPU "cores" (actually hyperthreading across 10 "true" cores)

That's still pretty impressive in my book, whether they're half cores or not. Still, it's going to be a PITA to try to use that power.

Re:Not seeing the point (0)

Anonymous Coward | about 2 years ago | (#40009333)

where the hell are you getting 4216 from ? its got TWO "true" x86 cores and TWO threads.

Re:Not seeing the point (1)

Anonymous Coward | about 2 years ago | (#40009533)

Oh, I see where I went wrong, I completely pulled it out of my ass.

Hey! I did not, what are you talking about!

Yes you did, you dirty little harlot?

What did you just call me?!!

A dirty little harlot.

Ok, but you are still stupid.

What the fuck are you talking about and who are you?

I'm Anonymous Coward, who the hell are you?

You can't be Anonymous Coward cause I am.

No you aren't.

Yes I am.

No you aren't.

Yes I am.

No you aren't.

Yes I am.

No you aren't.

Yes I am.

No you aren't.

Yes I am.

No you aren't.

Yes I am.

No you aren't.

Yes I am.

Re:Not seeing the point (0)

Anonymous Coward | about 2 years ago | (#40009549)

APK, is that you?

Re:Not seeing the point (1, Flamebait)

Jeng (926980) | about 2 years ago | (#40010069)

You'll never know.

Or wait, does that mean I'll never know?

Hold on now, what the fuck is going on?

Hell if I know.

Who the fuck are you?

I already told you that, I'm Anonymous Coward.

And I already told you that you aren't, I'm Anonymous Coward.

Bullshit, there is no way you are Anonymous Coward because I am.

Wait, what?

Re:Not seeing the point (0)

Jeng (926980) | about 2 years ago | (#40010101)

ROFL!!!! oops.

Anyway, it's really annoying reading a thread of just AC's replying to each other. We have no idea who is who and who is making which argument. Just create a damn account already.

Re:Not seeing the point (0)

Anonymous Coward | about 2 years ago | (#40009527)

Please stop. You don't understand, and it would appear you're not going to.

Re:Not seeing the point (0)

Anonymous Coward | about 2 years ago | (#40009231)

Not exactly:

* There are ten ix86 64-bit cores.
* There are 96 GPU cores
* Hyperthreading is used to make the ix86 cores look like a little over 4200 regular cores.
* Something similar to hyperthreading is being used to make the GPU look like 384 cores.

Re:Not seeing the point (0)

Anonymous Coward | about 2 years ago | (#40010293)

* Something similar to hyperthreading is being used to make the GPU look like 384 cores.

The AC morons are out in force this evening! Hyperthreading my bottom. The word you are looking for is SIMD.

Re:Not seeing the point (1)

Anonymous Coward | about 2 years ago | (#40009275)

Reminds me of the AMD Tombstone, a weird ass 48 bit CPU they got all ready to make and then ditched at the last minute in the late nineties. AMD has a habit of making some very strange CPUs. Hopefully this one will see some success.

Re:Not seeing the point (2, Informative)

Anonymous Coward | about 2 years ago | (#40009321)

You mean AMD TwoStone, right?

Tombstone was the "joke" name people in AMD management gave it, for obvious reasons.

Re:Not seeing the point (0)

Anonymous Coward | about 2 years ago | (#40011579)

You have piqued my curiosity. When was this and are there any articles you know of out there? Googling shows essentially no mention of AMD TwoStone on the web, aside from this very thread.

But will it stand up against Intel? (3, Insightful)

sl4shd0rk (755837) | about 2 years ago | (#40009175)

That's really all that matters. I've always been and AMD fan but If they can't pull out the same performance for less or equal price, they're done.

Re:But will it stand up against Intel? (5, Insightful)

confused one (671304) | about 2 years ago | (#40009337)

And they are, as long as you understand that they are not trying to compete at the level of a core i7. If you need that kind of x86 performance you have one choice, Intel, and you will pay their premium tier pricing to get it... AMD stumbled with the release of the FX series, hopefully as they move forward they will remain competitive.

Re:But will it stand up against Intel? (2, Interesting)

Lumpy (12016) | about 2 years ago | (#40010199)

Sorry but the 8 core FX kicks the crud out of the quad core i7 that is the same clock speed. I actually USE a pc for video editing rendering and 3d rendering and the new 8 core machine with one FX processor is kicking the arse of the i7 machine.

Granted i'm actually using multi threaded software unlike most people, but saying that the i7 is the end all to computing performance is not true.

Re:But will it stand up against Intel? (1)

BlackSnake112 (912158) | about 2 years ago | (#40010817)

What OS? From what I read the bulldozer line had issues that were to be addressed in the windows 8. So many on windows 7 stayed away. If your not on windows this is a non issue. Besides 8 real cores should beat the snot out of 4 real cores and 4 virtual cores every time.

Re:But will it stand up against Intel? (2)

gl4ss (559668) | about 2 years ago | (#40009349)

it beats intel(presumably more costly intel too) in gaming easily.

thanks to intels shitty gpu.

no surprises there, then.

Re:But will it stand up against Intel? (1)

ZeroSumHappiness (1710320) | about 2 years ago | (#40009419)

What about when I use a more powerful, discrete graphics card?

Re:But will it stand up against Intel? (2, Insightful)

Anonymous Coward | about 2 years ago | (#40009511)

good luck cramming that into a tablet or 9" laptop.

people under 30 don't use towers. tablets and notebooks. small notebooks.

Re:But will it stand up against Intel? (1)

Nutria (679911) | about 2 years ago | (#40010181)

people under 30 don't use towers.

They do when their employer points at a cubicle and says, "Sit there. Use that PC."

Employers would prefer GMA or discrete (2)

tepples (727027) | about 2 years ago | (#40010423)

An employer that provides a tower can go Intel. Most of the time, an Intel GMA (Graphics My Ass) is OK because the employer doesn't want the user playing 3D games on company time. In other cases, the employer provides a discrete card because it anticipates use for CAD, 3D graphic design, or video game development and testing.

Re:But will it stand up against Intel? (3, Interesting)

Lumpy (12016) | about 2 years ago | (#40010261)

"people under 30 who really dont do anything with their computers but websurf don't use towers. tablets and notebooks. small notebooks."

Fixed that for you. Every person I know under 30 that actually uses a computer has a tower. they need to do things like Render 3d GFX for static images or movies, high end photography, video production. even the CAD/CAM geeks have a tower.

I know plenty of under 30 professionals that actually use a computer to the point that they need a tower, It seems you don't, you might want to hang around smarter people.

Re:But will it stand up against Intel? (1)

Tenebrousedge (1226584) | about 2 years ago | (#40010991)

Web development, app development, graphics editing, audio editing. You have a circular argument -- or a No True Scotsman if you prefer that term.

The list of computing tasks for which a powerful desktop machine is necessary is vastly smaller than the list of computing tasks, as evidenced by hardware sales. This trend is increasing. At some point large computers will be both expensive and rare, and I for one won't mind that if it means the end of fixing desktops. Users can send their tablets back to the manufacturer.

No one who really uses a computer needs anything more than a low-level programming language and decade-old hardware. Any problem that can be solved by throwing more hardware at it is trivial -- and incidentally, if there's a job function involved in that, the hardware will eventually obviate it.

not me...though technically not under 30 (2)

Chirs (87576) | about 2 years ago | (#40011629)

I'm a professional software developer. I have an i5 laptop with built-in graphics, 8GB of memory, a couple of external displays, and a gigabit link to 2TB of NAS. Why would I need a tower?

I don't game much anymore, and when I do most of it is on my tablet anyway. My laptop is perfectly respectable for doing office work, compiling large amounts of code, doing photography work, and hobbyist CAD work in sketchup. It decodes high def video mostly in hardware with minimal overhead.

I have no desire for gaming-grade graphics in my laptop--I'd rather have an extra hour of battery life, thanks.

Re:But will it stand up against Intel? (0)

Anonymous Coward | about 2 years ago | (#40010995)

bullshit.

Re:But will it stand up against Intel? (1)

Nutria (679911) | about 2 years ago | (#40009613)

Or Linux, where ATI performance suffers compared to Nvidia

I've been exclusively AMD+Nvidia since the K6-2 & Riva TNT2 days, but my next mobo will be Intel.

Re:But will it stand up against Intel? (2)

Vancorps (746090) | about 2 years ago | (#40010987)

I'm confused how this is true, all my XBMC boxes are Linux, some are AMD and some use NVidia for graphics referring mostly to the Atoms though so all graphics is integrated. I have not seen any performance issues with the AMD drivers in Linux. These days both Nvidia and AMD support seems to be pretty good unless you have a top end latest model discreet GPU.

Re:But will it stand up against Intel? (1)

Nutria (679911) | about 2 years ago | (#40011499)

All I've ever read is how buggy the ati/amd drivers are and how support lags severely for kernels and cards and 3D is really slow. OTOH, the nvidia driver always supports the current stable kernel and cards.

If I were making an xbmc box (which I wouldn't since the Linux-based Iomega 35045 is only $105) then I'd have an Atom CPU and a GeForce 210 and install the binary driver and vdpau libraries so that it will off-load video decoding.

Re:But will it stand up against Intel? (0)

Anonymous Coward | about 2 years ago | (#40009427)

it beats intel(presumably more costly intel too) in gaming easily.

thanks to intels shitty gpu.

no surprises there, then.

But you still needs to remember that games are compiled on ICC, ignoring optimizations on AMD Processors...

Re:But will it stand up against Intel? (0)

Anonymous Coward | about 2 years ago | (#40010499)

Not true. If you tell icc to compile code for the i7, it inserts code to test for an i7. Tell it to compile for the specific SIMD version, it compiles for that SIMD version (and works fine on AMD). This myth is perpetuated by people who haven't bothered to RTM.

"outpaces Intel's Ivy Bridge for gaming"? (0)

timeOday (582209) | about 2 years ago | (#40009467)

it beats intel(presumably more costly intel too) in gaming easily.

No it doesn't. The summary says it does, then links to an article that says this:

After the 3DMark results, you might be wondering if Intel has finally caught up to AMD in terms of integrated graphics performance. The answer isâ¦yes and no. Depending on the game, there are times where a fast Ivy Bridge CPU with HD 4000 will actually beat out Trinity; there are also times where Intelâ(TM)s IGP really struggles to keep pace... We found that across the same selection of 15 titles, Ivy Bridge and Llano actually ended up 'tied' - Intel led in some games, AMD in others, but on average the two IGPs offered similar performance.

AMD's integrated GPU advantage is gone.

Re:"outpaces Intel's Ivy Bridge for gaming"? (4, Interesting)

mrjatsun (543322) | about 2 years ago | (#40009569)

> Ivy Bridge and Llano actually ended up 'tied

Yes, but Llano is the *old* AMD processor ;-) Check the reviews for performance of a HD 4000 vs a Trinity.

Re:"outpaces Intel's Ivy Bridge for gaming"? (3, Informative)

timeOday (582209) | about 2 years ago | (#40010143)

I shouldn't have quoted that second sentence about Llano, but the first sentence was specifically about Trinity. Here is the follow-on:

This chart and the next chart will thus show a similar average increase in performance for Trinity, but the details in specific games are going to be different. Starting with Ivy Bridge and HD 4000, as with our earlier game charts we see there are some titles where Intel leads (Batman and Skyrim), a couple ties (DiRT 3 and Mass Effect 2), and the remainder of the games are faster on Trinity. Mafia II is close to our 10 percent âoetieâ range but comes in just above that mark, as do Left 4 Dead 2 and Metro 2033. The biggest gap is Civilization V, where Intelâ(TM)s various IGPs have never managed good performance; Trinity is nearly twice as fast as Ivy Bridge in that title. Overall, it's a 20% lead for Trinity vs. quad-core Ivy Bridge.

So, AMD has the lead on average FPS, but it's now small enough that Intel wins in a few cases. AMD's integrated GPU is still a little better normally, but it's not a slam dunk any more.

Re:"outpaces Intel's Ivy Bridge for gaming"? (2)

edxwelch (600979) | about 2 years ago | (#40010481)

> So, AMD has the lead on average FPS, but it's now small enough that Intel wins in a few cases. AMD's integrated GPU is still a little better normally, but it's not a slam dunk any more.

It's curious, that this is the case for mobile, but on the desktop the HD4000 is beaten by the Llano by a large margin:

http://www.anandtech.com/show/5771/the-intel-ivy-bridge-core-i7-3770k-review/15 [anandtech.com]

Re:"outpaces Intel's Ivy Bridge for gaming"? (2)

mrjatsun (543322) | about 2 years ago | (#40010793)

> AMD has the lead on average FPS, but it's now small enough that Intel wins in a few cases

Not really, Intel does win on a couple cases and is close for some cases.. Most of those are older CPU bound games. For Civ 5, AMD is close to 100% faster. A lot of the games that I looked at were ~ 40% faster (e.g. starcraft 2). e.g.
        http://www.pcper.com/reviews/Mobile/AMD-Trinity-Mobile-Review-Trying-Cut-Ivy/Performance-Synthetic-3D-Real-World-Gaming [pcper.com]
        http://images.anandtech.com/doci/5831/trinity-vs-ivybridge-gaming-new.png [anandtech.com]

So better gaming perf at a cheaper price.. AMD has a better single chip solution for games. If you want a discrete graphics card for games, better to go with Intel.

Re:"outpaces Intel's Ivy Bridge for gaming"? (1)

Anonymous Coward | about 2 years ago | (#40010079)

the Ivy Bridge (Asus N56VM) is $1200 (MSRP) to $1300 on some preorder site. pre launch marketing (heh) claimed that a Trinity lappy might be $600-700. who knows, tho.

Re:"outpaces Intel's Ivy Bridge for gaming"? (3, Insightful)

serviscope_minor (664417) | about 2 years ago | (#40010185)

AMD's integrated GPU advantage is gone.

That's also compared to the more expensive i7 part. There was no i5 or i3 comparison.

Re:But will it stand up against Intel? (2)

mrjatsun (543322) | about 2 years ago | (#40009433)

Assuming we're not including discrete graphics card, if you want gaming performance, AMD wins. If you want video encoding or photo editing performance, Intel wins. For most people who have PCs, it doesn't matter because the CPU and graphics are already fast enough for anything there going to do on it.

Personally, I'm going with an Ivy Bridge, nVidia 680 GTX combo. If I was going for a single chip solution, I would probably go with AMD.

Re:But will it stand up against Intel? (4, Interesting)

asliarun (636603) | about 2 years ago | (#40009457)

That's really all that matters. I've always been and AMD fan but If they can't pull out the same performance for less or equal price, they're done.

IMO, the Trinity is a truly compelling offering from AMD, after a long long time. Yes, it trades lower CPU int/float performance for higher GPU performance when compared to Ivy Bridge, but this tradeoff makes it a very attractive choice for someone who wants a cheap to mid-priced laptop that gives you decent performance and decent battery life while still letting you play the latest bunch of games in low-def setting. Its hitting the sweet spot for laptops as far as I am concerned. I'm also fairly sure it will be priced about a hundred bucks cheaper than a comparable Ivy Bridge - that's how AMD has traditionally competed. Hats off to AMD fror getting their CPU performance to somewhat competitive levels while still maintaining the lead against the massively improved GPU of the Ivy Bridge. All this while they're still at 32nm while Ivy Bridge is at 22nm.

Having said that, what I am equally excited about is the hope that Intel will come up with Bay Trail, their 22nm Atom that I strongly suspect will feature a similar graphics core that is there in Ivy Bridge. Intel has always led with performance and stability, not with power efficiency and price, so they need to create something that genuinely beats the ARM design, at least in the tablet space if not in the cellphone space.

Re:But will it stand up against Intel? (3, Interesting)

Kjella (173770) | about 2 years ago | (#40009625)

Well, they'll sell them at the prices that they sell at, it's not like a CPU ever has a negative margin. The question is if that's good enough in the long run to keep making new designs and break even. Particularly as Intel is making a ton of money on processors that AMD can't compete against. Their Ivy Bridge processors should cost about 75% of a Sandy Bridge but sell for 98% of the price. Intel now has huge margins because AMD can't keep the pressure up, it's not really helping AMD to surrender the high end because it only gives Intel a bigger war chest.

This launch is okay, it's all around much better than Llano and keeping a fair pace with Intel, but it obviously tops out if you want CPU performance. What will be interesting to see it next year when Intel will have both a completely new architecture for the Atom and be on their best processing technology. Then I fear AMD may be seeing the two-front war again, both on the high and low end. Right now the Atom is a little too gimped to actually threaten AMDs offerings. I expect Intel just wants AMD crippled, not killed though to avoid antitrust regulations, so I think they'll be around while Intel makes all the money.

Re:But will it stand up against Intel? (0)

Anonymous Coward | about 2 years ago | (#40009997)

yeah, this is a tough one. there's obv no pricing on THIS Trinity
 
an xps 13 with an i7 (the one that slaps the Trinity around WRT battery life) starts at $1500 for an i7. the Z830 is $900. the V131 is $700; $800 if if was in a dual channel setup. and i couldn't find a price for the TimelineU M3 (i was mainly looking at ones that beat the Trinity in battery life). SUPPOSEDLY, some Trinitys might go for $600-700, but that was back when only marketing was releasing info on it.

ta3o (-1)

Anonymous Coward | about 2 years ago | (#40009283)

Of jBSD/OS. a irc.secsup.org or

Looks like Price/Performance win over Intel (4, Insightful)

Anonymous Coward | about 2 years ago | (#40009323)

I've seen a lot of reviews of various laptops that have missed the most important metric in this competition - Price!

What's been common in all reviews is that the only the very top end Intel "integrated" (No separate, discreet GPU) solutions have been competitive to the new fusion products. We're talking mobile i7s. I don't know if you've priced laptops lately, but the i7's are only found in expensive, high end systems.

The fusion APUs are nowhere near that expensive. Price wise, they should be compared to i3s or "pentium" mobile cpus.. Where they will win quite handily!

It turns out that AMD's 'APU' solutions have been very popular with low end device makers and AMD sells them by the boat load. What's impressed me, however, is how much intel has improved their GPU in ivy bridge. It's always been garbage before, but now it's starting to be something you could call 'low end'.

Re:Looks like Price/Performance win over Intel (0)

Anonymous Coward | about 2 years ago | (#40009623)

Gaming laptops don't use integrated graphics. Compute laptops (which is the other use for an i7) usually don't need high-performance graphics.

So the 'APU' is aimed at the subset of people who want to play games badly on their laptop, or who can use the GPU for some compute tasks.

Re:Looks like Price/Performance win over Intel (0)

Anonymous Coward | about 2 years ago | (#40010401)

Gaming laptops don't use integrated graphics.

I just spent a few hours today banging my head against a brand new Dell XPS laptop that features *both* an NVidia GT 420 and an Intel HD. Biggest pain in the arse ever. If it saves anyone some time, once you've installed the windows drivers and told it to run a given app with the NVidia card, it will always report the Intel HD card no matter what (Which royally screws up your wglGetProcAddress routines at your app start up). The trick is to set the apps up to use NVidia, then install the drivers again. Then, and only then, will the Nvidia card be reported. So yes, gaming laptops do infact use integrated graphics cards.....

Re:Looks like Price/Performance win over Intel (0)

Anonymous Coward | about 2 years ago | (#40011057)

Try "optimus" on an open source OS sometime....

Re:Looks like Price/Performance win over Intel (0)

Anonymous Coward | about 2 years ago | (#40011575)

Define "badly". I have an A8-3500M APU style. I can play all the latest games on my laptop at very reasonable framerates (Centered around 30), often at high settings and I paid under 500 for this laptop. There are no intel laptops available that can come close to that performance without costing literally double.

In all honesty.... (1)

BLToday (1777712) | about 2 years ago | (#40009331)

I don't transcode and my Excel sheets aren't that complicated. I suspect that most people are like me, we do basic work and play a game or two. I play TF2 on my laptop, it's 3 year old laptop with a new SSD. Plays fine. I can't think of the last time that I was truly CPU limited. I've been GPU limited since Crysis. I can't play that beyond low detail level.

Re:In all honesty.... (1)

QuantumRiff (120817) | about 2 years ago | (#40009585)

I got my wife an acer 10.6 inch thing somewhere between a nettop and a laptop.. She loves it.. that little AMD 350 CPU pulls 9 watts of power, so this little thing has great battery life (about 7 hours for our usage). Plays video fine, since it has a decent video chip (not great) built into the CPU. No heat.. no loud fans that kick on all the time.. she really digs it.. NOt bad for a $350 laptop at costco.

Re:In all honesty.... (0)

Anonymous Coward | about 2 years ago | (#40009725)

i WANTED to like the AMD E-350 as my HTPC, but i had issues playing back some 720p movie with stupid x264 encode settings (pretty much the placebo preset). also, it wasn't enough to do some decent scaling like MadVR or ffdshow Spline/Sinc on even 480p content.
 
hella quiet, tho. i removed the 60mm fan and put a 120 above the giant cpu heatsink and nothing else, not even an "rear exhaust" fan.

Re:In all honesty.... (1)

triffid_98 (899609) | about 2 years ago | (#40010873)

I made a E-350 microATX HTPC also but I'm quite happy with mine.

It will play pretty much anything I throw at it fine, it's nearly silent, and quite cheap to build (~$300). It can even play video while copying 2 streams (via HDHomeRun) since it's my DVR too.

The only thing that pisses me off, and is not really AMD's fault, is Netflix support. It runs on Silverlight and the E-350 really struggles with it.

The only 'fix' is to configure Netflix to send on it's lowest quality (bit-rate) setting.

Re:In all honesty.... (1)

Vancorps (746090) | about 2 years ago | (#40011071)

Very strange, my XBMC rig on Ubuntu plays back full 1080p Blu-ray using the E-350. Sounds like you didn't have hardware acceleration enabled.

A mixed bag (3)

chrysrobyn (106763) | about 2 years ago | (#40009373)

From what I've read, on CPU tasks it's between an i3 and an i5. An i3 is "fast enough" for most general use, so I think that's pretty good. On GPU tasks, it's significantly faster than Intel's integrated chipsets, knocking on the door of respectable gaming performance if not walking into the room.

If you're doing CPU tasks, you really want the i7. If you're doing hard core gaming, you're also going to want the latest generation video card, even if it's an entry model. If your budget is less than $700 and you still want to play video games, Trinity is a good compromise. I think it's perfect for college students.

Re:A mixed bag (1)

Rinikusu (28164) | about 2 years ago | (#40009795)

Honestly, since ditching my desktop, I've been loving my A-series A8 based laptop (upgraded it from an A4). I get respectable gaming performance, and it's perfectly fine for my music and media creation, although I will say that if I were a music and media pro I'd probably fork out the dollars for a real rig. It does everything I need it to do decent, the price was certainly right, and for anyone looking in the $500 laptop market that needs some graphics ability and isn't crunching a lot of numbers (i.e., most college students), this platform is pretty hard to beat. Honestly, if they could figure out how to shoe-horn the A-series into the 10.6" netbook format that's currently inhabited by the E-450s, I'd line up to grab one.

Re:A mixed bag (1)

purpledinoz (573045) | about 2 years ago | (#40009819)

I'd trade GPU for CPU any day. My 5 year old laptop's CPU is good enough most of the time.

Not using it (0)

Anonymous Coward | about 2 years ago | (#40010521)

The Only tasks you can really accomPlish with a 5 year old notebook is email and (light) web surfing. Like most peecee users, you don't really DO anything with the machine (ie photos, slideshows, movie creation). If you did, you'd realize how outdated and underpowered it really is.

Re:Not using it (0)

Anonymous Coward | about 2 years ago | (#40011029)

As long as you have an SSD you could do fine with a 500mhz celeron.

Re:Not using it (1)

djlowe (41723) | about 2 years ago | (#40011481)

The Only tasks you can really accomPlish with a 5 year old notebook is email and (light) web surfing.

That's complete bullshit. I'm typing this on an IBM ThinkPad T42: http://support.lenovo.com/en_US/detail.page?LegacyDocID=MIGR-57838 [lenovo.com]

Over the years, I've upgraded it to 2GB RAM, bought Windows 7 Professional x86, installed a 320GB HD, and at this point it's pretty much "maxed out", from a hardware perspective [1]. As I type this, it's running Windows 7 Pro x86, playing music via Winamp, has Outlook running, Opera, Ubuntu in a VM under VirtualBox (Why? Why not? *grin*) and 2 RDP sessions to other systems on my home network and is quick and responsive - perfectly suited for the things for which I use it.

Like most peecee users, you don't really DO anything with the machine (ie photos, slideshows, movie creation).

I think I'm doing things with it, and doing them quite well, thanks very much. YOUR problem is that you think that the tasks that you list are the ONLY things worthy of a computer.

If you did, you'd realize how outdated and underpowered it really is.

Again, bullshit. Underpowered for the tasks that you think are important? Sure. Useless, as you imply? Not hardly.

Regards,

dj

Notes:

[1] Now, for those of you who are wondering why I'd spend money to upgrade an "obsolete" laptop such as this? That's easy: I didn't pay for it. It was broken when I got it. I fixed it (because I can), and then bought RAM for it when it was dirt cheap... and bought the HD the same way. My total investment in it, from a cash perspective, was more than worth it to me. In return, I got a rock-solid laptop that is quick and responsive, to keep in my garage, which isn't heated or cooled, but despite that it keeps running, day in, day out, and gives me access to my home network, the Internet [2], when I'm puttering about in the garage, and has done so for years. I don't have to baby it, don't even back it up: I don't store anything on it that I'd miss if it stopped working, because that's what my NAS and its backups are for. Now, if the prices of the Pentium M 765 would finally drop to about $50 US, I'd buy one...

[2] I just replaced the Intel 2200 b/g WiFi NIC with a TP-Link Wireless N adapter. Had to patch the BIOS (via the commonly available No 1802 patch) to do so, but now I have Wireless N connectivity to my home network. WELL worth it, for the $ 19 US that I paid, plus a little time researching and tinkering.

[3] This note has no reference from above :)

wrong, Wrong, WRONG!!!!! (0)

Anonymous Coward | about 2 years ago | (#40011697)

8 years ago (that's right shithead, EIGHT mutha fucking years) Electronic Musician did a test with a laptop to see how it could handle a real world recording session. The laptop was able to record 6 players that were miced (including a full drum set), provide custom monitor feeds to each musician, and drive a couple of VSTi with usable latency at CD quality bit rates.

That's with an 8 year old laptop. Your post shows how full of shit you are, and you have NO IDEA how to utilize technology. My now defunct Athlon 900mHz system that I assembled in 2000 still would have handled HEAVY web surfing today quite easily.

Right Direction (0)

Anonymous Coward | about 2 years ago | (#40009403)

This time AMD can finally say they have better battery life than the equivalently power rated intel processor.

Transistor Count: Real or Marketing? (0)

Anonymous Coward | about 2 years ago | (#40009757)

Did AMD triple-check the transistor count this time before announcing this CPU?

Are the claimed 1.3B transistors in this CPU a number from AMD Marketing or are they really there and used in the CPU die?

Stop with the semantic overload of abbreviations!! (-1)

Anonymous Coward | about 2 years ago | (#40010201)

IPC is interprocess communication. Using IPC for an architecture metric will simply confuse any discussions about operating systems.

Re:Stop with the semantic overload of abbreviation (2)

Vancorps (746090) | about 2 years ago | (#40011097)

IPC is instructions per clock and is highly relevant to the discussion at hand.

Re:Stop with the semantic overload of abbreviation (0)

Anonymous Coward | about 2 years ago | (#40011181)

He clearly never attended a lecture about hardware architecture.

Gaming Laptop (1)

stilz2 (878265) | about 2 years ago | (#40010463)

What I find most impressive of AMD's APUs is that they made basic gaming on sub-$400 laptops possible.

Re:Gaming Laptop (1)

Anonymous Coward | about 2 years ago | (#40010749)

At the $400 end which is more likely, wanting speedy number crunching or light gaming. I think these articles really miss the PRICE difference in FINISHED machines. $600 for AMD is going to get you a more rounded machine than Intel. The i7 is a great thing, but it's a $1300+ entry fee. I7 is barely represented at retail. A $600 Intel machine is i5 with HD 3000 at best.

You'll notice the top intel spots were taken by quad core processors or discrete graphics cards... That's 50-100% more money than AMD.

HTPC (1)

corychristison (951993) | about 2 years ago | (#40010921)

So far I have seen no mention of it, but would this not make a great HTPC platform?
Very low powered CPU but a tank of a GPU sounds great to me... Especially when your box is idling.

Any thoughts from someone more knowlegable? I'm still like 5 generations behind running an AMD X2 5200+.

Re:HTPC (0)

Anonymous Coward | about 2 years ago | (#40011309)

I'm no expert, but this seems like overkill for regular HTPC use. I think AMD's Brazos platform is more than sufficient for HTPC.

Re:HTPC (4, Informative)

Rockoon (1252108) | about 2 years ago | (#40011367)

All of AMD's A-series processors make a great HTPC platform. Its been over a year now with Intel not offering any real competition at all in this segment once price is factored in. You can trivially get a full 65W A-series HTPC box up and running for under $150 with lots of headroom (thats the price I would quote to friends/coworkers and pocket the difference as labor costs.) The higher end A-series (100W) are only necessary if you are gaming.

' Some might say that Intel Atom solutions are price competitive with the A-series but the Atom solutions, just like AMD's low powered E-series lineup, really only works well for HTPC as long as 100% of your needed video codecs use GPU acceleration. If the Atom is good enough, then an E-series of the same price will be a bit better as well. Its hard to guarantee that all the codecs that you will be using will be GPU accelerated, especially so if you are stacked up on a Linux distro, so the E-series and Atoms are not really a solution that I recommend.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...