×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Trinity APUs Stack Up Well To Intel's Core 3

timothy posted about a year and a half ago | from the those-ndas-must-have-weighed-a-ton! dept.

AMD 223

Barence writes "AMD's APUs combine processor and graphics core in the same chip. Its latest Trinity chips are more powerful than ever, thanks to current-generation Radeon graphics and the same processing cores as AMD's full-fat FX processors. They're designed to take down Intel's Core i3 chips, and the first application and gaming benchmarks are out. With a slight improvement in applications and much more so in games, they're a genuine alternative to the Core i3." MojoKid writes with Hot Hardware's review, which also says the new AMD systems "[look] solid in gaming and multimedia benchmarks, writing "the CPU cores clock in at 3.8GHz / 4.2GHz for the A10-5800K and 3.6GHz / 3.9GHz for A8-5600K, taking into account base and maximum turbo speeds, while the graphics cores scale up to 800MHz for the top A10 chip."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

223 comments

Wow (4, Funny)

binarylarry (1338699) | about a year and a half ago | (#41477411)

AMD is finally competitive with Intel's lowest end offerings again!

Yay!

Re:Wow (2)

i kan reed (749298) | about a year and a half ago | (#41477467)

The advantage of AMD chips recently has been avoiding intel integrated graphics. I know you CAN get most intel chips without it, but it was a real anti-selling point for me.

Re:Wow (4, Insightful)

h4rr4r (612664) | about a year and a half ago | (#41477523)

You know you can just not use it right?
Why bother looking for a chip without it?

Heck, these days it is even usable and has good open drivers.

Re:Wow (1)

Anonymous Coward | about a year and a half ago | (#41477639)

Yes, you can actually disable it, but it requires active doing. By default its on, takes over your GPU and degrades performance and Intel stick them on every chip and if you use Win (what else would you game on?) it also auto-installs drivers. Highly annoying.

Re:Wow (1)

Anonymous Coward | about a year and a half ago | (#41477663)

"takes over your GPU"

What?

Maybe, I don't know, you don't plug your monitor into the video output on the motherboard, and you plug it into the one on the dedicated card instead?

Also, 10 seconds in the bios is not a problem. any true computer build requires a stop off at the bios anyway.

Have you ever actually built a computer before, or are you just doing this as a thought experiment?

Re:Wow (1)

h4rr4r (612664) | about a year and a half ago | (#41477705)

I game on my linux box, I don't however select motherboards with video out. Solves that problem quite nicely.

Re:Wow (4, Insightful)

TheLink (130905) | about a year and a half ago | (#41477957)

Does it really degrade performance? I've had motherboards with intel graphics, and I just plug an ATI/NVidia video card into them, install the drivers and it seems to work. Then if the video card fails (which does happen) I have the intel graphics to fall back on - so I can still use the PC for normal desktop stuff even if I can't play games that require higher graphics performance.

Re:Wow (1)

Joce640k (829181) | about a year and a half ago | (#41478821)

I still think the system used by PowerVR was the best. You have your desktop graphics and the 3D is overlayed on top of it. The video card has no external connector. It makes your 3D card cheaper and to some extent it frees you from buggy drivers (your desktop graphics are unaffected if the 3D card crashes). You could save power by turning it off when not needed. You can even pull the 3D card out if it dies and still be able to use the machine.

I think we'd all be better off these days if that sort of arrangement had become standard.

Re:Wow (1)

gman003 (1693318) | about a year and a half ago | (#41478623)

Weird, my current laptop has a processor with integrated graphics, but to my knowledge it was never used. Even when I was reinstalling and had no drivers for either my integrated or discrete graphics, it seemed to use the default VGA drivers on the NVidia card.

I'd accuse you of being an AMD shill, but you sound more like you just honestly don't know how things work.

PS: When I finally get around to installing Linux on this thing (any day now, I swear!), I'm actually planning to just use the Intel drivers, for better battery life. If I wanted to do any real gaming, I'd reboot in Windows, and apparently the NVidia blob under Linux doesn't support power management well. Sometimes it's good to have options.

Re:Wow (5, Insightful)

Skarecrow77 (1714214) | about a year and a half ago | (#41477619)

Ironic statement, since the main selling point of the chip being reviewed here is its integrated graphics.

Which I find just silly really. These are fine chips to build a PC for your little cousin who surfs the web and maybe plays world of warcraft. for any real build, integrated graphics, for all their advancements, still read like:
Intel: "Our new HD4000 graphics are nearly as fast as a mainstream card from 8 years ago!"
AMD: "HAH, our new chip's graphics cores are as fast as a mainstream card from 6 years ago! we're two years of obsolecense better!"

even a $100 modern dedicated card will whallop either of these chips solutions.

Re:Wow (2)

h4rr4r (612664) | about a year and a half ago | (#41477735)

For 90% of folks either of these is good enough.
I have played portal 2 on my macbook air using the Sandy Bridge graphics. It was fine.

Very few folks care about dedicated graphics cards these days.

I have 1 machine that has one, it cost a $100 and that is it. I might buy another if Steam for Linux ever launches.

Re:Wow (2, Insightful)

binarylarry (1338699) | about a year and a half ago | (#41477819)

No one cares about dedicated graphics cards.... unless they play games.

I don't know what region you're from where all the gamers just use the onboard GPU that comes with their mobo.

Re:Wow (1)

h4rr4r (612664) | about a year and a half ago | (#41478005)

That is why I said 90%, the other 10% are the gamers.

Re:Wow (0)

Anonymous Coward | about a year and a half ago | (#41478517)

I would be surprised if only 10% of the computer owners are gamers considering that the video game industry is larger than the movie industry.

Re:Wow (1)

h4rr4r (612664) | about a year and a half ago | (#41478683)

The games industry includes farmville and WoW, and lots more similar games that do not need a dedicated card.

How much of the games industry even needs a dedicated card? The video game industry includes consoles, smartphones and tablets as well.

Re:Wow (0)

Anonymous Coward | about a year and a half ago | (#41478105)

The NVIDIA 9400M integrated into my ION's motherboard, plays Warzone2100 just fine. I game probably an hour a day on average, but still don't know why I'd care to spend $5 on a graphics card (assuming I had a slot to put it into, which I don't).

Re:Wow (0, Troll)

lightknight (213164) | about a year and a half ago | (#41478229)

Or use Windows or possibly Gnome...or do OpenCl or OpenGl programming...or-

The list goes on. The fact that people are still selling craptacular integrated video chipsets in this day and age saddens me greatly. Guys, it's 2012...pony up for a dedicated video card with dedicated video ram. Quit trying to save a buck or two on a component you really don't want to be cheap on.

Listening to the constant roar of bullsh*t over integrated video cards vs. dedicated video cards, and how 'it will only matter to a gamer' ranks up there with the mindless debates about whether a regular user 'needs' an aluminum or copper heatsink. The answer is yes to copper (unless you can get something better, like silver), and yes to a dedicated video card.

Do you know what video card a hard-core gamer is going to use? Whatever it is, it will be 2 or 3 of them in a CrossFire of whatever configuration. That's a gamer.

Re:Wow (3, Insightful)

tlhIngan (30335) | about a year and a half ago | (#41478557)

Or use Windows or possibly Gnome...or do OpenCl or OpenGl programming...or-

The list goes on. The fact that people are still selling craptacular integrated video chipsets in this day and age saddens me greatly. Guys, it's 2012...pony up for a dedicated video card with dedicated video ram. Quit trying to save a buck or two on a component you really don't want to be cheap on.

Well, I think you can do OpenCL on Intel HD3xxx/4xxx chips these days. At least Apple seems to (on the retina MBP, they have a custom shader to handle the scaling from the double-size framebuffers to native panel size (if you're running at the higher-than-half size modes, e.g., 1920x1200) so that when you switch between GPUs, you don't notice it happening like you would if you wanted native.

As for why integrated graphics - easy - price. The customer sees $500 laptops, and they end up demanding cheap laptops. Think of all those /. arguments where "Apple is expensive! Their laptops start at $1000 when everyone elses is at $500!".

Hell, we call PCs (desktops and laptops) costing over $1000 "premium" nowadays. Expensive even, when we're constantly inundated with sub-$500 laptops and PCs.

It's why netbooks died quickly after the launch of the iPad (no manufacturer wanted to build no-profit PCs, and tablets at $500 were far more profitable), why you can get i7 laptops with integrated graphics and 1366x768 screens. Why "ultrabooks" costing $1000+ seem to be the ones everyone's dumping money into making product for (with high-res screens!), etc.

The race to the bottom has led manufacturers to focus on what everyone says they should look for in a PC - GHz (more is better), GB (more is better), GB (more is better) (one is RAM, other is HDD). Which means stuff like graphics and screen resolution (two of the most expensive parts) get ignored and skimped on because consumers don't care.

Hell, a retina MBP fully tricked out costs under $4000. Which only over a half-decade ago would've been considered normal for high-end PCs. These days it puts it basically at the top end "for 1%ers only" category.

Re:Wow (1, Informative)

TeXMaster (593524) | about a year and a half ago | (#41478805)

Or use Windows or possibly Gnome...or do OpenCl or OpenGl programming...or-

The list goes on. The fact that people are still selling craptacular integrated video chipsets in this day and age saddens me greatly. Guys, it's 2012...pony up for a dedicated video card with dedicated video ram. Quit trying to save a buck or two on a component you really don't want to be cheap on.

Well, I think you can do OpenCL on Intel HD3xxx/4xxx chips these days.

AFAIK, Intel HD3xxx is not OpenCL capable, and Intel HD4xxx is officially supported by Intel on Windows only (no Linux drivers). This is in sharp contrast with AMD, which has much better OpenCL support for everything they ship (CPUs, GPUs and APUs).

Re:Wow (1)

Jaktar (975138) | about a year and a half ago | (#41478815)

For my HTPC, integrated graphics is nice. Less power consumed overall, less space, less noise from extra fans.

Truly there are places where integrated works great, like work machines, kiosks, and library/research terminals. Those aren't really a small portion of the market.

Research (1)

microbox (704317) | about a year and a half ago | (#41478729)

No one cares about dedicated graphics cards.... unless they play games.

Or do cuda-enabled research.

Re:Wow (3, Interesting)

Skarecrow77 (1714214) | about a year and a half ago | (#41477945)

Pretty much until the sandy bridge era, integrated graphics were completely unusable for gaming, and they are still years behind dedicated cards.

Your statement that "90% of folks either of these is good enough." is true, but misleading. It is true that the extent of desktop/laptop gaming that most people are interested in maxes out at farmville (or whatever the new facebook gaming trend is, I certainly don't pay attention), and they do their gaming on their phone, tablet or console.

These articles however are written towards the community that constructs their own PCs, or at the very least is quite picky about what is inside their machines. You don't read these articles unless you care about such things. From that perspective, for the majority of the target audience of TFA links, these graphics performance of either brand is hardly good enough for any sort of main machine build.

Re:Wow (1)

RMingin (985478) | about a year and a half ago | (#41477999)

By using Portal 2, you have demonstrated only how very well-coded and well-optimized Portal 2 is. It runs fine on just about anything that can make triangles. Recently I watched Portal 2 running at high resolution and high apparent quality on a GeForce 8600 GT. Those are ancient, and weren't good even when they were new.

Re:Wow (1)

h4rr4r (612664) | about a year and a half ago | (#41478109)

So what was good when they were new?
I had one and it seemed up to the task for every game that was new at that time.

Re:Wow (1)

RMingin (985478) | about a year and a half ago | (#41478193)

I have Sandy Bridge in my laptop and desktop, and an Ivy Bridge desktop for a project, and I'm not disputing that their graphics are quite good, particularly for an integrated chip.

I just meant that proclaiming Portal 2 performance wasn't going to get the traction you were looking for.

Re:Wow (2)

h4rr4r (612664) | about a year and a half ago | (#41478225)

Why not?

Portal is far more graphically intense than any game 90% of people are ever going to play on their PC. WoW and Farmville are more likely the targets for this and integrated covers that fine.

Re:Wow (1)

FreonTrip (694097) | about a year and a half ago | (#41478783)

That was kind of an awkward transitional moment in GPU development, where the midrange parts were married to memory that couldn't do the cores justice, but the high-end parts are still half-decent for low-to-middling resolutions and detail settings today. The 8800GT and Radeon 2900XT can still get the job done (though the latter card will heat your house nearly as well as a hot Pentium D...). I had an 8600GTS until recently, and it wasn't half-bad, but also wasn't much more than an incremental step above a fast Geforce 7900 unless you cared deeply about its anisotropic filtering quality.

Re:Wow (1)

PRMan (959735) | about a year and a half ago | (#41477883)

The HD 4000 is probably better than you think. It plays most modern games adequately--not at top rendering for all features on a 2560x1920 monitor--but better than any console.

Re:Wow (3, Interesting)

Skarecrow77 (1714214) | about a year and a half ago | (#41478159)

I don't doubt that it works. I have a previous version of integrated intel graphics (yes I am aware of the advancements of the HD2000/3000/4000 series in comparison) on this laptop, and -can- game with the settings turned down... way down.

that said, I think my (somewhat cynical) "we are as good as a 6 year old card!" comments are pretty appropriate. Tom's hardware [tomshardware.com] ranks the HD4000 roughly on par with the nvidia 6800 ultra (released in 2004) or the 8600GT (released in 2006).

the 8600gt was a fine midrange card, and can still run today's games, albiet at reduced resolution and details. if all you're looking for is the ability to run a game, period, these chips will work, but I can't really say they'd do much better than a console (the ps3 gpu is essentially an nvidia gtx 7800, and the 360 gpu is similar, only with unified shaders), and again they don't hold a candle to even modest dedicated cards today.

in a laptop, I might be interested. On the desktop, which is what the chips being reviewed are for, I can't see much use for these things when it comes to gaming (which, again, is their big selling point right now). if you're building a desktop machine you expect to do any gaming on, and the extra $100 for, say, a gts 450 or something like that is a budget breaker, maybe you should be saving up an extra month.

Re:Wow (2)

timeOday (582209) | about a year and a half ago | (#41478257)

that said, I think my (somewhat cynical) "we are as good as a 6 year old card!" comments are pretty appropriate. Tom's hardware ranks the HD4000 roughly on par with the nvidia 6800 ultra (released in 2004) or the 8600GT (released in 2006).

The AMD Trinity in this review scored nearly double the HD4000.

Re:Wow (-1, Troll)

binarylarry (1338699) | about a year and a half ago | (#41478327)

Yeah but do you realize how flaky and buggy the use of that GPU will be?

I'd rather have a GPU that runs at half the speed but works 100% of the time.

Re:Wow (0)

Anonymous Coward | about a year and a half ago | (#41478769)

Is this a comment on AMD drivers in general, or are you being specific about AMD drivers on *nix? I've actually had good runs of late with CCC on Windows, but I will admit that AMD has missed the boat (NVidia too, it looks like) with regards to drivers for *nix desktops. Intel has done good stuff by releasing their *nix graphic drivers into the wild.

Re:Wow (2)

Bob the Super Hamste (1152367) | about a year and a half ago | (#41478847)

Hell even a $50 card is massively better. I don't play games but the performance improvement in doing GIS and cartography I have seen from going from integrated or on board graphics to a discrete ~$50 card have always been impressive.

Re:Wow (1, Flamebait)

Targon (17348) | about a year and a half ago | (#41477931)

Considering the piss-poor quality of Intel based machines in the $500 and under range, and when AMD based machines do tend to have higher quality components in that range, you could compare buying an Intel based machine like putting a Ferrari engine into a Yugo. Yea, it may be faster, but the overall experience of owning it will be shorter and more prone to failure. Obviously, going to a higher end Intel machine would result in a better experience, but at the low end, Intel based machines have a much higher failure rate across all brands compared to AMD.

Re:Wow (1)

craigminah (1885846) | about a year and a half ago | (#41478145)

Intel's latest HDxxx graphics options aren't that bad. I have an HTPC with a Core i3 and HD2000 graphics and it does 1080P fine which is all I ask for especially at 35W. Glad to see AMD improving but your argument is odd.

Re:Wow (2)

i kan reed (749298) | about a year and a half ago | (#41478203)

All I can say is that I've been burned by good specs that somehow manage to lack critical(to me) graphics functions, like supporting modern shader models.

Re:Wow (0)

Anonymous Coward | about a year and a half ago | (#41478333)

Actually, I only buy AMD processors, and not having Intel graphics is a downside for me, because of all the crap required to get 3D working in Linux. With Intel graphics (my work computer) it works automatically. The temptation to buy Intel processors is strong, but when I price them (including motherboards) I always change my mind.

graphics blows intel away and what better faster c (1)

Joe_Dragon (2206452) | about a year and a half ago | (#41477475)

graphics blows Intel away and what better faster cpu or slower cpu with much better video??

Re:graphics blows intel away and what better faste (-1, Flamebait)

binarylarry (1338699) | about a year and a half ago | (#41477507)

It's an ATI design, so it's horribly shitty even if it's fast.

Slow CPU + Shitty GPU = WINNING (AMD land)

Re:graphics blows intel away and what better faste (0)

Anonymous Coward | about a year and a half ago | (#41477521)

Someone has to pay for giant ad campaigns and blue morons jumping around on tv.

Check the mirror to see who that is.

Re:graphics blows intel away and what better faste (0)

Anonymous Coward | about a year and a half ago | (#41477509)

graphics blows Intel away and what better faster cpu or slower cpu with much better video??

You write at about a fourth-grade level.

Do something about it.

Re:Wow (0)

Anonymous Coward | about a year and a half ago | (#41477539)

They say they are comparing a Sandy Brige i3, show Sandy Bridge results, but then talk about 2500 and 4000 graphics bechmarks. Why can they not flat out state which Intel processor they're testing or put the results in a table for comparison?

I call bullshit on the lot.

Re:Wow (2)

h4rr4r (612664) | about a year and a half ago | (#41477559)

Why even compare to Sandy Bridge at all?
At least compare to Ivy Bridge if you are going to try to fight i3s.

Re:Wow (1)

ddtmm (549094) | about a year and a half ago | (#41477665)

Doesn't say much for AMD when they're comparing to Intel's entry-level processors. And they lost me at integrated ATI graphincs. I gave up on their shitty drivers many years ago. Doesn't matter how fast their GPUs are if the software side is full of bugs. They'r ethe best example of "just get it out the door. we'll fis the bugs in teh next version", But they never do...

Re:Wow (3, Insightful)

h4rr4r (612664) | about a year and a half ago | (#41477787)

Why would they not compare their new entry level CPU to their competitors entry level CPU?

These CPUs are designed to be priced against i3 of course they should be compared to i3.

You do realize that an NVIDIA card will work just fine in a computer with an AMD or intel CPU right?

Re:Wow (5, Insightful)

characterZer0 (138196) | about a year and a half ago | (#41477849)

I gave up on ATI's drivers too and bought a new laptop with an nVidia card. The state of the drivers is so pathetic that the laptop will not even boot nine times out of ten unless I disable the discrete card and use the integrated Intel GPU because otherwise the Optimus screws everything up. I will take occasionally buggy ATI over completely non-functional nVidia next time.

Re:Wow (0)

Anonymous Coward | about a year and a half ago | (#41477569)

Your sarcasm aside, the A10 processor is designed to compete with the i3, and will be priced around the same point as the highest cost i3.

If you're looking for performance, the new FX line (also based on the Trinity core) will be priced in line with the i5-i7 line, with the highest end chip probably priced someone in the middle of the two.

Re:Wow (1)

binarylarry (1338699) | about a year and a half ago | (#41477615)

I'm not really an AMD hater, well aside from their shitty GPUs (I do 3d programming and ATI/AMD GPUs are the bane of our existence).

I used to buy them back in the Athlon X2 days and I'd love to see them become competitive with Intel again.

Re:Wow (1)

Skarecrow77 (1714214) | about a year and a half ago | (#41477703)

I used to buy them back in the Athlon X2 days

"back in the day?" Athlon X2? Wasn't that like a year or two ago?

My first AMD build was a K5 200mhz (OC'd to 225mhz!), and I was late to the party... my buddy had a 40mhz AMD i386 years beforehand.

Whippersnapper, get off my lawn!

Re:Wow (2)

poly_pusher (1004145) | about a year and a half ago | (#41478087)

I still look at Barcelona as one of the bigger technology fails of the past 10 years. AMD was gaining market share like crazy, Intel was struggling to stay performace-competitive, then Core and Core2 came out. If only Barcelona had been what was promised. I still say that AMD is amazing considering their considerably lower R&D budget. As for AMD GPU's. I've had very unusual experiences with them. In some ways, they can be extraordinarily powerful. I'm a 3D artist and as an example if you are tumbling around a dense model in a 3D viewport AMD GPU's just chew it up compared to Nvidia's offerings. However, the moment you start moving vertices around, performance drops substantially compared with Nvidia cards. I'd like to know more about what you find problematic with AMD GPU's. My understanding was that AMD has better OpenGL drivers, or more compliant drivers while Nvidia has all these custom extensions outside of specifications which cause developers headaches. Are your issues specific to an API? e.g. DX11 Vs. OpenGL. I love learning more about this stuff.

Re:Wow (4, Insightful)

pushing-robot (1037830) | about a year and a half ago | (#41477599)

Or, more accurately, AMD's integrated video is better than Intel's integrated video (seriously, that's all they tested!).
And these AMD chips still double the system power consumption [hothardware.com] over their Intel counterparts.

So if you're part of the subset of gamers that morally object to dedicated video cards but still enjoy noisy fans and high electricity bills, AMD has a product just for you! Woo!

Re:Wow (4, Insightful)

h4rr4r (612664) | about a year and a half ago | (#41477673)

You are actually bitching about less power than a light bulb used to use?

At worst case it looks like ~60 watts on the two higher end units. How low power is the monitor if that constitutes doubling the power? I am betting total system in this little test ignores the monitor.

Oh noes tens of dollars more per year in electricity! The HORRORS! How ever will I afford such an extravagance that costs per year almost what two drinks at the bar costs.

If they are within 100watts I would call it a wash and be far more interested in computing power per $ upfront cost. AMD has traditionally done very well in that test and only started failing it very recently.

Re:Wow (0)

Anonymous Coward | about a year and a half ago | (#41478053)

Computing power per $ upfront ?

I'd rather save money long-term while also saving lots of time by buying something with reliable driver support.

Re:Wow (1)

h4rr4r (612664) | about a year and a half ago | (#41478165)

When you are talking about saving $30 over the course of the life of the machine you are never going to recoup that upfront cost.

The only machines I build are my faming machine, AMD CPUs are fine and I will be using an NVIDIA card no matter what CPU I get so drivers are covered.

Re:Wow (-1)

Anonymous Coward | about a year and a half ago | (#41477877)

And these AMD chips still double the system power consumption [hothardware.com] over their Intel counterparts.

Nice, article, except it's comparing apples and oranges.

The Intel core i3 series, especially the 3x series which the article uses, are mobile processors, while the amd A10-5x, A8-5x etc shown in the article are the Desktop version.

The actual mobile version of amd takes up way less watts than intel., having a maximum of 35-40 watts under load.

Still, nice FUD.

Re:Wow (3, Interesting)

beelsebob (529313) | about a year and a half ago | (#41477709)

Except that none of the benchmarks actually cover CPU speed, because AMD have put all the reviewers under NDA until the chip is released. That rather suggests they haven't caught up, they're just showing off the better IGP, which no one playing games will use anyway, and that anyone not playing games won't give a shit about.

Re:Wow (3, Insightful)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#41478189)

Except that none of the benchmarks actually cover CPU speed, because AMD have put all the reviewers under NDA until the chip is released. That rather suggests they haven't caught up, they're just showing off the better IGP, which no one playing games will use anyway, and that anyone not playing games won't give a shit about.

While I'm not hugely sanguine about AMD's prospects(unfortunately, it isn't going to be pretty if the world is divided between x86s priced like it's still 1995 and weedy ARM lockdown boxes, so it would be nice if AMD could survive and keep Intel in check); but there is one factor that makes IGPs much more of a big deal than they used to be:

Laptops. Back in the day, when laptops were actually expensive, the bog-standard 'family computer from best buy, chosen by idiots and sold by morons on commission' would be a desktop of some flavor. Unless the system was terminally cheap and nasty and entirely lacked an AGP/PCIe slot, it didn't matter what IGP it had, because if little Timmy or Suzy decided they wanted to do some gaming, they'd just buy a graphics card and pop it in.

Now, it's increasingly likely that the family computers will be laptops(or occasionally all-in-ones or other not mini towers) of equally unexciting quality but substantially lower upgradeability. If you want graphics, you either use what you bought or you buy a whole new computer(or just a console).

This makes the fact that some, but not all, IGPs can actually run reasonably contemporary games(especially at the shitty 1366x768 that a cheap laptop will almost certainly be displaying) much more important to some buyers and to the PC market generally.

Re:Wow (1)

smi.james.th (1706780) | about a year and a half ago | (#41478343)

In fairness, i3 isn't Intel's lowest end offering, there's still Pentium and Celeron which are lower than that. It's kind of mid-range...

AMD has forbidden testers to write about cpuperfor (5, Interesting)

Laglorden (87845) | about a year and a half ago | (#41477579)

AMD has apparently forbidden testers to write about cpuperformance.

In their NDA-contract it's specified

"In previewing x86 applications, without providing hard numbers until October [something], we are hoping that you will be able to convey what is most important to the end-user which is what the experience of using the system is like. As one of the foremost evaluators of technology, you are in a unique position to draw educated comparisons and conclusions based on real-world experience with the platform,"

and

"The topics which you must be held for the October [sometime], 2012 embargo lift are
        - Overclocking
        - Pricing
        - Non game benchmarks"

So the reviews coming out are only from sources that has decided to go along with those "guidelines". In other words, not complete, I would say extremly biased.

Re:AMD has forbidden testers to write about cpuper (0)

Anonymous Coward | about a year and a half ago | (#41477719)

So by examining the contents of the restriction list, that also tells us some things about this new AMD tech.

1) The advertised clock speed is already 115% of the safe usage speed.
2) It will cost about $2250
3) it is a very VERY specialized chip

Re:AMD has forbidden testers to write about cpuper (0)

Anonymous Coward | about a year and a half ago | (#41477833)

it's not something or sometime it's 2nd we've known that for 2 weeks...

Re:AMD has forbidden testers to write about cpuper (1)

h4rr4r (612664) | about a year and a half ago | (#41477873)

All prerelease info is like this, same with any reviewer who got the part for free.

What we really need is the consumer reports of computer hardware. Buy it only from normal vendors and don't advertise.

Unfair benchmark publishing from AMD (5, Informative)

IYagami (136831) | about a year and a half ago | (#41477585)

AMD allowed websites to publish a preview of the benchmarks before the estimated date if they only focused on graphics performance. This is an unfair move by AMD.

Read http://techreport.com/blog/23638/amd-attempts-to-shape-review-content-with-staged-release-of-info [techreport.com] for more details

(maybe in a couple of weeks you will find that AMD Trinity APUs have abysmal x86 performance compared to Intel CPUs)

Disclaimer: I own a laptop with an AMD cpu inside

Re:Unfair benchmark publishing from AMD (3, Insightful)

Targon (17348) | about a year and a half ago | (#41477741)

In this day and age, CPU performance means less and overall performance is the thing people look for. A quad-core 1.5GHz is easily enough for your average home user for day to day, and at that point, GPU power for things like full-screen youtube or Netflix videos becomes a bit more of a concern. We WILL have to wait and see what the performance numbers come in at, but a 10% bump in CPU performance is expected over the last generation from AMD.

Re:Unfair benchmark publishing from AMD (1)

PRMan (959735) | about a year and a half ago | (#41477973)

This. I built my wife a machine with an i3 but also with 8GB RAM and an SSD. It has Intel HD 4000 graphics. It screams. Unless you are ripping MP3s, editing video or compiling Chrome, your CPU is easily able to handle any task instantly anyway. And my wife plays Facebook-style games, which are smooth and fast on an Intel HD 4000 anyway.

Re:Unfair benchmark publishing from AMD (1)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#41478311)

I am a trifle surprised that AMD is trying to stage-manage the CPU performance benchmarking(since everybody who cares already has an informed guess based on the last model, and in absence of information pessimists are simply going to assume that the part is bloody dire, so actual benchmarks could hardly make things worse); but it is lovely how it is practically impossible to buy a non-netbook with a CPU too weak for general purposes.

The big killer seems to be disk I/O(well, that and the gigantic bottleneck that is your ISP; but that isn't a computer part). CPUs are hard to go wrong with, GPUs are punchy and fairly cheap, unless you have a very, very high resolution monitor; but the SSD that you really want is still a pretty expensive piece of gear, and there are enough horror stories of firmware issues and mystery death, even among the reputable brands, that it still has a bit of a wild west flavor to it. Not nearly as bad as it was; but SSDs seem to be one of the few areas where an all solid state part(and not even some screaming 100+watt cooling nightmare) has reliability alarmingly similar to its mechanical counterpart(despite the fact that HDDs sound like they shouldn't even work outside of a cleanroom full of engineers, much less slung in my laptop bag and bumped around all day)...

Re:Unfair benchmark publishing from AMD (0)

Anonymous Coward | about a year and a half ago | (#41477875)

5 days is a couple of weeks now? are there still 52 weeks in a year or did they change it to 120ish??

Re:Unfair benchmark publishing from AMD (1)

Anonymous Coward | about a year and a half ago | (#41477895)

Undoubtedly. Intel has been destroying AMD in cpu performance lately.

However, I don't think raw CPU performance is the bottleneck for your average computer user. For browsing, office, and social networking all but the very very lowest end modern processors are going to be just about overkill in terms of computational power.

GPUs, on the other hand, are a serious bottleneck when it comes to games. Even "casual" games now make heavy use of GPU rendering because people like smooth motion and flashy graphics. Even "2D" playfeilds are often rendered 3D polygons because it gives your game objects depth and a dynamic appearance. There are lot of visual effects that can pretty much only be realistically achieved in shader code.

For casual games, the GPU absolutely becomes the bottleneck. For the vast majority of computer users, the AMD solution clearly brings more value.

For high performance, or games? Want a dedicated GPU? Intel all the way. No questions.

Re:Unfair benchmark publishing from AMD (1)

h4rr4r (612664) | about a year and a half ago | (#41478095)

For casual games integrated video is now good enough, has been since at least sandy bridge.

Hard drive speeds are the biggest desktop bottle neck these days. Stick an SSD in any old desktop and watch what that does.

AMD has better multicore. (0)

Anonymous Coward | about a year and a half ago | (#41478271)

However, there is very little that takes good use of multiple cores beyond two or three cores, and therefore the higher clock of the Intel line make it the winner.

Re:AMD has better multicore. (2)

h4rr4r (612664) | about a year and a half ago | (#41478595)

Multiple applications open at once cover the multicore need quite nicely. At this point the clock on both is in the good enough territory, for me it would be just price.

Re:Unfair benchmark publishing from AMD (1)

h4rr4r (612664) | about a year and a half ago | (#41477939)

This is what everyone does.
Any test with early parts or free parts is rigged, don't trust them. Either the test is rigged or very commonly the part is.

This is not limited to computer parts, car reviews are often of cars specially setup for the reviewers. Lambo brings two cars to every review one setup for going fast in a straight line and one for cornering work. If you dare mention this or use the cars in the way they are not setup and print it you will never review another Lambo without buying it or borrowing one from a buyer.

Re:Unfair benchmark publishing from AMD (0)

Anonymous Coward | about a year and a half ago | (#41478057)

Given the status quo in reviews that benchmarks Intel sponsored code the majority of people will never use this is very fair. More people will want to do some semi-serious gaming than will do heavy duty 3D rendering, intensive video encoding passes and everything else reviewers like to use.

In reality the CPU performance doesn't matter anymore for normal users and while AMD is clearly behind Intel in raw benchmarks for normal tasks or even for things like programming a 6 year old notebook processor is enough. Okay, I know everyone on /. thinks compiling Linux+userland the whole time is normal but strangely enough that's not what common people do.

- Megol

Re:Unfair benchmark publishing from AMD (2)

Sloppy (14984) | about a year and a half ago | (#41478061)

Actually, that article says you have to focus on gaming performance, not graphics. So: bring on the Dwarf Fortress benchmarks!

Core 3? (1)

L4t3r4lu5 (1216702) | about a year and a half ago | (#41477611)

Is it the official name of the Haswell packages?

Un-fucking-believable.

Re:Core 3? (1)

Matimus (598096) | about a year and a half ago | (#41478693)

The article is talking about Ivy Bridge, which would be the 3rd generation Core. The "i3" loosely represents the performance sku, "i3" being the low end. Haswell will probably be marketed as the 4th generation Core.

Power numbers still not good (1)

apcullen (2504324) | about a year and a half ago | (#41477765)

Still consuming 140-150 watts at peak load vs intel's ~90. Good to see the graphics numbers coming up though.

The reason I highlight power is that the integrated graphics power could be a huge advantage in a low-end laptop. As long as it doesn't kill battery life.

But does it run Linux worth a damn? (4, Interesting)

drinkypoo (153816) | about a year and a half ago | (#41477767)

But does it run linux worth a damn [phoronix.com]? Inquiring minds want to know. I got boned by buying an Athlon 64 L110/R690M machine for which proper Linux support was never forthcoming. Now I want to see power saving and the graphics driver work before I give AMD money for more empty promises about Linux support.

Re:But does it run Linux worth a damn? (1)

Ritz_Just_Ritz (883997) | about a year and a half ago | (#41478209)

I'm running Ubuntu 12.04 without incident on the the A8-3870 (previous Llano architecture) without incident. Ubuntu + XBMC in a small shoebox mini-ITX enclosure is working great for an inexpensive HTPC for my home.

Best,

Excellent question. (2)

Kludge (13653) | about a year and a half ago | (#41478481)

For the last few years I have only been buying Intel hardware because it just works out of the box with all Linux distros. Is this AMD thing going to work out of the box in Linux?
No, I'm not going to take time to download and install drivers. That crap is for M$ users. Yeah, yeah, I know Intel graphics are not the fastest thing out there. Save it for someone who cares. The Intel graphics are fast enough for the games that I write and play.

Fuck all (0)

Anonymous Coward | about a year and a half ago | (#41477821)

the posts about AMD's "shitty graphics". I can pay $450 for an HP Pavilion g6 with AMD/ATI graphics and play Skyrim on it at medium detail at 800x600. Well that's a shit resolution, you say. But I didn't by the laptop to play Skyrim. It's a bonus. It looks and runs fine, and Intel integrated graphics certainly can't run it at any resolution.

Re:Fuck all (0)

Anonymous Coward | about a year and a half ago | (#41478019)

That's a pretty shitty resolution, though.

Re:Fuck all (0)

Anonymous Coward | about a year and a half ago | (#41478397)

SVGA, state of the art circa 1987!

Time to come out and confess... (4, Interesting)

sinij (911942) | about a year and a half ago | (#41477963)

I admit, I am one of the last few ideologues in PC gaming. I would never consider AMD graphic card due to shitty drivers and I would never consider Intel CPU due to socket shenanigans. Yes, I am actually one of the rare few people who upgrades CPUs and cares about socket backwards comparability.

My current gaming rig uses Zambezi 8-core AMD CPU, still adequate but it shows its age. I am disappointed AMD hasn't come up with an upgrade, but I can wait.

My last gaming rig lasted me over 4 years and going. I started with Athlon X2 end ended with Phenom II X4. It is still in use as a media PC, and still capable of gaming.

Maybe it is dumb luck, but every AMD chip I had was running cool, overclocked well and lasted. Every Intel chip I owned didn't overclock well and had problems staying cool.

The problem with video reviews (0)

Anonymous Coward | about a year and a half ago | (#41478117)

I like how the Hot Hardware video comparison has drastically different results with the Photoshop test, and shows that the video playback program was splitting modules instead of putting a pair of cores into low power.

How good are AMD's video drivers now? (1)

Dwedit (232252) | about a year and a half ago | (#41478371)

I was just wondering if the quality of the video drivers has improved at all since ATI was rebranded to AMD.
ATI was notorious for how awful its video drivers were. My current laptop has a Mobility Radeon X1400. Whenever I play a video that uses Overlay, there is about a 2% chance that it will hard-freeze the system. I don't think I've ever seen anything like that on an Intel or Nvidia graphics product.
I also sometimes get system-stopping delays that are several seconds long when running 3D games, it seems to happen just before textures are created, like it has something to do with the game trying to allocate video memory.
But anyway, have ATI's drivers gotten any better?

Re:How good are AMD's video drivers now? (1)

0xA (71424) | about a year and a half ago | (#41478667)

Better? Yes but still not very good.

Re:How good are AMD's video drivers now? (0)

Anonymous Coward | about a year and a half ago | (#41478787)

I have a 5850 and can't remember the last time I've had any sort of crash or any problems (other than Global Offensive, but everyone crashes in that game atm) on windows. On linux using mesa 8, no real problems, using binary drivers can get annoying with updated kernels. Some people still report linux issues, but I'm resonably happy.

Nvidia cards used to (possibly still do, because they like to push wattage limits) cook themselves until they die, and they like to use proprietary software to control the market. Intel onboard has almost always completely sucked (both hardware and software) until recently where they're not top notch but have highly passible cards, and probably the best linux support. So I'm not sure why people hold over problems on AMD in perpetuity. They've all messed up, they're all kinda decent at the moment.

Re:How good are AMD's video drivers now? (1)

jakobX (132504) | about a year and a half ago | (#41478845)

Cant say really. Ive had r8500, r9700, x700, hd4870 and now hd 7850 and ive never had any problems with drivers that a simple upgrade to the latest version didnt fix. Maybe im just lucky who knows.

Dont have much experience with nvidia cards. Only had (still have them) 6600gt, 6800something and a HTPC with ion chipset. No major problems here as well. I had an occasional driver crash with 6600, 6800 was in my work computer so not exactly heavily used and the HTPC needs a driver reinstall once monthly because HDMI audio just stops working. Not bad IMO.

The intel gfx in my current work computer is a bit crashy though. It doesnt like java applications for some reason. :)

I3's arent for gaming... (1)

Taelron (1046946) | about a year and a half ago | (#41478681)

Article states, "They're designed to take down Intel's Core i3 chips, and the first application and gaming benchmarks are out."
I3's are meant for basic desktop and doing your homework, not a gaming rig. So they are saying, Hey, our new chip is just as crappy at games as the I3... Brilliant marketting.

di3rk (-1)

Anonymous Coward | about a year and a half ago | (#41478869)

luck YI'll find may do, may not as WideOpen, Leaving core. I

genuine alternative (0)

Anonymous Coward | about a year and a half ago | (#41478901)

Shouldn't that be "authentic alternative"?

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...