Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Demos Llano Fusion APU, Radeon 6800 Series

timothy posted more than 3 years ago | from the onward-ever-onward dept.

AMD 116

MojoKid writes "At a press event for the impending launch of AMD's new Radeon HD 6870 and HD 6850 series graphics cards, the company took the opportunity to provide an early look at the first, fully functional samples of their upcoming 'Llano' processor, or APU (Applications Processer Unit). For those unfamiliar with Llano, it's 32nm 'Fusion' product that integrates CPU, GPU, and Northbridge functions on a single die. The chip is a low-power derivative of the company's current Phenom II architecture fused with a GPU that will target a wide range of operating environments at speeds of 3GHz or higher. Test systems showed the integrated GPU had no trouble running Alien vs. Predator at a moderate resolution with DirectX 11 features enabled. In terms of the Radeon 6800 series, board shots have been unveiled today, as well as scenes from AMD's upcoming tech demo, Mecha Warrior, showcasing the new graphics technology and advanced effects from the open source Bullet Physics library."

cancel ×

116 comments

Sorry! There are no comments related to the filter you selected.

Deceiving naming... (4, Informative)

TheKidWho (705796) | more than 3 years ago | (#33955106)

The 6870 actually has less performance than the 5870... Same goes for the 6850/5850... I don't really understand why they named them the way they did... Either way, a 6970 is supposed to be released in the near future to surpass the GTX480/5870.

Re:Deceiving naming... (1)

AHuxley (892839) | more than 3 years ago | (#33955212)

New 6xxx price points. They make the numbers flow from within the 6xxx range. The 5870 is now history. Its all about the 6xxx and how its fps graphs and costs.
If the 5870 is still better in price, frame rate or power use, I am sure it will be noted.
The main thrust seem to be a new mid range (in price) 6xxx should be a bump towards the 5870 stats.
As for the top end, will be fun :) http://vr-zone.com/articles/-rumour-ati-radeon-hd-6000-series-release-schedule-first-iteration-in-october/9688.html [vr-zone.com] has some projections.

Re:Deceiving naming... (1)

Narishma (822073) | more than 3 years ago | (#33955296)

I didn't understand any of what you said. This new scheme doesn't make much sense. Why didn't they just name these new cards 6770 and 6750 if that's the price range they're targeting? This will just confuse consumers and is something I would expect from nVidia or Intel, AMD are usually sensible in their naming conventions.

Re:Deceiving naming... (1)

AK Marc (707885) | more than 3 years ago | (#33955564)

They have the first digit as the generation and the next three as the location in that generation. Evidently, they decided that they needed to trim costs more than increase FPS, and adjusted the numbers accordingly for the last three digits. It might not be as good of a plan when comparing it to the previous generation, but a more accurate numbering looking forward. Not that I know what they have planned or why they really did it, but since you know they did it and assume they had some reason, then you have a limited number of choices that make sense for the answer.

Re:Deceiving naming... (2, Insightful)

PopeRatzo (965947) | more than 3 years ago | (#33955718)

Evidently, they decided that they needed to trim costs more than increase FPS

That's a nice way of saying "give the consumer less".

Re:Deceiving naming... (0)

Anonymous Coward | more than 3 years ago | (#33955988)

Evidently, they decided that they needed to trim costs more than increase FPS

That's a nice way of saying "give the consumer less".

Yes, just like "affirmative action" is a nice way of saying "institutionalized racism and sexism", something we claim to be against and don't want to admit we're complete hypocrites about. Ah, the euphamism.

Re:Deceiving naming... (3, Insightful)

gman003 (1693318) | more than 3 years ago | (#33956302)

Giving the consumer less, but also charging them less. Since very, very few people actually needed the top-of-the-line model of recent cards, it makes some amount of sense.

Re:Deceiving naming... (0, Troll)

aliquis (678370) | more than 3 years ago | (#33956736)

But those people don't buy the top-of-the-line cards ... :D

Maybe it's nothing more complicated than that people already know the GTX 460 offers more, so if they want to compete new model numbers is there it's at unless they can release a better card =P

Though they should be used to it after getting owned by the TNT, TNT2, GeForce, GF2, GF3, GF4, GF6, GF7, GF 8, GTX 200 (?), GTX 400, ..

FX5000 series and maybe the 4870(X2) being exceptions?

But their cards supposed to consume less electricity?

Poor AMD/ATI, and when they finally had a lead thanks to Nvidia&3DFX screw up they released shitty drivers instead to make up for it ;)

Re:Deceiving naming... (2, Insightful)

Joce640k (829181) | more than 3 years ago | (#33957314)

You forgot the 9700 era, ATI totally owned NVIDIA then.

And the current era, they totally own that.

Re:Deceiving naming... (1)

aliquis (678370) | more than 3 years ago | (#33957732)

I said FX5000 being exception stupid.

Funny how you get moderated for being stupid though.

And it wasn't that much ATI owning Nvidia as Nvidia (3Dfx) screwing up.

Re:Deceiving naming... (0)

Anonymous Coward | more than 3 years ago | (#33959444)

Bawwwwwwwwww.

Re:Deceiving naming... (4, Insightful)

Rockoon (1252108) | more than 3 years ago | (#33955846)

There is also the problem that the poster is measuring performance based on some single metric (presumably FPS in some game) which doesnt necessarily mean much.

Many years ago I upgraded from a Voodoo 3 to GeForce 4 Ti 4600, and for more than a few games that GF4 was slower in FPS than the Voodoo at first (but still more than fast enough for gaming.)

This was at a time when games were almost strictly simple textured polygon throwers, which was the Voodoo 3's only strength. As the use of multi-texturing became more prevalent (heavily used in terrain splatting.. [google.com] ), the advantages of the GF4 over the voodoo became apparent as more scene detail became essentially free, whereas that voodoo required many rendering passes to accomplish the same thing.

Now I'm not saying I know that this generation of AMD gpu's will experience the same sort of future-benefits as that GeForce 4 did, especially since DX10/DX11 really isnt having a rapid uptake, but there could easily be design choices here that favor DX11 features that just arent being heavily used yet.

The question is not 'is the 6870 slower than that 5870?' in some specific benchmark. The question is, which is these cards will provide a solid gaming platform for the most games. As with my experience, that voodoo performed better than the GF4 for while.. but for the newest games the GF4 kept providing a good experience whereas that voodoo became completely unacceptable.

Re:Deceiving naming... (3, Interesting)

fast turtle (1118037) | more than 3 years ago | (#33956064)

What I suspect AMD has done was add tesselation units to the chip. This will be evident when running the Heaven Benchmark with Tesselation enabled. Keep in mind that Tesselation is one of the key changes between DX10.1 and DX11 and as you stated, this is future looking. Sure the chip may be a bit slower currently but I suspect that when running something that depends heavily on tesselation, there wont be any slowdowns.

The reason I'm aware of this is my Radeon 5650. It's a DX11 card with 512 onboard and when running the Heaven Test, there's lots of improvement when tesselation is on even though the card struggles and drops to between 4-12 frames. With tesselation off, the card easily handles the test at a playable rate of 45-60 frames.

You must have had a really crappy Geforce4 (1)

Klinky (636952) | more than 3 years ago | (#33957712)

I am wondering how your Geforce4 Ti4600 got outclassed by a Voodoo3. The Voodoo3 was equaled or outclassed by the original GeForce256. Maybe your memory is fuzzy, but there would be some major issues if your Voodoo3 was faster than the Geforce4. Also multi-texturing was a big deal around the TnT2 & Voodoo2/3 days, the Geforce3/4 were way past that stage with the introduction of shaders. By the time the GeForce4 came around 3Dfx was already dead for two years after their Voodoo5 failed miserably against the Geforce2. In some cases, like going from a Geforce2 to a Geforce3/Radeon 8500, there wasn't much performance gain until shaders got involved. But I cannot see how a card that would have been 3 generations behind could have bested the Geforce4 even in lower-end non-shader games. The Voodoo3 would have been outclassed even from a raw clockrate, TMU, ROP standpoint.

I ultimately feel that the reason AMD is changing the model # is to save face so it doesn't look like they're putting out a low-end part first. This looks like they're still pumping out a leading edge flagship product. It's misleading, but that has never stopped the ill informed consumer from continuing to be ill informed.

Re:You must have had a really crappy Geforce4 (1)

jefe7777 (411081) | more than 3 years ago | (#33959404)

flagship = nascar what wins on sunday sells on monday. the flagship is benchmarked and touted on all the tech sites as a ferrari. then they sell the average consumer kia's that are slower then last years model. sure it's cheaper, but it's a hunk of shit, and going in the wrong direction, as everything else incrementally gets more computing intensive... no apologies for the bad car analogy.

Re:Deceiving naming... (1)

wintermute000 (928348) | more than 3 years ago | (#33957834)

yeah good points. Issue is that gaming graphics are now undeniably console driven, and consoles don't do DX11/tesselation yet, so that featureset I suspect will be a bit niche. For every stalker/crysis type of envelope pushing gfx-whore-heaven fps (which I do oh so love) there will be 4-5 multiplatform releases that don't even push a 5770 let alone higher.

I'm considering an upgrade from my 5850 purely from a noise POV, I have the HIS icoolerV variant (stock not OCed) and reviews say its quiet but under load it sounds like a 360 with spinning disk at max volume. Runs like a dream but dang the noise is actually annoying , its louder than my old 4850 crossfire!!!! and no I'm not returning it due to noise alone lol

Re:Deceiving naming... (1)

Creepy (93888) | more than 3 years ago | (#33960276)

That and only XBox360 does DX9 (specificially 9.0C). PS3 uses OpenGL ES 2.0 library for the most part, with lots of hacks. If they drive the market, don't expect DX11 or later to be adopted until the next gen of consoles come out.

I'm really excited about physics integration, as most of the stuff I've worked on recently have required passing physics information between hardware and software (in textures). For instance, cloth and hair both need physics to behave realistically.

Re:Deceiving naming... (2, Interesting)

hairyfeet (841228) | more than 3 years ago | (#33955614)

I have to agree, I frankly loved how easy it was to tell what was what by the AMD naming conventions. with Intel it is hard to tell what had virtual support and what hadn't, Nvidia had so many cards overlapping with numbers all over the place I frankly can't tell you if a 9600Gt beats a GT210 or the other way around, but with AMD it was easy. One the CPU side it was Sempron (almost pointless now), followed by Athlon, Phenom, and Phenom II. Those were followed by an x and number of cores, and of course faster is better, easy peasy. On the GPU side you had the xx3x, for bargain basement and integrated, the x5xx for those that only cared about HTPC or video acceleration, low mid, the x6xx the mid to high mid, and the x7xx and x8xx for the low high end to high. Everyone had a niche, everyone had a price point, easy peasy.

Hopefully by the time February rolls around they will have this straightened out, as I'll be replacing my HD4650 when I add my liquid cooler for my CPU and I'd really hate to play "guess which card is right" again. Meh, I figure I'll get one in the $100 price point anyway, but it would be nice to tell whether the 5xxx series or 6xxx series would be better at that price point. While I like to play FPS my screen is only 1600x900 and I'm not into the Crysis ePeen graphics, as long as I have good framerate I'm happy. Any suggestions? Oh and please don't say Nvidia as I don't buy their stuff after bumpgate and the way they turn my apt into a space heater, I'm also not happy with their disabling PhysX on machines with any AMD GPUs, since mine is integrated I sure as hell ain't paying for crippleware. So which would be the better buy in the $110 price range? The 5xxx or 6xxx?

Re:Deceiving naming... (2, Interesting)

PopeRatzo (965947) | more than 3 years ago | (#33955854)

Hopefully by the time February rolls around they will have this straightened out, as I'll be replacing my HD4650

I'm about ready to replace my 4650, too. I've got a new HD monitor coming and figure that's as good a time as any to up the graphic power, though I won't be going water-cooled.

My problem with the numbering system is always the second digit. For example, is a 5830 better than a 5770 or 4870? Do I add up the 4 digits and compare the sums? Is the first digit the most important, or the second, or the third?

The way I usually end up deciding is by sorting all the cards at Newegg by price and seeing what cart in the second-to last series is best in the $100-130 range. Then, I go to the recommended requirements for the game I want to play (again, I wait until the prices drop on Steam, so I'm just now buying games that came out last Christmas) and see if the new video card meets the requirements.

I might consider buying an nVidia card but the business with PhysX and their even more confusing model numbers puts me off.

Re:Deceiving naming... (2, Informative)

hairyfeet (841228) | more than 3 years ago | (#33955986)

The third digit is key in the same series, whereas the second tell you which series, higher is better on BOTH the second and third numbers when comparing cards. For example a 4670 is better than a 4650, but a 4870 beats both. The only time that isn't true is if the last digit is a 3, which equals bargain basement, for example a 4630.Basically the easy way to remember is 3 is low, 5 is low mid, 6 is mid, 7 is high mid, and 8 and 9 are high. That goes for the second AND the third digit.

As for liquid cooling you might want to check out this [tigerdirect.com] which a gamer friend turned me on to. he said it took less than 30 minutes to install, and dropped his temp by a good 20 degrees under load, and idle it is often room temp. Hell a decent air cooler will cost you more than that, and if you are gonna crack the case open anyway, why not?

Re:Deceiving naming... (2, Interesting)

tibman (623933) | more than 3 years ago | (#33960652)

I bought and use that exact water cooler on an AMD965 (Phenom IIx4 3.4Ghz Black-Edition). It works great and i highly recommend it. My only advice for anyone is make sure your side panel doesn't have fans or protrusions in the back near your 120mm exhaust port. My case has a 180mm side fan that prevented the radiator (sandwiched between two 120mm fans) from being mounted inside the case. I dremeled out a slot so the coolant tubes could pass through the back (it's a closed coolant system, so you can't just dremel holes). Right now there is a 120mm fan inside, the case wall, radiator outside, then another 120mm fan. It's extremely quiet and i really enjoy it.

My case, if anyone is interested: http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=4034179&CatId=32 [tigerdirect.com]

Re:Deceiving naming... (1)

aliquis (678370) | more than 3 years ago | (#33956912)

For example, is a 5830 better than a 5770 or 4870?

Probably.

Stupid guesses:
58xx > 48xx from generation alone.
57xx probably more limited chip or something else (different memory?) than 58xx.
xx30 lower end than xx70 of the same chip.

Or something, wikipedia most likely tell?

Facts:
HD4870: 750/900 clock, 800:40:16 unified shaders, texture mapping units, render output units, 256 bit GDDR5.
HD5770: 850/1200 clock, 800:40:16, 128 bit GDDR5.
HD5830: 800/1000 clock, 1120:56:16, 256 bit GDDR5.

X7XX on both seem to be 128 bit memory.
X8XX 256 bit
48XX X2 2x256
X9XX 2x256 bit

Fairly obvious the 5830 was much better than the 5770 even though it's got low clock rate (5870 got 850/1200 just as the 5770.)
4870 vs 5770 not so obvious. Do they have the same chip to or what? Guess the 4870 may be nicer. Just overclock it if it's not ;)

Re:Deceiving naming... (1)

i.of.the.storm (907783) | more than 3 years ago | (#33957724)

I think the 5830 is actually just a little better than the 5770 in real world performance but uses a significantly more power. I think the 4870 is actually also pretty close to the 5770 in performance, might actually be faster in fact. But the 5770 is based on the 5800 series core I think, just a little cut down.

Re:Deceiving naming... (1)

Cornelius the Great (555189) | more than 3 years ago | (#33960438)

To add to what you said, the 5800 series used more power since it was a much larger GPU. The 5830, 5850, and 5870 shared the same chip (Cypress). For the lower-end parts, the shading units, ROPs, texture units were fused off to improve yields and fill in the large performance gap between the 5780 and 5770. Likewise, the 5770 GPU (Juniper) actually started with a smaller core (with less than half as many transistors as Cypress), and subsequently disabled sections for the 5750.

Re:Deceiving naming... (1)

aliquis (678370) | more than 3 years ago | (#33956816)

I frankly can't tell you if a 9600Gt beats a GT210 or the other way around

I assume it does. What about a HD4250 vs HD 3650?

Hard to see the difference =P, though yeah, LE, GT, GTX, GTS, ... may seem weird, 30, 50 and 70 is rather obvious.

Re:Deceiving naming... (1)

Renraku (518261) | more than 3 years ago | (#33955232)

Because shit dude the 6870 is 1000 better than the 5870! And it's going for a lot less! It's a GREAT deal.

In a Tablet, or Game Console (1)

sanman2 (928866) | more than 3 years ago | (#33955256)

I'd love to see this thing in an Android tablet, or a next-gen game console. I'm sure the pixels-per-watt are fabulous. I hope it shows up in the Wii-2, or in some super-powered tablet that will make PCs obsolete.

Re:In a Tablet, or Game Console (1)

Killall -9 Bash (622952) | more than 3 years ago | (#33955392)

PCs will be obsolete when I can have a chip implanted in my brain.

Which one? (0, Offtopic)

zooblethorpe (686757) | more than 3 years ago | (#33955472)

PCs will be obsolete when I can have a chip implanted in my brain.

Will you go for Ponch, or Jon?

Cheers,

Re:Which one? (0)

Anonymous Coward | more than 3 years ago | (#33955514)

no that was Sam on Cheers.

Re:In a Tablet, or Game Console (1)

aliquis (678370) | more than 3 years ago | (#33956936)

My brain will be obsolete when one can implant a chip in it.

Who's right? Just poking in a M68k or "implanting" a 5.56mm NATO in there with no connections doesn't count :D. Though the later is sure to make my brain obsolete :D

Deceiving? (3, Insightful)

mykos (1627575) | more than 3 years ago | (#33955272)

I have a feeling that people who buy expensive pieces of hardware have tendency to do at least one web search or pop at least one question off at an internet forum about products before they buy. It's not like AMD is putting the or anything... [overclock.net]

Re:Deceiving? (3, Insightful)

mykos (1627575) | more than 3 years ago | (#33955288)

Wow HTML fail on my part...what I mean to say is "It's not like AMD is putting the same chip with the same everything into four generations of parts or anything"

Re:Deceiving? (1)

aliquis (678370) | more than 3 years ago | (#33956952)

From the comments it looks like they said 5 cards, not 5 generations? 2 generations?

And they did the same (or for three?) with the low-end GF4s didn't they?

That didn't mean ATI got better cards than the TIs though.

Re:Deceiving? (1)

AvitarX (172628) | more than 3 years ago | (#33957272)

I think in the GF 3-4 range there were a lot of GF4-mx cards.

These were essentially low-middle end GF3 cards, and for some things my GF2 TI outperformed it (with less CPU to boot).

Re:Deceiving? (1)

0123456 (636235) | more than 3 years ago | (#33957504)

I think in the GF 3-4 range there were a lot of GF4-mx cards.

These were essentially low-middle end GF3 cards, and for some things my GF2 TI outperformed it (with less CPU to boot).

The 'Geforce 4 MX' was a Geforce 2 in drag, wasn't it?

Re:Deceiving? (1)

aliquis (678370) | more than 3 years ago | (#33957748)

That was the impression I had. Or atleast that it wasn't better :)

Re:Deceiving naming... (0)

Anonymous Coward | more than 3 years ago | (#33955290)

GPU companies have always been known for having shitty names for anything they have ever made, ever.

GPU and Audio hardware are the 2 worst areas when it comes to naming, CPUs close 2nd. (don't even get me started on how awful audio is when it comes to "standards"... REALTEEEEEEEEK! All of my hatred could destroy galaxies)

It's as if they like to over-complicate these things as some sort of sick joke.

Re:Deceiving naming... (0)

Anonymous Coward | more than 3 years ago | (#33955374)

Citation needed? Everywhere I read puts the 6870 at a ~30% improvement over the 5870.

Re:Deceiving naming... (1)

weirdcrashingnoises (1151951) | more than 3 years ago | (#33956190)

check again, you were undoubtedly looking at 5770s, not the 5870s. and then that is correct, 5770 -> 6870. which as noted, doesn't make a lot of sense when the 5870 performs better.

they should have really just started their base line at the 6660, obviously...

Re:Deceiving naming... (1)

youshotwhointhewhat (1613613) | more than 3 years ago | (#33955796)

I think AMD is going to have the 5770 and 5750 live on as their lower-midrange cards while these will become the upper-midrange cards. The top of the line (single die) cards will then be named 69XX. Not sure what they are going to do with the dual die cards (perhaps bring back the X2 naming convention).

It makes sense for AMD to hold off on updating their lower end offerings since consumers are less demanding at this price point.

Re:Deceiving naming... (1)

Draaglom (1556491) | more than 3 years ago | (#33956040)

This is because in the current generation, the top end ATI GPUs are the 58xx. With this coming generation they wanted the top end to be x9xx, so they have just incremented the second digit. 6870 is intended as a successor to the 5770.

Re:Deceiving naming... (1)

poetmatt (793785) | more than 3 years ago | (#33957174)

low power but better manufactured leaves to see if they can be overclocked better than the 5x series. TBD until people get some hands on time, of course.

Re:Deceiving naming... (1)

Ecuador (740021) | more than 3 years ago | (#33958034)

The general idea is that the 5850/5870 will be phased out as these cheaper new cards will be introduced, so the AMD mid to up-range lineup will be 5750/5770, 6850/6870, 6950/6970 (next month, + 6990 later). So, that does make sense since the 6850 and 5750 will be out at the same time and the former is much faster (and that is indicated by its x8xx vs x7xx), and also has the new features (like HDMI 1.4, 3D and those are indicated by its 6xxx vs 5xxx). But the 5850/5870 will not disappear overnight so for a while these will be out too, so AMD could have avoided any confusion by, e.g. naming the new cards 6830/6850 or something like that...

Also what's up with the summary? APU = Applications Processer (sic) Unit? Spelling mistake AND acronym mistake in one? (It's supposed to be Accellerated Processing Unit)

Re:Deceiving naming... (0)

Anonymous Coward | more than 3 years ago | (#33958338)

The 5XXX family was the outlier since the mainstream-performance card is 5770 as opossed to 4850 and 3850 on previous generations.

Typo in summary, (-1, Offtopic)

Anonymous Coward | more than 3 years ago | (#33955312)

Typo in summary,

ATI makes Radeon. Not AMD.

Posting anonymous for obvious reasons.

Re:Typo in summary, (3, Informative)

Rod Beauvex (832040) | more than 3 years ago | (#33955326)

For what reasons? Other than being a pedantic hairsplitting douche that doesn't seem to want to accept that ATI is owned by AMD.

Re:Typo in summary, (1)

The Archon V2.0 (782634) | more than 3 years ago | (#33955352)

Typo in summary,

ATI makes Radeon. Not AMD.

Posting anonymous for obvious reasons.

Because otherwise everyone would laugh at you for not realizing AMD's ATI brand is on the way out? Don't worry, we'll do that anyway.

Re:Typo in summary, (5, Informative)

mabinogi (74033) | more than 3 years ago | (#33955384)

No, AMD makes Radeon and has done for years.

They've _branded_ them ATI since the buyout, but even that has changed now and future parts (which is what these are) will be AMD branded.

Mod Parent Up (Funny) or Down (Troll), Please (1, Offtopic)

billstewart (78916) | more than 3 years ago | (#33955390)

AMD buys ATI. ATI animates Lizard. Lizard bites Spock. Spock buys nVidia. nVidia dominates ATI.

Re:Mod Parent Up (Funny) or Down (Troll), Please (0)

Anonymous Coward | more than 3 years ago | (#33955632)

That's delicious. GP makes a well crafted troll, you are the only one of seven posters who isn't baited, and you get the downmod.

Re:Typo in summary, (0)

Anonymous Coward | more than 3 years ago | (#33955444)

Posting anonymous because you are an idiot, yeah that's pretty obvious.

AMD owns ATI, and AMD has also dropped the ATI names from the cards. The new cards will be marketed as AMD Radeon's, not ATI

Re:Typo in summary, (2, Informative)

FireXtol (1262832) | more than 3 years ago | (#33955478)

ATI is dead.[1] 1. Google: amd retire ati brand

Re:Typo in summary, (0)

Anonymous Coward | more than 3 years ago | (#33955488)

Posting anonymous for obvious reasons.

Embarrassment?

Re:Typo in summary, (1)

Miseph (979059) | more than 3 years ago | (#33956128)

Nope, trolling. Way to feed 'em.

Re:Typo in summary, (1)

anUnhandledException (1900222) | more than 3 years ago | (#33956258)

No AMD makes Radeon. ATI was acquired. Accept this thing called reality. AMD kept the brand seperate a while (despite ATI not actually existing) because is was worth more as a separate brand. Times change. As AMD gets more and more into hybrid APU it makes less sense to have a separate fake company name on some (but not all) of its video cards.

The line between CPU & GPU will become very blury over the next decade. AMD wants you to know they make it all. Dedicated CPU, dedicated GPU, low power APU, integrated graphics, medium power APU coupled w/ a dedicated GPU, etc. All made by AMD and all play nice together. I am sure you can see how that is more valuable from a branding standpoint. Given ATI has existed as a brand name only for nearly 3 years now it makes sense to retire it.

If your old bank gets acquried by a new bank and eventually the name changes do you still keep calling it by the old name a decade later just to be an ass and confuse people.

"This is a First Union dammit. It was a First Union when I was born, and it will be a First Union when I die. I don't give a damn that the brand name changed 9 years ago."

Re:Typo in summary, (1)

uvajed_ekil (914487) | more than 3 years ago | (#33956524)

Typo in summary, ATI makes Radeon. Not AMD. Posting anonymous for obvious reasons.

Nice try, but for all intents and purposes, ATI does not even exist as a company, at least not as an independent one. It was bought by AMD some time ago, in the range of 3-5 years ago, I think. So anything sold under the ATI moniker is in fact made by AMD. AMD is killing of the ATI name, and now beginning to label their graphics line strictly as AMD products.

Why did you post anonymously? If you had been right (you weren't), a correction would have been warranted. But anyway, if you are going to post such terse corrections, at least take credit for them, so we know who to blame. You are like a grammar nazi who corrects peeps but spelles wordz wrong with bad grammer and don't use no puncshuation but worse you know lol rolfmoa

Posting for obvious reasons.

Re:Typo in summary, (1)

Lord Kano (13027) | more than 3 years ago | (#33957216)

You must have not been paying attention when AMD bought ATI. You also must not have tried to download Radeon drivers and realized that you were at AMD's site.

LK

"Alien vs. Predator" Movie or Video Game? (1)

billstewart (78916) | more than 3 years ago | (#33955426)

I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.

Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

Re:"Alien vs. Predator" Movie or Video Game? (1)

zcomuto (1700174) | more than 3 years ago | (#33955612)

I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.

Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

By current standards, Low-power would be just under the amount of energy required to power the sun to run a dual-card setup.

Re:"Alien vs. Predator" Movie or Video Game? (3, Informative)

Sycraft-fu (314770) | more than 3 years ago | (#33955640)

Without any fan? No probably not. It is a desktop processor. This isn't an ultra low power component, it isn't an Atom. The idea AMD is going for here, and I think there's some merit to it, is a low range desktop type of system. People who want something cheap, but still want to play games. Intel's integrated chips don't work that well for that (though they've improved) so this is to try and fill the market.

If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones). But don't bitch that some people might want a computer that can play a game a little better than Nethack.

Re:"Alien vs. Predator" Movie or Video Game? (4, Insightful)

Chris Burke (6130) | more than 3 years ago | (#33956894)

The idea AMD is going for here, and I think there's some merit to it, is a low range desktop type of system. People who want something cheap, but still want to play games. Intel's integrated chips don't work that well for that (though they've improved) so this is to try and fill the market.

Think more mid-to-high-end laptops.

As mentioned in the summary, this is a low-power version of the Phenom II. Not an ultra-low power for consumer electronics or netbooks like Atom or AMD's Bobcat, but still solidly aimed at the mobile market. It provides all the power and cost advantages of a UMA solution plus gets rid of one of the system buses for more savings, while providing good-for-a-laptop graphics without having to significantly re-engineer the motherboard or cooling solution. This is still in theory; demonstrations of engineering samples are nice, but it'll be interesting once the reviewers get their hands on some.

Of course you're also right, since cost and power usage are relevant for desktop. Just not as much, since you're not dealing with battery life, or the form factor that make it difficult to work with discreet graphics. A single line of UMA-based motherboards with optional pci-e graphics card can serve multiple markets with one design and acceptable margins.

Re:"Alien vs. Predator" Movie or Video Game? (0)

Anonymous Coward | more than 3 years ago | (#33957222)

Think more mid-to-high-end laptops.

No, think value desktops. Dell has been selling Vostros with integrated Intel GMA 4500HD for some time now. Fusion gives such machines decent 3D. Now they sell into the home in addition to the office.

With regard to Fusion generally; never, ever bet against integration. Intel, AMD, ST, Hynix, et al. make ICs. The I stands for Integrated. If AMD gets the jump on Intel by integrating high performance GPUs with high performance CPUs it won't be the first time. Intel doesn't even have a high performance GPU to integrate, and NVidia has no CPU. It's AMD's to lose at this point; all they have to do is execute.

Re:"Alien vs. Predator" Movie or Video Game? (1)

0123456 (636235) | more than 3 years ago | (#33957490)

If AMD gets the jump on Intel by integrating high performance GPUs with high performance CPUs it won't be the first time.

This is not a high-performance GPU; you really can't integrate a high-performance GPU with a high-performance CPU because you'd have to suck 400W of heat out of the monstrous combined chip that would produce _and_ it would then be crippled by the poor memory bandwidth anyway.

No-one is going to buy this chip if they want high-performance 3D, they'll buy a CPU and a discrete graphics card. It's merely providing somewhat better performance than other integrated graphics chipsets, allowing people buying cheap PCs to play a few games at low settings and probably to reduce power usage while playing HD video (since GPUs are typically better at that than CPUs).

Which kind of leaves me wondering what the point is; the primary market is people who want to play games but not enough to actually buy a graphics card which can do so. Maybe it would be beneficial in the laptop market where many systems can't really play any games at all.

Re:"Alien vs. Predator" Movie or Video Game? (2, Interesting)

Klinky (636952) | more than 3 years ago | (#33957806)

Firstly this can save money. Integrating the GPU into the CPU can create a lower cost part for an OEM than having to using two chips in separate packages. Second this is a fusion between x86 & GPGPU/OpenCL. Once a critical mass of CPUs have an integrated GPU then you will probably see GPGPU tech really start to become integrated into programs that can take advantage of it. Suddenly your low-end budget box CPU can encode & decode multiple HD streams from your camera or apply special effects in realtime. Your games can take advantage of the integrated GPU for physics or possibly the framerate will be playable compared to some of the other crap IGPs. Things like image/video/audio encoding/decoding/editing, gaming, compression & encryption can all benefit from GPGPU. This is basically the start of setting a GPU specifications floor. The question is will Intel/nVidia play along and implement quality OpenCL on their GPUs? I think nVidia will probably have to at some point, but Intel might be a stalwart as OpenCL, anything not x86 that performs general purpose instructions probably looks like a threat to them.

Re:"Alien vs. Predator" Movie or Video Game? (1)

Lonewolf666 (259450) | more than 3 years ago | (#33959442)

It might be a medium-performance GPU by today's standards, integrated with a medium-performance CPU.
Right now, AMD has (for instance) the following sepatate parts:
-the 5570 GPU, a midrange GPU with 128 bit data bus. Complete cards use max. 43Watt (according to alternate.de)
-the Athlon X4 610e (e for "energy efficient"), a "Propus" quadcore with 45Watt TDP.
Put both on the same chip, assume some more improvement in power consumption, and you might get something loke the Llano. Except maybe for the shared memory bandwidth, that chip would beat my current system. Which is perfectly sufficient for games that are a few years old.

Re:"Alien vs. Predator" Movie or Video Game? (1)

Khyber (864651) | more than 3 years ago | (#33958252)

"Think more mid-to-high-end laptops."

No, those will come with their own MXM expansion slot for dedicated GPU.

Re:"Alien vs. Predator" Movie or Video Game? (1)

drinkypoo (153816) | more than 3 years ago | (#33959356)

Most of the mid-range laptops don't have MXM because it's a waste. The price differential of adding the slot is almost certainly more of a burden to the user than a replacement, and many MXM cards have heat sinks in nonstandard locations, or come in nonstandard sizes that only fit one range of laptops. This is supposedly less common in MXM3 but people were still using MXM1 when MXM3 had come out.

Low- to Mid-range laptops will get these chips, netbooks will continue struggling along with the slowest CPUs, and the big desktop replacement notebooks will continue to get MXM slots in case of a die bonding problem or similar which makes it cost effective to have them in there, although Tier 1 Vendors avoid actually admitting such problems. If every Toyota had a problem they'd have a recall, right? But if every HP has a problem (like everything which shipped with a Quadro FX1500M for a period of about a year!) then they deny, deny, deny and make you spend over 24 hours on the phone to get a replacement machine, and don't even bother to replace the video card in the machine you have. Which suggests that MXM is a big fucking waste of time and money in all cases.

Re:"Alien vs. Predator" Movie or Video Game? (1)

0123456 (636235) | more than 3 years ago | (#33957450)

If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones).

Ha... my LG Blu-Ray player has been in for three times for warranty repair and now we're waiting for a replacement unit because they can't fix it, while my $90 unknown-Chinese-brand Blu-Ray player from Walmart works perfectly and is multi-region out of the box.

And our Ion MythTV frontend does have a fan to keep it cool when playing 1080p, but it's inaudible from the sofa.

Re:"Alien vs. Predator" Movie or Video Game? (1)

qmaqdk (522323) | more than 3 years ago | (#33959074)

If you want 1080p with no fan, just get a Blu-ray player. There's plenty of them that'll play media off the network and Internet (LG has good ones). But don't bitch that some people might want a computer that can play a game a little better than Nethack.

And if you wan't a car that doesn't use gas, get a bicycle.

Re:"Alien vs. Predator" Movie or Video Game? (4, Interesting)

Ephemeriis (315124) | more than 3 years ago | (#33955930)

I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be, since I stopped caring once any modern hardware could play Nethack or Solitaire.

AvP is a relatively modern game. Came out in the last year or so. It isn't mind-shatteringly amazing, but it looks pretty decent.

Traditionally, integrated graphics have done a lousy job with serious gaming on PCs. Basically any FPS has required a discrete 3D card.

If Joe Sixpack can go out and buy an off-the-shelf machine at Best Buy and play a game on it without having to upgrade the hardware, it'll be a huge step in the right direction.

But this chip doesn't look like it'll be replacing 3D cards for serious gamers anytime soon.

Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

It's a desktop chip, so I can't imagine it'll do anything without a fan. Although the integrated graphics means that you wouldn't need a separate graphics card with its own fan. So it should be at least a little quieter.

Re:"Alien vs. Predator" Movie or Video Game? (1)

Osgeld (1900440) | more than 3 years ago | (#33957068)

If Joe Sixpack can go out and buy an off-the-shelf machine at Best Buy and play a game on it without having to upgrade the hardware, it'll be a huge step in the right direction.

well they always could, but they are too ill informed to make a good choice for their needs, they see a half dozen intel stickers on the front of some demo box and instantly start rationalizing that intel makes the best computer, just like they play their Nintendo tapes and wear their Nike air's, while blowing their nose in a Kleenex and using Clorox

personally there ought to be a survey for the people who just dont know, cant understand or are overwhelmed by a metric fucton of TOTALLY meaningless number thingies at the store to help them choose

do you game BAM 9 machines knocked out, whats your budged BAM down to 3

how fkin hard does it really have to be for the average Joe sixpack to get a computer they want? and is it any surprise that they dont like the things cause every one they have bought is pretty much a lie

intel blah blah 1284372! EXTREME GRAPHICS = voodoo 2 with less ram, with a giant sticker saying gaming powerhouse on the front

Re:"Alien vs. Predator" Movie or Video Game? (1)

Ephemeriis (315124) | more than 3 years ago | (#33958820)

well they always could, but they are too ill informed to make a good choice for their needs

The problem hasn't really been one of information.

Until fairly recently, your average off-the-shelf computer shipped with very crappy graphics. If you just ordered whatever was on Dell's website or grabbed something from Best Buy it would have enough integrated graphics to run Windows and not much else.

Sure, you can generally customize them with a video card of your choice... At least if you're ordering on-line... But even then the offerings weren't terribly impressive.

And there really is a limit to how much self-education you can expect your average consumer to do. Do you go out and research what kind of spark plugs are factory installed in your new car? I sure as hell don't.

Recently the situation has been changing. Windows now wants a halfway-decent 3D card. And integrated graphics are getting better. And gaming is becoming more mainstream, so more stock systems are coming with halfway-decent video cards. But, again, that's all a fairly recent change.

Re:"Alien vs. Predator" Movie or Video Game? (1)

Osgeld (1900440) | more than 3 years ago | (#33960382)

And there really is a limit to how much self-education you can expect your average consumer to do. Do you go out and research what kind of spark plugs are factory installed in your new car? I sure as hell don't.

no just as I said earlier about the little survey, I look in the book hanging off the shelf to quickly decide what I need for a sparkplug

Its not about joe sixpack educating themselves, its about providing reasonable information in a easy to digest format

why do most computers come with a crap video card? cause most of them say "great for gaming" on the front and is pushed out en mass as cake and people eat it up not having a clue either way

(and yes the cake is a lie)

Re:"Alien vs. Predator" Movie or Video Game? (2, Funny)

mjwx (966435) | more than 3 years ago | (#33958302)

AvP is a relatively modern game. Came out in the last year or so.

It worked, I have travelled back to the year 2000.

Re:"Alien vs. Predator" Movie or Video Game? (1)

Ephemeriis (315124) | more than 3 years ago | (#33958854)

AvP is a relatively modern game. Came out in the last year or so.

It worked, I have travelled back to the year 2000.

Hmmm...

Well, I assumed they were talking about the 2010 AvP game [wikipedia.org] . As that would make more sense (seeing as DX11 didn't even exist in 2000).

But I suppose it could be the 2000 AvP game [wikipedia.org] . In which case I'm less impressed.

Re:"Alien vs. Predator" Movie or Video Game? (1)

eknagy (1056622) | more than 3 years ago | (#33958952)

AvP is a relatively modern game. Came out in the last year or so.

Actually, there are more than a dozen AvP games according to the Wikipedia:
http://en.wikipedia.org/wiki/List_of_Alien_and_Predator_games [wikipedia.org]

I liked the AvP 1999 a lot and at first glance I did not understand why they showcase a 11-year old game.

Now I know I will have to buy an AvP 2010 with my next AMD laptop ;)

Re:"Alien vs. Predator" Movie or Video Game? (1)

Ephemeriis (315124) | more than 3 years ago | (#33959052)

Yeah. I just assumed they were referring to the 2010 version, as the earlier ones probably didn't feature DX11 graphics. But just saying "AvP" doesn't really clarify things much at this point.

The 2010 game is a mixed bag.

The marine campaign is a ton of fun. The predator campaign is fun, but doesn't make a whole lot of sense. The alien campaign was a disappointment.

The multiplayer can be fun, or frustrating, depending on the map and who you're playing as/against. Some of the maps seriously favor one race over the others, and it makes things very frustrating. If you're playing as the right race, you slaughter everyone. If you're playing as the wrong race you get slaughtered. And there isn't a whole hell of a lot you can do about it.

I played through the singleplayer storylines. Did some multiplayer. And then forgot about it. There's been some map pack DLC released... But the multiplayer just wasn't compelling enough to keep me coming back.

Re:"Alien vs. Predator" Movie or Video Game? (1)

AvitarX (172628) | more than 3 years ago | (#33956008)

You want Ontario or Zecate (Bobcat based APUs).

Both offer h.264 accelerated playback and are 9W or 18W.

I am seeing mixed info actually on what is available dual cs single core at what wattage.

The 9W is definitely single core, faster than Atom, with decentish GPU (accelerated video), the Zecate I think is 18W with a single core.

I imagine they will not quite double when going dual core (as the graphics part will not increase).

And they are supposed to have tech to completely shut down parts of the chip that aren't used, making a single threaded workload use the equivalent power of a single core, and idle under 1 watt.

The Llano would be more power than either of them.

Re:"Alien vs. Predator" Movie or Video Game? (1)

gman003 (1693318) | more than 3 years ago | (#33956308)

Hell, the crappy Intel integrated GPUs can handle video pretty well. Even a low-end card can do 1080p.

As for low-power, if recent experience is any judge, the power usage will be low only in comparison to quadruple Pentium IVs. Some cards last gen were 300+ watts TDP.

Re:"Alien vs. Predator" Movie or Video Game? (1)

aliquis (678370) | more than 3 years ago | (#33956982)

Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

Shouldn't anything new?

Re:"Alien vs. Predator" Movie or Video Game? (1)

stms (1132653) | more than 3 years ago | (#33957230)

I've got no idea how fast an "Alien vs. Predator" video game needs the graphics system to be

Agreed how many IOCs (Instances of Crisis) is that? like .25?

Re:"Alien vs. Predator" Movie or Video Game? (1)

h4rm0ny (722443) | more than 3 years ago | (#33958390)

Can the hardware play 1080p video without needing a noisy fan? How low power is "low-power"?

1080p blu-ray isn't that hard. If that's your primary interest you don't need to worry about getting a fancy card that scores well in gaming tests. The two things don't (mostly) correlate. With AMD cards, you're looking for something that has UVD2 support. That's the clue that indicates a card can give you proper blu-ray playback. And yes, that's plenty possible without a fan. For example, this oldish 4350 [awd-it.co.uk] should be fine for blu-ray (has UVD2.2, the latest) and is fanless, But I'm certain you could fine this cheaper second-hand.

Only a fool would say that a 747 airplane could .. (0, Troll)

Smidge207 (1278042) | more than 3 years ago | (#33955658)

come about by a tornado blowing through a refuse yard that contained the parts to the airplane, or a camera came about by blind chance. Only a fool would believe that a web site could create itself in cyberspace without the intervention of a programmer, even though all the ingredients exist there.

Yet many so called intelligent people believe that the universe came about by pure chance. This of course has to include the galaxies, solar systems, suns, planets and the micro world of atoms, protons, electrons, quarks etc. Then we also have to consider the dimensions, 1, 2, 3, 4 and so on, plus all the natural laws, anti matter, dark matter and the code that make up the fabric of the physical realm. Now the human mind cannot even conceive of the scale of the universe, never mind thinking at such wonderful design came about by luck. To believe that all this plus much more, came about without a God/designer/programer is absolutely pathetic. This belief is worse than a fable or farytale, it is the most extreme form of ignorance possible.

People who do not believe in God, also believe that the wonderful and technological human eye was not designed by God, rather such design came about because creatures without eyes had a less chance of survival. Yet when we see a man made camera we all know without a doubt that it was designed and built by someone. Did a camera come about by evolutionary processes because of the need for people to take photos, no it was created by intelligence for that need. However the truth is that the human eye and the human camera both use the same technology to turn light into an image, that can be understood. If you believe that a camera is always created by a designer (and so you should), then why is it so hard to believe that there was a creator for the human eye. If you don't attribute the wonderful technology of the human eye to a Creator, then what you are really saying is you believe that nothing or blind chance did a much better job with this technology than intelligent beings ? Humans who by possessing intellect can not only increase the odds in their favor of such design to come about, but actually make it possible at all. So how can nothing create something better than so called intelligent beings. To not believe in a creator is illogical and this is but one small example of Mans incredible foolishness. It is not hard to imagine the countless more examples of mans foolishness that exist. The question is, are you wise enough to accept obvious truth, or do you want to be blinded to the obvious by reason of your intellectual pride.

If you saw 20 apples on the ground and positioned to form a perfect circle on flat ground, you would automatically assume that these apples were placed there by somebody. What if I told you that there was a hole in my bag of apples, and when they fell out they formed a perfect circle by chance or the more accurately the elements that were in process at the time, such as wind, gravity and the angle that the apples fell out of the bag created this circle of apples. Would you not say that I was mad if I told you that they just fell out that way. Then how is it that some say the complex universe came about by chance. (I think I am being very extremely generous in my examples in a statistical sense).

Such people who believe that the universe itself, formed the earth, other planets, stars and galaxies, in fact the whole universe including the atomic world may be fools, but they are also ignoring the fact that the ingredients had to come from somewhere in the first place, which only makes the odds much greater. The only alternative to chance is constructive design and enlightened people understand that where there is design, there is always a designer.

Re:Only a fool would say that a 747 airplane could (1)

TheGratefulNet (143330) | more than 3 years ago | (#33955856)

I know you are a troll and this is very much OT...

but to think that all things that exist had to have been conciously thought of is the foolish thing, my friend. look around: you think this is the best that a perfect being could have come up with?

your own argument ('how wonderful things are') is the argument I use against you. things are quite shitty 'down here' and there is no sign of anyone in control. a supreme being would probably not be an absentee ruler and yet that's exactly what your religion would have us believe.

"the best thing I could say about god is that he seems to be an underachiever"

have a nice day, fundie.

yeah but (0)

Anonymous Coward | more than 3 years ago | (#33955822)

Will it run Linux?

(old ATI joke)

Linux (0)

Anonymous Coward | more than 3 years ago | (#33957400)

No, that's not old ATI joke. nVidia runs perfectly, ATI crashes. Nothing else matters if it costs about the same.

AMD makes comparable, stable drivers for Linux for this new processor, and hell, I'll buy it. I'll probably buy 1000+ of these. But if they don't, well, then I'll buy 0 because I can't use the hardware. It's actually that simple.

Re:Linux (1)

Vectormatic (1759674) | more than 3 years ago | (#33958996)

My MSI laptop with Ati x200 (well, it says x1150 on the tin, but that is a rebadged x200) has been running linux since 2007 (when i got it actually). Compiz worked flawlessly with 3d cube nonsense and all in the then recent ubuntu (dont remember if it wat 7.04 or 7.10)

Yes, Ati drivers for linux were hell back in 2004 with my 9600pro, but the last few years it all works

Physics and GPUs (1)

stimpleton (732392) | more than 3 years ago | (#33955902)

"...showcasing the new graphics technology and advanced effects from the open source Bullet Physics library"

Nvidia has their PhysX engine, and Intel advised they were to acquire Havok. Bullet is exciting for me. It was used in Grand Theft Auto 4, and in the movie Hancock.

So for me, reading AMD, ATI, Bullet in the same sentence is the interesting part.

Will apple use this new cpu with gpu build in? (1, Offtopic)

Joe The Dragon (967727) | more than 3 years ago | (#33955926)

Will apple use this new cpu with gpu build in?

as they like thin and small and intel new chip with gpu build in does not fit with apples techs and apple does not like putting in add on video chips in there low end systems.

Re:Will apple use this new cpu with gpu build in? (0)

MBCook (132727) | more than 3 years ago | (#33956084)

I doubt it. Switching to AMD (especially for only part of their line) seems like it would have a lot of ancillary costs such as the R&D help I know Intel has given Apple. Apple stuck by Intel for years through their abysmal "GPUs" (I've got one, along with an nVidia, in my MacBook Pro). Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.

Also, don't forget the right now AMD has the Phenom, which is a good chip, and Intel has their current Core line, which is an amazing line of chips. To go to AMD means sacrificing performance/watt on the CPU side.

Two years ago maybe it would have mattered. Today? Too little too late.

but the lack of OPEN CL in intel's chip is bad and (1)

Joe The Dragon (967727) | more than 3 years ago | (#33956118)

but the lack of OPEN CL in Intel's chip is bad and do you want a $700 desktop with intel video?

a $1200 laptop? $1500 laptop?

also the 320M they have now is better then what Intel new chip can do.

Re:Will apple use this new cpu with gpu build in? (5, Interesting)

tyrione (134248) | more than 3 years ago | (#33956630)

I doubt it. Switching to AMD (especially for only part of their line) seems like it would have a lot of ancillary costs such as the R&D help I know Intel has given Apple. Apple stuck by Intel for years through their abysmal "GPUs" (I've got one, along with an nVidia, in my MacBook Pro). Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.

Also, don't forget the right now AMD has the Phenom, which is a good chip, and Intel has their current Core line, which is an amazing line of chips. To go to AMD means sacrificing performance/watt on the CPU side.

Two years ago maybe it would have mattered. Today? Too little too late.

Being a former NeXT and Apple Engineer I can tell you unequivocally your thought is Bull Shit. Intel gave NeXT practically zero information for the NeXTStep Port to Intel. Apple designs around Intel Specs and Intel helps as another OEM. No special treatment.

Re:Will apple use this new cpu with gpu build in? (1)

forkazoo (138186) | more than 3 years ago | (#33958372)

Intel's latest round of integrated GPUs is actually supposed to be pretty good, to the point that on lower end computers (like MacBooks) it may not be necessary to include even a low-end GPU.

I've been hearing that Intel's latest graphics are finally pretty good for over a decade. At this point, they could release a graphics chip so amazing that each polygon gives me twenty dollars and a blowjob, and I'd still make a careful point of never using Intel graphics, no matter the cost. It's like the boy who cried wolf. Eventually it just doesn't matter if the story becomes true.

Slightly less severe, but still fresh in my mind: "ATi finally has decent drivers."

nVidia has "Great performance while using very little power this time." They are the graphics vendor that I come closest to trusting, but frankly they all make the same claims every year and it gets a little boring.

Useless resolution/performance measure (0)

Anonymous Coward | more than 3 years ago | (#33955950)

"Test systems showed the integrated GPU had no trouble running Alien vs. Predator at a moderate resolution with DirectX 11 features enabled."

What constitutes "moderate resolution", what does "no trouble" mean (15fps, 30fps, 60fps+?) and how does that performance compare CPU/GPU setups?

I have no idea what this claim really means other than the fact the system didn't crash when running a reasonably recent game. It says something, but numbers would be nice.

Oh, I see, they took the sentence almost verbatim from the first page of the cited Hot Hardware article, but the submitter didn't deem the resolution important enough to leave it in.

Answer: "a moderate 1024x768 resolution". Moderate it is. I look forward to additional details.

Re:Useless resolution/performance measure (3, Insightful)

mykos (1627575) | more than 3 years ago | (#33956122)

Getting more than 0 FPS at any resolution with those features enabled already puts it ahead of any integrated graphics solution on the marke--and they're doing it at super low wattage. If it can run AVP that well, it could run anything from 2008 and earlier (save for Crysis) extremely well.

That's at least 90% of all the games in history released for PC on an integrated graphics processor. Pretty amazing if you ask me.

Re:Useless resolution/performance measure (1)

mykos (1627575) | more than 3 years ago | (#33956164)

Also, here's a video to help answer your question: http://www.youtube.com/watch?v=ed3InAJhh2k [youtube.com]

Re:Useless resolution/performance measure (0)

Anonymous Coward | more than 3 years ago | (#33958788)

Okay, that's more like it! Yeah, that doesn't look bad at all. Adequate for a lot of uses. Couple that with the low power and now I'm impressed.

Re:Useless resolution/performance measure (0)

Anonymous Coward | more than 3 years ago | (#33958520)

Wow Alien vs Predator had dx 11 support back in 1999, way to be a head of the curve.

The article got it wrong (5, Informative)

Suiggy (1544213) | more than 3 years ago | (#33956260)

APU doesn't standard for Applications Processing Unit, it's an acronym for Accelerated Processing Unit.

http://sites.amd.com/us/fusion/apu/Pages/apu.aspx [amd.com]

"The GPU, with its massively parallel computing architecture, is increasingly being leveraged by applications to assist in these tasks. AMD software partners today are taking advantage of the GPU to deliver better experiences to across an ever-wider set of content, paving the way to breakthrough experiences with the upcoming AMD Fusion Family of Accelerated Processing Units (APU)."

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>