×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD's Hybrid Graphics Unveiled, Tested

Zonk posted more than 6 years ago | from the more-than-meets-the-eye dept.

Graphics 90

ThinSkin writes "The combination of AMD's ATI graphics division and AMD's CPU division means that AMD often fights a two-front war, directly competing against Intel in the CPU business as well as Nvidia in graphics. AMD's Hybrid Graphics technology allows them to fight against both companies at the same time. Inserting an additional card works the same as CrossFire, which, like Nvidia's SLI, was only capable by having two discrete graphics cards installed on a motherboard. ExtremeTech has put the 780G chipset through a series of gaming and synthetic benchmarks to see just how beneficial this technology is. HotHardware has a similar rundown on the technology. The results indicate that Hybrid Graphics aren't yet ideal for the power-hungry gamer, as driver revisions need to be ironed out at this early stage, but performance looks promising."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

90 comments

First post! (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22641298)

I had to do this once.

They also need to test HyperFlash that is in sb7xx (1)

Joe The Dragon (967727) | more than 6 years ago | (#22641314)

They also need to test HyperFlash that is in SB 7xx and how many boards will use the 2 usb 1.1 ports for mouse and key board?

Who cares, it sucks (0, Flamebait)

Corwn of Amber (802933) | more than 6 years ago | (#22641598)

And it will suck forevermore. It's integrated graphics, thus it shares RAM with the CPU. Ergo, it will suck forever.

As long as ATI makes real graphics cards, they will be in the competition for perf. As soon as they stop, they're leaving nVidia with the monopoly on real good GPUs.

Re:Who cares, it sucks (4, Informative)

BTG9999 (847188) | more than 6 years ago | (#22641862)

If you would RTFA you would have read that it is possible for motherboard to have dedicated ram for the integrated video card since AMD put a memory interface on the northbridge.

Re:Who cares, it sucks (1)

Corwn of Amber (802933) | more than 6 years ago | (#22642032)

No, it's system RAM. From TFA.

Re:Who cares, it sucks (4, Informative)

Anonymous Coward | more than 6 years ago | (#22642200)

The motherboard's BIOS lets you borrow 128, 256 or 512 MB of RAM from the system's RAM, to allocate it as video memory to the integrated GPU. For the first time ever, AMD is also equipping its integrated graphics chip with a separate memory interface. This allows motherboard makers and OEMs to provide dedicated graphics memory for the integrated chip directly on the board, if they find the GPU's performance unsatisfactory, or don't wish to use a shared-memory solution. In effect, this transforms the integrated on-chip graphics solution into a dedicated graphics card that just happens to reside in the northbridge
Link [tomshardware.com] . You're right that it is currently limited due to the RAM-sharing, but you are wrong that it will necessarily suck forever. There's no telling yet how the dedicated memory channel will affect performance. Who knows? Perhaps it will move out of the realm of suck.

Re:Who cares, it sucks (1)

RiotingPacifist (1228016) | more than 6 years ago | (#22643030)

That will mean upgradable graphics cards without needing to by a whole new card, which sounds ideal for non-gamers.

As theyre developing both the gfx & cpu drivers could they do away with the fixed ram intervals so the graphics card can have any amount of ram? hell could they allow dynamic configuration?

Could they do something cleaver and let 4 gigs of ram be used 3.x by cpu and the rest by the GFX or is that issue still impossible to address?

Re:Who cares, it sucks (2, Insightful)

PitaBred (632671) | more than 6 years ago | (#22643208)

Hell, forget non-gamers... what about mobile users? I've found that many times it's just the graphics chip in my laptops that are too slow. It'd be great to be able to just pop a new chip in there. Most notebooks don't have upgradeable graphics, and when they do, they still suck.

Re:Who cares, it sucks (0)

Anonymous Coward | more than 6 years ago | (#22644206)

Gee, upgradeable by just replacing the chip? Reminds me of the old Amigas [amigahistory.co.uk] !

Re:Who cares, it sucks (1)

DrSkwid (118965) | more than 6 years ago | (#22650236)

> That will mean upgradable graphics cards without needing to by a whole new card, which sounds ideal for non-gamers.

In what way do you think having an upgradable graphics card is *any* use to someone not buying a machine on the strength of its graphics card?

What are these mythical people going to use the extra GFX ram for ?

Re:Who cares, it sucks (1)

RiotingPacifist (1228016) | more than 6 years ago | (#22650496)

In what way do you think having an upgradable graphics card is *any* use to someone not buying a machine on the strength of its graphics card?
Thats exactly who would upgrade graphics cards. If you buy a machine on the strength of its graphics card, upgrading the ram isnt going to be enough for you, your likely to want the latest chipset that gives better performance. But if you buy a system and then a couple of years down the line you find its a bit sluggish (say you start running compiz / vista effects) , well you could just stick a new stick of ram in it.

What are these mythical people going to use the extra GFX ram for ?
running desktop effects
Photo editing
Video editing
Playing games they couldn't play before ( not on full graphics as true gamers do, but just struggling to play it with the extra ram (your still on an old chip set after all)

Re:Who cares, it sucks (1)

DrSkwid (118965) | more than 6 years ago | (#22653824)

I think it's a sales gimmick, the price of extra video ram in a new build will hardly be noticed at something like $30 for 1Gb.

Re:Who cares, it sucks (0)

Anonymous Coward | more than 6 years ago | (#22642298)

The motherboard can have both. I believe it can have up to 128 MB of dedicated ram just for the graphics, and can then use up to 512 MB of system ram too.

More good reviews (5, Informative)

Vigile (99919) | more than 6 years ago | (#22641356)

There are some other good looks at RS780 performance:

http://www.pcper.com/article.php?aid=527 [pcper.com] - looks at Hybrid CrossFire with several games in real world testing as well as GPU overclocking; also features the new AMD X2 4850e processor
http://www.techwarelabs.com/reviews/processors/780g-and-4850e/ [techwarelabs.com] - looks at both the chipset and CPU
http://techreport.com/articles.x/14261 [techreport.com] - good motherboard review
http://www.bit-tech.net/hardware/2008/03/04/amd_780g_integrated_graphics_chipset/1 [bit-tech.net] - tests HQV and HD audio systems

Past history (0)

Anonymous Coward | more than 6 years ago | (#22641780)

If we have to rely on ATI's driver writing abilities, I'm not going to have much confidence in it.

Unless I hear rave reviews, I'm sticking with nVidia, thank you very much. AMD's processors kick butt, however.

Personally, I've always wondered about the ATI purchase. Considering AMD's long history with nVidia, that company would have been a far better fit as far as creating total quality solutions go.

Re:Past history (0, Flamebait)

FishWithAHammer (957772) | more than 6 years ago | (#22642640)

AMD's processors kick butt, however.

Um, what? AMD's processors are terrible these days. There's a reason they're absolutely bleeding money: they're being killed in all segments of the processor market by Intel.

Re:Past history (2, Informative)

edwdig (47888) | more than 6 years ago | (#22643076)

Um, what? AMD's processors are terrible these days. There's a reason they're absolutely bleeding money: they're being killed in all segments of the processor market by Intel.

They're not terrible, they're just not quite as good as Intel's at the moment.

Terrible is things like Via processors or Transmeta or the other junk you normally wouldn't even consider.

Re:Past history (1)

Wdomburg (141264) | more than 6 years ago | (#22643274)

Via isn't terribly; they're just aimed at a different segment. The current C7 chips are more akin to the upcoming Intel Atom chips (and in fact share very similar design characteristics).

Re:Past history (2, Insightful)

maz2331 (1104901) | more than 6 years ago | (#22646554)

I've noticed that AMD tends to leapfrog Intel in a really big way every few years, then Intel slowly catches up and maybe passes them for a while with evolutionary changes. Then AMD hits another "breakthrough" that blows Intel totally out of the water again for a couple of years.

AMD tends to be smaller, more agile, but slower at the evolutionary tweaks than Intel. Intel's sheer size gives them an edge on the drudgery of small performance and cost optimizations, but they are so big that the "outside the box" thinking needed to really innovate is lost in committee before AMD releases the product.

Right now, Intel has the upper position. Give it a year or two...

Re:Past history (3, Insightful)

twistedcubic (577194) | more than 6 years ago | (#22645710)


Um, what? AMD's processors are terrible these days.

Um, no. Last year I got an Athlon X2 4600+ (65 watts max) and it does everything I need, and the stock HSF is almost silent. I seriously doubt an Intel processor could do everything this processor does for me, for the same amount of money. And no, I can't overclock because I can't risk the math errors.

It's silly to compare the processors based on those commonly used benchmarks (Quake? WTF?). Even those artificial benchmarks which purport to demonstrate number crunching speed are not as useful as you might think. I could do just as well with an Intel processor, but it will cost me significantly more money to do so because the Intel motherboards and processors are more expensive. I suppose if I played games I would buy a really fast Intel processor, crank the voltage, run a really loud HSF to keep it cool, and curse AMD for not providing me with this wonderful oppotunity. But alas, I don't.

Re:Past history (2, Informative)

Kjella (173770) | more than 6 years ago | (#22648720)

Ever since the Core processors came on the market, Intel has had power parity or better. Even the fastest Intel E8500 3.16 GHz operates with a TDP of 65 watts, the regular 4600+ has TDP of 89 watts unless you have the EE edition. Source: http://en.wikipedia.org/wiki/CPU_power_dissipation [wikipedia.org] . Note that TDP = Thermal Design Power and doesn't say much about how much it really draws, but in general you can see where it gets bumped up. For example, the E4300 1.80 GHz has the same TDP as the previously mentioned E8500 3.16 GHz, but you can be sure it draws a lot less than that while the E8500 is probably quite close.

Re:Past history (1)

h4rm0ny (722443) | more than 6 years ago | (#22647144)


Terrible? Not at all. AMD is still producing very good processors. They were also first up with proper quad-core chips. There's nothing wrong with AMD chips. They're very good. Likewise in the server market their quad core Opterons are excellent. I'm buying exclusively AMD at the moment.

Re:Past history (1)

FishWithAHammer (957772) | more than 6 years ago | (#22652924)

Terrible? Not at all. AMD is still producing very good processors. They were also first up with proper quad-core chips. There's nothing wrong with AMD chips. They're very good. Likewise in the server market their quad core Opterons are excellent. I'm buying exclusively AMD at the moment.

"Proper" quad-core chips? What kind of performance gains are those getting you over a C2Q? (The answer is "just about none.")

Opterons are the only place in the market where AMD is competitive. There is literally no reason to buy AMD for anything else at the moment.

Re:Past history (1)

Nicolay77 (258497) | more than 6 years ago | (#22652896)

Not really that terrible now.

They were terrible just before Intel launched the Core2Duo: They only cared about the most profitable segments, being too expensive in the high end, and that happened just because limited production capacity, so they could not provide the market with a whole array of products from low end to ultra high end.

AMD was much better several years ago, with much more bang for the buck than Intel in the segments I usually buy, not the highest end, but still providing competitive value for the price.

Re:Past history (1)

FishWithAHammer (957772) | more than 6 years ago | (#22652990)

AMD was much better several years ago, with much more bang for the buck than Intel in the segments I usually buy, not the highest end, but still providing competitive value for the price.

So basically where the Core 2 Duos are kicking ass and taking names? Come on, I understand the fanboy attitude but there's nothing to recommend AMD at present outside of the server market. Intel's equal-or-better in power consumption and better in performance. They're not competitive for the price by any stretch.

Re:Past history (1)

Nicolay77 (258497) | more than 6 years ago | (#22654184)

I said: "Not really that terrible now". Their prices are far less inflated than in 2005-2006. Fact.

I also said they were much better several years ago. And several means like 1999. So there's another fact.

Intel is better now, they are better since mid 2006, BUT I DIDN'T SAID IT. Is that such a huge problem?

I doesn't mean that I don't know about it, just that this was an AMD discussion so I though it was not relevant to the point (That AMD lost their soul by getting too greedy in 2005, and never recovered it). I also don't believe that AMD will leapfrog Intel in the next 3 or 4 years. I can't predict the future, but the trend seems to be that they will lag in production capacity even more. In fact in my post I was trying to explain AMD part in its own fall.

Now please stop calling me fanboy unless you can prove it, with something like an argument, and also please stop being a Intel fanboy yourself.

Re:Past history (0)

PopeRatzo (965947) | more than 6 years ago | (#22642804)

Unless I hear rave reviews, I'm sticking with nVidia, thank you very much.
Wait, I thought it was nVidia that has the weak drivers for graphics cards, not ATI. Or is that just for Linux?

I've really got to stop reading the hardware reviews. I just get all worked up for nothing. I've learned that as long as I stay around $1500 to $2000 worth of computer, I get what I need. I guess I've grown out of needing to spend an extra thousand bucks just to get the system with the fastest processor or hottest benchmarks.

When I look at some of the ads in the freebie PC Gamer magazines that end up on my doorstep despite my never having asked for them, and I see the latest systems touted in terms of being able to become a "Fragging Behemoth" or a "Killing machine so awesome you'll need a license", I get the feeling that the market for the top level of hardware has left me behind. Or maybe it's the other way 'round.

I do like the looks of that Blackbird 2, though. Maybe when I get that free 1200 bucks from my pal George Bush I'll add it to my tax refund and see if I can set one a them up as a DAW (with fragging on the side). The liquid cooling might quiet things down in my project studio.

Re:Past history (1, Flamebait)

edwdig (47888) | more than 6 years ago | (#22643270)

Wait, I thought it was nVidia that has the weak drivers for graphics cards, not ATI. Or is that just for Linux?

Nvidia's primary advantage is their drivers. They've always been leaps and bounds above ATI's. They go back and forth on who has the better hardware. When ATI has the advantage in raw power, it's often canceled by the lower quality of the drivers.

Nvidia's Linux drivers are generally excellent, usually offering performance similar to the Windows drivers. There's a little variance from model to model and release to release, but its close and the advantage can swing either way. However, there's a very small but rather vocal minority of users that have conflicts between Nvidia's drivers and something else in their system who like to complain a lot about them.

Re:Past history (1)

DrSkwid (118965) | more than 6 years ago | (#22650280)

> there's a very small but rather vocal minority of users that have conflicts between Nvidia's drivers and something else in their system who like to complain a lot about them.

The sort that don't like having their machine rooted by their binary blob video driver

or maybe the sort that don't run Linux

There was once a time when people complained about lack of hardware documentation, please don't lose sight of Freedom Zero.

Re:Past history (1)

edwdig (47888) | more than 6 years ago | (#22652228)

The sort that don't like having their machine rooted by their binary blob video driver

Is that supposed to mean its ok if your machine get rooted due to an open source video driver?

or maybe the sort that don't run Linux

If they're complaining about Nvidia's Linux drivers not working on their system, then they're rather stupid, don't you think?

There was once a time when people complained about lack of hardware documentation, please don't lose sight of Freedom Zero.

Publicly available hardware documentation is a good thing to have, however, it has absolutely nothing to do with the quality of the drivers provided by Nvidia.

Re:Past history (1)

DrSkwid (118965) | more than 6 years ago | (#22653728)

> Is that supposed to mean its ok if your machine get rooted due to an open source video driver?

It's is the preferred option over binary blob, yes

New blobs take considerably longer than source code mods

Extremetech story an eye sore (0)

Anonymous Coward | more than 6 years ago | (#22641366)

I love how ExtremeTech loves to put like 30 words on a page. What an eye sore. The HH story linked there is much better.

Risky Submission (4, Funny)

imstanny (722685) | more than 6 years ago | (#22641412)

...but can it run Aero in Vista?

Re:Risky Submission (-1, Troll)

Anonymous Coward | more than 6 years ago | (#22641522)

Who cares...Windows ME 2, oopsy...I mean Vista, won't be around for long.

Re:Risky Submission (1)

corychristison (951993) | more than 6 years ago | (#22641566)

I had assumed that was the point.

To bring decent-end graphics to the mass consumer market to run Vista's Aero Glass & Friends as well as some games.

I feel this (in it's current state) is more a shove at Intel's GMA graphics processors than anything.

Re:Risky Submission (2, Informative)

Guspaz (556486) | more than 6 years ago | (#22641594)

Previous gen onboard graphics (this new stuff is DX10) was capable of running Aero. The requirements for Aero aren't terribly demanding, far less than an actual game.

Re:Risky Submission (3, Interesting)

PrescriptionWarning (932687) | more than 6 years ago | (#22641804)

aye, but there's a difference between minimum requirements and recommended requirements. Quality and response time are what you'll notice in Aero between a simply on board accelerator and say a Geforce 5 series or higher

Re:Risky Submission (3, Informative)

everphilski (877346) | more than 6 years ago | (#22642116)

Sure, but my $300 laptop has an onboard GeForce 6 series chip. Just got to avoid Intel graphics like the plague and you'll be fine.

Re:Risky Submission (1)

merreborn (853723) | more than 6 years ago | (#22644432)

Just got to avoid Intel graphics like the plague and you'll be fine.
The intel GPU in my macbook beats the hell out of the mobile Radeon GPUs that most comparably-priced laptops have (at least, as of a year ago -- I'm talking about the X200, etc.). Granted, most intel GPUs are going to be weak, but they're not all bad.

Re:Risky Submission (1)

BlueParrot (965239) | more than 6 years ago | (#22647530)

Or... actively seek them out and install Ubuntu. These days it is actually possible to get a laptop with all hardware supported by open source drivers ( well except the BIOS I guess ) and depending on where you live, without the Microsoft tax.

There's nothing wrong with Intel graphics (1)

Joce640k (829181) | more than 6 years ago | (#22648364)

My laptop has an Intel 945 chipset and it runs just fine. My friend has Vista on his and Aero runs no problem (and so it should, Aero is just image compositing, not vertex processing).

I bought my laptop just for compatibility testing (I write 3D graphics software) and the graphics have been very stable and surprisingly fast. Intel drivers have always been good. I'm still not sure there's working drivers for the latest ATI/NVIDIA cards (I've had an unusable ATI 2600 HD sitting on my desk for the last six months because there's no working driver for it, NVIDIA aren't all that much better)

The only problems are lack of 3D antialiasing and the vertex processing is in software, though it isn't anywhere near as slow as I expected after all the trashing they get from script kiddies.

They're no good for gaming, sure... but there's absolutely nothing wrong with them for everybody else.

Re:Risky Submission (0)

Anonymous Coward | more than 6 years ago | (#22642894)

No, but it runs linux!

FTA: (0)

Anonymous Coward | more than 6 years ago | (#22641564)

...780G-based platform that idles under 80W and runs under full load at 155W. But then AMD adds an element much less common in the integrated world: great performance, regardless of whether you're executing threaded audio encoding software, the latest gaming titles, or even a simple file compression routine. Inclusion of AMD's full UVD gives the chipset real video decoding chops, too. http://www.hothardware.com/articles/AMD_780G_Chipset_and_Athlon_X2_4850e_Preview_/ [hothardware.com]

3-way SLI? (2, Insightful)

T-Bone-T (1048702) | more than 6 years ago | (#22641610)

If SLI can only do 2 cards, what was that when they did 3-way SLI a couple months ago?

Re:3-way SLI? (0)

Anonymous Coward | more than 6 years ago | (#22641782)

I got the torrent of that one. It was hot!

Re:3-way SLI? (0)

Anonymous Coward | more than 6 years ago | (#22643828)

It makes me sad to see parent modded down. :(

Re:3-way SLI? (1)

ChrisStrickler (1157941) | more than 6 years ago | (#22642138)

Re:3-way SLI? (1)

T-Bone-T (1048702) | more than 6 years ago | (#22642216)

It wasn't Crossfire. It was definitely 3-way SLI. According to the summary, that doesn't exist.

Re:3-way SLI? (1)

Gideon Fubar (833343) | more than 6 years ago | (#22642756)

here ya go: http://www.nvidia.com/object/io_1197375200475.html [nvidia.com]

of course, the article doesn't explicitly say 2 cards are needed.. It's referring to the fact that onboard graphics were previously replaced by PCI/AGP/PCI-E graphics cards, and that the two systems wouldn't work together at all.

I can't say for sure (i haven't read up on it properly yet), but what AMD appear to have done is made it possible to have an SLI-style system that is capable of using onboard graphics and PCI-E graphics at the same time.

Re:3-way SLI? (1)

theeddie55 (982783) | more than 6 years ago | (#22642376)

it doesn't say SLI can only do 2 cards, it says SLI needs 2 discreet cards, doesn't mean it can't have more, there's a couple of motherboards due for release soon allowing 4 way sli, though they need a modified case for 4 double width cards and it doesn't leave space for anything else.

Re:3-way SLI? (1)

T-Bone-T (1048702) | more than 6 years ago | (#22642600)

Yes, it does say it can only do 2 cards:

Inserting an additional card works the same as CrossFire, which, like Nvidia's SLI, was only capable by having two discrete graphics cards installed on a motherboard.

Re:3-way SLI? (2, Interesting)

Gideon Fubar (833343) | more than 6 years ago | (#22642986)

no it doesn't... it say that this system can only be used by having two cards mounted on the motherboard. Necessary and sufficient conditions.

Perhaps, however, it would have been less confusing if they'd said "two (or more)". Note that they say "like Crossfire", which can certainly support more than 2 cards.

If you assume they're talking about the difference between two mounted cards and one mounted card working with onboard graphics, it makes a lot more sense.

Re:3-way SLI? (1)

T-Bone-T (1048702) | more than 6 years ago | (#22643748)

It is amazing what rereading will do for you. I read it as "capable of", not "capable by". My mistake.

Re:3-way SLI? (2, Informative)

djtachyon (975314) | more than 6 years ago | (#22642468)

Well I have a 3-Way SLI nVidia motherboard now. The only current chipsets to support it are the nVidia 680i/780i/790i chipsets. Only catch is that no PCIe2.0 nVidia cards support Triple-SLI. So you have to use either the nVidia GeForce 8800 GTX or Ultra. Not sure why TFA is vague on this.

Nobody's expecting it to game (-1, Troll)

Zantetsuken (935350) | more than 6 years ago | (#22641796)

Nobody in with any kind of experience with computers and gaming would who's in their right mind would even expect any sort of onboard or hybrid gpu to excel at gaming, so why even mention it? What, because it gives the summary and article an extra bit of filler to make it look longer?

Nobody's expecting it to coprocess. (0)

Anonymous Coward | more than 6 years ago | (#22642398)

There's one benefit to it being there. As a known coprocessor for those doing CUDA.

Re:Nobody's expecting it to game (1)

ArsonSmith (13997) | more than 6 years ago | (#22643370)

Meanwhile in history criteria 1992ish:

Nobody with any kind of experience with computers and mathematics would, in their right mind, even expect any sort of integrated FPU to excel at floating point, so why even mention it? What, because it gives the summary and article an extra bit of filler to make it look longer?

how to identify a platform? (-1, Offtopic)

akshaykrsingh (1250828) | more than 6 years ago | (#22641800)

well guys i am from a different field was was going through all your conversation for a long time. how do you guys identify a platform? say for instance lets take a web site designing site in Mumbai called www.HugeH.com . how do i identify about the platforms used in it? http://www.hugeh.com/ [hugeh.com] plz help thanks akshay

Re:how to identify a platform? (0)

Anonymous Coward | more than 6 years ago | (#22642586)

PROTIP: you fail at hacking. and just about everything else.

Perhaps... (1)

Kyrubas (991784) | more than 6 years ago | (#22642124)

Perhaps this may be AMD/ATI's crack at the science rendering market that nVidia has locked down pretty well. If I remember right AMD/ATI released the specs on some of their cards for this kind of work, maybe developing this is a logical step for them in gaining this part of the market as well as a simple way to diversify their products by, counterintuitivly in a way I suppose, combining two of their markets.

Fr1st stop (-1, Redundant)

Anonymous Coward | more than 6 years ago | (#22642186)

Future. The 4and Real problems that

Wrong article summary (5, Insightful)

archen (447353) | more than 6 years ago | (#22642410)

AMD is in competition with Intel
ATI is in commpetition with Nvidia
AMD + ATI is in competition with INTEL

Which video chipset manufacturer has the majority of the market? ATI? Nvidia? Matrox? No, Intel does. In fact Intel has more market share then ATI and Nvidia combined. I highly doubt the gamer market will be very high on the uptake of not being able to upgrade their video card. As such this must be aimed more at the integrated mainboard chipset market where Nvidia isn't even a very big player.

Re:Wrong article summary (2, Interesting)

MorpheousMarty (1094907) | more than 6 years ago | (#22643928)

I highly doubt the gamer market will be very high on the uptake of not being able to upgrade their video card.
You can add a video card of your choice, and you can even set it up in a hybrid crossfire configuration with compatible cards, with good results [tomshardware.com] . As a gamer on a budget this definitely grabs my attention.

Re:Wrong article summary (1)

thedarknite (1031380) | more than 6 years ago | (#22644018)

IIRC Nvidia was one of the major suppliers of on-board graphics for AMD based motherboards and since AMDs purchase of ATI they are now focusing more on competing with Intel in that space.

Re:Wrong article summary (0)

Anonymous Coward | more than 6 years ago | (#22644802)

Which video chipset manufacturer has the majority of the market? ... Intel does. In fact Intel has more market share then ATI and Nvidia combined.

Yup, that's what majority means.

Another fun fact: Intel has more than 50% of the market! Betcha didn't know that.

Re:Wrong article summary (1)

Whiteox (919863) | more than 6 years ago | (#22645244)

AMD + ATI is in competition with INTEL
I doubt that, otherwise Intel would not be producing the X38 chips and their variants.
I just recently built a custom system with an X38 chipset optimized for ATI crossfire with and Intel CPU socket.
LOL And I had to put in a NVidia 8800GT!
Works really well, especially with 4GB of 1200MHZ ram :)

Re:Wrong article summary (1)

RzUpAnmsCwrds (262647) | more than 6 years ago | (#22645434)

AMD is in competition with Intel
ATI is in commpetition with Nvidia
AMD + ATI is in competition with INTEL


First, AMD and ATI are the same company, and the company is named AMD. ATI is a brand used for AMD's graphics solutions.
Second, AMD and NVIDIA are very much in competition, both in discrete graphics AND in core logic (chipsets).

In fact Intel has more market share then ATI and Nvidia combined.


Incorrect. AMD and NV both have around 28% of the market, Intel has about 40%, and the balance is controlled by VIA and other players.

Re:Wrong article summary (1)

DrSkwid (118965) | more than 6 years ago | (#22650336)

Please learn to discriminate between then and than. Thanks.

Re:Wrong article summary (1)

Nicolay77 (258497) | more than 6 years ago | (#22653136)

Actually, I would say 'relearn'.

This mistake was totally non-existent back in 2003. Don't know why it caught up.

hybrid graphics, or integrated graphics? (1)

chipace (671930) | more than 6 years ago | (#22642654)

I thought that hybrid graphics was the cpu and gpu on the same die. Integrated graphics is pretty much the norm, with predictable performance increases each year.

buzzwords beehaving badly (1)

cizoozic (1196001) | more than 6 years ago | (#22642868)

Actually, hybrid graphics merely supplement a traditional graphics engine with an electrical counterpart.

AWEsOME ?fp (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#22643266)

not going home to die. I will jam project. Today, as www.anti-slash.org FreeBSD's Already 4ware, *bSD [anti-slash.org] alike to reap we need to address that comprise

Beware onboard video for 1080p HTPC. (2, Interesting)

benow (671946) | more than 6 years ago | (#22643298)

I picked up a HTPC with onboard nvidia gfx and while it's great for everything else, it has a hard time with 1080p. I just kind of assumed it'd be able to do fullscreen video at 1920x1080, but it is very choppy. Something to consider when looking for an HTPC. There must be reviews of onboard graphics out there...

Re:Beware onboard video for 1080p HTPC. (1)

wild_berry (448019) | more than 6 years ago | (#22643516)

The ATI HD Radeon 3xxx series video decode accelerator is in this on-board graphics chip, and will do MPEG 2/4, H.264 and VC1 for you.

Re:Beware onboard video for 1080p HTPC. (2, Informative)

Pulzar (81031) | more than 6 years ago | (#22643912)

I just kind of assumed it'd be able to do fullscreen video at 1920x1080, but it is very choppy. Something to consider when looking for an HTPC.

That's the whole "point" of AMD780 -- it's the first one that can do it, and do it very well. It has built-in video decoders to handle even the most demanding blueray DVDs. On top of actually being able to play most new games, and pretty much all new DX10 games when you add a $50 video card and run them together.

So, yes -- beware of onboard video, but only before this one :).

Everyone can do everything perfect! (1)

definate (876684) | more than 6 years ago | (#22643958)

I understand that they are merging to similar things, however this is not necessarily good for them. Sure they have consolidated their products (to some extent), however this only puts a greater managing burden on them selves. Do they err on the side of the processor, or the graphics processor? Which gets more attention and money?

Since they can't drop the original architecture just yet, I see this as now fighting a front on 3 sides.

Someone should smack them with the Wealth Of Nations, we divide labour around because nobody can do everything perfectly!

Dear AMD CEO (Hector Ruiz), (2, Insightful)

DraconPern (521756) | more than 6 years ago | (#22646788)

None of that hardware matters if the drivers suck. Please hire some good driver developers.

No future in it (2, Interesting)

tacocat (527354) | more than 6 years ago | (#22647514)

I was in a rather lengthy conversation last week about the future of gaming on computers. Conclusion is that games are not going to survive long on computers for the primary reason that they are far too costly to support. The natural development is to move into highly specialized hardware and better manage the video requirements.

Here's the core of the problem: The video card becomes the single most expensive piece of hardware in a workstation chassis. Within six months I am buying games that marginally run on the equipment and at the end of the year I'm pretty much out. Even at the time of purchase, some video games won't run on the hardware. And gaming is the only segment of the software industry that is pushing against this hardware limitation. Office products, web browsers, email applications do not require this heavy hardware.

There is an increasing movement from desktop to a more distributed/mobile environment of notebooks and central workstations that act as servers for print, file, proxy applications. Notebooks are not built with 100W video cards. But notebooks are what you get when you go to college.

With the advent of PS3, Xbox360, Wii there are specialized pieces of hardware that are intended for gaming and have fixed hardware capabilities. These are the new gaming environments that people are moving into. The issue now is for them to solve how to do MMORPG and similar game constructs under this hardware platform. But by moving game development into this environment there is zero work they have to do in order to get the hardware compatability solved like they do with computers. It's a fixed environment.

Re:No future in it (1)

mpeskett (1221084) | more than 6 years ago | (#22648682)

I look at it this way, I have a reasonable PC sat under the desk, does all the normal tasks plus some moderate gaming. To push that up to the level of a top-class games machine I'd need to get myself an expensive graphics card. To get a separate games machine, I'd need to get myself an expensive games console... which is more or less the same cost to me (depending on exactly which card and which console I hypothetically go for) But upgrading the PC still has the advantage because I'd rather just have the one box that does everything, it means less clutter (especially with all the peripheral devices you get around a games console) and it means that I don't have to bugger about to go from being on the internet to playing a game or back again. Other points to the PC's advantage is that you don't have to worry about which type of PC to get or miss out on certain games because they only come out on one brand of PC, you can build your own and it'll be just as good (or better) than one you buy, and the thing's upgradeable as far as your budget stretches instead of having to buy a whole new box of tricks every few years (and at that point worry about whether your old games will be supported properly by the new device)

Re:No future in it (1)

drsquare (530038) | more than 6 years ago | (#22666662)

To push that up to the level of a top-class games machine I'd need to get myself an expensive graphics card. To get a separate games machine, I'd need to get myself an expensive games console...
The problem with the PC is, to plug in that video card you also need a new PCI-X motherboard. Which means you also need a new processor, new RAM, and it's SATA so your old hard disk and DVD drive don't work, so you need to buy them again. Then you need a new power box with all the new fangled connectors to get it all to work. Of course, the new graphics card is DVI only, so you need a new monitor. Then you need a special gaming mouse, and a controller, and speakers comparable to your living room TV. Before you know it you've spent the best part of a grand. Then you need Windows XP to run all the latest games, which is even more money spent.

Then a couple of years later when Doom 4 or whatever comes out, it's all obsolete and you get to start rebuying all over again.

Re:No future in it (1, Informative)

Anonymous Coward | more than 6 years ago | (#22651144)

Well this has always been the case and PC's are still around. You can't play cutting edge games without new cutting edge new technologies. Don't mistake the move to consoles as something to do with graphics, gaming companies say it's more of a response to piracy.

Re:No future in it (1)

guruweaver (1008847) | more than 6 years ago | (#22654254)

You make valid points, and that scares me. I realize I'm a niche market, but I travel quite a bit for business. Usually away 5 out 7 days a week. I'm a gamer and use evening gaming as entertainment and stress relief. I do not want to have to haul a big, heavy (X-Box 360/PS-3) console with me and have to fight to hook it up to the hotel TV just to game. The Wii is surprisingly portable, but even then, it's one more set of things (Wii, brick, controller, game disks, sensor bar, and RF modulator+cables (for TV's without RCA connectors)) to carry with me. I'm rely on my laptop for gaming. If PC gaming fades away, I hope the consoles get a lot more portable. Or maybe with decent remote abilities (RDP/VNC into the console at home and play from the hotel? There's a thought. :) ) Guru

Lock-in (1)

Yfrwlf (998822) | more than 6 years ago | (#22647904)

I wondered how long it would take them to find a way to truly lock in their products all the way to the graphics card. If this catches on, they finally have. Remember back when you could put any brand of processor in your motherboard? Then they got rid of that. Then they started releasing their own north bridges and spread rumors that if you used their brand of north bridge with their brand of graphics card it could run faster. Then, they tried to go even further and tell everyone that if you used their brand of north bridge, video card, and CPU all together with their "Spider" "platform" that it would have some kind of magical, proprietary speed increase, like you were supposed to be glad that they're taking steps to kill more standards and further control the market.

Sorry AMD, I believe in standards and in competition. Make me a CPU and a video card that can work in any computer so that we can get true benchmark comparisons of your products and have the flexibility to do things like easily put in a replacement if your friend's hardware device X dies suddenly.

driver revisions need ironing out? (0)

Anonymous Coward | more than 6 years ago | (#22650010)

Gee, those catalyst drivers work so well right now, I wonder what the holdup could be?
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...