Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Delivers DX11 Graphics Solution For Under $100

timothy posted more than 4 years ago | from the does-not-improve-c64-aztec dept.

Graphics 133

Vigile points out yesterday's launch of "the new AMD Radeon HD 5670, the first graphics card to bring DirectX 11 support to the sub-$100 market and offer next-generation features to almost any budget. The Redwood part (as it was codenamed) is nearly 3.5x smaller in die size than the first DX11 GPUs from AMD while still offering support for DirectCompute 5.0, Eyefinity multi-monitor gaming and of course DX11 features (like tessellation) in upcoming Windows gaming titles. Unfortunately, performance on the card is not revolutionary even for the $99 graphics market, though power consumption has been noticeably lowered while keeping the card well cooled in a single-slot design."

cancel ×

133 comments

Sorry! There are no comments related to the filter you selected.

Why? (2, Insightful)

Nemyst (1383049) | more than 4 years ago | (#30793830)

I'm sorry, I've seen this news go all around tech sites and... I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders. What, this requires a powerful GPU in the first place to be of any use? Something much, much better than this card? Oh...

Seriously, good for AMD, but I just don't see the point. Say it's a good card, say it has very low power consumption, but hyping DX11 when it has no particular benefit - especially at this price point - is absolutely useless.

And before anyone says I'm just bashing AMD, my computer has a 5850.

Re:Why? (5, Informative)

Lord Crc (151920) | more than 4 years ago | (#30793992)

I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders.

Compute shaders, or more generally GPGPU (via OpenCL as well as DX11) will open up a huge new market for GPUs. One midrange GPU can replace a small cluster of computers at a fraction of the cost. For example, using 2-3 GPUs in one box, people doing architectural visualization can get their results in minutes instead of days.

Re:Why? (1)

Lord Crc (151920) | more than 4 years ago | (#30794146)

I should add that the arch. viz. sample was for compute shading in general. But even such "puny" cards such as this should give a nice boost to many GPGPU applications.

Re:Why? (3, Insightful)

Kjella (173770) | more than 4 years ago | (#30794408)

For example, using 2-3 GPUs in one box, people doing architectural visualization can get their results in minutes instead of days.

Yeah, and the point was that those people wouldn't be buying this card. Face it, GPGPU isn't a general purpose CPU, we have some companies that are already damn good at making those. This means you either need it or you don't, and if you first do you'll probably want a lot of it. Companies and research institutions certainly will have the money, and even if you are a poor hungry student you can probably afford to invest 2-300$ in your education for a HD5850 which has a ton of shaders compared to this. The only real purpose of this card is to phase in a chip built on a smaller process, that'll be cheaper to produce. All they could have gained in performance they've instead cut in size.

Re:Why? (5, Insightful)

MrNaz (730548) | more than 4 years ago | (#30794618)

Face it, GPGPU isn't a general purpose CPU, we have some companies that are already damn good at making those.

Not quite accurate. While GPGPU != CPU, there are things that GPGPUs can do far better than CPUs, and those things are more common than you'd think.

The only real purpose of this card is to phase in a chip built on a smaller process, that'll be cheaper to produce.

Even though I don't agree with you that that is the only reason, isn't making the same product, but cheaper, a worthy cause in and of itself?

I feel that you are being unduly dismissive.

Re:Why? (1)

i.of.the.storm (907783) | more than 4 years ago | (#30795480)

Not quite accurate. While GPGPU != CPU, there are things that GPGPUs can do far better than CPUs, and those things are more common than you'd think.

I agree completely, for example, video encoding is pretty common these days and can be GPU accelerated for massive gains in speed.

Re:Why? (1)

Lord Crc (151920) | more than 4 years ago | (#30795522)

Yeah, and the point was that those people wouldn't be buying this card.

True, I was thinking more generally about the GPGPU market. However consider that if a HD5870 speeds up a task by 10-15x compared to a regular CPU for a given task, then this card could potentially give a 2-3x speed-up. For many it'll be easier and cheaper to get this card than a CPU which can do 2-3x.

Re:Why? (1)

AaronLawrence (600990) | more than 4 years ago | (#30795022)

A huge new market? More like a small but significant niche.

Re:Why? (1)

LBt1st (709520) | more than 4 years ago | (#30796540)

Well nVidia released the GT 240 not long ago for $100. My guess is this is AMD's answer to that.
nVidia's card supports DirectX10.1. If AMD can't make a card that out performs it, they can at least have a bigger number on the box.

For developers, both of these cards are good news. It means anyone can afford a video card that can handle the latest features (even if it does them slowly). Devs can focus on making games instead of supporting legacy hardware or creating workarounds for people without feature x.
Users don't have to drop hundreds/thousands for a gaming rig anymore. Everybody wins!

Re:Why? (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30794054)

I don't get it.

Of course you don't. This card is for people who have lower resolution monitor (under 1440 X 900), since at lower resolutions it can run all modern games comfortably. About 50% people still run at 1280 x 1024 or below, and for them this is a great graphics card. It gives good performance at reasonable price, and has the latest features.

Re:Why? (5, Insightful)

Sycraft-fu (314770) | more than 4 years ago | (#30794188)

Well the things that may make DX11 interesting in general, not just to high end graphics:

1) Compute shaders. Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders). The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video. You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration. Doesn't have to be a high end one either.

2) 64-bit precision. Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is). However there are other math functions that need higher precision. DX11 mandates 64-bit FP support. In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.

3) Multithreaded rendering/GPU multitasking. DX11 offers much, much better support for having multiple programs talk to the GPU at the same time. The idea is to have it fully preemptively multi-task, just like the CPU. Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.

It's a worthwhile new API. Now I'm not saying "Oh everyone needs a DX11 card!" If you have an older card and it works fine for you, great stick with it. However there is a point to wanting to have DX11 in all the segments of the market. Hopefully we can start having GPUs be used for more than just games on the average system.

Also, it makes sense from ATi's point of view. Rather than maintaining separate designs for separate lines, unify everything. Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it. Less shaders, less ROPs, smaller memory controllers, etc. Makes sense to do that for a low end part, rather than a totally new design. Keeps your costs down, since most of the development cost was paid for by the high end parts.

In terms of hyping it? Well that's called marketing.

Re:Why? (1)

Nemyst (1383049) | more than 4 years ago | (#30794376)

I do see the points of the new features, I just don't feel like they're best at home in a cheaper card. Props to ATi for keeping their cards unified (unlike the huge mess of i7s using P55 and X58 or mobile GPUs lagging two generations behind but sharing the same name as their newest desktop counterpart), but I just think the angle at which they're marketing this is not the best for the market they're looking at... Unless they really think their biggest buyers will be people who only care about GPGPU and other such number-crunching functions.

Yes, compute shaders are good for H.264 acceleration, but we already have that on even cheaper cards so this comes out as good but not revolutionary. The higher precision mostly matters for GPGPU applications, doesn't it? I have yet to see a good application of it in graphics or hardware acceleration at least. Multithreading sounds nice on paper, but I wonder how many people will actually make use of it.

I'm not against the card having DX11. I just really don't see it as its best selling point.

Re:Why? (1)

Kjella (173770) | more than 4 years ago | (#30794548)

The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video.

Except that for all intents and purposes, it has nothing to do with the GPU. It could just as well have been on a separate chip, like the Broadcom chip for the new Intel Atoms. It could have been on the CPU too for that matter. Right now there's an awful lot of hype, the question is how much is practical reality. Some things are better solved, in fact generally best solved by dedicated hardware like a HD decoder. How much falls between general purpose and dedicated hardware? Very good question.

Re:Why? (1, Informative)

Anonymous Coward | more than 4 years ago | (#30795542)


1) Compute shaders. Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders). The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video. You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration. Doesn't have to be a high end one either.

One of those problems is none other than solving large systems of linear equations, which encompasses a very wide range of specific problems such as 3D visualization (i.e., what the GPU was designed to do) but also solving partial differential equations through techniques such as the finite-element method [wikipedia.org] , which encompasses problems such as structural analysis, thermal, fluid dynamics, electromagnetics and even weather prediction and economic models. So, you pretty much can do everything better with a GPGPU.


2) 64-bit precision. Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is). However there are other math functions that need higher precision. DX11 mandates 64-bit FP support. In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.

...which is extremely helpful for the applications stated above.


3) Multithreaded rendering/GPU multitasking. DX11 offers much, much better support for having multiple programs talk to the GPU at the same time. The idea is to have it fully preemptively multi-task, just like the CPU. Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.

...which is what makes this whole technology relevant to begin with.

Re:Why? (1)

Mal-2 (675116) | more than 4 years ago | (#30796240)

Rather than maintaining separate designs for separate lines, unify everything. Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it. Less shaders, less ROPs, smaller memory controllers, etc.

This also allows them to pop a couple fuses and re-purpose marginal would-have-been high end parts by blocking out the broken parts. They did this back in the 9500/9700 days, I don't see why they wouldn't want to do it now.

Mal-2

Re:Why? (1)

dunkelfalke (91624) | more than 4 years ago | (#30794194)

Same thing was said about DX10. And about HD4670.

Re:Why? (5, Insightful)

Anonymous Coward | more than 4 years ago | (#30794224)

Same thing was said about DX10. And about HD4670.

And about DX9 before that. And DX8 before that. And on and on. I'm amazed by how many people here don't seem to "get" that advances in technology is precisely how technology moves forward. I mean, it's really a pretty simple concept.

Re:Why? (2, Informative)

Cid Highwind (9258) | more than 4 years ago | (#30794758)

Having a DX9 GPU got you the Windows aero effects, so there was at least a visible benefit to using the lowest end DX9 GPU over a (probably faster) DX8 part at the same price.

Re:Why? (2, Insightful)

Antiocheian (859870) | more than 4 years ago | (#30796988)

A new DirectX version is not technology moving forward. CUDA and PhysX are.

Re:Why? (1)

The_countess (813638) | more than 4 years ago | (#30797198)

CUDA and PhysicX are propitiatory follies from the past that are about to go extinct. DX11 comput-shaders and/or OpenCL takes over their jobs but on a platform that supports all (graphics) hardware.

Re:Why? (4, Insightful)

BikeHelmet (1437881) | more than 4 years ago | (#30794510)

Google Earth across 6 monitors from a single $100 card? Seems like technology is heading in the right direction!

Re:Why? (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30794558)

I'm sorry, I've seen this news go all around tech sites and... I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders. What, this requires a powerful GPU in the first place to be of any use? Something much, much better than this card? Oh....

Sounds like AMD wants to pull the "NVidia GeforceFX 5200 card" in the market to see what happens. The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features, it does at a grand 1-3fps. Don't get me started on it's unbelievably poor GLSL support either... But hey, it IS "The way it's meant to be played", so can YOU even complain!?

Re:Why? (1)

anss123 (985305) | more than 4 years ago | (#30794728)

Sounds like AMD wants to pull the "NVidia GeforceFX 5200 card" in the market to see what happens. The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features, it does at a grand 1-3fps. Don't get me started on it's unbelievably poor GLSL support either...

The 5200 wasn't a bad card as long as you kept away from those features. It was faster than the MX cards it replaced and cheaper too, only with the drawback that some games ran like crap in the default configs. If you want true turds you should look at low end laptop chipset, there we're talking sub Voodoo 2 performance with DX10 feature set and inability to run DX7 games.

Re:Why? (1)

.tekrox (858002) | more than 4 years ago | (#30796470)

I Resent that, my onboard Geforce 9400M runs many things quite well; Case in point - I can (and do) play Team Fortress 2 at 1920x1200 on medium settings with an almost contant 60fps...

Re:Why? (1)

mdwh2 (535323) | more than 4 years ago | (#30794676)

Helps with standardisation? I might be writing a game/application that doesn't need tonnes of graphics processing power to run, but it's still easier if I can simply write one DirectX 11 renderer, instead of having to write multiple renderers for people with low end cards that only support older APIs.

Re:Why? (1)

that this is not und (1026860) | more than 4 years ago | (#30795352)

You'd be better off writing for what everybody has on their machines currently. That's DirectX 10.

Don't be a victim of Microsoft's need for revenue from planned obsolescence. Code to DirectX 11 in a few years, if ever.

Re:Why? (1)

mdwh2 (535323) | more than 4 years ago | (#30795634)

Indeed yes, for now (actually DirectX 9, judging by the number still on XP).

But the point is that releasing low end cards now that run the latest DirectX means that things will be easier in future, and will mean developers can sooner start focusing on only DirectX 11.

Re:Why? (1)

KibibyteBrain (1455987) | more than 4 years ago | (#30794926)

Part of the reason DX10 never really took off was that only the highest end graphics cards supported it for years, and so software developers who used DX(far beyond just game writers) had to focus on supporting either just HW DX9 or both, to which the answer is pretty obvious. Because of the limited benefit you get from one version to the next on something like DX, this is a very bad trend. So by saturating the whole market with DX11 capable cards, hopefully this means that in a few years more apps will support DX beyond just 9, or even 8.

Re:Why? (0)

Anonymous Coward | more than 4 years ago | (#30796184)

Only highest end hardware supporting DX10 is a complete fabrication. DX10 never took off because it was Vista exclusive.

Re:Why? (1)

Hadlock (143607) | more than 4 years ago | (#30796916)

The real reason DX10 never took off is that nobody could tell the difference between DX9 and DX10 screenshots.

Re:Why? (1)

TJamieson (218336) | more than 4 years ago | (#30795434)

... hardware tessellation and compute shaders ...

Compute Shader for Shader Model 5.0, yes. However, starting with Catalyst 9.12 (December 2009) the HD48xx cards have driver support for CS for SM4.0. Regardless, afaik, no one is using either presently. Would be interesting to see a new physics engine that ran on this; PhysX for Compute Shaders I guess.

Re:Why? (1)

iamhassi (659463) | more than 4 years ago | (#30795614)

"Seriously, good for AMD, but I just don't see the point."

Not only that, but it's slower than the 8 month old $99 ATI Radeon HD 4770 [bit-tech.net]

so if I bought the $99 ATI Radeon HD 4770 8 months ago, why would I spend $99 on a slower card now?

Re:Why? (1)

LuxMaker (996734) | more than 4 years ago | (#30795730)

Because with hardware tessellation the image will look more real and with more compute shaders(64 bit) your F@H epeen will grow.

Re:Why? (1)

voidphoenix (710468) | more than 4 years ago | (#30796208)

Several (not-so-compelling) reasons: Eyefinity, DX11, lower power consumption. The 5670 would be good for budget HTPCs and for people with low-demand multi-monitor setups (like some of the people at work who run multiple spreadsheet-like apps concurrently). As well, the 4770 is scarce and it seems that many OEMs are discontinuing them. I'm thinking of grabbing another one for Crossfire before stocks run out at my supplier.

Re:Why? (1)

hairyfeet (841228) | more than 4 years ago | (#30796530)

I am also an AMD fan, switched last year, and I have to agree. The kicker for me is the 4870, which they used in the tests to whip this new card by around 30%, is only $20 more. So why exactly would I buy this card, when for $20 I can get 30% faster?

I rarely spend more than $100 on a GPU (currently using a 4650 1Gb) but I just can't see the point of this card with it being so close in price to a much better card. Maybe if they priced it in the $55-$75 range, then yeah it would be a good buy. For that price I would be happy to get rid of my 4650 for a little speed boost. But at $100 there are just too many card that for a little bit more give a whole lot more bang for the buck. So sorry AMD, I just don't get it.

Anonymous Coward (0)

Anonymous Coward | more than 4 years ago | (#30793844)

Great to see that Moore's Law still has some steam left for the GPU industry.

Re:Anonymous Coward (1)

tepples (727027) | more than 4 years ago | (#30794006)

With the 4 GHz CPU clock speed ceiling of the past few years, the conception of Moore's law has finally come to resemble its original formulation based on density doublings than the momentary distraction of clock speed doublings. I predict it'll be fairly easy to keep performance increasing in step with Moore's law for GPUs, especially because GPU workloads tend to be embarrassingly parallel [wikipedia.org] compared to CPU workloads.

I don't really keep up with games... (1)

h4rm0ny (722443) | more than 4 years ago | (#30793872)

... so somebody tell me if we actually have any that can really take advantage of the latest greatest graphics cards, yet? Seems like the hardware is outpacing the software, isn't it?

Re:I don't really keep up with games... (1)

mvidutis (1258378) | more than 4 years ago | (#30793938)

Clearly you don't run nearly enough instances of Crysis at a time.

Re:I don't really keep up with games... (3, Informative)

TheKidWho (705796) | more than 4 years ago | (#30793950)

A lot of games will struggle on this card significantly. It's about as powerful as a 3870 from 2+ years ago.

Re:I don't really keep up with games... (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30794036)

Which is still plenty powerful enough to run any game that also launches on the Xbox 360.

It also does it without having to buy a new PSU. The DX11 bits are just there to help cheap people (like myself) feel comfortable buying the card, knowing that it'll still play all the games (even if poorly) that come out next year since games that use DX11 are already starting to come out.

It's a good move from ATI, targeted at cheap gamers that are looking to breathe life into an older computer.

Re:I don't really keep up with games... (1)

dagamer34 (1012833) | more than 4 years ago | (#30794444)

Main reason for my buying the card is part of HTPC with HD bitstreaming.

Re:I don't really keep up with games... (1, Informative)

Anonymous Coward | more than 4 years ago | (#30796754)

Captain Obvious told me that while 3870 cost $200+ on 2+ years ago, this so called "about as powerful as a 3870" card cost around $100, uses less power than 3870, AND is listed on the x6xx line.

Truth be told, you're like comparing apples and oranges here. 2+ years ago, 3870 series might be the fastest card out there and should be compared now to a 5870.

Of course (1)

Sycraft-fu (314770) | more than 4 years ago | (#30794550)

How could hardware not outpace software? I mean it is really hard to develop a game for what does not yet exist. The hardware has to come out, and in particular the API has to come out, then developers can develop for it. They do get engineering samples a little early but still.

In terms of DX11 support. Yes, there are a couple games that will use it. No, it isn't really very useful. Said games run and look fine in DX9 mode.

Really, you don't buy new cards because you need their features right away. There are two major reasons to get a card with new features:

1) You want the highest end performance. As always, the newer stuff is faster than the older stuff. So if you are a performance junky, you buy a high end DX11 card not because it is DX11, but because it is fast.

2) You need new hardware anyhow (maybe you are building a new system) so you might as well get current tech. That way, in 2 years, when things ARE using DX11, you card supports it and you don't have to upgrade unless you need better performance.

However if you expect software to fully support new hardware, well you are dreaming. The only way that would be possible is for the graphics card makers to deliberately hold their cards back from the public. They won't do that. Also, software companies often won't start supporting it until there is enough of a market for it. So it needs to be launched, get in to the hands of the public, then it is worth while to develop for.

Re:Of course (1)

tepples (727027) | more than 4 years ago | (#30794978)

I mean it is really hard to develop a game for what does not yet exist. The hardware has to come out, and in particular the API has to come out, then developers can develop for it. They do get engineering samples a little early but still.

Do new video game consoles come onto the market with 0 games? No. Even the Nintendo 64 had three finished games at launch: Mario, Pilotwings, and Shogi. So at least some video game developers depend on engineering samples.

Re:Of course (1)

Sycraft-fu (314770) | more than 4 years ago | (#30795048)

Consoles are different. They are given to developers for longer periods of time, precisely for the reason that there need to be games out at launch. Graphics cards are tracked to the public much faster as people will buy them without any special titles, since they run older games better.

Also, console development these days can be done on specially modified PCs. You use PC hardware to simulate what'll be in the console, since the console chips come from the graphics companies' PC products.

Re:I don't really keep up with games... (0)

Anonymous Coward | more than 4 years ago | (#30794560)

Actually a game called Shattered Horizon can fully use the latest graphics cards and takes advantage of it. It is a FPS based in space meaning far more movement and mobility and requiring much higher graphical power than most games because of this, of course this is on top of some very nice graphics to begin with, then try rotating and moving them in three dimensions instead of two. Generally recommended requirements are in the 3000 series for ATI cards and 8000 series for nVidia. This game recommends at least a GTX 260 or ATI 4870 and you must have DX10 or higher on top of a quad core processor. The 5870 has a good time with this game and trouble maxing out. Sure it might only be one game but at least some games are moving in the direction of pushing the new hardware to higher limits.

Re:I don't really keep up with games... (1)

TheKidWho (705796) | more than 4 years ago | (#30796046)

Wrong, the 5670 can NOT run shattered horizons smoothly at high resolutions, In fact I bet it has trouble at 1280x1024. Heck my GTX 275 barely can run it at 1920x1200 maxed out and only gets ~20-25fps. Also Shattered Horizons is a DX10 game not a DX11 game.

Re:I don't really keep up with games... (1)

Hadlock (143607) | more than 4 years ago | (#30796942)

The biggest problem for game makers is that people went from 17-19" 1280x1024 displays (1.6 megapixels, i think, not going to do the math this late) to 21-24" displays at 1680x1050 (2.3 megapixels). The old standard used to be 1024x768. For a long time it was 1280x1024 (small step up). Now the standard (1680x1050) increased by about 50% seemingly overnight. A card (8600GT 512MB) that could push Valve's TF2 (two year old game at this point) at 45-60fps on 1280x1024 no problem with most of the settings at medium-high, now struggles to push the game at 30fps at low settings at 1680x1050. So while cards bumped up in capability, people are buying these cards to play their current games at their old speed/visual quality. We're going to have to wait another year or two before video cards with the capability to push "tomorrow's" games at a modern resolution... for less than $150. ATI is heading in that direction more quickly than nVidia, but video card makers have yet to meet their market with a proper product.

Compiz is all I need. (0, Flamebait)

GNUALMAFUERTE (697061) | more than 4 years ago | (#30793986)

And even my cheap integrated Intel 945 can run it in full glory at 1920x1080.

About games ... Chess doesn't require OpenGL.

The thing I'm most worried about is how in the last two years everyone has accepted DirectShit. It's micro$hit technology! it's not open, not cross-platform, and you all know it's meant to screw you up. This is IE all over again. We had a beautiful standard, called HTML. Micro$hit convinced people to use their stupid proprietary extensions, and in a few years we had destroyed the web. It took us YEARS to get back in track, destroy explorer, and get the web to be standards compliant again. Now people is doing the same all over again, displacing OpenGL because it's "Obsolete" and letting micro$hit rule hardware production with DirectShit-compatible devices.

I hate Gamers, and I hate the kind of people that talk about video cards all day. For fucks sake, If you want to play games get a Famicom or that shitty new alternative, I believe it's called playstation or something.

Re:Compiz is all I need. (1)

tepples (727027) | more than 4 years ago | (#30794082)

GNUALMAFUERTE wrote:

If you want to play games get a Famicom or that [subpar] new alternative, I believe it's called playstation or something.

What if I want to play indie games or games with mods? Consoles generally aren't made for that.

Re:Compiz is all I need. (1)

dunkelfalke (91624) | more than 4 years ago | (#30794148)

For someone who doesn't care about 3d games you seem to have quite strong emotions about direct x and gamers.

Re:Compiz is all I need. (-1, Flamebait)

GNUALMAFUERTE (697061) | more than 4 years ago | (#30794272)

I Don't care about games, but I do care about 3D. 3D was used WAY before there were any 3D games. And 3D is WAY more important than 3D games.

What I'm seeing is people defining the future of a technology as important as 3D rendering based on the most stupid application of that technology: Games.

But people is taking even more stupid decisions based on games (For example, Choosing Windorz over Unix because of game availability), so we are doomed anyway.

Re:Compiz is all I need. (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#30794338)

People chose Windows over Unix because Unix is uncomfortable to use and because most Unix users behave like dicks.
You are a prime example.

Re:Compiz is all I need. (1, Funny)

Anonymous Coward | more than 4 years ago | (#30796154)

Wow. +2 Insightful for an obviously inflammatory comment.GG /.

Re:Compiz is all I need. (2, Insightful)

DAldredge (2353) | more than 4 years ago | (#30794770)

If it wasn't for those games your 3D accelerator would cost much more than they currently do.

Re:Compiz is all I need. (1)

that this is not und (1026860) | more than 4 years ago | (#30795370)

And likewise, if it wasn't for porn, your VHS tape deck would have cost much more than it did.

Yay porn. Yah gamers.

Re:Compiz is all I need. (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#30796542)

Fuck you, you little faggot shill bitch. How's Ballmer's cock taste?

Re:Compiz is all I need. (3, Funny)

ClosedSource (238333) | more than 4 years ago | (#30794780)

"We had a beautiful standard, called HTML. Micro$hit convinced people to use their stupid proprietary extensions, and in a few years we had destroyed the web."

Yes, XMLHttpRequest that MS came up with which made AJAX possible is just another stupid extension. We should use only "beautiful" HTML.

Re:Compiz is all I need. (1)

that this is not und (1026860) | more than 4 years ago | (#30795386)

There's occasionally an exception that can be brought up, that gives Microsoft an excuse to exist.

Whats the point? (4, Informative)

Shanrak (1037504) | more than 4 years ago | (#30794020)

Toms Hardware's review here: http://www.tomshardware.com/reviews/radeon-hd-5670,2533.html [tomshardware.com] TLDR: While it does support DX11, its not powerful enough to really do much with it, barely keeping 30 FPS at 1680x1050.

Re:Whats the point? (1)

tepples (727027) | more than 4 years ago | (#30794030)

While it does support DX11, its not powerful enough to really do much with it, barely keeping 30 FPS at 1680x1050.

As I see it, the point of a bargain DX11 card is being able to run games that require DX11 at a lower res such as 1024x600 rather than at 0x0 because the game fails to start without features present.

Re:Whats the point? (0)

Anonymous Coward | more than 4 years ago | (#30794156)

And if you are a video game designer flat out requiring DX11 or bust, you are an idiot. The market is far too fragmented to make that a stipulation. You cannot even run DX11 (as well as DX10?) on Windows XP, still a large percentage of the market.

Re:Whats the point? (1)

tepples (727027) | more than 4 years ago | (#30794294)

Developers of launch titles for new consoles have the same problem. PlayStation 2 was "still a large percentage of the market" for years after the launch of PLAYSTATION 3.

Re:Whats the point? (0)

Anonymous Coward | more than 4 years ago | (#30794416)

no they dont. they know gamers will go for the ps3. there is a huge installed base of dx9 which will never move past dx9/xp for a long time. they may just go to windows 8/dx12 directly instead.

Re:Whats the point? (4, Informative)

Anonymous Coward | more than 4 years ago | (#30794104)

I think your post is misleading. According to that article, the card gets 46FPS average on Call of Duty: Modern Warfare 2, on 1920x1200, highest settings -- and that's one of the more intensive games. I have no idea what numbers you're quoting.

Re:Whats the point? (1)

Shanrak (1037504) | more than 4 years ago | (#30795078)

MW2 does not have DX11, the numbers I'm quoting is on Dirt 2, which does have DX11. The whole selling point of this card after all, is DX11 support since there are much better cards for that value that does not support DX11. (Note I'm not an ATI basher, I have a HD 5850 which I bought for DX11 in Arkham Asylum and Dirt 2.)

Re:Whats the point? (1)

i.of.the.storm (907783) | more than 4 years ago | (#30796586)

I guess the 5670 has lower power consumption than the cards it replaces. As an owner of a Radeon 2900, I can appreciate what that would be useful for, like not sounding like a vacuum cleaner while playing games. But I don't know whether I'd want to upgrade to the 5670, since as you said it seems to not be significantly faster than a 3870, but it costs a fair bit more.

Re:Whats the point? (0)

Anonymous Coward | more than 4 years ago | (#30796212)

MW2 is definitely not one of the more intensive games. The opposite if anything.

Re:Whats the point? (1)

dunkelfalke (91624) | more than 4 years ago | (#30794124)

Tom's Hardware seized to be informative years ago, nowadays they are just nVidia/intel advertizer.

Re:Whats the point? (0)

Anonymous Coward | more than 4 years ago | (#30794972)

Seized or ceased? Seized is a decent analogy, just like a bearing deciding to stop working, or an engine giving up the ghost after someone vomited in the carburetor bowl, but ceased is likely what you are meaning.

Re:Whats the point? (1)

dunkelfalke (91624) | more than 4 years ago | (#30795090)

Thanks for the correction, ceased is the word I should have used.

Re:Whats the point? (1)

Shanrak (1037504) | more than 4 years ago | (#30795748)

yet the majority of their monthly best video card for the money is ATI, I fail to see how they are nvidia advertisers.

Re:Whats the point? (4, Interesting)

YojimboJango (978350) | more than 4 years ago | (#30795818)

I'd like to point out something in that review. The only benchmarks that this card ever goes below 30fps minimum are Crysis and Far Cry 2 at 1920x1200 running in DX9 mode (instead of DX10 where the card is more likely to shine). Also they list the GeForce 9600 as getting 40.7fps average while playing DIRT in DX11. The GeForce 9600 does not support DX11.

In DirectX 9 mode, the Radeon HD 5670 is once again keeping pace with the GeForce 9800 GT, delivering playable performance all the way to 1920x1200. However, once DirectX 11 features are enabled, the latest Radeon slows to a crawl. Even the powerful Radeon HD 5750 has difficulty keeping the minimum frame rate above 30 fps at 1680x1050.

They pretty much tell us that they're testing these cards using higher settings for the ATI parts. Also on the reviews front page it tells us that they've under-clocked all the cards before testing. Why would anyone take their reviews seriously after actually reading that?

Not like I'm an ATI fanboy here either, my current and last 3 video cards were all Nvidia (was close to getting a 4850 about a year ago, but newegg had a sweet sale on the GTX260). It's just that this level of sleaze really pisses me off.

State of AMD for HTPC Use? (4, Insightful)

tji (74570) | more than 4 years ago | (#30794160)

I'm not a gamer, so the 3D features are not important to me. I am an HTPC user, and ATI has always been a non-factor in that realm. So, I haven't paid any attention to their releases for the last few years.

Has there been any change in video acceleration in Linux with AMD? Do they have any support for XvMC, VDPAU, or anything else usable in Linux?

Re:State of AMD for HTPC Use? (2, Informative)

Kjella (173770) | more than 4 years ago | (#30794726)

From what I understand hardware acceleration is now somewhat usable with the Catalyst drivers (source [phoronix.com] ). But for the open source drivers there is nothing, there's no specs for UVD and even though it should be possible to implement a shader-based acceleration and the docs for that is out, no one has done it yet.

Re:State of AMD for HTPC Use? (2, Informative)

moosesocks (264553) | more than 4 years ago | (#30796410)

AFAIK, the open-source drivers are progressing at a breakneck pace, and hardware acceleration is very usable on some cards. One of the more recent kernel releases included a new driver, which is allegedly quite good.

Apologies for being unable to offer more specifics. The current state of affairs is rather confusing, although I'm fairly confident that we're very quickly progressing in the right direction.

Re:State of AMD for HTPC Use? (0)

Anonymous Coward | more than 4 years ago | (#30794754)

No, they still don't. They suck.

Re:State of AMD for HTPC Use? (1)

bfree (113420) | more than 4 years ago | (#30794874)

I've had no problem displaying BBC-HD (1080i h264) with an 780G and an X2 5050e (low power dual core) with the Free drivers from x.org (but non-free firmware required for video acceleration and 3d). I wouldn't touch the closed source drivers from Ati or NVidia with yours but I'd now regard the modern Intel or Ati solutions as just fine for undemanding users.

Re:State of AMD for HTPC Use? (1)

Jah-Wren Ryel (80510) | more than 4 years ago | (#30795468)

I am an HTPC user, and ATI has always been a non-factor in that realm.

Not in Windows. MPC-HC's hardware acceleration has worked better with ATI chips than with Nvidia until just recently. The biggest sticking point was that VC1 bitstreaming (where you hand the entire bitstream to the gpu for decoding and display, rather than accelerating just parts of it like iDCT) didn't work on any nvidia gpus except the embedded ones - that did change with their most recent hardware release a couple of months ago, but ATI's had support for bitstreaming VC1 in their regular cards for at least a year.

Yeah, I can provide you the same thing for FREE! (2, Funny)

Hurricane78 (562437) | more than 4 years ago | (#30794296)

It’s called a “software renderer”. ;)

Just as AMD, I did not say that it would actually render anything in real time, did I? :P

AMD -=- ATI (1)

jo42 (227475) | more than 4 years ago | (#30794392)

Anyone else still :%s/AMD/ATI/g when coming up on these stories?

Re:AMD -=- ATI (0)

Anonymous Coward | more than 4 years ago | (#30794562)

Anyone else still :%s/AMD/ATI/g when coming up on these stories?

No. I just visually replace AMD with ATI whenever I see these sort of stories.

Meanwhile, NVidia is renaming cards (3, Informative)

Eukariote (881204) | more than 4 years ago | (#30794426)

With NVidia unable to release something competitive and therefore creating a "new" 3xx series into being through renaming 2xx series cards [semiaccurate.com] , the gts360m as well [semiaccurate.com] , those with a clue will be buying ATI for the time being.

Sadly, the average consumer will only look at higher number and is likely to be conned.

Re:Meanwhile, NVidia is renaming cards (1)

hansamurai (907719) | more than 4 years ago | (#30795210)

The average consumer isn't buying video cards, especially not "top of the line" ones. Whether Nvidia is doing this or not, I don't think it will have much affect on the market.

And it's not like you can compare model numbers of Nvidia cards to those of ATI's and figure things out, if they did that, everyone would just buy ATI anyway.

Re:Meanwhile, NVidia is renaming cards (1)

JackieBrown (987087) | more than 4 years ago | (#30796122)

That is annoying.

Whenever I think of switching to an intel cpu I give up since I cannot figure out how to compare them to an amd cpu

I am sure I would have the same problem if I was switchiong from itel to amd

Re:Meanwhile, NVidia is renaming cards (1)

i.of.the.storm (907783) | more than 4 years ago | (#30796594)

The funniest part is that much of the 2xx series cards are just renamed 9000 series cards, and much of those are renamed or die-shrunk 8000 series cards. That said, Charlie Demerjian is hardly an unbiased source of reporting on nVidia, although I think he does have good reasons for his "grudge."

Look I don't mean to be a cynical bastard but,... (1)

AbRASiON (589899) | more than 4 years ago | (#30795460)

We consistently see new hardware like this for people "DX10 cards now as low as 150$" or in this case DX11 cards at the 100$ price point.
Time and time again the game developers couldn't give a damn and I don't blame them - they target the biggest possible audience.
I'll never forget the Geforce 3 announcement at one of the Apple Expos of all things, Carmack was there and showed off some early Doom 3, it was absolute hype extravaganza. "Doom 3 will need a pixel shader card like the GF3!" So many people purchased one, problem is by the time Doom 3 came out, the GF3 was basically dead and while it could do the graphics required, it wasn't too quick.

My point is, any new tech like DX11, while great for all of us is never fast enough in the first implimentations, you'll see in 18 months time though, the DX12 cards will be bloody fantastic at DX11 features though, this is just how it is.
FWIW I have a DX10 ATI 4890 card, it's summer here in Australia and it's underclocked and still runs 99.9% of games flawlessly, I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones, performance bumps aren't what they used to be.

Re:Look I don't mean to be a cynical bastard but,. (1)

mdwh2 (535323) | more than 4 years ago | (#30795674)

My point is, any new tech like DX11, while great for all of us is never fast enough in the first implimentations, you'll see in 18 months time though, the DX12 cards will be bloody fantastic at DX11 features though, this is just how it is.

If that's true, you should be glad to get a DirectX 11 card, because it will be bloody fantastic at DirectX 10 features, which your current DirectX 10 card must surely not ever be fast enough at...

Re:Look I don't mean to be a cynical bastard but,. (1)

AbRASiON (589899) | more than 4 years ago | (#30795780)

Touche absoloutely touche! You're completely right.

Re:Look I don't mean to be a cynical bastard but,. (1)

starfire83 (923483) | more than 4 years ago | (#30795826)

FWIW I have a DX10 ATI 4890 card, it's summer here in Australia and it's underclocked and still runs 99.9% of games flawlessly, I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones, performance bumps aren't what they used to be.

Except that at the 5850 and 5870 cards are literally twice as powerful as the 4850 and 4870. Two 4850/4870 cards in CrossfireX are equal in performance as one 5850/5870 card. That doesn't seem to be the case on the lower end models, from what reviews are showing. I don't know why ATI decided to make the top end cards be twice as powerful as the previous generation but not the mid and low end cards. The low end cards seem to be slower, equal or just barely better. It certainly shoots their efforts of appealing to that market segment right in the foot. People interested in that market will snap up the cheaper but equally as powerful previous generation cards until they're out of circulation and unavailable.

Re:Look I don't mean to be a cynical bastard but,. (1)

AbRASiON (589899) | more than 4 years ago | (#30796034)

5850, no, 5870? Maybe and even then in 99.9% of games it's basically "here we see 90fps in 1920x1200 on the 4890 and we see 145 on the 5870!" Thing is I'm hitting 90fps already at 1920x1200, I (and very few) people have a 30" Apple display.

Not to say faster isn't better in the long run of couse but on a $ / speed ratio right now, the 5xxx series just isn't cutting it, far too overpriced - and to think it was ATI who saved us from Nvidia doing the exact thing when the GT series came out 18 months ago. One manufacturer is behind, jack the prices :/ (understandable I suppose)
The cheapest 5850 is about 280$ the cheapest 4890 is about 190$
http://www.anandtech.com/video/showdoc.aspx?i=3650&p=12 [anandtech.com]
http://techreport.com/articles.x/17652/4 [techreport.com]

I'm just not seeing 90$ extra worth of value here for the 5850, let alone the 5870.
Within literally 8 weeks of the first review of the next nvidia card, you'll see these prices 30% less across the board, it's not a smart time to buy right now.

Re:Look I don't mean to be a cynical bastard but,. (1)

starfire83 (923483) | more than 4 years ago | (#30796064)

The current prices have more to do with supply and demand over price gouging by ATI. The same thing happened when the 4770 came - you saw 4850 price drops because supply couldn't meet demand. As the 4000-series phases out you'll definitely see the 5000-series come down. Remember when the 4000-series were around the same prices as the 5000-series and the 3000-series were around where 4000-series is now? It's how it works. I won't be buying soon since it's not in my budget and, yeah, my 4870 1GB is adequate for my gaming needs. I probably won't be upgrading until DX11 games are prevalent and I want to play them.

Who needs performance (0)

rsilvergun (571051) | more than 4 years ago | (#30795740)

What's the point of anything above 1200x720 when most people game on 22" or 24" monitors? Studies show on that size display anything above 720p is pointless. A bigger concern, you just don't need much to play what's comming out. I just got a 4760 and it'll play anything on the market at 720p full details. I can turn on FSAA without a big hit to frame rate, but honestly on my 22" Acer I can't tell when it on anyway.

PC gaming is kinda dead right now. Practically everything's an Xbox port. The last big title that needed major hardware was Crysis, and that came out 2 years ago. Makes me wonder why AMD/Nvidia don't go the Rockstar games route and commission a game that requires their hardware.

Re:Who needs performance (1)

Akira Kogami (1566305) | more than 4 years ago | (#30796126)

"Studies show on that size display anything above 720p is pointless." Which studies?

Re:Who needs performance (0)

Anonymous Coward | more than 4 years ago | (#30796644)

WTF?
Seriously, WTF?
The difference between native 1920x1080 and upscaled 1280x720 is fucking dead obvious.
Hell, even on a 10 year old 21" CRT 1920x1440 and 1280x960 are worlds apart.

Re:Who needs performance (1)

Hadlock (143607) | more than 4 years ago | (#30797056)

anything above 720P at distances greater than 10' is useless. most people sit 18-24" away from their displays. you can most definitely tell the difference between a 1440x900, 1680x1050 and 1900x1200 pixel 24" diagonal display at 24" distance. you're correct that a 40", 1080p display for sports (i.e. general TV, not video games) in the living room is a waste of money, but for video games you will appreciate the 1080p (gui, etc). high resolutions for 22-27" displays on the desktop is very much wanted and very much useful.

Solution to what? (1)

seyyah (986027) | more than 4 years ago | (#30796306)

"Solution"...

Can't we get beyond this word?

ATI is aweful (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30796854)

ATI refuses to update the linux proprietary drivers for my video card for the newer kernels because supposedly my card which is only a few years old is according to them "legacy," but they still support windows albeit in a very limited (essential) scope despite it being legacy so i don't care what AMD/ATI does i am having no part in it. I don't know if AMD will lead to any change in this regard, but Id sooner give up computers altogether than support anything connected to ATI at this point.

Boycotted.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>