Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Real Life DirectX 10 Performance

Zonk posted more than 7 years ago | from the it-ain't-pretty dept.

Windows 67

AnandTech has a look at the performance PC gamers can expect see under Windows Vista with DirectX 10. Unfortunately, it isn't pretty. Despite the power of the new 10-compliant graphics cards, the choices made in developing this technology have resulted in a significant gap between what is possible and what is actually obtainable from commercial PC hardware. What's worse, the article starts off by pointing out that much of the shiny effects exclusive to DX10 games would have been possible with DX9, had Microsoft been inclined to develop in that direction. From the article: "[Current] cards are just not powerful enough to enable widespread use of any features that reach beyond the capability of DirectX 9. Even our high-end hardware struggled to keep up in some cases, and the highest resolution we tested was 2.3 megapixels. Pushing the resolution up to 4 MP (with 30" display resolutions of 2560x1600) brings all of our cards to their knees. In short, we really need to see faster hardware before developers can start doing more impressive things with DirectX 10."

cancel ×

67 comments

Sorry! There are no comments related to the filter you selected.

bad karma so who cares (-1, Flamebait)

fregare (923563) | more than 7 years ago | (#19795117)

1st pot (pruposely mispelled)/

Poor PC gamers... (5, Funny)

Kevin143 (672873) | more than 7 years ago | (#19795119)

I feel so sorry that they can't run the latest games at 2560x1600.

Re:Poor PC gamers... (0)

Psiven (302490) | more than 7 years ago | (#19795235)

But it illustrates that if you can't run a game with paired down dx10 features in high res, real dx10 features will choke just as much in standard res.

Relic designed the dx10 ver of Company of Heroes using specfications without hardware to test it on. The patch got delayed bc they haad to par it down so much.

Re:Poor PC gamers... (1)

Barny (103770) | more than 7 years ago | (#19796015)

Yup, not to mention of course that since SLI doesn't work with dx10 currently, the 8800ultra is the fastest card to drive these games, now my gtx is no slouch, but it struggles on high detail (note there are memory problems with ultra texture detail in vista) textures in dx10 mode at just 1920x1200.

Compare this performance to being able to run ANY dx9 game at that res, with all features (except AA of course, thats not really needed at that rez) on and fraps it at decent frame rate to boot and you see that DX10 hardware just isn't "there" yet.

Not really NV or AMD fault, they make the hardware to the best of their abilities, its up to the game makers to keep their games from bloating (2G process barrier anyone?).

Maybe we are throwing too much work at video cards, one of the big touted features of DX10 (and indeed CoH) is the video card handled physics, could NV (and indeed AMD) have been wrong claiming that their current cards (or near future ones for that matter) could "take care of this as well"?

Re:Poor PC gamers... (1)

fractoid (1076465) | more than 7 years ago | (#19796931)

...paired...par...
The word you're looking for is 'pared' / 'pare'. Not a flame, just FYI. :)

Re:Poor PC gamers... (2, Interesting)

Mex (191941) | more than 7 years ago | (#19796059)

For the money it costs to set up a PC with Windows Vista and a DirectX 10 capable card, yes, I'd feel sorry too.

Re:Poor PC gamers... (2, Insightful)

Pharmboy (216950) | more than 7 years ago | (#19801467)

Then maybe developers will start focusing more on playability and less on eye candy? Anyone?

Re:Poor PC gamers... (2, Insightful)

xXBondsXx (895786) | more than 7 years ago | (#19801921)

I've heard this argument thousands of times (especially during arguments about Wii vs. Xbox 360 vs. PS3)

What people have to realize is that graphics and sound are PART OF THE GAMEPLAY EXPERIENCE. Imagine playing Halo without the soundtrack playing in the background, or riding across the field in Zelda:OoT without the theme music playing. Imagine playing Warcraft III with crappy 2D 600x400 graphics or playing Banjo Kazooie for the N64 in black and white and 3 polygons per model.

These things would ruin these games. It destroys the experience; you can't only rely on gameplay and you can't only rely on graphics. It's a mixture...

besides listen to the market. It's obvious that eye candy sells consistently

Re:Poor PC gamers... (0, Redundant)

Pharmboy (216950) | more than 7 years ago | (#19807575)

By today's standards, TFC has crappy graphics but great gameplay, which is why there are still a couple hundred servers still running it, after 10 years.

I am not talking about the gaming "experience", I am talking about gameplay: how fluid the controls are, how intuitive the action is, how the game can offer something new each time.

SimCity 3000 has marginal graphics compared to SimCity 4, but it has better gameplay.

That means ... (4, Insightful)

rrhal (88665) | more than 7 years ago | (#19795187)

... that people who bought DX10 cards so that in the future they will be able to play DX10 games when they come out have basically been sold a "Pig in a Poke". As its currently constituted DX-10 pretty much only serves as a device to obsolete Windows XP in favor of Windows Vista.

Re:That means ... (2, Insightful)

zakeria (1031430) | more than 7 years ago | (#19795215)

and next we'll all find out that our new machines we all ran out and bought are also too slow to run Vista.. oh wait "we already know that"?

And yet ... (0, Insightful)

Anonymous Coward | more than 7 years ago | (#19795241)

... XP will still be preferred over Vista for years to come, until Microsoft pulls this same stunt enough:
Crippling a perfectly fine system to force people to 'upgrade' to a [insert complaint here]-encumbered, bloated mess vis-a-vis Vista.

Re:And yet ... (2, Interesting)

fractoid (1076465) | more than 7 years ago | (#19796971)

Personally, I'm grateful to them for making Vista so expensive in terms of upgrade price and hardware requirements. Without the added push I'd have stayed with Windows instead of switching to Ubuntu / Beryl (which looks much prettier than Aero, IMO). And without that push, I'd never have found out that it 'just works' at least as well as Windows does (at least for my hardware, maybe I was lucky), and can run WoW (my only Windows-specific app) through WiNE, with almost no tweaking, at a higher frame rate than in Windows. Only been running a day so far, but I can't see me going back.

Re:That means ... (0)

Anonymous Coward | more than 7 years ago | (#19795391)

Personally i got my 8800gts because my old 6800 was a joke in any newer games. I got it knowing full well it would probably need to be replaced in 1 - 1 1/2yrs to be able to play games that fully use DX10 at a reasonable high resolution + filters. I would hope no one was getting 1st gens expecting them to last forever.

Re:That means ... (5, Insightful)

ozphx (1061292) | more than 7 years ago | (#19795709)

Exactly.

DX10 doesnt have "performance". DX10 is an API. You can benchmark API quality by a great many things, but performance is fairly irrelevant when that performance is tied so much to the undelying hardware.

DX10 is a good API if in a couple of years time, the shader models match the industry direction and there isnt a whole bunch of GL_EXT_OBS_ASS_HATTERY_BUF_GAY_PRIMITIVE extensions to make things work. This is likely considering the industry partnership arrangements MS have.

Anandtech can enjoy their cry that their hardware wasnt good enough to make the most of DX10. This is really a good thing for the API, it means that DX10 has some lifetime. A scarier headline would have been "Current Gen Cards Can Max Out What DX10 Is Capable Of". That would be the death of an API...

Re:That means ... (1, Troll)

sortius_nod (1080919) | more than 7 years ago | (#19796513)

I can't really see bloatware as "feature packed"... If M$ spent more time working on making clean efficient code that wasn't almost designed to slow your machine down, maybe we'd have faith in their products.

I mean, come on, you shell out $5k for a computer, you expect it to be shithot. It would be (and the graphical difference marginal) in DX9 over DX10. If we could compare XP to Vista performance you'd probably see where the issues lay - a bloated OS that is resource hungry vs a bloated OS that's less resource hungry. I know which would turn up better results.

This is impossible to have happen as M$ seems to think we enjoy having to upgrade our OS to have games supported. About the only thing M$ has done well for gamers is the Xbox, and even the latest one is starting to show some massive cracks.

Re:That means ... (4, Interesting)

Alsee (515537) | more than 7 years ago | (#19798023)

DX10 doesnt have "performance". DX10 is an API.

DX10 is an API with a built-in performance penalty. The way it is designed has all sorts of restrictions and limitations on how things are done. Why? In order to make it "DRM enhanced". Whether you are using DRM content or not, the video system is required to operate under DRM rules. It prohibits things like direct memory access, just in case you happen to have DRM video somewhere and you tried to do a video capture. It also imposes a variety overhead costs, like validating memory accesses to prevent you from reading or writing anyplace that could impact DRM security. It cripples functions or continuously re-validates function calls to ensure that they cannot be called in any manner that might be a threat to the DRM system.

You can benchmark API quality by a great many things, but performance is fairly irrelevant when that performance is tied so much to the undelying hardware.

Normally correct, but in this case the API deliberately hamstrings the hardware.

DX10 is a good API if in a couple of years time

Yes, faster hardware will speed things up. However that faster speed will still be slower than it would have been without DX10.

-

Re:That means ... (2, Interesting)

kamapuaa (555446) | more than 7 years ago | (#19798923)

But Direct Memory Access doesn't make the video card operate faster, what are you talking about? A lot of DX9 video card drivers didn't even implement direct memory access. I love how of your three examples, two are the same example, and the third is so vague as to possibly also be the same example. Cry about DMA all you want, but complaining about the DMA hit to video card speed is goofy.

Re:That means ... (1)

Alsee (515537) | more than 7 years ago | (#19806667)

Cry about DMA all you want

I am talking about *general* communication in both directions between the video card and the computer, not just DMA. Computer memory is DRM secured against the video card, and video card memory is DRM secured against the rest of the computer. Any functionality that could possibly threaten DRM security on either side is either prohibited entirely or castrated and loaded with constant checks and DRM-security validations. I probably should cite some specific technical details to back this up for fellow programmers, but to be honest it has been several months since I went diving through Microsoft technical documents on the system. However one conclusion was clear and inescapable. The first priority in the design is DRM-style security, anything and and everything takes a back seat to that. The video card is seen as a threat, as a piece of hardware with the potential to access DRM-sensitive data in the computer, and *ALL* software access to the video card is seen as a threat with the potential to access DRM-sensitive data in the video card.

-

Re:That means ... (2, Informative)

ozphx (1061292) | more than 7 years ago | (#19807591)

Pfft.

Any of the real DRM features provided by a TPM setup - such as bus level encryption - are already in your modern chipset / video card and can quite happily AES at full bus speed. The marking "protected pages" is no more overhead than the no-execute bit.

Like another poster in this thread mentioed: DX10 is lighter than DX9. They've stripped out most of the cap bits for one - now a card either supports DX10 or it doesnt (none of this 'find the right texture format' bs - although admittedly I can't think of a single time a modern card didnt support what I wanted to use).

I actually like this brutal rationalization of the APIs that MS is doing. Killing hardware accelerated audio made me happy - gave me hope for the death of EAX and the associated 'playing games in a public toilet' feeling.

Re:That means ... (0)

Anonymous Coward | more than 7 years ago | (#19802427)

dx10 (the 3D graphics API and current topic) has better performance than dx9 due to much less communication overhead. The thing you want to bitch about is video codecs or some other off-topic component of Vista.

Microsoft is killing Windows gaming (0)

Anonymous Coward | more than 7 years ago | (#19805225)

DRM, Vista and Xbox-cannibalism are degrading the PC gaming experience to the point where it's almost not worth bothering with anymore.

Re:That means ... (1)

dosboot (973832) | more than 7 years ago | (#19809159)

This is exactly the kind of scary shit that keeps me from upgrading to Vista, I don't want to pay a premium just to get a crippled computer. It is ironic since I just recently started taking an interest in directX programming (on an XP box) and I'm having lots of fun doing it, but if I had upgraded a few months ago I would probably be up to my ears in headaches right now.

Can you provide some sources to verify the claims and give more information?

Re:That means ... (1)

niteice (793961) | more than 7 years ago | (#19800253)

there isnt a whole bunch of GL_EXT_OBS_ASS_HATTERY_BUF_GAY_PRIMITIVE extensions to make things work
Actually, most extensions to OpenGL 1.x are now core features in 2.x. Upcoming revisions of the API (codenamed Mt. Evans and Longs Peak) will move to a more D3D10-like model.

Re:That means ... (1)

ThrasherTT (87841) | more than 7 years ago | (#19800551)

GL_EXT_OBS_ASS_HATTERY_BUF_GAY_PRIMITIVE

Thanks! I shot (thankfully) lukewarm coffee out of my nose! :-D

Re:That means ... (0)

Anonymous Coward | more than 7 years ago | (#19800749)

You forget that online sites are still in their "New OS bad, must make up stuff to get readers" mode right now. In a about six months, we'll finally get some good technical articles about the real performance benefits of DX10. As a game dev, I look forward to making pure DX10 games instead of the DX9+plus patch hacks that companies are passing off as DX10 games. I am surprised that this aticle came from Anandtech. I expect this kind of blind MS-bashing from other sites, but usually Anandtech has been above this shit. I guess they have to draw in readers by going with the usual crowd pleasing material ("MS bad! Current OS good!").

Uhhhh (1)

Sycraft-fu (314770) | more than 7 years ago | (#19798891)

This is true basically no matter what the generation of graphics hardware. Graphics card improve at a much greater rate than other hardware. You really can't buy hardware as a "future proofing" deal. Whatever you bought, it'll be outdated fairly soon. As for these current games I'm guessing it is a combination of bad support in games and drivers that aren't optimised for DX. Regardless, when DX10 games start being mainstream (not for a while yet I'm betting, given the number of XP systems and non-DX10 cards), these current cards will be fairly obsolete. Anyone who ever buys a graphics card with the thought that they are future proof is kidding only themselves.

The best strategy for graphics card purchases for gaming is to select the amount of money you can spend on roughly a yearly basis and go for that. Getting a $100 card each year is likely to serve you much better than getting a $400 card and then not upgrading for 4 years. Yes, it means you'll not be getting the latest, greatest and you won't have all the eye candy, but after a year or two you'll be well ahead of where you'd have been keeping an older, expensive card.

Kinda, but . . (3, Insightful)

vecctor (935163) | more than 7 years ago | (#19799381)

that people who bought DX10 cards so that in the future they will be able to play DX10 games when they come out have basically been sold a "Pig in a Poke".
You are correct IF that is the only reason they bought them.

But the fact is, anyone who bought an 8800 of any variety (the "dx10 cards") bought the fastest DX9 card on the market for use with any game they wanted at the time of purchase. It spanked the next card down, and didn't carry any more of a price premium than any other high end card in the history of discrete graphics (indeed, it carried less of a premium if you looked at price/performance). It was a fast card "right then" regardless of DX10. They didn't sacrifice anything, the DX10 compatibility was just value-added bonus.

Early Adoptor Syndrome (0)

Anonymous Coward | more than 7 years ago | (#19800663)

Early Adopters are always the ones who get screwed. It's kind of the badge of honor for them.

Look, everyone with half a brain, even the most burned out Lunix fanboi, understands that early hardware will never achieve the same level of speed and support as later revisions of the hardware.

Being an early adopter is just like being a Mac user: you are paying more for the concept than for practicality. If you want to do the same thing for far less money, there are far better options.

What if (1)

Snaller (147050) | more than 7 years ago | (#19811433)

You bought the 'DX10' card in order to have faster performance under XP? XP doesn't just become obsolte because Microsoft dictates that. I'm not going to change to vista for the next decade.

Never upgrade too early (4, Insightful)

complete loony (663508) | more than 7 years ago | (#19795217)

If the HL2 / Doom3 generation of games taught us anything. Don't believe the hype. Don't upgrade your computer for a game you don't have yet. By the time there's something interesting that requires you to upgrade, it will cost less to do so, and probably perform better.

Re:Never upgrade too early (2, Insightful)

MSRedfox (1043112) | more than 7 years ago | (#19795749)

That's so true, and it is always the case. I remember when DirectX 9 came out. The first gen cards were great at running old directX 8 games, but you had to turn the resolution way down to get even so-so frame rates with DirectX 9 titles. And now we've got cards that can pound the living hell out of DirectX 9 games. People have gotten spoiled with super high resolutions. It'll take a gen or two of graphics cards to really rock the DirectX 10 scene. It's nothing new, it happens every time. People need to stop making such a big deal about it. It isn't ATI or Nvidia failing to deliver, it's that the next gen games push things even harder.

Re:Never upgrade too early (2, Insightful)

suv4x4 (956391) | more than 7 years ago | (#19795937)

If the HL2 / Doom3 generation of games taught us anything. Don't believe the hype. Don't upgrade your computer for a game you don't have yet. By the time there's something interesting that requires you to upgrade, it will cost less to do so, and probably perform better.

I've played both games on a GeForce 4 MX (the minimum supported card: no shaders, slower than GeForce 3), and honestly it was playable, even though not at very high settings.

Later on when I got a faster GeForce with a bazillion of pipelines and the latest shaders, I tried the games again. Yea, they looked better, some interesting effects here and there, but nothing major.

We don't really miss a lot by not having the latest card ever, and honestly, that resolution they tested at cracked me up. I'm sure they also maxed out the AA and Anisotropic filtering. Nerds.

Re:Never upgrade too early (1)

ben there... (946946) | more than 7 years ago | (#19797747)

Later on when I got a faster GeForce with a bazillion of pipelines and the latest shaders, I tried the games again. Yea, they looked better, some interesting effects here and there, but nothing major.

While we're giving anecdotal evidence... When I bought Oblivion I played it on an (unsupported) GeForce 4 Ti4600. I had to use the Oldblivion hack just to get it to run, disabling shaders and running at the lowest possible settings.

About a year later I played again with a C2D and a 7600 GT. It's like a completely different game. I finally knew what everyone was raving about, and it wasn't some pea-soup game that looked like Doom (the first one). It's infinitely more playable now, but that's just my opinion.

Re:Never upgrade too early (1)

suv4x4 (956391) | more than 7 years ago | (#19802005)

While I trust you about that, you had to HACK the game to make it run (and probably destroy lots of things in the proces). My card was the minimum supported card, but it was supported, no hacks. Despite it was a $30 low end card, I stay by my words that it worked fine.

Re:Never upgrade too early (5, Insightful)

rhyder128k (1051042) | more than 7 years ago | (#19797143)

Don't knock it. There's always someone who's willing to be the early adopter to no advantage. That guy, and others like him, make things affordable for the rest of us. The early adopter is usually happy with the situation and so should we be.

Re:Never upgrade too early (1)

AbRASiON (589899) | more than 7 years ago | (#19798259)

So true, should be a +5 not +4

Don't buy a quad core for 266$ US in July 22n'd after the price drops, by the time a game ACTUALLY needs or uses it, a quad core will be 80$ US and faster.......

Same with a 500 / 600/ 700$ DX10 card, you want to play DX10 games fast? by the time the games come out we'll have the GF8900 not the 8800 (for example)

etc etc etc
Hell, Carmack demo'd Doom 3 on the GF3 with it's amazing shaders! - what actual cards ended up extensively using shaders and looking good / fast in Doom 3?
Hint: wasn't the GF3, hell I think even the GF4 wasn't too good either,........

Depends on why you are upgrading (1)

Sycraft-fu (314770) | more than 7 years ago | (#19798951)

For whatever else they are or are not, the 8800s are rocking gaming card for DX 9 games. If you have a large flat panel, they'll do a good job of playing all the current games in a high rez on it at a high detail. Will they do well for DirectX 10 games? Who knows, way to early to say as we are only seeing the very first titles. There could be problems with the games, problems with the drivers or both. Really, we won't know how well the do DX10 until later when it is more mainstream. However, that's not really the reason to buy them. If you are buying a card now for games later, you are silly. That doesn't mean that there aren't games now for your card to chew on.

That's why I got an 8800, I wanted something better for the nice LCD I'd got. I got it before DirectX 10 was even a consideration as Vista wasn't out yet. I still don't really care. That it supports DX 10 is cool and all but that's not what I got it for. I got it for the DX 9 (and older) games that I have and it does an excellent job. That makes it worth the money to me, anything else is just gravy.

Same deal with all my past purchases. Usually they support new stuff that no games I'm playing use yet. That's not why I'm getting them, I'm not thinking to the future. I am getting them because in addition to that, they'll make the games I have now run really well, which is what it is all about.

2560x1600 is real life? (4, Insightful)

JF (18696) | more than 7 years ago | (#19795223)

Some interesting points in the article, but I'm unsure at how running tests that are hyper bandwidth-bottlenecked is any indication of the performance of DX10 features.

"OMG I can't push 30498230894384023984 pixels/sec through my DX10 card, DX10 sucks."

Re:2560x1600 is real life? (1)

fbjon (692006) | more than 7 years ago | (#19798237)

Assuming 60 fps, your 823 248 725 x 617 436 544 screen intrigues me, and I would like to subscribe to your newsletter.

Summary so you don't have to RTFM: (0, Redundant)

gardyloo (512791) | more than 7 years ago | (#19795249)

It's just way too early, and there are many different factors behind what we are seeing here. As the dust settles and everyone gets fully optimized DirectX 10 drivers out the door with a wider variety of games, then we'll be happy to take a second look.

Shadowrun (2, Insightful)

Renraku (518261) | more than 7 years ago | (#19795333)

Shadowrun is a nice example. It can be played on Windows XP with a hack.

According to Microsoft, its simply not possible as the XP version is still under development. It comes as no big surprise that DX9 can do 90% of what DX10 can do, especially since DX10 is Vista-only. Its just another attempt to push an operating system that very few people want. I'm sure I'll end up with a copy of it in a few years, but very few people actually want it right now.

No developer outside of Microsoft in their right mind would make a Vista-only game right now. It would be like releasing some Virutal Boy games.

Re:Shadowrun (3, Informative)

jdwilso2 (90224) | more than 7 years ago | (#19795405)

Shadowrun is not DX10. It's just restricted to only run on Vista.

Re:Shadowrun (4, Funny)

IHSW (960644) | more than 7 years ago | (#19796623)

No developer outside of Microsoft in their right mind would make a Vista-only game right now. It would be like releasing some PlayStation 3 games.
Fixed.

10 FPS is not playable. (1)

Il128 (467312) | more than 7 years ago | (#19795703)

"The AMD Radeon HD 2900 XT clearly outperforms the GeForce 8800 GTS here. At the low end, none of our cards are playable under any option the Call of Juarez benchmark presents. While all the numbers shown here are with large shadow maps and high quality shadows, even without these features, the 2400 XT only posted about 10 fps at 1024x768. We didn't bother to test it against the rest of our cards because it just couldn't stack up." In this day and age who wants to go from 60 FPS by upgrading down to 10 FPS on new hardware and new software?

At this point (2, Interesting)

Sycraft-fu (314770) | more than 7 years ago | (#19795897)

I don't think it is really useful to look at. DirectX 10 is brand new on the market so who knows how well optimised everything is? The drivers for the cards could very well need work. If you were a graphics can company what would you spend your time on: DirectX 9 which is what almost every game runs on, or DirectX 10 which there's maybe 3 game patches for? Also the games themselves may need improvements. Just because they've ported to DirectX 10, doesn't mean they did a good job of it. Any one remember the original Unreal Tournament? At its heart it was a Glide game and it just never ran as well on GL or DirectX, particularly DirectX. UT2003 was DX at its heart and ran smoking fast. It was to the point that on good DX hardware UT2003 could run faster than its predecessor, despite higher visual detail.

At this point DirectX 10 is more or less just a plaything. Cards are out supporting it, since hardware is almost always ahead of software (harder to develop for something that doesn't exist), but it is brand new and few systems support it (only systems running Vista using teh very newest graphics hardware). IT is at this point a curiosity for the most part. It's not really useful to start talking about performance until there's been a good deal more time for people to work with it, including making games designed for it, not ported to it.

What's the future like? (2, Insightful)

A Friendly Troll (1017492) | more than 7 years ago | (#19796085)

Current top cards (2900 and 8800) already use a lot of power, something like 200W or even more. They require powerful cooling, but it seems that every new graphics card generation tends to use a lot more power than the previous one. It's likely that a better manufacturing process (45nm?) will lower the power consumption slightly, but that's probably going to be offset by higher clocks to get it to the same thermal envelope.

What's the future of the cards' successors like? How long before graphics cards are going to be moved outside the computer, to their specialized cases? Or do you think something like Conroe will happen in the GPU market (vastly lower power consumption than the P4/Tbird, better performance on the same clock speed)? Is that even possible with GPUs and the never-ending quest for framerate and visual effects?

Re:What's the future like? (1)

Bert64 (520050) | more than 7 years ago | (#19797529)

Well, those of us who don't want to play the latest games in the highest resolutions, that is the majority of people...
Someone will come out with lower powered budget versions.

Re:What's the future like? (1)

A Friendly Troll (1017492) | more than 7 years ago | (#19802583)

Yes, that is true, there are always low-power cards (almost completely useless for gaming, though).

But I'm more interested in what's going to happen on the high-end and how that's going to work out. I can (unfortunately) imagine 400W "cards" for DirectX 11 :/

Re:What's the future like? (1)

Bert64 (520050) | more than 7 years ago | (#19812189)

Well, lower powered cards are usually more than capable of playing older games.
And there's plenty of benefits to having older games nowadays. New games are often incredibly buggy, and receive several updates over their lifespan. If you play older games, then those updates have already happened so your experience of the game won't be so buggy as the early adopters.
Plus, there are more likely to be nocd cracks available, so you don't have to deal with keeping physical media around all the time.
And finally, older games are often to be found in bargain buckets for a fraction of the cost of their newer brethren (assuming you actually buy them in the first place).
All of the above, plus the fact older games are no less playable (and often moreso) than newer ones.

Re:What's the future like? (0)

Anonymous Coward | more than 7 years ago | (#19798945)

That only applies to the high-end cards - I've never even met anyone who uses one of those. At the mid- and low-end power consumption is still not that significant, and isn't on the same skyrocketing exponential. For instance, the 7600 cards use less power than the older 6600 cards.

Plus, people are moving to laptops anyway - being frugal is more of an issue here so many of them are very low-power.

DX10 performance will take time (4, Insightful)

NateE (247273) | more than 7 years ago | (#19796229)

The games that Anand benchmarked with were not written from the ground up for DirectX 10. Company of Heroes was DX9 until the developers were nice enough to release a patch. Some developers have said that good DX10 performance requires writing from the ground up for DX10. Since DX10 is so different from DX9, I don't find this difficult to believe.

As soon as NVidia releases certified drivers for doing SLI in Vista. The problem with driving 30" LCDs will disappear.

People are forgetting how many years it takes to create a new AAA game title and the fact that game developers still have very little reason to be attracted to Vista. What with it's small installed base and hardware requirements for consumers.

Re:DX10 performance will take time (2, Informative)

oliverthered (187439) | more than 7 years ago | (#19797797)

DirectX 10 isn't so different from DirectX 9, it's basically DirectX 9 without the fixed function pipeline, or in other words you have to use shaders for everything and can't rely on the driver doing even the most basic of texturing &co outside of shaders.

This makes the pipeline cleaner than that of DirectX 9 and is supposed to give a performance increase when you're dealing with vast numbers of objects.

They've also added geometry shaders which may be useful for some games and can't be done in directx 9, I don't expect that many games to be making use of them for quite a while so there's no reason for any game to be directx 10 only.

The more shaders, the slower. Where is the news? (1)

siyavash (677724) | more than 7 years ago | (#19797279)

The more shaders and other effects you put on, the slower things get. 2+2 is faster than 2+2+2+2. I don't see what's the big deal it. DX10 never claimed to be faster then DX9 but look better. I mean, you can't use an old gfx card to run DX9 and expect it to run faster, why expect that the current gfx cards to do DX10 faster?

Eventhough they might be able to handle all DX10's new features, I doubt DX10 will ever be "as fast" as DX9 on current generation of gfx cards.

Drawing one vector is faster than two. Just simple math.

Hardware virtualization (4, Informative)

brucmack (572780) | more than 7 years ago | (#19797589)

Personally, the most interesting feature of DX10 is the hardware virtualization, so programs can share the card. Should make it possible to play a game on one monitor while playing a movie on another, for example. Presumably these cards wouldn't have a problem with this...

Re:Hardware virtualization (1)

Emetophobe (878584) | more than 7 years ago | (#19799717)

Personally, the most interesting feature of DX10 is the hardware virtualization, so programs can share the card. Should make it possible to play a game on one monitor while playing a movie on another, for example. Presumably these cards wouldn't have a problem with this...


Decent graphics cards have been able to do this for ages. My Radeon X1900XT can do this just fine while playing games in Windows XP, and I play games at 1650x1050 with max settings and 4xAA 8xAF, and I still have plenty of "juice" left over to power videos on my second monitor. Even my friend who has a slower PC than me, watches movies on 1 monitor and plays games on the other with his geforce 7800.

Cards from a year ago have been able to do this, without DirectX 10.

Re:Hardware virtualization (1)

IndieKid (1061106) | more than 7 years ago | (#19823993)

That's interesting, but I'd like to know whether you're watching videos that make use of the hardware acceleration features of your video card or not.

I can watch a video downloaded from the net in XVid on my HDTV whilst playing a game on my monitor, as I expect the game is handled by one core of my CPU and the video by the other. The graphics card isn't really doing much work for the video other than outputting the signal on one of the DVI ports (there's probably a separate chip for this per port, but I'm only guessing here). When I try and watch a DVD through my PC to take advantage of the hardware acceleration of my X1800XL and the upscaling to 720p at the same time as playing a game, the whole thing falls apart and the video doesn't play (usually the player just crashes and yes I've tried a few different ones). If I turn off the hardware acceleration everything is fine and I can watch the DVD whilst playing the game with no problems.

I'm not sure how DX10 would help here, but maybe if there was effectively two VMs running on my DX10 card, one for the video and one for the game, the VM for video could be assigned the minimum amount of resources necessary to do the hardware acceleration (some of these unified shaders, stream processors or whatever they're called). Hopefully, I'd get a smooth video and just lose a few fps in the game, effectively as if I was playing the game on a lesser model of the same card with less resources available.

Re:Hardware virtualization (1)

Tol Dantom (1114605) | more than 7 years ago | (#19801883)

I only hope that there is DRM virtualization so that your digital rights can be managed on both monitors simultaneously. Go progress!

Re:Hardware virtualization (1)

zolaar (764683) | more than 7 years ago | (#19809429)

Should make it possible to play a game on one monitor while playing a movie on another, for example

Amen, brotha. It's about freaking time!

If I had a dollar for every time I have been lurking through some part of town in Thief3 -- creeping through the shadows, pickpocketing the locals, trimming Hammerites' nosehairs with broadheads at 100 yards, jumping out of my skin at even the most barely audible footstep, the usual -- and thought to myself, "Hey, I am like totally in the mood for some Top Gun. I've lost that lovin' feelin, I tells ya. Goodness gracious, Maverick's great fiery balls demand my attention! But, fie! Cruel fates -- damn them all -- I cannot! Not, at least, without terminally disrupting my immersive computer game environment! Woe is me!"

Well consider Jester as good as dead, and tell my fence that I'll be needing those water arrows a little early today.

Now I'll be able to blackjack scores of nosy estate guards while re-engaging after a jetwash-induced flat spin caused me to temporarily abandon my wingman -- all without skipping a beat (or switching to guns -- only pansies would say they're too close for missles, Goose).

Thank you, DX10. You can be my wingman anytime.

Re:Hardware virtualization (1)

slaker (53818) | more than 7 years ago | (#19812593)

You win the internet, good sir.

Quitcher Bitchin (0)

Anonymous Coward | more than 7 years ago | (#19800361)

Since when has "developing for the future" been a bad thing? Dont you WANT DX10 to last?

Some people will complain about anything....

possible vs. obtainable? (2, Insightful)

Captain_Chaos (103843) | more than 7 years ago | (#19802277)

... a significant gap between what is possible and what is actually obtainable ...
What's the difference between "possible" and "actually obtainable"?

Re:possible vs. obtainable? (0)

Anonymous Coward | more than 7 years ago | (#19805801)

I think a better pair of words instead of "actually obtainable" would be "practically attainable".

It is theoretically possible to send a hundred thousand people to the moon and build a cool city there, but that is certainly not actually attainable for another few hundred years, at least.

What is "possible" is the maximum potential, what is "attainable" is what is within reach and can be expected to achieved in the near future. If it takes 10 years to max out DX 10 it is going to be replaced by a better API relatively quickly.

DX 9 isn't even at its full potential and it has been superceded.

Re:possible vs. obtainable? (1)

cswiger (63672) | more than 7 years ago | (#19807605)

What's the difference between "possible" and "actually obtainable"?

Good question. :-)

The former tends to mean what is doable in theory or in practice, but may not correspond to any actual realistic situation. For example, if you stuff nothing but data down the PCI (AGP, PCIe, whatever) bus, you get a rate which is about the bandwidth of the underlying bus, but in reality, you have to do some setup and so forth (ie, configure the target address and size for a DMA transfer) before blasting data bits, and other PCI devices also get to get some bus cycles, so the actually obtainable data rate tends to be about only 90% of the max bandwidth of the bus in practice.

This tends to be applicable to a wide range of things, from ethernet/networking load capacities, to disk I/O performance, to video card framerates, etc....

Harf. (2, Interesting)

stonecypher (118140) | more than 7 years ago | (#19802471)

The reason Microsoft couldn't reasonably do Aero under DirectX9 has to do with baselines. One of the biggest advantages of DirectX 10 has less to do with what it is and more to do with what it isn't: old. Microsoft needed a way to do two things: 1) make sure that people weren't trying to run Aero on 386es, and 2) a simply way to tell non-technical people whether or not their hardware was up to modern spec.

Does DirectX9 have all the capabilities needed to run something like Aero? Yes, but DirectX9 also runs on systems which would drag under the demands of something like Aero. Microsoft has a vested interest in preventing their new software from running on hardware which will struggle with Aero, because then there'll be a lot of people complaining about how (insert the bad side of slow Aero here.)

DirectX10 has a much higher minimum bar to entry. If your stuff is DirectX10 ready, it's almost certainly Aero ready. That's why they made the requirement - they didn't want old hardware making their shiny new product look like crap. (That it forces new hardware purchase, which gets OEMs and VARs to support the new OS, certainly helps.)

If you look at it from a business perspective at the same time that you look at it from a technical and an "oh god I have to deal with stupid users" perspective, you'll start to see why just using the DirectX name to set the new low watermark was actually a relatively simple way for Microsoft to flatten several problems at once.

Re:Harf. (1)

Curate (783077) | more than 7 years ago | (#19802903)

I hope you're not implying that you need a DX10 card to run Aero. Most cards released in the last few years will run Aero Glass just fine, with transparency and the works. It ran beautifully on my Radeon 9700 Pro which I bought in Dec 2002. That might actually be the earliest card that will run it. The basic requirements are DX9 driver support, pixel shader 2.0, and at least 64MB of memory on the card. http://www.howtogeek.com/howto/windows-vista/under standing-windows-vista-aero-glass-requirements/ [howtogeek.com]

let me see... (1)

BlueParrot (965239) | more than 7 years ago | (#19808111)

You can either use GL which is supported on virtually every platform there is, or you can go with DX10 thus limiting your market to Vista only while simultaneously taking a performance penalty... Since nobody in the right mind would go for the latter option I guess we can expect various windows bugs which adversely affect OpenGL very soon... rolled out as critical security updates of course...
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>