Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

DX11 Tested Against DX9 With Dirt 2 Demo

Soulskill posted more than 4 years ago | from the shadows-with-shadows dept.

Graphics 201

MojoKid writes "The PC demo for Codemasters' upcoming DirectX 11 racing title, Dirt 2, has just hit the web and is available for download. Dirt 2 is a highly-anticipated racing sim that also happens to feature leading-edge graphic effects. In addition to a DirectX 9 code path, Dirt 2 also utilizes a number of DirectX 11 features, like hardware-tessellated dynamic water, an animated crowd and dynamic cloth effects, in addition to DirectCompute 11-accelerated high-definition ambient occlusion (HADO), full floating-point high dynamic range (HDR) lighting, and full-screen resolution post processing. Performance-wise, DX11 didn't take its toll as much as you'd expect this early on in its adoption cycle." Bit-tech also took a look at the graphical differences, arriving at this conclusion: "You'd need a seriously keen eye and brown paper envelope full of cash from one of the creators of Dirt 2 to notice any real difference between textures in the two versions of DirectX."

cancel ×

201 comments

Sorry! There are no comments related to the filter you selected.

ehh (5, Insightful)

Dyinobal (1427207) | more than 4 years ago | (#30307966)

I Personally view DX11 as I do sonys push from DVD to blueray. Sure blueray has some nice features but I'm still enjoying my DVDs, and I don't really need uncompressed audio tracks for every language on my disks. Same thing with DX11, I've not even properly gotten set with many DX10 games and now they are pushing DX11 (well pushing as in mostly tech demos) and I've not even got much dust on my latest graphics card. I'll upgrade in a few years, perhaps when I see DX9 vanish, or at least become increasingly uncommon.

11 is the FUTURE (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30307988)

Everything else is dead or dying. Stay away from the dead and dying my mama always told me.

Like this stupid new captcha crap

Re:11 is the FUTURE (2, Funny)

MrNaz (730548) | more than 4 years ago | (#30309050)

Is your mama Netcraft?

Re:ehh (0, Flamebait)

webheaded (997188) | more than 4 years ago | (#30307990)

You know, for this being a nerd site, a lot of you guys sure seem to go into the future kicking and screaming don't you? There is no LIMIT to the amount of shit you'll complain about. It's a new version. They're not PUSHING it on anyone...it's just there. That's what new cards have. It isn't detrimental to your performance and you can still play in DX9 if the higher version gives you shit fits.

Some things are worth complaining about. Not EVERYTHING is a conspiracy by some rich bastard forcing some product down our throats. Maybe the benefits ARE incremental, but who the hell is buying a new card JUST for DirectX 11? Seriously, what kind of moron is doing that? If I buy a new card, it's for an overall performance game, not for some arbitrary new version of Direct X or whatever. That's just an added benefit.

Re:ehh (0)

Anonymous Coward | more than 4 years ago | (#30308048)

Sigh... I would have agreed with that sentiment if this article was on biotech or AI research or any of the other topics that prompt widespread paranoia and technophobia. Unfortunately, that isn't the case here.

The reason is simple: Microsoft controls the lifespan of both the old and new versions, new versions means the plug gets pulled on older ones (No longer selling copies of XP, anyone?). Now, MS can't can old DX versions per-se since they need old software to run but they can make it a pain in the ass to use for new stuff (Deliberately sabotage the development tools for example).

No, this isn't a conspiracy, MS does in fact stand to commercially gain by keeping everyone on the latest version of DX/Windows since they then get to use that as leverage against nVidia/ATi, especially when dealing with competing platforms including in other product domains (I'm reminded of MS strong-arming nVidia for a price cut on the nVidia GPUs in the original Xbox by withholding the specs for DX9 until nVidia caved).

Re:ehh (1)

LOLLinux (1682094) | more than 4 years ago | (#30308060)

Now, MS can't can old DX versions per-se since they need old software to run but they can make it a pain in the ass to use for new stuff (Deliberately sabotage the development tools for example).

Yes, because pissing off developers and having them not want to use your dev tools anymore is clearly how Microsoft has achieved the success that it has!

Re:ehh (3, Insightful)

Anonymous Coward | more than 4 years ago | (#30308082)

Where are they going to go?

And you should seriously learn some computing history, MS has a long healthy history of pissing off various segments of its developer base. Anyway, you are assuming that the sabotage would be obvious rather than just 'bad documentation' or APIs that return values they aren't supposed to, things that can be easily written off as bugs [which are never fixed] or lazyness.

Re:ehh (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30309086)

The decision of which API to use is not being made by devs but the bean counters in marketing or the publishers.

Re:ehh (1)

MrNaz (730548) | more than 4 years ago | (#30309060)

"No longer selling copies of XP, anyone?"

XP is almost 9 freakin' years old. What are you waiting for? Confirmation from Netcraft?

Re:ehh (4, Insightful)

Petrushka (815171) | more than 4 years ago | (#30308080)

Quoth Dyinobal:

Sure blueray has some nice features ... I don't really need ... I'll upgrade in a few years ...

Quoth webheaded:

... kicking and screaming ... There is no LIMIT to the amount of shit you'll complain about ... the higher version gives you shit fits ... what kind of moron ...

Compare and contrast: which of these two is complaining and having shit fits?

Re:ehh (0)

Anonymous Coward | more than 4 years ago | (#30308514)

The first twit!

Re:ehh (1, Funny)

Anonymous Coward | more than 4 years ago | (#30308652)

There was only one twit, so "first" is a bit redundant.

Re:ehh (0)

Anonymous Coward | more than 4 years ago | (#30308096)

LOL RAGE MORE

Re:ehh (3, Insightful)

Aceticon (140883) | more than 4 years ago | (#30308600)

I reckon that being specialists in one technical area or other, many of us are actually knowledgeable enough about technology and technology companies to know that newer is not the same as better.

As such, the old-hands amongst us we feel bound by duty and ethics to inform the bright-eyed, young and inexperienced amongst us of that.

Not that it makes any difference most of the time ...

Re:ehh (4, Funny)

White Flame (1074973) | more than 4 years ago | (#30308022)

But these go to 11!

Re:ehh (1, Insightful)

Dyinobal (1427207) | more than 4 years ago | (#30308046)

obligatory xkcd comic incoming?

Re:ehh (2, Funny)

smcn (87571) | more than 4 years ago | (#30308560)

Forget 11, BD's go to 1080!

Re:ehh (0)

Anonymous Coward | more than 4 years ago | (#30308810)

Give me $2,000 and I'll build you one that goes to 12!

Re:ehh (3, Funny)

LOLLinux (1682094) | more than 4 years ago | (#30308042)

I Personally view DX11 as I do sonys push from DVD to blueray. Sure blueray has some nice features but I'm still enjoying my DVDs, and I don't really need uncompressed audio tracks for every language on my disks.

I still watch VHS tapes you insensitive clod!

Re:ehh (2, Funny)

davester666 (731373) | more than 4 years ago | (#30308254)

Hah. LaserDisc's totally kick VHS tapes.

Re:ehh (1)

SimonTheSoundMan (1012395) | more than 4 years ago | (#30308786)

So does Betamax...

Re:ehh (3, Insightful)

ShooterNeo (555040) | more than 4 years ago | (#30308052)

Are you blind? It's one thing to compare DirectX 9 versus 11 video games, where either API lets you create highly detailed, high performance graphics.

It's another to compare the gigantic difference in picture quality between 1080p/720p and craptacular 480p (at most)

The difference between high def and standard is pretty darn immediate and obvious for new content such as TV shows that were made using the right digital cameras. Film, not so much, because the darn camera and lenses in movies is often set to blur hard edges and details, and of course is a craptacular 24fps.

Re:ehh (1)

X0563511 (793323) | more than 4 years ago | (#30308432)

You mean 23.976 fps.

Yea, NTSC is retarded in some ways.

Re:ehh (1, Informative)

mrmeval (662166) | more than 4 years ago | (#30308538)

Film is 24fps, NTSC is 23.976fps

Film can go higher in some formats. HDTV can be a variety of frame rates.

wrong (1)

FranTaylor (164577) | more than 4 years ago | (#30308776)

NTSC is essentially 30 Hz. Intentionally chosen so 60Hz line noise will be stationary on the screen.

Re:wrong (1)

BrokenCube (896491) | more than 4 years ago | (#30308884)

Actually, all of you are right. Behold the oracle that is wiki [wikipedia.org]

Re:ehh (1, Informative)

Anonymous Coward | more than 4 years ago | (#30309036)

uh, NTSC is not 23.976. It's 29.976 (AKA, 30).

http://en.wikipedia.org/wiki/Telecine#2:3_pulldown

Film converted to NTSC may be slowed to 23.976, but it's still broadcast at 29.976. NO one ever "sees" NTSC running at 23.976.

Re:ehh (3, Informative)

Jeremy Erwin (2054) | more than 4 years ago | (#30308460)

On a DVD, if something's out of focus, it could be because of the cinematography, or it could be because the DVD doesn't have enough bits. On a bluray, if something's out of focus, it's probably because the director of photography intended it to be out of focus.
Water looks a bit more realistic. Animation looks a bit sharper.

On a smaller screen, these are all subtleties, and don't jump out at the viewer unless edge enhancement is added-- which tends to bother viewers with larger screens. Too much processing can also make skin look like plastic.

Re:ehh (2, Informative)

SimonTheSoundMan (1012395) | more than 4 years ago | (#30308838)

Blur hard edges with film lenses? It does depend on which lenses are used, this is the choice of the DP, however most are incredibly sharp. Are you thinking of depth of field? Most HD cameras are 2/3 or 1/2 inch sensors, compared to films 16 or 35 mm, as they have greater magnification, focal length, aperture, circle of confusion etc.

Film cameras (35mm for example) can resolve resolution far beyond 1080p, more like 6 to 12 thousand pixels horizontal can be scanned from the negatives with no worry about resolution.

Re:ehh (1)

Ma8thew (861741) | more than 4 years ago | (#30308876)

Has it not occurred to you that sometimes good enough is Good Enough? Standard definition is a high enough resolution that you can enjoy the content. Sure, high definition is better, but it costs sometimes twice as much for the HD version a the moment. The GP didn't say he couldn't tell the difference, just that he didn't see the need for an increase in quality.

Re:ehh (1, Insightful)

n3tcat (664243) | more than 4 years ago | (#30309204)

Are you blind? It's one thing to compare DirectX 9 versus 11 video games, where either API lets you create highly detailed, high performance graphics.

It's another to compare the gigantic difference in picture quality between 1080p/720p and craptacular 480p (at most)

The difference between high def and standard is pretty darn immediate and obvious for new content such as TV shows that were made using the right digital cameras. Film, not so much, because the darn camera and lenses in movies is often set to blur hard edges and details, and of course is a craptacular 24fps.

You must work for sony, have stock in sony, or have spent thousands of dollars on the equipment you're talking about.

Re:ehh (4, Insightful)

hairyfeet (841228) | more than 4 years ago | (#30308276)

I think the problem is that DX9 has gotten "good enough" for most folks, at least IMHO. There is only so much pretty you can look at while dodging gunfire and shit blowing up all around you. I haven't played a game in the last 4-5 years where I thought "they really need to add more pretty" because I've been too busy going "Holy Crap! dodge dodge duck blast Shit! The bad guys are packing hefty and I'm packing wimpy! shit!". See for example the first time I whipped around the corner and shot at a splicer and hit Big Daddy in the ass by mistake. When those big red eyes spun on me all I needed was a sound bite of Daffy Duck [stuporduck.com] to make the moment perfect.

For me pretty much everything after Far cry 1 has been past the "good enough" level as far as graphics and bling goes. Now if they would do better on stories and AI I would be a happy camper, but sadly we haven't gotten much better on that front since Far Cry 1. IMHO it isn't so much the graphics that separate the okay from the good from the great, but decent story and AI. Bioshock, FEAR, L4D I was too busy playing the game to actually spend much time looking at the pretty. But the atmosphere, the AI (or lack of it in too many games), the story, these things I notice.

So I have to agree that while I am running Windows 7 HP I just don't see the need to toss my ATI 4650 1Gb. The games I play already look prettier than I can actually pay attention to while not getting the living shit blasted outta me, I haven't seen anything in DX11 that will make bad game companies come out with better AI (I'm looking at you, EA!) or better stories. So I will stick with DX9 until there are enough compelling games out that use DX11 to make it worth using.

And doesn't the X360 use DX9? Considering how many PC games are nothing but shitty X360 ports anymore DX11 will probably be waiting until the x720 before getting adopted. Oh well, that is what MSFT gets for killing MechWarrior and turning every game company they touch into an X360 company.

Re:ehh (1)

giuseppemag (1100721) | more than 4 years ago | (#30308322)

The XBox 360 uses a modified version of DX9 featuring tessellator units. This means that the XBox 360 is closer to DX 11 than it is to DX 9 or DX 10...

Re:ehh (1, Funny)

Anonymous Coward | more than 4 years ago | (#30309024)

There is only so much pretty you can look at while dodging gunfire and shit blowing up all around you.

That's a hell of a road rally game!

Re:ehh (0)

Anonymous Coward | more than 4 years ago | (#30309046)

Nope, the 360 does not use DX9, although it's fairly similar.
DX10 is just a cleaner version (API wise) of DX9, with a couple of extra features, most notably geometry shaders
DX11 adds compute shaders, and allows you to share resources across multiple contexts without having to mess around with swap chains.
 
So yeah, you can wait until the xbox 720 if you want, but it should be obvious to most people that DX11 is going to be adopted a hell of a lot quicker than DX10. Let's be honest, the only reason people are still targetting DX9 right now is because a large number of people are still running XP (having shunned vista as a piece of crap). Windows 7 is looking like a far more popular OS than vista, so I think it's a matter of time before you'll see games that target DX9 & DX11 (who cares about vista and DX10 anyway?).

Re:ehh (0)

Anonymous Coward | more than 4 years ago | (#30308442)

a new direct3d api isn't really comparable to the transition from DVD to blu ray though, your DVD player doesn't get too slow for the latest movies in a couple years and you don't upgrade to a newer model that costs about the same as what you payed for the old one

when your current GPU gets too old for your tastes, you buy a new one. a new version of directx doesn't really throw off that cycle, despite how much AMD would love to convince you otherwise

Re:ehh (1)

Opportunist (166417) | more than 4 years ago | (#30308604)

Personally I see DX11 just like BluRay as well: What good is better graphics if all you get is to see better how much the content sucks?

Movies without scripts don't get better just with more eye candy. Likewise, games with no replay value don't get more interesting with more particle effects.

Re:ehh (0)

Anonymous Coward | more than 4 years ago | (#30309166)

Why is this modded interesting ?

Some dinosaur doesn't like new stuff. Suggest staying away from slashdot. We're interested in things like that around here.

HADO - High Ambient Definition Occlusion ? Think u got your acronym rong.

DX11 compute exposes lots of cool features previously only available via OpenCL or nvidia specific CUDA.

Please don't post cr@p.

OpenGL (2, Interesting)

some_guy_88 (1306769) | more than 4 years ago | (#30307974)

Not even sure if I knew there was a DirectX 11.. Does anyone know how OpenGL compares to direct3d 11?

OpenGL Development (4, Informative)

bazald (886779) | more than 4 years ago | (#30308044)

Most of the "important" features of Direct3D 11 will be exposed immediately as OpenGL extensions.
The next version of OpenGL will officially support those features.
As usual, it will be a nightmare to take advantage of those features without requiring their presence. (GLEW and GLEE help only so much.)
If there are any features of Direct3D that would require architectural changes to OpenGL, they won't appear until the next major version, at the earliest. I'd be surprised if virtualization of texture memory were supported soon, but I'm not really expert in these developments. (For all I know, it is already supported...)

In summary, OpenGL will remain competitive with Direct3D with the usual caveats.

Re:OpenGL Development (0)

Anonymous Coward | more than 4 years ago | (#30308114)

Virtualization of texture memory is more of a driver issue than anything to actually do with OpenGL directly. See the work on TTM and GEM for details on how it's being handled in Linux.

Re:OpenGL (0)

LanceJZ (624688) | more than 4 years ago | (#30308176)

Just so you know, OpenGL was designed for 3D rendering applications, not real time 3D rendering, hence its name. It does not have a direct path to the hardware, its an in-between, a Graphic Layer library, to make it easier for 3D application developers, to compare them is ludicrous.

Re:OpenGL (5, Informative)

QuoteMstr (55051) | more than 4 years ago | (#30308214)

This is the most ill-informed comment I've ever seen.

You don't have a "direct path" to the hardware on modern computers at all. After all, you're not filling DMAed command buffers and programming memory registers, and you don't want to be: the details would drive you to madness. That's what we have drivers for.

OpenGL and Direct3D are both abstraction layers for the hardware. Neither is intrinsically more "direct", but both were certainly designed for real-time 3D rendering (although OpenGL was initially more used for CAD applications than games).

Re:OpenGL (0)

Anonymous Coward | more than 4 years ago | (#30308620)

I have filled DMAed command buffers and programmed memory registers... although I was also described as autistic in my pursuit of hardware optimization :D

Re:OpenGL (0)

Anonymous Coward | more than 4 years ago | (#30308982)

This is the most ill-informed comment I've ever seen.

You must be new here.

Re:OpenGL (1)

FranTaylor (164577) | more than 4 years ago | (#30308244)

"to compare them is ludicrous."

Really?

You did it yourself. In fact you did it twice.

Re:OpenGL (1)

machine321 (458769) | more than 4 years ago | (#30309136)

He meant "Ludacris", now get back.

Re:OpenGL (0)

Anonymous Coward | more than 4 years ago | (#30308634)

You must be a filthy Windows user. Microsoft doesn't want you to know but [hushed voice] there's more than one flavor of Kool-Aid in the world now. There has been for quite a while.

Re:OpenGL (0)

Anonymous Coward | more than 4 years ago | (#30309106)

You are wrong.

OpenGL, as any library based on rasterization, was designed for real time. It was designed mainly for interactive (real time) previews. In fact it was based on the graphics library of the SGI workstations that were well known because of their real-time rendering capabilities, thanks to hardware acceleration.

Re:OpenGL (3, Interesting)

noname444 (1182107) | more than 4 years ago | (#30308462)

Cards are lazily called "DX11" or "DX10", but the features are not DirectX-specific. The term shader model, or pixel shader version can be used to describe GPU hardware generations correctly and/or in an API neutral fashion.

Since these are hardware features they are available to any API that implements them, and OpenGL usually is implemented by the graphics driver, which is written by (or under contract of) the graphics card manufacturers, they usually expose any new hardware features to an OpenGL-application through extensions.

It's a shame that the Khronos Group isn't faster when it comes to including the extensions in the standard and upping the version number of OpenGL. I'd love to see an OpenGL release schedule synced with the shader models.

DX8 -> PS1.0 / PS1.1
DX11 -> PS5.0

For more information see:
http://en.wikipedia.org/wiki/Pixel_shader#Hardware [wikipedia.org]

Power efficiency (2, Interesting)

afidel (530433) | more than 4 years ago | (#30307998)

Would be interesting to know if either the DX9 or DX11 codepath had a significantly higher power requirement on DX11 capable hardware.

Re:Power efficiency (0, Flamebait)

ascendant (1116807) | more than 4 years ago | (#30308614)

Oh, god, why?

Why would you care? Why does it even matter, compared to GPU load vs idle power consumption? Are you one of those terminally retarded morons that plays nex-gen games on a laptop with the graphics settings all the way up on battery?

And, for a desktop, why power usage of DX11 vs DX9 enter into your buying decisions at all? You already know the computer is going to suck tons of power while rendering. The only thing that matters is how much power it draws when it's not being used.

Re:Power efficiency (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30308672)

More power with less juice would be nice, even on a desktop.

Insulting him will not make his argument invalid.

Re:Power efficiency (1)

0ld_d0g (923931) | more than 4 years ago | (#30309178)

Well, TBH, its not completely irrational. You would need a bigger SMPS if you want to run a card (DX11?) with higher pw consumption

Couldn't score at the glory hole tonight (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30308038)

I'm a fag. Will someone come over to my house and suck my cock?

WTF (0)

Anonymous Coward | more than 4 years ago | (#30308098)

Performance-wise, DX11 didn't take its toll as much as you'd expect...

Apparently you were expecting more than a THIRTY PERCENT (30%) drop in frame rate??

Exactly wtf were your expectations?

I can see the difference between DX9 and DX10 (2, Insightful)

crazybit (918023) | more than 4 years ago | (#30308102)

while playing Crysis. I haven't seen DX11, but from what I've seen on DX9 vs DX10, the only way you couldn't tell the difference is if the game graphics are poorly programmed. I am sure anyone that has seen Crysis superhigh on DX10 in 30+ fps could tell the difference.

Is it worthy? well, it depends on how much the gamer values graphic quality, so it's really very subjective. But don't say there is no visible difference.

Re:I can see the difference between DX9 and DX10 (1)

sznupi (719324) | more than 4 years ago | (#30308242)

I haven't seen DX11, but from what I've seen on DX9 vs DX10, the only way you couldn't tell the difference is if the game graphics are poorly programmed.

Why do you seem to exclude the possibility that DX9 path is excellently programmed?...

Re:I can see the difference between DX9 and DX10 (0, Troll)

JustNiz (692889) | more than 4 years ago | (#30308278)

Uhh dude its a Microsoft product. Thats why.

Re:I can see the difference between DX9 and DX10 (1)

Rip Dick (1207150) | more than 4 years ago | (#30308314)

I am sure anyone that has seen Crysis superhigh on DX10 in 30+ fps could tell the difference.

Exactly how high should one be?

Re:I can see the difference between DX9 and DX10 (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30308368)

Ignorance is bliss isn't it? The real difference between dx9 and dx10 in Crysis is barely noticeable. [extremetech.com]

How does it feel to be duped?

Re:I can see the difference between DX9 and DX10 (1)

ultral0rd (1595449) | more than 4 years ago | (#30308722)

The big advantage of DX 11 is Tesselation / Displacement, Realtime Meshdisplacement and making Normal maps into geometric detail. Here is dx 10 vs dx 11 : Dx 10 : http://www.pcgameshardware.com/screenshots/original/2009/10/Tesselation_aus.jpg [pcgameshardware.com] Dx 11: http://www.pcgameshardware.com/screenshots/original/2009/10/Tessellation_an.jpg [pcgameshardware.com] Even with the screenshots of the car in the link mentioned in the article you can see how the water is more interactive to the forces applied to it. But yes, just like the early days of anti-aliasing, this is going to complete rape PC's for a while.

Dx11 vs 9 (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30308134)

Just for the record "...notice any real difference between textures in the two versions of DirectX." Direct X has nothing to do with textures. (Textures are created by the artist & are bound by engine limitations) The textures would not change unless the game was specifically changed with higher resolution textures. I.e. 4098 vs 2048 etc... now that that's over... The engine is the limiting factor in the benchmark. Remember how games became dx10 when dx10 came out? Its not really using the framework to its full capacity. Such as COH or Bioshock having an update for DX10, it doesnt actually add that much, but compare dx9 crysis to dx10 crysis you'll see a difference as the engine was coded to use both frameworks fully. (or flight simulator X) anyways check out the video it shows dx11 not constrained by the engine, dx11 can actually tessellate normal maps http://www.youtube.com/watch?v=PR40GwRtFyw&feature=player_embedded [youtube.com] (go to like 2:50)

What progress! (0, Flamebait)

FranTaylor (164577) | more than 4 years ago | (#30308144)

Yes Microsoft hired away all of the greatest talent in 3-D graphics many years ago and sequestered them in Redmond, working on hardware-tesselated dynamic water.

Future people will ask how the US squandered all of its great intellectual talent, and the only answer we will have is that we spent it designing hardware tesselators so that video gamers could have photorealistic water.

It is hard to imagine a more horrifying waste of resources.

Microsoft Research has blown through billions of dollars and this is what we get for that?

Re:What progress! (1)

nedlohs (1335013) | more than 4 years ago | (#30308156)

billions of dollars just aren't what they used to be.

Re:What progress! (1)

FranTaylor (164577) | more than 4 years ago | (#30308172)

Yeah, it's easy enough to pour it down the drain when you didn't even really earn it in the first place.

Re:What progress! (2, Interesting)

abigsmurf (919188) | more than 4 years ago | (#30308260)

Go watch the Heaven tech demo/benchmark which makes heavy use of hardware tesselation and say that Microsoft wasted their time. Hardware tesselation is going to be the next big thing (it's been around for a while but this is the first time there's really been a universal standard for it).
A massive increase in the number of polygons you can use in models for minimal cost (or even a performance bonus) is a "horrifying waste of resources"?

huh? (2, Insightful)

FranTaylor (164577) | more than 4 years ago | (#30308332)

"Hardware tesselation is going to be the next big thing (it's been around for a while but this is the first time there's really been a universal standard for it). "

Boy you are really living in some sort of Microsoft fantasy world.

You can't tell the difference between "Microsoft" and "universal".

Re:huh? (1)

abigsmurf (919188) | more than 4 years ago | (#30308576)

There hasn't been a universal standard for it. nVidia and ATI both required different implementations and they didn't really push support for it. As very few companies want to program multiple renderers for different cards, everyone used bezier patches instead (which have a bigger performance hit and aren't capable of the complexity tessellation allows).

Tesselation allows for a massive increase in visual quality, if hardware tesselation could've been implemented easily in engines before DX11, you'd see it all over the place rather than a just professional modelling tools.

Re:huh? (1)

FranTaylor (164577) | more than 4 years ago | (#30308760)

"Tesselation allows for a massive increase in visual quality"

See this is exactly what I am talking about. "massive increase" in the one area of the product that really does not need increasing.

How much more real do things have to look? Does the game play better just because you can see the individual hair strands and blood drops of your victims?

Microsoft could have spent their billions inventing new ways to be productive, but instead they spent them on new ways to be NOT productive.

Re:huh? (0)

Anonymous Coward | more than 4 years ago | (#30309130)

There hasn't been a universal standard for it. nVidia and ATI both required different implementations and they didn't really push support for it.

SGI added openGL evaluators and GLUnurbs back in the nineties. AFAIK, only 1 manufacturer ever implemented them in hardware, and SGI provided partial hardware support

Nvidia brought NV_evaluators to the table with the geforce3, and they were quietly dropped from the drivers at a later date.

ATI brought along PN triangles, which they added as a GL extension.

Then came geometry shaders (DX10/GL3) which created a universal standard for it. You can now apply any tesselation scheme you want

Therefore, there has been a universal standard that no one has used for over a decade. DX10 has been around for a few years now, and is a standard that ATI/Nvidia/Intel support....
 
 

Tesselation allows for a massive increase in visual quality, if hardware tesselation could've been implemented easily in engines before DX11, you'd see it all over the place rather than a just professional modelling tools.

Tesselation in modelling tools is done in software (i.e. Maya, Max, Xsi, et al), typically either Catmull-Clark-eqs subdivision, or parametric surfaces (NURBS/Beziers et al). Realistically though, Per-pixel shading has removed most of the need for tesselation. Infact, the only thing tesselation will do for you in modern games is to improve the silhouettes of the ingame models, or as a way of getting higher fidelity out of particle effects (eg cloth, water etc. These normally just use quads instead of tri's, which lends themselves to tesselation schemes). PN triangles are probably the best solution for standard triangle based models, however some times the results aren't ideal, and can involve some additional setup and pre-processing to make it look good.

Re:huh? (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30308662)

Blah blah blah you Slashdot nerds are so wordy. Here LMFTFY:

Filthy Windows user!

Re:What progress! (1)

X0563511 (793323) | more than 4 years ago | (#30308446)

Yea, just like nPatches were going to be a big thing too.

I think driver support for those dropped years ago?

Re:What progress! (0)

Anonymous Coward | more than 4 years ago | (#30309180)

was a 'major feature' of the geforce3 cards (nvEvaluators), let you evaluate nurbs in hardware! Feature dropped less than a year later. reason: hardware tesselation of Nurbs patches is a retarded idea.

Re:What progress! (1)

smallfries (601545) | more than 4 years ago | (#30308434)

Failing troll is full of fail?

So why would the output of a part of Microsoft other than Microsoft Research, need to justify the entire budget of Microsoft Research? If I spend $1B to fund 100 projects, does each of them need to justify $1B of resources? I think there is a job on Wall Street in securitisation waiting for you.

Re:What progress! (1)

FranTaylor (164577) | more than 4 years ago | (#30309198)

Okay wise guy give me a list of all the innovations that Microsoft Research has brought to market.

I think Clippy and Bing will be the only noteworthy items.

Re:What progress! (2, Funny)

Jesus_666 (702802) | more than 4 years ago | (#30308964)

Yeah, those graphics experts could have done something more worthwile like ending war or curing cancer!

Textures? wtf (0)

Anonymous Coward | more than 4 years ago | (#30308170)

Just for the record "...notice any real difference between textures in the two versions of DirectX." Direct X has nothing to do with textures. (Textures are created by the artist & are bound by engine limitations) The textures would not change unless the game was specifically changed with higher resolution textures. I.e. 4098 vs 2048 etc... now that that's over... The engine is the limiting factor in the benchmark. Remember how games became dx10 when dx10 came out? Its not really using the framework to its full capacity. Such as COH or Bioshock having an update for DX10, it doesnt actually add that much, but compare dx9 crysis to dx10 crysis you'll see a difference as the engine was coded to use both frameworks fully. (or flight simulator X) anyways check out the video it shows dx11 not constrained by the engine, dx11 can actually tessellate normal maps http://www.youtube.com/watch?v=PR40GwRtFyw&feature=player_embedded [youtube.com] (go to like 2:50)

wtf indeed! (1)

FranTaylor (164577) | more than 4 years ago | (#30308200)

"Direct X has nothing to do with textures"

Please tell me how the data for the textures gets from the disk to the screen. Exactly what software manipulates them? What software renders them? What software is responsible for how the textures look on the screen?

Quote from the article:

"DirectCompute 11 accelerated high definition ambient occlusion (HADO), full floating point high dynamic range (HDR) lighting, and full screen resolution post processing. "

Do you REALLY assert that these features have NO effect on the rendering of textures?

Re:wtf indeed! (0)

Anonymous Coward | more than 4 years ago | (#30308392)

Thats not really a texture, its a filter really. You cannot have bad textures, turn on a fancy filter and expect textures to change, it changes how its rendered but the underlying textures will remain the same

Re:wtf indeed! (1)

X0563511 (793323) | more than 4 years ago | (#30308464)

I do.

Textures are just that. Image maps applied to geometry.

You're talking about shaders.

NOTICE the word "notice" (1)

FranTaylor (164577) | more than 4 years ago | (#30308540)

"notice any real difference between textures "

The word NOTICE speaks to its appearance on the screen, which is INDEED affected by shading.

HotHardware Test (4, Interesting)

DeadPixels (1391907) | more than 4 years ago | (#30308212)

From the HotHardware test:

The DirectX 11 performance numbers were recorded with the game set to its "Ultra" quality mode, while the DirectX 9 numbers were recorded with the game set to its "High" quality mode. ... As you can see, performance dropped off significantly in DirectX 11 mode.

Now, is it just me, or does that seem a little biased or inaccurate? Of course you're going to see lower performance when you set the graphics higher. Wouldn't it make much more sense (and be a fairer comparison) to compare the FPS with both cards set on either High or Ultra, instead of each on a different level?

Re:HotHardware Test (0)

Anonymous Coward | more than 4 years ago | (#30308246)

I suppose it really depends on what you want to see. IF you want to see if the DirectX 11 codepath is more efficient than the DirectX 9 one, then yes, you probably should be using the same settings. If you want to see if the "extra" DirectX 11 effects cause a significant performance detriment, then I'd say no. This is of course operating under the assumption that "Ultra" mode is only available in the DirectX 11 renderer. If that assumption proves false, then you're absolutely right.

Re:HotHardware Test (0)

Anonymous Coward | more than 4 years ago | (#30309052)

I suppose it really depends on what you want to see. IF you want to see if the DirectX 11 codepath is more efficient than the DirectX 9 one, then yes, you probably should be using the same settings. If you want to see if the "extra" DirectX 11 effects cause a significant performance detriment, then I'd say no. This is of course operating under the assumption that "Ultra" mode is only available in the DirectX 11 renderer. If that assumption proves false, then you're absolutely right.

This response applies to you, as well as all the other following posts that say the same thing in other words. From the fucking article:

To force DirectX 9 mode, head into the Documents folder in Windows, followed by My Games and head into the Dirt 2 Folder. Here you will find a final folder titled Hardware Settings which contains an XML file called hardware_settings_config.xml. Right click the file and select Edit to open the file in Word Pad. Find the line of code that says "forcedx9=false" and change it to "forcedx9=true". Your game will now run in DirectX 9 mode.

Re:HotHardware Test (2, Informative)

psyph3r (785014) | more than 4 years ago | (#30308378)

DX9 systems can only go to High. the ultra mode is only for the latest dx hardware.

Re:HotHardware Test (0)

Anonymous Coward | more than 4 years ago | (#30308566)

Are you being purposefully oblivious? OK, so since you agree that the DX11 card could be set to High just like the DX9 card, why not set them both to the same setting if only for the test? Maybe run a 2nd test and then show them performing asymmetrically, but first show us apples to apples, if only from the games software config standpoint.

Re:HotHardware Test (0)

Anonymous Coward | more than 4 years ago | (#30308480)

IN my opinion,

Re:HotHardware Test (4, Informative)

darthflo (1095225) | more than 4 years ago | (#30308898)

If I'm not mistaken, High sets the game to use the highest quality rendering it can get using only DirectX 9 features while Ultra is the only setting that actually enables stuff specific to DirectX 11. The article doesn't mention there being two cards or different installs or anything, so they probably just ran the game twice on the same box, first with DirectX-9-style rendering (done through DiretX 11 and only then switched on DirectX 11's full visual splendor (Ultra quality).

Re:HotHardware Test (1)

L4t3r4lu5 (1216702) | more than 4 years ago | (#30308996)

I remember from Crysis that "High" was DX9, and "Ultra" was DX10. The "Ultra" setting may have nothing to do with increasing graphics quality above "High" on DX9, and just enable the DX11 codepath.

The Difference (0)

rdnetto (955205) | more than 4 years ago | (#30308312)

Bit-tech also took a look at the graphical differences, arriving at this conclusion: "You'd need a seriously keen eye and brown paper envelope full of cash from one of the creators of Dirt 2 to notice any real difference between textures in the two versions of DirectX."

I've seen it, and the difference is like night and day! Well worth the extra $$$.
On a completely unrelated note, I have never seen so much green in one little brown package.

Hint to mods: +1 obscure movie reference

HADO (1)

Waccoon (1186667) | more than 4 years ago | (#30308384)

A shadow under a rock? Anything that has its own acronym should affect more than 0.5% of the pixels in a screenshot.

Bad summary (4, Insightful)

julesh (229690) | more than 4 years ago | (#30308416)

Summary picks out one point where the article states that graphics haven't improved, but article goes on to discuss improvements in other areas. The pictures speak for themselves; the shadows are much more realistic and the water effects are much more realistic. The textures were fine to start with -- who cares if they improved?

Re:Bad summary (0)

Anonymous Coward | more than 4 years ago | (#30309058)

Summary picks out one point where the article states that graphics haven't improved, but article goes on to discuss improvements in other areas. The pictures speak for themselves; the shadows are much more realistic and the water effects are much more realistic. The textures were fine to start with -- who cares if they improved?

It's a summary about something from Microsoft on Slashdot, what did you expect? :)

So it's not simply about higher resolution? (0)

Anonymous Coward | more than 4 years ago | (#30308590)

I understand (and to some extent) agree with the "good enough" argument for DX9. Many games look very realistic and are damn impressive.

From the screenshots in the article, I agree it takes a discerning eye to pick out the differences between DX9 and DX11. However, to me some things seem more natural and less CG-ish with DX11. Specially the textures applied to the road (to avoid it looking flat). The DX9 screenshots, while they look just fine (quite good, actually), very much look computer generated to me. The DX11 feel much more natural and realistic (and not realistic in terms of simply being higher resolution). This makes me think that the "next battle" in terms of graphics cards and DX isn't simply about higher resolution, but about recreating the naturalism (and randomness) that occurs in nature in the real world. If that is indeed the case, DX11 seems like it might be in a very good position to tackle that kind of problem.

Sorry that all isn't worded very clearly, I'm up after only two hours of sleep.

Clean cars (1)

the_arrow (171557) | more than 4 years ago | (#30308848)

I took a look at the video in TFA, and for a game named "dirt" the cars looked very clean... I mean, in the video the cars drive on a dirt track (with a nice dirt cloud after the cars), with small pools of water, and still the cars looks like they just have been thorough clean and waxing session. While using so much extra power for more realistic flags, crowds and water, which you have no time to see really, why not use some of that to make the cars, which you do see quite a lot, become dirty?

Dirt 2 (1)

RogueyWon (735973) | more than 4 years ago | (#30308868)

I've played Dirt 2 on the PS3, back when it was released a few months ago. I can see why the graphical improvements in the PC version might attract attention, but I have another question...

Does DX11 have any kind of feature that lets you take that complete and utter XTREME moron who does the voice-overs for the game and kill him slowly in imaginative ways? Any enjoyment in the game was killed for me by XTREME SURFER DUDE RAD TO THE MAX guy screaming his head off every time I tried to do anything. Seriously... who actually likes that kind of thing? Who can even actually tolerate it, in a game that doesn't give you any option to turn it off.

I'll stick with Forza 3 for my racing game goodness for the time being. Yes, it has the strange old-bloke with the curious mid-Atlantic accent doing voice work on the menus, but at least he's not being all XTREME and you can shut him up if you want to.

It was all about build quality (0)

Anonymous Coward | more than 4 years ago | (#30309150)

I always thought the DX9 was better constructed than the DX11 although the lack of multi-timbral operation on the DX9 meant it was more limited.

http://www.vintagesynth.com/yamaha/dx9.php
http://www.vintagesynth.com/yamaha/dx11.php

The problem with using games to test APIs (1)

0ld_d0g (923931) | more than 4 years ago | (#30309202)

is that no game publisher would choose to make a game that looks drastically better on cards that few people have(or worse, the PC version ends up looking way better than the console one). Its bad business. (Not for GPU manufacturers though!)

But, OTOH, if you stick just to demos made by MS, AMD/ATI & NVIDIA all you have are contrived scenarios that use all of what DX11 offers and DX10,9 don't. I would personally suspend judgement till more games are released w/ DX11 support

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?