Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Demos DirectX 11-Capable ATI Graphics Card

timothy posted more than 5 years ago | from the indirectx-is-too-subtle dept.

Graphics 107

An anonymous reader writes "Today at a press conference in Taiwan, AMD demonstrated the world's first GPU capable of DirectX 11 technology. The demonstrations shows the major improvements DirectX 11 gives us over DirectX 10 and also shows us what AMD has in store for an ATI Graphics Card coming out before the end of 2009 capable of DirectX 11. AMD shows three primary features of DirectX 11: a tessellator, which allows for less blocky and more fluid and realistic details; compute shaders which allows for less restricted programming; and finally, how DX11 is better designed to take advantage of multiple CPU cores."

cancel ×

107 comments

Sorry! There are no comments related to the filter you selected.

Direct X11? (5, Funny)

TeknoHog (164938) | more than 5 years ago | (#28200107)

Or is Microsoft finally catching up with the unix world?

Re:Direct X11? (-1, Redundant)

Moryath (553296) | more than 5 years ago | (#28200533)

Nope. Getting DX11 functionality still requires you to infect your PC with the Vista Virus.

Re:Direct X11? (0, Troll)

jack2000 (1178961) | more than 5 years ago | (#28200877)

WRONG. You can install DX11 games and the DX11 libraries themselves on XP if you wanted to. The idea that you need Vista for DX11 is FUD straight from the mouth of the beast!

Re:Direct X11? (0)

Anonymous Coward | more than 5 years ago | (#28201095)

[Citation Needed]

I have a hard time believing that you can install and use DX11 on XP, but not DX10.

Re:Direct X11? (1, Interesting)

jack2000 (1178961) | more than 5 years ago | (#28202723)

WHAT, TROLL? Really mods? DX10 can run on XP . Learn to use hacks: http://thepiratebay.org/search/dx10/0/7/0 [thepiratebay.org] So will DX11 too once they get around to it. DX10/11's inability to run on XP is completely arbitrary...

Re:Direct X11? (1)

Krneki (1192201) | more than 5 years ago | (#28201303)

What nonsense are you talking about. Where did you get the funny idea that DX11 will run on Windows XP?

Will programmers be able to utilize? (3, Interesting)

gubers33 (1302099) | more than 5 years ago | (#28200127)

That is the real question, the PS3 for example has amazing computing speed and a great graphics card, but game programmers have yet been able to utilize the system to its full potential. I'll be curious to see if the same occurs here.

Re:Will programmers be able to utilize? (1)

Darkness404 (1287218) | more than 5 years ago | (#28200263)

I think not for some of the same reasons as the PS3: porting. If I use Direct X 9 its relatively easy to port it to the Xbox and will run on XP too. Same thing for the PS3, if they don't do all kinds of fancy stuff if the game doesn't sell well as a PS3 exclusive just change it up, add in a few extras an release it as a 360 game. I don't know why any game developer unless they were either MS owned or heavily invested in my MS would choose to use an incompatible Direct X version.

Re:Will programmers be able to utilize? (1)

TikiTDO (759782) | more than 5 years ago | (#28200267)

I remember seeing some articles where senior Sony execs essentially said the PS3 was made to be complex, so that is not really a good comparison. Microsoft has been pretty good about making DirectX easy to use. I imagine this release will continue with the trend.

Re:Will programmers be able to utilize? (1)

ifrag (984323) | more than 5 years ago | (#28207459)

I remember an interview with John Carmack where he said that developers would have to "sweat blood" to code for the PS3. He had somewhat more favorable things to say about developing on the Xbox. Now personally I don't really care, I have both systems and this is not fan-boy ranting or endorsement of either platform, but Sony really missed the mark by thinking programmers were going to be able to make the base engines use all those cores effectively. Yes, for graphics, going full out parallel has clearly been the way to go, but for the main engine, just about every game to date has had a very single threaded execution path. I can see a lot of potential for things like enhancing AI, but for the core logic, I have not seen things change much, even long after the introduction of multiple cores on PC's.

Re:Will programmers be able to utilize? (0)

Anonymous Coward | more than 5 years ago | (#28200377)

Considering almost all graphics processing is in the class of "Embarrassingly Parallel"... yes. They will be able to.

The question isn't CAN they, but rather WILL they? Will it hit enough of a market penetration that it becomes economically viable to target? I only ask because DX10 STILL hasn't hit that point...

Re:Will programmers be able to utilize? (1)

MikeBabcock (65886) | more than 5 years ago | (#28200599)

Unlike console gaming, very few graphically intensive PC games are designed to work at a specific quality with a specific frame rate for a specific consumer card. Rather, they're designed to be able to harness power from cards that don't exist at the time of development, and make good use of the features they know of at the time of development.

On a console, the system you design on is the system your users play on and direct optimization for the platform is both necessary and worthwhile. In a PC gaming scenario, you can release a game that barely pulls 15fps on most peoples' systems (see Crysis) because hardware upgrades are or will become available to allow better performance for those who care to make the investment.

Re:Will programmers be able to utilize? (4, Interesting)

nobodylocalhost (1343981) | more than 5 years ago | (#28200649)

I think tessellation will be controllable on the driver side, in that case, you wont need to write specialized code in order to take advantage of it.
From what I understand, it is basically point based curve matching using differential calculus - a fundamental change in the way models are being rendered. So even for existing games, you just need to turn on tessellation processing with your graphics card driver, and you should be able to take advantage of it due to the fact it just changes the rendering method, models themselves and other parameters should remain the same.

Re:Will programmers be able to utilize? (2, Informative)

Handlarn (911194) | more than 5 years ago | (#28202869)

That would produce some bad looking results, as the driver wouldn't know the difference between a model that is intentionally polygonal and one that is not.

Re:Will programmers be able to utilize? (1)

Creepy (93888) | more than 5 years ago | (#28209431)

Not necessarily - you could define an object using some form of CSG and have the tessellation tool create the actual geometry (which is pretty much what CAD software does today). It won't be as simple as the grandparent suggests, though - it will require some sort of primitive input unless it is designed to work with some sort of Level of Detail scheme (an area I do know Microsoft has patents in)

A very simple instance of CSG is a sphere, which can be defined with a point and a radius. You could then do something like a mathematical subtract of, say, a cube to make a cube shaped hole in the sphere - that contains a mathematically polygonal structure, a curved surface, and could be entirely tessellated in hardware. I can do that today in DX10 hardware using the geometry shader, but the geometry shader tends to tessellate slowly and has limited a limited number of vertices it can emit (an nVidia 8800GTX can emit 1024, for instance - I haven't tried emitting on my ATI hardware because ATI has so far refused to enable it via extension in OpenGL, even though geometry shaders are supported in DirectX on the same hardware). I wonder if that's what they mean by "tessellation capable hardware" - yes it can be done, but without DX11 hardware it can't be done well. Incidentally, nVidia is supposed to display some DX11 hardware in June sometime, as well (but personally I won't care unless they release equivalent OpenGL extensions).

As for LoD schemes, currently one that is used in shaders takes a polygon and a texture of height and normal maps and basically does something similar to extrusion (but only on a visual level). Dynamic tessellation could actually extrude the texture detail as you get closer, which also makes soft shadows much easier to calculate than with pseudo-raytracing techniques (raytracing in general has a hard time with soft shadows due to typical use of point light sources, but hard shadows are easy).

Re:Will programmers -be willing- to utilize? (1)

Bearhouse (1034238) | more than 5 years ago | (#28200935)

You can utilise the (agreed) good performance of the PS3, as witnessed by the number of eggheads who have hacked them to serve as cheap supercomputing clusters, (see /. posts Ad nauseam.

I'd personnally rephrase your comment more along the lines of "is it financially viable"?
Of course, the PS3 is a notorious horror to code for, but the other factor - market share - should be up there too.
Naturally, the two are related.

AMD & DirectX11 - sounds like a similar Pyrrhic victory...

Now if only OpenGL etc. had the same marketing hype.

Re:Will programmers be able to utilize? (0)

Anonymous Coward | more than 5 years ago | (#28205845)

The PS3 has a great graphics card?

I want some of what you're smoking!

Disclaimer; I am a games programmer and I do know what I'm talking about. The RSX was equivalent to a middle of the range PC gfx card at when the PS3 launched... probably because that's exactly what it is...

Oblig (1, Funny)

delta419 (1227406) | more than 5 years ago | (#28200145)

But will it run Vista?

Re:Oblig (0, Troll)

gubers33 (1302099) | more than 5 years ago | (#28200195)

If you run it with Vista, I foresee very little utilization and a lot of problems for you.

Re:Oblig (0)

Anonymous Coward | more than 5 years ago | (#28200205)

Oh man, If I could mod this +5, funny, I would.

Re:Oblig (1)

incognito84 (903401) | more than 5 years ago | (#28204279)

and I guess there is no option for +5, Sarcastic?

DX11 ALREADY? (0)

Anonymous Coward | more than 5 years ago | (#28200157)

How about they work on their DX 10 performance first.

There aren't many games that work well with DX10 yet, why focus on DX11?

Re:DX11 ALREADY? (1)

fuzzyfuzzyfungus (1223518) | more than 5 years ago | (#28200197)

Because the future has higher profit margins than the past?

Re:DX11 ALREADY? (0)

Anonymous Coward | more than 5 years ago | (#28209295)

Where are mod points when you need them.

Re:DX11 ALREADY? (1)

Goldberg's Pants (139800) | more than 5 years ago | (#28200275)

The funny thing is I'm running XP so am on DX9 and really saw no need for DX10. Just another graphics card upgrade that doesn't make the games any better.

Now if they could put out a card that improved gameplay and it only worked in Vista I'd be upgrading in a heartbeat!

Re:DX11 ALREADY? (1)

dunkelfalke (91624) | more than 5 years ago | (#28200307)

Was same here until I saw Stalker Clear Sky in DX10.1
The day after I had Vista installed.

Re:DX11 ALREADY? (1)

GMFTatsujin (239569) | more than 5 years ago | (#28200865)

I've got Stalker CS on my DX9 XP install. I'd like to see some comparisons to get an idea of what DX10 can do for a game like that.

Re:DX11 ALREADY? (1)

GMFTatsujin (239569) | more than 5 years ago | (#28200879)

Re:DX11 ALREADY? (1)

Maxo-Texas (864189) | more than 5 years ago | (#28201125)

Man.. this is like the DVD vs HD-DVD/Blu Ray thing.

I can barely tell a difference between Dx9 and Dx10. In some cases, the Dx9 seems easier to see vs the Dx10 is more realistic (but harder to see/much busier).

Of course, the only game I've played in the last year was Wii Sports so it doubly doesn't matter.

I guess it is hard for me to appreciate the differences since in my lifetime, graphics have gone from Apple IIe to Vectrex to the nearly unplayable BattleMech (due to clipping of really big triangles) to Rise of the Triad (woof.. woof!) to Doom to Quake I to Everquest to Everquest 2/Wow. Once it the EQ/Wow/Call of Duty level, the last 2% isn't near as important to me.

Re:DX11 ALREADY? (1, Interesting)

dunkelfalke (91624) | more than 5 years ago | (#28201255)

That is depend how you play the game. In Stalker Clear Sky I really used to hike through the area after I cleared it from the enemies because the graphics were just that beautiful.

Re:DX11 ALREADY? (1)

turing_m (1030530) | more than 5 years ago | (#28203429)

Apple IIe to Vectrex to the nearly unplayable BattleMech (due to clipping of really big triangles) to Rise of the Triad (woof.. woof!) to Doom to Quake I to Everquest to Everquest 2/Wow. Once it the EQ/Wow/Call of Duty level, the last 2% isn't near as important to me.

Exactly. If you come from a position of having devoted hundreds of hours to games such as Bubble Bobble, Pirates!, Defender of the Crown etc. in the C64 and XT days, the last 2% of graphical improvement is hardly detectable. Only recently was I able to play such classics as Deus Ex and System Shock 2, for example. The graphics were still brilliant compared to what I remember from my Quake I days.

The difference is the story and gameplay. A game with superior gameplay from yesteryear is still far more enjoyable than a crap game released yesterday with superior graphics. And there is such a catalog of great games with great gameplay and story from the last 15 years or so compared with my available free time that I'm still working my way through the better games ever created. Maybe one of these days I'll get to Bioshock or Half Life 2. (I like the single player FPS genre because once I have devoted my 20+ hours to the game, I don't have a pressing need to complete one more level, unlike online multiplayer where there is an infinite amount of novel engrossing gameplay.)

In this way, it's no different to movies. The top 100 on imdb (and in your favorite genres) will still be more enjoyable and engrossing than 99.999% of summer blockbusters with great special effects. The occasional genuinely great summer blockbuster with great special effects (e.g. Alien, Aliens, Terminator, Terminator II in its time) will eventually bubble up on those lists anyway. At which point I'll see them.

Re:DX11 ALREADY? (1)

gTsiros (205624) | more than 5 years ago | (#28205191)

not all the areas in S:CS show differences.
the differences are not subtle, but they are sparse. /s.t.a.l.k.e.r. ubergeek

Re:DX11 ALREADY? (1)

lorenzo.boccaccia (1263310) | more than 5 years ago | (#28201425)

this is really all about hype. I remember having tessellation in Morrowind!! It was called n-patch at the time but the concept is the same.

Also, wasn't tessellation supposed to be one of the improvements of directx 10?!?!?!???
And meanwhile quake 3 did some triks and got fully rounded surfaces.
The truth is that everything that directx 10 offer is already doable using a video card supporting the shader model 2.0;
back then was only more difficult to obtain.

Just look at the gpu accelerated physix. It doesn't need directx 11 nor directx 10 to work, and does exactly what this directx 11 is supposed to enable: generic shader for programming logic on the gpu.

The only real advantage is that this current revision will make every shader across each supporting board usable with double precision floating points, allowing for those who knows to work without having to compensate for each board quirks - but this is something that have no value in video games.

a little demonstration there http://www.tweakguides.com/Crysis_13.html [tweakguides.com] , at the section "Enabling 'Very High' Settings Under Windows XP or Vista DX9". That has fully dynamic water, reflection, soft shadows and so on.

Re:DX11 ALREADY? (1)

lorenzo.boccaccia (1263310) | more than 5 years ago | (#28201509)

before someone come out to tell me that the article point out at some unspecified differences between the directx 9 and 10 version (as per the linked page) following the guide the full description about those differencies is:

"Again, there appear to be subtle shader differences that no screenshot can capture, such as the difference in motion blurring and depth of field, but it's nothing of any real significance, and can be recreated using appropriate advanced tweaks." (taken from page 5)

that is, blur and depth of field (other dx10 exclusives) are there, working, just somewhat different, but fully working

Re:DX11 ALREADY? (0)

Anonymous Coward | more than 5 years ago | (#28203407)

Tessellation has been in use long before DX10 or Morrowind. All it does it remove or add vertices depending on the viewing distance. The further away, the less detail you will be able to discern, therefore rendering full poly models would be silly.

Quake 3 did not do truly rounded surfaces. Q3 patches meshes use more vertices to simulate a curved surface, but they aren't curved. Look closely and you can easily see the individual planes that make up any non-brush, non-convex shape. By default curve subdivisions are set to 4, meaning vertices won't be added to the curve unless the intersecting planes are a good distance from each other. You can set this as low as 1 and although it does look smoother, curves are still quite visibly made up of planes. The Quake 3 engine also performs tessellation on curved surfaces and you can control the distance it starts dropping/popping polys with r_lodcurveerror for a demonstration of how it works.

I generally agree that DX10 was a small improvement over DX9, but it did add the ability to do nice things like volumetric clouds and fog/particles that can interact with the environment and properly clips on world geometry.

Re:DX11 ALREADY? (1)

lorenzo.boccaccia (1263310) | more than 5 years ago | (#28206259)

wrong. what are you describing is LOD, not tessellation. Tessellation adds vertex not originally on the shape based on mathematical properties of the shape, as in this example image: http://alex.vlachos.com/graphics/DX8_N-Patch.jpg [vlachos.com]

Volumetric fog has been there for ages, check this demo which works even on a Radeon 9500: http://www.humus.name/index.php?page=3D&ID=70 [humus.name]

just check on the nvidia sdk 9.5 page what could be done using directx 9: http://developer.download.nvidia.com/SDK/9.5/Samples/samples.html [nvidia.com]

directx 10 may have made all this stuff easier, but developers already had a way to do most of this a lor of time before.

also look at this for an example of the patches meshes used for terrain in Q4: http://www.katsbits.com/htm/tutorials/doom_3_terrain_from_patches.htm [katsbits.com] vertex are not removed, but added on the shape. Bbut patch meshes are specific of quake 4 or doom 3, curved surfaces on quake 3 are a different beast and you should try them using a decent video card: those are bezier meshes and doesn't have any kind of "vertex", all of the vertex you see are generated at run time based on the limit you impose. Look at this for an explanation: http://www.cc.gatech.edu/classes/AY2002/cs4451_spring/groups/group3/index.html [gatech.edu]

Re:DX11 ALREADY? (1)

ejtttje (673126) | more than 5 years ago | (#28202979)

Meh, I agree with Maxo-Texas... most of those screenshots just look like a slight change in position or lighting. Aside from an obvious lens flare in the last shot (which, to me, spoils realism because my eyes don't produce lens flares like a movie camera...), if you showed those images without the labels, I don't think people could identify which was which.

There was a shot with wood planks, but that just looked like a higher resolution texture or bump map was being used, not really very exciting. Makes me wonder if they were just bumping up texture resolutions for the DX10 mode. *shrug*

This coming from a guy who gets annoyed with interlaced video and various other compression artifacts, so I am kinda picky.

I also wish people would ignore this proprietary DirectX crap in the first place and use OpenGL for portability. :-/

Re:DX11 ALREADY? (1)

harryandthehenderson (1559721) | more than 5 years ago | (#28200383)

Just another graphics card upgrade that doesn't make the games any better.

And that's any different than any other incrementing of DirectX, how? Did DirectX 9 provide you improved gameplay over DirectX 8? Doubtful. It was just another graphics card update that didn't make the games any better.

Re:DX11 ALREADY? (1)

sssssss27 (1117705) | more than 5 years ago | (#28201111)

DirectX 11 adds GPGPU support. This could allow games to be better on DirectX 11 instead of just being prettier.

Re:DX11 ALREADY? (1)

Kjella (173770) | more than 5 years ago | (#28200863)

DX11 is a superset of DX10, so there's no reason for Microsoft to wait. Basicly it brings a few more interfaces but most importantly, much better multi-threading performance that is all on the driver side. All DX10 games will run just fine under DX11 and the minor performance hit we saw by DX10 is again being made irrelevant by faster cards.

This one goes up to 11 (2, Funny)

nschubach (922175) | more than 5 years ago | (#28201233)

How about they work on their DX 10 performance first.

Because this one goes up to 11, so obviously it's better.

Linux drivers? (2, Insightful)

anjilslaire (968692) | more than 5 years ago | (#28200223)

Yes, but will it run with all of the bells and whistles (sans DirectX, of course) on Linux? Will they have solid drivers available on release?

Re:Linux drivers? (1)

harryandthehenderson (1559721) | more than 5 years ago | (#28200251)

When has ATI ever had solid Linux drivers?

Re:Linux drivers? (3, Insightful)

Darkness404 (1287218) | more than 5 years ago | (#28200317)

When has ATI had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.

Re:Linux drivers? (0)

Anonymous Coward | more than 5 years ago | (#28200991)

This, and I wish I had mod points for this. ATI cannot make stable drivers for Windows let alone Linux. This card means nothing because their drivers will always be too shitty to run anything. Call me when Nvidia makes a DX11 card and developers support it, until then this means nothing.

Re:Linux drivers? (1)

Blakey Rat (99501) | more than 5 years ago | (#28202285)

You're not kidding. What's truly amazing is that it's *utterly* random.

Sometimes I'm running WOW, playing a DVD, and have 5-6 browser windows open and my computer's solid as a rock for days. Sometimes, I've got nothing but a single browser window, and bam-- "Vista has detected your graphics driver has crashed."

The good news is that Vista can recover from it 9 times out of 10. Even without crashing WOW, which is pretty impressive.

Re:Linux drivers? (1)

cbhacking (979169) | more than 5 years ago | (#28202491)

I've heard this a lot, but I can't say it matches my experiences.
3 years on a laptop with Win2000 and a very old ATi chip (7xxx series): no video-caused crashes.
2 years on a laptop with WinXP and an old-ish ATi chip (9xxx series): no video-caused crashes.
6 months on a Vista laptop with (now outdated) Mobility 200M: no video-related crashes, EVEN WITH BETA DRIVERS (for comparison, nVidia's Vista drivers weren't what I would call release quality until more than 6 months after Vista's release. ATi was there over three months before RTM).
Granted, these aren't top-of-the-line desktop (or even the best of the laptop) lines, but honestly, where are these driver-caused crashes ATi supposedly has so much trouble with?

Re:Linux drivers? (1)

bloodhawk (813939) | more than 5 years ago | (#28202583)

When has ATI had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.

Very true, The only positive thing I have to say about ATI drivers is they are a shit load better than Nvidia drivers, But that is like being smarter than the brain damaged kid at the back of the Bus.

Re:Linux drivers? (1)

Mycroft_VIII (572950) | more than 5 years ago | (#28206569)

I've had very few ATI driver crashes myself, I have ONE game that every third patch or so I can depend on locking the drivers if I do things just right(a very certain sequence, easily avoided), and even then vista re-sets them after a bit.
      Nvidia on the other hand I've failed to get hardware that works long enough to comment on the drivers since the started making chip-sets for something more advanced than plain old pci.
    4 different systems and 5 different cards(6xxx and 8xxx series mostly), best results I ever got was a card that wouldn't hard lock the system (Power switch on psu, not soft switch on case!) as long you didn't use the second monitor out or do anything 3d or move the cursor to fast for it.
    Their motherboard chipset I do like however, three of the systems I tried their graphics cards in were Nvidia based mainboards.

Mycroft

FUD (1)

electrosoccertux (874415) | more than 5 years ago | (#28209171)

Their drivers are fine. That's the first thing AMD fixed after acquiring ATI.

On the contrary,

When has Nvidia had solid anything drivers. Even the Windows drivers cause BSoDs for no apparent reason.

See how easy that was?

Re:Linux drivers? (0)

moon3 (1530265) | more than 5 years ago | (#28200625)

That might also be a Linux community's fault. Linux never provided solid driver development kit. Look at Microsoft and their Windows WDK, I can only praise Microsoft for providing such a superior tools and resources to driver developers. Support developers, give them the best tools and they will follow.

Re:Linux drivers? (2, Insightful)

harryandthehenderson (1559721) | more than 5 years ago | (#28200635)

That might also be a Linux community's fault. Linux never provided solid driver development kit.

Doesn't seem to have stopped nVIDIA from making a pretty solid driver for Linux.

Re:Linux drivers? (5, Informative)

Kjella (173770) | more than 5 years ago | (#28201041)

Doesn't seem to have stopped nVIDIA from making a pretty solid driver for Linux.

nVidia basicly overrode the lower third or so of X11 (it's a big function pointer table) and wrote their own implementation, ATI did the same except with less success. AMD/Intel is now trying to invent a proper open source stack with graphics execution manager (GEM) for memory management, kernel mode setting (KMS) for flicker free boots and more, low-level state tracking framework called Gallium3D to expose modern shaders, better direct rendering interface (DRI2), redirected direct rendering (RDR) and various other improvements but you're talking about things only 1-2 years old. nVidia has succeeded yes but for most intents and purposes they wrote the whole thing themselves, There's a reason it's a sore point for open source fanatics, it's not merely a blob addon it basicly ripped out a whole chunk of open source, said "not good enough" and replaced it with their own blob.

Re:Linux drivers? (1)

harryandthehenderson (1559721) | more than 5 years ago | (#28201105)

ATI did the same except with less success.

Which was entirely my point. That ATI is still to this day unable to release a half-decent driver for Linux is their own fault since nVIDIA was clearly able to do so.

Re:Linux drivers? (1)

Ultra64 (318705) | more than 5 years ago | (#28208869)

The driver quality has improved noticeably since they were purchased by AMD.
Also, the open source drivers are progressing nicely. http://xorg.freedesktop.org/wiki/RadeonFeature [freedesktop.org]

Re:Linux drivers? (0)

Anonymous Coward | more than 5 years ago | (#28207319)

Yes there will be drivers at launch.

ATI has had at-launch ASIC support from the initial launch of the HD4xxx family. This should continue with these new cards.

(I am not going to make any comment about general quality, but an aggressive OS community doesn't exactly help).

meh, we go through this every few years (2, Funny)

docbrody (1159409) | more than 5 years ago | (#28200259)

so what, another update to Direct X and another batch of video cards that support it. Or partially support some of the features, or 100% all of the key features, but not some others. or some variation on that. Blah blah blah

Re:meh, we go through this every few years (0)

Anonymous Coward | more than 5 years ago | (#28207517)

Yup. We're getting old. Now get of my lawn.

Closer to the ultimate goal (2, Interesting)

Profane MuthaFucka (574406) | more than 5 years ago | (#28200265)

Realistic 3D CGI porn. Of course.

3D CGI Porn (2, Insightful)

Petersko (564140) | more than 5 years ago | (#28200711)

"Realistic 3D CGI porn. Of course."

I guess that's for people who find it's just too creepy to have actual porn actresses in their downloaded mpg's... watching them... laughing at them... judging them...

With CGI porn, the disconnect is complete! It has become a truly solitary masturbatory experience, the last vestiges of shared sexuality banished.

WOO.... hoo?

Re:3D CGI Porn (2)

Profane MuthaFucka (574406) | more than 5 years ago | (#28200833)

No, it's for people who are all like "yea baby, oh, touch yourself, yea more of that, NO NO DON'T LICK THE TITTY! who told you that licking your own titty is sexy? It's not, so stop that. God if I were the director, I would have slapped you for that. Now look what you did, you killed my boner."

With computer CGI porn, no actress will lick her own titties ever again.

Re:3D CGI Porn (1)

Kjella (173770) | more than 5 years ago | (#28201305)

Hmm yeah, in addition to unrealistic depictions of what having sex is like and standards of beauty, let's top that off with some mindreading ability too. I wonder if any of that could make it hard connecting with real girls that need to be actually pleasured, look average instead of bombshell and have a mind of their own. No wonder realdolls sell, if they could make them semi-intelligent sex robots too they should be just the thing...

Re:3D CGI Porn (1)

NevermindPhreak (568683) | more than 5 years ago | (#28203921)

My girlfriend may not be able to read my mind, but she does know what I'm thinking when I tell her what I'm thinking. Porn lacks that ability. :-P

Whoops, this is slashdot! Replace "girlfriend" with "Fleshlight" or something.

Re:3D CGI Porn (0)

Anonymous Coward | more than 5 years ago | (#28205247)

Whoops, this is slashdot! Replace "girlfriend" with "Fleshlight" or something.

My Fleshlight may not be able to read my mind, but she does know what I'm thinking when I tell her what I'm thinking. Porn lacks that ability. :-P

Re:3D CGI Porn (1)

adolf (21054) | more than 5 years ago | (#28205817)

It's far simpler than that: It can even have a textmode interface:

Welcome to Pornmaker 9000. Please enter a number from 1 to 9 for each question:

1. Guys?
2. Girls?
3. Other?
4. All of the above?
5. Scat?
6. Tongue?
7. Latex?
8. Blonde?
9. Redhead?
10. Romantic?
11. Anal?
12. Fisting?
13. Anal fisting?
14. Animals?
15. Furries?
16. Tits?
17. POV?
18. Multicam?
19. Enema?
20. Shish?
21. Gag?
22. Tattoos?
23. Shower?
24. Golden shower?
25. Old?
26. Young?
27. Fat?
28. Siblings?
29. Twins?
30. Squirt?

[bzzt.] [qrrrr.] [wrrrrrrrrgh.]

Thank you for your preferences. Pornmaster 9000 has created your porn. Enjoy!

Re:3D CGI Porn (1)

geekoid (135745) | more than 5 years ago | (#28203207)

And they do what you want without having to pay them.
Well, anything someone on a screen could do and say.

Re:3D CGI Porn (1)

Chris Burke (6130) | more than 5 years ago | (#28209203)

With CGI porn, the disconnect is complete! It has become a truly solitary masturbatory experience, the last vestiges of shared sexuality banished.

Say what you will about it being impersonal. CGI is the only way I could afford to complete my masterpiece: Lord of the Cock Rings: The Battle of Purple Helm's Deep Penetration.

Re:Closer to the ultimate goal (1)

sharkey (16670) | more than 5 years ago | (#28201077)

You kids with your 'realism', and '3D' pr0n. Why, in my day, all we had was ASCII pr0n, running at 5 seconds per frame. And we liked it!

Re:Closer to the ultimate goal (0)

Anonymous Coward | more than 5 years ago | (#28204547)

Mod parent up!

So... (2, Insightful)

eexaa (1252378) | more than 5 years ago | (#28200293)

...so they are shipping real drivers with ATI cards? Great!

(In fact, I hope that they finally do something about this. I was forced to avoid any ATI hardware for over 5 years now, just because of driver incompatibilities. It's just sad.)

Beowolf Cluster? (0)

Anonymous Coward | more than 5 years ago | (#28200337)

Golly! Just imagine the graphics output a beowolf cluster of these would produce.

Re:Beowolf Cluster? (1, Funny)

harryandthehenderson (1559721) | more than 5 years ago | (#28200389)

15fps on max settings with Crysis?

who cares, my computer OSs don't have direct-x (0, Redundant)

swschrad (312009) | more than 5 years ago | (#28200529)

and won't, either.

Re:who cares, my computer OSs don't have direct-x (1)

Clay Pigeon -TPF-VS- (624050) | more than 5 years ago | (#28200771)

Then why post whinging about it? Are you the same sort of person that complains about being able to find porn on bing after you disable the filters?

Re:who cares, my computer OSs don't have direct-x (1)

geekoid (135745) | more than 5 years ago | (#28203221)

And we salute you~

Jeez dude, no one cares and it doesn't belong in this thread.

Yet Another Feature... (2, Interesting)

ADRA (37398) | more than 5 years ago | (#28200531)

that isn't in XP, hence nobody cares. You'll have the what, 30% market segment with Vista, and maybe 10% that are regular gamers who will be using this.

This will just encourage the further brokenness that Windows is turning the PC gaming platform. Good Job!

PS: Before everyone jumps in to say that everyone will jump into Win7, I think you're mistaken. The only way Microsoft will kill XP for most existing users would be to introduce a critical bug that they choose not to fix. I played with Win7 for a few days and can safely say that it doesn't add anything that I've ever wanted to use that a trivial search for google wouldn't find an as-good or better alternative. And maybe its just me, but pretty much every single UI 'enhancement' since circa Win2k is always a step backwards in terms of -my- productivity.

Its lucky that I'm Linux competent since Fedora/Gnome makes practically everything I need easy and uncluttered. If the barrier for entry was a little lower, I could see mass exodus potential coming as XP users take an honest look at what they -really- want to update to.

Re:Yet Another Feature... (1)

GMFTatsujin (239569) | more than 5 years ago | (#28200841)

Dude, that wasn't a PS. That was practically a P on its own merits.

Not an FP, but a P none the less. A troll P, at that.

Troll's pee!

(Sorry ... long day.)

Re:Yet Another Feature... (1)

The End Of Days (1243248) | more than 5 years ago | (#28201783)

I'm pretty sure that it's not just you who feels this way about Windows 7, but that's mainly because this site is full of people who hate Microsoft for the sake of hating Microsoft. Amongst the general population Windows 7 is gonna own. Personal opinion, to be sure, but historically I'm pretty good at judging hype on its own merits.

Re:Yet Another Feature... (0)

Anonymous Coward | more than 5 years ago | (#28201789)

Douchebags like you are the reason I won't ever use a Linux OS of any kind.

PS: I use and like Windows Vista. It doesn't just have a better UI but it's also faster than XP.

Re:Yet Another Feature... (0)

JCSoRocks (1142053) | more than 5 years ago | (#28202451)

The only reason XP has hung on so long is that it took so long for Vista to come out. Everyone was running XP and have continued to do so because Vista was expensive, it had problems until SP1 came out, and ancient XP machines couldn't run it. No one skipped from Win2k to Vista. Tons of people will be skipping from XP to 7. Just wait and see. New machines will have 7. Old machines will be replaced with machines running 7. Gamers running XP will switch to 7.

Re:Yet Another Feature... (1)

geekoid (135745) | more than 5 years ago | (#28203257)

Yes, the bug is called "Not having DirectX 11"

There will never be a mass exodus to Linux as long as corporations keep ties to an OS and developers write code to take advantage of specific items in that OS.

If applications were self contained like they should be, then the OS wouldn't matter nearly as much.

Re:Yet Another Feature... (1)

CrashNBrn (1143981) | more than 5 years ago | (#28208129)

Your frame of mind seems to be pretty common around here. "Nothing I can see" in Vista/Win7. I even might of agreed with that statement not all that long ago. Yet if you follow some of tech-notes related to Vista/Win7 ie the actual underpinnings of the system, not just the candy coated UI. Theres quite a lot of interesting stuff going on under the hood. Mark Russinovich's blog's since he was hired at Microsoft have been particularly enlightening - even more so than the stuff he used to post frequently on sysinternals.com.

I think whats particularly interesting, is he used to frequently point out how he would solve a security issue, or track down a bug - usually with "Process Explorer" or one of its kin. Also he was the first to discover the much maligned Root Kit. And now he's working directly FOR Microsoft, which means instead of him just happening to find problems in his own time that might be of interest to him, or some of his old security related clients -- He now has a direct line to getting Windows Drivers fixed and/or the quirks in "MS Spaghetti code" that has caused their software to misbehave on so many occasions. Which in the long run can only be a good thing.

Exposure. (0)

Anonymous Coward | more than 5 years ago | (#28200601)

From what I'm reading all DirectX 11.0 does is exposing functionality that's already in place with most modern GPUs. So I don't see why we need another new GPU.

The end of the CPU/GPU divide (1, Interesting)

Anonymous Coward | more than 5 years ago | (#28200633)

From the article:

Lastly, DX11 is better designed to take advantage of multiple CPU cores. This should allow developers to offload some of the work on to the processors that are typically there not doing as much work, freeing up the GPU to do the more important processing and rendering.

Interesting turnaround. The original motivation for the GPU was to allow the CPU to offload expensive graphics computation to a dedicated processor. Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again.
This is further evidence that the CPU/GPU divide is being eliminated, and that there will likely be no such distinction among processors in the near feature.

Re:The end of the CPU/GPU divide (1)

blahplusplus (757119) | more than 5 years ago | (#28204859)

"Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again."

This is more an artifact of programmers not knowing how to utilize extra cores of modern processors then it is really "offloading", it's more like taking full advantage of the CPU, GPU cards and drivers have always shared the load between CPU and GPU, the fact of the matter is with many core CPU's many programmers haven't learned to utilize them effectively.

Re:The end of the CPU/GPU divide (1)

Lord Crc (151920) | more than 5 years ago | (#28205913)

Interesting turnaround. The original motivation for the GPU was to allow the CPU to offload expensive graphics computation to a dedicated processor. Now it appears that that newer GPUs are allowed to offload their computation back to the CPU again.

No, that is not what's going on. The GPUs have become so fast at doing their jobs that the CPU can't feed them fast enough. That's where the new features in DX11 will help. It will make it possible to efficiently use multiple threads to feed the GPUs. This has been an issue in DX10 and earlier.

What happened... (0)

Anonymous Coward | more than 5 years ago | (#28200803)

What happened to having one universal standard with DirectX 9 Back in the days of DirectX 9, I don't remember Microsoft releasing a new version, requiring new hardware, every single year. And then, of course, they make all of their new games need the latest version of DirectX. So, people need to go out and buy new hardware, just to play a game. Most of these games are eventually cracked on torrent sites, so, obviously, its not necessary to have these new versions.

The real question is... (3, Funny)

nycguy (892403) | more than 5 years ago | (#28200955)

Will Duke Nukem Forever [3drealms.com] wait to take advantage of DirectX 11?

Re:The real question is... (1)

Deltaspectre (796409) | more than 5 years ago | (#28204397)

No.

Re:The real question is... (2, Funny)

ichigo 2.0 (900288) | more than 5 years ago | (#28204791)

I thought 3drealms going bankrupt would end the DNF meme. Guess I was wrong.

Re:The real question is... (0)

Anonymous Coward | more than 5 years ago | (#28209327)

Well they've already done 100 re-writes of the game and engine. What's another one going to hurt...

Finally! (1)

Merc248 (1026032) | more than 5 years ago | (#28201023)

I now can play my favorite game of all time with decent performance: 3DMark

Useless, stupidly written information-free article (1)

White Flame (1074973) | more than 5 years ago | (#28201043)

Come on, this is the depth of comprehension that the author has about what tessellation is?

One of the technologies in DirectX 11 is something called tessellator.

Tessellator allows for more smoother, less blocky, and more organic looking objects in games. Anti-aliasing shouldn't be confused with this, as AA does a descent job at smoothing out sharp edges but tessellator actually makes it look more fluid and frankly much more realistic. Tessellator makes things look more "rounded" instead of chunky and blocky. Instead of having to trade off quality for performance, like in the past, developers can now have the most realistic scenes without a performance hit.

Tech Fragments is an appropriate name for the site, I guess, seeing as they can't even get the tense of the word right.

Worst quote ever (1)

sys.stdout.write (1551563) | more than 5 years ago | (#28201327)

From TFA (in reference to the tesselator in DirectX 11):

Instead of having to trade off quality for performance, like in the past, developers can now have the most realistic scenes without a performance hit.

Yeah, I'm sure turning on tessellation won't cause any performance hit at all.

Tech Fragments has the most sensationalist writers ever.

Yes, but... (1)

Type44Q (1233630) | more than 5 years ago | (#28203289)

will it run Lin^H^H^HWindows XP? :P

Obligatory (0)

Legion303 (97901) | more than 5 years ago | (#28203365)

This one goes to 11.

Nvidia on the Run (1)

Nom du Keyboard (633989) | more than 5 years ago | (#28204337)

I, for one, am happy to see Nvidia on the run. I've seen what they will try to do ($649 for a GT280 card based on aging DDR3 memory technology) when they think that they rule the roost. Go ATI!

Re:Nvidia on the Run (0)

Anonymous Coward | more than 5 years ago | (#28205077)

Why does it matter what technology they use as long as it brings competitive performance?

Re:Nvidia on the Run (0)

Anonymous Coward | more than 5 years ago | (#28205471)

The most expensive GeForce card at the moment (excluding ones with water-blocks built on) is the GTX295, which can be had for $550. Further, nVidia achieves higher memory bandwidth with that "aging DDR3 memory technology" than ATI does with GDDR5.

Excuse Me But... (1)

Nom du Keyboard (633989) | more than 5 years ago | (#28204409)

Excuse me, but, didn't DirectX 10.1 also provide for a tessellator?

And isn't this the reason why there never was an Nvidia 10.1 card, but ATI ran it just fine?
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>