×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

DirectX 'Getting In the Way' of PC Game Graphics, Says AMD

Soulskill posted more than 3 years ago | from the stairs-in-the-way-of-getting-to-the-basement dept.

AMD 323

Bit-tech recently spoke with Richard Huddy, worldwide developer relations manager of AMD's GPU division, about the lack of a great disparity between PC game graphics and console game graphics, despite the hardware gap. Quoting: "'We often have at least ten times as much horsepower as an Xbox 360 or a PS3 in a high-end graphics card, yet it's very clear that the games don't look ten times as good. To a significant extent, that's because, one way or another, for good reasons and bad - mostly good, DirectX is getting in the way.' Huddy says that one of the most common requests he gets from game developers is: 'Make the API go away.' 'I certainly hear this in my conversations with games developers,' he says, 'and I guess it was actually the primary appeal of Larrabee to developers – not the hardware, which was hot and slow and unimpressive, but the software – being able to have total control over the machine, which is what the very best games developers want. By giving you access to the hardware at the very low level, you give games developers a chance to innovate, and that's going to put pressure on Microsoft – no doubt at all.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

323 comments

Yeah right (5, Interesting)

ggramm (2021034) | more than 3 years ago | (#35541046)

I worked for Microprose in the 90's. Back then we had direct access to hardware, but the technology was limited. GFX power increased and new tricks came. Now a days it wouldn't be possible to do all that.

DirectX is the sole reason we have good games and graphics on PC. No one wants to reinvent the whole wheel and Microsoft works a lot with GPU manufacturers to come out with new technology.

DirectX is not the reason, it's the lazy developers who just port the game from consoles to PC. They don't spend the time to make a PC version that uses DirectX and newest graphics cards to their fullest capability, so why on earth they would do that if you remove DirectX.

There is no DirectX on Linux and just look at how laughtable the situation is. Yeah theres nethack and some clone of Civilization 2 with worse graphics, but it's far from both console games and PC games that gamers play. It's a joke.
Microsoft has supported PC gaming to great lengths. We all should thank Microsoft that the situation is even so good. Who we should bitch at are the lazy developers and AMD, who also has been lagging behind. NVIDIA and Microsoft is basically doing all the innovation, and their hardware is miles ahead of AMD's. Microsoft, Intel and NVIDIA. All great companies with great products that are truly working for PC games.

Re:Yeah right (0)

Anonymous Coward | more than 3 years ago | (#35541094)

No one wants to reinvent the whole wheel

It's more like knowing how a wheel works than reinventing it. They don't have to reinvent the maths behind GC and other game related stuff, only learn what others have already invented.

Now if "they" are too lazy to even bother learn how stuff works...

Re:Yeah right (3, Informative)

bmo (77928) | more than 3 years ago | (#35541098)

There is no DirectX on Linux and just look at how laughtable the situation is. Yeah theres nethack and some clone of Civilization 2 with worse graphics, but it's far from both console games and PC games that gamers play. It's a joke

Funny, Steam games run just fine.

--
BMO

Re:Yeah right (0)

Anonymous Coward | more than 3 years ago | (#35541108)

There is DirectX for Windows.Called the WineD3D.

Re:Yeah right (0)

Anonymous Coward | more than 3 years ago | (#35541142)

What are you, slow as a spastic in a magnet factory? He was pointing out that Steam games run just fine on Linux, regardless of DirectX.

Re:Yeah right (1)

Whiteox (919863) | more than 3 years ago | (#35541102)

You are right. But don't forget that DOS was just an OS and it was the exe file, often assembly coded with huge tables for graphics, collision detectors etc that did all the work.

Re:Yeah right (5, Insightful)

SCPRedMage (838040) | more than 3 years ago | (#35541110)

There is no DirectX on Linux and just look at how laughtable the situation is. Yeah theres nethack and some clone of Civilization 2 with worse graphics, but it's far from both console games and PC games that gamers play. It's a joke.

Don't blame the lack of DirectX for the lack of games on Linux. OpenGL works just fine on it, as it does on Windows.

And Mac, much to the delight of the four people who want to play games under OS X.

As far as getting rid of graphics APIs, yeah, that's exactly what we need: to go back in time fifteen years, and make devs write their games for every piece of graphics hardware under the sun. There's a damn good reason the industry started using them, and its still as relevant today as it was back then.

Re:Yeah right (4, Informative)

perpenso (1613749) | more than 3 years ago | (#35541450)

And Mac, much to the delight of the four people who want to play games under OS X.

Last I heard you are about 5 orders of magnitude off with respect to Mac users playing World of Warcraft. :-)

Re:Yeah right (4, Insightful)

CharlyFoxtrot (1607527) | more than 3 years ago | (#35541608)

Don't blame the lack of DirectX for the lack of games on Linux. OpenGL works just fine on it, as it does on Windows.

And Mac, much to the delight of the four people who want to play games under OS X.

Don't forget iOS ! Pretty popular gaming platform these days and it supports OpenGL ES 2.0.

Re:Yeah right (1)

tepples (727027) | more than 3 years ago | (#35541636)

Don't blame the lack of DirectX for the lack of games on Linux. OpenGL works just fine on it, as it does on Windows.

Really? I've been told that the proprietary OpenGL drivers on Linux aren't that good quality, especially AMD's.

Re:Yeah right (0, Flamebait)

somersault (912633) | more than 3 years ago | (#35541112)

Another fucking MS shill comment from a brand new user, surprise surprise. Do you really think we're that dumb here that spouting some ancient stereotypical BS and then saying things like "innovation" and "truly" in the same post as Microsoft are going to fool us?

You could have at least mention TuxRacer or Alien Arena if you didn't want your comment to look like it was written in the 90s.. it isn't about Linux/Microsoft, or nVidia/AMD in this case, it's about Direct3D/OpenGL. Mac OS still has modern games despite not having DirectX. Likewise so does the PS3.

Re:Yeah right (0, Flamebait)

Anonymous Coward | more than 3 years ago | (#35541252)

Re: "shill", I don't think that word means what you think it means. It's a not a catch-all term for "anyone who doesn't hate Microsoft", you fucking typical Slashdot illiterates. Ah but it's my fault really, expecting intelligence from a festering den of nerd leftoids.

Re:Yeah right (2, Insightful)

Anonymous Coward | more than 3 years ago | (#35541422)

As someone who has tried and got pissed off multiple times at the API's, yes the API's need to be much thinner.

Let's do a quick comparision of how stupidly inefficient game development is...
1. Xbox/Xbox350 - DirectX,Managed C#
2. Wii/Gamecube - OpenGL,C/C++
3. PS2/PS3 - OpenGL, C/C++
4. PC - DirectX, Managed and Unmanaged C,C++, C#, OpenGL
5. Max - OpenGL, C/C++,ObjC
6. Linux - OpenGL, C/C++
7. Android - OpenGL, Java/Native C/C++ maybe.
8 iOS - OpenGL, C/C++/ObjC
9 Windows Phone 7- DirectX, Managed C#
10, All the other mobile phones and devices- Not DirectX

So the conventional wisdom is that having two different API's (DirectX, and OpenGL) complicates things much as writing libraries in C and C++ .
Both of the API's were written after-the-fact to replace existing proprietary 2D API's where everything was done on the CPU and the graphics card simply acted as a display buffer.

If we were to throw -everything- away and start from scratch, everyone should standardize on a unified C-like base that allows this close to metal application, and instead of having million-line long Make files to determine if the system libraries required are installed during compile, have one unified compiler that can do the compilation of the C-like language into the intermediate language+security/signing to verify it hasn't been altered. Then on the target system, the system compiler will verify the code hasn't been altered, compile and cache it into native binary's that use the hardware close to metal, no intermediate api's, libraries, or other overhead. The entire process would be quick, and as you upgrade your devices, the new system can make better use of new hardware, or the application can run in it's original profile (to avoid the type of problem we have with old software not working when the old shared libraries no longer exist, and the new hardware is too fast.)

Re:Yeah right (1)

dagamer34 (1012833) | more than 3 years ago | (#35541708)

That's why developers create middleware engines to abstract away most of the low-end stuff because they don't want to be bothered with it. Ever heard of Unreal Engine 3?

Re:Yeah right (1)

icebraining (1313345) | more than 3 years ago | (#35541902)

The big titles for the Xbox 360 are developed in C/C++, no managed C#.

In fact, it's kind of obvious, considering the massive work it would require to port games between the Xbox 360 and the PS3.

Re:Yeah right (1)

Dunbal (464142) | more than 3 years ago | (#35541140)

Microprose was an awesome company and I had all of their games from Silent Service up to and including Falcon 4.0. It's too bad that the company got swallowed whole and recycled so many times.

I agree that developer laziness is behind many development problems and it's not just limited to DirectX. Look at that steaming pile of horseshit called Bink, which was very popular at one point despite being a festering abscess of sloppy code. Look at some current (cough Miles cough) sound drivers that cause popular (cough TotalWar) titles to crash or have weird sound bugs.

The old saying "if you want something done right, do it yourself" applies here. However we have to look at the other side of the coin. Microsoft promised with Direct X to provide the magical "universal API" that would conceal all hardware issues, providing a standard interface for developers while the OS took care of the problem of hardware drivers. Microsoft failed to deliver - suddenly when the OS was shipped, it was no longer a priority to keep drivers up to date - this now became the responsibility of the hardware OEM.

Developers used to maintain their own in-house tools for the popular brands of sound/video cards (Select 1 for Sound Blaster, 2 for Ad Lib, etc) but now the list of hardware has grown so large that it's impossible to stay current, let alone try to innovate.

And finally hardware manufacturers have pursued a "trade secret" approach (until very recently) towards their hardware, releasing very little information to developers let alone the public.

All of these factors, and not just laziness on the part of developers, have contributed to the current situation where we are entirely dependent on Microsoft to provide the API that works.

I think that some huge developers (like say EA) could and should use their clout to improve the situation "improve the following bugs in your drivers or we go with someone else" with the more popular tools that are used in today's games, unfortunately it's usually the finance/marketing department that ends up having their way and buggy games get shipped. They know that we suckers will still buy them - because every other game from everyone else is buggy too. Honestly nowadays I am surprised to get a game that actually runs without first applying a 300+MB patch.

Re:Yeah right (2)

DJRumpy (1345787) | more than 3 years ago | (#35541230)

I agree with some of your points but disagree with a few:

MS was largely successful with DirectX, and the goal of allowing developers to largely ignore the graphics hardware while concentrating on a standard API was successful. For ill or good, it's driven game design on the dominant platform for years, and arguably kept OpenGL on the defensive.
MS may provider driver support with their OS because it is to their benefit to have out of the box support, but they have never been best in driver support. They leave that to the GPU vendors and rightly so. MS is often months or years out of date when it comes to drivers.
Last but not least, developers are not a slave to the MS API. They can always choose OpenGL with the added benefit of portability to other platforms. They simply choose not to, whether that involves cost, development time, or simple unfamiliarity with OpenGL as opposed to DirectX.

I think a large part of the lack of perceived difference between consoles and PC's these days have a bit to do with the least common denominator, which unfortunately also happens to be an aging dedicated gaming console, and lack of money, time, & resources by developers and publishers to fully stretch the legs of the newest hardware when they can target the lowest common denominator and get by with 'ok' on a PC.

Lowest common denominator is Intel's Graphics My (1)

tepples (727027) | more than 3 years ago | (#35541722)

I think a large part of the lack of perceived difference between consoles and PC's these days have a bit to do with the least common denominator, which unfortunately also happens to be an aging dedicated gaming console

As I understand it, the lowest common denominator console for "grown-up" consoles (Xbox 360) is far more powerful graphically than the lowest common denominator PC (any PC with integrated video). Half a GB of RAM and an AMD Radeon X1900 beat 2 GB of RAM and an Intel "Graphics My Ass".

Re:Lowest common denominator is Intel's Graphics M (0)

Anonymous Coward | more than 3 years ago | (#35541842)

You do realize the 360 has a 500MHz ATI Xenos? Hardly top of the line. It's a 2005 graphics card.

Re:Yeah right (0)

Anonymous Coward | more than 3 years ago | (#35541812)

As far as OpenGL on Windows, I would guess the developers choose not to for a couple of reasons. First, Microsoft did try to kill OpenGL support completely. They did back off of that, but they certainly had it in their plan to not ship it at all. Second, the state of OpenGL in the graphics drivers on Windows is abysmal. I work at a large corporation where we have folks who do geological modeling and whenever we get in a vertical app that uses OpenGL we cringe. The drivers (from both nVidia and ATI) just suck. They render wrong, crash machines, etc. You are always fighting some stupid bug with them. When the app uses DirectX it usually works OK. This isn't the folks writing the OpenGL app screwing up. It is the lack of development and testing effort on the part of the driver manufacturer. Anyway, it is truly to be avoided until ATI and nVidia decide to devote the resources to actually making it solid.

Re:Yeah right (2)

Tapewolf (1639955) | more than 3 years ago | (#35541144)

Am I the only one who remembers the demo scene? Pure DOS. No DirectX.

Stars, Wonders of the World (1995) [youtube.com] - (Contains brief cartoon nudity near start).

Re:Yeah right (1)

Tapewolf (1639955) | more than 3 years ago | (#35541174)

Addendum, for those too pressed for time to watch the entire 6:20 demo, the intro finishes at 1:11. Highlights include the face-through-the-wall at 3:13 and the hula-hoop scene at 4:35.

Re:Yeah right (2)

jedrek (79264) | more than 3 years ago | (#35541204)

I remember the demo scene. I remember having to use QEMM to get enough ram for the demos to run, then having them crash. I remember some demos working on my gfx card and not my friends', I remember having drivers for specific sound cards, etc.

Re:Yeah right (1)

sosume (680416) | more than 3 years ago | (#35541286)

No AI. Color cycling. Fractals. What you saw in the demo scene in the eighties, is now available as a visualisation plugin for Media player or Winamp. it looked impressive back then, but it were mere hacks pushing the metal to its fullest. Still they all used the same similar tricks. I watched a few of those again some time ago and was not impressed anymore.

Re:Yeah right (1)

McTickles (1812316) | more than 3 years ago | (#35541240)

I miss those days when one coder could actually do what the heck they wanted with the hardware.

Re:Yeah right (1)

swalve (1980968) | more than 3 years ago | (#35541448)

How does an API change what you can do with the hardware?

Re:Yeah right (0)

Anonymous Coward | more than 3 years ago | (#35541634)

MS did when they required card vendors to add tilt bits and other DRM into the hardware. Windows doesn't allow direct access to gfx or
sound hardware in case you try to pirate stuff. Protected pathways and all that crap.

Re:Yeah right (-1, Offtopic)

anomaly256 (1243020) | more than 3 years ago | (#35541154)

Amen, mod parent up. Troll? wtf? what shill modded troll?

Re:Yeah right (3, Interesting)

Tapewolf (1639955) | more than 3 years ago | (#35541208)

Amen, mod parent up. Troll? wtf? what shill modded troll?

Well, I suspect the reason it is considered a troll is because it rewrites history and ignores the facts in order to support its conclusion.
Stuff like ignoring the thriving DOS games market prior to 1998 or so when Windows finally took over. Brushing OpenGL and SDL under the carpet. I imagine that picking things like nethack and freeciv as a snapshot of linux gaming when you had Wolfenstein 3D, Sauerbraten and various other 3D-accelerated games was what pushed the moderators over the edge. I certainly wouldn't pick Solitaire as an example of what windows gaming looked like, and I loathe Windows.

Re:Yeah right (1)

Jaktar (975138) | more than 3 years ago | (#35541862)

Never attribute to malicious intent that which can be attributed to ignorance.

If you're like most people out there, you haven't really given *nix a try.

I'm not a dev. I don't know why OpenGL is not as popular as DirectX. I only know that DirectX has all the games I currently play. I can only guess why that is.

Re:Yeah right (0)

Anonymous Coward | more than 3 years ago | (#35541622)

I worked for Spectrum Holobyte and later Microprose after the buyout and name change in the 90s, back when I lived in Alameda. Were you one of the idiot game testers by any chance?

Re:Yeah right (5, Interesting)

CastrTroy (595695) | more than 3 years ago | (#35541750)

Yes, things were so much better back in the day when you had to have a very specific graphics card, or audio card, or joystick, otherwise the game wouldn't work. Developers had to code for each piece of hardware individually. If you bought a 3dfx voodoo card, there was a bunch of game you could play, and a bunch you couldn't. If you bought the gravis ultrasound, you were very much out of luck because most stuff was coded for the soundblaster, and a lot of stuff lacked support for your third party sound card. Joystick support was a complete mess. Also, games don't look 10 times as good, because then they could only run on 1% of the machines, and that is not a big enough market. Sure faster computers exist, but the computers that most people own are probably about as powerful as a console, especially if you look at the graphics chip.

Unification? (5, Insightful)

paziek (1329929) | more than 3 years ago | (#35541070)

Isn't DirectX and OpenGL there so that developer can write application using DirectX 10 and have it working with any card capable of DirectX and having enough memory? Are we gonna have "Works best in Internet Explorer 6" again for graphic cards? I still remember that whole 3dfx thing and I didn't like it.

Re:Unification? (5, Insightful)

smallfries (601545) | more than 3 years ago | (#35541106)

The whole 3dfx era was horrific, and as someone has already pointed out below DirectX made a huge positive impact in PC gaming. The article describes a real problem though: if I want to hit 50fps then my rendering needs to execute in under 20ms. Performing 5k system calls to draw chunks of geometry means that each syscall needs to be less than 4us, or about 12000 cycles on a 3Ghz processor. That is not a lot of time to do all of the internal housekeeping that the API requires and talk to the hardware as well.

The solution is not to throw away the API. The interface does need to change drastically, but not to raw hardware access. More of the geometry management needs to move onto the card and that probably means that devs will need to write in some shader language. It's not really lower-level / rawer access to the hardware. It is more that shader languages are becoming standardised as a compilation target and the API is moving on to this new target.

Re:Unification? (3, Interesting)

JackDW (904211) | more than 3 years ago | (#35541192)

This is a very good point, the overhead of API calls can be a significant bottleneck.

I'd suggest that a good solution is to move applications to entirely managed code (e.g. C#), so that there is no need for any hardware-enforced barrier between the kernel and the applications (c.f. Singularity [microsoft.com] ). In the best case, you may end up with a situation in which a JIT compiler inlines parts of the kernel's graphics driver directly into the application code, effectively run-time specialising the application for the available hardware. We already see hints of this happening, for instance the use of LLVM bit code in Apple's OpenGL stack [wikipedia.org] .

Re:Unification? (1)

cpu6502 (1960974) | more than 3 years ago | (#35541202)

Since you seem knowledgeable:

How is this handled on the consoles? Do the programmers go direct to the hardware, as they did in the days of the N64 and PS1, or do the modern PowerPC-based consoles also have a DirectX-style interface?

Re:Unification? (1)

Sam Douglas (1106539) | more than 3 years ago | (#35541304)

I have no experience in console games development, but I suspect that modern console SDKs have graphics APIs available to them that are higher level than hardware access. I believe the Xbox consoles use a DirectX-like API.

Re:Unification? (0)

Anonymous Coward | more than 3 years ago | (#35541314)

Quite simple really. Each console has only one type of graphics card, so you write an API that is a light wrapper around direct calls to the GPU, and then you inline the API calls in the client code.

Re:Unification? (0)

Anonymous Coward | more than 3 years ago | (#35541374)

The Xbox 360 uses DirectX and I think the PS3 might use something resembling OpenGL. The consoles generally don't do as much work as desktop cards are able to do so the API bottlenecks are less of a factor.

Re:Unification? (1)

Anonymous Coward | more than 3 years ago | (#35541512)

Going off my old and rusty memory:
Xbox 360 uses pretty much a Direct X 9 API with some changes (The sound API is from what I remember XAudio2 like).

The PS3 has Open GL ES, but to get the speed out of the machine you need to write very specific code for the PS3, which is a pita.

An API isn't the problem, it is how flexible it is. Most things are off loaded to the GPU, so you load the geometry, textures etc into the GPU accessible memory and use vertex/pixel shades, the former for doing animation and wot not where ever possible.

Re:Unification? (1)

tepples (727027) | more than 3 years ago | (#35541736)

The Xbox 360 is most often* programmed in C# using the XNA API, which is very much a managed counterpart to DirectX.

* I can explain what I mean by this.

Re:Unification? (-1)

Anonymous Coward | more than 3 years ago | (#35541880)

Libertarian thief makes everyday a new account. But, he still is just a retarded thief.

Re:Unification? (0)

Anonymous Coward | more than 3 years ago | (#35541276)

What makes things more difficult is that you cut most of the geometry (you do away with the hidden parts) before sending it to the video card for faster drawing.

So yeah, you could do geometry transformation on the card, but then you'd have to load more geometry data

Re:Unification? (1)

Twinbee (767046) | more than 3 years ago | (#35541562)

I just wish they gave us an easier way to access the gfx card's pixel buffer - you know what ultimately comes out on the monitor. It's ridiculous the amount of code that's needed to write a simple pixel to a screen or window, especially if animation/video is involved.

Re:Unification? (1)

tepples (727027) | more than 3 years ago | (#35541764)

They did. It was called DirectDraw. But if your game is running in a window, direct access to the frame buffer can violate the user's privacy. Microsoft deprecated DirectDraw in favor of Direct3D and later Direct2D because even an Intel GMA is substantially faster than software rendering. As for video, why can't you generate that into a texture and draw it as a quad?

Re:Unification? (0)

Anonymous Coward | more than 3 years ago | (#35541758)

Nobody should be using shader languages any more. You should be using DirectCompute or OpenCL as the best portable way to program on the various GPU's.

Shader languages were just a stop gap until we could get something more general purpose.

DirectX and DirectCompute go together for Windows specific solutions. OpenGL and OpenCL go together for truly portable solutions (basically all platforms and GPU's).

Re:Unification? (0)

Anonymous Coward | more than 3 years ago | (#35541770)

Performing 5k system calls to draw chunks of geometry means that each syscall needs to be less than 4us, or about 12000 cycles on a 3Ghz processor. That is not a lot of time to do all of the internal housekeeping that the API requires and talk to the hardware as well.

What is this, the 1940s, where CPUs dont instruction pipeline?

Ten times the tech != ten times the quality (1)

citoxE (1799926) | more than 3 years ago | (#35541074)

You can have as much technology as you want in a computer, in the end graphics can only be so good. Maybe the fact that console graphics can rival PC graphics (supposedly) says more about the PC than it does the console. Gaming PCs are still better than any console if you care about more than just how pretty your game looks on you monitor.

Re:Ten times the tech != ten times the quality (1)

MrHanky (141717) | more than 3 years ago | (#35541228)

Console graphics can't really rival PC graphics. Take a look at this comparison of PC vs Xbox vs PS3 in GTA4 [gamespot.com] . Then consider the fact that most modern gaming PCs (i.e. quad core with a midrange GPU or better) easily run the game at 1920x1080 @60 FPS, whereas the consoles use lower resolution and get choppier frame rates.

Re:Ten times the tech != ten times the quality (0)

Anonymous Coward | more than 3 years ago | (#35541292)

Console graphics can't really rival PC graphics.

For crappy ports. Check exclusive native console games for each console and you would be surprised at how good it looks. (PS3: GT5, Killzone 3, Uncharted 2, God of War 3, Heavy rain ...)

Re:Ten times the tech != ten times the quality (1)

Tridus (79566) | more than 3 years ago | (#35541604)

And none of those look as good as something like Crysis 2 that's got some decent PC optimizations.

The power comparison isn't even close, and that's if you just look at CPU/GPU speed & features and ignore stuff like RAM. The consoles are so memory starved that it's had a major impact on game design.

Re:Ten times the tech != ten times the quality (2)

Raenex (947668) | more than 3 years ago | (#35541430)

Console graphics can't really rival PC graphics. Take a look at this comparison of PC vs Xbox vs PS3 in GTA4 [gamespot.com].

I took a look. Totally underwhelmed at the differences. The problem is we have reached diminishing returns in graphics quality per hardware improvement.

I remember the jumps in each generation of PC and console graphics before 2000, and each one was huge and made the earlier generation look dated. When the PS3 and 360 came out, they were clearly better, but they weren't *that* much better. The same goes for today's PC graphics vs the aging consoles.

I don't see this situation changing until realtime ray tracing with realistic looking people (even the best today look like mannequins) comes about. That is, when you can't tell at a glance that you're looking at a game.

Re:Ten times the tech != ten times the quality (0)

Anonymous Coward | more than 3 years ago | (#35541446)

gta IV is over 3 years old... you can't compare that game to the current level of graphics on the PC, just look at crysis 2

my opinion on this quote is that he simply doesn't understand how game development works. He's ignorant, I don't know how amd will deal with nvidia with this guy at the helm

Re:Ten times the tech != ten times the quality (1)

click2005 (921437) | more than 3 years ago | (#35541666)

Crysis is a better example. The engine used in Crysis 2 (Cryengine3) has been dumbed down so it'll work on consoles.

Re:Ten times the tech != ten times the quality (1)

MrHanky (141717) | more than 3 years ago | (#35541800)

You're not looking. The leaves on the trees, for instance. The PC version is clearly more detailed. When looking at them in full resolution, there's really no comparison.

Re:Ten times the tech != ten times the quality (0)

Anonymous Coward | more than 3 years ago | (#35541236)

A large part of this effect is that artistic quality matters much more than technical quality, especially to normal (non-nerdy) people. I've had friends over who gasped about how beautiful a SNES rpg that I played in an emulator was. And as the technical differences are getting smaller and less easily perceived, that effect will only get bigger.
It is already the case that a few 3D games from ten years ago look better than some of the most high-end games published now. Now it's only a few, but as the years pass, there will be more and more old games that will be able to compete for longer and longer against the latest game releases.
And I don't think that's a bad thing. It will force game developers to focus on other things that are much more important to me, e.g.: story, artistic quality or a sense of polish in the UI and game mechanics.

Credit (3, Insightful)

calzakk (1455889) | more than 3 years ago | (#35541080)

Before Windows 95 and DirectX there was MS-DOS. Let's at least give credit where credit's due; DirectX has had a huge positive influence on Windows and Xbox gaming.

Re:Credit (0)

Anonymous Coward | more than 3 years ago | (#35541234)

Which is bad for everything else. No explanation needed here.

Re:Credit (1)

Sam Douglas (1106539) | more than 3 years ago | (#35541388)

DirectX helped to standardise the feature set of graphics cards, as well as provide a partially hardware-independent programming model. Without that it would be expensive for developers (have to develop for many different hardware configurations and quirks) and gamers (cheaper hardware, choice of manufacturer doesn't affect games that can be played, older cards stay useful longer).

Developing games is still hard; DirectX doesn't make everything cushy, but it does limit the variance in device capabilities across generations.

Re:Credit (1)

Pharmboy (216950) | more than 3 years ago | (#35541362)

But isn't asking to have "more direct, low level access" to the hardware EXACTLY like asking for the DOS days again, in a way? That was the first thing I thought. In reality, it would allow for faster game experience and better utilization of the hardware. Of course, this makes programming games a freaking nightmare as there are a million possible combinations, which would mean fewer games in that mode.

I always thought a "dedicated game mode" for the OS would be interesting, where all other services are put to sleep and the system virtually reboots to a more plain Jane state, with just networking and low level services. (yes, like the old DOS days, but in a more controlled and predicable way) Of course, that would be a perfect vector to infect a system....that and MS doesn't use a microkernel design, which is what we are likely talking about.

A better explaination? (2)

mustPushCart (1871520) | more than 3 years ago | (#35541130)

I RTFA and i still didnt understand why the API is bottlenecking, why the draw calls are one third of the draw calls possible on the consoles and why going direct to metal gives you orders of magnitude performance boost after considering both hardwares. Does directX reject the stream processors? or what exactly?

Hardware needs to change DX is obsolete. (5, Interesting)

goruka (1721094) | more than 3 years ago | (#35541158)

Discaimer: I am a pro game developer, wrote a few engines for commercial games, etc. I know what this guy means and ill try to explain it a bit better. The biggest problem with the DX model (which was inherited from GL) is the high dependency on the CPU to instruct it what to do.
State changes and draw commands are all sent from the CPU, buffered and then processed in the GPU. While this speeds up rendering considerably (the GPU is always a frame ore two behind the CPU) it makes it limiting, to get feedback from the GPU about the rendering state, and since the all the DX/GL commands are buffered, retrieving state or data means flushing/sync.
From modern algorithms related to occlusion estimation, or global illumination to overall reduction of state changes, it would benefit greatly if, for most tasks, the GPU could act by itself by running an user-made kernel that instructs it what to do (commands and state changes) instead of relying on DX, but for some reason this is not the direction GPUs are heading to, and it really doesnt make sense. Maybe Microsoft has something to do with it, but since Directx9 became the standard for game development, the API only became easier to program in versions 10 and 11, but didn't have major changes.

Re:Hardware needs to change DX is obsolete. (0)

Anonymous Coward | more than 3 years ago | (#35541190)

So basically remove the CPU from being the middle-man between the game and the GPU?

Re:Hardware needs to change DX is obsolete. (1)

Anonymous Coward | more than 3 years ago | (#35541214)

It didn't inherit it from OpenGL really. SGI started with OpenGL and realised that in order to really scale... you need the graphics card to hold all the data, and the CPU tell it higher level instructions. Not have the CPU telling the GPU to draw polygons.

Stuff like OpenSceneGraph are the grandchildren of all that work. Microsoft just never bothered.

Re:Hardware needs to change DX is obsolete. (2)

goruka (1721094) | more than 3 years ago | (#35541250)

I think the problem is not so much about the CPU being the middle man, but the CPU having to issue every draw/state change call. Instancing is one of the many examples about why the current model is wrong.
A for() loop for drawing an object 5000 times is slow because there is a lot of cpu-gpu communication. Instancing fixes this but makes it less flexible (you cant change which arrays are drawn or most of the state between objects).

Re:Hardware needs to change DX is obsolete. (1)

Dayofswords (1548243) | more than 3 years ago | (#35541638)

(super newbie programmer, merely guessing)
So rather than having to go to the CPU ask what's next, you just give the GPU data, package of commands(in form of algorithms, like the for loop) finish doing those then come back to the CPU for the next set?

i assume right now it's cpu does for, gpu draws, cpu does for, gpu draws
and what you want is cpu gives a package of stuff, gpu executes and draws

basically, have the GPU do some of the thinking
maybe?

Re:Hardware needs to change DX is obsolete. (1)

Twinbee (767046) | more than 3 years ago | (#35541568)

I'm not so sure that it's not the direction GPUs are heading. For example, NVidia's future Maxwell chip that combines a CPU and GPU into one will have true multitasking, and maybe take the pressure off the CPU completely. Perhaps AMD's Fusion already supports that kind of thing? You're right though, it would be great to have DMA to the GPU.

Re:Hardware needs to change DX is obsolete. (4, Interesting)

Zevensoft (1784070) | more than 3 years ago | (#35541584)

I've programmed DS game engines as well as high performance industrial OpenGL, and the frustrating thing about OpenGL (or DX, they're both just wrappers around NV or AMD) is the inability to send data in the other direction, ie. from the GPU to the CPU without killing performance. The DS didn't have that problem because the vertex processor was decoupled from the pixel processor, and even still you could redirect outputs wherever you like, as well as having full access to the 4 channel DMA controller! We would do occlusion culling on the vertex processor before animation, and also reducing polygon counts for the rasteriser.

Re:Hardware needs to change DX is obsolete. (3, Interesting)

NewWorldDan (899800) | more than 3 years ago | (#35541678)

I suspect one of the reasons for this is that Microsoft has taken the view, in the last 6-7 years, that the GPU can be used for accellerating and enhancing the desktop experiance (Aero, IE9). Their other goal, to a certain extent, is cross platform compatibility. Making it possible to write casual games from Windows, phone, and xbox.

Disclaimer: I wrote a game way back in 1994, directly interfacing the VGA card. In straight x86 assembly. I was total bare metal 17 years ago. I haven't really kept up on game development much since then. However, I wrote a clone of it in XNA recently. It took me about 4 hours to replicate 9 months of work from 1994. That includes the time to download, install, and learn XNA. My, how things have changed.

Slashdot does it again... (-1)

Anonymous Coward | more than 3 years ago | (#35541162)

One more anti-Microsoft article on Slashdot.... Surprised?

Try to remember the world before DirectX, if you can - games that would only run in a couple of cards, vendor-specific drivers, redundant low level engines for eveyr game, developer grief with detailed explanations on how to configure ini files so games would work. And I'm not even talking about DOS days. Newer versions of OpenGL are already DOA for games when compared to DirectX. Can DirectX improve? Yes. Is DirectX the reason games are not making progress? No. DirectX is one of the main reasons game graphics made such a huge leap in quality in the past 10 years.

Re:Slashdot does it again... (1)

_Shad0w_ (127912) | more than 3 years ago | (#35541316)

It is possible for something which was innovative and liberating to become stale and restraining, you know.

Games Tied To Hardware? (2)

borrrden (2014802) | more than 3 years ago | (#35541166)

So are they implying that they'd rather develop a game for a very specific set of hardware? Seems like an awful business model to me. Two of the reasons console games look good with lower specs on their hardware is because they are designed solely for gaming, and their specs do not change throughout the life cycle of the device so there is no need to develop for a broad base of hardware types. On the other hand, PC hardware is constantly evolving and multitasking is always going on. Scrap the API and develop directly for hardware, and see what it gets you. A lot of angry customers once they upgrade their card and it doesn't work anymore.

Re:Games Tied To Hardware? (2, Interesting)

Anonymous Coward | more than 3 years ago | (#35541506)

Nope. Right now the GPU-CPU situation looks like my boss dictating an email to his secretary - it probably wouldn't take as long if he just told her to inform the recipient he's going to be late. The developers want all possible API ops moved to the GPU where the CPU doesn't get in the way. They still want a standard API and most certainly don't want to develop straight for the metal.

Linux? (2, Insightful)

Anonymous Coward | more than 3 years ago | (#35541200)

Alright AMD. Make a game for Linux. That will give you the lower level access you want. Impress me :)

Re:Linux? (0)

Anonymous Coward | more than 3 years ago | (#35541440)

I want BF3 for Linux! That would be awesome. :)

Re:Linux? (0)

Anonymous Coward | more than 3 years ago | (#35541868)

You fail both in understanding what he's saying, and in somehow thinking AMD is a game development house.

Once again (2)

McTickles (1812316) | more than 3 years ago | (#35541212)

OpenGL is the way to go, no more porting headaches because of very wide support across platforms.

I find also that OpenGL code tend to be more straightforward and cleaner.

Re:Once again (0)

Anonymous Coward | more than 3 years ago | (#35541258)

You are right, but the rubes mod'ing you are morons.

If true where are the.. (0)

Anonymous Coward | more than 3 years ago | (#35541220)

10x better then a console graphics demos on Linux for AMD GPUs?

Funny, John Carmack thinks just the opposite (1)

AlienIntelligence (1184493) | more than 3 years ago | (#35541246)

John Carmack is quoted as saying almost the exact opposite:
[ http://techreport.com/discussions.x/20580 [techreport.com] ]
[ http://www.bit-tech.net/news/gaming/2011/03/11/carmack-directx-better-opengl/1 [bit-tech.net] ]

Eight days ago
[ http://games.slashdot.org/story/11/03/11/1832205/Doom-Creator-Says-Direct3D-Is-Now-Better-Than-OpenGL [slashdot.org] ]

For the lazy clickers:
Speaking to bit-tech for a forthcoming Custom PC feature about the future of OpenGL in PC gaming, Carmack said 'I actually think that Direct3D is a rather better API today.' He also added that 'Microsoft had the courage to continue making significant incompatible changes to improve the API, while OpenGL has been held back by compatibility concerns. Direct3D handles multi-threading better, and newer versions manage state better.'

In case you're unfamiliar with the mighty Carmack, he co-founded id Software in 1990, and had a large part in programming Wolfenstein 3D and the original Doom and Quake games. Since then, id has rigidly stuck by OpenGL for both Doom III and Quake 4, while many other cutting-edge PC game developers have moved entirely over to Direct3D.

Well, I did say, almost.

-AI

Re:Funny, John Carmack thinks just the opposite (3, Insightful)

bigstrat2003 (1058574) | more than 3 years ago | (#35541410)

The two issues under discussion are different. TFA says that DirectX is holding back PC gaming, while Carmack says DirectX is better than OpenGL. Those two are not mutually exclusive.

Re:Funny, John Carmack thinks just the opposite (2, Informative)

Anonymous Coward | more than 3 years ago | (#35541436)

That's not true.

If you're going to pretend to be knowledgeable, then it's a good idea to at least read the article.

Carmack's talkng about OpenGL vs DirectX. Arguably... DirectX is now a better API for writing games than OpenGL. I say arguably because I don't think it's a settled question - it is, however, one that is up for discussion - comparing Apples and Apples.

This article though... that's about the model used by both DX and OpenGL. Which basically means the CPU tells the GPU to draw each polygon (ok... it's a little more high level than that.. shaders, VBOs etc but essentially this is correct). On a console, the standard architecture means the developers hit the hardware directly and make their own in essence make their own API that makes the most of the balance of power between the CPU and the GPU.

On the PC - despite the fact that PC GPU hardware (even the cheap stuff) is massively more powerful... the CPU is still making system calls to DirectX very low level commands (effectively) to draw polygons. System calls are expensive and this is seriously limiting the ability of PCs to make use of the humongous power of these GPUs to draw real-time scenes.

The article is calling for a different model from the one used by DX and OpenGL - one that solves this problem. See also: scene graphs.

Really? (1)

Phoshi (1857806) | more than 3 years ago | (#35541320)

That might make sense, were it a case that PC graphics weren't 10x ahead of console graphics, and yet we're maxing out our cards. We are not. A mid end card handles even the most visually intensive games very well at above console resolutions. Yes, we could get more power out of our cards, no, it is not the reason graphics are not improving.

Here's what's really getting in the way. (0)

Anonymous Coward | more than 3 years ago | (#35541472)

1. Whiny gamers who want it NOW NOW NOW
2. Impatient gamers who want it NOW NOW NOW
3. M$/$ony who pay dev teams to develop for consoles first, thereby stifling advancements for the PC. You can only port so well, guys.
4. Dev teams who decide to get paid by M$/$ony in order to make money up front, instead of more down the road.

Switch gears, people. DICE shouldn't be one of the only companies that seems to give a damn about doing it kind of right. All of you should give a damn about doing it completely right.

Console APIs vs PC APIs - an explanation (5, Interesting)

LordHavoc (1394093) | more than 3 years ago | (#35541474)

The way things work on consoles is approximately similar to Windows/Linux/Mac, except for these important distinctions:
1. the hardware is a known target, as such the shader compilers and other components are carefully optimized only for this hardware, they do not produce intermediate bytecode formats or make basic assumptions of all hardware.
2. the APIs allow injecting raw command buffers, which means that you do not have to use the API to deliver geometry in any way shape or form, the overhead goes away but the burden of producing a good command buffer falls on the application when they use these direct-to-hardware API calls.
3. the APIs have much lower overhead as they are not a middle-man on the way to the hardware, but an API implemented (if not designed) specifically for the hardware. For example Microsoft had the legendary Michael Abrash working on their console drivers.
4. the hardware memory layout and access bandwidth is known to the developers, and certain optimization techniques become possible, for example rendering to a framebuffer in system memory for software processing (on Xbox 360 this is done for certain effects, on PS3 it is heavily utilized for deferred shading, motion blur and other techniques that run faster on the Cell SPE units), in some cases this has other special implications, like storage of sound effects in video memory on PS3 because the Cell SPE units have a separate memory path to video memory and thus can tap into this otherwise "unused" bandwidth for their purposes of sound mixing.
5. 3D stereo rendering is basic functionality on consoles.

The article is making the argument that we should be able to produce command buffers directly and insert them into the rendering stream (akin to OpenGL display-lists but new ones produced every frame instead of statically stored).

It is also making the argument that we should have explicit control over where our buffers are stored in memory (for instance rendering to system memory for software analysis techniques, like id Software Megatexture technology, which analyzes each frame which parts of the virtual texture need to be loaded).

There are more subtle aspects, such as knowing the exact hardware capabilities and designing for them, which are less of a "No API!" argument and more of a case of "Please optimize specifically for our cards!", which is a tough sell in the game industry.

AMD has already published much of the information that studios will need to make use of such functionality, for example the Radeon HD 6000 series shader microcode reference manual is public already.

Intel also has a track record of hardware specifications being public.

However NVIDIA is likely to require a non-disclosure agreement with each studio to unlock this kind of functionality, which prevents open discussion of techniques specific to their hardware.

Overall this may give AMD and Intel a substantial edge in the PC hardware market - because open discussion of graphics techniques is the backbone of the game industry.

On the fifth point it is worth noting that NVIDIA Geforce drivers offer stereo rendering in Direct3D but not OpenGL (despite it having a stereo rendering API from the beginning), they reserve this feature only for their Quadro series cards for purely marketing reasons, and this restriction prevents use of stereo rendering in many OpenGL-based indie games, another case of consoles besting PC in functionality for ridiculous reasons.

Re:Console APIs vs PC APIs - an explanation (0)

Anonymous Coward | more than 3 years ago | (#35541592)

Nvidia and AMD also artificially slow down double precision shader performance just for marketing, i.e. better sales. ) Additionally, AMD even produces cheaper cards without double precision support on die).
Full blown double precision cores are reserved for special, quite expensive, Professional (Quadro and Firegl) and GPGPU (Tesla, not sure about AMD) offerings. However they will be screwed once games start depending on double precision performance.

Yes, GPU using should gradually proceed from batched model to SMP (memory sharing) model, where you basically run a multithreaded app where a thread (group) is running on a GPU and performs communication with the CPU part in a smarter way, meaning a custom way written by the developer, than batch buffering. Fusion should in theory be that, but for now it seems to be integrated GPU on the same die as CPU.
API - DirectX can still exist, but direct access should be possible by an instruction set, and/or API which is basically a set of standard libraries being executed in virtualized environment on GPU,analogous to how protected mode programs run on a x86 CPU.
Oh yes, this really means that portions of the instruction set should be open, at least those used by kernel for controlling the virtualization of "GPU user mode". The remaining secrets and differences between vendors can be hidden in libraries.
Languages analogous to HLSL/GLSL can be used for making sure that the user code runs, as you can always use CPU to compile from high level code to machine code instead of precompiling for every GPU architecture.

In 10 years GPU and CPU will probably merge, so expect that the kernel will have to control the heterogeneous model of CPU-GPU SMP. If you want, Cell is a kind of predecessor for that, just wasn't powerful enough to match a discrete graphic card.

Open hardware API (0)

h00manist (800926) | more than 3 years ago | (#35541526)

Perhaps then the GPU makers should have talking about implementing a common open spec for a hardware-based API

Easy workaround (2)

WegianWarrior (649800) | more than 3 years ago | (#35541794)

Those of us who are old enough to remember a time before the GUI was the only show in town surely remember that "big" games almost always came with their own boot disk. Would it be so hard to go back to that, if the benefits were worth it? A DVD, or a flash drive, with a small Linux kernel, a library of drivers for the wide range of hardware out there and the game files - optimized for speed, with no loss of performance because a huge, bloated GUIed OS gets in your way. If the game developer uses an off-beat file system, it'll also prevent piracy!

Granted it'll also bring back the bad old days of cursing up a storm because the latest game didn't support your Gravis Ultrasound, but only the crappy SoundBlaster... and off course the game would have to include it's own TCP/IP stack if you want multiplayer... and a few gigs of drivers for the various motherboards, graphics adapters and so on and so forth that the casual gamer may or may not have - but at least you don't have to worry about a system put in place to simplify all that stuff getting in your way.

Get to the hardware? (2)

xtracto (837672) | more than 3 years ago | (#35541830)

By giving you access to the hardware at the very low level, you give games developers a chance to innovate

I am ready!

MOV DX, 03D4h
MOV AX, 06B00h
OUT DX, AX

Sounds like a good idea (1)

phanboy_iv (1006659) | more than 3 years ago | (#35541840)

'Oh hey! Let's start coding the graphics engines for our multi-million dollar games in a basic low-level chip-specific language! That'll let us squeeze the most out of that 5 year old GPU we have to use!'
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...