Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Khronos Releases OpenGL 4.2 Specification

Soulskill posted more than 2 years ago | from the onward-and-upward dept.

Graphics 98

jrepin tips news that the Khronos Group has announced the release of the OpenGL 4.2 specification. Some of the new functionality includes: "Enabling shaders with atomic counters and load/store/atomic read-modify-write operations to a single level of a texture (These capabilities can be combined, for example, to maintain a counter at each pixel in a buffer object for single-rendering-pass order-independent transparency); Capturing GPU-tessellated geometry and drawing multiple instances of the result of a transform feedback to enable complex objects to be efficiently repositioned and replicated; Modifying an arbitrary subset of a compressed texture, without having to re-download the whole texture to the GPU for significant performance improvements; Packing multiple 8 and 16 bit values into a single 32-bit value for efficient shader processing with significantly reduced memory storage and bandwidth, especially useful when transferring data between shader stages."

cancel ×

98 comments

Sorry! There are no comments related to the filter you selected.

Are they nuts? (-1, Redundant)

dev346 (2432780) | more than 2 years ago | (#37027226)

I see they added optional extension to allow DirectX coexistence [aeonity.com] as far as using same texture in both apis in same application in same time...
Why?

Re:Are they nuts? (2, Informative)

wazzzup (172351) | more than 2 years ago | (#37027250)

PARENT LINK IS GOATSE

Re:Are they nuts? (2, Funny)

Anonymous Coward | more than 2 years ago | (#37027288)

It took me a good 30 seconds or so before I realized that I wasn't looking at some real-world DirectX code written in C++.

Re:Are they nuts? (1)

Loonacy (459630) | more than 2 years ago | (#37027712)

Yeah, it looks like C# to me.

Feature Bloat? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37027276)

And the reason for all of this new stuff is, what exactly? Playing the devils advocate: I can render tits just fine already, so why do I need this feature bloat as well? From a practical perspective: The old standard is ruining fine on games I love, and a new standard will just complicate things and force people to upgrade to hardware since it looks like some of this stuff would be difficult at best to do in software only and get good perf. From a commercial perspective: Great, now I have increased development costs as a result of a new standard that everybody will want to target just because its new, even if my game doesn't need any of the new fluff. From a open source perspective: I worry about the open source stack as a result, its already hard enough to develop games that matter for linux, and impossible to get a dev house to take it seriously that's not an Indy. It looks to me that somebody - the standards body - is just - in the paraphrased words of a popular song about airplanes "spec'n to stay relevant". Can somebody with more free time tell the rest of us the value this actually gives? Or is it so aparent - and I so old - that as the "old guy" I am just being bitter?

Re:Feature Bloat? (-1)

Anonymous Coward | more than 2 years ago | (#37027344)

You should read developer comments on OpenGL. It's API is way worse than DirectX and has a ton of legacy features which people shouldn't be using anymore, but continue to be supported anyway.

OpenGL promised in the last version to cut away a lot of the features from really old versions, just like DirectX 10 did. This would have the disadvantage of breaking compatibility but the advantage of making it easier to support into the future and more efficient right now. Instead they maintained backward compatibility which made it bloated and hard to use.

Not being a graphics programmer, I don't know how much of this is addressed in the newer version, but that would be the reason for all the changes! Just because it supports older games doesn't make it appropriate for future ones!

Re:Feature Bloat? (0)

pimpsoftcom (877143) | more than 2 years ago | (#37027468)

What you are failing to understand is that unless this new spec makes minecraft run faster, most people will not care the new spec was released. And Adding this new spec to hardware increases costs, so most people will be against it on the basis that it costs them more money for the same thing. I'm not against progress by any means, but unless its REAL PROGRESS and VALUE is ADDED this sort of thing is worthless.

Re:Feature Bloat? (0, Flamebait)

Anonymous Coward | more than 2 years ago | (#37027656)

This isn't for gamer faggots who pretend to talk graphics , like you, it's for developers like me, who are always happy to increase their toolkit.

The fact that you cant read the summary and grok what's new, and what could be done with it, shows about where you're coming from. Keep up those l33t fraps, you fucking poseur.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37027902)

Well said. I was just thinking that.

Re:Feature Bloat? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37028020)

Ad-hom attacks and fallacious logic gets you no where. I never said "new tools are bad". I asked "What are these new tools good for?" and "Why should I use them"?

Re:Feature Bloat? (0, Flamebait)

Anonymous Coward | more than 2 years ago | (#37027696)

What you are failing to understand is that unless this new spec makes minecraft run faster, most people will not care the new spec was released.

Here's the thing: You aren't most people. Minecraft? That game fucking sucks. I wouldn't even call it a game. More like a Second Life rip-off for people who thought Second Life wasn't nerdy enough and have an allergic reaction to texture filtering.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37027698)

Most of the recent GL features have simply been added to simplify some things for the developer, or to simply expose features that already exist on the latest graphics cards. On the cards that support it, the latest core profile is pretty much inline with the current D3D equivalent (i.e. in practical terms, using either isn't much different anymore. It's just shaders, buffers, and some bits of state management for the most part). The only drawback of OpenGL is what happens when you have to run on a machine with OpenGL 2.1 or older. At that point, legacy support becomes a bit painful when dealing with incoherent drivers from various vendors. The one upside however, is that you can at expose the latest GPU features to machines running XP (locked to D3D9, but GL4.2 is still available) or Vista (locked to D3D10, but GL4.2 is still available). Swings and roundabouts. There are benefits to D3D. There are benefits to GL. Who cares. At least we have the luxury of having the choice, and that's all that's important.

Re:Feature Bloat? (1)

DudemanX (44606) | more than 2 years ago | (#37027830)

or Vista (locked to D3D10, but GL4.2 is still available).>

D3D11 works just fine under Vista.

Re:Feature Bloat? (1)

Kjella (173770) | more than 2 years ago | (#37027720)

At this point DirectX is so important they're not going to drop anything from hardware because OpenGL does or doesn't support it. The only difference is whether there'll be an interface to use it on Linux or not and the time you buy is only temporary. What do you do in five years when developers ask for features every card sold in the last five years has? Is it going to be less work then, or will it be just as much work and you're now five years behind the cutting edge? The answer is the latter, you're really asking for OpenGL to curl up and die in few years.

Re:Feature Bloat? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37027858)

The Day Open GL dies will be the day the music died. Not a good day, for open source or for gameing in general. The games that have defined the last few decades or so all started as opengl games, not direct X games. Doom, Hexan, Halo (written originally for Apple systems), etc.

Re:Feature Bloat? (1)

arose (644256) | more than 2 years ago | (#37029976)

Neither Doom, Hexen or the original Halo used accelerated 3D. You're thinking the later Quake titles and possibly Unreal.

Re:Feature Bloat? (1)

TheRaven64 (641858) | more than 2 years ago | (#37030498)

The original Halo came out long after Quake and most definitely did use 3D hardware, you're thinking of Marathon.

Re:Feature Bloat? (1)

arose (644256) | more than 2 years ago | (#37041442)

Yes, I realised I was blindly following him after hitting submit. Doh.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37027854)

Are you in management? Your logic is equally bent and ill evolved.

You're one of those who think moving a completion marker on a piece of paper is better than adding another 20% into a release to actually make it palatable.

Then it's release time and your project flops.

But hey, you moved your little marker and fulfilled your job at the company. I've met people like you.

Here's a tip: You should leave the real decisions to people with real skills and stop acting like some sort of definable "progress" is all that matters. That's very one-dimensional thinking and we're trying to get rid of that here in the modern age. Unless you know every possible implementation of every new feature right now, you have no fucking clue how much "progress" is being made. People far smarter than you found use in things others discarded many years ago.

Here's some structure for you... I slept through most schooling and dropped out of highschool because it was a waste of fucking time. In 2 years I was clearing $110,000 a year full time at fucking 18. Now I'm making double that along with having my own products I've developed from scratch looking to clear 7 digit profits. I'm 23 now and one of the types your dumb ass would never hire. Fuck you.

Re:Feature Bloat? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37028160)

No not management, I'm a full time dev and part time entrepreneur. And you are making a lot of bad assumptions with your ad-hom attacks and fallacious logic. You are wrong about me; You and I have a lot in common actually. Except I make more than you, have a better understanding of the target markets, and yes I would probably hire you if you could prove your skills because lets face it, school doesn't matter and what you learn does. The lesson to be learned here is that value can be added in many ways, its not a one sided thing like you seem to -ironically - be suggesting. You sit here and get angry because somebody wanted more data on the value of this, and yet you dismiss the value behind the idea that people who run companies and worry about this stuff are the same people hiring people like you, even as you allude to it as a part of a ad-hom attack.

Re:Feature Bloat? (-1)

Anonymous Coward | more than 2 years ago | (#37027584)

It says much about slashdot that this relatively informative comment critical of an open standard is modded to -1 while a number of goatse trolls are still sitting comfortably at 0.

Re:Feature Bloat? (3, Insightful)

TheRaven64 (641858) | more than 2 years ago | (#37027936)

OpenGL promised in the last version to cut away a lot of the features from really old versions, just like DirectX 10 did. This would have the disadvantage of breaking compatibility but the advantage of making it easier to support into the future and more efficient right now. Instead they maintained backward compatibility which made it bloated and hard to use.

You are aware that the complaints that you are repeating were about OpenGL 3.0, were addressed in 3.1, and the current topic is 4.2? 3.0 introduced a deprecation mechanism, and deprecated a load of stuff. 3.1 removed it. In 4, all of the fixed-function stuff is gone completely.

Re:Feature Bloat? (1)

hairyfeet (841228) | more than 2 years ago | (#37028772)

You know that is one of the things MSFT really should be given credit for, because DirectX "just works". It is even nicer in Win 7 (don't know if it is equally true with Vista) as it is trivial to download and install DirectX 9 (they have a light edition installed but for some older games you really need the full version) and it will live happily beside DirectX 11, only being called on older games that don't support the newer DirectX features.

It makes it nice for me as I can fire up a copy of NOLF II and have it all render beautifully even on Win 7 X64 and then go straight from that to Just Cause II and have all the purty my HD4850 (DX10.1 card) can crank out, and nice for my customers as they just stick in the disc and it all "just works".

There is a lot of things you can bitch about MSFT for but frankly I'm damned grateful for backwards compatibility. With BC I don't have to keep a bunch of old machines running various versions for old code, it all plays nicely on my latest and greatest while still letting me run the newest games and software.

As for TFA from what I understand the problem with Khronos and OpenGL is CAD. Just as ubergeeks here love to complain about the "cruft" in Windows (but I bet they'd scream their asses off if their mission critical Win32 code wouldn't run on anything past XP) so too have Khronos been bending over backwards but instead of gaming its CAD that they keep the old code for. So I can see why programmers would want to just use DirectX if they are making games, as DX has been from its inception primarily a gaming API with extras for video, networks, audio, etc while OpenGL seems to be pretty much centered around CAD right now, at least that is what I got from reading the stink here over the last release.

Maybe instead of bothering with OpenGL game developers should just donate a little time or money to Wine or Cedega? They seem to be making some pretty good strides on the gaming front and frankly gamers don't care what kind of API a game has as long as it "just works" and I have to give the Wine guys credit, they have been getting pretty good in that regard.

Re:Feature Bloat? (3, Insightful)

arth1 (260657) | more than 2 years ago | (#37029612)

You know that is one of the things MSFT really should be given credit for, because DirectX "just works".

Is that why so many games using DirectX come bundled with all those DirectX updates and patches?

Checking a randomly chosen game I have installed here (Dragon Age Origins) shows no less than 74(!) cab files with DirectX updates and patches, from Feb2005_d3dx9_24_x86.cab to NOV2007_d3dx9_36.x86.cab

Yep, it "just works".

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37029732)

Moreover though there were some changes made that broke a *TON* of older win32 games esp in the DX5-7 era. Some DX functions, intentionally or not were returning values that apps used, but were then removed in later DX versions. As such a number of apps have been broken because microsoft DIDN'T keep backwards compatbility. Made all the sillier since they could just have had those 'new' versions of functions hooked in the update libs like they've been doing with DX9+

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37030212)

Is that why so many games using DirectX come bundled with all those DirectX updates and patches?

How exactly do you think you're going to use the newest version if you don't have the newest version you stupid dickhead?

Checking a randomly chosen game I have installed here (Dragon Age Origins) shows no less than 74(!) cab files with DirectX updates and patches, from Feb2005_d3dx9_24_x86.cab to NOV2007_d3dx9_36.x86.cab

Yep, it "just works".

So? Who gives a fuck how many files it has.

Re:Feature Bloat? (1)

arth1 (260657) | more than 2 years ago | (#37032386)

How exactly do you think you're going to use the newest version if you don't have the newest version you stupid dickhead?

The normal way would be to provide an installer that installs the latest version. The problem with DirectX is that there's so much breakage and lack of backward and forward compatibilty, that developers often have to pick just which updates to install.

So? Who gives a fuck how many files it has.

It's not the number of files, it's the amount of distinct patches needed.

Re:Feature Bloat? (1)

hairyfeet (841228) | more than 2 years ago | (#37030254)

Uhhh...seriously WTF dude. You DO know that DirectX IS FREE, right? Why in the hell would a game company NOT include the files just to make sure their customers could run the game? Just to be dicks? If their games have DirectX9c features and you are running on vanilla DirectX 9 (I have actually seen this on boxes that haven't been updated in ages) then putting the files on there is SMART. it is free, it cuts down on support calls, and it makes it easier on the customer. Where is the downside?

And to the poster that complained about broken DirrectX 5-7 games? That wasn't MSFT dude, you can lay that failfest squarely at the feet of NVIDIA. I have some games that old, such as Mechwarrior II and you know what? They run just fine in DOSBox on a RADEON card, even a newer card like my HD4850. It is NVIDIA that tosses backward compatibility pretty much constantly, so if you want to run the latest games AND the older ones you are SOL if you buy Nvidia. that is why I only buy AMD/ATI after being a lifelong Intel+Nvidia man, because after bribery and compiler rigging on the Intel side, and bumpgate and dropping BC on the Nvidia side those companies got to be too big a douches for my taste.

My customers are quite happy with their new ultra affordable triples and quads along with the nice Radeon GPUs I install. I mean when you can easily find the HD48xx chips, with that fat 256bit pipe for around $60? Hell its a steal! But I bet my last dollar the one complaining about DX 5-7 loss of BC bought Nvidia and found out the hard way Nvidia don't care about BC. Considering the insane temps the newer Nvidia chips are hitting frankly I'd avoid them anyway.

But if you care about old gaming DOSBox + Radeon is a winning combo. and the easiest way to get it is to go to GOG and buy their ready made games. Nearly every game works 100% perfectly (DO NOT BUY the i76 package, that game is so tied to Win98 and CPU timings all the shims in the world can't fix it!) and their DOSbox based games all run like a charm on Win 7 X64. You just buy, download (with an insane backend games are uber quick to DLoad) install and play. No DRM, no keys, no muss, no fuss. And if you are on a Radeon card all will have 100% acceleration, if you are Nvidia? Well you may just have to settle for software rendering as I said Nvidia don't do BC.

Re:Feature Bloat? (1)

geminidomino (614729) | more than 2 years ago | (#37031774)

Hey, Hairyfeet, speaking of old gaming...

You know of any solutions to play games from that screwfest DX 5-7 period? I've been trying to get TIE Fighter 95 and XvT going, Win98 is too old for VirtualBox and AFAIK, DosBox is for DOS-age games. Two of my favorite games of all time are stuck in the middle.

BTW, from our conversation last month (on the topic of old games) : My Dingoo A320 [dealextreme.com] finally showed up. The stock NES and GBA emulators would be pretty nice, but they seem to have some unreliability in dealing with SRAM. With Dingux installed, Gen/MD with picoDrive works nice, but Snes9x performance is still pretty limp (Tested with Super Metroid, Chrono Trigger, and FF III), as usual. Haven't messed with overclocking it. No network support. Movie player seems surprisingly capable wrt formats, but only provides basic controls (no seeking, e.g.). HTH.

Re:Feature Bloat? (1)

hairyfeet (841228) | more than 2 years ago | (#37033376)

Cool, thanks for the info. I think I'm gonna pick one of those up as I was always more of a Genesis and NES than SuperNES guy. I bet it is mode 7 that causes the problems, emulating that is rather iffy.

As for TIE fighter 95? No problemo my friend, what you need is a handy little tool called "Application Compatibility toolkit" or you can just use the premade shims linked to here [wordpress.com] . I haven't tried it myself as I really need to get me another flightstick (I was stupid enough to lend mine to my nephew and the cat killed it, damned evil kitteh!) but they swear it works great.

I have found usually if DOSBox or VirtualBox won't cut it then the AppCompat Toolkit has you covered. If you get the toolkit (I'm sure you can find it somewhere) then you'll need to try playing with the settings, it is a trial and error kind of thing. But my nephew managed to get FF 7 running on XP by just spending an hour trying settings with the toolkit and everyone knows what a fiddly bitch THAT game was!

Anyway as soon as this gig I have coming up in Oct is through (having to train a new drummer, ugh. Last one had a drinking problem which when you are doing progressive blues and funk in a trio is NOT cool) I'll snatch me up one of those Dingoo players, sounds like just what I was looking for. Feel free to shoot me an email if you run into anymore "nigglers" as I'm pretty good at finding ways to make old stuff run on new. You'd be surprised how many times that comes up. hell I even have a customer that won't touch a PC if his "Secret Weapons" copy won't run,LOL!

A final bit of advice though, if you have plenty of older games you like (like I do) have you thought about getting a KVM and a second box? I know what you're about to say "Don't have the room, too much mess, etc" but there is actually an easy peasy way around that. Hit your local Craigslist or eBay and look up a "Compaq Deskpro En SFF 1GHz". This is a small (about the size of a phonebook) desktop that was designed to fit under a monitor. It is beige but a can of high gloss and some masking tape fixes that (painted black it actually looks slick) and while it doesn't have AGP what it DOES have is full size PCI slots and full Win98 drivers. it is perfect for Win9x and DOS games, just slap in an old Radeon or Geforce card, hell you can pick one of the old FX5200s or my fav the X1650PRO for dirt cheap Both cards have Win98 drivers that are rock solid, and you can pick up a Trendnet 2 port KVM with sound for less than $30 at Newegg.

Anyway with the average price of the card and the box you are talking MAYBE $80 all told for a perfect classic gaming system. With that setup a simple hotkey press and you have full hardware accelerated DOS and Win9x, the sound chip they come with has SB emulation or if you want the real thing a SB Live PCI card is like $5 at Surpluscomputers.com and those have great Win9x support as well as the old DOS SB sound. Just use some twist ties to wrap up the cables and voila! You have a little black box that sits under your monitor and gives you all the classic gaming goodness you could want.

But if you run into any more nigglers just feel free to shoot me an email. I am usually here at the shop working on boxes anyway so I check my email while waiting for reinstalls to finish. When you live in a rural state you'd be surprised how many folks become attached to games, so I end up spending a LOT of time figuring out how to make the old stuff run on the new, but AppCompat is your friend, and the shims it makes fixes a LOT of the old Win9x problems. the ONLY game I haven't been able to get up and running is the i76 collection, which is what I ended up finding the Compaq for. That beast uses so many hacks to get the speed up on old machines that trying to get that thing running on a modern quad was hair pulling! Anyway i'm glad you found a sweet player you like, I'll definitely be grabbing me one of those, thanks again for the heads up.

Re:Feature Bloat? (1)

arth1 (260657) | more than 2 years ago | (#37032626)

Uhhh...seriously WTF dude. You DO know that DirectX IS FREE, right? Why in the hell would a game company NOT include the files just to make sure their customers could run the game?

You missed the point entirely. Why would they need to provide dozens of cherry-picked patches, instead of just, um, install the latest version of DirectX?

The problem is that you can't - DirectX has so many compatibility problems that Microsoft has to provide the updates individually, so the end user (or developer) can pick and choose just which updates to install.

"Just works", indeed.

Re:Feature Bloat? (1)

hairyfeet (841228) | more than 2 years ago | (#37033612)

Uhhh dude? Did you decompile those? because i bet my last dollar they were NOT patches, but just plain old vanilla DX9c, simply the latest version. Hell every game I have bought in the last 6 months automatically checks at install and carries DirectX 9C (I hear DIRT comes with 11, I can say Just Cause II checks for Dx10 or better) along with the Visual C# or C++ hell I can't remember.

But those aren't "cherry picked patches" those are just the latest updates. And there is a very good reason why you have the latest updates on disc, it is because those updates add support for new features that come with the newer chips. It isn't like they went "Oh DirectX 9 is done, lets start on 10!" as you have new features that the GPU manufacturers added supported in later releases, like I believe 9c added support for more efficient buffering of textures.

But it is funny that you would try to bring up shims with Dx, as OpenGL is nothing but shims all the way down. With DirectX if you have 9c you have 9c and any card that is 9c will "just work". With OpenGL you have Nvidia specific shims, you have ATI specific shims, you have Intel specific shims, and from what I understand each is completely different and incompatible with each other.

I guess that is one of the nice things about being the 800 pound gorilla, as MSFT can just make a call and get the designers on the line and hash out what features the next DirectX will have. frankly I haven't seen a problem with directX in ages, not since XP SP2 as a matter of fact. it all "just works" and with the games carrying the latest updates for Visual C and Dx I don't have to do anything, just run the installer. that's nice and the way it should be done.

Oh and if you run into a REALLY old game that doesn't work on the latest and greatest? appCompat Toolkit is your friend. With that and a little tweaking one can get even Win95 era games up and running most of the time, well as long as you aren't running an Nvidia card that doesn't support BC. There are plenty of things to bitch about when it comes to MSFT but Dx isn't one of them friend, it "just works". and I'd take it over the "designed by committee" approach to OpenGL any day of the week.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37036036)

Yeah shipping software with all the required dependencies. Who the fuck thought of that idea. Much better to have to download them instead of putting them on the disc with 2gb of free space.

Re:Feature Bloat? (1)

arth1 (260657) | more than 2 years ago | (#37037006)

It's not about space, it's about a claim of "just works".

As in "it just works if and only if you also install the November 2006 redist as well as the February 2007 one, because the 2009 final one doesn't have the compatible parts that are only in those two redists"?

This is the problem with shifting standards as you go, with the distributed code being authoritative. OpenGL doesn't have that problem; the libraries are made according to the specs, not the other way around.

Wine.apk (1)

Chibi Merrow (226057) | more than 2 years ago | (#37035812)

Maybe instead of bothering with OpenGL game developers should just donate a little time or money to Wine or Cedega?

That's funny, Wine doesn't seem to be on the Android market....

You're missing the point. PC gaming is irrelevant. As it is, even dedicated portable gaming consoles seem to be becoming irrelevant. What is relevant are millions of small form factor devices, all of which use OpenGL and none of which run Windows. That is where gaming is going.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37027512)

Doesn't the new version of OpenGL support all the stuff you're using without any changes to your code? I thought it did.

Re:Feature Bloat? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37027664)

Problem is its going to increase cost and the new features don't do anything useful that I'm aware of, at least nothing that makes it easier for me as a dev to code games people enjoy. Its just adding feature bloat.

Re:Feature Bloat? (1)

EsbenMoseHansen (731150) | more than 2 years ago | (#37030236)

The atomic stuff and the counter stuff does sound useful to me, though I have not had the pleasure of programming a game yet. Well, not in the last 25 years, anyway.

Re:Feature Bloat? (0)

TigerTime (626140) | more than 2 years ago | (#37027558)

Seriously? Taking coded features out of the software layer and placed on the hardware layer can multiply the speed of operation by an order of magnitude. OpenGL is far behind DirectX in that sense. DirectX is in many ways easier, and faster because of it. OpenGL needs to ditch some of features that they are holding onto for the backwards compatibility. Anything older than 7years should be on the chopping block if it isn't needed.

Re:Feature Bloat? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37027650)

I generally agree if its not used its bloat but lets face it, we need to worry about the open source software stack as well. A big part of lock-in is video, after all. That is why Open Source Game Development is usually so stagnate, with few but always notable exceptions.

OpenGL Has Left DirectX In The Dust (5, Insightful)

Anonymous Coward | more than 2 years ago | (#37027770)

I don't know what delusion planet you are posting from but here in the Real World OpenGL has left DirectX in the dust in both features and performance a long time ago.

Khronos is absolutely on fire with giving developers what they want as quickly as possible. OpenGL developers have access to the absolute bleeding edge features of new graphics cards that people who are stuck still using DirectX have to wait around for Microsoft to get off their ass and implement.

It shouldn't be surprising OpenGL has won the API war with Microsoft:

210 Million OpenGL ES based Android devices a year.

150 Million OpenGL ES based iOS devices a year.

Every Linux, Mac,and Windows machine

The dying PC games market and the last place Xbox 360 are the only places left in the world still using the dead end DirectX API.

Re:OpenGL Has Left DirectX In The Dust (-1)

Anonymous Coward | more than 2 years ago | (#37027980)

Can't tell if you are very stupid or just a troll.

Re:OpenGL Has Left DirectX In The Dust (0)

Anonymous Coward | more than 2 years ago | (#37042122)

Can't tell if you are very stupid or just a troll.

Not surprising since it's quite obvious that you are indeed very, very stupid. Anyone can tell that.

Re:OpenGL Has Left DirectX In The Dust (0)

pimpsoftcom (877143) | more than 2 years ago | (#37027998)

Every big tittle I can think of targets windows/directX. WoW for example doesn't have a Linux client. You have to emulate it with wine. Wine is doing better and better, but the fact is, devs don't target Linux so people have to emulate if they want to play. You are also making the false assumption that every one of these devices are used for gaming; that is not the case. However, I can say with good numbers backing me up that an xbox 99$% of the time will be used for gaming, even if I dislike the platform. Please don't weaken your arguments by making use of fallacious logic.

Re:OpenGL Has Left DirectX In The Dust (0)

Anonymous Coward | more than 2 years ago | (#37028224)

Your example is a bad one, as the WoW client on Windows also offers an OpenGL version (WoW.exe -opengl). Linux users running it in Wine are strongly encouraged to use the OpenGL version, as it obviously runs better.

Re:OpenGL Has Left DirectX In The Dust (1)

pimpsoftcom (877143) | more than 2 years ago | (#37028316)

Thank you.

Lets try this again:
I don't see Rift, Eve Online (Who actualy dropped support but had it once, due to what Im discussing about) or other games being developed for Linux.

Lets confuse the issue more:
In fact if I do a search for linux gaming I get old games, bad games, unfinished games that were abandoned, and a very few clones of the classics done as a learning exercise.. all in opengl.

And lets not forget to piss some people off:
Why? Because in general the so called "pro game devs" only leech from the open source community by using it to learn, and in general they never give back by using the skills to make a game hat will actually run on Linux well without emulation.

That work? ;)

Re:OpenGL Has Left DirectX In The Dust (0)

Anonymous Coward | more than 2 years ago | (#37029086)

Thank you.

Lets try this again:

I don't see Rift, Eve Online (Who actualy dropped support but had it once, due to what Im discussing about) or other games being developed for Linux.

Lets confuse the issue more:

In fact if I do a search for linux gaming I get old games, bad games, unfinished games that were abandoned, and a very few clones of the classics done as a learning exercise.. all in opengl.

And lets not forget to piss some people off:

Why? Because in general the so called "pro game devs" only leech from the open source community by using it to learn, and in general they never give back by using the skills to make a game hat will actually run on Linux well without emulation.

That work? ;)

id Tech 5 is OpenGL

It's a learning exercise of a brilliant mind that changed the whole gaming industry

OpenGL has a much bigger marketshare right now than DirectX

Re:OpenGL Has Left DirectX In The Dust (1)

dbIII (701233) | more than 2 years ago | (#37030480)

Meanwhile the article is about OpenGL and not linux. Do you really know if those other games are directX since you got it wrong about WoW?
Also add the Nintendo platforms to the OpenGL list. Those professional 3D graphics and GIS programs running on MS Windows are using OpenGL as well.

Re:OpenGL Has Left DirectX In The Dust (0)

Anonymous Coward | more than 2 years ago | (#37032326)

or other games being developed for Linux.

Hereos Of Newerth [heroesofnewerth.com] has been well received [youtube.com] .

Re:OpenGL Has Left DirectX In The Dust (1)

LordLucless (582312) | more than 2 years ago | (#37029244)

Why the hell are talking about linux? While DirectX is windows only, OpenGL runs on pretty much every platform in common use. Before accusing others of fallacious logic, look to your own strawman.

Re:OpenGL Has Left DirectX In The Dust (0)

pimpsoftcom (877143) | more than 2 years ago | (#37029388)

I never said otherwise. What exactly are you accusing me of?

Re:OpenGL Has Left DirectX In The Dust (1)

LordLucless (582312) | more than 2 years ago | (#37030154)

The parent stated that OpenGL had more features and better performance than DirectX. You responded criticizing Linux gaming, and accusing the parent of making fallacious arguments. Either you're just totally off-topic and decided to randomly spurt about how linux gaming sucks in this thread, or you're making an implicit assertion that linux/windows parallels OpenGL/DirectX, which is false.

Re:OpenGL Has Left DirectX In The Dust (2)

gman003 (1693318) | more than 2 years ago | (#37029498)

Uh, quite a few Windows games use OpenGL, even the forthcoming Rage (so OpenGL on Windows is hardly "dead"). A lot of them (especially older ones) even offer both - I can set Half-Life or Unreal Tournament to use OpenGL, Direct3D, or even software rendering.

Most popular engines support both. UE3, used by about half the games on the market, uses OpenGL on the PS3, Wii, Mac, iPhone and Linux, and D3D on the XBox and Windows.

See, you're thinking too much about Windows VS Linux VS Mac, when the developers are thinking PS3 VS XBox VS Wii VS Windows. Coincidentally, half of those ONLY support OpenGL, while only one is pure Direct3D. Since every developer big enough that isn't owned by a console maker targets as many platforms as possible, most games end up having multiple renderers.

Re:OpenGL Has Left DirectX In The Dust (0)

Anonymous Coward | more than 2 years ago | (#37030578)

DirectX still has a few aces up its sleeve: Direct state access and multithreaded rendering.

Re:Feature Bloat? (2)

Ryvar (122400) | more than 2 years ago | (#37027828)

They did that already. As of OpenGL 3.1 the only non-deprecated rendering method is Vertex Buffer Objects. Link [opengl.org] .

There are a lot of things OpenGL could do to make itself more accessible - better-supported crossplatform utility libraries, three or four shortcut commands that set the various glEnable() states that 95% of new developers actually care about, streamlining eyebrow-raising pile of mipmap generation options, the entire process of setting up a vertex buffer object could be MASSIVELY simplified...

Honestly, what OpenGL needs isn't fewer features, but rather for the features most people want to use to be placed front and center with extremely simple, well-documented data formatting rules and optimized, efficient helper functions. Microsoft might have been Slashdot's Great Satan for a long time, but they do listen to the sort of developers they're hungry for, and DirectX is one of the better examples of that.

Re:Feature Bloat? (1)

TheRaven64 (641858) | more than 2 years ago | (#37028006)

Microsoft might have been Slashdot's Great Satan for a long time, but they do listen to the sort of developers they're hungry for, and DirectX is one of the better examples of that.

Well, it used to be, but they really screwed up by not supporting DirectX 10 on XP. If you use DirectX 10, you are limited to the operating systems with around 40% of the market. If you use DirectX 9, you get another 50%, but you're limited to old features. On the other hand, Intel, nVidia and AMD all support OpenGL - with all of the latest shader functionality exposed, either as part of the core standard or via extensions - on XP, Vista, and Windows 7. Oh, and you also get another 5-10% from OS X, if you care about that, and you get a relatively easy port to OpenGL ES for mobile devices too.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37032196)

I never understood what you so clearly pointed out. This has been very clear for many, many years now.

Basically it boils down to, either the people who make these decisions at development houses are completely fucking stupid or someone is paying them large sums of money such that completely fucking stupid notions suddenly make lots of financial sense.

Lets see, I can target a tiny subset of users, not use the latest in graphics capabilities, ignore a platform which provides the latest in hardware features, excellent performance, and makes cross platform development a snap, or... Again, its a WTF decision that only ever makes sense if you're an absolute fucking idiot or someone is paying you to ignore the vastly superior platform.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37032046)

DirectX is faster but not many times.... DirectX is ahead but not that much ahead. I work in a game engine that supports both DirectX and OpenGL 4.x and on nVidia at least the performance difference is a few fps and feature parity is pretty much there. The older features don't really slow the runtime down as such just make it harder for coders to find the "fast path".

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37027754)

New features allow the same things to be done, but with better hardware support. That means it's faster. That means you need less hardware to do the job. That means the systems that can run your game are cheaper.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37027784)

I can render tits just fine already, so why do I need this feature bloat as well? ... Or is it so aparent - and I so old - that as the "old guy" I am just being bitter?

The tits gave you away, old man.

But I'd like to see the source of those tits. Silicon has been used to make tits more beautiful for decades, yet there is no open-source tit renderer. Is there?
Please release the tits! (Det är vår!)

Re:Feature Bloat? (2)

Ryvar (122400) | more than 2 years ago | (#37027970)

I'm a newbie at this stuff, but here goes:

"single-rendering-pass order-independent transparency" - let's say I have three translucent objects at roughly the same depth, with parts of one in front of and behind parts of the others (and maybe the same is true for objects B and C as well). Figuring out the correct draw order is absolute fucking murder, and there still isn't a generalized approach for anybody but the most advanced of the most advanced (like Dual depth peeling [nvidia.com] or making convex hulls out of all translucent geo in the scene). Core API support for dealing with this issue would be a godsend and is about 10 years overdue for ALL graphics APIs.

Neat fact: the PowerVR-based GPU used by the iPhone/iPad uses a tile-based rendering method in which (I am told) this problem generally doesn't arise.

"Capturing GPU-tessellated geometry and drawing multiple instances of the result of a transform feedback to enable complex objects to be efficiently repositioned and replicated;" Easier to quickly render massive crowds, forests, and procedural cities.

"Modifying an arbitrary subset of a compressed texture, without having to re-download the whole texture to the GPU for significant performance improvements;" Shaders not requiring four fucking separate mask textures all dancing on the head of a pin to pull off a simple effect? Yeah, I'll take that. Could probably also have some nice gains in procedural content variation.

"Packing multiple 8 and 16 bit values into a single 32-bit value for efficient shader processing with significantly reduced memory storage and bandwidth, especially useful when transferring data between shader stages." Massive performance gains for any sort of post-processing work, basically.

Re:Feature Bloat? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37028052)

This was the sort of reply I wanted, asked for, but never got before your post (while being trolled and attacked for asking). Thank you.

Re:Feature Bloat? (1)

EsbenMoseHansen (731150) | more than 2 years ago | (#37030384)

In that case, I'd suggest phrasing the question a bit less inflammatory next time :) You did not come across as a nice guy who would like to know the concrete benefits and drawbacks.

Re:Feature Bloat? (1)

BlueParrot (965239) | more than 2 years ago | (#37031530)

Figuring out the correct draw order is absolute fucking murder, and there still isn't a generalized approach for anybody but the most advanced of the most advanced (like Dual depth peeling [nvidia.com] or making convex hulls out of all translucent geo in the scene).

Is this one of those things you would get practically automatically with ray-tracing? It seems to me that a z-buffer just isn't capable of adequately dealing with this kind of situation unless you actually want to sort the objects by depth for every single pixel.

Re:Feature Bloat? (1)

Assmasher (456699) | more than 2 years ago | (#37031560)

Just a note on OIT, with regards to DirectX: Since the fixed function pipeline was obsoleted in favor of the programmable shader based approach API implemented OIT has been orphaned as something of a strange hybrid between a crutch for people just getting into advanced topics and a luxury item.

DirectX is primarily focused on high-end games, and nowadays most games (it would seem) use some variation of Deferred Lighting. The type of deferred lighting you use would determine which, of the many, sorting approaches you would/could use to handle your non-opaque geometry (unless you can get away with a separate rendering path for your transparency handling - usually you'll have too many lights for this.) This somewhat precludes adding a magic 'OIT-enabled' flag unless its usage was bound by serious restrictions.

This is why, imho, OIT isn't bundled into DirectX. Microsoft was talking about possibly doing it before DirectX 11 (in 2008 iirc?) but it seemed to have dropped by the wayside, and I honestly think it is because virtually no one that DirectX is targeted for would use it.

Re:Feature Bloat? (0)

Anonymous Coward | more than 2 years ago | (#37030740)

As far as I can see all the new features are essentially performance enhancements ie - certain things can be done faster on the same hardware. This won't affect existing games. If you are a game developer your game you are not using any of the new features - nothing changes, drivers will not probably not drop support for the OpenGL version you are using anytime in the foreseeable future. If you were previously using these features as extensions (they all would have been prior to 4.2) you will probably have to do some string replacement.

For the opensource driver stack it means that there is a new OpenGL version to catch up with - that does present a problem but one solved by putting more resources into the opensource driver stack rather than hobbling OpenGL so that it becomes even less competitive with DirectX than it already is.

Are they nuts? (-1, Flamebait)

dev347 (2432786) | more than 2 years ago | (#37027292)

I see they added optional extension to allow DirectX coexistence [aeonity.com] as far as using same texture by both apis in same time in same application....
Why?

Re:Are they nuts? (0)

Anonymous Coward | more than 2 years ago | (#37027330)

Warning, parent is goatse.cx troll.

Are they nuts? (-1, Offtopic)

dev348 (2432790) | more than 2 years ago | (#37027350)

I see they added optional extension to allow DirectX coexistence [aeonity.com] as far as using same texture by both apis in same time in same application...
Why?

Re:Are they nuts? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37027382)

WARNING Parent is GOATSE troll!

Re:Are they nuts? (1)

Yvan256 (722131) | more than 2 years ago | (#37027758)

Not only that, but the exact same post from dev346, dev347 and dev348.

If only all those efforts went into something productive.

Re:Are they nuts? (1)

geminidomino (614729) | more than 2 years ago | (#37031838)

Looks like the MichaelKristopeit troll decided to move to a shorter username. :)

Re:Are they nuts? (1)

mrmeval (662166) | more than 2 years ago | (#37028498)

Slashdot should out anonymous posters that reach -1

In Soviet Russia (1)

Jorl17 (1716772) | more than 2 years ago | (#37027628)

Soviet Russia no longer exists. Were you expecting something else?

In Soviet Russia, something else expects you!

How much of the API is needed for HW accel? (2)

BlueParrot (965239) | more than 2 years ago | (#37027904)

Perhaps somebody in the know can enlighten me about this.

I see many fairly advanced features and functions in both the DX and OpenGL APIs , but I was under the impression that a modern graphics cards were basically designed to do a few fairly primitive operations very well and in parallel. So basically, how much of these APIs actually deal with interfacing the graphics card and it's hardware accelerated features, and how much of it is more along the lines of just a standard library that contains frequently used graphics algorithms?

Maybe my view of how programming is done these days is a bit naive, but I've always sort of felt there was a difference between the APIs that are there in order to let you use the hardware without mucking around with terribly low level and platform dependent stuff like interupts and so on, and on the other hand just standard libraries that is pretty much things where the code would be more or less the same on most platforms, but you just don't want to write it all over again whenever you make a new program ( things like some container class for C++ ).

My idea of what OpenGL and DirectX did was to let you access the features of the video card without having to worry about all the little differences between one card and another. So you could send the card a bunch of textures or something without having to rewrite the code for every card you wanted to run on.

Am I missing a lot here? Do the OpenGL and DirectX APIs also deal with a load of stuff that is just generally handy to have around when writing graphics programs?

Re:How much of the API is needed for HW accel? (5, Informative)

TheRaven64 (641858) | more than 2 years ago | (#37028100)

(Disclaimer: Simplifications follow.)

Originally, there was OpenGL, which provided the model of a graphics pipeline as a set of stages where different things (depth culling, occlusion, texturing, lighting) happened in a specific order, with some configurable bits. There was a reference implementation that implemented the entire pipeline in software. Graphics card vendors would take this and replace some parts with hardware. For example, the 3dfx Voodoo card did texturing in hardware, which sped things up a lot. The GeForce added transform and lighting, and the Radeon added clipping.

Gradually, the blocks in this pipeline stopped being fixed function units and became programmable. Initially, the texturing unit was programmable, so you could add effects by running small programs in the texturing phase (pixel shaders). Then the same thing happened for vertex calculations, and finally you got geometry shaders too.

Then the card manufacturers noticed that each stage in the pipeline was running quite similar programs. They introduced a unified shader model, and now cards just run a sequence of shader programs on the same execution units.

As to how specialised they are... it's debatable. A modern GPU is a turing-complete processor. It can implement any algorithm. Some things, however, are very fast. For example, copying data between bits of memory in specific patterns that are common for graphics.

Modern graphics APIs are split into two parts. The shader language (GLSL or HLSL) is used to write the small programs that run on the graphics card and implement the various stages of the pipeline. The rest is responsible for things like passing data to the card (e.g. textures, geometry), setting up output buffers, and scheduling the shaders to run.

Re:How much of the API is needed for HW accel? (1)

Assmasher (456699) | more than 2 years ago | (#37031586)

The GeForce added transform and lighting

Nerdmode:on

Oh, how excited I was to get a GeFORCE256 card and talk about T&L in hardware in my home PC... Ironically I worked with an SGI RE2 about two feet away from me at the time and couldn't get as excited about it (have you ever worked with Irix? LOL.)

Nerdmode:off

Re:How much of the API is needed for HW accel? (1)

multi io (640409) | more than 2 years ago | (#37030820)

So basically, how much of these APIs actually deal with interfacing the graphics card and it's hardware accelerated features

Most of these APIs. OpenGL und D3D are basically meant to be thin, portable layers encapsulating the capabilities of (some generation of) the graphics hardware.

and how much of it is more along the lines of just a standard library that contains frequently used graphics algorithms?

Not much of it. You can use the GLU (GL utililities) library for some software utility functions (basically just convenience or comfort stuff, no thick API layers). Even for very basic stuff like matrix multiplications you have to use 3rd party libraries (if you need to do it on the CPU, rather than in a shader on the GPU). The API implementations may provide software emulations for emulating the features of newer graphics hardware in software (on the CPU) when running on older graphics cards, but if you're using graphics hardware that matches the GL version, there are hardly any sort of thick abstraction layers or algorithms that run on the CPU. The point is that, basically, modern graphics cards and OpenGL/D3D do NOT evolve independently of one another. Instead, there's a more or less lock-stepped development, i.e. graphics hardware comes out in "generations", where each new generation supports an additional set of hardware capabilities, which is supported by a matching major release of the graphics API. The generations are even named after DX API versions -- DX9, DX10, DX11. OpenGL major releases occur a bit more frequently and thus sometimes encapsulate new hardware functionality that's not yet supported by the latest DX version (hardware vendors can also provide their own GL extensions). So basically, graphics cards of the same generation from different vendors are quite similar feature-wise; the differences lie only in how the hardware implements the feature sets (and in basic characteristics like performance and amount of texture memory).

Re:How much of the API is needed for HW accel? (0)

Anonymous Coward | more than 2 years ago | (#37030890)

Some day we might be able to replace OpenGL with OpenCL+EGL

Are they nuts? (-1)

Anonymous Coward | more than 2 years ago | (#37027978)

I see they added optional extension to allow DirectX coexistence [aeonity.com] as far as using same texture in both apis in same application in same time....
Why?

Re:Are they nuts? (1)

PwnzerDragoon (2014464) | more than 2 years ago | (#37028294)

In case you missed his first post, this is a goatse troll.

OpenGL a thing of the post (0)

loufoque (1400831) | more than 2 years ago | (#37028026)

But DirectX is no better.

The future lies in directly programming the hardware with a classical programming language, building your own renderers in software, hopefully not limited by outdated polygon technology.

Re:OpenGL a thing of the post (2)

pimpsoftcom (877143) | more than 2 years ago | (#37028174)

Wait , I really hope you are not saying VOXELS are ready for prime time over the standard polygon model?

Re:OpenGL a thing of the post (2)

Ryvar (122400) | more than 2 years ago | (#37028320)

It's a troll or loufoque is a bit detached from reality, but this does bring up an interesting point: a lot of what people are looking into these days in terms of rendering is voxels drawn using polygons. Minecraft? Basically those tiles are voxels being rendered as an uniform convex hulls - lends itself to some amazing efficiency.

This [youtube.com] is even more interesting from a technical perspective - stretching isosurfaces across voxel terrain to create a truly malleable world.

Re:OpenGL a thing of the post (1)

gman003 (1693318) | more than 2 years ago | (#37029512)

HAHAHAHA!

Wait, you're serious?

HAHAHAHAHAHA!

NO games programmer wants to get involved in bare hardware coding. That would require so much redundant code to be written, and testing would be an absolute nightmare. Even the vaunted Intel Larrabee design was going to have drivers and code so that it would appear to games as a regular OpenGL/DirectX card. You could write your own code, sure, but it would default to acting just like any other card (as far as the software can tell).

Re:OpenGL a thing of the post (1)

loufoque (1400831) | more than 2 years ago | (#37030378)

Ever heard of engines and other middleware?

Re:OpenGL a thing of the post (0)

Anonymous Coward | more than 2 years ago | (#37030494)

Ever heard of standards for that middleware?

Oh wait...

Re:OpenGL a thing of the post (1)

gman003 (1693318) | more than 2 years ago | (#37031400)

Even engine developers don't want to do that unless necessary for some really, really cool feature (realtime ray-tracing, maybe). It's just far too much work.

Are they nuts (-1)

Anonymous Coward | more than 2 years ago | (#37028028)

I see they added optional extension to allow DirectX coexistence [aeonity.com] as far as using same texture in both apis in same appIication in same time....
Why?

Re:Are they nuts (1)

pimpsoftcom (877143) | more than 2 years ago | (#37028182)

Warning: Parent is GOATSE Troll.

Re:Are they nuts (1)

PwnzerDragoon (2014464) | more than 2 years ago | (#37028302)

Another goatse warning, here.

Are they nuts? (-1)

Anonymous Coward | more than 2 years ago | (#37028424)

I see they added optional extension to allow DirectX coexistence [aeonity.com] as far as using same texture in both apis in same appIication in same time....
Why?

Re:Are they nuts? (1)

pimpsoftcom (877143) | more than 2 years ago | (#37028542)

Warning: Parent is GOATSE Troll..

Are they nuts?? (-1)

Anonymous Coward | more than 2 years ago | (#37028576)

I see they added optional extension to aIlow DirectX coexistence [aeonity.com] as far as using same texture in both apis in same appIication in same time....
Why?

Who owns OpenGL (1)

unixisc (2429386) | more than 2 years ago | (#37030648)

Who owns the OpenGL spec? Previously, it was SGI, but once they threw it open, isn't it up to a SIG, or something like that? Or is Khronos the sole owner now? If not, how do they release any OpenGL spec?

Another Patent Encumbered Standard (0)

Anonymous Coward | more than 2 years ago | (#37031112)

VRRP, philosophically, [openbsd.org]
must ipso facto standard be
But standard it
needs to be free
vis a vis
the IETF
you see?

But can VRRP
be said to be
or not to be
a standard, see,
when VRRP can not be free,
due to some Cisco patentry.. s/VRRP/O

OpenGL.... (1)

Windwraith (932426) | more than 2 years ago | (#37031326)

We are already at 4.2...
Wow, GL moves fast, but who cares, when you need to force your users to update their drivers so you can have the bare minimum "new" features.

Seriously, what's the rate of adoption? Perhaps it was never labeled properly but don't all default installs of GL support like 1.x? What drivers and cards provide support for GL 2.0+?
And most importantly why should I bother developing in a newer version of GL, if I don't know if the user will be able to update to the right version to run a game of mine? Better stick with the lower standards...
I find GL worse than trying to develop multiplatform with C99...

Are they nuts? (0)

Anonymous Coward | more than 2 years ago | (#37042356)

I see they added optional extension to aIlow DirectX coexistence [aeonity.com] as far as using same texture in both apis in same appIication in same time....
Why?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>