Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Releases 3D Programming Documentation

kdawson posted more than 6 years ago | from the fosdem-fossdoc dept.

Programming 94

Michael Larabel writes "With the Free Open Source Developers' European Meeting (FOSDEM) starting today, where John Bridgman of AMD will be addressing the X.Org developers, AMD has this morning released their 3D programming documentation. This information covers not only the recent R500 series, but goes back in detail to the R300/400 series. This is another one of AMD's open source documentation offerings, which they had started doing at the X Developer Summit 2007 with releasing 900 pages of basic documentation. Phoronix has a detailed analysis of what is being offered with today's information as well as information on sample code being released soon. This information will allow open source 3D/OpenGL work to get underway with ATI's newer graphics cards."

cancel ×

94 comments

Sorry! There are no comments related to the filter you selected.

Makes me ask (1)

edsousa (1201831) | more than 6 years ago | (#22530652)

Would fglrx work at last? Or community devs will do all the work to have a decent driver?

Re:Makes me ask (3, Interesting)

bersl2 (689221) | more than 6 years ago | (#22530760)

fglrx is probably a technical and legal mess unable to be cleaned up with less effort than it would take to re-write the drivers using good documentation.

Re:Makes me ask (2, Informative)

hr.wien (986516) | more than 6 years ago | (#22532540)

fglrx has seen massive improvement lately. It is supposed to be mostly in sync with the Windows Catalyst drivers these days. It's still a bit off perfect of course, but a lot better than it was.

Re:Makes me ask (1)

MoHaG (1002926) | more than 6 years ago | (#22533536)

fglrx has seen massive improvement lately. It is supposed to be mostly in sync with the Windows Catalyst drivers these days. It's still a bit off perfect of course, but a lot better than it was.

Yes, with version 8.455.2 it only hangs my system after 30 minutes of using Google Earth not immediately... Quake III seems to run stable at least, just without working brightness controls...

I even had compiz-fusion running on a recent version and it was reasonably stable, with the complete lockups being totally predictable (After logging off for the second time...)

It might just have been bad luck, and I did not really have time to look up all the possible settings to try to stabilize the system, but one would think that such measures should not be necessary to use hardware you bought.

If anyone is wondering: I'm running a Radeon 9600 under Gentoo...

Re:Makes me ask (1)

Bert64 (520050) | more than 6 years ago | (#22533846)

I had major trouble getting a radeon hd2400 working with mythtv... If it worked at all, it was laughably slow (~5 secs to redraw the menu).
Eventually i had to give up, and get an nvidia card.

Re:Makes me ask (1)

Zencyde (850968) | more than 6 years ago | (#22534224)

Amen! I was running a Radeon 9800 Pro 128 MB before purchasing my Geforce 7600 GS 512 MB. The Radeon gave me nothing but trouble under Ubuntu! I managed to get it to work for a short while; but, it messed up Compiz Fusion and once I got Compiz Fusion to work, I lost the ability to render OpenGL. Now, I can't even render OpenGL with my new card under that installation. I'm running off of a 20 GB harddrive now and the card works flawlessly. : ) Let's just hope that nVidia will open up their specs, too.

Re:Makes me ask (1)

MrHanky (141717) | more than 6 years ago | (#22533956)

It's still garbage, really. Doesn't properly support XVideo, crashes all the time, etc. For a professionally developed video driver, the quality really is shocking.

Still... (2, Informative)

Junta (36770) | more than 6 years ago | (#22535152)

Comparing my R500 part with fglrx with an R300 part with the open source driver:
-With fglrx kernel module loaded, my laptop has not been able to suspend ever (using Ubuntu Gutsy)
-I have to do a goofy Virtual 1408x1050 resolution with fglrx to make 3D applications not look horribly corrupted. This is weird, but as long as I don't xrandr over to it, it's not a big deal, however..
-After doing above trick, fglrx shows corruption in lower right hand corner and hardware cursor if trying to do 3D apps at 1400x150 (native resolution). Have to run at 1280x960 to prevent that corruption.
-All acceleration (3D and 2D) has a horrible diagonal tearing effect.

The *ONLY* net improvement in the interval you deem 'massive' improvement is in the frames per second area. Though important, the top priority should be reliability.

Meanwhile, though much slower, the open source driver on the R300 part behaves quite in line with what I expect. I look *very* much forward to what the open source initiative ultimately yields. If AMD can cram the fglrx performance into binary blobs that leverage the open source layers for everything they get right, I would be ecstatic.

Re:Still... (1)

deek (22697) | more than 6 years ago | (#22540658)

fglrx shows corruption in lower right hand corner and hardware cursor if trying to do 3D apps at 1400x150 (native resolution). Have to run at 1280x960 to prevent that corruption.


I had the same issue at one stage. I had to put the following option in the fglrx Device section of xorg.conf :

Option "XAANoOffscreenPixmaps" "true"

Give it a try and see how it works for you.

Does it have info on Hybrid cross fire and other.. (0)

Joe The Dragon (967727) | more than 6 years ago | (#22530688)

Does it have info on Hybrid cross fire and other cross fire setups as well?

Re:Does it have info on Hybrid cross fire and othe (1)

somersault (912633) | more than 6 years ago | (#22530890)

dude, let the devs get some single card setups going first before asking about crossfire o_0

Re:Does it have info on Hybrid cross fire and othe (1)

Joe The Dragon (967727) | more than 6 years ago | (#22531436)

It may be easy to add it later if they plan for it when they start working on then hacking it in later.

Re:Does it have info on Hybrid cross fire and othe (1, Informative)

Anonymous Coward | more than 6 years ago | (#22531166)

The R500 cards only have crossfire via the external cable.

Re:Does it have info on Hybrid cross fire and othe (1)

default luser (529332) | more than 6 years ago | (#22547352)

The R500 cards only have crossfire via the external cable.

Not entirely true. The RV570 (x1950 Pro) was the first internal Crossfire-capable GPU. But you're basically right, because every other card in the R500 range had the Crossfire glue logic external to the GPU die, and thus required special "Crossfire Edition" cards.

The RV570 got the internal Crossfire treatment because it was completely redesigned to (1) reduce power consumption and (2) create a cheap midrange card.

lame (-1, Flamebait)

HAVOCtheHedgehog (1235824) | more than 6 years ago | (#22530770)

its a shame when a subpar company is forced to release its code so we can write our own working drivers because they cant/wont. eff AMD and their wide variety of crapware

sad but true (1)

sirmonkey (1056544) | more than 6 years ago | (#22530840)

sad but true. they had there chance years ago. but now. team green all the way.
i'd like to support amd/ati because of there friendlyness to oss but they just can cut it. alltho they are getting better. the R670's are _allright_ (crossfireX is there only hope) . and they should suck it up and release the 411 for that line and not vintage ones that are very quickly becomming very inadituquite.

Re:sad but true (0)

Anonymous Coward | more than 6 years ago | (#22532932)

I can't take the opinion of anyone seriously if they mix up their/there three times in less than a hundred words...

Re:sad but true (0)

Anonymous Coward | more than 6 years ago | (#22533880)

And wth is "inadituquite" ?

Re:sad but true (0)

Anonymous Coward | more than 6 years ago | (#22537408)

It's open-source spelling. The alphabet has been released under the GPL, so he's free to do what he wants with it. You take your closed-source, proprietary tactics back to Redmond, you fascist bully-boy.

Re:lame (1)

Metasquares (555685) | more than 6 years ago | (#22530844)

But it's even worse when they don't release the code.

Not lame at all. (3, Insightful)

Timothy Brownawell (627747) | more than 6 years ago | (#22531146)

It's actually quite nice when they tell us how to write our own drivers, so we're not dependent on them for needed maintenance (bug fixes, updates for newer kernels, etc). Companies can have all sorts of reasons to stop supporting a product, or to provide sub-par support, and being able to write our own drivers means that that isn't a problem.

Not fame at all. (0)

Anonymous Coward | more than 6 years ago | (#22533950)

There's just ONE problem with that argument. The Windows and Mac market. Just because the FOSS group is all gung ho about writing drivers for every piece of hardware known to man doesn't mean that the Windows and Mac side is. Nvidia and ATI will both still have to write binary drivers for them and that means all the other arguments .e.g. support, legacy, etc stay the same. All that open sourcing has done is reduced the noise level from a demographic that's demonstrated that it can be noiser and hard to please than most [slashdot.org] (squeaky wheel and all that).

This is great and all... (-1, Redundant)

Anonymous Coward | more than 6 years ago | (#22530820)

but I'm holding out for 4D.

Re:This is great and all... (1)

moderatorrater (1095745) | more than 6 years ago | (#22533360)

We all know the equations don't work out well with 4 dimensions. If they're going to start adding more, I think they should go with 10 minimum, maybe 11.

Re:This is great and all... (1)

Zencyde (850968) | more than 6 years ago | (#22534244)

But the Superstring Theory [wikipedia.org] hasn't even been proven yet!

It was about time (1)

joaommp (685612) | more than 6 years ago | (#22530848)

I'm not a fan of opensource. Or closed source. I'm purely agnostic on that field. I believe licenses should be chosen on a per-case basis. But hardware makers loose little by opening their software accessible interface documentation. Actually they have more to win, because with open documentation, other people can write drivers for software platforms where that piece of hardware has so far been unsupported. That may actually increase their market share.

Re:It was about time (1)

garett_spencley (193892) | more than 6 years ago | (#22530948)

Hardware can be just as competitive an industry as software and from the manufacturers point of view they stand a great deal to lose if their competitors get a hold of their trade secrets.

You might think that how the card works is something that is trivially reverse engineered but that is not always the case. While I am not a hardware or graphics card expert, I suspect that a lot of chips, GPUs for example, likely have instruction sets and features that the manufacturers don't want their competitors to have access to, and for which closed-source drivers can take advantage of without releasing those trade secrets to the public. I'm sure there's lots of better examples too.

I wish this weren't the case. As a Linux user I'm always wanting specs to be open so better drivers can be developed. I'm really happy to see AMD taking this step.

Re:It was about time (1)

joaommp (685612) | more than 6 years ago | (#22530976)

well, but if you think of it, in the case of such markets, if the problem are the competitors, than having the documentation opened or closed will not make a difference... the competitors can simply reverse engineer the driver... it may take longer, but we're talking about multi bilion dollar industries here. They HAVE the resources for that...

Re:It was about time (1)

EvanED (569694) | more than 6 years ago | (#22530978)

Not just that, but graphics drivers are a significant piece of tech in their own right anymore.

Know about shader programs? Those are compiled by a JIT compiler in the driver, at runtime. If nVidia or ATi believes that they have a better compiler that implements some optimizations that the other's doesn't, that could make them very reluctant to release the code.

I would think that's more reason for specs.. (4, Interesting)

Junta (36770) | more than 6 years ago | (#22531116)

I see that as a reason not to open source the existing drivers, but not to preclude releasing the details needed by the open source community to produce an open driver with their own shader programs, which may be lower performance, but good enough for default operation for a lot of distributions.

I find an interesting perspective being hinted at by AMD in this context. That they approach a common open source layer at the low level, and plug in their proprietary 'good stuff' as a replacement for higher layer things. As an example, they feel their powerplay stuff isn't top secret, so putting it at a layer where everyone can bang on it and improve it is ideal for everyone. Same with things like display handling. AMD and nVidia both do bizarre things requiring proprietary tools to configure display hotplug, instead of the full xrandr feature set, which has grown to include display hot plug.

In general, there are *many* things AMD has historically gotten wrong in their drivers. Mostly with respect to power management, suspend, stability with arbitrary kernels/X servers. One thing they seem to do better relative to the open source community is good 3D performance if all the underlying stuff happens to line up. If they can outsource the basic, but potentially widely varying work to the community, it would do wonders if their driver architecture lets them leverage that. And by giving open source 3D developers a chance to create a full stack, it's the best of all worlds. I would be delighted to see the Open Source 3D stack surpass the proprietary stack, but wonder what patents stand in the way of that being permitted...

Re:It was about time (0)

Anonymous Coward | more than 6 years ago | (#22531246)

Interestingly enough, Apple didn't like either one's shader JIT compiler, so for Leopard they use a heavily modified llvm.

Re:It was about time (1)

niteice (793961) | more than 6 years ago | (#22531388)

llvm is only used for CPU fallbacks, not the entire shader engine.

Re:It was about time (1)

EsbenMoseHansen (731150) | more than 6 years ago | (#22536850)

Know about shader programs? Those are compiled by a JIT compiler in the driver, at runtime. If nVidia or ATi believes that they have a better compiler that implements some optimizations that the other's doesn't, that could make them very reluctant to release the code.

It's not really a JIT compiler (at least in opengl). Just a compiler.(Well, technically, they are free to implement a JIT compiler, but that would be silly when they have the opportunity to make a real compiler instead).

It is a part of the driver, though. Compilers are, however, something that we have a lot of in the opensource world, so I have no fears there.

Re:It was about time (1)

mikael (484) | more than 6 years ago | (#22539310)

Anyone involved in the design of ASIC chips has access to chip grinders and electron microscopes, which while normally used for quality-assurance purposes, can be used to examine competitors architectures. See Chip art [wikipedia.org] at wikipedia.

There are standard ways of communicating with hardware (memory-mapped registers, IO ports, DMA transfers), so there isn't much that isn't known already.

Although, most of the optimisations in the use of 3D hardware seem to be related to memory mapping, caching, ordering and buffering, which are patented (The OpengL extension registry lists the restrictions assigned to each extension). These are probably the reason that the drivers remain proprietary.

Re:It was about time (1)

bluefoxlucid (723572) | more than 6 years ago | (#22576928)

You might think that how the card works is something that is trivially reverse engineered but that is not always the case.

People have written open-source nVidia drivers (i.e. at the GLX level, not the DRI level). This forms the basis of current work in the area; nobody's gotten around to it because nvdriver is good enough, aside from Ubuntu making you install it after first boot. The closed source drivers only put developer focus elsewhere.

Go play with IDA Pro, get a book called "Reversing: Secrets of Reverse Engineering" and jump "Reverse Engineering Code with IDA Pro" when it comes out. "Rootkits: Subverting the Windows Kernel" was also a favorite of a colleague of mine, more peripheral knowledge in this area.

Hard Line (1)

maz2331 (1104901) | more than 6 years ago | (#22553916)

I'm a real hardliner on the belief that interfaces should not only be unprotectable, but should be required to be released in any and all cases, period, no exceptions.

Way to go AMD (4, Insightful)

schwaang (667808) | more than 6 years ago | (#22530866)

For ages, the FOSS community has said "just give us the specs for your graphics cards and we'll write the drivers". Well it looks like AMD is taking real steps in that direction, and I for one, say Thanks!

According to TFA, the small group at AMD who has spent time clearing the docs for legal issues are going to speak at FOSDEM [phoronix.com] , and the maintainer for the open source driver for AMD/ATI graphics (RadeonHD) will be giving an update.

And thanks also to Intel for putting out their 3D graphics specs last month. These are good days for Linux.

Re:Way to go AMD (-1, Troll)

Anonymous Coward | more than 6 years ago | (#22531090)

Well, they get free development from eager developers and more respect from the OSS community. With AMD sort of floundering with an uncompetitive mainstream lineup across the board - CPUs and video cards - I have to wonder if they are just grasping at any hold they can get. Which is not necessarily a bad thing for us. Who cares about motives if it results in better 3D support for *nix.

Re:Way to go AMD (0)

Anonymous Coward | more than 6 years ago | (#22531176)

I wish there was a mod called -1, astroturfing.

AMD right now is the leader in the sub-200 dollar market segment (mainstream), until nvidia can actually start getting out its 9600gt in any real numbers.

Re:Way to go AMD (0)

Anonymous Coward | more than 6 years ago | (#22531356)

AMD didn't have to do this any more (or less) than Intel did. So in the end this is really about the arrival of Linux as growing force in the marketplace.

(By way of slashdot flamebait, I'd say any other *nix is a beneficiary of what the critical mass of Linux has induced here.)

Re:Way to go AMD (1)

LWATCDR (28044) | more than 6 years ago | (#22531482)

Well since I have had my doubts that the FOSS community really can write better drivers than AMD or NVidia this will be interesting. Of course this is one of those times when I will be very happy to be wrong.

Re:Way to go AMD (4, Insightful)

644bd346996 (1012333) | more than 6 years ago | (#22531684)

Depends on what you mean by better. There's no doubt that the open source drivers will be more stable and have better software compatibility than the proprietary stuff. The 3d performance will really only matter to the Linux gamers (a very small market, that), as the performance should definitely be more than enough for simpler things like compiz, etc.

You should take a look at the existing 3d drivers. The folks reverse-engineering the r300 series did a pretty good job (well enough for it to be the development platform for xgl). And the open-source drivers also guarantee that the card will continue to work just as well with software written long after the demise of the company (eg. with the 3dfx drivers).

Re:Way to go AMD (2, Informative)

X0563511 (793323) | more than 6 years ago | (#22532050)

Gamers are not the only ones who like 3D acceleration.

Quickly and off the top of my head, here are two big ones:
1. Compiz/Fusion and the like is gaining popularity.
2. Some applications NEED good 3D or they crawl. See Blender for instance.

Of these, I would say gaming would be the least demanding - at least if my assumption that "stable is harder than fast" is correct.

Re:Way to go AMD (4, Insightful)

forkazoo (138186) | more than 6 years ago | (#22532342)

Gamers are not the only ones who like 3D acceleration.

Quickly and off the top of my head, here are two big ones:
1. Compiz/Fusion and the like is gaining popularity.
2. Some applications NEED good 3D or they crawl. See Blender for instance.

Of these, I would say gaming would be the least demanding - at least if my assumption that "stable is harder than fast" is correct.


Sure, Blender needs good OpenGL acceleration. But, nobody is going to be that concerned about getting an extra 1 fps in Blender. If proprietary drivers go twice as fast, or ten times as fast, then the open source devs would look like idiots. If the open source ones are ten percent slower, then 99% of people will be completely satisfied. Games are flashy, and they sell cards, and people will complain about getting killed by somebody with a faster machine because it couldn't possibly have anything to do with lack of skill. In Blender, you just need sufficient speed to work. If the guy next to you has an extra 2 fps, it doesn't make him appreciably more productive, and you certainly can't justify needing to display faster than the refresh rate of the monitor in Blender!

Re:Way to go AMD (1)

X0563511 (793323) | more than 6 years ago | (#22534592)

True, but you missed stability.

If Jimmie Joe Fragger crashes in a match, he gets mad and his team loses the round. That's it.

If Jimmie Joe Modeler crashes after tweaking a model for a time, there is no guarantee he can get it "just right" again - and that is lost productivity rather than just lost time.

And historically... (1)

Junta (36770) | more than 6 years ago | (#22535404)

The open source drivers have had the reliability advantage, so I'm guessing you agree with the perspective of the parent post?

Re:And historically... (1)

X0563511 (793323) | more than 6 years ago | (#22535908)

Well, I'm speaking from the perspective of using a crappy laptop with a crappy ATI chipset. Not even sure if the chipset is related to what they are releasing.

Very unstable, in my particular case.

Re:Way to go AMD (1)

Enleth (947766) | more than 6 years ago | (#22536208)

That's interesting - I've been doing some modeling in Blender on a SiS 661 with the free driver (that is, no DRI at all). The "preview" window was next to useless, but the workspace itself was pretty fast. I was able to model some relatively simple objects and animations without any problems, on a Celeron M 1,5GHz. I've even tried playing with some more complicated scenes downloaded from blender.org examples and tutorials and everything was still fine, if not extra-smooth.

Re:Way to go AMD (1)

X0563511 (793323) | more than 6 years ago | (#22538486)

Well, I'm not that advanced in blender, but when doing character work and sculpting, wireframe doesn't cut it - i need shading.

Re:Way to go AMD (1)

LWATCDR (28044) | more than 6 years ago | (#22540660)

Better means stable, fast, and full featured.
Currently there are no drivers for modern cards that are not in at last some large part written the company that produces them. The FOSS Intel driver is mainly written by Intel and actually has several parts of it obscured.
I still doubt that a FOSS driver that is based just on the documentation will be more stable than the proprietary driver. FOSS isn't magic. I have had issues with Firefox and FSpot with stability. Nothing terrible but problems none the less.
Heck I would love to buy a good ATI card with FOSs drivers I need to upgrade my video card soon so I am hopeful but I don't believe in magic.

Re:Way to go AMD (0)

Anonymous Coward | more than 6 years ago | (#22531756)

I don't think this is/was necessarily about writing "better" drivers than AMD/nVidia could do, just writing drivers that take advantage of the hardware in ones computer because nVidia/AMD wern't writing drivers for their hardware on linux / other systems.

Re:Way to go AMD (1)

Chris Mattern (191822) | more than 6 years ago | (#22532216)

Will they write better drivers than the current commercial drivers for Windows? In terms of sheer performance, probably not. In terms of reliability, maybe. Will they write better drivers than the current Linux open-source drivers? Damn skippy. And as I use the open-source nv driver myself, that's a very good thing as far as I'm concerned.

Re:Way to go AMD (2, Informative)

Alcoholic Synonymous (990318) | more than 6 years ago | (#22533324)

"These are good days for Linux."

These are good days for Xorg, which isn't Linux. Everyone running X will benefit, not just Linux. Linux isn't the only non-Windows platform.

H.264 acceleration included? (2, Insightful)

pyite69 (463042) | more than 6 years ago | (#22530990)

Feature parity with Windows must be the goal if they want to beat NVidia. I hope we can get some sort of media acceleration beyond the stale old XVideo & XV-MC.

Re:H.264 acceleration included? (0)

Anonymous Coward | more than 6 years ago | (#22531700)

They have mentioned h.264 decoding as being one of the things they're opening up with the public release of 'performance library' code (whatever that really means), so maybe open-source developers will be able to pick up code from that and use the 3d card docs being released now to write hardware acceleration code using these new GPUs.

Unfortuantly ati's cards don't match up to what I need in other respects - mergedFB and bigmonitor (I think that's what the other multimonitor ati project was called) don't work with compiz and/or 3d acceleration - and I have become quite reliant on multimonitor support over the years.

If ati had something that worked as well as twinview there would be no question as to whose videocard I will buy next week when I upgrade my machine. As things stand now it will probably be another nvidia even though I really want to reward AMD/ati for taking this step forward :(

Re:H.264 acceleration included? (1)

tixxit (1107127) | more than 6 years ago | (#22541214)

AIGLX, Compiz, dual monitors (different resolutions), and fglrx work fine here.

Re:H.264 acceleration included? (0)

Anonymous Coward | more than 6 years ago | (#22532408)

Feature parity with Windows must be the goal if they want to beat NVidia. I hope we can get some sort of media acceleration beyond the stale old XVideo & XV-MC.
The open driver probably won't for one main reason: patents.

Anyway, an Athlon 64 3000+ with an Radeon 9200 can do h.264 at 720p without any issues (VLC, Xine, Mplayer, but Totem (gstreamer backend) chokes). I don't have any content in 1080p, so I don't know about that, but VLC can do 1080p with "MSMPEG4V2" without dropping frames.

Re:H.264 acceleration included? (1)

X0563511 (793323) | more than 6 years ago | (#22535950)

I'm sure someone, somewhere, outside the USA, will write something.

Since I don't give a crap about software patents, I will use it and be happy. Since I'm not the one who would be violating the patent, I don't think I will be in legal trouble (but in this case I don't really care)

Re:H.264 acceleration included? (4, Interesting)

Jah-Wren Ryel (80510) | more than 6 years ago | (#22533150)

I hope we can get some sort of media acceleration beyond the stale old XVideo & XV-MC.
You won't get it, and the reason is DRM.

ATI's cards that have h.264 acceleration (and all kinds of other good stuff like smart de-interlacing all collectively branded as "UVD") are unlikely to ever have the specs for UVD disclosed because they integrated the good stuff with the bad stuff (DRM) and are afraid the exposing how to use the good stuff in UVD will also expose how to circumvent the bad stuff on microsoft windows systems.

So, once again, those DRM apologists who say that DRM is purely optional, that if you don't want to use it, it won't hurt you, are proven wrong again.

On the plus side, the next gen cards will have the DRM broken out into a separate part of the chip so that they can feel safe in publishing the specs for good video stuff while leaving the bad stuff locked away.

One of many such statements by ATI/AMD. [phoronix.com]

Yeeha!!!! (5, Interesting)

Anonymous Coward | more than 6 years ago | (#22531012)

I'm the owner of 5 boxes all with Nvidia graphic cards.
I've been using only Nvidia cards since 2000 because they had
the best 3D graphics card for my Linux box. I was willing to deal
with binary drivers because there was nothing else available to me
at my price range (loooow budget) for 3D graphics.

But.... over the years I would get burned every now and then
when
1) I would upgrade the kernel and then the X server would get borked
because the Nvidia kernel module didn't match the new kernel, or

2) Some funky memory leak in the binary Nvidia module would lock
up my box hard because of some damn NvAgp vs. Agpart setting or
some funky memory speed setting. Of course, this didn't happen with
every Nvidia driver so of course I wouldn't bother writing down
what it took to fix the problem.

Finally when I switched to Debian Linux in fall 2004 and had
beautiful apt-get/synaptic take care of all of my needs I thought
I was done ... until I found out that Nvidia doesn't time its
driver releases with kernel releases so if I wanted to upgrade
my kernel painlessly with apt-get/synaptic I would have to
wait for Nvidia to get off it's damn rocking chair playing their
damn banjo and release a driver to go with the newer kernel.

The final straw for me was when all of my 5 nvidia cards were
now listed in the "legacy driver" section. Can you guess what
"legacy driver" means about Nvidia fixing their closed source
driver? Yeah, that's exactly the point.

That's when I started looking around for open source 3d drivers.
I know about Nouveau for Nvidia, but frankly I'm too pissed off
about Nvidia to consider them. Ati had a long history of treating
Linux customers like second class scum. Intel on the other hand
earned the "golden throne" by providing full open source for their
graphic chipsets. So now that I'm looking for getting a dual core
64 bit cpu + 3D graphic chipset the only viable choice was intel,
which I was happy to do business with.

Now that Ati has decided to come forth with 3D documentation I'm
willing to give an intel/ATi or AMD/Ati combo serious consideration.

Way to go ATI!!!!

 

Re:Yeeha!!!! (1)

Sharth (621005) | more than 6 years ago | (#22531108)

Or hell, maybe the kernel devs could make it easier to have binary modules stay compatible from version to version...

Re:Yeeha!!!! (1)

Timothy Brownawell (627747) | more than 6 years ago | (#22531198)

Or hell, maybe the kernel devs could make it easier to have binary modules stay compatible from version to version...
I expect that this will happen automagically once they find a set of interfaces that's actually good enough that it doesn't *need* to be changed to let other parts of the kernel improve...

Re:Yeeha!!!! (2, Interesting)

LWATCDR (28044) | more than 6 years ago | (#22531458)

That isn't the issue. The interfaces are pretty stable other wise you couldn't just recompile most drivers when the new kernel comes out. What is missing is a stable binary interface. I am all for a binary interface. The developers don't want a binary interface for what I feel are bad reasons. But they are the devs and they get to make that call even if I don't like it.

Re:Yeeha!!!! (1)

xtracto (837672) | more than 6 years ago | (#22534004)

That isn't the issue. The interfaces are pretty stable other wise you couldn't just recompile most drivers when the new kernel comes out. In one of the most recent Linus Trovalds interviews I read (it was featured here in slashdot), Linus specifically stated that they also DO NOT guarantee a stable API (yes, not the ABI, the API) for future versions of the kernel. He gave his reasons and I respect them but I also thing it is not fine. If you have no guarantees that your program will work at least during a 5 years life span then maybe you should choose another platform to invest your resources. The good thing about linux (which is always improving and developing) is also a bad thing, given the instability of the API. Developing a commercial application in such system will guarantee you *lots* of customer support demand (and in consequence, lots of unhappy customers).

Re:Yeeha!!!! (2, Informative)

Ryan Mallon (689481) | more than 6 years ago | (#22540506)

The internal kernel API's are subject to change. Functions within the kernel for dealing with lists, interrupts, devices drivers etc, can and have changed many times in the past. The API (ie syscall interface) which is exposed to userspace is very stable, and in many cases pre-dates Linux itself.

Typically userspace application developers do not need to worry about changes to the kernel, since the userspace APIs are mostly stable. Drivers within the kernel usually do not need to worry either, since changes which break in kernel code are generally not accepted. The only people the unstable kernel API really affects is those maintaining out of kernel drivers (whether they are binary or source).

Re:Yeeha!!!! (1, Insightful)

Anonymous Coward | more than 6 years ago | (#22531470)

Not to pick a nit, but not being "good enough" isn't the reason the kernel devs have decided not to commit to a stable binary API. It's so that they have total flexibility to use the latest greatest code.

The upside of the Linux way is rapid development, with a constant stream of new features.
The downside is that since every kernel update might break binary compatibility for a previously compiled driver, third-party drivers must be recompiled for every update.

It's definitely a trade-off, one that isn't done by more commercially oriented OSes like Solaris or Windows which do commit to binary stability within major versions. Vendors love that because they can compile just one driver for XP and be done with it.

OTOH, the Linux kernel's policy *does* put pressure on third-party drivers to go open-source, like what is *finally* happening for graphics cards after all these years. So three cheers!

Re:Yeeha!!!! (1)

drinkypoo (153816) | more than 6 years ago | (#22531160)

It's interesting you should attack nvidia ('s regrettable lack of open source drivers) in a story with a link to phoronix, which noted a rumor January 10 that nvidia might open source their drivers too [phoronix.com] . Of course, we know what rumors are worth. I for one intend to get a new laptop shortly after the quad cores start rolling out and I plan to get whatever laptop will let me use no closed source drivers. If I can find one. :P

Re:Yeeha!!!! (1)

pipatron (966506) | more than 6 years ago | (#22531966)

Go for anything with an intel chipset, preferably a ThinkPad if they will be upgraded to quad core. Always fun to laugh at the closed-source-binary-driver-hell that is nvidia and ati.

Re:Yeeha!!!! (1)

Aladrin (926209) | more than 6 years ago | (#22532194)

While my Intel gma3100 has been -amazing- for compiz, it's been absolute crap for gaming... Especially while running compiz. (Can't run opengl and compiz at the same time on Intel, it can't handle it.) The drivers are pretty amazingly solid.

But I've been wanting to game on this computer as well, and I miss my nVidia card. I was just about to break down and buy one... Maybe I'll just wait a while longer and see what happens with ATI's drivers. It would be -so- great to continue not having to deal with restricted drivers and yet have a powerful card.

Kudos to ATI for making good on their promise, and for supporting the open source community in general.

Re:Yeeha!!!! (0)

Anonymous Coward | more than 6 years ago | (#22532604)

FYI the compiz stuff is in the works and while the current dev code isn't stable, it's leaps and bounds faster than it was in 7.0.2/Xorg 1.4

So when mesa 7.1 release comes out, expects some significant if not spectacular results (Like quake3 @ 1280x800@30-90 fps (On the particular demo I ran, which previously had been showing like 30-40fps PEAK with 7.0.2... rather significant change, eh? Too bad it's still got artifacting, so it's not ready for mainstream yet, nevermind AIGLX support, which IS AFAIK broken.)

Re:Yeeha!!!! (1)

level_headed_midwest (888889) | more than 6 years ago | (#22532586)

The Intel IGPs may be fine for general desktop usage but anything that uses the IGP to do much more than draw the desktop is MUCH better served by a discrete GPU.

Re:Yeeha!!!! (1)

drinkypoo (153816) | more than 6 years ago | (#22600520)

I am well aware that intel has open source drivers. They are not especially good drivers (not especially bad ones either) and are less stable than nvidia or fglrx in my experience. But the point is well taken. However, I will not be buying a Thinkpad. The prices are astronomical. For the same money I can get an Apple laptop which is much nicer what with the EFI boot and far superior case design.

Re:Yeeha!!!! (1)

turgid (580780) | more than 6 years ago | (#22533986)

1) I would upgrade the kernel and then the X server would get borked because the Nvidia kernel module didn't match the new kernel

I too, have been an nVidia customer since 1999 for Linux (I don't do Windows at all). The nVidia driver package has an option to compile a new interface to match your current kernel. It leads you through the options 1 at a time with yes/no options.

The only time I had trouble was when I did a major kernel upgrade and forgot to install the headers for the new kernel. If you're installing as part of a distro (Ubuntu) that should already be done for you when you install the OS. All you need to do is to remember to say yes to the option in the nvidia installer.

I have 5 nvidia graphics cards going all the way back to 1999 and all of then still work. I don't play games nowadays, but for the likes of Stellarium they are very useful and very, very fast.

I bought an ATi card a few years ago and was extremely disappointed with the speed and the snowyness of the display. I'll be waiting a couple of years before I consider buying ATi again, when hopefully the drivers will be mature.

Re:Yeeha!!!! (1)

downix (84795) | more than 6 years ago | (#22534338)

You forgot XGI, which have had open documentation for their chips for well over a year now.

So is this all for any chip? (1)

Kjella (173770) | more than 6 years ago | (#22531324)

The 3D programming documentation today is 300 pages and is made up of a programming guide and register specifications. Among the areas covered in this 3D guide are the command processor, vertex shaders, fragment shaders, Hyper-Z, and the various 3D registers.
Maybe the tcore code contains more, but doesn't 300 pages sound small when the previous drops have been 900 pages or so? I'd be very happy if this really is all they need to make a competitive driver (i.e. no hidden features only the closed source driver can use).

Re:So is this all for any chip? (3, Informative)

TravisWatkins (746905) | more than 6 years ago | (#22531480)

The previous releases covered initializing the card, mode setting, 2D output, etc. That's a lot of stuff to cover. These docs are basically just on how to setup the 3D engine and feed it shaders.

Awesome! (0)

Anonymous Coward | more than 6 years ago | (#22531754)

I think this is all that 3D Realms needs to finish Duke Nukem Forever!

Re:Awesome! (1)

BronsCon (927697) | more than 6 years ago | (#22531818)

Ahh, I see! They're going to require a top-end GPU so they can emulate an older GPU on that. That way, everyone sees the game exactly the same, regardless of their hardware. They would need full specs for that, I suppose.

Radeon R600 series, here I come (0)

Anonymous Coward | more than 6 years ago | (#22532066)

Finally, I can buy a new computer and not have to use a PCI Radeon 9250. Although stable drivers probably won't be here until after the R800 comes out. Assuming AMD lasts that long.

But yes, HUZZAH. I intend to buy AMD for my next computer.

Next make coreboot [coreboot.org] (i.e. LinuxBIOS) the default for the new AMD 8-series motherboard chipset. Hopefully that would endear them to super computing farms and open source enthusiasts alike.

One Moment While I Go Dance in the Streets (3, Funny)

darkonc (47285) | more than 6 years ago | (#22532072)

This is the end of the beginning.

Now that AMD/ATI has come over from the Dark Side, I expect that Nvidia and all of the other graphics chip manufacturers are going to be close behind. It may take them a year or two to work out the logistics, but they'll be here.

More and more people are moving over to Linux/BSD Free/Open software, and letting yourself be locked out of a growing market is the kind of things that CEOs and CTO's get fired for.

It used to be the case that manufacturers could peacefully close their eyes to the Open Source / Free communities and drink the Microsoft brand Kool-Aid because all of their competitors were doing the same thing. Now, however, with one of The big guns having committed to solid support of the Open Source universe, their less responsive competitors have a massive flank open that is going to have to be responded to.

One Moment While I Go blow hot air. (1, Funny)

Anonymous Coward | more than 6 years ago | (#22533904)

"More and more people are moving over to Linux/BSD Free/Open software, and letting yourself be locked out of a growing market is the kind of things that CEOs and CTO's get fired for."

Uh huh. And just how many CEO's and CTO's have been fired for using ATI or Nvidia's binary blob? I suspect the number's between zero and your imagination.

"This is the end of the beginning. "

The total number of hardware and still growing that's released with a binary blob is still greater than the total number that have open source drivers.

Re:One Moment While I Go blow hot air. (3, Interesting)

Junta (36770) | more than 6 years ago | (#22535512)

Uh huh. And just how many CEO's and CTO's have been fired for using ATI or Nvidia's binary blob? I suspect the number's between zero and your imagination.
He was suggesting AMD's or Intel's CEO, not 'client' companies. I doubt it would get to C*O level, but I could see leadership being shuffled out of responsibility if they didn't, for example, make a correct strategy to get the GPUs sold into the HPC market for GPU computing while the competitor did. I.e. if someone takes the open source specs and designs a set of really kick-ass math libraries that cream anything done with nVidia's CUDA, that could lead to a lot of AMD GPUs being moved while nVidia rushes to leverage that. I doubt anyone would be fired though.

The total number of hardware and still growing that's released with a binary blob is still greater than the total number that have open source drivers.
Huh? I can count two families with binary blobs as the only option for full-function, nVidia and AMD. This story hypothetically paves the way for the AMD half to go away, leaving only nVidia for now (rumor has it nVidia will follow suit). There exist some fakeraid cards that have binary only drivers to use the same format as the firmware support, but overwhelmingly this is skipped for pure software RAID. There exist a few wireless drivers without Linux drivers at all, but ndiswrapper has brought over the Windows drivers, so I guess you could say those are binary blobs. Even counting all that, you still have countless network adapters, graphics chips (current hardware is mostly Intel on that front), wireless adapters, storage controllers, audio devices, USB devices which in no way require a binary blob. The binary blob portion of linux support is a vast minority.

Re:One Moment While I Go blow hot air. (1)

darkonc (47285) | more than 6 years ago | (#22569226)

In fact, if one chooses their graphics processor well, it's actually pretty rare to get a random box that has hardware requiring binary blobs for Linux functionality.

Too late (1, Funny)

Anonymous Coward | more than 6 years ago | (#22532150)

I ordered an Nvidia card yesterday.

Re:Too late (2, Interesting)

Solra Bizna (716281) | more than 6 years ago | (#22532984)

I've been lamenting for years that the R300 card in my G4 (now a G5, long story) would never get specs. I figured they'd start releasing only specs for R500 and up. So when I read this story, I LITERALLY jumped for joy. I'm so happy that I'm switching from nVidia to ATI in my next custom Linux box.

-:sigma.SB

Re:Too late (1)

MrHanky (141717) | more than 6 years ago | (#22533874)

But the free R300 driver is pretty good, at least far better than ATI's proprietary fglrx. Maybe not as fast for 3d, but much better for everything else, like video (I can no longer play back 720p x264 after upgrading to an RV570). Or is that driver, too, i386 only?

Re:Too late (1)

Bert64 (520050) | more than 6 years ago | (#22533878)

There's really no reason not to release specs for older cards, they've long been surpassed on the performance front, but these older chips are widely used in servers and embedded devices because they're cheap and still more than capable of doing the job.

What's left? (3, Insightful)

sudog (101964) | more than 6 years ago | (#22532552)

So what's left before the complete documentation sets are in our hands?

Re:What's left? (1)

z0M6 (1103593) | more than 6 years ago | (#22533754)

You won't get UVD because of DRM, but I think you are better off asking Bridgman on the phoronix forums.

What's left?-experience (0)

Anonymous Coward | more than 6 years ago | (#22534678)

Well a couple things. One UVD isn't as big an issue because ATI use to use shaders to impliment that feature before the latest series buit it in. Now I've notice this documentation doesn't cover the latest chip. So FOSS will always play catchup (and they'll never be free from proprietary hardware if ATI changes their mind.) Also note the feature set revolves around DirectX not OpenGL so the tripack that's NVIDIA/ATI/Microsoft will still be ahead of FOSS. And last let's hope FOSS 2D/3D driver knowledge is up to the task.

Re:What's left?-experience (4, Interesting)

z0M6 (1103593) | more than 6 years ago | (#22535102)

Actually, r600 documentation is expected in a few months. That can hardly be called catching up compared to how it has been earlier.

Using the gpu to decode h264 etc is something I see as quite possible, but it is likely that it is something we have to implement ourselves (something I think we are capable of).

Time to support ATI / AMD is NOW! (2, Insightful)

Anonymous Coward | more than 6 years ago | (#22533854)

I used to buy/recommend mostly AMD CPUs and Nvidia graphics cards till now.

I guess it's time to make it AMD / ATI now.

If they have released what we needed to get the drivers made, which is what we have always wanted, it's time we reciprocated by supporting them.

This will show other graphics companies *hint hint* that releasing the specs = good business.

Great! (1)

wcspxyx (120207) | more than 6 years ago | (#22536514)

Maybe now we can finally get a decent Windows driver!

Thank you ATI! (1)

stoanhart (876182) | more than 6 years ago | (#22540408)

You've followed through! My next video card purchase won't be for a while, so there's a good change that free drivers will be available, and you just got yourself a customer!

I know everyone was skeptical when this was announced some months ago. I though "well, it could happen." The silence on the issue lately made me think I had spoken too soon. I was beginning to wonder where the specs were. Well, here they are.

Thank you ATI!
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?