Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Releases Open-Source Radeon HD 7000 Driver

Soulskill posted more than 2 years ago | from the still-waiting-on-the-open-source-sand-wedge dept.

AMD 84

An anonymous reader writes "AMD has publicly released the open-source code to the Radeon HD 7000 series 'Southern Islands' graphics cards for Linux users. This allows users of AMD's latest-generation of Radeon graphics cards to use the open-source Linux driver rather than Catalyst, plus there's also early support for AMD's next-generation Fusion APUs."

Sorry! There are no comments related to the filter you selected.

DRM in Linux? (0)

Anonymous Coward | more than 2 years ago | (#39423587)

and Trinity DRM support patches

Since when did Linux support DRM?

WOOOOOOOO!!!!!!! (0)

Anonymous Coward | more than 2 years ago | (#39423591)

YAY! 3D goodness!!!!!!

HA! HA! CAPTCHA "fusing"

Re:WOOOOOOOO!!!!!!! (3, Funny)

Skapare (16644) | more than 2 years ago | (#39423993)

I'm still waiting for 4D.

Re:WOOOOOOOO!!!!!!! (2)

Jappus (1177563) | more than 2 years ago | (#39425533)

I'm still waiting for 4D.

Ahh, so I figure you finally want to see that overhyped second frame.

Let me tell you, after you've seen one, you've pretty much seen them all. ;-)

Bitcoins (0)

Anonymous Coward | more than 2 years ago | (#39423605)

All that matters is what does this mean to all 5 bitcoin miners out there?

Re:Bitcoins (2)

geekd (14774) | more than 2 years ago | (#39423761)

Judging from the average hashrate, there are more than 5 of us. :)

Re:Bitcoins (1, Troll)

viperidaenz (2515578) | more than 2 years ago | (#39423969)

Thanks for wasting electricity in your pointless quest for worthless bits

Re:Bitcoins (2, Funny)

Dagger2 (1177377) | more than 2 years ago | (#39424103)

Thanks for completely misunderstanding Bitcoin.

Re:Bitcoins (0, Redundant)

viperidaenz (2515578) | more than 2 years ago | (#39424123)

I'm sorry, has it not got to the point there the power required to produce bitcoins costs more than anyone is willing to trade for them?
I don't know about you, but when I go to work, I come home richer than when I left.

Re:Bitcoins (1)

Dagger2 (1177377) | more than 2 years ago | (#39424203)

No. And that's not the point of mining anyway. The point of mining is to secure the network; any profit made from it is just the incentive. If your own electricity costs too much for there to be any incentive for you, then just buy the bitcoins outright, since doing that will be cheaper.

Do you also come out richer every time you buy something using Paypal?

Re:Bitcoins (2, Insightful)

TheRaven64 (641858) | more than 2 years ago | (#39425293)

Do you also come out richer every time you buy something using Paypal?

Well, yes, that's the point of trade. I exchange something (in this case money) for something that I perceive to have higher value.

Re:Bitcoins (1)

cas2000 (148703) | more than 2 years ago | (#39445411)

well then, if you're talking about perception of value then the people who mine bitcoins obviously perceive the value of what they are doing to be higher than the value of the electricity consumed.

BTW, lots of people pay good money for worthless shit that they perceive to have value (largely because they're either stupid or easily manipulated by marketing or both) so arguments based on perceived value aren't very convincing.

Re:Bitcoins (1)

TheRaven64 (641858) | more than 2 years ago | (#39449089)

well then, if you're talking about perception of value then the people who mine bitcoins obviously perceive the value of what they are doing to be higher than the value of the electricity consumed.

Except that BitCoins are a medium of exchange. They have no value other than the ability to exchange them for something else. It's like the number in your bank balance: you don't even have coins or notes that may become collectable. The value of the money in your bank account is defined by what it can buy for you, as is the value of BitCoins. Currently, however, I don't believe that there are any things that can be bought for BitCoins that can't be bought for less money than the cost of mining the BitCoins.

arguments based on perceived value aren't very convincing.

Since there is no absolute measure of value, perceived value is the only thing that we can talk about in this context. Value is inherently a subjective quality as it depends on your situation. The value of a cup of water to me is a lot less than the value of exactly the same cup to someone in the middle of a desert.

Re:Bitcoins (1)

Anonymous Coward | more than 2 years ago | (#39424833)

Bitcoin is silly, but how much electricity did you waste making this post and having thousands of people download it?

Re:Bitcoins (1)

viperidaenz (2515578) | more than 2 years ago | (#39435789)

Not to mention the 15 or so mod points people have used on it too....

Llano: 3.3? (3, Interesting)

whoever57 (658626) | more than 2 years ago | (#39423631)

I would be much more interested to know if Llano is fully supported in 3.3 kernel. With 3.2, if KMS is enabled, the screen blanks as soon as the radeon module is loaded (even before X starts).

Re:Llano: 3.3? (5, Informative)

wirelessduck (2581819) | more than 2 years ago | (#39423793)

There's still a problem with Llano VGA in Linux 3.3. These bug [launchpad.net] reports [freedesktop.org] seem to indicate that the problem lies with the DP to VGA bridge not working. As a workaround, you can use a HDMI/DVI connection instead of VGA.

Re:Llano: 3.3? (0)

Anonymous Coward | more than 2 years ago | (#39424557)

Sadly that doesn't help on a laptop :( Luckly the fglrx driver does not require KMS and works fine.

AMD needs to focus on OS (4, Insightful)

aztektum (170569) | more than 2 years ago | (#39423649)

AMD could help itself a great deal by focusing on open-source support. Intel does a pretty damn good job supporting open-source with drivers, but they lack top-end graphics hardware. nVidia provides a solid binary, but their *NIX support lags behind Windows.

If AMD becomes the number #1 graphics hardware on Linux, it could help even out their hot/cold CPU offerings.

Re:AMD needs to focus on OS (0)

Anonymous Coward | more than 2 years ago | (#39423707)

nVidia provides a solid binary, but their *NIX support lags behind Windows.

How you figure? Last I heard from them was that their Linux and Windows drivers practically have the same code base (minus DirectX of course). There are only a handful of small differences between their GL on Windows and their GL on Linux. What specifically is this lagging behind you speak of? Or are you just talking out of your ass?

Re:AMD needs to focus on OS (3, Informative)

master5o1 (1068594) | more than 2 years ago | (#39423743)

Lagging support such as nVidia Optimus not being supported on Linux platforms.

Re:AMD needs to focus on OS (0)

Anonymous Coward | more than 2 years ago | (#39423951)

What has Optimus have to do with GeForce and Tesla?

Supported hardware is supported. Unsupported hardware is not supported. Don't buy unsupported hardware. It is like buying a Skype phone then bitching it doesn't work with Asterisk or other SIP providers.

Re:AMD needs to focus on OS (1)

Alex Belits (437) | more than 2 years ago | (#39424363)

Optimus is THE ONLY CURRENTLY PRODUCED GRAPHICS ADAPTER that is not supported on Linux by its vendor and without a functional open source driver.

Re:AMD needs to focus on OS (0)

Anonymous Coward | more than 2 years ago | (#39426653)

So don't buy it. Why is that difficult? The fact is, NVIDIA's support is awesome and always has been, even while AMD was literally pointing, laughing, and mocking NVIDIA and their Linux uses. And even to this day, NVIDIA's drivers are still superior to that of AMDs.

Get back to us when AMD has actually caught NVIDIA in what they do support.

I participated in an AMD market study many years ago - before AMD entered the Linux market. When I asked about support for Linux, I was almost laughed out of the room and was then mocked by the AMD representative when they thought I couldn't see or hear. That unprofessional behavior and view of Linux users in general has always been there - until they realized that NVIDIA had worked their asses off to actually create a new technical (and Linux) computing market. And its not just that one guy. When AMD wasn't supporting Linux, they were basically giving us all the middle finger and laughing about it. That fact is, they've been very disengenuous to Linux for a long, long time now.

Add to all that, the fact that the merging of the AMD drivers (win/linux/apple) has yet to be completed, unlike what they originally projected, says plenty. Linux is still treated as a red headed step child. And you still see Linux-only bugs to such a degree, its obvious they are not using the same code base for many things.

Get back to use when AMD comes close to supporting Linux users like NVIDIA always has. And no, I don't mean by way of disengenuous lies as is the current case. Until that happens, there is only one company who stands behind Linux and 3D - NVIDIA. Period.

Re:AMD needs to focus on OS (1)

Rutulian (171771) | more than 2 years ago | (#39433855)

I think you must mean ATI. AMD has generally been pretty supportive of Linux, which is why many Linux users were happy when AMD bought ATI. The fact that the AMD driver is still a bit behind just reflects the fact that AMD basically had to start over to develop an open source driver for their cards. AMD is in fact, I think, driving much of the Xorg development around better 3D support because of their drivers (things like KMS and Gallium).

Re:AMD needs to focus on OS (1)

Narishma (822073) | more than 2 years ago | (#39426689)

Optimus isn't a graphics adapter. It's just software in the drivers that switches between the IGP and the discrete GPU on the fly when needed. I thought the reason it wasn't available on Linux was that Xorg simply didn't support hot-swapping of GPUs.

Re:AMD needs to focus on OS (1)

Alex Belits (437) | more than 2 years ago | (#39448137)

I thought the reason it wasn't available on Linux was that Xorg simply didn't support hot-swapping of GPUs.

No. Nvidia itself supports power management on its GPUs, and there is no actual hot-swapping involved, Nvidia GPU is simply turned on and off whenever needed while the rest is done by constantly running Intel GPU. There is even Bumblebee -- an open source Optimus support that uses either proprietary Nvidia drivers or open source Noveau underneath, however without Nvidia being involved it only works on some devices. The only reason why Optimus is not universally supported under Linux is Nvidia's refusal to do so.

Re:AMD needs to focus on OS (3, Interesting)

Asic Eng (193332) | more than 2 years ago | (#39424345)

Isn't that problem solved by the bumblebee project? [bumblebee-project.org]

Re:AMD needs to focus on OS (1)

Anonymous Coward | more than 2 years ago | (#39425157)

Disclaimer : my knowledge of this issue Dates 4 months back

Barely. It requires a special command (optirun) to be appended to run any application that requires 3D acceleration, and you don't benefit from the power-saving features optimus is supposed to offer. As a result, my autonomy is shorter on linux (~1h30 as opposed to ~2h30).

I wanted to promote the smoothness of the linux experience thanks to a nice compiz display instead I had a hard time making basic 3D applications work. I think nVidia lost me over this purchase. Optimus is not advertised anywhere, vendors don't know shit about this, and this issue broke nvidia's good track record of linux drivers.

Bumblebee saved my ass indeed but it is still not the best solution.

Re:AMD needs to focus on OS (1)

Teun (17872) | more than 2 years ago | (#39429959)

It is possible to get Optimus like power savings with Bumblebee but you have to get rather experimental nVidia drivers to do so. On this Lenovo W520 powertop gives me around 10-11 Watts using the Intel chip and 14-18 when using the nVidia one. The biggest issue is you need to go nVidia when using a second screen but that's also in Windows and by that time you're likely on mains power anyway.

Re:AMD needs to focus on OS (2, Informative)

Anonymous Coward | more than 2 years ago | (#39424165)

nVidia's linux support has been solid for, like, a decade. Their blob works so damn well, and has worked well for so damn long, that even if the Magical Code Fairy came and blessed AMD/ATI with perfect *NIX modules, I would be hard-pressed to give up on nVidia. I'm all fidgety, waiting for the nearly-released 600-series cards, and you can totally bet those cards will have first-rate blob support, for us early adopters of the linux persuasion.
Buy a relatively new ATI card, and really, you just hope to Christ that at least the vesa driver works in X, so you can spend the next 10 or 15 hours trying to get the thing semi-functional.
Plug in the nVidia card, run the module installer, and start up X. It is that easy. ATI should supply firey hoops & circus music with each boxed video card, just to make the installation experience more realistic. Giving open source coders more information to work with will help, but if they want something as awesome/trivial as nVidia's experience, they're going to need a whole new way of doing things. Either give the open source community *everything* it needs, or hire competent people to get the job done properly. It's been a 2-bit effort out of AMD for far too long. Honestly, I don't even consider them for Win machines any more, because I know that the card I choose today will eventually pass through a Lin machine, after another upgrade.

Re:AMD needs to focus on OS (0)

Anonymous Coward | more than 2 years ago | (#39424723)

I agree 100%. nVidia's Linux support is outstanding. Every card I own is based on an nVidia chipset.
No matter the model, if Linux installs, the nVidia blob installs and works.

Re:AMD needs to focus on OS (1)

webheaded (997188) | more than 2 years ago | (#39426509)

I wouldn't choose them for a Windows machine anyway because they make shitty drivers. You're doing yourself a favor...trust me. I've had a 4870x2 for a year or two now and I've never had such hell with drivers. Just imagine your video driver crashing on YOUTUBE. Or simply refusing to play flash videos properly when hardware acceleration is enabled. Bask in the glory of losing all your catalyst settings in every single upgrade or even just rebooting your computer sometimes.

Re:AMD needs to focus on OS (1)

fast turtle (1118037) | more than 2 years ago | (#39427403)

I've been using a 5670 for over 3 years and never had a problem with their drivers in windows as I prefer sticking with the stable version of catalyst. Hell if I could, I'd install the bare driver but that's not the case anylonger and it's catalyst that has given ATI a bad name, not the driver itself. I have never had a driver failure since I installed the card under Vista. Yes Vista. When it was released the driver was stable and never crashed. It wasn't the fastest but then, I'm not a gamer.

For all the Nvidia fanbois out there, remember the Vista Debacle? It took Nvida almost 18 months before their drivers were stable and how many laptops did they burn up with faulty cards? Another question is what about their segmenting of cards. All those crippled cards sold as mainstream just to get people to upgrade. There were many 6100's availabe that were and are crap. The only reason they outperformed the onboard mobile versions was they had dedicated memory.

I'm not a gamer or bitcoin miner so I don't give a damn about performance. Instead I buy my cards on the lack of external power connections. In other words they have to work with the power limits of the PCIe bus itself instead of demanding a1Kw PSU just to supplie the card. Yep Power demand is important to me as my current desktop shows (off the shelf HP system from walmart) because it's currently running that Radeon 5670 from the Vista system and doing it on a 250watt PSU. The total system draw even when testing with prime95 is a maximum of 200 watts. I've never seen it draw more and it normally draw more then 130-150 watts with the monitor plugged into the same battery backup (APC XS1300).

Re:AMD needs to focus on OS (1)

webheaded (997188) | more than 2 years ago | (#39428221)

It has become increasingly apparently that the 4xxx line was ATi's red headed step child. Everyone with cards AFTER that line seems okay but everyone with a card in that line has the exact same issues I do. It has however been bad enough that I don't think I'm ever going to buy an ATi card again. I'm not even kidding. You have no idea. I guess we'll see but after all the bullshit I've had to deal with, regardless of it being the driver or Catalyst, I am VERY leery about buying another ATi card.

Re:AMD needs to focus on OS (1)

tyrione (134248) | more than 2 years ago | (#39530007)

I'll give it up tomorrow and when the replacement to Bulldozer arrives I am giving up Nvidia. AMD's OpenCL support is far richer and my applications are being designed to leverage OpenCL 1.1/2.0 for Engineering and Solid Modeling.

Re:AMD needs to focus on OS (1)

Anonymous Coward | more than 2 years ago | (#39428029)

nVidia provides a ,proprietary blob, only for x86.

FTFY.

With their fabs all sold... (0)

unixisc (2429386) | more than 2 years ago | (#39423655)

...methinks that the next logical step for them would be to have an FOSS CPU architecture - their AM64 (sans all of Intel's 32-bit x86 IP), and put it out there under an OSS license of their choice for any design house/fab to use it, and just live w/ licensing fees. That way, they don't have to worry about operational details, which they suck @ anyway, and can focus solely on CPU design.

Re:With their fabs all sold... (1)

the_B0fh (208483) | more than 2 years ago | (#39423775)

And this will do what?

Re:With their fabs all sold... (0, Troll)

Anonymous Coward | more than 2 years ago | (#39423887)

Embrace the bankrupcy ... really quickly.

Re:With their fabs all sold... (1)

viperidaenz (2515578) | more than 2 years ago | (#39423979)

If they give away their CPU architecture for free under an open license, how do they charge royalties? Who is going to make an AM64 CPU with no x86 support?

Re:With their fabs all sold... (2)

unixisc (2429386) | more than 2 years ago | (#39424411)

I wasn't suggesting that they give it for $0 - I was suggesting that they make the architecture open source, say w/ a license that allows anybody to study the code, but charges a certain amount if someone wants to use for commercial purposes i.e. to fab a CPU. In other words, they could charge $x to a licensee for signing up, and $y for every chip sold. The licensee then just has to do incremental work on the design in terms of aligning it w/ whichever fabs they're working w/, which won't involve AMD directly. The license could also have a mechanism to pull in any innovations made by the public, sorta in the way GPL does, but it doesn't have to have all, or even most aspects of GPL.

As for who would want to make such a chip, it would not be a bad idea to start. Let's say a company wanted to do such a CPU w/o 32-bitness, by the time such a chip was fabbed out and had the sort of yields that could support their market, the minimum RAM that would be available would be 4GB anyway. And as more apps become 64-bit, the lack of 32-bit support won't be a problem, just like the lack of 16-bit support isn't a problem.

Re:With their fabs all sold... (1)

anyanka (1953414) | more than 2 years ago | (#39425205)

I wasn't suggesting that they give it for $0 - I was suggesting that they make the architecture open source, say w/ a license that allows anybody to study the code, but charges a certain amount if someone wants to use for commercial purposes i.e. to fab a CPU.

Also known as *not* making it open source... Unless you're talking about living off the patent licenses.

Re:With their fabs all sold... (1)

unixisc (2429386) | more than 2 years ago | (#39425801)

That's precisely what I'm talking about - living of patent licenses. Also, open source doesn't mean that people are free to do what they like - like I mentioned, if they want to use it for commercial purposes, they can put normal terms & conditions involved in selling. It would be something like QPL, except that since the average person doesn't just go to a fab and give them the models, the paid aspect of the commercial usage would be more acceptable than it was w/ Qt/KDE.

Re:With their fabs all sold... (0)

slew (2918) | more than 2 years ago | (#39427151)

Uhm, Arm doesnt' "open-source" their architecture and it's pretty successful. The company that used to be called Sun did a community license similar to what your are suggesting with their Sparc core. Which do you think was more sucessful?

I always find it surprising how the opensource vultures/jackals come out to attack the weak and wounded with their "suggestions" on how to run thier business into the ground. It's as if they actually want the weak companies to die so they can feed off of the remains. Open source should be considered insurance against a businesses abandoning a product, not something that keeps a dying company alive just enough and turned into a zombie patent troll company... Only a scavenger wants that...

Re:With their fabs all sold... (1)

Rutulian (171771) | more than 2 years ago | (#39433887)

No, that would be "not making it free." Open source does not have to be free, unless you want to charge to view the source instead of just to use it.

Re:With their fabs all sold... (1)

ocularsinister (774024) | more than 2 years ago | (#39425773)

I think it would make more sense for them to license ARM and make a competitor for Tegra. I'd certainly be interested in a netbook/pad based around that technology.

Re:With their fabs all sold... (1)

viperidaenz (2515578) | more than 2 years ago | (#39430583)

I had a quick google of AMD64 and it is just a bunch of extensions to x86. Its pretty much just more registers, everything extended from 32 to 64 bits and a new addressing mode. Mostly the same instructions. Good luck building an AMD64 CPU without an x86 license from Intel. Even if you could, good luck selling a CPU that can't run any software currently available

/. -- phoronix -- (0)

Anonymous Coward | more than 2 years ago | (#39423665)

... --> fox "news"?

awesome! (1)

ThorGod (456163) | more than 2 years ago | (#39423719)

I look forward to hearing from actual users how well these drivers work.

They still have a non-free dependency; go /w Intel (2, Interesting)

Anonymous Coward | more than 2 years ago | (#39423739)

I was one of the first people to give AMD credit for making the decision to release specifications when ATI was bought. How naive I was. We shouldn't be supporting AMD until they come clean and release sufficient specifications for use on free operating systems. Intel remains to be the only company with graphics chipsets that are well supported on GNU/Linux and free operating systems. The ATI 3d acceleration is still dependent on non-free software. Only the 2d works on free systems. Right now they just release some parts. Don't let this fool you. You will still get stuck later should AMD go bankrupt, sell the division, leave the market, or simply decide to end support like all companies eventually do, etc.

Nope. I prefer having an all around good system than one that is slightly "better". Non-free drivers and firmware effect other parts of the system and I won't subject myself to that punishment. I left Microsoft Windows for a reason. I will NOT be going back.

Re:They still have a non-free dependency; go /w In (3, Interesting)

dmitrygr (736758) | more than 2 years ago | (#39423821)

Don't get too excited. Some intel chips use PowerVR, which has no OSS driver (from intel as well as from anyone else) see here [imgtec.com]

Re:They still have a non-free dependency; go /w In (2)

viperidaenz (2515578) | more than 2 years ago | (#39423987)

Yes, like all the new Atom cpu's

Re:They still have a non-free dependency; go /w In (1)

serviscope_minor (664417) | more than 2 years ago | (#39424693)

Some intel chips use PowerVR, which has no OSS driver

And no windows driver either. Obviously, that's hyperbole. The chip does have Windows drivers and Linux drivers. The Windows ones are beyond terrible and the Linux ones were even worse.

This may have changed when since I last looked, but I'll bet the intel partners were furious for being given a dud with such awful drivers.

Re:They still have a non-free dependency; go /w In (1)

tlhIngan (30335) | more than 2 years ago | (#39428763)

Some intel chips use PowerVR, which has no OSS driver

And no windows driver either. Obviously, that's hyperbole. The chip does have Windows drivers and Linux drivers. The Windows ones are beyond terrible and the Linux ones were even worse.

This may have changed when since I last looked, but I'll bet the intel partners were furious for being given a dud with such awful drivers.

PowerVR doesn't have any open-source drivers - the only ones you get are binary blobs.

Of course, awful drivers is interesting, considering an awful lot of smartphones are running the Linux kernel, and an awful lot of them have PowerVR chips powering them.

Yes I'm talking about Android... and PowerVR has been a staple in the ARM world for ages for decent 3D embedded graphics.

Then again, I suppose the big issue is how Intel adapted PowerVR for PCs - because they probably don't have PCIe interfaces. And perhaps the Windows driver was awful because they had to adapt the Windows CE one (which has significant differences in DirectX and Direct3D over desktop Windows). But as am embedded chip, it's pretty solid.

Re:They still have a non-free dependency; go /w In (1)

serviscope_minor (664417) | more than 2 years ago | (#39439093)

Of course, awful drivers is interesting, considering an awful lot of smartphones are running the Linux kernel, and an awful lot of them have PowerVR chips powering them.

Sure. I was referring to the Intel GMA500, which I had the misfortune of dealing with. It was in a super-ruggedized toughbook. We had both Windows and Linux versions. The graphics drivers were not good on either, with Linux being much worse.

Re:They still have a non-free dependency; go /w In (0)

Anonymous Coward | more than 2 years ago | (#39426499)

+1 just for your signature.

Re:They still have a non-free dependency; go /w In (2)

bzipitidoo (647217) | more than 2 years ago | (#39424007)

I'd switch to AMD permanently and buy a new AMD video card tomorrow if I was sure they're serious. I want decent 3D acceleration in the open source drivers for Linux. Neither Nvidia nor ATI ever delivered on this. The proprietary Catalyst driver is something like 5x the speed of the open source driver. Nvidia is even worse. That's totally unacceptable. Some years ago, ATI announced they were opening up, and I got ready to dump Nvidia. And then... it didn't happen.

Intel? What a joke! Their video performance is so horrid that they can't beat AMD on the dog slow open source driver no matter what driver they use. Until Intel improves dramatically, they're out of the picture.

So I'm not celebrating yet. Sounds like this may well be another empty gesture.

Re:They still have a non-free dependency; go /w In (5, Insightful)

Kjella (173770) | more than 2 years ago | (#39424279)

ATI announced they were opening up, and I got ready to dump Nvidia. And then... it didn't happen.

Actually that's what did happen, they said they'd open up and for the most part they have - the instruction set for "decent 3D acceleration" is out there. A decent CPU analogy is that they promised x86_64 specs, you expected GCC. It doesn't magically make a team that's 2-3% the size of the proprietary team magically able to be 50 times as efficient, worse yet the hardware radically changes from generation to generation like now from VLIW to GCN which is basically to start over. And it continues to expand with geometry shaders, tesselation, new display standards, new chips etc. so it's a rapidly moving target.

For example, Mesa just got OpenGL 3.0 support last month, the standard was released back in 2008. That's not just lack of a driver, there's not even an implementation to accelerate. Of course you could say that AMD should release their proprietary driver/OpenGL implementation which would be nice indeed but isn't practical on so many levels and certainly not something they promised. Your post is essentially why nVidia doesn't want to get involved with OSS, it's "Whaaaaaaa give us specs, we'll write the code" "Okay here's specs" "Whaaaaaaaa performance sucks, write the code too".

Re:They still have a non-free dependency; go /w In (1)

Cajun Hell (725246) | more than 2 years ago | (#39426663)

Of course you could say that AMD should release their proprietary driver/OpenGL implementation which would be nice indeed but isn't practical on so many levels and certainly not something they promised.

That's actually what they need to do if they want to stay relevant, because their competition has done it.

And by competition, I don't mean nVidia. I mean the truly monstrously huge behemoth, whose claws and fangs are already dripping with so much AMD blood, yet who is dismissed every time graphics hardware comes up, because their hardware is merely low-end integrated crap. You know who I mean.

Low-end integrated "crap" is what 95% of the world needs, and with each passing year that fraction increases a little, as people realize that "crap" keeps getting less and less crappy. That's where the real money is. That's even where AMD's own Llano is! About a month from now, people will see the "crap" bar has moved again, crossing yet more peoples' threshold for what isn't really crap, and God Help AMD a year from now.

"Whaaaaaaaa performance sucks, write the code too"

There's no "Whaa" anymore, because AMD's real competition is writing and releasing the Free code, so instead of crying, people can just buy their amazing hardware. (Amazing, if slightly limited on graphics, and "slightly limited" is itself so subjective!)

Re:They still have a non-free dependency; go /w In (1)

ak3ldama (554026) | more than 2 years ago | (#39427337)

Low-end integrated "crap" is what 95% of the world needs...That's even where AMD's own Llano is!...

Many people do agree with you which is why people are asking if the Llano support within the open source drivers is working yet. (Anecdotal...) I have a moderate 46xx series ATI card in my linux box which I can use happily with the open drivers. Performance isn't top notch but I have never needed 100% of its graphics capabilities. In my Llano laptop it is still all Windows 7. In the future I would be happy with another AMD APU system once the open driver support is better. The performance available with Llano is all I need to be paying for, spending more is wasteful.

Re:They still have a non-free dependency; go /w In (1)

Kjella (173770) | more than 2 years ago | (#39427393)

That's actually what they need to do if they want to stay relevant, because their competition has done it. (...) You know who I mean.

More like the other way around, today AMD and Intel both have OpenGL 3.0 support through mesa while AMD has OpenGL 4.2 through catalyst/fglrx. If AMD went through the trouble of opening their implementation then Intel would essentially get a free pass to that, not to mention an invaluable lesson in shader optimization tricks and that'd benefit nVidia too. Even if it were possible it'd not be in AMD's best interest.

Re:They still have a non-free dependency; go /w In (1)

epyT-R (613989) | more than 2 years ago | (#39436479)

I'm not sure about that.. the architectures are so different that any optimizations made to the shader compiler would be useless to other designs.. hell, even different generations of chips require different optimizations.

Re:They still have a non-free dependency; go /w In (1)

tyrione (134248) | more than 2 years ago | (#39530031)

I could care less if they have a non-binary blob or not. I care about full OpenCL 1.x-2.x/OpenGL 4.x stack support for Linux. Taint the kernel all their want. I want solid drivers that just work.

Re:They still have a non-free dependency; go /w In (5, Informative)

Daniel Phillips (238627) | more than 2 years ago | (#39424075)

The ATI 3d acceleration is still dependent on non-free software. Only the 2d works on free systems.

Complete nonsense. I am doing OpenGL development at this very moment using the fully open Radeon driver. Your post has too many inaccuracies to address. If it were possible to retract it, you probably should.

Re:They still have a non-free dependency; go /w In (1)

paskie (539112) | more than 2 years ago | (#39425287)

How can I use acceleration with the Radeon driver and without the appropriate firmware binary blob? I think the GP is taking issue with that.

Re:They still have a non-free dependency; go /w In (1)

MrHanky (141717) | more than 2 years ago | (#39425487)

While "true", the realistic alternative is the firmware blob residing in ROM on the graphics card.

Re:They still have a non-free dependency; go /w In (1)

Daniel Phillips (238627) | more than 2 years ago | (#39432957)

Exactly. I don't have the source code for the processor microcode either. I can live with it, provided the API exposed to the driver is sufficiently complete.

Re:They still have a non-free dependency; go /w In (0)

Anonymous Coward | more than 2 years ago | (#39426945)

Mod parent up, the firmware still remains proprietary. See e.g. http://packages.debian.org/squeeze/firmware-linux-nonfree [debian.org]

Be sure to read the copyright file which includes gems like "Binary redistribution", "All Rights Reserved", "No reverse engineering, decompilation, or disassembly of this Software
  is permitted"...

Doesn't sound too free to me.

Re:They still have a non-free dependency; go /w In (0)

Anonymous Coward | more than 2 years ago | (#39427547)

Yes, but doesn't *every* PC video card include at least one piece of *proprietary* firmware - the VGA BIOS?

oh? (1)

alienzed (732782) | more than 2 years ago | (#39423785)

consumers everywhere rejoice!

What the hell is a DCE6 display watermark? (1)

Jah-Wren Ryel (80510) | more than 2 years ago | (#39423807)

What the hell does "DCE6 display watermark support" mean?
I googled for it and didn't find anything useful.
It sounds ominously like cinavia [wikipedia.org] for video.

Re:What the hell is a DCE6 display watermark? (1)

Daniel Phillips (238627) | more than 2 years ago | (#39424179)

What the hell does "DCE6 display watermark support" mean?
I googled for it and didn't find anything useful.

Whoa, this one really flies below the Google radar. DCE part of recent Radeon architecture [botchco.com] , a programmable display controller that produces low level digitial signals to drive a wide variety of display types. As for "watermark", I did not turn up much on it beyond the patch [spinics.net] . You tell me and we'll both know.

Re:What the hell is a DCE6 display watermark? (1)

Anonymous Coward | more than 2 years ago | (#39425735)

Watermark refers to empty/fill thresholds in the FIFOs between video memory and the display.

Re:What the hell is a DCE6 display watermark? (1)

Daniel Phillips (238627) | more than 2 years ago | (#39432997)

Watermark refers to empty/fill thresholds in the FIFOs between video memory and the display.

Hah, so it's made-in-linux terminology abuse. Somebody got confused between the correct "highwater mark" and the incorrect "high watermark".

Re:What the hell is a DCE6 display watermark? (2)

Kjella (173770) | more than 2 years ago | (#39424437)

Well my understanding of the patch is sketchy, but DCE6 is just the display controller chip and watermark in this context seem to be nothing but frame begin/end indicators or timings depending on number of pixels to draw, latency, display clock etc. so you'll get a picture on screen and the display buffer is updated at the right time. It certainly has nothing to do with watermarks in the cinavia sense.

Previous series? (2, Interesting)

Anonymous Coward | more than 2 years ago | (#39424807)

I am still waiting for a working 5700 series driver.

The closed driver contains lots of bugs and is unstable, the open driver lacks features and has bad fan control. In short, one pile of failure.

some remarks (0)

Anonymous Coward | more than 2 years ago | (#39425431)

ATI is notorious, when one speaks about its proprietary drivers. I can even say that there is something wrong in their programming skills. The last 64-bit fglrx, which really works with 2.6.37.6 is version 11.08. But why 2.6.37.6 ? Since 2.6.38 we alredy have linux power issue and in general, all Fedoras after 14 have Gnome3 (my wife and children refused to use it). Now I always recommend to my friends NVIDIA instead ATI.

How convenient (2)

elwin_windleaf (643442) | more than 2 years ago | (#39426393)

I found myself in the market for a graphics card recently, and after the research and hassle of figuring out what has been released as open source, I decided to delay the decision by sticking with an older NVIDIA card I had kicking around.

Now that I know this series of AMD cards is supported with open source drivers, I'm much more comfortable running it in my Linux desktop than my old NVIDIA card, which requires their proprietary drivers.

Re:How convenient (0)

Anonymous Coward | more than 2 years ago | (#39427815)

I ditched my old NVIDIA GeForce 7600GT for a Radeon HD6770 just for the open source driver support and, quite frankly, I couldn't be happier with my decision:

- Gnome 3 works without any crashes, while previously with the nouveau drivers for NVIDIA it crashed like Windows 98 :)
- Open source 3D games like nexuiz, openarena, neverball, etc. are playable out of the box with the open source drivers.
- Movies play with vsync on, i.e. without the annoying "tearing" effect that I had under NVIDIA with nouveau.

If you want good and stable open source driver support _now_, I'd recommend buying a previous gen AMD card like the Radeon HD6770 that I have bought.

"next-Generation Fusion" (1)

bill_mcgonigle (4333) | more than 2 years ago | (#39428097)

I built a desktop system for myself last June with an AMD 'APU' in it. At the time people were talking August for ATI's open source reveal, so I put my old nVidia card in it. It's still there, obviously.

Assuming these parts went to fab before I could buy them, this puts ATI's lead time on open source drivers for new chips at about a year. That's probably 1/3 the useful life of the part. Hopefully for the last 2/3rds I'll be able to take advantage of that power savings.

Serious question: how do they test these chips? Are they really using Windows drivers in the development lab? That seems unnecessarily hard.

wtf do i get the actual code then? (0)

Anonymous Coward | more than 2 years ago | (#39433109)

apologies for being a noob etc but wtf do i actually get the code then? i cant find it anywhere

I can only woner why they did this (0)

Anonymous Coward | more than 2 years ago | (#39440985)

I can only wonder [http] why they did this. However, I have an idea [cchtml.com] where to start. Saying that their Linux support is crappy would be an overstatement. It is damn-near unusable. What I really don't get is why do they fsck up their software when their hardware is so capable? Don't they understand that it's people at universities (mostly using Linux) who decide what compute cards to put into the next supercomputer?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?