Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Open Sources Their Linux Video API

Roblimo posted more than 3 years ago | from the every-little-open-source-bit-helps dept.

AMD 64

An anonymous reader writes "AMD has open sourced X-Video Bitstream Acceleration, their API by which they expose the Universal Video Decoder 2 GPU under Linux." They may be a little late with this move, and not everything you could wish is now open source, but it's better than nothing.

cancel ×

64 comments

Sorry! There are no comments related to the filter you selected.

Okaaaaaay... (4, Insightful)

MrEricSir (398214) | more than 3 years ago | (#35322930)

The ATI drivers for Linux were never perfect, but they worked decently. But ATI/AMD would drop support for older chips that were still in use. The open source community never provided a shim to let these older drivers work with newer builds of X.

Does open sourcing the drivers really fix the compatibility problem? To me, not building a shim suggests a general lack of caring about ATI drivers. Do we really need the source to give a future to aging ATI/AMD chips?

Re:Okaaaaaay... (1)

kangsterizer (1698322) | more than 3 years ago | (#35323108)

Obviously its not just about "open sourcing the drivers", otherwise we'd have open sourced catalysts and all would be well.
The reason catalysts work well and open source drivers don't is because, well, they're not the same code at all, and the catalyst ones are a lot more advanced.

Now they want their view of the standard to be implemented so open source it, but theres other standards already out there, so it feels a bit like "just throwing it around hoping it works"

Re:Okaaaaaay... (2)

WorBlux (1751716) | more than 3 years ago | (#35325938)

Not neccessarily, the graphics industry is a hotbed of patent litigation waiting to happen. Open sourcing the complete driver would open up a lot of proof for attacks through the courts. Opening up any of it to open source is a huge deal, and show the continuing shift in willingness of manufacturers to work with the linux foundation to provide the best possible experience on the hardware for any potential use. ATI on just linux that's broken, it's the OpenGL support, wchich lags behind even in thier windows drivers.

Re:Okaaaaaay... (2)

PhobosK (2001566) | more than 3 years ago | (#35323262)

Well that is too late I think... Something like using an umbrella after the rain has stopped... :)

I already have a very bad experience with a couple of laptops with integrated old ATI cards which linux support was dropped by ATI 2 years after they were produced?!...

So i have learned my lesson very well and it is - NEVER buy anything even closely related to ATI (though it is now AMD :) )...

No one should make one and the same mistake again, shouldn't he?

Re:Okaaaaaay... (1)

Bengie (1121981) | more than 3 years ago | (#35323948)

ATI has better drivers and better cards than nVidia for the Windows platform. Too bad they haven't invested much into Linux :*(

My $300 6950 out would out-perform a similar $300 card from nVidia, and consume less power and run A LOT cooler. Instead, I OC'd it 100mhz and unlocked a bunch of shaders. Now it smears the floor with another $300 nVidia card. System has been perfectly stable. My friends with nVidia on the other hand, they still have black screen bugs back from 3 generations ago on their new 470s. They say their next cards are going to be AMD/ATI after many loyal years to nVidia and empty promises of bug fixes.

meh, the grass is always greener on the other side.

Personally, I'm biased against nVidia because they market like Intel. They make a great product, but I loath them for ethical reasons.

Re:Okaaaaaay... (1)

Smauler (915644) | more than 3 years ago | (#35325484)

My Gigabyte gtx460 is 100% stable and runs relatively cool. Under load it'll sometimes get up to 60-70 degrees (with pretty crappy case cooling ATM - my front fan has given up). The lowest I've ever seen it at was 10 degrees. That was in December, when it was cold out, and I don't heat my house when I'm at work. My PC is on 24/7 though... I was suprised it was running so cold - the CPU was at about 15 degrees IIRC. I'm guessing ambient temperature must have been about 5 degrees.

I had a look at all the options when I bought it, and bang for buck the GTX460 was the best card for me. I usually buy just below cutting edge.

Re:Okaaaaaay... (1)

Bengie (1121981) | more than 3 years ago | (#35325620)

Try an Antec 900 for a case. About $100, but hard to beat for it's price. My videocard went from 90c to 40c, but I to had a crappy case.

Other than the lack of tool-less setup, it is an excellent case.

Re:Okaaaaaay... (4, Informative)

slash.duncan (1103465) | more than 3 years ago | (#35323404)

Well, there's the proprietary drivers which AMD/ATI does what they want with, dropping support for old chips, etc, and there's the native xorg/kernel/mesa/drm and now KMS drivers, which are open. The open drivers support at least as far back as Mach64 and ATIRage, and while I never used those specific drivers, after I realized what a bad idea the servantware drivers were based on the nVidia card I had when I first switched to Linux, I've stuck with the Radeon native drivers. In fact, I was still using a Radeon 9200 (r2xx chip series) until about 14 months ago, when I upgraded to a Radeon hd4650 (r7xx chip series), so I /know/ how well the freedomware support lasts. =:^)

And why would the free/libre and open source (FLOSS) folks build a shim for the servantware driver? The kernel specifically does NOT maintain an internal kernel stable ABI (the external/userland interface is a different story, they go to great lengths to maintain that stable), and if anyone's building proprietary drivers on it, it's up to them to maintain their shim between the open and closed stuff as necessary. Rather, the FLOSS folks maintain their native FLOSS drivers.

And while for the leading edge it it's arguable that the servantware drivers are better performing and for some months may in fact be the only choice, by the time ATI's dropping driver support, the freedomware drivers tend to be quite stable and mature (altho there was a gap in the r3xx-r5xx time frame after ATI quit cooperating, before AMD bought them and started cooperating with the FLOSS folks again, part of the reason I stuck with the r2xx series so long, but those series are well covered now).

So this /is/ good news, as it should allow the freedomware drivers to better support hardware video accel, as they merge the new information into the freedomware drivers.

Re:Okaaaaaay... (0)

Anonymous Coward | more than 3 years ago | (#35323662)

servantware

freedomware

Yep... time to quit reading Slashdot.

Re:Okaaaaaay... (0)

Anonymous Coward | more than 3 years ago | (#35324014)

lol

Re:Okaaaaaay... (0)

Anonymous Coward | more than 3 years ago | (#35326300)

It's always been like that here. You can always spot the loonies because they use words like "GNU/Linux" and "FLOSS". Nobody else uses these words.

See you over at Linux Hater's Blog!

Re:Okaaaaaay... (2)

next_ghost (1868792) | more than 3 years ago | (#35324368)

X.org and Linux kernel developers don't care about any closed source software. When somebody chooses to release software as closed source, he decides that nobody else can update it themselves. Why should open source developers make his life easier by restricting the pace of development of their own software? Open source developers didn't force him to release the software as closed source. Open source software on the other hand can be easily updated to keep up with the pace of upstream development by anybody.

I own 3 generations of ATI hardware (Mach64/Rage 3D Pro, R200/Radeon 8500, M56/Mobility Radeon x1600) and in general, the closed source driver implements more hardware features but on the other hand, the open source driver is MUCH more stable. ATI is my graphics card brand of choice but I'd rather get Intel than deal with closed source driver again.

Re:Okaaaaaay... (0)

MrEricSir (398214) | more than 3 years ago | (#35327374)

This kind of "us vs. them" thinking is a failure both in politics and in software.

If we're more concerned with the licenses than whether or not our computers work, then we've failed as programmers and become lawyers.

Re:Okaaaaaay... (1)

silanea (1241518) | more than 3 years ago | (#35329446)

I disagree. Open Source is not just about the Ubuntu image you can download today, it is about how we create and use software 20 years from today. It took a long time to get hardware vendors to hand out specs or show genuine interest in delivering Free drivers. And here we are, with at least two big players (AMD and Intel) pledging their support and another (NVIDIA) at least playing along somewhat.

Re:Okaaaaaay... (1)

MrEricSir (398214) | more than 3 years ago | (#35333400)

20 years ago, FLOSS advocates were saying the exact same thing.

And yet, my computer's graphics chip STILL doesn't work. I'm sick of the excuses.

Re:Okaaaaaay... (1)

next_ghost (1868792) | more than 3 years ago | (#35336672)

And yet, my computer's graphics chip STILL doesn't work. I'm sick of the excuses.

And which one is that? Because right now, the R300g driver (supporting R300-R500 chips) is about to pass Catalyst with a loud woosh in 3D performance. It has passed Catalyst with much louder woosh in 2D performance and stability ages ago. The R600c driver also makes Catalyst eat dust in 2D performance and 3D works fine on older cards (HD5xxx support is still weak because ATI released specs less than 6 months ago).

Re:Okaaaaaay... (2)

next_ghost (1868792) | more than 3 years ago | (#35329962)

There is no "us vs. them" in this case. There are two software packages, one open source, the other proprietary. Why should developers of the open source package cripple their own software just to keep the proprietary one working? Developers of the propiretary one made the decision to prevent everybody else from contributing fixes and updates. If you're dissatisfied with results when they can't or don't want to keep up with changes in related open source packages, blame the proprietary developers for making wrong decisions.

Re:Okaaaaaay... (1)

glitchvern (468940) | more than 3 years ago | (#35335708)

The ATI drivers for Linux were never perfect, but they worked decently. But ATI/AMD would drop support for older chips that were still in use. The open source community never provided a shim to let these older drivers work with newer builds of X.

Does open sourcing the drivers really fix the compatibility problem? To me, not building a shim suggests a general lack of caring about ATI drivers. Do we really need the source to give a future to aging ATI/AMD chips?

As of January 19 phoronix, puts the average speed of the latest available open-source driver at roughly 70% the speed of the Catalyst driver before the pre-R600 support was discontinued in early 2009. [phoronix.com] This is using composite results from the ATI Radeon X1800XL, Radeon X1800XT, and X1950PRO graphics cards being benchmarked on Nexuiz, Warsow, OpenArena, World of Padman, and Urban Terror. These cards use the R300g driver. Newer cards using the R600g driver (cards with HD in the name) are not currently anywhere near these results.

A bit of history:

A long time ago, documents describing the specifications of graphics cards were generally available under NDA to XFree86 developers. Then Nvidia started releasing binary only drivers. ATI eventually followed suite. The last series with docs available from this era is ATI's R200. The R300 released in 2002 did not have docs released for it. Following the lack of docs, driver development stagnates.

April 6, 2004 XFree86/X.org fork.
After Keith Packard was kicked out of the XFree86 core group and XFree86 switched to non-gpl compatible license a fork ensues. Project Leadership of XFree86 had been basically hostile to developers and had retarded increases in the developer base and improvements in the graphics stack for literally years. Following the fork a renaissance in X Server development begins.

July 24, 2006 AMD acquires ATI.
Speculation about open drivers begin.

May 10, 2007 Red Hat Summit
AMD's Henri Richard says something about improving the open source drivers. Speculation becomes flood of rumors.

September 06, 2007 ATI/AMD's New Open-Source Strategy Explained [phoronix.com]
AMD announces plans to contribute specification documents and code to the open source drivers. By this time successive X.org releases have seen:

  • Removal of XIE, PEX and libxml
  • Window translucency, XDamage, Distributed Multihead X, XFixes, Composite
  • EXA, major source code refactoring. Switch to autotools build system instead of Imake
  • EXA enhancements, KDrive integrated, AIGLX
  • Removal of LBX and the built-in keyboard driver, X-ACE, XCB, autoconfig improvements
  • Input hotplug, output hotplug (RandR 1.2), DTrace probes, PCI domain support.

September 11, 2007 XDS2007 Program [x.org]
The "softpipe" talk by Keith Whitwell of Tungsten Graphics is the earliest reference I can find to Gallium3D. References to Gallium3D show up on Tungsten Graphics website at approximately the same time according to internet archive. Apparently Tungsten Graphics released a softpipe driver (gallium driver for cpu) at this time, along with a "proof of concept" i915 driver.

September 12, 2007 AMD Releases 900+ Pages Of GPU Specs [phoronix.com]
RV630 Register Reference Guide and M56 Register Reference Guide.

January 04, 2008 AMD Releases Additional R600 GPU Programming Documentation [phoronix.com]
M76 and RS690 register guide weighing in at 458 and 422 pages respectavly. Contains LVTMA and i2c information not found in previous docs. LVTMA is the second digital output block on the ATI R500/600 series and can handle TMDS and LVDS for DVI/HDMI and LCD panels, respectively.

February 22, 2008 AMD Releases 3D Programming Documentation [phoronix.com]
300 pages. This 3D programming documentation covers the R500 series and even goes back with information on the R300/400 series as well. Document contains programming guide and register specifications. Among the areas covered in this 3D guide are the command processor, vertex shaders, fragment shaders, Hyper-Z, and the various 3D registers. Previous documents have just covered R500/600 card support with mode-setting, LVTMA, TMDS, i2c, and other basic but critical elements.

February 28, 2008 AMD Updates Its 3D Programming Guide [phoronix.com]
Covers more vertex program formats than the v1.1 draft. 4 more pages than the v1.1 draft.

March 14, 2008 AMD Releases R300 3D Register Guide [phoronix.com]
Just 99 pages long but covers registers for color buffer, fog, geometry assembly, graphics backend, rasterization, clipping, setup unit, texture, fragment shaders, vertex, and Z-Buffer.

The R300 open-source support had largely been reverse-engineered and built upon the R200 open-source support, which came from documentation ATI had released to open-source developers under Non-Disclosure Agreements several years ago. The Radeon R300 series consists of such graphics cards as the Radeon 9500, 9800, X300, X550, and X600 -- both AGP and PCI Express parts.

March 19, 2008 AMD Releases Production Microcode For All Radeon GPUs [phoronix.com]
This is the microcode found in the fglrx driver and it covers the Radeon R100 to R600 product families. This microcode dump can be found in the Mesa/DRM git tree in shared-core/radeon_cp.c. This file is made up of the microcode (arrays made up of hex) for the R100, R200, R300, R420, RS600, RS690, R520, R600, RV610, and RV620. Microcode is low-level instructions for the graphics processor.

April 01, 2008 AMD Releases Revised R500 Document [phoronix.com]
Version 1.3 includes expanded coverage of the Command Processor (CP) found on the R500 graphics processors.

June 11, 2008 AMD Releases R600 GPU Documentation [phoronix.com]
This ISA (Instruction Set Architecture) documentation covers the unified shader block found on the Radeon HD 2000/3000 series and newer. This PDF document is 342 pages long and does go into detail surrounding R600 vertex and geometry shaders.

July 25, 2008 AMD Releases New AtomBIOS Parser [phoronix.com]
New AtomBIOS Parser should be small and clean enough to go into the kernel for kernel modesetting. Not going to replace parser in X. That would be work for no gain.

December 29, 2008 AMD Releases Open-Source R600/700 3D Code [phoronix.com]
Code drop. Documentation was to be released as well, but wasn't fully sanitized before holidays. Code includes a working DRM, working EXA acceleration, an initial X-Video implementation and the working r600_demo program. This code covers R600 and R700 series.

January 26, 2009 AMD Releases R600/700 3D Documentation [phoronix.com]
The R600 3D register guide is 166 pages long and covers R600 shader instructions, R700 shader instructions, shader textures, and various other registers needed to program a 3D graphics driver.

March 29, 2009 AMD Releases R700 Instruction Set Architecture [phoronix.com]
In this new R700 ISA documentation is 392 pages worth of information on various topics, particularly about the Stream processing abilities of the R700 family.

April 18, 2009 AMD Pushes Out New R600/700 3D Code [phoronix.com]
This code will allow open-source 3D acceleration on the Radeon HD 2000, 3000, and 4000 series of graphics cards.

May 07, 2009 AMD Releases R600/700 Programming Guide [phoronix.com]
This 43 page document, that is entitled "Radeon R6xx/R7xx Acceleration", provides a basic overview of the ASIC architecture and a small programming guide. This document also covers the packet definitions and information concerning synchronization and cache flushing for these newest graphics processors. Explained in detail is the second generation Superscalar Unified Shader Architecture, technical changes between the R600 and R670, technical changes between the R670 and R700 series, the R600/700 3D pipeline, and various other topics that excite graphics driver developers.

November 09, 2009 ATI R300 Gallium3D DRI Support Is "Done" [phoronix.com]
The DRI state tracker report now states "done". First R300 Gallium3d component to reach complete status. R600g driver still considered extremely early in development.

December 22, 2009 AMD Publishes Evergreen Shader Documents [phoronix.com]
362 pages. A shader instruction set documentation for the R800 "Evergreen" graphics processors. Titled "Evergreen Family Instruction Set Architecture - Instructions and Microcode."

February 01, 2010 Open-Source ATI Evergreen Support Arrives [phoronix.com]
User-space mode-setting support for the Radeon HD 5000 series GPUs. No 3d or even 2d acceleration. No kernel mode setting.

February 09, 2010 There's Evergreen KMS Support & More To Test [phoronix.com]
Includes new I2C code that supports the hardware I2C engines found on Radeon graphics cards and exposes it to user-space, a PLL algorithm rework, DRM power management support, basic Evergreen "R800" KMS support, and various other fixes and new additions. Still no 3d or even 2d acceleration.

April 08, 2010 Open-Source ATI Evergreen Acceleration Builds Up [phoronix.com]
New kernel DRM code that adds support for the command processor, interrupts, and graphics initialization on the Evergreen ASICs. New microcode. Support for power tables. Still no 3d or even 2d acceleration, but the foundation for acceleration is now laid down.

July 22, 2010 Intel's Preparing To Push Its New GLSL Compiler Into Mesa [phoronix.com]
New GL shading language compiler for Mesa dubbed "GLSL2"

August 17, 2010 Intel's GLSL2 Branch Is Merged To Mesa Master [phoronix.com]
Goal is to support current and future versions of the GL Shading Language required for OpenGL 3.x/4.x support. Current goal is GLSL 1.3 support. Also faster than previous driver and addresses tons of bugs. This compiler will be used by all drivers not just intel. Adds 85,000 lines to Mesa code-base.

August 20, 2010 Open-Source 2D, 3D For ATI Radeon HD 5000 Series GPUs [phoronix.com]
2D EXA, X-Video, and OpenGL acceleration. 332 days after the first Evergreen graphics cards were released, the public finally has the first open-source hardware-acceleration support for ATI Radeon HD 5400/5500/5600/5700/5800/5900 series ASICs. This commit ends up adding over 12,000 lines of code to the R600 DRI driver. It may be close to the same speed and feature parity as the drivers that support earlier product generations -- even the previous-generation Radeon HD 4000 series that has quite decent open-source 3D support within its classic Mesa driver. First-run opengl driver. Following this, the Evergreen register documentation that is released to the public will also be updated and released. 332 days may seem like a long time between product release and 2d/3d acceleration support but it is actually a record low. Roughly ~31 months from launch to GL2 for 6xx, ~19 months for 7xx, less than 12 months for Evergreen. New ASICs should see a lower time until 2d/3d acceleration support.

November 22, 2010 Open-Source AMD Fusion Driver For Ontario Released [phoronix.com]
This initial open-source support should be at around the same level of support and capabilities as where the open-source Radeon HD 5000 "Evergreen" support is at right now. This includes user-space mode-setting, kernel mode-setting, 2D EXA, X-Video, and 3D/OpenGL support. This support comes nearly at time of product release.

Since 2007 the radeon driver has seen the switch from the memory manager being in X to being in the kernel "GEM and/or TTM", the switch from User Mode Setting to Kernel Mode Setting, the switch from the "classic" mesa to "gallium" mesa, and the switch from DRI to DRI2. The R300g and R600g drivers that have all these changes in them have not really hit distros yet. Distro's have been shipping the R300c and R600c drivers which have some of these changes. The R300g is now more or less at feature-parity with the R300c and has better performance and should be hitting distros soon. As stated at the beginning of this post it is now running at ~70% the speed of the catalysts driver before the pre-R600 support was discontinued in early 2009. The R600g development lags behind the older, more mature R300g. The R600g only became capable of running glxgears in July 2010 is now more or less at feature-parity with the R600c. All development is now focused on the R300g/R600g drivers. Over the past 3.5 years almost all the re-archtechturing groundwork for radeon in Mesa/X/Drm/Dri is complete and specifications for the R400, R500, R600, R700, and Evergreen have been released and support for them incorporated into the drivers. Developers have been focused on implementing things correctly and only now are they beginning to do optimization work and see the payoff for the previous 3.5 years of work.

The remaining archtechture groundwork is to see Mesa's OpenGL support go from 2.1 to 3.0 to 3.1 to 3.2 to 3.3 to 4.0 to 4.1. A majority of the various GL extentions for 3.0, 3.1, 3.2, and 3.3 are done. [freedesktop.org] The gl shader compiler on the other hand is going to need a ton of work. The number of intermediate representations to get from shader program to actual byte code running on the gpu is fairly large and glsl 1.3 is not yet supported.

Re:Okaaaaaay... (1)

pigeon768 (589860) | more than 3 years ago | (#35359366)

The fglrx drivers were terrible. They were ludicrously unstable. From what I understand, they eventually got better, but they would crash the system (ie, straight to POST, not just X11) on a regular basis for years.

Writing a compatibility layer for old drivers is a very tricky business. Specifically, it's the business of the writers of the old drivers. Only they know what arcane deprecated functionality their software uses, not the writers of the interface.

Open sourcing the API to the drivers did fix the problem. The open source drivers [freedesktop.org] support ATI hardware all the way back to the first generation radeon.

this is good news (4, Interesting)

bmalia (583394) | more than 3 years ago | (#35322938)

I have always purchased nVidia cards soley because I knew that they provided linux drivers. Lately though, the drivers don't seem to work quite right. Might be getting to be about time for me to give ATI a go.

Re:this is good news (1)

Urza9814 (883915) | more than 3 years ago | (#35323874)

I've used nothing but ATI and nothing but Linux for 6+ years now and I've never had any issues. I am of course using the proprietary ATI drivers though. And I never buy the latest, top of the line video cards.

Re:this is good news (0)

Anonymous Coward | more than 3 years ago | (#35325560)

They do work well if you're using a distro like Ubuntu that controls the versions of the drivers and kernels. If you use a constantly up-to-date rolling release like Arch, it's not worth the hassle to keep constantly checking for compatibility with new software. It's horrible to download a new kernel, reboot, and then find out your drivers don't work.

I don't play any games on my computer, so I can use the open source drivers without worrying. It would be nice if they got faster for those times I do want 3D, though.

Re:this is good news (2)

janwedekind (778872) | more than 3 years ago | (#35324010)

If you don't need cutting edge graphics, give Intel Graphics a go. The drivers are free software -> distributors are permitted to integrate them properly -> installation is a breeze.

Re:this is good news (0)

bmcage (785177) | more than 3 years ago | (#35324160)

If you don't need cutting edge graphics, give Intel Graphics a go. The drivers are free software -> distributors are permitted to integrate them properly -> installation is a breeze.

Unfortunately, everybody actually needs cutting edge graphics ...

Re:this is good news (1)

Mr. Slippery (47854) | more than 3 years ago | (#35324412)

Unfortunately, everybody actually needs cutting edge graphics ...

For what?

Re:this is good news (1)

bmcage (785177) | more than 3 years ago | (#35324850)

You don't use those nice Desktop Effects I presume. Did you take a look at how Apple promotes it's upcoming Lion? My wife's laptop with Intel is a joke compared to mine. I need good graphics for work. Of course, it might be the open source drivers that suck.

Re:this is good news (2)

Knuckles (8964) | more than 3 years ago | (#35325254)

Intel's 3D is plenty capable of desktop effects and stuff like Google Earth. Compiz runs perfectly. When the GP wrote "cutting edge graphics" he was talking about stuff like Crysis 2 and maybe professional 3D use. Few people actually need that.

Re:this is good news (1)

janwedekind (778872) | more than 3 years ago | (#35326002)

My laptop has an Intel Mobile Series 4 graphics card. KDE compositing rarely drops below 25 frames/second.

Re:this is good news (0)

BitZtream (692029) | more than 3 years ago | (#35326210)

Wow, that must suck. Your GUI allow eats so much CPU power that you can't maintain a good frame rate? I can't imagine trying to get anything done on a machine with GUI lag anymore, let alone a GUI which maxes out a CPU which I'd prefer to be using myself.

Re:this is good news (1)

tepples (727027) | more than 3 years ago | (#35327048)

KDE compositing rarely drops below 25 frames/second.

If it drops below 60, it's not keeping pace with your monitor.

Re:this is good news (0)

Anonymous Coward | more than 3 years ago | (#35325228)

In the last 10 years, even now, whenever someone tends to have problems in X with something not working or crashing, it is generally Intel or AMD video chips. All I can say is nVidia has worked exceptionally for myself in the last 10 years. I also have Intel graphics, but those (used to?) crash in the 3D driver..

So yes, you can go and test ATI. Please let us know how it turns out, especially if you use 3D acceleration.

Yet Another API (2)

arivanov (12034) | more than 3 years ago | (#35322940)

Sigh... That makes what? 4 or 5 different APIs.

Original XvMC
Via XvMC VLD extension
Nvidia - three options - legacy, their bitstream and using CUDA
Intel

Sigh... Can't we just get along and agree on a single standard?

Re:Yet Another API (2)

MrEricSir (398214) | more than 3 years ago | (#35322996)

Fractured API standards are the standard in the open source world. Just look at A/V APIs, web rendering APIs, KDE vs. GTK, etc.

As long as they can work together programmatically, it's not necessarily a bad thing to have different APIs.

Re:Yet Another API (2, Insightful)

BitZtream (692029) | more than 3 years ago | (#35326252)

And this is something that most people in the OSS world (has nothing to do with OSS in general, just OSS allows it to happen easier) utterly fail to grasp.

The bad thing with multiple APIs, that all do essentially the same thing is that they give 'choice'. I realize that most OSS users and indeed most techies LOVE choice, the rest of the world doesn't. Or rather, its not so much that they don't like choice, its that they are not educated about the choice enough to answer them effectively.

GTK vs Qt/Gnome vs KDE is a great example, here its not the users choice thats a problem its the developers. Some devs use KDE, some use GNOME, some use their own toolkit, some use X primitives directly. And combine that all together on a desktop and you get one big ugly fucking mess where everything works slightly different and the user just ends up frustrated because they don't spend their entire lives having a circle jerk to discuss which GUI toolkit should rule them all.

Multiple choices are NOT ALWAYS A GOOD THING, especially when you don't have the domain specific knowledge to make the choice, or someone else that knows nothing about you or your needs is making the choice for you.

The Linux desktop is example of why choice is not always a good thing.

I know, what I just said was complete blasphemy here, but its true.

Re:Yet Another API (0)

Anonymous Coward | more than 3 years ago | (#35327540)

On the other hand you fail to see there is no choice.

Look at GTK vs Qt that you noted. Are they really different? Nope, both use X11 with a standard window manager and implement various freedesktop specs. In fact, almost all backend systems are standardized, with user-facing systems poping up left and right as various programmers tinker with their pet projects, and as various desktops have a NIH syndrome. The reason for a lack of a real video acceleration API on X is the general lack of X design development combined with the failed XvMC that was obsolete almost as soon as it was released (by note, the more general and older Xv is still in use) giving the impression that creating such a extension was pointless.

People may scream about too much choice and how it stops progress but in reality most serious choices are already made by higher-up system standardizing on existing standards and systems. The real problem is when these higher-up systems develop new systems but dont spin them off as standards for other projects to use. This leads to true duplication of effort, and is largely seen in the audio area: various sound systems for each desktop (needlessly), various backends. By comparison the video space is fairly clean: a few codec implementations (not really duplicates), some players using these codecs and shims to use binary codecs, then you have desktop-specific systems arise to provide a unified API for using these codecs which does duplicate work.

So in conclusion: the problem is not really too much choice, but a general lack of development due to no industry standards (after all, open source is about implementing commodities), combined with NIH and the need for "portability" preventing various desktop projects from trying to standardize a new shared, but possibly OS-specific, system, especially when it comes to audio.

Re:Yet Another API (0)

Anonymous Coward | more than 3 years ago | (#35328456)

That's why going Microsoft is for the best. No point in having Apple, even though it looks pretty it conflicts with what everyone else is using.

Re:Yet Another API (0)

Anonymous Coward | more than 3 years ago | (#35329524)

no one uses X primitives directly anymore...

Re:Yet Another API (1)

cardpuncher (713057) | more than 3 years ago | (#35323072)

And, since it requires the Catalyst driver, I assume it's tied to X (like VDPAU) which means that integrating into something like DirectFB isn't going to be possible. As far as I can tell the APIs are not only different in detail but different in the way they are abstracted which means it's quite difficult to have them "work together" in any meaningful way.

Re:Yet Another API (2)

Chuckles08 (1277062) | more than 3 years ago | (#35323128)

This does seem to be a recurring theme in the open source world. On the one hand, it's great to have/try lots of approaches but we need a more effective way of elevating the most successful to the top. Seems like connecting social media more closely with these types of projects would enable discussion and opinions to act as a catalyst for promoting effective solutions.

Re:Yet Another API (1)

Kjella (173770) | more than 3 years ago | (#35323194)

Well XvMC will never do more than MPEG2, so it's not suited for much of anything.

As far as modern codecs go, nVidia has VDPAU, Intel has VA API and ATI has XvBA. Why everyone needs to reinvent the wheel I don't know, but there it is. I figure eventually someone will write the right wrappers so apps only need to deal with one API.

Re:Yet Another API (4, Informative)

u17 (1730558) | more than 3 years ago | (#35323270)

I figure eventually someone will write the right wrappers so apps only need to deal with one API.

VA-API is the wrapper that you speak of. It has multiple backends [freedesktop.org] , including backends for Intel cards, VDPAU and XvBA.

Re:Yet Another API (1)

simcop2387 (703011) | more than 3 years ago | (#35325968)

in fact up until this release it was only ever possible to use XvBA through VA-API. now there's a possibility of that changing but i doubt it. Instead i believe this will result in a more stable XvBA backend for VA-API so that it'll end up easier to use.

Re:Yet Another API (1)

Xua (249955) | more than 3 years ago | (#35323302)

Actually Intel's VA API has backends that use VDPAU and something from FRGLX. I am not sure these backends are tested well but in theory an application that uses VA API can use acceleration provided by all three major graphics hardware vendors. In addition to decoding VA API can be used to accelerate encoding and post-processing filters.

Re:Yet Another API (2)

Ant P. (974313) | more than 3 years ago | (#35323256)

VA-API is the only standard that makes sense to implement [freedesktop.org] , unless you like limiting your apps to nvidia/ati users only, or like writing three times as much code.

Re:Yet Another API (0)

Anonymous Coward | more than 3 years ago | (#35325318)

It's a new one called Gallium.

VDPAU is already an open standard (3, Informative)

CajunArson (465943) | more than 3 years ago | (#35323018)

Nvidia's VDPAU is already an open standard that other video drivers can implement in Linux for video acceleration, so I'm not sure what this buys us. VDPAU as implemented by Nvidia is also about the only video acceleration standard that isn't totally broken and that can accelerate videos beyond MPEG-2 as well.

Re:VDPAU is already an open standard (2, Insightful)

Anonymous Coward | more than 3 years ago | (#35323098)

independence from binary-blob drivers is what it buys us.

Re:VDPAU is already an open standard (1)

Anonymous Coward | more than 3 years ago | (#35323126)

It's not true that VDPAU is the only working standard for linux. My Notebook has an Intel Arrandale GPU which works fine with Intels open-source video driver and can decode H264 HD through VA-API just fine.

Re:VDPAU is already an open standard (1)

drinkypoo (153816) | more than 3 years ago | (#35323480)

VA-API is an open standard, what's wrong with it? Allegedly both ATI and nVidia cards can backend it (nvidia through VDPAU, ATI through whatever wacky stuff they use.) And of course intel provided it so it works with intel.

nVidia has the only accelerated OpenGL pipeline that works worth a crap on any platform. Now THAT is interesting.

Re:VDPAU is already an open standard (1)

3.1415926535 (243140) | more than 3 years ago | (#35333536)

VA-API is an open standard, what's wrong with it?

Its display functionality is totally inadequate.

Not open sourced (5, Informative)

Kjella (173770) | more than 3 years ago | (#35323130)

This headline is widely misleading. They've now documented their equivalent of nVidia's VDPAU blob, but it's only available when you run the closed source Catalyst driver. TFA says so quite clearly.

Before anyone starts wondering, this won't do much good for those hoping to see AMD's UVD2 engine supported by the open-source Radeon graphics drivers.

Re:Not open sourced (1)

Anonymous Coward | more than 3 years ago | (#35323224)

It's doubly misleading, because API stands for Application Programmer's Interface, and an interface in the context of computer programming means headers/protocols, which by definition aren't compiled and don't have any source. Hence you can't "open-source" an interface.

What's the reality? What is usable for MythTV? (0)

Anonymous Coward | more than 3 years ago | (#35324334)

ATI has had video acceleration since ~1996, but never anything usable for Linux. As a long time MythTV user, I gave up on them a long time ago. What's the AMD/ATI reality today? Do they have usable APIs, have any apps supported them?

I tried the Intel video accel stuff, which showed promise as fully open source. It wasn't ready for prime time when I tried it ~2 years ago. Is that usable for HD playback in anything now?

Nvidia, while being closed source, has at least had options for Linux users. XvMC mostly worked, although not without its problems. VDPAU is excellent. The little Atom/Ion platforms with VDPAU are ideal for HTPC use.

Re:What's the reality? What is usable for MythTV? (2)

jedidiah (1196) | more than 3 years ago | (#35325050)

You can use what actually works while you wait for some academic or aesthetic ideal.

Some like to whine about how there are too many APIs around but the actual coders just take care of business. At least the Free Software coders do. That is why the libre tools for Linux are so much better at using this sort of stuff than what proprietary software exists for Linux.

If nvidia is no longer the only game in town then that can only be a good thing.

Intel is left alone (0)

Anonymous Coward | more than 3 years ago | (#35324418)

atlast he AMD/ATI have showed some interest in linux ... :)

Now only if (1)

tabrnaker (741668) | more than 3 years ago | (#35325330)

they would release their internal hardware accelerated build of FFMpeg

Re:Now only if (0)

Anonymous Coward | more than 3 years ago | (#35326888)

probably not possible.
contains 3de party IP stuff they cant release.

Re:Now only if (1)

tabrnaker (741668) | more than 3 years ago | (#35327222)

Who cares about the code. I'd settle for a binary.

Nvidia (1)

ruinevil (852677) | more than 3 years ago | (#35325834)

I feel that nVidia uses the same drivers for all operating systems. The core doesn't change, it just has a wrapper to interface with X/DirectX/Quartz. They just update the core significantly once in a while, to the point it can't interface with older cards. That's why the occasionally have huge issues.

Is this really needed? (1)

TeknoHog (164938) | more than 3 years ago | (#35326340)

Mplayer-uau (basically mplayer with full multithreading) plays 1080p H.264 on an Atom D510 without any hardware decoding. I have given up with GPU video decoding on Linux, since software works so well even on fanless processors.

Re:Is this really needed? (0)

Anonymous Coward | more than 3 years ago | (#35327684)

At what bitrate? Even the fairly old nvidia 9400 can do 1080p H/264 at Bluray bitrates, whereas an Atom has gotta be working all-out at gigabyte-per-hour bitrates. While I do think it's inevitable that we'll eventually get to the point where ~10 Watt CPUs will be up to the job of obsoleting hardware decoders, that's several years off.

Re:Is this really needed? (1)

godrik (1287354) | more than 3 years ago | (#35327716)

In my experience, Atom processors do not play high resolution very well. But even if it did, having a version that uses the GPU would be a significant improvement. It would free the CPU to do potentially something else. Including downclocking, which could improve energy efficiency significantly.

On my PDA (Nokia N810), I used to decompress audio using a software lib. When I switched to a lib that uses the internal DSP, my battery life increase 300%.

Re:Is this really needed? (1)

TeknoHog (164938) | more than 3 years ago | (#35329502)

In my experience, Atom processors do not play high resolution very well. But even if it did, having a version that uses the GPU would be a significant improvement. It would free the CPU to do potentially something else. Including downclocking, which could improve energy efficiency significantly.

Good point, but the power consumption of a Radeon is not exactly zero, even at idle. If you need to add a discrete GPU to shave off a few CPU watts, I believe the overall consumption increases.

On my PDA (Nokia N810), I used to decompress audio using a software lib. When I switched to a lib that uses the internal DSP, my battery life increase 300%.

Another good point. Unfortunately, this AMD announcement does not do much help in the mobile space.

There is also the general point that hardware acceleration is lagging behind new codec development. Software is much more flexible, even when "hardware" means new drivers/firmware for a general-purpose DSP. In this day and age it would be much more interesting to see codecs written in OpenCL, for example.

Re:Is this really needed? (0)

Anonymous Coward | more than 3 years ago | (#35328686)

It plays 720p H.264 on a single-core 1.6ghz Atom as well, on top of a single-channel DDR2 memory bus. With minimal post-decoding hardware assist for scaling and YUV->RGB conversion and so forth via Xv.

Atoms derive quite significant advantages from software that's multithreaded. I suppose this means their single-thread performance kinda sucks, which is true compared to something like a 1.33ghz mobile core i3.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?