Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Open-Sources Video Encode Engine

timothy posted about 7 months ago | from the drag-out-the-home-movies dept.

Open Source 34

An anonymous reader writes "AMD's latest feature added to their open-source Radeon DRM graphics driver is VCE video encoding via a large code drop that happened this morning. A patched kernel and Mesa will now allow the Radeon driver with their latest-generation hardware to provide low-latency H.264 video encoding on the GPU rather than CPU. The support is still being tuned and it only supports the VCE2 engine but the patches can be found on the mailing list until they land in the trunk in the coming months."

cancel ×

34 comments

Sorry! There are no comments related to the filter you selected.

Holy shit (0)

Anonymous Coward | about 7 months ago | (#46150173)

It's almost like this has been on Intel's CPUs for years...

Re:Holy shit (1)

jedidiah (1196) | about 7 months ago | (#46150317)

> It's almost like this has been on Intel's CPUs for years...

Oh really?

So what build of ffmpeg on Linux supports this?

Having some weaker-than-everyone-elses GPU feature and actually having libre support for that feature are two entirely different things.

Re:Holy shit (1)

kry73n (2742191) | about 7 months ago | (#46150639)

FFmpeg (upstream SVN tree >= 2010/01/18 / version 0.6.x and onwards)

for reference: http://www.freedesktop.org/wik... [freedesktop.org]

Re:Holy shit (3, Informative)

Kjella (173770) | about 7 months ago | (#46150843)

They have decoding support, but at least as recently as Google Summer of Code 2013 [multimedia.cx] they don't have hardware encoding support. That seems to be the fault of the ffmpeg project though, encoding was added to the VA API in June 2009. Lack of interest?

Re:Holy shit (1)

wagnerrp (1305589) | about 7 months ago | (#46152293)

Encoding was added to the VAAPI interface, but was never supported by Intel hardware. There's not much sense implementing a protocol when there's no hardware to interface with. You may be looking for the Intel Media SDK [intel.com] , which wasn't made publicly available until the middle of last year.

Re:Holy shit (1)

UnknowingFool (672806) | about 7 months ago | (#46153383)

Encoding was added to the VAAPI interface, but was never supported by Intel hardware. There's not much sense implementing a protocol when there's no hardware to interface with. You may be looking for the Intel Media SDK [intel.com], which wasn't made publicly available until the middle of last year.

Intel says otherwise [01.org] : Hardware encoding is supported with Intel HD 2000 and newer (Sandy Bridge)

aaaaand (0)

Anonymous Coward | about 7 months ago | (#46155063)

If you read the ffmpeg mailing list they soundly fried the Intel hardware developers for implementing instructions that they felt were useless. They threw a bit of a hissy because Intel didn't ask them what they needed and instead surprised them with a bunch of functions they said weren't going to speed things up.

Rightly or wrongly that appears to be what happened. I researched this not too long after having gotten a Sandy CPU in hopes that I'd be able to speed up the BD encoding I was doing. No such luck! Decoding yes, encoding NOPE! Sadly video cards don't seem to offer the flexability that is desired either and tests have shown that software encoding has higher quality than that done by video cards I'd LOVE to see ffmpeg do something with this...

Re:Holy shit (2)

UnknowingFool (672806) | about 7 months ago | (#46150825)

That's not what he's saying and you know it. He's staying on the topic of this article which is Intel has provided open source h.264 encoding (VAAPI) for years while AMD is now releasing code. You could do a little research [01.org] . As for ffmpeg, it doesn't support VAAPI but that is a choice of the ffmpeg developers not to include it; however, through the external libx264 library, you can get ffmpeg to encode h.264. This has nothing to do with Intel as they released the code. That's like complaining that it's the Khronos Group's fault if PowerVR decided not to support OpenCL on their new GPUs.

Re:Holy shit (0)

Anonymous Coward | about 7 months ago | (#46152911)

https://github.com/lu-zero/libav/tree/qsvenc-simple using Intel media sdk, htting libav11 in few months.

Did you mean Gpu? GPU is four times as fast. (2)

raymorris (2726007) | about 7 months ago | (#46150395)

You said Intel CPU. Did you mean Intel GPU?
GPU encoding is about four times as fast as CPU. Of course that depends on which GPU is being compared to which CPU.

Re:Did you mean Gpu? GPU is four times as fast. (2)

PhrostyMcByte (589271) | about 7 months ago | (#46150905)

I'm sure he means Intel's Quick Sync hardware codecs, which are integrated on Intel's CPUs and does not use the integrated GPU.

My understanding of AMD's VCE is that it is also a fully separate codec which does not use any GPU compute power, though they do have optimized paths to copy the framebuffer into VCE for low-latency screen capture.

Re:Did you mean Gpu? GPU is four times as fast. (1)

UnknowingFool (672806) | about 7 months ago | (#46151099)

No I think he means VAAPI which allows for hardware h.264 encoding on Intel GPU and has been open sourced for years. The limitation was not all older hardware supported this particular encoding.

interesting, hardware video chip on the CPU (1)

raymorris (2726007) | about 7 months ago | (#46152101)

That's interesting. For anyone who, like me, wasn't familiar with Quick Sync, it seems to be a dedicated video codec chip on the CPU. In testing, it was much faster than CPU or even GPU encoding, but at lower picture quality per megabyte.

And thus the measurement was nonsense (0)

Anonymous Coward | about 7 months ago | (#46152931)

> it was much faster than CPU or even GPU encoding, but at lower picture quality per megabyte.

That is like saying that car A used much more fuel while driving much faster than car B.
Guess what, I have the fastest video encoder of all:
cat /dev/zero > videofile
Sure, the quality sucks, but you won't find any faster!

not exactly. Can real time vs can't (2)

raymorris (2726007) | about 7 months ago | (#46153233)

I don't think it's quite nonsense. A motorcycle has lower latency than a truck (you can get there faster), but a truck has higher throughput (it can deliver 1000 boxes quicker). That's a useful distinction.

Especially Quick Sync can easily encode IN REAL TIME, so it's useful for DVRs, etc. (Think instant replay). An unassisted CPU will struggle with real time encoding. Being able to encode even multiple streams in real time is better than not being able to.

Re:interesting, hardware video chip on the CPU (1)

hattig (47930) | about 7 months ago | (#46160429)

Yes, this is the sort of task-oriented dedicated function blocks for video decode and encode that have been popular in GPU, ARM SoC and now x86 "APU" for quite some time.

Useless for high quality encoding, but great for standard consumer uses, quick encoding and transcoding of all those phone videos.

The PS4 probably uses VCE for its TwitchTV integration, for example.

Re:Did you mean Gpu? GPU is four times as fast. (0)

Anonymous Coward | about 7 months ago | (#46151949)

GPU encoding is about four times as fast as CPU.

It's also about six times worse, so that's not surprising.

binary driver (0)

Anonymous Coward | about 7 months ago | (#46150297)

From the mailing list, it appears you still need to link this all to a closed source binary...

Re:binary driver (1)

fuzzyfuzzyfungus (1223518) | about 7 months ago | (#46150383)

Unless the binary mentioned is different from the various others in the directory, it's firmware that the driver needs to load onto the card on startup (which has been the case for Radeons for some time now) rather than a binary driver component that runs on within the host OS.

Distribution will be a hassle, as always; but it's not architecturally much different from just adding a chunk of flash to the card and storing the firmware there instead.

Re:binary driver (3, Informative)

Kjella (173770) | about 7 months ago | (#46150741)

From the mailing list, it appears you still need to link this all to a closed source binary...

No, it's firmware/microcode. The driver sends it to the GPU at boot as a blob, it lives inside the card hidden from everything. The alternative would be to have an EEPROM and a firmware flashing utility, it'd still be there and closed source but it wouldn't be in the driver. It's not really part of the programming model, it's hardware initialization/configuration/tweaks to make the it work correctly according to the model.

Nice work AMD (0)

Anonymous Coward | about 7 months ago | (#46150357)

Now if only we actually had support for a codec that isn't patent-encumbered and actually included as part of standard releases?

Re:Nice work AMD (0)

Anonymous Coward | about 7 months ago | (#46150473)

I guess they have to go with H.264 since it's the major standard. The backwards people who make the decisions for the major softwares, and users of those softwares, are reluctant to change, and until then we're stuck with what's popular, for better and worse.

Re:Nice work AMD (1)

symbolset (646467) | about 7 months ago | (#46151923)

It is still a step backward, a last ditch attempt to rescue the patent encumbered CODEC before it becomes extinct. They should let it die, for the good of progress. Who wants a CODEC backed by a group that sued a mom for publishing a birthday video online over patent licensing?

Quality vs Speed (2)

deimios666 (1040904) | about 7 months ago | (#46150677)

While I applaud AMD for their initiative there have been tests that show a drop in quality of GPU encoded H264 vs a CPU/software solution.
For details check out: http://www.behardware.com/articles/828-27/h-264-encoding-cpu-vs-gpu-nvidia-cuda-amd-stream-intel-mediasdk-and-x264.html [behardware.com] and http://www.tomshardware.com/reviews/video-transcoding-amd-app-nvidia-cuda-intel-quicksync,2839-13.html [tomshardware.com]

Re:Quality vs Speed (1)

K. S. Kyosuke (729550) | about 7 months ago | (#46150811)

Use HSA, then. If the program running on the GPU won't give identical results, it's a hardware bug.

Re:Quality vs Speed (1)

KingMotley (944240) | about 7 months ago | (#46150979)

Not a hardware bug. The program running on the GPU isn't the same as the one running on the CPU.

Re:Quality vs Speed (0)

Anonymous Coward | about 7 months ago | (#46151405)

That's why he said to use HSA, which fully supports C/C++. Just recompile your code for HSA, then let it run on the GPU. You may need to refactor a bit of code to make sure the data is structured and accessed in a way the GPU likes it, but it should give identical output.

Sure, if you want to ruin it (0)

Anonymous Coward | about 7 months ago | (#46152985)

Yeah, great idea. If your goal is to create the slowest encoder ever. But then you could just use the reference encoder, it is already unusably slow, no need to go coding.

Re:Quality vs Speed (1)

gl4ss (559668) | about 7 months ago | (#46151195)

it's not the same program.

that's the thing... these new codecs, they don't specify exactly how you should encode.

Re:Quality vs Speed (1)

K. S. Kyosuke (729550) | about 7 months ago | (#46151723)

Indeed, but if a source code compiled for the CPU and the same source code compiled for HSA don't give the same results, it's a hardware bug.

Re:Quality vs Speed (1)

tlhIngan (30335) | about 7 months ago | (#46153473)

that's the thing... these new codecs, they don't specify exactly how you should encode.

That's the point. Because codec quality is highly dependent on the tables you use, which is the main selling point of the codecs. In other words, the quality of the final output is strictly determined by the quality of the coder.

The decoder rarely adds quality loss itself - it just reconstructs the signal based on what it is given and few decoders actually have a say in quality decisions.

The coder part though is where the magic happens. It's where all the quality decisions are made. And it's how two coders can compete with each other - perhaps one encodes with higher quality, while another encodes faster.

It's why you use say, LAME to encode MP3s over some commercial codec - LAME's encoding model has been refined over the years to be so good it can be indistinguishable. Even x264 has pretty good quality

It's how the coders differentiate themselves on the market. There is no canonical encoding of a signal - there is so much flexibility that the encoder has a bunch of choices it can make that affect the final output.

You really are a moron (0)

Anonymous Coward | about 7 months ago | (#46151837)

Who ever thought hardware H264 encoding matched the quality of a decent x264 encode? Exactly no-one.

People want access to hardware video encoding for various reasons- and quality is not one of them.
- because they've paid for the hardware, and should be able to access it
- because hardware encoding reduces the workload on the CPU
- because hardware encoding may be 'good enough' for many use cases
- because hardware encoding can be vastly faster on computers lacking the most expensive i7 Intel processors.
- because hardware encoding allows low latency streaming of game output video to remote wireless devices, allowing the AAA PC games to be played on Android gaming tablets (tablets with integrated physical joypad controls).

Again, no-one wants their Bluray rips to have been created on AMD (or Intel or Nvidia) hardware encode units. But we don't need you to point out this glaringly obvious fact.

Codecs as far as the eye can see.... (4, Interesting)

ThatsDrDangerToYou (3480047) | about 7 months ago | (#46150703)

If I never see another video codec in my career it will be too soon.

Video decoder verification was the most tedious task I have ever been assigned. Just sayin.

Well that took forever- give AMD no praise (0)

Anonymous Coward | about 7 months ago | (#46151617)

Years ago, ATI famously made multi-media versions of their graphics cards, designed for dual output, where the second output would likely be a TV set. Users needed, understandably, high-quality video-decode on the TV output, and this meant access to the video-decode hardware and the resize video filters. At first, ATI's drivers did exactly this, but then one day the new drivers REMOVED all hardware acceleration form the secondary output.

AMD lied about the situation- said it was a 'bug' and promised to put back the functionality. They never did (remember, I'm talking about then, not now). S, what happened?

Microsoft happened. Microsoft determined that ATI was encouraging dual simultaneous use of a Windows licensee. One child could be doing homework on the PC, while at the same time the PC could be playing a movie on the TV set for the rest of the family. Nothing illegal about this situation, but Microsoft didn't like it. So Microsoft told ATI to cripple their drivers- an order ATI most certainly did NOT have to follw, but ATI (AMD) always put their paying customers LAST over the interests of corporate requests.

Roll forward to a few years back. AMD refused to allow users to have general access to the hardware video decode path of ATI graphics hardware. So, if for instance you are doing video editing, you are forced to decode the video streams in hardware. AMD's excuse for this sickening refusal to allow people access to hardware features they paid for? The lobby groups for film and TV producers claimed that such restrictions would help fight 'piracy'. This despite the fact that AMD already supported the DRM protected path video decode, designed to prevent access to the raw frame data of DRM protected commercial video.

Intel and Nvidia BOTH allowed users of their hardware to access video decode functionality for editing and the like. AMD was the ONLY company to refuse access.

It is the EXACT same story with AMD's video encode hardware, AMD's JPG encode/decode hardware, and AMD's Trueaudio hardware. AMD is the WORST company in the world for respecting the rights of its paying customers. AMD managers love to wine and dine with suits form other companies, and treat these suits as the ONLY voice worth listening to. And these suits have every excuse in the world to request that AMD deny its paying customers open access to the hardware they own.

Today, AMD's biggest buzz is the Mantle API. Despite the fact that this API supports AMD GPU hardware that has been out for years, AMD has stated it has ZERO plans to open the API to the general programming community. At best, AMD has said it MAY consider providing an open SDK at the end of this year. And what does this mean in practice? That ONLY if unbearable commercial pressure falls on AMD (unfavourable market competition) will AMD provide its customers with low level access to their own hardware.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>