Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Next-Gen Video Encoding: x265 Tackles HEVC/H.265

timothy posted 1 year,4 days | from the relocating-the-bottlenecks dept.

Software 104

An anonymous reader writes "Late last night, MulticoreWare released an early alpha build of the x265 library. x265 is intended to be the open source counterpart to the recently released HEVC/H.265 standard which was approved back in January, much in the same way that x264 is used for H.264 today. Tom's Hardware put x265 through a series of CPU benchmarks and then compared x265 to x264. While x265 is more taxing in terms of CPU utilization, it affords higher quality at any given bit rate, or the same quality at a lower bit rate than x264." (Reader Dputiger writes points out a comparison at ExtremeTech, too.)

cancel ×

104 comments

pr0n (-1, Offtopic)

Anonymous Coward | 1 year,4 days | (#44362659)

I can't wait to watch ugly women urinate on eachother in unprecedented detail!

When do we get hardware? (0)

Anonymous Coward | 1 year,4 days | (#44362725)

I want to play back a 4k MKV of Ben Hur on my phone! When?!?!

Is there any possibility that the PS4 or Xbox One will have playback capability?

Roku?
Apple TV?

Re:When do we get hardware? (0)

Anonymous Coward | 1 year,4 days | (#44363543)

I'm sure an Android phone will have it first.

And it will be stuttery as hell.

With no chance of upgrades.

Re:When do we get hardware? (0)

Anonymous Coward | 1 year,3 days | (#44368953)

> With no chance of upgrades.

Unless you have a custom ROM.

This is great news! (5, Interesting)

ShooterNeo (555040) | 1 year,4 days | (#44362731)

25-35% less file size for the same quality is an incredible advance. Obviously the task of improving compression algorithms is going to ratchet up enormously as the file sizes get smaller with higher entropy. I'm in fact amazed that an advance this big is even possible, apparently, x264 is nowhere near the theoretical limits for (lossy) video compression.

Re:This is great news! (1)

Anonymous Coward | 1 year,4 days | (#44362909)

Storage isn't a problem, it's the cheapest part of the equation. Energy consumption is the biggest technical challenge due to the global domination of mobile devices and the current limitations in energy storage.

Re:This is great news! (0)

Anonymous Coward | 1 year,4 days | (#44362997)

The OP never stated storage was a problem. I guess he had assumed that everyone could see the elephant in the room; bandwidth.

Re:This is great news! (4, Insightful)

gl4ss (559668) | 1 year,4 days | (#44363003)

Storage isn't a problem, it's the cheapest part of the equation. Energy consumption is the biggest technical challenge due to the global domination of mobile devices and the current limitations in energy storage.

_transferring_ is still very much a issue though.

Re:This is great news! (3, Insightful)

jedidiah (1196) | 1 year,4 days | (#44363027)

Storage isn't even the problem when it comes to file size, network bandwidth is. The generally poor quality of broadband and even cable ultimately relate to the size of the file. Network performance and bandwidth caps are the real choke point.

Streams get over compressed to the point that even an aggressively transcoded DVD beats the snot out of them in terms of quality. Forget about a raw BluRay stream.

Re:This is great news! (2)

tttonyyy (726776) | 1 year,4 days | (#44366457)

Ultimately video over IP (which sounds like a bad plan to start off with) is all about the connection - modern broadcasters use adaptive streaming - the same video is encoded at a variety of bitrates and resolutions and made available to playback clients. The client assesses live buffer fill and decides between low bitrate and poor quality and high bitrate/quality dynamically depending on how the link is performing, fetching small/large files off the server as appropriate. It works very well and the user is left completely unaware that it's happening.

Curiously enough the H264 standard was very forward thinking in this respect and there are lots of clever ways to dynamically control streaming - none of which anyone uses as it's complicated to implement compared to just encoding the same thing at different bitrates.

H265/HEVC is the logical progression in computational complexity vs compression efficiency - definitely here to stay in the video compression industry.

Re:This is great news! (1)

jedidiah (1196) | 1 year,3 days | (#44369391)

It isn't even really here yet.

You know what's really here to stay? MPEG2. Stuff lingers because content is available in that format. Until that happens, fanish pronoucements are all premature.

Re:This is great news! (0)

Anonymous Coward | 1 year,4 days | (#44363079)

Rooting the data between server and client has an energy cost too especial when that involves both long paths and transmitting over wifi or mobile phone networks, and the difference is for encoding not decoding, which is what will be done hundreds of times for each encode and is therefore, the more important of the two.

Re:This is great news! (3, Insightful)

NFN_NLN (633283) | 1 year,4 days | (#44363091)

Storage isn't a problem, it's the cheapest part of the equation. Energy consumption is the biggest technical challenge due to the global domination of mobile devices and the current limitations in energy storage.

The summary talks about an increase in CPU usage. If they use a dedicated H.265 chips in the future (much like they use H.264 chips now) can they not optimize the hardware to minimize CPU and power use? I'm just wondering from the perspective of mobile/phone users if H.265 is going to dominate or will H.264 still be the standard for mobile.

Re:This is great news! (1)

marcello_dl (667940) | 1 year,4 days | (#44363591)

I think there will be a race to adopt this and to flood the market with h265 material- the newer machines will crunch through those videos while the older one will suffer. Big push for updating those toys.

Personally I am using xvid4 for my stuff, at high bitrates it is not much worse than h264, I have less patent concerns. I wish somebody came up with cams sporting dirac or theora encoding. Most source material is now compressed in mpeg2/4/10 formats, transcoding make those codecs perform worse.

Re:This is great news! (1)

evilviper (135110) | 1 year,4 days | (#44366355)

the newer machines will crunch through those videos while the older one will suffer. Big push for updating those toys.

It took YEARS after the release of H.264 before ANY CPUs could do realtime playback of highdef videos. That's why there was such a big push for H.264 decoding in GPUs, but that came years later.

Hardware Decode (5, Insightful)

LordCrank (74800) | 1 year,4 days | (#44364267)

If it's anything like H.264/x264 then I expect to have the hardware to decode H.265/x265 in my laptop about 2 years after movies and tv shows are being distributed in this format, but 2 years before there are any linux drivers for the hardware decoders.

Re:Hardware Decode (1)

jedidiah (1196) | 1 year,4 days | (#44365947)

By the time I had any need of a hardware h264 decoder in Linux, the drivers were readily available and used by all of the relevant software.

The notable slacker here was Adobe. These are the losers that whine about clanlib while the "hobbyists" just take care of business.

The community and (at least a subset of) the hardware vendors were well prepared in ample time.

Current performance of current entry level CPUs may already make the point moot.

Re:Hardware Decode (1)

LordCrank (74800) | 1 year,4 days | (#44366063)

I had a laptop that would barely be able to do hdtv, and couldn't handle 720p content without significant stuttering. I believe the issue was that I had done enough research to know its graphics card supported decoding, but not enough to know that the particular brand making the graphics card had a long history of terrible linux support.

As far as whether hardware decoding is even necessary, it's something that I'd look at as an issue for set top boxes/low power media centers. It seems XBMC has generally been able to get linux drivers for these released fairly quickly in recent years, since the hardware manufacturers are aware that a significant portion of their potential customers are looking for a device to run XBMC.

Re:Hardware Decode (2)

evilviper (135110) | 1 year,4 days | (#44366337)

By the time I had any need of a hardware h264 decoder in Linux, the drivers were readily available and used by all of the relevant software.

So? You were a VERY LATE ADOPTER of H.264, which doesn't remotely represent the rest of the world. H.264 has been around since 2003, and people have been struggling to watch H.264 videos in realtime on top-of-the-line CPUs from the start, and failing miserably for years. It took several years to get it decoded by GPUs even on Windows, and longer still before Linux could do it with VADPU or other frameworks. GP was absolutely correct about how it progressed.

Re:Hardware Decode (1)

pantaril (1624521) | 1 year,3 days | (#44369775)

By the time I had any need of a hardware h264 decoder in Linux, the drivers were readily available and used by all of the relevant software.

Hardware accelerated video is still huge problem on linux if you don't own one of the few supported brands of graphic cards (nvidia vdpau with proprietary drivers and intel HD graphics vaapi open source drivers). Recently i wanted to purchase odroid-u2 [archlinuxarm.org] ARM computer with Mali-400 graphics accelerator but the driver support is not there and unfortunately it's not exception.

Re:Hardware Decode (0)

twohorse (682282) | 1 year,4 days | (#44367257)

Only a few years ago, the only consumer versions of Linux was in routers and tvs. With the proliferation of mobile Linux media devices, expect a Linux driver before a windows one.

Re:This is great news! (1)

evilviper (135110) | 1 year,4 days | (#44366073)

It's very hard to break away from the lowest-common-denominator. Just look at MP3.

Smartphones didn't have a common standard video format before H.264 came along. H.265 is going to require much more computation, which will be possible in a few years, but I would fully expect H.264 to stay entrenched for a long, long time to come, and H.265 being an also-ran technology some high-end phones will start supporting, and us techies will rant about, while nobody else really cares...

Besides that, Google is now the major force in the mobile phone market, and they just released their new VP9 codec that they claim surpasses HEVC/H.265. If they throw their weight around just a little bit, they can mandate all new Android phones must support WebM with VP9 and Opus. And if they go further and threaten to drop H.264 videos from YouTube, I'm sure even Apple won't be able to fight them.

Re:This is great news! (1)

complete loony (663508) | 1 year,3 days | (#44368759)

The biggest compression gains for video involve improving motion compensation, which may raise the decoders memory requirements, and entropy coding. Entropy coding is implicitly single threaded, and the complexity of this process will impose a lower bound on the clock speed of your decoder. While the rest of the decoding process can be done in parallel with lower powered circuits.

Re:This is great news! (1)

ultranova (717540) | 1 year,3 days | (#44370217)

Entropy coding is implicitly single threaded, and the complexity of this process will impose a lower bound on the clock speed of your decoder. While the rest of the decoding process can be done in parallel with lower powered circuits.

Well, how about SHA-1 encoding? Each frame is an unpacked bitmap image, giving them a fixed size in bits. The encoder generates every frame which hashes to the same SHA-1 as the target, sorts them by their contents, and notes the ordinal of the target. The final packet consists of the SHA-1 and the ordinal, and the stream header contains the framesize. And the decoder simply repeats the process, picking the frame indicated by the SHA-1 and the ordinal.

This scheme should parallelize nicely, and can be further sped up easily by using a rainbow table of all SHA-1 hashes.

Re:This is great news! (3, Insightful)

dj245 (732906) | 1 year,4 days | (#44363213)

Storage isn't a problem, it's the cheapest part of the equation. Energy consumption is the biggest technical challenge due to the global domination of mobile devices and the current limitations in energy storage.

But there are bandwidth limitations on many devices. Limitations which are generally fine now for 1080p, but could be a problem with "something better", or with multiple streams of "something better".

Plus, this article deals with the compression part of the video encoding. Most media is decompressed many many times, but only compressed once. It is reasonable to assume that decompression will be more taxing with x265 compared to x264, but that isn't a part of this article. How much more CPU is required for decompressing x265 compared to x264? That isn't so clear at the moment, and since the code isn't finalized, results today may have no bearing on tomorrow anyhow.

Re:This is great news! (0)

Anonymous Coward | 1 year,4 days | (#44363895)

Poor me, I don't even have a computer that can do high bitrate x264 yet :(

Re:This is great news! (1)

jedidiah (1196) | 1 year,4 days | (#44365957)

You can get one at Frys for $200 now.

Alternatively, you could get yourself a really old and dirt cheap video card to upgrade your current machine.

Re:This is great news! (1)

Nemyst (1383049) | 1 year,4 days | (#44363309)

As others have said, bandwidth is an extremely important element. The thing to remember is that for the specific task of video compression and decompression, hardware vendors aren't shy about integrating specialized hardware that can run the algorithm at a much higher speed and at a much lower energy cost than general purpose processors. While it's too early to say whether H.265 will take more CPU power to compute (this is, after all, an alpha implementation), even if it were, it probably wouldn't matter that much once hardware implementations start to appear.

Re:This is great news! (1)

DarkXale (1771414) | 1 year,4 days | (#44363927)

The power cost for transferring a single bit has not really changed over the years (if anything, its increased). The power cost of decoding a set of bits however constantly decreases.

Its always been more power efficient to employ more complex compression (more CPU work) over transferring more bits. Even for non-specialized CPUs this is true, never mind when you have hardware decoders.

Re:This is great news! (1)

Bengie (1121981) | 1 year,3 days | (#44370621)

Its always been more power efficient to employ more complex compression (more CPU work) over transferring more bits.

I agree. Wireless eats a lot of power and newer wireless devices and shut-down and go idle much more quickly. Even quicker idle wireless in the pipeline.

Re:This is great news! (0)

Anonymous Coward | 1 year,4 days | (#44364701)

There's no free lunch. Both it and VP9 accomplish the greater compression by smacking the hell out of your CPU. You don't want to be running this on your laptop battery or older desktop without support from hardware acceleration. And even then it's going to chew through the power.

Re:This is great news! (2)

evilviper (135110) | 1 year,4 days | (#44365979)

25-35% less file size for the same quality is an incredible advance.

No, it really isn't.

First off, we're just talking about PSNR per bitrate, which is pretty meaningless. They even say in TFA that they could have used the "--psnr" option to increase the PSNR, at the expense of video that looks like crap. At least SSIM would have been less easy to fool than PSNR. At the end of the day, there's nothing better than a subjective expert human visual comparison.

Secondly, look at a few of those charts. The specific encoder you use makes a HELL of a lot more difference than the codec/format. Other H.264 encoders don't come close to x264. If x265 doesn't get the same kind of open source development boost, x264 will continue to improve, and probably outperform the newer format, as proprietary codec developers just haven't shown themselves willing or able to do a good job of perceptual encoding, yet that's where the bulk of our non-pirated content comes from... And those pirate rips aren't exactly encoded by experts, so they tend to be crap that can be easily surpassed by MPEG-2 encodings.

And finally, Google claims their patent-free VP9 codec, which was recently finalized, will outperform H.265/HEVC. Even if they're over-promising, I'm sure it'll be close enough. And if we get open source developers working on and optimizing VP9/libvpx, while shunning HEVC, then the patent-free codec could fly past the patent-encumbered one easily enough. And for the first time in history, we'll have non-proprietary audio and video codecs that can outperform the expensive and patented kind, laying to rest the contentious HTML5 video debate, back when Theora really didn't have a snowball's chance in hell against H.264.

Let's just hope Google is actually committed this time, and will convert more than 4% of their Youtube catalog to WebM in the next 4 years... This time using the new and improved VP9 and Opus, instead of just using it as a threat to get cheap licensing for the proprietary codecs.

Re:This is great news! (2)

mister2au (1707664) | 1 year,4 days | (#44367799)

The specific encoder you use makes a HELL of a lot more difference than the codec/format. Other H.264 encoders don't come close to x264. If x265 doesn't get the same kind of open source development boost, x264 will continue to improve, and probably outperform the newer format, as proprietary codec developers just haven't shown themselves willing or able to do a good job of perceptual encoding, yet that's where the bulk of our non-pirated content comes from...

Wow ... just wow!

I'll assume by "proprietary codec developers" you actually mean developers who have closed source implementation of the encoder. In which case, the statement may not be completely unreasonable ie closed-source encoders have not kept pass with this open-source implementation due to less comphrensive (and less costly) implementation of the encoding tools.

The rest is complete rubbish though ...

So what if the bulk of non-pirated comes from closed source encoders ... what does that have to do x264 or x265 or H.265 development? Is this just a shot at closed-source software given you seem to be VERY pro-open source?

The specific encoder you use makes a HELL of a lot more difference than the codec/format. Other H.264 encoders don't come close to x264. If ... blah blah blah .. x264 will continue to improve, and probably outperform the newer format"

Do you understand that H.265 is essentially a superset of H.264? You know, as in H.264 + extensions? How is x264 going to out-perform x265 given it is a subset of the later?

Do you understand that x264 uses a superset of the encoder algorithms implemented by other encoders? So what you attribute is primarily due to codec and then codec parameters and then software optimisations.

You could have said "I hate standards .. I love open source ... I hope H.265 fails" ... much quicker and doesn't need any pretence of understanding video codecs.

Re:This is great news! (1)

evilviper (135110) | 1 year,3 days | (#44368121)

So what if the bulk of non-pirated comes from closed source encoders ... what does that have to do x264 or x265 or H.265 development?

It means the actual delivered content won't be any better than what we've got now. Quite possibly worse. It's quite a common phenomena when new lossy formats come out. You'd know that if you had a clue about the subject.

Do you understand that H.265 is essentially a superset of H.264? You know, as in H.264 + extensions

You could say that all modern video codecs are essentially supersets of H.261... But that doesn't mean you can bolt on feature XYZ to the previous encoder and get your 50% improvement. Either x265 will get a lot of development effort and will become a very good codec, or it won't, and other formats with more development effort will outperform it. These are fickle issues that are hard to predict. In any case, it's nothing so brainlessly simple as "THIS NEW FORMAT IS X% BETTER THAN THE OLD ONE!!!"

So what you attribute is primarily due to codec and then codec parameters and then software optimisations.

Some ASP codecs outperform some AVC codecs. The newer format doesn't magically offer improvements.

You could have said "I hate standards .. I love open source ... I hope H.265 fails" ... much quicker and doesn't need any pretence of understanding video codecs.

You could have said "I'm a shill for a MPEG-LA patent holder" much quicker than playing dumb and twisting things around to try and make a straw-man.

Re:This is great news! (1)

davester666 (731373) | 1 year,3 days | (#44367967)

Um, the theoretical limit for lossy video compression is 1 bit. You can store any video with just that single bit if you are willing to put up with some rather obvious artifacts.

I guess, technically, it's just a single artifact that you have to put up with.

Reference encoder with some small tweaks (2)

PhrostyMcByte (589271) | 1 year,4 days | (#44362747)

This is basically just the HEVC reference encoder right now. Expect it to be slow and inefficient.

Re:Reference encoder with some small tweaks (5, Informative)

StreamingEagle (1571901) | 1 year,4 days | (#44362983)

The HM reference encoder takes roughly 40 seconds to encode one frame of 1080P video on a dual Xeon (16 core) server. x265 can encode 1080P at roughly 11 frames per second today. The project is still early in development, and there are many features (lookahead, B-frames, rate control, etc) and efficiency/performance optimizations left to be done, but we are making good progress. I would encourage you to try it before reaching any conclusions.

Re:Reference encoder with some small tweaks (0)

Anonymous Coward | 1 year,4 days | (#44363099)

Not sure about HEVC, but with other ITU reference code, you can get an order of magnitude or two improvement simply by stripping out the code that counts all the operations, and wraps all the operations for overflow protection (which is not needed in most cases).

Re:Reference encoder with some small tweaks (2)

timeOday (582209) | 1 year,4 days | (#44363649)

How amenable is H.265 to vector operations? Nobody would expect a 3d game to run at an acceptable rate without a GPU nowadays, and video encoding may fall in the same category. Serial processing hasn't been getting significantly faster for years now.

Re:Reference encoder with some small tweaks (0)

Anonymous Coward | 1 year,4 days | (#44367667)

TFA said that x265 was designed to take advantage of multicore CPU's to encode different sections of the video in parallel. Great for offline transcoding, but doesn't help with realtime. No word on vector processor (e.g. GPU) acceleration. IIRC, past efforts to use common GPU's for video encoding haven't been particularly successful.

Re:Reference encoder with some small tweaks (1)

Bengie (1121981) | 1 year,3 days | (#44370693)

past efforts to use common GPU's for video encoding haven't been particularly successful.

GPUs accelerate ate synchronous parallel code, but video encoding tends to have a mix of non-synchronous and serial code paths, making 2Tflop GPUs barely faster than a 100Gflop CPU.

Re:Reference encoder with some small tweaks (1)

Anon E. Muss (808473) | 1 year,4 days | (#44367725)

Just curious... Have you tested x265 scalability on any big iron? (e.g. eight socket Xeon server)

no patent clarification yet, though (4, Informative)

Trepidity (597) | 1 year,4 days | (#44362773)

Not even just that it's almost certainly covered by a pile of patents, but unlike H.264, there isn't any clarity yet about which ones, and what the licensing terms will be like. Will the categories of royalty-free use granted to H.264 codecs also be applied to H.265? Nobody seems to know. MPEG-LA hasn't issued an update since June 2012 [mpegla.com] , at which point they were still at the stage of calling for patent-holders to submit claims.

Re:no patent clarification yet, though (1)

StreamingEagle (1571901) | 1 year,4 days | (#44363033)

This is a software implementation. Commercial companies must license necessary patents separately, just as they do when they implement MPEG-2, H.264, etc.

Re:no patent clarification yet, though (1)

Trepidity (597) | 1 year,4 days | (#44363287)

At the moment, even noncommercial users who download the software and use it to encode videos are in a murky situation, since patents don't by default have an exception for noncommercial use; simply encoding personal videos via a patented method constitutes "practicing" the invention. The H.264 license specifically gives a royalty-free patent grant for noncommercial use (as well as a few other types of use), which clears up that case. What would be helpful is if MPEG-LA came out and said whether it plans to do that with H.265 also.

Re:no patent clarification yet, though (1)

Kjella (173770) | 1 year,4 days | (#44363673)

What would be helpful is if MPEG-LA came out and said whether it plans to do that with H.265 also.

Don't forget that MPEG LA isn't the one holding the patents, it's just a mouthpiece for all the companies involved that deals with collecting and distributing royalty fees. Until they've got deals signed with every patent holder, they can't really say anything on behalf of all of them. That said, if they want any H.265 adoption to take place the terms should probably not be significantly worse than H.264...

Re:no patent clarification yet, though (1)

tlhIngan (30335) | 1 year,4 days | (#44364353)

This is a software implementation. Commercial companies must license necessary patents separately, just as they do when they implement MPEG-2, H.264, etc.

No, companies doing MPEG-2, h.264, etc just buy a blanket patent license from MPEG-LA - a set fee schedule basically means you pay like 25 cents per device and you've licensed all the patents.

Problem is. h.265's patent pool is still being created and rates negotiated (and how much out of that fee does everyone get).

So it's similar to implementing say a cellphone, where you must strike separate deals with everyone in order to license all the patents. But in a year or two, it'll probably be a patent pool which makes life easier.

Of course, it also shows how much marketing (yes, marketing) needs to go into VP9 - h.264 is too established for VP8 to be of any threat. But the next-gen codec field is wide open - will it be h.265? Or VP9? Or something else? (All it would take is Google to heavily promote and advertise VP9, and grease enough people so that MPEG-5 can include VP9 as a codec in part of the standard).

And yes, the opportunity to strike is NOW.

Re:no patent clarification yet, though (1)

evilviper (135110) | 1 year,4 days | (#44366301)

Of course, it also shows how much marketing (yes, marketing) needs to go into VP9 - h.264 is too established for VP8 to be of any threat. But the next-gen codec field is wide open - will it be h.265? Or VP9? Or something else? (All it would take is Google to heavily promote and advertise VP9, and grease enough people so that MPEG-5 can include VP9 as a codec in part of the standard).

Google doesn't need the MPEG to get VP9 into widespread use. They've got their video and audio codecs fully developed (VP9/Opus), a container format (MKV/WebM), encoding and decoding software, client platforms they control (Android, Chrome, Chromium, Firefox, etc.), and the biggest internet video service (YouTube).

If they bite the bullet, and require Android phones to include VP9/Opus support, then convert YouTube to VP9/Opus and drop Flash and H.264, their format will own the internet in no time at all. Even Apple and Microsoft (staunch defender of H.264 patent-holders) will have no choice but to support the format on all their devices, lest they take away their user's YouTube viewing privileges, and face incredible backlash.

 

Re:no patent clarification yet, though (0)

Anonymous Coward | 1 year,3 days | (#44368535)

If they bite the bullet, and require Android phones to include VP9/Opus support, then convert YouTube to VP9/Opus and drop Flash and H.264, their format will own the internet in no time at all. Even Apple and Microsoft (staunch defender of H.264 patent-holders) will have no choice but to support the format on all their devices, lest they take away their user's YouTube viewing privileges, and face incredible backlash.

Requiring support in Android is a good plan, dropping H.264 is dumb. You overestimate Youtube. Youtube is widely "hated" by its users who keep coming back for more (in the same way that people hate Windows but refuse to use anything else) so it's Google that would have to deal with the backlash here.

A better plan is to just not offer HD in H.264, you get 320p unless you have VP9 which serves the 1080p. Google really needs to push hard on this though, require new Android devices support it and get it into Chrome and Firefox, and get a plugin for IE if possible, the transition period between H.264 and H.265 will be rather short (hardware design cycles take some time for the chips to be designed and manufactured, by the time the first H.265 devices are on the market it will be too late as a wall of cheap decoder chips will be flooding supplier inventories).

Decode performance (2)

Anonymous Coward | 1 year,4 days | (#44362789)

Has anyone found any information on decode performance? Apparently "H.265 is known for taking more horsepower to encode and decode than H.264" but I only see numbers provided for encode performance. It seems like decode performance is more important in most cases.

Re:Decode performance (1)

khelms (772692) | 1 year,4 days | (#44363489)

Why did the parent get voted down? This seems like a reasonable question to me. I sometimes have trouble getting smooth playback of H264 on my Linux systems due to the GPU driver not being as advanced as the Windows version. It sounds like H265 will require even more powerful graphics processing. Will I need a newer, more powerful video card? Will I be able to play this format on Linux at all, or will I have to wait a couple of years till the driver support for hardware acceleration catches up?

Re:Decode performance (2)

StreamingEagle (1571901) | 1 year,4 days | (#44363603)

x265 is an HEVC encoder implementation only. To answer the general question, H.265 is more compute intensive to decode than H.264, but the compute requirements for HD decoding are not unreasonable. Software decoding of HD HEVC is possible on a dual-core ARM system, and x86 systems will not have a problem. Of course, as with any video codec, hardware manufacturers will implement hardware decoders. Some have been announced. Expect more announcements in the coming months.

Re:Decode performance (1)

mister2au (1707664) | 1 year,3 days | (#44368067)

About 25-50% higher according to most estimates. There aren't any optimised decoders to really compare real-world performance though.

A shame that they/he 'stole' the x265 name,.. (1)

Selur (2745445) | 1 year,4 days | (#44362861)

even so that none of the main x264 developers is involved with the project. -> lame marketing stunt

Re:A shame that they/he 'stole' the x265 name,.. (1)

StreamingEagle (1571901) | 1 year,4 days | (#44362953)

Not true.

Re:A shame that they/he 'stole' the x265 name,.. (1)

Selur (2745445) | 1 year,4 days | (#44363509)

what? - the author commited some patches to x264, quite a while back - the author is not a main contributor to x264 - the x264 team is not involved in the x265 development

Re:A shame that they/he 'stole' the x265 name,.. (5, Informative)

StreamingEagle (1571901) | 1 year,4 days | (#44363665)

This project is not a surprise to any of the x264 developers - we have been in discussions with them for many weeks, and we have an agreement which allows us to utilize x264 code in x265. The x264 developers haven't had a chance to make contributions yet, as we just opened the project up to participation by the open source development community. We welcome their participation, and will do everything we can to enable and encourage it.

Re:A shame that they/he 'stole' the x265 name,.. (5, Insightful)

TopSpin (753) | 1 year,4 days | (#44364203)

we have an agreement which allows us to utilize x264 code in x265

You don't need an 'agreement' to use x264 code because x264 is licensed [videolan.org] under the terms of the GNU GPL v2.0. What, exactly, is this agreement supposed to permit?

Dual licensing (4, Informative)

tepples (727027) | 1 year,4 days | (#44365401)

What, exactly, is this agreement [to use x264 code in x265] supposed to permit?

Dual licensing permits the x264 maintainer to dual license the x264 code to clients unwilling to accept the GPL. The agreement permits the x265 maintainer to do the same with pieces of x265 that were borrowed from x264.

Re:Dual licensing (1)

jedidiah (1196) | 1 year,4 days | (#44365987)

When the likes of Sony seem perfectly willing to use Free Software, you really have to wonder if this is real problem or just nonsense perpetrated by a vanishingly small by highly noisy minority.

Copyleft vs. permissive licensing (1)

tepples (727027) | 1 year,4 days | (#44366081)

When the likes of Sony seem perfectly willing to use Free Software

Sony uses permissively licensed free software, such as *BSD operating systems. The X-series MPEG-4 video encoders (Xvid, x264, and x265), on the other hand, are copylefted. Console makers in general ban copylefted software on their platforms [slashdot.org] because allowing game publishers to distribute "Installation Information" (GPLv3) or "scripts to control [...] installation" (GPLv2) would allow other publishers to circumvent licensing.

Mod parent up (1)

amaurea (2900163) | 1 year,4 days | (#44363587)

I'm also annoyed about the name, and we may now end up in a confusing situation with two independent x265 programs if the x264 developers eventually start working on a h265 encoder too.

Re:Mod parent up (0)

Anonymous Coward | 1 year,4 days | (#44363905)

I'm also annoyed about the name, and we may now end up in a confusing situation with two independent x265 programs if the x264 developers eventually start working on a h265 encoder too.

Or they just merge the two projects together.

Re:A shame that they/he 'stole' the x265 name,.. (0)

Anonymous Coward | 1 year,4 days | (#44364055)

That's hypocrisy to the extreme. Where did you think the x264 name came from, Einstein?

x265, x264, Xvid, DivX, DIVX (1)

tepples (727027) | 1 year,4 days | (#44365427)

Just a guess, but the "x" in x264 comes from it being the H.264 counterpart to Xvid, which is DivX spelled backward, and DivX is named after an experimental time-limited counterpart to DVD-Video that Circuit City was pushing.

Re:x265, x264, Xvid, DivX, DIVX (0)

Anonymous Coward | 1 year,4 days | (#44367373)

Divx was a video codec back in the day (2000-ish)... it sold out (to CC, as you noted) and someone forked it to make xvid. Re the x264 vs h264... h264 is the video standard. x264 is the implementation of an encoder for that standard.

Re:x265, x264, Xvid, DivX, DIVX (1)

GuB-42 (2483988) | 1 year,3 days | (#44372451)

In fact there are 3 divx :
- "DIVX" : the time-limited DVD alternative that never took of, not a video codec.
- "DivX ;-)" : a hacked version of Microsoft's MPEG4-ASP codec that works with AVIs
- "DivX" : a completely rewritten and proprietary MPEG4-ASP codec that is compatible with "DivX ;-)", with many improvements
- "XviD" : an opensource alternative to "DivX", also written from scratch

"lacks B-frame support" (0)

Anonymous Coward | 1 year,4 days | (#44362923)

That's like half of the job.

Re:"lacks B-frame support" (1)

Selur (2745445) | 1 year,4 days | (#44363387)

according to https://bitbucket.org/multicoreware/x265/wiki/Home [bitbucket.org] there should be some sort of b-frames support: "With -b 1 you activate a hack which enables a "random access" GOP structure which uses B frames."

Re:"lacks B-frame support" (2)

maxwell demon (590494) | 1 year,4 days | (#44363641)

"With -b 1 you activate a hack which enables a "random access" GOP structure which uses B frames."

Will they also have a version with Democrat structure?

Re:"lacks B-frame support" (0)

Anonymous Coward | 1 year,4 days | (#44363825)

"With -b 1 you activate a hack which enables a "random access" GOP structure which uses B frames."

Will they also have a version with Democrat structure?

Since "-b 1" or "be one" is not consistent with the Democrat structure, the option to use the Democrat structure will be instead a collection of several hundred conflicting options that appear to yield more choice and an improvement, but instead will function identically with the "-b 1" option.

Re:"lacks B-frame support" (1)

Anonymous Coward | 1 year,4 days | (#44364301)

Will they also have a version with Democrat structure?

That version will be horrible inefficient since the GOP version only cares about 1% of the CPU.

Companies (0)

Anonymous Coward | 1 year,4 days | (#44363785)

Developer: So we've finally made our h264 mp4 HTML5 applets and site work for every device out there using customized and conditioned JavaScript.
MulticoreWare: h265 soon out!
Developer: Awwww crap =(

If you already support WebM and MP4 (1)

tepples (727027) | 1 year,4 days | (#44365443)

Ideally, you added multi-format support to your HTML5 applets when VP8 came out, and plugging in support for VP9 and H.265 won't be too hard.

NOT INTERESTED! (0)

Anonymous Coward | 1 year,4 days | (#44364311)

Sorry, but I have no interest. Until I can avoid the MPEG-LA and subsequent media cartels in every way, shape and form, from hardware to production, and commercial distribution by my own terms, I want nothing to do with it.

For any of you asking why? Go read up on WHO you have to pay, if you want to sell a video you'ved created to distribute commercially.

Mobile video the key (1)

TheSync (5291) | 1 year,4 days | (#44364465)

The question is whether HEVC will be of use on mobile devices - that is where an increasing amount of video viewing is being done, and the area where bandwidth is most in demand.

Existing smartphones have hardware support for H.264 - it is unclear how soon they will have hardware support for HEVC, which of course is even more computationally intensive.

Dedicated AVC hardware or programmable DSP? (1)

tepples (727027) | 1 year,4 days | (#44365505)

Does "hardware support for H.264" refer to dedicated silicon that does nothing but AVC decoding, or does it refer to implementing AVC's inner loops in a programmable digital signal processor such as a GPU? If the latter, it may be reprogrammable for HEVC. In fact, with HEVC increasing opportunities for parallelism, it might be easier to bring the GPU in.

Re:Mobile video the key (1)

mister2au (1707664) | 1 year,3 days | (#44368063)

Happy to be proven wrong, but I'd imagine fairly soon.

H.265 is a superset of H.264, so much of the same hardware could probably be used. As for complexity, it seems H.265 is around 25-50% more complex for DECODING so certainly not unreasonable for current mobile devices.

The current Samsung Galaxy S4 advertised both HEVC support and Full HD (1080p) playback - not sure if they are both together however.

What about VP9? (0)

Anonymous Coward | 1 year,4 days | (#44364489)

Seriously... anyone else hoping it takes over this time around? License/royalty free and *supposedly* within 1% of the quality for filesize? I hope my next WDTV supports it. ... guess this means I'll have to re-acquire my tv collection.

And time for Google to interrupt...... (0)

Anonymous Coward | 1 year,4 days | (#44364491)

And its now about time for Google to jump in and try and push their crap again!!

h264 good enough? (1, Interesting)

Blaskowicz (634489) | 1 year,4 days | (#44364913)

The thing is h264 is maybe too entrenched, it took many years to have many millions of devices supporting it and that gives you a big install base that people don't necessarily want to replace (not everyone is the middle class american with "drawer full of smartphones"). And even then many old PCs aren't quite up to the task yet - the kind that barely manage youtube 480p or even 360p.

MP3 files are still commonly used, even though they're clearly inferior and using AAC or OGG is a comparatively much simpler problem. I still even watch a lot of xvid. I can imagine the market will stick to h264 for a long time, like Windows XP stuck and still sticks around. Even if h265 is adopted it will mean content providers and hosts will have to dual encode to h264 and h265 for everything, increasing storage and encoding hardware costs.

Re:h264 good enough? (0)

Anonymous Coward | 1 year,4 days | (#44365211)

You must be new here. Technology evolves.

Re:h264 good enough? (0)

Anonymous Coward | 1 year,4 days | (#44365397)

h265 will gain traction as 4K displays start showing up in more and more average consumer households.

There will be a natural financial incentive for e.g. Netflix to sell "premium" 4K streams, and for the movie biz. to launch yet another disc format (I think it will be successful - there are a heck of a lot more households that can afford large-screen TVs and blu-ray players than there are households with access to 20 MBit+ broadband).

Next optical disc format (1)

tepples (727027) | 1 year,4 days | (#44365549)

The thing is h264 is maybe too entrenched, it took many years to have many millions of devices supporting it

The same was true of MPEG-2 hardware (DVD players) and MPEG-4 ASP hardware (DVD players with DivX). It took Blu-ray and smartphones to get AVC hardware into users' hands. People replace smartphones regularly. And just as Blu-ray Disc with AVC replaced DVD-Video with MPEG-2, whatever optical disc format replaces Blu-ray Disc is likely to include HEVC as an option for 4K video.

Re:Next optical disc format (1)

jedidiah (1196) | 1 year,4 days | (#44366061)

We don't even seem to have a standard media format for 4K video yet. While 25% improvement is nice, it really isn't that spectacular when you consider the likely escalation of stream size.

25% smaller versus 400% larger.

Re:Next optical disc format (1)

mister2au (1707664) | 1 year,3 days | (#44367947)

Agreed but 4x pixels does not equate to 4x bitrate for a given codec - typically more like 2x.

so maybe something like

4K H.265 = 75% * 200%* 1080p H.264 = 150% H.264

So not as bad as you'd think and a lot of current format stuff is still MPEG-2 as well so even more scope for improvement !!!

Re:Next optical disc format (1)

evilviper (135110) | 1 year,4 days | (#44366241)

And just as Blu-ray Disc with AVC replaced DVD-Video with MPEG-2, whatever optical disc format replaces Blu-ray Disc is likely to include HEVC as an option for 4K video.

Blu-ray hasn't exactly replaced DVDs... so it's probably a better analogy than you think. We're still stuck with MPEG-2 all over the place.

And I don't expect to see anything better than Blu-ray for another generation. Blu-ray only has a market because the FCC forced the HDTV upgrade, creating a substantial market for 1080i TVs, then early adopters driving the price down for the rest of the market.

The electronics industry DESPERATELY wants to push for everyone to buy a new, more expensive TV once again, but their efforts are falling flat. 3D TVs, 4K TVs, etc. It's foolish and pointless, and it won't happen until the FCC mandates a newer, incompatible, higher resolution video format, which just isn't going to happen for another 50 years, like NTSC TV broadcasts before it. At that point, yeah, the latest video codecs will get used. Until then, there's no compelling reason to spend that much money for a minor upgrade.

Re:Next optical disc format (1)

toejam13 (958243) | 1 year,4 days | (#44367467)

It's foolish and pointless, and it won't happen until the FCC mandates a newer, incompatible, higher resolution video format, which just isn't going to happen for another 50 years, like NTSC TV broadcasts before it.

The Advanced Television Systems Committee (ATSC) added support for H.264 compression and 1080p resolutions to the A/72 standard in 2008. My understanding is that FCC approval is pending.

The ATSC is also accepting proposals for a new 2180p broadcast standard that will not be backwards compatible with the current standard. I would expect that standard to be in place for terrestrial broadcasts within the decade. Cable operators will probably adopt it much sooner.

50 years? Technology doesn't move that slow anymore.

Re:Next optical disc format (1)

evilviper (135110) | 1 year,3 days | (#44368007)

The Advanced Television Systems Committee (ATSC) added support for H.264 compression and 1080p resolutions to the A/72 standard in 2008. My understanding is that FCC approval is pending.

ATSC writes all kinds of crazy standards that will NEVER be used. ATSC for satellite, cable, mobile, and handheld? Nobody uses it. Cable is QAM and satellite is universally DVB-S or DVB-S2. M/H is a joke that everyone just ignores.

The FCC will never approve any incompatible changes to OTA broadcast TV for decades to come. The installed base of hundreds of millions of TVs would be far too costly to replace.

The conversion to HDTV took close to two decades, several acts of congress, massive spending from broadcasters and consumers, and more. And it only went through because after half a century, our broadcast TV standards were woefully out of date and it was apparent that improvements were needed. It's simply not going to happen again in the foreseeable future, and it's ludicrous to claim otherwise.

50 years? Technology doesn't move that slow anymore.

"Technology" never moved that slow, but broadcast standards certainly did, and still do. Feel free to buy your 4K 3-D TV, but good luck finding content for it, as nobody else will, just as they didn't choose highdef or flat-screen displays until the FCC mandated it as the future of broadcast TV.

Re:Next optical disc format (1)

Blaskowicz (634489) | 1 year,3 days | (#44368449)

A nitpick, I think that any optical format that replaces Bluray disc would be a Bluray disc, such as a 100GB triple layer version with a higher supported sustained reading speed. That will piss off consumers though : maybe you need such a scheme (possibly with four or five layers) where the first layer or two can be read by a regular Bluray player, containing an usual h264 movie, and h265 version is on the bottom layers.. Is that possible at all?

Re:Next optical disc format (1)

jedidiah (1196) | 1 year,3 days | (#44370227)

The same is still true of MPEG2 hardware.

Every household in America has at least one expensive MPEG2 decoder (if not many) that likely won't be replaced until the issue is forced.

Other non-h264 devices continue to linger in the marketplace as they continue to function and are able to generate new content. This tends to create conflict with crippled devices that are limited to a particular subset of h264.

Even these force fed h264 devices come into conflict with other h264 content that assumes a more robust h264 decoding is available. The format is so complex that you would be hard pressed to create any "easy" UI to address it all.

So even pure h264 by itself is a bit of a quagmire. This is an issue that certain people seem to go out of their way to ignore or deny.

I hope they consider Opus for audio (1)

Khopesh (112447) | 1 year,4 days | (#44365543)

Ogg Opus [wikipedia.org] (open, royalty-free, not patent-encumbered audio) beats the pants off of HE-AAC [wikipedia.org] (which, in turn, is superior to everything else at pretty much every level). Opus also streams better, capable of dealing with extreme low-latency demands associated with real-time uses like VoIP.

It is so common to see people talking about tweaking x264 to improve quality and compression, but there is a point where you're better off optimizing the other pieces; AC3 passthru is laughable contrasted to 6-channel vorbis [wikipedia.org] (which I use in place of HE-AAC due to not having access to a quality AAC encoder on Linux). I'm still waiting for opus support in matroska [wikipedia.org] (which is in progress [xiph.org] ) or something to supplant matroska as the prevailing file container.

There's also the patent question; will this be intentionally patent-encumbered the way MPEG standards tend to go (in which case they'll certainly connect it to HE-AAC), or will this be a somewhat more open standard (which lends nicely to Opus)?

Re:I hope they consider Opus for audio (3, Informative)

evilviper (135110) | 1 year,4 days | (#44366167)

Ogg Opus (open, royalty-free, not patent-encumbered audio) beats the pants off of HE-AAC (which, in turn, is superior to everything else at pretty much every level).

Wow! So much wrong in just a single sentence...

Opus is an IETF developed codec, based on CELT from Xiph.org, and Silk from Skype/Microsoft.

HE-AAC certainly isn't "superior" at "every level". It excels at very low bitrate encoding that sounds SOMEWHAT like the original. As you start increasing the bitrate (eg 96k), low-complexity AAC easily surpasses HE-AAC. And as you go to higher bitrates still (eg. 160k), temporal domain codecs can outperform any frequency-domain codecs, so Musepack will beat the pants of AAC, and even Opus.

Still, low bitrate lossy audio quality is important, so Opus is a good choice for streaming audio and video. That's why Google chose it for their latest revision of WebM, along with their new VP9 codec that they claim outperforms HEVC.

I seriously doubt the MPEG / MPEG-LA organizations, and their members, will consider using a patent-free audio codec along with their heavily patent-encumbered video codec. Their business model is patents, and they'll chose an expensive and inferior option over a free one, any day. I'd expect HE-AACv2 to be the best you can count on for the foreseeable future.

Re:I hope they consider Opus for audio (1)

Khopesh (112447) | 1 year,4 days | (#44367011)

Ogg Opus (open, royalty-free, not patent-encumbered audio) beats the pants off of HE-AAC (which, in turn, is superior to everything else at pretty much every level).

Wow! So much wrong in just a single sentence...

Opus is an IETF developed codec, based on CELT from Xiph.org, and Silk from Skype/Microsoft.

Ah, I thought Opus was under Xiph's Ogg umbrella. Xiph certainly hosts the new CELT+Silk project called Opus. Ogg is (afaict) the only container used by Opus (.opus is an Ogg file container). Noting those things, if you can refer to Vorbis as "Ogg Vorbis" then why can't you refer to Opus as "Ogg Opus?" I knew this was informal (though it is somewhat common) and did it mostly to concisely demonstrate its connections to Xiph (which most people know as "the ogg [vorbis] guys").

HE-AAC certainly isn't "superior" at "every level". It excels at very low bitrate encoding that sounds SOMEWHAT like the original. As you start increasing the bitrate (eg 96k), low-complexity AAC easily surpasses HE-AAC. And as you go to higher bitrates still (eg. 160k), temporal domain codecs can outperform any frequency-domain codecs, so Musepack will beat the pants of AAC, and even Opus.

You again caught me generalizing. Perhaps I was mistaken, but I thought AAC encoders capable of HE-AAC would automatically select which AAC codec to use based on the compression level and this is what I was referring to. I also should have said "roughly equal or better than" rather than "superior to" in that statement.

Regarding Musepack (MPC), that is an obscure format that was generally on par with Vorbis back in the day (but Vorbis has been improved a few times since then while Musepack has not). I'm gauging most of the surviving "next-gen mp3" codecs as largely equivalent at the 128+kbps level these days (though my personal preference, biased in part towards Freedom (and Linux compatibility), is for vorbis).

We don't seem to be getting quality listening tests for higher bitrates any more. Everybody is obsessed with super-low bitrate (so much so that they don't even try to find an mp3 baseline by which to state things are roughly equivalent to). I found a 64 kbps comparison [xiph.org] which shows Opus narrowly beating Apple's HE-AAC for the lead. I'd love to see a thorough and up-to-date comparison of the major contenders at the 96, 128, and 160 kbps levels that also includes the lossless version as a baseline.

Still, low bitrate lossy audio quality is important, so Opus is a good choice for streaming audio and video. That's why Google chose it for their latest revision of WebM, along with their new VP9 codec that they claim outperforms HEVC.

That's odd, since WebM uses Matroska,which doesn't yet support Opus (though that's in the works). We'll see if Google can successfully make MPEG-LA obsolete. For that, we'd also have to compare VP9 to H.265 (Google noted VP9 was ~7% behind h.265 in Q4 2011) or perhaps wait for Daala [xiph.org] or some Dirac [wikipedia.org] -like wavelet codec to come out of left field (though wavelet compression has large hurdles [multimedia.cx] , it is closer to how our eyes actually see).

I seriously doubt the MPEG / MPEG-LA organizations, and their members, will consider using a patent-free audio codec along with their heavily patent-encumbered video codec. Their business model is patents, and they'll chose an expensive and inferior option over a free one, any day. I'd expect HE-AACv2 to be the best you can count on for the foreseeable future.

Agreed. Hopefully that won't stop Xiph and Google from stealing the show.

Re:I hope they consider Opus for audio (1)

evilviper (135110) | 1 year,4 days | (#44367295)

Regarding Musepack (MPC), that is an obscure format that was generally on par with Vorbis back in the day (but Vorbis has been improved a few times since then while Musepack has not).

I'll re-iterate my first comment:

"temporal domain codecs can outperform any frequency-domain codecs" (at high bitrates)

Vorbis is a frequency domain codec. Musepack necessarily beats it at high bitrates, every time. Don't like Musepack? Okay. Pick another temporal-domain codec, and it'll beat Vorbis, Opus, AAC, and any other frequency-domain codec you can come up with. It's not a debate, there's no listening tests required. It's provable. I believe the oldest one still around is MPEG-1 Layer II, which again will always necessarily beat AAC, Opus, Vorbis, etc., at high bitrates.

https://en.wikipedia.org/wiki/MPEG-1#Quality [wikipedia.org]

We don't seem to be getting quality listening tests for higher bitrates any more. Everybody is obsessed with super-low bitrate

That's because the earliest standard codecs like MP2 came extremely close to the theoretical limits of Perceptual Entropy. Musepack did even better, and is probably right at the limit. You can't possibly do any better. The only place there is much room for improvement is in non-transparent audio, ie. low bitrates that don't need to sound perfectly like the original.

(Google noted VP9 was ~7% behind h.265 in Q4 2011)

They've more recently claimed VP9 beats HEVC across the board: "we've produced a codec that shows video quality that is slightly better than HEVC (H.265)"

http://blog.webmproject.org/2013/07/vp9-lands-in-chrome-dev-channel.html [webmproject.org]

or perhaps wait for Daala or some Dirac-like wavelet codec to come out of left field

Xiph,org has a shoddy history with video codecs. They royally screwed up with VP3. When released as open source, it outperformed all the standard codecs available at the time, but they sat on it for a decade, and finally released something that wasn't any better than VP3 long after H.264 had gained a solid grip, and could vastly outperforming it easily. VP9 has some hope of getting somewhere, in large part because Xiph doesn't have their hands in it.

At this point I'd throw wavelets in the bucket with fractal compression, and all the other theoretical methods that never worked-out in reality. Things we later find out weren't really efficient or perhaps even theoretically possible.

Re:I hope they consider Opus for audio (0)

Anonymous Coward | 1 year,3 days | (#44368795)

> VP9 has some hope of getting somewhere, in large part because Xiph
> doesn't have their hands in it.

fuck off useless troll

Advantage Over VP9? (1)

theweatherelectric (2007596) | 1 year,4 days | (#44366667)

VP9 produces video about the same size and quality as H.265 (Google I/O talk on VP9 [youtube.com] , though they of course weren't using x265 to compare), VP9 support is already in Chrome [webmproject.org] (with Firefox and Opera likely to follow soon) and the reference VP9 implementation is BSD-licensed. What's the advantage of H.265 over VP9 and what does x265 in particular offer over this new version of WebM (VP9+Opus)?

Re:Advantage Over VP9? (0)

Anonymous Coward | 1 year,3 days | (#44368001)

The biggest advantage is it isn't a google product so it won't suck ass.

Re:Advantage Over VP9? (0)

Anonymous Coward | 1 year,3 days | (#44368429)

I wasn't particularly impressed by VP9. To me, it didn't look significantly better than H.264.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...