Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Firefox 33 Integrates Cisco's OpenH264

Unknown Lamer posted about 3 months ago | from the monty-does-it-better dept.

Firefox 194

NotInHere (3654617) writes As promised, version 33 of the Firefox browser will fetch the OpenH264 module from Cisco, which enables Firefox to decode and encode H.264 video, for both the <video> tag and WebRTC, which has a codec war on this matter. The module won't be a traditional NPAPI plugin, but a so-called Gecko Media Plugin (GMP), Mozilla's answer to the disliked Pepper API. Firefox had no cross-platform support for H.264 before. Note that only the particular copy of the implementation built and blessed by Cisco is licensed to use the h.264 patents.

cancel ×

194 comments

Sorry! There are no comments related to the filter you selected.

Trusting a binary from Cisco (2, Interesting)

Anonymous Coward | about 3 months ago | (#47514955)

Even though the codec source code is available, it is compiled by Cisco and provided to Mozilla. Something in me doesn't 100% trust that Cisco won't use this as an opportunity to put hidden spyware on everyone's computers. The US gov't can force American companies to secretly implement spyware, right?

Re:Trusting a binary from Cisco (5, Insightful)

ledow (319597) | about 3 months ago | (#47514971)

But with access to the source code, it's easily possible to verify that the binary supplied corresponds to the source.

That's how we know that TrueCrypt has no "binary" backdoors - we just try different combinations of compiling, noting the differences, until we find the one that Cisco used. If we never find the exact combination, the differences between a "known good" compile of the original source and the final binary make the amount of code to blind-check almost negligible in comparison.

It's when people DON'T provide source that you should be suspicious, or when you can't get close to their source providing their binary.

Re:Trusting a binary from Cisco (1)

Anonymous Coward | about 3 months ago | (#47515025)

I wonder when we'll see the results of this test on Slashdot, and if it will be re-tested whenever they push out an auto-update. It might be good for a year or two until they've earned our trust, and then they spike it.

Re:Trusting a binary from Cisco (0)

Anonymous Coward | about 3 months ago | (#47515105)

It's when people DON'T provide source that you should be suspicious, or when you can't get close to their source providing their binary.

Reading does not mean understanding [xcott.com] . It takes more effort to hide malice in an open source project, but it is still very possible. It becomes geometrically easier to hide hostile behavior as the size of the source code increases, or even if there is simply no planning to the structure of the system (like OpenSSL).

Re:Trusting a binary from Cisco (3, Insightful)

wonkey_monkey (2592601) | about 3 months ago | (#47515189)

But with access to the source code, it's easily possible to verify that the binary supplied corresponds to the source.

Is it that easy? My understanding was that you'd at least have to have identical versions of the compilation tools to have any hope of coming close to a bit-for-bit match on the binary.

Re:Trusting a binary from Cisco (4, Insightful)

Kardos (1348077) | about 3 months ago | (#47515263)

Seems like a problem with a simple solution: Cisco needs to publish their build procedure.

Re:Trusting a binary from Cisco (1)

tokizr (1984172) | about 3 months ago | (#47515273)

I don't know exactly how this verification is usually done but I would assume it would involved a more relaxed search checking for instance that the same system calls which are implicit in the source are in the provided binary (and only those), or that the symbol table matches what is expected. An not a direct byte to byte comparison.

But I could be wrong so I hope someone else with experience in this area can enlighten us further.

Re:Trusting a binary from Cisco (5, Informative)

Wrath0fb0b (302444) | about 3 months ago | (#47515705)

No. In fact it's absurdly difficult to reliably create reproducible builds [debian.org] . Debian has been working on this since at least 2009 (afaict) and has been plowing through issues but you still can't get an identical Kernel [debian.org] as the .deb. Heck, it was 8 weeks just for the Tor browser [debian.org] .

It's not just the compilation tools, it's the entire build environment that needs to be homogenized. All kinds of components will insert uname/hostname and paths into the binary, filesystems list the contents of a directory in undefined order, timestamps and permissions are embedded into tarballs and documentation, different locale produces other weirdness.

tl;dr: it's much harder than just installing an identical version of clang and hitting make.

[ And, as an aside, this goes back decades. The infrastructure around builds was never designed with reproducibility as a design goal. We are basically retrofitting this new requirement on decades of legacy code that never even considered that we would want such a thing ... ]

Re:Trusting a binary from Cisco (1)

vux984 (928602) | about 3 months ago | (#47516331)

No. In fact it's absurdly difficult to reliably create reproducible builds.

Yes and no. Yes, absolutely, its absurdly difficult to create identical binaries, for the reasons you mentioned.

But you can pretty reasonably get close enough to make manual inspection of the differences easy enough. And as you said the differences are usually filepaths, hostnames, timestamps etc so one can identify the difference as benign pretty easily.

That's not good enough for general build reproducibility, but for one off code to binary verifications of key pieces its reasonable.

Re:Trusting a binary from Cisco (1)

Alain Williams (2972) | about 3 months ago | (#47515069)

Who modded this troll, it should be modded insightful? Are the NSA operatives getting in quick these days ?

Re:Trusting a binary from Cisco (1)

some old guy (674482) | about 3 months ago | (#47515469)

They sure seem to be, or at least groveling apologists if not operatives.

Re:Trusting a binary from Cisco (2)

Anonymous Coward | about 3 months ago | (#47515551)

Are the NSA operatives getting in quick these days ?

Operation Frist Post

Re:Trusting a binary from Cisco (5, Informative)

Actually, I do RTFA (1058596) | about 3 months ago | (#47515133)

Cisco heard your concerns and has responded: Development and maintenance will be overseen by a board from industry and the open source community.

Re:Trusting a binary from Cisco (0)

93 Escort Wagon (326346) | about 3 months ago | (#47515159)

Even though the codec source code is available, it is compiled by Cisco and provided to Mozilla. Something in me doesn't 100% trust that Cisco won't use this as an opportunity to put hidden spyware on everyone's computers.

I don't believe "everyone" is using Firefox these days - quite the opposite. So most of us aren't going to lose any sleep over this possibility.

Re:Trusting a binary from Cisco (5, Funny)

ArcadeMan (2766669) | about 3 months ago | (#47515179)

That's why I know I'm safe. I use OS X, which is a closed-source OS. And since it's closed, the government doesn't have access to it.

I love the smell of bad logic in the morning.

Re:Trusting a binary from Cisco (0)

Anonymous Coward | about 3 months ago | (#47515487)

NSA may not be smart because libertarians uses linux/bsd!

Re:Trusting a binary from Cisco (5, Interesting)

Anonymous Coward | about 3 months ago | (#47515925)

Not only will it be your choice to accept the binary, but Mozilla also shares those concerns. Hence why they're sandboxing the CDM plugins to limit their access and ability to do anything except what they advertise. We'll have the choice to trust Mozilla's work, disable it, or partake in an effort to confirm that it's as legit as we want, so I honestly fail to see any major issue here.

Re:Trusting a binary from Cisco (3, Insightful)

smash (1351) | about 3 months ago | (#47516075)

Why the fuck would they bother, when they can just do that to all of the backbone routers you use?

Licensed (1)

Anonymous Coward | about 3 months ago | (#47514969)

Not for long...

so-called (1)

Anonymous Coward | about 3 months ago | (#47514973)

That is not how you use this word.

Great (3, Insightful)

Burz (138833) | about 3 months ago | (#47514975)

I always wanted a backdoor in my browser.

Re:Great (1)

Anonymous Coward | about 3 months ago | (#47515375)

You have always had one. [wikipedia.org]

Re:Great (1)

smash (1351) | about 3 months ago | (#47516109)

If you think CIsco need to backdoor your browser to own all your shit, you are tragically naive.

Re:Great (1)

bill_mcgonigle (4333) | about 3 months ago | (#47516457)

I always wanted a backdoor in my browser.

I really did try searching for how this plugin retrieval works but must not have use the right search terms.

To stay license compliant *AND* safe, Mozilla should sign the modules as they become available, and Firefox should only download them if both Mozilla's and Cisco's signatures verify.

That being done, there's very little difference between Mozilla shipping the code to you as part of a Firefox update and having the browser fetch it afterwards.

But if Mozilla is _only_ trusting Cisco's signature, then, yeah, wow, holy cow, back a truck into it.

Links welcome.

So Kind of open? (0)

robstout (2873439) | about 3 months ago | (#47515019)

Kind of open, I guess?

Re:So Kind of open? (5, Informative)

Actually, I do RTFA (1058596) | about 3 months ago | (#47515075)

The source is open: you can read it, you can compile it and compare binaries, etc.

In fact, it is BSD licensed.

But that only covers the copyright. The patent is not opened (nor owned by Cisco), and seem to prevent derivative works.

Cisco paid the fees to use the patent in this one application, and open-sourced it to the world. Seems like a great solution, security-wise, and clever legally.

And, it becomes just more BSD code when the patent expires in... what, a decade? Or if the new Supreme Court ruling is found to invalidate the patent.

Re:So Kind of open? (3, Interesting)

alci63 (1856480) | about 3 months ago | (#47515357)

So, this is a software only patent... so it's not legal in Europe (or is it). Some Linux distro might consider integrating this code directly, and compile it instead of letting FF grab a blob from Cisco. Maybe distribute it in a special repository, that users would activate where it legal... Notice VideoLAN for example does play HEVC (aka H.264), and does not licence anything...

Patent upgrade treadmill (4, Insightful)

tepples (727027) | about 3 months ago | (#47515413)

And, it becomes just more BSD code when the patent expires in... what, a decade?

A decade from now, most major web video streams will be in H.265 (HEVC), and H.266 will be the Next Big Thing(tm). By the time the patents on one codec have run out, bandwidth constraints cause providers of non-free media to switch to a new freshly patented codec. Users end up stuck on a treadmill, from H.261 to MPEG-1 to MPEG-2 to H.263 family (Sorenson Spark, DivX, Xvid) to H.264 (AVC) and so on.

Re:Patent upgrade treadmill (3, Insightful)

petermgreen (876956) | about 3 months ago | (#47515611)

Maybe maybe not. Once a format is deemed "good enough" it can stick around for a long time. See mp3, jpeg png etc. Furthermore bandwidth prices have dropped through the floor in recent years,

Re:Patent upgrade treadmill (3, Insightful)

tepples (727027) | about 3 months ago | (#47515741)

Once a format is deemed "good enough" it can stick around for a long time.

True, if it is impractical to deploy a new codec in the field alongside the existing codecs, a first mover will win. This is why U.S. OTA digital television is stuck on DVD/SVCD era codecs, but some countries whose digital transition happened later use H.264.

Furthermore bandwidth prices have dropped through the floor in recent years

Long haul yes, last mile no. Satellite and cellular ISPs tend to charge on the order of $10 per GB. Even wired home ISPs such as Comcast and Verizon have been practicing "congestion by choice", refusing to peer with L3.

Re:Patent upgrade treadmill (1)

Anonymous Coward | about 3 months ago | (#47515711)

Bandwidth and codec efficiency are not the key issues. The reason why H.264 has won out over all other codecs is that it had enough industry support to get a hardware implementation in practically all mobile devices. Unless this hardware support goes away, H.264 will continue to be a viable alternative, even after its patents expire. There's only so much compression possible before you can no longer handle a lot of source material with inherently high entropy. Diminishing returns with increased coding complexity and the availability of a (then) free "good enough" alternative significantly reduce the attractiveness of newer standards. Is a compression factor of 2 compared to a free codec worth the license trouble and the additional development? What matters is that the existing standard doesn't empty the battery where a new codec wouldn't. With hardware support for H.264 on every device and spotty support for any newer codec, that should limit the licensing shenanigans anyone can pull with a newer standard. For comparison see MP3 vs AAC.

YouTube never implemented Theora (1)

tepples (727027) | about 3 months ago | (#47515757)

Is a compression factor of 2 compared to a free codec worth the license trouble and the additional development?

Yes. This is why YouTube never implemented Theora, waiting until VP8 (which roughly compares to AVC baseline) before adding any free codecs.

What matters is that the existing standard doesn't empty the battery where a new codec wouldn't.

Why wouldn't the new codec be GPGPU-accelerated too?

Re:YouTube never implemented Theora (0)

Anonymous Coward | about 3 months ago | (#47515985)

We're not comparing two unaccelerated codecs with similar licenses. All else equal, the more efficient codec is of course preferable. But if you compare hardware accelerated H.264, then free, to a new codec which compresses twice as much but requires patent licenses and doesn't have ubiquitous hardware support (yet), the new codec doesn't look attractive. Bandwidth is not that scarce.

GPGPU acceleration is much less power efficient than dedicated codec hardware. Dedicated codec hardware is at least partly codec specific. Newer codecs typically include the building blocks of older codecs, but not vice versa. Even if a codec doesn't need additional hardware blocks and could theoretically be implemented by updating DSP firmware, manufacturers don't do this, because the licensing costs were not included in the price of the sold hardware. In consequence, a new codec necessarily requires at least two years for competitive market penetration (because phones are replaced on about a two year schedule).

Re:YouTube never implemented Theora (1)

tepples (727027) | about 3 months ago | (#47516267)

Bandwidth is not that scarce.

It is when you're trying to send 2160p video to subscribers behind local monopoly ISPs that routinely practice metering, congestion by choice, or both.

Re:Patent upgrade treadmill (1)

Actually, I do RTFA (1058596) | about 3 months ago | (#47516249)

By the time the patents on one codec have run out, bandwidth constraints cause providers of non-free media to switch to a new freshly patented codec

That seems silly.

Bandwidth is one of those commodities (like processor cycles) that gets cheaper as time marches on. Bandwidth now is easily a couple of orders of magnitude higher than a decade ago (and moving towards gigabit), and that was several orders of magnitude higher than the decade before that.

Further, its a cost center now. If you could halve Netflix's bandwidth costs, you'd be quite wealthy.

The real limits are a) on the decoding side: How much processor power, RAM, etc. does it take to create an image, and b) the quality of the decompressed video, esp. against theoretical limits

1 Gbit per second vs. 5 GB per month (1)

tepples (727027) | about 3 months ago | (#47516275)

Bandwidth now is easily a couple of orders of magnitude higher than a decade ago (and moving towards gigabit)

Gigabit per second just means you blow through a month of last-mile data in 40 seconds.

archival H.264 (0)

Anonymous Coward | about 3 months ago | (#47516389)

And, it becomes just more BSD code when the patent expires in... what, a decade?

A decade from now, most major web video streams will be in H.265 (HEVC), and H.266 will be the Next Big Thing(tm).

True, but if you save all your files in H.264, you are guaranteed an archival data format that can be read by software that won't suddenly stop working.

We have to remember to preserve the past in addition to planning for the future.

Archiving your own or someone else's? (1)

tepples (727027) | about 3 months ago | (#47516487)

True, but if you save all your files in H.264, you are guaranteed an archival data format that can be read by software that won't suddenly stop working.

If you are archiving a video that you produced, what's the big advantage of H.264 over VP8? VP8 is rate-distortion comparable to H.264 baseline, and VP8 is free today. An archival copy needs to be read by software, not necessarily read by specialized hardware in a battery-constrained device.

If you are archiving a video that someone else produced, most streaming video providers have a policy of implementing technical measures to prevent just that, backed by national anticircumvention legislation.

Re:So Kind of open? (0)

Anonymous Coward | about 3 months ago | (#47515521)

There is the issue of what MPEG-LA intends to do with their grip on things in 2016, correct? How is this implementation, not effected?

Is anyone left to care? (4, Insightful)

Anonymous Coward | about 3 months ago | (#47515039)

They've already destroyed FF and changed it from a browser with its own identity into Chrome's obsessed former friend who mimics her every move and style and is planning to kill her and assume her identity some day.

Honestly, there's nothing left to call Firefox now. If I want a browser like Chrome, I'll run Chrome. If I want a browser like Firefox, then I have to use an old one or a fork.

Stop punching your users in the face, and give them back the control they had over their browser.

Re:Is anyone left to care? (1)

roca (43122) | about 3 months ago | (#47515095)

You have control. As the article says:
> Users will have options to activate or deactivate it

Re:Is anyone left to care? (1)

Anonymous Coward | about 3 months ago | (#47515175)

You have control. As the article says:
> Users will have options to activate or deactivate it

I may have control over this plugin, but I don't have control over my whole browsing experience the way that I did 8 versions ago.

Re:Is anyone left to care? (1)

Anonymous Coward | about 3 months ago | (#47515599)

I don't have control over my whole browsing experience the way that I did 8 versions ago.

That would be last Monday's version, right ?

Re:Is anyone left to care? (2)

stoploss (2842505) | about 3 months ago | (#47515601)

I may have control over this plugin, but I don't have control over my whole browsing experience the way that I did 8 versions ago.

AKA "last month". Mozilla really lost the community's goodwill with that move. There was no compelling rationale to support FF after that. Their insistence on using a single-process model really destabilizes their browser, for example. Every release seems to remove functionality or force you to change the way you use the browser in ways you don't want. It's like they hired Gnome 3/Unity/Windows Metro program managers and asked them how best to fuck up their main product.

Thanks to this change to their support model I relegated FF to rare use when I need to check to confirm if another browser is being flaky or if the site itself is to blame.

Re:Is anyone left to care? (-1, Flamebait)

Anonymous Coward | about 3 months ago | (#47516345)

You're still living 4 years ago, are you? Firefox has never insisted on sticking with the single-process model, they just had a lot of code to fix to move to one. So they fixed that code while multi-threading the browser as much as possible, and as of half a year ago they resumed work on a multi-process model - hell, you can even try it in the File menu if you're so inclined (new e10s window).

The fact is that people like you guys just don't want to install another addon or keep up to date on things. To you, if Mozilla isn't doing everything for you as quickly as you want, then they're not worth considering. But funny enough, I don't see you guys doing anything except ignorantly bitching like a bunch of princesses with a pea under their pile of mattresses. I'd say you should just stop using Firefox entirely if you're just going to insist that it's terrible because it doesn't fondle your balls precisely how you want it to.

Re:Is anyone left to care? (1)

93 Escort Wagon (326346) | about 3 months ago | (#47515183)

You have control. As the article says:
> Users will have options to activate or deactivate it

It sounds like the person to whom you're replying deactivated Firefox quite some time ago.

Re:Is anyone left to care? (0)

Anonymous Coward | about 3 months ago | (#47515153)

If you can't manage to read and see that you don't have to use this, well, stay on chrome and enjoy the sound of your whiny voice.

Version number confusion (4, Informative)

Anonymous Coward | about 3 months ago | (#47515087)

(reads summary)

Hum, Interesting...firefox 33 integrates, mumble, mumble...wait, something's not right with this picture.

(Scrolls back a few lines on the RSS feed)

Firefox 31 Released [slashdot.org]

Aha! I knew it. Latest version is 31! Must be a typo...

(One angry RTFA later)

Oh, hang on...They are referring to the yet unreleased, possibly future version of Firefox. With no indication whatsoever of that fact in the summary, even though a (stable?) version of Firefox was just recently released, as highlighted on this very same website less than 24 hours ago.

...

Would it have killed anyone to point this out somewhere? You know, for those of us at home who don't keep up with Firefox's versioning madness?

Re:Version number confusion (1)

ArcadeMan (2766669) | about 3 months ago | (#47515161)

You don't have to keep track of Firefox version. By this time next month they'll announce FireFox 40, with H.265 support.

Re:Version number confusion (1)

Cro Magnon (467622) | about 3 months ago | (#47515391)

Version 31 was yesterday's version. 33 will be tomorrow's version.

Re:Version number confusion (0)

Anonymous Coward | about 3 months ago | (#47516481)

No, version 33 is scheduled for 3 months from new.

Re:Version number confusion (1)

Em Adespoton (792954) | about 3 months ago | (#47516377)

What makes it even more confusing is that last night, I was reading about the new features being added to Firefox 32 which is currently available to testers.

I've heard of sprints, but this is getting a bit silly.

At fucking last (1, Interesting)

ArcadeMan (2766669) | about 3 months ago | (#47515127)

Can we finally use the tag with H.264 files and just forget about the rest?

Re:At fucking last (1, Informative)

ArcadeMan (2766669) | about 3 months ago | (#47515149)

Always really preview before clicking submit.

Can we finally use the the <video> tag with H.264 files and just forget about the rest?

Re:At fucking last (2)

93 Escort Wagon (326346) | about 3 months ago | (#47515301)

Always really preview before clicking submit.

Can we finally use the the <video> tag with H.264 files and just forget about the rest?

No, since Firefox is currently limiting the use of this plugin to WebRTC - which basically means it's not available for anything actual users want to do, such as watch html5 video.

Re:At fucking last (1)

ArcadeMan (2766669) | about 3 months ago | (#47515323)

Another stupid idea by the Firefox team, then.

Re:At fucking last (2)

Blaskowicz (634489) | about 3 months ago | (#47515473)

The article mentions Youtube, without giving any specifics. Seems they're shipping the plugin greyed out, disabled etc. and then WebRTC stuff will work (does anyone have either used that?) and then maybe you'll be able to use html5 video in some future version, maybe.

Setting the politics aside, and even whether they intend or not to provide html5 video support, it feels better to do that staged release. I sure would want that the kinks, bugs, networking and security issues are worked out before it is unleashed on millions of unsuspecting users.
Well, people will get that h264 support for WebRTC even though they have no idea what the f that means, but as no one uses WebRTC it wouldn't be as drastic as Youtube support.

Re:At fucking last (0)

Anonymous Coward | about 3 months ago | (#47515609)

No, because i will disable it.

Re:At fucking last (1)

ArcadeMan (2766669) | about 3 months ago | (#47516185)

No video for you.

NEXT!

bad for standards (5, Insightful)

l2718 (514756) | about 3 months ago | (#47515129)

Mozilla capitulating on the tag has serious implications for web standards. By including patent-encumbered code in the browser they take the rug from under those in the www foundation that argue for free web standards. Yes, some websites wanted to use H.264 for video encoding, but Mozilla shouldn't have abetted them.

Re:bad for standards (2)

emblemparade (774653) | about 3 months ago | (#47515225)

This has nothing to do with the "tag" itself, which does not specify codecs. Yes, this is still a compromise, but many of us have been compromising for years on various aspects of freedom and openness. Choose your battles carefully and you can win the war: Mozilla has already achieved so much for the open web, and I'm confident the upward slope will continue.

Re:bad for standards (2, Interesting)

ArcadeMan (2766669) | about 3 months ago | (#47515249)

I'm all for open standards and less patents, but H.264 videos and H.264 decoding hardware has been used everywhere for almost a decade now. Even if something free and open-source had been able to replace it, we're on the verge of switching to H.265 which is about twice as good as H.264.

Re:bad for standards (1)

Anonymous Coward | about 3 months ago | (#47515369)

The battle now is to prevent H.265 from becoming a new de-facto standard, so the future does not get stuck with yet another patent encumbered format.

Xiph is working on Daala and Google on VP9, both attempting to replace H.265 -- with Daala, at the moment, looking like the most promising one as something that only can compete with H.265 but will perform better.

Re:bad for standards (2, Insightful)

smash (1351) | about 3 months ago | (#47516125)

If the open source world releases something (unencumbered with the GPL - i.e., BSD licensed) with encoding and decoding tools that actually works as well or better than the closed alternative, in a timely manner then I'm sure people will use it.

It will never happen. Get used to it. There is far, far less complex stuff in the free desktop that has been broken for the past 20 years and still not fixed.

Re:bad for standards (2)

Blaskowicz (634489) | about 3 months ago | (#47515677)

I'm sure the transition to H265 will be at least a decade long (do unreleased AMD and Intel CPUs even support it? I think not). H264 will stay for a long time. Even MP3 has been outdated for like 10+ years but still is massively used.

Re:bad for standards (2)

Dr.Dubious DDQ (11968) | about 3 months ago | (#47515291)

It also still doesn't give anyone permission to generate their own h.264 video files (outside of webrtc "video-chatting" inside the browser) legally without paying someone a patent "poll-tax" for permission, so this is still "consume-only".

I'm also under the impression that there are,absurdly, potential patent-license issues with the .mp4 file format that h.264 video is most often stored in.

Finally, of course unless the usual obstructionist Apple and Microsoft ever implement opus codec support, this also doesn't give you the legal ability to include sound (mp3 or aac, typically, for h.264 videos) with the video. Hope everybody likes silent movies...

Re:bad for standards (5, Informative)

tlhIngan (30335) | about 3 months ago | (#47516105)

It also still doesn't give anyone permission to generate their own h.264 video files (outside of webrtc "video-chatting" inside the browser) legally without paying someone a patent "poll-tax" for permission, so this is still "consume-only".
I'm also under the impression that there are,absurdly, potential patent-license issues with the .mp4 file format that h.264 video is most often stored in.

Finally, of course unless the usual obstructionist Apple and Microsoft ever implement opus codec support, this also doesn't give you the legal ability to include sound (mp3 or aac, typically, for h.264 videos) with the video. Hope everybody likes silent movies...

If you have a camcorder, the license to create h.264 is present as part of the camcorder. This includes phones and everything else people submit to YouTube, for example.

The only constraint is that if you post content online, you cannot take payment on the content itself - i.e., you can put it online, you can put ads around it, but you cannot force someone to pay to view that content (commercial activity). So those videos on YouTube where you have to pay in order to view them come under a different license.

As for the Mp4 format being patented - it's RAND by Apple ages ago (MP4 is a subset of the QuickTime MOV format). If Apple's asserting any patents on the format, that is. But since people mass-license the h.264 patents through the MPEG-LA, that means any patents Apple has on MP4 are included in the license fee you pay to create or display the content.

Sound is licensed under a separate agreement - MP3 or AAC. Again, your typical MPEG-LA license for h.264 will probably include use licenses for AAC (most typical format) so you can have a soundtrack.

If not, there's always PCM as well - handled by the format just fine.

In an imperfect world... (4, Insightful)

westlake (615356) | about 3 months ago | (#47515497)

Yes, some websites wanted to use H.264 for video encoding, but Mozilla shouldn't have abetted them.

H.264 is here.

HEVC not far down the road.

The geek sees everything in terms of the "open" web.

But there is more to digital video than video distribution through the web.

Which is why the mainstream commercial codecs dominate here.

Why hardware and software support for these codecs are baked into the smartphone, tablet, PC, graphics card, HDTV, video game console, Blu-ray player. The prosumer HD camcorder, medical and industrial video systems and so on, endlessly.

Re:bad for standards (1)

diamondmagic (877411) | about 3 months ago | (#47515723)

Code implementing software patents can still be Free/Open Source Software. I mean, isn't that what x264 and VLC is? The un-FOSS-like restriction is one enforced by the government and patent trolls, not the software project.

Just because one country makes it illegal means you should, or even have to, spread it all around the world.

Mozilla isn't even offering people the option to enable h.264 in some alternative fashion (maybe a user could provide it themselves, maybe Firefox searches the OS or hardware for an h.264 implementation) - which they could legally do - no, they're just saying "Haha, screw you".

Re:bad for standards (1)

Anonymous Coward | about 3 months ago | (#47516231)

Actually, if you are running a recent version of Firefox on a Linux system and you have gstreamer and the gstreamer ffmpeg plugin installed, then you can watch h.264 videos with no problem as Firefox supports gstreamer by default on Linux systems. The question now is whether Mozilla will change that default in order to give Cisco's implementation preference over what is already supported.

Re:bad for standards (0)

Anonymous Coward | about 3 months ago | (#47516491)

They also recently decided to include DRM (EME), the interface has big lumpy icons instead of simple text menus (I believe you can change that but I don't have a recent version anymore), they included their own pdf viewer (ridiculous), a simple way to disable javascript is now hidden from regular users(a common choice many users make), etc, etc, etc.
I've switched to wget.

bad idea written all over it (0)

Anonymous Coward | about 3 months ago | (#47515155)

Nothing should come direct from Cisco. We shouldn't be reliant on patent encumbered formats. It's that simple. No matter what the payoff is there is something fundamentally wrong about it. We're giving up control of our systems and that is just wrong. I shouldn't need a license to tinker with the code.

If this system is to be put in place despite this cry anything to be downloaded from Cisco directly should be compiled by Mozilla and signed by Mozilla for security reasons and the code should also be compilable such that it can be matched against the Cisco binary. Firefox should then check this executable upon downloading it.

Re:bad idea written all over it (1)

ArcadeMan (2766669) | about 3 months ago | (#47515313)

Isn't there patents on GIF and JPEG? And the web is full of images in those two formats anyway.

Re:bad idea written all over it (1)

Anonymous Coward | about 3 months ago | (#47515627)

Gif patents (LZW-compression part of gif) expired on 20 June 2003 (Source [wikipedia.org] ). JPEG only had a patent troll go after it. (Source [wikipedia.org] )

So what's the best way to do video on the web? (2)

RogueWarrior65 (678876) | about 3 months ago | (#47515241)

Serious question: What's the best way to handle video on the web given a few requirements? First, the content needs to be hosted on the same site as the website. Why? Because sites like Youtube and Vimeo have control over it. They can unilaterally decide to take something down. They will also present related video. For someone trying to market product, you shouldn't make it easy for a prospective customer to find your competitors. Second, the video has to work on both Macs and PCs. Third, the video has to work on Internet Explorer as early as v.8 because too many users don't know any better.

Re:So what's the best way to do video on the web? (0)

jonwil (467024) | about 3 months ago | (#47515745)

If you genuinely need to support Intercrap Exploder then your best option is Flash.

Re:So what's the best way to do video on the web? (1)

Anonymous Coward | about 3 months ago | (#47515783)

Serious question: What's the best way to handle video on the web given a few requirements? First, the content needs to be hosted on the same site as the website. Why? Because sites like Youtube and Vimeo have control over it. They can unilaterally decide to take something down. They will also present related video. For someone trying to market product, you shouldn't make it easy for a prospective customer to find your competitors. Second, the video has to work on both Macs and PCs. Third, the video has to work on Internet Explorer as early as v.8 because too many users don't know any better.

This sounds like a valid question for an 'ask Slashdot' article. You'll get more answers that way than you will here in a comment that might get modded 'Off topic'.

Re:So what's the best way to do video on the web? (1)

smash (1351) | about 3 months ago | (#47516143)

video tag. for the IE8 users, give them alt text.

mpeg4, with link too if embedded (1)

raymorris (2726007) | about 3 months ago | (#47516207)

Virtually all of the popular file formats for video are essentially containers that have mpeg4 video inside. Therefore, essentially any player can play mpeg4. The difference is which package files they can open, so just use a plain .mpg file rather than a proprietary package like .wmv.

If you want to embed the video that's fine, but also provide a link to the mpeg file itself. A plain link to a mpg file is like a plain link to an html page - it will work for anyone.

cave in (1)

mnt (1796310) | about 3 months ago | (#47515283)

so mozilla gave up to make the web a better place?

Re:cave in (1)

smash (1351) | about 3 months ago | (#47516151)

so mozilla gave up, to make the web a better place?

Fixed.

Latest version (2)

rossdee (243626) | about 3 months ago | (#47515361)

So thats whats gonna be in FF33, which is 2 versions from now.

FF31 has just been released AFAIK

So whats new (or broken) in FF31 - should I upgrade from FF30 ?

Re:Latest version (0)

Anonymous Coward | about 3 months ago | (#47515751)

I upgraded from firefox 30.0+build1-0ubuntu0.12.04.3 to firefox 31.0+build1-0ubuntu0.12.04.1 this morning when it became available.
I tried playing some videos, and I immediately noticed horrible audio stuttering problems.

If version 32 doesn't come out quick enough, does anyone know where I can get a copy of the DEBs for firefox 30.0+build1-0ubuntu0.12.04.3?
I forgot to copy them before I upgraded, and it sucks that Ubuntu doesn't keep them around on their servers.
One of these days I'll learn my lesson.

Re:Latest version (1)

Derek Pomery (2028) | about 3 months ago | (#47515927)

Haven't noticed that personally. Have you tried killing off pulseaudio? I occasionally get weirdass audio problems with pulse that get fixed by that.

I guess a full on logging out and logging back in might do the trick too.

Re:Latest version (0)

Anonymous Coward | about 3 months ago | (#47515781)

Non-ESR versions get end-of-lifed very soon after a new release. Don't expect security fixes.

Re:Latest version (1)

jenningsthecat (1525947) | about 3 months ago | (#47515811)

FF31 has just been released AFAIK

So whats new (or broken) in FF31 - should I upgrade from FF30 ?

Unless you like Australis, you may want to 'upgrade' to Pale Moon 24.

Re:Latest version (1)

Derek Pomery (2028) | about 3 months ago | (#47516245)

Re:Latest version (1)

ravenlord_hun (2715033) | about 3 months ago | (#47516285)

That's a very good solution... right until the plugin stops working because a future update broke it. Then you are left with a chrome-copycat.

Re:Latest version (1)

Derek Pomery (2028) | about 3 months ago | (#47516321)

Welp, then downgrade or whatever can be an option if-when that happens. For now, I'd prefer using the addon over dropping back that far.
Hell, there's always ESR to drag that window out even further if indeed the addon gets abandoned.
But given how many people (me included) are annoyed with Australis, I expect the addon will have a reasonable shelf life.

Re:Latest version (0)

Anonymous Coward | about 3 months ago | (#47516247)

pcxFirefox 28 is probably a better choice for that; it's compiled with ICC with support for SSE3, too.

ActiveX again. (3, Interesting)

mar.kolya (2448710) | about 3 months ago | (#47515439)

So, at least on Linux this 'thing' doesn't come packaged with the browser in a package. Instead browser DOWNLOADS this crap from the net. ActiveX, anyone?

Very-very-very disappointing. Looks like Mozilla have forgotten what their mission was behind all those gay-rights fights.

Re:ActiveX again. (1)

Anonymous Coward | about 3 months ago | (#47515963)

If you'd bothered to read up on it, you'd know better than to jerk your knee like this. Not only is it an option, not only are you able to block it, not only are they working hard to make it easier to verify yourself which code has been compiled into a given binary, but they're also working to limit what these plugins truly do - going so far as to sandbox them entirely from access to your system. Stop trying to make false equivalencies. Next you'll be telling me that the standard NPAPI is ActiveX, and a clearly safer and superior solution to this.

Australis killed Firefox (1)

EzInKy (115248) | about 3 months ago | (#47515489)

Much as I hate to admit it, what I see is what I get. Kill australis and save Firefox!

Re:Australis killed Firefox (0)

Anonymous Coward | about 3 months ago | (#47516293)

Here's an idea: why not do something instead of bitching and moaning? Let Mozilla work on what they want to - a core browser we can extend by our own whims. Create on a skin of your own, or use someone else's hard work like Pale Moon's. Mozilla are not there to do cater to everyone's precise whims, and that mentality is what led to its actual downfall when Chrome came out. If the best you can do is badmouth Firefox, then all you're doing is killing Firefox. Who wants to use a browser that even the fans hate? And when Firefox finally dies, so too will go all the third-party browsers based on Gecko, and then you'll have something to REALLY bitch and moan about. Use your brain sometime.

Blindly trusting Cisco? (0)

Anonymous Coward | about 3 months ago | (#47515719)

So, I remember back when this was originally announced that there was a large pushback against firefox fetching a binary blob compiled by Cisco, as there was no way to know if it was built from the published OpenH264 source code. These concerns were answered by mozilla and cisco with hand-waving answers that they would work on reproducible builds. Mike Perry from mozilla had this to say:
"In fact, if there is any component of Firefox that should have reproducible builds as a hard requirement, this seems like candidate 0."

So, here we are in 2014, mozilla is implementing the binary blob download, and... where are the reproducible builds?

Looking at the issue tracker, I found this:
https://github.com/cisco/openh264/issues/893
"Supporting replicatable builds is still on our to-do list."

That's great guys. So the "hard requirement" is no longer there I guess. And we should all just shut up and trust the benevolent cisco.

Re:Blindly trusting Cisco? (2)

smash (1351) | about 3 months ago | (#47516175)

If you don't trust Cisco you better get off the internet. Seriously, if you're worried about this, a binary blob running in your web browser is the least of your problems. There's a very good chance that the network hardware at your ISP is Cisco. If it's not, it will likely be Juniper.

Flash support (1)

Eravnrekaree (467752) | about 3 months ago | (#47515871)

I know some will mock this, but there is a heck of a lot of Flash content out there, and Firefox really should work with Adobe for an unloadable plugin for getting an up to date Flash player on all platforms. There is really far too much Flash content out there to ignore this need. Make it something that can be disabled, and unloaded as a plugin, sure. If you don't want it, you won't have to have it loaded, so it keeps everyone happy. I think that getting Ogg support into the browser and other open codecs will help us transition away from the Flash over time, which is something i support, especially with Google doing flash-free on most of its new Youtube content, for instance. But there is a lot of existing Flash content out there and its not going away any time soon, so we need the capability to view it.

As will Flash moving to HTML5 instead of a plugin (1)

raymorris (2726007) | about 3 months ago | (#47516291)

>I think that getting Ogg support into the browser and other open codecs will help us transition away from the Flash over time,

Also, Flash Cc, the authoring tool, can now output HTML5 rather than SWF, so all the existing Flash projects can be recompiled to no longer require the plugin. Support isn't 100% yet, but that's the direction Adobe is going. The programming language within Flash has always been a dialect of JavaScript/Emacscript, so it is pretty simple for Adobe to start using the browser's JavaScript engine instead of one provided by Flash. Other than a cross-browser JavaScript engine, the other thing provided by the plugin is a graphics API. Now that the canvas element, there's no need for the plugin.

Hurry up (1)

bhlowe (1803290) | about 3 months ago | (#47516085)

This split between supported formats on various browsers is ridiculous. Embed it into the next FireFox so that video tags support H.264. Make it something you can disable if you're paranoid. There will be plenty of time to examine it and make sure there isn't a back door (which would be a stupid thing for Cisco to attempt!)
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?