×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google Submits VP8 Draft To the IETF

Soulskill posted more than 3 years ago | from the i-could-have-had-a-vp8 dept.

Google 156

An anonymous reader writes "Google has submitted an Internet Draft covering the bitstream format and decoding of VP8 video to the Internet Engineering Task Force. CNET's Stephen Shankland writes, 'Google representatives published the "VP8 Data Format and Decoding Guide" at the IETF earlier this month, but that doesn't signal standardization, the company said in a statement. The document details the VP8 bitstream — the actual sequence of bytes into which video is encoded. "We submitted the VP8 bitstream reference as an IETF Independent RFC [request for comments] to create a canonical public reference for the document," Google said. "This is independent from a standards track." The IETF document could help allay one concern VP8 critics have raised: that VP8 is defined not by documentation of the bitstream but rather by the source code of the software Google released to implement VP8. But the IETF document still plays a subordinate role to that source code.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

156 comments

Reminds me of Sony (1)

commodore64_love (1445365) | more than 3 years ago | (#34964684)

Everybody in the world is using DVD, which naturally evolved in HD-DVD, and along comes Sony and a few other companies with "Bluray" which they claimed was superior. All it did was setup confusion for consumers and a format war.
Thankfully that "war" was short and didn't have negative consequences.
Hopefully this one will be too, so we can have a clear standard, rather than a divided MPEG v. VP8 internet.

Re:Reminds me of Sony (1)

theaveng (1243528) | more than 3 years ago | (#34964716)

Not the same thing.

But I understand your point about wanting to avoid another VHS versus Beta or HDdvd versus Bluray format confusion. It would have a negative impact on consumers, especially those who barely know how to use computers ("How do I make the window fill the whole screen?"). Or those wondering why they can't play the VP8-encoded video in their iGadget, since it only supports MPEG formats.

Venue choice? (3, Interesting)

Compaqt (1758360) | more than 3 years ago | (#34964740)

Is there any significance to the fact that Google chose IETF instead of ISO (where MPEG-LA and M$ submitted H.264 and OOXML)?

Re:Venue choice? (1)

happymellon (927696) | more than 3 years ago | (#34964794)

Ok I just posted below a similar question, and checking out wikipedia seems to imply that they work with ISO:

The Internet Engineering Task Force (IETF) develops and promotes Internet standards, cooperating closely with the W3C and ISO/IEC standards bodies and dealing in particular with standards of the TCP/IP and Internet protocol suite.

So is this ment to bypass the ISO standardisation, and get it put straight in an ISO standard? That's a pretty imaginative way to play the game, kudos to them.

In a way, they did submit it to ISO, just a small board but one that control everything that Google cares about.

Re:Venue choice? (0)

Anonymous Coward | more than 3 years ago | (#34964842)

Well the summary says it is only an RFC and is not on a "standards track" which means Google still isn't ready (probably don't have the documentation ready yet) to make it an open standard.

Re:Venue choice? (3, Interesting)

msauve (701917) | more than 3 years ago | (#34964974)

It takes some time before an RFC can become a proposed IETF standard.

A Proposed Standard specification is generally stable, has resolved known design choices, is believed to be well-understood, has received significant community review, and appears to enjoy enough community interest to be considered valuable.

- RFC 2026 [rfc-editor.org]

There aren't that many actual IETF standards. The standards process isn't even a standard. HTTP is only a draft standard. RFC 1918 (which defines the 10.0.0.0/8 - 172.16.0.0/12 - 192.168.0.0/16 private IP addresses) is only a proposed standard, yet was published in 1996, and is in universal use.

Re:Venue choice? (2)

dirkx (540136) | more than 3 years ago | (#34966382)

And there is also the expectation of "But the IETF document still plays a subordinate role to that source code.'' in the original article. However one interesting expectation or requirement of IETF standards is that one expects at least one, ideally two, independent implementations based on the written spec. This would alleviate the concerns that the VP8 case is too leading - as one has now an example of an independently derived code base - which has taken the written spec as its lead - and secondly it would also give the community a very fair idea as to any residual IPR issues, by Google or by others. Thanks, Dw.

Re:Venue choice? (0)

Anonymous Coward | more than 3 years ago | (#34965916)

The Internet was pretty much built on RFCs submitted along with reference implementations. From that standpoint, Google is in line with historiy.

Re:Venue choice? (5, Informative)

arivanov (12034) | more than 3 years ago | (#34964800)

Yes there is. Read on the MSFT XML history of going through ISO. It says all that there is to be said about ISO certification.

IETF may have its own politics (same as any standards body). However, out of all standards bodies it is the one which is probably the least corrupt.

Re:Venue choice? (5, Informative)

msauve (701917) | more than 3 years ago | (#34964824)

Well, let's see. The ISO has OSI-IP, IDRP, IS-IS, CMIP, X.400, X.500, etc. The IETF has TCP/IP, BGP, OSPF, SNMP, SMTP, LDAP, etc.

I think it's pretty clear why they created an IETF RFP.

IS-IS (0)

Anonymous Coward | more than 3 years ago | (#34965044)

Well, let's see. The ISO has OSI-IP, IDRP, IS-IS, CMIP, X.400, X.500, etc. The IETF has TCP/IP, BGP, OSPF, SNMP, SMTP, LDAP, etc.

I think it's pretty clear why they created an IETF RFP.

It should be noted that IS-IS is the protocol used for TRILL/routing bridges. TRILL is being done under the auspices of IETF (see RFC 5556.)

Re:Venue choice? (4, Informative)

TheRaven64 (641858) | more than 3 years ago | (#34964926)

IETF RFCs are just that - requests for comments. Anyone can publish one. The IETF assigns them a number, and they are public, but that's all. The IETF does not necessarily endorse them, they just publish them so that they can get feedback.

Within the set of RFCs there are some that are designated 'standards track'. These are ones that will eventually become IETF-endorsed standards. Most Internet-related standards are defined by a set of standards-track RFCs. These have a number of requirements, such as being free to implement (no known lurking patents) and having two existing, interoperable, independent implementations.

In contrast, some are informational RFCs, which basically just document existing practice. A company often releases one of these to let everyone else know what they are doing. It's basically a central location for publishing documentation.

Unlike a submission to ISO, this is not a request for standardisation, it's just a slightly more formal way of publishing documentation than popping it up on your own web server.

It's worth noting that publishing an informational RFC is sometimes the first step towards getting something adopted on the standards track. If I were in charge at Google, I would invite the IETF to form a video encoding working group and take control of the evolution of WebM.

Re:Venue choice? (1)

hedwards (940851) | more than 3 years ago | (#34966782)

A submission to ISO definitely isn't a request for standardization. I think we settled that with that whole debacle over MS' entry into the open document niche. They said themselves that they prefer to allow the market to decide. Which is just a fancy way of saying that they aren't a standards organization.

A standards organization that allows competing standards to battle it out is completely worthless in that respect. They're supposed to pick winners and losers otherwise you don't get an interoperable standard.

Re:Venue choice? (4, Informative)

jpmorgan (517966) | more than 3 years ago | (#34965130)

The IETF is the correct body for something like this, not ISO.

ISO is a standards body, and the function of a standards body in every other industry is to take multiple incompatible implementations of a concept, figure out the best of each and combine them into a single common standard that everybody can support. Politics are an inherent part of it, since the entity whose current products is closest to the eventual standard stands to do well financially. Look at how OpenGL is developed, for an example of a proper standardization process. Companies implement the standard, then add extensions to provide new features and give themselves a competitive advantage. Then at the next standards meeting, OpenGL is enhanced to a common base by taking these extensions and making them part of the next version of the standard.

But for some bizarre reason, software types view standardization as just a giant design process (except design by committee, an extremely political committee). If HTML and CSS were to follow normal standardization procedures, for example, Firefox, Opera, Chrome, Safari and even IE would be free to extend HTML however they want, and then every couple of years the best extensions from all would be combined and rolled into the next version of HTML.

The IETF is the correct body for VP8, because VP8 doesn't need standardization. There are no multiple competing implementations that need to be brought into alignment. It exists, it works, fait accompli. This is the process by which most successful Internet protocols were created. Maybe in the future when people have new ideas about how VP8 can be enhanced, it'll need a standardization process. But for the time being, all we need are the details, published openly and clearly, so anybody can implement it.

Standardization is about evolution, not intelligent design.

Re:Venue choice? (1)

mlingojones (919531) | more than 3 years ago | (#34965192)

If HTML and CSS were to follow normal standardization procedures, for example, Firefox, Opera, Chrome, Safari and even IE would be free to extend HTML however they want, and then every couple of years the best extensions from all would be combined and rolled into the next version of HTML.

That's pretty much what happens with HTML and CSS. The canvas element in HTML5 and the transform property in CSS3 were initially created and implemented by Apple in WebKit, and later adopted by the W3C.

Re:Venue choice? (2)

arose (644256) | more than 3 years ago | (#34966116)

True, but neither of those (and video came from Opera, no?) generally had incompatible implementations that needed to be reconciled (not that reconciliation isn't possible, but just dropping one implementation in favor of another is not unheard of, like IndexedDB vs Web SQL), they just tend to get tweaked (or not) and adapted as is if they work well. I think that is what jpmorgan was trying to make: generally ISO relies on working groups that try to make their stuff work together and generally IETF tends to document existing stuff so that other people can adopt it as is.

Let us be honest about H.264 (2)

westlake (615356) | more than 3 years ago | (#34965534)

Is there any significance to the fact that Google chose IETF instead of ISO (where MPEG-LA and M$ submitted H.264 and OOXML)?

Let us be honest about H.264. Where it comes from and how it is used.

H.264/MPEG-4 AVC is a block-oriented motion-compensation-based codec standard developed by the ITU-T Video Coding Experts Group (VCEG) together with the ISO/IEC Moving Picture Experts Group (MPEG). It was the product of a partnership effort known as the Joint Video Team (JVT). The ITU-T H.264 standard and the ISO/IEC MPEG-4 AVC standard (formally, ISO/IEC 14496-10 - MPEG-4 Part 10, Advanced Video Coding) are jointly maintained so that they have identical technical content. H.264/MPEG-4 AVC [wikipedia.org]

VCEG was preceded in the ITU-T (which was called the CCITT at the time) by the "Specialists Group on Coding for Visual Telephony" chaired by Sakae Okubo (NTT) which developed H.261. The first meeting of this group was held Dec. 11-14, 1984 in Tokyo, Japan. In 1994, Richard Shaphorst (Delta Information Systems) took over new video codec development in ITU-T with the launch of the project for developing H.324. Schaphorst appointed Karel Rijkse (KPN Research) to chair the development of the H.263 codec standard as part of that project. In 1996, Schaphorst then appointed Gary Sullivan (PictureTel, since 1999 Microsoft) to launch the subsequent "H.263+" enhancement project, which was completed in 1998. In 1998, Sullivan was made rapporteur (chairman) of the question (group) for video coding in the ITU-T that is now called VCEG. After the H.263+ project, the group then completed an "H.263++" effort, produced H.263 Appendix III and H.263 Annex X, and launched the "H.26L" project with a call for proposals issued in January 1998 and a first draft design adopted in August 1999. In 2000, Thomas Wiegand (Fraunhofer HHI) was appointed as an associated rapporteur (vice-chairman) of VCEG. Sullivan and Wiegand led the H.26L project as it progressed to eventually become the H.264 standard after formation of a Joint Video Team (JVT) with MPEG for the completion of the work in 2003. (In MPEG, the H.264 standard is known as MPEG-4 part 10.) Since 2003, VCEG and the JVT have developed several substantial extensions of H.264, produced H.271, and conducted exploration work toward the potential creation of a future new "H.265". In January 2010, the Joint Collaborative Team on Video Coding (JCT-VC) was created as a group of video coding experts from ITU-T Study Group 16 (VCEG) and ISO/IEC JTC 1/SC 29/WG 11 (MPEG) to develop a new generation video coding standard.

In July 2006, the video coding work of the ITU-T led by VCEG was voted as the most influential area of the standardization work of the CCITT and ITU-T in their 50-year history. The image coding work that is now in the domain of VCEG was also highly ranked in the voting, placing third overall.

The organization now known as VCEG has standardized (and is responsible for the maintenance of) the following video compression formats and ancillary standards:

H.120: the first digital video coding standard. v1 (1984) featured conditional replenishment, differential PCM, scalar quantization, variable-length coding and a switch for quincunx sampling. v2 (1988) added motion compensation and background prediction. This standard was little-used and no codecs exist.
H.261: was the first practical digital video coding standard (late 1990). This design was a pioneering effort, and all subsequent international video coding standards have been based closely on its design.
H.262: it is identical in content to the video part of the ISO/IEC MPEG-2 standard (ISO/IEC 13818-2). This standard was developed in a joint partnership between VCEG and MPEG, and thus it became published as a standard of both organizations. ITU-T Recommendation H.262 and ISO/IEC 13818-2 were developed and published as "common text" international standards. As a result, the two documents are completely identical in all aspects.
H.263: was developed as an evolutionary improvement based on experience from H.261, and the MPEG-1 and MPEG-2 standards. Its first version was completed in 1995 and provided a suitable replacement for H.261 at all bitrates.
H.263v2: also known as H.263+ or as the 1998 version of H.263, is the informal name of the second edition of the H.263 international video coding standard. It retains the entire technical content of the original version of the standard, but enhances H.263 capabilities by adding several annexes which substantially improve encoding efficiency and provide other capabilities (such as enhanced robustness against data loss in the transmission channel). The H.263+ project was completed in late 1997 or early 1998, and was then followed by an "H.263++" project that added a few more enhancements in late 2000.
H.264: Advanced Video Coding (AVC) is the newest entry in the series of international video coding standards. It is currently the most powerful and state-of-the-art standard, and was developed by a Joint Video Team (JVT) consisting of experts from ITU-T's Video Coding Experts Group (VCEG) and ISO/IEC's Moving Picture Experts Group (MPEG) created in 2001. The ITU-T H.264 standard and the ISO/IEC MPEG-4 Part 10 standard (formally, ISO/IEC 14496-10) are technically identical. The final drafting work on the first version of the standard was completed in May 2003. As has been the case with past standards, its design provides the most current balance between the coding efficiency, implementation complexity, and cost based on state of VLSI design technology (CPUs, DSPs, ASICs, FPGAs, etc).
H.264.1: Conformance testing for H.264
H.264.2: Reference software for H.264
H.265: Not yet developed; expected 2012 or later. [HEVC]
H.271: Video back channel messages for conveyance of status information and requests from a video receiver to a video sender
For further information about the image coding work now in the domain of VCEG, see the Joint Photographic Experts Group article. Video Coding Experts Group>/a> [wikipedia.org]

The full list of MPEG LA licensors of H.264 technology: AVC/H.264 Licensors [mpegla.com] None are spelled or abbrieviated here with a "$" sign except Microsoft - even though the list is dominated by global industrial cartels the size of Mitsubishi.

Where playing corporate hardball is not unknown.

All but Microsoft are represented here by their familiar, established, corporate logos. No Borg icons. No stained glass windows. It helps clear your head before posting.

For a short list of H.264 applications, badly in need of an update, and mostly excluding streaming video services like Netflix, CCTV and the corporate intra-net: List of video services using H.264/MPEG-4 AVC [wikipedia.org]

Source code is fine! (5, Insightful)

multipartmixed (163409) | more than 3 years ago | (#34964758)

Well, you know, as long as it's not terrible code.

Once upon a time, the RFC for IP and the BSD code base (that *everyone*) used differed in some subtle way. W. Richards Stevens was the first guy to notice, years after both were written.

Guess what happened? They changed the standard.

Re:Source code is fine! (-1)

Anonymous Coward | more than 3 years ago | (#34964928)

Nothing personal multipart/mixed, but that's the fourth misplaced parenthesis I've seen today, and I'm about to start a revolution over it.

For anyone else considering using parentheses (or having already used them, in an editable context): Please remember that the sentence should remain grammatically correct when you remove the parenthetical; if it's got a word missing or an extra word, that probably means YOU PUT A PARENTHESIS IN THE WRONG FUCKING PLACE!

Re:Source code is fine! (0)

Anonymous Coward | more than 3 years ago | (#34965058)

Nothing personal, but I continue to see incorrect capitalization after colons, and I'm about to start a revolution over it.

The word following a colon shouldn't be capitalized, as if the clause were its own sentence, unless the word would be capitalized if it weren't the first word of the sentence (such as a proper noun). If a word is capitalized after a colon, it probably means YOU CAPITALIZED THE WRONG FUCKING WORD!

But, seriously, is it worth being pedantic over such small details? I mean, come on. Misplaced (or missing) commas can change the interpretation of sentences, and those mistakes happen all of the time. It was completely obvious what the parent was conveying, which is more than I can say for many tweets and texts that I see. So, pat yourself on the back for having a firm grasp on grammar, but for minor mistakes that don't impact the reader's understanding of what's written... let it go.

Re:Source code is fine! (0)

Anonymous Coward | more than 3 years ago | (#34965272)

It made me chuckle that, on a thread regarding standardization, you're so quick to chastise someone asking that the rules of a standard (i.e. written English) be followed. Do you also disregard minor details of technical standards when it suits you?

Re:Source code is fine! (1)

Anonymous Coward | more than 3 years ago | (#34965612)

Interestingly, yes.

But, first, English is less of a standard than a moving target. For example, the MLA standard for citing references changes every year. American English didn't always place periods inside the quotations either, it evolved. Today, the phrase "begs the question" is usually accepted as an acceptable replacement for "raises the question," instead of referring to a specific fallacy. If English is to actually be considered a standard, it's one that's constantly changing. But, let's move on to answering your question more directly.

Here's an example where I might disregard the rules of standard English. To wipe your hard drive clean, enter the command "dd if=/dev/zero of=/dev/sda". Technically, having the period outside the quotation marks is incorrect; however, it clarifies that the period should not be entered by the user. Here, a rule for English gets in the way of what I want to communicate, so I ignore it.

Let's do something more technical. Take the original TFTP specification. It introduces a condition where packet looping could occur between server and client (an issue addressed in a later revision). Should astute coders have disregard minor details in the standard to address that oversight if it suits them? Yes. Yes, they should.

Standards, including the written rules for English, are there for a reason. The reasons for standardization of English are the same as for TFTP, HTTP, or any other protocol. If everyone agrees on the rules, communication becomes easier. However, when those standards get in the way of effective communication, they should be ignored. Still other times, portions of standards are outdated and it no longer makes sense to follow them, since doing so does nothing to enhance communication. For example, it's becoming more common to avoid paragraph indents, which has been largely replaced by double-spacing between paragraphs. The indent to alert the reader that a new paragraph has begun is no longer necessary.

Sorry, let's get back to my point. The point of standardization in English is communication. The grammatical error by multipart/mixed did nothing to detract from the point he was making. Therefore, calling him out (unless m/m happens to be your English student) was unnecessary, since the meaning was clear. Just as unnecessary was my complaint about capitalization; correct grammar wouldn't have done anything to enhance the meaning of what was written.

By the way, the basis for your question is covered by several fallacies.

The first is the generalization fallacy, claiming that two situations are highly similar, when they aren't. Is it so easy to assume that breaking the rules of one standard automatically means that it's okay to break the rules of others? You could have written "Do you also disregard the societal standards surrounding murder when it suits you?" English standards and technical standards are very different things.

Similarly, and for much the same rationale, it's a slippery slope fallacy. There's an assumption that disregarding one standard leads to disregarding another, more important standard.

The last fallacy that jumped out at me is the "begging the question" fallacy. There is the subtle premise that disregarding minor details of technical standards is a bad thing.

Anyway, I've been working all morning, so I appreciate the break. :)

Re:Source code is fine! (0)

Anonymous Coward | more than 3 years ago | (#34965654)

Written English? Standard? Who are you kidding?

Re:Source code is fine! (0)

Anonymous Coward | more than 3 years ago | (#34965288)

But, seriously, is it worth being pedantic over such small details?

I don't think it really counts as being pedantic if it isn't over small details.

Re:Source code is fine! (0)

Anonymous Coward | more than 3 years ago | (#34965646)

http://www.answers.com/topic/pedantic [answers.com] disagrees with you. Being pedantic doesn't require that details be small, it refers to (usually overly strict) adherence to formal rules.

Re:Source code is fine! (1)

iluvcapra (782887) | more than 3 years ago | (#34965674)

You need a standard to define what's a valid and invalid file. Having only the source code would allow people (including Google) to create WebM encoders which are nominally compatible with the open source decoder but "extend" the format by inserting new data in a nonstandard way, which might then be available to only to the (perhaps paying) users of their platform. Thus fragmentation of WebM, perhaps eventually leading to a situation where only one company's reader and writer are useable in most practical circumstance.

Re:Source code is fine! (1)

multipartmixed (163409) | more than 3 years ago | (#34966970)

Source code is nothing more than a rigorous specification. In fact, it is so rigorous that you can write a translator which automatically translates the specification into a binary you can run on your platform.

No specification will stop a vendor from writing proprietary extensions. If they publish updated source code, then whoever can catch up.

Also, isn't the VP8 source code freely usable?

So what does this mean? (1)

happymellon (927696) | more than 3 years ago | (#34964768)

Google could have just released documentation providing the specification, so how does the IETF help? So that they can call it IETF.4628 (Or however the IETF standards are named), or are they looking to make it an internet standard, rather than just a video standard like H.264?

WebM will never catch on (0, Troll)

javacowboy (222023) | more than 3 years ago | (#34964864)

Why?

1) WebM/VP8 probably infringe on several MPEG-LA patents (I don't agree with software patents, but U.S. courts do)
2) Google has not offered to indemnify anybody who uses WebM.
3) Mobile hardware has H.264 compatibility built-in, not so for WebM,
4) The media companies have encoded their content in H.264, they can't be bothered to re-encode it to WebM.

This is the same reason that Linux won't catch on on the desktop (arguably, the only reason). Media companies (RIAA, MPAA, and game publishers) will never support these open formats (OGG Vorbis, OGG Theora, WebM). The developers of the closed formats (MP3, H.264, GIF) will insist on getting paid one way or the other, which means their formats can't be natively supported by an open source OS.

Re:WebM will never catch on (1)

mrnobo1024 (464702) | more than 3 years ago | (#34964912)

1) WebM/VP8 probably infringe on several MPEG-LA patents (I don't agree with software patents, but U.S. courts do)
2) Google has not offered to indemnify anybody who uses WebM.

H.264 infringes on a patent I own. When adoption is sufficient, I will sue everyone who uses it (MPEG-LA doesn't indemnify its users against outside patent claims either).

(sure, I'm probably lying, but can you prove it?)

Re:WebM will never catch on (1)

javacowboy (222023) | more than 3 years ago | (#34964970)

H.264 infringes on a patent I own. When adoption is sufficient, I will sue everyone who uses it (MPEG-LA doesn't indemnify its users against outside patent claims either).

How many lawyers do you have, how much do you pay them, and how good are they?

Re:WebM will never catch on (1)

rtfa-troll (1340807) | more than 3 years ago | (#34966336)

How many lawyers do you have, how much do you pay them, and how good are they?

Hi; I'm mrnobo1024's partner in this. Our IP budget for 2011 is about 10MUSD, but for 2012 we have dedicated 85 billion dollars. That may seem quite a bit, but you should know a) that the dollar is expected to fall to about 10% of it's current value and b) we also have a bunch of suits lined up against companies using MS Windows. This still means that we have 80% of the top lawyers in IP working directly for us and will have well over a billion 2010 dollars to go after the MPEG-LA licensee list.

We even have a deal with Google to take over their VP2 patents (which cover H.264 and the MPEG-LA doesn't have access to; look it up).

(for a truth value which demonstrates complete partnering synergies with the truth value of mrnobo1024's post)

Re:WebM will never catch on (1)

mlingojones (919531) | more than 3 years ago | (#34965140)

H.264 infringes on a patent I own. When adoption is sufficient, I will sue everyone who uses it (MPEG-LA doesn't indemnify its users against outside patent claims either).

(sure, I'm probably lying, but can you prove it?)

That's ridiculous. You can use that to argue against the adoption of anything. "HTML infringes on a patent I own, don't use it or I'll sue you!" "Keyboards infringe on a patent I own, don't use them or I'll sue you!" "You can't prove me wrong, so I win!"

The burden of proof is on you.

Re:WebM will never catch on (2)

TheRaven64 (641858) | more than 3 years ago | (#34965276)

So apply the same standard to the MPEG-LA. They only definitive statement that they've actually made is that they are putting together a pool of patents covering VP8 (not that they actually have any) and that they will later be charging for licenses for this pool. This somehow gets repeated as 'VP8 infringes a load of H.264 patents'.

Re:WebM will never catch on (1)

mlingojones (919531) | more than 3 years ago | (#34965360)

I have. There have been independent analyses that call the patent status of VP8 into question.

http://arstechnica.com/open-source/news/2010/05/google-support-aside-webm-carries-patent-risks-from-mpeg-la.ars [arstechnica.com]

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34965408)

I would not describe dark shikaris (who does not have any patent expertise) rantings as independent analysis.

Re:WebM will never catch on (3, Informative)

DJRumpy (1345787) | more than 3 years ago | (#34965748)

Interesting read, and it brings up two points which needs repeating. Specifically as to how VP8 does it's intra frame prediction.

From the linked article, Jason Garrett-Glaser (one of the developers of X.264) had this to say:

One specific characteristic of the codec that Garrett-Glaser considers particularly prone to patent risks is its handling of a feature called intra prediction. He accuses On2 of cribbing the technology from H.264.

"VP8's intra prediction is basically ripped off wholesale from H.264," he wrote. "This is a patent time-bomb waiting to happen. H.264's spatial intra prediction is covered in patents and I don't think that On2 will be able to just get away with changing the rounding in the prediction modes."

The other interesting point is the fact that the more a company discusses specific patents, the more they legally expose themselves to potential 'willful' violation in patent claims.

Unfortunately, it could be difficult for Google to provide any unambiguous assurances about VP8's patent status. Under US patent law, companies can be forced to pay triple damages for "willful" infringement—cases in which it can be demonstrated that the company was previously aware of a patent that it infringed. As such, publicly discussing specific patents can dramatically increase a company's exposure to liability.

This is probably why no one is saying much of anything until everyone is ready to lay their cards on the table.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34966502)

Whoosh... That was the whole point of mrnobo1024s comment. Accusing WebM of of probably infringing patents is useless -- both codecs/containers have a risk of patent problems, trying to point that as a flaw in only one of them is dishonest. The only thing we can do is try to compare the amount risk each of these options has.

Re:WebM will never catch on (1)

Draek (916851) | more than 3 years ago | (#34966562)

That's ridiculous. You can use that to argue against the adoption of anything.

Exactly! now you see the value of the patent system for companies such as Microsoft and Apple.

Linux gaining too much ground on the corporate arena? "the Linux kernel may or may not infringe on over 200 of our patents, and we may or may not decide to sue everyone that uses it". VP8 threatening their plan for dominance of the online video market? "VP8 may or may not infringe on our patents, and we may or may not decide to sue people over it, eventually".

And the best part? given how corrupt the system is, approving patents left and right regardless of how vague they are or how much prior art there may be, chances are they're actually right and you are infringing on *something* of theirs.

God were those bribes well spent.

Re:WebM will never catch on (1)

Hope Thelps (322083) | more than 3 years ago | (#34964938)

WebM/VP8 probably infringe on several MPEG-LA patents

Which parts of WebM/VP8 do you think probably infringe on which patents, and why?

Re:WebM will never catch on (1)

javacowboy (222023) | more than 3 years ago | (#34964976)

I'm not privy to the technical details, but the U.S. patent office hands out patents to pretty much anybody who asks for them, and I'm willing to bet MPEG-LA has way more patents than Google has.

Re:WebM will never catch on (1)

Hope Thelps (322083) | more than 3 years ago | (#34965088)

So it's just a subset of 'anything you do probably infringes on someone's patent, so don't even wait for them to sue, don't ask for details, just fold to every potential bully'? I guess that's one approach to life. Hope you have fun with it.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34965090)

So you just heard it somewhere but you can't actually back it up.

Re:WebM will never catch on (2)

TheRaven64 (641858) | more than 3 years ago | (#34965284)

I'm not privy to the technical details

Why not? VP8 is public and has two independent implementations (libavcodec and libvpx) for you to look at. All patents are public. All H.264 patents are listed clearly on the MPEG-LA's website. It's easy to validate any claim of patent infringement. So, unless you are just spreading FUD, point to the patent and point to the part of VP8 that it infringes.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34965506)

I don't know what it is about codecs that brings out complete morons on online boards. It's like they feel comfortable talking about something they haven't even done the basic research on. Programmers rarely do this lest they become publicly humiliated.

Maybe they share a common fault with audiophiles.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34965110)

Which parts of WebM/VP8 do you think probably infringe on which patents, and why?

The one called "Simulation of moving pictures using a stream of static images".
If it haven't been filled yet, I call dips...</sarcasm>

Re:WebM will never catch on (2)

mlingojones (919531) | more than 3 years ago | (#34965238)

Here you go [arstechnica.com].

From the article:

"VP8's intra prediction is basically ripped off wholesale from H.264," he wrote. "This is a patent time-bomb waiting to happen. H.264's spatial intra prediction is covered in patents and I don't think that On2 will be able to just get away with changing the rounding in the prediction modes."

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34964962)

1.) Have people forgotten the whole SCO debacle already? I'm not taking MPEG-LA (or anyone with a vested interested in it) at their word that their competitor violates "some patents", not without reference to which patents and where in the code.
2.) This is just point one plus something completely unrealistic.
3.) Oh my, it must have been an awkward time when mobile hardware had H.264 acceleration but nobody used it. Either that or you've got "which comes first" backwards. It's also a good thing that Google has absolutely no pull in the mobile market.
4.) Yes, media companies are well known for sticking to one format indefinitely. And also Google certainly isn't one of the largest servers of video media on the internet.

Re:WebM will never catch on (1)

javacowboy (222023) | more than 3 years ago | (#34965030)

1) SCO was about non-existent copyright rights and claims. MPEG-LA has real patents on video formats and compression.
2) Google hasn't offered to indemnify anybody, as MPEG-LA has. That shows Google has little to no confidence in WebM's patent standing.
3) What are you talking about? Why go through the trouble of encoding to WebM when H.264 is the path of least resistance?
4) YouTube doesn't serve full movies or tv shows. It only serves 10 minute video clips. What's your point exactly?

Re:WebM will never catch on (2)

Hope Thelps (322083) | more than 3 years ago | (#34965148)

Google hasn't offered to indemnify anybody, as MPEG-LA has.

Who exactly are you claiming that MPEG-LA have offered to indemnify, and what do you claim that they have offered to indemnify them against?

From their FAQ [mpegla.com]:

Q: Are all AVC essential patents included?
A: No assurance is or can be made that the License includes every essential patent. The purpose of the License is to offer a convenient licensing alternative to everyone on the same terms and to include as much essential intellectual property as possible for their convenience. Participation in the License is voluntary on the part of essential patent holders, however.

Re:WebM will never catch on (1)

Anonymous Coward | more than 3 years ago | (#34965012)

3) Mobile hardware has H.264 compatibility built-in, not so for WebM,

Will this argument DIAF, please. There are basically NO H.264 decoder ASICs in general mobile use, they're all DSPs with an H.264 codec implemented in software. Since WebM/VP8 is so similar to H.264, it would take quite some effort to create an evil DSP that can decode H.264 but can't be made to decode VP8 with comparable features/resolution/bitrate, and I assure you the general-purpose DSP in your smartphone will handle it just fine.

Re:WebM will never catch on (5, Informative)

TheRaven64 (641858) | more than 3 years ago | (#34965168)

VP8 probably infringe on several MPEG-LA patents

Does it? The MPEG-LA has not produced any patents that it infringes, On2 presumably checked the (easy-to-find) list of MPEG-LA patents before shipping VP8, and the MPEG-LA is currently asking people to come forward with patents that cover VP8 - not something it would need to do if it already had a large pool of them.

Google has not offered to indemnify anybody who uses WebM

The MPEG-LA does not offer indemnity either. This was demonstrated quite well a couple of months ago when MPEG-LA licensees were sued for patent infringement over H.264.

Mobile hardware has H.264 compatibility built-in, not so for WebM

Most 'H.264 hardware' is really a DSP with a few things like [I]DCT in hardware. This same hardware can used for VP8 (it's typically already used for MPEG-2 and MPEG-4 ASP).

The media companies have encoded their content in H.264, they can't be bothered to re-encode it to WebM

YouTube is owned by Google, and they're going to be making everything WebM soon. I wouldn't be surprised if they only made the low-quality versions H.264 in the future and required WebM for the higher-quality encodings. This would let them keep iPhone users happy (low quality encoding isn't such a problem on a tiny screen), while forcing desktop users to install a WebM plugin.

Re:WebM will never catch on (1)

mlingojones (919531) | more than 3 years ago | (#34965320)

YouTube is owned by Google, and they're going to be making everything WebM soon. I wouldn't be surprised if they only made the low-quality versions H.264 in the future and required WebM for the higher-quality encodings. This would let them keep iPhone users happy (low quality encoding isn't such a problem on a tiny screen), while forcing desktop users to install a WebM plugin.

Three things wrong/silly that I can see with that statement.

  • Few iPhone users use the YouTube website, as there is a native YouTube app preinstalled on each device.
  • Plugins? Argh! That's what we're trying to get away from in the first place!
  • That wouldn't do anything for desktop users anyway, unless they get rid of Flash as well. And as Flash can be used to wrap H.264 video, the path of least resistance to content providers is to just continue to serve H.264 wrapped in Flash, rather than re-encode everything in another, widely unsupported format.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34965666)

Few iPhone users use the YouTube website, as there is a native YouTube app preinstalled on each device.

So? If you mean the YouTube app won't work, then it's up to the implementor to provide the necessary WebM decoder.

Plugins? Argh! That's what we're trying to get away from in the first place!

IE and Safari would require a WebM plugin. Other browsers won't. H.264 has far worse support. IE is the only Windows browser with it. IE users stuck on Windows XP lack it. Linux users are doomed without Flash, which they don't want to use anyway.

Are there even any well-known plugins which provide H.264 for Opera, Firefox, and Chromium?

At least WebM will have an officially sponsored and painless option for installing.

That wouldn't do anything for desktop users anyway, unless they get rid of Flash as well. And as Flash can be used to wrap H.264 video, the path of least resistance to content providers is to just continue to serve H.264 wrapped in Flash, rather than re-encode everything in another, widely unsupported format.

What does that have to do with YouTube?

Re:WebM will never catch on (1)

bill_mcgonigle (4333) | more than 3 years ago | (#34965758)

Few iPhone users use the YouTube website, as there is a native YouTube app preinstalled on each device.

What does that have to do with how Google striates their encodings?

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34965774)

The path of least resistance is going to be WebM + Flash. When Flash 11 is released with support for WebM then all browsers instantly support WebM if they have current flash (which most people do).

Re:WebM will never catch on (1)

Draek (916851) | more than 3 years ago | (#34966612)

Well, a plugin would only be required for IE users, and only until the sheer momentum of Chrome, Mozilla and Opera force them to implement it anyways. They don't have much to gain from h.264, so it's unlikely they'd continue the holy war to the end, unlike Apple.

Re:WebM will never catch on (1)

Tacvek (948259) | more than 3 years ago | (#34966854)

IE would not need a browser plugin. All they would need is the proper DirectShow/Media Foundation filters and codecs, and it would simply support WebM (it would also cause Windows Media Player, and many other windows apps to support WebM). Just like Safari would need merely need the proper Quicktime codecs, for WebM to just work.

Graphics Card Manufacturers (1)

bill_mcgonigle (4333) | more than 3 years ago | (#34965786)

Most 'H.264 hardware' is really a DSP with a few things like [I]DCT in hardware. This same hardware can used for VP8 (it's typically already used for MPEG-2 and MPEG-4 ASP).

At what level is the h.264 decoding provided by the graphics card manufacturers? Do you get a show_h264(x1,y1,x2,y2,&stream_buffer) function or are only those hardware transforms exposed and you have to ship your own decoder?

I get that the DSP can handle WebM's essential bits, but how much buy-in is required from the graphics card manufacturers to make existing products thus capable?

Re:Graphics Card Manufacturers (1)

Tacvek (948259) | more than 3 years ago | (#34966914)

To the best of my knowledge, full blown PCs generally don't have DSPs for that sort of thing, but simply use GPGPU techniques for this, which could be done even without GPU manufacturer support, but I believe for some common codecs they due provide special support.

However for mobile products, the DSP is often completely separate from the 3D graphics chip, and is often built into the CPU. (Well, not built into the the ARM core, but still on the same piece of silicon.)

Sample code for accelerating common codecs is often supplied, which could be modified to support WebM too by somebody familiar with Codecs and DSPs.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34966428)

Does it? The MPEG-LA has not produced any patents that it infringes, On2 presumably checked the (easy-to-find) list of MPEG-LA patents before shipping VP8, and the MPEG-LA is currently asking people to come forward with patents that cover VP8 - not something it would need to do if it already had a large pool of them.

Welcome to the US legal system. You never say which patents something infringes on before you sue, since that just allows them to stop infringing. You never look for which patents an invention might infringe on, because that triples the damage money.

Plus, there's no point in suing Google over WebM yet, since Google has a patent library of their own and this just invites a patent war between Google and everyone else. Best instead to wait and sue anyone who licenses WebM and who won't be able to start a patent war. Like, say, you.

So YouTube can probably get away with using WebM. Firefox? No. FFmpeg? No. You? No.

WebM is a ticking timebomb, one that's best to stay away from.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34966866)

If Firefox gets sued over WebM patents, I have a feeling the litigating party's lawyers will start mysteriously dying.

Re:WebM will never catch on (1)

javacowboy (222023) | more than 3 years ago | (#34966634)

Does it? The MPEG-LA has not produced any patents that it infringes, On2 presumably checked the (easy-to-find) list of MPEG-LA patents before shipping VP8, and the MPEG-LA is currently asking people to come forward with patents that cover VP8 - not something it would need to do if it already had a large pool of them.

Why give Google the opportunity to work around those patents? Also, Sun took years to open source Java, yet Google open sourced VP8 in months, indicating to me that Google was sloppy and didn't do their due diligence.

The MPEG-LA does not offer indemnity either. This was demonstrated quite well a couple of months ago when MPEG-LA licensees were sued for patent infringement over H.264.

MPEG-LA indemnifies users for the patents they own, not for patents outside their patent pool, which is way more than Google is offering to do.

Most 'H.264 hardware' is really a DSP with a few things like [I]DCT in hardware. This same hardware can used for VP8 (it's typically already used for MPEG-2 and MPEG-4 ASP).

More hoops to jump through.

YouTube is owned by Google, and they're going to be making everything WebM soon. I wouldn't be surprised if they only made the low-quality versions H.264 in the future and required WebM for the higher-quality encodings. This would let them keep iPhone users happy (low quality encoding isn't such a problem on a tiny screen), while forcing desktop users to install a WebM plugin.

I'm going to get modded down as a troll again for saying this (even though every one of my posts on this topic has been sincere, and labelled "troll" by reactionary Slashdotters), but Google doesn't own any of the content outside of users' home videos. The RIAA, MPAA, and gave studios produce most of the content that people are interested in, and that comprises more than 10 minute clips, and they're not going to re-encode in WebM. Even Apple couldn't bring those companies to their knees so what makes you think Google will?

Re:WebM will never catch on (3, Insightful)

TheRaven64 (641858) | more than 3 years ago | (#34966784)

Also, Sun took years to open source Java, yet Google open sourced VP8 in months, indicating to me that Google was sloppy and didn't do their due diligence.

Do you have any idea how meaningless that comparison is? The problem with open sourcing Java was that some parts were owned by Sun, some were licensed from third parties, and the licensed parts had to be either relicensed or replaced. In contrast, On2 was already shipping VP8 and had been working on it - specifically working around patents to produce it - since before Google bought them.

It's also worth pointing out that, not only are you comparing completely unrelated things, you are comparing completely unrelated sizes of things. The Java code is a couple of orders of magnitude bigger than the VP8 code.

MPEG-LA indemnifies users for the patents they own, not for patents outside their patent pool, which is way more than Google is offering to do.

Uh, what? You don't actually know the difference between indemnifying and licensing, do you? MPEG-LA and Google both offer licenses to their patents. MPEG-LA has a complex fee scale, Google provides you with 'a perpetual, worldwide, non-exclusive, no-charge, royalty-free, irrevocable' (quoted from the patent license itself) to all of the patents that they own, or will acquire in the future, related to VP8.

Neither indemnifies you against damages from infringing on third-party patents.

More hoops to jump through.

Yes, shockingly, you actually do need to write some code to support new features. Not much (the libavcodec implementation is about 1,400 lines of code, reusing existing decoder building blocks), but slightly more than none.

I'm going to get modded down as a troll again for saying this (even though every one of my posts on this topic has been sincere, and labelled "troll" by reactionary Slashdotters), but Google doesn't own any of the content outside of users' home videos. The RIAA, MPAA, and gave studios produce most of the content that people are interested in, and that comprises more than 10 minute clips, and they're not going to re-encode in WebM. Even Apple couldn't bring those companies to their knees so what makes you think Google will?

You've not visited YouTube recently, have you? They stream TV shows [youtube.com] and movies [youtube.com] (most of them only in the USA, currently), with the consent and cooperation of the studios that own them. Google provides all of the infrastructure for this, including the choice of format.

Re:WebM will never catch on (0)

Anonymous Coward | more than 3 years ago | (#34966996)

The RIAA, MPAA, and gave studios produce most of the content that people are interested in, and that comprises more than 10 minute clips

If that were true, YouTube wouldn't be popular at all. The majority of the videos viewed on YouTube are made by people. That was the founding principle of YouTube.

but Google doesn't own any of the content outside of users' home videos

and they're not going to re-encode in WebM.

You're an idiot. Google owns no videos on YouTube: the copyright holder retains ownership. The uploader signs an agreement with Google allowing Google to use that video for any purposes. That includes re-encoding the video to another format or resolution. Re-encoding probably doesn't even constitute a derivative work since copyright is about whether something looks the same, not whether it's a byte-for-byte copy.

If Microsoft was doing this (0)

js3 (319268) | more than 3 years ago | (#34964930)

If microsoft was doing what google is attempting to do we would all be screaming bloody murder

Re:If Microsoft was doing this (4, Insightful)

drinkypoo (153816) | more than 3 years ago | (#34965092)

If microsoft was doing what google is attempting to do we would all be screaming bloody murder

Google produces open source, makes Linux software, and gives away free web services and doesn't care if you block ads, which would be trivial to detect and act upon when you're talking about the architecture of google. Microsoft has been convicted of abuse of their monopoly position. It is utterly unreasonable to treat google the same as convicted criminal Microsoft.

Re:If Microsoft was doing this (1)

Rockoon (1252108) | more than 3 years ago | (#34965254)

Are saying is that if Microsoft was doing this, then it could not be tolerated?

I mean, it sounds like what you are saying. That Google can do this, and its awesome, but it cannot be tolerated if we do a 's/Google/Microsoft' ?

Re:If Microsoft was doing this (0)

Anonymous Coward | more than 3 years ago | (#34965556)

No he's saying that Microsoft is nowhere close to Google in terms of how they do business. In fact Microsoft is part of the MPEG-LA that is backing the patent encumbered royalty based codec that would make money licensing supposedly "open standards" instead of the open source codec that Google is backing that would be free for everyone involved.

Back in my day people would be screaming bloody murder about what Microsoft is doing (like we did with IE) and support the people pushing for open standards. People around here used to care about this kind of stuff.

We don't have to go through the same crap again if people would just stop thinking about the cheap plastic with planned obsolescence in their pockets and think long term for a few seconds. This is about a future standard and we don't to saddle a entire generation with this crap again like Microsoft did with IE.

Open standards should be open!

Re:If Microsoft was doing this (1)

Rockoon (1252108) | more than 3 years ago | (#34965662)

The GP pointed out that if Microsoft was doing this, that people would be "screaming bloody murder"

So far, the comments (such as yours) seem to be about why Microsoft is evil, and not about why this move from Google should not be considered bad.

If Microsoft did this very thing, would you be rejoicing the move or would you be screaming bloody murder?

He's been asked and hasn't answered. Now you have been asked.

Re:If Microsoft was doing this (0)

Anonymous Coward | more than 3 years ago | (#34967100)

Remember when Microsoft granted free Windows licenses to Russian non-profits to stop police crackdown? That was universally praised on /., despite how hated Microsoft is and how they were endorsing Windows.

So yeah, Microsoft is capable of good. Too bad it's rare when they do it.

Re:If Microsoft was doing this (1)

Anonymous Coward | more than 3 years ago | (#34965716)

No. He's saying that Microsoft has never done this. I wish Microsoft were doing this, because they would have a lot more weight than Google does (though Google would undoubtedly support Microsoft in that occasion).

Microsoft not being evil --- the impossible dream (0)

Anonymous Coward | more than 3 years ago | (#34966536)

You're making a strawman argument that if Microsoft had done the same thing as Google we would still not trust them.

It's a strawman argument because Microsoft DOES NOT do the same things as Google. Have you not kept track of the whole story behind OOXML and ECMA (which is an *INDUSTRY* standard, most certainly PATENT ENCUMBERED) and ISO? Have you not kept track on the licensing restrictions of implementing .NET by anyone other than MICROSOFT? What about silverlight? Microsoft has a TRACK RECORD of screwing everyone else. They are always looking out for #1, and that's them. After watching their game play for 20 years they've me convinced of one thing: They believe life is a zero sum game and they're going to do everything they can to win.

If you're not a shill, or a troll, and are honestly ignorant of the whole situation, then do yourself and everyone else a favor and do some research on Microsoft. Start with wikipedia. Look up the history on Internet Explorer/Spyglass, cpu licensing, banning of multiboot, DrDos, "Lotus won't run", Windows vs. OS Warp, the original author of QDOS and how he was screwed, Paul Allen and how he was screwed, Corel and how they were screwed, AOL and frickin' stupid desktop icons and how they were screwed, the antitrust lawsuit and Mr. Gates wonderful performance on video, the whole tie in with Internet explorer into the operating system, the doctored videos claiming internet explorer couldn't be removed, the attempt to pervert Java and screw Sun first and then the whole world as result, the whole screw Netscape thing, the games they played with funding SCO, the Halloween documents, screw the Tawainese vendors so they wouldn't put Linux on the netbooks, it's a long litany of screw, screw, screw everyone else.

And after you've done your proper research, please don't ever attempt to compare Google with Microsoft in terms of Evil. It's insulting and offensive.

Re:If Microsoft was doing this (0)

Anonymous Coward | more than 3 years ago | (#34966948)

Interesting thought process: "As long as its not MSFT".Idiots. Wrongdoing is wrongdoing, circumvention is circumvention (understand the words?). This is not everyone against MSFT. We should look at what is right and proper. Oh, and by the way, when was the last time Adobe, Google or the god Apple told what to put into their products or that they have to give up code to others and not receive in-kind/ You got it we talk of MSFT. And what are the evil empires of computing? But this is why I read /.-just like I watch MSNBC and Fox And yes, If Microsoft was doing this there would some vociferous outcry about their circumvention of the 'system' Hypocrites all!

Re:If Microsoft was doing this (0)

Anonymous Coward | more than 3 years ago | (#34965184)

Even if Microsoft wanted to make a specification public, they will have to write one first.

Re:If Microsoft was doing this (5, Interesting)

Burpmaster (598437) | more than 3 years ago | (#34965246)

If microsoft was doing what google is attempting to do we would all be screaming bloody murder

What, you mean producing a standard that actually matches the implementation and irrevocably granting free use of the necessary patents to everyone? How do you know how people would respond? Microsoft has never done that. They'e done the exact opposite, though...

Re:If Microsoft was doing this (3)

gdshaw (1015745) | more than 3 years ago | (#34965596)

If microsoft was doing what google is attempting to do we would all be screaming bloody murder

What Google is doing is far from ideal technically, but they have given us reasonable grounds to believe that their intentions are honourable: code that we can use freely, and a patent grant with no strings attached.

The technical shortcomings can be forgiven in view of the need to challenge H.264 quickly, and the need to work around patents held by others. I wish we had a codec with the technical qualities of H.264 and the legal qualities of VP8, but we don't. H.264 is irrelevant to me if I can't use it for legal or economic reasons.

When Microsoft has done something similar (like .NET, OOXML or ActiveX) there have usually been details in the fine print that either tie the technology to other Microsoft products or make it legally dangerous to use. What they have done in the past is not comparable to what Google is doing now.

Even Microsoft were to reform their behaviour completely, they would quite rightly be scrutinised very closely because of their past misdeeds.

Re:If Microsoft was doing this (1)

Xtifr (1323) | more than 3 years ago | (#34966072)

If microsoft was doing what google is attempting to do we would all be screaming bloody murder

When Microsoft releases their software for Linux under terms that allow it to be included in Debian, we'll be able to judge the truth of this claim. I'm not sure whether you're right or wrong, but I think hell is likely to freeze over before we'll ever find out! :)

Microsoft's history of backstabbing their partners and doing everything they can to lock in their users makes them a very different proposition from Google. I suspect that at least, some people would be shouting "it's a trap" if Microsoft did something similar. But there's a solid foundation for that fear with MS, and I'm not seeing any similar foundations for similar fears with Google. I may not trust them (Google) much, but I don't see any benefit to them to attempting to manipulate this format the way MS likely would (embrace, extend, extinguish). Nor has Google been running around shouting about various vague unspecified patents that the FLOSS community has supposedly been violating. Just about the opposite, in the case of VP8. Frankly, there are lots and lots of reasons for treating/viewing Google (or almost anyone else) differently from Microsoft. So, I have to conclude that you're either much more paranoid than I, or a troll.

What might be more interesting to contemplate is: what would happen if Novell or Oracle tried something similar. I suspect that would lead to a fair amount of justified paranoia, and shouts of "it's a trap"--but nowhere near as many as if it were Microsoft--and a fair amount of heated debate on the topic. However, Novell is mostly mistrusted because of their ties to MS, so really, the interesting case, IMO, would be Oracle.

For now, though, I'm just happy that there's finally at least one video format I can legally run on my machine without using the bloody flash player, which has been my only way to play videos at all up till now. I'm not too worried about Google's current control, because it depends on the good will of the community, which is perfectly able to take over and fork in extremis--see Xorg/Xfree86, EGCS/GCC, and arguably, MariaDB/MySQL and OOo/LIbreOffice.

Re:If Microsoft was doing this (0)

Anonymous Coward | more than 3 years ago | (#34966456)

It is not just that we would scream bloody murder - we have. http://developers.slashdot.org/article.pl?sid=07/08/02/1648200 [slashdot.org] JPEG XR - something actually progressive, and better than even the related WebP that Google is now trying to push. The best of those comments are about how they don't yet see what is evil about it.

Great, another religious nut (0)

Anonymous Coward | more than 3 years ago | (#34966478)

It's easy to say unfalsiable stuff like that, since Microsoft never has and never will do anything like this.

While you people make your purely-based-on-faith claims that people would scream bloody murder if Microsoft got friendlier toward software interoperability, meanwhile here in the land of science, progress marches on.

Hey Google? You want to win this war? (5, Interesting)

Anonymous Coward | more than 3 years ago | (#34965286)

Take some that immense R&D budget that you have and put a team of programmers on the task of getting VP8 encode/decode acceleration via OpenCL/CUDA.
The x264 team is sitting back and saying it can't be done, meanwhile a university has already posted the code for a modified x264 that uses the GPU to accelerate the pyramid search. The race is already started.

If x264 is further improved for GPU support and this makes it into FFMPEG, then the race is over...

Re:Hey Google? You want to win this war? (0)

Anonymous Coward | more than 3 years ago | (#34965768)

That's one of the few things I dislike about Google. They have a ton of great ideas, but they never seem to dedicate enough resources (or put their full weight behind them) to see things fully realized.

Google is like the ADD megacorp of software.

Re:Hey Google? You want to win this war? (1)

oiron (697563) | more than 3 years ago | (#34967300)

I doubt that's going to be particularly difficult for 3rd party devs to do - the libavcodec version is only 1400 lines. I'm sure that there's enough reusable code for decoder blocks floating around to assemble an OpenCL version of it...

boolean entropy coder (1)

hey (83763) | more than 3 years ago | (#34965376)

"essentially the entire VP8 data stream is encoded using a boolean entropy coder."
Well, duh.

Re:boolean entropy coder (1)

Rockoon (1252108) | more than 3 years ago | (#34965610)

An entropy encoder encodes symbols taken from an alphabet of symbols.

A boolean coder is a special case that has an alphabet of exactly 2 symbols, which is not required.. or even common.

Most compressors use an alphabet of 256 symbols.

Why do people that don't know data compression make ignorant comments about data compression?

Re:boolean entropy coder (0)

Anonymous Coward | more than 3 years ago | (#34967020)

Why do people that don't know data compression make ignorant comments about data compression?

Well what other sort of comments could people who don't know data compression make about data compression?

Smart move, Google... (1)

Qubit (100461) | more than 3 years ago | (#34965508)

Creating an RFC was a very smart move for Google.

First off, remember that when WebM burst onto the scene, Google made it pretty clear that they didn't want to monkey-around with improving the VP8 codec. Sure, maybe it could be improved, but they basically said that they just wanted to leave it as-is and have people start using the darn thing. The benefit of having it in use out in the wild outweighed any delays.

So, by submitting VP8 to the IETF as an RFC, they're not (necessarily) revisiting the question of making improvements or tweaks to VP8, but they are making progress on standardizing the codec.

Second, after the stunts with OOXML in ISO, if they actually want everyone to use VP8 in WebM for all of their web video needs, they might just want to avoid those jokers. I don't know too much about the w3c, OASIS, ECMA, or any of those other standards bodies, but I'm not sure if those would be appropriate places to go to standardize a codec, anyhow.

Third, IPR (Intellectual/Imaginary Property Rights). The IETF has an RFC [ietf.org] (of course it's an RFC!) about IPR, including a description of who needs to disclose information about possible IPR and when.

Here's an excerpt:

6.1. Who Must Make an IPR Disclosure?

6.1.1. A Contributor's IPR in his or her Contribution

      Any Contributor who reasonably and personally knows of IPR meeting
      the conditions of Section 6.6 which the Contributor believes Covers
      or may ultimately Cover his or her Contribution, or which the
      Contributor reasonably and personally knows his or her employer or
      sponsor may assert against Implementing Technologies based on such
      Contribution, must make a disclosure in accordance with this Section
      6.

      This requirement specifically includes Contributions that are made by
      any means including electronic or spoken comments, unless the latter
      are rejected from consideration before a disclosure could reasonably
      be submitted. An IPR discloser is requested to withdraw a previous
      disclosure if a revised Contribution negates the previous IPR
      disclosure, or to amend a previous disclosure if a revised
      Contribution substantially alters the previous disclosure.

      Contributors must disclose IPR meeting the description in this
      section; there are no exceptions to this rule.

6.1.2. An IETF Participant's IPR in Contributions by Others

      Any individual participating in an IETF discussion who reasonably and
      personally knows of IPR meeting the conditions of Section 6.6 which
      the individual believes Covers or may ultimately Cover a Contribution
      made by another person, or which such IETF participant reasonably and
      personally knows his or her employer or sponsor may assert against
      Implementing Technologies based on such Contribution, must make a
      disclosure in accordance with this Section 6.

I don't think that Google necessarily expects to see the MPEG-LA boys show up en force to describe all of the patents they have that they think read on VP8. Yes, it's too bad. But I don't think it's too unreasonable to believe that anyone who gets involved in a discussion about this RFC will need to disclose any IPR they know about. If Google or someone else can get employees from some of the primary codec-patent-owning companies to be at all involved in the discussion, this could force them to tip their hand regarding patents (or risk legal problems later for failing to disclose them at this time).

Overall, not much work for Google with the possibility of some big payoffs later. Very slick.

Re:Smart move, Google... (1)

arose (644256) | more than 3 years ago | (#34965950)

ECMA is known to rubber-stamp stuff coming from Microsoft (probably others as well), OOXML went to ECMA first.

Re:Smart move, Google... (1)

larry bagina (561269) | more than 3 years ago | (#34966364)

  1. MPEG-LA (or any other patent holder) only need to disclose if they participate or contribute.
  2. A jury in Texas, not the IETF will decide if there is patent infringement.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...