Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google Funds Ogg Theora For Mobile

Soulskill posted more than 4 years ago | from the swinging-for-the-fences dept.

Google 183

An anonymous reader writes "Google has decided to fund the development of Theora optimized for ARM processors. The article on the Open Source at Google blog notes the importance of having a universal baseline video codec for the Web: 'What is clear though, is that we need a baseline to work from — one standard format that (if all else fails) everything can fall back to. This doesn't need to be the most complex format, or the most advertised format, or even the format with the most companies involved in its creation. All it needs to do is to be available, everywhere. The codec in the frame for this is Ogg Theora, a spin off of the VP3 codec released into the wild by On2 a couple of years ago.'"

cancel ×

183 comments

Sorry! There are no comments related to the filter you selected.

weed fp (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31799380)

fp on weed

Dirac (3, Interesting)

xZgf6xHx2uhoAj9D (1160707) | more than 4 years ago | (#31799388)

This is awesome! Not to detract from it, but why is there so much more love for Theora than for Dirac?

Re:Dirac (3, Informative)

EdZ (755139) | more than 4 years ago | (#31799406)

Because Theora is much further along in development than Dirac?

Re:Dirac (1)

sznupi (719324) | more than 4 years ago | (#31799866)

Only partially true - neglecting surrounding infrastructure not present in PC world, Dirac the codec seems to be ready, "production" kind of ready. BBC apparently uses it for internal needs / transmission.

Re:Dirac (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31800142)

BBC ues Dirac mainly for lossless video production.

This is different from the typical web usage of the Theora codec.

There are room for both and they will most likely help each other over time.

Re:Dirac (0)

Anonymous Coward | more than 4 years ago | (#31799416)

I want it to have a steam engine!

Re:Dirac (3, Informative)

kg8484 (1755554) | more than 4 years ago | (#31799474)

I think a big reason is because the Xiph project has a few other codecs developed in-house that are successful. Besides Vorbis, their MP3 alternative, Speex [speex.org] and and FLAC [sourceforge.net] are "under the Xiph.org banner" [wikipedia.org] . This allows them to promote Theora more. Also, Dirac was released in 2008 vs Theora's 2004, so Theora has had 4 more years to get a following.

Re:Dirac (3, Informative)

Anonymous Coward | more than 4 years ago | (#31799494)

> why is there so much more love for Theora than for Dirac?

In order to play Flash video, or Silverlight video, browsers need a plugin.

Theora/HTML5 video can play in Firefox, Opera, Google Chrome (without any plugin) and IE (this browser alone requires a plugin).

(You can download that plugin for IE from here: http://code.google.com/chrome/chromeframe/ )

Ogg Vorbis, Speex, Theora and FLAC files can play on Windows and Linux platforms.

(Linux support is out-of-the-box, and you can get the support for Windows from here: http://www.xiph.org/dshow/ )

This means that Theora is supported on most desktops, laptops and netbooks. Say 90% or more.

There are 300,000 Theora videos on openvideo.dailymotion.com.

Theora is the video codec for wikipedia

http://videoonwikipedia.org/

All of this means that Theora is infinitely better-supported, right now, today, than is Dirac.

Re:Dirac (1)

tepples (727027) | more than 4 years ago | (#31800000)

In order to play Flash video, or Silverlight video, browsers need a plugin.

That's less of a disadvantage if PC makers install the plug-in on new PCs. This appears to be the case at least for Flash Player.

Theora/HTML5 video can play in Firefox, Opera, Google Chrome (without any plugin) and IE (this browser alone requires a plugin).

Doesn't Safari need the XiphQT plug-in [xiph.org] too?

Re:Dirac (5, Informative)

TheRaven64 (641858) | more than 4 years ago | (#31799552)

CPU load. Theora is based on VP3, which is old. It was open sourced in 2004, but VP3 first shipped in 2000. Back in 2000, I had a 450MHz K6-2, and a lot of people I knew had slower machines. Now, a typical handheld is faster than that machine. Theora, like VP3, relies a lot on postprocessing passes for quality. This has the advantage that you can just not bother on slower machines, and get a slightly worse picture but with a lower CPU requirement.

Dirac, in contrast, needs at least a 2GHz CPU to play back. It's patent free and looks great, but the CPU load is huge. There have been efforts to offload a lot of it onto the GPU, which is nice for the desktop but doesn't help older machines and handhelds (except the latest generation). The BBC is working with vendors to get Dirac implemented in hardware, but it won't be ubiquitous for quite a few years.

Dirac also doesn't perform as well as Theora at low bitrates. This is very important for web streaming. Dirac is great for situations where bandwidth and CPU power are plentiful, but Theora makes more sense as a lowest common denominator solution. Ideally, you'd see both supported; Dirac for high quality, Theora for fallback.

Re:Dirac (0)

Anonymous Coward | more than 4 years ago | (#31799554)

Because with Dirac its imposible to determine simultaneously certain pairs of physical properties like bitrate and image quality with any great degree of accuracy or certainty.

Re:Dirac (1)

David Gerard (12369) | more than 4 years ago | (#31799590)

Because the version of Dirac that gives significant advantage is nowhere near usable. I understand it's progressing nicely, though.

Re:Dirac (2, Interesting)

Anonymous Coward | more than 4 years ago | (#31799698)

There are rumors that a number of the major companies, except for Adobe are moving towards an agreement on Dirac in Ogg containers as the new standard at least for higher resolution/bandwith content. From what I understand, Google is also in favor of Dirac but wants Theora as fallback codec for mobile devices/phones with less bandwith and CPU resources -- at least for now. Both are great codecs and it looks like in the long run Dirac may become the standard codec for HD content.

Beyond awesome! (4, Insightful)

KingSkippus (799657) | more than 4 years ago | (#31799714)

This is beyond awesome, it's a game-changer. Google is one of those rare companies that singularly has the power to move markets, and it is revolutionary to see it do so in favor of consumers as it has. I understand the reasons why it has preferred H.264 over Theora, but it is really nice to see that it also understands the reasons why we should be preferring an open format instead. It's especially nice in an age of companies wanting to lock everything down and be the gatekeeper to everything, the major player in technology is pushing yet again to open things up.

Sometimes I think that Google is about the only company that "gets it." They understand that more people using the Internet translates to more money in their pocket. Even if those people are not using Google's services directly, they are increasing the market such that collectively, it has more opportunity, which in turn translates into more $$$. They seem to not really care if other people are making more money as well, which really separates them in my mind from other companies, who are of the "it is not enough that I succeed, but everyone else must fail" mentality.

Anyway, back to the topic at hand, one reason I've seen people regurgitate in why H.264 is the right way to go is because it is supported on hardware. Congratulations to Google on working to negate that argument.

Re:Beyond awesome! (0)

koolfy (1213316) | more than 4 years ago | (#31799888)

I understand the reasons why it has preferred H.264 over Theora, but it is really nice to see that it also understands the reasons why we should be preferring an open format instead.

Am I the only one seeing the contradiction here ? Really ?

Sometimes I think that Google is about the only company that "gets it."

Sometimes I think that Google just didn't "get it" in the first place when choosing H.264 for youtube.

Maybe it was nothing more than a "Microsoft-ish" tactic to push Google Chrome over Firefox, Opera, Chromium etc.
You know, one of these "if your product isn't used enough, make an every-day used platform/technology incompatible with its competitors."

Promotion by incompatibility is the reason I use Linux and look for open standards and technologies. And I don't think Google's contradictory position is helping those.
You see, as long as you use H.264 in real life, promoting Vorbis-Theora is nothing more than hypocrisy and getting good publicity.

Anyway, there is no way I ever forgive Google anytime soon. [flickr.com]

H.264 uses half the bitrate of Theora (2, Insightful)

tepples (727027) | more than 4 years ago | (#31800022)

Sometimes I think that Google just didn't "get it" in the first place when choosing H.264 for youtube.

YouTube started out on Sorenson H.263 because Flash Player supported that out of the box. When iPhone and new versions of Flash Player started to support H.264, YouTube reencoded uploaded videos in the new format. It was a happy accident that Chrome and Safari supported the same codec for the HTML5 <video> element. Now that platforms stuck on Flash 7 (namely Wii) have upgraded to a version with H.264, YouTube appears not to do H.263 anymore. Theora is somewhere between H.263 and H.264 in quality, roughly on par with MPEG-4 part 2 codecs such as DivX and Xvid, but H.264 still uses half the bitrate of Theora for the same perceived quality.

Re:H.264 uses half the bitrate of Theora (0)

Anonymous Coward | more than 4 years ago | (#31800232)

Just that quality is not something that matters on youtube.

Re:H.264 uses half the bitrate of Theora (2, Informative)

dingen (958134) | more than 4 years ago | (#31800306)

But bandwidth does concern them. With H264, videos can use half the bandwith of Theora and look somewhat the same.

Re:Beyond awesome! (1)

Nikker (749551) | more than 4 years ago | (#31800126)

Google's decision to go for h.264 was likely a lot to do with the iPhone/iPod Touch. Apple wanted to have streaming video and wanted h.264, Google wants mind share and market penetration. Most users have no idea how the content is encoded but they do know they can goto Youtube to get it. Now that people are lining up at Google's door step to get it the encoding no longer really matters, it is trivial for Google to encode the source in any format they want so the get to keep the ball in their court. I would guess Google's goal would be to have every device compatible with their service and they will implement any codec to make that possible. With Theora they now have an ace up their sleeve, if there is any device that cannot afford to license a hardware H.264 encoder/decoder this will allow for hardware manufacturers to produce a Theora based design and with smaller / no licensing costs involved with Theora many companies may choose to implement it as a catch all solution, it may even be beneficial to content providers as their costs for licensing are 0 as well. If and when this becomes popular we will come into time span where Theora will take over formats like DIVX/XVID in standalone players and other devices. In the end Google's decision was both beneficial to the consumer as well as them selves since they retained mind share as well as control over that aspect of the market. This is a smarter long term strategy. If Google chose to alienate Apple and not host H.264 content then Apple would have been forced to likely seek another content provider or become one themselves in which case they would have likely chose one they could easily control as a result screwing with mobile media streaming distribution similar to the way they handle App development on their platform.

So don't be a hater as long as there is a one to many relationship between the content provider and hardware devices the one will want to keep as many as possible and the many will seek to connect to the one using the cheapest / easiest method available. With Theora being implemented in hardware if it becomes a commodity item it will take over many aspects of distribution.

it was likely because of hardware support (1)

YesIAmAScript (886271) | more than 4 years ago | (#31800334)

Yes, the iPhone and iPod Touch mattered. But if Google had chosen Theora and not H.264 (not sure why it's an either/or, but you presupposed this) then YouTube would be a bit player in the mobile market right now because no mobile device could play it efficiently, because there is no Theora support in mobile chips right now.

YouTube's competitors were already supporting H.264 and thus they could work on mobile devices, and Google could have lost the mobile market space to them if they didn't move to cover this weakness.

To me it's strange to think mobile players will move to adding Theora hardware support just as a "backup plan". Transistors aren't free. There's a lot of codecs they already don't support that would bring a lot more perceived value to the customer before they'd add Theora.

Re:Beyond awesome! (1, Interesting)

jo_ham (604554) | more than 4 years ago | (#31800260)

H.264 is an open standard, so the fox is not crying.

Re:Beyond awesome! (1)

koolfy (1213316) | more than 4 years ago | (#31800308)

how is it open ?

Re:Beyond awesome! (2, Informative)

jo_ham (604554) | more than 4 years ago | (#31800536)

It's an open standard. This is well known.

"The ITU-T H.264 standard and the ISO/IEC MPEG-4 AVC standard (formally, ISO/IEC 14496-10 - MPEG-4 Part 10, Advanced Video Coding) are jointly maintained so that they have identical technical content."

Just because it is patented doesn't mean it's not open.

It is the opposite side of the coin from something like WMV, which is proprietary.

Re:Beyond awesome! (2, Insightful)

Randle_Revar (229304) | more than 4 years ago | (#31800494)

It hardly matters if the specs are published, if you can't implement them without paying for patent licenses.

Re:Beyond awesome! (1)

jo_ham (604554) | more than 4 years ago | (#31800614)

That depends. It matters a great deal I think. It may be worth paying for the licence, and you have the option if you want it, unlike a closed format where your only option is reverse engineering. For systems like GSM, fully patented but open standards are in use that supply royalties to the original companies that developed them.

It's not always bad if the result is an open, but patented standard.

We can continue to push for royalty free standards and fully OSS-friendly codecs, but dismissing the middle ground (as Mozilla is trying to do by being overly stubborn with H.264) is not helping. I understand their reluctance and their stand, but sometimes you have to compromise for the benefit of all (binary drivers for GPUs in Linux come to mind as another example - not ideal, but small steps, and beneficial results for both parties in the meantime).

OGG newbie question (1)

commodore64_love (1445365) | more than 4 years ago | (#31799736)

I've always used the MPEG4 codec, both for audio (AAC+SBR) and video (AVC/H.264), since it can provide quality equal to MP3 or MPEG2, but at half the speed.

How does OGG compare to MPEG4?

Re:OGG newbie question (4, Informative)

nxtw (866177) | more than 4 years ago | (#31799874)

How does OGG compare to MPEG4?

Theora is perhaps better than H.263 and MPEG-2 (from the mid 90s), but does not come close to H.264/MPEG-4 AVC or VC-1. (The frozen Theora bitstream format is lacking many features found in H.264 and VC-1.) Results might be similar to H.263+/MPEG-4 ASP.

The Ogg container also has some documented flaws [hardwarebug.org] .

Note that there are many sites which perform misleading or flawed comparisons of the two; for example, they might compare the result from YouTube's H.264 encoder with a lossy source (which optimizes for encoding speed) to a locally ran Theora encode with a lossless source.

Since OS X 10.6 and Windows 7 come with H.264 decoding, and Windows 7 supports H.264 hardware decoding with compatible hardware from any source, I recommend sticking with H.264. (OS X 10.6's H.264 hardware decoding support appears to be limited to videos played in QuickTime X from MPEG4 or QuickTime container files on systems with nVidia 9400M GPUs or newer, even though Macs with capable GPUs started appearing in 2007.)

Re:OGG newbie question (2, Insightful)

Randle_Revar (229304) | more than 4 years ago | (#31800510)

Ogg may indeed be less than ideal, but that article exaggerates it's problems.

Paging Chris DiBona (4, Interesting)

David Gerard (12369) | more than 4 years ago | (#31799392)

Chris DiBona of the Google open source group claimed [whatwg.org] that "If [youtube] were to switch to theora and maintain even a semblance of the current youtube quality it would take up most available bandwidth across the Internet."

This was shown to be false [xiph.org] .

Mr DiBona then mysteriously vanished without trace.

Could he please manifest and either (a) support his claims or (b) concede his error?

Thanks ever so much.

Re:Paging Chris DiBona (-1, Troll)

Anonymous Coward | more than 4 years ago | (#31799422)

DiBona is a fat idiot, end of story.

WHATWG: The worst thing to happen to the Web. (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31799478)

A decade from now, we'll look back and see WHATWG as the worst thing to have happened to the Web in years. Much worse than IE6, even.

HTML5 is shit, no matter how you look at it. From a technical standpoint, it's rubbish. Many of its new elements have gone out of their way to bring back the combination of presentation and content that we've tried to get rid of for over 15 years now. Others, like canvas, encourage JavaScript to be used more than it ever should be. Furthermore, the audio and video playback will end up as the next-generation marquee or blink element; annoying, misused and hated by all.

HTML5 has been nothing but a toilet for companies like Google, Apple, Mozilla (yes, they're a commercial entity as much as any other these days), Adobe and Microsoft to defecate in. Instead of putting an emphasis on quality technology and a good user experience, they're going to bring a web littered with terrible JavaScript-based "apps", audio and video support that'll end up being used mainly for annoying advertisements, mixed presentation and content thanks to tags like header and footer, along with numerous other problems.

What's worst of all, though, is that XHTML, XForms and other sensible standards are being discarded for something so much worse. Then again, this is exactly what we should expect from web developers. They're not interested in quality or technical superiority, but merely throwing out some shit and calling it a day.

Re:WHATWG: The worst thing to happen to the Web. (2, Interesting)

Mystra_x64 (1108487) | more than 4 years ago | (#31799556)

XML serialization of HTML is still there. XForms... I never heard it worked in any browser sans some 3rd party plugins.

So, please, describe what's rubbish in HTML. Those new elements are _needeed_ anyway. It's better to have them than to implement anew every time you need them.

I don't understand what's you problem with audio and video either. They are here anyway with flash. You can disable flash. You can disable audio/video if you really want to. Your problem is?

Re:WHATWG: The worst thing to happen to the Web. (2, Insightful)

Nadaka (224565) | more than 4 years ago | (#31799634)

Canvas is not needed. You can create dynamic, animated graphics using the existing SVG standard.

And yes, html5 brings back the integration of style and content.

It is defined to maintain backwards compatibility by keeping some elements that are counter to the philosophy of html and yet fails to preserve the definition and presence of those elements. It is even halfassed at meeting its stated goals.

Html5 spec does not specify a single DOM structure, unlike html2, this means that IE is going to continue to require hackish work around for cross platform js.

Html5 may not be total crap compared to html4, but compared to the competing and now defunct standard xhtml2? It is utter irredeemable crap.

Re:WHATWG: The worst thing to happen to the Web. (1)

Mystra_x64 (1108487) | more than 4 years ago | (#31799728)

I'm not going to mention canvas here as I'm not interested in it ATM.

It is defined to maintain backwards compatibility by keeping some elements

Like B and I? I don't see any major problem with that. And they have been somewhat redefined. XHTML2 could not care less about backward compatibility and now it's dead.

Html5 spec does not specify a single DOM structure, unlike html2, this means that IE is going to continue to require hackish work around for cross platform js.

I don't quite understand what you mean here. Could you please be more specific?

but compared to the competing and now defunct standard xhtml2? It is utter irredeemable crap.

Please name a few areas where XHTML2 was the best-thing-since-you-know-what that make HTML5 "utter irredeemable crap". XHTML2 had some nice things in it, but nothing really good to sacrifice everything. I see more problems in a fact that CSS is still not up to the task in some areas.

Re:WHATWG: The worst thing to happen to the Web. (1)

Nadaka (224565) | more than 4 years ago | (#31800170)

1: XForms are a huge improvement over traditional html forms with contol over view, validation and data.

2: The standardization on xml events and DOM. This is a huge issue.
If you have ever done any serious work in AJAX web apps,
you will know that there are IE uses a different DOM structure than everything else.
This means they you have to:
A: know all the idiosyncratic differences and how to identify and code around them.
B: test everything with rediculous thoroughness and apply hackish patches to get things working.
C: use a bloated standardization library (prototype, jquery, dojo, etc).

3: xhtml2 is a host language allowing you to embed other xml language elements within.
You can include MathML, SVG, etc with the appropriate namespaces.

4: seperation of symantics from styling.
Html5 has predefined css styles and tags used to define symantic meaning in a haphazard fashion.
Xhtml2 uses the role attribute to define symantic meaning in a single, clear and consistant way without interfering in document structure markup or styling.

5: real errors. Invalid documents are invalid and produce an error instead of allowing the browser to randomly attempt to guess the correct fix for bad code. This will make developing consistant and valid documents easier as the author will be able to catch invalid documents during development.

IE: The worst thing to happen to the Web. (1)

tepples (727027) | more than 4 years ago | (#31800068)

Canvas is not needed. You can create dynamic, animated graphics using the existing SVG standard.

But can you let the user do this creating? How would one write a photo editor or pixel art editor with SVG and no <canvas>? And how well does SVG handle sprite graphics in the style of 8-bit or 16-bit consoles?

And yes, html5 brings back the integration of style and content.

It was still there in transitional XHTML 1.

Html5 spec does not specify a single DOM structure

What exactly do you mean by this? If the HTML5 standard cites the DOM Events spec [w3.org] , then it supports addEventListener and the like. Microsoft made a specific choice not to support DOM Events, a W3C Recommendation published in 2000, and therefore not support HTML5.

but compared to the competing and now defunct standard xhtml2?

Which web browser ever supported XHTML 2 without using XSLT to turn it into XHTML 1? Heck, IE was even late to support XHTML 1.

Re:IE: The worst thing to happen to the Web. (3, Insightful)

Nadaka (224565) | more than 4 years ago | (#31800554)

How does SVG handle sprite graphics? Far better than canvas does. To move a sprite, you can transform its position, with a canvas you have to re-composite the image. The sprite itself can be a traditional bit-mapped image if desired.

Pixel art editing is somewhat possible. Canvas can generate a bitmap of the output, but SVG can not without and external converter. As you add pixels (really rectangles in the DOM of the svg) you dramatically explode the size of the DOM tree causing performance issues. With good partitioning algorithms, this can be partially mitigated by combining adjacent like pixels into single DOM objects.

No web browser ever supported xhtml2, but then the only after the xhtml2 spec was shelved did browsers start to roll out any significant support for html5 either.

Re:IE: The worst thing to happen to the Web. (1)

tepples (727027) | more than 4 years ago | (#31800622)

To move a sprite, you can transform its position, with a canvas you have to re-composite the image.

But if you transform a sprite's position, the web browser has to re-composite the image anyway. The advantage of <canvas> is that it lets script take PNG screenshots of a composited image and pass them around.

As you add pixels (really rectangles in the DOM of the svg) you dramatically explode the size of the DOM tree causing performance issues. With good partitioning algorithms, this can be partially mitigated by combining adjacent like pixels into single DOM objects.

Or you can just do it the easy way with a <canvas>.

Re:WHATWG: The worst thing to happen to the Web. (4, Insightful)

Anonymous Coward | more than 4 years ago | (#31799566)

Meh, getting 503s trying to log in. Sorry for the A/C Post.

XHTML was interesting and lovely, and no one gave a shit. Ideology loses to practicality in almost every case until ideology is reformed to conform to reality.

I think you'll find that if you look at HTML5, there's not a lot of presentational bits in it. Most of that is still reserved for CSS.

You'll also find that the cases where things are defined at least gives the web a unified a way to handle real web pages that exist *today*. Right now, a new browser would have to reverse engineer what Chrome, FF, IE and friends did in order to know how to render the web. HTML5 at least identifies the reality that exists.

You note that JS is being used to do things it shouldn't. On what grounds? Who are you to tell what should and shouldn't be done with a language and in a given environment? The practical fact is that folks *are* doing amazing things with JS. If you don't like the language, that's your problem. If you don't want it on your computer, don't use those websites. JS *does* lots of things today, and there's no reason to limit it artificially. You want something better out there? Come up with a solution and push it.

Your final comment notes that web developers aren't interested in quality and technical superiority. You're right. Why should they? What they care about is getting a product out. You're asking them to solve problems that they don't have.

Tks,
Jeff Bailey
(an employee of Google, not speaking for Google at all)

Re:WHATWG: The worst thing to happen to the Web. (0, Troll)

Anonymous Coward | more than 4 years ago | (#31799708)

Hi, Jeff.

XHTML was ignored because it is sensible. It is easy to parse, and hence easy to generate, easy to manipulate, and easy to validate. In most other computing fields, these would be seen as benefits. I think it's due to a collective stupidity and ignorance that web developers haven't bothered to make better use of a technology that would vastly improve their lives.

There is no need for elements like "header" and "footer" in HTML5. The exact same functionality is better represented as traditional divs or spans with a class specified. End of story. Anyone who supports the "header" and "footer" elements, among several others, supports content mixed with presentation. It's a regression.

JavaScript is a scripting language, Jeff. I shouldn't have to explain this to you. It is okay to use it for writing a single-line onclick handler. It does not, however, offer the language constructs to develop anything beyond that. We have far too many ignorant web developers who think that JavaScript is a good language, but that's only because they're totally ignorant of everything else. Use C, C++, Python, Ruby, Perl, C#, OCaml, Haskell, Scheme or Common Lisp for even a week, and you'll immediately see how fucked up JavaScript is, and how pathetic of a language it is for development of code that exceeds two or three lines in length.

What are some of these "amazing things" that have been done with JavaScript? Tell me, Jeff. Tell me. It sure as fuck isn't GMail. Thunderbird, mutt and even goddamn Outlook are still more pleasant to use than GMail's web interface. It isn't Facebook, because that site is as slow as molasses, yet still doesn't do anything interesting with JavaScript. Is it those JavaScript re-implementations of video games from the 1970s, the ones that run slower than the originals did? Sorry, Jeff, nobody has done anything unique with JavaScript. That's why most web "apps" are pure shit compared to their desktop equivalents from the early 1990s.

Your final comment notes that web developers aren't interested in quality and technical superiority. You're right. Why should they? What they care about is getting a product out.

This, Jeff, is why web "apps" are so shitty, and will continue to be shitty. You guys, even at "respected" companies like Google, are all about "getting product out". Great, you've "shipped your product". That doesn't change the fact that it's shit, and basically unusable. But what the fuck, you've "shipped". That's all that matters, right? Actually, no. You guys are basically the same as Indian offshore developers, who shovel out one piece of shit after another. You guys are a disgrace to software development.

Re:WHATWG: The worst thing to happen to the Web. (3, Interesting)

Graymalkin (13732) | more than 4 years ago | (#31799792)

XHTML is easy to generate, manipulate, and validate? Have you ever written software that tried to handle XHTML? It's as complex as writing an XML handler which is not trivial to do properly. Things like tag attributes add a whole extra layer of complexity to getting a machine to actually understand the document. Your contention that HTML5 is regressing with respect mixing presentation and content is ignorant and borderline stupid. It makes me wonder if you've even read the spec. HTML5 eliminates presentation tags like center, tt, and the font tag. It does add tags that make it easier for user agents to determine the context of different parts of a document.

For instance the header, footer, and article tags let the UA figure out in a search which parts of the document they ought to pay more attention to. Search engines can focus on text inside article tags and ignore text matches in the footer or nav tags for instance. Screen readers don't need to try to parse pages based on tag attributes like they have to with HTML4/XHTML. A screen reader can know that it doesn't need to bother reading the contents of the footer or it can more easily provide a verbal menu based on the sections of the document.

Re:WHATWG: The worst thing to happen to the Web. (0)

Anonymous Coward | more than 4 years ago | (#31799896)

Are you serious? Are you seriously saying that you find parsing XML difficult? Perhaps you should find another field to work in, if that's the case. Handling XML (and, by extension, XHTML) is a trivial task.

Yes, I have worked on systems that store millions of complex documents in XML and XHTML. I've worked on content management systems where we had end-users editing those documents. Do you know what we did? We forced the documents to validate before we persisted them, and that made them easy to parse. For the average user, it took about 10 minutes for them to figure out how to write well-formed and valid document.

XHTML5 (1)

tepples (727027) | more than 4 years ago | (#31800124)

We forced the documents to validate before we persisted them

Which is still possible with HTML5. It has two surface forms, XML and a pseudo-SGML, which parse to the same DOM. The user can enter XHTML5, and you can still validate that. But the advantage of HTML5 is that its pseudo-SGML parser is more clearly specified, so that even tag soup translates to a well-defined DOM. If the user enters pseudo-SGML, in which case you can parse that into a DOM and then serialize it back to XHTML5.

Re:WHATWG: The worst thing to happen to the Web. (1)

init100 (915886) | more than 4 years ago | (#31800210)

Are you seriously saying that you find parsing XML difficult? ... Handling XML (and, by extension, XHTML) is a trivial task.

Depends on your point of view. Using an already existing XML parser in an application can be pretty easy. Writing a fully compliant XML parser is far from simple. If it would be so simple, why are most XML libraries fairly large and complicated pieces of software?

It is pretty simple to write a non-validating parser for a limited subset of XML, but if you include things such as namespace support, XPATH support, not to mention validation by DTD, Relax-NG and/or XML Schema, the parser suddenly becomes very complex.

Re:WHATWG: The worst thing to happen to the Web. (0)

Anonymous Coward | more than 4 years ago | (#31799890)

someone piss in your corn flakes this morning?

By the sounds of it, the only thing that'll solve your problem is canceling your internet.

Re:WHATWG: The worst thing to happen to the Web. (3, Insightful)

msclrhd (1211086) | more than 4 years ago | (#31799956)

Tags like header and footer denote semantics which are part of the content (content denotes what is displayed, not how it is displayed). They don't say "the footer should be in a 10pt font" -- that is up to the CSS. They (and the other layout elements) denote the semantics of what is currently being done in an ad-hoc way. They allow things like search engines to identify relevant information (e.g. ignore the footer sections).

HTML5 is looking to be a great standard. Not perfect by any means, but it is a good step forward (giant leap?) in the right direction. Having a defined way of processing HTML5 and having an XML variant (XHTML) unified to the same DOM makes it easier to choose how you want to write/generate your HTML content.

There were some nice ideas in XHTML2, but it didn't pan out. That does not mean that some of those ideas cannot be integrated into HTML in the future like section has been.

It is also good to see Google seeking to improve video support.

Gradually, HTML5 support will improve, as will support for CSS3 as these standards get finalised. Also, audio and video support will stabilise as well. These, with all the advances in support for MathML, SVG, SMIL and other standards as well as performance improvements for JavaScript and hardware-accelerated page rendering mean that the web is only growing in strength.

As for JavaScript, it is just a scripting language -- you can do anything with it and hook it to anything. You do know that the "fetch more comments" feature of slashdot uses javascript? You do know that thunderbird and firefox make use of javascript for binding their UI together?

Header and footer; JavaScript deployment advantage (1)

tepples (727027) | more than 4 years ago | (#31800094)

There is no need for elements like "header" and "footer" in HTML5. The exact same functionality is better represented as traditional divs or spans with a class specified. End of story.

So how are you going to get thousands of web sites to use the same class= for a header or footer so that the user can apply a user stylesheet to every site's header and footer?

Use C, C++, Python, Ruby, Perl, C#, OCaml, Haskell, Scheme or Common Lisp for even a week, and you'll immediately see how fucked up JavaScript is, and how pathetic of a language it is for development of code that exceeds two or three lines in length.

It's interesting that you mention Scheme and Common Lisp. The common opinion on the web is that JavaScript has Lisp semantics with C syntax. In fact, I'd wager that if M-expressions had ever been properly implemented in Lisp, they would look a lot like JavaScript. Another advantage of JavaScript is that end users might not have privileges to install an application written in "C, C++, Python, Ruby, Perl, C#, OCaml, Haskell, Scheme or Common Lisp".

Is it those JavaScript re-implementations of video games from the 1970s, the ones that run slower than the originals did?

The originals don't run at all if you don't have permission to install them on a given PC. JavaScript has the advantage that it's (at least supposed to be) sandboxed, so computer owners are more likely to let guests use applications written in it.

Re:WHATWG: The worst thing to happen to the Web. (0)

Anonymous Coward | more than 4 years ago | (#31800098)

Bitter much?

I would say you are way off the mark. What's been unique with Javascript? Let's start with AJAX. C, C++, Python, Ruby, Perl, C#, OCaml, Haskell, Scheme, nor Lisp has advanced the state of the web like AJAX has. I can't access/run thunderbird/mutt/outlook on my iPhone, or use them from my work desktop, but gmail's web interface works EVERYWHERE.

Maybe those things aren't important to YOU, but they are to me. Javascript may not be the best programming language in the world, but it has some very unique advantages that most others don't. It's no one else fault that you simply can't see the advantages and disadvantages to differing technologies, and apparently have a blinding hate towards specific ones.

Re:WHATWG: The worst thing to happen to the Web. (0)

Anonymous Coward | more than 4 years ago | (#31800150)

XHTML was ignored because it is sensible. It is easy to parse, and hence easy to generate, easy to manipulate, and easy to validate.

Sorry, but this is just bullshit. As someone who used XHTML extensively and liked the flexibility it offered me, XHTML was killed by one thing and one thing only, and that is IE's complete and utter refusal to support it and the extremely high market share of IE during the period where adoption was a practical option. If XHTML could have even failed back to something useful in IE, it might have had a chance, but that did not really work in practice either. As a result, the Web developer community moved on to technologies where they could provide more useful Web apps, but where they could provide support for IE using the same page and a bunch of hacks.

What are some of these "amazing things" that have been done with JavaScript? Tell me, Jeff. Tell me. It sure as fuck isn't GMail. Thunderbird, mutt and even goddamn Outlook are still more pleasant to use than GMail's web interface.

Sigh. Except you're missing the critical element that Gmail will work on your Windows PC, Mac, iPhone, Android, Wii, or whatever else you throw at it using whatever browser a particular user happens to have, even including the abysmally retro IE. All the other e-mail programs you mention require a separate version to support each and every platform. The Web application is the work around for the complete failure of cross-platform computing technologies, largely killed by Microsoft.

Re:WHATWG: The worst thing to happen to the Web. (1)

init100 (915886) | more than 4 years ago | (#31800196)

Anyone who supports the "header" and "footer" elements, among several others, supports content mixed with presentation. It's a regression.

Very wrong. Header and footer elements denote document structure, nothing else. Of course they will have default styles, but that can be overridden like everything else. Actually header and footer elements are much more sensible than using divs with classes or ids. A header is specified to be used for certain parts of a document, and can be correctly interpreted by software such as screen readers and braille displays. How do you do that with divs? The id/class is an arbitrary string, not something that such software can rely on.

It is okay to use it for writing a single-line onclick handler. It does not, however, offer the language constructs to develop anything beyond that.

Who are you to assert what we can and cannot do with it?

We have far too many ignorant web developers who think that JavaScript is a good language

I can agree that much Javascript code is pretty hackish, but you can develop structured code with it. Sure, I'd prefer a conventional object oriented language instead of the prototype-based language Javascript is, but it's really just because I'm more familiar with the former. The more you use it however, the better you'll become in thinking about prototypes instead of superclasses.

Use C, C++, Python, Ruby, Perl, C#, OCaml, Haskell, Scheme or Common Lisp for even a week, and you'll immediately see how fucked up JavaScript is, and how pathetic of a language it is for development of code that exceeds two or three lines in length.

I don't agree. Like every language, it has its strengths and its weaknesses, but it's not like other languages does not have strengths and weaknesses of their own. Javascript surely has its share of quirks, but so has every other language out there. Care to explain what is so immensely shitty, pathetic and fucked up about Javascript?

Re:WHATWG: The worst thing to happen to the Web. (2, Interesting)

TheRaven64 (641858) | more than 4 years ago | (#31799580)

Many of its new elements have gone out of their way to bring back the combination of presentation and content that we've tried to get rid of for over 15 years now.

Absolutely not true. The new tags are for things like articles, sections, and so on. They provide more semantic information, not less. The HTML 2 approach removed all of these as redundant because you can implement them with class attributes. The problem with this is that one site will use <div class="article">, another will use <div class="post">, a third will use <div class="blog">, and this makes it very difficult for the browser to render them in a consistent way and for other user agents to know that they represent articles. In contrast, HTML5 pages will use the <article> tag.

Others, like canvas, encourage JavaScript to be used more than it ever should be. Furthermore, the audio and video playback will end up as the next-generation marquee or blink element; annoying, misused and hated by all.

They don't allow you to do anything that you can't do in Flash already. Flash is often abused, but in some cases it's used very effectively. I'd rather have an open standard than a proprietary system. Things like Web Socket are also very useful, allowing you to keep a connection to the server open and incrementally fetch data without polling. Something like Slashdot could use this to insert posts into an open page whenever someone posts them, rather than fetching them in a blob when you hit 'more,' for example.

What's worst of all, though, is that XHTML, XForms and other sensible standards are being discarded for something so much worse.

XHTML is not being discarded. XHTML 2 is. I like XHTML 2 a lot, and if I were creating the web now as a new system, I'd want something like XHTML 2. Unfortunately, this is not the current situation. XHTML 2 is a great standard for designing document formats, but it doesn't in any way reflect how people are building web sites today, let alone tomorrow. If every browser supported XHTML 2 tomorrow, I doubt you'd see more than a handful of sites using it in a year's time. In contrast, people are already using bits of [X]HTML 5, because they're actually useful.

XHTML 2 made the same mistake the W3C did with HTML 4 and XHTML 1. The spec was written before the implementation. With HTML 5, every feature has to have a well-defined use case and must have two independent implementations before it goes into the final spec.

I've written in more detail about HTML 5 in two [informit.com] articles [informit.com] . I don't agree with everything in the spec, but it's a lot better than HTML 4 + Flash.

Re:WHATWG: The worst thing to happen to the Web. (1, Interesting)

Anonymous Coward | more than 4 years ago | (#31799810)

David, aside from two blogs using the same shitty WordPress themes, when have you ever seen two sites that look exactly like? It's very, very rare. And besides, if the <article> tag is being used to control rendering, that makes it a presentational element no different than <b>, <i> tables, and other crap like that we've tried to get rid of.

In reality, do you know what's going to happen with the <article> element? In order to make it render properly, people will have to specify a class or style, and fix the rendering using CSS. There's really no beneficial difference between <article class="..."> and <div class="...">. Most sensible people will just use divs, since they're supported by just about every browser still in use today.

Web Sockets and crap like that are nothing more than pathetic hacks to work around the web platform being a steaming pile of shit 95% of the time. Like with previous hacks, such as JavaScript, we've seen that they introduce huge security flaws, all for comparatively little gain. As for your Slashdot example, they could obtain the same effect by just using JavaScript's setTimeout function to make an AJAX request and grab any comments since the last check. There's no need for persistent connections or any stupidity like that.

Also, HTML5 is not a specification. A specification is defined before the implementation, not after it. It's difficult to even call it a "standard", with vendors still arguing over video codecs and shit like that. I personally prefer to call it a "failure".

Re:WHATWG: The worst thing to happen to the Web. (2, Insightful)

msclrhd (1211086) | more than 4 years ago | (#31799988)

The article, section, header, footer and aside tags don't have any presentation information (except that section/section/h1 is similar to using h2). A HTML5 browser should only have the following presentation logic done via CSS:
      article, section, header, footer, aside { display: block; }

Anything more fancy is done by CSS. Which means that you can have a single CSS theme file (WordPress, ZenGarden, whatever) that is used by *any* website that uses HTML5 markup.

Re:WHATWG: The worst thing to happen to the Web. (1)

grumbel (592662) | more than 4 years ago | (#31800206)

In reality, do you know what's going to happen with the element? In order to make it render properly, people will have to specify a class or style, and fix the rendering using CSS.

article, nav, section, header, footer and friends are about markup, not about rendering. In terms of normal rendering it makes no difference if you use <article> or <div class="article>, in terms of marked it however makes a huge different. One of the core problem for me with the Web today is that there is simply no to tell the browser what is the actual content and what is just a navigation bar. This in turn makes some webpages on some devices pretty much unusable (Wikipedia on a PSP for example). If the navigation and article content would feature proper markup you could have a button in your browser to simply hide the navigation, without the markup the browser has no way to tell where the navigation ends and the content starts.

Now of course in practice things might turn out different, pages might not use proper markup as it would make it to easy to skip advertisements and such, but on the other side you have pages like Wikipedia where it could really be a useful addition.

That said, I am not holding by breath, <link rel="next"> and friends have been in HTML for well over a decade and can be extremely useful in some use cases (having a book in HTML for example), yet proper support in most browsers for them is still missing or at least well enough hidden that a normal end user would never find them.

Re:WHATWG: The worst thing to happen to the Web. (1)

TheRaven64 (641858) | more than 4 years ago | (#31800612)

In reality, do you know what's going to happen with the <article> element? In order to make it render properly, people will have to specify a class or style, and fix the rendering using CSS. There's really no beneficial difference between <article class="..."> and <div class="...">. Most sensible people will just use divs, since they're supported by just about every browser still in use today.

That's precisely the point. They will be presented in different ways (like I said, they're not presentation tags), but the browser will know that they are articles, not something else. If it's on an eInk device, for example, it may decide to split a big page on article breaks. It may display a list of articles in a side view. Anything parsing the HTML for some purpose other than immediate display will know that these are articles, and not just some arbitrary level of detail in a hierarchy.

You seem to be arguing both against presentation markup and against semantic markup. There is a big difference between using an article tag and using a div with a class. As I said in my original post, the former is uniform across sites, while the latter is not. It's the same as the heading tags that have been in HTML from the start. You could replace these with div tags and heading1 (for example) classes, but it's useful for things parsing the HTML to know that something is a heading, not just that it's a generic bit of text.

And you're completely ignoring the benefit of user CSS. This is a feature that has been in browsers for a long time and allows you to specify CSS that overrides the site's own styling. With a richer set of semantic elements, you can define CSS attributes that are applied to every article, every caption, every section, and so on. You don't have to maintain a massive list of class names for every site that you might visit.

Since you like XHTML 2 so much, perhaps you should refresh your memory as to what the XHTML 2 solution is to this problem. If you honestly think that maintaining a huge network of ontology maps is better than defining a few new tags, then I hope I never use a system that you've designed.

Re:WHATWG: The worst thing to happen to the Web. (0, Redundant)

Lorien_the_first_one (1178397) | more than 4 years ago | (#31799840)

Ok, I'm not a web developer, but I follow the news and try things out on my browser, like the acid3 tests. I don't know the difference between the various markup languages. What I do know is that I want a consistent experience across the web. To help this along, I try to use a browser that supports open standards so that web developers get the feedback from my browser.

When I read a well reasoned debate, even on Slashdot (and it does happen), it's encouraging. I would much prefer that to the drama of a flame war. This comment is not just directed to TheRaven64, but to everyone who can participate in the discussion with reasoning, facts and references.

Since I can't know everything about web design, I try to use discussions like this as a chance to become better informed.

Thank you.

Re:WHATWG: The worst thing to happen to the Web. (0)

Anonymous Coward | more than 4 years ago | (#31799770)

What does this have to do with Chris DiBona?

Re:WHATWG: The worst thing to happen to the Web. (1)

commodore64_love (1445365) | more than 4 years ago | (#31799774)

>>> http://people.xiph.org/~greg/video/ytcompare/comparison.html [xiph.org]

I don't see any difference between the MPEG4 codecs ((H.264+AAC) and the OGG codecs (Theora+Vorbis). The two images look identical. It's too bad the author did not provide larger images.

Re:WHATWG: The worst thing to happen to the Web. (1)

Kneo24 (688412) | more than 4 years ago | (#31800234)

Ogg/Theora+Vorbis provided a clearer image at a smaller compression. It's more noticeable in the upper right hand corner. At higher compression it's hard to tell a difference if one even exists at all.

Re:Paging Chris DiBona (1)

koolfy (1213316) | more than 4 years ago | (#31799572)

I think the difference here is that it aims mobile phone (or netbooks) market.
AFAIK, vorbis/theora does fine with low and medium quality video (to and including 360p I think, but I'm not sure) but has more problems with file weight and brandwidth usage for high qality videos.

So what I understand is that they promote Vorbis/Theora for "low-end" video streaming and prefear H.264 for "high-end" videos.

I'm really not sure about that, it's just the result of my tiny experimentations with converting h264 content to ogg content and streaming it with html5 for my own private use. Feel free to correct me if you have a more solid experience in that field.

(and yes., as a Chromium user I hate Google for shipping their html5 videos in H.264 and use TinyOgg [tinyogg.com] links every time I can.)

Re:Paging Chris DiBona (0)

Anonymous Coward | more than 4 years ago | (#31799740)

I'm really not sure about that, it's just the result of my tiny experimentations with converting h264 content to ogg content and streaming it with html5 for my own private use. Feel free to correct me if you have a more solid experience in that field.

Converting from one lossy format to another can NEVER produce a superior output, and in 99.9% of cases will suffer a quality loss, that's in the nature of lossy formats.

It's like trying to get a 320kbps MP3 out of a 192kbps MP3, the data simply doesn't exist, and can't be magicked into existence

Had you gotten an OGG file originally and converted to H264 the OGG would be better, the real test would be to get an OGG and a H264 converted from a MUCH higher quality source, preferably in more than one format, and then compare, the test only being perfectly valid if you could get raw, lossless video to OGG and H264.
 

Re:Paging Chris DiBona (1)

pantherace (165052) | more than 4 years ago | (#31799864)

Actually, it might sound/look better to some. The reason being that particular formats have particular errors/noise which are introduced, and sometimes mitigated by another format. If you have enough experience (I did at one point), you can tell which audio standard is being used simply by listening. Now, due to not listening to as much in as many formats and caring about it, the only one I can tell apart is wma and sometimes low-bitrate mp3. As I recall, the reason for that may be due to where wma chops off frequencies.

There's also the whole vinyl sounds better, which is something similar (noise introduced by the vinyl is something that some people like.)

You just got it (1)

Snaller (147050) | more than 4 years ago | (#31799576)

They picked this shitty format, so be happy and shut up.

Re:You just got it (0)

Anonymous Coward | more than 4 years ago | (#31800576)

lol u mad

Re:Paging Chris DiBona (1)

PhrostyMcByte (589271) | more than 4 years ago | (#31799614)

While I disagree wholeheartedly with Chris's statement, I also think that Greg's comparison was not a very good one. The only thing he compared was a computer-generated, low-motion, pristine and lossless source. How many of those have have you seen on YouTube? Where is the noisy, poorly-lit video of some kid complaining about his life? Where's the shaky video someone shot on their cell phone? Where's the re-re-re-encoded video from people who re-uploaded the same video other people uploaded? Where's the TV captures and music videos? No, it was not representative at all.

Re:Paging Chris DiBona (1)

John Hasler (414242) | more than 4 years ago | (#31799782)

> No, it was not representative at all.

How representative of the stuff that actually gets large numbers of hits are your examples? Inefficient transmission of a noisy, poorly-lit video of some kid complaining about his life is unimportant if it only gets downloaded nine times.

Re:Paging Chris DiBona (0)

Anonymous Coward | more than 4 years ago | (#31799826)

How representative of the stuff that actually gets large numbers of hits are your examples? Inefficient transmission of a noisy, poorly-lit video of some kid complaining about his life is unimportant if it only gets downloaded nine times.

Right but it's not just 1 kid in 1 video it's millions of kids in millions of videos, if each gets downloaded nine times that's over 9000 million times!

The demerits of the beach (1)

tepples (727027) | more than 4 years ago | (#31800396)

Inefficient transmission of a noisy, poorly-lit video of some kid complaining about his life is unimportant if it only gets downloaded nine times.

Nine? It's over nine thousand. [youtube.com]

Re:Paging Chris DiBona (4, Insightful)

Graymalkin (13732) | more than 4 years ago | (#31800030)

The Xiph's group rebuttal page does nothing to show Chris DiBona's contention was false. As I have said before, through either ignorance or malice the Xiph guys dropped the ball on their comparison.

1. Their larger Theora video has an audio track that's about 64kbps. The H264 video from YouTube has a 128kbps audio track (the numbers are rough since they're VBR tracks). This means for every second of video the Theora video has an extra 64kbps to throw at the video. While 64kbps might not sound like much that's 13% of the file's total bitrate. This gives the Theora track a 13% data rate advantage over YouTube's video. Every objective test I've ever seen has gauged AAC and Vorbis to have roughly equivalent audio quality at the same bitrate. If they want to make an actual comparison they would need to use a 128kbps Vorbis audio track.

2. The Ogg file format really sucks for streaming over the internet. The Ogg container tries to be too general of a format when it's only being used to represent time based media. FFMPEG developer Mans has a lot to say [hardwarebug.org] about the container format. Thanks to sample and chunk tables in the MPEG-4 format seeks are really efficient over the network since the header gives you an index to all of the samples in the file. A single HTTP request or file seek is needed to seek to a particular time in the file, even if the full file hasn't been downloaded yet. For services like YouTube and Vimeo, especially in context of mobile connections, Ogg's inefficiency is a real detriment.

3. MPEG-4 files with H.264/AAC tracks can be handled by the Flash plug-in as well as natively in browsers. YouTube and Vimeo and others can encode a single version of a file and serve it up to older browsers using Flash and newer browsers using the HTML5 video tag. If Ogg is added as an option that is another step in your decision tree. For individual requests this extra logic might be trivial but when you're handling millions of requests per hour this really adds up.

I'm not defending any hyperbole Chris DiBona was spouting off about the internet grinding to a halt but Ogg and Theora are simply not optimal for a "baseline" media format. It's only real feature is the fact it is open source and doesn't require a license. This isn't the most useful feature in today's world because all of the mobile devices that would be served Theora files already have licenses for MPEG-4. Tens to hundreds of millions of phones already support MPEG-4. They're using MPEG-4 to do send video over MMS and e-mail and for watching video on the web. Theora improve any of those experiences.

Re:Paging Chris DiBona (1)

Randle_Revar (229304) | more than 4 years ago | (#31800534)

>The Ogg file format really sucks for streaming over the internet.
No, it is rather good for streaming. It is actually a bit weaker for downloaded or progressive downloaded (youtube-style) content, but not horrible even there.

>It's only real feature is the fact it is open source and doesn't require a license.
It may not be optimal, but it is good enough, and so the fact that it is Free is enough.

Re:Paging Chris DiBona (2, Interesting)

chrisd (1457) | more than 4 years ago | (#31800384)

Sorry, but Theora is still not as high quality as later codecs. That hasn't changed. But I was very happy to fund this work out of my group.

Once again (1, Insightful)

Anonymous Coward | more than 4 years ago | (#31799408)

The technically inferior is set to become the ubiquitously available option because the better option is entangled in non-technical problems.

Re:Once again (0, Redundant)

pv2b (231846) | more than 4 years ago | (#31799496)

Non-technical problems, such as H.264 requiring licensing patents.

Patents are specifically intended to restrict usage of technology (to those who are inclined to pay for it).

So - a royalty-free product which produces comparable (if slightly inferior) results *should* become the ubiquitously available option. It is as it should be. :-)

I very much doubt, however, that Apple and Microsoft will include Theora in their web browsers or in the iPhone. I think it is much more likely that the patent-encumbered option is set to become more ubiquitous than the free option, due to corporate politics. (After all, neither Apple's or Microsoft's products support Theora or Vorbis out of the box now.)

Re:Once again (1, Informative)

Anonymous Coward | more than 4 years ago | (#31799672)

Non-technical problems, such as H.264 requiring licensing patents.

Patents are specifically intended to restrict usage of technology (to those who are inclined to pay for it).

So - a royalty-free product which produces comparable (if slightly inferior) results *should* become the ubiquitously available option. It is as it should be. :-)

I very much doubt, however, that Apple and Microsoft will include Theora in their web browsers or in the iPhone. I think it is much more likely that the patent-encumbered option is set to become more ubiquitous than the free option, due to corporate politics. (After all, neither Apple's or Microsoft's products support Theora or Vorbis out of the box now.)

Neither Apple's or Microsoft's products support Flash out of the box either, yet Flash is fairly ubiquitous right now.

http://www.w3.org/Consortium/Patent-Policy-20040205/
"The goal of this policy is to assure that Recommendations produced under this policy can be implemented on a Royalty-Free (RF) basis."

Therefore, Theora is the only codec suitable for use as the web video codec.

The easiest way to get support for Theora video on browser clients is to install Firefox or Google Chrome. Almost half of the desktops/laptops/netbooks in use now have already done that anyway (Firefox has 40% worldwide, and Google Chrome about 7%).

It is very easy to add Theora support throughout Windows media: http://www.xiph.org/dshow/

There are plugins for IE that (possibly in conjunction with the Directshow filters) will enable Theora support in IE browsers.

You might be able to get Theora supported on the iPhone via this submitted app:
http://www.opera.com/press/releases/2010/03/23_3/

That about covers it, one would think.

Re:Once again (3, Informative)

TheRaven64 (641858) | more than 4 years ago | (#31799830)

Neither Apple's or Microsoft's products support Flash out of the box either, yet Flash is fairly ubiquitous right now.

Really? The last two Macs I've bought have come with Flash preinstalled. Not sure about Windows, but someone mentioned a few days ago here that their new Windows machine had Flash preinstalled, although it's not clear whether this was done by MS or the OEM.

Re:Once again (1)

melikamp (631205) | more than 4 years ago | (#31800116)

Fuck them. My phone runs mplayer.

strange brew that's also good for you (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31799442)

That would be Kombucha.

Slashdot fails (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31799476)

Error 503 Service Unavailable

Service Unavailable

Guru Meditation:

XID: 01010101

Varnish

Can't login, can't see my user page.

I'd prefer a CUDA accelerated encoder/decoder (1)

Anonymous Coward | more than 4 years ago | (#31799508)

why optimize for the low end, when the high end also needs a boost? ;)
After all, Theora is not covered by most graphics card's built in video deocders.

Christian

Re:I'd prefer a CUDA accelerated encoder/decoder (1)

pv2b (231846) | more than 4 years ago | (#31799536)

The one does not exclude the other. There's nothing stopping you from doing what Google did - funding development a CUDA accellerated Theora codec if you feel so inclined (and if you have the money). :-)

I guess Google thought Theora on the desktop was "good enough" and wanted to focus on ubiquity rather than perfection - for now.

Re:I'd prefer a CUDA accelerated encoder/decoder (1)

TheRaven64 (641858) | more than 4 years ago | (#31799592)

Because the high end doesn't need a boost. VP3 was designed to run on 300MHz Pentium-class CPUs. On a modern 2GHz system, it uses an insignificant amount of CPU power, even at high bitrates. No one cares about getting a program that uses 10% of a CPU to use 7% instead. Getting one that uses 90% to use 50%, however, can be a big win. Getting one that uses 110% (i.e. can't run at quite realtime speeds) to use only 90% is an even bigger subjective improvement.

Re:I'd prefer a CUDA accelerated encoder/decoder (1)

sznupi (719324) | more than 4 years ago | (#31799818)

I'd guess it's mostly about 90->50% scenario; current ARMs are quite powerfull, certainly enough for video in resolutions which make sense on devices that are likely to play them. But for those devices battery is probably the most limiting thing nowadays...and ARM has great ways of conserving it; if it is given the chance, if not on constant high cpu usage.

Re:I'd prefer a CUDA accelerated encoder/decoder (0, Flamebait)

Moldiver (1343577) | more than 4 years ago | (#31799876)

It doesn't? Theora-encoding on my quad-core xeon is roughly 5 times slower than h264-encoding of the same base-vid. I call that shitty performance and considering that the iq was much worse at roughly the same size I don't care if Theora rots somewhere to death.

nothing more disenchanting than foss 'community' (0)

Anonymous Coward | more than 4 years ago | (#31799548)

still, hats off to those who have remained focused/selfless. nobody ever sees their efforts, as anything worthwhile becomes assimilated with little/no fanfare/recognition/compensation to the people who actually accomplish the task at hand.

If Google was serious... (2, Insightful)

MojoRilla (591502) | more than 4 years ago | (#31799676)

If Google was serious, they would release VP8 as open source, and open source the patents. They did just buy On2 [wikipedia.org] . Why support a codec that was state of the art in 2000?

Re:If Google was serious... (1, Informative)

Anonymous Coward | more than 4 years ago | (#31799794)

Why support a codec that was state of the art in 2000?

You mean people should stop supporting things like mp3 just because you personally think it's too old?

Re:If Google was serious... (1)

nxtw (866177) | more than 4 years ago | (#31799998)

You mean people should stop supporting things like mp3 just because you personally think it's too old?

Not only because it's old; because it's old and there are newer codes that are better. This is why, for example, Apple uses AAC, even though they could have just used MP3 in a DRM container (back when they still applied DRM to downloads).

New video codes have brought clearly perceptible improvements, which is why we've seen MPEG-2, MPEG-4 ASP, and MPEG-4 Part 10 AVC/H.264 within the past 15 years (and H.265 coming in a few years too).
On the audio side, MP3 (and Dolby Digital/AC3) still see much use because they are both 'good enough' for what they are used for, but that doesn't mean we should avoid all progress.

Re:If Google was serious... (1)

SpinyNorman (33776) | more than 4 years ago | (#31800310)

FYI... It's "codec", not codes. It's an abbreviation of (en)coder-decoder.

Re:If Google was serious... (1)

nxtw (866177) | more than 4 years ago | (#31800542)

FYI... It's "codec", not codes. It's an abbreviation of (en)coder-decoder.

No. codec/encoder/decoder refers to a specific implementation or implementations. "code" here refers to the encoding format itself.

We don't want to go back to codec hell... (3, Interesting)

ducomputergeek (595742) | more than 4 years ago | (#31799916)

Theora lost because it wasn't as good as H.264 and it's still not as good as H.264 bit for bit. The only reason why the opensource world support it isn't because it's better, but because it's the only "open source friendly" option. Sorry, but that just because it fits an idelogoy doesn't mean much to the part of the world that uses the product. It's like suggesting that a professional 3D/video shop use Blender instead of Maya or Cinelerra instead of Final Cut Pro or Avid. The professionals are going to take a look at it for a while and go, "Nice toy, now I've got to get back to work."

If the opensource world wants Theroa to succeed, you're going to have to produce something that's better than H.264 end of story. Until then the people are working in Video are going to continue using H.264 because it's everywhere and is currently the best mainstream codec available.

I worked in Video production in the late 90's through about 2005. H.264 was a godsend when we finally had a single Codec that was adopted by pretty much all recording hardware and editing software. Before it was a Codec Hell. Nobody I talk to in the industry, and I still have a lot of friends who work everywhere from their basement to large production shops, have any interest in embracing Theora or anything else. They only want to support 1 Codec that works everywhere, and that's H.264. Even if it costs them a little bit of money. Because whatever it costs them is likely cheaper than the headaches of having to support multiple formats.

Now, if Theora or some other patent free format gets to the point where it can offer at least the same (really it has to be BETTER than H.264 in features and quality) only then will the production houses be interested in switching. And by better, offer at least the same quality as H.264 at a lower bit rate than H.264.

Re:We don't want to go back to codec hell... (1, Insightful)

MrHanky (141717) | more than 4 years ago | (#31799968)

Quality is not the reason why Theora lost to H.264, just like quality wasn't the reason why Vorbis lost to mp3.

Re:We don't want to go back to codec hell... (0, Troll)

diamondsw (685967) | more than 4 years ago | (#31800626)

This is what gets modded "insightful" - a single statement with no backing or even basis in reality? Just vague FUD?

It's all in the name... (-1, Troll)

Anonymous Coward | more than 4 years ago | (#31800174)

Theora lost because it wasn't as good as H.264 and it's still not as good as H.264 bit for bit.

I rather think they lost because they have one of the worst product names in history. Superficial as it is, I would feel stupid installing Ogg on my system, and it would be a cold day in hell before I start recommending Ogg to my friends & acquaintances. They'd probably think I was having a stroke if I did:

"Bob, you really should get Ooohggggggg." I would say.
"I can't understand you Jim, are you pretending to be a dense cave man or are you having a stroke?" Bob would respond.
"No, Ooohgggggg is a container format!" I would protest
"That's it, I'm calling an ambulance" Jim would riposte

And so it would go. So, in order to avoid looking like pantomiming idiot and/or someone suffering from a cerebral haemorrhage, I stick to H.264.

Re:We don't want to go back to codec hell... (1)

mqduck (232646) | more than 4 years ago | (#31800348)

Now, if Theora or some other patent free format gets to the point where it can offer ...

That brings up a question I've had in my mind for a while. I don't know how codecs/formats work, but can someone tell me if the theora format can be improved to the point that it rivals H.264, while still being the theora format? Or at some point is it necessary to call it new format? And if so, what effect would a new, better theora-derived format have if the world, hypothetically, had standardized on theora?

Also, how much of a difference does the quality of the codec used to create theora videos make? I recall when LAME first came along and was so good that it could double the quality of a low- or mid-bitrate MP3 file compared to the old options.

Re:We don't want to go back to codec hell... (1)

Arker (91948) | more than 4 years ago | (#31800648)

I don't know how codecs/formats work, but can someone tell me if the theora format can be improved to the point that it rivals H.264, while still being the theora format?

Dont get sucked in by the group-think. Theora already rivals H.264 - in real most applications it's highly unlikely anyone would ever notice the difference.

And yes, encoder and decoder development is at least as important than the underlying algorithm.

Arm is a start, how about the DSP (0)

Anonymous Coward | more than 4 years ago | (#31799970)

That TI's mated to their ARM cpus? TI DaVinci [ti.com]

theorarm (1)

yupa (751893) | more than 4 years ago | (#31800218)

Did you notice that the author of the blog entry is a developer of theorarm. His point of vue is not necessary the same as google...

more codec support (1)

YesIAmAScript (886271) | more than 4 years ago | (#31800268)

Where space and power matters most (pocketable devices), I'm just not entranced by support for more codecs that aren't efficient.

Some day it'll be reasonable for the device in your pocket to play video in any format you find it in. But for now, I think I'd rather the effort were concentrated on maxing out the efficiency (bits and power) of the codecs that are already in wide use.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>