×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

BBC Lowers HDTV Bitrate; Users Notice

timothy posted more than 4 years ago | from the totally-amazing dept.

Media 412

aws910 writes "According to an article on the BBC website, BBC HD lowered the bitrate of their broadcasts by almost 50% and are surprised that users noticed. From the article: 'The replacement encoders work at a bitrate of 9.7Mbps (megabits per second), while their predecessors worked at 16Mbps, the standard for other broadcasters.' The BBC claims 'We did extensive testing on the new encoders which showed that they could produce pictures at the same or even better quality than the old encoders ...' I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

412 comments

Focus group... (2, Insightful)

gandhi_2 (1108023) | more than 4 years ago | (#30476142)

...of blind retards.

At any resolution: (0)

Anonymous Coward | more than 4 years ago | (#30476164)

Benny Hill is awesome! Especially when he runs around really fast.

Re:Focus group... (5, Funny)

Psx29 (538840) | more than 4 years ago | (#30476256)

FTA: ""Even my wife can see a reduction in picture quality and she's got cataracts," wrote one. "

Re:Focus group... (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30476526)

As a 49 yo grandmother, a feminist, and having had a long career as a C programmer, I find that offensive. Would they have said his father couldn't see it? This is just another racist characterization of women being incompetent with technology.

Re:Focus group... (4, Funny)

dyingtolive (1393037) | more than 4 years ago | (#30476694)

Actually, deary, it's sexist, not racist. I swear it's not making me take you less seriously though.

Re:Focus group... (5, Informative)

Anonymous Coward | more than 4 years ago | (#30476712)

1) The alleged wife in the quote is purported to have cataracts. Cataracts typically reduce visual acuity due to the cloudiness they impart to the lens of the eye. How does a reduction of visual acuity translate to "just another racist characterization of women being incompetent with technology"?

2) If the quote had been ""Even my husband can see a reduction in picture quality and he's got cataracts," wrote one." would you have bothered to make your little rant post?

P.S. The term you were looking for is "sexist" not "racist".

Re:Focus group... (-1, Troll)

Anonymous Coward | more than 4 years ago | (#30476820)

1) The alleged wife in the quote is purported to have cataracts. Cataracts typically reduce visual acuity due to the cloudiness they impart to the lens of the eye. How does a reduction of visual acuity translate to "just another racist characterization of women being incompetent with technology"?

2) If the quote had been ""Even my husband can see a reduction in picture quality and he's got cataracts," wrote one." would you have bothered to make your little rant post?

P.S. The term you were looking for is "sexist" not "racist".

What's the difference between a nigger and a pile of shit? Eventually the shit will turn white and stop stinking.

What do you say to Mike Tyson with no arms? Nigger, nigger, nigger!

How was breakdancing invented? From niggers trying to steal hubcaps from moving cars.

How do you babysit a nigger kid? Wet his lips and stick them to the wall. How do you get him back down? Teach him to say "motherfucker."

Are those jokes sexist?

Re:Focus group... (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30476814)

As a 43 year old father of a special needs child I find your comment retarded (And I deplore the use of the word). The salient point was that the wife of the commenter in the article has cataracts, not that she is female. If the comment had been "Even my husband can see a reduction in picture quality and he's got cataracts" it would have been no less salient.

Re:Focus group... (2, Insightful)

brkello (642429) | more than 4 years ago | (#30476920)

What a weird post. Would you find it less offensive if you weren't a C programmer? It may be a stereotype, but it is there for a reason. This hits home with my mom who says she can't tell the difference between standard def and high def television. Does that mean all women can't? Nope. But it was an amusing quote...loosen up. Stop looking for things to be offended about.

Re:Focus group... (0)

Anonymous Coward | more than 4 years ago | (#30476942)

As a 49 yo grandmother, a feminist, and having had a long career as a C programmer, I find that offensive. Would they have said his father couldn't see it?

Probably, if his father had cataracts and complained to him. Now if he'd said "Even a woman could see the difference" I would see reason for complaint. I think the emphasis here is on the cataracts.

This is just another racist characterization of women being incompetent with technology.

Hold on a sec- now women are a separate race? Really?

I'm just glad that no one made a 'racist' characterization of overly sensitive feminists who fly off the handle at the slightest perceived insult.

Re:Focus group... (0, Flamebait)

the_hellspawn (908071) | more than 4 years ago | (#30477008)

So what does HD TV have to do with being a grandmother? Nothing for starters. So what if you are 49 y/o, I am 34 y/o and does anyone care NO!. So your still just a programmer at 49? I would have been a little more impressed if it read; "...during my long career as a programmer now manager of my department of C/C++ programmers..." See you show no signs for improvement and C/C++ are basically the same. You have it that you can only fit C language in your minimal capacity skull. I do have to say this as my final; I truly feel sorry for your husband/boyfriend/boytoy for getting stuck with a dumb arse like yourself whom must declare your position on a topic and still call it by an incorrect term. Silly Troll Slashdot is for winners

Re:Focus group... (0, Flamebait)

BubbaDave (1352535) | more than 4 years ago | (#30477032)

As a 49 yo grandmother, a feminist, and having had a long career as a C programmer, I find that offensive. Would they have said his father couldn't see it? This is just another racist characterization of women being incompetent with technology.

To know whether or not we should listen to you, we have to know- are you hot?

Dave

Re:Focus group... (0)

Anonymous Coward | more than 4 years ago | (#30477062)

Perhaps the fact that you completely failed to understand the joke on every level and that you use the phrase "just another" is proof that you are quite sexist yourself.

Re:Focus group... (5, Funny)

Anonymous Coward | more than 4 years ago | (#30476590)

FTA: ""Even my wife can see a reduction in picture quality and she's got cataracts," wrote one. "

They must have a pretty big screen if she can see that difference from the kitchen.

Re:Focus group... (1, Funny)

Anonymous Coward | more than 4 years ago | (#30477072)

The screen is in the kitchen dipshit.

And I'm doing the cooking.

Re:Focus group... (0)

Anonymous Coward | more than 4 years ago | (#30477088)

I would mod this up but it is already 5...

Re:Focus group... (5, Informative)

jasonwc (939262) | more than 4 years ago | (#30476580)

Yes, it IS possible to get higher picture quality out of a lower bitrate, but not with all else equal. For example, you can get higher quality with CPU-intensive settings using H.264 5.1 Profile than you can with H.264 4.1 (what Blu-Ray's/HD DVDs use), at the same bitrate. You're giving up CPU cycles in decoding for lower video size. This is why x264 can produce near-transparent encodes of Blu-Ray movies at about half the size. x264 uses much more demanding settings.

x264 at 20 Mbit which high-quality settings is far more demanding than a 40 Mbit H.264 stream from a Blu-Ray.

Re:Focus group... (5, Interesting)

postbigbang (761081) | more than 4 years ago | (#30476684)

In the US, Comcast uses codex compression to squeeze HD on their cable systems. When people get to see native resolution at the TV store, then get the Comcast version when they plug in their shiny new HD TV, they wonder WTF? That the beeb would put their foot on the garden hose and expect no one to notice is ludicrous.

I wish the FCC would get involved in the US to force cable companies to limit the number of channels supported and broadcast them in the highest sustainable resolution-- or tell their users the truth about what's happening and why. Maybe we can start to get rid of the excess junk channels.

Re:Focus group... (3, Interesting)

kimvette (919543) | more than 4 years ago | (#30476832)

Adding to that, Comcast's programming is 720p, with much of it upscaled. The Blu-Ray source you see at the stores are often 1080p, or at least 1080i. You're comparing rotten wormy apples to nice juicy oranges, where Comcast's feeds are the rotten wormy apples.

Re:Focus group... (0)

Anonymous Coward | more than 4 years ago | (#30477048)

What exactly is "codex compression" and what do you mean by "native resolution"?

Seriously, learn WTF you're talking about before using kompleecated teknical turms...

They suck at math too (4, Insightful)

Locke2005 (849178) | more than 4 years ago | (#30476146)

They also lowered their math standards. From 16MBps to 9.7 MBps is a 40% reduction, not "almost 50%".

Re:They suck at math too (3, Informative)

gandhi_2 (1108023) | more than 4 years ago | (#30476274)

Technically speaking, they suck at "maths".

Re:They suck at math too (0)

Anonymous Coward | more than 4 years ago | (#30476436)

Maybe in your part of the world.

Re:They suck at math too (5, Funny)

zach_the_lizard (1317619) | more than 4 years ago | (#30476490)

Technically speaking, they suck at "maths".

We Amurkins don't recognize no commie "maths." We want our math to grow up as individuals

Re:They suck at math too (1)

Zedrick (764028) | more than 4 years ago | (#30476848)

How could this possibly be modded "troll"? He's pointing out something obvious FFS! Please mod him up, but above all I'd really like to see an explanation from the person who slapped the -1 on the partent (as AC, ofcourse).

Yes (2, Informative)

Anonymous Coward | more than 4 years ago | (#30476148)

Yes, if more time and passes are spent encoding the video, lower bitrate CAN result in a higher quality video. However, this does not appear to be the case in this instance.

Re:Yes (0)

Anonymous Coward | more than 4 years ago | (#30476198)

Or if BBC went to the future to bring back to our time a superior codec that uses a lower bit rate but produces superior image.

Re:Yes (2, Insightful)

Bakkster (1529253) | more than 4 years ago | (#30476232)

In other words, lower bitrate can be better, but only if you compare to shitty and inefficient compression.

Re:Yes (4, Insightful)

Fantom42 (174630) | more than 4 years ago | (#30476598)

In other words, lower bitrate can be better, but only if you compare to shitty and inefficient compression.

And by this you mean compression that is state of the art two minutes ago, vs. today. Seriously, this field is moving pretty fast, and what you call shitty and inefficient was not long ago the best people could do. A few years ago when I was messing with the x264-svn libraries, stuff would get tweaked daily.

Not to mention there are other factors at play with regards to compression. A well-engineered system isn't necessarily going to go for the maximum compression rate for video quality. One has to look at other limitations, such as the decoding hardware, the method by which the video is being delievered, and even the viewing devices on the receiving end.

What is disheartening about the article is that it looks like the BBC are just in denial mode, and not really taking the complaints seriously. "Standard viewing equipment"? Seriously, what exactly are they getting at with that comment? On top of that it looks like they are trying to blame the quality of the source material, which certainly muddies the picture, but certainly the customers that are complaining would be used to these variations in quality before the change and not just suddenly notice it at the same time this equipment was rolled out.

I have respect for them sticking to their guns, but not when they are doing it with such lame excuses. Then again, the BBC spokesperson and reporter may not be the most tech savvy individuals, and its likely some of the message here is lost in translation. Lossy codec indeed.

Re:Yes (1)

Albanach (527650) | more than 4 years ago | (#30476398)

It would also be quite remarkable to see better quality compared to any other modern encoding while reducing bitrate by 50%.

Better quality from a lower bitrate? (-1, Redundant)

Anonymous Coward | more than 4 years ago | (#30476154)

is it really possible to get better quality from a lower bitrate?

Yes. Next question.

Yes (4, Informative)

Unoriginal Nick (620805) | more than 4 years ago | (#30476170)

I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?

Sure, if you also switch to a better codec, such as using H.264 instead of MPEG-2. However, I don't think that's what's happening in this case.

Re:Yes (1, Informative)

Anonymous Coward | more than 4 years ago | (#30476252)

Yes, and double yes, Even within the same codec. There is a LOT that an encoder can do to improve the quality of video (especially with more advanced codecs).

Just check out the differences between say, x264 and apples encoder. Both are H.264, but x264 blows apple's encoder clean out of the water.

9.4 mbps is still a pretty large bandwidth. An encoder like x264 could do quite a bit with that bandwidth. at 16mbps though, Almost any encode, from MPEG 2 standards, up could produce some pretty clean looking pictures.

Re:Yes (4, Funny)

EsJay (879629) | more than 4 years ago | (#30476802)

Just check out the differences between say, x264 and apples encoder...

Aren't you comparing x264 to oranges?

Re:Yes (0)

Anonymous Coward | more than 4 years ago | (#30476264)

Umm, H.264 doesn't compress anywhere near the rate of MPEG. That would be why H.264 is much better quality (not to mention the its amazing frame flow, as I call it.)
But you are forgetting the multiple past slashdot article posts on how people prefer the sound of an MP3 to CDA. That situation is a WTF moment, but at least it isn't true with video (yet).

Re:Yes (1)

jojo80 (99781) | more than 4 years ago | (#30476492)

You do know that H.264 is also known as MPEG4? And at the same bitrate, H.264 does produce better pictures than MPEG2.

Re:Yes (1)

Zebra_X (13249) | more than 4 years ago | (#30476658)

H.264 = MPEG-4 Part 10 (http://en.wikipedia.org/wiki/H.264/MPEG-4_AVC)

If you mean MPEG-2, H.264 was designed as a replacement for this technology amongst others.

H.264 and VC-1 are currently the most efficient methods in terms of bandwidth to transmit video.

Re:Yes (5, Insightful)

natehoy (1608657) | more than 4 years ago | (#30476368)

You can also get better compression by specifying a more sophisticated compression method within the same codec, for example, since many codecs support various levels of compression.

Generally, "better" compression (fitting a higher resolution and/or framerate into a smaller size) requires a lot more power to encode and often some more power to decode. You can use less bitrate to get a quality signal there, but you need "smarter" coders and decoders at the respective ends of the transmission. So the BBC may have upgraded their compression engine to something that can do "better" compression, thereby fitting the same resolution and framerate into a 40% smaller stream. But their customers' television sets might not have the horsepower to decode it at full quality.

That could easily explain why the BBC's testing went so well but their consumers (with varying brands of TV sets probably mostly tested for British use with the old compression) can't keep up and render an inferior picture.

It's also possible that, by compressing the video stream into a denser compression method, signal loss is having a greater effect than it did with the old compression method. The viewers may be seeing artifacts that are the decoder's attempts to fill in the blanks. The old compression method might have allowed a certain amount of redundancy or error correction that the new one lacks, and the loss of part of the signal has a more visible effect on the new one.

Re:Yes (2, Insightful)

HateBreeder (656491) | more than 4 years ago | (#30476548)

> can't keep up and render an inferior picture.

It's not like there's half-way here. This is the digital age - if it can't keep up you won't see anything!

Re:Yes (2, Informative)

GasparGMSwordsman (753396) | more than 4 years ago | (#30476778)

Despite what many people claim, when you get a garbled digital signal, most systems will give you garbled artifacts on the screen. This is because in any broadcast situation you expect a certain percentage of interference and have to design to deal with that. If you only ever displayed images on a tv if the signal was perfect, you would be amazed at how often your display would blank out.

On my TV at home I have changed the settings to turn off the "blue screen/bad signal screen". The TV does its best to figure out what it is receiving. I still can get a signal loss if there is enough interference, but for the most part I just get warped images and garbled sound if something happens. (I have a very nice HD tv with tuner built in.) I am at the very edge of two stations in my area and on both of those I have to fiddle with the antenna to have them come in clear. (Plus my cat moves the antenna onto the floor pretty often...)

Re:Yes (1)

Fantom42 (174630) | more than 4 years ago | (#30476382)

I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?

Sure, if you also switch to a better codec, such as using H.264 instead of MPEG-2. However, I don't think that's what's happening in this case

Just to amplify what has been said here a few times, yes it is possible, and not only from changing codecs. H.264 supports many optional features that are not implemented in all decoders, and these features can have an effect on quality. Use of a variable bit rate vice a constant bitrate can also increase quality and decrease bandwidth needs, at the cost of requiring some bursting capability or buffering to accomodate the variation. Also, there are tricks that can be played with dark and light tone masking to increase the compressibility of a video stream, and any number of other preprocessing tricks that people use.

The reason I bring up this point, is it might be that they tweaked the parameters to achieve a lower bitrate, and thought they weren't sacraficing quality, but that different viewing devices, or different people, were able to notice a change in quality that the study group they used was not. Point being that these guys may not have been as stupid as people are making them out to be, but just designed a poor test. Video compression and judging output quality is really complicated and depends on a lot of factors. It could be they just missed some of those factors when they tweaked the encoding algorithm.

Re:Yes (1)

Speare (84249) | more than 4 years ago | (#30476520)

In addition, if your higher bitrate is clogging the net and stalling every few seconds, and the lower bitrate actually allows the audio and/or to be played in real time, then many people would say that is an improvement in quality although not necessarily on a frame-by-frame basis.

So they starting to act like comcast cable with th (0, Informative)

Anonymous Coward | more than 4 years ago | (#30476176)

So they starting to act like comcast cable with there compressed HD.

Re:So they starting to act like comcast cable with (2, Informative)

Jackie_Chan_Fan (730745) | more than 4 years ago | (#30476638)

I'll toss FIOS under the bus too. Verizon's HD varies greatly. I'm not sure if its the channel companies themselves or Verizon doing it...

Either way, I hate watching fast motion movies or tv shows where the bitrate is too low.

Try watching "How its Made" on discovery HD and watch how compressed things look as fast moving manufactured parts pass through machinery.

Same for HBO films etc.

Of course it is possible (2, Informative)

godrik (1287354) | more than 4 years ago | (#30476190)

but is it really possible to get better quality from a lower bitrate?

If you are changing the compression algorithm of course it is possible. In H264, there are a lot of compression possibilities which are not used by the compression algorithm but which will be recognized by the decompression algorithm.

Yes, of course (5, Informative)

TheRaven64 (641858) | more than 4 years ago | (#30476196)

Any lossy compression works by throwing away bits of the picture that the viewer might not notice. You can lower the bitrate with better psychovisual and psychoacoustic models. You're still throwing away more information, but you're doing it in a way that the user is less likely to notice. This takes more CPU time on the compressor, a more optimised encoder, or a better algorithm.

Re:Yes, of course (3, Interesting)

Locke2005 (849178) | more than 4 years ago | (#30476346)

Fractal encoding works well in that you can zoom way in on the fractal without noticing obvious compression artifacts. However, there is no straightforward algorithm for doing the compression; as far as I know, you have to brute-force every possibility to get optimal encoding -- not something you can effectively do in real time. But if you've got several days before the segment airs in which to encode it, should should be able to get better quality out of far fewer bits.

Re:Yes, of course (3, Informative)

TheRaven64 (641858) | more than 4 years ago | (#30476604)

Yup, fractal encoding is pretty impressive. I played with it a bit on a 386, when it took about ten minutes to compress a 320x240 image. I've not heard of any newer algorithms that improve matters much. More interesting is topological compression, which has most of the same advantages as storing a vector image (resolution independent) and a raster image (can come from a sampled source). You can extend these to video by modelling the video data as a 3D volume, rather than as a sequence of frames. The topological changes in the time dimension are usually quite gradual, and it's easy to trade special and temporal resolution. The really nice thing about this approach is that it's resolution independent in three dimensions, not just two, so it's easy to generate a signal that matches the display's refresh rate.

Re:Yes, of course (4, Informative)

Andy Dodd (701) | more than 4 years ago | (#30476408)

LAME was a pretty good example of this for MP3 - Eventually it was able to achieve (somewhat) better quality at (somewhat) lower bitrates than the reference encoders.

Vorbis, similarly, had the AoTUV tuning - This provided significant rate/distortion tradeoff improvements compared to a "vanilla" encoder, without changing the decoder.

However, 40% reduction in bitrate with an increase in quality is very difficult unless the original encoder was CRAP. (Which is actually a definite possibility for a realtime hardware encoder.) Also, it's far more likely to have such improvements with H.264 or MPEG-4 ASP, not nearly as likely with MPEG-2, which had a far less flexible encoding scheme.

Re:Yes, of course (2, Informative)

TheRaven64 (641858) | more than 4 years ago | (#30476750)

Which is actually a definite possibility for a realtime hardware encoder.

Not just a realtime hardware encoder, but likely a first-generation encoder. Most compression standards are now designed with some headroom. When AAC was first introduced, Dolby provided two encoders, a consumer-grade and a professional encoder. The consumer-grade one was only slightly better than MP3, but ran much faster. The pro encoder was a lot slower but the quality was noticeably better. More recent encoders produce even better quality. A 40% decrease in bitrate is about what I'd expect going from a single-pass to a two-pass H.264 encoder, and it's entirely possible that a newer single-pass encoder can do the same sort of thing just by using a longer window now that RAM is a lot cheaper.

Also, it's far more likely to have such improvements with H.264 or MPEG-4 ASP, not nearly as likely with MPEG-2, which had a far less flexible encoding scheme

BBC HD uses H.264. It's rebroadcast after transcoding to MPEG-2 if you have Virgin Media cable, because their decoder boxes, unlike the FreeView boxes, can't handle H.264.

Re:Yes, of course (2, Funny)

trb (8509) | more than 4 years ago | (#30477034)

Lossless compression also works by throwing away data. In a simple case, if you have a still image and you store it in a file that's 1000x1000 pixels and 24 bits deep with 8 bits each of red, green, and blue, you store that uncompressed in 3 megabytes. 24 bits of color isn't infinite, it's a palette of 16.77 million colors. And you're not saving every micron of the image. You are dicing the image into 1000x1000. If you are taking a picture of a scene that's 10 meters by 10 meters, you are stuffing a square 10x10 mm into each pixel. And also, your recorded image isn't perfect anyway, it's not perfectly focused and color reproduction is not exact. Information is lost.

If you use those same 3 megabytes to store a lossy jpeg, you can store a lot more detail in the same file space - at typical compression rates, it might be 5-10 times more detail. I am often puzzled by folks who hate lossy and love lossless, because lossless isn't simply lossless, it's smarter about what it chooses to lose.

I understand that the process of uncompressing and recompressing a lossy image is a lossy process, but we're not talking about multiple recompressions here, we're talking about one cycle. This is true for both broadcast video and for playing back your personal music. And especially if you listen with earbuds, it's silly to worry about audio compression loss, since the earbuds are very lossy. I know that this is a discussion of video, but the same rules apply.

Yes, you can (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#30476200)

You can have a lossless 48khz compressed audio will have a much lower bitrate than PCM audio, and more fidelity.

Older mp3 compressoion algorithms sounded worse at 256kbit than more modern ones do at 128k or lower.

Look at the size/quality of MPEG2 vs DivX, h264, etc.

The submitter is, like most of slashdot, technically clueless.

of course (0)

Anonymous Coward | more than 4 years ago | (#30476208)

I got a good laugh off of this, but is it really possible to get better quality from a lower bitrate?

Ever looked at mpeg2 vs h264? That's not what happened here, but your question gets a huge Yes anyway.

Bitrate vs. Quality (5, Insightful)

jandrese (485) | more than 4 years ago | (#30476210)

It's not impossible to get better results out of lower bitrates, but you have to pay the penalty elsewhere, typically in encode/decode complexity.

If your decode hardware is fixed (it's generic HDTV hardware), then there is much less room for improvement, and half the bitrate is an enormous drop. It's no surprise that the BBC viewers complained.

Re:Bitrate vs. Quality (1)

evilviper (135110) | more than 4 years ago | (#30477058)

If your decode hardware is fixed (it's generic HDTV hardware), then there is much less room for improvement, and half the bitrate is an enormous drop.

You're assuming they started with a halfway decent encoder to begin with. The difference between a good encoder and a crappy one is vastly more than 50%.

Added complexity need not be involved (though it certainly can help). A better Quantization table, for instance, wouldn't be any slower, and the reduced bitrate would speed-up encoding/decoding.

More relevant, though, is the comparison function. I find libavcodec's SAD (Sum Absolute Differences) far better than the traditional DCT as it doesn't discolor blocks, while the former will speed-up video encoding almost an order of magnitude versus the later.

Summary rounding error (4, Informative)

w0mprat (1317953) | more than 4 years ago | (#30476212)

Nitpick: So 39% is "almost 50%"?? I would have called that "almost 40%". Then again that is a /. summary.

Re:Summary rounding error (1, Interesting)

Andy Dodd (701) | more than 4 years ago | (#30476424)

Depends on your definition of "almost" and what precision you're rounding to.

Re:Summary rounding error (0)

Anonymous Coward | more than 4 years ago | (#30476640)

It depends on what your definition of "is" is.

Re:Summary rounding error (2, Informative)

DragonWriter (970822) | more than 4 years ago | (#30476564)

Nitpick: So 39% is "almost 50%"??

Yes, its almost 50% -- if you are, e.g., relating it to the nearest 25%. (Rounding it to the nearest 25% it would be just plain 50%, not "almost 50%".)

I would have called that "almost 40%".

Its also almost 40% -- if you are, e.g., relating it to the nearest 10% (or 5% or 2%). And, in fact, 6.3/16 is also "almost 39.5%" if you are relating it to nearest 0.5%, and "just over 39%" if you are relating it to the nearest 1%.

"Almost" means you are giving an approximation (and the direction the value differs from the approximation), not an exact figure. There are infinite number of possible approximations for any given exact value. That something could be described as "almost 40%" does not mean it cannot also be described as "almost 50%" without any "rounding error", since "almost" does not specify the precision of the approximation being used.

Their new algorithm? (4, Funny)

seven of five (578993) | more than 4 years ago | (#30476222)

They just remove the naughty bits.

Re:Their new algorithm? (0)

Anonymous Coward | more than 4 years ago | (#30476968)

Ahh, the good ol’ days... removing the nipples from the prons to save precious bandwidth over a dial-up connection...

It is absolutely possible (5, Informative)

PhrostyMcByte (589271) | more than 4 years ago | (#30476224)

Bitrate is only part of the equation -- the H.264 spec allows for a number of different ways to compress video, and it's up to the encoder to find out which is best for your video. Even in the same encoder, you can tweak dozens of settings in ways that dramatically change output quality -- usually a trade off between time and size.

x264 has beat every commercial encoder out there -- in some cases, on a level that would indeed render higher quality with half the bitrate.

Re:It is absolutely possible (1)

Kjella (173770) | more than 4 years ago | (#30476476)

x264 has beat every commercial encoder out there -- in some cases, on a level that would indeed render higher quality with half the bitrate.

Last I checked x264 was just on par or slightly below some commercial encoders with a standard profile. But x264 tends to have a bunch of OCD encoders who don't quit until they've tweaked it for just the right grain settings and tweaks for a given show or movie, which is usually what gives it the edge.

Re:It is absolutely possible (5, Interesting)

Silverlancer (786390) | more than 4 years ago | (#30476574)

The main change in the past year has been the psy optimizations that were added; before the psy optimizations, x264 was roughly on par with Mainconcept, one of the better commercial encoders. The psy optimizations--adaptive quantization and psy-RD (both on by default)--put x264 way over the top. Recently, the new MB-tree algorithm (also on by default) has boosted quality quite a bit as well. The main catch with psy optimizations is that they're nearly impossible to measure mathematically, and in fact, unless you disable them, they will make the "mathematical" measures of quality (mean squared error/PSNR) much worse.

It's always nice when free software solutions trash the commercial alternatives.

of course it's POSSIBLE... (1)

jamescford (205756) | more than 4 years ago | (#30476250)

Compression is just the discarding of irrelevant or less relevant information. With images or video, that means keeping the perceptually meaningful content and discarding the rest. An improvement might come about if the encoder was removing irrelevant variations (noise), or smoothing out unnecessary details away from perceptually salient objects (making them easier to see).

It's pretty hard to make an image encoder that maintains the important perceptual qualities of every possible image, so IF their encoder is good, maybe they just didn't test it on the whole range of stuff they eventually used it on.

Sure! (0)

Anonymous Coward | more than 4 years ago | (#30476262)

is it really possible to get better quality from a lower bitrate?

It is, if your original encoders sucked...

less is more (0)

Anonymous Coward | more than 4 years ago | (#30476266)

"We did extensive testing on the new encoders which showed that they could produce pictures at the same or even better quality than the old encoders"

I find this hard to believe, especially as there are already complaints in the iPlayer forum.

noise (1)

methano (519830) | more than 4 years ago | (#30476270)

Lower bit rates can reduce noise if it's of the high frequency, snap, crackle, pop variety. You get less information but it's more soothing. Some people prefer lower quality to higher quality because the high frequency stuff is annoying. One of the nice advantages of getting older is that they can really scrimp on quality and you can no longer tell the difference.

It is possible, but I don't think they did. (1, Informative)

Anonymous Coward | more than 4 years ago | (#30476278)

I'm far from an expert, but my understanding is that to a limited extent, you can make a trade-off between the bitrate and encoding/decoding time. H.264/MPEG-4 AVC is superior to older codecs, generally having both better visual quality and a lower bitrate, but it requires much more time to encode and requires more powerful hardware to decode the stream.

But my very loose understanding is that all they did was lower the bitrate and maybe conducted a test to see if some random idiots could tell the difference with ideal samples.

Crap HD Quality (3, Interesting)

TooTechy (191509) | more than 4 years ago | (#30476288)

Try watching a football game here in the US and you will see what crap quality can be. The turf turns into squares of blur when the camera moves, then returns to blades of grass when the picture is stationary. As soon as you spot it you will hate it. If you don't see it then OK for you.

I used to have a friend who could spot the two little circles in the top right of a movie in the theater telling the projectionist to change the reel. Once he saw them the movies were never the same again.

Re:Crap HD Quality (0)

Anonymous Coward | more than 4 years ago | (#30476388)

yep, ive seen fight club too

Re:Crap HD Quality (0)

Anonymous Coward | more than 4 years ago | (#30476516)

Epic lulz anonymous

Re:Crap HD Quality (0)

Anonymous Coward | more than 4 years ago | (#30476562)

But they're only circles for matted prints. If it's anamorphic, the circles will be stretched into ellipses. Irrelevant, perhaps, as you can tell from the aspect ratio (2.35:1 is anamorphic, 1.85:1 is matted), but it certainly provides conclusive verification.

digital movie theaters don't have them and they ha (0)

Anonymous Coward | more than 4 years ago | (#30476668)

digital movie theaters don't have them and they have better video there as well.

Re:Crap HD Quality (4, Informative)

Brett Buck (811747) | more than 4 years ago | (#30476672)

I think you might want to talk to you cable company on that one. I know the effect you are seeing (it's by far the worst on local Public TV since they crammed 7 sub-channels into the same carrier), but network TV coverage of football in my area is pretty pristine for the most part. OTA is even better but cable is still awfully good.

      Of course, by "talk to your cable company", I mean "do nothing" because talking to the cable company is a complete waste of time.

      Brett

It depends on the material (3, Informative)

Locke2005 (849178) | more than 4 years ago | (#30476290)

If you're watching a soap opera, you only need to see a few frames per week to follow the story. If you are watching a live sports event with a lot of action, most people will notice every dropped frame and compression artifact (I've noticed myself while watching the Olympics via satellite feed.) Methinks they did the testing on a relatively static video. Video compression works by (among other methods) creating a key frame, then sending diffs off that key frame for several frames. If every frame is completely different, compression does not work well.

Still a way to go... (2, Informative)

TheRaven64 (641858) | more than 4 years ago | (#30476330)

For reference, the BBC HD content on iPlayer is 3.5Mb/s for 720p (no higher quality available). 9.7Mb/s is less than three times as much, so it probably won't be long before the streaming and broadcast signals are the same quality.

iPlayer appears to use H.264 (2, Insightful)

tepples (727027) | more than 4 years ago | (#30476654)

But then iPlayer appears to use H.264, which allows for more efficient encoding than the MPEG-2 codec used for digital TV broadcasts.

Re:iPlayer appears to use H.264 (3, Informative)

TheRaven64 (641858) | more than 4 years ago | (#30476766)

BBC HD also uses H.264 for terrestrial and satellite broadcasts. It's only if you have Virgin Media cable that you get the stream transcoded to MPEG-2.

BBC evil Jedi (1, Funny)

jdgeorge (18767) | more than 4 years ago | (#30476348)

BBC accountant: We provide the $ame or better picture quality with half the bitrate! Just think of the $aving$!

BBC IT decision maker: I $ee what you're $aying.... The$e picture$ look $uper!

Public: This looks like crap.

BBC rep: (waves hand) The$e aren't the compre$$ion artifact$ you're looking for. We can go about our bu$ine$$. There are no complaint$.

Depends on the codec (1, Redundant)

StikyPad (445176) | more than 4 years ago | (#30476428)

If they've switched from MPEG-2 to MPEG-4, then yes, you can get equal or better quality at a lower bitrate.

Test video (4, Funny)

bugs2squash (1132591) | more than 4 years ago | (#30476430)

was the featureless black-screen video to 4'33" from John Cage. Results were far better at the lower bitrate. The absolute darkness was less blurry.

Quite a bit left out (1)

sunking2 (521698) | more than 4 years ago | (#30476502)

The article talks about bitrate, which implies not a change in codec, so it remains mpeg-2. I'm assuming the BBC is available OTA, so unless they want everyone with a HD ready TV to have to get a new receiver box they can't just change to x264, etc. So in this context, the answer is no, using the same codec at a reduced bitrate can't produce better than the source. However, that assumes you are comparing to the original source. Take for example a standard DVD player which has a mpeg2 uncompressed file at 480i. The resulting image on a very large 60+" HD TV may appear blocky in some situations. A good DVD player will be able to interpolate and massage things around so that the resulting 1080i image on your screen indeed does look better on your screen. Although the image itself may be quite a bit different than the original pure source. So you have a perceived better frame image, although it may be quite a bit changed from the source.

Re:Quite a bit left out (1)

Rising Ape (1620461) | more than 4 years ago | (#30476582)

BBC HD is a separate channel from normal BBC broadcasts, and uses H.264 rather than MPEG2.

The standard definition channels remain MPEG2 for compatibility.

Re:Quite a bit left out (1)

sunking2 (521698) | more than 4 years ago | (#30476906)

Are you sure about that? At least according to wiki they have two formats and OTA is still mpeg2. The fact they may be doing both raises some other questions like whether OTA is seeing this as well. Cutting back on the OTA bitrate makes zero sense, so maybe this is only talking about their sat/cable distribution?

Sure! (0)

Anonymous Coward | more than 4 years ago | (#30476540)

Just look at carbon... carbon alone without any compression isn't quite nice...
But COMPRESS it and you have something much higher in quality...... ;)

Ummm .... (1)

gstoddart (321705) | more than 4 years ago | (#30476560)

In her blog post, Ms Nagler said that the services was created to be at its best for "typical viewing set ups" and that user groups with standard equipment were happy with the service.

Is she saying that they've optimized their HD for people without HD screens, or am I just confused?

It really does sound like they're trying to sell something which isn't HD, but gets sold as if it is. Strange. "Now, to server you better, we are open fewer hours."

Cheers

A better model beats higher bitrate every time (3, Insightful)

Edgewize (262271) | more than 4 years ago | (#30476578)

Lossy compression formats depend on an understanding of human perception. Nobody has a perfect model of the human brain, and nobody has a perfect algorithm for deciding what data to keep and what data to throw away.

If you have a better model of human perception than your competitors, then your encoder will yield higher quality output. If you spend 50% of your bits on things that nobody will notice, and I spend 25% of my bits on things that nobody will notice, then my 650kbps stream is going to look better than your 900kbps stream.

LAME did not win out as the MP3 encoder of choice just because it is free. It won out because its psychoacoustic model yields better-sounding MP3s at 128kbps than its competitors managed at 160kbps or even 192kbps.

Almost 50% (1)

sexconker (1179573) | more than 4 years ago | (#30476786)

16 -> 9.7?
Try almost almost 40%.

But yes, still a terrible thing to do.

The current encoders they use are trash.

Dark scene? ENJOY YOUR MAGENTA!
Fast scene? I HOPE YOU LIKE GREEN.

None too good on this side of the pond either (1)

renger (1607815) | more than 4 years ago | (#30476988)

The pay HD movie channels have terrible encoding, for the most part. HBO HD, SHO HD and so on exhibit significant coding artifacts during high motion scenes. A notable exception appears to be HDnet Movies: they can faithfully reproduce all manner of complex and fast changing content; would be nice if the well-funded big-boys followed suit. Speculation is that the big-name networks utilize bandwidth-constrained HD feeds intentionally. The majority of their last-mile distribution partners (DBS satellite and terrestrial) are capacity limited. Not much use in sending 16Mbps MPEG2 HD signal to Comcast, if they recompress and statmux multiple channels together into an over-committed modulator. The FiOS guys have stated that they will not recompress any feeds they receive; they promise to deliver the full bandwidth that they get from their suppliers. HDnet Movies looks very clean. Wish the big movie guys would provide FiOS with higher-fidelity HD feeds to deliver.

spectrum (1)

DaveGod (703167) | more than 4 years ago | (#30477090)

I suspect the move is connected with Freeview (a non-profit organisation, of which BBC is a member, that runs the free digital broadcasts) and the digital switchover. The BBC are probably thinking that if they cut their HD bitrate there can be more HD channels (and I assume more people would be able to get a good-enough signal). Or it just costs less.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...