Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

When is 720p Not 720p?

Hemos posted more than 9 years ago | from the the-tricks-of-the-eye dept.

Television 399

Henning Hoffmann writes "HDBlog has an interesting entry about many home theater displays.
Home theater displays around the resolution of 720p (most DLP, LCD, and LCOS displays) must convert 1080i material to their native resolution for display. No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all! "

Sorry! There are no comments related to the filter you selected.

Reminds me of Sound Blaster (4, Insightful)

Anonymous Coward | more than 9 years ago | (#12407538)

This sounds like the visual version of what Creative Labs has been doing for YEARS with their Sound Blaster audio cards. With most other cards if you want to record with a sample rate of 44.1 khz, you record at 44.1 khz, but even with the newer Sound Blaster cards it must be resampled to 48 khz first.

It doesn't matter if you are sampling up or down, resampling is bad, your best b
et is to find a device without it, or if it is necessary like in this case, the one that does the best conversions.

If I bought one of these displays I would be pretty pissed, but I doubt there is much that can be done about it, if you COULD do something than companies like Creative Labs would be out of business.

Re:Reminds me of Sound Blaster (5, Informative)

Shkuey (609361) | more than 9 years ago | (#12407752)

Actually this is an issue of giving people what they want. In this case an HDTV that isn't a thousand bucks more expensive and doesn't have a video processing delay.

The first incorrect thing in the /. post is that this is somehow standard definition. It's not, 540 lines is more than 480. Not only that but they process 1920 lines of horizontal resolution (scaled down to 1280 for a 720p display), which is quite a bit more than 640.

Anyone who is serious about getting the absolute most out of their display will have an external scaler and a device to delay the audio. Frankly as digital display technologies take more of a foothold in the market I'm hoping these interlaced resolutions will become far less common.

When I first read the headlines I thought they would perhaps talk about 1024x768 plasmas with rectangular pixels being marketed as 720p. That kind of thing is far more blasphemous in my opinion.

So in summary of TFA: 720p is not 720p when it's 1080i.

Re:Reminds me of Sound Blaster (0)

Molly Lipton (865392) | more than 9 years ago | (#12407756)

This is always how things go. Companies always look for ways to do just a little less to increase profits. The only cure is an informed consumer.

On the other hand, in the case of HDTV, we have a government mandated standard that manufacturers must meet by, I believe, January 2006 (so real soon!). It is no surprise that when the government interferes with Free Market activity this way, that problems like this crop up. Some manufacturers simply aren't prepared for the digital revolution, but it's do or die, so they start cutting corners.

It would be ideal if we just left the various corporations involved to create their own open standards and prepare for them in their own good time, rather than using Statist intervention to push technology along. Ultimately, the legal issues surrounding the government mandated crossover will probably be a hindrance to whatever new technology succeeds HD.

It's time to get the government out of the technology business!

Re:Reminds me of Sound Blaster (3, Informative)

GutBomb (541585) | more than 9 years ago | (#12408092)

the government mandate is for DIGITAL broadcast. it has nothing to do with HDTV. there are digital over the air broadcasts of SD content as well, and the government mandate says that all TVs made after January 2006 must be capable of recieving Digital ATSC over air signals, and that all over the air broadcast networks start broadcasting digital ATSC signals.

Re:Reminds me of Sound Blaster (1)

Zemplar (764598) | more than 9 years ago | (#12407812)

Vote with your money and encourage everyone you know to do the same!

I generally dread every prodcut made by "Creative" Labs as well so I simply don't buy anything they make. Turtle Beach's Santa Cruz was the edge but now I use M-Audio's Revolution 7.1 in new box builds due to the Santa Cruz being out of production.

Then again, if audio isn't important the onboard sound isn't much worse than anything mady by CL IMHO!

Re:Reminds me of Sound Blaster (1)

springbox (853816) | more than 9 years ago | (#12408069)

Then again, if audio isn't important the onboard sound isn't much worse than anything mady by CL IMHO!

Damn right. I ejected my Sound Blaster Live! 5.1 because its faulty WDMs were giving me major issues in Windows XP. It was really more than I ever needed. My integrated Realtec audio codec does everything I want and does it right.

Re:Reminds me of Sound Blaster (0)

Anonymous Coward | more than 9 years ago | (#12408096)

I agree, Sound Blaster products are not as good as they make out to be. Okay for gaming systems but rubbish for anything relating to hifi audio.

This is probably way ott but in the quest for decent sound out of a PC I have made an external USB 'sound card' which uses a Burr-Brown DAC. The advantage being that digital audio is outputted directly from the PC without any interference.

I've heard stories of Sound Blaster cards' digital output containing hiss when converted back to analogue - WTF!!!

Anyway have a look here [hepso.dna.fi] for instructions on how to make the external USB Dac.

HTPC wins again (0)

Anonymous Coward | more than 9 years ago | (#12407539)

Use a HTPC and a RGB or DVI input for the display. The HTPC will use the Video card to process the signal. Programs like VideLan have modes to properly format the video. Case Solved?

Same thought (2, Insightful)

SuperKendall (25149) | more than 9 years ago | (#12407645)

I had the same line of thought, that if you just use an HTPC and a monitor that can display data from that (like a projector) then you are all set - as long as whatever feeds HDTV into your HTPC for display is properly doing the conversion. That would be interesting to know, how are current HDTV cards for PC's doing any scaling? I guess they just dump the ffed to disca and then it's up to the players, which hopefully use the great horspoer of a PC to scale properly...

When is 720p not 720p? (0, Offtopic)

FlyByPC (841016) | more than 9 years ago | (#12407542)

...When converted from pounds Sterling to Euros?

FP? (-1, Offtopic)

Supernoma (794214) | more than 9 years ago | (#12407544)

Nothing for me to see here?

720p or not 720p... (-1)

Anonymous Coward | more than 9 years ago | (#12407548)

...that is the question.

...this really sucks. (-1)

Anonymous Coward | more than 9 years ago | (#12407557)

good thing I now know about this :)

First!! (-1)

Anonymous Coward | more than 9 years ago | (#12407562)

First Post!

What does Sony and others have to say about that? (1)

alexandreracine (859693) | more than 9 years ago | (#12407572)

Come on giants! I am waiting for your answers!

Re:What does Sony and others have to say about tha (1)

xerid (235598) | more than 9 years ago | (#12407664)

What's sad is that the engineers for these companies are keeping quiet. What ever happened to having ethics for your profession?

Three letters... (1)

Gruneun (261463) | more than 9 years ago | (#12407710)

NDA

Re:What does Sony and others have to say about tha (0)

Anonymous Coward | more than 9 years ago | (#12407953)

If one were to do so, you'd create bad press about your company first, and the industry in general second. That's the fastest way to unemployment.

It's there (5, Funny)

Anonymous Coward | more than 9 years ago | (#12407585)

No surprise there. But many displays do this by discarding half of the 1080i HD signal, effectively giving 720p viewers an SD signal - not watching HD at all!

The HD signal's still there... you just have to learn how to read between the lines.

For the inevitable /.ing (5, Informative)

Anonymous Coward | more than 9 years ago | (#12407587)

Here is tfa for you...

When is 720p not 720p?

Tom Norton, in his coverage of the Home Entertainment expo, brought something up that I was unaware of.

720p displays show native 720p signals directly, of course. They also upconvert SD signals (like DVD) up to 720p for display. And 720p displays must convert incoming 1080i signals to 720p before they can be displayed. No surprise there, this makes sense. But, Silicon Optix claims that most manufacturers do the 1080i conversion just by taking one 540 line field from each 1080i frame (which is composed of two 540 line fields) and scaling that one field up to 720p, ignoring the other field. Reason being, it takes a lot less processing power to do this than to convert the image to 1080p and scale that, which would use all the information in the original signal to derive the 720p signal. If you have a display like this, it means that you're watching 540 lines of resolution upconverted to 720p. This is not HD, just like watching a DVD upconverted to 720p is not HD. Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height. While this is better than DVD, it's not HD in my mind. (Aside: Tom Norton mentions this in his review of the Screenplay 777 projector.)

If this is indeed the case, most people with 720p (or similar) projectors (and most DLP, LCD, and LCOS home theater projectors are exactly that) are not seeing what their displays are capable of. They're not, technically, even watching HD. This is crazy! How can this be? Why haven't we heard of this before? How are manufacturers getting away with it?

Over-reacting? Well, if you're an owner of a 720p (or any similar resolution) projector you're either gonna be really upset by this or you're just gonna be laisez-faire about it because there's nothing you can do and you're enjoying your projector just fine thank-you. But me, I don't even own any such projector and I'm a little ticked. But I guess I should really wait for evidence of how properly-done conversion looks in comparison before making any snap judgements. I'm sure that the people selling HQV (a processor chip that does it the RIGHT way) will set something up.

To me, this is a serious issue. Comments are welcome.

from: http://www.hdblog.net/ [hdblog.net]

Re:For the inevitable /.ing (4, Informative)

Overzeetop (214511) | more than 9 years ago | (#12408002)

That's all well and good, but I'm afraid I tend to agree with them. If content providers want to "do it right" they should ditch the 1950's interlacing and get with the 1980s.

He's leaving one step out. 1080i is 540 lines scanned 60 times per second, offset by half a vertical pitch. 720p is 720 lines scanned at 30 times persecond.

To try and take two frames which are not occuring at the same instant, stitch them together, remove the motion artifacts, resample, and then display is just plain silly. And frought with errors, as you are expecting a computer to determine which parts of the motion (over 1/60 of a second) to keep and which to throw away.

If you wanted high fidelity, you'd spend the money for a 1080p60 system. Then it wouldn't matter. Except that you would complain about the quality, because each frame you see was upsampled from only 540 lines of resolution.

It all comes back to the fact the the FCC let the industry choose this "18 formats is good" spec.

Personally, I'm in favor of an olympic standard mayonaise, but...no...wait...awww hell, I give up.

Resampling (5, Insightful)

FlyByPC (841016) | more than 9 years ago | (#12407592)

There's got to be a fairly straightforward formula relating inherent resolution loss when performing any noninteger upsampling, or any downsampling. Any other change in resolution must necessarily degrade the signal, yes? (Except perhaps if a clever algorithm could losslessly encode the original data in a 1.5x-upsampled version, without distorting it.)

The clever algorithm is a "Fourier transform" (5, Informative)

roystgnr (4015) | more than 9 years ago | (#12407764)

And when you use it to upsample data, it is a lossless encoding that doesn't degrade the signal (unless you deliberately throw away data - discrete Fourier transforms are also used in lossy encoders).

It's not a distortion-free transform, since high frequency signals (e.g. sharp edges) in the original image get interpreted as smooth changes and can get blurred between multiple pixels in an upsampled signal. But then again, that's exactly the sort of thing that happens when you digitize a picture in the first place - if you have a sharp black/white edge that passes through the middle of a pixel, the most accurate thing you can do is make that pixel gray.

Re:Resampling: Imagine a 1-pixel-wide line (4, Informative)

G4from128k (686170) | more than 9 years ago | (#12407862)

There's got to be a fairly straightforward formula relating inherent resolution loss when performing any noninteger upsampling, or any downsampling.

Its a bit messy. Imagine 1080i image with a 1-pixel wide sloping black line that is nearly horizontal on a white background. If you throw out half the data, you create an image with a dashed-line. Gaps in the line occur where the slanting line cut across the rows that were discarded. If you upsample from 540 to 720, you will find that the remaining dashes become fattened non-uniformly. In places where the row in the 720-row image falls directly on top of the 540 row image, the line will be thin and dark. In places where the 720-row image falls midway between rows in the 540 row image, the line will be wide and less dark. The end result is the thin black uniform line is converted to a dashed line of varying thickness and darkness -- not pretty.

Even if you resample directly from 1080 to 720, you still run into problems where the 720-row image pixels fall between the 1080-row pixels. At best, you can use higher-order interpolation (e.g. cubic) to try and fit a curve through the original data and try to estimate what was in the middle of the pixels so they can be shifted half way over. But the result wil never look like an image that was taken with a 720-row camera in the first place.

Re:Resampling (0)

Anonymous Coward | more than 9 years ago | (#12407961)

The problem is not so much the interpolation from 1080 lines to 720 lines. There are excellent algorithms which easily produce better results than dropping every other line. The problem is that 1080 is 1080i and 720 is 720p, where i means interlaced and p means progressive. Every other line of a 1080i signal is from a different time (1/60 second later than the other lines). You can't just downsample a full 1080i frame which is composed by weaving the two fields together. You have to "deinterlace" first: create a progressive frame by compensating for motion. Only then can you downsample without horrible artifacts. Good deinterlacers are expensive, even for SD resolutions (look for "Faroudja"). The interpolation is child's play compared to deinterlacing.

Which Models? (5, Interesting)

goldspider (445116) | more than 9 years ago | (#12407595)

Is there any way of telling which sets do this? This is certainly something I'd like to know before I dropped that kind of cash.

Re:Which Models? (5, Interesting)

David Leppik (158017) | more than 9 years ago | (#12407965)

Is there any way of telling which sets do this? This is certainly something I'd like to know before I dropped that kind of cash.
Yes. Go to a showroom and look at the displays. If you see some that have greater vertical resolution than the non-HD models, there you go. If you can't see a difference, then it doesn't make a difference.

If there is a difference you can't see but could learn to see, don't learn; it will not bring you joy, it will only make you miserable or annoying. Long ago I learned to see the FFT distortion in JPEG and MPEG images. Has it made me happy? No. I end up making the JPEGs on my website bigger than everyone else's so I won't see wrinkles on people's faces that are apparently invisible to everyone else. And I can't stand to watch satellite television on a big screen TV because of the annoying compression artifacts.

Bottom line (0)

Anonymous Coward | more than 9 years ago | (#12407601)

720p is not 720p when its not 720p. Got it?

Re:Bottom line (0)

Anonymous Coward | more than 9 years ago | (#12407815)

When is a contraction not a contraction?

When it's missing an apostrophe.

Got it?

misty (0, Offtopic)

MetalliQaZ (539913) | more than 9 years ago | (#12407606)

Oh, I think I'm getting a little misty over here. Its just so sad..all those 720 owners, watching tv at slightly less quality than HD, its just so....heartbreaking.

In other words: cry me a river.

-d

Re:misty (0, Flamebait)

tomstdenis (446163) | more than 9 years ago | (#12407629)

More so...

HAHAHAHAHAHAHA

Congrat for buying into expensive "entertainment" only to be duped by the lying theives that are the home appliance manufuckturers.

I got my NTSC decoder and I'm damn well happy.

Tom

Re:misty (1, Insightful)

Anonymous Coward | more than 9 years ago | (#12407638)

Yeah I mean how dare they complain, so what if they spent their hard earned cash expecting to get a device with the specs that were claimed. Shame on them!

(what the hell kind of attitude is that)

Re:misty (0)

Anonymous Coward | more than 9 years ago | (#12407737)

I think at least half of the humor is that most of the "victims" were schmucks with more money than brains, who wouldn't ever have noticed the problem were it not for someone else pointing it out.

Really, if you're buying this stuff just to make a statement about how hip and financially-endowed you are, are you even getting ripped off if it doesn't meet tech specs? I mean, it still functions as a prosthetic cock, right?

Re:misty (3, Insightful)

eyegor (148503) | more than 9 years ago | (#12407868)

There has been a similar issue for years with audio amplifier specs.

Mfgrs usually tout their amps with having "200 watts of pulsing music power" which usually means 100 watts per channel peak. In reality it's more like 70.7 watts/channel RMS (assuming they're not still lying).

Workaround is to use an HTPC... (5, Informative)

Rectal Prolapse (32159) | more than 9 years ago | (#12407620)

A Home Theater PC with good quality parts, drivers, and decoders will preserve the 1080i signal - it will combine the 1080i field pair into a single 1080p signal, and then downconvert (ie. downscale) to 720p.

As a reference, my Athlon XP running at 2.4 GHz (aproximately equivalent to an Athlon XP 3400+) with a Geforce 6800GT and TheaterTek 2.1 software will have (little) trouble achieving this, assuming the 1080i source isn't glitchy itself.

Alternative is to use the NVIDIA DVD Decoder version 1.0.0.67 ($20 US after 30 day trial) and ZoomPlayer 4.5 beta ($20 beta or nagware) for similar results.

TheaterTek is roughly $70 US and includes updated NVIDIA DVD Decoders - too bad NVIDIA hasn't updated their official DVD decoders with the bugfixes that is present in the TheaterTek package.

Forgot to mention HDTV tuner card... (1)

Rectal Prolapse (32159) | more than 9 years ago | (#12407680)

You will also need a decent HDTV tuner card - but, I don't know much about them. http://www.avsforum.com/ [avsforum.com] is the place to go if you need info on that.

Unfortunately, since I live in Calgary, Canada, HDTV service is very sparse...I basically download HDTV 1080i content from the internet - usually trailers or free NASA HD content.

Re:Workaround is to use an HTPC... (0)

Anonymous Coward | more than 9 years ago | (#12407711)

Amazing. So the alternative to using a piece of hardware that bones a display stream is to...

Buy a different piece of hardware that handles it properly.

Genius!

You got it! (1)

Rectal Prolapse (32159) | more than 9 years ago | (#12407803)

:)

Basically, you want to bypass your projector's scaler. If the display hardware always passes 720p resolution to your native 720p device (preferably using DVI OR if you know your display won't rescale 720p to 720p - yes, some do this horrid practice!) then you will avoid the problem.

Re:Workaround is to use an HTPC... (1)

Ian Peon (232360) | more than 9 years ago | (#12407773)

I don't care what you do with the signal, DVDs only have SD information encoded on them (480i). So, snaps to you if you're watching HDTV via your HTPC, but now why the hell are you sending everything 1080i? Do you have a 1080i native display? I'll bet you don't - cause they're still pretty damned expensive.

OK, then let's look at your DVD signal path. 480i converted to 1080i then sent to your display that convertes it to 720p?? Two resolution conversions - and the article states that the second one may only give you 540p. No matter how you slice it, it is far better to simply give your display its native resolution when at all possible - set your clever HTPC to output 720p. Now you're doing 480i converted to 720p - and that's it!

Re:Workaround is to use an HTPC... (1)

Rectal Prolapse (32159) | more than 9 years ago | (#12407852)

Yep, that's what I meant to say! I wasn't too clear...

If your display is 720p, set your HTPC to output 720p. That's it - the decoder and videocard's builtin scaler will take care of the rest!

Just hope that the decoder/videocard's deinterlacer is up to snuff...

1080i streams... (1)

Rectal Prolapse (32159) | more than 9 years ago | (#12408056)

I should also mention that TheaterTek, NVIDIA DVD Decoders, ZoomPlayer, etc. are not limited to SD resolution DVDs - they can play recorded 1080i content encoded in mpeg2 transport or program stream files.

If you have Microsoft's Media Center Edition 2005, you can specify the NVIDIA DVD Decoder (or other competent mpeg2 decoder, such as Elecard's or WinDVD's) for all mpeg2 content including HDTV.

Re:Workaround is to use an HTPC... (0)

Anonymous Coward | more than 9 years ago | (#12408095)

The grandparent failed to point out that the DVD player is just the product that provides the MPEG2 decoder. Recent decoders are capable of decoding HDTV MPEG2 streams, so they're not just for watching DVDs anymore. A PC with a good video card can decode a 1080i live TV stream, do all the deinterlacing and downsampling in high quality and provide the display with a 720p signal, so the crappy scaler in the display is never used.

Re:Workaround is to use an HTPC... (0)

UnrefinedLayman (185512) | more than 9 years ago | (#12408121)

Do you have a 1080i native display? I'll bet you don't - cause they're still pretty damned expensive.
I do -- it came with my sub-$900 Centrino laptop with a Radeon X300. Displays 1080i with room to spare, no less.

Televisions may be another story, but there are cheap 1080i displays to be had.

combining field pairs... (0)

Anonymous Coward | more than 9 years ago | (#12407992)

If you combine the two 1080i fields into a single 1080p frame, you now have only 30 frames per second. 720p is 60 frames per second. Also, you'll have massive interlacing problems, since the two fields you combined were from different points of time (1/60th second apart) and you're showing them simultaneously.

No, the only proper way to do it is to convert each 1080i field on its own into a 720p frame. It's not really tough, any computer flat panel has the required filtering circuitry inside, and I know my TV does it, thank you. I didn't buy a junky TV.

When is 720p Not 720p? (5, Insightful)

t_allardyce (48447) | more than 9 years ago | (#12407630)

When the un-washed masses can't actually tell the difference (they can't even see DCT blocking) and you can get away with selling this crap to them..

Somewhat offtopic, but still.. (1)

trezor (555230) | more than 9 years ago | (#12407791)

Which can also be the only explenation for why anyone would try to encode HD-content on todays DVDs.

I can't understand anything else than that the DCT-artifacts would be even more dominant than on todays DVDs (or DVD-rips).

DCT Blocking (0)

Anonymous Coward | more than 9 years ago | (#12407832)

And you say that you can see the DCT blocking on live video? Not freeze frame, but actual motion going on, with no uncorrected errors in the input stream?

On the other hand, if you freeze a complex image, especially ones with edges abutting flat fields, then I don't expect they're too hard to pick out.
That's the trade off in compression. If you have a lot of action, you need to turn up the compression, which causes more artifacts, but they are hidden by the motion. You can see the artifacts if you stop the motion, but the format wasn't designed for stopped images.

And if the transport layer breaks down, say if you're at the end of a satellite link in a heavy storm, then it's pretty hard to miss the DCT blocks.

Re:When is 720p Not 720p? (0)

Anonymous Coward | more than 9 years ago | (#12408012)

When the un-washed masses can't actually tell the difference (they can't even see DCT blocking) and you can get away with selling this crap to them..

Hey, it worked for selling 32-bit color video cards to Joe Average; why stop there?

Re:When is 720p Not 720p? (1)

AaronGTurner (731883) | more than 9 years ago | (#12408027)

In a fairly recent demo of HDTV one of the manufacturers demonstrated 1080i on a 720p display and even the washed elite of the press largely failed to notice. So what chance do the unwashed masses have?

If you can't see the problem, is there a problem? (3, Informative)

... James ... (33917) | more than 9 years ago | (#12407637)

I have a 720p projector paired with a 110" screen. Both 720p and 1080i material look fantastic. Maybe the supposed degredation would be visible side-by-side with a native resolution projector, but I certainly wouldn't worry about it based on what I've been watching.

Well, a little worse, actually... (3, Informative)

AtariDatacenter (31657) | more than 9 years ago | (#12407647)

The article says: Sure, you'll get the full width of the 1080i resolution, but you're only getting half the height.

Except your 720p display will hopefully have a horizontal resolution of 1280. 1080i video has a horizontal resolution of 1920. So, you're keeping half of the vertical (1080 lines to 540) and you;re keeping 2/3rds of the horizontal (1920 down to 1280).

Ouch.

Re:Well, a little worse, actually... (1)

yubyub (173486) | more than 9 years ago | (#12407720)

Not really. Current camera technology can't provide full detail at 1920 pixels yet. We'll be there at some point, but not today.

Especially after you throw the encoding/decoding and relatively heavy compression most HD signals encounter at the encoding side, you get nowhere near 1920 pixels.

Re:Well, a little worse, actually... (1)

AtariDatacenter (31657) | more than 9 years ago | (#12407822)

Point taken, and I don't know who your provider is, or how high-end of a display you have, but I see a marked difference in resolution between 720p and 1080i video. I don't think it comes from the extra 360 vertical lines.

Re:Well, a little worse, actually... (3, Informative)

Ironsides (739422) | more than 9 years ago | (#12407882)

Current camera technology can't provide full detail at 1920 pixels yet

High, I just got back from NAB show in Las Vegas last week. The vendors were had HD Cams that would film and record 1920x1080i. That somepoint is today.

Consider the source too! (3, Informative)

Rectal Prolapse (32159) | more than 9 years ago | (#12407755)

From my lurking on HDTV enthusiast sites, sometimes the broadcaster will take DVD content (480i) and upconvert to 1080i! It's a terrible practice.

And in other instances, the broadcaster will not use the full resolution - what looks like 1920x1080i may actually be an upconvert of 1280x1080i, 1440x1080i, or 1280x720! And then there is the overcompression - taking a 20mb/sec mpeg2 stream and cutting the bitrate in half - compression artifacts galore.

It is sad when HDTV programming available in North America can be WORSE than the DVD!

You never know what you get in US (and other?) HDTV broadcasts. My understanding is that only the Japanese use minimal mpeg2 compress - I saw snippets of Contact (with Japanese subtitles) in full glorious 1920x1080i at the maximum 20 mbit/sec bitrate - and it was glorious!

Re:Consider the source too! (1)

TheSync (5291) | more than 9 years ago | (#12408008)

The truth is that a broadcast quality SD source (say from DigiBeta) is still of much higher quality than what you find on an analog NTSC signal (NTSC has a limited color space and its own resolution limits).

So most broadcast SD material upconverted to HD resolution still looks better than had the material been broadcast in analog NTSC.

Good Ol' CRT (5, Insightful)

goldspider (445116) | more than 9 years ago | (#12407651)

When I upgrade to an HD idiot box, I plan on sticking with tried-and-true CRT. IMHO, you can't beat the picture quality/price, and I have yet to hear a compelling reason to fork out thousands of dollars for the trendier offerings.

Re:Good Ol' CRT (0)

Anonymous Coward | more than 9 years ago | (#12407797)

I've got a HDTV that's a CRT--and yes, I don't think you can beat the price/quality. Plus, for video games you don't have to worry about screen burn-in as much as you would with some of the newer technologies.

Re:Good Ol' CRT (1)

Rosco P. Coltrane (209368) | more than 9 years ago | (#12407876)

I have yet to hear a compelling reason to fork out thousands of dollars for the trendier offerings.

The trendier offerings sell to first-adopters and very rich people: those in the first group get their kicks inviting friends at home to hear them go "ooh..ahh..wow", not really out of the better quality, and the second ground just doesn't care about the price.

When the early adopters are done early-adopting, then it gets affordable for people with regular lives, like you and me.

Re:Good Ol' CRT (0)

Anonymous Coward | more than 9 years ago | (#12407879)

Lets see...
CRT's can't be that big. Go too big and it's usually projection, not CRT. Why? Becuase CRTs can't refresh a huge area fast enough. Why? Because they are an older technology being forced to do stuff it wasn't meant to, and don't work well with fast refresh rates and large areas, and if they do they don't work well for long.

Re:Good Ol' CRT (3, Insightful)

GatorMan (70959) | more than 9 years ago | (#12407899)

You must live on the first floor and I also assume you don't relocate often. Try getting a 215lbs. 36" CRT HDTV up a couple flights of stairs. Even with the home delivery options you're lucky if the thugs don't damage your set, your building, or both.

Re:Good Ol' CRT (1)

AtariDatacenter (31657) | more than 9 years ago | (#12407921)

The reason to fork over thousands of dollars for trendier offerings? Picture size.

The reason Plasma/LCD are on the market wasn't because they were selected for image quality. Quite a ways from it. Instead, they were able to scale to sizes greater than 40".

Tube based HDTVs are only manufactured to about 36" today. And they are an awesome value for what you get... they've got an excellent image quality. The problem is that to fully resolve a 1080i image with your eye, you've got to be sitting pretty close to a tube of that size.

The ideal would be a full 1080i screen that is quite large where you can set a ways back and get the full resolution. But a plasma or LCD of that size and at 1080i/1080p? You're talking serious cash.

If you're willing to sit a little closer to your TV (say, 6-7' on a 34" tube), then CRTs are an excellent value. The Sony KD34XBR960 and KD34XS955 are the recognizied reference standard when it comes to direct view picture quality.

Re:Good Ol' CRT (3, Insightful)

The-Bus (138060) | more than 9 years ago | (#12408074)

Well, not sure what you mean by CRT but I would say the compelling reason is size. CRTs can only be so big. If you want to go bigger, you can go with what I call the "MTV Cribs" TVs, plasma/LCD, etc. or you can go with a quality RPTV or a projector. I have yet to see a plasma or LCD that has a better quality picture than a decent RPTV.

its really too bad (2, Insightful)

bassgoonist (876907) | more than 9 years ago | (#12407663)

Early adopters often get slapped in the face. I've been thinking about buying an hdtv for a long time. I'm really glad I read this before I bought one.

Re:its really too bad (1)

Mignon (34109) | more than 9 years ago | (#12408007)

I've been thinking about buying an hdtv for a long time.

Then I guess it's a little late to consider yourself an early adopter ;)

This is what you get..... (3, Informative)

nathanmace (839928) | more than 9 years ago | (#12407667)

This is what you get when you buy a "major" appliance without doing your research first. I know if I was planning on dropping anywhere from $700-to over a $1000 on something, I would be sure to find out everything about it so I could make an informed decision. If someone didn't do that, then they got what they deserved.

That said, I'm sure there a lot of people out who "don't care". It works form them, and that is all they care about.

Re:This is what you get..... (2, Insightful)

Tiroth (95112) | more than 9 years ago | (#12407996)

The problem is, how do you do the research? The audio/video publications out there have not even come close to adopting a standard set of measurements that would quantify the performance of processors that need to perform complex tasks like scaling, 3:2 pulldown, etc. The results from different chipsets are all over the map (chroma key errors, cheats, lame algorithms), and it's rare to be able to get any information at all on new products. You just have to wait 6 months until someone that actually knows what they are doing throws a review up on the web.

Re:This is what you get..... (4, Insightful)

Zed2K (313037) | more than 9 years ago | (#12408016)

Kind of hard to "do your research" when you can't find out how the conversion takes place or can't understand how it all works. This is not a consumers fault kind of thing. This kind of information is not made known unless people ask the question. Who would have thought to ask a question like this?

Re:This is what you get..... (1)

hackstraw (262471) | more than 9 years ago | (#12408089)

This is what you get when you buy a "major" appliance without doing your research first.

Yeah, thats what I thought too, except that my research is usually about 2x better than the QA of most companies today. I've found that its best to pay a premium or do without, and when paying a premium, have some integrator or store stand behind the shitty QA of the products. Even then, its a pain in the ass, but it certainly beats buying something from a company like Newegg and paying a 15% restocking fee for them to restock something they should have never stocked.

What chips or products? (1)

bert33 (655799) | more than 9 years ago | (#12407677)

It would be nice to know what chip or products use this type of conversion. Currently I have my Motorola STB convert everything to 720p. This setup did seem to product a better image than letting my Panasonic LCD Projection set handle the conversion.

Should have bought a 1080i screen then! (5, Insightful)

node 3 (115640) | more than 9 years ago | (#12407678)

If the broadcast is 1080i, and your display isn't 1080i, I don't think it's logical to assume the quality of the downsampled video will be equivalent to a true 720p broadcast.

When I get around to buying a HD television (not any time soon, I do all my televisioning on my computer), it will be a true 1080i (are there 1080p televisions?) display so I'll know I'm getting the full potential of HD.

Unless I'm strapped for cash, of course, in which case I'll just suck it up and know my 720p won't be the best thing for watching 1080i content on.

On the plus side, it's important to get the facts out there for the consumer, who will likely (although not logically) assume he's/she's getting more than they really are.

Re:Should have bought a 1080i screen then! (2, Informative)

Zed2K (313037) | more than 9 years ago | (#12407986)

There are 1080p DLP tv's coming this year from samsung, panasonic, mitsubishi.

Personally I'm getting a samsung 6168 model.

Re:Should have bought a 1080i screen then! (2, Informative)

chrisbolt (11273) | more than 9 years ago | (#12408003)

A 'true 1080i' display will not be able to display 720p as good as a 'true 1080p' display. The best solution would be a true 1080p display, which isn't available yet, but will be in the next year or two.

Re:Should have bought a 1080i screen then! (1)

Physician (861339) | more than 9 years ago | (#12408065)

Yes there are 1080p televisions. The smallest and hence cheapest is the Sharp Aquos LC-45GD6U.

Hey, Bloggers... (5, Insightful)

Gruneun (261463) | more than 9 years ago | (#12407681)

Why not submit a link to the original article, rather than a link to your blog, which consists only of a link to the original article?

Otherwise, people might assume this is a shameless attempt to draw traffic to your site.

Exacerbate or mitigate? (1)

Covener (32114) | more than 9 years ago | (#12407699)

Which set-top (cable/satellite) receivers are doing that same stingey conversion when you've told them to output 720p?

Anyone with any real regard for picture quality, and whose equipment leaves them the choice, has probably evaluated it under both configurations anyway.

Which is the reason... (1)

Lord Apathy (584315) | more than 9 years ago | (#12407758)

This is the reason I bought a 52" rear screen projection than LCD/Plasma and whatever. That, and it was 3000 bucks cheaper and had a better picture.

spatial vs temporal resolution (5, Informative)

Phearless Phred (67414) | more than 9 years ago | (#12407763)

Ok, here's the skinny. 1080i is 1920x1080 @ 59.94 fields / second, meaning at any one instant in time, you're looking at a 1920x540 image made up of every other line of the picture (the odd fields, if you will.) Then, ~1/60th of a second later, you see the even fields. 720p is 1280x720 @ 60 FRAMES per second, meaning at any given instant you're looking at EVERY field of the image...not just the odd or even fields. If you were to try and take all 1080 lines from the original signal, they wouldn't really map properly to 720 at any given second because half of data would be from that same ~1/60th of a second later. Scaling the fields up is really the best way to go, at least for stuff that's been shot interlaced.

Re:spatial vs temporal resolution (3, Insightful)

pe1chl (90186) | more than 9 years ago | (#12407851)

But this is true for SD display on a double-scan TV as well.
The "digital feature box" in the TV is supposed to combine the two fields into one single frame. This is usually referred to as "motion compensation" or some other nifty marketing term.
This is what separates the cheap from the expensive TVs.

Re:spatial vs temporal resolution (4, Insightful)

chrisbolt (11273) | more than 9 years ago | (#12408101)

The proper way to do it is to deinterlace the 1920x540 image into a 1920x1080 image by interpolating the missing pixels based on what surrounds the empty spaces, and based on previous fields. Then resize the 1920x1080 image to 1280x720. This is the same way a progressive DVD player turns 480i video into 480p video, minus the resizing.

ED displays (2, Interesting)

justforaday (560408) | more than 9 years ago | (#12407783)

I haven't had a chance to read through the full article/blog/whatever yet (I'll do that at lunch), but this sounds like something I noticed over the weekend while browsing the Best Buy site. Companies are now producing ED-compatible TVs. They list all sorts of compatible display modes (1080i, 720p, 480p, etc), but then mention that they downscale them for display on the TV. Is this just some way of offering half-assed support to unsuspecting consumers?

Re:ED displays (1)

justforaday (560408) | more than 9 years ago | (#12408085)

Having read TFA, I see that it's not even the same thing at all. I still think that these ED displays are a load of crap though...

It's not the TV. (0)

Anonymous Coward | more than 9 years ago | (#12407816)

To those of you screaming "I'm glad I didn't yet buy an HDTV... Foolish early adopters..."

1. It's not the TV's problem. It's the processor feeding the TV. Most HDTV's don't have the processor/scaler on board. This is done with your CABLE BOX, SAT RECEIVER, or [upscaling] DVD player.

2. 1080i... Who cares? 1080p is where it's at... If you're on the fence about buying into HDTV, then wait until you can buy a 1080p capable unit.

3. Stick with CRT? Why pay more for old technology? Some of the newer LCos, DLP, and LCD solutions can draw a BETTER picture for less money and certainly less maintenance (convergence!??? NEVER will I deal with THAT BS again. Screw tubes!).

AudioAdvice.com (0)

Anonymous Coward | more than 9 years ago | (#12407835)

I work for these guys and though I won't see a dime if you spend it I've never known a group of people who know more about setting up Audio Equipment. They will take the time to show you what to do, and even set it up for you. This way you don't have to worry about getting scammed.

See, if you're going to invest a few thousand dollars in ANY home system do it right the first time. Call people who know what they are doing, who will come out and offer warranties on what they do, and set it up for you.

But hey this is a shameless plug for a company that I love working for. (Located in Raleigh, NC)

Not as Alarming as it Sounds (0)

Anonymous Coward | more than 9 years ago | (#12407850)

While this loss of resolution sounds disatrous, it's rarely a big deal in reality. This problem only occurs when connecting a 1080i source to a 720p display (and even then, no one but Silicon Optix seems to know how many displays actually use this downconversion shortcut). But when would you connect a 1080i source to a 720p display? Every HD source I've ever used, from over-the-air tuner to satellite receiver to upconverting DVD player to HTPCs, have a selectable output on their side. Select them all to output 720p and you take the processing out of the hands of the display. Problem solved.

Re:Not as Alarming as it Sounds (2, Insightful)

wiredlogic (135348) | more than 9 years ago | (#12408110)

Every HD source I've ever used, from over-the-air tuner to satellite receiver to upconverting DVD player to HTPCs, have a selectable output on their side

If you receive an OTA broadcast or cable signal in 1080i then you don't have any control over the video source. Since the broadcasters are split between 720p and 1080i this is a real issue.

only HD is HD (2, Interesting)

Anonymous Coward | more than 9 years ago | (#12407855)

Manufacturers better refrain from selling not HD capable displays as HD displays. This is clearly false advertising and there have been several succesful lawsuits lately where people who have been stupid enough and bought into this got their money back.

The bigger trick is finding a 1080i broadcast (1)

syntap (242090) | more than 9 years ago | (#12407891)

When stations, at least in the DC area, broadcast a 720p signal it's considered a big deal. Most broadcast 480p to utilize existing bandwidth for broadcast of two or more stations.

I think this guy is missing the point (3, Insightful)

chrisbolt (11273) | more than 9 years ago | (#12407924)

720p displays show native 720p signals directly, of course. They also upconvert SD signals (like DVD) up to 720p for display. And 720p displays must convert incoming 1080i signals to 720p before they can be displayed. No surprise there, this makes sense. But, Silicon Optix claims that most manufacturers do the 1080i conversion just by taking one 540 line field from each 1080i frame (which is composed of two 540 line fields) and scaling that one field up to 720p, ignoring the other field.
Then later on, he says "Tom Norton mentions this in his review of the Screenplay 777 projector [ultimateavmag.com] ." The problem is, in the Screenplay 777 article, it's explained a bit differently:
For conversion to the native 720p resolution of the projector, 1080i material undergoes an interesting process. Consider each 1080i frame as two 540p fields. Each of these 540p fields is converted to the 720p native resolution of the 777. These upscaled fields are then displayed sequentially. One could, I suppose, make the argument that this potentially reduces or eliminates the temporal (motion) problems inherent in 1080i material. One could also make the argument that it eliminates any spatial (static) resolution advantages that 1080i has over 720p--or indeed over even native 720p sources.
So he's off, in that instead of it taking one 540 line field from each 1080i frame composed of two fields, it's taking both 540 line fields from each 1080i frame. Not as good as deinterlacing to get 1080p, then resizing that to 720p, but not as bad as the guy makes it sound. Not to mention most rear projection CRTs will display 1080i, but can't display 720p... what happens there?

Depends on the conversion (1)

Ironsides (739422) | more than 9 years ago | (#12407945)

720p operates at 60 frames (60 full frames) per second. 1080i operates at 60 fields (30 full frames) per second.

If they convert each individual 1080i frame (1920x540) to 720p (1280x720) then they are not tossing any fields (which seems to be the problem). So, if they are converting 1080i@60 fields to 720p@60 frames, then there is no problem here. If, however, they are converting to 720p@30 frames, then they are tossing half the fields from 1080i and we have a problem. All depends on how the conversion is done.

Solution: use a real 720p signal! (1)

Kosi (589267) | more than 9 years ago | (#12407988)

WTF would prefer the lousy flickering interlaced picture just because of the higher resolution?

It's a big enough shame that this crap has found it's way into the HDTV specs, but WTF does someone use it?!?

Re:Solution: use a real 720p signal! (1)

teshuvah (831969) | more than 9 years ago | (#12408038)

Because a 1080i signal is far superior to a 720p signal. Not to mention the issues all the 720p sets have like rainbowing, screen door effect, etc.

1080i RP CRT > *

Not news - Buy a scaler. (4, Informative)

GoRK (10018) | more than 9 years ago | (#12408018)

This is not particularly news. Some "blogger" discovers something because he never bothered to ask and screams something about the sky is falling.. I'm kind of sick of this "news" reporting. Incidentally, this same issue affects owners of most plasma and LCD tv's with native resolutions below 1920x1080 too.. depending on how you look at it as a problem or not.

Anyway, it's fairly well known that the internal scalers in many devices suck. That is why there is a market for good external scalers. If you are paranoid about watching a lot of 1080i on your 720p projector or LCD TV or Plasma, go buy a scaler. They cost about $1000 but will improve scaled display a lot.

At least if you have an external scaler you will have some options about how you convert 1080i to 720p. The article makes it sound like splitting the fields is a huge sin -- and it is if you discard one field per frame (Half field deinterlacing), but it's perfectly acceptible to scale EACH 540-line field to a seperate 720-line frame and double the framerate. This is called bob deinterlacing and is often the best for converting 1080i video to lower resolutions. If you are watching a 1080i upconvert of a film or something, though, you can have the scaler do full field deinterlacing and inverse telecine for you and see a nice 720p/24fps picture. Scalers also generally have internal audio delays for various types of audio feeds so you won't have to worry about AV sync issues either.

If you have any questions about how your device does this, you should try to find out before you buy it. Most devices don't publish how they do it, though, so your only option may be to derive it -- and that will not be an easy job.

Sony Rear Projection TV (1)

SEGV (1677) | more than 9 years ago | (#12408029)

Tell me about it. I got a 46" Sony widescreen rear projection TV, supposedly HD-ready.

It will accept a 720p signal, but DOWNCONVERTS it to 480p for display. This is only noted as a footnote in the user manual.

What use is that? We've been suckered.

gee am I suprise (1, Insightful)

lexcyber (133454) | more than 9 years ago | (#12408035)

As usual when we talk quality, in any discussion it all boils down to: You get what you pay for!

Buy crap equipment and you will get crap.

More Like When is HD Not HD (3, Informative)

Myriad (89793) | more than 9 years ago | (#12408036)

While this is an interesting issue, and one I hadn't heard of before, it's only one of many mines in the field of HD.

If you're looking to get into HD there are a *lot* of little quirks to take into account, such as:
- Offically there are two HD resolutions, 720p and 1080i
- Most HD TV's are only capable of *one* of these resolutions. So you have to choose, 720p OR 1080i in most cases. If you want one that can do both, check very carefully.. forget DLP or LCD based devices (fixed set of pixels so fixed resolution), CRT only.
- Many HDTV's will *not* convert from one format to another. They accept only their native resolution.
- Different networks broadcast using one standard or the other. For example CBS uses 1080i and ABC 720p IIRC. Fox is way behind in HDTV support.
- Most HDTV receivers can handle either a 720p or 1080i signal and will convert as required for your TV's native resolution.
- Some TV providers only support one format, regardless of the source material. Ie, in Canada Starchoice only broadcasts in 1080i. Any 720p content they have they upconvert to 1080i before broadcasting. It's impossible to receive a native 720p signal from them.
- The Xbox supports both HDTV modes... but very few HD games actually use 1080i (Dragons Lair being one). Most are 720p. So if this is important to you, you'll possibly want a 720p native TV: most receivers do not have HD inputs that would let you upconvert a 720p game to a 1080i signal for the TV. (the new Xbox will have more HD content than the current one, but it's a good bet that they'll be mostly 720p titles)
- Most Projectors and Plasma's are *not* HDTV. They are EDTV (enhanced definition) or some such. Check the specs carefully.
- Most projectors are 1024x768. This means your HD signal of 1920x1080i or 1920x720p is being heavily rescaled horizontally! Few projectors have a true HD native resolution.

So there you go... lots of fun things to take into account!

Blockwars [blockwars.com] : free, multiplayer, head to head Tetris like game

Yes but... (0)

Anonymous Coward | more than 9 years ago | (#12408053)

But if I only bought my HDTV to show off in front of my mates, it wouldn't really matter, would it?

there are not many 720p displays to begin with. (2, Informative)

Mark19960 (539856) | more than 9 years ago | (#12408102)

I work on this stuff, every day.. all day.
only cheap projectors or displays have a maximum res of 720p.
I dont see many of those anyhow.
but yes, on those displays the signal is downconverted by chopping out 1/2 of it.

however, these displays are not popular anyhow.
some of the most popular displays still cant display native, but they are still can display either XGA or SXGA with no proble (were getting pretty close to hd at this point)

dont buy a cheap projector, and you wont get a cheap display. you get what you pay for.

It's still digital (1)

TheNinjaroach (878876) | more than 9 years ago | (#12408117)

..I'm sitting around wondering why this is a surprise to most people? I thought it was common knowledge that the processing power required to do HDTV the right way was lacking. Even though you're losing some resolution with these displays - it's still digital. That's very important as far as picture quality goes, none of those fuzzy lines to the left and right of menu text. It's also progressive which helps a lot - I can see flicker very easily (If I'm on a CRT with
It's definitely the price you pay for being an early adopter, but I would still argue that the picture you are getting is much better than the NTSC we've been used to for decades.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?