Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

1080p, Human Vision, and Reality

CmdrTaco posted about 7 years ago | from the even-one-eyed-pirates-like-hdtv dept.

Television 403

An anonymous reader writes "'1080p provides the sharpest, most lifelike picture possible.' '1080p combines high resolution with a high frame rate, so you see more detail from second to second.' This marketing copy is largely accurate. 1080p can be significantly better that 1080i, 720p, 480p or 480i. But, (there's always a "but") there are qualifications. The most obvious qualification: Is this performance improvement manifest under real world viewing conditions? After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however! In the consumer electronics world we have to ask a similar question. I can buy 1080p gear, but will I see the difference? The answer to this question is a bit more ambiguous."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered


Article Summary (5, Informative)

Jaguar777 (189036) | about 7 years ago | (#18674793)

If you do the math you come to the conclusion that the human eye can't distinguish between 720p and 1080p when viewing a 50" screen from 8' away. However, 1080p can be very useful for much larger screen sizes, and is handy to have when viewing 1080i content.

I'm sorry, I was told there would be no math (4, Funny)

elrous0 (869638) | about 7 years ago | (#18674957)

In other words, your mother was wrong. You're better off sitting CLOSER to the TV.

Mom might have been right.... (2, Informative)

Radon360 (951529) | about 7 years ago | (#18675069)

...depending on how old you are. I think the concern was associated more with X-ray radiation emissions from CRT televisions, and older ones at that (prior to the introduction of the Radiation Control for Health and Safety Act of 1968 [fda.gov] ). I would fathom to say that most of us on this site are too young to have been plopped in front of a TV that old for large amounts of time.

Re:Mom might have been right.... (4, Interesting)

Hoi Polloi (522990) | about 7 years ago | (#18675399)

I assume this is why TV tubes are made with leaded glass, to absoarb the soft x-rays being generated. This is also why tossing out a TV tube improperly is a pollution no-no.

It isn't that simple. (5, Informative)

fyngyrz (762201) | about 7 years ago | (#18675041)

According to the linked text, the "average" person can see 2 pixels at about 2 minutes of arc, and has a field of view of 100 degrees. There are 30 sets of 2 minutes of arc in one degree, and one hundred of those in the field of view, so we get: 2 * 30 * 100, or about 6000 pixel acuity overall.

1080p is 1920 horizontally and 1080 vertically at most. So horizontally, where the 100 degree figure is accurate, there is no question that 1080p is about 2/3 less than your ability to see detail, and the answer to the question in the summary is, yes, it is worth it.

Vertically, let's assume (though it isn't true) that only having one eye-width available cuts your vision's arc in half (it doesn't, but roll with me here.) That would mean that instead of 6000 pixel acuity, you're down to 3000. 1080p is 1080 pixels vertically. In this case, you'd again be at 1/3 of your visual acuity, and again, the answer is yes, it is worth it. Coming back to reality, where you vertical field of view is actually greater than 50 degrees, your acuity is higher and it is even more worth it.

Aside from these general numbers that TFA throws around (without making any conclusions), the human eye doesn't have uniform acuity across the field of view. You see more near the center of your cone of vision, and you perceive more there as well. Things out towards the edges are less well perceived. Doubt me? Put a hand up (or have a friend do it) at the edge of your vision - stare straight ahead, with the hand at the extreme edge of what you can see at the side. Try and count the number of fingers for a few tries. You'll likely find you can't (it can be done, but it takes some practice - in martial arts, my school trains with these same exercises for years so that we develop and maintain a bit more ability to figure out what is going on at the edges of our vision.) But the point is, at the edges, you certainly aren't seeing with the same acuity or perception that you are at the center focus of your vision.

So the resolution across the screen isn't really benefiting your perception - the closer to the edge you go, the more degraded your perception is, though the pixel spacing remains constant. However - and I think this is the key - you can look anywhere, that is, place the center of your vision, anywhere on the display, and be rewarded with an image that is well within the ability of your eyes and mind to resolve well.

There are some color-based caveats to this. Your eye sees better in brightness than it does in color. It sees better in some colors better than others (green is considerably better resolved than blue, for instance.) These differences in perception make TGA's blanket statement that your acuity is 2 pixels per two minutes of arc is more than a little bit of hand-waving. Still, the finest detail in the HD signal (and normal video, for that matter) is carried in the brightness information, and that is indeed where your highest acuity is, so technically, we're still kind of talking about the same general ballpark — the color information is less dense, and that corresponds to your lesser acuity in color.

There is a simple and relatively easy to access test that you can do yourself. Go find an LCD computer monitor in the 17 inch or larger range that has a native resolution of 1280x1024. That's pretty standard for a few years back, should be easy to do. Verify that the computer attached to it is running in the same resolution. This is about 1/2 HD across, and 1 HD vertically. Look at it. Any trouble seeing the finest details? Of course not. Now go find a computer monitor that is closer to HD, or exactly HD. You might have to go to a dealer, but you can find them. Again, make sure that the computer is set to use this resolution. Now we're talking about HD. Can you see the finest details? I can - and easily. I suspect you can too, because my visual acuity is nothing special. But do the test, if you doubt that HD offers detail that is useful to your perceptions.

Finally, note that your distance to the display has a lot to do with how much you're going to see. The smaller the display, or conversely, the further you are from it, the smaller the number of degrees of your vision the display will take up, and consequently, your visual acuity relative to the display drops. For instance, if the display width takes up 1/2 your horizontal field of view, you're down to 3000 pixels acuity from 6000; if the display takes up 1/3 of your horizontal field of view, you're down to 2000 (which is very close to HD) and so that's about the minimum you can go and expect to get the benefit of "all the dots" at the focal point of your vision. On the other hand, more of the display is closer to the center of your vision, so generally speaking, you'll see more detail overall than you will if the display takes up your entire field of view.

Even though one would think that the latter arrangement - 1/3 or 1/2 the field of view - would be optimum, I find that I prefer the display to take up my entire field of view. I look around the image as my interest draws me, and I see more as I go. In the meantime, the rest of the image sort of surrounds what is going on, much like the real world does. YMMV, of course. But that's half (or 1/3) the fun. :-)

Re:It isn't that simple. (3, Insightful)

maxume (22995) | about 7 years ago | (#18675277)

So the op says "the human eye can't distinguish between 720p and 1080p when viewing a 50" screen from 8' away" and then you go on and on and on to come to the conclusion that it ends up mattering how big the screen is and how close you sit to it, essentially because the human eye is limited to hd resolution or so when a screen is taking up 1/3 of your field of view. Nice work.

Re:It isn't that simple. (4, Insightful)

InsaneProcessor (869563) | about 7 years ago | (#18675823)

This is just another reason why I still you an old standard TV. Until the dust settles, I ain't spending one thin dime on HD.

Re:It isn't that simple. (4, Informative)

Paladin128 (203968) | about 7 years ago | (#18675321)

Though you are correct that human acuity degenerates near its edges of visual range, some people actually look around the screen a bit. I'm setting up my basement as a home theater, and I'll have a 6' wide screen, where I'll be sitting about 9' away. My eyes tend to wander around the screen, so the sharpness at the edges does matter.

Hey, that's unfair (5, Funny)

jimicus (737525) | about 7 years ago | (#18675347)

You're commenting on something which it sounds like you might actually be qualified to comment on! What are you doing on /. ?

Nyquist (0)

Anonymous Coward | about 7 years ago | (#18675585)

Something you've missed is that the very finest in visual quality comes not from being able to resolve the individual pixels but from having several pixels unresolved for each "pixel" in your vision. The point where your eye and the display have equal resolution is just the point at which you start getting diminishing returns for adding extra pixels.

Re:It isn't that simple. (0)

Anonymous Coward | about 7 years ago | (#18675815)

You're right, it isn't that simple. You talk about the horizontal field of view of the human eyes being "6000 pixels". Based on this you say "there is no question that 1080p is about ..." and you follow up with "the answer to the question in the summary is, yes". Well, I hope you are wearing some sweet HD VR goggles. I'm pretty sure my 48" home theater system does not take up 100 degrees of my field of view. In fact, at a distance of 8 feet my TV would have to be almost 20 feet wide to take up 100 degrees of viewing arc.

However, some quick trig shows that a 3.5 foot wide TV at 8 feet would be perceived at about 1200 pixels wide if you had 6000 pixels evenly distributed across a 100 degree viewing arc. So while I think you oversimplified your complexification of the OP's oversimplification, I guess I would have to agree with your conclusion!

Re:It isn't that--(numbers for easy comparison) (1)

Falladir (1026636) | about 7 years ago | (#18675817)

TA refers to a 44"-wide television at 96" (eight feet).

arctan(22/96) = 12 degrees of the 100% degrees that we've assumed. So the proposed display setup uses 24%, or 1,440 of the 6,000 horizontal pixels that the parent calculated. This is consistent with TA's assertion that 720p is closer to the actual resolving power of the eye.

The assumption of a 50" screen at 96" viewing distance is fair, but you only have to sit two feet closer to see all the pixels in a 1080p display.

(I hope I get rich so I can buy fancy TV equipment some day =D)

Re:Article Summary (1)

metamatic (202216) | about 7 years ago | (#18675045)

Another factor is upscaler quality. I have a DVD player with a Faroudja upscaler, and DVDs played with it look pretty much indistinguishable from HDTV on my set. That is, a well encoded DVD movie looks about as good as (say) an HD episode of CSI.

That's why I'm in no hurry to get Blu-ray or HD-DVD. I'll wait for one of 'em to win (and for someone to start selling region-free players if Blu-ray wins).

Re:Article Summary (1)

Anonymous Brave Guy (457657) | about 7 years ago | (#18675263)

Yep, I'm holding off HD-DVD and Blu-ray for the same reason (amongst other, more political ones). I've only got a cheapo DVD player, but my TV is a nice Loewe box that upscales very well (though it has only 720p resolution — no-one was selling 1080any for home TVs at that stage). The first DVD I played on it was Revenge of the Sith, and I was pretty much blown away by the appearance of the opening battle scenes. I've seen some HD demos, and short of watching them side-by-side, I would be hard-pressed to spot the difference. It only really tells in things like crowd scenes or wide-view nature shots, where there isn't enough detail in the SD input for upscaling to look sharp.

Re:Article Summary (5, Insightful)

nospmiS remoH (714998) | about 7 years ago | (#18675439)

I think the bigger issue is that the majority of HD content out there sucks. Taking crappy video and re-encoding it to 1080p will not make it look better. Sure it's "full HD" now, but it can still look like crap. I have seen many 720p videos that look WAY better than some 1080p videos simply because the source content was recorded using better equipment and encoded well. TNT-HD is the worst network of all for this crap. Many of there HD simulcast stuff is the exact same show just scaled up and often times stretched with a terrible fish-eye effect. It is sad the amount of bandwidth being wasted for this "HD" crap (don't even get me started on DirecTV's 'HDLite'). [/rant]

Re:Article Summary (1)

CastrTroy (595695) | about 7 years ago | (#18675799)

Does anybody else have problems with their cable company over-compressing the digital cable? I pay a lot of money for cable, and in the last few years the quality has degraded while they try to stuff more SD channels, HD Channels, OnDemand Channels, VOIP, and Internet over that same line. I'm not going to upgrade to HDTV until I can be guaranteed that I'm actually getting really good looking television and that the quality won't degrade as they try to put more content on the tubes. Has anybody else noticed the compression artifacts? Or am I just looking too closely. BTW, I'm on Rogers in Ottawa.

Re:Article Summary (2, Informative)

ivan256 (17499) | about 7 years ago | (#18675697)

Do you own stock in Faroudja or something?

Great. You can't tell. Here's a cookie.

The rest of us can tell the difference between well encoded 1080i content and upscaled 480p content. I'm very sorry for you that you can't.

(And I still think that your real problem is that your television does a crappy job of downscaling 1080i to 720p, and that's why you mistakenly believe your upscaled DVDs look just as good.)

Re:Article Summary (1)

lokedhs (672255) | about 7 years ago | (#18675757)

Well, that's because your HD broadcast is to heavily compressed that the advantage becomes minimal.

Try connecting a BR player to your HD TV and watch a good quality BR movie. Then you'll see a huge difference.

Re:Article Summary (0)

Anonymous Coward | about 7 years ago | (#18675349)

Their math does assume 20/20 vision... There are plenty of people out there, either naturally or through the use of corrective lenses, that have vision signifigantly better than 20/20.

Re:Article Summary (1)

nsayer (86181) | about 7 years ago | (#18675589)

So once again, the /. summary leaves out the most important bits - you can't talk about resolution without also mentioning screen size and distance from the eyes.

We have a 50" 720p set 6 feet away from us, and for us, it's ideal. A larger set would overwhelm the room, and we wouldn't really want to move any closer or further away. But with that set-up, the difference between HD and SD programming is both obvious and striking. Even mundane stuff - like comparing WPT broadcasts to the NBC National Heads-up championship (the former in SD, the latter in HD) - you can see more of the nuances in their expressions better.

But, like the article says, I am doubious that spending more for a 1080p set would have made any further difference for us. Perhaps with a 60" screen if we really strained to see, probably with a 72" screen... but we wouldn't want anything that big in that room.

NO (-1, Offtopic)

Anonymous Coward | about 7 years ago | (#18674809)

Atleast thats what my magic 8-ball said.

1080p content (5, Insightful)

Orange Crush (934731) | about 7 years ago | (#18674813)

There's still not much available in the wild that does 1080p justice right now anyway. Horribly compressed 1080p looks every bit as awful as horribly compressed 1080i/720p.

Read the opposite opinion from Secrets (4, Interesting)

Shawn Parr (712602) | about 7 years ago | (#18675093)

I recently saw an article posted by Secrets of Home Theatre, very well known for their DVD benchmark process and articles.

The article is here [hometheaterhifi.com] .

They show numerous examples of how the processing involved can indeed lead to a better image on 1080p sets. Mind you it is not just the resolution, but how 480 material being processed and scaled can look better on a 1080p screen than on a 720p (or more likely 768p) screen. It is a very interesting read. Although if you are already conversant in scaling and video processing some of it can be very basic. I count that as a feature though as most non-technical people should be able to read it and come away with the information they are presenting.

Definitely interesting as a counterpoint.

Re:1080p content (1)

jonTu (839883) | about 7 years ago | (#18675531)

This is a really really good point, and seems to be the big unspoken truth in the HDTV industry right now. Any video editor who has worked with the 720 or 1080 standards will tell you that even a high-end computer is incapable of playing back a HD stream uncompressed, typically because it simply can't stream data off the hard drive fast enough. In fact, in practice most computers can't play back an uncompressed "480" (née NTSC D1) stream. There most certainly isn't a RAID 0 array in the typical TiVO, let alone the bandwidth to actually download this scale of content, so the solution cable providers and DVR manufacturers use is lossy compression.

People Are Blind (5, Insightful)

CheeseburgerBrown (553703) | about 7 years ago | (#18674859)

Consider many people can't distinguish between a high definition picture and a standard definition picture warped to fit their HD screen, this question seems largely academic.

Re:People Are Blind (1)

bleh-of-the-huns (17740) | about 7 years ago | (#18674919)

I can definately tell the difference.. I leave SD broadcasts with the side and top/bottom bars, otherwise it looks crappy. I also disabled my tivo's upscale to 1080i function, and left it in the shows native format, otherwise it gets blocky.. although right now we are doing renovations in the TV room, so the 56" giant 1080p samsung is sitting 4 feet from the futon in my spare bedroom.. that does not help the viewing either :P

Re:People Are Blind (0)

Anonymous Coward | about 7 years ago | (#18674953)

Many people arent looking at a TV setup right or the right content. The difference in Discovery HD type shows or a NFL broadcast in HD are like the difference between night and day.

Re:People Are Blind (1)

ryanov (193048) | about 7 years ago | (#18675005)

Even the Simpsons on FOXHD looks very very different. Colors are better and edges are crisper. Now does it look better than the Simpsons on a standard TV, I'm not really sure, but the standard on an HDTV is noticeably bad.

Re:People Are Blind (3, Funny)

Hoi Polloi (522990) | about 7 years ago | (#18675459)

The problem with watching the Simpsons in HD is that you can see all of Marge's wrinkles and Homer's old acne scars through the makeup. You can also see that Bart has some early facial hair coming in.

Re:People Are Blind (1)

mykdavies (1369) | about 7 years ago | (#18675565)

I suspect what the GP poster really meant was "Consider[ing] many people can't distinguish between a 16:9 picture and a 4:3 picture warped to fit their 16:9 screen, this question seems largely academic", which is a much more fundamental distinction.

People aren't blind, just ignorant (0)

Anonymous Coward | about 7 years ago | (#18675195)

Once someone actually sees the difference, it's remarkably obvious. My father wasn't really buying into the HD stuff (despite seeing it at stores), until I showed him the SD feed of a football game (Ohio vs. Michigan I think), and then the HD. Switching between them, it was instantly obvious the difference. It's much harder for someone to see something at the store, then compare it to something they have at home.

Re:People aren't blind, just ignorant (1)

shaitand (626655) | about 7 years ago | (#18675563)

If the difference is so small that you have to look at them side by side then who cares? Nobody has to prove to me that HD presents a clearer picture. They have to prove to me that I can't enjoy a SD picture without being distracted by blur and pixelation.

You don't need the best image man can possibly produce to watch a football game. All you need is an image that lets you see the game and that doesn't distract you from the game.

Re:People Are Blind (5, Insightful)

Luke (7869) | about 7 years ago | (#18675355)

Exactly. I wonder how many people claiming they can see a difference between 1080i and 1080p happily listen to compressed audio with earbud headphones.

People Are Blind AND DEAF (0)

Anonymous Coward | about 7 years ago | (#18675441)

People swear that an audio sampling frequency of 192kHz yields more fidelity than 92k. CD audio is 44.1 after Nyquist theorem which says we must take just over 2 samples for the audio frequencies we wish to represent. In the case of CD audio that's 20Hz to 20kHz which is generally accepted as the range of human hearing. A CD sample rate of 44.1 actually exceeds the audible range for most people, if we double it to 92kHz we then exceed the frequency response of human hearing, recording microphones and reproduction systems. Yet there are still those who claim to hear a difference at 192kHz.

The same is true of HD, people claim they see an improvement on their 40" home cinema when the physics say that it's not possible.

It's similar to a religious debate, the true believers on one side and scientific fact on the other. Very amusing [wikipedia.org]

It depends what you watch (2, Informative)

Anonymous Brave Guy (457657) | about 7 years ago | (#18675465)

Consider many people can't distinguish between a high definition picture and a standard definition picture warped to fit their HD screen, this question seems largely academic.

That's because, given a good upscaler, you can't distinguish much difference between DVD quality (which is most people's benchmark of what their SD TV can do) and 720p (which is what most HDTVs show). If by "standard definition" you're talking about crappy, digitally compressed TV channels at lower resolutions, then sure, there's a difference there, though I do wonder how much of the perceived improvement is due simply to using less lossy compression, rather than to genuine resolution improvement.

Even looking at DVD vs. HD, you can see the difference in things like crowd scenes, detailed nature shots, or sports where the players are filmed from way back so you can see the field as well — basically anything where there isn't enough detail in the source material for any upscaler to work with. However, for most things I watch at least, that doesn't apply. There basically isn't much difference in face shots, action scenes set in a street/building and filmed from fairly close in, or most CGI and special effects.

Does anyone even broadcast 1080p.... (3, Insightful)

bleh-of-the-huns (17740) | about 7 years ago | (#18674867)

Last I checked, other then HD/BR DVD players, and normal DVD players that upscale to 1080p, there are no sources from cable or satellite that broadcast in anything other then 720, so its kind of a moot point. I have heard rumours verizon fios tv will have a few 1080p channels in a few months, but nothing substantial... and last I checked, there boxes do not do 1080p (I could be wrong about the boxes statement though)

I have a series3 tivo though, which only supports up to 1080i :(

Re:Does anyone even broadcast 1080p.... (2, Insightful)

Cauchy (61097) | about 7 years ago | (#18674943)

Seems to me, the PS3 is pushing 1080P capable devices into millions of homes (sales issues aside). Many games that are being released are at 1080P. I just ordered my first Blu-Ray DVD (BBC's Planet Earth series). I think that is something worth seeing at 1080P.

Re:Does anyone even broadcast 1080p.... (1)

Anonymous Brave Guy (457657) | about 7 years ago | (#18675643)

Seems to me, the PS3 is pushing 1080P capable devices into millions of homes (sales issues aside).

Then you just need a TV that can display it. :-)

Perhaps it's different in the US, but certainly here in the UK, 1080 is still exceptional, and almost all HDTVs are really just displaying 720p. Even the serious brands you get from speciality shops have only started supplying 1080-capable units very recently (months, not years) and they cost a fortune even by geek standards. So while I'm sure Planet Earth will look great in HD, I doubt that even I as a TV-enjoying geek will have a box that can display it any time for the next few years. Given the prices involved, I rather doubt support from games consoles is going to change that very much.

Re:Does anyone even broadcast 1080p.... (4, Interesting)

Enderandrew (866215) | about 7 years ago | (#18675057)

My wife and I were looking at TVs, and we walked past some gorgeous 52" LCDs that support 1080p, and I told her this is what I wanted.

Then she walked past a smalled 32" LCD that only supported 720p/1080i and she said, "this picture looks so much better, and the TV is $1000 less! Why?"

I casually explained that the expensive TV was tuned to a normal TV broadcast, while the cheaper TV was being tuned to ESPNHD. She looked and realized that the most expensive TV getting a crappy signal isn't going to look all the great.

I still want a nice LCD that supports 1080p, but I'm not pushing for it immediately until I can afford a PS3 and a nice staple of BluRay movies to go along with it.

720p looks IMMENSELY better than 480i, or any crappy upscaled images my fancy DVD player and digital cable box can put out. I have yet to see a nice, natural 1080p image myself, but I'm willing to bet I will be able to tell the difference.

If anyone recalls, there were people who insisted that you couldn't really tell the difference between a progressive and interlaced picture.

Re:Does anyone even broadcast 1080p.... (5, Informative)

Zontar_Thing_From_Ve (949321) | about 7 years ago | (#18675067)

Last I checked, other then HD/BR DVD players, and normal DVD players that upscale to 1080p, there are no sources from cable or satellite that broadcast in anything other then 720, so its kind of a moot point. I have heard rumours verizon fios tv will have a few 1080p channels in a few months, but nothing substantial... and last I checked, there boxes do not do 1080p (I could be wrong about the boxes statement though)

Wow, this is wrong. Since you mentioned Verizon, you must live in the USA. NBC and CBS both broadcast in 1080i right now. Discovery HD and Universal HD do too. Those come to mind fairly quickly. I'm sure there are others. By the way, I wouldn't hold my breath about 1080p TV broadcasts. The ATSC definition for high def TV used in the USA doesn't support it at this time because the bandwidth requirements to do this are enormous.

Re:Does anyone even broadcast 1080p.... (2, Informative)

JFMulder (59706) | about 7 years ago | (#18675101)

Mod parent up. I was about to post the same thing : the ATSC standard for broadcasting doesn't permit 1080p signals.

Re:Does anyone even broadcast 1080p.... (0)

Anonymous Coward | about 7 years ago | (#18675307)

You left out INHD, which also broadcasts in 1080i. They broadcast a number of baseball games, and the difference between 1080i and 720p for a baseball game is quite noticeable.

Stop spreading misinformation (1, Interesting)

Anonymous Coward | about 7 years ago | (#18675351)

Yes, ATSC supports 1080p, and has from the beginning. You idiots who think 1080p has to mean 60p are muddying the waters. Don't you watch DVDs and think they're just fine? They're 24p, doofus. If you can do 60i in 1080, then you can do up to 30p with identical bandwidth. Just read info on the standard instead of the bullshit the 720p proponents say.

Wikipedia article on ATSC [wikipedia.org]

Re:Does anyone even broadcast 1080p.... (1, Interesting)

Anonymous Coward | about 7 years ago | (#18675193)

And I like to thank you for using the word "moot" correctly in a sentence. Every time someone writes "its a mute point" I throw up a little bit. Not the kind of throw-up where you actually vomit everything out of your stomach, but the kind of throw up where you kinda puke a little bit in your mouth.

Re:Does anyone even broadcast 1080p.... (0)

Anonymous Coward | about 7 years ago | (#18675257)

DirecTV uses 1080i and 720p.

Sports are better in 720p for the faster frame rate, nature shows are better in 1080i since camera movement is much slower and the faster frame rate is rarely needed.

The Discovery HD channel is 1080i. However, I have watched shows on this channel with fast panning of the camera, and noticed some jerkiness in the motion.

Same with HBO HD, also 1080i. Action movies make me sick.

Could I tell the difference between 720p and 1080p? Maybe, I'm likely much more sensitive to visual detail than the average person, but true, it's not a huge jump in detail. Still I drool at the thought of 1080p.

Re:Does anyone even broadcast 1080p.... (1)

ben there... (946946) | about 7 years ago | (#18675729)

BBC's Sky broadcasts some movies in 1080p. They actually use H.264 within a transport stream, which looks pretty good.

Here is an example of one of their movies I have: Domino [imageshack.us] , an OAR broadcast, so its actual res is 1920x800 or so. It looks much better than DVD to me. I also have the DVD around here somewhere, but no caps from it to compare to.

IMAX (3, Funny)

timster (32400) | about 7 years ago | (#18674897)

I, for one, will not be happy until I have an IMAX theater in my home. That requires way, WAY more resolution than 1080p. And you can see the difference for sure.

Re:IMAX... stills (1)

temojen (678985) | about 7 years ago | (#18675287)

I used to get teased about using outdated technology by members of our local photo club who shoot crop-factor digitals and project digitally, until I brought in my 6x6 projector and put some images up on the screen.

Re:IMAX... stills (1)

mstahl (701501) | about 7 years ago | (#18675545)

6x6cm is 120/220 sized film. IMAX is actually 70x48.5mm [wikipedia.org] . So each frame is about as large as from my large-format 5x7 camera.

Nitpicking aside, though, I've used the same trick to get digital advocates to stfu. One frame of 6x6 at, say 100ASA—if you were to consider each grain of silver halide to be a "pixel"—and you're talking hundred of megapixels.

Re:IMAX (4, Funny)

Hoi Polloi (522990) | about 7 years ago | (#18675527)

I'm trying out this thing called "outdoors". 3D video, extreme HD, millions of colors, and it is free! The reviews say it is incredibly lifelike.

The difference is when you get close (2, Funny)

The Media Mechanic (1084283) | about 7 years ago | (#18674933)

If you lean into your honey for a kiss, she doesn't get all pixellated when you get close to her face.

When you press your face up against your HDTV panel, you should be able to tell the difference between 1080p and reality.

If you can't tell the difference between the two, then you might want to get your eyes checked.

Your target audience... (4, Funny)

sczimme (603413) | about 7 years ago | (#18675009)

If you lean into your honey for a kiss, she doesn't get all pixellated when you get close to her face.

Consider your target audience...

Re:The difference is when you get close (1)

skorch (906936) | about 7 years ago | (#18675027)

If you're leaning in to kiss your honey on a tv screen, I think you have more serious problems than just your eyes.

Re:The difference is when you get close (5, Funny)

Rosco P. Coltrane (209368) | about 7 years ago | (#18675051)

If you lean into your honey for a kiss, she doesn't get all pixellated when you get close to her face.

Dude, this is Slashdot. When a Slashdot reader leans into his honey for a kiss, she *does* get pixellated...

Analogy (5, Funny)

Chabo (880571) | about 7 years ago | (#18674973)

After all, one can purchase 200mph speed-rated tires for a Toyota Prius®. Expectations of a real performance improvement based on such an investment will likely go unfulfilled, however!

But it does mean that the performance of the car won't be limited by the tires... ;)

Re:Analogy (1)

multipartmixed (163409) | about 7 years ago | (#18675179)

> But it does mean that the performance of the car won't be limited by the tires... ;)

Depends how you define performance. High-speed tires tend to have harder rubber and/or shallower tread depth.

True, but... (1)

sczimme (603413) | about 7 years ago | (#18675213)

But it does mean that the performance of the car won't be limited by the tires... ;)

Very true, but I believe there is an expectation that the delivery and display of signal(s) will continue to improve so that the capabilities of the new gear can be realized; we don't have the same expectation of the highway infrastructure, at least in the US. (We don't have enough physical or visionary room for wholesale upgrades.)

The resolution of current televisions will eventually become a limitation. The Prius will likely never use the full capacity of 200mph tires.

PS Dear Toyota - please prove me wrong. :-)

Re:Analogy (1)

smchris (464899) | about 7 years ago | (#18675759)

Enough with the Prius bashing. We all know electric motors have great torque. I believe the Prius computer intentionally reins that in some for the boring commuter experience. I got my wife a "My Prius accelerates faster than your SUV" bumper sticker but she doesn't have what it takes to put it on. And they maneuver just fine in 70+ mph freeway traffic.

Everyone's real-world conditions are different (3, Interesting)

davidwr (791652) | about 7 years ago | (#18674987)

My "real-world" conditions may be a 50" TV seen from 8' away.

Another person may watch the same 50" set from 4' away.

Your kids may watch it from 1' away just to annoy you.

2 arc-minutes of angle is different in each of these conditions.

Don't forget: You may be watching it on a TV that has a zoom feature. You need all the pixels you can get when zooming in.

Re:Everyone's real-world conditions are different (1)

PFI_Optix (936301) | about 7 years ago | (#18675059)

I just have to ask...why zoom? Did the director not get a close enough view of Amber's naughty bits for you?

Seriously...I just don't see a lot of value in a TV that can zoom.

Re:Everyone's real-world conditions are different (2, Insightful)

Rosco P. Coltrane (209368) | about 7 years ago | (#18675137)

My opinion is that this hi-res frenzy that's been going on for years is pure marketing bullshit. The truth is :

1- only a small minority of consumers have 50" TVs
2- only a small subset of aforementioned minority watches 50" TVs upclose
3- What do you watch on TV that requires high resolution? most TV programs are crap, and if they display text in the image (the toughest kind of feature to display with a low resolution), it's big enough that it never matters anyway.

High resolution is a solution in search of a problem. The best proof is, nobody in the 25-or-so years I've been hearing about HDTV coming "real soon now" is really clamoring for a better image quality. Most people are happy enough with TV the way it is. That's the reality.

Re:Everyone's real-world conditions are different (1)

kevin_conaway (585204) | about 7 years ago | (#18675259)

High resolution is a solution in search of a problem. The best proof is, nobody in the 25-or-so years I've been hearing about HDTV coming "real soon now" is really clamoring for a better image quality. Most people are happy enough with TV the way it is. That's the reality.

Spoken like someone who hasn't watched sports in high def vs standard def. Believe me, people are clamoring for new tvs if only to watch football

Re:Everyone's real-world conditions are different (1)

billdar (595311) | about 7 years ago | (#18675679)

or hockey. The contrast of dark jersy's on white ice (let alone the puck!) moving fast across the screen is murder on any compression/upscaler.

Re:Everyone's real-world conditions are different (0)

Anonymous Coward | about 7 years ago | (#18675801)

Does the score change with higher resolution?

You ouviously don't have a set then (1)

brunes69 (86786) | about 7 years ago | (#18675543)

Honestly, I don't know anyone (including myself) who has ever gotten an HD set and then later said "this was not worth the switch".

You can't go around blasting your mouth off about stuff you have not tried. Until you have actually SEEN THE DAY TO DAY DIFFERENCE in shows like CSI and Lost when broadcast in HD vs. non-HD, you're just spouting bullshit.

I won't even go into the difference it makes to have Dolby Digital 5.1 surround on these shows.

It is a totally different viewing experience. It makes you barely even bother watching shows on the SD channels anymore because they are so much worse.

Re:You ouviously don't have a set then (0)

Anonymous Coward | about 7 years ago | (#18675707)

Then you put some SD content on the new uber-TV of doom, and my god it is shit.

Re:Everyone's real-world conditions are different (1)

TheFlyingGoat (161967) | about 7 years ago | (#18675583)

Everyone I know who has seen a high def broadcast for the first time has been impressed, or at least been able to see the difference between HD and SD. You make it seem like people can't tell the difference or don't care... not true.

1: About 2/3 of my friends right now have at least 1 HDTV in their house. Mine is a 48" and 2 friends have 36"-42" TV's, but they view them from a little closer than I do. I'm middle class, so I'm sure my friends tend to have more HDTV's than lower class people, but I'd say we're all pretty average. Large TV's are far more popular than this "small minority" you refer to.

2: I can tell the difference between 480p and 1080i on my 48" TV from about 9 feet away. My wife can too, which is saying much more. We can also both tell the difference on my friend's 36" HDTV from about 6 feet away. It really doesn't take sitting 3-4 feet away to be able to tell the difference.

3: Requires high resolution? Nothing. Looks better in high resolution? Sports (you can actually see the golf ball, baseball, and hockey puck, although I watch mostly NFL). Your comment about TV shows being crap doesn't hold weight here, since that's 100% your opinion and a large majority of people out there would disagree. There's actually a great deal of good shows on right now: House, Heroes, Battlestar Galactica, Deadliest Catch, Prison Break, etc. Dresden Files is also decent although it falls short of the books, which is to be expected. As for viewing text, think video game consoles.

I also think that HDTV isn't necessarily required, but it sure makes TV nice to look at. People that claim that there's no market for it, or that it doesn't look better than SD are just being ignorant. I don't mind people that choose not to own a huge high def TV themselves, but at least be realistic about the technology and numbers.

Sidenote: Not only is the picture clearer in HD, the colors are more vibrant and the contrast is higher (probably due to more color information). The result means that even if you can't see the additional pixel information, the image can still look better.

Re:Everyone's real-world conditions are different (1)

Doc Lazarus (1081525) | about 7 years ago | (#18675621)

Plus everybody forgets that the majority of all TV programming isn't anywhere close to HD. If you want to watch 'Heroes,' you can see a clear picture. You want something like 'I Love Lucy,' then you're going to watch a blocky mess. My parents had a cheap HDTV and all they got for most of the shows is just blocky garbage due to most programming not having that advantage. You don't see many people dealing with that problem.

Re:Everyone's real-world conditions are different (1)

LWATCDR (28044) | about 7 years ago | (#18675829)

The shame is that you could see I Love Lucie in HD. The older TV shows I think where shot on film. Those could be remastered as HD. Show that came later and where shot on Video tape are the ones that are stuck in SDTV.

What I really want is TMC in HD. I love old movies but those will not be coming to HD anytime soon.

Re:Everyone's real-world conditions are different (1, Interesting)

Anonymous Coward | about 7 years ago | (#18675791)

What about big TVs?

Real world example, with slightly lower resolutions. My TV is 92" - a 720p projector and screen. I sit about 11.5 feet back from the screen. I upgraded from a 480p projector a few months ago. The old projector displayed an HD signal quite well, but downscaled the resolution. Still, it looked good. On certain scenes - usually with large patches of light, solid color - I was able to clearly see the pixel structure. On other scenes, it gave a grainy appearance, but the pixels weren't as clearly delineated. Sitting any closer (which might be necessary with other people watching), highlighted these problems.

With the new projector, HD content is usually better. Older movies, or movies deliberately made with an older, or grainier, appearance, are usually about the same, no real improvement. More recent movies, sharper images, etc. are much nicer. TV shot on film and broadcast in HD is better, sharper, more depth. TV shot with HD cameras - sports, documentaries, etc., or even the evening news on one local channel - is far, far superior. Much sharper, clearer, far more detail, and with depth approaching 3D at times. Much less visible pixel structure. I can sit 3-4 feet closer without any problems (meaning, we can pull up some extra chairs and more people can watch). But even so, it's not as sharp and detailed as 1080p TVs and HD-DVD/Blu-Ray content that I've seen in stores. OK, so that's carefully chosen content and tweaked monitors for maximum "wow" factor in the stores, and standing next to the monitor is MUCH closer than actual viewing distance, but still, there's a layer of detail that's simply not there at home. There's graininess on some content that shouldn't be there. And viewing distance is compromised - as good as my picture at home is, if I stand four feet behind my couch, the picture looks even better.

From my experience, I doubt that on a 50" screen, going up to 1080p will make much difference. But with a big screen, yeah, it will. I can see the limitations of my setup, and I know, based on what I see and the difference from my previous, lower resolution projector, that taking the next step up will be a significant improvement on some types of content. The same kind of changes I've already experienced, but more so. Not on older movies, not on some TV. But on newer movies, and on TV shot on HD, I will definitely see the extra detail, it will give more punch to the image at normal viewing distance, it will allow for closer seating, and, with a short-throw projector, would even allow for a larger screen. Oh, and 1080i vs 1080p? On a 50", who cares? At 92", yeah, it matters.

I can't afford 1080p right now. But when a 1080p DLP projector with a short throw and (hopefully) lens-shifting becomes available at an affordable price, I'll be watching on a new 110" - maybe even 120", a full TEN FEET - and chuckling at those who argue over its value for a "big" 50" plasma.

WTF (-1, Offtopic)

Anonymous Coward | about 7 years ago | (#18675007)

What The Fuck(TM) is up with the ® symbol? Are you writing for some marketing brochure?

1080p? (3, Funny)

ObiWanStevobi (1030352) | about 7 years ago | (#18675017)

You're still on that? I'm on 3240z, it's higher def than real life.

Re:1080p? (-1, Troll)

Anonymous Coward | about 7 years ago | (#18675079)

It wouldn't be higher def than real life it would be higher def than your eyes can handle dumbass.

Re:1080p? Mine is diffraction-limited! (0)

Anonymous Coward | about 7 years ago | (#18675335)

1/8 wave or bust!

Only noticeable improvements matter (0)

Anonymous Coward | about 7 years ago | (#18675033)

Consumers should really only care about noticeable improvements in displays. This means that when you're watching the screen, the resolution is not hindering you from viewing the important details in the image, such as someone's face. Clouds, for example, may not matter. When it comes to video games, text on the screen such as your health will be more important. For example, have you ever played a game on an old TV set and you can't even read the text? It's times like that where you can really see the difference. In many situations though, getting that better resolution display just isn't going to matter.

Here's my real world... (1)

maillemaker (924053) | about 7 years ago | (#18675039)

1997 vintage RCA CRT TV.

Re:Here's my real world... (0)

Anonymous Coward | about 7 years ago | (#18675129)


Re:Here's my real world... (1)

Malc (1751) | about 7 years ago | (#18675171)

1997? Vintage? Good lord that's not old. TVs should last more than 20 years. Vintage would be something older than that.

Re:Here's my real world... (1)

jandrese (485) | about 7 years ago | (#18675525)

Yeah, the only reason I upgraded around 2001 or so was because I wanted component and s-video in on the TV instead of just composite (which was the norm back in the 90s). The biggest killer with those older TVs is that they often only support Coax input, which is terrible for anything that is not a TV antenna or a cable from your cable/satellite company.

Re:Here's my real world... (1)

Alioth (221270) | about 7 years ago | (#18675255)

1993 vintage Sony Trinitron TV here.

The thing is the Trinitron TV still looks much better than any LCD or plasma standard def TV, or high def TV showing upscaled standard definition. HD signals on an HDTV look better, but most of the HDTV content isn't particularly interesting to me. There's simply no point me changing it until HD is ubiquitous. The picture on the Sony is as good as you get for standard def, the colours are all still vibrant. (I also don't watch enough TV to really warrant replacing it any time soon, either).

stretched widescreen (0)

Anonymous Coward | about 7 years ago | (#18675157)

seeing as everybody i know watches 4:3 content stretched on their 16:9 screen i think most people wouldn't care.

and to be honest, nothing on tv is worth broadcasting in HD. doesn't really add anything.

ok perhaps its worthwhile for a nude scene but other than that, i don't watch telly thinking "i wish this had better resolution". i actually think "this program is crap"

not the most obvious "qualification" (1)

way2trivial (601132) | about 7 years ago | (#18675167)

"'1080p provides the sharpest, most lifelike picture possible.' '1080p

the use of "qualification" in the summary terms mean exception to the claim.

I feel it neccassary to qualify the last word pretty strongly as the biggest "qualification" of that statement.
http://www.google.com/search?hl=en&safe=off&q=QFHD [google.com] is pretty possible....
and it does exceed 1080p

compression (0)

Anonymous Coward | about 7 years ago | (#18675211)

I'd rather see an improvement in the compression technology used for the HD signals. Ever see a HD demo video with fast motion? Sometimes the parts of the screen containing fast motion get really blocky (especially if the colors are relatively dark). It looks crappy, but most people don't notice. There is a rollercoaster demo they use in the stores that I notice blocky patches in every time i see it.

Better use of the bandwidth ... (0)

Anonymous Coward | about 7 years ago | (#18675251)

A much better use of the bandwidth (and cost) would be higher frame rates and a much larger dynamic range. Panning shots on a large screen look awful, as do dark parts of an otherwise bright scene.

Re:Better use of the bandwidth ... (0)

Anonymous Coward | about 7 years ago | (#18675779)

A much better use of the bandwidth (and cost) would be higher frame rates and a much larger dynamic range. Panning shots on a large screen look awful, as do dark parts of an otherwise bright scene.
Those are artifacts from lossy compression, something the HD marketing blitz would prefer you didn't know about ;-) I find compression artifacts irritating yet I've only ever been irritated by lack of resolution on web videos.

If you use it as a monitor, HELL YEAH... (1)

nweaver (113078) | about 7 years ago | (#18675269)

If you use your HDTV as a computer monitor, definately.

One of the nice things about the Mac Mini is that it will drive a 1080p signal right out of the box: just hook up a DVI cable or a DVI->HDMI cable to that shiney HDTV and go to town.

More info on 720p/WXGA (5, Informative)

StandardCell (589682) | about 7 years ago | (#18675275)

Having worked in the high-end DTV and image processing space, our rule of thumb was that the vast majority of people will not distinguish between 1080p and WXGA/720p at normal viewing distances for up to around a 37"-40" screen UNLESS you have native 1920x1080 computer output. It only costs about $50 more to add 1080p capability to the same size glass, but even that is too expensive for many people because of some of the other implications (i.e. more of and more expensive SDRAM for the scaler/deinterlacer especially for PiP, more expensive interfaces like 1080p-capable HDMI and 1080p-capable analog component ADCs, etc.). These few dollars are not just a few dollars in an industry where panel prices are dropping 30% per year. Designers of these "low-end" DTVs are looking to squeeze pennies out of every design. For this reason alone, it'll be quite a while before you see a "budget" 1080p panel in a 26"-40" screen size.

At some point, panel prices will stabilize, but most people won't require this either way. And, as I mentioned, very few sources will output 1080p anyway. The ones I know of: Xbox360/PS3, HD-DVD, Blu-Ray and PCs. All broadcast infrastructure is capable of 10-bit 4:2:2 YCbCr color sampled 1920x1080, but even that is overkill and does not go out over broadcast infrastructure (i.e. ATSC broadcasts are max 1080i today). The other thing to distinguish is the frame rate. When most people talk about 1080p, they often are implying 1080p at 60 frames per second. Most Hollywood movies are actually 1080p but at 24fps which can be carried using 1080i bandwidths and using pulldown. And you don't want to change the frame rate of these movies anyway because it's a waste of bandwidth and, if you frame rate convert it using motion compensated techniques, you lose the suspension of reality that low frame rates give you. The TV's deinterlacer needs to know how to deal with pulldown (aka "film mode") but most new DTVs can do this fairly well.

In other words, other than video games and the odd nature documentary that you might have a next-gen optical disc for on a screen size greater than 40" and for the best eyes in that case, 1080p is mostly a waste of time. I'm glad the article pointed this stuff out.

More important things to look for in a display: color bit depth (10-bit or greater) with full 10-bit processing throughout the pipeline, good motion adaptive deinterlacing tuned for both high-motion and low-motion scenes, good scaling with properly-selected coefficients, good color management, MPEG block and mosquito artifact reduction, and good off-axis viewing angle both horizontally and vertically. I'll gladly take a WXGA display with these features over the 1080p crap that's foisted on people without them.

If you're out buying a DTV, get a hold of the Silicon Optix HQV DVD v1.4 or the Faroudja Sage DVDs and force the "salesperson" to play the DVD using component inputs to the DTV. They have material that we constantly used to benchmark quality, and that will help you filter out many of the issues people still have with their new displays.

Re:More info on 720p/WXGA (0)

Anonymous Coward | about 7 years ago | (#18675735)

Isignia NS42LCD. Full HD panel at 1500 CDN$.

"color bit depth (10-bit or greater)." What's the point? Most LCD computer mintors are 18 bit, never mind a TV!

Content (4, Insightful)

BigDumbAnimal (532071) | about 7 years ago | (#18675289)

This has bugged me for awhile.

Many TV manufacturers have been pushing 1080p. They have even showed images of sports and TV shows to show off their TV's great picture. However, the fact is that it is very unlikely that anyone will be watching any sports in 1080p in the near future in the US. Television content producers have spent millions upgrading to HD gear that will only support 1080i at the most and 720p as the top progressive scan resolution. They are not likely to change again to go from 1080i -> 1080p to benefit the few folks with TVs and receivers that support 1080p. As others have pointed out, 1080p isn't even supported by the HD broadcast standard.

The only sports you will seen in 1080p will be some crappy sports movie on Blu-ray.

Just do this... (1, Funny)

BigGar' (411008) | about 7 years ago | (#18675291)

splash $0.02 worth of bleach in your eyes. you'll be more than happy with the old ntsc standard after that.

Viewing distance calculator (5, Informative)

Thaelon (250687) | about 7 years ago | (#18675359)

Here [carltonbale.com] is a viewing distance calculator (in Excel) you can use to figure out way more about home theater setups than you'll ever really need.

It has viewing distances for user selectable monitor/TV/projector resolutions & sizes, seating distances, optimal viewing distances, seating heights(?!), THX viability(?!) etc. It's well researched and cited.

No I'm not affiliated with it, I just found it and liked it.

Perhaps true, but technically iffy (4, Informative)

redelm (54142) | about 7 years ago | (#18675437)

The photo standard for human visual acuity is 10 line-pairs per mm at normal still picture viewing distance (about one meter). 0.1 mil. But 20:20 is only 0.3 mil (1 minute of arc). A 50" diag 16:9 screen is 24.5" vertical. 1080 lines gives 0.58mm each. At 8' range this is 0.24 mil, within 20:20, but not within photo standards.

Of course, we are looking at moving pictures, which have different, more subjective requirements. A lot depends on content and "immersion". Many people watch these horribly small LCDs (portable and aircraft) with often only 240 lines. Judged for picture quality, they're extremely poor. Yet people still watch, so the content must be compelling enjough to overlook the technical flaws. I personally sometimes experience the reverse effect at HiDef -- the details start to distract from the content!

What about the real resolution? (0)

Anonymous Coward | about 7 years ago | (#18675515)

Most "HDTV"s out there are only 1366x768 panels. I've also seen 1024x768. Then you have the odd 1920x1200 monitor, and the occasional 1440x960. So what's the point of discussing 1080i vs 1080p if you can't even display it properly unless you get a so-called FullHD panel? It's all gonna get resized otherwise.

Sigh... (1, Interesting)

Anonymous Coward | about 7 years ago | (#18675721)

Putting 200mph tires on a Toyota Prius is down right stupid. Putting them on my Porsche 928 is not quite as stupid.

The issue with 1080p is not as clear cut though. There are a LOT of factors that you have to put into consideration. 1080p has higher resolution and higher frame rate. Period. It has a much higher resolution. Is this better or not? That will SERIOUSLY depend on what media you are using to watch on the 1080p, and moreover, if you are using a very sharp LCD/Plasma display or not. (My guess is the answer is "yes" for most people.)

One big factor is that these 1080p screens are not very well adept for use with analog material. They can take very precise digital data very well, but converting analog material to digital screens will leave very irritating artifacts. That is, pixels jumping back and forth, which REALLY show up badly in slow scenes or those scenes with next to no motion.

Analog, SD TVs are very good at overcoming these problems. The [i]nterlace and analog construct allows a lot of this to be fuzzed. It can be fuzzed enough that you don't notice it. In that sense, it's not fair to compair 1080p with a tube SD picture. 1080p can NOT fuzz an image. It is not capable of doing so without jumping a pixel, which is noticeable. Software can do the fuzzing, but that will degrade the image.

By definition, Analog can "fuzz" what Digital can't. And most people will notice. Will they CARE is an entirely different issue. I tend to have very good hearing and can tell a GREAT difference between a CD and an mp3/AAC/OGG song. This does not require high-end equipment. I can tell the difference between an Apple Lossless file, and an AAC file from the exact same song on the exact same iPod. But the question is, do I care? The answer is no.

Either way, I think that 1080p is too LOW of a resolution, as long as the screen is digital. You can actually tell the crappy picture, which wouldn't bother you at 1/4 the resolution if it were an analog tube.

But seriously, it's a matter of software medium. Play a DVD on an SD TV (tube) set, and then compare it to S-VHS, and then VHS. You can STILL see the difference, and this article is bickering about the same thing. Been there, done that. 8mm film is awesome, even though it has lower resolution than most digital camcorders these days. A 50 year old soviet made 16mm camera shoot will run circles around 1080p with modern film. So what!?
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account