Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Ultra HDTV on Display for the First Time

Zonk posted about 8 years ago | from the completely-useful dept.

314

fdiskne1 writes "According to a story by the BBC, the successor to HDTV is already out there. The resolution? 7680 x 4320 pixels. Despite the 'wow' factor, the only screens capable of using Ultra High Definition Television are large movie screens, and no television channel has the bandwidth needed for this image. Some experts, in fact, say the technology is only a novelty. Until the rest of the necessary technology catches up, the only foreseen use for Ultra HDTV is in movie theatres and museum video archives." From the article: "Dr. Masaru Kanazawa, one of NHK's senior research engineers, helped develop the technology. He told the BBC News website: 'When we designed HDTV 40 years ago our target was to make people feel like they were watching the real object. Our target now is to make people feel that they are in the scene.' As well as the higher picture resolution, the Ultra HD standard incorporates an advanced version of surround sound that uses 24 loudspeakers. "

cancel ×

314 comments

Sorry! There are no comments related to the filter you selected.

Goddamnit... (5, Funny)

Cyno01 (573917) | about 8 years ago | (#16113939)

And i just bought an HDTV last week.

Re:Goddamnit... (1, Insightful)

Anonymous Coward | about 8 years ago | (#16113974)

Well, given how long it took HDTV to get out there, you should have 30-40 years of use before it goes obsolete.

Re:Goddamnit... (0)

Anonymous Coward | about 8 years ago | (#16113991)

Hmmmm... I wonder what Jenna Jameson looks like in THAT resolution.

Re:Goddamnit... (0)

Anonymous Coward | about 8 years ago | (#16114097)

You really want to see all the zits and injection marks on the porn slut when some turd plows her ass - you'll get to see the shit flakes on the cock, too. You'd like that, too?

Re:Goddamnit... (2, Interesting)

drinkypoo (153816) | about 8 years ago | (#16114182)

I just want to make a serious comment here; this is precisely what has happened with both video and now with digital video. The 8mm film when transferred to video lost a lot of information and that had the effect of smoothing out blemishes. Shooting direct to video meant a lot less was lost, and you saw a lot more pimples. Now, digital video has brought us another level of nastiness, because splotchiness in an image is even more pronounced when you've got artifacting going on - and we have MPEG-compressed DV, which is then decompressed and processed, and recompressed with MPEG2 again, at a different bitrate (and probably in a substantially different format.) So at once you get the clarity of DV, and the splotchiness of recompressed MPEG, and every pimple, blackhead, scar, and abcess one's had since birth stands out in living color.

Re:Goddamnit... (1)

archen (447353) | about 8 years ago | (#16114024)

I think that some of these tech companies might find it in their best interest to "cool it". If they keep pushing out all this fringe technology all the time for something that has traditionally been stable over decades they're likely to see consumer burn out, where the consumer no longer cares about the latest technology and may be loath to ever upgrade since none of their current equipment works well with the newer stuff.

Backlash? There's a cycle for this stuff (4, Insightful)

postbigbang (761081) | about 8 years ago | (#16114080)

First, it's pre-announced. Then there's a lag between the neat idea exposure and mass-market reality. It took about ten years for HDTV of the dull 1080i type to become affordable (if you consider just under a $1K affordable-- and it will drop further soon).

Digital photography was pre-announced. Looked great, even at megapixel rates. Kodak scoffed, so did Fuji. Both hedged their bets and it's a great thing they did or they'd be in Chapter 7. It took about the same time from pre-announcement to mass market approval. Now you can go to Brookstone and get a 640x320 matchbox-sized camera for $50, and digital 'disposibles' are arriving.

Cool-it is anti-consumption. Do we need television AT ALL? That's a question still to be answered. I'm all in favor for advancing technology, especially if it feeds the poor and gives quality of life a boost. While an UltraHD TV might have only speculative value, it pushes the boundary, and that's what humanity is all about.

So fie on your 'fringe' technology PCs were 'fringe' when I was soldering together and wire-wrapping motherboards in the pre-IBM and pre-Kaypro days. What we did, goofy as it sounds, is the reason you can post on /. to begin with.

Re:Backlash? There's a cycle for this stuff (5, Funny)

mcmonkey (96054) | about 8 years ago | (#16114130)

So fie on your 'fringe' technology PCs were 'fringe' when I was soldering together and wire-wrapping motherboards in the pre-IBM and pre-Kaypro days. What we did, goofy as it sounds, is the reason you can post on /. to begin with.

Can I get off your lawn now?

k thanx

And...uh...replacing IMAX? (1, Insightful)

Anonymous Coward | about 8 years ago | (#16113944)

Title says it all...

Replacing IMAX? (2, Informative)

Richard Kirk (535523) | about 8 years ago | (#16114374)

I saw it at IBC last week. They had a camera on the top of the RAI showing live shots of Amasterdam as well as stuff from disc, all at 60 Hz. It looked pretty good. It wasn't like looking out of a window - though it might have done if the screen had been window-sized. The screen was big like an IMAX screen, and you could let your eye wander around it in the same way. I felt there was some sharpening and colour processing nonsense going on. I guess the total contrast was something like 2000:1, so you will need a high dynamic range version before the highlights and shadows look quite convincing. However, getting 2000:1 is pretty impressive - a lot of the scattered light in projectors comes from the pixel edges, and you must have a lot more of these. All in all, it was pretty sweeeet. They also had a 4K LCD display outside the theatre, and that looked good too.

I was told the downlink for the live camera was sending 52 Gbits/sec, which isn't quite the figures the others were coming up with. The data might have been 16 bits per channel. The camera was about a foot cube, which is pretty good as a blimped IMAX camera is the size of a small car.

I don't know where the figure of not being ready for 25 years comes from. The project never had a time to manufacture. I would imagine if there was demand, it could be ready a lot earlier.

Does it replace IMAX? I am not sure. I would like to see it show footage scanned from the original "North of Superior" footage. I have seen a strike from the original negative of that, and I remember the image being so impressive that you felt the tilt when the aeroplane cornered: you believed your eyes over your inner ear. It would be interesting to know if this rig could do the same.

Oh good! (1)

LunaticTippy (872397) | about 8 years ago | (#16113954)

An excuse to not upgrade to Blu-Ray or HD-DVD!

Now I won't have to lose geek credibility when I say SD is "good enough."

The device (4, Funny)

also-rr (980579) | about 8 years ago | (#16113958)

Also required blood to be sampled and only one life form to be detected in the room before it allows you to play your DNA proteced version of "Stars Wars IV - Remix 92 - The Jedi Beat The Terrorists (2020 release)".

Re:The device (2, Insightful)

iainl (136759) | about 8 years ago | (#16114121)

Mind you, in the '77 release, the Jedi _are_ the terrorists. Lucas seems to have got something of a bone to pick with Bush, judging from the heavy-handed subtext of the prequels, too.

Re:The device (1)

drinkypoo (153816) | about 8 years ago | (#16114211)

Mind you, in the '77 release, the Jedi _are_ the terrorists. Lucas seems to have got something of a bone to pick with Bush, judging from the heavy-handed subtext of the prequels, too.

The outline of the story of nine movies was written before any of them were shot. Lucas picked the middle of the story partly because he felt it was the only portion that could be carried off successfully with the technology of the day, and partly because he felt it would be the most palatable.

This isn't to say that the whole story was written, but the basic facts are, in theory, more or less in the same places. Except for that midichlorian bullshit, I hope.

Re:The device (1)

Golias (176380) | about 8 years ago | (#16114390)

The outline of the story of nine movies was written before any of them were shot. This is a myth based on some comments Lucas made during some interviews regarding Empire

The truth is, he wrote a long movie which started in the middle to feel like a Saturday serial, and upon realizing it was too long to shoot, he took the first thrid of his idea and created Star Wars.

Empre and Jedi did not follow the remaining script ideas which he had written to the letter. For example, the scenes with the Ewoks were originally conceived to be a planet of Wookies, and the idea of Leah and Luke being siblings was never part of the original script.

The Prequels were cut from whole cloth. He did not have that part of the story written at all before he began the original trilogy.

Oh, and the claim that he was drawing from Joseph Campbell's mythology concepts: Also after-the-fact revisionist bullshit. He stole far more from Kurosawa, Asimov, Herbert, and Kirby than he ever did from "The Hero With A Thousand Faces."

At least, for the first trilogy. For the prequel, he layed on that "Power of Myth" crap so thick, there was hardly any room for a story (which is just as well, since the story stinks on ice.)

Great... (0, Redundant)

Comatosis (798554) | about 8 years ago | (#16113961)

That's all we need in the future, $700,000 TV's!. Seriously people, if you want REAL, then go OUTSIDE. That is true reality, you smell, taste, and see it all, with a unlimited resolution.

Re:Great... (2, Informative)

maxwell demon (590494) | about 8 years ago | (#16114090)

I'm sure the MPAA is already working up something to restrict this. After all, how would you think you'd have a right to get all those experiences for free? :-)

BTW, it's not true that you get it with unlimited resolution. There are several limits to the resolution you get. First is the wavelength of light. Red light has a wavelength of about 800 nm, so you can't see any more than that in red. Violet light has about 400 nm, so you have twice the resolution there, but it's still limited.

The second limit is in your eyes. You simply don't get more "pixels" than your retina provides. So even the light wavelength limit is actually purely theoretical. Note that you cannot offset this by going arbitrary close, because below some minimal distance your eyes won't focus any more.

Re:Great... (3, Informative)

drinkypoo (153816) | about 8 years ago | (#16114233)

The second limit is in your eyes. You simply don't get more "pixels" than your retina provides.

This is pure nonsense, because our brain doesn't work in pixels. It works in concepts, and what you think you're seeing is actually constructed in your brain from a combination of what your optic nerve feeds to your brain, and what you remember about seeing similar things before. YOU DO NOT PERCEIVE REALITY. You perceive your brain's model of reality. This is the most important thing to remember about your senses, and most people have never heard it or are all too willing to forget and pretend that yes, they are directly connected to reality.

Do some research on saccades [wikipedia.org] ... but here's the meaty part of the wikipedia page:

Humans and other animals do not look at a scene in a steady way. Instead, the eyes move around, locating interesting parts of the scene and building up a mental 'map' corresponding to the scene. One reason for saccades of the human eye is that only the central part of the retina, the fovea, has a high concentration of color sensitive photoreceptor cells called cone cells. The rest of the retina is mainly made up of monochrome photoreceptor cell called rod cells, which are especially good for motion detection. Consequently, the fovea makes up the high-resolution central part the of human retina.

By moving the eye so that small parts of a scene can be sensed with greater resolution, body resources can be used more efficiently. If an entire scene were viewed in high resolution, the diameter of the optic nerve would need to be larger than the diameter of the eyeball itself. Subsequent processing of such a high-resolution image would require a brain many times larger than its current size.

In other words, you have no idea what you're talking about.

Re:Great... (5, Funny)

darkitecture (627408) | about 8 years ago | (#16114107)

Seriously people, if you want REAL, then go OUTSIDE. That is true reality, you smell, taste, and see it all, with a unlimited resolution.

No, see you're missing the point. I don't want REAL LIFE. I want LIFELIKE. Because let's face it, no matter what happens in real life, I doubt I'm ever gonna have the opportunity to bend Elisha Cuthbert over the closest piece of furniture and give her the worst 30 seconds of her life.

But if we can make screens mimic reality, then we're one step closer to every twisted geek's fantasy - the Holodeck. And I guarantee you, Holodeck-Elisha is more open to experimentation. One just has to hope that Real-Holographic-Simulated-Evil-Lincoln doesn't spring to life and goes on a rampage, wrecking the ambience.

Re:Great... (1)

slipangle (859826) | about 8 years ago | (#16114339)

The Holodeck will be the last invention of mankind. Once you go in you'll never come out. I know I wouldn't. Alien archaeologists will find our remains in these little rooms all over the planet. Eventually they'll breakdown, but by then nobody will be left who knows how to fix them. I can't wait. -- "Doesn't own HDTV"

Re:Great... (1)

chrismcdirty (677039) | about 8 years ago | (#16114385)

You can settle for your Holodeck-Elisha. But I won't be happy until I get my Elisha-Cuthbert-Bot.

Dude... (0)

Anonymous Coward | about 8 years ago | (#16114416)

that's rape.

Anyone have a video link of the demonstration? (5, Funny)

phpWebber (693379) | about 8 years ago | (#16113967)

I want to see if it looks better than my computer monitor resolution.

Re:Anyone have a video link of the demonstration? (1)

Flibz (716178) | about 8 years ago | (#16114044)

Mod parent funny!

640 (1, Funny)

gEvil (beta) (945888) | about 8 years ago | (#16113968)

640 (by 480) ought to be enough for anyone...

Re:640 (0)

jimstapleton (999106) | about 8 years ago | (#16114010)

OK Bill...

hrmph! (2, Funny)

B5_geek (638928) | about 8 years ago | (#16113978)

The inventors were overheard as saying; "Big deal. IT'S THE CONTENT STUPID!"

Atleast books will always have a higher (mental) resolution, it's to bad nobody reads anymore.

Re:hrmph! (1)

shawn(at)fsu (447153) | about 8 years ago | (#16114028)

Not to sound like a troll but please give refrences to studies or book sales showing book reading as a whole is going down...

Re:hrmph! (3, Funny)

Gospodin (547743) | about 8 years ago | (#16114168)

I would but I couldn't find any videos about it.

Re:hrmph! (4, Funny)

gEvil (beta) (945888) | about 8 years ago | (#16114135)

...it's to bad nobody reads anymore.

It's a shame that writing skills are on the decline, too.

Re:hrmph! (1)

Control Group (105494) | about 8 years ago | (#16114292)

Wish I had seen this before I posted. You, sir, are deserving of a +1 Funny.

Ahead of their time (1)

neonprimetime (528653) | about 8 years ago | (#16113979)

However, it is unlikely to be available to the public for at least 25 years.

Whoah, /. posted a story that is ahead of time!

Re:Ahead of their time (1, Offtopic)

Malc (1751) | about 8 years ago | (#16114141)

Oh gosh: that means 25 years of dup stories.

Re:Ahead of their time (1)

AcidLacedPenguiN (835552) | about 8 years ago | (#16114169)

well they only did it to lock in 25 years worth of dupes.

The final resolution jump? (5, Interesting)

w33t (978574) | about 8 years ago | (#16113982)

That's quite the resolution.

I wonder, can the human eye even see such high resolution; does it even matter at that point? I mean,

According to this page [72.14.205.104] it would appear that each human eye is a 15 megapixel camera.

If my maths are correctish then 7680 x 4320 is 33 million pixels.

So then, the question is - does this mean that by adding both eyes together, at best humans have 30 megapixel resolution vision?

Could this be considered "full human" resolution?

Re:The final resolution jump? (1)

brunascle (994197) | about 8 years ago | (#16114034)

i was thinking the same thing.

not only that, but to fully utilize the entire resolution, it would have to completely fill your field of vision. that would probably require straping the screen to your face.

Re:The final resolution jump? (4, Interesting)

KillerBob (217953) | about 8 years ago | (#16114195)

The pixel density is higher than the eyes can see, unless it's taking up your full field of vision. But the other thing to keep in mind is that your eyes are essentially two cameras working in parallel. We subconsciously interpolate the information they're sending to create depth, but we also subconsciously interpolate the data to increase the resolution (and sharpen the image). Pick something in your room, take off your glasses if you wear them. It's relatively in focus, depending on how bad your prescription is. Now... close one eye, then the other. Notice that with both eyes open, the focus is better than it is with one eye closed, and it doesn't matter which eye is closed for that effect. Even if you're like me where one eye is near-sighted and the other is far-sighted. (My right is -0.50, my left is +0.25)

I don't know the exact numbers, but we'll use the number of 15 megapixels per eye... just because a single eye is 15MP doesn't mean that both eyes working in tandem is going to be 30MP. In Astronomy, you can drastically increase the resolution of a picture you're taking by taking a dozen pictures spread out over a large area. If they're at the same time, then you can interpolate the missing data and produce a *really* high resolution picture. I'd be surprised if we aren't subconsciously doing the same thing with our eyes.

Re:The final resolution jump? (0)

Anonymous Coward | about 8 years ago | (#16114076)

yes

fta:

a maximum estimate
of approximately 15 million variable-resolution pixels per eye. Assuming 60 Hz stereo display with a
depth complexity of 6, it was estimated a rendering rate of approximately ten billion triangles per second
is sufficient to saturate the human visual system.

Re:The final resolution jump? (0)

Anonymous Coward | about 8 years ago | (#16114111)

Isn't the issue with scale? The article noted movie theatres. When a screen becomes the size of my house 15 million pixels may look great on an 88 square foot display, but what if you wanted to go bigger?

Re:The final resolution jump? (1)

Hahnsoo (976162) | about 8 years ago | (#16114128)

The human brain does a fair amount of interpolating and "interlacing" to the images received by the optic nerve. This is evident in the "blind spot" that we all have. So while the actual visual field may be finite, the resolving power of the post-processed image can be much higher. One can argue that the resolution offered by the human eye is not additive, but redundant (both eyes see virtually the same content). One can also argue that it is synergistic (allows for depth perception).

Re:The final resolution jump? (1)

starfishsystems (834319) | about 8 years ago | (#16114134)

I wonder, can the human eye even see such high resolution

Not all at once, of course. The point of having higher resolution is so that when you attend to one part of the scene, you won't perceive a reduction in image quality.

Re:The final resolution jump? (1)

The New Stan Price (909151) | about 8 years ago | (#16114138)

Megapixels aren't the only factor, size of the display plays into the equation. I would think that some of the resolution is so that the picture looks good even on a big screen, where the eye must pan around to see the whole scene. That said, my printer does 600dpi, which equates to a 12.8 x 7.2 inch picture at 7680 x 4320. This would make for a fairly small monitor size. Does your eye notice a difference between 300dpi and 600dpi? Mine does. Why do we not have a problem with our printers being this high def, but we cannot imagine our displays being this high def?

Re:The final resolution jump? (1)

uab21 (951482) | about 8 years ago | (#16114189)

Why do we not have a problem with our printers being this high def, but we cannot imagine our displays being this high def?

...because most of us don't try reading a printed page from across the room. The page is closer and probably fills more of the field of vision than the TV in front of the couch does for most of us, so the effective level of detail for the print is differt.

Re:The final resolution jump? (1)

llZENll (545605) | about 8 years ago | (#16114193)

"Why do we not have a problem with our printers being this high def, but we cannot imagine our displays being this high def?"

A very simple reason, distance. When was the last time you looked at your photo album from 10-50 feet away? Or how about watching your 50" TV or 30' cinema screen from 2 feet away?

Re:The final resolution jump? (1)

drinkypoo (153816) | about 8 years ago | (#16114250)

IBM had 200 dpi LCD screens at least five years ago, although AFAIK they never commercialized them. My IBM Thinkpad A21p has a 1600x1200 resolution 15" display: 133dpi.

Re:The final resolution jump? (1)

kennygraham (894697) | about 8 years ago | (#16114165)

This would be the case if the display was only being viewed by one person, and the display knew where that person was looking on the screen. Human eyes can only see much detail in the center of thier gaze. A "full human resolution" monitor of this sort would have to display that level of detail across the entire screen.

Re:The final resolution jump? (2, Insightful)

interiot (50685) | about 8 years ago | (#16114166)

If you had a display that wraps completely around you, eg. "surround vision", then you certainly couldn't look at the entire display at one time, so it would be reasonable to have media that carried more data than the human eye can see.

Re:The final resolution jump? (1)

DittoBox (978894) | about 8 years ago | (#16114324)

If that's true, then why can I notice stark differences between 150, 300 and 600 dpi images? It has to do with pixel density. That's why signage is 50-150 dpi, it's readable from a distance. Small print work like books, magazines, business cards and the like are 300-600 DPI, because they're seen much closer. In order to have font size at 12pt and still be readable you need to have high pixel/dot density. That's a little oversimplified, so read this: http://en.wikipedia.org/wiki/Dots_per_inch [wikipedia.org]

Re:The final resolution jump? (1)

Rik Sweeney (471717) | about 8 years ago | (#16114336)

That is true, but by that time scientists will have increased the resolution of the human eye to 66 megapixels.

Rather like the way scientists increased the speed of light in Futurama.

The answer is Moire (0)

Anonymous Coward | about 8 years ago | (#16114386)

The reason we will almost always need more resolution is the Moire Effect [wikipedia.org] .

It doesn't matter where the human eye's resolution tops out, as long as your display is made of discrete square dots you will experience Moire shimmering in certain scenes. Grass, screendoors, wallpaper on home improvement shows... because a television program or movie is made by moving a camera across a scene, eventually you're bound to hit a pattern that's a problem for the resolution you have.

In printing, where resolutions of thousands of DPI are available, they *still* have to watch out very carefully for Moire. Higher resolutions make it easier to avoid, but they don't entirely eliminate the problem.

As long as that shimmering is there, a scene will not be truly and convincingly immersive. You don't have to know about Moire to spot that something is off.

25 years sounds about right (2, Insightful)

windowpain (211052) | about 8 years ago | (#16114011)

The article says we might start to see these UHDTV sets in about 25 years. Although SDTV can be said to have started in the 1920s or 30s practically speaking it's about 55 or so years old as the transition to high definition picks up steam. (2006 will be the first year more high definition sets than standard definition sets are sold in the US.) With the rate of technological change and Moore's law it seems reasonable to me that the next generation will arrive in about half the time SDTV lasted.

Re:25 years sounds about right (2, Informative)

flappinbooger (574405) | about 8 years ago | (#16114040)

--insert obligatory slashdot reply here saying moore's law doesn't have anything to do with tv resolution--

Re:25 years sounds about right (1)

tlhIngan (30335) | about 8 years ago | (#16114086)

It is about right - they were talking about HDTV since the 80's. It's just that only around now has technology reached the point where HDTV is practical. (Wasn't the original HDTV rollout years something like 1997, 2000, 2003, and so on until technology became cheap and available?)

TVs are getting cheap (what you paid for a "big screen" TV back in the 1980s would get you a nice HDTV these days, IIRC), contents is starting to become available (as equipment etc. become much cheaper), bandwidth usage remains similar to what exists, and most importantly, processing power and storage have fallen in price. Imagine trying to record a 720p stream in the 80s and 90s using existing technology. Or even trying to edit it.

Of course, the Futurama joke about SDTVs was dated extremely quickly... (the one where Amy shows off her obscene tattoo that because of SDTV, ends up being blurred.)

Re:25 years sounds about right (5, Informative)

Catbeller (118204) | about 8 years ago | (#16114417)

"It's just that only around now has technology reached the point where HDTV is practical. (Wasn't the original HDTV rollout years something like 1997, 2000, 2003, and so on until technology became cheap and available?)"

I'm probably the only one here who is 1) old enough to remember, and 2) actually paying attention to the HDTV fiasco from 1985 onwards.

Analog HDTV was rolled out in Japan in the 1980's. A bit stung, the American television manufactures and the networks hammered together a proposal to broadcast 1080p in the following way: standard def over the usual VHF channels, while the HD component would be broadcast over unused channels. Thus, Channel 2 CBS would go out as normal, while an HDTV set would take that signal and add information broadcast over channel, say, 3. All analog. All broadcast. The rollout would have been around 1990 or so.

A funny thing happened. Digital video. The broadcasters saw what digital compression could do for them. Why just one channel, using all that bandwidth, when we can now use the same two channels and broadcast 4 programs simo? We promise that sometimes we'll broadcast in HD; just most of the time, we'd like to make more money with four low-def channels. And they demanded, and got, 1080 (i), to halve the signal and enable more channels on the side thereby.

And their wish was granted. These were the years of no-regulation, after all. The issue of public ownership of the airwaves was going bye-bye, and the government would like to auction off those frequencies anyway, which leads us to

Cable. Since so much programming was going over cable, the Gov decided that public regulation of public airwaves was silly and undermining competition. So long Fairness Doctrine, so long limits on corporate ownership and monopoly control. And so additionally, why force public airwaves to go digital when cable could deliver it so much better than they?

And network TV didn't really want to pay to upgrade, either, so that slowed it down a lot. Delay after delay...

THEN the kicker. The "content owners" saw that in the digital age they had a chance to lock down signals and force people to pay each time they accessed their "property". They wanted taping to go away as well -- they hated VCR's and almost killed the tech in 1984. They could win this one, and so was born the Broadcast Flag, a digital lock on transmissions that controlled the use of the program. Cue a big delay as HDMI, HTCP and all the other locks were developed and approved by the "content" industry.

Now... it's the 21st century. almost 20 years late, and we've crappy 1080i signals going over the air, infomercials clogging all those channels we can access for free, and we can't record the standard 1080i signal.

Remember, the public airwaves are supposed to belong to we the people, and the broadcasters and producers are supposed to dance to our tune. Somehow they are now the masters, and we those begging for mercy.

Finally! (1)

growse (928427) | about 8 years ago | (#16114016)

I'll be able to hook this up to a server and run my console apps at a *really* high resolution.

Typical (3, Funny)

Yahweh Doesn't Exist (906833) | about 8 years ago | (#16114021)

you wait 40 years to upgrade and a week later you're obsolete.

what I hate about TV is how the specs are so hardware-dependent. all kinds of numbers and letters and if it differs by 1 character your thousands of dollars might have been wasted.

imo it should be more like computers: you basically have a processor that determines your data processing and a display device that determines your viewable resolution. almost everything else is software and thus improvements are continuous and ongoing. it's a much better model than upgrading every couple of decades, with a half-decade period when your TV is too good for the signal.

once TV is based on more internet-like digital technologies this will hopefully happen.

Re:Typical (1)

maxwell demon (590494) | about 8 years ago | (#16114209)

once TV is based on more internet-like digital technologies this will hopefully happen.

And then we will have movies starting with the message "This movie will be enjoyed most on an 1024x768 TV." And if your TV has a higher resolution, the movie will be shown in a small rectangle in the middle.

At least that would be the analog to many of today's web pages.

Re:Typical (1)

Yahweh Doesn't Exist (906833) | about 8 years ago | (#16114269)

no, that's exactly how TV works.

with computers you can handle any source your CPU is up to and can always click full screen.

and you have source options; ever watched a trailer from Apple? you get the choice of small, medium, large and HD res.

with TVs it's either HD or not. and HD versions are often completely different channels. it's lame, just like those "+1 hour channels" are lame - if you had decent scaling/shifting abilities in the first place you could do 10x as much 10x as easily.

Pitty (2, Insightful)

suv4x4 (956391) | about 8 years ago | (#16114030)

Pitty money and time is spent on increasing the specs of something that is already in abundance.

As technology matures there's a race for bigger, faster, and finer. But this race is not eternal: in few years the sweet spot is hit and people are not interested in higher resolutions.

With TV resolution this sweet spot is already somewhere between DVD and EDTV, way below 1800p. So yea, don't expect "technology to catch up" in that respect, as the summary suggest, since noone cares for it to catch up in this way.

Re:Pitty (1)

Ctrl-Z (28806) | about 8 years ago | (#16114199)

Correct me if I'm wrong, but isn't DVD the same resolution as EDTV, 480p? The wikipedia article on enhanced-definition television [wikipedia.org] says that DVD is at the lower end because it isn't capable of a 60 Hz frame rate (480p60). So you're basically saying that 480p is the "sweet spot". I beg to differ.

40 years ago!? (2, Interesting)

Buddy_DoQ (922706) | about 8 years ago | (#16114043)

"When we designed HDTV 40 years ago..."

Whoa! 40 Years ago!? Amazing! Crazy how long it took to go public/mainstream. I guess it's one thing to design something and quite another to build upon it.

Re:40 years ago!? (0)

Anonymous Coward | about 8 years ago | (#16114214)

That's misleading. He's talking about the Japanese version of HDTV, which wasn't digital and would have taken the bandwidth of 6 standard TV channels to transmit one channel. Needless to say, that version has never been deployed anywhere, though they did try to sell it to the US as an HDTV system. Its inefficiency is what spurred the development of real hdtv.

Too much? (1)

milo_a_wagner (1002274) | about 8 years ago | (#16114053)

The focus of this article seems to be on the domestic/consumer future for this technology. I don't doubt that U-HDTV is an impressive and immersing experience, but is there really a place for this in our homes? There is a limit to the level of detail we can see on a average-sized (say, 28") television across the room. The size of potential television screen we could purchase is somewhat limited by the size of living room we can afford, after all - and HDTV sets large enough for the viewer to appreciate the quality of picture they display tend to dwarf all but the largest of rooms. The article suggests that U-HDTV might be available to consumers in around 25 years, but, even then, will we want or need it?

Re:Too much? (1)

curmudgeous (710771) | about 8 years ago | (#16114266)

This is getting close to what I like to refer to as immersive video. My ideal setup would have one wall (or major portion of) covered by a video screen with good enough resolution that you can't count the pixels from ~ 5ft away. Have this screen tied into an ambiance/environmental feed that is time-synched to my location. Get up in the morning and feel like going to the beach? Just select the beach channel complete with surf, wind, sunbathers. Likewise for rain forest, arctic tundra, etc. For the more people oriented types, we could even have shopping mall or city street feeds. Not quite a holodeck but it would be good enough for me.

Re:Too much? (1)

milo_a_wagner (1002274) | about 8 years ago | (#16114364)

Sounds terrifying. I'm not sure this kind of 'immersive' experience will be a reality for a long time yet, anyhow. And talk about prohibitive expense! Even Star Trek didn't have holodecks (or similar) in living quarters!

Jimmy Buffett had something to say about this (0, Offtopic)

thisnow1 (882441) | about 8 years ago | (#16114055)

Legal problems gettin' thick and hazy
Look at the people gettin' rich and crazy
Locked up in mansions on the top of the hill
Someone needs to tell them 'bout overkill

Overkill, overkill
Such a megalo modern problematic ill
Climb too fast and shove too hard
You'll be pushin' up the daisies in the old boneyard

Ah uh
Ah uh
Ah uh
Ah uh

I went to find the truth in the himalayas
Bundled up half-frozen munchin' milky way-uhs
Found a shaman in a diaper with a poppy pot
When I asked if he was cold he said I just think hot

Overkill, overkill
Such a megalo modern problematic ill
Climb too fast and shove too hard
You'll be pushin' up the daisies in some old boneyard

Ah uh
Ah uh

Wmr:
Out in hollywood the paper money rolls
They feed their egos instead of their souls
A million here, a million there
A mindless corporate dance
Gettin' paid for fuckin' off in the south of france

They don't do the shows
But they act like the stars
They fly around in g-4's and suck on big cigars
It ain't about the talent
It ain't about the skill
It's all about the silly stupid horseshit deal!

Overkill, overkill
Such a megalo modern problematic ill
Climb too fast and shove too hard
You'll be pushin' up the daisies in the old boneyard

[ horn/pan break ]

I got no corporate gig
I got no guru (ah uh)
I don't own ocean front in honolulu (ah uh)
You write the big checks
But I pay your bills
Now someone's got to tell you 'bout overkill

Overkill, overkill
Such a megalo modern problematic ill
Climb too fast and shove too hard
You'll be pushin' up the daisies in some old boneyard

Overkill, overkill
Such a megalo modern problematic ill
Climb too fast and shove too hard
You'll be pushin' up the daisies in some old boneyard

Computer monitor (1)

polar red (215081) | about 8 years ago | (#16114056)

I want that to use as my second monitor. Good enough for editing text-files on.

Demo (1, Redundant)

haxrox (1003032) | about 8 years ago | (#16114065)

Are there any videos of Ultra HDTV in action? I'd love to see how it looks on my monitor.

Pron (1)

audioguy16 (1003029) | about 8 years ago | (#16114066)

How will my pron collection look on this screen, i wonder.

bandwidth (2, Insightful)

Raleel (30913) | about 8 years ago | (#16114067)

So, did I do my math right?
x*y*bytes per pixel*frames per second gives bytes per section /1024 gives kb /1024 gives mb /1024 gives gb
7680*4320*3*25/1024/1024/1024 = 2.3174 gigabytes per second

that's quite a chunk for streaming video. of course, there will be compression techs and other tricks, but that's pretty impressive.

Re:bandwidth (2, Informative)

hattig (47930) | about 8 years ago | (#16114397)

I expect that video will be 5 bytes per pixel by the time this comes out - already the latest version of the HDMI specification allows for 36-bits per pixel, which would require 5 bytes.

So 7680 x 4320 x 5 at 60fps = 9.3GB/s.

Another comment said that this was 25 years away, although I wouldn't be surprised if it was only 15 years away the way things are progressing. 9.3GB/s is offered on even low-end graphics cards these days, but the bandwidth problem is between the player and the display, i.e., the HDMI equivalent specification of the time will have to carry that much bandwidth.

HDMI 1.3 currently carries over 1GB/s on its interlink, so that's probably not a worry ether.

"ultra" high def has been around for a while... (1)

cyanics (168644) | about 8 years ago | (#16114083)

actually, we don't even use the full spec size of HD now. The real spec is for 1920x1080. We haven't even been using anything close to that. "Ultra", huh? What's next "Super-Duper" "Magnum" "mega-uber"

Re:"ultra" high def has been around for a while... (2, Funny)

Kawolski (939414) | about 8 years ago | (#16114285)

They should've called it "Very HD" and saved "Ultra HD" for the next one.

Moon movies (1)

meckardt (113120) | about 8 years ago | (#16114094)

This format may also be useful for showing the often missing moon landing movies.

Excellent (1)

Hijacked Public (999535) | about 8 years ago | (#16114099)

'When we designed HDTV 40 years ago our target was to make people feel like they were watching the real object. Our target now is to make people feel that they are in the scene.'

Best TV related comment I have read in a great while.

Don't actually go outside and inteact with other people, sit at home and your TV will make you think you are interacting with other people.

Someone need to make this guy listen to She Watch Channel Zero.

Yay!!!! (1)

aonaran (15651) | about 8 years ago | (#16114103)

Now I just have to wait for it to come down in price and I'll finally be able to have an HDTV LCD that displays ALL 3 common resolutions without doing funky scaling tricks
1080i/p pixels = 4x4 pixel block of real pysical pixels, 720p pixels = 6x6 block of physical pixels, 480i/p pixels = 9x9 block of physical pixels.
1080i pip is still 1080i at 1/4 of the screen. no down scaling.

THIS folks is what I've been waiting for ever since HDTV was announced.

If only I could afford to be an early adopter on this technology.

Re:Yay!!!! (1)

dhartshorn (456906) | about 8 years ago | (#16114262)

No foolin! And four 1080 Picture in Picture sources at once. With 24 speakers, we can have four separate 5.1 audio streams. Too bad we only have two eyes & two ears.

If the new CPU is multicore, the new TV is multisource. Just think of Arnold's wall TV in Total Recall.

When you get to the parts you step through (you know the ones, you pr0n addict!), blowup the source. to full screen.

Ooooh flat! (1)

lymond01 (314120) | about 8 years ago | (#16114112)

Say it with me: HOLODECK

Give me gyroscopes and holograms! It doesn't matter if that Klingon bat'leth is UHD resolution...all you'll see is a low-res blur before your very high res intestines spill out before you!

(Of course, those aren't really your intestines, but this holodeck goes for intensity in imagery.)

Re:Ooooh flat! (1)

Control Group (105494) | about 8 years ago | (#16114261)

(Of course, those aren't really your intestines, but this holodeck goes for intensity in imagery.)

Since the safety protocols have no doubt broken/been bypassed/been shut off/been overridden by a rogue AI inside the holodeck program itself, those are, in fact, really your intestines.

Sony vs Microsoft (3, Funny)

mcai8rw2 (923718) | about 8 years ago | (#16114114)

Haha! That resoultion sounds flippin great...but I can see it now:

"Ken Kutaragis' head announces that the "playstation 14" ships WITHOUT the foot wide ultra-ultra-ultra-HDMI cable."


Meanwhile, CMDR Taco [deceased] writes on how playstations "neural implant connect-kinetic extremity dongle [N.I.C.K.E.D]...was 'actually just a rehash of the Wiiiiiiiiiiiiiiiiiiis controller.

That's 31 Megapixels! Camera optics ready? (1)

Dr. Spork (142693) | about 8 years ago | (#16114123)

They finally have some still cameras that can usefully photograph at such huge megapixel counts, but getting video camera to take more than 20 frames per second is a different story. Besides, it's been my experience, and the experience of many, that the real bottleneck in digital cameras is no longer the pixel count but the optics themselves. By which I mean that pictures with 30Mpix will not look better than pictures at 7Mpix with even upmarket still camera optics these days.

What's more, while all the electronics in digital cameras are quickly improving, the optics are not. The lenses of an expensive camera from the 80's are ground and set every bit as accurately as the lenses on new cameras. This is just not an area with much room for drastic improvements. But if 31Mpix is ever going to pay off, there had better be some drastic improvements in the optics. There isn't much use for the extra pixels if all they show is blur.

Re:That's 31 Megapixels! Camera optics ready? (1)

Control Group (105494) | about 8 years ago | (#16114240)

It's not the precision of the optics that matters*, it's the size. You could support 31 megapixels if you were willing to put a big enough aperture on the front of the camera. Look at the lens of an HD TV camera sometime - it's much wider than the lens on an SD TV. The extra information is captured by widening the light intake, rather than perfecting the light intake for a given width.

*This statement is to be taken strictly in the context of improving on current optics for higher resolutions. Obviously, the precision of the optics is the most important single factor in the technical quality of a photograph from a reasonable camera; but, as you pointed out, the precision of the optics is pretty much plateaued.

Also... (2, Funny)

Kirin Fenrir (1001780) | about 8 years ago | (#16114124)

The $3000 version of the PS4 is built specifically for Ultra HDTV! Pre-order now!

But why? (1)

140Mandak262Jamuna (970587) | about 8 years ago | (#16114183)

Ramping up pixel count is like Detroit building bigger and bigger engines , 380 CuInches, baby vrrooom vrrooom. Hope they improve the dynamic range of these screens, which are all pathetic 1000 for most screens. Human eye has 1 million. Also why not some real depth perception too.

I guess I'm just iggernant (1)

Control Group (105494) | about 8 years ago | (#16114206)

I'm serious when I say this, so bear that in mind as you snicker... ...but: what's the challenge, here? What's the innovation? If you're not worried about how you're going to fit it into an existing transmission medium (that is, they obviously aren't worried about sending OTA on a TV channel), then what's the challenge to designing a higher-resolution spec?

How is this different than me defining a video spec that operates at 1048576 x 589824 pixels x 120 fps, non-interlaced? Is it just that they spent the money to have custom hardware designed to meet their UHD spec?

(As I alluded to at the beginning, I suspect I'm just ignorant of what actually goes into developing this sort of thing - information replies will get a cookie. Never mind that it comes from slashdot.org, I promise that's my cookie you're getting)

Re:I guess I'm just iggernant (0)

Anonymous Coward | about 8 years ago | (#16114297)

*snicker*

WHUXGA (7680 x 4800 pixels) (2, Informative)

mrcgran (1002503) | about 8 years ago | (#16114228)

Just wait a few more years for WHUXGA...

From http://en.wikipedia.org/wiki/HUXGA [wikipedia.org]
WHUXGA 7680×4800 16:10 37M

WHUXGA an abbreviation for Wide Hex[adecatuple] Ultra Extended Graphics Array, is a display standard that can support a resolution up to 7680 x 4800 pixels, assuming a 16:10 aspect ratio. The name comes from the fact that it has sixteen (hexadecatuple) times as many pixels as an WUXGA display. As of 2005, one would need 12 such displays to render certain single-shot digital pictures, for instance a 14836 x 20072 pixels image created by a Betterlight Super 10K-2.

Long ways from human eye resolution (1)

192939495969798999 (58312) | about 8 years ago | (#16114232)

Don't worry, we're still a long ways from human-eye resolution. The eye can detect a single photon, which means until the display has the control to know exactly how many photons it's emitting (not coming soon), it still won't be *quite* realistic. Well, that and the fact that it's a flat screen and we have binocular vision.

Re:Long ways from human eye resolution (2, Insightful)

Cougem (734635) | about 8 years ago | (#16114379)

Yes the human eye's ROD cells can detect single photons, but they are slow, colour insensitive, and of relatively low density at the fovea (the part of the eye which we usually use to fixate on objects with).
And still, it's completely irrelevant. Yes our eye may be able to sense very small amounts of light, but that's nothing to do with resolution; the eye must be able to pin point the location that the photon landed, and that is limited by the 6 million or so cones we have, and a lot of parallel/serial processing.

It took 40 years, that shows (1)

TheCouchPotatoFamine (628797) | about 8 years ago | (#16114239)

It took 40 years, that shows how much we really need it, i suppose.

"Now, books in 3pt font! Rush to Best Buy for yours today!"

Format Wars 20x6 (1)

liak12345 (967676) | about 8 years ago | (#16114277)

I predict that when this this tv becomes a household staple in 20 years that the Battleship Mauve-Ray discs will totally beat out the QD-DVD (quantum density) discs in terms of movie quality and that all you QD fanbois can cry more noobs.

The Law of Diminishing Returns (1)

Hootenanny (966459) | about 8 years ago | (#16114286)

This is where the law of diminishing returns kicks in...

Many people, when shown a 60" screen at a reasonable viewing distance, can't tell the difference between 720p and 1080p. The added resolution of UHDTV would only be of benefit on a large movie screen from a close viewing distance. But movies implemented the ideal screen resolution decades ago... It's called film.

ONLY theaters? (1)

soft_guy (534437) | about 8 years ago | (#16114302)

I think it is completely reasonable to have this kind of technology in movie theaters. The whole concept of a movie theater is that it is an expensive experience that cannot be replicated at home. If you have HD-DVD at home plus a large format HD display or a projector, then what is the point of going to the movies. I think it is OK for theaters to invest in a technology that makes the answer to the previous question something other than "none".

Film (2, Informative)

GWBasic (900357) | about 8 years ago | (#16114319)

Ultra-HDTV's resolution is comparable to 30mm and 70mm film. This will probably be what's adopted when digital projection becomes mainstream in theaters.

Triptychs (1)

Doc Ruby (173196) | about 8 years ago | (#16114345)

I don't know why we don't already just use 3 screens across at just 1600x1900 (UXGA) all the time. We look at the middle screen for detail, and the side screens fill our peripheral vision. It seems like mounting a bezel on the screens that pushes out past their frames to join in front in a seam is a lot easier than making a really big panel. And the lower detail demand for the side screens could mean the driving boards don't need to triple the UXGA performance.

I'd expect this kind of rig to already be standard for gamers, whose peripheral vision is essential for survival. And therefore fairly cheap for the rest of us who just want desktops and movies that aren't like chasing a carrot on a stick.

It _is_ useful for normal screens... (1)

Bromskloss (750445) | about 8 years ago | (#16114349)

...in the same way that high resolution still images are -- they might not fit in their entirety on the screen, but it enables you to zoom in on details. If you processor is fast enough to keep up with all the data at all, that is.

Please, please, please! (1)

Cybert4 (994278) | about 8 years ago | (#16114395)

Do not make another interlaced standard!

Seen it, awesome. (3, Informative)

JFMulder (59706) | about 8 years ago | (#16114414)

I've seen this at NAB this year in Vegas. It's awesome. The sound system has 9 speakers on the upper layer surrounding the crowd, 10 middle speakers around and 3 lower speakers right in front, with two LFEs. It actually uses two projectors IIRC, one for chrominance and one for luminance. They showed a bunch of footage filmed for the occasion. Since it came from Japan, it involved a lot of soccer games, Japan landscapes and.... Ultra High Def sumo fat wiggling. At the end, they showed real-time footage from a tower on top of the convention center. It was pretty cool, tough you could see some noticeable compression artifacts in some places.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>