Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

High Dynamic Range Monitors

kdawson posted more than 7 years ago | from the details-details dept.

131

An anonymous reader writes, "We are seeing more and more about high dynamic range (HDR) images, where the photographer brackets the exposures and then combines the images to increase the dynamic range of the photo. The next step is going to be monitors that can display the wider dynamic range these images offer, as well as being more true-to-life, as they come closer to matching the capabilities of the ol' Mark I eyeball. The guys who seem to be furthest along with this are a company called Brightside Technologies. Here is a detailed review of the Brightside tech." With a price tag of $49K for a 37" monitor (with a contrast ratio of 200K to 1), HDR isn't exactly ready for the living room yet.

cancel ×

131 comments

Medical Imaging (4, Insightful)

BWJones (18351) | more than 7 years ago | (#16415377)

Of course one of the other principal arenas where monitors like this are valuable is in medical imaging. One of the serious shortcomings in the migration of radiology to digital formats is the reduced quality of the images as compared to film. The dynamic range of film is simply so much greater than can be achieved with standard CRTs or LCD monitors that there is a real danger of missing out on very subtle changes in X-Rays for example. While it's true that image processing can make up for some differences, digital still can't quite compete with film for many purposes including data density in many cases.

Re:Medical Imaging (0)

Anonymous Coward | more than 7 years ago | (#16415665)

Last time I got an x-ray it was digital, but they printed it out.

I beg to differ. (5, Informative)

purduephotog (218304) | more than 7 years ago | (#16415691)

Mammography has gone completely digital. Why? Because the quality of the imagery is lightyears better than what you can get for film. Couple that with rapid processing from a scanner laser and throw in algorithms that contrast enhance areas of nearly neutral density and you have a recipe for catching growths that would otherwise miss detection.

A good, excellent radiologist could detect subtle differences of about 80% that of a standard person. I'd give you the exact quote but it's been a while since I remembered the data- suffice to say I was impressed at the level (in controlled lighting situations) that they were able to see in film.

A good medical display is a peeled LCD- all the colors have been chemically removed from the surface- and has typically a brighter backlight and another polarizer to knock down the lmin even further. This gives you better dynamic range that is easily adjusted faster than film can- want to zoom in? No problem- touch and zoom- or if you had film, grab a loupe (or crane your head closer). Digital wins hands down.

Yes, if you digitize a negative you have a data density that can't be reached very easily (I used to estimate this for a job for large quantities of imagery and at high quality ratios- 2 micron spot sizes). But frankly alot of that information is useless- you don't need to know what isn't of relevance.

The most important aspect of digital imaging is proper viewing environment- something no one seems to get. Reduce the lighting of the area to 0.5 fc and remove any sources of glare off the monitor. Wear dark clothing. Have wall wash lighting appropriate to about 3-9 fc. Have surfaces neutral gray. Ceiling black.

Digital definately competes with film in many markets for medical xray- Mammography was just the easiest to choose because it has been such a radical change in such a short time period.

I should note I used to work for Eastman Kodak and did work with other individuals on these digital products (specifically, algorithms)... but I'm not biased because of that. Just the simple truth- from the raw data I've seen I'll feel happy and safe knowing my wife gets a digital mammagram every year.

Re:I beg to differ. (1)

mattkime (8466) | more than 7 years ago | (#16415803)

Further, digital imaging often requires less radiation exposure.

I had an interesting conversation with my dentist about the new digital xray equipment.

Re:I beg to differ. (1)

Amouth (879122) | more than 7 years ago | (#16417155)

I thought you were going to say you stayed at a Holiday Inn Express

Re:I beg to differ. (1)

pz (113803) | more than 7 years ago | (#16416603)

You also forgot to add that exact film processing conditions (concentrations, age of the reagents, temperatures, humidity, etc.) can affect the image tremendously in often unreproducible ways. Digital imaging reduces many of these uncontrolled variables.

Re:I beg to differ. (3, Informative)

BWJones (18351) | more than 7 years ago | (#16416979)

I think that you are missing the point of my argument. I was supporting the use and implementation of HDR monitors because of some of the current limitations of digital radiology. All of the things that are done to medical quality LCDs and digital enhancement are an attempt to narrow the difference in image quality between film and computer display and HDR monitors will help this out considerably.

I am not arguing against digital radiology, rather I am all for it because of the inherent benefits (less rads, less time, less film processing variability, more convenient, etc....etc....etc...), but the reality is that digital radiology is still not all it could be. You said it yourself in that a well trained radiologist can detect about 80% of the differences present in digital representation. Well..... 20% is still a lot of potential misses on diagnoses.

The reasons that digital has been so successful is not necessarily because of its inherent superiority in image quality. Rather it has been successful because it is cheaper and more convenient especially given the trend away from traditional medical records management.

As to the density of information, I routinely take film images of electron microscopy captures and digitize them because of the convenience, and that is working on the nanoscale range. I am throwing information away by the conversion, but it is more convenient for all of the reasons we have already talked about.

Re:I beg to differ. (0)

Anonymous Coward | more than 7 years ago | (#16417043)

You like other people touching your wife's tits and then staring at them?

Re:I beg to differ. (2, Funny)

Anonymous Coward | more than 7 years ago | (#16417115)

I like you getting ass cancer because you wouldn't let the doctor stick his finger in your pooper even better.

Re:I beg to differ. (1, Insightful)

Silverstrike (170889) | more than 7 years ago | (#16417229)

For the answer on digital vs. film.

How many silver halide molecules can you fit into any given area?

Now, how many pixels can you fit into that same area?

Exactly.

Re:I beg to differ. (1)

Firehed (942385) | more than 7 years ago | (#16417503)

It's not that simple. Not by a long shot. For one, you're comparing a negative to a display. Extrapolate those molecules to what it looks like when you're seeing the enlarged print and you're getting close. Then consider innumerable other variables, such as film speed. Then consider how you can increase the physical size of the CCD in your new camera - it's not quite as simple to replace a decades-old film standard.

Prosumer-level equipment already is comparable to 35mm film, and will probably exceed it entirely within a year or two. Of course it's a fairly subjective thing, but having some work developing prints, I can say that the quality difference is fast becoming negligable and the various other benefits make it fully worthwhile.

Re:I beg to differ. (0)

Anonymous Coward | more than 7 years ago | (#16417649)

Digital mammography is the best. Tongues work good too.

Re:I beg to differ. (1)

icepick72 (834363) | more than 7 years ago | (#16418629)

Just skimmed over your post quickly, this is what I got from it:
Mammography: the surface, No problem- touch, hands, proper viewing environment,

You are indeed taunting the teenage /.ers here.

Re:Medical Imaging (2, Informative)

ketamine-bp (586203) | more than 7 years ago | (#16416245)

I believe that in interpretation of X-ray (chest or abdomen), most disease state/patterns are pretty obvious and do not require anything more than a careful eye on a 1000x1000 image of 8-bit grays to actually interpret it. As for X-ray skeletal parts, you can usually lesions, or it is simply not there.

For CT and MRI, however, the best thing about using a computer to read it rather than reading it on printed films, is that you can actually adjust the window (from the bone window to the soft-tissue window etc) - distinguishing adipose containing nodules from nodules that are composed of 'real' soft tissue - etc. and THAT doesn't take a very high resolution, or high dynamic range image either - and don't tell me you want to put all that window into one image so we don't have to adjust that... it would be much more difficult to see than the ye olde window adjustment...

Re:Medical Imaging (1)

dfghjk (711126) | more than 7 years ago | (#16417383)

"The dynamic range of film is simply so much greater than can be achieved with standard CRTs or LCD monitors..."

Nonsense. You can't compare a recording device (film) with a playback device (monitor). Digital sensors (the equivalent of film) can have just as great dynamic range as film and are inherently more linear. Monitors have greater dynamic range than print output. slide film is better than print.

There are already specialized displays for medical imaging.

Re:Medical Imaging (1)

BWJones (18351) | more than 7 years ago | (#16417763)

Nonsense. ......... slide film is better than print.

Hello? Say what?

Just for your edification though, it is generally accepted that given current technology, the difference in dynamic range is still about 1.5 to 3 stops better with film than digital. Clicky clicky [anandtech.com] for just one reference out of many.

Re:Medical Imaging (1)

dfghjk (711126) | more than 7 years ago | (#16418107)

Please. If you're going to talk photography, at least has the decency to quote a photography site. Anandtech is no expert and the link you provided had no date to back of their incorrect claim.

Print has a maximum dynamic range of about 5.5 to 6 stops unless it uses a special process. Film itself can go up to about 11 stops although typical films offer less than 10. Furthermore, film is not linear and the resultant output is typically compressed to 8.5 or less. Dmax for slide film can be as great as 3.6 or 3.7 versus about 2.0 for print. Here's a quick link that is somewhat more useful than yours: http://www.marginalsoftware.com/Scanner/density_ra nge.htm [marginalsoftware.com] . You can find real information on the subject if you are willing to look. Here's another: http://photo.net/learn/drange/ [photo.net]

Regarding digital imaging, current digital SLRs offer dynamic range of over 10 stops, surpassing any available color film. For medical imaging work, imagers could be made with even better dynamic range. Don't know if they bother.

Finally, any monitor easily exceeds print for dynamic range. After all, print has a contrast ratio of maybe 50:1. 6.5 to 7 stops is common from monitors.

HALP!! NEED TO BEAT RAPIDSHARE'S DOWNLOAD LIMIT (1)

cmdrtacoe (1009851) | more than 7 years ago | (#16415401)

NEED TO BEAT RAPIDSHARE'S ONE HOUR DOWNLOAD LIMIT.


    • URLs http://example.com/ [example.com] will auto-link a URL Important Stuff * Please try to keep posts on topic. * Try to reply to other people's comments instead of starting new threads. * Read other people's messages before posting your own to avoid simply duplicating what has already been said. * Use a clear subject that describes what your message is about. * Offtopic, Inflammatory, Inappropriate, Illegal, or Offensive comments might be moderated. (You can read everything, even moderated posts, by adjusting your threshold on the User Preferences Page) Problems regarding accounts or comment posting should be sent to CowboyNeal.

Who's not ready? (1)

MLopat (848735) | more than 7 years ago | (#16415417)

Maybe you're not ready for one for your living room, but I'm looking for the order form! Who wouldn't want this? And if you think the price is outrageous, consider how expensive LCD TV's were 7-10 years ago.

Re:Who's not ready? (0)

Anonymous Coward | more than 7 years ago | (#16415579)

Me. I have no need for this crap. Maybe i dont understand the whole point of a HDR TV/Monitor but i'm thinking I would much rather a camera that can effectively do HDR.

My reasoning - Every pixel on the screen is displayed seperately, the cameras meter reads the WHOLE image, not seperate light and dark spots and adjusts accordingly..

Am i completely wrong?

Re:Who's not ready? (1)

kidgenius (704962) | more than 7 years ago | (#16415805)

Good idea....but wouldn't you want to view that image in all its HDR glory? If you took a photo in HDR and you can't view it the same way you took it, it's not doing much good for you.

Re:Who's not ready? (1)

Frenchy_2001 (659163) | more than 7 years ago | (#16416179)

You are missing the point that your eye cannot even see such a high contrast all at once.
The eye has a very effective contrast control mechanism (the pupil), but it works in time, as the eye scan the space. So, our dynamic range at an instant T is small, but as this range is adjusted to the subject we observe, the resulting image we record is using a much higher range.

The idea of the GP, that is currently under study by the camera manufacturers, is to allow the camera to do that range adjustment as it scans the image. You can then render the resulting image on a normal monitor or print it out.

I've seen some images recreated with those algorithms and they are stunning.

The display described here is the brute force approach. It will work, but it is a lot of effort for a result that can be reached more easily.

Re:Who's not ready? (1)

ketamine-bp (586203) | more than 7 years ago | (#16416291)

more dynamic range = easier to do post-processing and do not have to blanket-exposure. the dynamic range of digital sensor and film are very often much worse than our eye, and that makes photos unrealistic for one reason or two. by increasing the dynamic range of it, even if we downsample the bit-depth, we'll see more realistic image.

Re:Who's not ready? (0)

Anonymous Coward | more than 7 years ago | (#16415625)

I will take my anger out on you rich people by owning your boxen and blinding you with your super bright displays.

Old tech (1)

lisaparratt (752068) | more than 7 years ago | (#16415419)

Weren't there SGI monitors once upon a time that has an extra gun for rendering brighter highlights?

Re:Old tech (1)

dorfmann (1010467) | more than 7 years ago | (#16415551)

I was at Siggraph in 1997, and saw an SGI monitor that was amazing - I had to do a double-take, it looked so real. I don't know if it was an early HD monitor or what, but damn, I wish I had one!

Re:Old tech (0)

Anonymous Coward | more than 7 years ago | (#16418317)

All the (CRT) SGI monitors I've ever seen have just been Sony Trinitron, often identical to the Sun ones, except for the logo.
If it was an LCD (I'm not sure when they released those) it could have been a 1600SW which is special ;)

The really interesting thing, is that all SGI monitors (CRT/LCD) all seem to have relatively low contrast ratios (like 400:1)
whereas elsewhere there seems to be a race to get the highest contrast ratio possible. SGI machines also seem to have a pretty high gamma (I think 1.7, against 1.2 for the PC).

I'd guess the quality of the pictures you saw was less to do with the monitor (unless 1600SW), and more to do with the output capabilities of SGI machines, where 48bit RGBA (36bit colour) is standard on all but the lowest end machines.

Re:Old tech (1)

Jeremy Erwin (2054) | more than 7 years ago | (#16419351)

According to ColorSync, the Mac Standard Gamma is 1.8, PC Standard is 2.2.

Won't a monitor with genuine HDR hurt your eyes? (0)

Anonymous Coward | more than 7 years ago | (#16415457)

I've used laptops that were a bit painful for extended use at maximum brightness, as is the DS Lite.

I appreciate that you can also make the blacks blacker, but surely there is a perceptual limit there which we are fairly near already.... especially since even a hint of ambient-light glare on the screen will wash out the differences at the dark end of the spectrum.

Re:Won't a monitor with genuine HDR hurt your eyes (0)

Anonymous Coward | more than 7 years ago | (#16415645)

Even the brightest displays (with the exception of direct laser-into-the-eye projection systems) don't come close to the intensity of natural light. Sunlight is up to 200 times brighter than an office with typical lighting. Try using a backlit (i.e. non-reflectively lit) notebook display in bright sunlight and you'll see what I mean: the maximum brightness of the notebook display is so low that your eye can't resolve the difference between the white-vs-sun and black-vs-sun contrasts - IOW you can't read the display.

Re:Won't a monitor with genuine HDR hurt your eyes (0)

Anonymous Coward | more than 7 years ago | (#16415771)

Yes, and my point is that a display as bright as direct sunlight would be unpleasant to use - do I really want to use sunglasses when I'm watching a documentary about deserts? More importantly, do I want to give advertisers the chance to use borderline painful flashes of light to attract my attention?

I do agree that bright displays are needed for outdoor use on sunlit days, at least for situations where you can't just turn the display at 90 degrees to the light.

Laser displays (1)

slobber (685169) | more than 7 years ago | (#16415483)

This seems like a good candidate for high dynamic range if it is not vaporware:

http://hardware.slashdot.org/article.pl?sid=06/10/ 11/0214254 [slashdot.org]

Re:Laser displays (1)

bwogowly (1012947) | more than 7 years ago | (#16418943)

Yeah, I'll go with the laser TV. I've known about Brightside for a few weeks now and realized that it is not needed at all. I don't know how bright my 27" mid 2004 SD CRT gets, but I experience the bloom effect (images so bright there appears a kind of glare around the image) with it. TV sets are bright enough, its the dynamic range (Brightside's referral of contrast ratio) that's important. This laser TV has an infinite contrast ratio (contrast ratio: brightest part of the TV divided by the darkest point) because the lasers at the sub pixels can turn off, which means black, no light at all. I've read a comment about SED (Surface-conduction Electron-emitter Display), that it's like making the leap from standard def to high def all over again. You've gotta ask, why? The same is said about Brightside, and if that's the only difference, once we cover what that difference is, then you don't need the brightness that the Brightside offers, but the contrast ratio. Contrast ratio is the figurative leap from low def to high def all over again. So, if a set has, say, over 100,000:1 contrast ratio you should be ok. I'm not sure whether the SED's contrast ratio is measured passively or actively (whether they measured the different brightnesses from a white screen to a "black" screen, or whether they were measured while displayed at the same time, like a checkerboard pattern, which gives a more real world ratio), either way, I'm going with the laser TV, no doubt there. Brightside has 200000:1 contrast ratio and painful brightness, and the laser TV has infinite contrast ratio and lots of colors!

It's tres cool (3, Interesting)

PhantomHarlock (189617) | more than 7 years ago | (#16415525)

I've been seeing these at Siggraph for years. They do look very nice. You basically need a very bright light source (not hard) that doesn't generate too much heat (a little harder) and a way to modulate that light over a very large range (harder). It would be fun to have a converter for DSLR RAW images to display in HDR, or the usual bracketed ones.

The examples they usually use are things like light streaming through stained glass in a church, where normally you'd either only see the stained glass properly exposed, or the rest of the room, but not both. It does work to very good effect in those instances, and heightens the "window into the world" effect that high resolution displays have. If this were to be combined with 2X HD resolution 60P motion video (about 4,000 pixels across) it would kick serious ass as the next 'Imax' lifelike motion picture display.

Oddly enough, the captcha for the post reply screen right now is 'aperture'.

Re:It's tres cool (2, Informative)

squidfrog (765515) | more than 7 years ago | (#16417135)

Using dcraw [cybercom.net] and the Radiance (HDR) file format [lbl.gov] , it should be trivial to convert any digicam or SLR's raw image to an HDR.

For manually-captured bracketed images, there's AHDRIC [uh.edu] (disclaimer: I wrote this). As long as the EXIF info is intact and the only thing that changes between shots is the shutterspeed, this should do the trick. A related tool (AHDRIA) lets you capture HDRs automatically by controlling a digicam via USB (Canon digicams only, sorry). This process can take 20-120 seconds, depending on the quality required.

Re:It's tres cool (1)

Apotsy (84148) | more than 7 years ago | (#16418271)

If this were to be combined with 2X HD resolution 60P motion video (about 4,000 pixels across) it would kick serious ass as the next 'Imax' lifelike motion picture display.

It could also breath new life into older motion pictures. A lot of movies shot during the 20th century look quite feeble on home video with today's display tech, but would look inredible if scanned and stored in a format that preserved the full dynamic range of the image. There is a tremendous amount of HDR info locked in Hollywood's valuts already. It would be wonderful for people to have a way to view it at home. An HDR home video format would probably be much more successful than the failing HD formats.

HDR rules (1, Interesting)

Anonymous Coward | more than 7 years ago | (#16415533)

If it takes an expensive display system before people stop going nuts over excessively tonemapped HDR images, then so be it. It's still going to be different from viewing the real scene because bright highlights and dark shadows will be much closer together on a relatively small screen, so our eyes won't be able to adapt as easily. A nicely tonemapped picture, perhaps combined with a slightly higher dynamic range than on today's displays, will beat "1:1" recreations any day.

Monitors? .. What about input? (4, Interesting)

UnknownSoldier (67820) | more than 7 years ago | (#16415557)

Since "true" HDR consumera camera's don't exist (anyone know?), it can be faked [flickr.com] , quite convincingly, I might add.
i.e.
"It's a feature in Photoshop CS2 or Photomatix or FDRTools."

Even black and white can be support HDR. This is a great B&W example [flickr.com] of why 8-bit greyscale just doesn't cut it.

--
"The difference between Religion and Philosophy, is that one is put into practise"

Re:Monitors? .. What about input? (3, Insightful)

spoco2 (322835) | more than 7 years ago | (#16415709)

The thing to me about the 'HDR' images produced by that technique is that they look far more 'unreal' than normal photos. They have this 'hypereal' effect that reminds me of postcards from the erm... I guess 1940s/1950s that had some hand retouching done to them, or a foil look.

They just, to me, look a little silly, and that's a result of having an image with more information in it than the medium they are displaying on can handle.

Now, with a display that can ACTUALLY display the full spectrum of a HDR image. THAT I'm interested in.

Why is this story only being posted now though? It's from last year!

Re:Monitors? .. What about input? (2, Informative)

Hijacked Public (999535) | more than 7 years ago | (#16415855)

HDR images are not at their best on a computer monitor, they look much better in print. Side by side a 3 stop HDR digital print generally looks better than a single exposure.

Re:Monitors? .. What about input? (1)

dfghjk (711126) | more than 7 years ago | (#16417423)

Conventional printing offers lower dynamic range than any computer monitor.

Re:Monitors? .. What about input? (1)

Hijacked Public (999535) | more than 7 years ago | (#16415785)

DSLRs are limited to 12 or 14 bits. Merge to HDR in Photoshop CS2 will do as many images as you care to take.

I have never seen a huge advantage in color prints but B&W, even with HDR, can't quite produce the results film can.

Re:Monitors? .. What about input? (1)

Rui del-Negro (531098) | more than 7 years ago | (#16417857)

> Since "true" HDR consumera camera's don't exist [...]

Depends on what you mean by HDR.

For some people, HDR simply means "a very high dynamic range" (compared to competing products, or the "normal" standards). That's the case with these monitors.

For othr people, HDR means a dynamic range that is greater than your output medium. By definition, a "HDR" monitor can't comply with this definition, although a monitor can certainly be compatible with "HDR input", if both it and the graphics card support it.

And for other people HDR means virtually unconstrained, floating-point representation of light properties (including negative light, for example). Again, a monitor can't really represent this, but it can accept an input format that supports an approximation of this model. Think of it as "extreme self-calibration".

Most modern digital still cameras capture at least 20% more information than they store in JPEG format. In other words, their raw format is "HDR" (second definition above, not the third one) when compared to their JPEG output. When you load a raw file into an application, you'll typically have the option to respect the original exposure (and basically discard those extra 20%) or lower the exposure so that all the information is compressed into the "visible" range.

Photoshop's support for floating-point HDR is in its infancy (they don't even fully support LDR 16-bit images). If you want to play around with HDR, a better choice is Paul Debevec's [debevec.org] HDRShop [hdrshop.com] . In fact, Paul's site has long been one of the most important resources for digital HDR development and research (and here I include the third definition above, which is what "HDR" normally means when applied to CGI).

Unfortunately "HDR" has become a marketing buzzword, which means that in most cases it's used incorrectly, simply to mean "higher precision" or "with light blooming effects"...

RMN
~~~

Re:Monitors? .. What about input? (1)

metalhed77 (250273) | more than 7 years ago | (#16418563)

Well, they do and they don't. My 20D gets about 5 stops of dynamic range, which looks to be about what this thing outputs. It's not at all rare, in fact, for my 20D's dynamic range to exceed that of scene if it's relatively flat.

A lot of the HDR images made with photoshop have more dynamic range than the human eye does. The human eye can trick one into thinking contrary that however due to how quickly it adapts on changing focus.

Re:Monitors? .. What about input? (1)

tincho_uy (566438) | more than 7 years ago | (#16420199)

Fuji's S3Pro (and the upcoming S5Pro) do have a extended DR feature that works quite well in many cases. It's far from perfect, but it does improve the results. Basically, it uses adjacent photosites divided into two sets to capture at different intensities (I'm not really sure of the technical aspects, but I guess each set works at a different "ISO value").

Of course this implies a loss of resolution, since you are using 2 captured pixels to create 1 pixel in the final image

Just my 2 cents

Re:Monitors? .. What about input? (1)

tripler6 (878443) | more than 7 years ago | (#16420317)

yea, i think its using something like a 12mp sensor, but only outputting a 6mp image - thats the s5, iirc. I have no idea how its getting better dynamic range. im not sure though, I use a D200 myself. normally people dont fool around with hdr unless they like taking landscapes. its pretty good for those. im a sports guy, so.. whatever, i could care less.

Good news bad news (1)

edwardpickman (965122) | more than 7 years ago | (#16415559)

I was pretty excited until I saw the 49K price tag. That really killed the ole puppy. 5K I'd be highly interested but 49K is about 44K out of my budget range. Strictly for ultra high enders.

Re:Good news bad news (1)

UnknownSoldier (67820) | more than 7 years ago | (#16415613)

I hear ya! Just like plasma (displays) when they first came out. Had to wait a while for the price to drop from $50k to hit below $5k, but it was worth it.

It's interesting that in "graphics", resolution is being pursued first, instead of the bit-depth issue, when the later is just as important.

Cheers

Re:Good news bad news (2, Insightful)

edwardpickman (965122) | more than 7 years ago | (#16415743)

Excellent point. Truth be told I'd much rather see the color depth approached first. They've gotten better but for film level work none of them display full color resolution. Frustriating that the software will handle 48 bit, three channels of 16, but the monitors won't. Mostly becomes an issue when you are working with a lot for gradient images, skies and such. You still get some pixelation that isn't in the actual image file. Then again if you're doing TV who cares. They call it NTSC, never the same color, for a reason.

Re:Good news bad news (1)

Sparohok (318277) | more than 7 years ago | (#16416927)

Truth be told I'd much rather see the color depth approached first.

Color depth is HDR. They are one and the same. A monitor which can faithfully display 48 bit color is an HDR monitor by definition.

Re:Good news bad news (0)

Anonymous Coward | more than 7 years ago | (#16417289)

Actually it isn't that simple. HDR means "high dynamic range", which is mostly a fancy way of saying "high contrast". In a sunlit forest scene there are extremely high contrasts. The brightest spot is several thousand times brighter than the darkest spot. You can capture that scene in an 8-bit picture: Simply assign 0 to the darkest spot and 255 to the brightest spot and spread the in-between values according to your favorite gamma curve. That is a HDR picture with low intensity resolution. In the spatial dimension it's the equivalent of taking a picture with a low resolution camera or a high resolution camera. Both pictures show the whole scene, only in different amounts of detail. The HDR picture with low intensity resolution captures the whole dynamic range of the scene, but a HDR picture with high intensity resolution (16 bits or 32 bits per channel) captures finer nuances of brightness (and color). 8 bit intensity resolution is sufficient for displaying pictures without banding on LDR displays. To really make use of a HDR display, one also needs to improve the intensity resolution, but not every monitor which can faithfully display 48bit color is a HDR display, because high intensity resolution does not imply high dynamic range (contrast of several thousand to one when measured with a realistic black level).

Re:Good news bad news (1)

Sparohok (318277) | more than 7 years ago | (#16418885)

In theory, you have a good point. In practice, I don't think it is particularly significant.

The original purpose of gamma correction is to quantize intensity in a way that is perceived as linear. Linear perceived intensity is very important. It results in the best information transfer in a given number of bits. It is also a bedrock assumption of most digital signal processing methods. For example, if you apply a typical smoothing filter to a nonlinear image you will change the overall intensity of the image in unintended ways.

Using a gamma curve which is too far away from linear intensity will cause all sorts of problems and in practice is seldom done. When it is done, I imagine it is a poor compromise chosen for lack of better alternatives. It's the exception that proves the rule.

Re:Good news bad news (1)

4D6963 (933028) | more than 7 years ago | (#16419885)

Color depth is HDR

I wouldn't say that. Mainly because basically a high dynamic range means that your range go outside of your traditional 0.0-1.0 range, as color depth is mainly about staying in the 0.0-1.0 range, but with a better precision. This being said, you still can use a greater color depth to display HDR.

It's as if you had a 24-bit sound card, you could turn your 16-bit sounds into 24-bit sounds but 256 times lower, crank up the volume of your speakers so that these sound normal and with the normal amount of noise, and become deaf when a 24-bit sound makes use of the upper 8 bits.

Re:Good news bad news (2, Insightful)

Sparohok (318277) | more than 7 years ago | (#16420299)

There is no such thing as a 0.0 - 1.0 visual range. The human visual system is floating point, pretty much literally. You have your exponent, which is how well adapted your eyes are to the light, how dilated your pupil is, etc. You have your mantissa, which is the relative intensity within your current visual field. Physiologially, we have about 28 bits of exponent and about 10 bits of mantissa. So, proper HDR is floating point. But we're not quite there yet.

In both audio and video, this whole idea of quantizing a 0.0 - 1.0 interval is a compromise wrought by insufficient numerical resolution. It has nothing to do with physics or perception or anything else. Once you realize that, you should also realize that the idea of "going outside" the 0.0 - 1.0 range is absurd. You don't go outside the range, you expand the range so as to better approximate the incredible human senses. As long as we're using fixed point image formats and digital video standards, there will always be a range, and we'll always be inside the 0.0 - 1.0 range, and it will always be a compromise.

Audio professionals have worked out their terminology far better than graphics guys have. Audio guys talk about dB, decibels. The reference point is 0db, which is as loud as your amp will go. When you add more bits, you're adding more quiet, not more loud. If you want more loud you buy a bigger amp. Each additional bit gives you 3dB more quiet, and you'd better hope your equipment has a low enough noise floor that you can hear all that fresh new quiet.

So what are you saying? What's the difference between HDR and 48 bit color? To use an analogy to audio, you seem to be saying that HDR is about more loud, and 48 bit color is about more quiet. But as you go on to point out, they're really just the same thing. No matter what, you've got a clipping level (the maximum luminance of your output device), you've got a noise floor (the minimum luminance of your output device), and hopefully you've got enough quantization levels in between for perceptual linearity. That's why HDR and color depth are joined at the hip. You can't get one without the other. There is no meaningful distinction.

Re:Good news bad news (1)

4D6963 (933028) | more than 7 years ago | (#16419851)

Frustriating that the software will handle 48 bit, three channels of 16, but the monitors won't

That may be frustrating, but there's still a solution to make things look better if you got 48-bit images to display in 24-bit : dynamic random dithering.

Re:Good news bad news (1)

Sparohok (318277) | more than 7 years ago | (#16417061)

That's short sighted. This technology seems highly amenable to economies of scale and low cost mass production. I'm sure there are challenges, particularly efficiency and cooling, but those are really only a problem if you need high brightness. I could see a product based on this technology costing only 20% or 30% more than equivalent LDR displays within a year or two - predicated only on market penetration.

Other HDR technologies I've seen involve far higher barriers to low cost production. Laser projection being a great example, stunning imagery, but inherently high cost for a long time to come.

The real barrier to this technology is not cost but video standards. Most standards for digital video have 8 bits of dynamic range.

That's SOOO 2005... (1)

ChronosWS (706209) | more than 7 years ago | (#16415587)

BrightSide DR37-P HDR display

Published: 4th October 2005 by Geoff Richards

Um, yeah.

saw one at siggraph last year (1)

mrloafbot (739993) | more than 7 years ago | (#16415589)

I saw a hdr display from brightside tech either last year or the year before at Siggraph a conference on computer graphics. I thought it was such a cool Idea untill I saw it in person. They had a few images up and one was of a sunset, it was like looking directly at a sunset, pretty amazing but ULTRA bright. So bright that I didnt want to look at it and my eyes couldnt adjust to it. Why buy a display so you have to have sunglasses too look at it?

Re:saw one at siggraph last year (1)

jonTu (839883) | more than 7 years ago | (#16416331)

Yep, and honestly, that's all this is: a bright as hell monitor. Which may be great, and may possess a "high dynamic range" but it doesn't display what we call HDR. HDR images are a series of exposures merged into one image such that a tonal range beyond the perception of a normal camera is captured. This can be a still or a still sequence (i.e. video). If you're digital compositor, this is tremendously useful, as you can take HDR images and tweak the exposure to match the lighting of the rest of the scene. That's why Industrial Light and Magic invented the Open EXR format for HDR images, so they could tweak exposure for their 3D renders to composite them into scenes shot with live actors (and hence lighting that can't be tweaked so much in post). But a super-bright monitor labeled "HDR" is just a marketing gimmick. Take a look at the demo image at the bottom of TFA, it's a great example. If that was HDR, you'd be able to see the detail of the mountains, because it would at once be displaying the sky underexposed relative to the scene's average exposure value (so you see the cloud details) and the mountains overexposed (so you see the details of the peaks, instead of just silhouettes). This would display just fine on a conventional monitor, albeit in a much more life-like manner on a super-bright one. Yes, it's true that humans can perceive a much greater luminance range then most displays can provide, but that's no reason to get sloppy with our terminology. HDR is a tool to streamline workflow for digital compositors and for photographers to create images impossible in a single exposure with conventional camera equipment. It's not a display technology.

Re:saw one at siggraph last year (0)

Anonymous Coward | more than 7 years ago | (#16416817)

You have some reading up to do. HDR means high dynamic range, so 'may possess a "high dynamic range" but it doesn't display what we call HDR' doesn't make sense. What you call HDR is one specific kludge which is often used to create HDR pictures, but certainly not the definition of HDR. The usefulness of HDR images goes far beyond tweaking exposure in post processing. HDR is also extremely important when you need to render blur and reflections. (And you can tonemap the hell out of HDR images and make DIGG users drool all over their keyboard.) The image in TFA is a picture of a HDR display, shown as an 8-bit JPG on your presumably non-HDR display, so obviously you can't see the intensity resolution and contrast of the HDR display.

That's a bit backwards (1)

tweakt (325224) | more than 7 years ago | (#16420419)


You are correct in your description of what is being labeled "HDR" currently. However, you are actually a bit backwards on the subject. HDR is really the gimmick here. It's a trick, a way of approximating reality.

The term HDR is misleading. It's more accurate to describe it as a technique which uses dynamic range compression. Taking a real-life scene with a large dynamic range and compressing it into the limited range available on a monitor or in print. You are not increasing dynamic range, you are merely creating the perception of it. An image of the sun will not be displayed any brighter than what your monitor is capable of producing.

These new monitors are capable generating genuine high dynamic range. They produce blinding bright highlights while still rendering deep dark shadow details, but you can't assess this quality without viewing it directly. They're subject to the same "have to see it to beleive it" problem as HDTV.

Lightsabers (1)

catbutt (469582) | more than 7 years ago | (#16415609)

I always figured if you really looked at a real lightsaber, instead of being, say, pink in the middle [wikipedia.org] , they'd really look intense burn-your-retinas red, like looking at the little red lights on the underside of your mouse.

I'm not sure I'd pay $49,000 for a tv just for that purpose, but it's the best one I could think of. I'd certainly pay that much for a lightsaber though.

Re:Lightsabers (1)

Twisted64 (837490) | more than 7 years ago | (#16416757)

I haven't had a close look at that image, but I think lightsabres are white in the middle, and the corona is the only colourful thing about them. Only found that out a week ago, when I started fiddling with some photos of my own :)

Re:Lightsabers (1)

catbutt (469582) | more than 7 years ago | (#16420113)

I think you are wrong. You can tell by the way they look when they move quickly (bright, rich color smeared across the image), as well as from the light they shine on things.

They basically look the way neon lights look when photographed....washed out where they are brightest. Not the way they look when you see them in real life.

Contrast actually goes down to 0 (1)

CliffSpradlin (243679) | more than 7 years ago | (#16415637)

If you read their site, they explain that actually the contrast goes to 0, because a pixel on their screen can have 0 brightness.

Apparently this actually breaks the industry equation for deriving contrast (divide by 0), so they had to bump it up to like .1 or .01.

Pretty awesome technology.

Re:Contrast actually goes down to 0 (0)

Anonymous Coward | more than 7 years ago | (#16415769)

That's marketing bullshit. Contrast is a ratio: brightness of the brighter pixel or image divided by the brightness of the darker pixel or image. Nothing in this world is truly black (unless they packed a black hole into their display). A contrast of 0 is meaningless: Contrast=1 means no contrast, same brightness. Contrasts are typically given in ratio form normalized to 1 for the darker pixel. A maximum contrast of 400:1 means that a white pixel on this display is 400 times as bright as a black pixel. (Note that contrast is not synonymous with intensity resolution: An HDR display can still be a 24bit display, just with brighter white pixels.)

Re:Contrast actually goes down to 0 (2, Insightful)

SEMW (967629) | more than 7 years ago | (#16415971)

Don't be an idiot. You don't have to have "a black hole in the display" for a pixel to have effectively zero brightness, you just have to have it not generate any light (excluding blackbody radiation, which is negligible in the visible spectrum at room temp). One of the photos in TFA is of the monitor displaying a black screen in a dark room; you can't tell it from the surroundings. The pixels can be individually completely switched off (actually, that's not strictly true, a group of a few pixels can be switched off), giving a contrast ratio of (max brightness)/0 -- hence the divide by zero error, as the grandparent said.

Re:Contrast actually goes down to 0 (0)

Anonymous Coward | more than 7 years ago | (#16416189)

One of the photos in TFA is of the monitor displaying a black screen in a dark room; you can't tell it from the surroundings.

I can take an entirely black picture of the room I'm in now. That doesn't mean the picture on the wall has infinite contrast. If the room isn't entirely dark (and a room with an HDR display in it isn't perfectly dark if you're not watching only renditions of /dev/zero), then the reflectiveness of the display surface limits the contrast, unless it's a black hole, as I mentioned before. Like I said. Black isn't. Claiming zero brightness for off pixels is marketing bullshit.

Re:Contrast actually goes down to 0 (2, Insightful)

SEMW (967629) | more than 7 years ago | (#16416457)

> f the room isn't entirely dark (and a room with an HDR display in it isn't perfectly dark if you're not watching only renditions of /dev/zero), then the reflectiveness of the display surface limits the contrast, unless it's a black hole, as I mentioned before

Nope. The specified contrast is the ratio of EMITTED, RADIATED light from a bright pixel to EMITTED, RADIATED light from a dark pixel. Certainly, ambient light will reduce the effective contrast in reality, but the definition of specified contrast ratio assumes no ambient or reflected light. Obviously. How could it be otherwise, or the contrast ratio would be meaningless unless you specify everything from amount of ambient light to the colour of the walls along with it.

Re:Contrast actually goes down to 0 (0)

Anonymous Coward | more than 7 years ago | (#16416639)

If you're happy with a completely useless metric, then of course infinite contrast can make sense to you. Other people live in the real world where surface reflectivity causes brighter "black" pixels than residual emission. Those people recognize "infinite contrast" as the marketing bullshit that it is.

Re:Contrast actually goes down to 0 (1)

4D6963 (933028) | more than 7 years ago | (#16419903)

Nothing in this world is truly black

If you don't let any photon out then it's trully black. No need to invoke black holes.

Re:Contrast actually goes down to 0 (1)

Eccles (932) | more than 7 years ago | (#16416529)

Nigel Tufnel: It's like, how much more black could this be? and the answer is none. None more black.

What about SED? (1)

ScislaC (827506) | more than 7 years ago | (#16415641)

I was under the impression that SED monitors were set to be the new big thing starting in the next couple years and that they're also boasting very high contrast ratios too with very low power consumption. And given that they're supposed to be mass-produced by comparison, hopefully the price would be significantly lower.

Re:What about SED? (1)

some_hoser (656003) | more than 7 years ago | (#16419425)

The problem is that all they're doing to get their high contrast ratios is to make the darks very dark. The max brightness is bitiful, less than a standard lcd screen.

They made up the 200k figure... (3, Informative)

pla (258480) | more than 7 years ago | (#16415701)

The BrightSide DR37-R EDR display theoretically has an infinite contrast ratio. How? Because it can turn individual LED backlights off completely (see How It Works), it has a black luminance of zero. When you divide any brightness value by this zero black value, you get infinity.

It goes from 0 to 4000cd/m^2. Their comparison model, the LVM-37w1, goes from 0.55 to 550cd/m^2.

So this toy gets as close to true black as you can get - "off", thus constrained by the ambient light level. For white, they manage 4000cd/m^2, or comparable to fairly bright interior lighting.


Consider me impressed, but realistically, this only amounts to roughly an 8x brightness improvement over the best normal displays, with true-black thrown in as a perk (they suspiciously don't mention the next lowest intensity, no doubt because it goes back into the realm of a contrast ratio of only a few thousand.

Re:They made up the 200k figure... (2, Insightful)

kidgenius (704962) | more than 7 years ago | (#16415867)

You missed what they said. If you kept reading just another sentence or two, you would've understood.

Using the "next lowest intensity" as you described gets them to the 200k figure, not only a few thousand. The perfect black, "off", gets you to infinity.

Re:They made up the 200k figure... (2, Funny)

pla (258480) | more than 7 years ago | (#16415943)

If you kept reading just another sentence or two, you would've understood.

D'oh! My bad... I must have glazed over for that part, because I seriously didn't notice it. But yeah, I suppose that pretty much negates the bulk of my point.

Re:They made up the 200k figure... (1)

ketamine-bp (586203) | more than 7 years ago | (#16416139)

how good the dynamic range is depends on the monitor as well as the ambient environment you are going to use it in. the material that makes up the thing also counts. if your room has anything that is light emitting it kind of defeat the purpose of turning the thing off.

This is over a year old (1)

Sarusa (104047) | more than 7 years ago | (#16415717)

I know slashdot always runs behind digg by a few days or even a week or two, but this is ridiculous.

Does it create projection-type movie images? (1)

Etcetera (14711) | more than 7 years ago | (#16415759)

We've all seen the scenes in movies (WarGames, Sneakers, 2001, etc...) where someone is looking at a monitor and we see the reflection of the image on the screen projected out onto their faces.

Question: is the image showing up like that purely a function of the brightness of what the person is looking at? IOW, would an HDR monitor have the effect of "projecting" the image out as if one were staring into an overhead?

Someone mentioned above that pictures/video of stained glass windows were often used as demos of HDR tech. Did it have this sort of effect in the demo room?

Re:Does it create projection-type movie images? (1)

dottyslashdottydot (1008859) | more than 7 years ago | (#16416009)

To project an image on someone's face, you need to focus it somehow. Think of a movie projector, or the overhead you mentioned. Since there's no lenses involved in this screen, it would be impossible to focus the image from the monitor onto someone's face, or any surface for that matter. All you'd get with this monitor is a diffuse glow, just brighter. This is a silly 'effect' in a movie, as to create it, they would have to use some sort of a projector. The actors would have blinding lights in their eyes... not so good for trying to read the screen!

Re:Does it create projection-type movie images? (0)

Anonymous Coward | more than 7 years ago | (#16416011)

You get bloom around the bright parts of the image, but not a projected image, per se. You need polarizing and focusing to make a projector, not just bright light. It does light up the room, but not in a focused fashion.

Re:Does it create projection-type movie images? (2, Insightful)

SEMW (967629) | more than 7 years ago | (#16416089)

No. Think about it: unless you're really pressing your nose right up to the screen, for a monitor to display a reflection of the image on the face of whoever's looking at it, it would have to radiate at a single angle (probably perpendicular) only. You wouldn't be able to see the whole screen, only a few pixels per eye at any one time. Ever stood in front of a projector screen and looked at the projector? Like that. It would be utterly useless as a monitor.

N.B. if you have something like the left side of the screen one colour and the right side a different one, you may well be able to see that by looking at your face, but that's more due to the fact that your face isn't flat; the left side slopes backwards from centre to edge, and vice versa for the right side. You certainly wouldn't be able to see detail.

Too much (1)

Dougthebug (625695) | more than 7 years ago | (#16415897)

I'm all for making monitors with better contrast but the BrightSide solution is a little silly.

4000 cd/m^2 is their models peak luminance. The nice thing about a standard 300 cd/m^2 monitor is that I can stare at a picture of the sun for as long as I want without blinding myself. I'm not sure I would want to do that with one of these... Not because it's enough to blind you or anything, but it could cause your pupils to dialate so when you turn it off everything would be really dark.

What about the video card and xmission path? (0)

Anonymous Coward | more than 7 years ago | (#16416359)

Nice, but I wonder why no one has commented on the fact that the video card is still outputting, at best, plain old integer 32bit RGBA. What would be nice is a video card that outputs FP32 or better, combined with an HDR monitor that accepts FP32 or better input. Right now the technology is "reconstructing" the dynamic range...but, this is a lossy process and "true" HDR with a full HDR path from rendering, to transmission, to display, will look even more orders of magntitude better I'd wager.

probably not worth it (1)

oohshiny (998054) | more than 7 years ago | (#16416503)

These kinds of monitors are probably not worth it. For the purposes of mammalian vision, high dynamic range is a nuisance that needs to be gotten rid of, and that's exactly what the human eye is doing. You still notice that a high dynamic range is present, but you don't really perceive it.

A little more dynamic range than what your average LCD monitor has would be nice, but aiming for reproducing anything resembling the full dynamic range of natural scenes is a waste of time and money.

Re:probably not worth it (1)

dfghjk (711126) | more than 7 years ago | (#16417459)

Exactly! The purpose of HDR photography is to capture all the dynamic range of the scene, not to display it in all it's eye-blasting glory. Part of the HDR technique is to selectively compress the dynamic range to achieve interesting images.

And why exactly is this better than.. (1)

Brane2 (608748) | more than 7 years ago | (#16416509)

...having sanwditch of two identical LCD panels, glued one on another and driven with essentially the same signal ?

One would probably need a bit stronger backlight and maybe special mask between the panes so that ligt form one pixel on one LCD could go just through tha same position on another panel...

Sweet!!!! (3, Funny)

MoxFulder (159829) | more than 7 years ago | (#16416559)

I can't wait till this goes mainstream. Then I'll be able to watch a video of a solar eclipse and actually get blinded by the image. Coool.

Local contrast? (0)

Anonymous Coward | more than 7 years ago | (#16416819)

If you RTFA, they're using a 45x31 array of independent backlights. I wonder what contrast ratio you'd get for a checkerboard with each square only a few px on a side - since they claim they're using the same liquid crystals as normal displays, I guess the local contrast is still limited to two orders of magnitude like everything else. Now, this probably isn't as much of a problem, because two orders of magnitude is a lot for two points right next to each other.... I wonder if there are any weird effects when panning images with relatively high local contrast as different parts of the image move over the boundaries between the backlights.

Re:Local contrast? (0)

Anonymous Coward | more than 7 years ago | (#16416909)

I bet a checkerboard that's out of phase with the backlight array would look really odd.

omg (0)

Anonymous Coward | more than 7 years ago | (#16416845)

hdr pron, omg!

power consumption woes (1)

madmethods (729671) | more than 7 years ago | (#16417169)

Haven't seen anybody comment on this yet -- if you dig through the actual specs you'll see one reason why this technology hasn't already taken off. The power consumption of the display is 1680 watts. You basically wouldn't want to put anything else on a household circuit with it.

-G

Re:power consumption woes (1)

CreateWindowEx (630955) | more than 7 years ago | (#16417773)

The end of the article mentioned some Moore's law equivalent for LEDs that would likely reduce the power consumption and heat output to reasonable levels by the time this technology became ready for consumer applications. However, if they really meant it was doubling light output per watt every 12-18 months, at some point that would become a perpetual motion machine, so I'm not sure I really understood this point. IANAEE.

Re:power consumption woes (1)

windsurfer619 (958212) | more than 7 years ago | (#16418141)

Theoretically, LEDs can become 100% efficient at converting electricity into light. Perhaps they mean't moore's law, until the physical limit?

Polaroid XR film (4, Interesting)

dpbsmith (263124) | more than 7 years ago | (#16417211)

I can't seem to find a reference to it online... I'd appreciate one if someone has one... but circa 1960 the Polaroid company developed a film for recording nuclear tests, which was similar to three-layer color film except that the three layers, instead of being sensitized to different colors, were given emulsions with widely different sensitivities. The fastest emulation was similar to Kodak Royal-X Pan, ISO 1600, and the slowest was similar to Kodak Microfile, and if I recall correctly had an ISO speed of something like 0.1

The result was to extend the useful dynamic range of the film by a factor of 10000 or so--more than a dozen additional f-stops of latitude, or extra Ansel zones, if you like.

The film was processed in regular Kodacolor chemistry (IIRC), each layer coming out a different color. In color, the result was a "false color" image displaying a huge dynamic range of light intensity; or, it could be printed as black-and-white using different filters to select different intensity ranges.

In effect, the photographer was automatically bracketing every shot by a dozen F-stops, in a single shutter click.

It was an incredibly neat hack. I wonder whatever became of it?

isn't that actually LOW dynamic range? (1)

stigmerger (989244) | more than 7 years ago | (#16417655)

> We are seeing more and more about high dynamic range (HDR) images,
> where the photographer brackets the exposures and then combines
> the images to increase the dynamic range of the photo.

So instead of an image that goes from black to white, you have an image
that goes from dark grey to light grey. Now you can see all the stuff
that would've been hard to make out in a single photo. I think what has
happened is that you've *decreased* the dynamic range in the photo (or,
more properly speaking, you've used the fixed range better).

If a new camera provides pixels in the range [0,10] instead of the range
[0,1] you can still have the same problem. It's where you map the values
that come through the lens that makes the photo ledgible or not.

Or maybe I'm missing something.

Black level (0)

Anonymous Coward | more than 7 years ago | (#16417933)

I'm more excited about the really low black level instead of the super-high brightness. Those pictures show just how shitty the black level of LCD displays is. With this new technology LCDs will finally be able to compete with CRT black levels. Brightside should produce models with normal peak luminance (conveniently avoiding the cooling problems) for home applications. I just wonder how they avoid any significant halos with that low-resolution led panel lighting up the high-resolution LCD panel.

shameless plug (1)

Anubis333 (103791) | more than 7 years ago | (#16420499)

Crysis [crysis-thegame.com] was shown on a BrightSide monitor at SIGGRAPH. These monitors hook up to commercial off-the-shelf hardware. The price should drop dramatically in the future when better production processes are in place, much like any new product. ;)

(as long as the software supports HDR/their DLL)
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...