Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Are We At the Limit of Screen Resolution Improvements?

timothy posted about a year ago | from the eentsy-weentsy dept.

Displays 414

itwbennett writes "A pair of decisions by Motorola and Ubuntu to settle for 'good enough' when it comes to screen resolution for the Ubuntu Edge and the Moto X raises the question: Have we reached the limit of resolution improvements that people with average vision can actually notice?" Phone vs. laptop vs. big wall-mounted monitor seems an important distinction; the 10-foot view really is different.

cancel ×

414 comments

I have a hard time (4, Funny)

vikingpower (768921) | about a year ago | (#44447397)

reading TFA...

Re:I have a hard time (0)

ColdWetDog (752185) | about a year ago | (#44447581)

You click on the link.

You're welcome.

Re:I have a hard time (1)

AndyAndyAndyAndy (967043) | about a year ago | (#44447895)

You probably just need more pixel density.

Re:I have a hard time (0)

Anonymous Coward | about a year ago | (#44447973)

Public school really has gone downhill...

Re:I have a hard time (1)

MouseTheLuckyDog (2752443) | about a year ago | (#44448071)

Is it in cursive?

already passing it (5, Insightful)

tverbeek (457094) | about a year ago | (#44447401)

We're already past the level where I can benefit from higher resolution on phones. I'm over 40 and already have reading glasses, but I'd need to get special phone-only glasses to see any more detail or smaller type.

Re:already passing it (2)

Anonymous Coward | about a year ago | (#44447605)

I feel your pain. I can no longer do any glasses-free browsing on my smartphone without a lot of squinting and resulting headache. I fear that increasing resolution will just tempt younger developers (who have yet to encounter the joys of presbyopia) to design things in even smaller fonts.

Re:already passing it (1)

Nadaka (224565) | about a year ago | (#44447833)

you can use a 2 finger "stretch" gesture to zoom in.

Re:already passing it (1)

Anonymous Coward | about a year ago | (#44447915)

you can use a 2 finger "stretch" gesture to zoom in.

On websites, sure. But rarely in native applications.

Re:already passing it (1)

graphius (907855) | about a year ago | (#44447945)

you can use a 2 finger "stretch" gesture to zoom in.

Only works for some things and some sites...

Re:already passing it (1)

Anonymous Coward | about a year ago | (#44447899)

We aren't past the limit of screen resolution improvements, but we definitely are seeing diminished returns.

Computer monitors, and TVs have not caught up to phones in terms of pixel density, and antialiasing is still needed even on the highest density screens.

However for most applications the limitation now is less screen resolution and more content size (streaming 1080p video is impractical in large portion of the US so even a 1080p monitor is waisted for applications like watching netflix streaming), and as the parent pointed out, many people can't even perceive the different between current high density and slightly lower density screens due to imperfect eyesight.

The value of screen densities higher than 300ppi is debatable, as is the value of going even that high for large format multi-viewer displays (like TVs).

Re:already passing it (2)

malignant_minded (884324) | about a year ago | (#44448005)

You shouldn't have to. What we should be/are concentrating on is better reflow and text to speech. Higher resolution should be a benefit as text becomes less blocky making shape recognition easier. Just because resolutions are higher doesn't mean you should have smaller text if you don't want it. With so many different size devices you should be able to load and manipulate content on demand. So if you don't want images because of connection or space constraints, your choice. Images should also be vectored whenever possible. Currently I have 20/10 in one eye and 20/15 in the other at 33, I am hoping to hold onto this as long as possible but it will eventually decrease, that is life. Content should be able to handle all cases as the person desires. If I can only see 1 inch icons that should be my choice and my phone should have desktops with 4 icons.

Re:already passing it (0)

Anonymous Coward | about a year ago | (#44448165)

what difference does a higher resolution truly make? more time with your family? less eye strain? solution to world hunger..? or just marketing gimmicks to push upgrades and stock prices?

Not until Anti-Aliasing isn't a thing (5, Insightful)

earlzdotnet (2788729) | about a year ago | (#44447405)

We've reached this point with some devices, but a screen isn't a high enough resolution until Anti-Aliasing isn't needed in any form.

Re:Not until Anti-Aliasing isn't a thing (1)

Tynin (634655) | about a year ago | (#44447513)

We've reached this point with some devices, but a screen isn't a high enough resolution until Anti-Aliasing isn't needed in any form.

Came here to say the same thing. I'm looking forward to the new 4K monitors finally starting to come out, which may spell the end for AA.

Re:Not until Anti-Aliasing isn't a thing (3, Insightful)

Seumas (6865) | about a year ago | (#44447703)

I'm really excited for 4k monitors, but it's going to be awhile before really high quality ones that are great for work (color accuracy and reproduction, no weird problems exhausting your eyes like a lot of gaming-specific monitors) as well as great for gaming (responsive, negligible lag/input-delay/ghosting) are available. Even longer before they are around $3,0000 (which is about the price at which I'd pull the trigger on at least one of them).

Hopefully, by the time those exist, GPUs will exist that can fully utilize a 4k display on a single GPU.

As for home theaters? I don't think we'll see much 4k content in a very long time. I bought my first 50" 1080p HDTV in 2001 but it seems like most of the population is only now finally moving to HDTV in 2013 (and most of those are still the people who say things like "I don't know why we need HDTV -- standard television is as good as it needs to get and I can't tell any different!". There will be a huge chicken and egg problem for the next decade. Plus, since most of the content will start to be delivered over the network, there will have to be significant improvements in speeds and data caps in this country. We can't even count on true 1080p digital distribution, yet.

Consoles will not make use of 4k this generation, so that is out of the question for the next decade, too. Yeah, the PS4 and XBOX ONE both support 4k, but I doubt that's going to be true 4k. It'll be upscaled. I just don't see how these dinky little consoles with only a few gigs of memory available will be able to push enough bits around for native 4k.

Re:Not until Anti-Aliasing isn't a thing (2)

Shadow of Eternity (795165) | about a year ago | (#44448103)

You're not going to get the response times you want until we go back to electron/phosphor tech instead of physically moving pixels. I get a new trinitron off ebay periodically because even the fastest "gaming" screens these days are still so slow compared to a CRT that I can see the blur just from moving around ingame like a smeared oilpainting.

Re:Not until Anti-Aliasing isn't a thing (1)

gl4ss (559668) | about a year ago | (#44447521)

yeah.

the mentioned devices do it for parts sourcing and money reasons - and not wanting to go higher density than what's available on default configs shipped with os on the os they're shipping with.. (android, yes edge ships with android... or might ship. but they do state that it will ship with android and then later with the ubububutouch).

that's the usual line anyways, what's cheap enough is good enough - for now. and that for now part is what companies like to skip in their shitty materials.

and it's really not just anti aliasing, but getting it high enough to fool the eyes that it's not a screen at all but a window through.

Re:Not until Anti-Aliasing isn't a thing (1, Informative)

malzfreund (1729864) | about a year ago | (#44447627)

You'll always need anti-aliasing. Even if we had 1000 dpi monitors.

Re:Not until Anti-Aliasing isn't a thing (1)

malzfreund (1729864) | about a year ago | (#44447667)

can't seem to edit my previous post. antialiasing has nothing to do with resolution.

Re:Not until Anti-Aliasing isn't a thing (4, Informative)

gl4ss (559668) | about a year ago | (#44447887)

can't seem to edit my previous post. antialiasing has nothing to do with resolution.

antialiasing and font edge smoothing as it is understood when people speak of antialiasing has pretty much everything to do with resolution.

if you can't see the individual pixels, and need say a group of 10x10 pixels to see a point on the screen, it becomes meaningless to do any subpixel effects of any kind on those 100 pixels that make up the smallest unit you can actually see.

and slashdot doesn't have an edit functionality btw.

Re:Not until Anti-Aliasing isn't a thing (0)

Anonymous Coward | about a year ago | (#44448053)

antialiasing and font edge smoothing as it is understood when people speak of antialiasing has pretty much everything to do with resolution. if you can't see the individual pixels, and need say a group of 10x10 pixels to see a point on the screen, it becomes meaningless to do any subpixel effects of any kind on those 100 pixels that make up the smallest unit you can actually see. and slashdot doesn't have an edit functionality btw.

I have trouble reading your writing, but there's no resolution where aliasing isn't an issue. The statement

We've reached this point with some devices, but a screen isn't a high enough resolution until Anti-Aliasing isn't needed in any form.

is false. Higher resolution will eliminate some artifacts which people call aliasing, but not all of them.

Re:Not until Anti-Aliasing isn't a thing (0)

Anonymous Coward | about a year ago | (#44447841)

It's funny you mention this. Really, one of the reasons to have a high res screen is so AA actually works properly. (The pixels are small enough that your vision will interpenetrate the interpolation as smooth curves) On low res screens you have to resort to subpixel rending. And it is an abomination that makes blurry poor contrast text with color artifacts.

Cleartype (The micrsoft brand of subpixel AA) is so awful that it gives me headaches. It makes text noticeably worse if you have good vision. (If turning cleartype on helps, you need glasses) I have to resort to registry hacks to turn off AA and replace the system font with Arial in order for a computer to be usable.

FYI, apple uses greyscale AA on "retina" screens. (Yes, Retina is just their brand name for displays above a certain PPI)

I suppose you are right, though, drive the pixels small enough and there will be no need for any AA. A dream for another decade.

Re:Not until Anti-Aliasing isn't a thing (1)

Anonymous Coward | about a year ago | (#44447877)

Antialiasing is faster, uses less memory and yields better results than realistic increases in screen resolution. On a screen with pixels too small to see individually, antialiasing isn't needed to remove jagged edges (because they're already too smooth to see), but it ensures correct tone / density. To create the same visual quality as antialiasing with 8 bit depth, you would need 256 times more pixels, at a prohibitive memory and CPU cost. There is absolutely no point in making displays with a resolution much higher than the human eye can resolve from the smallest distance that the display is going to be regularly viewed from.

Re:Not until Anti-Aliasing isn't a thing (0)

Anonymous Coward | about a year ago | (#44447901)

That's an absurd statement to make, since the 'jaggies' that anti-aliasing deals with can be many pixels wide. You'd need something like 10x the pixel density the eye can perceive to completely get rid of the need for anti-aliasing.

No (4, Interesting)

wangmaster (760932) | about a year ago | (#44447409)

Come back and talk to me again when the average laptop and desktop screen hits high density PPI :)

Re:No (0)

Anonymous Coward | about a year ago | (#44447441)

Why does the laptop need to be average? Your argument is akin to "have we reached the moon? No! Come back to me when the average man has stood on the moon".

There exist laptops (notably the Retina MacBook Pro and the Chromebook Pixel) which have resolutions high enough.

Re:No (3, Insightful)

wangmaster (760932) | about a year ago | (#44447613)

The average smartphone has a 720p screen with a pixel density well above 200 now. In the context of this discussion, why can't an average panel that is generally within 12-24"s of your face (desktop or laptop) not have the same requirements?

Sure, there exists laptops today that do. But those laptops don't provide you with alot of choice (both are walled gardens, yeah yeah yeah, I know you can install other things on them etc etc etc, but that's not the point here).

That said, I know this is coming. We're seeing more and more high resolution ultrabooks/laptops. So when I say come back and talk to me again, it's very likely by the end of the year :).

Re:No (1)

NJRoadfan (1254248) | about a year ago | (#44447621)

What, isn't 1366x768 good enough for everybody? Ugh.

Re:No (4, Insightful)

Andrio (2580551) | about a year ago | (#44447729)

Phones? Yes (There's not much benefit going past 1280 * 800 )

Tablets? Getting there (Nexus 7 at 1080p, Nexus 10 at 2560 * 1600)

Monitors? NO! Let me put it like this. Most monitors sit somewhere between the previously mentioned phone and tablet resolutions, despite being 2-5 times the size.

Re:No (0)

Nerdfest (867930) | about a year ago | (#44447881)

ChromeBook pixel is getting there.

Re:No (0)

Anonymous Coward | about a year ago | (#44447905)

640dpi ought to be enough for anybody.

no (2, Insightful)

iggymanz (596061) | about a year ago | (#44447417)

I have rather poor vision, having to use different lens for reading, computer, distance...and I can still see the difference between 1080i and 4K monitors, a person with 20/20 should be able to benefit from even higher resolution (and I suspect even higher contrast ratios).

We know from testing a significant part of the female population would notice higher bit color space too.

Re:no (0)

Anonymous Coward | about a year ago | (#44447663)

This is specifically discussed in the article...thanks for reminding me why one should never read the posts here

Re:no (1)

iggymanz (596061) | about a year ago | (#44447889)

no, it is not. the article only addresses phone and laptop distance and resolutions and discusses the debate there.

maybe you remind us why AC should be under the threshold of normal viewers

Re:no (1)

wangmaster (760932) | about a year ago | (#44447959)

In AC's defense:
Phone vs. laptop vs. big wall-mounted monitor seems an important distinction; the 10-foot view really is different.
That was in the slashdot article itself (not the linked article).

I chose to make my post because I thought it needed to be explicitly answered :)

Re:no (1)

azav (469988) | about a year ago | (#44447757)

1080i?

There are no 1080i monitors. The i stands for interlaced, which means that under high data rate of playing back a video, every other line of the current frame is skipped and filled in in the next frame.

The monitors are p, which stands for progressive and the progressive is progressive scan, as in top to bottom. This, today, is not really relevant on non CRT displays either since the CRTs used scanlines to display the image.

FYI, a 1080p display should be 1920 x 1080 square pixels.

Re:no (1)

iggymanz (596061) | about a year ago | (#44447809)

yes I mistyped

as aside my "p" monitor can go into "i" mode though

Re:no (1)

sjames (1099) | about a year ago | (#44447867)

But many people with less than perfect but better than dismal vision will tend to use their phones with uncorrected vision so they don't have to get out their reading glasses on the move.

Re:no (1)

iggymanz (596061) | about a year ago | (#44447965)

well, some of us old farts use the reading portion of our /distance bifocals, which we change to monitor glasses when we get to work. so at work the phone is blurry. soon we'll get to heads up display in light weight glasses that actually zoom and focus, but that don't look like dork-ware (google glass,etc).

Digital Movie Projection... and "Average People" (3, Interesting)

jellomizer (103300) | about a year ago | (#44447419)

If you build for the average person, you are doomed to fail. Because 1/2 of the population is above average. Also there are the finer details that a person doesn't fully recognize. The average person cannot tell the difference between 720p and 1080p. However if you have them side by side (with colors/contract/brightness matching) They will see the a difference.

Re:Digital Movie Projection... and "Average People (4, Informative)

Anonymous Coward | about a year ago | (#44447715)

Because 1/2 of the population is above average.

Half the population is above (or below) the median.

Re:Digital Movie Projection... and "Average People (1)

Anonymous Coward | about a year ago | (#44447733)

Because 1/2 of the population is above average.

AVERAGES DO NOT WORK THAT WAY! GOOD NIGHT!

Re:Digital Movie Projection... and "Average People (0)

Seumas (6865) | about a year ago | (#44447735)

I'm no mathematologist, but I think half of the population is above the median; not the average. :P

Re:Digital Movie Projection... and "Average People (0)

steelfood (895457) | about a year ago | (#44448023)

No, with a normal distribution, half the population is going to be above and the other half below, no matter what average you take. That's assuming a normal distribution. I don't know what distribution visual acuity of the industrialized world's population actually is. I'd hazard a guess that it's skewed to the lower end with the higher end quickly diminishing.

Now, instead of the mean or even the median, the average that best fits in this case is probably the mode.

Re:Digital Movie Projection... and "Average People (1)

DFurno2003 (739807) | about a year ago | (#44447835)

"The average person cannot tell the difference between 720p and 1080p" Where's you pull this one out of?

Re:Digital Movie Projection... and "Average People (0)

Anonymous Coward | about a year ago | (#44447855)

Because 1/2 of the population is above average..

Averages don't work that way.

Re:Digital Movie Projection... and "Average People (0, Informative)

Anonymous Coward | about a year ago | (#44447865)

Because 1/2 of the population is above average.

Basic stats fail.

Half the population is above the *median*, by definition.
Half the population is only above the average/mean for a given characteristic if that characteristic has a exactly symmetric distribution such as the normal distribution

Example:
IQs 89, 90, 91, 130
Average is 100, 3 below average and 1 above average.
Median is 90.5 (average of the two 'middle values'). 2 below median, 2 above.

Re:Digital Movie Projection... and "Average People (4, Insightful)

bill_mcgonigle (4333) | about a year ago | (#44448095)

Basic stats fail.

I can't believe there are five posts on here that declare 'average' to be 'mean' and then go on to criticize the GP's lack of statistical knowledge.

I think the very first thing on the very first day of my first statistics class was a discussion of mean, median, and mode, and how all three are referred to as 'average' in common parlance, depending on context.

Re:Digital Movie Projection... and "Average People (1)

medv4380 (1604309) | about a year ago | (#44447873)

You're assuming Normal distribution. If it is logarithmic distribution, which I'd put more money on, then you're wrong. Only a small number of people can see better than average, 20/20. Many see far worse, and some, like myself for a time, have vision like 20/15. It doesn't last, and "half" the population isn't any where near it.

Re:Digital Movie Projection... and "Average People (1)

bill_mcgonigle (4333) | about a year ago | (#44448139)

have vision like 20/15. It doesn't last, and "half" the population isn't any where near it.

hrm, I've been contact-lens corrected to 20/15 for the past 28 years.

Re:Digital Movie Projection... and "Average People (1)

Cid Highwind (9258) | about a year ago | (#44448145)

Wait, we're talking about digital movie projection, as in machines that will be used to show "Transformers 7: Incomprehensible Jump-Cut Explosiongasm!" and you're worried about it being commercial failure because too many people are above average?

...

(Oh god, when did I get so old?)

NO (-1)

Anonymous Coward | about a year ago | (#44447437)

FUCK NO

Yes (0)

Anonymous Coward | about a year ago | (#44447443)

All of the PPI chasers trying to beat Apple on number specs are irrelevant. I guess on forums need numbers to argue superiority over but... personally when I no longer could see pixels under normal use I didn't care.

900 dpi (1)

jbolden (176878) | about a year ago | (#44447445)

I remember someone did a test of this when Steve Jobs came out with "retina" claim. For a young child holding a phone at arm's distance 900 ppi was really "retina" resolution. I think we are likely one double short of retina resolutions on our higher resolution devices. 20 megapixel for a laptop, 5 megapixel for a phone is probably genuinely the limit.

Right now our hardware isn't fast enough to handle that much resolution so it is still a balancing act.

Re:900 dpi (4, Interesting)

Trepidity (597) | about a year ago | (#44447583)

It's a bit complex, because the retina doesn't really have a static resolution: it integrates information from constant movements, responses nonlinearly to different patterns of photon impacts, and has different sensitivies across different parts. You could put a ballpark number on it, but it's difficult to really sort out what the "resolution of the retina" is.

To quote a paper:

Many would say that new display technologies, 9 megapixel panels and projectors for example, are coming ever closer to “matching the resolution of the human eye”, but how does one measure this, and in what areas are current displays and rendering techniques still lacking? [...] The resolution perceived by the eye involves both spatial and temporal derivatives of the scene; even if the image is not moving, the eye is (“drifts”), but previous attempts to characterize the resolution requirements of the human eye generally have not taken this into account. Thus our photon model explicitly simulates the image effects of drifts via motion blur techniques; we believe that this effect when combined with the spatial derivatives of receptive fields is a necessary component of building a deep quantitative model of the eye’s ability to perceive resolution in display devices.

Pretty interesting stuff, from a project that tried to build a photon-accurate model of the human eye [michaelfrankdeering.com] .

yeah? (0)

Anonymous Coward | about a year ago | (#44447453)

This will all be a silly prehistoric consideration a thousand years from now when those reading this is using atomic resolution.

Hype (0)

Anonymous Coward | about a year ago | (#44447467)

Honestly, I think beyond a certain point the resolution increases are all hype. I have decent vision, and I was in BestBuy and looked at a retina iPad right next to a non-retina iPad. I literally could not tell the difference unless I stuck my nose up to the glass. It's all just marketing gimmick.

Re:Hype (1)

AvitarX (172628) | about a year ago | (#44447645)

I can immediately tell the difference, it's quite stark IMO.

I do not know that going past the retina iPad would have payback for me.

Yes. (1)

Anonymous Coward | about a year ago | (#44447475)

This was studied back in the beginning of the HD era. Most people cannot physically discern the difference between 720p and 1080p resolution on their TVs, viewed at normal distances. At the time I think they used 42" as the average panel size; that has probably increased somewhat, but the point stands.

Now we're putting 2 megapixels inside 5 inches? There was benefit to getting close to 300 ppi (i.e. "retina" resolution); that's basically National Geographic-level print resolution. But new phones that are pushing over 450 are wasting loads of battery life by pushing around way more pixels than are at all necessary.

Roughly 720p on a handheld is all that form factor will ever need, until we start engineering human eyes to work better. Tablets can go a bit higher, but anything more than the Nexus 10's resolution is a waste.

Re:Yes. (2)

bobbied (2522392) | about a year ago | (#44448067)

But you always have the "My screen resolution is better than yours" crowd that will fall for the device with the better specs in droves so you can bet device makers will be designing and building resolutions that you and I can't ever hope to see.

But one should be careful to note that the issue is pixels per inch and not overall resolution here. 720P might be overkill on a 2" screen, but it might be way too low for the latest movie theater screen. Even at the best PPI you can see, the next frontier will be refresh rates (Although, going much past 120 FPS is totally overkill.. )

Personally I really *hate* watching blue ray movies in full resolution. Usually the material just looks cheesy to me, where you can see the boundaries of the CGI sequences, makeup smudges on the actors, obvious short cuts on the set construction and all kinds of things that just are not right. It actually makes it more difficult for me to suspend reality long enough to enjoy the movie. Of course, being an old projectionist from years ago makes me sensitive to vestiges of bad editing, splices, reel changes and queue marks which also distract me.

Re:Yes. (2)

bobbied (2522392) | about a year ago | (#44448073)

My point being, even if you can see it, having more resolution is not necessarily a good thing.

Definitely (0)

Anonymous Coward | about a year ago | (#44447483)

Seriously, I think we've even already passed the sweet spot, into useless (i.e. wasteful) overkill. I wish I could buy a lower-res (but otherwise the same) smartphone, and reap the difference as longer-lasting battery (and also increased performance, though I doubt I would be able to perceive that, outside of looking at tables full of benchmark times). I know it's subjective and we all have our different thresholds, but if we aren't there yet, then surely almost everyone would agree that we're at least close to being there, right?

Re:Definitely (1)

topologicalanomaly47 (1226068) | about a year ago | (#44447909)

Not to mention that they still add 16Gb of (sometimes unexpandable) storage to those devices basically forcing you to watch low resolution/bitrate content on a high-end screen.

Depends on screen size & typical viewing dista (1)

AlexOsadzinski (221254) | about a year ago | (#44447505)

Many technologies have already caught up with human physiology limits. The best example, I think, is audio: it's relatively trivial, given current mainstream CPU capability, storage size and bandwidth, and network bandwidth, to exceed the aural capabilities of most humans. 192kHz 24-bit audio, or DSD streams, exceed the hearing limits of most people, although there are still intangibles between that and traditional analog sources, that some people can, or think that they can, hear.

Video and stills are the next frontier. Many (MANY) people can't tell the difference between 480i and 1080p video on a typical TV at typical viewing distances. Why? If they have 20/20 vision, it's a brain thing.

1080p on, say, an iPad Mini Retina (not yet announced or shipping, of course) will exceed the resolving power of most people's eyes and brains at normal viewing distances. 1080p on a 10-foot home theater screen shows pixels for some people at normal viewing distances. 4K does not. But, if you've ever seen 8K, "something" makes it pop much more than 4K, and it's very close to looking through a window at something. The illusion is shattered if the POV changes at all, e.g. a camera pan. But, for a static camera, 8K is very convincing. So, while 1080p may be "good enough", 4K is a step function upwards for large-screen TVs and home theater projector applications, and 8K may approach the limits of human vision. We won't know until someone tries 16K and we see if there's an intangible difference.

Printers (2)

Princeofcups (150855) | about a year ago | (#44447509)

Didn't laser printers show us that 300dpi is still a bit jaggy, and 600dpi is perfectly smooth at arm's length? When screen resolution is around 400dpi then we are probably done.

Re:Printers (1)

Cassini2 (956052) | about a year ago | (#44447709)

The article quotes researchers delivering numbers between about 240 dpi and 477 dpi. When 300 dpi laser printers were popular, I remember being able to spot the dots. However, I had to try. Since then 600+ dpi laser printers have taken over the market, and I can't easily spot the dots with the newer high-resolution laser printers.

As such, the observations from both the print and the display researchers are consistent. Somewhere between 200 and 400 dpi the technology becomes "good enough" for many people. Somewhere between 400 and 600 dpi, the technology becomes "good enough" for almost everyone.

Re:Printers (1)

jbolden (176878) | about a year ago | (#44447857)

My guess is you can easily see the difference between 600 dpi and 2400 dpi print, especially for a photo. Print something on your 600 dpi printer that came from a fashion magazine. Resolution is worse on screens than on paper but no the cutoff isn't where you think it is.

Re:Printers (0)

Anonymous Coward | about a year ago | (#44447983)

Reminds me of a guy at my office who constantly claimed he could see the dots on the laser printers in our office that were near 2000 DPI. Well, I had to do a 1 page report but my office computer was getting fixed and my wife had the home computer with her, so I just typed it up on my typewriter (an electronic daisy wheel type) at home. At the morning meeting, our boss was talking about performance of the department and the guy tried to throw me under the bus by telling my boss that I left early (when it was my boss who told me to just go home since I couldn't work there, but he didn't know that). He even says that I must have falsified the report I just handed my boss because there was no way I could have written it the day before. I informed them that I did it at home and he retorts with "no that was done by the printers here, I can tell." I just dipped my finger in my water glass and smeared one of the letters. Should have seen his face, it was almost as funny as the fact that multiple times that day and the rest of the week, he repeatedly tried smearing laser prints in the same way.

Re:Printers (1)

Anonymous Coward | about a year ago | (#44447979)

600dpi is pretty nice, but you can still easily see the difference to 1200dpi when using fonts like Computer Modern that use hairlines tapering into bends.

Re:Printers (1)

jonbryce (703250) | about a year ago | (#44448047)

Printers are 300/600 dpi in 2 bit colour, or 2 bit mono. Displays have at least 6 bit colour, and usualy 8 bit.

Printers and resolution (4, Interesting)

MasterOfGoingFaster (922862) | about a year ago | (#44448157)

Didn't laser printers show us that 300dpi is still a bit jaggy, and 600dpi is perfectly smooth at arm's length? When screen resolution is around 400dpi then we are probably done.

300dpi didn't cut it for dithered images - 600dpi was close, but not quite enough. The winner was the 1200dpi laser printers.

When you have a grayscale image you want to print on a single-color device, you use dithering to create the illusion of gray shades. A 1-to-1 mapping of pixels to printer dots gives you 2 colors - black and white. Photos look horrible. Double the printer resolution so you have a 2x2 dot array for each pixel and you have 16 possible shades. Double it again for a 4x4 dot array per pixel and you have 256 possible shades. So if you want a 300 pixel-per-inch gray scale image to look good, you need a printer resolution of 1200dpi.

Now, all this changes for RGB displays, since each pixel can be from 16 to 256 shades each. But less depth per pixel might be compensated for by smaller pixels and a higher density.

I remember in the early days of computer graphics, it was believed that 24-bit color (8-bit each Red, Green and Blue pixels) was the pinnacle. But once 24-bit color became widely available, we discovered it wasn't enough. When edited in Photoshop, often a 24-bit image would show banding in the sky, due to rounding errors in the math involved. Adobe added 48-bit color (16-bits per RGB channel) the rounding errors became much less visible. Today cameras capture 8, 12,14 or 16 bits per RGB channel, and using HDR software we get 96-bit color.

My point is we have a history of thinking we know where the limit is, but when the technology arrives, we discover we need a little bit more....

Re:Printers (1)

bobbied (2522392) | about a year ago | (#44448175)

Yea, but they will just start on refresh rates then.. Followed by 3D?

It is not close to the end.. ;)

Fontguy (1)

Anonymous Coward | about a year ago | (#44447517)

Until print serif fonts can be read without getting headaches: No.

We've fixed resolution... (2)

maroberts (15852) | about a year ago | (#44447535)

how about sorting out readability in bright sunlight and battery life (without losing the gains in the other factors)?

Sure (0)

lesincompetent (2836253) | about a year ago | (#44447537)

And 640k is more memory than anyone will ever need on a computer.

Re:Sure (1)

Jason Levine (196982) | about a year ago | (#44447885)

That might have been a laughable statement now because more memory (as well as faster processors and other advances) allowed for computing applications that we couldn't foresee at the time, but the limitation on displays isn't "what are we using it for" but "what can the human eye see." Perhaps we'll wind up implanting devices in our eyes to increase our eye's resolution limit (and somehow get around the fact that our brains might not be able to deal with ultra-HD reality), but short of that there's a hard limit on what the average person can see. Put a 600dpi tablet screen and a 1200dpi tablet screen in front of 100 people and 99 won't be able to see the difference.

Holograms (1)

Arthur B. (806360) | about a year ago | (#44447567)

If you want to make high resolution true holograms, you'll need to square the pixel density (intuitively, each pixel has to contain an entire high resolution image which would correspond to the hologram as seen though a pinhole placed on this pixel).

Bring on the 1M PPI displays.

Re:Holograms (1)

iggymanz (596061) | about a year ago | (#44447637)

that's overengineering. you only need to provide a a position-dependent view for each eyeball in the room. so number of viewers x 2 x 2D pixel resolution.

Re:Holograms (1)

Arthur B. (806360) | about a year ago | (#44447673)

I agree, it gets you 98% there, but an eyeball isn't a pinhole either.

Re:Holograms (2)

iggymanz (596061) | about a year ago | (#44447793)

the rods and cones of the eye are on a surface, we only need concern ourselves with paths that terminate on that surface, they can originate from a surface

Re:Holograms (1)

bobbied (2522392) | about a year ago | (#44448161)

It's pretty darn close. Although, I think your point should be that this only assumes there is ONE observation point and ONE observer. Often this is not true, because there may be multiple people. Also, a standard 3D representation using 2 x 2D images also will assume a fixed distance between the eyes, while actual distance may vary between people and the orientation of their head. Plenty of room for innovation here.

seems like... (3, Informative)

buddyglass (925859) | about a year ago | (#44447577)

It's a matter of PPI and typical viewing distance. Phones are often held about a foot from your face. Computer monitors are usually two or three feet away from your face. TVs are significantly further away. Greater distance = eye is more tolerant of lower PPI. That's why the iPhone 5 is ~326 PPI, a Macbook Pro with Retina is ~220 PPI, an Apple 27" Thunderbolt Display is ~110 PPI and a 65" 1080p TV is ~35 PPI.

100Hz screens (1)

Reliable Windmill (2932227) | about a year ago | (#44447585)

What I'm really waiting for is 100Hz screens without ghosting, and perhaps a yellow pixel element as well for higher color definition. The "60Hz" screens we have today are more like 16Hz screens.

Anyone who has seen 50 or 60Hz progressive TV, often seen in news casts and soap operas for some reason, know just how alive and vivid everything looks compared to the 24-25 or 30 FPS usually seen on TV, Internet and in movies.

Re:100Hz screens (0)

Anonymous Coward | about a year ago | (#44448027)

What I'm really waiting for is 100Hz screens without ghosting, and perhaps a yellow pixel element as well for higher color definition.

Yellow does not make a whole lot of sense for additive color composition. And since the total area is limited, any extension of the gamut will cost contrast and saturation.

Smallest pixel (2)

jones_supa (887896) | about a year ago | (#44447631)

What is the size of the smallest pixel that can currently be made using LCD technology?

Re:Smallest pixel (0)

Anonymous Coward | about a year ago | (#44447821)

In production it's around 39 microns.

What I'd rather have on my phone (1)

DigitAl56K (805623) | about a year ago | (#44447683)

For my phone, screen resolution is good enough(tm). Screen power consumption is in drastic need of improvement. It's consistently the biggest drain on the battery.

Almost there... (1)

gatzke (2977) | about a year ago | (#44447685)

If we are getting 1080p on 5" phones you hold 10" from your eyes, I want similar resolution on my 30" desktop that I sit 20" from.

Maybe my math is wrong, but 2x distance should require 1/2 the pixel density. But 6x the size would be something around 6000x3000 on my desktop I think. I am happy with 2650x1600, but it could use 4x the pixels I guess.

I am happy with 52" 1080p in my den at 8' but 4k would be better...

I have been craving more pixels since I found I could make my 486 33 run some games in xga mode, getting 1024x768 amazing pixels.

Higher Still (0)

Anonymous Coward | about a year ago | (#44447749)

As a rift owner I can say for certain that at least for VR applications we will need much higher resolutions than we currently have in small form factors (and frankly I'd want this in larger factors as well).

I frankly think we need to get about 4k on a 5-7 inch screen before we start getting I can't visually tell the difference between real life in the VR context. So no we don't need it on phones but we will else where.

Hasn't stopped manufacturers (3, Interesting)

HangingChad (677530) | about a year ago | (#44447791)

Have we reached the limit of resolution improvements that people with average vision can actually notice?

Hasn't really slowed the push toward 4K in video production. While it's sometimes handy to have the frame real estate in production, it takes up a crapton more space, requires more power to edit and it's mostly useless to consumers. Even theater projection systems can't resolve much over 2K.

But if the industry doesn't go to 4K, then who will buy new cameras, editing software and storage hardware? And consumers might never upgrade their "old" HDTVs. Think of the children!

Re:Hasn't stopped manufacturers (1)

bill_mcgonigle (4333) | about a year ago | (#44448057)

requires more power to edit and it's mostly useless to consumers.

who cares about consumers? Give me a 16K video sensor and then I can zoom or re-crop the video in post and still deliver a nice 4k product. It's simply a matter of the cost of hardware.

Except when I move close... (1)

canadiannomad (1745008) | about a year ago | (#44447847)

I agree "at normal viewing distances" I don't have perfect vision, but when I want to see a detail, guess what I do? I zoom in, and move closer. This is where high resolution on those devices becomes important. Not at the standard "laboratory condition" distances, but when I want to inspect something closer.
Am I abnormal in this?

Human eye (3, Informative)

the eric conspiracy (20178) | about a year ago | (#44447853)

Wikipedia says:

Angular resolution: about 4 arcminutes, or approximately 0.07Â

Field of view (FOV): simultaneous visual perception in an area of about 160Â Ã-- 175Â.

So that's about 2200 x 2400 if the screen is at the correct distance. Further away and you need less resolution. Closer and you won't see the whole image.

VR (1)

Guspaz (556486) | about a year ago | (#44447927)

Devices like the Oculus Rift need resolution to go way higher. I once calculated (perhaps completely incorrectly) that an 11K display was the threshold of "retina" for the Rift, although I'd imagine 8K would be close enough. This is a 5-7" display we're talking about here.

No (0)

Anonymous Coward | about a year ago | (#44447961)

Higher resolution makes text easier to read as you focus in on small portions of a screen, the higher pixel density is crucial. Despite that, I would personally prefer engineers focus on making screens refresh faster (i..e higher framerates). I want everything that reaches my eye to be a minimum of 120fps, that includes 3D with 120fps per eye. 2nd to that I want more accurate detail in shadows. You tend to just get washed out or blacked out, complete lack of fine detail.

No - Capability of the human visual system (0)

Anonymous Coward | about a year ago | (#44447963)

http://www.itcexperts.net/library/Capability%20of%20the%20human%20visual%20system.pdf

um no (1)

slashmydots (2189826) | about a year ago | (#44448021)

Everyone at my work over 40 can't see a damn thing so their 1920x1080 monitors are at 1280x800. That awful pixel shrink ratio results in blurry crap which is almost as difficult to read. If the resolution was 10x higher, it would be a lot less noticeable because larger resolution pixel blurs would be possible. So no, they should keep increasing it.

Multiple Screens? (1)

LifesABeach (234436) | about a year ago | (#44448025)

Multiple screens can turn data into information when applied.

Wut? (0)

Anonymous Coward | about a year ago | (#44448063)

A large portion of the monitors on the market today have crappier resolution than monitors one the market 15 years ago. All these stupid 1080p monitors have pixel sizes in the 0.25 - 0.30mm range. My Sony Trinitron CRT was 0.22 back in 1997.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...