Announcing: Slashdot Deals - Explore geek apps, games, gadgets and more. (what is this?)

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Nano-Pixels Hold Potential For Screens Far Denser Than Today's Best

timothy posted about 7 months ago | from the enhance-enhance-enhance dept.

Displays 129

Zothecula (1870348) writes "The Retina displays featured on Apple's iPhone 4 and 5 models pack a pixel density of 326 ppi, with individual pixels measuring 78 micrometers. That might seem plenty good enough given the average human eye is unable to differentiate between the individual pixels, but scientists in the UK have now developed technology that could lead to extremely high-resolution displays that put such pixel densities to shame."

Sorry! There are no comments related to the filter you selected.

What's the point? (-1, Redundant)

Etz Haim (856183) | about 7 months ago | (#47441589)

If the average human eye can't tell the slightest difference, what's the point of making displays that dense?

Re:What's the point? (5, Informative)

scsirob (246572) | about 7 months ago | (#47441593)

The human eye is limited to certain pixel densities at certain distances. Technology such as this can create QHD displays in Google Glass applications where the pixels are much closer to the eye. In fact, it may be possible to implant this inside the eye and have augmented reality without p*ssing off the people around you.

Re:What's the point? (4, Interesting)

fuzzyfuzzyfungus (1223518) | about 7 months ago | (#47441669)

Less exciting; but sufficiently dense pixels might also make subpixel defects less obnoxious, even if the actual resolution requirements are low enough that multiple physical pixels are driven as a single logical pixel to reduce computational costs or display link bandwidth. And more acceptable defects means fewer scrapped panels.

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47441935)

Smaller pixels will unnecessarily drive up cost. Devices with normally sized pixels will be cheaper.

Re:What's the point? (3, Insightful)

smitty_one_each (243267) | about 7 months ago | (#47442045)

I mean, really: Emacs looks great in character mode and 80 columns. Why all this other faffing about?

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47441777)

There will always be some use for higher density display technology, even if of interest to a small number of people who would be targeting things other than the human eye. There is plenty of near-field microscopy work that could be done using even pixel sizes much smaller than the wavelength of light being used. From a distance you couldn't distinguish pixels much smaller than a wavelength. But if something of interest were right on top of the light emitter, closer than a wavelength of light away, you can selectively illuminate parts of the subject and distinguish that from a distance.

Re: What's the point? (1)

jrumney (197329) | about 7 months ago | (#47442009)

I can understand the benefit of higher resolution capture capability to microscopic applications, but displays? Do you look at your display through a microscope?

Re: What's the point? (1)

EvolutionInAction (2623513) | about 7 months ago | (#47442189)

Read. He says that if you can have a light emitting grid below the object of interest you could do some neat tricks with illumination.

Of course, if you could actually get pixels much smaller than a wavelength, the big application would be true holography. You aren't drawing images at that point, you're drawing interference patterns.

Re: What's the point? (1)

Pinky's Brain (1158667) | about 7 months ago | (#47442611)

Don't even need to get much subwavelength, half wavelength is already enough for a 180 degree viewing angle.

That said, materials which can do this are hardly new ... OASLMs were first used for holographic displays approximately 2 decades ago AFAIK.

Re: What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442199)

Depends what you mean by "microscopic applications", I reckon. Some of my colleagues are using conventional digital projector displays (i.e. the micro-mirror assembly) to deliver patterned light on very small specimens. The size of the display is irreverent, since they reduce it optically with lenses. A higher resolution display, with smaller pixels, wouldn't be helpful for this microscopy application.

Re: What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442755)

If you are going to reduce the image with lenses, then you are going to hit limits somewhere near the wavelength size.

Re: What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442237)

Obviously the goal is realism, otherwise we would have stuck with 320x240. If you want true realism, you need a pixel size approaching the Planck length, if such a thing exists. Anything larger is just saying "it's good enough" and NOT a good representation of reality, regardless of the receptor density of your retina. Light is never a parallel projection directly into your eye. There are all sorts of interactions and interference that occur in the light before it reaches your eye. THAT is what the goal is, not simply ejecting photons at a density that matches the density of your retina. For most people, "good enough" is probably good enough, but something tells me that we won't stop there.

Re: What's the point? (1)

davester666 (731373) | about 7 months ago | (#47443481)

You need these high-res displays so that you can enhance video like they do on TV, where you get a grainy 640x480 video feed of a car several blocks away and you zoom in and read the license plate. It's not possible without using a high-res display, preferably one that is semi-transparent, like the ones on CSI Miami.

Re: What's the point? (1)

shadowrat (1069614) | about 7 months ago | (#47443757)

I can understand the benefit of higher resolution capture capability to microscopic applications, but displays? Do you look at your display through a microscope?

no, but do you look at your kitchen table or your hand through a microscope? The resolution of the world is very very small and it contributes to the appearance of items on a macro (human) scale. The engineer in me gets the argument that 1080p is already fulfilling and surpassing the use case of reading text, but i'd be interested to see what kind of magic can be pulled off when we have enough pixels to really mimic the way light is scattered off of microcopic surfaces.

Re:What's the point? (2)

AmiMoJo (196126) | about 7 months ago | (#47441783)

There is a limit, but it is way above the 326 PPI of a "retina" display. You only have to compare such display to other phones with higher PPIs (pretty much any medium to high end model made in the last couple of years) to see that.

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47441981)

Not for my eyesight. But you're such a big Android fanboy it wouldn't matter. Keep telling your weaboo self whatever it takes.

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442183)

Except even the iPhone 6 is rumored to have a 416 PPI. I was going to say what a fool you'd look like if and when this is proven true, but we're way past that already.

Re:What's the point? (0)

Fuzi719 (1107665) | about 7 months ago | (#47442347)

Except even the iPhone 6 is rumored to have a 416 PPI. I was going to say what a fool you'd look like if and when this is proven true, but we're way past that already.

Which is still less than the 469 PPI of my year-old HTC One (M7). "Retina" is so cute. :-P

Re:What's the point? (2)

stjobe (78285) | about 7 months ago | (#47442641)

Aye. My Nexus 5 has a 1080x1920 445 PPI display. Although I didn't know that until just a minute ago when I looked it up, it's not something they make a big deal of in their marketing..

iPhone 5 only has 326 PPI you say? And they brag about the iPhone 6 getting a 416 PPI display?

I'll never understand marketing...

Re:What's the point? (1)

Anonymous Coward | about 7 months ago | (#47441951)

You will still piss off people around you, they are only less likely to detect it. Big difference.

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442429)

Technology such as this can create QHD displays in Google Glass applications where the pixels are much closer to the eye.

Sure, when/if they figure out how to change the pixel values without an atomic force microscope.
As it is now, they don't have a display technology, only a super-dense printing technology.
It's only pixels, adding lines and logic to drive them will inevitably and strongly reduce the resolution.

Re:What's the point? (1)

Anonymous Coward | about 7 months ago | (#47442555)

The contentious point in Google Glass is the camera in front, not the display.

Re:What's the point? (1)

rsilvergun (571051) | about 7 months ago | (#47443509)

it may be possible to implant this inside the eye and have augmented reality without p*ssing off the people around you.

Wait, are you suggesting there are _other_ uses for augmented reality? Sir, you've just blown my mind.

Re:What's the point? (1)

Z00L00K (682162) | about 6 months ago | (#47443869)

So next generation of Oculus Rift can get better image quality.

Other applications may be more light-weight devices for disabled people as well.

A higher density also means better images at short distance between eye and screen (you may want to add some optics to relieve eye stress though).

Re:What's the point? (2)

FireFury03 (653718) | about 7 months ago | (#47441597)

If the average human eye can't tell the slightest difference, what's the point of making displays that dense?

I would guess there may be applications for things like VR/AR headsets, where you're using a very small screen to cover a large field of vision.

However, I more or less thought the same thing about Apple's retina displays - I can see some restricted uses, but for the general case I don't notice the pixels on my non-retina phone so I'm not sure why I'd want to waste the battery power moving even more pixels around.

Re:What's the point? (1)

Guspaz (556486) | about 7 months ago | (#47441717)

VR and AR are something, as are things like the Google Glass, but another one is EVFs (electronic viewfinders), which typically use microdisplays.

Then again, these microdisplays already feature pixels FAR smaller than what they're claiming these new "nanopixels" are.. the article is kind of confusing. They seem to be claiming pixel sizes a bit less than half what an iPhone has, but there are already smartphones out there with pixel densities almost double the iPhone (like those phones with 1440p displays), and microdisplays go many *times* more dense than that, so... what's new here exactly?

Re:What's the point? (1)

Anonymous Coward | about 7 months ago | (#47442565)

I think you missed three orders of magnitude there - 30nm vs 78microns.

Re:What's the point? (1)

Guspaz (556486) | about 7 months ago | (#47443599)

Yep, you're right.

Re:What's the point? (1)

ilsaloving (1534307) | about 7 months ago | (#47442231)

I don't know if you read books or anything on your devices, but I've found that reading on an iPad Air to be *significantly* better than my previous devices. Less strain to read, I can make the text smaller without it getting blurry.

I didn't see the point in high density displays either until I took the same pdf on an older and a newer device side by side. The different is striking.

Of course, if you don't use your device for such things, then I agree, the higher density doesn't grant you much.

Re:What's the point? (1)

FireFury03 (653718) | about 7 months ago | (#47442625)

I don't know if you read books or anything on your devices, but I've found that reading on an iPad Air to be *significantly* better than my previous devices.

I don't own a tablet - I use a desktop machine for every day work, a laptop around the house and an Android smartphone. I wouldn't really want to read books on my smartphone except in an emergency - screen's too small to be comfortable. And I don't want a bigger smart phone because then it wouldn't be convenient to carry around and I honestly can't think how a higher resolution display would make my phone better.

On the other hand, my wife does have a tablet... She occasionally reads books on it, but it mostly gets used for facebook, web surfing, photo browsing, etc. My experience of using it for reading books isn't great - if I want to sit in the garden in the sun I find the screen too reflective, and if I want to sit in bed at night then a backlit screen is really glaring.

I think, if I were going to buy a device to be an ebook reader, I would have to buy an epaper device to be really comfortable with it, and epaper is a bit too limited to use the device for non-book uses. So since I can't get a device that would be a reasonable all-rounder then I'm not likely to buy one soon. The perfect tablet for me would probably be one that has an LCD display on one side and an ePaper display on the other so I could just turn it over to choose which display was most suitable for the current situation - no one makes such a thing.

In truth, the prevalence of DRM on ebooks is likely to keep me from being especially interested in buying an ebook reader. Whilst I do consider tablets to be quite "shiny" and nice for surfing the web on, when I look at what I'd use it for honestly, I really don't think I'd get a lot of use out of it so there's not a lot of point in me buying one.

Re:What's the point? (1)

ilsaloving (1534307) | about 6 months ago | (#47443883)

That is entirely true. And I had spent a long time looking at ePaper devices. Unfortunately, the devices I was looking at turned out to be far more expensive than comparable tablets, and with ePaper you were locked into only reading books. (Or view web pages, email, etc, assuming they even provided that functionality). I didn't want to be restricted to just books, so I went with the tablet instead.

Some company in India actually came up with a design similar to what you describe. I forget the name now, but they called the display a Qi Display. The last time I looked into it, there were issues with quality, so I stopped paying attention.

Re:What's the point? (1)

Blaskowicz (634489) | about 6 months ago | (#47444591)


I for one would simply like a high res monochrome LCD (or greyscale, if monochrome implies 1bit).
It was prevalent in the 80s and 90s, works unlit, is usable outdoors and gives you much longer battery life to boot. I wouldn't give a damn about black and white if I had a long-lived, always usable device. Hell, a 1989 Game Boy is still a better gaming device than a smartphone and I tried to read a book on one (read the first chapter before being bored with it. Blocky font with very few pixels per character is not exactly ideal)

Re:What's the point? (1)

Megol (3135005) | about 7 months ago | (#47442845)

In my experience things like contrast, backlight quality and _the_lack_of_glare_ are more important for readability than pixel density.
A high enough pixel density combined with good anti-aliasing and subpixel precise rendering makes a huge difference though - not that I'd call the existing solutions good (unless I've missed something).

Re:What's the point? (3, Interesting)

Anonymous Coward | about 7 months ago | (#47441607)

Small point: If they keep making the pixels smaller, holographic displays could be possible.

Re:What's the point? (1)

MatthiasF (1853064) | about 7 months ago | (#47443615)

This is an important point. Today, most screens are designed for only 2d in mind, meaning the light is being sent out omni-directional.

If pixel density increases past what can be seen by the human eye, they could develop 3d displays using polarized films that could allow for directional displays. This means they would be similar to today's planar holographs, where as you move your head you would see a different version of the image.

This would be a huge advancement in display technology and such science fiction concepts as holodecks could be possible for numerous people to walk around a boxed display and see accurate 3D.

Re:What's the point? (1)

K. S. Kyosuke (729550) | about 7 months ago | (#47441615)

Virtual reality, perhaps? Combine this with an eye tracker to render portions of the screen selectively, since doing the whole screen at full resolution would be prohibitive. You might have to combine this with real-time region updates. Sounds like an interesting problem...

Re:What's the point? (1)

towermac (752159) | about 7 months ago | (#47442677)

Very interesting.

The eye tracker thing is interesting; only update what I'm looking at. Unless there are idle cycles, then go ahead and update the rest of it.

I was thinking of a fractal style display environment, where oop and inheritance are taken to ridiculous levels. Each item to be updated is the child of a child of some child region possibly, although I was thinking more of in game objects instead of predefined areas. Like all the mice or grass waving or whatever.

Assuming we are stuck with a single master thread in games for the time being, you're going to run out of time long before you updated just the changes in a cycle, even if the changes were few. So this fractal structured graphics thing updates the highest parents first, then drops a level and runs their child commands, which might be the basic outlines of leaves or other terrain, with the details, broken into how ever many levels of detail as the programmer saw fit, encoded within. The GPU may, or may not, get to those children in time, if it does not; you would see blurry graphics, at least for that frame.

On the next update, the GPU gets it's new changes, and still has a list from the prior cycle that didn't get drawn. Some of those get drawn on this new cycle, and eventually, the GPU catches up. If it doesn't catch up, then that is the best you will be able to run this game with the graphics card you have. Instead of game programmers having predefined fall back modes, (like a poor-good graphics slider), they throw everything they want at the GPU, and it will take care of rendering what it can. And unless your GPU is just really bad, then you will be able to see something besides big color splotches, and hopefully play the game with degraded quality.

But you never again miss a frame, and if you stop and stare, the detail will fill itself in on your glorious VR holographic screen with no additional help from the programmers. Then, you'll drop a grand or more on a video card..

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47441627)

Nobody said the eye could not tell the differnce, only that they couldn't differentiate between the individual pixels at that density.

Re: What's the point? (0)

Anonymous Coward | about 7 months ago | (#47441659)

Is that not the same thing?

Re: What's the point? (0)

Anonymous Coward | about 7 months ago | (#47441749)


What's the point? (1)

Anonymous Coward | about 7 months ago | (#47441653)

Try drawing a non-aliased almost horizontal black line on white background on a retina display at normal viewing distance; you won't have any problems noticing the staircasing.

Re:What's the point? (1)

Anonymous Coward | about 7 months ago | (#47441687)

Even if you can't tell the individual pixels apart, you always want anti-aliasing. Aliasing is technically a frequency domain thing. No matter how high the sample rate (the spatial resolution in the case of displays) is, aliasing will always be visible if you sample frequencies that are higher than half the sample rate. The correct solution is to use a display resolution just high enough that individual pixels can't be distinguished and apply anti-aliasing. The point of higher display resolutions is not to do away with anti-aliasing!

Re:What's the point? (1)

demon driver (1046738) | about 7 months ago | (#47441715)

Is that why more and more camera manufacturers, while sensor resolution becomes higher and higher, find anti-aliasing filters unnecessary?

Re:What's the point? (1)

Anonymous Coward | about 7 months ago | (#47441743)

Not having an explicit anti-aliasing filter is tacit admission that there is an implicit anti-aliasing filter somewhere else. In other words, the optics are shit.

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442093)

Either that or the optics are designed to blur/filter the signal enough to avoid aliasing.

Re:What's the point? (1)

Blaskowicz (634489) | about 6 months ago | (#47444629)

Is that why I've looked at some shots on the web that were very high res and very noisy?
Maybe you have to do the anti-aliasing / proper reconstruction in the RAW importing software.

Re:What's the point? (1)

Lussarn (105276) | about 7 months ago | (#47441679)

If the average human eye can't tell the slightest difference, what's the point of making displays that dense?

The whole retina thing is just a marketing ploy. Perhaps some wants to hold the phone closer than what Steeve decided was the optimum range. There is no denying text is sharper and you need to zoom less when having better than retina resolution.

In any case, I'm not average.

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442563)

You may not be, but the average person is.

Re:What's the point? (-1)

Anonymous Coward | about 7 months ago | (#47441689)

the point is ... niggers!

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47441785)

What is "Retinal Display"
well Dr.Phil Plait has a dissertation on the subject and his PhD is in optics having designed a camera in the Hubble Telescope
http://blogs.discovermagazine.... [discovermagazine.com]
The critical number is "5730 number as a scale factor; multiply an object’s size by that, and, if your vision is perfect (OOOooooo, foreshadowing!) you get how far away you can see it as more than a dot."
so 78 micron * 5730 = 0.45m which is about the usual place for holding your phone.

Re:What's the point? (2)

Carewolf (581105) | about 7 months ago | (#47441795)

The human eye CAN tell the difference. What it can't do it distinguish individual pixels, just like you normally can't see individual frames when a movie or game is faster than 24fps. If your eye-sight was so poor that you coun't see better than 300dpi at one meter, you would not be allowed to get a drivers license in most countries. Road signs are designed to be read by the minimal allowed vision at a certain distance, that means you must be able to read half a meter high letters at 1km, which requires 1/5cm resolution, which is the same as 600dpi at 1m, and that is minimum.

Re:What's the point? (4, Informative)

dfghjk (711126) | about 7 months ago | (#47442605)

20/20 vision is defined as 1 arc minute of resolving power. It is rare for anyone to achieve resolving power more than twice that.

1 arc minute translates to 87 dpi at 1 meter, although I have no idea why you mix inches and meters here. It is 95 dpi at 3 feet; 100 dpi is the commonly used number. People with 20/10 vision can resolve 190 dpi at 3 feet, 175 dpi at 1 meter.

No one living sees better than 300 dpi at 1 meter, so it is not likely to be the standard in ANY country, much less "most". 600 dpi for road sign legibility is even more absurd.

At 1km, 20/20 vision can resolve a "dot" about 29cm in size. That's 3.5 dots per METER. 1/2 meter letters would not be legible. 20/40 vision, a common driving standard, would be closer to 2 dots per meter, or the feature size you are quoting.

See http://www.safetysign.com/cont... [safetysign.com]

A road sign that should be legible at 1km should have a minimum letter size of 1.1 meters, not 0.5 meters.

2 dots per meter at 1km is 2 dots per mm, 50 dpi, at 1m not 600 dpi. In order to resolve text at that size someone would need 250 dpi of acuity which no one has.

Carewolf, everything you said was wrong. You may need a new calculator.

Re:What's the point? (1)

Bengie (1121981) | about 7 months ago | (#47443727)

The human eye can see individual frames in one aspect. It has been known for quite a while that humans can recognize a single frame at over 300fps. They injected a single frame that was in high contrast to the current video, and not only were people able to recognize when it occurred, but they were able to recognize the simple shape being displayed.

Humans don't see motion as "frames", but our visual system is great at picking out something when it's drastically different. The only reason we don't see "frames" past a certain fps is because it gets "close enough" that the brain kicks in and perceives it as a smooth motion.

Perception is a tricky thing. Do not mix up what we see with what we perceive.

Re:What's the point? (1)

fnj (64210) | about 6 months ago | (#47444333)

300 dpi at one meter? Are you high? NO ONE can come anywhere close to that. You fail basic reality.

FYI, 20-20 vision resolves roughly 16 dpi at 6 meters, and you don't even need 20-20 to drive.

Re:What's the point? (0)

Anonymous Coward | about 7 months ago | (#47442019)

Secret messages in the phone screens, communicated via scanners. Perfect for spies, terrorists, members of drug distribution chains and children.

Re:What's the point? (1)

TomGreenhaw (929233) | about 7 months ago | (#47442389)

How about a super high resolution watch or phone that along with a pair of glasses that magnify the image to look like a big screen?

Re:What's the point? (1)

MillionthMonkey (240664) | about 7 months ago | (#47442569)

If the average human eye can't tell the slightest difference, what's the point of making displays that dense?

Maybe eagles want to watch TV too.

Re:What's the point? (1)

ArcadeMan (2766669) | about 6 months ago | (#47444471)

I think both AMD and nVidia are pushing displays manufacturers toward higher-DPI panels.

where is the news? (0)

Anonymous Coward | about 7 months ago | (#47441633)

pico and femto pixels hold the same potential

Tiny Projectors (1)

sdack (601542) | about 7 months ago | (#47441663)

This should be very interesting for making tiny projectors.

Re:Tiny Projectors (1)

kamapuaa (555446) | about 7 months ago | (#47441773)

You'd think, but microprojectors/picoprojectors haven't really advanced over the past five years.

Re:Tiny Projectors (1)

synaptic (4599) | about 6 months ago | (#47444149)

This might allow for very high-resolution interference fringes for holographic displays.

Hardware Struggles Now Though (1)

ButchDeLoria (2772751) | about 7 months ago | (#47441693)

At this point, we're making consumer grade hardware strain to drive 4K monitors. Pixel density doesn't matter if the device can't easily run at that resolution.

Re:Hardware Struggles Now Though (2)

CastrTroy (595695) | about 7 months ago | (#47441831)

This. When I was shopping for a tablet last Christmas, there were a lot of reviews saying that the 2048x1536, tablets were slower than their predecessors at many tasks even though the processor was faster, because it took so much computation just to run the screen. For a 10 inch tablet, 1080p seems to be good enough. And trying to cram more pixels in there just for the sake of it, at the expense of battery life and framerates seems be a bad idea.

Re:Hardware Struggles Now Though (0)

Anonymous Coward | about 7 months ago | (#47443149)

Eventually, we'll reach the point where it's not practical to throw huge bitmaps around and we'll have to draw screens using vectors or other procedural methods.

Re:Hardware Struggles Now Though (1)

Blaskowicz (634489) | about 6 months ago | (#47444719)

We're soon gonna see display using Displayport compression. Analogous to texture compression, small blocks are compressed but that is done in real time with dedicated hardware, with a supposedly very good algorithm. The goal is to enable power savings on mobile devices (including laptops), by reducing the insanely high bitrates for transmission between the GPU or SoC and the display. It will also allow a PC with Displayport 1.3 to output to an 8K display, even though the bandwith (increased from the current standard) would be too small to do that uncompressed.

Nice try (1)

Anonymous Coward | about 7 months ago | (#47441731)

But the LG G3 is already down to 47 micrometer. And it's mostly about battery life.

Finally, (0)

Anonymous Coward | about 7 months ago | (#47441739)

Finally, our genetically engineered descendants will have displays that won't have noticeable pixels.

VR Headsets for Eagles (5, Funny)

Warren Owen (3744573) | about 7 months ago | (#47441787)

At last we will be able to make VR Headsets for Eagles

Re:VR Headsets for Eagles (0)

Anonymous Coward | about 7 months ago | (#47442227)

I didn't know Henley/Fry et al were into VR...cool!

I can really see the difference... not. (0)

Anonymous Coward | about 7 months ago | (#47441801)

"the average human eye is unable to differentiate between the individual pixels"

How can they tell if it's working?

Pixel master race (1)

jones_supa (887896) | about 7 months ago | (#47441851)

Pixel master race.

There are better than Apple's (2)

erroneus (253617) | about 7 months ago | (#47441857)

Why do they mention that and fail to mention devices which present even higher density displays? My Nexus 5 has 445ppi display density.

I find it annoying that despite the existence of common devices which are "better" that the "best" is still considered to be Apple's. Nothing like product endorsement which wasn't [likely] even paid for. At the very least, they should have included the trademark sign to indicate they were making a commercial reference in their endorsement. (They did, at least capitalize "retina" in retina display... that's not quite the same thing and kind of makes it worse.)

Re:There are better than Apple's (1, Funny)

drinkypoo (153816) | about 7 months ago | (#47441891)

I find it annoying that despite the existence of common devices which are "better" that the "best" is still considered to be Apple's.

Congratulations, you have just lived down to your nickname, and it has led you to whine about Apple's popularity — the only reason why everything is compared to Apple.

Oh good grief (1)

NoNonAlphaCharsHere (2201864) | about 7 months ago | (#47441865)

They're drawing pictures with AN ATOMIC FORCE MICROSCOPE, and we're discussing it like it's going to be on the next generation of smart phones.
This technology is at the "hey, look at the shadow of this Maltese cross created by the cathode rays!" [wikipedia.org] stage.

Re:Oh good grief (1)

NormalVisual (565491) | about 7 months ago | (#47443035)

From the *very next* sentence in TFA: "They then found that the "nano-pixels" could be switched on and off electronically, creating colored dots that could be used as the basis for an extremely high-resolution display."

Does it matter? (0)

Anonymous Coward | about 7 months ago | (#47441877)

How many fall for such marketing ploys as retina displays or 4K resolutions? I supposed the same people who cannot watch a HD movie but must have BlueRay or have 3D TV. The problem I have with this technology is that its always marketed as being needed because its better. I have myself looked at 4K TV's and retina displays from a standpoint of real world content and not the special demo content the manufactures of these products use. Take a streaming content from Netflix or Amazon or even Google Play and your already limited to HD at best and really its compressed HD to start with. Unless your a BlueRay fanatic to boot, the high density pixel displays won't help you. I also do not see bandwidth improving enough with the internet to have all these streaming services move to a even higher definition content then 1080P. Even satellite and cable TV have bandwidth limitations and they do not have the pipes or satellites to do high bandwidth content on all channels. Even going as far back as content production, they have to juggle using anything more then 720P as their costs go up to produce higher res. content.
Just like sports gave up on 3D production as being too expensive and not having the viewers who even have the equipment to view it. The ideal of higher pixel TV's and monitors is just another way to up margins on products. Its like giving you a 700HP car and then govern it to 55 MPH.

Re:Does it matter? (0)

Anonymous Coward | about 7 months ago | (#47441969)

Cars will never catch on, i tried the real world approach of straddling over it and giving it a kick... it moved nowhere!

If/when real-world content is optimized for the improvements, you will notice the difference.

Re:Does it matter? (0)

Anonymous Coward | about 7 months ago | (#47443199)

"If I’d asked people what they want, they would have said higher-resolution horses."

BadBIOS is real (0)

Anonymous Coward | about 7 months ago | (#47441939)

I've taken a standard PC, freeware Audacity, and manually generated both Morse Code and Binary data in a simple .wav file using 20kHz - 22kHz "sound" with some fade in/fade out to clean up 'tics'. When played you cannot hear it (the dog goes nuts though). I then used my iPhone and a sound spectrum analyzer (free app) and monitored the inaudible frequencies.


A partition type virus combined with modem type software (but modified to use inaudible sound) could easily perform communication between PCs.

BadBIOS is real, blame the audiophiles (0)

Anonymous Coward | about 7 months ago | (#47442881)

Maybe if they didn't insist on unrealistically high sampling frequencies, to the point where every integrated sound device is running at several times the human sound frequency limit, we wouldn't have this problem.

1.2 arcminute per line pair (3, Interesting)

Anonymous Coward | about 7 months ago | (#47442039)

The human visual system is good for at most a resolution of around 1.2 arcminute per line pair. That's an outstanding eye, with outstanding conditions. Granted, looking at a light source like an iPhone screen is in general what I would call excellent conditions, except in the shadow detail areas. If they go OLED, even that will improve.

But the bottom line is, do the math. It's pretty simple geometry. If you exceed what the human visual system can perceive, all you're doing is making marketing hype.

Same thing applies to movie theaters -- where the hype is now 4k. Even Sony admits unless you are sitting in the first few rows of the theater, 4k is overkill and 2k is plenty. If you like the back row, a 4k projection won't give you any improvement over a 720p HDTV signal.

I'm just sayin'... Do the math.

Re:1.2 arcminute per line pair (4, Funny)

SpankiMonki (3493987) | about 7 months ago | (#47442305)

I know, right? Pixels schmixels. What really matters is dynamic contrast ratio. If these new screens don't have at least 10,000,000,000,000,000:1, I'm just not interested.

1.2 arcminute per line pair (0)

Anonymous Coward | about 7 months ago | (#47442937)

For computer monitors, though, 4k and even 8k will be completely viable. Sadly, by that point consumers will have soured on UHDTV, because the exact same problems exist with 4k at the theatre as with 4k in the home - people sit too far back from their screens. And since television standards dictate mainstream computer monitor technology, UHD monitors will be around for only a short period of time.

Witness the industry's insistence on standardizing on 1366x768 and 1920x1080 for everything - anything at a higher resolution is simply not sold in US retail outlets. If I go to Best Buy or Micro Center, I'll probably find a 27" 1080p monitor sitting there. *Maybe* the super-pricey Apple Thunderbolt Monitor, which is 27" 1440p, since they also sell Apple products. But if I go online, I can buy a 28" 2160p monitor, from multiple vendors, for about 60-70% the price, displaying literally quadruple the amount of pixels as the (roughly) same size monitor you can get at retail. Only caveat is that it's TN, but all the crap at retail is bound to be TN anyway for the same reason why they're 1080p at 27 inches.

So, yes, 4k is a marketing fad - but, for the sake of your desktop, pray that people are fooled into thinking it's worth paying money for, or you'll be stuck with 1366x768 laptop screens until the end of time.

For any particular reason? What use? (1)

AbRASiON (589899) | about 7 months ago | (#47442091)

I already can't see the pixels even up ultra close on an iphone 5, I have difficulty on my Samsung Galaxy S3 and both of those displays are only "fairly good" by the new mid 2014 phone standard which is up to over 500ppi

You want to impress me, get OLED happening everywhere, I've done the reading, I understand the tech, the colour range, refresh rate and incredibly black blacks are awesome.

Also, 2d / 3d graphics processors are going to melt pushing this many pixels sooner or later :/

Re:For any particular reason? What use? (1)

camperdave (969942) | about 6 months ago | (#47444431)

The problem of OLED is that it cannot compete with sunlight. So you can't see your display outside. We need full color e-ink, or something similar; a reflective/refractive or absorbive technology rather than an emmissive one.

I can see why! (1)

sd4f (1891894) | about 7 months ago | (#47442167)

It's good that they're working on this, getting better pixel densities will no doubt have applications somewhere (such as VR, google glass type hardware), but really, I don't want to start seeing 4k phone displays.

Monitors? (1)

Lawrence_Bird (67278) | about 7 months ago | (#47442181)

So.. you mean I might live to see a 35" monitor with over 300 dpi? (ok, I'll settle for a doubling of the current 100dpi).

Ideal PPI (2)

Twinbee (767046) | about 7 months ago | (#47442265)

SO...... if you paint a white single-pixel width 15 degree line without any anti-aliasing onto a black background, what does the PPI need to be at so you don't notice any jaggies?

300? 600? 1200? 2400 or more?

Re:Ideal PPI (1)

Pinky's Brain (1158667) | about 7 months ago | (#47442645)

Whatever PPI is necessary to make it invisible.

Re:Ideal PPI (1)

Twinbee (767046) | about 7 months ago | (#47442661)

I think that's a very good point, and something I considered too. I'm not sure that would work for ever-increasing brightnesses though.

Doesn't seem to work (1)

LongearedBat (1665481) | about 7 months ago | (#47442547)

On my screen, the sample pictures they show in the article look just as pixellated as any other picture.

Where's the middle ground of usability? (1)

udoschuermann (158146) | about 7 months ago | (#47443115)

I'd be plenty happy if I could buy a 24" desktop monitor with 2560×1600 pixels (125 DPI).

Back in 2004 (10 years ago!) I had a Sager laptop with 135 DPI (1600×1200). That was an awesome display, but it seems like we have not made any progress since then: It's either barely stretch for 100 DPI on the desktop or 400+ DPI on a tiny mobile phone. Why can't we get 150 or 200 DPI on the desktop? Am I really the only one who cares?

Re:Where's the middle ground of usability? (0)

Anonymous Coward | about 6 months ago | (#47444011)

> I'd be plenty happy if I could buy a 24" desktop monitor with 2560×1600 pixels (125 DPI).

Here you go: http://www.amazon.com/gp/product/B005JN9310/ [amazon.com]

Re:Where's the middle ground of usability? (0)

Anonymous Coward | about 6 months ago | (#47444097)

Oops... my mistake. That's 1920x1200. I was fooled by its inclusion in this review [squidoo.com] of best 2560x1600 displays.

Re:Where's the middle ground of usability? (0)

Anonymous Coward | about 6 months ago | (#47444217)

This is good news, though:

NEC Display Solutions Introduces 24-Inch UHD Display [techpowerup.com]

Can I have an indigo pixel? (3, Interesting)

jfengel (409917) | about 7 months ago | (#47443133)

One possibility would be improving the color range, even if the resolution isn't improved. Rather than cramming in three phosophors per pixel, perhaps we could have four, or more. There's a considerable chunk of color space not well represented by RGB color.

I don't know how much of a difference it would make to TV viewers or gamers, but I know that artists would be grateful for a better color range. The conversion from RGB to CMYK is always a bit of a crapshoot; things that look great on your screen don't look as good when they come back from the printers, and there's a whole range of stuff it doesn't occur to you to try because you can't see it.

I could even imagine that it might be handy for medical imaging and other applications where you want to cram as much information onto the screen as possible: more pixels may not improve things but more colors might. Though more pixels could achieve that as well: it would be nice to be able to zoom in by bringing your face closer to the screen without simply seeing bigger pixels. Head motion is kinaesthetically appealing: you can move in and out without losing your sense of overall place.

Sharp already makes a four-pixel TV, with an added yellow (which is especially helpful in skin tones). I think it would be neat to be able to produce true indigo, violent, and cyan. If this lets you add more phosphors without costing resolution, it might not be a killer app, but it could be a desirable thing.

Image size and camera (1)

angel'o'sphere (80593) | about 7 months ago | (#47443569)

I wonder how big images to be displayed on such a screen would be, or more important: what camera do you need to be able to support such resolutions?

Only 326 ppi huh? (1)

Grismar (840501) | about 6 months ago | (#47443945)

You could have mentioned a bunch of non-Apple phones available right now, with far higher ppi than those two Apple devices - without fancy future "in 5 years"-tech. And I'm not talking obscure brands either. But I guess that was kinda the whole point right? A small advertisement with a tech article hardly anyone on here will read.

Can be used foor true 3D display (1)

Prune (557140) | about 6 months ago | (#47444783)

Currently we do have auto-stereoscropic displays (no glasses), but they only account for stereopsis, not accommodation (different focal distances for the eye). In current 3D displays, the 3D cue of stereopsis conflicts with the information from accommodation to a flat plane, and the 3D effect is significantly diminished (and can even cause discomfort or headaches). With an ultra high pixel density display base, lightfield displays become practical, and they can reproduce both stereopsis and different focal depth per image element. Current prototypes I've seen at SIGGRAPH have been very low resolution, as you need a patch of 2D pixels under each microlens (lightfield displays are based on a microlens array with multiple pixels under each lens). I imagine a 1920x1080 microlens array with 32x32 pixels under each microlens. If the display is also high-dynamic range and with extended color gamut, it would be the ultimate visual equivalent to a window into other worlds.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?