Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

High End Graphics Cards Tested At 4K Resolutions

Soulskill posted about a year and a half ago | from the scale-it-until-they-catch-fire dept.

Graphics 201

Vigile writes "One of the drawbacks to high end graphics has been the lack of low cost and massively-available displays with a resolution higher than 1920x1080. Yes, 25x16/25x14 panels are coming down in price, but it might be the influx of 4K monitors that makes a splash. PC Perspective purchased a 4K TV for under $1500 recently and set to benchmarking high end graphics cards from AMD and NVIDIA at 3840x2160. For under $500, the Radeon HD 7970 provided the best experience, though the GTX Titan was the most powerful single GPU option. At the $1000 price point the GeForce GTX 690 appears to be the card to beat with AMD's continuing problems on CrossFire scaling. PC Perspective has also included YouTube and downloadable 4K video files (~100 mbps) as well as screenshots, in addition to a full suite of benchmarks."

Sorry! There are no comments related to the filter you selected.

CENSORSHIP BY SLASHDOT... apk (-1)

Anonymous Coward | about a year and a half ago | (#43596339)

* Breaking news: currupt Slashdot administrators have modified the site's "lameness filter" to censor true and useful HOST file information. Slashdot admins collude with criminals to prevent you from learning about HOST files. Visit here [slashdot.org] for information Slashdot does not want you to see.

* Older news: corrupt Slashdot administration attempted to ban me for blowing the whistle on their illegal activities, while not banning the criminal who stalks, harasses, and impersonates me. Whistleblower abuse is a federal felony. Lunatic Slashdot admin's have been owned by me in so many tech debates over the past decade that they conspire with criminals to effetely & vainly *try* to "hide" my posts and censor me. Jealousy at it's finest.

=> Lawsuit's and criminal prosecution against Slashdot are now inevitable. Moderation+posting records will be sequestered and anyone acting aginst me will be dealt with permanently.

Previous notice:

A corrupt slashdot luser has pentrated the moderation system to downmod all my posts while impersonating me.

Nearly 330++ times that I know of @ this point for all of March/April 2013 so far, & others here have told you to stop - take the hint, lunatic (leave slashdot)...

Sorry folks - but whoever the nutjob is that's attempting to impersonate me, & upset the rest of you as well, has SERIOUS mental issues, no questions asked! I must've gotten the better of him + seriously "gotten his goat" in doing so in a technical debate & his "geek angst" @ losing to me has him doing the:

---

A.) $10,000 challenges, ala (where the imposter actually TRACKED + LISTED the # of times he's done this no less, & where I get the 330 or so times I noted above) -> http://it.slashdot.org/comments.pl?sid=3585795&cid=43285307 [slashdot.org]

&/or

B.) Reposting OLD + possibly altered models - (this I haven't checked on as to altering the veracity of the info. being changed) of posts of mine from the past here

---

(Albeit massively repeatedly thru all threads on /. this March/April 2013 nearly in its entirety thusfar).

* Personally, I'm surprised the moderation staff here hasn't just "blocked out" his network range yet honestly!

(They know it's NOT the same as my own as well, especially after THIS post of mine, which they CAN see the IP range I am coming out of to compare with the ac spamming troll doing the above...).

APK

P.S.=> Again/Stressing it: NO guys - it is NOT me doing it, as I wouldn't waste that much time on such trivial b.s. like a kid might...

Plus, I only post where hosts file usage is on topic or appropriate for a solution & certainly NOT IN EVERY POST ON SLASHDOT (like the nutcase trying to "impersonate me" is doing for nearly all of March/April now, & 330++ times that I know of @ least)... apk

P.S.=> here [slashdot.org] is CORRECT host file information just to piss off the insane lunatic troll.

--


CENSORED BY SLASHDOT [slashdot.org]
CENSORED BY SLASHDOT [slashdot.org]
CENSORED BY SLASHDOT [slashdot.org]
CENSORED BY SLASHDOT [slashdot.org]
CENSORED BY SLASHDOT [slashdot.org]
CENSORED BY SLASHDOT [slashdot.org]
CENSORED BY SLASHDOT [slashdot.org]

Making up for small penis with MASSIVE post! ..apk (-1, Troll)

Anonymous Coward | about a year and a half ago | (#43596587)

*Breaking News: Corrupt slashdot lusers infiltrate the moderation system and downmod my posts and tell me to stop waving around my "E-peen" when I use my copy-pasta about my 2gb HOSTS file, running in Ring 0! Now they have conscripted the help of the slashdot admin staff to modify the lameness filter to block my endless shit smeared screeds that I relentlessly pull out of my ass to amue the children, forcing me to LINK to previous screed postings instead!

Previous News: Someone named Jeremiah Cornelius keeps making fun of my *totally useful* posts about hosts files, and makes fun of me personally and insists that I am a troll! It's totally unfair and makes my panties get all sweaty and knotted! I simply can't understand why anyone would NOT want to read the shit splattered screed of text I forecully ram ram down slashdot's comments section dozens of times daily! It's informative and useful, honest! Slashdot admins just can't handle the amount of truthiness it contains! It's totally a conspiracy!

Previous news: Today marks the 500th day I have refused to take my lithium script. My friend the pink unicorn tells me to add my mental health provider to my phone company's block list, and to be his best friend forever. I like him a lot, because he told me about hosts files, and how to use petrified hot grits the right way. He says that all those companies on the internet are out to get me, and that I need to add them to my hosts file. I don't know how I ever managed to live without his helpful advice! I will share it with everyone, everywhere from now on!

Re:Making up for small penis with MASSIVE post! .. (-1)

Anonymous Coward | about a year and a half ago | (#43596631)

The part about the panties made me physically nauseous.

Jeremiah Cornelius: Grow up (-1)

Anonymous Coward | about a year and a half ago | (#43597145)

You're embarassing yourself Jeremiah Cornelius http://slashdot.org/comments.pl?sid=3581857&cid=43276741 [slashdot.org] since you posted that using your registered username by mistake (instead of your usual anonymous coward submissions by the 100's the past 2-3 months now on slashdot) giving away it's you spamming this forums almost constantly, just as you have in the post I just replied to.

Re:Jeremiah Cornelius: Grow up (0)

Anonymous Coward | about a year and a half ago | (#43597363)

You fail it, Paul. Your skill is not enough.

Now where's the cheap monitors? (4, Insightful)

Kjella (173770) | about a year and a half ago | (#43596465)

I've done the distance/size check, I don't need an UHDTV from where I'm sitting. There's not content for it anyway. But I would like a 27-30" 3840x2160 monitor for my computer.

Re:Now where's the cheap monitors? (2, Informative)

Skapare (16644) | about a year and a half ago | (#43596865)

3840 is not 4k. 4096 is 4k.

Re:Now where's the cheap monitors? (3, Informative)

SOOPRcow (1279010) | about a year and a half ago | (#43596903)

4K Ultra high definition television 3840 × 2160 which is, as I'm sure you can figure out, double the resolution of current HD content. That said, I will agree that calling it "4K Ultra HD" is kind of misleading :) See, http://en.wikipedia.org/wiki/4K_resolution [wikipedia.org]

Re:Now where's the cheap monitors? (1)

gman003 (1693318) | about a year and a half ago | (#43597191)

Yes - which is why I make a point to call it 2160p, not 4K.

Re: Now where's the cheap monitors? (0)

Anonymous Coward | about a year and a half ago | (#43597437)

Let it go man, we lost this one back when a megabyte didn't mean 1000,000 bytes and we let it slide. 4k rolls off the tounge so nicely, and says it all in 2 syllables, which is important if we're gonna flog this spectacular resolution to joe sixpack.

Re:Now where's the cheap monitors? (5, Informative)

PhrostyMcByte (589271) | about a year and a half ago | (#43596899)

4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.

Re: Now where's the cheap monitors? (5, Insightful)

crdotson (224356) | about a year and a half ago | (#43596997)

Thank you. Most people just don't seem to understand that monitors aren't done until you can't tell the difference between a monitor and a window! It's "1920x1080 should be enough for anybody" mentality. You'd think people would learn after a while.

Re: Now where's the cheap monitors? (1)

DigiShaman (671371) | about a year and a half ago | (#43597337)

If these monitors can reproduce the life-likeness of the printing process used by Peter Lik, then I'm sold!

Re: Now where's the cheap monitors? (1)

Anonymous Coward | about a year and a half ago | (#43597559)

This resolution drought we've been in for a decade really pisses me off. I was running 2048x1280 15 years ago on a Sony Trinitron monitor with 0.22" dot pitch. Why the fuck are all monitors today still shittier than we had over 15 years ago?

Re: Now where's the cheap monitors? (0)

fredgiblet (1063752) | about a year and a half ago | (#43597621)

Because most people can't tell and/or don't care. There's not much money in catering to the resolution queens.

Re:Now where's the cheap monitors? (4, Informative)

dfghjk (711126) | about a year and a half ago | (#43597259)

"The image will look more life-like than any of the common TVs available today..."

Not because of the wide gamut it won't. Having the gamut on your output device doesn't mean you have it on your input device. Content won't exist that uses it so it WILL be "relegated to photographers and graph (sic) designers", standard or not. The value is suspect and the cost is mandatory extra bit depth leading to higher data rates.

The side effect of wide gamut displays displaying common content in non-color managed environments is that it looks worse, not better. This is television we are talking about, not Photoshop. Today's HD content won't look the least bit better on a wide gamut display, it could only look worse.

Re:Now where's the cheap monitors? (5, Informative)

JanneM (7445) | about a year and a half ago | (#43597423)

It's different for different parts of the business of course, but the graphic designers I know personally (through a family member) don't care about monitor gamut or colour fidelity at all. Sounds odd, perhaps, but there's good reason for it.

Most graphic design is not for the web, but for physical objects. Anything you see around you that's printed or patterned - kitchen utensils, tools, and household objets; clothes and textile prints; books, calendars, pamphlets; not to mention the cardboard and plastic boxes it all came in - has been designed by a graphic designer. And it's all printed using different kind of processes, on different materials, with different kinds of inks and dyes.

A monitor, any monitor, simply can't show you what the finished design will look like, since it can't replicate the effect of the particular ink and material combination you're going to use. So they don't even try. Instead they do the design on the computer, but choose the precise colour and material combination by Pantone patches. We've got shelves of sample binders at home, with all kinds of colour and material combinations for reference. As an added bonus you can take samples into different light conditions and see what they look like there.

The finished design is usually sent off as a set of monochrome layers, with an accompanying specification on what Pantone colour each layer should use. They do make a colour version of it too, but that's just to give the client a rough idea of what the design will look like.

Re:Now where's the cheap monitors? (0)

Anonymous Coward | about a year and a half ago | (#43597327)

I wish the parent had linked to a graph of the color spaces in question. For comparison, here's an image depicting the sRGB color space [wikimedia.org] that's currently used, including in HDTV -- and here's one for the CIE 1931 color space [wikimedia.org] used by rec2020/UHDTV.

Re:Now where's the cheap monitors? (1)

mjwx (966435) | about a year and a half ago | (#43597349)

4K/8K will sell UHDTV. But the best benefit, a gem rarely mentioned: it features a hugely increased gamut [wikipedia.org] and 10 or 12-bit (10-bit mandatory minimum) component depth. The image will look more life-like than any of the common TVs available today, and it won't be relegated to photographers and graph designers: it'll be standard.

In theory yes,

In practice people cant tell the difference between 6 bit and 10 bit colour. Besides this, most people wont be able or willing to configure or manage colour on their TV set properly. Most people cant be bothered to set their monitor at the proper resolution.

It's the same with DVD and Bluray, most people cant tell the difference. They only think they can because they know it's bluray. I can easily convince people an upscaled DVD is bluray simply by telling them it's bluray. They think I'm lying when I show them a plain DVD at the end of it.

A nice improvement in theory but much like 1080p or 3D, in practice it'll be nothing more than a gimmick to keep TV prices high.

However, all the resolution and gamut improvements in the world wont improve the crappy reality TV that gets served up on it.

Re:Now where's the cheap monitors? (1)

will.perdikakis (1074743) | about a year and a half ago | (#43597443)

In practice people cant tell the difference between 6 bit and 10 bit colour. .

That is unfair.

Rec 2020 not only increases the resolution of the color space, it also increases the area covered in the CIE 1931. In other words, it is not just the same color gamut with less quantization, it is a much larger color space entirely.

Comparison:
Rec 2020 UHDTV: http://upload.wikimedia.org/wikipedia/commons/b/b6/CIExy1931_Rec_2020.svg [wikimedia.org]

Rec 709: HDTV: http://upload.wikimedia.org/wikipedia/commons/e/ef/CIExy1931_Rec_709.svg [wikimedia.org]

Re:Now where's the cheap monitors? (1)

mjwx (966435) | about a year and a half ago | (#43597489)

In practice people cant tell the difference between 6 bit and 10 bit colour. .

That is unfair.

It's not unfair all.

Most people wont be able to tell. The graphs you linked to tell you the measurable difference, but that's using a device to measure it, we're talking about people here. People are terrible at measuring things.

In a blind test. Most people wont be able to tell the difference.

Re:Now where's the cheap monitors? (3, Funny)

jelizondo (183861) | about a year and a half ago | (#43597693)

In a blind test. Most people wont be able to tell the difference.

You insensitive clod! They are blind! Of course they can't tell the difference!

Jees!

(ducks)

Re:Now where's the cheap monitors? (2, Interesting)

Anonymous Coward | about a year and a half ago | (#43596977)

A 9.7" retina display costs 55$ off the shelf for horsiest. That's about 40$ BoM or lower. Quintuple\Sextuple that for a 23" 4K and add a thick margin and you end up in the 300-350$ range without taking into account how this production will scale to make it all cheaper.

As for your 27-30" 3840x2160 desire, it's actually quite easily doable now since it's really not that dense when you consider stuff like 5" devices having 1920x1080.

I would imagine a small OEM could make an order for these right from an existing Chinese line as long as the order are in the thousands. I don't even think over 20,000 will be necessary considering it's really just taking the same piece and *not* cutting it up as much. Maybe an Alienwere \ Nvidia co-venture to produce console killers with their own x86 console targeting these yet to be seen screens... Oh, we can dream...

Re:Now where's the cheap monitors? (1)

Anonymous Coward | about a year and a half ago | (#43597059)

lol meant *hobbyists* not *horsiest* :D
This spell-check is freaking awesome :P

Re:Now where's the cheap monitors? (1)

scubamage (727538) | about a year and a half ago | (#43597351)

Eh. I could see using this for a PC, but honestly, I drive all of my media through a HTPC, and I'll be darned if I'm going to have to buy something that can fit a full height video card just to watch videos, plus the video card. My 51" 1080p plasma display at 10 feet looks crystal clear, with no discernable pixels. Maybe in 5-10 years once the life has been zapped out of this plasma, I'll think about it. But until it is commodity hardware, no thanks. By all means though, other folks feel free. I'm more interested in the graphics card updates that will have to be in place to drive the performance required for gaming on these things.

4k for games? (1)

aleator (869538) | about a year and a half ago | (#43596489)

does it matter that much if you play on a 4k or 2k screen? the games graphics are anyway not distinguishing between single pixels and the textures are not optimised for 4k. if you would play 2k side by side to 4k (now keeping aside the GPU power), would you realise the difference? 4k makes significant difference for photography and video!

Re:4k for games? (0)

TheRealMindChild (743925) | about a year and a half ago | (#43596545)

Does it matter if you watch a DVD or Blu-Ray?

Re:4k for games? (1)

aleator (869538) | about a year and a half ago | (#43596605)

a DVD has smaller frames inside compared to a blueray - whereas the texture and 3d structures a game is shipped with is the same... no game will show you a different 3d object with more detail simply because you are on higher resolution

Re:4k for games? (1)

wierd_w (1375923) | about a year and a half ago | (#43596695)

Unless the game supports community content, like say, skyrim.

Then you drop in a high poly, high res pack, and play at 4k native all day long.

Re:4k for games? (1)

PopeRatzo (965947) | about a year and a half ago | (#43596889)

Unless the game supports community content, like say, skyrim.

Skyrim is a pretty rare example. Most games are not modded to the moon the way Skyrim has been.

Most games will look the same on 2k or 4k.

Re:4k for games? (1)

epyT-R (613989) | about a year and a half ago | (#43596815)

uh wait what? no.. Bitmapped data (like an mpeg stream) will not show you more detail if simply scaled up. The detail has to be there in the first place. Some tvs and players can 'add' sharper edges and gradients, as well as add intermediate frames (temporal/motion smoothing) with filters, which are running at the higher resolution, but this is fake data added after the fact. Some people like this and some don't.

However, rendering native 3D graphics at higher resolutions reduces edge aliasing and tightens perspective correction filtering (anisotropy) on surfaces not parallel and/or very distant to the view frustum. Yes, lower resolution textures will cause the latter benefit to roll off faster as the display resolution increases, but the benefits are still there.

Re:4k for games? (1)

hairyfeet (841228) | about a year and a half ago | (#43596945)

But how many people can actually see that when all the shit be blowing up all over the screen? I used to always have to go for the biggest GPU I could possibly afford but then one day it was pointed out to me "All you are doing is letting those around you enjoy a little more eye candy, you are too busy trying not to die to notice" and he was right,when I am focusing on actually playing a game as long as it stays above 30FPS and doesn't have obvious graphics pop in I couldn't pay attention to bling, I was too busy playing.

As far as 4K goes for gaming i don't see it becoming mainstream until you can get sub $200 4K screens like you can the 24s and sometimes 32 inch 1080Ps now, because at least with my gamer customers they have waaay too many other parts they want to get so that changing screens? not really high on their list. Can't say as I blame 'em, I want to replace my aging HD4850 with an HD7770,add a caching SSD to speed game load times,heck even getting a nicer headset has a higher priority than replacing my 20 inch 16x9 screen does, I will probably end up keeping it until most games won't play at less than 1080p or the screen dies, one of the two.

So I have a feeling that for the foreseeable future 4k is gonna be strictly a videophile hobby which means the content just won't be there, kinda like how the OEMs pushed like hell for 3D TV and that has bombed pretty massively, nobody cares because the content isn't there and the content isn't there because nobody cares, a classic chicken and the egg kinda deal.

Re:4k for games? (1)

fredgiblet (1063752) | about a year and a half ago | (#43597129)

Agreed. And the reason this is being pushed now is probably BECAUSE 3D is bombing. The content companies need to push something new to get people to upgrade their TVs, 3D didn't work so now they're going for 4K. Whatever, when I bought my TV I chose the 720 model because it was going to be used for Xbox and DVDs, I doubt I'll be getting a 4K for a decade or so.

Re:4k for games? (1)

Anonymous Coward | about a year and a half ago | (#43596669)

" the textures are not optimised for 4k"

While I generally agree with you that 4K is overkill, I don't agree for the reasons you stated.

The textures don't have to be available in the full resolution in order to take advantage of that resolution (in theory). That is because texels are not 1:1 with scree pixels. They can be many:1, because the object on which they are mapped can be in the distance. A 1024 pixel wide texture might be mapped to (say) 20 pixels of screen space.

That being said, 4K does seem fairly absurd for most gaming purposes.

Re:4k for games? (1)

hairyfeet (841228) | about a year and a half ago | (#43596691)

Until the monitor manufacturers can crank those puppies out for the same price they can crank 1080p screens out for its largely gonna be moot, it'll be one of those niches like laserdisc back in the day that only a few of the videophiles cared about.

Until then the sweet spot seems to be 32-42 inch 1080p screens, at least that is what I'm seeing in the shop from both the gamers and those setting up home theaters, with the gamers preferring 32s and the home theaters going 40 to 42. Makes me feel like a dinosaur going home to do everything on a 20 inch 16x9 but I'm less than a foot from the screen so I figure right now a 32 inch would be overkill and I'd rather spend that money on a caching SSD and a new GPU.

Re:4k for games? (1)

bmxeroh (1694004) | about a year and a half ago | (#43597111)

I'm with you on price, until these things come way down, you will see these sets pretty much never. That being said, any cable we pull through the wall supports 4k, just in case someone wants to upgrade some day. I find it curious that you mention 40-42 inch sets, generally everything we put on the wall is 50" or larger with folks moving to projection if they want to go really, really big (100" + big). Usually the 40 inch sets are in the bedroom.

<shameless_plug> The site in my sig has some photos of some of the large sets we've mounted. They look fantastic. And we're awesome. Seriously. Like we completely blow every other Central Ohio installer out of the water on price and customer service. It's what we do. </shameless_plug>

Re:4k for games? (4, Insightful)

Keith Mickunas (460655) | about a year and a half ago | (#43596785)

A 4K 50" display 4' or 5' away would give you a pretty damn immersive experience. Wouldn't that be nice?

I'm sitting with my eyes about 3' from my 27" 2560x1440 display with about 108ppi. I can make out some pixels as it is in the text. I'm not wearing my glasses, so that helps some. If this was a 4K 27" display, that would be 163ppi. That's a 50% increase right there.

Wasn't that long ago that running 1280x1024 on a 17" LCD was pretty damn nice, and that was 94ppi. So for a decade we've barely improved when it comes to density. Hell, a 24" 16:10 display that so many people love so much has the same density as a 17" LCD.

Of course my very first PC games ran in CGA, and I thought VGA was a huge step up. But at no time have I ever thought to myself "Nope, more wouldn't be better". Not when it comes to graphics, RAM, harddrive size, etc. Give me more and I'll use it.

Re:4k for games? (1)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#43596835)

does it matter that much if you play on a 4k or 2k screen? the games graphics are anyway not distinguishing between single pixels and the textures are not optimised for 4k. if you would play 2k side by side to 4k (now keeping aside the GPU power), would you realise the difference? 4k makes significant difference for photography and video!

Support varies by engine; but one (reasonably) common thing that people do with games that weren't originally designed with high-resolution widescreens in mind is mess with the field of view. Some games react badly, with all sorts of distortion effects; but it can also create a nice 'peripheral vision' sense that the game originally lacked.

This would also be engine-dependent, in terms of how much it can be tweaked; but it isn't uncommon for (even comparatively low resolution) games to have decent texture assets so that the world doesn't turn to lego when you walk up and talk to somebody/bump into a tree; but it swaps them out for lower-poly and lower resolution models when you are further away in order to save power, and eventually ceases drawing them altogether. If the engine allows it, and you've got the power, just bumping all the sliders to "Fuck yeah, I've got more shader units than some minor deities, MAXIMUM DETAIL AND DRAW DISTANCE!" will be an improvement.

For games that have lots of floating HUD elements, buttons, sidebars, etc(RTS, some of the more cluttered RPGs) it's always handy to have more room around the edges while still having a high resolution view of the action.

Re:4k for games? (0)

Anonymous Coward | about a year and a half ago | (#43597177)

No one will ever need more then 640K anyway.

Re:4k for games? (1)

Nyder (754090) | about a year and a half ago | (#43597235)

does it matter that much if you play on a 4k or 2k screen? the games graphics are anyway not distinguishing between single pixels and the textures are not optimised for 4k. if you would play 2k side by side to 4k (now keeping aside the GPU power), would you realise the difference? 4k makes significant difference for photography and video!

What does that matter when the game devs program for the consoles and port it to the PC? Sure, maybe on the PC you can go a bit higher resolution, but crappy textures still look like crappy textures.

No (5, Insightful)

bhcompy (1877290) | about a year and a half ago | (#43596507)

One of the drawbacks to high end graphics has been the lack of low cost and massively-available displays with a resolution higher than 1920x1080.

Really? You've never heard of the Dell U2410? Fuck 16:9

Re:No (1)

Keith Mickunas (460655) | about a year and a half ago | (#43596537)

1920x1200 have been around long before the Dell U2410, so it's silly they ignored this. But would you really reject a 2560x1440 display because it's 16:9? How about a 4k display? That's just silly.

People need to get over this 16:9 vs. 16:10 garbage. What matters is the number of pixels. Once you get past 1200 lines or thereabouts, it's all gravy. I'm happily using two 16:9 displays, a pair of Dell U2711, and I'm well pleased with that. The extra cost to get an additional 160 lines from a 16:10 30" screen just isn't worth it.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43596567)

But would you really reject a 2560x1440 display because it's 16:9?

There are 2560x1600 monitors, so, yes.

Re:No (2, Insightful)

Keith Mickunas (460655) | about a year and a half ago | (#43596633)

Why? Why does 16:10 make a difference at that resolution? I mentioned the 2560x1600 displays, but you know what, they cost hundreds more and they have lower pixel density. The premium for 160 pixels is 30% or more, hell with Dell on Amazon right now it's 50% more.

What exactly are people doing that requires 16:10? I've used 'em, I like 'em, but I'll take 2560x1440 over 1920x1200 any day of the week. Likewise I'll take 3840x2160 over 2560x1600.

If the premium for 16:10 was in the neighborhood of 10-15% for the same pixel density, then yes, it's worthwhile. Otherwise, what's the big deal?

Re:No (2)

Rakarra (112805) | about a year and a half ago | (#43596881)

What exactly are people doing that requires 16:10?

There's really only one advantage that 16:9 has over 16:10, and that's smaller black borders (or no borders at all) for widescreen video content. Otherwise, the vertical real estate is very nice to have, and I've found 2560x1600 (which I've used for the last 5 years) somehow really hits the sweet spot between vertical size and widescreen.

Re:No (2)

Skapare (16644) | about a year and a half ago | (#43596893)

16:10 tends to work out better for office work. Sure, the higher res makes it less important. But its the physical size that makes it less important ... depending on how much space you have to push the display back.

But once you get up close to the holy grail of true 4k which is 4096, why even bother with 3840? Cinema digital is shot in 4096 (and up). 3840 should be boycotted or even banned.

Re:No (2)

Keith Mickunas (460655) | about a year and a half ago | (#43596933)

I wish they'd be honest about those resolutions. It is annoying that 3840 is being advertised as 4k. Clearly it's not. Sure it's double 1080p, but 1080p ain't 2k, so 3840 shouldn't be called 4k.

It's also annoying that they put 1080p and 2k in the graph, then just labeled this new display 4k. Dammit, so close, they acknowledged 2k as a valid format, but ignored real 4k.

Re:No (1)

epyT-R (613989) | about a year and a half ago | (#43596975)

because then you're wasting visible space. 16:10 takes advantage of more vertical space in your field of view.

Re:No (1)

ewibble (1655195) | about a year and a half ago | (#43596987)

Yes it does, resolution isn't everything area (real screen size) also important. A 16:10 screen has 5% more area than a 16:9 monitor of the same size measured on the diagonal. and a 4:3 has 12% more.

16:9 is cheaper than 16:10 to make (1)

aleator (869538) | about a year and a half ago | (#43596589)

16:10 was the standard before the industry decided that 16:9 is actually cheaper to produce http://mybroadband.co.za/news/hardware/17621-widescreen-monitors-where-did-1920x1200-go.html [mybroadband.co.za] http://www.displaysearch.com/cps/rde/xchg/SID-0A424DE8-28DF6E59/displaysearch/hs.xsl/070108_16by9_PR.asp [displaysearch.com]

Re:16:9 is cheaper than 16:10 to make (1)

SolitaryMan (538416) | about a year and a half ago | (#43596667)

Yep, it is pretty easy: making the ration the same as in TVs makes it possible for them to save money because of the scale.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43596615)

But would you really reject a 2560x1440 display because it's 16:9?

No, but I'd still reject a 2560x1440 display if I had to pay for it, because it's an expensive overkill when all you want is something comparable to an ordinary everyday WUXGA display from a decade ago. But that's what you need to go past 1200 lines and into your gravy territory, because 1920x1080 killed 1920x1200.

Fuck 16:9.

Re:No (2)

Keith Mickunas (460655) | about a year and a half ago | (#43596685)

I heard at one time that 16:10 came out of the video editing industry. Basically they were working on 16:9 video, so they had displays with extra space at the bottom for controls. These displays then were adapted to the higher end computer market. However once 16:9 displays were being manufactured in large quantities for consumer TVs, I imagine that drove the price down for manufacturing 16:9 computer monitors. I'm fairly certain the decision to use 1920x1080 in the TV industry had nothing to do with computer displays.

But what does the ratio really matter? Isn't it about the pixels? I used a Samsung 24" 1920x1200 for a long time, and I loved it. But I'd seen higher pixel counts, and I wanted it. So when I had the spare cash last year the decision for me was 1 30" display or spend a slight bit more and go for 2 27" displays. I don't care about the ratio, I got plenty of height now. That's what matters.

Re:No (1)

fredgiblet (1063752) | about a year and a half ago | (#43596795)

Ratio does matter, more vertical space reduces scrolling in documents and web pages, gives more space for content creation that isn't widescreen formatted itself (like making square or portrait oriented art) and is beneficial for some games (most non-first person games). The only real benefit that I can think of for 16:9 over 16:10 is no letterboxing, I'd gladly trade that for the benefits of 16:10 (or even 4:3, I can watch movies on my TV if the letterboxing is going to be that big of a deal).

Re:No (1)

EvanED (569694) | about a year and a half ago | (#43596907)

The only real benefit that I can think of for 16:9 over 16:10 is no letterboxing

What I really like about wide screen is that it makes it a lot more helpful to have two windows open side by side, e.g. code and a command prompt or code and documentation. You can do that with 4:3 if you have enough horizontal pixels, but my feeling is that narrower than about 1600 things start to go downhill quickly. (I am usually pretty comfortable with my 1680-wide display.)

E.g., the monitor I'm writing this on (sort of -- more in a sec) is 1680x1050 (granted, this is 16:10), or 1.764 megapixels. A 4:3 display with the same number of pixels would be 1534x1150. If I had to choose between those two resolutions, I'd actually rather have the 1680 widescreen.

Re:No (1)

EvanED (569694) | about a year and a half ago | (#43596911)

Oh look at my wonderful reading comprehension skills. You're comparing 16:9 to 16:10, not widescreen to 4:3.

Carry on, and ignore what I said completely. :-)

Re:No (1)

Keith Mickunas (460655) | about a year and a half ago | (#43596909)

You're half right. More vertical space is great, but the ratio doesn't matter. For the work you are talking about what matters is the height.

There's a funky Dell out now that has an even wider aspect ratio, it's around 2.33:1 I believe. Now I'd like that, if it had more pixels. It's 2560x1080. That's a detriment. But imagine if that was more along the lines 3350x1440. Would you still complain that was too wide? You could have three documents, web pages, whatever up, side by side, and still have a lot of vertical space.

Overall resolution and pixel density are what's important. Far more so than ratio. Hell, if height is the only thing that matters to you then get the biggest monitor you can afford and turn it on its side.

Frankly I want both, height and width. But if the cost for a wee bit more height is substantial, well as long as I have enough vertical resolution I'm not going to get too caught up on the ratio. 1440 is great for me. Would I like 1600? Sure. Would I rather spend that $300+ on something else? You bet.

Re:No (1)

fredgiblet (1063752) | about a year and a half ago | (#43597073)

But what if I'm reading ONE website instead of three? It's true that if the screen is tall enough the ratio doesn't matter, but getting that height costs more horizontal desk space that I shouldn't have to give up and in a world where both ratios were widely produced getting the height from a 16:9 display would cost more. Also if I'm gaming the higher resolution can just as easily be a detriment, requiring more or higher-end video cards to get the other quality options up.

Resolution and pixel density are very important, but for the purposes of our discussion they're similar anyway, most consumer-grade monitors are in the 100 ppi range and will probably stay there for the foreseeable future, and the overall resolution of a 16:10 display is usually higher than a 16:9 display.

Height isn't the only thing that matters, but given similar resolution and ppi height starts moving up the list. I'd love for more monitors to have the pivot option to switch to portrait mode when I'm surfing then swap it to landscape for gaming and movie watching, but that's usually a significant premium as well.

Re:No (1)

bhcompy (1877290) | about a year and a half ago | (#43597053)

Yea, no letter boxing because on a 16:9 screen you don't have the real estate. The size of the picture is the same, it's just that you have a black banner because you have more screen to work with.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43597025)

16:10 approximates the golden ratio.

Additionally, your web site is throwing a 404.

Re:No (1)

fredgiblet (1063752) | about a year and a half ago | (#43596643)

Would I reject it? No. But I'd look to trade it for a 16:10 monitor ASAP. Some games (particularly strategy games) are better with taller screens.

Re:No (1)

epyT-R (613989) | about a year and a half ago | (#43596951)

Those are your preferences.. I think anything over 24" is useless for desktop use as it requires too much neck panning and eyeball rolling. It's not just the number of pixels that matters, it's how many can be crammed into your visual range at a time. I'd like to see a 3840x2400 panel in 23-24", 120hz or better, no input lag/ghosting, and deep color support. Of course, this is unobtanium along with the gpu to drive it well, but everyone has different priorities.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43597089)

I think anything over 24" is useless for desktop use as it requires too much neck panning and eyeball rolling

Depending on what you're doing, neck panning and eyeball rolling can still be way faster and easier than scrolling scrolling.

For instance, I've been occasionally working on a little side project which is editing together a bunch of clips from a game soundtrack to try to create a single coherent track. And on my 23" 1080p monitor, a lot of the time it is absolutely maddening to work on. If I have the monitor in its normal orientation, I can't fit all of the tracks on the screen at a comfortable size. If I turn it to portrait mode, it doesn't show enough span of time.

I'm going to be taking an actual job in a few months (finally ending my time as a student) and I honestly won't be at all surprised if I wind up buying some 30", 2560x1600 monitor, to a non-trivial extent on account of this project. (Though also because it'd be awesome. :-))

(OK, my monitor is not quite 24", but it's close enough for this argument. 23" doesn't seem to merely be "not quite good enough" but more... "not even nearly good enough".)

Part of the above is pixel size, but that's by no means all of it. I can resize the tracks vertically so that they all fit on screen, for instance... but then they're too small. Not just too low resolution, but actually too small, physically. So if I want to do something where I need to see a bunch of tracks at once, I have to go and resize them all down. Then if I want to do something that requires a good view of the wave form, I have to expand a couple back out. It's a pretty maddening process. (It would be made much less maddening if Audacity had sort of "preset views" so I could hit a couple buttons and fit all of them to screen, then hit another couple buttons and expand them back out; but that's not a solution so much as a workaround.) A super-high res display that's still in the 23-24" range wouldn't help a ton with that.

The other thing is that I actually have two monitors; I have a 23" 1080p one in landscape mode, and next to it I have a 1680x1050 22" monitor in vertical resolution. Together they form a field which is far larger than that 20" monitor. I don't really find the looking around very annoying at all.

Finally, I'll also say that this all depends of course on how far from your face you keep your monitor. Mine are probably a little more than 2 feet away, and I wish they were a bit more except that my desk isn't deep enough. If you keep them closer, then yes, a smaller screen would fill your view more quickly.

Re:No (2)

mjwx (966435) | about a year and a half ago | (#43597409)

But would you really reject a 2560x1440 display because it's 16:9?

I'd happily pay a few dollars more for a 2560x1600 display because it is 16:10. 16:10 displays are superior to 16:9 for almost all computing purposes. For games it gives me a taller FOV, for work it's exactly 2 A4 pages side by side and gives me a taller view (yes, an extra 4 CM really does make a difference when working on a large spreadsheet, config file or script), with video editing you can have the tools on screen without overlaying them on a video.

16:9 monitors are cheap. I generally recommend them if you dont do anything on your computer that requires a taller FOV (I.E. most users). If all you do is web browsing and watching cat videos, this difference means nothing, if you do serious work then it does.

People need to get over this 16:9 vs. 16:10 garbage.

My laptop is 16:9, my desktops (home and work) are 16:10.

This is mainly done due to availability. it's easy to get cheap 16:10 monitors, hard to get 16:10 laptops.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43596569)

Fuck almost 400 dollars for a monitor.

Re:No (2)

ebh (116526) | about a year and a half ago | (#43596781)

I just took a 21" CRT to the recycling place. In 1995, it cost about $2200 new. In 2001, my employer gave it to me as scrap when our building was closed and they decided that a lot of that stuff was cheaper to give away than to move to some warehouse across the country. (Plus it was a tiny bit of good will that the local management could show the laid-off employees when the Big Guys were being callous pricks kicking us to the curb while we were still going to 9/11 funerals.)

ObTopic: $400 is an expensive monitor these days, but it wasn't that long ago that $400 wouldn't buy you a useable SVGA monitor.

Re:No (1)

Keith Mickunas (460655) | about a year and a half ago | (#43596853)

I bought a Gateway open-box 21" monitor back in the late 90's for about $1k. I think it could do 1600x1200, but it wasn't real solid at that. That was one heavy beast to move around. I got rid of it some time ago, don't remember how. I got a Dell 19" Trinitron from one employer in 2000. That was sweet, although it had those two strange horizontal lines, but other than that the image was solid. Eventually its color started to go though. I also bought a 17" NEC monitor in '95 or '96 for just under $700 on sale at Best Buy. Checked Computer Shopper and saw that was about $300 off list. Ah, those were the days.

Of course those CRTs didn't have the viewable size they were listed at. So that 21" was probably under 20". My LCDs are all within a fraction of an inch of what they claim. And of course they way a fraction of what those old CRTs did.

Re:No (2)

EvanED (569694) | about a year and a half ago | (#43596985)

I got a Dell 19" Trinitron from one employer in 2000. That was sweet, although it had those two strange horizontal lines,

That was true of all Trinitron monitors. Here is what Wikipedia says [wikipedia.org] about them:

Even small changes in the alignment of the grille over the phosphors can cause the coloring to shift. Since the wires are thin, small bumps can cause the wires to shift alignment if they are not held in place. Monitors using this technology have one or more thin tungsten wires running horizontally across the grille to prevent this. Screens 15" and below have one wire located about two thirds of the way down the screen, while monitors greater than 15" have 2 wires at the one-third and two-thirds positions. These wires are less apparent or completely obscured on standard definition sets due to larger scan lines of the video being displayed. On computer monitors, where the lines are much closer together, the wires are often visible. This is a minor drawback of the Trinitron standard which is not shared by shadow mask CRTs.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43596573)

One of the drawbacks to high end graphics has been the lack of low cost and massively-available displays with a resolution higher than 1920x1080.

Here's a $550 display that gives you an entire 230,400 extra pixels for only double the price!!!11onetwo

I think you missed something there.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43596595)

While technically correct it almost seems like you went out of your way to miss the point. Oh well, welcome to Slashdot.

Re:No (0)

Anonymous Coward | about a year and a half ago | (#43597203)

Samsung S24A450BW-1 here. 16:10 goodness. I'll wait on 4k till I can get 3840x2400.

Inevitable, mr... (0)

Anonymous Coward | about a year and a half ago | (#43596515)

If the displays are cheap enough to make for the consumer market it's just a matter of time for PC. After all, 1080 pretty much killed 1200 ;)

On a side note, I wonder how much work would be needed to get current cards rendering 4k Surround/Eyefinity. I also wonder if it even matters with everything* being a console port these days...

*most. :)

Re:Inevitable, mr... (1)

realityimpaired (1668397) | about a year and a half ago | (#43597085)

On a side note, I wonder how much work would be needed to get current cards rendering 4k Surround/Eyefinity.

Buy the monitors and cables, and hook them up? My 6970 has 2 DisplayPort outputs, each of which can support up to 4 monitors with the correct cable/splitter. 4K would only take two monitors on each, and the 2-way splitters are fairly easy to get your hands on. I don't even need the splitters, as I also have 2 DVI outputs on the card, so I can drive two monitors with DP, and two with DVI (and I have never seen a monitor that supports DP and doesn't support DVI).

The card can easily handle that resolution, as long as I turn the anti-aliasing down. I can leave everything else on max, and just set the AA to 2X or off (which I usually do anyway). Just that there isn't a lot of point with most games: the UI isn't designed to be split across multiple physical displays. This is why, even though I have multiple monitors connected, I play in windowed mode on a single 24" 1920x1080 display, and keep browser/chat/everything else on the other display, rather than bothering with eyefinity.

Meh (0, Interesting)

Anonymous Coward | about a year and a half ago | (#43596519)

For about as far as I sit from my monitor and all the better my eyesight really is this is just overkill.
 
The whole resolution thing is starting to look like a Monster Cable dick wagging contest. Does it matter? Yeah. Does it matter to a human with normal eyesight? Probably not... It reminds me of when video card makers use to brag about 17.6 million colors. Too bad their users could only see about 100k of them.

Re:Meh (1)

olsmeister (1488789) | about a year and a half ago | (#43596535)

Would I give up a kidney to get 4k in my house? No. Do I want to see other people spend a ton of their money on it to move the technology forward and bring down the cost so maybe someday I can afford it? Hell yeah!

Re:Meh (1)

Keith Mickunas (460655) | about a year and a half ago | (#43596579)

That's really the important thing. We've been stuck in a rut with display sizes for a long, long time. It's time to move pixel density forward. The 27" displays that have been on the scene for the past 2 years or so are great, but so far the price hasn't dropped a great deal (disregarding the generic Korean Dell/Apple rejects).

Once 4K TV production ramps up that should lead to more higher density monitors at reasonable prices. Sadly I have to admit that it really seems like Apple was the company that pushed forward into higher density displays for smaller devices. Fortunately other companies picked up on that pretty quickly. Once people get used to those kind of displays on their tablets and phones, they're going to want something similar on their desktops and laptops.

Re:Meh (1)

CODiNE (27417) | about a year and a half ago | (#43596701)

Oh yeah I love using HiDPI mode on my 27" iMac to turn a 2560x1440 display into a virtual 1280x720 screen with twice the detail. This lets me sit way back 3-5 feet or more and have a nice readable picture. Helps avoid eye strain and is really nice how crisp everything is. Of course 1280x720 is limiting useable screw space and I occasionally have to switch it back, but I really do prefer to use it whenever possible.

It's sort of the opposite of what a true retina iMac would do for me though.

Re:Meh (1)

EvanED (569694) | about a year and a half ago | (#43596709)

Sadly I have to admit that it really seems like Apple was the company that pushed forward into higher density displays for smaller devices.

Maybe they did it more successfully, and I'll also definitely grant them the Macbook Retina. But IMO they were late to the high-res game in the smartphone market. By the time the iPhone 4 was released in June 2010 with it's 326ppi screen, there were multiple phones with reasonably high resolution on the market. I have a Nokia N900, released in Nov 2009 with a 267ppi (800x480) screen; one of my friends actually bought one of the Neo Freerunners, released in July 2008 (almost 2 years before the iPhone's retina display) with a 286ppi 480x640 screen. And those are just the two I know about. (Disclaimer: my friend's opinion was that the Freerunner actually didn't make a very good phone; it seemed like more of an experiment with their open hardware design than anything.)

Yes, the iPhone 4's display is a little higher resolution than those, but compared to the 3GS's 163ppi, they get you most of the way to the retina.

Re:Meh (5, Insightful)

ChrisMaple (607946) | about a year and a half ago | (#43596553)

For people who do technical work with a computer, the ability to have several high definition windows open at once is a tremendous benefit. Integrated circuit design, programming, CAD graphics, etc.

Re:Meh (-1)

Anonymous Coward | about a year and a half ago | (#43596661)

You smart mouthing me, cunt? You saying I don't do anything technical with my computer?
 
You're a faggot.

Re:Meh (-1)

Anonymous Coward | about a year and a half ago | (#43596841)

You smart mouthing me, cunt? You saying I don't do anything technical with my computer?

You're a faggot.

*im* saying that you are either a troll, or an ignorant stupid idiot who cant comprehend simple english or otherwise understand the intent behind a persons words.

Re:Meh (1)

fredgiblet (1063752) | about a year and a half ago | (#43596711)

True enough, but you're talking about a niche. There's niche uses for tons of technology that has no place on the average consumers desk (Like hex-core procs with hyper-threading, $1k video cards and motherboards with 10 SATA ports).

Niche? (0)

Anonymous Coward | about a year and a half ago | (#43596935)

Almost every person setting at a computer in offices everywhere would benefit. If this is a niche, its one hell of a big one!

Re:Niche? (0)

Anonymous Coward | about a year and a half ago | (#43597003)

No they wouldn't benefit as for what most people do their is no discernable benefit in a higher resolution.

Re:Niche? (1)

fredgiblet (1063752) | about a year and a half ago | (#43597081)

I would bet most people wouldn't notice the difference unless they were coming from a crappy monitor to begin with. Technically it's superior and they would benefit, but the RoI would be poor.

Re:Meh (1)

vlueboy (1799360) | about a year and a half ago | (#43596765)

For people who do technical work with a computer, the ability to have several high definition windows open at once is a tremendous benefit. Integrated circuit design, programming, CAD graphics, etc.

Did you forget "reading slashdot clearly at *any* zoom level?" ;)

Re:Meh (1)

fredgiblet (1063752) | about a year and a half ago | (#43596681)

It really is. I see people talking about how anything lower than 400 ppi is unacceptable on their phones and I just shake my head. I'm sure there are people who can legitimately tell the difference, but the vast majority of people aren't going to be able to. Meanwhile upgrading to 2560x1440 (or better x1600) may make my games a little prettier, but it's also going to require a third video card and won't do anything for the internet or the DVDs (or even Blu-rays) that I watch. When I get around to upgrading I'll probably end up getting 1920x1200 and leaving it at that for a good long while.

Re:Meh (0)

Anonymous Coward | about a year and a half ago | (#43597055)

Spoken like someone that's still using a 1280x1024 display.

Try 1920x1080 or more sometime, on a monitor no smaller than 26".

Then tell me you don't see a difference.

Necessary, no, but it's damn cool, bro.

Re:Meh (1)

gman003 (1693318) | about a year and a half ago | (#43597317)

No. Just... no.

If I could have my perfect setup, I'd have a 32" 4096x2560 main monitor, with two 27" 2560x1600 monitors to each side. And running each at 144Hz, with full AdobeRGB coverage (or better), while we're at it.

I just bought a 1440p display, and it is hands-down the best single computer component I have ever bought. Better than getting an SSD. Better than any new processor, or new video card, or new sound card, or new RAM.

True, I'm probably never going to watch video at that resolution. And it's likewise overkill for gaming. But you know what? Sometimes I feel like being a productive member of society, and that helps immensely. Dozens of open windows? No problem. Coding? Turn it to portrait mode and I can fit 150 lines on one screen, at standard font sizes.

And eyesight has nothing to do with it - I'm nearsighted as hell, and I would still like as many pixels as possible.

When will we get REAL 4k displays ... (2)

Skapare (16644) | about a year and a half ago | (#43596761)

... like 4096x1728 (digital cinema size plus a few more pixels to make it mathematically right)? Feel free to make the actual LCD pixels a bit smaller so it can all fit in a decent size (not over 80cm, please). Hell, I'd be happy even with 2048x1280 for now so I can avoid the border bumping on 1920x1200.

Well there is a niche... (2)

flayzernax (1060680) | about a year and a half ago | (#43596947)

As ive seen pictures of peoples massive 6 monitor setups...

Though as someone who's been a gamer since duke nukem... and the ultima games... I don't see what all the hype is about. The colors aught to be much nicer on a 4k display, but I know I won't be spending money on one until their dirt cheap or I get one as a gift (which means they'll be dirt cheap by then).

Then again you can make a pretty game, that gets pretty boring pretty fast =) I've played some hideous monstrosities with the worst interfaces known to man just because the actual game was fun.

Re:Well there is a niche... (1)

fredgiblet (1063752) | about a year and a half ago | (#43597267)

I remember seeing a setup someone made with 21 monitors to play Falcon 4.0 (IIRC). They had 180 degrees of monitors horizontally, one below the desk and several above in the middle area.

Re:Well there is a niche... (1)

mjwx (966435) | about a year and a half ago | (#43597465)

As ive seen pictures of peoples massive 6 monitor setups...

There are a few games where a multi-monitor setup is really good. Flight sims in particular where you want 1 or even 2 monitors to your side to display the side windows, maybe one above or one for the instrumentation.

In fact if you're learning to fly, a multi-monitor setup with HOTAS is a godsend.

But so few games actually support multi-monitor setups. So for the most part they are just e-peen extensions.

3D TV??? (1)

Jane Q. Public (1010737) | about a year and a half ago | (#43597201)

Please put spaces around your "/".

25x16/25x14 is 3 dimensions.

Resolution? WHY? (1)

will.perdikakis (1074743) | about a year and a half ago | (#43597367)

Honestly, resolution is the LAST thing that we need updated in the display frontier. Work on improving contrast ratios, widening the color gamut to the full capability of the human eye (obviously, the content must match here), reducing pixel response (LCD), reducing input lag, eliminating gamma, lighting and contrast variance throughout the panel. I would take a 1080P display with a 10% improvement in contrast ratio over a 4k TV any day.

Re:Resolution? WHY? (0)

Anonymous Coward | about a year and a half ago | (#43597611)

I would take a 1080P display with a 10% improvement in contrast ratio over a 4k TV any day.

I wouldn't. I have a 1680x1050 panel at home. Why would I move to a 1920x1080 panel? That's 30 more pixels vertically. Since most of the time I'm coding I have 2-3 code windows tiled hoizontally. 1680 pixels is more than enough for slighly more than 80 cols per terminal and a useable font size.

A 1080P monitor gives me nothing except the ability to watch high-def video without scaling. Since I don't do that on my PC anyway I don't see the point. Now, if you could give me a proper 1920x1200 monitor that would be a few more vertical lines and that is better.

Re:Resolution? WHY? (1)

will.perdikakis (1074743) | about a year and a half ago | (#43597653)

I was strictly talking about Picture quality, which, no doubt will be the marketing angle of 4k.

You complaint is one of aspect ratio actually and since 1080p and 4k UHD are 16:9, I do not see how 4k would be an improvement for you on the same size panel; you will need to be able to read the text.

On a side note, have you considered rotating your monitor? I have a 1680 x 1050 in portrait mode that is great for academic/professional production.

problem for usa-ers (1)

iggymanz (596061) | about a year and a half ago | (#43597389)

feeding a 4K square screen is beyond the bandwidth of the pathetic "broadband" most of us can get.

so for only 2500$ (1)

Osgeld (1900440) | about a year and a half ago | (#43597493)

I can play games optimized for 2005 era XBOX360 hardware in 4K, when at 1280x1024 I can already see the 128x128 textures clear as day?

whats my motivation here?

~100 mbps (0)

Anonymous Coward | about a year and a half ago | (#43597641)

How long will it take when you are downloading from YouTube at 100 millibits per second? That's 1 bit every 10 seconds. Why would anyone want to download anything that slowly? Couldn't you get a 300 baud modem? It's 3000 times as fast!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?