Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

GPUs Keep Getting Faster, But Your Eyes Can't Tell

timothy posted about 9 months ago | from the at-least-my-eyes-can't dept.

Displays 291

itwbennett writes "This brings to mind an earlier Slashdot discussion about whether we've hit the limit on screen resolution improvements on handheld devices. But this time, the question revolves around ever-faster graphics processing units (GPUs) and the resolution limits of desktop monitors. ITworld's Andy Patrizio frames the problem like this: 'Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well. Here's the thing: at that resolution, these new GPUs are so powerful you get no major, appreciable gain over the older generation.' Or as Chris Angelini, editorial director for Tom's Hardware Guide, put it, 'The current high-end of GPUs gives you as much as you'd need for an enjoyable experience. Beyond that and it's not like you will get nothing, it's just that you will notice less benefit.'"

cancel ×

291 comments

There are other applications (1)

RunFatBoy.net (960072) | about 9 months ago | (#45294621)

Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

-- Jim
Your website could be better. Getting weekly feedback [weeklyfeedback.com] is a good starting point.

Re:There are other applications (1)

K. S. Kyosuke (729550) | about 9 months ago | (#45294857)

And perhaps natural language processing, to put forward my pet peeve.

Re:There are other applications (4, Informative)

exomondo (1725132) | about 9 months ago | (#45294867)

These are often marketed as GPGPU products, nVidia's Tesla for example, rather than taking a bunch of Geforces and putting them together.

Re:There are other applications (1)

MrHanky (141717) | about 9 months ago | (#45294875)

I'm not sure I'd call a 27" monitor an area of science, but it does benefit from today's faster GPUs.

Re:There are other applications (1)

Anonymous Coward | about 9 months ago | (#45294955)

Let's not even mention 3P and 5P fullhd or 19x12 setups, 3P 2560x1440 (yay for cheap korean IPS) or that 60Hz 3840x2160 displays are dropping in price quick...

Re:There are other applications (5, Informative)

Anonymous Coward | about 9 months ago | (#45294927)

Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

Even ignoring that, the guy is a fucking idiot.
He seems to be confused about the function of a GPU- they are doing far more than simply pushing pixels onto the screen. Wake up buddy, this isn't a VGA card from the 90's. A modern GPU is doing a holy shitload of processing and post-processing on the rendered scene before it ever gets around to showing the results to the user. Seriously man, there's a reason why our games don't look as awesomely smooth and detailed and complex as a big budget animated film- it's because in order to get that level of detail, yes on that SAME resolution display, you need a farm of servers crunching the scene data for hours, days, etc. Until I can get that level of quality out of my desktop GPU, there will always be room for VERY noticeable performance improvement.

Re:There are other applications (4, Insightful)

Score Whore (32328) | about 9 months ago | (#45294961)

Not to mention that the world hasn't standardized on 1920x1080. I've got half a dozen computers / tablets and the only one that is 1080p is the Surface Pro. The MacBook Pro with Retina Display is 2880x1880. Both of my 27" monitors are 2560x1440. I don't have any idea what this dipshit is thinking, but his assumptions are completely wrong.

Re:There are other applications (4, Funny)

TechyImmigrant (175943) | about 9 months ago | (#45294985)

I wouldn't let a 1920x1080 monitor grace my cheap Ikea desk.

Re:There are other applications (2)

Anonymous Coward | about 9 months ago | (#45295107)

Thats why I have my old 1920x1200 panel on an arm off the wall. TECHNICALLY it isn't touching the desk.

Re:There are other applications (2)

acid_andy (534219) | about 9 months ago | (#45295195)

16:10 FTW!

Re:There are other applications (2)

icebike (68054) | about 9 months ago | (#45295231)

I agree, the guy must live in the ghetto, using recycled year 2000 equipment.
For his crapstation maybe he wouldn't get any benefit from a faster GPU.

Monitors are getting bigger all the time, and the real estate is welcome when editing mountains of text, (as is a monitor that can swivel to portrait). 1920x1080 is not sufficient for a large monitor. I don't have any that are that limited. People aren't limited to one monitor either.

Re:There are other applications (1)

Nyder (754090) | about 9 months ago | (#45295199)

Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

-- Jim
Your website could be better. Getting weekly feedback [weeklyfeedback.com] is a good starting point.

Benefits the Bitcoin Miner malware makers...

Re:There are other applications (2)

aussie.virologist (1429001) | about 9 months ago | (#45295279)

Aren't there are other areas of science that a faster GPU benefits namely structural biology and the modeling proteins?

Absolutely, I run complete atomistic molecular dynamics simulations of viruses that cause disease in humans (enterovirus simulations around the 3-4 million atom mark). Five years ago I had to use a supercomputer to model 1/12 of a virus particle which barely scraped into the nanosecond range. I'm now able to run complete virus simulations on my desktop computer (Tesla C2070 and Quadro 5000) and I get 0.1ns/day or on my 2U rack (4x Tesla M2090) with 2 viruses running simultaneously at almost 0.2ns/day. That's using the last generation of nVidia cards (Fermi), I should in theory be able to almost double that with the new Kepler cards. I will be VERY interested to see how the next ?Maxwell architecture pans out in the future. I can see a time in the not to distant future when I can model multiple instances of virus-drug interactions on-site here in the lab and get results overnight that I can compare with our "wet lab" results. I use NAMD for the simulations which works well with the CUDA cards.

Lets not forget (5, Insightful)

geekoid (135745) | about 9 months ago | (#45294623)

they need to handle more stuff happening on the screen.

Re:Lets not forget (5, Insightful)

infogulch (1838658) | about 9 months ago | (#45294815)

Exactly, the games themselves have been pared down to fewer objects because our older cards couldn't handle it. Now there are new cards and people expect games that can use that horsepower to be available instantly? Sounds unreasonable to me.

When your graphics card can handle 3x 4K monitors at 120Hz and 3D while playing a game with fully destructible and interactable environments (not this weak-ass pre-scripted 'destruction' they're hyping in BF4 & Ghosts) the size of new york city without breaking a sweat, the bank, or the power bill, THEN you can talk about the overabundance of gpu horsepower.

Re:Lets not forget (1)

Anonymous Coward | about 9 months ago | (#45294847)

and 3D

Ew, no.

Re:Lets not forget (2)

infogulch (1838658) | about 9 months ago | (#45294941)

Well, I just wanted an excuse for my 120Hz requirement.

VR (0)

Anonymous Coward | about 9 months ago | (#45294643)

What about things like Oculus and rendering with 2 separate cameras?

Now (3, Interesting)

Zeroblitzt (871307) | about 9 months ago | (#45294655)

Make it draw less power!

Re:Now (1)

Anonymous Coward | about 9 months ago | (#45294721)

They are, slowly. I just replaced my AMD HD4890 with an HD7950, and my wattage consumption under normal load dropped by about 50 watts.

Re:Now (0)

Anonymous Coward | about 9 months ago | (#45294737)

They have been, you can have two of them in crossfire on a sub 500 watt PSU wheras in say back in 2006 you'd be insane to attempt that.

Re:Now (4, Interesting)

afidel (530433) | about 9 months ago | (#45294763)

They are, you can get very playable framerates @1080p using a nearly passively cooled card (the next shrink will probably make it possible using a completely passive card). Hell, my new gaming rig draws under 100W while playing most games, my previous rig used over 100W just for the graphics card.

Re:Now (1)

dgatwood (11270) | about 9 months ago | (#45295309)

Make it draw less power!

In a modern architecture with proper power management, increasing the speed of the chip often does exactly that. If you graph power consumption over time, the total power consumption is the area under the line. Thus, if you make a chip that takes twice as much instantaneous power while it is active, but can do the work in a third as much time, then as long as you properly power down that extra hardware while it is idle, you're using two-thirds as much power when averaged out over time.

This assumes that the idle power of the faster chip doesn't increase so much that you make up the difference during long periods of inactivity, of course. Reducing idle power is almost invariably an important goal. :-)

we will need those phat GPUs: (2, Insightful)

Anonymous Coward | about 9 months ago | (#45294661)

-Multimonitor gaming
-3D gaming (120 Hz refresh rate or higher)
-4K gaming

keep em coming, and keep em affordable!

Re: we will need those phat GPUs: (-1)

Anonymous Coward | about 9 months ago | (#45295059)

Fake 3D is for idiots. Quit wasting your money on stupid fake shit and save it for holographic 3D.

Err, wha? (4, Insightful)

Anonymous Coward | about 9 months ago | (#45294663)

One thing they all have in common is the resolution.

So 2560x1440 and 2560x1600 27"s only exist in my imagination?

Re:Err, wha? (1)

halfEvilTech (1171369) | about 9 months ago | (#45294899)

One thing they all have in common is the resolution.

So 2560x1600 27"s only exist in my imagination?

um yes... those would be the 30" models...

Re:Err, wha? (2, Informative)

Anonymous Coward | about 9 months ago | (#45295001)

...which also don't exist according to TFA and TFS:

Desktop monitors (I'm not talking laptops except for the high-end laptops) tend to vary in size from 20 to 24 inches for mainstream/standard monitors, and 27 to 30 inches for the high end. One thing they all have in common is the resolution. They have pretty much standardized on 1920x1080. That's because 1920x1080 is the resolution for HDTV, and it fits 20 to 24-inch monitors well.

Re:Err, wha? (0)

petteyg359 (1847514) | about 9 months ago | (#45295113)

I prefer my 2560x1600 screens in 10" form factor [samsung.com] . 27"+ needs to be at least 3840x2160.

Re:Err, wha? (0)

Anonymous Coward | about 9 months ago | (#45294957)

Yes, take your pills

Your eyes... (5, Insightful)

blahplusplus (757119) | about 9 months ago | (#45294665)

... can certainly tell. The more onscreen objects there are the more slowdown there is. This is why I like sites like HardOCP that look at MIN and MAX framerates during a gameplay session. No one cares that a basic non-interactive timedemo gets 100's of frames a second, they are concerned with the framerate floor during actually playing the game.

We have a winner (0)

Anonymous Coward | about 9 months ago | (#45294765)

More polygons, for a) better graphics, b) easier graphics creation (from 3-d capture), and c) fewer tricks required to keep frame rates up. We're not horribly far from being able to just 3-D capture the real world to create game worlds. This will also allow really interesting special affects that persist for hours instead of seconds.

Re:Your eyes... (1)

intermodal (534361) | about 9 months ago | (#45294811)

Additionally, it's not just about what people are playing now. It's about what people are playing next year and the year after. My two-year-old laptop plays a mean game of Half Life or Team Fortress Classic. Put me on a Team Fortress 2 map with 20 players, explosions, and flying missiles? 15ms ping rates can't save me from lag at that point, I've got to drop from the server because it's unplayable. Granted, it's a laptop, but TF2 came out years ago.

Re:Your eyes... (5, Insightful)

MatthiasF (1853064) | about 9 months ago | (#45294883)

And don't forget about refresh rates. A 60hz refresh rate might be the standard but motion looks a lot better at 120 hz on better monitors.

Higher refresh requires more powerful GPU, no matter the resolution.

Re:Your eyes... (1)

fuzzyfuzzyfungus (1223518) | about 9 months ago | (#45294973)

"The more onscreen objects there are the more slowdown there is." Even when the framerates are fully in order, that one's a kicker: How did the developers ensure that framerates would be adequate on consoles, and on average PCs? By keeping the amount of stuff on screen down. And so we have pop-in, RPGs where a 'city' has maybe 100 people (spread across multiple areas with lots of clutter to occlude sightlines, and various other deviations from either the realistic or the epic, depending on what the occasion demands.

Arguably (as with physics acceleration for destructible environments) that's the more difficult chicken-and-egg problem: If it's just a matter of how pretty things are, I can make it work on weak hardware, and if you have a nice GPU you can crank up the resolution, anti-ailias you little heart out, and set all the draw distance sliders to maximum.

If, however, it simply isn't possible to cater to people who I need as customers if my environment has 500 NPC armies clashing or if castles can be knocked down one stone at a time, with realistic friction/leverage/impact, I can't just 'dumb down' for weaker systems, I'd actually need to rebalance the game, since the weak system version might never have you facing more than 20 NPCs, and can't have any puzzles/requirements that involve destructible environments (or I need a whole separate set of game assets with scripted quest-destructables that just have hit points and 'damaged' textures). It's a different game.

This isn't to say that the effort to support all levels of prettiness is zero, I've no doubt that it isn't; but unless a contemporary rendering engine is downright broken, it should at least be possible for the gamer enthusiasts to render at substantially higher resolutions, more AA, longer draw distances, higher poly models and higher rez textures at greater distances, etc. without any changes to the core game. The same is Not true of changes that require power but are also integral to gameplay. If some people are seeing 'real' backgrounds (with actual NPCs and scenery doing their thing, and the laws of perspective applied) and other people are getting skyboxes and a few low-detail mountains, you can't let either party interact at great distances, or you'll risk changing the game.

Totally wrong (5, Informative)

brennz (715237) | about 9 months ago | (#45294671)

In cutting edge games, FPS still suffers even at low resolutions.

Many users are going to multi-monitor setups to increase their visualization and even cutting edge graphics cards cannot handle gaming at 1920x1080 x 3 display setups on taxing games or applications (e.g. Crysis).

1080p is dildos (0)

SpaceManFlip (2720507) | about 9 months ago | (#45294687)

1920x1080 is the resolution that DUMBASSES have settled upon. Ha ha. My cell phone can do 4k already BOOYAAA!

OK that was just joke-trolling. But I don't agree that we should settle for 1080p regardless. I found myself a $200 deal on a Dell Ultrasharp U2410 for my big system, where the important graphics happen. It does 1920x1200 and it's very nice. I would surely rather have a 27" monitor with the 1600p or whatnot resolution, but money has to be spent on practical things sometimes.

This also sounds like a previous Slashdot discussion about GPUs...

Re:1080p is dildos (1)

silas_moeckel (234313) | about 9 months ago | (#45294829)

If your dropping 300 - 1k for a high end video card why would you be driving a 150 buck 1080p monitor off it? 2560x1600 monitors are 350 ish with decent ips panels.

I'm running 3 32 inch 2560x1600 panels on my primary desktop and still want more pixels.

Re:1080p is dildos (1)

Lohrno (670867) | about 9 months ago | (#45294873)

I would like my 2560x1600 display to update at more than 60hz if possible...

Re: 1080p is dildos (1)

UnknownSoldier (67820) | about 9 months ago | (#45295337)

1080p @ 120 Hz with V-Sync OFF for multiplayer FPS is the gold standard on my GTX Titan.

Seriously? (5, Insightful)

fragfoo (2018548) | about 9 months ago | (#45294693)

For games, GPU's have to process 3D geometry, light, shadows, etc. Number of pixels is not the only factor. This is so lame.

Re:Seriously? (1)

Andy Dodd (701) | about 9 months ago | (#45294839)

Yup. GPU power these days isn't about final pixel fill rates, we've had more than enough for this for a while (although keep in mind that many GPUs render at 4x the screen resolution or more to support antialiasing) - it's about geometry and effects. Most of the focus on new GPU designs is in improving shader throughput.

Yeah, monitor resolutions aren't changing much - but more GPU horsepower means that you can render a given scene in far more detail.

Think of it this way - even GPUs from the early 2000s had no problem rendering Quake3 on a 1920x1200 screen. However, good luck rendering something as detailed as Crysis on GPU/CPU combos released even after five years of additional GPU development compared to things like a GeForce 2 or GeForce 3.

Re:Seriously? (1)

ddt (14627) | about 9 months ago | (#45295275)

Agreed. The assertion is totally ridiculous. Smells like slashdot is getting played by someone who wants to convince the world to buy underpowered GPUs.

Oculus Rift at... (2)

Alejux (2800513) | about 9 months ago | (#45294697)

8K resolution, 120hz. Nuff said.

Nonsense (0)

Anonymous Coward | about 9 months ago | (#45294699)

If you're spending the kind of $$$ to get high-end GPUs, you're also spending the $$$ for 30" monitors with 2560 x 1600 resolution or more.

Dell has a mind-blowing 31.5" monitor coming out [dell.com] that has a resolution of 3840 x 2160.

Re:Nonsense (0)

Anonymous Coward | about 9 months ago | (#45294799)

How exactly is a dell-branded Sharp PN-K321 mind-blowing?

Re:Nonsense (1)

bobbied (2522392) | about 9 months ago | (#45295007)

Price?

Re:Nonsense (1)

TechyImmigrant (175943) | about 9 months ago | (#45295009)

Oooh. Shiny!

Re:Nonsense (1)

Soft Cosmic Rusk (1211950) | about 9 months ago | (#45295187)

So that's lower resolution than the 12 year old, 22.2" IBM T220? Yeah, it's mind-blowing all right. Where did it all go so horribly wrong?

Oh I can tell.... (0)

Anonymous Coward | about 9 months ago | (#45294705)

Without a good GPU this Green Lantern costume [sears.com] looks like an Ace Frehley costume!

Silly (0)

Anonymous Coward | about 9 months ago | (#45294717)

It's not just about drawing a flat image, it's also dealing with all sorts of other stuff (multiple objects, camera angles, etc..).

Throw in there multiple monitors and you can definitely push even a high end card to the point of lag.

Assumptions (5, Interesting)

RogWilco (2467114) | about 9 months ago | (#45294727)

That statement makes the rash assumption that GPUs will somehow continue to grow in speed and complexity while everything around them remains static. What about stereoscopic displays which would double the required number of pixels to be rendered for the equivalent of a 2d image? What about HMDs like the forthcoming Oculus Rift, which over time will need to continue pushing the boundaries of higher resolution displays? Who on earth is thinking that the display industry is thinking "whelp, that's it! we've hit 1080p! we can all go home now, there's nothing left to do!" ? 1080p on a 24 inch display is nowhere close to the maximum PPI we can perceive at a normal desktop viewing distance, why is that the boundary? Why are 24" displays the end? Yes, improving technology has diminishing returns. That's nothing groundbreaking, and using that to somehow suggest that we have peaked in terms of usable GPU performance is just downright silly.

Re:Assumptions (1)

timeOday (582209) | about 9 months ago | (#45294893)

Unfortunately desktop display resolutions have been stagnant for nearly 10 years now. (The Apple 30" was released June 2004).

I was disappointed the new MacBook Pro does have Thunderbolt2, but does not support 4k displays. I have a Dell 30" on my old MacBook Pro but was looking forwarding to an upgrade, finally, but no.

Re:Assumptions (0)

Anonymous Coward | about 9 months ago | (#45295257)

Unfortunately desktop display resolutions have been stagnant for nearly 10 years now. (The Apple 30" was released June 2004).

I was disappointed the new MacBook Pro does have Thunderbolt2, but does not support 4k displays. I have a Dell 30" on my old MacBook Pro but was looking forwarding to an upgrade, finally, but no.

The 15" model does support 4096 x 2160, but only at video (24Hz) refresh speeds. (I'm sure you could use it for regular computing as well, but it might seem a bit sluggish.)

Re:Assumptions (1)

timeOday (582209) | about 9 months ago | (#45295347)

I am delighted to hear that! I work with maps a lot - they aren't dynamic but they love detail.

Re:Assumptions (0)

Anonymous Coward | about 9 months ago | (#45295387)

> stagnant for nearly 10 years now.

Not quite. My 1988 20" Sony Trinitron that came with my Sun SPARCstation has a vertical resolution of 1,200 pixels. Resolution has been stagnant for almost 25 years rather than only the 10 that you claim.

No news here (2)

JustNiz (692889) | about 9 months ago | (#45294739)

If you're talking 2D desktop-type computing (surfing, emails, writing documents etc) the point of this article has already been true for at least a decade.

If you're talking 3D hardware rendering (most usually gaming), there is no such thing as enough GPU power, as its also about consistently achieving the highest framerates your monitor can handle, while having every eye-candy setting maxxed out on the latest AAA games, which are mostly already developed to get the most out of the current and next generation hardware. Its a moving goalpost on purpose.

Re:No news here (0)

Anonymous Coward | about 9 months ago | (#45295311)

...as its also about consistently achieving the highest framerates your monitor can handle...

Except that the article starts off by saying "...but your eyes can't tell." Kinda missed that one, huh?

Not true at all (-1)

Anonymous Coward | about 9 months ago | (#45294741)

My eyes can tell the difference between 60 and 120hz, and my ears can really hear the difference when I'm using Monster brand cables vs regular digital output. Just because your puny human senses can't tell the difference doesn't mean us superior beings should be held back!

That's an easy question to settle (1)

Hentes (2461350) | about 9 months ago | (#45294745)

Get some volunteers, let them play on a machine with an old GPU and a machine with a new one. If they can tell which is which, then apparently our eyes can see the difference. I'd be curious to see the result.

Re:That's an easy question to settle (1)

uCallHimDrJ0NES (2546640) | about 9 months ago | (#45294791)

But then what would we fight about here?

Sure! (1)

Greyfox (87712) | about 9 months ago | (#45295253)

The game we've chosen for this test is Dwarf Fortress [bay12games.com]

DSP (2)

IdeaMan (216340) | about 9 months ago | (#45294753)

A GPU is no longer a Graphics Processing Unit, it's a general purpose DSP usable for tasks that have simple logic that must done in a massively parallel fashion. No, I'm actually not talking about mining bitcoins or specialty stuff, I'm talking about things like physics engines.
On the other hand, they are still WAY behind the curve measured by the "My screen isn't 4xAA RAYTRACED yet" crowd.

Re:DSP (1)

Anaerin (905998) | about 9 months ago | (#45295027)

Maybe, but there are some enterprising people working on it [softlab.ntua.gr]

It comes down to what you are actually drawing... (0)

Anonymous Coward | about 9 months ago | (#45294755)

Even if the framerates or the resolution are ok it still comes down to what quality of imagery is being produced on screen, so saying that the gpus do not need to get faster is like saying we dont want a better visual quality from now on.... Games still do not look like feature film VFX and as long as they dont faster graphics are warranted... Just my 2 cents.

Good (2)

The Cat (19816) | about 9 months ago | (#45294773)

Now maybe we can have gameplay and originality again.

Author's poor interpretation of performance (2, Interesting)

Anonymous Coward | about 9 months ago | (#45294775)

"There is considerable debate over what is the limit of the human eye when it comes to frame rate; some say 24, others say 30,"

That's what is studied and discussed as as the lower limit to trick people into thinking it is in motion. I believe there are other studies where they have used pilots as test subjects where they could spot an object between 1/270 a second and 1/300 a second. In addition, there's another study that our brain (and perhaps eyes) can be trained by watching movies/tv to be more relaxed and accept lower frame rates such as 24 as fluid, or higher. Different careers can have an impact as we are exposed to different things visually.

Additionally frame latency can continue to be driven down (with diminishing returns) with higher performing cards even if the frame rate stays constant.

Re:Author's poor interpretation of performance (5, Informative)

Anaerin (905998) | about 9 months ago | (#45295067)

And it depends on what part of the eye you're talking about. The Rods (The detail-oriented parts of the eye) see at around 30Hz. The Cones (The black-and-white but higher light sensitivity and faster responding parts) see at around 70Hz. This is why CRT monitors were recommended to be set at 72Hz or higher to avoid eyestrain - at 60Hz the Rods couldn't see the flickering of the display, but the Cones could, and the disparity caused headaches (You could also see the effect if you looked at a 60Hz monitor through your peripheral vision - it appears to shimmer).

Re:Author's poor interpretation of performance (0)

cstec (521534) | about 9 months ago | (#45295211)

"There is considerable debate over what is the limit of the human eye when it comes to frame rate; some say 24, others say 30,"

This! Apparently the author is Gen Y (Z?) and didn't live through the painful Hell that was 60hz monitors, which are well below human "framerate". 75 Hz generally fixed the brutal flashing for most people, but that was just for still images. For serious gaming, 100 Hz was finally getting there.

What's sad is how this impacts the dumbing down and slowing down of gaming over time. (Or is it the other way around?) Today's popular shooters are like bullet time compared to old titles, not worth discussing. Going backward from a more real gaming era, Unreal Tournament 3 was slower than UT 2K4/3, which was slower than UT, and so Quake 3, Quake 2, Quake, and finally Doom . Ah the glorious Doom! I shudder to think what this console generation would do in a deathmatch with typical Doom players - with no powerups, no invul, no way to assauge their egos except to have skillz and not suck. Good luck with that.

Of course the article also claims 1920x1080 is a standard. For a TV set, perhaps, or someone being cheap, but that's not a monitor - 1920x1200 is a PC monitor.

Triple Monitors... (0)

Anonymous Coward | about 9 months ago | (#45294797)

I'm running 3x24" monitors with a resolution of 5760x1080. You damn well bet I need a lot of processing power for that...

Not true. (1)

decaheximal (566400) | about 9 months ago | (#45294809)

I'm not excited about the "next" generation of cards because they'll be able to maintain a solid framerate at higher resolutions (I haven't been for almost ten years), I'm excited because they'll contain more and faster programmable shader units. That's where the magic sauce happens, and the more shader power you have the more awesome stuff you can do. And as other people pointed out, they're incredibly useful for a wide range of applications outside of pure graphics processing.

my eyes can (1)

epyT-R (613989) | about 9 months ago | (#45294837)

I like high framerates and can see the difference, and there are other ways to spend the bandwidth and processing time, like color depth. 24bit is still quite limiting compared to 'real life' color gamut.. Of course, in order to be of benefit, we need displays capable of 'real life' color gamut, and normalizing even a 30bit depth on today's monitors is pointless.

Another place GPU is (ab)used is with antialiasing and post process effects, which many like. I dislike antialiasing because it causes me eye strain with the slightly blurrier image. I'd rather live with the jaggies and have higher framerate. Same with the 'blur' and bloom effects now abused by modern titles. Enough already. It's not 'realistic' in the slightest to have all that blown out color in post processing. Example titles include battlefield 4, bioshock infinite, batman arkham origins, and the codemasters' racing games, though there are plenty of others.

C'mon, who let this crap get posted? (1)

atlasdropperofworlds (888683) | about 9 months ago | (#45294841)

Wow, how can something so stupid get chosen as a post? Seriously. Even at 1080p, even the high end GPUs fall below 60fps on the most demanding games out there. People to buy high-end GPUs often do so to pair them up with 3 1080p monitors, or a 1440p monitor, or even a 1600p monitor. In fact, these people need to buy 2 to 4 of these top-end GPUs to drive that many pixels and triangles.

Re:C'mon, who let this crap get posted? (0)

Anonymous Coward | about 9 months ago | (#45294931)

This is one of those rare magical moments where the entire slashdot community, which on most days can't agree on anything as a matter of principle, stands united in the belief that the author is a complete idiot.

Re:C'mon, who let this crap get posted? (0)

Anonymous Coward | about 9 months ago | (#45295237)

Hey, don't blame me. I meta-moderated thumbs down on this one..

640K (0)

Anonymous Coward | about 9 months ago | (#45294869)

640K ought to be enough for anybody.

(Bill Gates actually never said that, in fact it was an IBM PC limitation.)

Re:640K (1)

TechyImmigrant (175943) | about 9 months ago | (#45295141)

640kHz is a really fast frame rate.

Monitor vs a Window (0)

Anonymous Coward | about 9 months ago | (#45294903)

Until you can no longer tell the difference between a monitor and looking out a window, I say keep working.

Not just about pixels... (1)

Lab Rat Jason (2495638) | about 9 months ago | (#45294923)

It's not just about resolution and frames per second... it's about color depth, shading complexity, depth of field, reflection, iridescence and phong. There are TONS of other dimensions that could be included in games that can benefit from a faster GPU. Parent post is somewhat naive... Games should play like hollywood movies at 60fps before we even talk about slowing down.

Re:Not just about pixels... (1)

epyT-R (613989) | about 9 months ago | (#45294981)

only 60? pff, 120 please..

It's A Dumb "Standard" (2, Interesting)

Jane Q. Public (1010737) | about 9 months ago | (#45294933)

There is absolutely no reason to have 1080p as a "standard" max resolution. 5 years ago I got a nice Princeton 24", 1920 x 1200 monitor at a good price. And I expected resolution to keep going up from there, as it always had before. Imaging my surprise when 1920 x 1200 monitors became harder to find, as manufacturers settled on the lower "standard" of 1920 x 1080 and seemed to be refusing to budge.

It's great and all that a 1080p monitor will handle 1080p video. BUT... when it does, there is no room for video controls, or anything else, because it's in "full screen" mode, which has limitations. I can play the same video on my monitor, using VLC, and still have room for the controls and other information, always on-screen.

Now certain forces seem to want us to "upgrade" to 4k, which uses an outrageous amount of memory and hard drive space, super high bandwidth cables, and is more resolution than the eye can discern anyway unless the screen is absolutely huge AND around 10 feet away.

Whatever happened to the gradual, step-wise progress we used to see? I would not in the least mind having a 26" or 27", 2560 x 1440 monitor on my desk. That should have been the next reasonable step up in monitor resolution... but try to find one from a major manufacturer! There are some on Ebay, mostly from no-names, and most of them are far more expensive than they should be. They should be cheaper than my 24" monitor from 5 years ago. But they aren't. Everything else in the computer field is still getting better and cheaper at the same time. But not monitors. Why?

Re:It's A Dumb "Standard" (0)

Anonymous Coward | about 9 months ago | (#45295023)

+1

4:3 monitors should be standard, not 16:9

Re:It's A Dumb "Standard" (3, Informative)

nojayuk (567177) | about 9 months ago | (#45295043)

Is Dell enough of a major manufacturer for you? I just got a replacement 27" Dell 2560x1440 monitor delivered today after a big electricity spike blew out my previous Dell 27" monitor a few days ago.

Sure it costs more than piddly little HD-resolution monitors but I'm looking at nearly twice the number of pixels as HD, it's an IPS panel, high-gamut and with a lot of useful extra functionality (a USB 3.0 hub, for example). Well worth the £550 I dropped on it.

If you are willing to compromise and really want a 24" 1920x1200 monitor Dell make them too. The 2412M is affordable, the U2413 has a higher gamut at a higher price. Your choice.

Re:It's A Dumb "Standard" (1)

Anonymous Coward | about 9 months ago | (#45295207)

Go to New Egg and look at LCD monitors. Filter for 24 inch and then look under resolutions. You'll find 164 under 1080p and 33 under 1920 x 1200. Non of the other resolutions are even available for purchase. Even if you don't filter you're going to still have 1080p as the vast majority of what's offered, and it's been that way for a long time. Unless this 4k and 8k con works out I don't anticipate your 2560x1440 catching fire. Sincerely AC

Re:It's A Dumb "Standard" (0)

Anonymous Coward | about 9 months ago | (#45295081)

1080p isn't a gaming standard. Any PC gamer can tell you that. 1080p is a video format resolution, get that? VIDEO! We also have 2160p and 4K with various refresh rates. They are for motion video. Retard PCers like youself can run whatever you like on any screen you like, if you have the money, which I doubt you have. The fact you haven't had at least one 30" monitor for 5 years says you need to get a better education or a proper job. My 11 year old has 2x30" screens, one of which he paid for out of his own earnings saved from his lawn cutting services to neighbors. What's your excuse?

Two Words: "Volume" and "Suppliers" (1)

HaeMaker (221642) | about 9 months ago | (#45295091)

If there is a widely accepted standard, there is a guaranteed customer base. 4K is the next logical plateau since it is gaining traction as the next broadcast standard (although NHK is pushing for 8K!).

There are only two suppliers of flat panels, Samsung and Sharp (and Sharp isn't looking good financially). If you ask them to make a common, but not wildly common resolution, it will cost much more.

In two years, you will be able to buy a 4K monitor for the same price as a 1080p screen, and all new video cards will support it.

Re:Two Words: "Volume" and "Suppliers" (0)

Anonymous Coward | about 9 months ago | (#45295403)

3D was the next broadcast standard, and what happened to that again?

Re:It's A Dumb "Standard" (0)

Anonymous Coward | about 9 months ago | (#45295097)

Actually pixels are still very noticeable in HD computer screens especially if you are playing a computer game. I think you meant to say it is more resolution than the eye can handle unless the screen is absolutely huge and also *6 inches* from your face because the further away the screen the less noticeable the resolution. IPad resolutions are really the only screens that come close to giving a no pixel feel but they also fail because they are literally 1 foot from your face. I really can't believe we haven't gone to 4k resolution yet but the barrier seems to be the cost. Mark my words, Apple will come out with an expensive 4k monitor, then everyone else will follow suit.

Re:It's A Dumb "Standard" (0)

Anonymous Coward | about 9 months ago | (#45295137)

Apple, Dell, HP and Samsung and aren't major manufacturers?

Re:It's A Dumb "Standard" (1)

sribe (304414) | about 9 months ago | (#45295293)

I would not in the least mind having a 26" or 27", 2560 x 1440 monitor on my desk. That should have been the next reasonable step up in monitor resolution... but try to find one from a major manufacturer!

For whatever reason, they seem to have been coming and going for the past year or two. If you don't find them this month, check back next month and you'll find several...

Re:It's A Dumb "Standard" (1)

Ichijo (607641) | about 9 months ago | (#45295325)

[4k] is more resolution than the eye can discern...

Yes, that's the whole idea behind retina displays.

Re:It's A Dumb "Standard" (0)

Anonymous Coward | about 9 months ago | (#45295385)

There are Korean monitors with 3840x2160 for $700 and 2560x1440 for $300. Look harder on Google and Amazon.

Console Ports. (1)

Kaenneth (82978) | about 9 months ago | (#45294995)

Most games are written for the Lowest Common Denominator, that is, Game Consoles.

Hopefully PC games will be 'allowed' to improve when the next generation of console becomes standard.

Re:Console Ports. (1)

0123456 (636235) | about 9 months ago | (#45295099)

Hopefully PC games will be 'allowed' to improve when the next generation of console becomes standard.

Except the next generation of consoles are basically low to mid-range gaming PCs.

Yes, I can see it... (2)

HaeMaker (221642) | about 9 months ago | (#45294999)

Until, when I look at a video game on my screen and look at a live action TV show and can't tell the difference, there is room for improvement. Perhaps the improvement needs to come from the game developers, but there is still room and I do not believe we have hit the pinacle of GPU performance.

By the way, 4K will replace 1080p very soon, so the article is doubly moot.

I game on Haswell (0)

Anonymous Coward | about 9 months ago | (#45295061)

When I tell people I game on Haswell, they say "nice CPU but I asked what GPU you use." I just stare at them. They finally get what I'm saying, and don't believe me. But it's true, and AFAICT it's just fine, and way faster than the discrete GPU machine I used to have. It all comes down just how many years old that machine was. And if it was 12, then YES, Haswell beats it, and seems plenty fast for my 10 year old games.

GPU's are not just for Graphics anymore (1)

bobbied (2522392) | about 9 months ago | (#45295065)

Really? GPU's are being used more and more for more than just graphics processing. Many interesting parallel processing problems are being off loaded to GPU's where they are number crunching on hundreds of cores much faster than can be done on your main CPU. See http://www.nvidia.com/object/cuda_home_new.html [nvidia.com] for one such set of libraries for Nvidia cards.

So WHO CARES if you cannot see the difference in what gets displayed. There is a LOT more going on.

More resolution! (1)

GrahamCox (741991) | about 9 months ago | (#45295077)

1080p on a 27" screen? Those are some pretty big pixels! Actually my 27" iMac has far higher res than that, though still looks a little fuzzy after using my "retina" laptop screen. I hope to see these 300-ish ppi values reaching 27" screens sometime soon, and that's what these GPUs will need to be fast for. Unlike a TV set, which is viewed from a distance, a monitor is used much closer, so higher res is a very obvious benefit.

Only applies to CURRENT gen games (0)

Anonymous Coward | about 9 months ago | (#45295185)

Most current generation PC games are less than stellar ports from the Xbox360 and PS3. As such, even a modern low-end GPU will likely have enough power to run the game well at less extreme resolutions.

The console versions have heavily constrained textures and model face-counts, and use shaders in a modest way. New assets are rarely created for the PC version, so the PC version is merely showing console assets at a higher resolution, with higher (or more consistent) frame-rates, and better (often MUCH better) anti-aliasing methods.

This season's batch of next-gen games for the (dreadful) Xbone, and the AMD-2014 HSA, hUMA architecture based PS4, are quite grotty ports of the current PC versions, and thus unremarkable compared with even a PC running with a lesser mid-end card. What matters is what happens from 2014 onwards.

If the putrid Xbone has any success, cross-platform AAA developers will likely usually constrain the games to what the very weak hardware of the Xbone can handle, and this essentially means games that never look better than PC games from 2012 running on lesser graphics cards. This could create problems for future PC owners looking at better gaming experiences on the common resolution (1080P) mentioned in the article, given that by the end of next year, almost any PC GPU card would be overkill at that rez for 2012 style titles.

However, if the Xbone fails (as seems very likely), and the vastly superior PS4 becomes the console of choice, AAA games developers will have enough power to start to make big use of a whole batch of new methods that make the visuals much better. As this happens, current PC hardware will once again be seriously tested, especially if PC owners desire to activate the better forms of anti-aliasing that the PS4 will not be able to use.

Or, possibly, now that the PX, Xbone and PS4 are all forms of PC, games developers will treat the three platforms as variations of ONE PC version, crafting the best graphic experience possible, and then LOWERING the settings for the console versions until they play at adequate speed. This would unleash the ability of developers to utilise any amount of potential GPU power, but on the understanding that many heavy-weight rendering options would have to be switched down or off for the console versions. The problem here is that currently, console sales are considered as vastly more important than PC sales, and so spending large amounts of money developing features that can ONLY be appreciated on quite powerful PCs would run counter to good financial planning- UNLESS the PC version was seen as creating a clear 'HALO' effect (like has happened with The Witcher IP, and the Battlefield IP).

Today, the impact of the state-of-the-art in games rendering means that SANE PC gamers (those that don't blow up low-rez games on monitors with massive resolutions) can still happily get by with high-mid-end cards from years ago, or low-mid-end cards from today. If you don't care about being future proof (with AMD's Mantle support on their GCN architecture found on both consoles and their current PC GPU parts), you can buy a card like Nvidia's 2GB 650TI Boost (yes, that 'boost' bit matters as a designation). for around 150 dollars, and it will give a great experience at 1080P.

For $300 you can get AMD's 7970 with three free good games, or for a bit more, Nvidia's 770 with three far better games, and have so much surplus power at 1080P, you can probably be happily using this card in two years time (and even when it needs retiring, it will make a great card in a secondary system).

However, if you are upgrading your PC in the light of the current situation, think seriously about getting a card with MORE than 2GB (which really means AMD) and a card with GCN (which means ONLY AMD). Current 1080P games are almost using 2GB of GPU memory per frame rendered, and given the PS4 has 8GB of memory that can theoretically be used by its GPU, it is likely the best 1080P rendering methods will start to use approaching 3GB of space. To-the-metal games on the Xbone and Ps4 will be GCN, of course, and many will have 'Mantle' GCN versions on the PC.

yeah but.... (0)

Anonymous Coward | about 9 months ago | (#45295221)

i run 3 monitors, with my main being 2560x1600. a oc'd geforce gtx 680 can ~barely~ keep up at that resolution. now there next thing on the scene is 4k gaming. whats 4k? 3840x2160. that's a lot of fucking pixels, lemme tell you.

oh and the difference between 2560x1600 and 1920x1080? fucking amazing. like double the pixels.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...