Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA's G-Sync Is VSync Designed For LCDs (not CRTs)

Soulskill posted 1 year,13 days | from the g-is-the-new-v dept.

Graphics 139

Phopojijo writes "A monitor redraws itself top to bottom because of how the electron guns in CRT monitors used to operate. VSync was created to align the completed frames, computed by a videocard, to the start of each monitor draw; without it, midway through a monitor's draw process, a break (horizontal tear) would be visible on screen between the two time-slices of animation. Pixels on LCD monitors do not need to wait for above lines of pixels to be drawn, but they do. G-Sync is a technology from NVIDIA to make monitor refresh rates variable. The monitor will time its draws to whenever the GPU is finished rendering. A scene which requires 40ms to draw will have a smooth 'framerate' of 25FPS instead of trying to fit in some fraction of 60 FPS." NVIDIA also announced support for three 4k displays at the same time. That resolution would be 11520×2160.

Sorry! There are no comments related to the filter you selected.

But.. (0, Redundant)

TeknoHog (164938) | 1 year,13 days | (#45169345)

will the pixels wait for the first post?

Re:But.. (0)

Phopojijo (1603961) | 1 year,13 days | (#45169373)

Fir..! Dangit : ( Guess not.

Basic math (-1)

Anonymous Coward | 1 year,13 days | (#45169365)

Read your own damn source article, you moron. 4k is 3840×2160. Where the fuck did you get 11520 from?

Re:Basic math (0)

Anonymous Coward | 1 year,13 days | (#45169395)

Read your own damn source article, you commenting moron. Sorry, I failed miserably.

Re:Basic math (3)

Tuidjy (321055) | 1 year,13 days | (#45169425)

11520 = 3 x 3840

If _you_ had read the damn article, you would have noticed that the resolution is for THREE 4K monitors, side to side.

I'm not saying it's a graceful turn of phrase, or particularly clear, but most people would have been able to tell where he got it from...

Re:Basic math (0)

Anonymous Coward | 1 year,13 days | (#45169599)

Yup I saw that as soon as I read the number.

3 x 75" 4k TV's mounted in the bedroom in front of the bed with a wireless keyboard and mouse. /drool

How many years though till a graphics card can drive that resolution which all the goodies turned on?

Re:Basic math (1)

Delusion_ (56114) | 1 year,13 days | (#45171329)

A tangent, but frankly, given the choice between 4K monitors that I couldn't afford an a return to widespread availability of a 16:10 option at 1920x1200, I'd take the latter. 16:9 is less ideal to me.

Re:Basic math (0)

Anonymous Coward | 1 year,13 days | (#45170169)

11520?

"the resolution is for THREE 4K monitors, side to side"

Nah, everybody knows that is't really TEN Sun Sparc monitors side to side.

Are there even that many left... (0)

Anonymous Coward | 1 year,13 days | (#45170567)

In the real world anymore? :D

Actually for that matter, how many of those SGI LCDs with the proprietary display connector still exist?

Re:Basic math (0)

Anonymous Coward | 1 year,13 days | (#45170717)

Just like the summary says.

Re:Basic math (-1, Troll)

Anonymous Coward | 1 year,13 days | (#45169537)

Hahaha, disregard that! I Suck cocks!

Re:Basic math (0)

Anonymous Coward | 1 year,13 days | (#45169815)

It's right there in the fucking summary you moron:
"NVIDIA also announced support for three 4k displays at the same time."

In English (2, Interesting)

girlintraining (1395911) | 1 year,13 days | (#45169409)

Okay, can someone who isn't wrapped up in market-speak tell us what the practical benefit is here? The fact is that graphic cards are still designed around the concept of a frame; the rendering pipeline is based on that. 'vsync' doesn't have any meaning anymore; LCD monitors just ignore it and bitblt the next frame directly to the display without any delay. So this "G-sync" sounds to me like just a way to throttle the pipeline of the graphics card so it delivers a consistent FPS... which is something we can do since DirectX9.

So what, then, is the tangible benefit realized? Because I smell marketing, not technology, in this PR.

Re:In English (4, Informative)

Reliable Windmill (2932227) | 1 year,13 days | (#45169475)

For starters it reduces memory contention because the display device doesn't have to read and send the display data over the wire 60 times a second, while rendering the next frame. Theoretically, if there's nothing happening on the screen, such as an idle desktop, the display device won't consume x*100 Mbyte of bandwidth just to show a still image on the screen.

Re:In English (0)

Anonymous Coward | 1 year,13 days | (#45169571)

I'm confused about the extent to which this would work, are you saying that if you left your desktop idle, and there was nothing moving on screen, then the monitor wouldn't update, and would just show a single static frame until something moved?

Re:In English (0)

Anonymous Coward | 1 year,13 days | (#45169779)

It could be also done like that if the OS is smart enough to fully know that nothing needs to be updated on the screen. But in practice this will actually be the opposite: it would put out as high framerate as it is able to render.

Re:In English (1)

SuricouRaven (1897204) | 1 year,13 days | (#45169833)

I don't know if it will be able to do that, but if it can then there might be power savings too.

Re:In English (2)

ericloewe (2129490) | 1 year,13 days | (#45170319)

That's not quite the angle they're going for, but there are such solutions, involving a special "no refresh" signal (I assume) and an LCD controller with a framebuffer that is used to refresh the panel if there is no change, allowing the GPU to be idled.

Re:In English (2)

VortexCortex (1117377) | 1 year,13 days | (#45170671)

So, it's 1970's era Double Buffering?

Re:In English (3, Informative)

LazyBoot (756150) | 1 year,13 days | (#45169479)

Vsync still tend to add noticeable input lag even in games today. And some games still have issues with tearing even on lcd screens. So I'm guessing this is what they are trying to fix.

Re:In English (1)

Anonymous Coward | 1 year,13 days | (#45169625)

Yeah, I was going to say that I've got all sorts of games that have tearing on side-to-side scrolling scenes. Whether it's a problem with my panel, the videocard, or the game, I'd like it to stop (and usually enabling vsync in the game stops it).

Re:In English (1)

nospam007 (722110) | 1 year,13 days | (#45170455)

"Vsync still tend to add noticeable input lag even in games today. And some games still have issues with tearing even on lcd screens. So I'm guessing this is what they are trying to fix."

I still have tearing on 2 of my machines (Vista and 7)

Switching to an Aero theme fixes it, even though I hate those.

Re:In English (1)

Anonymous Coward | 1 year,13 days | (#45169517)

No more screen tearing: http://en.wikipedia.org/wiki/Screen_tearing

Re:In English (2)

CastrTroy (595695) | 1 year,13 days | (#45169547)

The only advantage I see is that you see the image as soon as possible after it's done rendering. If your graphics card is done rendering the scene, but you monitor just refreshed 1 ms before that, you're going to have to wait another .016 seconds (assuming 60 hz) to see that frame, because that's the next time the screen refreshes. If you can make the refresh of the monitor exactly match the rate at which frames are produced, there's minimal lag between the frame being rendered, and the frame being shown to the person playing the game.

Re:In English (2)

ultrasawblade (2105922) | 1 year,13 days | (#45169637)

Images where there's a low difference of changes can operate at a much higher framerate.

I think this opens up interesting possibilities.

Of course I think the physical response time of the display will be a bottleneck, capping the rate at a maximum below what the GPU can pump out.

Also it can save power. If the GPU is creating frames lower than 60hz then that's less power it needs to spend to do it.

Re:In English (2)

dow (7718) | 1 year,13 days | (#45170205)

Indeed. I have a 144hz screen, and I noticed as soon as I went from 60hz to 144hz that even when the frame rate was below 60fps, it was still smoother than before. It was obvious that it would be smoother when getting over 60fps, but this surprised me. Thinking about it I came to the same conclusion.

I have also been wondering what would the picture be like if using a high refresh rate when the graphics card cannot render enough frames for one every refresh, what if it only rendered half the pixels every update. Not like an old interlaced picture, but rendered half the odd numbered pixels on one line and half the even number on the next line. Would it look blurred or odd? Would the way the eye works adapt to this better than rendering full screens at half the framerate?

I was also wondering whether such a display method would be useful for pre-rendered low frame rate applications such as video playback? (This is where a +1 informative comment says 'Yes, they've been doing it for years already...' heh)

Re:In English (0)

Anonymous Coward | 1 year,13 days | (#45170679)

The other significant advantage is that games makers would no longer have to guess how long their frame is going to take to generate.

When generating a frame, you want to start as late as possible, because that gives you the maximum amount of user input to deal with to figure out what the game state should be when that frame is displayed. Because of that, what you do as a games developer is guess how long the frame will take to produce, observe when the next vsync will come, subtract, add some padding, and start rendering then.

The bad news is that if your frame takes longer than expected to generate you drop frames when you didn't have to. If your frame takes less time than expected to generate, you get more input lag than you ideally could have. This allows you to have the best of both worlds –you just start generating a new frame when you feel like it, and it appears on the screen immediately when you're done, lowering input lag, eliminating frame drops, and keeping no tearing.

Re: In English (1)

Anonymous Coward | 1 year,13 days | (#45169583)

VSyncs problem is that it is dictated by your monitor and is a fixed rate where graphics card frame rates are variable. This causes either studder or tearing depending on if you want to wait for VSync before drawing.

This solution instead is controlled by the video card so it will never tear and the monitor will update when told to rather than at a specific time. No more tearing and no more performance loss to deal with it... also no need to triple buffer which will help reduce input lag (among other things).

Re:In English (5, Informative)

The MAZZTer (911996) | 1 year,13 days | (#45169619)

No marketspeak here, but if you're not familiar with the technical details you might be a bit lost. First of all, in order to understand the solution, you need to identify the problem.

The problem is that, currently, refresh rates are STATIC. For example, if I set mine to 60Hz, the screen redraws at 60fps. If I keep vsync disabled to allow my gfx card to push out frames as fast as it can, my screen still only draws 60fps, and screen "tearing" can result as the screen redraws in the middle of the gfx card pushing out a new frame (so I see half-and-half of two frames).

As described, let's say my gfx card is pushing out 25fps. Currently the optimal strategy is to keep vsync off, even though this can result in screen "tearing", because with low fps bigger problem emerges even though screen "tearing" is fixed, with vsync on.

Every time my gfx card pushes out a frame, since vsync is on, it waits to ensure it will not be drawing to the screen buffer while the screen is updating. Since it waits, the screen only draws complete frames. So at 60fps the screen updates in 1/60 second intervals, and the gfx card render at 1/25 second intervals. So, at the beginning of a frame render, the gfx card renders... and the screen redraws twice, and then the gfx card has to wait for the third opportunity to draw before syncing up again. Since it is waiting instead of rendering, I am now rendering at 20fps (since I lose 2/3 refresh opportunities) instead of the optimal 25fps. If I disable vsync, I get tearing, but now 25fps.

This "G-sync" claims to solve that issue by making refresh rates DYNAMIC. So if my gfx card renderas at 25fps, the screen will refresh at that rate. It will be synchronized. No tearing or gfx card waiting to draw.

Re:In English (2)

AvitarX (172628) | 1 year,13 days | (#45169889)

I think worse than 20 FPS even, some frames will last longer than others, causing jerky motion. it won't be a smooth 20 FPS, it will be 20ish, but some frames lasting extra draws vs others.

Re:In English (0)

Anonymous Coward | 1 year,13 days | (#45170017)

It sounds like it would be variable frame rate and introduce jitters based on rendering time.
On the other hand the display update will be synchronized to the rendering instead of whenever the next VSYNC happens to be (so 0ms -1/VSYNC).

The LCD is like dynamic memory. It would still need a redraw to meet a minimum refresh rate or you'll not have a proper picture.
Not sure how the monitor firmware/hardware is going to handle this. This scheme would only work for LCD with full digital interface and not the ones that digitizes VGA signals as it would mess up PLL timings..

Re:In English (1)

Anonymous Coward | 1 year,13 days | (#45171173)

This scheme would only work for LCD with full digital interface and not the ones that digitizes VGA signals as it would mess up PLL timings..

Indeed, from the material I've read it's DisplayPort only (DP is a high speed packet interface). As you say, HDMI/DVI work more or less like digitized VGA.

Also, addressing another issue you mentioned, the photos of the monitor-side controller show that it has several DRAMs. This memory is almost certainly used to store the last frame so the monitor can refresh itself if/when the host refresh interval drops too low to keep a stable image on the panel.

Re:In English (2)

Guspaz (556486) | 1 year,13 days | (#45170333)

G-Sync enforces a 30Hz minimum refresh rate (the monitor will never wait longer than 33ms to refresh, or in the 144Hz demo monitors, it will never wait less than 7ms), so your example wouldn't work, but apart from that, yeah :)

Re:In English (2)

Chemisor (97276) | 1 year,13 days | (#45171507)

Currently the optimal strategy is to keep vsync off, even though this can result in screen "tearing"

No, the optimal strategy is to keep vsync on and throttle your redraws exactly to it. To make it work you must set up an event loop and a phase-lock timer (because just calling glFinish to wait for vsync will keep you in a pointless busywait all the time). Unfortunately, game programmers these days are often too lazy to do this and simply ignore vsync altogether. While this may result in smoother animation, it also heats up my very expensive video card, shortening its life for no reason. Thankfully, there are tools on Windows (but sadly, not on Linux) to force vsync on any such insensitive games, at which point the video card cools down to 40 degrees or so, illustrating just how much power they are wasting.

Re:In English (1)

Anonymous Coward | 1 year,13 days | (#45169653)

Slashdot: where anything you don't understand must be bullshit.

Basically, you know how you think monitors work? They actually don't. But with g-sync, they do.

Re: In English (0)

Anonymous Coward | 1 year,13 days | (#45170265)

Ding ding ding...!!! Thanks, AC, was about to post the same thing, but liked your phasing better.

Re:In English (0)

Anonymous Coward | 1 year,13 days | (#45170449)

For the poster in question you can replace the word monitors and g-sync with any other appropriate word pair and the statement will hold true.

Re:In English (5, Informative)

jones_supa (887896) | 1 year,13 days | (#45169707)

LCD monitors absolutely do not ignore VSync. Now let's not forget that the primary function of a VSync signal is to tell the monitor (CRT or LCD) where the start of the picture begins. There's also HSync to break the picture into scanlines. VSync always takes a certain amount of time which the monitor will "take a breath" (CRT will also move the gun back to top). At this time, it is the perfect moment for the GPU to quickly swap its framebuffers in the video memory. The "scratch" draw buffer will be moved as the final output image and then the GPU can begin drawing the next one in the background. At the same time the completed image is sent to the monitor in the normal picture signal when the monitor gets back to work to draw a frame. If the buffers are swapped in the process of the monitor drawing the frame, the halves of two frames will get shown together which leads to the video artifact called "tearing".

If we are a good citizen and swap buffers only during the VSync period we can get a nice tear-free (typically 60fps) image. However if instead it takes more than the time of one picture (which about 16ms) to draw the next one, we have to wait a long time for the next VSync and that means that we also slide all the way down to a 30fps frame rate. Now if the game runs fast at some moments but slower at some others, the bouncing back between 60fps and 30fps (or even 15fps) makes this annoying jerky effect. NVIDIA's G-Sync tries to solve this problem by making the frame time dynamic.

Re:In English (5, Interesting)

Anonymous Coward | 1 year,13 days | (#45170479)

Well I don't want to be "that guy", but I am "that guy". The real reason for vsync in the days of CRTs is to give time for the energy in the vertical deflection coils to swap around. There is a tremendous amount of apparent power (current out of phase with voltage) circulating in a CRT's deflection coils.

Simply "shorting out" that power results in tremendous waste. They used to do it that way early on, they quickly went to dumping that current into a capacitor so they could dump it right back into the coil on the next cycle. That takes time.

An electron beam has little mass and can easily be put anywhere at all very quickly on the face of a CRT. It's just that the magnetic deflection used in TVs is optimized for sweeping at one rate one way. On CRT oscilloscopes they used electrostatic deflection and you could, in theory, have the electron beam sweep as fast "up" as "left to right".

So why didn't they use electrostatic deflection in TVs? The forces generated by an electrostatic deflection system are much smaller than a magnetic system, you'd need a CRT a few feet deep to get the same size picture.

Ta dah! The wonders of autism!

Re:In English (1)

Anonymous Coward | 1 year,13 days | (#45169763)

So this "G-sync" sounds to me like just a way to throttle the pipeline of the graphics card so it delivers a consistent FPS...

Actually, it's the inverse, with G-sync, the monitor retrace tracks the instantaneous FPS delivered by the game... That way there is no stutter (or tearing) as a result of quantizing the display scans to a pre-determined arbitrary frame rate.

Re:In English (3, Informative)

Nemyst (1383049) | 1 year,13 days | (#45169921)

Actually, it's the reverse. Instead of forcing the GPU to wait for the screen's refresh rate (as is the case with V-sync), potentially causing some pretty bad frame drops, G-sync makes the monitor wait for the GPU's frames. Whenever the GPU outputs a frame, the monitor refreshes with that frame. If a frame takes longer, the screen keeps the old frame shown in the meantime.

Remember, V-sync forces the GPU to wait for the full frame's duration, regardless of how long it's taken to render the frame. If the GPU renders the frame in 3ms but V-sync is at 10ms per frame, the GPU waits around for 7ms. Flip side, if the GPU takes 11ms, it's "missed" a frame (lag) and still has to wait 9ms until it can start drawing the next frame. G-sync is supposed to make it so as soon as the GPU's done rendering a frame, it pushes it to the monitor, and as soon as the monitor can refresh the display to show that new image, it will.

In theory, this could effectively give the visual quality of V-sync (no screen tearing) with a speed similar to straight rendering without V-sync.

Re:In English (1)

ericloewe (2129490) | 1 year,13 days | (#45170381)

You sync the panel's refresh rate to the application's.
Say a frame gets delayed (40 fps instead of 60fps, for instance), traditionally, the monitor is blissfully ignorant of that fact and just refreshes whatever it is given when the time comes.
Nvidia's solution is to have the GPU signal the LCD's controller, telling it when to refresh. This allows the monitor to refresh when the frame is done rendering, instead of at a fixed point in time.

It's essentially a method for allowing the panel to be refreshed on cue, keeping everything in sync. This avoids tearing (Incomplete frame in the bu.ffer) and lag (with vSync on, a frame takes longer to render and is thus forced to wait until the next refresh time - here the monitor waits for the GPU)

Re:In English (1)

Anonymous Coward | 1 year,13 days | (#45170659)

This technology will do 3 things:

1) Reduce input lag to the lowest possible delay due to each frame being displayed immediately on the screen. With standard fixed refresh rate displays, there is almost always a delay between a frame being generated and being displayed on the screen and the delay is not constant.

2) Remove the need for vsync to eliminate screen tearing. Since the monitor's refresh cycles are controlled by the GPU, it can be guaranteed to avoid tearing without requiring the GPU to render frames ahead and wait for the monitor's refresh cycle.

3) Make games run visually smoother. With a standard 60 Hz display, you can only achieve perfectly smooth motion with evenly divisible framerates (60 fps, 30 fps, 20 fps, etc.) With this new technology, since the display refresh is controlled by the GPU, you will be able to achieve smooth motion at any framerate. In addition the elimination of screen tearing will also remove the visual stutter caused by screen tearing as well. With any luck this will work for video playback as well to smooth out playback of 24 and 25 fps video on 60 Hz displays, but it's impossible say at this point.

Now I'm one of the lucky people who's had very little trouble with input lag, despite using vsync + triple buffering (the current best solution for motion smoothness) in a myriad of games across 5 computers over more than 10 years. I don't expect a huge improvement for myself, but this sounds like a godsend for others who have to choose between screen tearing and input lag.

Re:In English (0)

Anonymous Coward | 1 year,13 days | (#45170745)

I hope to not to see lines while watching video on Linux. Currently I see frame division lines in the middle. Like frame shows up while previous one wasn't finished. Very annoying when there is a lot of motion in the video. It could be because I have two monitors with different frame-rates and the card gets confused which one to synchronize with. There are some switches in config, but it practice they don't help. :-\

Re:In English (1)

Anonymous Coward | 1 year,13 days | (#45171141)

Oh, slashdot. Yet another ignorant "girlintraining" post modded up to 4+ interesting/informative/etc. for no discernible reason.

LCDs do not ignore vsync. They have never ignored vsync. How on earth did you get the idea that they ignored vsync? Same comment re: "bitblt the next frame directly to the display without any delay". The "next frame" isn't even in the display, you buffoon. It's either not computed yet, or sitting in buffers on the video card. The display can't magically pull those bits out of memory it's not directly connected to, it has to wait for the video card's refresh controller to send them over the wire.

Whether the display is a CRT or a LCD, it can only display the data it's given, at the rate it is fed. In the present refresh model the sequence looks like this:

1. Video card sends a vertical blanking synchronization signal, or the moral equivalent of same (in 100% digital standards like DisplayPort the "sync signal" is just a special packet or something)
2. Video card sends 1 line's worth of horizontal pixel data.
3. Video cards sends a horizontal blanking synchronization signal, or the moral equivalent.
4. Goto #2 unless all lines for the frame have been sent.
5. Goto #1.

And what I haven't described in the above is that there are blank periods inserted to keep both the horizontal and vertical synchronization periods constant. With standards like DVI and HDMI, this involves just sending zeroes or something (DVI/HDMI are pretty much like VGA with digital RGB channels instead of analog), with DisplayPort you just delay the next packet sent. Either way, considered as a whole, frames arrive at the video card with a fixed frequency. And note that this frequency is generated by the video card -- it queries the monitor to find out what range of horizontal and vertical frequencies the monitor supports, so it's up to the video card to generate data at a rate the display can accept, but the clock source is on the card.

G-Sync is very much like the above model, except:

* DisplayPort only, probably because it's not a natural extension to other standards. DisplayPort is a much more modern packetized protocol which isn't as tied to VGA backwards compatibility.
* No more fixed frequency of frame arrival. Software on the host computer determines when to start sending a new frame.
* Requires a frame buffer in the monitor since if no new frames have arrived in a while the LCD will need to do a repeat refresh of the last frame from local memory to avoid visual glitches (contrary to popular belief, images on LCDs do not persist forever if they are not refreshed)

The advantage:

Traditional display refresh requires a refresh event every N milliseconds whether there's truly a new content frame ready or not (e.g. once every 16.67ms at 60Hz). If you're displaying content whose frame rate exactly matches the display, or the display is running at an exact multiple of the refresh rate, great! You get perfectly smooth animation because everything happens at fixed integral multiples of the frame time. But say you're playing a video game and that video game can't keep up with 60 Hz refresh, dropping to 53 Hz. Now you have two choices:

1. Synchronize the game's output to the vertical retrace. If you're taking longer than 1 monitor frame time to render a game frame, output one game frame for every two monitor frames. So you drop straight from 60 Hz to 30 Hz, and from 30 to 20 if you can't do it all inside of two monitor frame times. The average might still be higher than 30 if the situation is that many frames take less than 16.67ms to render but some take 25 or whatever, but in that case you'll be running at 60 Hz much of the time with occasional tiny glitches to 30 Hz. This looks smoother than 30 Hz all the time, but tends to have a noticeable choppiness to it.

2. Decouple completely from the monitor's vertical retrace. Now the game may update the frame buffer even while it's partway through being spooled out to the video card. However, this leads to a graphical artifact known as "horizontal tearing": because one displayed frame may contain data from two distinct game-rendered frames, it's easy to have a visible discontinuity. Also this still doesn't really look as smooth as it ought to.

G-Sync is like having V-Sync on all the time (one monitor frame is always one game frame), but the fact that the game doesn't have to lock to a fixed vertical retrace frequency means that you no longer have to target a fixed frame rate (like 60 fps) to get smooth animation.

Re:In English (4, Funny)

Hsien-Ko (1090623) | 1 year,13 days | (#45171965)

It seems strange that they didn't call it nSync....

Finally (5, Interesting)

Reliable Windmill (2932227) | 1 year,13 days | (#45169431)

Now we just wait until they finally figure out to employ a smarter protocol than sending the whole frame buffer over the wire when only a tiny part of the screen has changed. It would do wonders for APUs and other systems with shared memory.

Re:Finally (0)

Anonymous Coward | 1 year,13 days | (#45169541)

I'm sure you're watching a black screen while the gpu is rendering the next frame...This is just an improvement on vsync problems when the framerate drops below the sync rate.

Re:Finally (2)

CastrTroy (595695) | 1 year,13 days | (#45169639)

On one hand, I see where this could be a good idea. On the other hand, I kind of like dumb displays. Stuff like displays and speakers should really just display/play the signal sent to them. They should be as simple as possible, because they are expensive, and this allows them to last for longer and be cheaper. If TVs were smart, I would probably have to upgrade my TV every time they came up with a new video encoding standard. Luckily, TVs just understand a raw signal, and I can much more easily upgrade my computer to interpret the encoded videos than I can upgrade my TV.

Re:Finally (1)

lgw (121541) | 1 year,13 days | (#45169691)

A huge part of good remote desktop protocols is just that! Keeps the bandwidth down. If your graphics card could know "for free" that all changes were in a given rectangle, and I bet it often could, that doesn't even sound hard to do.

Re:Finally (1)

Richy_T (111409) | 1 year,13 days | (#45169755)

This was my thought. But typically, where it would make a difference, the whole screen is probably changing anyway. I can still see some advantage to that though.

I guess ultimately, the GPU(s) should be in the monitor and the PCIe bus would be the connection. It appears there's no defined cable length so it would probably require a standards update.

Re: Finally (0)

Anonymous Coward | 1 year,13 days | (#45169857)

Thunderbolt, aka LightPeak, is a transport that can do a PCIe connection over a 2m+ cable (as well as carry a regular DisplayPort signal)

Re: Finally (1)

Richy_T (111409) | 1 year,13 days | (#45169947)

Sweet. The name rings a bell but I don't think I paid attention.

Re: Finally (1)

petermgreen (876956) | 1 year,13 days | (#45170911)

AIUI first generation thunderbolt is basically equivilent to PCIe 2.0 x4 while second generation thunderbolt is basically equvilent to PCIe 3.0 x4. Afaict that is tolerable but suboptimal for running an external GPU.

Re:Finally (0)

Anonymous Coward | 1 year,13 days | (#45169873)

For FPS's when you rotate all around, or for action movies where the camera moves quickly, all of the screen is updated. Presumably, this is also the case when the graphics system is maximally stressed. What good will your protocol have at that point? Are you concerned about reducing the close-to-idle case power consumption or something?

Carmack on VR Latency (1)

tepples (727027) | 1 year,13 days | (#45169989)

For FPS's when you rotate all around, or for action movies where the camera moves quickly, all of the screen is updated.

Then make "scroll rectangle" one of the primitives in the screen difference protocol. If the camera turns, scroll the data in the frame buffer at the same speed that the camera turns. Sure, there'll be artifacts near the HUD, but overall, that should provide the illusion of less latency [slashdot.org] . MPEG-4 ASP (e.g. DivX, Xvid) uses this technique under the name "global motion compensation", but ultimately, the concept dates back to motion vectors way back in the H.261 era.

Re:Carmack on VR Latency (2)

Rockoon (1252108) | 1 year,13 days | (#45170385)

Well there goes the power savings...

So why are you considering doing this again?

The fact is that the bandwidth between video card and monitor must be enough to handle the worst case scenario or else its not fit for purpose, and the hardware cost difference between fully utilizing this link and greatly under utilizing this link is very very small. There are power savings if you can under-utilize the link without sacrifice, but...

Meanwhile there are large up-front costs associated with performing real-time video compression techniques, and the needed computation itself also uses a lot of power. So there would be large power costs in actually successfully under-utilizing the link using video compression techniques.

So no, not really a smart idea. The reason nVidia (who I am currently not a fan of) is proposing this g-sync idea is because currently the full technical capabilities of lcd hardware arent being exploited because of old the paradigms of a frame buffers being sent at constant fixed intervals. It would be very low cost to add logic to lcd displaces to sit and wait for new frames rather than what they currently do.

Its win-win-win because....
o) the video card would only have to transmit new frames, rather than constantly transmit frames even when they havent changed.
o) The lcd can spring into action updating pixels as soon as the video card has a new frame ready, rather than at fixed intervals.
o) In low frame-rate situations, there are power savings on the link, in the video card, and in the monitor (although the monitors power savings would be very minor as the back-light uses 99% of the power already)

I imagine this sort of thing would be adopted in laptops long before desktops.

Re:Carmack on VR Latency (0)

Anonymous Coward | 1 year,13 days | (#45171361)

The OPP might be a little naive, but imagine rather than reducing shared memory bandwidth, we're trying to scale up the display resolution.

Wouldn't it be nice if the display was "retina" resolution with tens of times as many pixels in every dimension, and the system could send high resolution updates to areas that then persist statically on their own, for technical tasks, text, etc. At the same time, it could send an mpeg4 or similar motion compressed video stream to be decompressed, scaled, and composited in the display itself.

Let my whole wall display news, or art, or whatever, with a full motion video rectangle embedded at a convenient location. But I can move that rectangle around or scale it up or down depending on where I am sitting, without requiring a higher bandwidth from the rendering system to the wall display.

What goes around comes around... let's go back to smart display servers detached from our application servers.

Re:Finally (1)

Reliable Windmill (2932227) | 1 year,13 days | (#45170537)

In [i]that[/i] particular case we're back to square one, where we are now, but in all other cases there are power- and memory bandwidth savings. Not all usage cases are full screen FPS games, and typically these are not the users who are concerned with memory bandwidth and power usage.

Re:Finally (1)

ultranova (717540) | 1 year,13 days | (#45171977)

In [i]that[/i] particular case we're back to square one, where we are now, but in all other cases there are power- and memory bandwidth savings.

Would there? The GPU actually requires more memory bandwidth, since it needs to retrieve the previous frame for a pixel-for-pixel comparison. And both the encoding and decoding require circuitry, which needs power - probably more power than just sending the raw frame over a 1-meter link in the first place. That's worth remembering: we aren't talking about a trans-Atlantic cable here.

Re:Finally (1)

tlhIngan (30335) | 1 year,13 days | (#45170233)

Now we just wait until they finally figure out to employ a smarter protocol than sending the whole frame buffer over the wire when only a tiny part of the screen has changed. It would do wonders for APUs and other systems with shared memory.

They exist - they're called "smart" LCD displays and are typically used by embedded devices. These maintain their own framebuffer, and the LCD controller sends partial updates as it needs to then shuts down. It saves some power and offloads a lot of the logic to the screen.

Of course, there's no standard for them and each LCD display has its own command set and connection protocol..

Re:Finally (0)

Anonymous Coward | 1 year,13 days | (#45170489)

Why would that make any difference? In an APU setup the "wire" is not the limiting factor.... It's the APU itself.

Re:Finally (1)

Reliable Windmill (2932227) | 1 year,13 days | (#45170607)

My point was not concerning what type of connection you have to the screen, but that you can save memory bandwidth and power by only sending to the screen what needs to be updated.

Re:Finally (1)

Solandri (704621) | 1 year,13 days | (#45171503)

Now we just wait until they finally figure out to employ a smarter protocol than sending the whole frame buffer over the wire when only a tiny part of the screen has changed.

Wouldn't that depend on whether it's faster to just send the entire framebuffer over the wire, or do a pixel-by-pixel compare between the current framebuffer with the previous one to figure out which parts have changed?

This sort of streaming compression makes sense when bandwidth is limited, like back when people used dialup to access the Internet. Compressing photos on the fly back then could speed up web browsing. But HDMI 1.2 has sufficient bandwidth to transmit 1080p @ 60 Hz, 1.3 has 2.5x as much bandwidth, and 1.4 can transmit 3840x2160 @ 30 Hz. The latency your compression method adds would be acceptable if you're trying to do something like view a computer's screen on your tablet over wifi via remote desktop. But for a monitor connected to a computer by an HDMI cable, it's presently unnecessary.

Makes no sense. (0, Informative)

Anonymous Coward | 1 year,13 days | (#45169481)

MMMMkay.... even if LCDs don't have an explicit refresh rate, interface standards DO.

On the wire, be it VGA, DVI, HDMI, they all, without exception, must conform to timing standards. This includes a pixel clock and for digital transport and hsync/vsync for VGA and always fixed sizes for blanking and active regions. This maps to a fixed frame rate which the input hardware of the monitor will synchronize to. You can't just arbitrarily decide to send frames at variable rates - it doesn't work that way.

NVidia would have to change the whole industry for this - it could work but we're talking about new interface standards here... I expect to see an NVidia-branded LCD which supports this, which of course will cost 4 times as much as a standard LCD.

Re: Makes no sense. (0)

Anonymous Coward | 1 year,13 days | (#45169615)

and they already have buy in from multiple monitor manufacturers. So yes, adoption will be slow like any new fangled whiz banger. If it works well, people will use it.

Re:Makes no sense. (2)

Richy_T (111409) | 1 year,13 days | (#45169791)

According to TFA, that's exactly what you'll be seeing.

This technology will be available soon on Kepler-based GeForce graphics cards but will require a monitor with support for G-Sync; not just any display will work. The first launch monitor is a variation on the very popular 144 Hz ASUS VG248QE 1920x1080 display

Re:Makes no sense. (1)

Richy_T (111409) | 1 year,13 days | (#45169803)

And

UPDATE 2: ASUS has announced the G-Sync enabled version of the VG248QE will be priced at $399.

So you're not far off there, either.

Re:Makes no sense. (1)

Forbo (3035827) | 1 year,13 days | (#45169897)

Considering the original version is priced at $280, that would put it closer to being 1.4x the cost of the original.

Re:Makes no sense. (1)

Richy_T (111409) | 1 year,13 days | (#45169935)

That's probably fair :) Though it's a little pricey but today's standards to begin with.

Re:Makes no sense. (0)

Anonymous Coward | 1 year,13 days | (#45169929)

That's a pretty standard price for a 120 Hz monitor, though.

Re:Makes no sense. (1)

khellendros1984 (792761) | 1 year,13 days | (#45170229)

Well, the original VG248QE seems to be selling on Newegg for about $280. So they're talking about a retail price roughly 1.5x as much as the hardware they're basing it on.

Re:Makes no sense. (1)

Tailhook (98486) | 1 year,13 days | (#45170081)

NVidia would have to change the whole industry for this

We can't have one of the largest purveyors of video hardware influencing display standards now, can we?

NVidia isn't some startup. They put GPUs into millions of devices; desktops, laptops, tablets, consoles, phones, etc. When they offer a new technique for syncing video the world is going to have a look. That doesn't mean it must be accepted, but it won't be dismissed out-of-hand.

Besides, given an advanced bus like DisplayPort I suspect this might amount to a simple video-chip-to-display negotiation with a transparent fallback. DisplayPort devices can be Ethernet peers, among other things; they can coordinate anything they wish. So promoting a display connection to a new syncing technique should be transparent and non-disruptive for all hardware, past and future, without some brand new interface standard.

Turns out that's exactly what is happening. From Anandtech [anandtech.com] :

Meanwhile we do have limited information on the interface itself; G-Sync is designed to work over DisplayPort (since it’s packet based), with NVIDIA manipulating the timing of the v-blank signal to indicate a refresh. Importantly, this indicates that NVIDIA may not be significantly modifying the DisplayPort protocol, which at least cracks open the door to other implementations on the source/video card side.

This will require Nvidia gear inside the monitor (1)

Dr. Spork (142693) | 1 year,13 days | (#45169667)

I would feel pretty good about this if it were being proposed as some sort of standard, but from the blurb, it looks like a single-vendor lock-in situation. You will need an Nvidia graphics card to make it work, but your monitor will also need an Nvidia circuit board to regulate the framerate. The only value of this kind of variable framerate technology is for gaming. This means that the needed circuitry will appear only in monitors that are meant specifically for gamers. This means that they will be segmented off from the larger LCD market, and probably priced for "the gamer who has everything". But then again, this kind of gamer can just buy some fancy 60Hz monitors and a graphics card that doesn't tear frames because it has enough power. The latter course is probably cheaper. I know that lots of PC gamers now use big LCD televisions as their desktop monitors, or multi-monitor setups. Somehow I don't see these people upgrading their monitors so that they can look decent even at lower framerates. They would just buy the sort of graphics card that doesn't give them lower framerates.

Re:This will require Nvidia gear inside the monito (1)

Nemyst (1383049) | 1 year,13 days | (#45169955)

Interestingly, Nvidia will be providing the G-sync chips by themselves [anandtech.com] , allowing people to mod their monitor to install the chip on them. I'm not sure just how compatible this would be, but it might allow you to upgrade your existing monitors with G-sync support or get someone to do it for you, depending on your capabilities and willingness to risk your monitor.

Re:This will require Nvidia gear inside the monito (1)

petermgreen (876956) | 1 year,13 days | (#45171033)

I don't see anything about them selling chips to end users, just stuff about them selling upgrade modules. I guess each module will be specific to one make/model of monitor and will require cooperation of the monitor manufacturer to produce.

When did PC+TV take off? (1)

tepples (727027) | 1 year,13 days | (#45170039)

I know that lots of PC gamers now use big LCD televisions as their desktop monitors

When did this come to be the case? A few years ago, people were telling me that almost nobody does that [slashdot.org] .

Re:When did PC+TV take off? (1)

PhrstBrn (751463) | 1 year,13 days | (#45170311)

You're confusing HTPCs and using panels designed as TVs for computer monitors. We're talking about people who stick a 32" monitor (or larger) on the wall in front of their desk in their office, vs putting a computer under the TV in your living room. While the components are the same, the ergonomics are different.

Re:When did PC+TV take off? (1)

Stormwatch (703920) | 1 year,13 days | (#45170793)

A few years ago

An important detail there. Back then, as I recall, HD TVs (1280×720), or even lower res, were very common. While that's okay for watching movies or TV shows from the couch, that's awful for a large screen sitting within arm's reach. And even now, most TVs are only Full HD (1920x1080), no matter the size, while computer monitors often go higher; 27" monitors at WQHD (2560x1440) are getting quite popular, I heard.

Re: When did PC+TV take off? (0)

Anonymous Coward | 1 year,13 days | (#45171219)

When the LCD panels were all made in 16:9 1080p anyway, the only

Will it work with game consoles? (0)

Anonymous Coward | 1 year,13 days | (#45169679)

What I want to know is if this tech work with game consoles also? Does specific monitor has to be paired with Nvidia GPU only? Input lag is rather big deal in console games using LCD TVs. If this helps eliminate or reduce input lag on monitors, that will be great.

Re: Will it work with game consoles? (0)

Anonymous Coward | 1 year,13 days | (#45169885)

No, it needs specific hardware. Also, when you can have a gaming PC why would you ever want a glorified netbook with a laptop video card glued to it?

Re: Will it work with game consoles? (1)

tepples (727027) | 1 year,13 days | (#45170093)

Also, when you can have a gaming PC why would you ever want a glorified netbook with a laptop video card glued to it?

Because if you have more than one gamer in the household, you don't always want to have to buy two to four gaming PCs and two to four copies of each game. One console, one copy of each game, and two to four controllers are cheaper, even with console maker markup on the games. Even though console games are somewhat less likely to support same-screen multiplayer than they used to, I'm under the impression that console games are still more likely to support it than PC games. (And no, same-screen doesn't necessarily mean split-screen, especially for things like beat-em-ups, non-first-person shooters, and fighting games.)

G-String (0)

Anonymous Coward | 1 year,13 days | (#45169745)

Hope it comes with a sexy bra...

VSync still needed for LCDs (0)

Anonymous Coward | 1 year,13 days | (#45169951)

While there is no electron gun to shoot the screen left to right, top to bottom, the memory holding the frame buffer is scanned and sent to the display in the same way, you should not write it during that period.

Variable rate vsync (1)

tepples (727027) | 1 year,13 days | (#45170113)

G-Sync is still vsync, just at a variable rate that matches the rate that new pictures are available.

Seems like a good idea (1)

grmoc (57943) | 1 year,13 days | (#45170001)

G-sync (i.e. sync originated by the graphics card) seems like a good idea.
It:
    allows for the ability of single or multiple graphics cards within a computer to emulate genlock for multiple monitors, so that the refresh rates and refresh times of those monitors interact properly
    allows for the synchronization of frame rendering and output, i.e. reducing display lag which is important for gamers and realtime applications.
    allows for a graphics card to select the highest possible framerate (possibly under 60hz) when displaying higher resolutions (e.g. 4k or 8k) on cables/interfaces that don't allow for a full 60hz bitrate.

Good stuff.

Re:Seems like a good idea (1)

petermgreen (876956) | 1 year,13 days | (#45171107)

i.e. sync originated by the graphics card

Sync has always been originated by the graphics card so no special assistance from the monitor would be needed to lock the framerates and timings of multiple monitors together.

The problem is that traditionally monitors don't just use the sync signals to sync the start of a frame/line, they also use them as part of the process of working out what geometry and timings the graphics card is sending. Furthermore some monitors will only "lock" successfully if the timings are roughly what they expect. So you can't use weird-ass framerates and you can't vary the framerate dynamically without causing glitches.

Displayport unlike other display interfaces is packet based so afaict it shouldn't suffer this limitation but I guess for compatibility reasons they don't want to do anything weird unless they know the monitor at the other end will support it.

11520 ? (1)

Adm.Wiggin (759767) | 1 year,13 days | (#45170291)

It's over 9000! [youtube.com]

(Oblig.)

They do not need to wait but they do. (0)

Anonymous Coward | 1 year,13 days | (#45170403)

Pixels on LCD monitors do not need to wait for above lines of pixels to be drawn, but they do.

Perhaps it's the lack of sleep but I can't understand what this sentence is saying.

Re:They do not need to wait but they do. (0)

Anonymous Coward | 1 year,13 days | (#45171197)

I don't need to reply to dumb posts, but I do it anyway.

Why does 25fps on a computer game seem slow? (0)

mark-t (151149) | 1 year,13 days | (#45170779)

... When movies are shown at 24fps and the motion still seems fluid?

Re:Why does 25fps on a computer game seem slow? (1)

cheetah_spottycat (106624) | 1 year,13 days | (#45170897)

Because skilled directors and camera operators have learned in the last 100 years of movie making history which kind of camera movements work, and painstakingly avoid those which don't work with low framerates.

motion blur (1)

Chirs (87576) | 1 year,13 days | (#45170947)

If you freeze a movie frame shot at 24fps you'll see that moving objects are blurry. And in a fast pan it still looks anything but fluid.

Requires New Monitors Too (1)

locopuyo (1433631) | 1 year,13 days | (#45171045)

Something the summary fails to point out is this will not work with existing LCD monitors. The monitors will have to have special hardware that supports G-Sync.

Standard LCD monitors and TVs update the pixels the same way old CRTs do. They start from the top and update line by line until they reach the bottom.

It is actually a little surprising they haven't done something like this for phone and laptop screens yet. The only thing that stopped them from doing it with the first LCDs was compatibility with existing video signals.

CRTs? (1)

Delusion_ (56114) | 1 year,13 days | (#45171303)

Given how few CRT monitors there are in the wild (let alone on those computers that are running new hardware), I'm not sure why the CRT vs LCD distinction was noteworthy.

Re:CRTs? (1)

TyFoN (12980) | 1 year,13 days | (#45171505)

I have friends that play CS with CRT monitors for the higher sync you get on them.
I guess those are the individuals that are most interested in this :)

Re:CRTs? (1)

Delusion_ (56114) | 1 year,13 days | (#45171585)

Ouch. I don't envy them. Of course, ditch 21" CRTs for 27" LCDs didn't really save me any physical desktop space. Clutter multiplies to fit its container.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?