Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Standards Group Adds Adaptive-Sync To DisplayPort

Unknown Lamer posted about 5 months ago | from the variable-framerate-considered-alright dept.

Displays 82

MojoKid (1002251) writes "Over the past nine months, we've seen the beginnings of a revolution in how video games are displayed. First, Nvidia demoed G-Sync, its proprietary technology for ensuring smooth frame delivery. Then AMD demoed its own free standard, dubbed FreeSync, that showed a similar technology. Now, VESA (Video Electronics Standard Association) has announced support for "Adaptive Sync," as an addition to DisplayPort. The new capability will debut with DisplayPort 1.2a. The goal of these technologies is to synchronize output from the GPU and the display to ensure smooth output. When this doesn't happen, the display will either stutter due to a mismatch of frames (if V-Sync is enabled) or may visibly tear if V-Sync is disabled. Adaptive Sync is the capability that will allow a DisplayPort 1.2a-compatible monitor and video card to perform FreeSync without needing the expensive ASIC that characterizes G-Sync. You'll still need a DP1.2a cable, monitor, and video card (DP1.2a monitors are expected to ship year end). Unlike G-Sync, a DP1.2a monitor shouldn't cost any additional money, however. The updated ASICs being developed by various vendors will bake the capability in by default."

cancel ×

82 comments

Sorry! There are no comments related to the filter you selected.

frist psot (-1)

Anonymous Coward | about 5 months ago | (#46987347)

frist psotddccccc

It's a great idea (2)

Bryan Ischo (893) | about 5 months ago | (#46987367)

I have to wonder why the idea of adaptive vsync wasn't thought of earlier or implemented into display standards earlier. It just seems like such an obvious idea once you've heard of it. Surely someone else in the graphics/display industry must have had the idea before NVidia?

I can't think of any downsides to having this technology; it's pure upside as far as I can tell. Although, I guess I could imagine that there could be some technical downsides, depending upon how displays are typically implemented. For an LCD, I can imagine that knowing the frequency ahead of time allows the LCD panel to perhaps "pipeline" some of its operation, allowing faster grey-to-grey transitions. For example, if the display knows that the next frame is going to come at exactly X milliseconds in the future, then perhaps it could start transitioning all pixels to grey at time X - N, where N is the average time it takes for pixels to transition to grey, and then when the frame is received, it could then transition all pixels from grey to the next frame pixel colors faster. With adaptive vsync, the display would not be able to do this; it would have to start the transition from frame M pixel values to frame M + 1 pixel values only as soon as frame M + 1 becomes available.

Not being able to play grey-to-grey optimization games is I guess a possible downside of adaptive vsync; but I suspect it's a pretty small downside. Aside from gamers who want to see "the next frame" with the smallest latency possible, I don't know that anyone is really going to care much about that potential downside.

Re:It's a great idea (5, Interesting)

wonkey_monkey (2592601) | about 5 months ago | (#46987461)

For example, if the display knows that the next frame is going to come at exactly X milliseconds in the future, then perhaps it could start transitioning all pixels to grey at time X - N, where N is the average time it takes for pixels to transition to grey, and then when the frame is received, it could then transition all pixels from grey to the next frame pixel colors faster.

What's the reason for transitioning to grey? Is it to minimise the likely "distance" (time) to the new colour?

Won't most pixels, most of the time, remain a similar colour in the next frame? I don't understand the ins and outs, but wouldn't you lose as much, if not more, as you'd gain?

Re:It's a great idea (-1)

Anonymous Coward | about 5 months ago | (#46987907)

Bryan doesn't know what he's talking about. Ignore.

Re: grey (0)

Anonymous Coward | about 5 months ago | (#46988251)

Skipping grey would require enough frame buffer to hold 2 frames. If they blank to grey however, they can just reuse a single frame buffer.

Re: grey (2)

kimvette (919543) | about 5 months ago | (#46989277)

There goes the great contrast ratio of monitors. Just as we're mourning the loss of vertical resolution thanks to the economics of reusing 16:9 television panels, we'll be mourning the good old days of nice dark blacks and well-saturated colors if they were to completely grey out the screen between each frame. Thanks but no thanks.

Re:It's a great idea (1)

StripedCow (776465) | about 5 months ago | (#46988897)

I guess his idea is that you can transition from one frame to the next *while* the frame is building as on a CRT.
Given two colors A and B, you could write the resulting color as
C = A*(1-alpha) + B*alpha
As alpha goes from 0 to 1, the color of the pixel would go from C=A to C=B.
This is then applied to the scanline of the CRT, where alpha is increased from 0 to 1 as the scanline progresses.

Re:It's a great idea (0)

Anonymous Coward | about 5 months ago | (#46988937)

If you knew the future and what color is gonna show there before it's delivered to you

Re:It's a great idea (1)

wonkey_monkey (2592601) | about 5 months ago | (#46988957)

You patent it, I'll rustle up some investors, and before we know it we'll be sitting onna beach, earning twenny percent.

Re:It's a great idea (2)

chihowa (366380) | about 5 months ago | (#46988915)

There's no reason to go through grey between every frame. As you say, most pixels will remain the same color, and making every pixel grey at the V-sync frequency will just make the whole display strobe and look washed out.

Grey-to-grey is just an easy thing to test for benchmarking displays. You don't actually do that in normal operation.

Re:It's a great idea (1)

ZorglubZ (3530445) | about 5 months ago | (#46993253)

I was under the impression that it used to be measured as B-W-B (or W-B-W), and only when quality of displays increased, they switched to measuring GTG (since it was the lower number).

Re:It's a great idea (0)

Anonymous Coward | about 5 months ago | (#46987507)

Aside from gamers who want to see "the next frame" with the smallest latency possible, I don't know that anyone is really going to care much about that potential downside.

Nope, gamers wants to have the lowest latency possible to visible difference, not to correct output. It doesn't matter if the pixel reaches the correct value faster, you just want to see where the change happens.
Going for gray first only reduces maximum time it takes to reach the final state and it will reduce color-shifting during large transitions between different colors.
Possibly the faster transition could be undesirable when doing color balancing during video editing but I highly doubt that and during video editing tearing and/or stuttering is a much bigger problem anyway.

Re:It's a great idea (3, Informative)

michelcolman (1208008) | about 5 months ago | (#46987869)

I have to wonder why the idea of adaptive vsync wasn't thought of earlier or implemented into display standards earlier. It just seems like such an obvious idea once you've heard of it. Surely someone else in the graphics/display industry must have had the idea before NVidia?

It's just a vicious compatibility circle.

CRTs have a fixed frame rate for technical reasons.
Therefore graphics cards have a fixed frame rate to support CRTs
Therefore LCD displays have a fixed frame rate to support graphics cards
Therefore graphics cards continue to have a fixed frame rate
etc...

New stuff has to remain compatible with old stuff, so nobody even thinks of breaking the circle. Until now, fortunately.

Re:It's a great idea (1)

Blaskowicz (634489) | about 5 months ago | (#46990329)

What's more, even sending the picture as a set of scanlines is no longer needed. With recent versions of eDP (embedded Displayport) and DP you will have the option to send it as little square chunks, send only those that have changed, and even use compression (for extremely high resolutions or power saving purpose)

Re:It's a great idea (1)

ultranova (717540) | about 5 months ago | (#46988167)

Not being able to play grey-to-grey optimization games is I guess a possible downside of adaptive vsync; but I suspect it's a pretty small downside. Aside from gamers who want to see "the next frame" with the smallest latency possible, I don't know that anyone is really going to care much about that potential downside.

Especially since it's actually an upside to most people: the gray-to-gray "optimization" introduces flicker, at least based on your description.

Re:It's a great idea (2)

neilo_1701D (2765337) | about 5 months ago | (#46988705)

The Amiga did this sort of stuff when it first came out. You could create a Copper (the display coprocessor) list that was synced to the vscan quite easily; "beam-synced blitting" I think was the name. Basically, you built your copper list so the screen writes were always just behind the video beam so you could have flicker-free drawing.

Re:It's a great idea (5, Informative)

Immerman (2627577) | about 5 months ago | (#46989127)

That's clever in that they presumably accomplished that without a back-buffer back when RAM was expensive, but basically you're describing the vsynced based rendering which has been the standard for decades: Wait until the screen starts updating (the vsync), then start working on the next frame to maximize rendering time. It's nothing like G-sync/free-sync/adaptive sync though - you still have the issue that if your screen updates at 60FPS you have exactly 1/60 of a second (~16.67ms) to render each frame.

Adaptive sync means that if you finish rendering an easy frame in only 14ms the screen can display it immediately instead of waiting an extra 2.67ms for the next scheduled refresh. Even more importantly if a complex frame takes 20ms to render you don't miss the refresh and have to have to wait an extra 13.3ms for the next scheduled refresh, wasting almost an entire frame - instead the screen can hold off on refreshing until the rendering is complete.

TLDR: Adaptive sync means that if you enter a graphically intensive area that you can only render at 50fps, then your monitor will automatically refresh at 50fps, instead of continuing to try to refresh at 60fps and having to spend every other frame waiting for the rendering to finish, for an effective framerate of only 30fps. (or possibly a jittery 40fps with double-buffering: update,update, wait,u,u,w,...)

Re:It's a great idea (1)

neilo_1701D (2765337) | about 5 months ago | (#46989371)

Oh wow; thanks for that explanation!

+1 informative if I could mod in this thread now.

Wow (1)

KingMotley (944240) | about 5 months ago | (#46991797)

Probably the best explanation of the problem and how adaptive sync works that I've seen. Well done!

Re:It's a great idea (1)

aix tom (902140) | about 5 months ago | (#46991647)

I remember that one, that was a neat feature. It also meant that you were able to split the screen horizontally, having different resolutions and colour depths for the "top part" and "bottom part" of the CRT.

Re:It's a great idea (4, Insightful)

pla (258480) | about 5 months ago | (#46989211)

I have to wonder why the idea of adaptive vsync wasn't thought of earlier or implemented into display standards earlier.

I have to wonder why we still use the concept of sync and porch and blanking interval and even frames, etc at all, when we all now run pixel-addressable digital displays rather than a magnetically confined analog electron beam physically sweeping over a surface.

"Tearing" results from the display updating halfway through a complete refresh. Why the hell do displays still do complete refreshes? No need whatsoever to update anything but the small subset of pixels that have changed. And no need whatsoever to do that in some blessed-from-on-high linear scan pattern from left-to-right top-to-bottom manner, either.

How about if the next gen of video hardware stops pretending it still needs to support CRTs, and we can all move on from caring about metrics like "refresh rate" that haven't meant a damned thing in over a decade?

Re:It's a great idea (0)

Anonymous Coward | about 5 months ago | (#46997361)

I have to wonder why we still use the concept of sync and porch and blanking interval and even frames, etc at all, when we all now run pixel-addressable digital displays rather than a magnetically confined analog electron beam physically sweeping over a surface.

While sync and porch still occur in DVI, they are not used for display alignment as they were in VGA. Instead the rising edge of the display enable signal marks the start of a scanline, and the pixel clock encoded in the pixel data controls the rate of scanout. They exist solely for compatibility with DVI-VGA converters and the defunct digital CRTs that only existed for a couple of years.

"Tearing" results from the display updating halfway through a complete refresh. Why the hell do displays still do complete refreshes? No need whatsoever to update anything but the small subset of pixels that have changed. And no need whatsoever to do that in some blessed-from-on-high linear scan pattern from left-to-right top-to-bottom manner, either.

The panels are charge coupled devices, it's not possible to refresh a single pixel without refreshing the entire line.

Displayport is a packet architecture, and so could support partial refresh if existing scanout hardware was capable of such. Displayport makes optional the existence of hsync/vsync, and most eDP (notebook displays, etc) already support panel-self-refresh for power saving. The panel itself still must still be refreshed or the pixels would fade over time, but this is done by the scanout controller integrated in the display, rather than requiring high power transmitters in the GPU to transfer the data to the display again. Self refresh is supported in Displayport as an optional feature, but I am unaware of any displays that implement the function. It would after-all require that the display has RAM to hold the frame for the purposes of self-refresh, and the saving in energy efficiency is hardly worth the added cost for mains powered devices.

but (-1)

Anonymous Coward | about 5 months ago | (#46987381)

How will this affect climate change? Is it carbon neutral?

I didn't realise they didn't already did that. (1)

serviscope_minor (664417) | about 5 months ago | (#46987397)

I didn't realise this wasn't already a thing. I mean it made sense with CRTs, since they had an analog PLL for synching the line sweep to the end of line markers. I'm sort of surprised that with digital ones they went to the effort of syncing frame display when there wasn't any data in the input line.

Re:I didn't realise they didn't already did that. (1)

PhunkySchtuff (208108) | about 5 months ago | (#46987439)

I haven't RTFA, but from what I understand of it, it's not syncing the output from the graphics card to the vertical blanking interval on the monitor, it's the other way around. It's running the monitor at a variable frame rate so that if you're running at (say) 60Hz refresh and the next frame takes 1/60th second + a tiny bit, the monitor can hold off painting the new frame until the data is there to paint it, rather than waiting for 2/60th second before displaying an updated frame. Or, if the next frame is ready early, and the monitor can do so, it can paint the new frame early - so the monitor isn't running at 60Hz, it's running in sync with the output of the graphics card.

Re:I didn't realise they didn't already did that. (4, Informative)

gigaherz (2653757) | about 5 months ago | (#46987595)

The protocol used for digital signaling is internally surprisingly similar in concept to the analog equivalent. The idea of "adaptive" sync is that instead of starting a new frame after a fixed exact period, it can be "or later". There's no other technology involved other than allowing a frame to come late.

Re:I didn't realise they didn't already did that. (1)

Immerman (2627577) | about 5 months ago | (#46989141)

Actually from TFA it looks like "or sooner" is also an option, though presumably there's still some monitor-specific minimum.

Re:I didn't realise they didn't already did that. (1)

serviscope_minor (664417) | about 5 months ago | (#46987629)

Yeah, I understand---and I didn't realise they did that.

I mean a VBI made sense for analog. For digital, I assumed they just dumped the data to the flat panel when they'd either got a frame load from the input source or when that happens and an additional signal arrives. Seems interesting that they've apparently gone to the the effort of running an internal oscillator in the monitor so it dumps it at a fixed rate regardless.

Re:I didn't realise they didn't already did that. (1)

PhunkySchtuff (208108) | about 5 months ago | (#46987677)

Yeah, I always found it strange that even a purely digital flat panel monitor still "emulates" a vertical refresh interval signal...

Re:I didn't realise they didn't already did that. (2)

50000BTU_barbecue (588132) | about 5 months ago | (#46988431)

What's "purely digital" about a LCD? You can have analog VGA inputs, which are digitized in the monitor, then sent over some ridiculously fast serial interface to column driver ICs on the glass... to be converted back to the analog voltages needed to control the LCD shutters.

Guess what? Your LCD monitor has thousands of D/A converters in it!

So for example, a relatively cheap monitor (like mine) 1680x1050, requires 1680x3=5040 columns to be driven in the actual glass. Each pixel has RGB, right? Well, those voltages have to come from somewhere!

www.intechopen.com/download/pdf/11273

Column drivers are the most amazing things I've seen in a while. They are bare dies about 2 x 11 mm with hundreds of pins, attached directly to the flex PCB that drives the glass. Each IC contains hundreds of digital-to-analog converters and opamps! It's crazy! There's usually 10 per panel, so each IC drives about 500 lines. You should see the flex PCBs, the traces are so fine you need a magnifying glass to resolve the traces.

http://oi59.tinypic.com/whmc74... [tinypic.com]

This is as close as I can get this morning. Yes, those traces are so fine they just look like a green patch.

I'd say that means a LCD monitor is more analog than digital, but that's just me.

So what's so strange about a serial device needing synchronization signals anyways?

Re:I didn't realise they didn't already did that. (2)

chihowa (366380) | about 5 months ago | (#46988999)

The pixel addressing of a modern graphics system (GPU to LCD) is purely digital, which is what he meant by "purely digital". Of course there are analog components in the displays, but the signal path is digital.

It seems very inefficient to dump whole frames to the panel at a fixed (or even variable) interval. Why not just change individual pixels only when they are damaged?

Re:I didn't realise they didn't already did that. (2)

50000BTU_barbecue (588132) | about 5 months ago | (#46989155)

Why does it matter if you dump whole frames to the LCD? It's not like the cable is miles long or that transmitting a signal takes so much power compared to the backlight.

You'd just be adding a lot of complexity to arbitrarily refresh a bunch of pixels.

Oh and suddenly programmers are worried about *efficiency*? I doubt it! You'd just be adding complexity to the monitor. Right now a monitor is a 2 dimensional serial to parallel converter. It does the job just fine.

And I'd argue your assertion that pixels are addressed on a LCD. If you're not using it at its native resolution, what are you addressing? That's purely a concept on the computer side. It's actually the TCON in the monitor that does any "addressing", and it doesn't do anything more fancy than a shift register. It's not like you can go back on the line and say "oops, I wanted that pixel to be purple, not yellow, so please address it". By that time, it's too late. Next line!

Re:I didn't realise they didn't already did that. (1)

batkiwi (137781) | about 5 months ago | (#46994607)

Because 99% of the time you care about an entire composited frame, not individual pixels changing.

Re:I didn't realise they didn't already did that. (1)

PhunkySchtuff (208108) | about 5 months ago | (#46994903)

What's "purely digital" about a LCD? For a start, there's nothing in this article talking about VGA. I'm talking about DisplayPort (as is the linked article) which has a signal path from the GPU to the monitor (and if you want to be pedantic about it, the DisplayPort interface on the rear of the monitor) that is purely digital. However, if you really want to take it to it's illogical extreme, even the digital signalling used by DisplayPort is, at it's heart, analogue voltages travelling down a bunch of copper wires.

Either way, the signal path, the communications channel, that still has things like a vertical blanking interval and runs between the GPU and the electronics in the monitor is purely digital.

Re:I didn't realise they didn't already did that. (0)

Anonymous Coward | about 5 months ago | (#46990311)

They want to use as few transistors on the display as possible. It's not about emulating an analog monitor. It's about not having the monitor cost extra, because random access isn't worth cost.

Re:I didn't realise they didn't already did that. (1)

Charliemopps (1157495) | about 5 months ago | (#46988261)

Yea, I remember having an argument about this with one of those guys that has to buy everything that's new in home entertainment. He had his brand new $800 15" LCD monitor and was telling me how it didn't have a refresh rate, only CRTs did. "That's silly, why would an LCD have a refresh rate?!?" he lamented. Because that's how CRT's worked for the past 30 years so that's how they designed LCDs. If it isn't broke, don't fix it is every engineers mantra. It's when marketing shows up and tells you they need a reason to get people to buy their stuff rather than someone else's that the engineers start thinking "Well, we could do something about that legacy sync thing..."

There will likely be no visible difference between the 2 to 99% of the population... just like most new display options. I suspect Nvidia introduced this as a way to be different and AMD realized it would be pretty easy to do it basically for free and released their open version to neutralize any market advantage Nvidia would have because of it. You also don't want your competitor to set a closed standard you'll eventually have to pay them for.

Re:I didn't realise they didn't already did that. (5, Informative)

50000BTU_barbecue (588132) | about 5 months ago | (#46988667)

It's tragic to hear the kind of nonsense people tell themselves. It's like a cyclist buying a car and saying "that's silly, why would a car have a speed?"

It's the same thing, dingus!

A monitor is just a high-speed serial device. Stuff comes in at some rate. The only reason CRTs had such tight timing requirements was because of the humongous amount of reactive power flowing in the deflection coils. You can just short them out but then all that reactive power becomes real (waste) heat. Lots of it. So people didn't do that.

Remember how old Multisync monitors used to click relays as they shifted to different horizontal frequencies? That was the monitor swapping in different capacitors to create the LC tank with the deflection coils. So they could swap the power around between the coil and the cap instead of dissipating it.

But that meant you better be ready to send me those pixels when I'm ready! I can't wait!

There is no such large power being bounced around inside an LCD, it's really just thousands of analog voltages being sent to a glass panel. It can wait a bit, the picture won't fade that quickly. Eventually the capacitor that is formed by the LCD shutter will leak, but that takes time.

Re:I didn't realise they didn't already did that. (0)

Anonymous Coward | about 5 months ago | (#46997397)

There is no such large power being bounced around inside an LCD, it's really just thousands of analog voltages being sent to a glass panel. It can wait a bit, the picture won't fade that quickly. Eventually the capacitor that is formed by the LCD shutter will leak, but that takes time.

No, but there's 2560*1600*3*2 = 24M tiny capacitors in the form of the floating gates in the TFT array of my monitor. Adjustments still have to be made to the drive current depending on the refresh rate to ensure that the same net charge is deposited on all those little capacitors.

That's a somewhat easier task than tank balancing, but still requires some engineering to ensure the color isn't off when the monitor switches refresh rate.

I have worked with LCDs without intelligent scanout generators, they seemed to take about 2 seconds after power-loss to go full grey. You could probably drop the refresh rate to 15Hz with only minor degradation. If you drop it to 1Hz, you're certainly going to notice.

What most embedded monitors use now is eDP, which supports panel self-refresh. This doesn't mean that the panel is not refreshed, but rather the video data no longer needs to be transferred from the GPU to the scanout generator continuously. Instead the scanout generator has embedded RAM which buffers the last image and periodically refreshes the display (I would assume at 15Hz or so) while the eDP link is shut down.

Adaptive Sync seems to be just the same thing applied to non-embedded DisplayPort.

Re:I didn't realise they didn't already did that. (1)

50000BTU_barbecue (588132) | about 5 months ago | (#46999243)

Yes, from what I understand if there's DC on a shutter for too long this chemically degrades the liquid crystal itself. So the TCON NEEDS to refresh the screen no matter what so it can run its inversion scheme.

Anyways you seem to know a thing or two about LCDs!

Re:I didn't realise they didn't already did that. (0)

Anonymous Coward | about 5 months ago | (#47005583)

Well, I didn't know that.

Re:I didn't realise they didn't already did that. (1)

Bryan Ischo (893) | about 5 months ago | (#46991029)

Read Anandtech's review of the technology. It makes games much, much more playable. It eliminates tearing and stuttering. It's a real thing, not just some hype. Sorry that you've become so jaded to new tech that you think everything is hype, but seriously, this is real.

By the end of the year? (1)

Kokuyo (549451) | about 5 months ago | (#46987411)

So it's gonna hit stores earlier than G-Sync?

Yeah, I'm a bit frustrated... does it show? ;)

Re:By the end of the year? (1)

wonkey_monkey (2592601) | about 5 months ago | (#46987471)

So it's gonna hit stores earlier than G-Sync?

Yes, but only compatible stores will be opening their doors early enough for you to buy it first.

Re:By the end of the year? (0)

Anonymous Coward | about 5 months ago | (#46987535)

No, that's old school. We have doors that sync with the current flow of customers now.

Re:By the end of the year? (1)

Immerman (2627577) | about 5 months ago | (#46989263)

I would think that's likely. After all G-Sync promised to add $150 to the price tag of a $200 monitor for a feature that would really only appeal to hard core gamers (so add some extra $$$s for limited production run overhead) - that's a recipe for a niche product of limited appeal to manufacturers. Adaptive Sync promises to offer basically the same features for no added production cost, which makes for far easier integration into the production line - just put it in everything, there's no down side.

Useful (1)

arnero (3539079) | about 5 months ago | (#46987435)

CRTs do not flicker if they are refreshed after at max 10 ms. There may be a problem with intensity fluctuations. LCD panels do not wash out if they are refreshed after at max 30 ms? Since the desktop runs in 3d today, this is useful for everybody. You could even watch movies at 48 fps, PAL at 50 fps, NTSC at 59.997 fps fullscreen. No more tripple buffer! Maybe we could even get variable timing for horizontal refresh in order to calculate post-processing on the fly like in on the gameboy.

Re:Useful (1)

wonkey_monkey (2592601) | about 5 months ago | (#46987449)

NTSC at 59.997 fps

It's 59.94 fps, you insensitive clod!

Re: Useful (1)

Anonymous Coward | about 5 months ago | (#46987481)

Actually its precisely 60000/1001 fields per second :-)

Re:Useful (1)

cheater512 (783349) | about 5 months ago | (#46987569)

LCD's refresh the entire screen at once, no scanning.
It also has its own frame buffer. You could update it once a second if you wanted and it would work fine.

Re:Useful (1)

Anonymous Coward | about 5 months ago | (#46987673)

LCD's refresh the entire screen at once, no scanning.

This is not true:
http://www.youtube.com/watch?v=nCHgmCxGEzY

Re:Useful (0)

Anonymous Coward | about 5 months ago | (#46989717)

Those are some amazing claims. Care to cite a source? Your ass doesn't count.

Re:Useful (0)

Anonymous Coward | about 5 months ago | (#46992281)

Oh jesus, I just visited your website... it really fits with your post: full of crap!

CRTs always flicker (1)

Immerman (2627577) | about 5 months ago | (#46989605)

Actually CRTs flicker regardless of refresh rate, it's just that if they flicker fast enough our eyes don't register it visually (but may still experience added eye strain). If you film a CRT with a high-speed camera you can actually see the electron gun racing across the screen, and the first-row phosphors stop glowing long before the beam reaches the bottom rows. Here's an example of a CRT and LCD side-by-side, you can see the effect clearly: https://www.youtube.com/watch?... [youtube.com]

I love the idea of video playback at native speeds, I hadn't thought of that application. It could also be great for those speed-reading apps that flash words on the screen - you could gradually increase speed across the spectrum, instead of getting those last huge steps of 900wpm->1200->1800->3600 on a 60Hz screen.

Re:Useful (0)

Anonymous Coward | about 5 months ago | (#46997431)

You could even watch movies at 48 fps, PAL at 50 fps, NTSC at 59.997 fps fullscreen.

Already possible. Download Media Player Classic; it supports dynamic pixel clock timing adjustment. It has always been possible to make small adjustments to the pixel clock and sync timing to servo the video refresh rate to exactly match the audio sample rate. What Adaptive Sync allows is sudden discontinous jumps in field rate which would cause a standard monitor or TV to lose sync, at which point it blackens the screen until it detects a stable clock. You can surely do the same thing on Linux/X11, but you need to have/hack a graphics driver which supports the Xvidtune extension.

My Panasonic plasma TV will accept down to about 45fps as 50fps before it loses sync, so watching films at 48fps is possible, though my TV supports scan-doubling and 1080p24 mode, as I would guess most televisions do now, so there's little point in 48fps when most films are 24fps (The Hobbit excepting).

I also have an Altera Cyclone V GX devkit, with a 19" Lenovo IPS screen hooked up over DVI running a pattern generator logic. Through experimentation of adjusting the PLL, it will sync at arbitrary field rates from about 40fps to 78fps, and even report the correct field rate in the OSD, though at one point it was reporting 640x480 as 1280x480 (turns out I had the wrong pixel clock divider programmed into the HDMI transmitter). What does happen is that it loses sync if the pixel clock is swept by more than about ~1-2%

What about normal TVs (1)

Anonymous Coward | about 5 months ago | (#46987519)

You know, for us filthy casual console players?

Re:What about normal TVs (1)

N3x)( (1722680) | about 5 months ago | (#46987623)

well you'd have to wait for a new xbone/ps4/wiiuu/toaster to come to market in about 10 years or so. By then I expect that a tv and a monitor and a pc to be pretty much the same device.

Re:What about normal TVs (1)

Immerman (2627577) | about 5 months ago | (#46989783)

Why on Earth would you want to integrate your computing power (which obeys Moore's Law and will hence be obsolete by the time you finish reading this), with your large expensive TV/monitor which is extremely unlikely to see substantial improvements over the course of a decade?

I mean, sure, I could spend several hundred bucks to upgrade my 40" LCD to a thinner LED backlit model - but aside from a bit of a reduction in weight and power consumption, what would my money buy me? It's lasted through 3 system upgrades and is still going strong - if it were integrated with the PC I'd only have been able to afford *maybe* two upgrades for the same total price.

Re: What about normal TVs (1)

N3x)( (1722680) | about 5 months ago | (#46990503)

I'm not saying it's the right way to go. I'm just saying that tvs get smarter and pcs get integrated into monitors and monitors becomes larger and larger. I'm just hoping for open standards enabling a sort of local cloud so that me having a nice PC makes games run better on my tv.

Re:What about normal TVs (1)

Sockatume (732728) | about 5 months ago | (#46987741)

You can wait for a revision of the consoles that adds Displayport (unlikely), or a new version of HDMI that supports this (more likely) and a version of the consoles that supports that new HDMI (contingent but plausible), but in either case you'd have to shell out for a new TV, which seems to be a rather fundamental obstruction.

I'd be shocked if new portables (inc. phones and tablets) weren't using this within a few years though.

Re:What about normal TVs (1)

N3x)( (1722680) | about 5 months ago | (#46987991)

I was under the impression that many handhelds and laptops already have these kinda systems for power-saving reasons. It just hasn't been part of the standard as these devices all have proprietary interfaces between graphics hardware and screen. The demo of this that AMD showed ran on standard laptops.

Re:What about normal TVs (0)

Anonymous Coward | about 5 months ago | (#46997493)

HDMI is very unlikely to support this.

HDMI uses a very similar TMDS transport as DVI (the difference is HDMI transmits data blocks during the blank period over the TMDS link, while DVI does not), with each pixel transmitted in real-time with an embedded clock (8b10 encoding), where-as Displayport transmits packets containing subimages, which are buffered for a (very) small period prior to display. Inside the scanout generator in the display, that clock is recovered by locking the incoming TMDS clock with a locally generated PLL clock, so large discontinuous jumps in the pixel clock will cause loss of synchronisation as the scanout generator adjusts it's PLL and waits for lock. Adaptive sync works for DisplayPort because the data clock is independent of the sync rate, and there is no pixel clock (as far as DisplayPort is concerned, anyway)

It might be possible to skew the timing by radical adjustment of the blank periods while maintaining the same pixel clock. I have an FPGA board with an HDMI port here, so I can in fact test what happens in this scenario, at least with the displays I have at hand. I wouldn't expect this to be supported universally.

I suspect what will happen is the display will detect the shifting horizontal and vertical sync period and reset the scanout generator until it is stable again. I'll get back to you if I end up testing it.

vsync (-1)

Anonymous Coward | about 5 months ago | (#46987857)

"What's vsync?" -- X11 developers

About bloody time... (2)

Dan Askme (2895283) | about 5 months ago | (#46988281)

This tech has been a long time overdue, nothing else to say.

DisplayPort? (1)

sabbede (2678435) | about 5 months ago | (#46988655)

The only place I have EVER seen a DisplayPort is on my laptop.

My video cards are DVI and HDMI, monitors are DVI and VGA...

Re:DisplayPort? (0)

Anonymous Coward | about 5 months ago | (#46989009)

I see them on monitors and video cards all the time. My PC here at work is driving two LCDs over display ports. The PC has two display port outputs and one VGA (that isn't used).

Re:DisplayPort? (1)

Blaskowicz (634489) | about 5 months ago | (#46990687)

It's still not to be found on most low end and even midrange hardware. On motherboards, this is a feature you have to hunt for, especially for use with Intel graphics (and to have a couple of them, that's only found on very rare mobos with Thunderbolt). Even on graphics cards the newly released GTX 750 and 750 ti don't have them, save for a model from one specific vendor that is not sold in Europe as far as I know. The cheaper ones tend to have a HDMI/VGA/DVI triad.

Re:DisplayPort? (1)

Blaskowicz (634489) | about 5 months ago | (#46990753)

well I found the card on Amazon. Still has one Displayport rather than two. I'd be happy if they crammed two mini-DP, one HDMI and one VGA on a single row (include a mini DP to single link DVI adapter in the package)

Re:DisplayPort? (1)

Immerman (2627577) | about 5 months ago | (#46989853)

Look around for it - you might be surprised at the number of new monitors and video cards that support DisplayPort. My guess as to the reason it's not yet everywhere is that some of it's primary features are the support of high resolutions and daisy chaining displays - and most people don't use multiple monitors or own a screen with a resolution greater than a piddly 1920x1080.

Re:DisplayPort? (1)

Blaskowicz (634489) | about 5 months ago | (#46990851)

And dual link DVI covers some of the bandwith need, plus VGA optionally piggy-backed on the connector.
I hope the VESA Adaptive-Sync feature gets successful (and that it'll work on linux, why not)

By the way I would like a great 1600x900 monitor with high refresh rate, great blacks and angles.. "piddly" but it would be not too big and still allow comfortable use of a maximized browser.

Re:DisplayPort? (1)

Immerman (2627577) | about 5 months ago | (#46991879)

Nothing wrong with a maximized browser on a 13" 1920x1080 screen. The problem is that for larger screens 1080 makes for a horrible dpi resolution for arms-length usage. Heck, even at 13" that's only 170dpi - barely newsprint quality. Once upon a time you could get decent resolution monitors at a decent price, but thanks to the LCD monitor market piggybacking on the TV market those have all but disappeared except for niche products at a substantial markup.

As for dual link DVI, that can handle a maximum of 2560×1600 at 60 Hz, twice as many pixels as 1920x1080, but only half of those needed for a 4K screen (which is looking like it will be the only mass-market upgrade from 1080p). Meanwhile DisplayPort 1.3 is targeting 7680×4320@60Hz (4x more pixels than 4K), optionally spread across multiple daisy-chained screens, or used for 4K stereoscopic rendering, and the format allows flexibility for far greater color resolution than allowed for in DVI (6 to 16 bits per color channel, compared to DVI's 8)

Re:DisplayPort? (1)

locopuyo (1433631) | about 5 months ago | (#46990997)

Only higher end video cards and monitors have DisplayPort.

Re:DisplayPort? (1)

sabbede (2678435) | about 5 months ago | (#47007803)

Well I'll be darned...

I suppose my lack of familiarity is due only to the shortage of my disposable income!

V-Sync zees ees a gud idea. (0)

Anonymous Coward | about 5 months ago | (#46988671)

The only thing that belong N'Sync is dirty dish. -Karl Malone

Why does this require a cable change? (1)

MobyDisk (75490) | about 5 months ago | (#46989657)

I see 2 changes involved here:
Computer: If I don't have a new frame to send yet, don't re-send the current frame.
Monitor: If the sender doesn't send a frame, don't rescan. Just leave the image there.

I see why this is a change to the communications protocol. But why does this require a new cable? And why would the cable require a chip in it?

Re:Why does this require a cable change? (0)

Anonymous Coward | about 5 months ago | (#46993331)

Monitor: If the sender doesn't send a frame, don't rescan. Just leave the image there.

Um, "there" where?

Re:Why does this require a cable change? (1)

MobyDisk (75490) | about 5 months ago | (#46995569)

It depends on the display.

With CRTs, you would need a framebuffer. But they are obsolete so that doesn't matter.
Plasma uses PWM to modulate pixel color, so they already have a framebuffer. So they just need to keep doing what they are already doing.
LCDs are stable: the crystals don't change until a voltage is applied. Although the voltage is latched-in anyway, so they have a built-in framebuffer of sorts.

Please correct me if I am wrong on any of these. Regardless of what tech the display uses, just adding a framebuffer answers the "where" question.

Re:Why does this require a cable change? (1)

Anonymous Coward | about 5 months ago | (#46996583)

Nope, pretty much correct with one exception.
LCDs (at least current TN, *VA and IPS) need a minimum refresh rate of a few Hz, as the individual transistors lose state similar to DRAM.

Re:Why does this require a cable change? (0)

Anonymous Coward | about 5 months ago | (#46997577)

LCDs are stable for ~100ms ~1s, plenty enough to implement Adaptive sync, but not enough to pause the refresh entirely.

The problem with just adjusting the sync rate for DVI and HDMI is that the pixel clock must run at a fixed rate, because the PLL in the display has to lock to the pixel clock of the TDMS interface. You can adjust the blank periods to consume more time between traces, but in practice a lot of displays use the blank period to guess what resolution and refresh rate they are running at, and if the blank period suddenly changes, they lose sync and display a black screen.

With DisplayPort the picture is sent in packets over a fixed (actually 1 of 4) clock rate, over 1 to 4 lanes, so the refresh rate is completely detached from the rate on the wire. I can only assume the reason that regular DP1.0 displays can't support adaptive sync is because of fixed timing specifications in the overlying logical protocol, which are probably there to support manufacturers who switch from DVI to DP by the expedient of throwing an interposer between the port and scanout generator, and for DP-DVI and DP-HDMI adaptors which require standard timing to function correctly.

I can and do servo the field rate of my TV using Media Player Classic's built-in support for such, but this just adjusts the field rate by 1~2% to create AV sync. Sudden discontinuous jumps in sync rate will cause the display to lose sync.

As an EE, it does disturb me the growing trend in modern electronics to use special cables and connectors for each application, especially as everything is approaching the similitude of high-speed serial interfaces, where the only defining characteristic at the analog signalling level is the symbol rate. I work with FPGAs, and you can get a Cyclone V GX to talk to virtually anything over it's SERDES, assuming it's below the 3.125gbps maximum rate. Likewise you can get a Stratix V to talk to anything, assuming it's below the 56gbps maximum rate supported by THOSE devices. Granted the FPGAs have considerable lattitude in supported signalling modes and symbol encoding, but it ought to be possible for electronics to converge on a set of speed grades of uniform modes, connectors and cables.

I don't think present commercial forces would allow it. Silicon vendors are all about "a chip for this, a chip for that", surprise, surprise, because they want to sell a lot of chips.

What blows my mind is what analog chip vendors, Analog Devices in particular, are able to accomplish with equalisation; I've seen 1080p HDMI over 30m of Cat5 cable. So yea, you could probably just plug your monitor in with Cat5 cable, if it wasn't all about selling cables, chips and connectors. 10GBaseT can accomplish better, but that is more than just equalisation, it is a much more complicated encoding scheme.

Re:Why does this require a cable change? (1)

MobyDisk (75490) | about 5 months ago | (#46999197)

Thanks for the info. I'm still trying to comprehend it all.

I am surprised that they are syching the displays to the wire rate. I see how that causes the problem. The whole "blanking" thing confuses the heck out of me. I understand how it applied to CRTs, but are we really still using that? I thought resolution and timing were negotiated by DDC? Why would a display rely on analog artifacts to guess timing when it is right there in the digital signal?

FreeSync FreeFileSync ? (1)

BrendaEM (871664) | about 5 months ago | (#46989719)

I hope our government isn't so stupid as to let them trademark FreeSync, as FreeFileSync has been around for years.

BTW, if there is any question I hearby trademark FreeFileSync(TM), and give all rights to the open source file syncronization program: http://sourceforge.net/project... [sourceforge.net]

Re:FreeSync FreeFileSync ? (1)

Blaskowicz (634489) | about 5 months ago | (#46990879)

I hereby trademark OpenFileSync and NetFileSync, to confuse your users gratuitously.

Good news on standard; bad on time (1)

Sally Bowls (2952007) | about 5 months ago | (#46992815)

I want a new rig. After reading all the praise of gsync, that seemed like a reasonable cost-benefit to wait for that since the monitors were coming out in early 2014. Except not really. So now we have a new standard coming "year end" which probably means March or April. I guess a new rig with a very good graphics card this summer and then do something about a new monitor at "year end." If it really is a new standard out this year, I can see a lack of enthusiasm for an nVidia-only monitors that costs an extra couple of hundred dollars and only makes sense for a few months.
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?