Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Displays AMD Games Hardware

First AMD FreeSync Capable Gaming Displays and Drivers Launched, Tested 63

MojoKid writes Soon after NVIDIA unveiled its G-SYNC technology, AMD announced that it would pursue an open standard, dubbed FreeSync, leveraging technologies already available in the DisplayPort specification to offer adaptive refresh rates to users of some discrete Radeon GPUs and AMD APUs. AMD's goal with FreeSync was to introduce a technology that offered similar end-user benefits to NVIDIA's G-SYNC, that didn't require monitor manufacturers to employ any proprietary add-ons, and that could be adopted by any GPU maker. Today, AMD released its first FreeSync capable set of drivers and this first look at the sleek ultra-widescreen LG 34UM67 showcases some of the benefits, based on an IPS panel with a native resolution of 2560x1080 and a max refresh rate of 75Hz. To fully appreciate how adaptive refresh rate technologies work, it's best to experience them in person. In short, the GPU scans a frame out to the monitor where it's drawn on-screen and the monitor doesn't update until a frame is done drawing. As soon as a frame is done, the monitor will update again as quickly as it can with the next frame, in lockstep with the GPU. This completely eliminates tearing and jitter issues that are common in PC gaming. Technologies like NVIDIA G-SYNC and AMD FreeSync aren't a panacea for all of PC gaming anomalies, but they do ultimately enhance the experience and are worthwhile upgrades in image quality and less eye strain.
This discussion has been archived. No new comments can be posted.

First AMD FreeSync Capable Gaming Displays and Drivers Launched, Tested

Comments Filter:
  • Comment removed (Score:5, Insightful)

    by account_deleted ( 4530225 ) on Thursday March 19, 2015 @02:56PM (#49293935)
    Comment removed based on user account deletion
    • by Anonymous Coward

      It's bullshit. NVIDIA should work with AMD and the other manufacturers on this, not against them.

      The mistake you're making is to assume that in a free market what's best for consumers is also best for corporations.

    • It seems like the laptop version of G-Sync is using the same protocol as FreeSync (i.e it doesn't require any special hardware).
      http://www.extremetech.com/ext... [extremetech.com]
      So, maybe somebody could hack Nvidia's driver to make it compatible with FreeSync monitors?

  • I'll never leave my mum's basement.
  • by davydagger ( 2566757 ) on Thursday March 19, 2015 @03:27PM (#49294141)

    Now, if AMD linux drivers could really not suck, that would be awesome.

    Because their drivers are crappy. Their FOSS driver is crappy and their propiertary driver is crappy. They are really putting the cart before the horse here. What they really need to do, is just a massive bug hunt with their drivers. Right now they are lacking.

    Oh, and its hurting sales, because people won't buy AMD cards because they are known to be buggy. Even after they fix them, its going to take a lot of them to be seen as reliable.

  • Why still 1080? (Score:4, Interesting)

    by BobSutan ( 467781 ) on Thursday March 19, 2015 @03:42PM (#49294263)

    I've got a 24" monitor that's 10 yeas old and it's native resolution is1900x1200. Why the regression in recent years back to 1080? You'd think monitors today would have continued advancing. Sure, give them 1080 capability, but still they should have a much higher native resolution by now.

    • Economy of scale. The HDTV standard settled on 1080p. That was worse than the 1200p that was getting quite commonplace at the time, but close enough that manufacturers could justify consolidating their product ranges into mostly making 1080p for everybody, thus reducing their operating costs. Price of 1080p went down, and the price of 1200p was raised as manufacturers' inclination to supply them dwindled, causing a resultant reduction in demand, and so 1080p became standard. It's a pity because 1900x1200 r

    • Because that would defeat the point of an ultra-wide monitor. 1200 vertical pixels would make it less wide. There would be none of the "advantages" you get from 16:10 over 16:9 by using 21:10 instead of 12:9. And 1920x1080 content would scale terribly on 21:10 in full screen.
    • Dell U2412M, U2413, U2415 are all 24" monitors with 1920x1200 screens.

      Or you can jump up to 27" 2560x1440, 30" 2560x1600 or even 34" 3440x1440
      Or you can go to a 4K screen or even a 5K one.

    • It's hard to call it a regression when it was driven by popularity.

      Also isn't your comment about a year too late to be relevant? Right now there are more QWXGA and QHD screens on the market than ever, let alone the incoming 4k screens. And they are affordable too!

      • Also isn't your comment about a year too late to be relevant?

        Indeed. Even Walmart is offering greater than 1080p choices now, for both monitors and TVs.

        1920x1200 is no longer something to brag about... adjust accordingly lest you start to sound like an old person. :P

    • 4K TV is happening ... hang onto your hat and wait for the 4K 60Hz 4:4:4 panels later this year. Almost there.

      Of course I'm buying the first 50" 8K display I can get my hands on for less than $2K. After 30 years of upgrading displays, I think I will be done.

  • Tearing (Score:1, Interesting)

    by Morlenden ( 108782 )

    "This completely eliminates tearing and jitter issues that are common in PC gaming."

    Adaptive sync should fix tearing but it won't do much for jitter. That has to be fixed in the game program. Jitter occurs when frames, each representing a point in time, are displayed at different times than the ones they represent. A game program must try to advance the simulation time for each frame an amount that matches the time that will elapse before the frame is displayed, but it can be difficult to know what the simu

    • Actually it does a great job eliminating jitter.

      Jitter is usually caused by small fluctuations in frame rates which happens in pretty much every game. With GSYNC the monitor displays the frame as soon as it is rendered, so if frame one takes 16.42 ms to render and the next frame takes 16.01 ms to render the motion will still be smooth. With a normal monitor you would get a slight jitter.

      The only jitters GSYNC can't fix are in poorly programmed engines, typically seen in bad console ports.

      I've bee
  • Is nice to using that for games, but... This monitor can also work on 24fps films?
    • Many monitors already support 24000/1001Hz refresh rates.

      But yes, playing video with the FreeSync technology is going to be a possibility. For the sub-R290 AMD cards it will actually be the only supported mode (i.e. no FreeSync gaming support).

  • Most "scaler" chip manufacturers support AMD's FreeSync already. So gg nVidia...

  • Any word on when we'll get a flatpanel that isn't like watching an oil painting smear around in realtime?

"Protozoa are small, and bacteria are small, but viruses are smaller than the both put together."

Working...