Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Security Graphics Hardware

NVIDIA Releases Fix For Dangerous Display Driver Exploit 84

wiredmikey writes "NVIDIA on Saturday quietly released a driver update (version 310.90) that fixes a recently-uncovered security vulnerability in the NVIDIA Display Driver service (nvvsvc.exe). The vulnerability was disclosed on Christmas day by Peter Winter-Smith, a researcher from the U.K. According to Rapid7's HD Moore, the vulnerability allows a remote attacker with a valid domain account to gain super-user access to any desktop or laptop running the vulnerable service, and allows an attacker (or rogue user) with a low-privileged account to gain super-access to their own system. In addition to the security fix, driver version 310.90 addresses other bugs and brings performance increases for several games and applications for a number of GPUs including the GeForce 400/500/600 Series."
This discussion has been archived. No new comments can be posted.

NVIDIA Releases Fix For Dangerous Display Driver Exploit

Comments Filter:
  • Looks like they're now dropping support for the Geforce 7-series cards. Bummer, I have a 7800GT and it's still pretty quick.
    • Re: (Score:3, Informative)

      by Anonymous Coward

      Really? Try this page.
      http://www.geforce.com/drivers/results/49740 [geforce.com]
      Still plenty of support for the 7 series.

    • by Sycraft-fu ( 314770 ) on Sunday January 06, 2013 @03:34AM (#42493513)

      About 7.5 years old. It is reasonable that they cease supporting it with new drivers. You can still get drivers for it, they have drivers for OSes up to and including Windows 8, they just aren't keeping support in newer unified drivers.

      Sounds pretty reasonable to me. They gave you over 7 years of driver updates. It is fairly unrealistic to assume that they'll continue with new support forever, particularly given that there is little reason. The 7 series can't do WDDM 1.1 or 1.2, it can't handle DirectX 10, 10.1, 11 or 11.1, it can't do CUDA, DirectCompute or OpenCL. There is just little in the way of things to implement for it.

      If you wish to continue using the card, no problem (though be aware that an Intel 4000 series GPU found in Ivy Bridge processors is likely to be faster, and certainly has far more features) just use the 306 series drivers. It will continue to operate with those no problem.

      If the security issues is what you are worried about, it looks like it only affected the 310 drivers, so no issues there.

      • by Billly Gates ( 198444 ) on Sunday January 06, 2013 @03:51AM (#42493599) Journal

        Nvidia and ATI have great cheap $49 cards if you want aero. That can cream the gaming 7800 series easily. No meed to get a new system.

        If it is on XP you have a lot more security issues than this card though.

        • If it is on XP you have a lot more security issues than this card though.

          Such as?

          I have a few XP systems that are still getting regular automatic patches and updates from Microsoft. You seem pretty confident that all XP systems are vulnerable, so you must be aware of something specific. Care to share?

      • Yes, it's reasonable, that doesn't mean I like it. I won't gracefully give up my right to complain on the Internet.

        Frankly it's linux kernel compatibility I'm most concerned about. If Fedora 18 comes out next week with an updated kernel which breaks compatibility with the current 7-series driver, what are the chances it's going to get fixed?

        In the other hand, things are moving along in the Nouveau open source driver [slashdot.org] so there are alternatives.

        • I got a 1GB 610GT for $12 on Black Friday. It's in my system right now because the new fan for my 240GT (still better than the budget cards, anyway) has only just arrived. It's a gimped 520 IIRC and still better than your 7800. Its specs are almost as good as my 240GT (which was a spectacularly good core when the card was new, in spite of it being based on a core which was already old when it was released) but it has 1/4 the fill rate so 1920x1200 is murder.

          I do miss the days when the nVidia driver went all

      • " it can't do CUDA, DirectCompute or OpenCL"

        nouveau is feature complete for 2D and 3D rendering, in addition, without OpenCL, CUDA, or the bells and whistles of new hardware, it just might work better for this guy's application.

        nouveau renders the world wide web just fine, it also works all the way back to NV04.

        Your also not going to be playing the latest games on your old video card anyway.
        • by Anonymous Coward

          NVC0 family [400 series]
          All sorts of fun. Feature-wise it isn't too different but the architecture has changed a lot.
          These cards are generally working with the latest kernel and Mesa but may still have power management issues. It is recommended to use the Linux 3.1 kernel or newer (or a backported driver from this kernel).

          Oh yea, it sounds perfect!

      • Comment removed based on user account deletion
    • by Elbart ( 1233584 )
      They've announced to stop support 6- and 7-series a few months back. You should be reading the release notes once in a while.
  • Dangerous ? Nope. (Score:5, Interesting)

    by lemur3 ( 997863 ) on Sunday January 06, 2013 @02:37AM (#42493325)

    Not like a CRT catching fire...

    I remember hooking up an old CRT to the wrong video card.. one with way too a high resolution for that screen..

    A while later, hooked up to the correct video card, I noticed a bit of smoke coming out from where the dials were.. removed the case.. plugged it in again to see if it was OK .. it burst into 3 foot high flames.

    thankfully a fire extinguisher was about 3 feet away... mom would have been awfully mad if i had burned down the house.... scared the bejeezes out of me ... the burnt electrical smell was horrendous..

    (bonus: it was a fancy no mess extinguisher)

    lesson learned.

    • by earlzdotnet ( 2788729 ) on Sunday January 06, 2013 @02:45AM (#42493341)
      The more I learn about the past of computing the more I'm convinced they only ever considered one failure mode: catastrophic.
    • by Anonymous Coward

      It appears as though you have found a use for is_computer_on_fire().

      http://www.tycomsystems.com/beos/BeBook/The%20Kernel%20Kit/System.html

      Good old BeOS. Man I miss that operating system (though Haiku fills that gap nicely), but moreso the radical hardware that came with it (BeBox).

    • I've *NEVER* heard of a single instance of a refresh rate or too high of a scanning frequency causing monitor failure. Seems like a trivial thing to fix for a monitor manufacturer. Would you sell a product that shot out fire if someone clicked a slider setting too high?

      • by Kjella ( 173770 )

        I think I experienced that. It was in the 80s, old IBM PC and somehow the machine froze - not just the screen but it didn't respond to input or anything, but I left it running in the hope it'd recover. Suddenly there was a rather loud bang as a capacitor blew and released its magic smoke. Of course it could be coincidence, but I suspect it was the computer crashing and sending a very bad signal to the screen. This was the age where you'd check machine and screen compatibility before plugging it in, probably

      • I've *NEVER* heard of a single instance of a refresh rate or too high of a scanning frequency causing monitor failure.

        It's not the kind of thing that is likely to crop up, is it? The only monitors which will even try to sync to a bad frequency were ancient multisyncs, modern ones are smart enough to detect a signal out of range. And you'd pretty much have to have some bad hardware for it to happen.

      • by ais523 ( 1172701 )
        If you overclock a monitor, you can expect similar results to if you try to overclock a CPU. (In general, it's going to overheat.) Because it's rather easier to do by mistake than it is for a CPU, monitors tend to check for it nowadays and cut out intentionally.
      • by ATMD ( 986401 )

        Would you sell a product that shot out fire if someone clicked a slider setting too high?

        I have one of those, I use it for sticking bits of metal together.

    • by mrmeval ( 662166 )

      NEC had one glorious little monitor, pretty thing and expensive but if were plugged into the wrong card it would most of the time sheer the picture tube off, it would at best damage the phosphor. This would happen right at the part of the yoke closest to the face. Couple an odd reaction by the vertical circuit to a higher vertical frequency to an increase in horizontal frequency resulting in a very high boost in high voltage unchecked by an inadequate x-ray protect and you had in effect an electron cutting

    • by ATMD ( 986401 )

      Nice screen saver.

    • "mom would have been awfully mad if i had burned down the house"

      ... until she realized that it finally got you out of her basement?

  • by GodfatherofSoul ( 174979 ) on Sunday January 06, 2013 @03:24AM (#42493473)

    The days of trying to manually screen each update your system needs are over. Too many components are vulnerable and the turnaround time for an exploit is too short.

  • by Billly Gates ( 198444 ) on Sunday January 06, 2013 @03:45AM (#42493583) Journal

    Do we as geeks and IT professionals need to worry about this?

    First it was the OS that got you owned. Then when Linux, Macosx, and NT/XP came it was about IE. IE 5.5 and 6 were instant targets. Then as that died off it was flash, java, and ODF addons.

    Are video drivers next? Which never gets updated? The video drivers. Which has its own cpu, ram, and is never checked by AV? The video card. A reflash would be a nightmate.

    • by DrXym ( 126579 ) on Sunday January 06, 2013 @07:11AM (#42494311)

      Do we as geeks and IT professionals need to worry about this?

      Absolutely. WebGL allows any random website to tap your hardware through the browser. WebGL is essentially OpenGL ES 2.0 give or take a few APIs and is supported by just about every modern browser except IE. Some enable WebGL by default on suitable hardware, some have it disabled by default. When it is enabled a page has carte blanche to abuse the chipset six ways to sunday. The only protection afforded by browsers is the driver has to implement a GL extension called GL_EXT_robustness which says the driver promises, fingers crossed to be really good about checking and recovering from errors.

      ActiveX had something similar called the "safe for scripting" bit. IE wouldn't load a page unless the control said it was safe and look what happened there. While there are less graphics drivers than activeX controls, it's easy to imagine a driver version claiming it's robust when in fact it isn't. It's easy to imagine a malicious site using that fact to break a lot of machines. I assume browsers could implement a whitelist of "good" drivers and update the list in addition to checking for the extension but it's obviously imperfect and offers additional browser exploits where none existed before.

  • A NVidia, how about fixing your drivers so that it will stop quit providing a signal on windows 8 machines after an hour or so? Should have tested your drivers before release.
    • " how about fixing your drivers so that it will stop quit providing a signal on windows 8 machines after an hour or so"

      You make that plus sound like a drawback.

      • by Cammi ( 1956130 )
        If a driver does not work, it is not a plus. For instance ... you would not be on a computer ...
        • "If a driver does not work, it is not a plus. For instance ... you would not be on a computer ..."

          ... and by logical inference, if a driver for Windows 8 does not work, I would not be using a Windows 8 computer. Ergo, it is not a drawback; it is a major plus. Please try to follow along with the rest of the class.

  • Excuse me if this is a dumb question, but why is the display driver exposed to the network at all?

    • No network access as far as I can see:

      the vulnerability allows a remote attacker with a valid domain account to gain super-user access

      You need an account on the machine to log into it first.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...