Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Announces GeForce GTX 560M and GT 520MX Mobile GPUs

timothy posted more than 2 years ago | from the spittin'-pixels dept.

Graphics 62

MojoKid writes "NVIDIA just took the wraps off of a couple of new mobile GPUs at Computex and announced a slew of notebooks designs that will feature the new chips. The new GeForce GTX 560M and GT 520M will be arriving very soon, in notebooks from Asus, Alienware, Clevo, Toshiba, MSI, Samsung and others. The GeForce GT 520MX is an entry level DirectX 11 GPU designed for thin, light, highly mobile platforms. It sports 48 CUDA cores with a 900MHz graphics clock, 1800MHz shader clock, and 900MHz memory clock. Decidedly more powerful, the GeForce GTX 560M is outfitted with 192 CUDA cores and clocks in at 775MHz, with 1559MHz shaders, and 1250MHz for GDDR5 memory."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered


Sandybridge (1)

Anonymous Coward | more than 2 years ago | (#36285292)

How does this compare to Intel and AMD's graphics integrated CPUs?

Re:Sandybridge (1, Interesting)

Anonymous Coward | more than 2 years ago | (#36285314)

How does this compare to Intel and AMD's graphics integrated CPUs?

The 560M is way higher end than integrated CPU's. It's on-par with desktop performance. Without benchmarks yet, I'd guess that it's probably similar to the GTX 560 non-mobile version.

Re:Sandybridge (2)

beelsebob (529313) | more than 2 years ago | (#36285378)

No where near the desktop part – it has half the number of cores. Still, it'll compete handily with some lesser desktop cards.

Re:Sandybridge (0)

Anonymous Coward | more than 2 years ago | (#36285382)

Dont feed the trolls!

This should go in the Wii2, it's perfect. (3, Interesting)

elucido (870205) | more than 2 years ago | (#36285406)

Perfect design, made for mobile machines, cheap, powerful, fast, sleek.

Re:This should go in the Wii2, it's perfect. (1)

IronSight (1925612) | more than 2 years ago | (#36285960)

I'm going to guess that Nintendo sticks with ATI/AMD since they have for the last 2 systems, but we shall see. I do know that Nintendo do not adhere to razor blade marketing (selling consoles cheap so they can make it back in software sales) so they will have to use the cheapest bidder to keep their console price under control so people can afford it.

Re:Sandybridge (2)

hairyfeet (841228) | more than 2 years ago | (#36286332)

I'd say the bigger question is: How long will Nvidia be able to stay afloat? First you have the dirty dealing by Intel (why they haven't been busted for antitrust i'll never know, as between AMD and Nvidia they caused billions in damage to the market with their dirty dealing, even worse than MSFT in the 90s IMHO) which slaughtered their Intel chipset division, causing them to go out of business, and now there is Intel licensing PowerVR for Atom [vr-zone.com] which I'm betting will do to ION what cutting off access to the bus did to their Intel desktop sales

And on the other side you have AMD which frankly doesn't need Nvidia as they have excellent Radeon GPUs both discrete and as APUs with the new Bobcat and Bulldozer chips. Nvidia is still trying to make a little money with their desktop chipsets for AMD, but since they aren't making any new ones all they have left is the bottom of the barrel sub $45 market and from what I've seen most AMD guys (myself included) buy the Radeon chips to go with them so not much money on that side of the isle. Finally you have the fact that while Nvidia designs the monster chips first and then figures out ways to cripple them for the smaller markets AMD switched to simply designing for the midrange and using dual GPUs with HT links for the high end which is the obviously cheaper way to go.

So that leaves phones, HPCs, and discrete GPUs, which while decent markets are nothing like the size of the chipset division and as the APUs get better and better OEMs will be less and less likely to use discrete for anything but the gamer laptops, a teeny tiny niche.

Frankly I'll be amazed if Nvidia is around in 5 years, I really will. Personally I think the moves by intel are designed to slowly bleed Nvidia to make a takeover less expensive. Lets face it Intel has always sucked when it comes to GPUs and having Nvidia to integrate the way AMD did with ATI would give them some serious graphical muscle, although again why nobody has screamed antitrust over the way Intel has been behaving I'll never know. but I can't see the markets they are currently in bringing in enough cash to pay for the massive R&D that having to keep up with AMD costs and from the looks of it the new APUs are gonna end up "good enough" for everyone but hardcore gamers further hurting their bottom line.

So frankly I just don't see how discrete chips like TFA are gonna keep them afloat long term. The discrete chips cost money the OEMs don't have to spend with virtually all the chips from Atom on up coming with GPU on chip, those that buy gamer notebooks are a tiny niche of the overall market and with the vast majority of discrete cards going to the sub $150 market, which favors the AMD "build the MOR chips" over the Nvidia "high end first" model I just don't see how they are supposed to survive long term. CUDA is nice but I don't see it being enough to keep them above water, especially if Intel has eyes on buying them out down the road and keeps up their douchebag behavior.

So how many think Nvidia will be here in 5 years, or will they end up a footnote like Voodoo? Which in a twist of irony if it turns out I'm right was bought by Nvidia after being slowly bled to death by changes in the market. While I switched to AMD only for me and my customers after the Bumpgate mess I'd hate to see Nvidia disappear, especially since it would ultimately be over Intel douchebaggery.

Re:Sandybridge (2)

Hal_Porter (817932) | more than 2 years ago | (#36287420)

I think NVidia should do an x64 chip. The patents on x86 have mostly expired now. AMD have said they will license the x64 extension to anyone - they've already done it to Transmeta and Via.

An Atom class x64 chip would mean they could do a combined CPU/GPU.

The other option would be to buy Via who've already got an x86 licence. Or even just team up with them to put NVidia GPUs on the same die as Via CPUs. Which would be interesting combination actually. You could scale the performance from Intel Atom to AMD Bobcat. It should be possible to get HD video acceleration pretty easily, and that's something Intel Atom based systems seem to struggle with.

In fact it's a shame Intel won't play nice with NVidia - Like Via Intel make some excellent CPUs but horrid GPUs. NVidia make excellent GPUs but lack an x86/x64 design. Which is kind of an issue in the netbook market. I can see tablets being ARM but legacy x86 applications will run like ass on an Arm via emulation, and that is where all the money is.

Re:Sandybridge (1)

hairyfeet (841228) | more than 2 years ago | (#36292100)

I too have been wondering WTF Nvidia is thinking not buying Via. The new Nano chips are dual core X64 OOO with excellent on chip crypto, which tied in with Nvidia GPU tech would make a kick ass low power server that would be able to do strong encryption as well as a damned good notebook and netbook setup, and of course the Via embedded market would be a given, for things like car PCs and industrial usage.

I just don't see Nvidia staying afloat long term if they completely give up on the X86/64 market which is what it appears they are doing. Discrete GPUs simply don't have that big of a market, especially in mobile where power trumps all, and by buying rights to PowerVR Intel has just killed the ION platform dead. if you look at the numbers while you can't game for shit on it the PowerVR chips accelerate all of the major and minor HD formats (similar to the Broadcom HD chip) while using barely 2w of power. By integrating PowerVR Intel has killed a major source for ION sales, the fact that Atom can't do 1080p video, and as I said the AMD Radeon integrated does all the HD video you could want.

So that just leaves discrete GPUs, which as I said gaming, which I frankly love on my PC, is still a very small niche compared to the overall market, and high end mobile which is frankly dominated by iPhone. The rest usually just get whatever nice handset is offered with the contract and with carriers wanting to maximize profits above all I see the Broadcom and PowerVR chips taking a good chunk of that business. Finally as I pointed out the Nvidia "top first" model is inherently more expensive than the AMD "middle chips only" model, while still giving AMD the ability to scale. I have also found with power usage AMD chips are MUCH better as they can scale the stream processors, whereas Nvidia seems to have an "all or nothing" approach that while good for benchmarks makes for lousy power consumption.

In the end without being bought by Intel or buying Via and getting into APUs I just don't see them surviving long term. The mobile market is a fickle one, where what is "hot" can change in a heartbeat, and with the huge amounts of X86 desktop and mobile devices being more and more closed off from them I just don't see the niches they have left keeping them afloat. The last numbers I saw had them going through quarters of bleeding bad followed by brief respites when the next GPU hit. Again this gives an advantage to AMD who not only uses a cheaper design model but owns the sub $200 CPU market which helps their bottom line. I used to love Nvidia but honestly in 5 years I see them going like Voodoo before them, crushed by the change in tech just as Voodoo was killed when everyone quit using Glide API for DirectX and hardware T&L.

Re:Sandybridge (1)

Nursie (632944) | more than 2 years ago | (#36293732)

CUDA and OpenCL are a niche, but one that's increasing in importance for scientific computing. We're already seeing nVidia components in the top 10 supercomputers list.

However I doubt very much they're going anywhere in the next five years. The capabilities of the GPU are on the cusp of being exploited in ordinary (i.e. not super) computing and can only grow.

They're attacking the high end market with Tesla/Fermi, the mainstream and mobile sectors are well catered to with cutting edge GPUs, ION addresses netbooks, and Tegra is just starting to come into play in smaller systems. They face competition in all these areas (except perhaps Fermi/Tesla) so it's not plain sailing, but they're also the technology leader even where they're not the market leader.

I'm no fanboy, I've had GFX cards from all sorts of different vendors over the years. I think not only would the computing world be worse off without nVidia, I think they're going to stay relevant for a long time yet.

Until someone comes up with the next quantum leap in GPU/Vector technology.

Obligatory (2)

dutchwhizzman (817898) | more than 2 years ago | (#36285338)

But will it run DNF?

Re:Obligatory (1)

Anonymous Coward | more than 2 years ago | (#36285874)

I should hope so, DNF is running on unreal engine 3, and my 1gb gtx 260 mobile can run all those games at 1080p at 60fps. In fact that card can play all except metro 2039 at full resolution with all of the pretty effects. I could be wrong, but any computer built in the last 5 years with a dedicated video card should be able to run any game that is also on a console. The most you would have to do with an nvidia laptop/mobile dedicated gpu is turn down anti-aliasing and shut off vsync though. I did notice some games have slight frame rate issues without doing that. Though I think this will all depend on *if* gearbox decides to use heavy directx 11 features. Since they are porting to 5 year old consoles, they might just have directx 9/10 features.

Cross Platform Support (2)

pinkushun (1467193) | more than 2 years ago | (#36285422)

"Intel does provide development drivers for Intel graphics to the open source community."

+1 :D

http://www.intel.com/support/graphics/sb/cs-010512.htm?wapkw=(linux) [intel.com]

Re:Cross Platform Support (2)

morgan_greywolf (835522) | more than 2 years ago | (#36285572)

What, exactly, does this have to with the article? AMD/ATI also provides drivers for the open source community.

The thing is, of the three big graphics vendors, only NVIDIA supports reasonably complete OpenGL support, which makes their cards non-starters for us CAD users. I run SketchUp on Wine [blogspot.com]

Re:Cross Platform Support (2)

BeanThere (28381) | more than 2 years ago | (#36285996)

I don't know, I can't begin to imagine what open source drivers for new video cards could possibly have to do with an article about those new video cards on a website full of open source users. Very mysterious.

On a tangential vein, my desktop machine is NVIDIA but my laptop GPU is ATI, and CUDA on the ATI is broken and was just a waste of money because AMD/ATI's website just sends me to the laptop manufacturer's website to supposedly get the drivers, but the drivers on HP's website that they point me to, just plain have zero CUDA support. Surely between HP and AMD/ATI one of the two have a responsibility to make the product they advertise and sell, actually work? I've never had problems like that with NVIDIA drivers, so in future I'll stick with them.

Re:Cross Platform Support (2)

baka_toroi (1194359) | more than 2 years ago | (#36286152)

I can't tell if you're being sarcastic or not, but CUDA is a propietary standard that only works on Nvidia GPUs.

Re:Cross Platform Support (1)

BeanThere (28381) | more than 2 years ago | (#36286350)

Sorry, I meant OpenCL, I was typing in a hurry and momentarily confused. The card is a Mobility FireGL V5700. Most recently I was trying to get rpcminer-opencl working for bitcoin mining, to no avail. The AMD site just tells me I must download the drivers from HP. I've downloaded and installed the drivers from HP's site and they don't work.

Re:Cross Platform Support (1)

BeanThere (28381) | more than 2 years ago | (#36286362)

I mean, they do work in the sense that my display card is working, but GPU OpenCL apps do not work.

Re:Cross Platform Support (2)

WaroDaBeast (1211048) | more than 2 years ago | (#36287394)

What's your laptop's model name? Also, what OS are you running on that machine? The FireGL V5700 being based on the HD 3650M, it could be possible to mod the drivers for them to work on your computer.

Re:Cross Platform Support (1)

PitaBred (632671) | more than 2 years ago | (#36288658)

That's funny. I have poclbm running on my Mobility Radeon 5830, and both my desktop's Radeon 6950's (unlocked to 6970's and overclocked).

Perhaps you need to install the OpenCL dev kit or something? Because if the consumer level parts can run the miners, the pro level cards should have no problem. Try downloading the drivers from AMD and not HP: http://support.amd.com/us/gpudownload/windows/Pages/radeonmob_win7-64.aspx [amd.com]

I don't know for sure, though.

Re:Cross Platform Support (1)

BeanThere (28381) | more than 2 years ago | (#36304094)

Hmm, thanks, have tried installing both the Stream SDK and the AMD driver, still doesn't work. GPU caps viewer recognizes the GPU but says I have no OpenCL GPU devices.

Re:Cross Platform Support (1)

Nursie (632944) | more than 2 years ago | (#36289054)

OpenCL on nVidia seems broken too. At least under linux.

I can get pure OpenCL stuff working, but the moment I try to use the CL/GL Interop stuff it just stops working, this despite all the right capabilities on the card.

Makes it less useful for me as I wanted to use CL to draw GL textures. Maybe I should try CUDA instead, they're supposed to be similar.

Re:Cross Platform Support (0)

Anonymous Coward | more than 2 years ago | (#36286170)

Umm, CUDA is a Nvidia proprietary standard i.e. there are no AMD/ATI hardware that support CUDA unless someone writes their own wrapper/driver. AMD is supporting OpenCL.

Re:Cross Platform Support (1)

UnknowingFool (672806) | more than 2 years ago | (#36285904)

ATI, Nvidia, and Intel provide open source drivers for Linux. However, they all don't provide the same amount of support. For example, Intel and ATI support for VAAPI is not as strong as NVidia [mythtv.org].

Re:Cross Platform Support (0)

Anonymous Coward | more than 2 years ago | (#36286010)


Re:Cross Platform Support (0)

Anonymous Coward | more than 2 years ago | (#36287878)

Nvida provides good drives for Linux but they are *not* open source, there are reverse engineered open source ones but these are very much inferior to the closed ones. Neither open or closed covers their dual graphics cards (power saving vs high power when needed) for laptops in any mainstream usable way, although the open source drivers may at least be able to use the screen with them soon.

AMD provide reasonable closed drivers for Linux and officially support the open source efforts. The closed divers are not as reliable as Nvida's but are much better than they used to be. Although there is a serious delay for good open source drives from hardware release it is dropping, with initial support coming quickly and the drivers for the older chips now exceed the performance of the closed drives even on windows. They are in my, probably biased, assessment the best bet for now.

Intel provide open drives and historically have been the best but their resent use of a licensed graphics core has ko'ed their open efforts for all chips using the new core- do not expect the situation to improve fast.

VAAPI and openCL are latecomers compared to openGL support. both are being built into the open stack in such a way that they will easy to make available across most of the newer drivers as soon as they go from the excremental implementations, but this will take at least half a year form the looks of things, and that with a PPA or some equivalent.

Does it fry? (1)

Carewolf (581105) | more than 2 years ago | (#36285432)

Is it guaranteed to fry my laptop like the last mobile NVidia chipset I bought? (140NVS)

Re:Does it fry? (1)

WaroDaBeast (1211048) | more than 2 years ago | (#36285460)

I've just had a quick look at the nVidia GPU comparison page over at wikipedia, and the NVS 140M sports the infamous G86M chip, which was known to be faulty. You remember all the flak nVidia got due to the 8400M/8600M fiasco, don't you?

Re:Does it fry? (1)

Carewolf (581105) | more than 2 years ago | (#36285504)

Of course I do, some of us was unfortunate enough to have that piece of shit hardwired into our laptops. Being a thinkpad user I wasn't even in the category of users who was offered the insulting replacement laptop.

I think NVidia needs to be reminded of this for a long long time.

Re:Does it fry? (1)

rhook (943951) | more than 2 years ago | (#36305714)

No, you were in a category of users who were offered to have the mainboard replaced, for free, even if your warranty had expired. Iirc Lenovo is still repairing systems that had one of these defective chips, free of charge.

Re:Does it fry? (1)

Carewolf (581105) | more than 2 years ago | (#36307162)

No, the program officially stopped a month before my laptop died, and even though it was extended the extension stopped the week before.

Re:Does it fry? (1)

xMrFishx (1956084) | more than 2 years ago | (#36285514)

Ah, still having that problem are they? Every G92 I can lay my eyes on has died from that generation too in every friend's machine that's had one. My own 2007 MBP had the chip fail too, luckily Apple honoured their fit-for-purpose recall and replaced it for free. Glad I swapped to ATI for the desktop after my G80 GTX started to breath it's last breaths last year, though I think that's having memory death rather than the chip and unlike all the G92s I've seen, still works (barely).

Re:Does it fry? (1)

iamhassi (659463) | more than 2 years ago | (#36292720)

I've just had a quick look at the nVidia GPU comparison page over at wikipedia, and the NVS 140M sports the infamous G86M chip, which was known to be faulty. You remember all the flak nVidia got due to the 8400M/8600M fiasco, don't you?

8400M/8600M fiasco [wikipedia.org]:
"Some chips of the GeForce 8 series (concretely those from the G84 and G86 series) may suffer from an overheating problem. NVIDIA states this issue should not affect many chips,[37] whereas others assert that all of the chips in these series are potentially affected.[37] NVIDIA CEO Jen-Hsun Huang and CFO Marvin Burkett were involved in a lawsuit filed on September 9, 2008 alleging that their knowledge of the flaw, and their intent to hide it, resulted in NVIDIA losing 31% on the stock markets.[38] The reason for the high failure rate was because of improper selection of the underfill material for the chip. Underfill materials are a type of glue that keeps the silicon die firmly attached to the packaging material, which is where the connection to the actual pins takes place. On the affected chips, the working temperature of the underfill material was too low for the task and allowed the chip to move slightly if temperature was raised above a certain level, weakening the solder joints by which the die is attached. This eventually leads to a catastrophic failure, although the way the chip fails is quite random."

I will never buy another NVIDIA powered laptop after this [slashdot.org]:
"As part of a December 2010 settlement agreement, NVIDIA agreed to provide all owners of laptops containing a defective NVIDIA GPU with a laptop of similar kind and value. In February, NVIDIA announced that a $279 single-core Compaq CQ56 would be provided as a replacement to all laptops — from $2500 dual-core tablet PCs to $2000 17" entertainment notebooks. "

That's the most f***** up thing I've ever heard. That's like Porsche replacing my car with an Escort when the engine explodes. Any positive stories about NVIDIA should be banned from Slashdot after that.

(Almost) nothing new here, move along (2)

WaroDaBeast (1211048) | more than 2 years ago | (#36285476)

Bah, the GTX 560M is just a refresh of the GTX 460M. It sports the GF116 chip instead of the GF106 and has got higher clocks, but that's all. *shrugs*

Re:(Almost) nothing new here, move along (0)

Anonymous Coward | more than 2 years ago | (#36288356)

Isn't that basically what the desktop GTX 560 is compared to the GTX 460? The architecture revision from GF106 to GF116.

Re:(Almost) nothing new here, move along (1)

WaroDaBeast (1211048) | more than 2 years ago | (#36290708)

Technically, it went from GF104 to GF114, but yeah, it got pretty much the same treatment: newer revision with SPs, TMUs and ROPs remaining the same, plus higher clocks.

Makes me wanna say: Whoop-de-fucking-doo. :|

Re:(Almost) nothing new here, move along (1)

SpazmodeusG (1334705) | more than 2 years ago | (#36292900)

Yeah, there really isn't much different in the 400 and 500 series. I think this list shows it the best - Link [wikipedia.org]

eg. The GT530 is the GT430. Not just like it, it is the same card. Futher down the list in the mobile series the 540M is the 435M just clocked a bit higher. etc. It's pretty much the same for the entire 500 series range. It's internally the same as the 400 series. Minor improvements if any.

What interesting is that often the same 500 series card has a higher sub-model number than the equivalent 400 series edition. So you might think the 540M would be miles ahead of the 435M but in reality there's not much difference at all.

Re:(Almost) nothing new here, move along (1)

WaroDaBeast (1211048) | more than 2 years ago | (#36293684)

Now, now, don't tell me you've never heard of nVidia's "The Way It's Meant To Be Renamed." ;)

truth coming out all over the wwworld, inescapable (0)

Anonymous Coward | more than 2 years ago | (#36285486)

then there's us. pretending any futher could render us as a margin error in history.

Re:truth coming out all over the wwworld, inescapa (0)

amn108 (1231606) | more than 2 years ago | (#36285602)

Not bad robot, but you have much to learn of human ways ;-)

Mobile? (1)

Lord Lode (1290856) | more than 2 years ago | (#36285548)

How comes everytime ./ reports something on new NVidia cards, it's about crappy mobile versions? I'm interested in true computing power!

Re:Mobile? (1)

Hadlock (143607) | more than 2 years ago | (#36285926)

I wouldn't mind a direct comparison to their desktop counterparts, either. It's always so hard to equivocate the mobile versions to a more oft benchmarked version.

Nice HotHardware advertisement (0)

Anonymous Coward | more than 2 years ago | (#36285566)

MojoKid has made 10 submissions from HotHardware.com in the last 14 days. How is this one any more notable than every other old-graphics-hardware-in-mobile-form release?

oh my (1)

amn108 (1231606) | more than 2 years ago | (#36285578)

It seems to me that the sheer power of this mobile thing shadows the performance of my entire Centrino laptop :/

Native NVidia support on Linux (0)

Anonymous Coward | more than 2 years ago | (#36285776)

I am getting a bit tired of dancing with Nouveau every time I set up a CUDA box.

Re:Native NVidia support on Linux (1)

IronSight (1925612) | more than 2 years ago | (#36285920)

That's your distro trying to shove the free driver down your throat. I personally use Sabayon since it has many flavors, and it automatically installs the restricted nvidia driver for you on a fresh install, saving a ton of hassle. I'm sure there are other distrobutions that do not push the free driver, but I am unable to think of any off hand.

Bravo (1)

MonsterTrimble (1205334) | more than 2 years ago | (#36286342)

Now that we have that out of the way, how about you guys fix the bugs in the nvidia-96 driver for Linux? You know, the one that calls for xorg wrong?

No 'Optimus' support in Linux (3, Interesting)

Behemoth (4137) | more than 2 years ago | (#36286458)

Be wary on the Linux side of the 'Optimus' technology. I didn't do due diligence and impetuously ordered a new laptop from Dell with an nVidia card (GT 525M). Turns out that there was no way in the Dell laptop to turn it off, and Linux couldn't see the nVidia card, just the intermediating Intel card. The ‘automatic graphics switching’ is done in software only under Win7. End result - no OpenGL under Linux. End-end result, I sent it back.

There is a project to get Optimus working on Linux (https://github.com/MrMEEE/bumblebee) but I really don't have time, and the switching has to be done manually at the moment.

Re:No 'Optimus' support in Linux (1)

Anonymous Coward | more than 2 years ago | (#36287576)

Be wary on the Linux side of the 'Optimus' technology. I didn't do due diligence and impetuously ordered a new laptop from Dell with an nVidia card (GT 525M). Turns out that there was no way in the Dell laptop to turn it off, and Linux couldn't see the nVidia card, just the intermediating Intel card. The ‘automatic graphics switching’ is done in software only under Win7. End result - no OpenGL under Linux. End-end result, I sent it back.

There is a project to get Optimus working on Linux (https://github.com/MrMEEE/bumblebee) but I really don't have time, and the switching has to be done manually at the moment.

In our Latitude R6420 series of machines, you can disable Optimus via a BIOS switch and it will forget the Intel card exists.....not sure if you really had a hard-on to use Optimus (which we certainly don't) .... but there is a switch to turn it off.

Re:No 'Optimus' support in Linux (1)

Behemoth (4137) | more than 2 years ago | (#36288598)

I'll take a look at it. I have no use for Optimus - I just need a laptop with basic CUDA support. The model I unwittingly got was an Inspiron, and there didn't seem any way to disable Optimus. I'd love to be wrong (no RMA until after the holidays so it's not out the door yet) but I saw no options in the BIOS. I'm used to not being able to use the bleeding edge features in Linux, I'm just not used to being shut out completely, at least by nVidia products.

Thanks m(__)m

Why does nVidia even bother? (0)

Anonymous Coward | more than 2 years ago | (#36289058)

By limiting the memory bus to 64 bits, they're effectively destroying any performance these cards may have attained. Last time I tried one of these crippled things I was barely able to run the ORIGINAL Unreal Tournament, Counterstrike or Serious Sam First Encounter at 20-30fps at native display resolution. Which is fucking ridiculous, considering this is 2011. Come on, nVidia, stop crippling your damn chips!

Wait! (0)

Anonymous Coward | more than 2 years ago | (#36290570)

Didn't Slashdot tell me back in 1999 that the GeForce 2 was the only card I'd ever need?

Another mediocre mobile graphics chipset. (0)

Anonymous Coward | more than 2 years ago | (#36292290)

In a decade-long line of mediocre mobile graphics chipsets. Probably overheats in 15 minutes too!

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account