Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Shows Off "Optimus" Switchable Graphics For Notebooks

Soulskill posted more than 4 years ago | from the that's-some-prime-namespace dept.

Graphics 102

Vigile writes "Transformers jokes aside, NVIDIA's newest technology offering hopes to radically change the way notebook computers are built and how customers use them. The promise of both extended battery life and high performance mobile computing has seemed like a pipe dream, and even the most recent updates to 'switchable graphics' left much to be desired in terms of the user experience. Having both an integrated and discrete graphics chip in your notebook does little good if you never switch between the two. Optimus allows the system to seamlessly and instantly change between IGP and discrete NVIDIA GPUs based on the task being run, including games, GPU encoding or Flash video playback. Using new software and hardware technology, notebooks using Optimus can power on and pass control to the GPU in a matter of 300ms and power both the GPU and PCIe lanes completely off when not in use. This can be done without being forced to reboot or even close out your applications, making it a hands-free solution for the customer."

cancel ×

102 comments

Sorry! There are no comments related to the filter you selected.

VOODOO (5, Funny)

Hodr (219920) | more than 4 years ago | (#31074738)

I knew if I just held off upgrading my Orchid Righteous 3d (Voodoo 1) card long eoungh discrete 3d cards would become relavent again. You guys with your fancy Banshee cards can suck it.

Re:VOODOO (1)

JustNiz (692889) | more than 4 years ago | (#31074832)

oh happy memories.
I'm sure you're finding the massive 640x480 resolution just as awesome as I do.

Re:VOODOO (3, Funny)

Tetsujin (103070) | more than 4 years ago | (#31076570)

oh happy memories.
I'm sure you're finding the massive 640x480 resolution just as awesome as I do.

Well, you're clearly not aware of the nature of the sham that pervades high-resolution graphics.

For instance, graphics hardware manufacturers will happily tell you that a resolution like 1920 x 1080 has nearly seven times as many pixels as 640 x 480. But what they don't tell you is that all of these pixels are a whole hell of a lot smaller than the ones on your good old VGA monitor! With my monitor, I may not have a lot of pixels, but I'm damn sure I'm getting my money's worth out of every single one!

Re:VOODOO (1)

mog007 (677810) | more than 4 years ago | (#31077514)

Exactly, that's why I only have a resolution of 9x9 on my 25 inch screen.

Re:VOODOO (1)

Tetsujin (103070) | more than 4 years ago | (#31077874)

Exactly, that's why I only have a resolution of 9x9 on my 25 inch screen.

Really, that's all you need to play a rousing game of "dot" [youtube.com] ...

Re:VOODOO (0)

Anonymous Coward | more than 4 years ago | (#31079846)

I know I'm offtopic here, but why is youtube turning into a complete pile of shit lately? I have half again of the video cached per the line at the bottom of the screen, yet it keeps pausing..... grrrr.

Re:VOODOO (1)

Tetsujin (103070) | more than 4 years ago | (#31082402)

Yeah, I was getting that, too. Really annoying...

Re:VOODOO (1)

Pojut (1027544) | more than 4 years ago | (#31074872)

Pfft. Everyone knows the Monster3D was where it was at! [wikipedia.org]

Re:VOODOO (1)

Hodr (219920) | more than 4 years ago | (#31074984)

Same card (both used the reference 3DFX Voodoo 1 chipset), but the Orchid card came out first ;)

Re:VOODOO (0)

Anonymous Coward | more than 4 years ago | (#31075168)

I'm glad I held out for the Pure3D. The extra 2 MB of ram will really help with larger texture sizes for years to come.

Re:VOODOO (1)

Pojut (1027544) | more than 4 years ago | (#31075260)

Ah yes, I forgot about that one! That was from Canopus, right?

Boring (1)

toastar (573882) | more than 4 years ago | (#31075624)

Please wake me when a company has brought the GPU on die.

Re:Boring (2, Insightful)

Khyber (864651) | more than 4 years ago | (#31076920)

Why? So when my GPU fucks itself I have to buy a whole new cpu/GPU combo?

No thanks, I'll stick with discrete individual parts - makes repairs and upgrades so much easier.

Re:Boring (1)

spire3661 (1038968) | more than 4 years ago | (#31078888)

In a CPU/GPU package the odds of JUST the GPU failing are pretty small. Plus you save alot of money/space/power by not having to make a whole new PCB board for the GPU.

Re:Boring (0)

Anonymous Coward | more than 4 years ago | (#31079536)

Which is -exactly- why they want the GPU on-die... Why let people buy -one- upgrade when you can force them to get two!

Re:Boring (1)

petermgreen (876956) | more than 4 years ago | (#31098558)

Well intel can put graphics and processor in the same package without putting them on the same die. Indeed they have already done so with thier latest dual core chips (the current gen quad-core chips don't have any support for shared memory graphics at all)

The thing is intel have failed to make decent graphics soloutions (they are getting better but are not yet up to the standards of even the integrated graphics in nvidia chipsets let alone dedicated graphics cards) and nvidia haven't even tried to make x86 processors which is a bit of a problem for making a decent single-chip soloution.

Re:VOODOO (1)

spire3661 (1038968) | more than 4 years ago | (#31078854)

Voodoo2 had pass through too. Voodoo 2 SLI + S3 Virge VGA w/ passthourghs = wiring nightmare in the back of the case.

Re:VOODOO (0)

Anonymous Coward | more than 4 years ago | (#31083720)

BAH, 3dfx people.

I'm sure Rendition will come back and prove to you all that vQuake is better than gl-quake.
I will just keep holding to my Verite V1000 a tiny bit longer.

Crap (1, Funny)

Anonymous Coward | more than 4 years ago | (#31074780)

Transformers jokes aside

This article is ruined for me :(

Re:Crap (1)

Vigile (99919) | more than 4 years ago | (#31075248)

In the pcper.com video on the first page there is at least some homage to the Transformers that starts right around the 2:30 mark...

http://www.pcper.com/article.php?aid=868 [pcper.com]

I like my desktop. (1)

tjstork (137384) | more than 4 years ago | (#31074806)

I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them. But I'd also want solid state hard drives and hdmi cables to wire them to the TV...

these guys are close...

http://hothardware.com/News/Eurocom_launches_QuadCore_XEON_Based_Notebook_/ [hothardware.com]

But oddly, I would like to have an SSI EEB desktop case, that lies flat, like old PCs used to...

Re:I like my desktop. (3, Insightful)

Monkeedude1212 (1560403) | more than 4 years ago | (#31074900)

What all the cool kids are doing is dropping cases altogether. Thats right, nothing looks more badass than your motherboard laying on the desk with silicon chips sticking up in the air, with a giant fan overhead to help keep things cool and circulated. Your friends will be so jealous at all the blinking lights.

As for the Optimus, I think its a great idea. This change can come for desktops as much as it has for notebooks, if there is enough demand for such a product.

Think, you had to factor in the power supply when you bought that new Graphics card. So imagine how much power its actually eating up. Imagine if your desktop didn't have to use that much power when it didn't have to?

Re:I like my desktop. (1)

Ethanol-fueled (1125189) | more than 4 years ago | (#31075696)

Thats right, nothing looks more badass than your motherboard laying on the desk with silicon chips sticking up in the air

That technique is also helpful for troubleshooting and verifying laptops before putting them back together, because it's much more of a hassle to do so. And you don't even need a big fan as long as all the motherboard fans are attached, just make sure everything is laid flat on an ESD mat or other protective surface.

Re:I like my desktop. (2, Funny)

datapharmer (1099455) | more than 4 years ago | (#31076168)

NO NO NO! You can't do that! You'll fry the thing with static electricity, coffee, and dust. Everyone true nerd knows you've got to brown bag it. That's right, a quick trip to the feed supply for a burlap sack and you are working with some serious overclocking potential. It is breathable, protects from dust and light spills, and you can hatch chick on the heat sink...

Re:I like my desktop. (1)

MrNemesis (587188) | more than 4 years ago | (#31089134)

What with nVidia's unceremonious exit from the IGP market thanks to Intel's licensing, and the introduction of every-Intel-chip-comes-with-a-GPU, tech like this is a shrewd, and pretty essential, move by nVidia in order to remain relevant in the middle tiers. If people, whether on a laptop or desktop, can get the power savings of an Intel IGP with the ability to fall back on to a decent GPU, they'll claw back a good deal of marketshare from "prosumers" and the like. Conversely, ATI has made incredible improvements in the idle power savings of their GPUs, so it remains to be seen if process technology makes the software complexity worth it.

The only question is whether they'll stick at it in terms of driver support. IIRC their hybrid power initiative lasted only for a couple of card revs a year or two ago. Didn't RTFA but suspect this'll only work on Win7 where the WDDM allows for two different GPU HALs whereas Vista did not.

Re:I like my desktop. (1)

Jeremy Erwin (2054) | more than 4 years ago | (#31074992)

A twelve pound notebook? Sounds like a niche product.

Re:I like my desktop. (1)

Anne Thwacks (531696) | more than 4 years ago | (#31075684)

A twelve pound notebook?

By my estimation{ GBP12 = NGN3,000 - I hope its as good as OLPC!

Re:I like my desktop. (1)

mhajicek (1582795) | more than 4 years ago | (#31076312)

Re:I like my desktop. (1)

Jeremy Erwin (2054) | more than 4 years ago | (#31076874)

Yes, but that computer wasn't competing with lighter devices. If you wanted an IBM PC that could be easily moved, the Portable PC was (or was close to) your only option. Now, we have laptops, netbooks, tablets, pdas and so on. If you were a field engineer/scientist, and had, say, a ADC PCI card or a General purpose GPU that you needed to use, there are options. Niche options

But the lighter a computer is, the more often it will be carried around. Even eight pounds can be a burden, presenting the user with a choice between having it around, and leaving the heavy bag behind.

I know that many textbooks are heavier than eight pounds. But consider whether you instinctively carry around a few volumes volumes of The Art of Computer Programming on the slim chance that you'll be able to enjoy it over lunch?

Re:I like my desktop. (5, Funny)

vlm (69642) | more than 4 years ago | (#31075274)

I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them. But I'd also want solid state hard drives and hdmi cables to wire them to the TV...

But... But ... But ... Marketing told me you guys wanted postage stamp size touch sensitive screens, batteries that last two hours, and 3 second e-ink refresh rates. And its gotta use a cloud, whatever that is. And an app store, gotta have an app store. I guess you must be wrong.

Re:I like my desktop. (1)

tjstork (137384) | more than 4 years ago | (#31075780)

I guess you must be wrong.

Here's the crazy part. I don't even care about battery life or even having a battery. I just want something I can plug in wherever.

Re:I like my desktop. (2, Funny)

Chris Burke (6130) | more than 4 years ago | (#31076164)

Here's the crazy part. I don't even care about battery life or even having a battery. I just want something I can plug in wherever.

Sounds good, as long as we don't let the folks in the adult novelty department get word of it.

Re:I like my desktop. (1)

maxume (22995) | more than 4 years ago | (#31075572)

You can plug keyboards and displays into a notebook.

Hdmi is currently only somewhat available, and SSDs are a tough trade off if you are concerned about the amount of live space (without an external drive).

Dual CPUs no, but multiple cores yes.

And they cost more.

Still, the number of people with needs that are not met by an $800 laptop is shrinking pretty fast.

Re:I like my desktop. (1)

PitaBred (632671) | more than 4 years ago | (#31075584)

I like my desktop too. But I can't carry it on the plane with me, it's a pain in the ass to haul to a friend's house when we want to do some LAN play, and I can't bring it other places I go so I still have a place to offload photos and such.

Desktops are great as long as you never leave your house, or never need or want a computer when you do so.

Re:I like my desktop. (0)

Anonymous Coward | more than 4 years ago | (#31077396)

21"? Lol I like my 27" IPS desktop. 21" is practically laptop sized. Gimme a break.

And with quadcore i5 laptops out shortly?

no transformers jokes? (4, Funny)

DJCouchyCouch (622482) | more than 4 years ago | (#31074858)

But that's my Prime form of entertainment!

Re:no transformers jokes? (0, Offtopic)

Moheeheeko (1682914) | more than 4 years ago | (#31074898)

*rimshot*

Re:no transformers jokes? (4, Funny)

twentynine (984768) | more than 4 years ago | (#31075952)

You might wanna Jazz up that joke a little bit...

Re:no transformers jokes? (1)

Snyper1000 (987002) | more than 4 years ago | (#31078010)

Or at least composite some other jokes in ;)

Re:no transformers jokes? Bah! (1)

Tetsujin (103070) | more than 4 years ago | (#31076728)

Yes, we can talk about hardware without making a bunch of stupid jokes about its name*.

One of the great features of the Optimus chipset is its pipelining architecture, called the "Convoy". With this system a number of pending GPU tasks can be stored into containers, and the GPU hardware will process them quickly, moving the data to its destination, transforming it as necessary, etc. But the hardware apparently kept dying on them during the demonstration: they were able to get it up and running again each time, but it happened at least three times.

Rumors are already spreading about the planned successor technology, known simply as "Ultra". It will basically be a beefed-up version of "Optimus"... Though there are rumors it won't be quite as flexible.

AMD is working on their own competing product, called "Hot Rod" - it really hasn't gained much of a following so far, though. I've also heard about something called "Ironhide" - apparently it's designed to provide a GPU for processing functions on headless systems...

(* Doesn't necessarily mean we will...)

What a relief (5, Funny)

s2theg (1185203) | more than 4 years ago | (#31074862)

"Optimus can power on and pass control to the GPU in a matter of 300ms"

That's good. I'm tired of finishing before my video player can render the first frame.

Re:What a relief (1)

ArsonSmith (13997) | more than 4 years ago | (#31075226)

"Optimus can transform and roll out"

too

Re:What a relief (2, Informative)

Bluesman (104513) | more than 4 years ago | (#31081208)

Getting older will help your stamina too.

MacBook Pros (1, Informative)

Stele (9443) | more than 4 years ago | (#31074886)

I believe the latest model MacBook Pros have been doing this for at least a year.

something like it on linux (5, Informative)

je ne sais quoi (987177) | more than 4 years ago | (#31075020)

For years, the proprietary NVIDIA drivers for linux have been using a feature called powermizer that changes the performance of the GPU based on what the PC needed. E.g., under normal conditions, the GPU is underclocked but when you run an openGL window or run a game, the GPU bumps up into full speed. In principal it sounds like a great idea, but it's been really annoying to wait around for what seemed like at least a year for NVIDIA to get it to run well enough with a composite manager like compiz. For a long time, things like highlighting text in firefox and then dragging it led to flickering of the screen, or the new kde has composite things built right in which didn't work well. During that period we had to do things like fool the GPU into running full tilt all the time because NVIDIA didn't give us an option to switch powermizer off until AFTER they fixed the problems with it.

Re:something like it on linux (1)

Dragoniz3r (992309) | more than 4 years ago | (#31078864)

Little different from swapping between a low-power "integrated" GPU and a high-performance discrete GPU, which is what I read the article to say this technology does.

Re:something like it on linux (1)

lab16 (416283) | more than 4 years ago | (#31081040)

"For a long time, things like highlighting text in firefox and then dragging it led to flickering of the screen, "

I had a monitor that would flicker whenever I opened up certain windows only while compiz was enabled. It didn't seem to flicker if the window was too small, or for anything else other than large windows with compiz enabled, and seemed to be due to the "beam-up" animation that was displayed whenever new windows were opened. After it first flickered when opening the window, it would not flicker while opening up subsequent windows, until it would flicker again after about a minute later, at which time opening up more large windows would cause the flicker again. I never thought that the root cause could have been my video card, thinking instead that it was the monitor, so I didn't really check on what powermizer was doing at that time. That monitor died sooner rather than later and I replaced it with the same model (hannsg HG281D) that does not have this problem, so maybe it was a combination of power mizer and the monitor?

Re:something like it on linux (1)

Hurricane78 (562437) | more than 4 years ago | (#31082568)

Weird. I used a embedded nVidia chip (7050PV) last year, and I never had those problems. Did you forget the following options in your xorg.conf?

Option "AddARGBGLXVisuals" "true"
Option "UseEvents" "false" # This option must be either undeclared or false, in order to avoid periodic short-term freezes on beryl and other OpenGL intensive programs

Re:MacBook Pros (3, Informative)

Vigile (99919) | more than 4 years ago | (#31075178)

Nope, not really. I have one of those and the video on the PCPer article shows the process on a MacBook Pro. You have to change a settings in the control panel and then logout of the system to change GPU modes.

Re:MacBook Pros (1)

99BottlesOfBeerInMyF (813746) | more than 4 years ago | (#31075964)

I have one of those and the video on the PCPer article shows the process on a MacBook Pro.

I read the PCPer article, but I don't recall any videos showing OS X. Are you referring to the process of switching graphics modes under Windows or under OS X?

Re:MacBook Pros (1)

Stele (9443) | more than 4 years ago | (#31075984)

Ah, well the MBP solution is not nearly as cool as I thought it was then.

Re:MacBook Pros (1)

moonbender (547943) | more than 4 years ago | (#31077246)

That's such an awkward solution that I was always amazed it was allowed to appear in an Apple product. Or any other product for that matter.

Re:MacBook Pros (1)

Khyber (864651) | more than 4 years ago | (#31077014)

Nope, they cannot, just an effect of OSX's security permissions and driver model. Rebooting is required.

Besides, what's the point of having dual GPUs if you can't use both simultaneously for really heavy data processing?

Oh, that's right - Intel IGP, nVidia Discrete - you couldn't SLI it anyways without some serious hardware and software workarounds.

Re:MacBook Pros (2, Interesting)

Belisar (473474) | more than 4 years ago | (#31079424)

The new thing seems to be that you can actually switch between the onboard and 'real' GPU on the fly and fast while everything is running.

The previous laptops with switchable graphics, such as my Sony Vaio which had a Geforce and an Intel chips, did have to at least reboot the graphics system (on OS X) or reboot the whole computer (Windows) in order to go to the power saving mode.

In my experience, I usually was too lazy / didn't want to close my work and kept using the good GPU all the time. The only times I'd work up the enthusiasm to actually switch over was before a flight or something where I'd know I'd not need the power.

Hey if it extends battery life... (2, Interesting)

planckscale (579258) | more than 4 years ago | (#31074896)

...I'm all for it. But by how much will it extend the battery life? And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in. Mabye in a smaller segment of mobile gamers this will make a difference.

Re:Hey if it extends battery life... (1)

itof500 (239202) | more than 4 years ago | (#31075562)

It is actually pretty nice to have the long battery life during the work/meeting day, and then plug it in and boost the graphics in the hotel room to participate in the guild raid that night.

Re:Hey if it extends battery life... (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#31076518)

Ah but you can use hardware acceleration in your desktop environment, but you might not always want it on. Playing video, running something like photoshop - theres a bunch of stuff that uses the video card that isn't a video game. Just FYI. So if you are sitting there browsing slashdot for an hour, it can switch to the Integrated low power one, but as soon as you boot up Media Player or something, it can switch to your full blown power monster.

Re:Hey if it extends battery life... (1)

oKtosiTe (793555) | more than 4 years ago | (#31078812)

...I'm all for it. But by how much will it extend the battery life? And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in. Mabye in a smaller segment of mobile gamers this will make a difference.

I'm one of the "netbooks folks", and the prospect of being able to play video, or even basic accelerated games without running out of juice in less than half the regular time sounds great to me.

Re:Hey if it extends battery life... (1)

VoltageX (845249) | more than 4 years ago | (#31080382)

Actually, with my newest desktop using 350W under full load I could use this.

HybridSLI? (0)

Anonymous Coward | more than 4 years ago | (#31074926)

Doesn't Nvidia already have something like this in their HybridSLI technology? I remember reading about it in the manual for my last motherboard, but haven't ever used it since I don't have a discrete card in that machine. Is this the same thing, just applied to laptops?

Or a rebranding to create buzz?

Re:HybridSLI? (3, Informative)

Vigile (99919) | more than 4 years ago | (#31075210)

Read the article at pcper.com - it talks about the current versions of switchable graphics and how the new Optimus differs.

It's not a cosmetic change.

Can't they make a 'smarter' GPU? (2, Interesting)

JSBiff (87824) | more than 4 years ago | (#31074936)

I would have thought that, instead of switching between a 'low power' video chip, and a 'high power' GPU, they would have concentrated on just making the Nvidia graphics cards use lower power when not doing things like rendering 3D graphics, or decoding video? I mean, mobile CPU's have some smarts built into them to allow them to vary how much power they consume, can't they do that with GPUs?

Re:Can't they make a 'smarter' GPU? (1)

MBCook (132727) | more than 4 years ago | (#31075634)

Not entirely. While you can do that, the chip in a laptop still has some real limits. You can't put off more than X watts of heat, because the laptop just can't dissipate it.

But if the GPU used for high intensity activities (such as games) is external to the laptop, you can have it give off 150 watts of heat because it can provide the necessary cooling capacity.

I'd love something like this. I have my MacBook Pro which I really like, but don't do much in the way of 3D. I'd love to be able to plug in a good external card once in a while to use for gaming sessions, and let the internal GPU be lower powered. There are others with enthusiast/desktop replacement laptops this would be very good for as well.

Re:Can't they make a 'smarter' GPU? (1)

Neil Hodges (960909) | more than 4 years ago | (#31075848)

Sounds like a new style of docking station with a high-end GPU built-in would be a good direction to go in.

Re:Can't they make a 'smarter' GPU? (0)

Anonymous Coward | more than 4 years ago | (#31076706)

If only this existed. Oh wait, it does:

http://www.amd.com/us/products/technologies/ati-xgp/Pages/ati-xgp.aspx

Re:Can't they make a 'smarter' GPU? (1)

Khyber (864651) | more than 4 years ago | (#31077094)

And it's a piece of shit, too.

"providing 4 GBytes/s bandwidth to support ATI Radeon graphics cards to enable you to run the most demanding graphics applications."

Sorry, my crap onboard 8600GS tears that apart at 22.4GB/s

Needs moar lanes.

Re:Can't they make a 'smarter' GPU? (0)

Anonymous Coward | more than 4 years ago | (#31077746)

Transfer from gpu to graphics ram or from gpu to cpu and normal ram?

There is a difference. For many applications PCIE 4x can be enough and that's what that external port can deliver. PCIE 16x would be 16GBytes/s.

Re:Can't they make a 'smarter' GPU? (1)

Score Whore (32328) | more than 4 years ago | (#31078814)

That 22.4 GB/s is the bandwidth between your GPU and your video card's RAM. It's not the bandwidth between your system and your video card. That bandwidth is a 16 lane PCIe bus. Which, and this might be a surprise to you, is 4 GB/s (250 MB/s per lane * 16 lanes.)

Re:Can't they make a 'smarter' GPU? (1)

Khyber (864651) | more than 4 years ago | (#31080362)

Incorrect

PCI Express 2.0 is what my card runs on - 16 lanes, 500MB/s.

That's 16GB/s for me. And it's still bottlenecking my GPU.

Re:Can't they make a 'smarter' GPU? (1)

Score Whore (32328) | more than 4 years ago | (#31080624)

So you're on a 2.0 bus. And no, the 8 GB/s, full duplex still has nothing to do with "bottlenecking your GPU." Your GPU memory interface is completely separate from it's host interface. They have nothing to do with each other.

Re:Can't they make a 'smarter' GPU? (1)

Khyber (864651) | more than 4 years ago | (#31082336)

I have a new PCI-E 3.0 test desktop board as well, with the same GPU onboard (8600GS,) and half the memory (512 on my laptop versus 256 on the desktop.) Same games run better on the desktop board.

That's the same chip, same fab process, same clock speed and power consumption. Different bus, the one on the faster bus has half the memory, and the memory otherwise is the same (GDDR3.)

I suggest rethinking your statement.

Re:Can't they make a 'smarter' GPU? (0)

Anonymous Coward | more than 4 years ago | (#31078744)

Heh.. I remember looking into getting a PCMCIA - PCI adaptor and connecting a voodoo card to it once ^_^

Re:Can't they make a 'smarter' GPU? (2, Interesting)

billcopc (196330) | more than 4 years ago | (#31076890)

The problem with GPU throttling is it's far more visible (pun intended). If your CPU is rapidly switching between 3.0ghz and, say, 1.2ghz, you probably won't notice at all, but if your game or video app has uneven framerates or the dreaded micro-stutter, you will feel the overwhelming urge to smash your laptop against the nearest brick wall.

GPUs typically have two power modes: power-saving (idle), and full-blast (gaming). Your device drivers kick it into high-power mode whenever you launch a 3D app, so the stutter of switching speeds happens before any animation takes place, and it stays that way until you exit the game. This is representative of typical GPU usage: you're either using it to the max, or not at all. I don't know anyone who runs their games at lower quality settings just to "save power on the GPU", you'll push the flashiest pixels your hardware can handle.

What would be quite appreciated is if the high-end GPUs had a true low-power mode that shuts off all the excess pwnage, but that's just my bias. I tend to buy the fastest GPU I can afford, and stick it out for a few years until it starts bothering me. My latest acquisition, the GTX 295, is a power hog. Even when sitting idle at the desktop, my PC chugs a hearty 400 watts to do nothing, roughly 300w to the two GPUs and the remainder for the CPU and motherboard. While gaming, this number swells to around 800w, again 3/4 of that goes to the GPUs. I'm fine with the 800w active consumption, it's the idle power draw that bothers me, because I only game for an hour or two a night, 3-4 nights a week. If I replace those two GPUs with a low-end card, my 2D performance is unaffected yet power usage drops to a much cozier 100w. Why the big GPUs need 200 more watts to do absolutely nothing, that defies even the most usurious logic. Now given the greater number of high-end desktop vs laptop GPUs, I think they should figure out how to shut down parts of the desktop GPU when not in use, rather than investing in some never-gonna-sell IGP+GPU trickery. The $25 drop on my monthly hydro bill would more than justify the expense of a higher-efficiency device. Hell, that's enough to buy the latest GPU every year!

Re:Can't they make a 'smarter' GPU? (1)

spire3661 (1038968) | more than 4 years ago | (#31078962)

Sometimes while playing WoW ill lower my settings so my GPU( ATI 4850) fan doesnt spin up. For some reason, the frost path that Skadi lays in Utgarde really taxes my GPU, more then anything else in the game. Once that fist pass comes my GPU is LOUD. For awhile the frost was bugging and would remain long enough for Skadi to make a second pass and i thought my GPU was gonna pop.

Great (1)

Joshua Fan (1733100) | more than 4 years ago | (#31075022)

Now give this to me in a 10" notebook.

Re:Great (1)

Vigile (99919) | more than 4 years ago | (#31075230)

From what I am told that is coming sooner than you might think. Expect to see something by April!

Linux hybrid graphics (2, Informative)

Anonymous Coward | more than 4 years ago | (#31075216)

The current progress of Linux hybrid graphics. [blogspot.com]

There has been a lot of progress in this area the past few weeks. Wonder if this will let NVIDIA switch gpu's without restarting X.

Linux support (1)

Ltap (1572175) | more than 4 years ago | (#31075232)

I see a lot of hacking that will be necessary to make something like this work. It doesn't seem like something that would automate easily unless it used some kind of profiles system.

Re:Linux support (1)

Vigile (99919) | more than 4 years ago | (#31075320)

Uhh....yah. It does use a profiling system.

That is detailed in the article. :)

correct me if im wrong.. (1)

Moheeheeko (1682914) | more than 4 years ago | (#31075238)

..but dont ATI cards ALLREADY do this if you set up the CCC right?

Re:correct me if im wrong.. (1)

asdf7890 (1518587) | more than 4 years ago | (#31075626)

Yep. Mine halves the core and memory clocks when full 3D power is not required (or at least it claims to). Though it still see the need to have its cooling fans going on minimum during normal operation so either this doesn't save as much power as I think it should or the fans don't actually have an "off" state (or my case's airflow is insufficient, though I do not believe that to be the case as nothing gets ridiculously hot under prolonged heavy load).

About time, if it works as advertised. (2, Informative)

Happy Nuclear Death (911893) | more than 4 years ago | (#31075370)

I have suffered from one of the multiple-display-device solutions, in the form of an Alienware M15X, so Optimus sounds like a huge step forward.

While in theory it was nice to have both a battery-friendly Intel GMA and a reasonably powerful Nvidia GeForce card in one (relatively) portable package, in reality it was lousy. As suggested by TFA, you had to reboot to switch between them, whether running Windows XP or Vista. That would have been bad enough, but wait, there's more!

This effectively meant that I could never switch, because us mere users were not permitted to authorize UAC prompts or do "admin" things under XP. Yes, you needed administrator-level access to switch between display devices. I don't know why, maybe because it involved changes to startup files. Huge software limitation there, as well as a shortcoming of our boneheaded IT rules.

But you really shouldn't have to reboot to switch devices.

Re:About time, if it works as advertised. (1)

asdf7890 (1518587) | more than 4 years ago | (#31075768)

But you really shouldn't have to reboot to switch devices.

Video devices are not something that has previously been needed to be "hot swappable" so unlike many other things the driver model for graphics hardware probably doesn't allow for devices to be turned on and off. In the case of your Intel/nVida combo I'm guessing that the BIOS enabled one or the other at boot and relied on Windows to detect this and switch drivers on next start. While Windows can work with two distanct graphics cards of different types they both need to be running at boot and until shutdown. I'm guessing that this is implemented in such a way as Windows doesn't actually see a device change but instead just sees it as a monitor change (if that) so this will only work with nVidia chips for both low power and high power use (so Windows can use the same driver for each and the driver+chipset sort out sending work to the right place with out any help from the rest of the OS).

Re:About time, if it works as advertised. (1)

billcopc (196330) | more than 4 years ago | (#31076946)

The real solution to this problem is to reduce the base power consumption of the GeForce. Dual-GPU switching is a kludge, nothing more. A crutch for an inefficient GPU.

Re:About time, if it works as advertised. (1)

drinkypoo (153816) | more than 4 years ago | (#31077190)

The thing I find amusing is that I've never had any problem with Powermizer. Even my "unsupported" GTS240 does it fine. I haven't used it under Windows because XP won't install on my Gigabyte motherboard (their response is that it works for them) but I've had no problems under Linux. Previously, I had other cards with powermizer, including laptops with 3700FX and before that 1500FX Quadro chips, and it worked fine on them, too. So I got a low-power video card (the GTS 240) and I'm rocking out with a 460W power supply instead of one of these kilowatt jams. It runs everything I throw at it without fault so long as I used a driver that doesn't even recognize the type of card :)

How fitting (1)

Wiarumas (919682) | more than 4 years ago | (#31075426)

So, on one hand you have a powerful graphics notebook when its primed (aka Optimus Prime). And on the other hand, you can turn it off and it becomes a cab over semi truck.

what happened to good hardware design? (1)

robmv (855035) | more than 4 years ago | (#31075516)

I am a Thinkpad T500 owner with switchable Intel/ATI, and it is a nice feature even that I need to reboot and change the mode at the BIOS to use one or the other chipset on Linux (I have not tried the recent X server restart experiments), I use more than 95% of the time the Intel IGP, but I still consider this software switching a horrible hack. Why do not design efficient chips (ATI/NVIDIA) able to power down parts of it when not using advanced features?

This is like putting two processor like the most power hungry Intel chip and an Intel Atom, and build software to switch from them when needed. no, you add power management features the the processor and only use one

Re:what happened to good hardware design? (1)

Neil Hodges (960909) | more than 4 years ago | (#31075592)

This is like putting two processor like the most power hungry Intel chip and an Intel Atom, and build software to switch from them when needed...

Shh, don't give them any new ideas.

MAC (-1, Troll)

Anonymous Coward | more than 4 years ago | (#31076090)

has.been.doing.this.since.there.latest.mac-book.pro.WHY,is,NVIDIA.being.praised...my.space-bar.is.broken

first 4ost (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#31076432)

nned your help! future. Even

YUO FAiL IT (-1, Troll)

Anonymous Coward | more than 4 years ago | (#31079522)

practical purposes to th3 crowd in To yOu by Penisbird

I read articles about e-pcie years ago. (1)

AbRASiON (589899) | more than 4 years ago | (#31080364)

Where is it? External PCI-Express slot on the laptop - some kind of high end, many many pin plug to go to an external, powered '3D brick' nothing eventuated :/

A Micheal Bay associated brand? Must get. (1)

UrduBlake (1544847) | more than 4 years ago | (#31081896)

This is a total no brainer. A definite buy. What else can a transfan ask for? Kudos, Nvidia. 50 cents per post. $)

5870s drop to 27-35W when not gaming already (1)

mykos (1627575) | more than 4 years ago | (#31082370)

ATI already has high-powered GPUs like the 5870 that drop to 27-35W when not gaming (which is probably not too far off the power consumption of these GPUs) without having to switch to another GPU. I guess this switching thing is probably designed to compete power-wise with ATI.

Re:5870s drop to 27-35W when not gaming already (1)

lintux (125434) | more than 4 years ago | (#31083548)

27W. Wow. That's only about four times as much as my whole laptop (with Intel graphics). Definitely very low-power! :-P

I sure hope they have a mobile version of that chip...

Coming to your Linux box... (1)

DaVince21 (1342819) | more than 4 years ago | (#31095028)

...in 2014!

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>