Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Killer Mobile Graphics — NVIDIA's GeForce 8800M

kdawson posted more than 6 years ago | from the confluence-of-many-streams dept.

Graphics 89

MojoKid writes "Today NVIDIA unveiled the much-anticipated GeForce 8800M series of mobile graphics processors. The GeForce 8800M is powered by the new G92M GPU which is built on a 65nm manufacturing process and shares a lineage with the desktop-bound G92 GPU on which NVIDIA built their GeForce 8800 GT. The 8800M series will come in two flavors, a GTX and a GTS, with different configurations of stream processors, 64 for the GTS model and 96 for the high-end GTX."

Sorry! There are no comments related to the filter you selected.

poop (-1, Troll)

Anonymous Coward | more than 6 years ago | (#21417083)

first shit

Effects on Battery Life? (1, Insightful)

SlashdotOgre (739181) | more than 6 years ago | (#21417099)

Nice 3D graphics are great, but if I'm buying a mobile PC I'm more concerned about my battery life. The 8800M article mentions, "Power Mizer 7.0 Technology," but the lack of numbers backing it is concerning.

Although to be fair they're probably targeting 17" "laptops" anyway, and at that size I guess it's fair to say battery life shouldn't be a primary concern (heck if you're carrying it around I'd be more worried about getting a hernia).

Re:Effects on Battery Life? (3, Informative)

gerf (532474) | more than 6 years ago | (#21417131)

Well, it's supposedly more mizerly (?) than the 512MB 8600m GT, which I have. With a 1.6 c2d, 2GB ram, XP Pro, and a 85WHr battery, I can get over 5 hours of battery life, which I think is dang good considering that I can play games quite well. That of course is not while gaming, only web browsing and using IM.

Of course, this will vary from laptop to laptop, YMMV.

Re:Effects on Battery Life? (5, Insightful)

futuresheep (531366) | more than 6 years ago | (#21417163)

If you're buying a laptop because it has this graphics chip, battery life is secondary to frames per second. The people that are buying these laptops buy them because they're suited for playing games anywhere while plugged in, not traveling and off site work.

Re:Effects on Battery Life? (1)

kestasjk (933987) | more than 6 years ago | (#21418871)

If you're buying a laptop because it has this graphics chip, battery life is secondary to frames per second. The people that are buying these laptops buy them because they're suited for playing games anywhere while plugged in, not traveling and off site work.
Also lots of people buy laptops as desktop replacements these days, just because they take up less space.

Re:Effects on Battery Life? (1)

bdjacobson (1094909) | more than 6 years ago | (#21433059)

If you're buying a laptop because it has this graphics chip, battery life is secondary to frames per second. The people that are buying these laptops buy them because they're suited for playing games anywhere while plugged in, not traveling and off site work.
I'd like to second this. I thought I would be chiefly concerned about battery life, so I payed extra for the 12-cell battery, that when I undervolted my CPU, gave me 6.5h of laptop time.

Later I realized, its only about once/year that I actually make use of that time. All other times I have an outlet nearby.

Note the weight of the 12 cell ~ weight of 6-cell + charger.

And I was stuck with a laptop that played wow on lowest settings at 8fps.

Now, I would go for the compact-ish laptop (and try to keep it light), but I would definitely go for some graphics horsepower; because the times when I'm at a friends house over Christmas and we want to frag are far more often than the very few times I'm not near a power outlet.

Re:Effects on Battery Life? (4, Informative)

Zymergy (803632) | more than 6 years ago | (#21417191)

PowerMizer Mobile Technology page: http://www.nvidia.com/object/feature_powermizer.html [nvidia.com]

Maybe the NVIDIA Technical Brief will yield some answers: http://www.nvidia.com/object/IO_26269.html [nvidia.com] (Warning, spawns a PDF)

PowerMizer7.0 Power Management Techniques:
Use of leading edge chip process
CPU load balancing
Intelligent GPU utilization management
Revolutionary performance-per-watt design
PCI Express power management
Aggressive clock scaling
Dedicated power management circuits
Display brightness management
Adaptive performance algorithms

CPU Offload Example (from NVIDIA's Technical Brief)
Figures 3 and 4 (see PDF) show CPU utilization when running a Blu-ray H.264 HD movie using the CPU and GPU, respectively. You can see that under the GPU video playback, 30% less CPU cycles are being used. This dramatic reduction in CPU usage means less power is being consumed by the processor, therefore system power consumption is reduced. resulting in longer battery life.
Note: Testing was conducted on an Intel Centrino based platform with 2 GHz Core2 Duo processor, and a GeForce 8600M GS, running Intervideo WinDVD8 playing a Casino Royale H.264 Blu-ray disc.

PDF? (0)

Anonymous Coward | more than 6 years ago | (#21420771)

Warning, spawns a PDF

What's wrong with PDF?

Re:PDF? (1)

PitaBred (632671) | more than 6 years ago | (#21421997)

It's big, requires a plugin, and takes some time to initialize? A PDF isn't like a web page, where you already have the rendering engine loaded and running, so it's just courteous to give people a head's-up that they might be clicking on something that will disrupt their normal workflow.

Power usage not a win (1)

dj245 (732906) | more than 6 years ago | (#21422235)

I can agree that power usage by the CPU is cut since the utilization is not as high. But since the GPU is doing all this decoding, power usage by the GPU will increase. If the GPU is equal in decoding efficiency to the CPU, then power consumption is a wash and there is no power savings.

The real benefeit here is the ability to use the slowest, lowest powered chip you can find in a media center and still be able to decode HD.

Re:Effects on Battery Life? (1, Interesting)

Hektor_Troy (262592) | more than 6 years ago | (#21417429)

Nice 3D graphics are great, but if I'm buying a mobile PC I'm more concerned about my battery life.
Then why even bother reading articles about high-end graphics cards? Were you hoping that somehow, nVidia managed to build in magical transistors that uses less power the more of them there are? Everyone knows (or should know) that high end graphics cards use a lot more energy than low end ones. Partly because they run at much higher speeds, partly because they have a lot more transistors. This one in particular has 754 million transistors - compare that to a quad core Core2, like the Q6600 that has 582 million transistors.

More transistors == more power needed
Higher speeds == more power needed

heck if you're carrying it around I'd be more worried about getting a hernia
If you cannot manage to carry a 17" laptop around with you without getting a hernia, you have bigger problems than battery life on a laptop. My current one weighs around 8 lbs. Sure, it's heavier than a 12" model for obvious reasons, but really - a hernia from 8 lbs? How women in your family ever survived pregnancy with the lower back and hernia problems, is a mystery to science.

Hell, HP's 20" luggable is 15 lbs - again hardly something that'll give you much of a workout.

Why does these types of posts always show up?

High end graphic? Hah, I can settle for an automatic etch-a-sketch! Noone needs more than that.
Big laptops? Anything bigger than 5" is too big! Noone will EVER have a use for it.
Low battery life? I need 72 hours of continuous battery life while running simulations at Blue Gene speeds. Anything less than that is useless.


Seriously, what's the point? The parrent post is about as insightful as an observation that liquid water is wet. If you didn't know beforehand, you'd have to be an idiot.

Re:Effects on Battery Life? (3, Informative)

m94mni (541438) | more than 6 years ago | (#21417515)

Actually, one important part of newer PowerMizer designs (>3.0 maybe) is that parts of the GPU are *turned off* when not in use. Other parts run on decreased voltage.

That effectively decreases the number of proce3ssors and of course saves a *lot* of Watts.

Re:Effects on Battery Life? (3, Interesting)

PopeRatzo (965947) | more than 6 years ago | (#21419021)

Actually, one important part of newer PowerMizer designs (>3.0 maybe) is that parts of the GPU are *turned off* when not in use. Other parts run on decreased voltage.
Sounds good. Why don't they put one of these into a desktop for me? My machine is used for digital music and video editing and post-production. It would be nice to have a powerful video card that doesn't need too much noisy refrigeration. As it is, I've got a constant fight with heat and noise. I've built a quiet enclosure, but it gets warm in there. I've tried liquid cooling, but it's also a little noisy. Right now, I've got liquid cooling inside the enclosure, and cables as long as practical. I'm planning a closet for my gear, but I've got to wait until my kid finishes school and moves out first. I'd rather live with the noise and keep her around.

Actually, as time goes on I do less and less live recording, and when I do, I just use one of these extra-cool portable digital recorders and go somewhere quiet. When you work in a home studio, you make adjustments.

But I still perk up whenever I hear about "decreased voltage". I'm all for saving watts.

ATI's new desktop graphics cards do this (2, Informative)

recoiledsnake (879048) | more than 6 years ago | (#21420359)

From http://www.anandtech.com/video/showdoc.aspx?i=3151&p=2 [anandtech.com] :

As for PowerPlay, which is usually found in mobile GPUs, AMD has opted to include broader power management support in their desktop GPUs as well. While they aren't to wholly turn off parts of the chip, clock gaiting is used, as well as dynamic adjustment of core and memory clock speed and voltages. The command buffer is monitored to determine when power saving features need to be applied. This means that when applications need the power of the GPU it will run at full speed, but when less is going on (or even when something is CPU limited) we should see power, noise, and heat characteristics improve. One of the cool side effects of PowerPlay is that clock speeds are no longer determined by application state. On previous hardware, 3d clock speeds were only enabled when a fullscreen 3D application started. This means that GPU computing software (like folding@home) was only run at 2D clock speeds. Since these programs will no doubt fill the command queue, they will get full performance from the GPU now.
Hopefully, Nvidia will follow the lead.

Re:Effects on Battery Life? (1)

Kamots (321174) | more than 6 years ago | (#21420467)

"It would be nice to have a powerful video card that doesn't need too much noisy refrigeration."

Powerful video card? Try a 8800 GT. Want it quiet? Try swapping out the cooler for an Accelero S1 passive heatsink. It will run much significantly cooler than the stock active cooling solution... and if you want to get fancy (and really drop your temps), attach a low-speed 120mm (ie, nice and quiet) fan to it.

If you pay attention to noise when designing your system, you should be fine. I've got a completely air-cooled system that my current project is to try to figure out how to do something about the noise my optical drives make as they're by far the loudest components. (Well... when I've got a disc spun-up at least)

Things to look at when designing your system.

1) Case... the most important choice! Rather than building a quiet enclosure, look at purchasing a case with noise dampening qualities. Take a look at Antec's P182 as an example (what I'm using). It has noise dampening panels, vibration isolation mounts for your drives, and is designed for the use of low-speed 120mm (meaning quiet) fans. It's also designed to have good airflow properties. Yes it's more than a $30 aluminum sheet metal case, but it also isn't going to rattle like one.

2) Power supply... your power supply has a built-in cooling solution that unless you're doing the watercooling route you're really not going to be able to modify (if it's made to use a high-speed 60mm fan, good luck fitting a low-speed 120 in there...) However, if you keep noise in mind when you're selecting your psu, you can find quiet units (even passively cooled if you're willing to pay for it). Personally I find the hx520 quiet enough (has a 120mm intake fan, so the supply itself serves as a buffer between the fan and the outside of your case)

3) Fans! Use large low-speed fans for moving air through your case! When you're looking to buy a fan, the two important numbers are cfm and dB. Where you draw the airflow/noise balance is up to you, personally I aim for the highest cfm rating I can get while staying under 20dB. (Coolermaster sells a 120mm fan that moves 44 cfm of air for a bit under 20dB...) The 3 120mm fans that ship with the p182 when set on low are also below 20dB.

4) Examine the cooling solutions on your CPU and GPU (possibly other components as well if they have an active cooling solution... for example the northbridge on select motherboards). I've found the stock intel cooler to be pretty quiet on the C2Ds, however, there are passive solutions. For your GPU, again, there are passive solutions, although the stock solutions aren't that bad.

5) And... where I'm stuck. Find quiet drives. I have noticed that there's a significant difference in how loud my two optical drives are... I'm sure that I could find drives quieter than what I have. Same with harddrives. Currently, these make for the loudest components in my case.

Alternatively, you could go the watercooling route, if you do that it's all about selecting a quiet pump and having a large radiator (you could go large enough to cool passively, or simply large enough to use large, low-speed fans). Water cooling should allow you to have a nearly silent cooling setup. It'd also allow you to go nuts and use your enclosure around an already quiet case with just a passive radiator external.

Re:Effects on Battery Life? (1)

PitaBred (632671) | more than 6 years ago | (#21422129)

I don't know about Linux, but apparently Nero has a DriveSpeed application that will allow you to limit the speed of your CD-ROM drives. Just running them at 16x or 32x instead of 48x or whatever will significantly reduce your optical drive noise. I just tend to rip CD and DVD images to the hard drive, so I don't worry about it one way or the other ;)

Re:Effects on Battery Life? (1)

PopeRatzo (965947) | more than 6 years ago | (#21442363)

just tend to rip CD and DVD images to the hard drive, so I don't worry about it one way or the other ;)
Right. I wouldn't run a CD or DVD drive when I was trying to record audio or mix. They're way too noisy. More and more since I started using Alcohol and Daemon Tools, the only time I use a CD or DVD is when I first take it out of the box. After that, it's on my shelf.

Re:Effects on Battery Life? (0, Offtopic)

flayzernax (1060680) | more than 6 years ago | (#21417851)

Haha, I don't have moderation points but i say mod this parent up! they are right on.

Re:Effects on Battery Life? (1)

Christophotron (812632) | more than 6 years ago | (#21417435)

I love my 17" notebook, but I agree that it is a bit large and power-hungry. I only carry it around when I know I will use it, and for the off-days I always have my trusty HTC Kaiser. Now as far as graphics are concerned, I am a bit jealous of these new cards. I hope to hear news of the 8800M being offered as an MXM card so that I could (potentially) swap out my Go7600 for an 8800M. Come on NVIDIA, that's the whole point of having modular graphics cards!

Re:Effects on Battery Life? (1)

xouumalperxe (815707) | more than 6 years ago | (#21421531)

I see no reason whatsoever why this card should spend significantly more battery power while in plain old 2D mode for normal "on the move" usage than "lesser" chips. Perhaps it even spends less than some comparable chips, seeing as it's built on a 65nm process. And if you're playing 3D games, I really doubt battery life is your biggest concern.

Yet for all the processing power in the world (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#21417101)

You'll still have a hard time finding games that are more playable than the classics from C64, NES, and whatnot. Now you can beat the graphics of even the best Amiga with a phone! Still, the games are horrid and boring.
Of course, it makes for cool UIs, but it'll be a while before these chips will be common in end-user handsets.

Alienware already has two 8800M GTX models (5, Informative)

Zymergy (803632) | more than 6 years ago | (#21417113)

It appears Alienware will be using the GeForce 8800M GTX in their "m15x" and "m17x" models:
http://www.alienware.com/intro_pages/m17x_m15x.aspx [alienware.com]
NVIDIA GeForce 8800M Link: http://www.nvidia.com/object/geforce_8M.html [nvidia.com]

Re:Alienware already has two 8800M GTX models (3, Insightful)

snl2587 (1177409) | more than 6 years ago | (#21417155)

That is, of course, if you really want to pay for Alienware.

Re:Alienware already has two 8800M GTX models (1)

rucs_hack (784150) | more than 6 years ago | (#21417439)

That is, of course, if you really want to pay for Alienware.

I might, if I had money to burn on a gaming rig, and people were going to see it. They do have that 'look at me, I'm well off' sheen.

Re:Alienware already has two 8800M GTX models (1)

grasshoppa (657393) | more than 6 years ago | (#21417513)

I might, if I had money to burn on a gaming rig, and people were going to see it. They do have that 'look at me, I'm well off' sheen.

In much the same way Monster cables have that, "I have plenty of money and have no problems parting with it" sort of way. Which is a very strong attractor for certain kinds of chicks.

Re:Alienware already has two 8800M GTX models (1)

DeusExCalamus (1146781) | more than 6 years ago | (#21417697)

No worries, they'll be repelled by his geek aura easily enough.

Re:Alienware already has two 8800M GTX models (1)

mgblst (80109) | more than 6 years ago | (#21418601)

In much the same way Monster cables have that, "I have plenty of money and have no problems parting with it" sort of way. Which is a very strong attractor for certain kinds of chicks.


Which is kind of ironic, since those chick will have no idea what a Monster cable or an Alienware laptop looks like. They generally go by how shiney your cars is.

Re:Alienware already has two 8800M GTX models (0)

Anonymous Coward | more than 6 years ago | (#21417561)

I might, if I had money to burn on a gaming rig, and people were going to see it
well but the looks of it, i dont think many people could miss it. that thing looks like it's more than an inch thick. notebook my ass, I've seen encyclopedias that are less bulky than that

Re:Alienware already has two 8800M GTX models (1)

dbcad7 (771464) | more than 6 years ago | (#21420765)

Well, those are nice drawings of a laptop... are they actually manufactured yet ?

iMac (2, Informative)

tsa (15680) | more than 6 years ago | (#21417195)

Let's hope Apple puts this card in the next iMac instead of the crappy ATI they put in it now.

Unlikely. (2, Interesting)

SanityInAnarchy (655584) | more than 6 years ago | (#21417273)

Last I heard, Steve Jobs had some issues with nVidia, thus you get ATI for all new macs, end of story. Unless someone else comes along -- Intel, maybe?

Could be completely unfounded rumor, so take with a grain of salt, but it does sound like the Apple we know and love [to hate].

By the way -- this is why I love to be a PC/Linux user. I can buy whatever hardware I want, I'm not bound by the moods of His Holy Turtleneckness. The disadvantage is, it has to be something with a Linux driver, but in a pinch, I can write my own. But really, not that much doesn't have a Linux driver these days.

Re:Unlikely. (5, Informative)

Anonymous Coward | more than 6 years ago | (#21417921)

Actually, you got it the wrong way around.

Apple was going 100% ATI, but then ATI leaked about how they'd got the contract to the press and Jobs was furious. He really HATES secrets getting let out (I've no idea why, it seems to be industry standard practice. But if you ever happen to enter a NDA with applie then you better honour it!)

Anyway, Apple pulled the contract and shifted every mac they could to nvidia. However, for some reason they didn't shift imac despite shifting everything else. I have a vague suspicion it is to force apple developers to always code in a GPU independent way (basically to keep nividia honest) but as an imac owner, it is very annoying.

Re:Unlikely. (-1)

Anonymous Coward | more than 6 years ago | (#21417957)

BS.

Newest Mac Pros and Macbook Pro revisions switched to Nvidia.

Re:Unlikely. (2, Informative)

UserChrisCanter4 (464072) | more than 6 years ago | (#21419389)

Intel graphics are already in Macbook (non-pros) and the Mac Mini; the low range of Apple products hasn't had a dedicated GPU/VRAM setup since the PowerPC days. In fact, there are fewer product lines with ATI chips than there are any others, since ATI is now only present in the iMac line of products and as a BTO option on the Mac Pro.

You have to go pretty far back in Apple's product line to find a point where there wasn't a pretty even mixture of video card combination available.

Re:Unlikely. (1)

petermgreen (876956) | more than 6 years ago | (#21426235)

there are fewer product lines with ATI chips than there are any others, since ATI is now only present in the iMac line of products and as a BTO option on the Mac Pro.
Nvidia gets the macbook pro and the default choice for the mac pro -- 2 lines
Intel gets the mini and the macbook --2 lines
ATI gets the imac, the xserve and a choice for the mac pro -- 3 lines

so ATI wins on number of product lines.

but in terms of total units to apple I suspect intel is the winner followed by nvidia.

Re:Unlikely. (1, Insightful)

Anonymous Coward | more than 6 years ago | (#21421881)

...in a pinch, I can write my own.
 
That's awesome! So... how many drivers have you written? Or how many drivers do Linux users write, on average?
 
Users don't write drivers. The Macintosh presents a complete system, which Just Works without someone having to worry about what component is used and what is not used.

Just works... (1)

SanityInAnarchy (655584) | more than 6 years ago | (#21430593)

...right up until it Just Doesn't.

Like, say, when you plug in a peripheral that doesn't work.

Dude, I know. I had a Powerbook. I know what the Mac Experience is, and I know why it's attractive.

I also know that the second you want some non-standard hardware, or, really, non-standard anything, it's a crapshoot. And as a Linux user, when I say hardware support is a "crapshoot", it means really, really fucking bad.

But as you may not realize, I actually am a software developer -- so if I really can't find anyone else to write the driver, I can write it myself. As a user, if you really can't find a driver, you can pay me to write one.

Re:Unlikely. (1)

Calibax (151875) | more than 6 years ago | (#21422457)

I would guess this is unfounded. After all, nVidia graphics are in the latest Mac Book Pro models as well as being the standard product in the Mac Pro. In fact, the Mac Pro series will support up to 4 graphics adapters, provided they are nVidia cards, as only a single ATI card is supported. nVidia also supplies workstation class video cards for the Mac Pro.

Apple isn't stupid. They remember what happened back in the late 1990s - their only graphics vendor was ATI who was able to completely dictate what graphics solutions were able to be shipped in Apple products. Now Apple is careful to parcel out their graphics work to ATI, Intel and nVidia and will never let any one vendor get into a dominating position. Give each enough to keep them interested, don't give any one of them enough to affect your business plan. Of course this also applies to other components, and is why there may (eventually) be AMD chipsets/processors in Apple products.

Re:Unlikely. (0)

Anonymous Coward | more than 6 years ago | (#21423203)

Nice try faggar, keep recompiling drivers while upgrading your kernel.

Re:iMac (1, Insightful)

Vskye (9079) | more than 6 years ago | (#21417427)

I'd agree here. Personally, I'd love a MacBook but I will never buy one until it comes with at least 256MB of dedicated video ram. Are ya reading this Steve? And yes, Nvidia would be a better option.

Re:iMac (1)

heinousjay (683506) | more than 6 years ago | (#21417941)

No, Steve isn't reading this. Steve doesn't care about the six laptops he'd sell worldwide with that configuration.

Re:iMac (0)

Anonymous Coward | more than 6 years ago | (#21418005)

won't happen. That's the Macbook Pro market segment. Macbooks just went to the x3100, and there they will stay for some time to come. Next Mac Mini revision will also feature x3100. We will see this before the end of Jan.

Re:iMac (1)

Gr8Apes (679165) | more than 6 years ago | (#21419739)

Buy a MacBook Pro. Refurbs are getting pretty darn inexpensive, especially one generation back.

Re:iMac (2, Informative)

TheMidnight (1055796) | more than 6 years ago | (#21417461)

Well, since we're talking about laptops and mobile graphics, I feel the need to point out that my new MacBook Pro has an nVidia 8600 GT in it. Apple has provided nVidia chips in the MacBook Pro line for a few months now. You can get 128 MB or 256 MB, depending on whether you buy the 15" or 17" model.

Re:iMac (1)

tsa (15680) | more than 6 years ago | (#21417487)

Really? That's strange. I have the older MBP, a 15" model with the ATI Radeon X1600 card, and 256 MB.

Re:iMac (2, Informative)

Vskye (9079) | more than 6 years ago | (#21417535)

Well, since we're talking about laptops and mobile graphics, I feel the need to point out that my new MacBook Pro has an nVidia 8600 GT in it. Apple has provided nVidia chips in the MacBook Pro line for a few months now. You can get 128 MB or 256 MB, depending on whether you buy the 15" or 17" model.

Yep, that might be true.. but to get to 256MB in graphics memory you have to spend $2499.00 US. That's just crazy. (MacBook Pro 15") I'm sorry, but I'll just get a iMac and purchase a cheap PC based laptop, and toss Linux on it. Personally, I'd love a MacBook but the bang for the buck just isn't justified. (spec wise, and the pro version is just insane price wise)

Re:iMac (0)

ravenspear (756059) | more than 6 years ago | (#21417525)

Won't happen. Apple never uses the top graphics cards, they always pick from among the bottom or a few generations back.

Example, their current pro portable chip is the 8600M, not the current fastest one they could have gone with, and they even underclocked it below its standard speed.

Re:iMac (0)

Anonymous Coward | more than 6 years ago | (#21417549)

it's mobile GPU man, Apple will have a better chance putting 8800 GTX/GTS/GT in iMac's

Re:iMac (1)

644bd346996 (1012333) | more than 6 years ago | (#21421927)

The iMac uses mobile chips so that it can stay thin and quiet. Apple wouldn't put a power-hungry desktop GPU in the iMac.

Re:iMac (0)

Anonymous Coward | more than 6 years ago | (#21440963)

Don't feel like registering just to make this comment. All I want to say is that these are mobile GPU's. That's the whole reason for this thread.

Passively cooled desktop cards? (4, Interesting)

Kris_J (10111) | more than 6 years ago | (#21417329)

I need to upgrade an old PC that's built to be quiet, thus doesn't have a fan on the video card. Anyone know if these chips could be used to make a passively cooled desktop video card, and if they're likely to be?

Re:Passively cooled desktop cards? (5, Informative)

vipz (1179205) | more than 6 years ago | (#21417449)

I believe the 8800GT on the desktop side of things uses the same G92 chip. Sparkle has already announced a passively cooled version of that: Press Release [sparkle.com.tw] Pictures of a passively cooled Gainward card have also been floating around the net.

Re:Passively cooled desktop cards? (1)

BiggerIsBetter (682164) | more than 6 years ago | (#21417497)

Anyone know if these chips could be used to make a passively cooled desktop video card, and if they're likely to be?
Dunno about these chips yet, but Asus has their Silent Magic line of passively cooled cards. Their EN8500GT SILENT MAGIC/HTP/512M [asus.co.nz] looks like a reasonable option.

Re:Passively cooled desktop cards? (1)

lucas teh geek (714343) | more than 6 years ago | (#21417523)

passively cooled desktop cards are not particularly uncommon (so long as you arent wanting the top of the line uber-gaming rig) for example in the 30 seconds i spent looking at an online store i found this [centrecom.com.au] and this [centrecom.com.au] , both of which are 8600 GT series cards. there didnt seem to be any 8800 series cards, but it wouldnt suprise me to see them later in on, once even faster 8xxxx series cards come out (8900? i dunno if its in the works)

Re:Passively cooled desktop cards? (1)

giorgiofr (887762) | more than 6 years ago | (#21417867)

I have a fanless 7600gt by Gigabyte, it's totally silent, does not run hot and works well in all games. It's smallish and does not require additional pci-x power connectors. HTH

Eh, there are silent cards already (1)

SmallFurryCreature (593017) | more than 6 years ago | (#21417963)

Passive cards are nothing new and you can get them with regular desktop GPU's. Just shop around a bit. I got one, granted I keep it extra cool by having a very large (but slow moving) fan installed on the case that blows on it, but that is just to be safe, it has enough metal strapped to it to cool it with normal airflow in your case.

Wanting a laptop GPU in your desktop is just silly, unless you want to consume less power. Passive cooling has been around for a long time and exists for the CPU as well. Just make sure you buy spacious hardware, those things are HUGE.

Re:Eh, there are silent cards already (1)

lucas teh geek (714343) | more than 6 years ago | (#21418383)

the biggest problem with a silent pc (no cpu/gpu fans at least) is the amount of noise that hdd's generate and the fact there's not much you can do about them. I've got a quite case that has rubber grommets that hdds sit on but it doesnt help with the whine. admittedly I've got some pretty old drives in my case, so it would probably be a bit quieter if they were replaced. I'm looking forward to the day i can afford a decent sized solid state drive to boot off and then have a NAS device off in a cupboard somewhere mounted via gigabit ethernet.

Re:Eh, there are silent cards already (1)

Junta (36770) | more than 6 years ago | (#21418969)

My experience is that some of the newer drives are pretty decent on noise. But, getting out of local disks isn't hard at all even with pretty run-of-the-mill motherboards, at least not with linux. You could setup up a box with the gobs of storage somewhere, put some linux on it (CentOS 5.1 might be a good choice) with software target (CentOS 5.1 ought to get that incuded, since RHEL5.1 did, otherwise google for iet). CentOS/RHEL support a fairly nomral install to an iSCSI software target with the software initiator. Since most desktop systems don't support iSCSI boot, you'll have to access the SCSI target elsewhere post install and set up your network boot server to serve up that kernel and initrd, or, use a USB flash disk as /boot and not worry about network booting at all.

On the Windows side, I'm not sure if MS has any way at all to install and boot appropriately for iSCSI without iSCSI firmware/hardware available. In fact, the trend seems to be that most of the fubar firmware implementations exists mainly because Windows needs it (the commonly known example is the significant amount of 'fakeraid' cards that satisfy BIOS calls until the 'real' driver loads which contains the vendor's chosen software RAID implementation, Windows needs it to boot because they can't bootstrap flexibly while Linux could have a normal or RAID-1 /boot and other than that small piece can use the standard software RAID implementation).

Re:Eh, there are silent cards already (1)

Mattsson (105422) | more than 6 years ago | (#21421603)

Put in some new, fairly quite drives, then put them in something like a Quiet drive enclosure [scythe-eu.com] .
Much better than simply putting the drives on rubber grommets.
If you also combine this with a low-noise or passive-cooling psu, have all remaining fans rubber-mounted and get some noise absorbing material to put on the inside of your chassis, you'll have a nearly silent pc.
   

Re:Passively cooled desktop cards? (0)

Anonymous Coward | more than 6 years ago | (#21419461)

I have a passively cooled 7500gt card installed, and I can tell you that this shit is little more than a marketing gig. High powered 3D chips (*and* their memory) just generate a lot of heat, especially when run at full throttle with quake 4 or something. You'll soon find out that, while the card has a heat sink and no fan installed, you still have to provide for top notch air flow through your computer case or else the chip - more or less constantly - overheats. So instead attaching the fan(s) to the card, you have to put them elsewhere in your case, they just roll over the problem. Hahahaha.

I also wonder why people still want these kind of chips in laptops, where room is such constrained that good air flow can hardly be provided, and fans are so small that they have to whizz away at insanely high speeds to cool these days' chips. That's why the industry invented so called cooling pads which have extra fans to cool the laptop from the *outside*. Taking a step back and looking at this with a cool (!) eye clearly exhibits that customers are fooled with, and news like this are just generated by marketing hype because apparently there's enough people who still hook up on this shit.

Take some advice and, if you want quiescence and long living hardware, go along with a humble, old school 2D chip. It saves you a lot of trouble and frustration.

Re:Passively cooled desktop cards? (0)

Anonymous Coward | more than 6 years ago | (#21427437)

either you live somewhere stupidly hot (40C+ days, everyday), you bought a cheap-ass card that didnt actually have any thought put into how a passive heatsink might work, or you're full of shit.

I've got a passively cooled 7600gt, which is overclocked ffs, and I've never had it overheat. not once. no crashes. no texture corruption. not a thing. it has no more airflow than that what is provided by the sonata II PSU and 120mm rear fan, neither of which are audible when the pc is on the floor. and this is in australia, not some fucking pussy country where it snows; we like it hot, yesterday was 37C.

Re:Passively cooled desktop cards? (1)

ffflala (793437) | more than 6 years ago | (#21426271)

If you are willing to go with a slightly lesser 8600GTS model, here's a passively-cooled card that I put into my quiet machine last month. FWIW, I get decent frame rates on Oblivion (running on WINE) w/ high quality settings. I hesitate to post a product link/shill for a company, but then again you did ask.

My plan is to wait until these drop below $100, then get another for a silent SLI configuration.

MSI NX8600GTS-T2D256EZ HD GeForce 8600GTS 256MB 128-bit GDDR3 PCI Express x16 HDCP Ready SLI Supported Video Card

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127287 [newegg.com]

Somewhat unrelated but... (1)

blankoboy (719577) | more than 6 years ago | (#21417639)

when are we going to be seeing the PS3's RSX shrink from 90 down to 65nm? I imagine it's already taped out and ready to go.

Wait... (3, Funny)

Vampyre_Dark (630787) | more than 6 years ago | (#21418307)

Wait ten days... 8800GTSE
Wait ten days... 8800GTSE2 with pack in game
Wait ten days... 8800GTSE OVERCLOCKED EXTREME EDITION
Wait ten days... 8850LE
Wait ten days... 8850GTS
Wait six months... driver update makes 8850GTS 25% slower.
Wait ten days... 9800GT
Wait three months... driver update makes 8850GTS 25% slower.
Wait ten days... 9850GTS
Wait three months... driver update makes 9800GT 25% slower.

This is the song that never ends!
Yes it goes on and on my friends...

GTSE? (0)

Anonymous Coward | more than 6 years ago | (#21430175)

You forgot the 8800GoaTSE

do want. (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#21418557)

do want. want. want. want.

oblig (0, Redundant)

ohsmeguk (1048214) | more than 6 years ago | (#21418643)

But does it run Linux?

will it work on exisitng laptops (1)

r4rushabh (1191689) | more than 6 years ago | (#21418903)

I have a dell e1505, will this card work on my laptop

Re:will it work on exisitng laptops (1)

gnuman99 (746007) | more than 6 years ago | (#21419713)

lol, this is not a card. You can't "upgrade" your laptop - it is not a desktop.

Geez. I think someone got lost on the intertubes here.

Re:will it work on exisitng laptops (2, Informative)

lonesome_coder (1166023) | more than 6 years ago | (#21420375)

Actually, these cards are replaceable. This module is very easily removed and uses a standard mxm pci-e interface so that they can be changed out.

Don't bring flames if you don't know what you are talking about.

Re:will it work on exisitng laptops (0)

Anonymous Coward | more than 6 years ago | (#21421553)

Not quite. You CAN technically upgrade laptops with these--assuming it supports standard "replaceable" mxm cards, and the manufacturer will swap them out (or if you can find them on ebay occasionally.) In the parent's case at least, the e1505 doesn't have mxm, so no dice.

Re:will it work on exisitng laptops (1)

beavioso (853680) | more than 6 years ago | (#21424129)

Actually some people can swap out video cards for laptops. I know that a Dell 9300 (it's a 17" model) can upgrade from an ATI x300 (middle of the road then) to a Nvidia 6800go (high end). Some found out that a 7800gtx go could fit in the 9300 too. Sadly, because of BIOS and physical layout issues, that is where it ended for the 9300, but Dell's high end computer, unless they have integrated graphics, can usually swap video cards at least once.

So, in conclusion I think this will probably fit in the Dell's XPS M1730, as it just came out with the Nvidia 8700M graphics card. FYI, you can also get the M1730 with two 8700M cards in SLI, so it might be able to house these in SLI.

Re:will it work on exisitng laptops (0)

Anonymous Coward | more than 6 years ago | (#21428555)

maybe you can answer my questions. I have the 8400M GS (128mb series) for my HP DV6500t laptop. Is it swappable with this new 8800? Or is the interface different, or is there possible heating problems, or driver difficulties. Thanks. My card is working swell now, but maybe in a year or so (when this drops in price a tadbit) I want to know my options.

Re:will it work on exisitng laptops (1)

beavioso (853680) | more than 6 years ago | (#21429253)

I don't know really. I stumbled upon notebookforums dot com and learned about the 9300 upgrade path by accident. They should have a forum for HP notebooks, and maybe someone there can answer your question.

Re:will it work on exisitng laptops (1)

Khyber (864651) | more than 6 years ago | (#21425239)

Someone needs to work in a laptop repair depot, sometime.

Can these be bought without a laptop? (2, Interesting)

lonesome_coder (1166023) | more than 6 years ago | (#21419093)

Just wondering...I have an Alienware m7700 that could really use a new card to put some life into it. Does anyone know if there is a place to just buy these cards without having to go through Dell, Alienware or some other company of that sort? Any linkage would be appreciated.

Yes, I was dumb enough to buy a lappy from Alienware...definitely catches attention in the airport, though.

Re:Can these be bought without a laptop? (1)

MTgeekMAN (700406) | more than 6 years ago | (#21419761)

Actually I am wondering about this as well. I bought a laptop earlier this year... not the greatest video card in it for gaming, but it is replaceable. would be nice to run games smoothly at the res of the screen on the laptop. (1920x1200)

Thanks!

Re:Can these be bought without a laptop? (1)

Karl the Pagan (815039) | more than 6 years ago | (#21426025)

Yes, but not for Dell/Alienware laptops.

Dell & Alienware use a proprietary formfactor which is not generally available elsewhere.

I already paid 300% of what I should have on a GPU upgrade from Dell. Not only was the process thoroughly frustrating and overpriced, but my laptop just barely gets by with the approved Dell upgrade.

I was looking in this thread for vendor recommendations, but sadly I don't see any. So here are mine: (DISCLAIMER: I currently own a Dell. I am not a professional tech writer and do not have resources available to actually try out these products)

Sager Notebook - http://www.sagernotebook.com/default.php [sagernotebook.com]
Clevo is the first name I see tossed around most often and Sager seems to be a good reseller. They let slip some 8800M talk a while back, but haven't had an 8800M laptop ready to go yet. I think they might make a good platform for an SLI laptop gaming rig.

VoodooPC - http://www.voodoopc.com/system/Notebook.aspx [voodoopc.com]
I read some celebrity interview and this is what they used. Checking out the specs they look good, if overpriced.

Re:Can these be bought without a laptop? (1)

lonesome_coder (1166023) | more than 6 years ago | (#21426647)

Thanks for the links, I will take a look when I get home.

Your comment about the Alienware laptops isn't entirely true...my m7700 is Clevo with a different top on it and uses MXM PCI-e cards. Buying them through Alienware is absolutely crazy, however. Before I voided my warranty by flashing my own BIOS with one from their support site (go figure) I had them send me a replacement card as mine had went on the fritz. The price for my outdated card (7800GTX Go) was absurd (in the area of $1200USD for just the card). Thankfully, they replaced it after a nice, calm and collected conversation I had with my script-read...I mean, support representative.

Re:Can these be bought without a laptop? (1)

Karl the Pagan (815039) | more than 6 years ago | (#21427489)

Yea, the Clevo-based Alienwares are the exception, but you're not exactly going back to Dell for their support, eh?

Re:Can these be bought without a laptop? (1)

Karl the Pagan (815039) | more than 6 years ago | (#21427521)

err, I see - existing Clevo user. Check out other resellers ;)

Re:Can these be bought without a laptop? (1)

Chili-71 (768964) | more than 6 years ago | (#21434611)

Typically, and find no reason that Alienware would be different, the graphics engine (GPU) is integrated with the Mother Board on laptops. Good luck unsoldering it and soldering in a new one.

Re:Can these be bought without a laptop? (0)

Anonymous Coward | more than 6 years ago | (#21436649)

Some models use MXM (Mobile Express Module) sockets which allow upgrading; problem is, this is very seldom advertised as a feature -- which it is! -- or even mentioned in the specs. (On purpose?) And nevertheless it can be a bitch to figure out how to get the custom-built heatpipe cooler off...

I wonder.. (1)

deftones_325 (1159693) | more than 6 years ago | (#21421281)

..if it will come with a sponge to wipe up the puddle of molten Alienware components.

Xvmc (1)

bh_doc (930270) | more than 6 years ago | (#21430729)

When NVidia finally implements Xvmc support in the 8 series [nvnews.net] I might start giving a shit about their products again. But I suspect by the time that happens the open source ATI drivers might be a real alternative. So I probably won't give a shit then, either. /angry at nvidia

Desktop Replacement (1)

partowel (469956) | more than 6 years ago | (#21431241)

I don't get Top of the Line laptops for battery life.

Battery Life? I use it as a regular PC.

Also for LAN parties.

Batteries in laptops have always sucked, imo.

2 hours? 5 hours? 8 hours?

Whatever. I like my solar powered calculator that can keep going and going.

The best graphics chip will NEVER win on battery life. Thats a given.

Graphics need POWER. Its very simple. Not rocket science.

Really bad part about laptops : They are EASY to steal.

Desktops : REALLY hard to steal. Big and Bulky has advantages.

Conclusion : This chip is for desktop replacement laptops. Not the battery powered gaming system.

If you want portable gaming, get Nintendo DS. I hear thats good.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?