Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Launches Five New Mobile GPUs

ScuttleMonkey posted more than 5 years ago | from the too-many-options dept.

Graphics 67

Engadget is reporting that NVIDIA has released five new mobile GPUs to fill some imagined gap in the 200M series lineup. These new chips supposedly double the performance and halve the power consumption of the older chips, but still no word on why they think we need eight different GPU options. "The cards are SLI, HybridPower, CUDA, Windows 7 and DirectX 10.1 compatible, and all support PhysX other than the low-end G210M. Of course, with integrated graphics like the 9400M starting to obviate discrete graphics in the mid range -- even including Apple's latest low-end 15-inch MacBook Pro -- we're not sure what we'll do with eight different GPU options, but we suppose NVIDIA's yet-to-be-announced price sheet for these cards will make it all clear in time."

cancel ×

67 comments

Sorry! There are no comments related to the filter you selected.

If you want any real info, read the comments (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#28337625)

One random guy puts more info in a few sentences than the entire engadget paragraph.

Re:If you want any real info, read the comments (-1, Troll)

Anonymous Coward | more than 5 years ago | (#28337867)

Linux just isn't ready for the netbook yet. It may be ready for the web servers that you nerds use to distribute your TRON fanzines and personal Dungeons and Dragons web-sights across the world wide web, but the average computer user isn't going to spend months learning how to use a CLI and then hours compiling packages so that they can get a workable graphic interface to check their mail with, especially not when they already have a Windows XP netbook that does its job perfectly well and is backed by a major corporation, as opposed to Linux which is only supported by a few unemployed nerds living in their mother's basement somewhere. The last thing I want is a level 5 dwarf (haha) providing me my OS.

Re:If you want any real info, read the comments (0)

Anonymous Coward | more than 5 years ago | (#28341735)

Linux just isn't ready for the toaster yet. It may be ready for the waffle irons that you nerds use to distribute your waffle fanzines and personal mapple suryp pitchers across the family wide table, but the average kitchen user isn't going to spend months learning how to use a dial and then hours installing upgrades so that they can get a workable interface to check their time with, especially not when they already have a Windows XP toaster that does its job perfectly well and is backed by a major corporation, as opposed to Linux which is only supported by a few unemployed nerds living in their mother's basement somewhere. The last thing I want is a level 5 dwarf (haha) providing me my breakfast.

Re:If you want any real info, read the comments (1, Offtopic)

justinlee37 (993373) | more than 5 years ago | (#28342061)

I realize I'm just posting in a troll thread, which is exactly what the troll wants, but I just couldn't resist ... it's like nerd honey ...

There aren't any racial levels in "Dwarf." You could have a level 5 fighter who happens to be a dwarf. There are racial levels for other races though, usually monstrous ones. You could have, for example, a level 5 doppelganger.

Although personally I would rather make my doppelganger a 10th level assassin, 10th level rogue. Oh yeah ... *drools*

P.S. I don't use Linux. I play Crysis on my 64-bit Windows Vista Ultimate (TM) machine. I need it to run my dual-core GPU (the Radeon HD3870 X2) and to access my 8gb of system RAM. Oh and my quad-core CPU doesn't really require 64-bit operating, but it's still pretty nifty. Anyway, I can't really bite on the Linux stuff because I just don't care. Maybe a Linux fanboy could help feed the troll?

Re:If you want any real info, read the comments (1)

Omniscient Lurker (1504701) | more than 5 years ago | (#28343851)

How do you know what game the trolls' hypothetical nerd (in the guise of a dwarf) is playing? Sure they reference dungeons and dragons, but trolls aren't known for maintaining references throughout a post.

Re:If you want any real info, read the comments (1)

Petrushka (815171) | more than 5 years ago | (#28344743)

There aren't any racial levels in "Dwarf."

Not to defend the GP, but I take it you never played Basic [wikipedia.org] , eh? Those are the rules that an awful lot of people started off with, in the days when AD&D -- a.k.a. 1st edition -- was a bit beyond the financial means of many teenagers.

Re:If you want any real info, read the comments (0, Offtopic)

justinlee37 (993373) | more than 5 years ago | (#28408413)

I never played Basic. I realize that "elf" was a class in Basic but I didn't realize that "dwarf" was as well. Man, that version was screwy. I once heard that the game listed a common domesticated cat as having better statistics than a peasant.

Finally (4, Interesting)

Yvan256 (722131) | more than 5 years ago | (#28337635)

Finally, news about low-power GPUs with decent capabilities.

I'm sure hardcore gamers prefer bleeding edge hardware news, but for the rest of us, heat dissipation and power requirements are beginning to be a nuisance more than anything else. I'm sure 99% of computer users would be fine with a dual-core Atom CPU and one of those new GPUs.

Re:Finally (3, Informative)

Anonymous Coward | more than 5 years ago | (#28337819)

Finally, news about low-power GPUs with decent capabilities.

I'm sure hardcore gamers prefer bleeding edge hardware news, but for the rest of us, heat dissipation and power requirements are beginning to be a nuisance more than anything else. I'm sure 99% of computer users would be fine with a dual-core Atom CPU and one of those new GPUs.

I have a duel core atom, and it sucks for flash. Its really sad that you can have the best video solution in the world paired with these and video ends up being the thing that suffers the most.
Once we get HTML 5, and video on the web migrates to a non-CPU based video system that will be true though.

Re:Finally (0)

Anonymous Coward | more than 5 years ago | (#28337929)

Everything but an i7 or a C2Q sucks for flash.

That's why we have firefox plugins (0)

Anonymous Coward | more than 5 years ago | (#28338657)

I don't have a problem with flash (I don't have flash installed either), because if it is something I want to view, like I just finished watching a short youtube vid a few minutes ago, I use one of the many download and convert plugins, then watch it from my drive, where it plays quite well in VLC on a very old and modest system now.

And sites that use it for normal HTML replacement? Fuck 'em, if they can't do some links and text and images without resorting to flash..there's a big internet out there, I just skip their stupid and bloated website. I never liked "coded to be best viewed under IE!" bullshit sites, and I don't have to start liking "coded to only work with Flash!" websites either.

Re:Finally (0)

Anonymous Coward | more than 5 years ago | (#28338083)

You are assuming that the video decoders for your HTML-5 compliant browser will be capable of taking advantage of GPU-assist. Why doesn't flash do it already?

Re:Finally (1)

Freetardo Jones (1574733) | more than 5 years ago | (#28338183)

Why doesn't flash do it already?

It's already been planned. From here [nvidia.com] :

NVIDIA, the inventor of the GPU, and Adobe Systems Incorporated announced that they are collaborating as part of the Open Screen Project to optimize and enable Adobe® Flash® Player, a key component of the Adobe Flash Platform, to leverage GPU video and graphics acceleration on a wide range of mobile Internet devices, including netbooks, tablets, mobile phones and other on-the-go media devices./quote

Re:Finally (1)

Chyeld (713439) | more than 5 years ago | (#28340611)

NVIDIA, the inventor of the GPU

<onomatopoeia>SNERK</onomatopoeia>!

Say what now?

"I have balls of steel"

Yes NVIDIA, yes you do.

Never ascribe to malice ... (1)

Zero__Kelvin (151819) | more than 5 years ago | (#28341137)

Maybe the NVIDIA writer should have written that NVIDIA invented the General Purpose GPU [wikipedia.org] ? From the wiki it seems like they might have been pioneers there. As far as the statement as written, I don't think anyone could rightly assume they meant that the invented the term Graphics Processing Unit or the idea of having a (at least logically) separate graphics processor in general, so you almost have to assume they mean to use the term GPGPU . Don't forget the Telephone [wikipedia.org] effect and the fact that the person writing the article is probably not an Engineer at all. Making easily explained and readily understood mistakes isn't the same as being arrogant.

Re:Never ascribe to malice ... (1)

N Monkey (313423) | more than 5 years ago | (#28345361)

Maybe the NVIDIA writer should have written that NVIDIA invented the General Purpose GPU [wikipedia.org] ? From the wiki it seems like they might have been pioneers there.

I still don't think it's a valid claim. Try reading Myer & Sutherland'sOn the Design of Display Processors [stanford.edu] , or Levinthal and Porter's Chap - a SIMD graphics processor [acm.org] . There was also the Ti graphics chips eg (34010 [wikipedia.org] ). It happens all the time. IIRC many of the SGI machines were done with programmable hardware but I guess that wasn't exposed to the end user.

Re:Never ascribe to malice ... (1)

Zero__Kelvin (151819) | more than 5 years ago | (#28355943)

Programmable hardware isn't the same as a General Purpose GPU. Maybe you knew that, but it appears from your post that you may not know it. In any case, they may not be pioneers, but I suspect that they are. SGI obviously is a contender as well. I think we can all agree that the statement was blatantly false, but exactly whither arrogance, ignorance, miscommunication, or error is unknown, which is really the point I was trying to make. I guess I'll add that it seems unlikely they would intentionally try to convince the world that the invented they concept. At least I hope so, since to do so is to violate a M$ business practice patent ;-)

Apologies if I was unclear ...

Re:Finally (1)

jedidiah (1196) | more than 5 years ago | (#28338781)

> You are assuming that the video decoders for your HTML-5 compliant browser will be capable of taking advantage of GPU-assist. Why doesn't flash do it already?

Yes that's DECODERS as opposed to a singular decoder owned by a company that may or may not care about your requirements.

There are varying levels of GPU-assist even on Linux. So it's clearly NOT going to
be a problem if anyone who is interested in dealing with the problem decides to take
a crack at it. It will inevitably lead to a better situation then one where Adobe
(or even Microsoft) are the only ones in a position to "fix things".

Re:Finally (2, Interesting)

Arthur Grumbine (1086397) | more than 5 years ago | (#28339069)

I have a duel core atom, and it sucks for flash

Probably cuz it's tired from fighting in one-on-one combat with the GPU all the time. I recommend getting an Atom that works with its GPU [tomshardware.com] .

Re:Finally (1)

MojoStan (776183) | more than 5 years ago | (#28343145)

I have a duel core atom, and it sucks for flash

Probably cuz it's tired from fighting in one-on-one combat with the GPU all the time. I recommend getting an Atom that works with its GPU [tomshardware.com] .

Your link says nothing about the GPU in the Ion chipset (GeForce 9300) helping Flash video in any way (it doesn't). Yes, we all know Ion's GPU accelerates the codecs used in Blu-ray (H.264, VC-1, MPEG-2), but the Atom has to do all the work when it comes to Flash (and it sucks).

Here's a much better link that explains how the Atom (single and dual core) does with Flash on the Ion platform at different resolutions: Zotac's Ion: The Follow Up - Watching Flash Video on the Ion [anandtech.com]

Summary: single-core Atom on Ion is insufficient for playing Hulu video at 480p in its default window (not full-screen). At full screen, even a dual-core Atom-on-Ion is insufficient for playing 480p Hulu video.

OTOH, Atom-on-Ion works surprisingly well with Blu-ray [anandtech.com] . Pretty impressive full such a low-power, fanless system. It would seem like the perfect HTPC platform if Flash playback didn't suck.

Re:Finally (1)

BlackSnake112 (912158) | more than 5 years ago | (#28341791)

I can stream HD video (movie trailers from apple's site) on my netbook. There is a dual core atom? Where? It looks like dual core cause of hyper threading, but it is still a single core CPU. Well, the 1.6Ghz one in the netbooks that I have seen are anyway. The trailers streamed and played fine under linux (ubuntu 8.10 and ubuntu remix), OSX 10.5, XP Pro, and win 7. I needed to install the proper player to view them, but once installed no issues.

The flash websites (which drive me crazy in a bad way) have no issues that I have seen. The people that were loading up those flash stuff (badger, badger, badger, badger, ... is stuck in my head forever now) had no issues watching them. Is there some different flash that you are talking about?

Flash has more requirements then HD 720p? The small screen limited me to 720p. The higher resolutions did stream, but I only saw part of the video. The rest was off the screen. The 8.9 inch screen on the netbook it not a 1080p screen.

Re:Finally (1)

lorenzo.boccaccia (1263310) | more than 5 years ago | (#28345101)

atom 330, the one which is not in netbook

Re:Finally (2, Interesting)

MBGMorden (803437) | more than 5 years ago | (#28338363)

I think there's some misunderstanding between "hardcore gamers" and people who the Atom CPU is viable for. The Atom is a wonderfully efficient chip, and I'll concede that it's probably good enough for most "mundane" computing tasks. However, it's not good for ANY level of traditional (and by traditional I mean something that uses some level of 3d acceleration) PC gaming. I'd also question it's usefulness for things like video encoding. That's not a high end or odd application anymore. My mother (who is FAR from a technophile) is looking into video encoding and editing now after having gotten a digital video camera last Christmas. The bare reality is the Atom is a SLOW chip. We've come to the realization lately that we can and do get useful work done on slow chips, but I have to be given a good situational reason to saddle myself with one.

Overall I think that there is some room for compromise between "bleeding edge" and "so efficient it hurts", but the Atom isn't it. It's a great mobile chip for netbooks where the difference between a 5w chip and a 15w chip is incredible (because even though both are virtually unnoticeable blips on a power bill, when running wireless everyone notices the extra battery life). For standard usage there are better choices. Low power/laptop versions of standard offerings such as the Core 2 Duo I think have a better future there.

Re:Finally (4, Interesting)

Hurricane78 (562437) | more than 5 years ago | (#28338715)

The Atom is a wonderfully efficient chip

No, it's not. It's a wonderfully feature-less chip, with everything possible off-loaded into the northbridge. Which is why the NB looks like the real CPU, when you look at the board.

If you want wonderful efficiency, look at those new smartbooks that were show in a recent /. article. They take 1-2 watts, and play full-hd and hardware accelerated flash.
I rather stack 10 of those, than buying one Atom chip (with the same power usage).

I just wish someone would offer bare-bones ARM modules that you could take as much as you wanted of, and stick them together to form a desktop computer. maybe even have a special module that you could take out as a smartbook. Throw in some GPUs, and maybe an SPU (sound), or whatever you like.
Of course Windows would -- as usual -- just choke and die, but Windows and Smartbooks do not fit anyway (yet). It's all Linux in its many forms (including Android).

I for one, would love to have a desktop system, that is essentially a more tightly integrated blade rack with a fast backbone bus.

Re:Finally (1)

i.of.the.storm (907783) | more than 5 years ago | (#28338871)

I was going to post exactly what you said, but you seem to have covered all the bases. It bears repeating though; Atom is not a wonderfully efficient chip at all, it just consumes less power compared to other x86 chips but I would guess that its performance per watt is actually lower than other recent x86 CPUs, and definitely much lower than an ARM CPU. I think it's much more feasible to go from a sensible, low power architecture like ARM and try to increase performance, than to take a complex, power-hungry architecture like x86 and try to reduce power consumption. I wouldn't be surprised if ARM chips managed to catch up and surpass Atom in both performance and ultra-low power consumption in the next few years.

Re:Finally (1)

hairyfeet (841228) | more than 5 years ago | (#28343119)

What i want to know is: What the hell was wrong with the Celeron/Sempron? Yeah it didn't get the ultra low wattage of the Atom(which you pointed out is kinda BS thanks to everything being offloaded to the Northbridge) but unlike the Atom the Celeron/Sempron IS capable of doing just about all of the everyday tasks that most folks use PCs for.

Hell my oldest is playing Left4Dead on a 3.06Ghz Celeron while I finish up this 3.6Ghz P4 refurb for him, and it plays just fine. The only thing I saw wrong with the Celeron/Sempron was those that would try to cheap out and put the desktop chips into sub basement laptops, which of course made for 30 minute batteries and lap melters. But having gotten to work with a few Celeron mobile laptops here at the shop, for what most folks do with their machines they would be fine. And for Granny desktops the Celeron/Sempron made a good general purpose desktop.

I think the only reason Intel pushed out the Atom was a way to get rid of the old 945g chipsets I'm sure they had a warehouse full of while selling a chip that I'm betting costs them maybe 1/4th the cost of a Celeron, if that. But since Celeron/Sempron was a way for Intel and AMD to sell the chips that came out with less than perfect caches I have to wonder if the Atom will end up hurting Intel's bottom line. After all, if Intel got a bad batch of Core2 all they had to do was blow the cache and voila! Celerons. But if everyone jumps on the Atom cheapo craze then what is Intel gonna do with those blown Cores if nobody uses Celerons in the low end anymore?

Personally I would take a Celeron/Sempron over an Atom any day. I have a couple of Celerons and even the 1.1GHz has enough performance to make a decent Nettop. I have a feeling a lot of these Atom based are gonna end up in somebody's closet or on eBay cheap when folks realize exactly HOW underpowered those things really are.

Re:Finally (1)

tikram (1262046) | more than 5 years ago | (#28340823)

This is your lucky day! [beagleboard.org]

Re:Finally (1)

jones_supa (887896) | more than 5 years ago | (#28345489)

No, it's not. It's a wonderfully feature-less chip, with everything possible off-loaded into the northbridge. Which is why the NB looks like the real CPU, when you look at the board.

Are you sure it isn't only about the NB not being as power-efficient yet? I wonder if there is anything more "off-loaded" than with any other CPU.

Re:Finally (1)

matmota (238500) | more than 5 years ago | (#28353189)

You can get a MIPS-based desktop system with 72 processors that consumes 300 Watts, from SyCortex. They call it their Deskside Development System [sicortex.com] for their bigger parallel computers, and they say it does have a fast backbone bus.

It does run Linux, but at $23,695.00 (48 GB RAM) it's not, I suspect, what you were asking for. I would also like some cheap barebones I could just go on populating with CPUs as I wanted.

The GP might like SGI's Molecule [gizmodo.com] better though, it being Atom-based: 5000 chips, that's 10000 cores, in 3U size. But this one is only a concept computer.

Re:Finally (1)

renoX (11677) | more than 5 years ago | (#28370751)

>>The Atom is a wonderfully efficient chip
> No, it's not. It's a wonderfully feature-less chip, with everything possible off-loaded into the northbridge.

Well this depends what you compare an Atom to, compared to many other x86 it doesn't off-load anything more in the northbridge..
Compared to an SOC or an ARM, sure.

Re:Finally (1)

notskynet (1397301) | more than 5 years ago | (#28380927)

Throw in a Burroughs Large Systems B5000 type CPU, HyperTransport backplane, Radix tree/string optimized GPGPU, possibly with basic dataflow capabilities for dynamic transcompilation (i386, anyone?), storage and networking, hide it all under the LLVA interface, embed Linux in the motherboard firmware, and Intel and IBM are gonna take lessons from you.

Re:Finally (1)

Kjella (173770) | more than 5 years ago | (#28339925)

Atom + nVidia ION does full 1080p decoding and is capable to running a 3D desktop with any wiggly effects you might want. That covers a lot of ground in my book. Gaming and video editing is at the opposite end of the scale for me, having a HD cam it's one of the things that really give the machine a workout. But it's rather specific in either you got it or you don't. If you don't, and most people I know only have digicams, then Atom will do you just fine. For gaming, go dualcore + fast GPU, for video encoding go quadcore and forget GPU, the GPU encoding/trancoding tools I've seen so far have had too many limitations. And of course if you got plenty money almost anything is possible...

Re:Finally (1)

ACMENEWSLLC (940904) | more than 5 years ago | (#28338519)

As an owner of an Asus EEE BOX 206 with an ATI HD video card, I could only agree that it would suite the needs of most users if Adobe would get up off their but and create decent GPU offloading capabilities into Flash.

The EEE has an Atom and draws 19W max. It plays DVD's just fine. Not being able to stream YouTube or HULU really sucks, though.

I question if producing 8 different chip sets is as cost effective is perhaps producing three? The more quantity you can produce of a single chip, the cheaper manufacturing becomes, right?

Re:Finally (1)

Pulzar (81031) | more than 5 years ago | (#28339323)

I question if producing 8 different chip sets is as cost effective is perhaps producing three? The more quantity you can produce of a single chip, the cheaper manufacturing becomes, right?

We're still talking about two (maybe three) different ASICs, all packaged/fused into different products. Having multiple packages still costs some money, but being able to hit the sweet spot of every market segment is worth it.

Re:Finally (2, Interesting)

AmiMoJo (196126) | more than 5 years ago | (#28338891)

Let's just hope they fixed the manufacturing problems that are still dogging them.

I work fixing PCs for business and the public, and we have seen over 120 HP laptops with nVidia chipsets that have failed in the past six months. Usual symptoms are no video output (but otherwise boots), wifi card dropping out or just completely dead and not POSTing.

HP will do anything to get out of fixing the problem, which they won't even admit exists on most affected models. There is a website (http://www.hplies.com/ [hplies.com] ) organising people in the US, but so far nothing similar for the UK.

Re:Finally (0)

Anonymous Coward | more than 5 years ago | (#28339037)

As one of the affected of said laptops (in South Africa) I don't care what either company brings out. Not buying their crap ever again.

Never again: nVidia or HP (0)

Anonymous Coward | more than 5 years ago | (#28340411)

I will NEVER buy another nVidia or HP product. In my opinion, they cannot be trusted.

Re:Finally (1)

Bitmanhome (254112) | more than 5 years ago | (#28343551)

Getting off topic, but I just got an HP replaced for that reason (dead nVidia chip). (I'm an nVidia snob, which is why that lappy had one of their chips to begin with.) If you have a bad HP, take the advice at that site [hplies.com] , and get a case manager. Using regular support, we had to send it in 3 times to get a working (though down-specced) machine. But once we got a case manager, they sent a new machine.

Re:Finally (1)

AmiMoJo (196126) | more than 5 years ago | (#28350047)

We managed to get three machines fixed by HP.

1. Took 8 months, went back and forth several times because they initially installed the wrong motherboard (similar spec but lacking a HDMI port). It was a US model and that seemed to confuse them. So much for a world wide warranty.

2. Was fixed the second time it went to them, but for some inexplicable reason came back with a cracked copy of Vista installed on the HDD. Luckily we imaged the drive before sending it off so we were able to restore it. No idea how or why it happened, there was a perfectly good Vista COA on the base.

3. Took three attempts to get fixed, but then failed again one month outside the warranty period. HP didn't want to know after that. Apparently they are just replacing defective boards with more defective ones.

low powered -- but better than standard integrated (1)

popo (107611) | more than 5 years ago | (#28337667)

Yes these are nothing special in the big picture. But the pricepoint could be extremely low for all we know. I'll bet this is an effort to put Nvidia chipsets in an entire generation of netbooks -- from which Nvidia has been excluded in favor of integrated graphics.

These are actually a new architecture of sorts (4, Interesting)

Vigile (99919) | more than 5 years ago | (#28337891)

This piece has more commentary on the release as opposed to regurgitating specs: http://www.pcper.com/article.php?aid=732 [pcper.com]

It looks like this new architecture is going to be quite different than the desktop counterpart.

Suicidal NVIDIA GPUs (4, Interesting)

madnis (1300099) | more than 5 years ago | (#28337971)

So has NVIDIA fixed their bump-material problem, or can I expect one of these GPUs to croak after 6 months like the my laptop's 8400M did?

Re:Suicidal NVIDIA GPUs (2, Interesting)

ledow (319597) | more than 5 years ago | (#28338085)

Oh, wow. Thanks. I've never heard of this and just had my new laptop repaired with what appears to be an identical problem.

It was a Clevo with a 9300M on it and the symptoms sound exactly the same - 6 months in, the graphics starting playing up to the point that the computer just hung if you touched the keyboard or moved it in any way, always with graphical corruption, and sometimes Linux/Windows would just carry on regardless, but with corrupt graphics. Sometimes there'd be a kernel panic or freeze, but the graphics were the main culprit all the time.

I've just had the mainboard replaced - let's hope that they replaced it with one of the "newer" designs.

Re:Suicidal NVIDIA GPUs (2, Interesting)

Hurricane78 (562437) | more than 5 years ago | (#28338951)

Well... they already killed themselves with their naming scheme changes. Re-labeling things so that you are pretty much guaranteed to feel ripped off when buying one of their cards, because it is just the same old shit with a new name, does not essentially make them trustworthy, or me wanting to buy anything from them.

Unfortunately, ATi's current generation is completely incompatible with Linux, (Not compatible to current kernel interfaces [>=2.6.29], massive tons of things that make it crash, composite and xinerama blocking each other, needs band-aids here, and helper tools there, to just get it working for a short time, extremely crappy video rendering [imagine HDR going wild]) so they are the only real choice. :/

Re:Suicidal NVIDIA GPUs (2, Informative)

i.of.the.storm (907783) | more than 5 years ago | (#28338973)

This is a very important issue for anyone looking into nVidia chips, and I for one won't buy one until it is ascertained whether this issue has been fixed. Apparently the problem was even in a chip as low performance as the Geforce Go 6150 IGP, which is pathetic. An IGP should never have overheating problems, what the hell nVidia.

the one (2, Funny)

gellern (1045842) | more than 5 years ago | (#28337991)

8 GPUs to rule them all and in the darkness bind them! i guess their strategy in current market is: can't convince them? confuse them!

Re:the one (1)

Hurricane78 (562437) | more than 5 years ago | (#28338773)

and in the darkness bind them

Are you saying they will come bundled with Doom 4?
Better hope they integrate all eight at the same time. Or you might end up with the set of your keyboard LEDs having a higher resolution (and being brighter anyway).

Sucky Summary, could have held the whole FA... (3, Informative)

BobMcD (601576) | more than 5 years ago | (#28338147)

NVIDIA is filling in what it presumes to be holes in its next-generation GPU lineup, adding the 40nm G210M, GT 230M, GT 240M and GTS 250M, with GDDR3 memory ranging from 512MB to 1GB, to its existing GTX 280M, GTX 260M and GTS 160M laptop graphics cards. Apparently the new cards sport "double the performance" and "half the power consumption" over the last generation of discrete GPUs they're replacing. The cards are SLI, HybridPower, CUDA, Windows 7 and DirectX 10.1 compatible, and all support PhysX other than the low-end G210M. Of course, with integrated graphics like the 9400M starting to obviate discrete graphics in the mid range -- even including Apple's latest low-end 15-inch MacBook Pro -- we're not sure what we'll do with eight different GPU options, but we suppose NVIDIA's yet-to-be-announced price sheet for these cards will make it all clear in time.

Look at the words changed:

[what it presumes to be holes] becomes [some imagined gap]

[Apparently the new cards sport "double the performance" and "half the power consumption"] becomes [These new chips supposedly double the performance and halve the power consumption]

[we're not sure what we'll do with eight different GPU options] becomes [still no word on why they think we need eight different GPU options]

and [but we suppose NVIDIA's yet-to-be-announced price sheet for these cards will make it all clear in time] gets completely omitted...

WTF?

Re:Sucky Summary, could have held the whole FA... (1)

Colonel Korn (1258968) | more than 5 years ago | (#28338395)

Also, what's up with calling 9400M midrange in the same article that the faster 210M is called low end? And why is an article that mentions 4 new GPUs labeled as introducing 5 new GPUs in the title?

It wouldn't be Slashdot... (0)

Anonymous Coward | more than 5 years ago | (#28338723)

...without some hilariously misplaced righteous indignation. The only appealing part of Slashdot is laughing at self-rightous nerd rage.

For those confused about the codenames... (4, Informative)

slyn (1111419) | more than 5 years ago | (#28338241)

So I was looking around after seeing this earlier to try and make sense of what older generation codenames match to the newer generation codenames, and found this: http://www.nvidia.com/object/geforce_m_series.html [nvidia.com] (scroll down).

Basically it goes GTX > GTS > GT > GS > G

The old 9400/8400 line has become the 210/110
The old 9600/8600 line has become the 230/130
The old 9800/8800 GT/GS has become the 250/150
And The old 9800/8800 GTX/GTS has become the 280

There are a few other cards that fall in the middle of categories, but that seems to be the basic gist of it as far as I can tell.

Heres another useful resource for comparing mobile gpu's: http://www.notebookcheck.net/Comparison-of-Graphic-Cards.130.0.html [notebookcheck.net]

Re:For those confused about the codenames... (1)

Jeremy Erwin (2054) | more than 5 years ago | (#28339557)

I wonder if these codes promote fanboyism. You've learnt the code, you know the lingo, you buy the card that you know. Accepting that the other side might just have something better this iteration would require turning in your secret decoder ring.

Re:For those confused about the codenames... (1)

hairyfeet (841228) | more than 5 years ago | (#28345753)

I don't think it is being a fanboy as much as being confused as hell because you know one scheme and not the other, kinda like the old days before MHz where you weren't sure if that 486 from company x(be it AMD or Cyrix) would be equal to a Intel 486 or what. It is even worse when you try to jump ship with GPU, as there are just so damned many cards out there and so varied a price point.

I know when I decided to jump from Intel+Nvidia(because until all the "bad bump" chips have cleared the channel I'm steering clear of them for me and my customers) to AMD+ATI I finally just went to the ATI forums and said "throw an old greybeard a break here" and was told to get pretty much any Phenom(bigger is better of course) and anything in the 4xxx series, again bigger numbers is better. I am quite happy now with my Kuma 7550 which is whisper quiet(my customers really love that as well) and my new HD4650 plays the videos and games nice and smooth. But I can say that with all the numbering schemes it was a PITA trying to figure out Nvidia xxxx= ATI yyyy after being on Intel+Nvidia since the days of the Geforce2.My last AMD was a Barton Core and I hadn't had an ATI since the RageII, so needless to say I was out of the loop. So it wasn't that I was a fanboy, it was I already knew at a glance what to expect by looking at the Intel and Nvidia numbering schemes.

Re:For those confused about the codenames... (1)

MojoStan (776183) | more than 5 years ago | (#28343303)

The old 9400/8400 line has become the 210/110
The old 9600/8600 line has become the 230/130
The old 9800/8800 GT/GS has become the 250/150
And The old 9800/8800 GTX/GTS has become the 280

You mean the GTX 280M is not based on the desktop GTX 280, but the previous-generation 9800/8800? Death to NVIDIA!

I'm kidding, of course, but this is a long-time pet peave of mine. The GeForce4 MX was based on GeForce2 technology. The Radeon 8000 was not a DirectX 8/OpenGL 1.4 GPU like the rest of the 8000-series. This shit continues today with these NVIDIA mobile GPUs.

Competition is so cool... (3, Insightful)

rickb928 (945187) | more than 5 years ago | (#28338309)

- Intel threatening an all-in-one smartphone chipset

- ARM showing up everywhere, netbooks coming soon, hopeful big battery life gains and HD playback

- Microsoft feeling left out of the smart- market. (I know, insert favorite pun here)

- Android liking its chances in the netbook market

- AMD looking at netbooks for growth

It's wonderful. I may yet get a netbook with 8+ hrs battery life, touchscreen, and I can settle for a Bluetooth headset profile connection to my smartphone in my pocket.

Now, gimme the 8' screen that folds out to 8"x14", and a swiveling keyboard. Woot. And that 700MHz thingie that is supposed to make broadband ubiquitous... For under $300, and less than $40/mo for the Interwebs.

I'll buy it.

Re:Competition is so cool... (0)

Anonymous Coward | more than 5 years ago | (#28340293)

Competition is also cool for the 6 year olds who put your gadgets together such that their parents will not starve this week.

Re:Competition is so cool... (1)

rickb928 (945187) | more than 5 years ago | (#28354285)

And equally cool for the 45-year old dad planning on paying his daughter's tuition this fall right after he finds a new job. Nope, it's not fair. It was never intended to be. Trying to make it fair is a constant struggle.

Memo from NVidia CEO (4, Funny)

sootman (158191) | more than 5 years ago | (#28338463)

Fuck Everything, We're Doing 5 GPUs [theonion.com]

Would someone tell me how this happened? We were the fucking vanguard of graphics cards in this country. The GeForce was the card to own. Then the other guy came out with a three-GPU card... Well, fuck it. We're going to five GPUs.

Here's the report from Engineering. Someone put it in the bathroom: I want to wipe my ass with it. They don't tell me what to invent--I tell them. And I'm telling them to stick two more GPUs in there. I don't care how. Make the GPUs so thin they're invisible. Put some on the bracket. I don't care if they have to cram the fifth one in perpendicular to the other four, just do it!

I know what you're thinking now: What'll people say? Mew mew mew. Oh, no, what will people say?! Grow the fuck up. When you're on top, people talk. That's the price you pay for being on top. Which NVidia is, always has been, and forever shall be, Amen, five GPUs, sweet Jesus in heaven.

(Hey, Slashcode, why won't you format <i> or <em> inside <blockquote>?)

Re:Memo from NVidia CEO (1)

gringer (252588) | more than 5 years ago | (#28343411)

(Hey, Slashcode, why won't you format <i> or <em> inside <blockquote>?)

Because you should be using <quote> instead, which does support that formatting.

how is 8 alot? (1)

wjh31 (1372867) | more than 5 years ago | (#28338809)

we're not sure what we'll do with eight different GPU options

yeah because theres hardly any options in the desktop market...

DX 10.1?? (1)

bakedpatato (1254274) | more than 5 years ago | (#28338953)

Hmm...looks like nvidia is finally hopping on the 10.1 train by committing their mobile lines to it...after all that they did to ATi. now all I want is a laptop with one of these suckers to ship.

Laptop graphics cards - help needed (1)

maroberts (15852) | more than 5 years ago | (#28339117)

I apologise for being a little behind the curve here, but can someone tell me if laptop graphics cards have standard sizes and interfaces nowadays, preferably with useful links? I always thought laptops were more of a custom build than your everyday PC.

Re:Laptop graphics cards - help needed (1)

Narishma (822073) | more than 5 years ago | (#28339361)

They call them cards but that's just to differentiate them from the integrated GPUs. I don't think you can actually replace them.

Re:Laptop graphics cards - help needed (2, Informative)

Atriqus (826899) | more than 5 years ago | (#28340235)

Sure, here's a link that'll send you in the right direction.

MXM [wikipedia.org]

Bleh, boring (1)

Lord Lode (1290856) | more than 5 years ago | (#28339211)

Yawn, mobile GPU's. Where are the times of the huge graphics revolutions like when the GeForce just came out?

If they aren't OpenCL compliant, ... (1)

tyrione (134248) | more than 5 years ago | (#28341355)

then I'll pass. The upcoming ATi updates are rolling in OpenCL which allows for us to have cross-compilation even if Nvidia thinks everyone's gung-ho about CUDA.

OpenCL available on all current nvidia products (1, Informative)

Anonymous Coward | more than 5 years ago | (#28341577)

If they aren't OpenCL compliant, ... then I'll pass. The upcoming ATi updates are rolling in OpenCL which allows for us to have cross-compilation even if Nvidia thinks everyone's gung-ho about CUDA.

AFAIK, Nvidia released OpenCL drivers that run on-top of the nvidia cuda runtime
http://www.nvidia.com/object/cuda_opencl.html [nvidia.com]

Since all recent Nvidia chips are CUDA enabled, they are by default also OpenCL enabled.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?