Beta

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia CEO "Not Afraid" of CPU-GPU Hybrids

ScuttleMonkey posted more than 6 years ago | from the you-don't-scare-me dept.

Graphics 228

J. Dzhugashvili writes "Is Nvidia worried about the advent of both CPUs with graphics processor cores and Larrabee, Intel's future discrete graphics processor? Judging by the tone adopted by Nvidia's CEO during a financial analyst conference yesterday, not quite. Huang believes CPU-GPU hybrids will be no different (and just as slow) as today's integrated graphics chipsets, and he thinks people will still pay for faster Nvidia GPUs. Regarding Larrabee, Huang says Nvidia is going to 'open a can of whoop-ass' on Intel, and that Intel's strategy of reinventing the wheel by ignoring years of graphics architecture R&D is fundamentally flawed. Nvidia also has some new hotness in the pipeline, such as its APX 2500 system-on-a-chip for handhelds and a new platform for VIA processors."

cancel ×

228 comments

Sorry! There are no comments related to the filter you selected.

Not scared... no kidding? (1)

OMNIpotusCOM (1230884) | more than 6 years ago | (#23040376)

It's easy to not be afraid when you have NO COMPETITION! I realize that wasn't the point of the article, but there were some stories no here about Creative [slashdot.org] and how they have sucked since they bought up the competition, and it would suck if that happened (more than it already has) to NVidia.

Re:Not scared... no kidding? (4, Insightful)

Yvan256 (722131) | more than 6 years ago | (#23040470)

No competition? What? Did ATI die or something?

Yes I know they got bought by AMD, but they still exist and they still make GPUs AFAIK.

And if your argument is that nVidia is better than ATI, let me remind you that ATI/nVidia and intel/AMD keep leapfrogging each other every few years.

Re:Not scared... no kidding? (3, Informative)

Anonymous Coward | more than 6 years ago | (#23040820)

ATI/AMD hasn't been competitive with NVIDIA for two product cycles. That doesn't look likely to change in the near future, either; ATI/AMD's next generation GPU architecture isn't looking so hot.

AMD is in a world of hurt right now, with Intel consistently maintaining a lead over them in the CPU segment, and NVIDIA maintaining a lead over them in the GPU segment. They're doing some interesting, synergistic things between the CPU and GPU sides, but who knows if that'll pan out. Meanwhile, they're being forced to compete on price alone, which is never a position you want to be in.

(The driver quality situation hasn't exactly helped them any, either, although I'm looking forward to good things post-acquisition, especially now that open source drivers are becoming a reality.)

Re:Not scared... no kidding? (3, Insightful)

nuzak (959558) | more than 6 years ago | (#23041016)

> ATI/AMD hasn't been competitive with NVIDIA for two product cycles

Competitive enough anyway. Long as I'm still on AGP, I'm still getting ATI cards (nVidia's agp offerings have classically been highly crippled beyond just running on AGP). But sure, I'm a niche, and truth be told, my next system will probably have nVidia.

But gamer video cards aren't everything, and I daresay not even the majority. If you have a flatscreen TV, chances are good it's got ATI parts in it. Then there's laptops and integrated video, nothing to sneeze at.

Re:Not scared... no kidding? (1)

aliquis (678370) | more than 6 years ago | (#23041966)

Thought Nvidia 8xxxm-series are better than ATI HD x2xxx ones aswell.

Re:Not scared... no kidding? (2, Insightful)

Z34107 (925136) | more than 6 years ago | (#23041690)

Quite true. With the 8500 and 8600 models, and now the 9500, nVidia trounces AMD even on budget cards.

But, nVidia got pummeled prior to their acquisition of Yahoo!^H^H^H^H^H^H Voodoo, and the two were quite neck and neck for a long time. So it's more of "the tables have turned (again)" rather than "they have no competition."

Until AMD completely quits making higher-end video cards, nVidia will have to keep on doing something to stay competitive. Same thing with Firefox - I don't think IE8 would have looked any different than IE5 without something biting at their heels-slash-completely surpassing them.

Re:Not scared... no kidding? (1)

aliquis (678370) | more than 6 years ago | (#23041980)

Because Nvidia focused on gaming consoles during that time instead, big deal, I'm sure they won't let it happen again.

Re:Not scared... no kidding? (1)

Z34107 (925136) | more than 6 years ago | (#23042148)

I don't know what they were smoking, but they pissed off Microsoft, too, and got left out of developing DirectX 8 IIRC. Because they didn't have their hands on the next version of DirectX, they were way behind the ball when the SDK proper was released.

But, I'm thrilled with the hardware they produce. And as long as AMD stays no more than one generation behind them, they won't be able to rest on their laurels, either.

Re:Not scared... no kidding? (1)

aliquis (678370) | more than 6 years ago | (#23041920)

Or to be more correct, ATI haven't been competive except during the FX5-series. Which more or less was 3Dfx-cards and not Nvidiacards anyway.

Re:Not scared... no kidding? (1)

Ravadill (589248) | more than 6 years ago | (#23041990)

ATI's 9xxx (9600 and up) series beat NVidia's (underpowered and overheating) FX series chips by a massive amount. Definately more recent than 3dfx.

Re:Not scared... no kidding? (1)

OMNIpotusCOM (1230884) | more than 6 years ago | (#23041002)

Fair enough, but it seems like AMD is getting less and less press, and ATI is less and less desirable, and Intel is less and less interested in making graphic cards. I have never heard anyone say they were dying to see the new Radeon. If ATI were doing well I don't think they could have been purchased by AMD... but I could be wrong. Just shootin from the hip, my friend, just shootin from the hip =)

Re:Not scared... no kidding? (1)

socz (1057222) | more than 6 years ago | (#23041048)

But ATI sucks :P From personal experience, all ATI video cards i've had over the years have all sucked. They've been very limited in what I can do. The most upto date video card i have is only an nvidia 6600gt. But before that I ran pretty much only ati (because i didn't know any better) and always hated life. Since then i've recommended nvidia to everyone and no complaints what soever.

Now i'm looking at one of the newer nvidia cards for a HTPC. hopefully amd can fix the ati issues and make them a realistic performance card for people, but i'm likely to never buy them again.

Re:Not scared... no kidding? (1)

aliquis (678370) | more than 6 years ago | (#23041906)

No they don't.

Since the Voodoo 2 Nvidia have been making the best video cards performancevise all the time except when they let the 3DFX-guys make the FX5-series. TNT, TNT2, Geforce especially, Geforce2, Geforce3, Geforce4, had them on top. FX5 was shit and Radeon 9xxx was better, Nvidia did catch up in the next generation thought. So yes, Nvidia lost in one generation because they used other developers, but it's not like the game change the whole time. The x1950 was a nice card for the price thought.

And considering the shitty ATI-drivers during 9xxx and the very shitty Linux drivers and nonexisting BSD- and Solaris-drivers Nvidia wins even more.

My friend which invest a lot in his gaming rig because he have a work but no life (and play wow) got some ATI cards now thought because the latest Nvidia cards didn't worked in SLI on non nvidia-motherboards which seems kind of retarded. So he just told the store that they didn't worked in his system and returned them and got ATIs best stuff instead (some dual GPU-card in SLI configuration.)

Intel? (4, Funny)

icydog (923695) | more than 6 years ago | (#23040380)

Did I hear that correctly? NVidia is going to beat Intel in the GPU department? What a breaking development!

In other news, Aston Martin makes better cars than Hyundai!

Ray tracing for the win (5, Informative)

symbolset (646467) | more than 6 years ago | (#23040638)

Ray vs raster. The reason we have so much tech in Raster is because processing was not sufficient to do ray. If it had been we'd have never started down the raster branch of development because it just doesn't work as well. The results are not as realistic with raster. Shadows don't look right. You can't do csg. You get edge effects. There are a thousand work-arounds for things like reflections of reflections, lens effects and audio reflections. Raster is a hack and when we have the CPU to do the real time ray tracing rendering raster composition will go away.

Raster was a way to make some fairly believable (if cartoonish) video games. They still require some deliberate suspension-of-disbelief. Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo, except for the fact that the realistic scene depicts something that might be physically impossible.

Re:Ray tracing for the win (2, Interesting)

caerwyn (38056) | more than 6 years ago | (#23041144)

This is true to some extent, but raster will never completely go away- there are situations where raster is completely appropriate.

For instance, modern GUIs often use the 3d hardware to handle window transforms, blending and placement. These are fundamentally polygonal objects for which triangle transformation and rasterization is a perfectly appropriate tool and ray tracing would be silly.

The current polygon model will never vanish completely, even if high-end graphics eventually go to ray tracing instead.

Re:Ray tracing for the win (1)

steelfood (895457) | more than 6 years ago | (#23041692)

If realism is the goal, global illumination techniques, of which ray tracing and ray casting are a part of, would be your best bet. Yes, rasters have their place. But this is a small place in the grand scheme of things.

All bets are off if the intention is not photorealism. Some hybrid of the two may be best depending on the situation.

Re:Ray tracing for the win (1)

kalirion (728907) | more than 6 years ago | (#23041154)

Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo, except for the fact that the realistic scene depicts something that might be physically impossible.

Photos, maybe. But we're still a loooooong way off for real time video when you consider that it is still relatively easy to tell CGI from live action in the highest budget prerendered movies. At close-ups anyway.

Re:Ray tracing for the win (2, Interesting)

wattrlz (1162603) | more than 6 years ago | (#23041654)

There must be a conspiracy behind that. There's no way big-budget studios with seven and eight figure budgets and virtually limitless cpu cycles at their disposal could be releasing big-screen features that are regularly shown up by video games and decade old tv movies. Maybe it has something to do with greenscreening to meld the cgi with live action characters, perhaps it's some sort of nostalgia, or the think that the general public just isn't ready to see movie-length photo-realistic features, but there's no way digital animation hasn't progressed in the past ten or twenty years.

Re:Ray tracing for the win (3, Insightful)

mrchaotica (681592) | more than 6 years ago | (#23042270)

Perhaps the limitation is in the ability of the humans to model the scene rather than the ability of the computer to render it.

Re:Ray tracing for the win (2, Insightful)

koko775 (617640) | more than 6 years ago | (#23041262)

Even raytracing needs hacks like radiosity.

I don't buy the 'raytracing is so much better than raster' argument. I do agree that it makes it algorithmically simpler to create near-photorealistic renders, but that doesn't mean that raster's only redeeming quality is that it's less burdensome for simpler scenes.

Re:Ray tracing for the win (1, Insightful)

Anne Thwacks (531696) | more than 6 years ago | (#23041302)

The real reason we have raster, is because more computers spend more hours rendering Word docs than rendering games images, by at least a factor of 100,000.

Worse than that, people like me would be quite happy using our 4MB ISA graphics cards, if some sod hadn't gone and invented PCI.

In fact, 3/4 of all computer users would probably be happy using text mode and printing in 10 pitch courier if it wasnt for the noise those damned daisy-wheel printers made.

NVidia are about to get shafted, and, as someone who cannot get his NVidia card to work properly in FreeBSD, or Win2k, I say "good riddance". (It works with Ubuntu 7.10 if anyone actually cares)

Re:Ray tracing for the win (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23041684)

Worse than that, people like me would be quite happy using our 4MB ISA graphics cards, if some sod hadn't gone and invented PCI.

Are you sure about that?


4MB is not enough to store 1280x1024 at 32bpp. I also believe that extra video card memory can be used in 2D to store extra bitmaps.


ISA also has a bandwidth of under 4 MB/s, which is not enough for 320x240 16bpp 30fps video


If you want to talk about those old graphics cards, try turning off all 2D acceleration and see how smooth moving windows and scrolling is. That's why they did window outlines.

Re:Ray tracing for the win (2, Interesting)

ObsessiveMathsFreak (773371) | more than 6 years ago | (#23041602)

The results are not as realistic with raster. Shadows don't look right.
As John Carmack mentioned in a recent interview, this is in fact a bonus, for shadows as well as other things.

The fact is that "artificial" raster shadows, lighting and reflections typically look more impressive than the "more realistic" results of ray tracing. This alone explains why raster will maintain its dominance, and why ray tracing will not catch on.

Re:Ray tracing for the win (1)

Jorophose (1062218) | more than 6 years ago | (#23041750)

Raster was a way to make some fairly believable (if cartoonish) video games. They still require some deliberate suspension-of-disbelief. Only with raytracing do you get the surreal Live-or-memorex feeling of not being able to tell a rendered scene from a photo

I think that was the whole appeal. I don't feel like playing a game where everything is completely realistic. I'd much rather play Mario games, where everyone is cartoonish in the sense that you know it's not a person


Humans feel unsafe around human-like artificial creations. Research uncanny valley.

Re:Intel? (1)

wattrlz (1162603) | more than 6 years ago | (#23040698)

... In other news, Aston Martin makes better cars than Hyundai!
In light of the often facetuos nature of any sentence containing the words, "British Engineering", the Comparison of Aston Martin's reputation for reliability with Hyundai's, and the comparison of their current parent company's reputations and stock prices... My word! That is news, indeed!

Re:Intel? (1)

Sciros (986030) | more than 6 years ago | (#23040760)

Aston Martin's privately owned. Bought from Ford by rich Kuwaitis for $850 million or something.

Can't say that's necessarily a good thing, but I guess Ford wanted the money.

And yeah, Hyundais are better built than Astons. But Astons are better in many other regards of course.

Re:Intel? (2, Insightful)

TheRaven64 (641858) | more than 6 years ago | (#23041312)

nVidia beating Intel in the GPU market would indeed be news. Intel currently have something like 40% of the GPU market, while nVidia is closer to 30%. Reading the quote from nVidia, I hear echoes of the same thing that the management at SGI said just before a few of their employees left, founded nVidia, and destroyed the premium workstation graphics market by delivering almost as good consumer hardware for a small fraction of the price.

nVidia should be very careful that they don't make the same mistake as Creative. Twenty years ago, if you wanted sound from a PC, you bought a Soundblaster. Ten years ago, if you wanted good sound in games, you bought a Soundblaster (or, if you had more taste, a card that did A3D), and it would offload the expensive computations from the CPU and give you a better gaming experience. Now, who buys discrete sound cards? The positional audio calculations are so cheap by today's standards that you can do them all on the CPU and barely notice.

CPU and GPU intergation. (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23040450)

CPU and GPU integration is quite logical progression of technology. There are things the GPU is not optimal and same goes to the CPU. It seems that when combined, they prove successful.

A side note maybe we'll see a Nvidia GPU based Folding@home release some day, but at least ATI latest GPUs have a new client to play with:
http://folding.typepad.com/news/2008/04/gpu2-open-beta.html [typepad.com]

Re:CPU and GPU intergation. (2, Insightful)

Anonymous Coward | more than 6 years ago | (#23040976)

CPU and GPU integration is quite logical progression of technology. There are things the GPU is not optimal and same goes to the CPU. It seems that when combined, they prove successful.
Let's examine this statement:

"Bus and train integration is quite logical progression of technology. There are things the plane is not optimal and same goes to the bus. It seems that when combined, they prove successful. So let's put wings on a bus."

Now, I think there are plenty of good reasons why CPU/GPU integration is a good idea (as well as a few good reasons why it's not), but there's nothing logical about the statement you made. Just because a CPU does something well and a GPU does something different well, it doesn't necessarily follow that slapping them together is a better idea than having them be discrete components.

The key insight is that the modern CPU and the modern GPU are starting to converge in a lot of areas of functionality. The main difference is that CPUs are optimized for serial processing of at most a few threads of arbitrarily complex software, while GPUs are optimized for massively parallel processing of large numbers of pixels using similar, fairly simple programs (shaders).

Now, the logic core needed to perform these two tasks is highly specific, which is why we have separate CPUs and GPUs to begin with. But there's a lot to be gained by integrating the two more closely. You can share memory interfaces, for example, and perhaps more relevantly for the high-end graphics segment, you can tightly couple CPU and GPU operations across a bus that's going to be a hundred times faster than anything PCI Express can provide, and with latency to die for.

In short, I agree with your basic point, but I don't think you made a very good case for it.

Re:CPU and GPU intergation. (1)

mrchaotica (681592) | more than 6 years ago | (#23042314)

Now, the logic core needed to perform these two tasks is highly specific, which is why we have separate CPUs and GPUs to begin with. But there's a lot to be gained by integrating the two more closely. You can share memory interfaces, for example, and perhaps more relevantly for the high-end graphics segment, you can tightly couple CPU and GPU operations across a bus that's going to be a hundred times faster than anything PCI Express can provide, and with latency to die for.

That's why I think AMD/ATI, rather than either Intel or NVidia, is on the right track: I'm really looking forward to being able to put an AMD CPU and an ATI GPU on either end of a Hypertransport bus.

Re:CPU and GPU intergation. (1)

sexconker (1179573) | more than 6 years ago | (#23041610)

?
For decades we've been moving things away from the CPU to separate, dedicated chipsets.

Now we want to consolidate things back into the CPU?

Why? Because we have more cores and developers can't be arsed to program for them?

Re:CPU and GPU intergation. (1)

hobbit (5915) | more than 6 years ago | (#23042106)


Novel processing units tend to be on separate chipsets before they are integrated. Take for instance floating point units. No longer did you have to simulate non-integer values in your own code: you just bought an FPU and plugged it in. Similarly, we don't have to write our own 3D renderers any more, and eventually Graphical Processing Units will be integrated into the same casing as the CPU, just as Floating Point Units were before them.

Re:CPU and GPU intergation. (1)

sexconker (1179573) | more than 6 years ago | (#23042244)

I remember when we all loved that video decoding was being offloaded to GPUs, or that real modems had become cheap enough that we could get away from "soft" modems.

I remember well the fiasco of Creative's audio cards that were nothing more than some ports and a driver that ran on your cpu.

There are certain things that are better left to dedicated chipsets.
Due to the architectural differences between CPUs and GPUs, I'm going to say that, at least for the next 5 years, GPUs will perform better if they're not on the CPU.

Re:CPU and GPU intergation. (1)

billcopc (196330) | more than 6 years ago | (#23042178)

Yes and no.

A combined CPU and GPU would allow cost savings, which is good for a large portion of the market who simply want a cheap piece of junk that can run Excel and surf porn.

On the reverse, having both items fused together means you can't upgrade one without tossing the other. You might be perfectly happy with the GPU, but want a faster CPU for the real work you do with that machine. I'd bet my 8800GTS that Intel/AMD will plan their product lines in such a way that you can't get exactly what you want - you'll either have to get more GPU than you need, to match the CPU side, or vice versa. No more lopsided configs.

Personally, I would much rather stick with the current solutions involving motherboard-integrated video. G31 and NF7050 are pretty decent for what they are, and there's nothing currently stopping me from slapping a Q6600 quad-core on a cheap G31 for a budget compute node. A combined CPU+GPU won't let me do that.

Multi Core GPUs (2, Interesting)

alterami (267758) | more than 6 years ago | (#23040490)

What AMD should really try to do is start combining their cpu technology and their graphics technology and make some multi core GPUs. They might be better positioned to do this than Intel or Nvidia.

Re:Multi Core GPUs (2, Insightful)

Wesley Felter (138342) | more than 6 years ago | (#23040618)

Modern GPUs already have 8-16 cores.

Re:Multi Core GPUs (1)

PhrostyMcByte (589271) | more than 6 years ago | (#23040796)

That is basically what GPUs do already. You'll notice modern ones have something like 128 "stream processors".

Re:Multi Core GPUs (1)

makomk (752139) | more than 6 years ago | (#23041460)

Yep, and I think they're essentially dumbed-down general CPU cores; apparently they don't even support vector operations and are scalar-only.

Re:Multi Core GPUs (1)

graphicsguy (710710) | more than 6 years ago | (#23042094)

apparently they don't even support vector operations and are scalar-only.
Yes, that's true. The reason is because it's hard to keep those vector units fully utilized. You get better utilization with more scalar units rather than fewer vector ones (for the same area of silicon).

Re:Multi Core GPUs (1)

Zebra_X (13249) | more than 6 years ago | (#23042134)

Lol... Just like 64 bit and multicore, AMD was talking about Fusion way before anyone else was. Intel has "stolen" yet another idea from AMD. Unfortunately, the reality is that AMD doesn't have the capital to refresh it's production lines as often as Intel - and I think to some extent the human capital to exectute on the big ideas.

Re:Multi Core GPUs (1)

billcopc (196330) | more than 6 years ago | (#23042202)

What CPU technology ? Their "native" quad core is slower than Intel's "bastard" double-double.

AMD had the lead for a brief while, and they pushed Intel to slash their prices at long last, but now AMD is back where it started, back where it belongs in the budget segment. It's not a bad place to be, now that the Ghz race is over and 99.44% of the world is completely sated with entry-level processors.

Let's Face It (2, Insightful)

DigitalisAkujin (846133) | more than 6 years ago | (#23040494)

Until Intel can show us Crysis in all it's GPU raping glory running on it's chipset in 1600x1200 with all settings to Ultra High Nvidia and ATI will still be kings of high end graphics. Then again, if all Intel wants to do is create a sub standard alternative to those high end cards just to run Vista Aero and *nix Beryl then they have already succeeded.

NOTHING to do with existing games. (4, Informative)

SanityInAnarchy (655584) | more than 6 years ago | (#23040654)

Until Intel can show us Crysis

If Intel is right, there won't be much of an effect on existing games.

Intel is focusing on raytracers, something Crytek has specifically said that they will not do. Therefore, both Crysis and any sequels won't really see any improvement from Intel's approach.

If Intel is right, what we are talking about is the Crysis-killer -- a game that looks and plays much better than Crysis (and maybe with a plot that doesn't completely suck [penny-arcade.com] ), and only on Intel hardware, not on nVidia.

Oh, and Beryl has been killed and merged. It's just Compiz now, and Compiz Fusion if you need more.

Re:NOTHING to do with existing games. (1)

Snuz (1271620) | more than 6 years ago | (#23041160)

Penny Arcade can go die in a fire, Crysis was great in both the gameplay and graphics depts.

Re:NOTHING to do with existing games. (1)

geekboy642 (799087) | more than 6 years ago | (#23042082)

If you had been reading, the parent post referred specifically to the story. Gameplay and graphics have absolutely nothing to do with the story line.

Re:NOTHING to do with existing games. (1)

Colonel Korn (1258968) | more than 6 years ago | (#23041718)

When Intel has a ray tracing Crysis-killing demo, we'll be in 2013 playing things 10-100x more complex/faster on raster hardware.

Re:Let's Face It (2, Interesting)

LurkerXXX (667952) | more than 6 years ago | (#23041214)

Intel has open specs on their integrated video hardware, so Open Source folks can write their own stable drivers.

ATI and Nvidia do not. I know who I'm rooting for to come up with a good hardware...

Re:Let's Face It (2, Informative)

hr.wien (986516) | more than 6 years ago | (#23041848)

ATI have open specs. At least for a lot of their hardware. They are releasing more and more documentation as it gets cleaned up and cleared by legal. Open Source ATI drivers are coming on in leaps and bounds as a result.

Re:Let's Face It (1)

Alioth (221270) | more than 6 years ago | (#23042268)

The thing is, nvidia's statement feels spookily like "famous last words".

I'm sure DEC engineers poo-pooed Intel back in the early 90s when the DEC Alpha blew away anything Intel made by a factor of four. But a few short years later, Chipzilla had drawn level. Now Alpha processors aren't even made and DEC is long deceased.

Nvidia ought not to rest on its laurels like DEC did, or Intel will crush them.

Interesting comments in the call (1)

Fallen Kell (165468) | more than 6 years ago | (#23040516)

Some of the comments made were very interesting. He really slammed someone that I take was either an Intel rep, or otherwise associate. The best was when that rep/associate/whoever criticized Nvidia about their driver issues in Vista, and the slam-dunk response that I paraphase, "If we [Nvidia] only had to support the same product/application that Intel has [Office 2003] for the last 5 years then we probably wouldn't have as many driver issues as well. But since we have new products/applications that our drivers need to support which come out every day, it makes things a little more complicated".

Did anyone expect him to surrender? (4, Insightful)

WoTG (610710) | more than 6 years ago | (#23040590)

IMHO, Nvidia is stuck as the odd-man out. When integrated chipsets and GPU-CPU hybrids can easily handle full-HD playback, the market for discrete GPUs falls and falls some more. Sure, discrete will always be faster, just like a Porsche is faster than a Toyota, but who makes more money (by a mile)?

Is Creative still around? Last I heard, they were making MP3 players...

Re:Did anyone expect him to surrender? (1)

twotailakitsune (1229480) | more than 6 years ago | (#23041130)

ATI/AMD has a chipset that does this: AMD 780. The 780G is real good. Can get like 10 FPS in Crysis.

Re:Did anyone expect him to surrender? (1)

bmajik (96670) | more than 6 years ago | (#23041492)

just like a Porsche is faster than a Toyota, but who makes more money


Porsche, actually, if you're referring to profit

Porsche is done of the most profitable automakers. If you've ever looked at a Porsche options sheet, it will become clear why this is the case. They also have brilliant/lucky financial people.

http://www.bloomberg.com/apps/news?pid=20601087&sid=aYvaIoPRz4Vg&refer=home [bloomberg.com]

Re:Did anyone expect him to surrender? (1)

WoTG (610710) | more than 6 years ago | (#23041676)

Wow, point taken. I guess that wasn't the best example! I knew I should have said Ferrari.

I wonder what portion of Porche's profit comes from Volkswagon?

Re:Did anyone expect him to surrender? (0)

Anonymous Coward | more than 6 years ago | (#23042326)

just like a Porsche is faster than a Toyota, but who makes more money
Porsche, actually, if you're referring to profit

Wait, what? Is this some joke? Porsche made $1.97 billion last year as you pointed out. Meanwhile, Toyota made $13.93 billion [wikipedia.org] .

Re:Did anyone expect him to surrender? (1)

Miseph (979059) | more than 6 years ago | (#23041500)

Pretty decent ones, too. I managed to pick up a 1gb Zen Stone for under $10, and the only complaint I have is that it won't play ogg and there isn't a Rockbox port out there for it (yet). Since I dislike iTunes anyway (it has too many "features" that serve only to piss me off, and any UI that treats me like 5 year old just sets me on edge) it pretty much does everything that I want an mp3 player to do.

Plus, as an added bonus, I don't have to pretend that I'm hip or trendy while I listen to it; if people think i'm cool, I want it to be because I am, not because I paid double for a piece of small electronics with lowercase "i"s.

Re:Did anyone expect him to surrender? (0)

Anonymous Coward | more than 6 years ago | (#23041772)

Toyota makes 1 of the top 10 fastest production cars in the world.

I'm fairly certain porsche is on the list too, but you can't compare a Camry and a 911.

Try a 911 and a Supra. You might be surprised, and for about 1/5 the cost.

He should be afraid (4, Interesting)

Yvan256 (722131) | more than 6 years ago | (#23040608)

I, for one, don't want a GPU which requires 25W+ in standby mode.

My Mac mini has a maximum load of 110W. That's the Core 2 Duo CPU, the integrated GMA950, 3GB of RAM, a 2.5" drive and a DVD burner, not to mention FireWire 400 and four USB 2.0 ports under maximum load (the FW400 port being 8W alone).

Granted the GMA950 sucks compared to nVidia's current offerings, however do they have any plans for low-power GPUs? I'm pretty sure the whole company can't survive on the FPS-crazed game players revenues alone.

They should start thinking about asking intel to integrate their (current) laptop GPUs into intel CPUs.

Re:He should be afraid (4, Informative)

forsey (1136633) | more than 6 years ago | (#23040818)

Actually nVidia is working a new technology called HybridPower which involves a computer with both an on board and discrete graphics card, where the low power on board card is used most of the time (when you are just in your OS environment of choice), but when you need the power (for stuff like games) the discrete card boots up.

Re:He should be afraid (1)

Wesley Felter (138342) | more than 6 years ago | (#23041100)

Nvidia already makes IGPs that are pretty low power; they don't even need fans.

For ultimate low power, there's the future VIA/Nvidia hookup: http://www.dailytech.com/NVIDIA%20Promises%20Powerful%20Sub45%20Processing%20Platform%20to%20Counter%20Intel/article11452.htm [dailytech.com]

Sigh (2, Informative)

Sycraft-fu (314770) | more than 6 years ago | (#23041298)

Of COURSE they do, in fact they already HAVE low power offerings. I'm not sure why people seem to think the 8800 is the only card nVidia makes. nVidia is quite adept at taking their technology and scaling it down. Just reduce the clock speed, cut off shader units and such, there you go. In the 8 series they have an 8400. I don't know what the power draw is, but it doesn't have any extra power connectors so it is under 75 watts peak by definition (that's all PCIe can handle). They have even lower power cards in other lines, and integrated on the motherboard.

So they already HAVE low power GPUs. However you can't have both low power and super high performance. If you want something that performs like an 8800, well, you need an 8800.

Re:Sigh (1)

Yvan256 (722131) | more than 6 years ago | (#23041536)

Since you seem to know nVidia's lineup, what would you recommend for a fan-less PCI or AGP card good enough to run Final Fantasy XI in 1280x1024?

And before you say "FF XI is old, any current card will do", let me remind you that it's a MMORPG with more than a few dozen players on the screen at once (in Jeuno, for example), and my target is to have around 20 FPS in worst-case scenario.

I'm asking for PCI because I might try to dump my current AMD Athlon 2600+/KT6 Delta box (which is huge) with a fanless mini-ITX/Core Solo system (if possible). Mind you, I only run Win98SE with 512MB/1GB because FF XI runs fine on that.

Also keep in mind that I'm currently using an ATI Radeon 9600XT and it can barely hold 15 FPS in crowded areas, in 640x480.

The PC is just a toy (4, Insightful)

klapaucjusz (1167407) | more than 6 years ago | (#23040626)

If I understand them right, they're claiming that integrated graphics and CPU/GPU hybrids are just a toy, and that you want discrete graphics if you're serious. Ken Olsen famously said that "the PC is just a toy". When did you last use a "real" computer?

Re:The PC is just a toy (1)

Alioth (221270) | more than 6 years ago | (#23042316)

And of course they and game developers slam Intel's product... but 99.9% of PCs are never used to play games.

Nvidia's statement sounds like famous last words, too. I think their laurels are getting pressed too flat from resting on them. Just as Intel's CPUs eventually caught up and overtook the Alpha, the same might happen with their graphics chipsets.

Can't we all just get along? (1)

scubamage (727538) | more than 6 years ago | (#23040670)

So, if you have a hybrid chip, why not put it on a motherboard with a slot for a nice nvidia card. Then you'll get all the raytracing goodness from intel, plus the joyful rasterbation of nvidia's finest offerings. The word "coprocessor" exists for a reason. Or am I missing something here?

Re:Can't we all just get along? (1)

Wesley Felter (138342) | more than 6 years ago | (#23040966)

Or am I missing something here?
Yeah, you're missing some money from your wallet. Most people won't waste their money on two different GPUs, just like they won't buy PPUs or Killer NICs.

Translation: "nVidia needs a better top manager." (2, Funny)

Futurepower(R) (558542) | more than 6 years ago | (#23040734)

"Huang says Nvidia is going to 'open a can of whoop-ass' on Intel..."

This is a VERY SERIOUS problem for the entire world. There are apparently no people available who have both technical understanding and social sophistication.

Huang is obviously ethnic Chinese. It is likely he is imitating something he heard in a movie or TV show. He certainly did not realize that only ignorant angry people use that phrase.

Translating, that phrase, and the boasting in general, says to me: "Huang must be fired. nVidia needs a better top manager."

Re:Translation: "nVidia needs a better top manager (1)

MaDMvD (1148691) | more than 6 years ago | (#23040822)

While I agree that Mr. Huang's statement was a little overboard, and definitely unprofessional, I, my good sir, think you are uptight. Furthermore, if you had actually RTFA (I know, you are probably unfamiliar w/ the term), you would have realized that he never directly mentions Intel in his statement. Feast your eyes on the glory of an article snippet: Huang summed up Nvidia's position on Larrabee in one sentence: "We're gonna open a can of whoop-ass [on Intel]." Now, notice the [on Intel]. Thank you.

Okay, here is more detail about why he is foolish. (3, Interesting)

Futurepower(R) (558542) | more than 6 years ago | (#23042154)

The problem is not that Nvidia CEO Jen-Hsun Huang made one stupid statement. The problem is that he said many foolish things, indicating that he is not a good CEO. Here are some:

Quote from the article: "Nvidia CEO Jen-Hsun Huang was quite vocal on those fronts, arguing hybrid chips that mix microprocessor and graphics processor cores will be no different from systems that include Intel or AMD integrated graphics today."

My opinion: There would be no need for all the talk if there were no chance of competition. Everyone knows there will be new competition from Intel Larabee and AMD/ATI. Everyone knows that "no different" is a lie. Lying exposes the Nvidia CEO as a weak man.

"... he explained that Nvidia is continuously reinventing itself and that it will be two architectural refreshes beyond the current generation of chips before Larrabee launches."

The entire issue is that Intel+Larabee and AMD+ATI will make Nvidia irrelevant for most users. The GPU will be on the motherboard. Nvidia will sell only to gamers who are willing to pay extra, a lot extra.

"Huang also raised the prospect of application and API-level compatibility problems with Larrabee. Intel has said Larrabee will support the DirectX 10 and OpenGL application programming interfaces just like current AMD and Nvidia GPUs, but Huang seemed dubious Intel could deliver on that front."

Intel, in this case, is Intel and Microsoft working together. Both are poorly managed companies in many ways, but they are both managed well enough to insure that the Microsoft product works with the Intel hardware. Sure, it is an easy guess that Microsoft will release several buggy versions, because Microsoft has a history of treating its customers as though they were beta testers, but eventually everything will work correctly.

'[NVidia VP] Tamasi went on to shoot down Intel's emphasis on ray tracing, which the chipmaker has called "the future for games." '

Ray tracing is certainly the future for games, there is no question about that. The question is when, because the processor power required is huge. It's my guess, but an easy guess, that Mr. Tamasi is lying; he is apparently trying to take advantage of the ignorance of financial analists.

"Additionally, Tamasi believes rasterization is inherently more scalable than ray tracing. He said running a ray tracer on a cell phone is "hard to conceive."

This is apparently another attempt to confuse the financial analyists, who often have only a pretend interest in technical things. Anyone understanding the statement knows it is nonsense. No one is suggesting that there will be ray-tracing on cell phones. My opinion is that this is another lie.

"We're gonna be highly focused on bringing a great experience to people who care about it," he explained, adding that Nvidia hardware simply isn't for everyone."

That was a foolish thing to say. That's the whole issue! In the future, Nvidia's sales will drop because "Nvidia hardware simply isn't for everyone." Most computers will not have separate video adapters, whereas they did before. Only powerful game machines will need to by from Nvidia.

'Huang added, "I would build CPUs if I could change the world [in doing so]." ' Later in the article, it says, "Nvidia is readying a platform to accompany VIA's next-generation Isaiah processor, which should fight it out with Intel's Atom in the low-cost notebook and desktop arena"

Translation: Before, every desktop computer needed a video adapter, which came from a company different than the CPU maker, a company like Nvidia. Now, the video adapters will be mostly supplied by CPU makers. In response, Nvidia will start making low-end CPUs. It is questionable whether Nvidia can compete with Intel and AMD making any kind of CPU.

Typo corrections (1)

Futurepower(R) (558542) | more than 6 years ago | (#23042214)

Analysts, the word is analysts.

Typing too fast.

Re:Translation: "nVidia needs a better top manager (0)

Anonymous Coward | more than 6 years ago | (#23040836)

What it says to me is "Huang speaks the language of their demographic". Sorry, but "can of whoop-ass" may be coarse, but it's not redneck-speak. It's at worst a little bit dated.

I see you didn't say anything about his actual leadership or management style. The crappy year of 2008 aside, where everyone's been suffering, I'd say NVDA's stock price alone vindicates him.

Re:Translation: "nVidia needs a better top manager (1)

avandesande (143899) | more than 6 years ago | (#23040918)

The statement itself is pretty stupid. Is NVDIA going to design a better CPU with onboard GPU unit than Intel?

Re:Translation: "nVidia needs a better top manager (1)

Nullav (1053766) | more than 6 years ago | (#23042010)

Or just design a dedicated GPU that blows Intel's offering out of the water. That's what a video card manufacturer does, after all.

Re:Translation: "nVidia needs a better top manager (1)

Lord Ender (156273) | more than 6 years ago | (#23040980)

only ignorant angry people use that phrase.
Only ignorant angry people make such generalizations.

Re:Translation: "nVidia needs a better top manager (2, Insightful)

ozbird (127571) | more than 6 years ago | (#23041396)

This is a VERY SERIOUS problem for the entire world. There are apparently no people available who have both technical understanding and social sophistication.

Maybe he was out of chairs?

Re:Translation: "nVidia needs a better top manager (2, Informative)

nuzak (959558) | more than 6 years ago | (#23041618)

Huang is obviously ethnic Chinese. It is likely he is imitating something he heard in a movie or TV show.

Yeah, them slanty-eyed furriners just can't speak English right, can they?

Huang is over 40 years old and has lived in the US since he was a child. Idiot.

So when AMD dies (1)

tuaris (955470) | more than 6 years ago | (#23040828)

When AMD Finally dies, what will my ATI Stock Certificates be worth?

ouch (2, Informative)

Lord Ender (156273) | more than 6 years ago | (#23040840)

NVDA was down 7% in the stock market today. As an Nvidia shareholder, that hurts!

If you don't believe Intel will ever compete with Nvidia, now is probably a good time to buy. NVDA has a forward P/E of 14. That's a "value stock" price for a leading tech company... you don't get opportunities like that often. NVDA also has no debt on the books, so the credit crunch does not directly affect them.

I think AMD has a better plan (4, Interesting)

scumdamn (82357) | more than 6 years ago | (#23040850)

Intel is and always has been CPU-centric. That's all they ever seem to focus on because it's what they do best. Nvidia is focusing 100% on GPUs because it's what they best. AMD seems to have it right with their combination of the two (by necessity) because they're focusing on a mix between the two. I'm seriously stoked about the 780G chipset they rolled out this month because it's an integrated chipset that doesn't suck and actually speeds up an ATI video card if you add the right one. Given, AMD isn't the fastest when it comes to either graphics or processors but at least they have a platform with a chipset, CPU, and graphics that work together. Chipsets have needed to be a bit more powerful for a long-ass time.

Re:I think AMD has a better plan (1)

IdeaMan (216340) | more than 6 years ago | (#23041146)

You could be right, unless either NVidia licenses their GPU IP to Intel, or is bought by Intel.

They created a small integrated ARM based processor (picture in article), so I take that as an indication of how they're trying to do the integration game. The question then becomes how high up the CPU ladder are they aiming?

Re:I think AMD has a better plan (1)

opec (755488) | more than 6 years ago | (#23041548)

Ass time?

Re:I think AMD has a better plan (1)

Kjella (173770) | more than 6 years ago | (#23042090)

Only if you think the non-serious gamer market is a big hit. It's been a long time since I heard anyone complain about GPU performance except in relation to a game. Graphics cards run at the resolution you want, and play a lot of well... not GPU intensive games or rather non-GRU games at all, but games you could run on highly inefficient platforms like flash and still do alright. And for the games that do consider GPU-performance, more is usually always better. Sure, it's an integrated chipset but I've yet to see anyone make substantial benefits from being integrated. Most of all, it's a cutthroat market with nVidia, Intel and AMD/ATI all making integrated chipsets. In the end, there's no point if the "integrated chipset" is just the GPU with some I/O glued on, with no more relation to the CPU than it has today. Show me the real benefit of this integration, if not it's mere aggregation and can easily be done by Intel/nVidia as discrete components.

Re:I think AMD has a better plan (0)

Anonymous Coward | more than 6 years ago | (#23042208)

Presumably it is mostly cost savings at this point, avoiding the cost of lots of memory chips.

Nvidia, Think of the... (0)

Anonymous Coward | more than 6 years ago | (#23040978)

Nvidia may not be scared, but I am, I don't want some goatse interfering with my porn data processing . :-(

Can of Whoop Ass?? (2, Interesting)

TomRC (231027) | more than 6 years ago | (#23041108)

Granted NVidia is way out ahead in graphics performance - but generally you can tell when that someone is getting nervous when they start in the belligerant bragging.

The risk for NVidia isn't that Intel will surpass them, or even necessarily approach their best performance. The risk is that Intel might start catching up, cutting (further) into NVidia's market share.
AMD's acquisition of ATI seems to imply that they see tight integration of graphics to be at least cheaper for a given level of performance, or higher performance for a given price. Apply that same reasoning to Intel, since they certainly aren't likely to let AMD have that advantage all to themselves.

Now try to apply that logic to NVidia - what are they going to do, merge with a distant-last-place x86 maker?

Intel graphics suck... (0, Redundant)

Doug52392 (1094585) | more than 6 years ago | (#23041226)

Intel "graphics cards" are on most regular PCs, and they can't run shit. They're integrated, so if you try to run a game, the computer takes some RAM and makes it graphics RAM, which slows everything down...

Re:Intel graphics suck... (2, Insightful)

Nullav (1053766) | more than 6 years ago | (#23041516)

Next you're going to tell me the sky is blue or that too much water can kill me. Onboard video isn't meant to be shiny, just to serve a basic need: being able to see what the hell you're doing. Rather than dismissing Intel because they (and many other board manufacturers) provide a bare-bones video solution, I'm interested in seeing what they'll pop out when they're actually trying.

By the way, onboard video uses about as much RAM as a browser will use (And about as much as Win98 needs to boot in, but I digress.), hardly a drop in the bucket with 1GB sticks being so cheap now. If 8-32MB of RAM is that much of a problem for you, you have more problems than poor video.

Re:Intel graphics suck... (1)

klapaucjusz (1167407) | more than 6 years ago | (#23041626)

if you try to run a game, the computer takes some RAM and makes it graphics RAM, which slows everything down...

Look at it the other way: whenever you're not running a game, "the computer" takes some Video RAM and gives it back to the system...

NVidia may just add a CPU. (1)

Animats (122034) | more than 6 years ago | (#23041278)

NVidia may just put a CPU or two in their graphics chips. They already have more transistors in the GPU than Intel has in many of their CPUs. They could license a CPU design from AMD. A CPU design today is a file of Verilog, so integrating that onto a part with a GPU isn't that big a deal.

Re:NVidia may just add a CPU. (0)

Anonymous Coward | more than 6 years ago | (#23041736)

A CPU design today is a file of Verilog, so integrating that onto a part with a GPU isn't that big a deal.
lol

I don't want integrated graphics (0)

Anonymous Coward | more than 6 years ago | (#23041292)

I find that graphics cards burn out 2-3x as fast as cpu/motherboards, and discrete graphics also allow the graphics capability of the system to be upgraded for $100-200. Integrated graphics would mean the CPU/GPU (which would be much more expensive) would need to be swapped out to upgrade a system, and also that these more expensive chips would either burn out more often, or underperform in order to extend chip life. I'm sure vista would also consider a cpu/gpu upgrade as 'reliscenceworthy' event, though a single graphics card upgrade probably is not.

The folks that are concerned about high speed graphics are the most likely to want to upgrade the graphics more often, and integrating them onto the GPU just doesn't make sense for that market. Some type of higher level bus than AGP would be a better way to go.

Re:I don't want integrated graphics (1)

Nullav (1053766) | more than 6 years ago | (#23042218)

'Much more expensive'? I see the prices actually going down. Has adding three processor cores and boatloads of cache over the past four years raised prices much? Also, this would mean a GPU would no longer be a specialized component and be produced in higher volume as part of every desktop CPU.

Upgrades? Just grab a new processor, tear up your thumbs removing the cheapshit heatsink you grabbed off eBay for $2, apply some thermal grease that absolutely refuses to come off your finger, replace the heatsink and you're done. (Well, it's cheaper, at least.) You could perform two of the most significant upgrades for $100-200, rather than $50-100 for the CPU and $100-200 for the GPU.

Just like the FPU (5, Interesting)

spitzak (4019) | more than 6 years ago | (#23041360)

Once upon a time the floating point was done on a seperate chip. You could buy a cheaper "non-professional" machine that emulated the fpu in software and ran slower. You could also upgrade your machine by adding the fpu chip.

Such FPU's do not exist today.

I think Nvidia should be worried about this.

Re:Just like the FPU (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23041852)

That's true, but FPUs didn't require 100+ GByte/sec dedicated memory systems, analog and digital video output circuitry or any of the other things that GPUs have other than FPUs... You're absolutely right that there's a possibility of things going like they did for the x87 and Weitek floating point chips, but it's also possible that having the GPU as an independent device will continue to make sense for a variety of other reasons going forward.

Re:Just like the FPU (1)

Kjella (173770) | more than 6 years ago | (#23042192)

The FPU is a sparring partner, the GPU is a pipeline. Except for a few special uses, the GPU doesn't need to talk back to the CPU. The FPU (that's floating point unit, doing non-integer math) very often returned a result that the CPU needed back. The result is that it makes sense to put the CPU and FPU very close, the same doesn't hold true for the GPU. It's substantially more difficult to cool 1x100W chip than 2x50W chips. Hell, even on a single chip Intel and nVidia could draw up "this is my part, this is your part and these are the PCIe on-chip interconnects but it still wouldn't mean they cooperate closer.

The problem... (2, Interesting)

AdamReyher (862525) | more than 6 years ago | (#23041370)

...is that Nvidia is saying that Intel is ignoring years of GPU development. Umm, wait. Isn't a GPU basically a mini-computer/CPU by itself that exclusively handles graphics calculations? By making this statement, I think they've forgotten who Intel is. Intel has more than enough experience in the field to go off on their own and make GPUs. Is it something to be scared of? Probably not, because as he correctly points out, a dedicated GPU will be more powerful. However, it's not something that can be ignored. We'll just have to wait and see.

Publish your programming manuals! (1)

GNUPublicLicense (1242094) | more than 6 years ago | (#23041384)

DAMN IT! We want opened drivers!

CPU+GPU is mostly a cost-cutting measure (3, Interesting)

billcopc (196330) | more than 6 years ago | (#23042100)

Having the GPU built into the CPU is primarily a cost-cutting measure. Take one low-end CPU, add one low-end GPU, and you have a single-chip solution that consumes a bit less power than separate components.

Nobody expects the CPU+GPU to yield gaming performance worth a damn, because the two big companies that are looking into this amalgam both have underperforming graphics technology. Do they both make excellent budget solutions ? Yes they certainly do, but for those who crave extreme speed, the only option is NVidia.

That said, not everyone plays shooters. Back in my retail days, I'd say I moved 50 times more bottom-end GPUs than top-end ones. Those Radeon 9250s were $29.99 piles of alien poop, but cheap poop is enough for the average norm. The only people who spent more than $100 on a video card were teenagers and comic book guys (and of course, my awesome self).
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?
or Connect with...

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>