Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Future According To nVidia

kdawson posted more than 6 years ago | from the when-all-you've-got-is-a-hammer dept.

Graphics 132

NerdMaster writes "Last week nVidia held their Spring 2008 Editor's day, where they presented their forthcoming series of graphics processing units. While the folks at Hardware Secrets couldn't tell the details of the new chips, they posted some ideas of what nVidia is seeing as the future of computing. Basically more GPGPU usage, with the system CPU losing its importance, and the co-existence of ray-tracing and rasterization on future video cards and games. In other words, the 'can of whoop-ass' nVidia has promised to open on Intel."

Sorry! There are no comments related to the filter you selected.

Who will have the better Linux driver support? (5, Informative)

pembo13 (770295) | more than 6 years ago | (#23542553)

That's my main influence when I purchase video cards.

Re:Who will have the better Linux driver support? (4, Insightful)

Rosco P. Coltrane (209368) | more than 6 years ago | (#23542997)

I fail to see how this is redundant. I too choose video cards based on how well they are supported under Linux. Or rather, I choose the ones with the less shitty support. Any Linux users who's ever tried to use any OpenGL app more complex than glxgears knows the pain, so I reckon Linux (or any OS other than Windows I suppose) support isn't a trivial, or a fanboy issue.

So no, the post isn't redundant, because this issue isn't yet solved (not to mention, how can a first post be redundant?).

Re:Who will have the better Linux driver support? (0, Redundant)

nebulus4 (799015) | more than 6 years ago | (#23543249)

(...) not to mention, how can a first post be redundant?
You must be new here.

Re:Who will have the better Linux driver support? (0)

Anonymous Coward | more than 6 years ago | (#23546303)

Finally!! A moderator who really understands what redundant [reference.com] really means.

Re:Who will have the better Linux driver support? (0)

Anonymous Coward | more than 6 years ago | (#23544123)

Sorry, I guess there wasn't a "Who Cares" moderator option.

Re:Who will have the better Linux driver support? (2, Insightful)

morcego (260031) | more than 6 years ago | (#23544845)

so I reckon Linux (or any OS other than Windows I suppose) support isn't a trivial


Considering how many problems I have always seen, I would say that even on Windows it is anything but trivial.

Video drivers suck. On whatever platform you choose.

Re:Who will have the better Linux driver support? (1)

Khyber (864651) | more than 6 years ago | (#23545357)

First posts are redundant because moderators are too fucking STUPID and have little understanding of the language they supposedly speak.

Re:Who will have the better Linux driver support? (5, Insightful)

darthflo (1095225) | more than 6 years ago | (#23543007)

nVidia will probably continue their controversial blob model (i.e. you get a binary object plus the source to a kernel module that, with the help of said object, works as a driver). Purists rage against it because it's against freedom and-so-on, pragmatists tend to like the full 3D acceleration that comes with it.
Intel is going the Open Source road, trying to be as open as possible. Unfortunately, from a performance PoV their hardware sucks. Their products are intended as consumer-level, chipset integrated solutions and, considering that, work nicely. Don't try any 3D games, though.
ATi opened a lot of specs, so community-developed and completely open drivers are on the horizon. Unfortunately the horizon is quite far away and the movement towards it is similar to a kid on a tricycle. The situation is prone to improve though. Performance-wise, ATi may be a good choice if you'd like to play the occasional game, but they don't really compare to nVidia (which is unlikely to change soon).
In the end, I'm going to stick to nVidia in the near future, using intel wherever low energy consumption is strongly desired (i.e. notebooks and similar). ATi just ain't my cup of tea, I wouldn't be putting a red card in a Windows box either, but my preference of nVintel is just such -- a preference. Go with whatever suits you best.

Re:Who will have the better Linux driver support? (1)

PeterKraus (1244558) | more than 6 years ago | (#23545181)

ATI/AMD proprietary drivers are getting better and better. The performance of FireGL series is a bit better than on Windows, actually.

Re:Who will have the better Linux driver support? (1)

darthflo (1095225) | more than 6 years ago | (#23545657)

Nice, didn't know that. I'm planning to wait till about Q3/2009 for a new performance rig, but if ATi manages to catch up to nVidia's performance 'till then, I may just opt for the really open option. Thanks.

Re:Who will have the better Linux driver support? (2, Interesting)

Anonymous Coward | more than 6 years ago | (#23545573)

Purists rage against it because it's against freedom and-so-on, pragmatists tend to like the full 3D acceleration that comes with it.
Bullshit.
Closed drivers suck for pragmatic reasons.
Just because YOU haven't paid the price yet doesn't mean it isn't true.

I bought two top-end nvidia cards (spent $350+ each on them) only to find out that because my monitors don't send EDID information their binary-blob drivers wouldn't work. The problem was that my monitors required dual-link DVI and even though these top-of-the-line cards had dual-link transceivers built into the chip (i.e. every single card of that generation had dual-link transceivers, it wasn't unique to one vendor's model) nvidia's brain-dead binary-blob drivers would assume that their own transceivers where single-link if they failed to get EDID information and nothing you could do in the config files would convince them otherwise. it even printed it out in the driver syslog messages "initializing single-link dvi transmitter..."

I reported the bug to nvidia, I even fucking proved it out by buying an EDID generator (Gefen dvi detective) to force an EDID down the wire to the nvidia cards.
But Nvidia's support for linux is informal. They don't officially support linux. You heard me, its "best level of effort" where the engineers assigned to the work are shit-for-brains who just ignore problems that they can't grasp rather than an official support program with bug-reports and escalation.

So, I ended up spending an extra $150 (plus DAYS of my time figuring out the problem) all of which could have been reduced to about 1 day's worth of effort if the source was available for me to fix myself.

Oddly enough, RMS had a similar problem once upon a time - a closed source printer driver was buggy as hell, and the printer manufacturer refused to either fix the bugs or send him the source so he could fix it himself. Amazing how little things have changed since then, despite all the hype about "Open Source."

At least ATI has committed to fully Free drivers for their next gen cards, due in a month or so. (The current cards aren't fully Free because the DRM hardware is entangled with the video decode/playback acceleration hardware so they won't release a driver that supports accelerated video playback for fear it will reveal the cracks in their DRM. The next gen cards seperate the DRM hardware from everything else so we can just ignore it.)

Re:Who will have the better Linux driver support? (1)

darthflo (1095225) | more than 6 years ago | (#23545985)

You're obviously right about the difficulty of fixing problems, where the openness of Open Source really comes to play. It certainly sucks to know that only a single bit needed to be flipped to make something work that doesn't; and not getting any kind of support from the manufacturer sucks even worse.
Though, in defense of nVidia your problem does seem rather unique. Until very recently it was my understanding that most any screen made in the past decade ought to provide an EDID -- the standard's fifteen bloody years old. While I understand your aggravation against nVidia, I'd also point a finger the other way, to whomever would build a high-end screen without a working EDID implementation. Single-Link DVI is sufficient for 1080p60, so we're probably looking at screens over $2k a pop.

Anyways, while Open Source (with an equal feature set)) is always better than closed solutions, I even more strongly prefer a working over an incomplete or non-working solution. I'd love to switch to a free (speech) driver, but in my experience, none of them works even remotely as well as the kernel-tainting blob.

EDID often doesn't make it through KVMs (0)

Anonymous Coward | more than 6 years ago | (#23546205)

The parent's situation may be somewhat unique, but there's a very much more common situation that hits the same problem, namely EDID blocking or mangling by KVMs. The handling of EDID is patchy at best on a lot of KVMs, even on new ones, so the graphics card can easily reach a wrong conclusion. While it's OK to make assumptions as a default, the settings need to be user-selectable to get around this problem.

Re:Who will have the better Linux driver support? (0)

Anonymous Coward | more than 6 years ago | (#23547411)

in defense of nVidia your problem does seem rather unique.
Which is kind of my point. "Works for me" is not a legitimate form of support.
There will always be those users who have systems outside of the mainstream.
Many would argue that linux iitself is still outside of the mainstream.

PS, check nvidia's support for X-RandR 1.2 - the standard has been out for over a year.
Almost all of the open source video drivers support it. Nvidia doesn't.

Re:Who will have the better Linux driver support? (1)

drinkypoo (153816) | more than 6 years ago | (#23545773)

This is precisely how I feel. ATI can't write a driver to save their life (I hear they're getting better, but most drivers get better over their lifetime; I want to know when it's good and you don't need special workarounds to make things like compiz not explode) and the OSS drivers are still not where they need to be, and won't be for a long time. If there are good OSS drivers for ATI before nVidia's drivers go OSS, I'll go ATI. Until then, I'm in nVidia land. Only my servers need nothing better than intel, though.

Re:Who will have the better Linux driver support? (1)

Vexorian (959249) | more than 6 years ago | (#23547339)

I love how the Linux world has become a place in which ignoring the long term results of a decision is called "being pragmatic".

Re:Who will have the better Linux driver support? (1)

papabob (1211684) | more than 6 years ago | (#23543569)

Not only that. The graphic card is one of the few components for which we don't have any clue about its internals (yes, we've heard about its shaders, the pipelines, etc, but only in powerpoints released by the company). Memory? If I'm interested I can find info about DDR voltage, timing diagrams and everything else. Processor? Intel can send you a 400-pages printed copy of their manual. BIOS? Hard disk? Same. Sure, if you use their drivers everything will work smoothly, but those drivers are given as a plus not as "the only way to get our product running".

Some big company should realize that if nvidia/ati shutdown it bussines, there will be no way to "revive" their old workstations, and nobody will release patches for coming bugs... some big company or some govt department, of course.

Re:Who will have the better Linux driver support? (5, Funny)

Hal_Porter (817932) | more than 6 years ago | (#23544651)

That's my main influence when I purchase video cards.
Hi!

I'm the CEO of NVidia and I spend all day reading slashdot. Despite that I hadn't noticed that Linux was popular until I read your post.

I'll tell the driver developers to start fixing the drivers now.

Thanks for the heads up

Jen-Hsun Huang
CEO, NVidia Inc

Fullscreen TV output? (1)

antdude (79039) | more than 6 years ago | (#23545365)

For me, fullscreen TV output. There's none in 8xxx series and it's broken in older card drivers. See as an example [nvidia.com] .

They *so* don't care... (0)

Anonymous Coward | more than 6 years ago | (#23546311)

There's no market for high-end video cards in linux except for a few CAD tools, and those people don't care at all about open source drivers. So I'm afraid you're a demographic that just doesn't matter to them...

Not nVidia. (3, Insightful)

SanityInAnarchy (655584) | more than 6 years ago | (#23547799)

Possibly Intel, possibly ATI.

But nVidia is the last to publish specs, or any sort of source code. ATI and Intel already do one of the two for pretty much all of their cards.

So, in the long run, nVidia loses. It's possible they'll change in the future, but when you can actually convert a geForce to a Quadro with a soft mod, I very much doubt it'll be anytime soon.

Yawn (5, Insightful)

gd23ka (324741) | more than 6 years ago | (#23542561)

The future according to Sun or IBM.. faster CPUs. The future according to Nvidia... more GPUs .. the future according to Seagate.. exabytes and petabyes, the future according to Minute Maid.. , the future according to Blue Bonnet .. lower cholesterol, the future according to ATT "more bars in more places", the future according to ...

Another paid for article. Yawn.

Re:Yawn (1)

somersault (912633) | more than 6 years ago | (#23542595)

It doesn't have to be paid for, it's just a report on what nVidia is saying, it doesn't say that's definitely what will happen. After reading the summary I thought exactly what you are thinking though. The main CPU may lose a little 'importance' when it comes to games and physics simulations, but it's not going away anytime soon..

Re:Yawn (1)

ta bu shi da yu (687699) | more than 6 years ago | (#23542829)

I don't think that games are main drivers of computing. I think that business apps are. nVidia might think that they are going to rule computing, but they won't. They'll be a dominant player, but they should remember S3 graphics and Vesa. You're only as good as your last product, and given that they aren't particularly open they might lose their market at any time.

Re:Yawn (2, Informative)

Scoth (879800) | more than 6 years ago | (#23543885)

Business apps might be the drivers of the most sales, but I tend to think games are the drivers of "progress". There are very few business apps that need more than NT4 on a decent sized screen with a fast enough processor to run Office. I think even Office 2007's minimum requirements say something about a 500mhz processor. Heck, a large number of companies could probably get away with Windows 3.1, Word 1.1A, Eudora Light for e-mail, and maybe some sort of spreadsheet/accounting software. You really don't need a dual core with 2GB of RAM and Vista Ultimate to send e-mail, write letters, track expenses, and surf the web a bit. As for nVidia, I'm still split on whether the graphics card is going to end up being dominant, or we're going to end up with something like 16 or 32 general purpose cores with a dynamically allocated number dedicated to graphics. I tend to think that as things are now, highly specialized dedicated graphics cards aren't going anywhere, but I've been surprised before.

Re:Yawn (1)

hairyfeet (841228) | more than 6 years ago | (#23543797)

I think the point they are trying to drive,which is true IMHO,is that the CPU just isn't going to be giving you the big boosts anymore. As someone who remembers saving up so I could make the jump from 400Mhz to 1.1Ghz I can say that from first hand experience they are right. Once everyone has gone dual and quad core adding another 400-600Mhz on your CPU really isn't going to do much,whereas jumping a couple of generations on GPU will give you a pretty big boost.


The place where I can see the CPU making big strides is in the embedded spaces. Recently a customer at the local college has had me researching a couple of computers to use as brains in a rocket and remote robot and the amount of CPU power I have been finding that will fit into such tiny spaces is just unreal. Slightly OT,but does anyone know where I can find a micro that has at least one USB and preferably runs Linux? I have to fit the CPU into a 4in diameter rocket and so far most of the ones I'm finding require daughter boards that won't fit. I'm figuring this [netallen.com] for the robot as it fits the space and has plenty of ports for the cameras and spectrometer. Anyway thanks in advance for any suggestions and as usual this is my 02c,YMMV.

Re:Yawn (1)

vlm (69642) | more than 6 years ago | (#23543955)

Slightly OT,but does anyone know where I can find a micro that has at least one USB and preferably runs Linux? I have to fit the CPU into a 4in diameter rocket and so far most of the ones I'm finding require daughter boards that won't fit.
http://www.gumstix.com/ [gumstix.com]

or

http://gumstix.com/waysmalls.html [gumstix.com]

As they say "linux computers that fit in the palm of your hand"

I believe the verdex boards are 2cm by 8cm.

Price is about the same as desktop gear, figure you'll drop about $250 on a basic working system.

Re:Yawn (1)

somersault (912633) | more than 6 years ago | (#23544045)

I agree overall, I used to think that upgrading my GPU would be the most important thing back when I had my 1GHz Athlon, though eventually when that machine fried, I found out that a faster CPU enabled me to get the most out of my GPU. It's back to the stage again where any 2GHz dual core CPU should be fast enough for anyone with the current generation of games and apps, but I'm sure they'll find some more interesting uses for CPU power in the next few years. The way things are going, maybe everything will keep converging until all processors are highly parallelisable units equally capable of amazing graphics and general calculations.. and including all of the traditional motherboard chipset on the die as well :p

Re:Yawn (5, Funny)

Anonymous Coward | more than 6 years ago | (#23542649)

The future according to Goatse.. P1st fR0st!
The future according to anonymous coward.. more trolling, offtopic, flamebait-ness, with the odd insightful or funny.
The future according to Ballmer.. inflated Vista sales (it's his job, damnit!).
The future according to Microsoft shill 59329.. "I hate Microsoft as much as the next guy, but Vista really is t3h w1n! Go and buy it now!"
The future according to Stallman.. Hurd.

BTW The promo video for HURD is going to feature Stallman as a Gangsta rapper, and features the phrase: "HURD up to ma Niggaz."

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#23542949)

Yeah, so?

In the future after the future we're going to see Apple buying HURD and rebranding it too iHURD

Then we'll be seeing Steve Jobs featured as a Gangsta rapper, and features the phrase: "iHURD you niggaz."

Re:Yawn (-1, Flamebait)

Anonymous Coward | more than 6 years ago | (#23543369)

Really good humor, but the use of the word "niggaz" really took the funny right off of the gag and moves it along the spectrum between insensitive and out right race-baiting.

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#23543575)

Thank you for your cogent analysis.

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#23543923)

Really good humor, but the use of the word "niggaz" really took the funny right off of the gag and moves it along the spectrum between insensitive and out right race-baiting.
Ohh, Lighten up Francis

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#23544009)

You are, of course, assuming the writer of that comment was white (as it is part of the "gangsta rappa" culture to throw that word around rather a lot.
If the writer was in fact black, then it couldn't be race bating (baiting him/herself?)..
Which just goes to show exactly how insensitive and racist this kneejerk "Ohhh... He said a naughty word.." attitude really is..
Personally, I think the joke was funny, irrespective of the colour and creed of the poster.. It fit the genre (s)he was aiming at perfectly.

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#23544157)

You cretin. It's people like you that the term 'loony left' were invented for (and I say that as a proud reader of the Guardian). You are giving the left a bad name by taking political correctness too far.

Look, the humour is like this scene from Shaun of the Dead [youtube.com] . It's the situation and Ed's delivery that make it funny, and has bugger all to do with making fun of black people. In fact, the aim is to make fun of white people trying to emulate 'black culture.'

Stallman saying 'HURD up to ma Niggaz' is funny on three levels:

  1. because of who is saying it (a fat white bloke);
  2. the reference to the attempts by Microsoft [gizmodo.com] to come up with 'hip' raps and be 'down with the kids' and the idea that Stallman might try to emulate that;
  3. it's a shit pun, that purposefully sounds like something a marketing deparment would come up with;

Wow, some of you Americans just don't get subtle humour. Kudos to those who do however. ;)

/me prepares to be modded off-topic (I never understood that one).

Re:Yawn (0, Offtopic)

Hal_Porter (817932) | more than 6 years ago | (#23544911)

You cretin. It's people like you that the term 'loony left' were invented for (and I say that as a proud reader of the Guardian). You are giving the left a bad name by taking political correctness too far.



Look, the humour is like this scene from Shaun of the Dead [youtube.com] . It's the situation and Ed's delivery that make it funny, and has bugger all to do with making fun of black people. In fact, the aim is to make fun of white people trying to emulate 'black culture.'



Stallman saying 'HURD up to ma Niggaz' is funny on three levels:




  1.    
  2. because of who is saying it (a fat white bloke);

  3.    
  4. the reference to the attempts by Microsoft [gizmodo.com] to come up with 'hip' raps and be 'down with the kids' and the idea that Stallman might try to emulate that;

  5.    
  6. it's a shit pun, that purposefully sounds like something a marketing deparment would come up with;


Wow, some of you Americans just don't get subtle humour. Kudos to those who do however. ;)



/me prepares to be modded off-topic (I never understood that one).

Shaun of the Dead is racist, sexist and heteronormic though. Plus there are no major minority or disabled characters. Unless you count the zombies as a disabled minority. But if you do that then you can finally see how offensive the film really is.

Essentially it's two white men battling trying to defend their culture against the 'inferior' culture of a disabled minority. But it's notable the 'inferior' culture seems much more representative of modern britain. There are clearly female zombies, gay zombies and zombies of colour. Perhaps that is what is so threatening to the two 'heroes' of the film. Perhaps the gay zombies remind them of the homosexual undertones to their own 'friendship'.

Re:Yawn (0, Offtopic)

Dishevel (1105119) | more than 6 years ago | (#23544527)

Wow. I am still kinda hoping that sooner or later something will come along in the universe that makes it difficult for pansy ass PC bitches like you to breed so we can finally watch the PC's who are constantly fucking up our society with their "You hurt my feelers!" shit go away.

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#23544567)

"HURD up to ma Niggaz."

Ouch how embarrassing. You must have not gotten the memo but "Niggaz" is no longer pc ... the hip and socially acceptable term is "Niggahz"

Re:Yawn (5, Funny)

Anonymous Coward | more than 6 years ago | (#23542715)

The future according to the past...the present.

Zen koan. (2, Funny)

Hankapobe (1290722) | more than 6 years ago | (#23543225)

The future according to the past...the present.

Ahhhh. You are a Zen Master. Please, teach us more!

Re:Yawn (3, Insightful)

darthflo (1095225) | more than 6 years ago | (#23543035)

Three things: - None of the futures you mentioned contradicts any of the others. Quite obviously Blue Bonnet won't predict the future of the storage market and Minute Maid won't be the first companyto know about new processes in CPU manufacturing.
- What's the future according to Minute Maid anyways? Really, I'm intrigued!
- Did you notice the interesting parallel between the future according to ATT and what the american government seems to be steering to? More bars in more places (and as many people behind them as possible(?))? What a strange coincidence...

Re:Yawn (4, Funny)

ceoyoyo (59147) | more than 6 years ago | (#23544107)

Now what's wrong with more bars with more bartenders behind them so you get your drinks faster? Really, all this criticism of the American government when they're really trying to do something quite noble. :P

Re:Yawn (1)

darthflo (1095225) | more than 6 years ago | (#23544137)

Well played, sir.

Re:Yawn (0)

Anonymous Coward | more than 6 years ago | (#23543065)

the future according to [Alice] Blue Bonnet
...is that she marries Johnny Fedora.

Re:Yawn (1)

mikael (484) | more than 6 years ago | (#23543539)

The future according to Sun or IBM - faster CPU's using more cores per processor.

The future according to Nvidia - faster GPU's using more stream processors.

Re:Cell Proc? (1)

notdotcom.com (1021409) | more than 6 years ago | (#23543679)

Where does IBM/Sony's Cell processor fall in the CPU/GPU battle? IBM most certainly plans to use it in PCs as a CPU, but wasn't most of the initial development focused on making it a better GPUing CPU?

Re:Cell Proc? (1)

Hal_Porter (817932) | more than 6 years ago | (#23545011)

The Cell would suck in a PC. There's an underpowered, in order PowerPC core and a bunch of SPEs. But SPEs are for signal processing, not general computation. They only have 256K of memory, smaller than the cache inside a desktop CPU. They don't have access to main memory or an MMU. Even if you added an MMU and paged from main mem to the SPEs it still wouldn't help. The bus would be saturated by page misses and the SPEs would spend all their time waiting.

Even in a games console it's probably hard to keep all the SPEs usefully employed.

In the Year 2000 ... (1)

j3tt (859525) | more than 6 years ago | (#23543811)

Conan: "... It's time, once again, to look into the future."

Guest: "The future, Conan?"

Conan: "That's right, Let's look to the future, all the way to the year 2000!"

and then ... La Bamba's high falsetto ... "In the Year 2000"

"In the Year 2000"

More GPGPU = More Parallelism (1)

StCredZero (169093) | more than 6 years ago | (#23547003)

In a way nVidia's message is the same as that of the Cell ship. There will be more and more use of parallelism, with the CPU (Or a particular CPU on symmetric multi-core systems) acting as a kind of foreman for a troop of processors working in parallel.

Not all of the uses for the gobs of cheap parallel processing power are apparent yet. But people will find cool things to use it for, just as they found cool things to use home computers for. In a way, we are now going through a home supercomputing revolution.

Price / Perfomance works for me (4, Informative)

Rog7 (182880) | more than 6 years ago | (#23542565)

I'm all for it.

The more competition the better.

Anyone that worries too much about the cost a good GPU adds to the price of a PC, doesn't remember much what it was like when Intel was the only serious player in the CPU market.

This kind of future, to me, spells higher bang for the buck.

Really? (1)

Linker3000 (626634) | more than 6 years ago | (#23542567)

"the 'can of whoop-ass' nVidia has promised to open on Intel."

Yep, I'm sure the Intel Devs have all taken a sabbatical.

Re:Really? (3, Informative)

NMerriam (15122) | more than 6 years ago | (#23542677)

Yep, I'm sure the Intel Devs have all taken a sabbatical.


The ones that work on GPUs? I'm not sure they ever even showed up for their first day of work.

Re:Really? (0)

Anonymous Coward | more than 6 years ago | (#23542895)

no, they are just taking a few days off on account of their badly whooped asses still being sore.

Surprise, Surprise... (4, Insightful)

allcar (1111567) | more than 6 years ago | (#23542575)

The leading manufacturer of GPUs wants GPUs to become ever more important.

Re:Surprise, Surprise... (2, Interesting)

Anonymous Coward | more than 6 years ago | (#23543627)

I am a bit skeptical. If AMD's experimentation with combining the CPU and GPU bears fruit it might actually mean the end for the traditional GPU's. nVidia doesn't have a CPU that can compete with AMD and Intel so I think nVidia is the one in trouble here. But I suppose nVidia has to keep up appearances to keep the stocks from plummeting.

Re:Surprise, Surprise... (1)

Malekin (1079147) | more than 6 years ago | (#23544007)

If AMD's experimentation with combining the CPU and GPU bears fruit it might actually mean the end for the traditional GPU's.
Until the wheel of reinvention turns once again and people start realising that they can find performance gains by splitting the graphics processing load out onto special hardware.

Re:Surprise, Surprise... (2, Interesting)

pdusen (1146399) | more than 6 years ago | (#23545399)

Right. The special hardware being separate graphics-optimized cores, in this case.

Re:Surprise, Surprise... (1)

chthon (580889) | more than 6 years ago | (#23545487)

This answer is very interesting because I seem to remember that MMX was introduced because Philips planned to create specialty co-processor(s) (boards) (around 96/97) to off-load multi-media tasks, so that sound processing would take less CPU cycles and to introduce video processing. Intel did not like this idea and added MMX just to cut off such things.

Re:Surprise, Surprise... (1)

canuck57 (662392) | more than 6 years ago | (#23547839)

I am a bit skeptical. If AMD's experimentation with combining the CPU and GPU bears fruit it might actually mean the end for the traditional GPU's. nVidia doesn't have a CPU that can compete with AMD and Intel so I think nVidia is the one in trouble here. But I suppose nVidia has to keep up appearances to keep the stocks from plummeting.

I would concur with that. But add nVidia is also missing an OS and applications. While it is an extension of not having a traditional binary compatible CPU to Intel and AMD, nVidia is totally void here.

My guess is Intel has the weakest video, perhaps is talking to nVidia and nVidia is trying to pump the value. While AMD is trying to see how to best integrate the CPU and GPU. That is, this is about politics and price for nVidia.

Some problems today's GPUs have are: they run too hot, take too much power and cost too much. And GPU needs to work on these. Only games are going to put 100 watts into gaming cards, the rest of us want fanless, low power and low cost. nVidia seems to be ignoring this and it is going to come with a backlash.

Sounds like BS (1, Flamebait)

gweihir (88907) | more than 6 years ago | (#23542589)

And nVidia has been spouting a lot of this lately. Is the company in trouble and the top executives are now trying to avoid that impression by constantly talking aboit how bright the future is for their company? Quite possible, I would say.

As to the claim that the GPU will replace the CPU: Not likely. This is just the co-processor idea in disguise. Eventually this idea will fade again, except for some very specific tasks. A lot of things cannot be done efficiently on a CPU. I have to say I find the idea of the GPU actually becomming less important, because of more available general-purpose CPUs actually a lot more convincing. And it has the advantage of far simpler hardware. Especially nVidia GPUs have ben riddled with problems for some time now. CPUs (either AMD or Intel) have not been.

Re:Sounds like BS (1)

dreamchaser (49529) | more than 6 years ago | (#23542619)

I agree. In TFA they refer to the GPU taking over more and more specifically in gaming applications, but even then the more you free up the main CPU to do other things like AI hopefully the better games will get. If you recall Weitek when they made a much faster (albeit single precision) math coprocessor than Intel's own 80387, it would be like Weitek saying "We forsee that you'll need that 386 far less in the future."

If nVidia or any other GPU manufacturer tries to get too generalized they run the risk of being good at lots of stuff but not the best at any one thing, as well.

Re:Sounds like BS (0)

Anonymous Coward | more than 6 years ago | (#23542857)

I think they're scared shitless by Intel's more serious plans for the GPU market, starting with the Nehalem architecture next year.

Considering how much cash (and how many talented chip designers) Intel has, they should be. I hope they can do more than just talk, or we could end up with very little competition in 5-10 years, for both GPUs and CPUs (which may or may not be the same thing then).

Re:Sounds like BS (1)

smallfries (601545) | more than 6 years ago | (#23544321)

But nVidia didn't claim that the GPU would replace the CPU. They even went to lengths to deny it in the article. The version that the poster linked to was void of details, a better description is available at the Inquirer [theinquirer.net] who weren't under NDA.

The hi-light of the press conference seems to be the censored part revealing that nVidia will be fab'ing ARM-11s in the near future in direct competition with the Intel Atom. Looks like they're not planning to go down without a fight...

Competing (3, Insightful)

Yetihehe (971185) | more than 6 years ago | (#23542593)

FTFA:

basically more GPGPU usage (i.e. the use of the graphics chip to process regular programs) and the co-existence of "competing" technologies like ray tracing and rasterization
Hmm, they aren't really competing technologies. Raytracing CAN be an extension of rasterization, some RT algorithms even use some form of rasterization for visibility testing... But if nVidia don't embrace RT, they risk going to second position (no, not extinct, as you can do RT on nvidia cards today, but it would be better with some native api and better hardware support).

Well API isn't their department (3, Interesting)

Sycraft-fu (314770) | more than 6 years ago | (#23542661)

nVidia doesn't do the APIs for their cards. They have no properitary API, their native APIs are DirectX and OpenGL. In fact, the advances in those APIs, more specifically DirectX, often determines the features they work on. The graphics card companies have a dialogue with MS on these matters.

This could be an area that OpenGL takes the lead in, as DirectX is still rasterization based for now. However it seems that while DirectX leads the hardware (the new DX software comes out usually about the time the hardware companies have hardware to run it) OpenGL trails it rather badly. 3.0 was supposed to be out by now, but they are dragging their feet badly and have no date when it'll be final.

I imagine that if MS wants raytracing in DirectX, nVidia will support it. For the most part, if MS makes it part of the DirectX spec, hardware companies work to support that in hardware since DirectX is the major force in games. Until then I doubt they'll go out of their way. No reason to add a bunch of hardware to do something if the major APIs don't support it. Very few developers are going to implement something that requires special coding to do, especially if it works on only one brand of card.

I remember back when Matrox added bump mapping to their cards. There was very few (like two) titles that used it because it wasn't a standard thing. It didn't start getting used until later, when all cards supported it as a consequence of having shaders that could do it and it was part of the APIs.

Re:Well API isn't their department (1)

nawcom (941663) | more than 6 years ago | (#23542681)

You probably know a bit more on OpenGL development than I do. How many people are involved with its development? What companies? OpenGL is one open source project I wish was ahead of times like some computer development categories, but unfortunately it isn't.

Re:Well API isn't their department (1)

LingNoi (1066278) | more than 6 years ago | (#23542775)

openGL is an open source project?

Re:Well API isn't their department (1, Funny)

Anonymous Coward | more than 6 years ago | (#23543119)

>openGL is an open source project?
Well, DUH! It has the word "open" in it, doesn't it?

Re:Well API isn't their department (0)

Anonymous Coward | more than 6 years ago | (#23543355)

Microsoft Office Open XML aka ISO 29500. I rest my case.

Re:Well API isn't their department (4, Informative)

ardor (673957) | more than 6 years ago | (#23542933)

nVidia doesn't do the APIs for their cards.
Wrong. [nvidia.com]
GPGPU absolutely demands specialized APIs - forget D3D and OGL for it. These two don't even guarantee any floating point precision, which is no big deal for games, but deadly for GPGPU tasks.

Will they become platform supplier? (4, Interesting)

sznupi (719324) | more than 6 years ago | (#23542621)

I was wondering about this...now that nVidia wants CPU to loose its importance _and_ they started to cooparate with Via on chipsets for Via CPUs (which perhaps aren't the fastest...but I've hard the latest Isaiah core is quite capable), will we see some kind of merge?

That would be great! (1)

Hankapobe (1290722) | more than 6 years ago | (#23543243)

I was wondering about this...now that nVidia wants CPU to loose its importance _and_ they started to cooparate with Via on chipsets for Via CPUs (which perhaps aren't the fastest...but I've hard the latest Isaiah core is quite capable), will we see some kind of merge?

Wouldn't that be great! It's about time that graphics processing, IO, an other things are sent to their own processors. Anyway, wasn't that done before - Amiga?

Re:That would be great! (1)

sznupi (719324) | more than 6 years ago | (#23545789)

Uhmmm...I was thinking more about what AMD and Intel are doing when it comes to owning all components for their platform...

Re:Will they become platform supplier? (1)

Hal_Porter (817932) | more than 6 years ago | (#23545045)

Via Isaiah isn't fast enough for games. And NVidia target gamers. So no, unless Via are about to announce an uber x86 implementation.

Re:Will they become platform supplier? (1)

sznupi (719324) | more than 6 years ago | (#23545747)

Nvidia doesn't target solely gamers, otherwise GF 6100/6200/7300/7600/8200/8300/8500 wouldn't be available at the moment. Add to that that possibly the most profitable market segment now consists of cheap laptops, where it's certainly better to have full platform available to OEMs.

Also, those early tests
http://techreport.com/discussions.x/14584 [techreport.com]
suggest that Isaiah, when it comes to performance per clock, is finally comparable with AMD/Intel. Who knows what we'll see later...

PS. Games are _the_ only thing (and not that many people play them...) now for which you'd need faster CPU than cheapest one...plus it's still better to have slower CPU/faster GPU than other way around.

Re:Will they become platform supplier? (1)

Hal_Porter (817932) | more than 6 years ago | (#23546601)

In a very cheap laptop a la OLPC you'd better off with Intel integrated graphics.

In fact I think even that is overkill - you could add a framebuffer, hardware cursor and a blitter to the core chipset and steal some system RAM for the actual video memory. Negligable die area and low power consumption.

Re:Will they become platform supplier? (1)

sznupi (719324) | more than 6 years ago | (#23546805)

I'm thinking more about cheap 15,4" "desktop replacement" laptops with pathetic/"doesn't matter" battery life that, from what I see everywhere, dominate the sales

(nvm that I absolutelly hate them - I prefer something more portable, but economy of scale doesn't work for my advantage)

patentdead randoidian megasloths going DOWn (-1, Troll)

Anonymous Coward | more than 6 years ago | (#23542909)

the whole crooked greed/fear/ego based system is badly infactdead.

recipe for success; "If my people, which are called by my name, shall humble themselves, and pray, and seek my face, and turn from their wicked ways; then will I hear from heaven, and will forgive their sin, and will heal their land."

GPU is great if you want 128 processors at 50 MHz (0)

Anonymous Coward | more than 6 years ago | (#23542975)

The GPU is a massively parallel computing machine. The cores are 50 MHz. This is fine if you can parallel your needs as you can easily do for video, but for general CPU needs you are executing very slowly if you use only one, 50 MHz core, as you likely will.

Raytracing (1)

Lord Lode (1290856) | more than 6 years ago | (#23543047)

Hardware accelarated raytracing could also be interesting for speeding non-realtime rendering such as for making movies!

Re:Raytracing (1)

urbanriot (924981) | more than 6 years ago | (#23544161)

Wasn't Nvidia (and many others on Slashdot) bashing Intel for researching and announcing the inclusion of ray tracing support in future chipsets?

nVidia's split personality (4, Interesting)

argent (18001) | more than 6 years ago | (#23544671)

Nvidia's chief scientist, David Kirk, is really down on raytracing and particularly on dedicated raytracing hardware.

http://scarydevil.com/~peter/io/raytracing-vs-rasterization.html [scarydevil.com]

However... Dr Philipp Slusallek, who demonstrated how even a really slow FPGA implementation of raytracing hardware could kick general purpose processors (whether CPU or GPGPU) butts in 2005, has been working as a "Visiting Professor" at nVidia since October 2007.

They're still playing their cards close to their chest.

Re:nVidia's split personality (2, Informative)

flabbergast (620919) | more than 6 years ago | (#23546441)

And the ray tracing group at Utah, Pete Shirley and Steve Parker, now both work for nVidia as well.

MY - YES - MY can of WOOPASS, Telsa has arrived! (1)

itsybitsy (149808) | more than 6 years ago | (#23543167)

It arrived Friday. Wow. 430 to 512 BILLION FLOATING POINT OPERATIONS PER SECOND! Need I say any more. Yummy, now I get to play with it!

Future? How about the present? (1)

hazee (728152) | more than 6 years ago | (#23543229)

I made sure that my current machine had an nVidia graphics chip, so that I could play with stuff like CUDA. But my machine also runs Vista and, some 18 months after its release, there still isn't a stable version of CUDA for Vista. Plus, seeing as my machine's a laptop, I doubt that even the beta drivers available from nVidia would install, seeing as how they're prone to playing silly buggers when it comes to laptop chip support.

So nVidia, instead of spouting off about how great the future's going to be, how about supporting some of the people who are trying to run your hardware in the present, eh?

Re:Future? How about the present? (1)

Ant P. (974313) | more than 6 years ago | (#23543871)

This is the thing wrong with nVidia. They're so obsessed with having the fastest hardware tomorrow that they fail to notice people leaving them in droves today for better hardware (power consumption/drivers/open specs).

It's a bit strange they'd support CUDA on linux but not vista though.

so basically... (1)

smash (1351) | more than 6 years ago | (#23543773)

... the pc architecture is going to be catching up to where the Amiga and various consoles have been for the past 2-3 decades (in terms of basic high level design ideas)?

:)

Funny how things work, isn't it.

nVidai is on the wrong path of computing (1)

cjjjer (530715) | more than 6 years ago | (#23543917)

Maybe instead of worrying about the next generation of GPU's they should be more concerned with fixing their shitty driver support for Vista. I have a laptop that came with Vista but the driver support is so useless and buggy I had to downgrade to XP. The control panel options under XP vs. Vista are night and day.

Because of all the problems I have had with it I doubt I will ever buy another laptop/motherboard with a nVidia product on it.

It's hard to come up with the latest and greatest when nobody will buy your products ... Dumbass ...

Haven't we seen this all before? (1)

Spencer60 (1295971) | more than 6 years ago | (#23543921)

Back in the day, if you ran 'math-intensive' software it would look for an 8087 math co-processor and load special code libraries in Lotus 123 to speed up calc performance. Once Intel had the chip real estate to spare though, this special purpose chip got subsumed into the CPU. As Intel keeps driving the transistor count up, they will be perfectly capable of embbeding a full-featured 'streams' processor into their CPUs. It won't happen right away, but it solves the issue of different code libraries (a software company's nightmare) and will be good enough for 80% of the folks out there. High-end graphics will always need a hi-perf, dedicated solution, but the market will be smaller (which I think is nVidias first worry). Add to this some gaming companies recent threats to abandon the PC as a game platform due to piracy concerns, and their market could get much smaller. Hence all the bluster about GPUs taking over for the CPU. Historicaly, it's always been the CPU that takes over special-purpose functions, not the other way around (at least in the Intel space). No one else has the cash and the facilities (same thing really) to drive the transistor count up enough to do this trick.

Re:Haven't we seen this all before? (1)

m50d (797211) | more than 6 years ago | (#23545111)

Historicaly, it's always been the CPU that takes over special-purpose functions, not the other way around (at least in the Intel space).

It has indeed, and there's certainly grounds for caution, but there's a chance that this time it's different. There are different silicon processes involved in making a fast vector processor (as one needs for GPUs) compared to what one does to make a CPU, so putting them together isn't simply a matter of finding enough space in the package. Couple this with the fact that CPUs are already running out of enough space on the motherboard to get the fast link they need to main memory, and GPUs need a faster one to their memory, and suddenly combining CPU and GPU seems a downright bad idea.

Of course, in the long term the whole system's going to be a single chip, but I think it doesn't make sense to move the GPU in with the CPU until you've already moved main memory, which is still some years off. (Transmeta tried it about a decade ago, and the technology wasn't up to it; maybe it's time someone gave it another go?)

The merging of CPU and GPU (1)

Lida Tang (1296025) | more than 6 years ago | (#23544755)

Intel obviously sees the threat of the GPU creators, but their attempts at breaking into the GPU market hasn't been very successful.

Their next generation effort is called Larrabee [theinquirer.net] . Which uses multiple x86 cores linked with a ring bus.

It actually reminds me of PS3 SPU setup but Intel is using the GPU functionality as a wedge into the GPU market, instead of pushing it for general computation. But, since standard C code will work on it, you can rewrite the entire stack to be a physics co-processor or fold@home client.

Ultimately, I see the CPU and GPU separation disappear and merge into one chip, much like FPU and sound card functionalities.

The Future of Computers: The GPFPGA? (1)

FurtiveGlancer (1274746) | more than 6 years ago | (#23544939)

Why not employ numerous Field Programable Gate Arrays (FPGAs) instead of a CPU? You could program one or more FPGA to be optimized to execute each of the functions the software needs. Need more FLOPs? Program for that. Need scalar computation? program for that. Seven FPGAs running one way and nine running another. At some point, FPGAs may completely replace the CPU as we know it today. The HPC community is already looking at this possibility for some types of computations.

This has been done before. (1)

Tokerat (150341) | more than 6 years ago | (#23545077)

nVidia seems to be a litte late to this game [wikipedia.org] . ;-)

Bigger and bigger (1)

phorm (591458) | more than 6 years ago | (#23545145)

Focus generally seems to be on "bigger" as opposed to "more efficient." Add more cores, increase the frequency, etc etc.

Some other tasks focus on "trimmed down and more efficient" but then tend to fail in the power output arena.

I was wondering how difficult it might be to make a motherboard or graphics card with multi-processors. One small one for general-purpose computing (basic surfing, word-processing, 2d graphics or basic 3d), and a bigger one that could be used to "kick in" when needed, like an overdrive engine, but otherwise non-power and sleeping until needed.

Many CPU's already have power-states, but generally these aren't as efficient as a CPU specifically designed for lower-power/efficiency. Same for GPU's. So how about cores or processors that come online on-demand?

Re:Bigger and bigger (1)

IKnwThePiecesFt (693955) | more than 6 years ago | (#23546159)

This is essentially what's been proposed by NVidia and ATI as Hybrid SLI and I believe Hybrid Crossfire. You essentially have an integrated Geforce 6150 (or other power-efficient chipset graphics) and a then a discrete 8800 (or other high end watt sucker). When you launch a game, the video switches over to the 8800 giving you high performance, but then when you quit and go back to your desktop the 8800 is powered off and the 6150 takes over.

Sounds like a good idea (1)

phorm (591458) | more than 6 years ago | (#23547191)

Actually, with most motherboards coming with onboard video (that is usually less powerful than the add-on GPU), this sounds like a really good idea. Of course, in this case you'd need a compatible card (onboard ATI+addon ATI, or onboard Nvidia+addon NVidia). I wonder if it could be standardized so that the lesser-power onboard GPU's could be switched down and allow a passthrough for the addon AGP card (or vise-versa, since the addon card is more likely to have extra ports such as DVI etc than the onboard/motherboard which has limited space).

Scare and FUD (0)

Anonymous Coward | more than 6 years ago | (#23545379)

nVidia is scared by Intel's push. Intel it putting a lot of money in getting the brightest graphics-heads out there.
Ray-tracing is for marketing and managers only. Marketing has to make sure that all bases are covered, but marketing people don't really know much about actual technology.
Managers know more, but often not enough to realize that ray-tracing by itself means nothing.
Possibly most graphics programmers are also confused by the feasibility of ray-tracing in real-time. To see so many, so called, experts get dragged into this ray-tracing FUD it's both sad and funny.

It's also sad to see nVidia getting into this silly game just to make sure that they aren't falling behind on the marketing side.

silly (1)

nguy (1207026) | more than 6 years ago | (#23545639)

Vector coprocessors, array processors, and all that have been around for ages. Maybe they'll finally catch on. If they do, you can bet that the manufacturer making them will not be a graphics card manufacturer. In fact, by definition, they won't be a graphics card manufacturer, since they will be making co-processors for non-graphics applications.

But I don't think they will catch on. It makes little sense for people to stick extra cards into their machines for computation. Instead, you'll probably see something more like Cell, a combination of CPU plus vectorized coprocessors on a single chip, and if you plug multiple of them into the motherboard, you get the right mix of CPU and coprocessor performance.

How About Some Backwards Compatibility? (1)

fyrie (604735) | more than 6 years ago | (#23545691)

Seriously.... Over the past few years Nvidia has shown me that they could really care less if games that aren't brand new will run on their cards. Older games that use palletized textures and 16bit function calls look horrible on the newer cards. This is something they could fix easily in software if they wanted to.

I call bullshit (1, Interesting)

Anonymous Coward | more than 6 years ago | (#23546887)

According to nVidia the dream gaming system will consist of quad nVidia GPU cores running on top of a nVidia chipset-equipped motherboard, with nVidia-certified "Enthusiast" system components. Meanwhile the company just will not work on LOWERING the power consumption of their graphics cards. Why do we need one-kilowatt power supplies? Because nVidia says so!

Fuck nVidia.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?