Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia Discloses Details On Next-Gen Fermi GPU

samzenpus posted more than 5 years ago | from the getting-the-skinny dept.

Graphics 175

EconolineCrush writes "The Tech Report has published the first details describing the architecture behind Nvidia's upcoming Fermi GPU. More than just a graphics processor, Fermi incorporates many enhancements targeted specifically at general-purpose computing, such as better support for double-precision math, improved internal scheduling and switching, and more robust tools for developers. Plus, you know, more cores. Some questions about the chip remain unanswered, but it's not expected to arrive until later this year or early next."

Sorry! There are no comments related to the filter you selected.

But does it... (4, Interesting)

popo (107611) | about 5 years ago | (#29600369)

... run Linux?

Re:But does it... (1, Insightful)

Anonymous Coward | about 5 years ago | (#29600413)

More importantly, does it run physx in a machine that also has a non-nvidia gpu?

Oh, wait. No, it doesn't [slashdot.org] .

Re:But does it... (-1, Troll)

Anonymous Coward | about 5 years ago | (#29600457)

commodore64_love [slashdot.org] did my ass last night, bitch!

Re:But does it... (0)

Anonymous Coward | about 5 years ago | (#29600469)

That's more important than linux support? Funny, I didn't think anyone -- even gamers -- gave a shit about hardware accelerated physx.

Re:But does it... (-1, Troll)

Anonymous Coward | about 5 years ago | (#29600737)

No gamers give a shit about Linux either you dipshit.

Re:But does it... (0)

Anonymous Coward | about 5 years ago | (#29600845)

Most game servers run Linux?

Re:But does it... (0)

Anonymous Coward | about 5 years ago | (#29600863)

Which has no bearing whatsoever on what I said. Some people care about linux. No one cares about hardware accelerated physx.

Re:But does it... (-1, Troll)

Anonymous Coward | about 5 years ago | (#29600479)

Right, because it'll make SUCH a difference in anyone's life...

Grow up, and just use what works.

Re:But does it... (-1, Troll)

Anonymous Coward | about 5 years ago | (#29600495)

If anyone other than nVidia had a worthwhile graphics card, that might be a problem. Oh well, guess it's no big deal.

Re:But does it... (1, Insightful)

PopeRatzo (965947) | about 5 years ago | (#29600791)

More importantly, does it run physx in a machine that also has a non-nvidia gpu?

You understand that these gpu's are made by nvidia, right? So how could they run something on a machine with a non-nvidia gpu if the gpu's the article refers to are made by nvidia/I.?

What exactly were you trying to say? I'm not quite sure.

Re:But does it... (4, Insightful)

skarhand (1628475) | about 5 years ago | (#29600953)

You could have read the link... Theoretically, you could use an ATI card for graphics and a second Nvidia card just for the physx. Well, not anymore. Nvidia disabled that possibility in the driver. So people with older Nvidia cards who choose to upgrade to the newest radeon 5800 series will lose physx. That kind of business practices remind me of a certain company from Redmond...

Re:But does it... (2, Insightful)

PopeRatzo (965947) | about 5 years ago | (#29600997)

That kind of business practices remind me of a certain company from Redmond...

Actually, I can think of at least one other major computer manufacturer who makes products that nerf other manufacturers' products. I think they're located in Cupertino.

Re:But does it... (5, Funny)

Dishevel (1105119) | about 5 years ago | (#29601545)

Now you have done it. Don't you know enough not to wake the fanboys. Now I am gonna have to hear how Jobs is a genius for not allowing people to do things that he believes unimportant with their hardware. That nobody actually needs any more control than Steve gives them. It's not that I agree with them. I just do not want to hear it. Please just let them have their fanboy dreams in peace. Please!

Re:But does it... (1)

mwvdlee (775178) | about 5 years ago | (#29602099)

IBM isn't located in Cupertino, right? Neither is Sun. Which are you talking about?

(before some pedantic know-it-all kid starts acting up; yes, I do know which company he was talking about, as do all of us).

Re:But does it... (5, Interesting)

Anonymous Coward | about 5 years ago | (#29600961)

Some motherboards have more than one PCI Express slot. Some even come with GPUs built onto the motherboard. In either case, it is entirely conceivable that there may be a GPU present other than the one attached to the display. Then there's the Hydra 200 (look on Anandtech, I'm too lazy to find the link) - a chipset which evenly distributes processing power among multiple GPUs from any vendor to render a scene or do GPGPU computing.

Nvidia just released new drivers which explicitly disable PhysX acceleration in the presence of a GPU from another manufacturer. For the above stated reasons, this is evil.

Re:But does it... (1)

PopeRatzo (965947) | about 5 years ago | (#29600979)

Then there's the Hydra 200 (look on Anandtech, I'm too lazy to find the link) - a chipset which evenly distributes processing power among multiple GPUs from any vendor to render a scene

Really? That's cool.

Nvidia just released new drivers which explicitly disable PhysX acceleration in the presence of a GPU from another manufacturer.

That sucks. Do you know if there are other hardware physics solutions on the market besides PhysX? I know of a couple of software solutions, but don't know of any other hardware ones.

Thanks for explaining the OP for me, AC.

Re:But does it... (0)

Anonymous Coward | about 5 years ago | (#29600969)

I'm a bit confused. Do they mean you need Nvidia chips on the motherboard or do they make a separate card just for physx? Or is it a software issue that only runs on Nvidia hardware?

Re:But does it... (1)

portalcake625 (1488239) | about 5 years ago | (#29601713)

You don't need Nvidia chips on the mobo the PhysX is on the card and not separate.
TOFA says that if you have 1 ATI card, for example, in your 2000-GPU array with 1999 Nvidias, you'll still have no PhysX

Re:But does it... (-1, Troll)

Anonymous Coward | about 5 years ago | (#29600421)

No. Next question?

Re:But does it... (0)

Anonymous Coward | about 5 years ago | (#29600567)

and, more importantly, if you imagine a beowulf cluster of these, will it run Linux to let you write a message to /. saying frist psot, and
2...
3...profit!

But it won't have an RPU... (1)

argent (18001) | about 5 years ago | (#29600385)

They could fit one of Philipp Slusallek's ray-trace processing units in the corner of the chip and never notice the cost in silicon.

AWESOME (1, Funny)

Icegryphon (715550) | about 5 years ago | (#29600393)

such as better support for double-precision math

BEST NEWS EVAR!

Re:AWESOME (5, Insightful)

ArchMageZeratuL (1276832) | about 5 years ago | (#29600529)

To the best of my knowledge, double-precision floating point operations are actually pretty important for some scientific applications of GPUs, and as such this is significant for those using GPUs as supercomputers.

Re:AWESOME (0)

Anonymous Coward | about 5 years ago | (#29600705)

Correct.

The main reason DP has no real support in GPU's these days is because it is so much slower and only selective people, such as the science community, really need it. GPGPU may use this, but I dont see this becoming used in games.

Re:AWESOME (2, Insightful)

Korin43 (881732) | about 5 years ago | (#29600813)

It could also be useful in raytracing. The official reason POV-Ray hasn't been able to use video cards is that they don't have the required precision. That's probably pre-CUDA though, but "better support" sounds helpful.

Re:AWESOME (1)

jpmorgan (517966) | about 5 years ago | (#29600831)

NVIDIA has had double precision support since the GT200 line. Performance was poor compared to single precision, but it was there.

Re:AWESOME (1)

Penguinoflight (517245) | about 5 years ago | (#29601913)

Do you mean the gtx280 line, or the re-badged (again) 8000 series?

Re:AWESOME (1)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601571)

better support

Not only is it "better support", but it now has "Awesome Street Cred" and "Totally amazing user experience", thanks to a magical technology licensed from ________ (Pick your favorite scape-goat) famous for lock-in and proprietary design!

Can also be useful in graphics (2, Insightful)

Sycraft-fu (314770) | about 5 years ago | (#29600971)

It depends on what you are doing, but when you get something that involves a lot of successive operations, even 32-bit FP can end up not being enough precision. You get truncation errors and those add up to visible artifacts. This could also become more true as displays start to take higher precision input and even more true if we start getting high dynamic range displays (like something that can do ultra-bright when asked) that themselves take floating point data.

Re:AWESOME (1)

AHuxley (892839) | about 5 years ago | (#29601327)

Then you can dream of a big home made rendering device
http://helmer2.sfe.se/ [helmer2.sfe.se]

Honestly, at this point... (0)

Shikaku (1129753) | about 5 years ago | (#29600417)

Why bother buying a computer motherboard, cpu and case? Maybe the case to store it in, but you could make a full fledged computer with just a graphics card they are so powerful.

Re:Honestly, at this point... (1)

stocke2 (600251) | about 5 years ago | (#29600497)

only for certain operations

Re:Honestly, at this point... (1)

whatajoke (1625715) | about 5 years ago | (#29600501)

What are you trying to say? And why did you get modded insightful for an inane comment?

Re:Honestly, at this point... (3, Funny)

Shikaku (1129753) | about 5 years ago | (#29600645)

Yes.

Re:Honestly, at this point... (0, Troll)

whatajoke (1625715) | about 5 years ago | (#29601403)

Wow. Good to see the mods finding this thread so novel and very relevant to the original topic. Maybe I should now run along with some meme from here [wikipedia.org] .

If only I could beat the mods with a clue stick.... Even reddit with its uber-simple mod system does better than this. Either slashdot should do better, or maybe just kick out the moderators and introduce reddit style modding.

Re:Honestly, at this point... (0)

Anonymous Coward | about 5 years ago | (#29601449)

with a UID > 1.6m you clearly know how the mods function.

Re:Honestly, at this point... (1)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601579)

with a UID > 1.6m you clearly know how the mods function.

Yeah, but he's been lurking since day 1. or was it day 2. It's so hard to remember when ... HEY YOU, get of my damn lawn!

Re:Honestly, at this point... (1)

whatajoke (1625715) | about 5 years ago | (#29601715)

The anonymous coward posts earlier were actually read by mods and brought to the attention of others. But lately I realised that the anonymous coward posts were no longer floated upwards by the mods. So I had to create an account just so my posts will have atleast 1 point.

These days if you want to be modded up, ask the same questions that were asked when a similar article appeared earlier, or just roll out endless memes.

Maybe I am deluded about the times when redundant posts were really modded redundant.

Re:Honestly, at this point... (1)

Ethanol-fueled (1125189) | about 5 years ago | (#29600669)

Probably waiting for a future in which computers are just big FGPAs and SoCs defined entirely by their firmware, with peripheral jacks soldered to their pins.

Re:Honestly, at this point... (1)

blackraven14250 (902843) | about 5 years ago | (#29600799)

It's really funny, because this kinda happens, in that things keep getting integrated into the CPU. It's really just that we keep adding more new stuff outside the CPU that keeps literally everything (except power regulation) from being one chip.

Re:Honestly, at this point... (0)

Anonymous Coward | about 5 years ago | (#29600517)

Yes, who cares about networking, disks, audio, or any peripherals.

Re:Honestly, at this point... (1)

Eddi3 (1046882) | about 5 years ago | (#29600523)

Completely OT, but your sig just made my day.

Re:Honestly, at this point... (0)

Anonymous Coward | about 5 years ago | (#29600547)

They are powerful for certain tasks, but they tend to handle lots of small tasks better than "monolithic" tasks. A web browser, for example, run on a graphics card, would be pretty painful - particularly in the case of the HTML parser and javascript and flash engines. Your music player would probably be fine if you don't like FLAC (if I remember, it's mostly integer math, rather than FP math like most other compression codecs). Compilers and text editors would be painful at best... Your games (if you play them) will be hit or miss, excelling in some areas, sucking in others.

I'm sure you get the idea.

How did this get modded "insightful"? (3, Interesting)

Joce640k (829181) | about 5 years ago | (#29600655)

"Ignorant" would be a better rating - there's a lot of compute power but it's in the middle of a very different architecture to an x86 CPU. Not usable for running an OS.

Re:How did this get modded "insightful"? (1)

springbox (853816) | about 5 years ago | (#29601321)

You accidentally the "ing."

Re:How did this get modded "insightful"? (1)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601585)

but it's in the middle of a very different architecture to an x86 CPU

OMG! It's not x86? It's USELESS!!!!!....

Re:Honestly, at this point... (2, Insightful)

jhulst (1121275) | about 5 years ago | (#29600671)

Sure, they have lots of power, but only when used for parallel tasks. Each individual core is considerably slower than a normal CPU core and much more limited in what it can do.

Re:Honestly, at this point... (1)

Lord Crc (151920) | about 5 years ago | (#29600721)

Why bother buying a computer motherboard, cpu and case? Maybe the case to store it in, but you could make a full fledged computer with just a graphics card they are so powerful.

GPU's vs CPU's is a bit like having 5000 highly trained monkeys vs 5 highly trained people. If your task is easy enough for the GPU, it'll do it blazingly fast. On the other hand, for some tasks the CPU is still the better option.

Re:Honestly, at this point... (1)

mwvdlee (775178) | about 5 years ago | (#29602157)

It's more like comparing a single William Shakespeare with 5000 monkeys that each memorized a small part of Hamlet; The monkeys will be able to write Hamlet in a few seconds, but only Shakespeare is able to write anything other than Hamlet.

Anybody else got any contrived analogies? Something with cars perhaps?

Re:Honestly, at this point... (1)

symbolset (646467) | about 5 years ago | (#29601129)

It's called Larrabee. Coming soon.

linux users can suck horse cock (-1, Flamebait)

Anonymous Coward | about 5 years ago | (#29600453)

Admit you are losers.

Re:linux users can suck horse cock (1)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601597)

AC posters suck

Fixed that for ya.

I'd have posted AC, but the refresh time is too long.

Re:linux users can suck horse cock (0)

Anonymous Coward | about 5 years ago | (#29602151)

You do realise that horse cock runs the latest kernel, right?

This word "detail"... (2, Funny)

Joce640k (829181) | about 5 years ago | (#29600641)

...I'm not sure it means what you think it means.

So... (1, Interesting)

fuzzyfuzzyfungus (1223518) | about 5 years ago | (#29600709)

Will they also be announcing support for an underfill material that doesn't cause the chip to die after a fairly short period of normal use? And, if they do, will they be lying about it?

Re:So... (2, Funny)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601607)

OF COURSE, who do you think they are? Apple, Sony, HP and Microsoft?

Nvidia just makes the cards. It isn't their fault if they're not installed, cooled or properly read bedtime stories after use.

Games before hardware (4, Insightful)

DigiShaman (671371) | about 5 years ago | (#29600711)

Back in the day up till the year 2000, I used to upgrade my PC four times a year. The point was to always improve multi-tasking and obtain faster frame rates with higher detail in games that I already have. Since then however, the hardware has always been "good enough" for general computing and playing even the latest/popular games. The only time I'm compelled to upgrade my computer (mainly the video card) is if there's a game out that I love.

Honestly, the only game I'm looking forward to is Diablo3. Even then, my nVidia 8800GT card should be more than sufficient. If not, it would be games like these that will send me over to Newegg to make a purchase. Given the lack of games compounded with hardware that's already decent in the market, I'm willing to bet it's got Intel, AMD, and nVidia scared. Who really wants/need bleeding edge technology anymore? Am I wrong thinking the desire for better video card technology has plateaued in the last few years?

Re:Games before hardware (1)

cjfs (1253208) | about 5 years ago | (#29600797)

Who really wants/need bleeding edge technology anymore?

Numbers...must...go...higher... [berkeley.edu]

Re:Games before hardware (1)

Ethanol-fueled (1125189) | about 5 years ago | (#29600809)

Honestly, the only game I'm looking forward to is Diablo3. Even then, my nVidia 8800GT card should be more than sufficient. If not, it would be games like these that will send me over to Newegg to make a purchase.

Recent games are just re-hashes of stale material (or otherwise lacking in storytelling) with emphasis on eye candy to make people continue to buy the latest graphics hardware. The true "good enough"-ers probably don't give a shit about Space Marines and Zombies 10.

Re:Games before hardware (1)

AHuxley (892839) | about 5 years ago | (#29601343)

Perfect for the 640p xbox and ps3 generation.
Real gamers still dream of more than Sony or MS allows you to have ;)

Re:Games before hardware (5, Interesting)

jpmorgan (517966) | about 5 years ago | (#29600815)

Notice the features being marketed: concurrent CUDA kernels, high performance IEEE double-precision floating point performance, multi-level caching and expanded shared memory, high performance atomic global memory operations. NVIDIA doesn't care about you anymore. Excepting a small hardcore, gamers are either playing graphically trivial MMOs (*cough*WoW*cough*) or have moved to consoles.

They won't want to sell you this chip for a hundred bucks, they want to sell it to the HPC world for a couple thousand bucks (or more... some of NVIDIA's current Tesla products are 5 figures). The only gamers they're really interested in these days are on mobile platforms, using Tegra.

Re:Games before hardware (1)

VocationalZero (1306233) | about 5 years ago | (#29601393)

Notice the features being marketed: concurrent CUDA kernels, high performance IEEE double-precision floating point performance, multi-level caching and expanded shared memory, high performance atomic global memory operations.

You say this like it is somehow a bad thing.

NVIDIA doesn't care about you anymore.

Are you always this melodramatic, or maybe you work for ATI or something?

Excepting a small hardcore, gamers are either playing graphically trivial MMOs (*cough*WoW*cough*) or have moved to consoles.

Really? So I now qualify as "hardcore" because I casually play games like Unreal 3 and Arkham Asylum? You are so very wrong. The PC game market has taken a hit along with everything else (except short selling) because of the economy, but it is still a billion dollar industry thats really only to get bigger.

Re:Games before hardware (0)

Anonymous Coward | about 5 years ago | (#29601757)

They won't want to sell you this chip for a hundred bucks, they want to sell it to the HPC world for a couple thousand bucks (or more... some of NVIDIA's current Tesla products are 5 figures). The only gamers they're really interested in these days are on mobile platforms, using Tegra.

Ah, it looks like we've gone full circle on workstation cards. You used to have to pay a small fortune for graphics rendering, then consumer GPUs brought prices down for the masses, and now we're back to insanely expensive accelerator cards for the scientific market again...

Re:Games before hardware (1)

mwvdlee (775178) | about 5 years ago | (#29602173)

NVIDIA doesn't care about you anymore.

They never did.
Nor does any other company.
Companies care about your money.
Name me one company that actually cares more about you than your money.

Re:Games before hardware (1)

icebraining (1313345) | about 5 years ago | (#29600853)

You do realise your card is less than two years old? In the year 2000 they were releasing the Geforce 2 Ultra, I bet you can't play Doom3 in it.

Graphics are still very far from "realistic", and till then graphic cards will continue to evolve; the 8800GT may seem more than enough for now, but it won't seem that way some years from now, Unless, of course, OnLive takes over the PC gaming market.

Re:Games before hardware (1)

LordBullGod (1602191) | about 5 years ago | (#29600867)

I have to agree with you. It seems to me that the power of GPUs over a few years has gotten better, but for what (from a gamers view)? I have not found to many games (that is what I use CPU/GPU power for anyway) that really need the amount of power that is offered right now. Yes, ok, for the nice cinematic movies and cut scenes, but REAL game play. I still run a crossfire rig using two year old cards (2900XTs), and there has not been to much to challenge it lately. Yea, yea, Crysis makes it work, but it SHOULD. Where is the programming in games to support the GPU power that is available? No I don't program, no I don't develop, but really, where is the game play? I had also heard a lot about physics being supported directly on the GPU....bla bla bla, I don't see much of that either. Great, the cards support it, where is the programming other then in tech demos? I'm with you Dig, I used to upgrade all the time (2x /yr), but right now, I just don't see the need. Are the manufacutrers using us PC folks to test waht will and won't work for consoles? I feel that way, and i won't play into it. I'm good for now, but upgrades for me are now 1-2 years apart, not bi-annually. Just my view. -BG-

Re:Games before hardware (5, Interesting)

Pulzar (81031) | about 5 years ago | (#29600893)

People have been saying that forever now. I think only the first 2 generations of 3D cards were greeted by universal enthusiasm, while everything else since had a number of "who needs that power to run game X" crowd. The truth is, yes, you can run a lot of games with old cards, but you can run them better with newer cards. So, it's just a matter of preference when it comes to the usual gaming.

AMD/ATI is at least doing something fun with all this new power. Since you can run the latest games in 5400x2000 resolutions with high frame rate, why not hook up three monitors to one Radeon 58xx card and play it like this [amd.com] ? That wasn't something you could do with an older card.

Similarly, using some of the new video converter apps that make use of a GPU can cut down transcoding from many hours to one hour or less... you can convert your blu-ray movie to a portable video format much easier and quicker. Again, something you couldn't do with an old card, and something that was only somewhat useful in previous generation.

In summary, I think the *need* for more power is less pressing than it used to, but there's still more and more you can do with new cards.

Re:Games before hardware (1)

VoltageX (845249) | about 5 years ago | (#29601731)

I haven't seen one GPGPU video encoder that doesn't suck - yet.
Even the one NVIDIA is touting, "Badaboom" has a Fischer-Price interface and produces some of the worst looking H264 video I've ever seen.

Re:Games before hardware (1)

Nemyst (1383049) | about 5 years ago | (#29601003)

The problem is consoles: with releases showing up on a vast array of systems with wildly different capabilities and most games now coming out on consoles first or at the same time as on the PC, it would make no sense for developers to create a game which would be too complex/heavy to be ran on a substantial portion of machines (read: Xbox 360s and PS3s, not even counting the Wii). Thus, games get stale as "old" hardware doesn't get upgraded.

This generation is noticeably different in that consoles now have similar capabilities to PCs and that there no more is that differentiation between PC games and console games, whereas with earlier generations you had a lot of "PC only" and "consoles only" games which were specifically tailored for each medium.

Re:Games before hardware (0, Troll)

AHuxley (892839) | about 5 years ago | (#29601375)

Young coders are too lazy and brainwashed by MS and Sony to think anymore.
You finally have the bandwidth and cpu and gpu to do something and your stuck dreaming at 640p.
Hack the cards with Linux and dream big. Take computing back from the DRM, locked down junk MS and Sony code down to.
You have the OS, now get some graphics freedom too.

Re:Games before hardware (3, Funny)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601665)

Young coders are too lazy and brainwashed by MS and Sony to think anymore. You finally have the bandwidth and cpu and gpu to do something and your stuck dreaming at 640p. Hack the cards with Linux and dream big. Take computing back from the DRM, locked down junk MS and Sony code down to. You have the OS, now get some graphics freedom too.

I hope you don't contribute to the trunk, your code is 2x as long as it should be, and only half as effective!

Re:Games before hardware (3, Insightful)

Heir Of The Mess (939658) | about 5 years ago | (#29601361)

Since then however, the hardware has always been "good enough"

That's because most games are now being written for consoles and then being ported to PC, so the graphics requirements are based on what's in an X-Box 360. Unfortunately consoles are on something like a 5 year cycle. People are now buying a game console + a cheap PC for their other stuff for cheaper than the ol gaming rig. Makes sense in a way.

Re:Games before hardware (1)

plague911 (1292006) | about 5 years ago | (#29601465)

Yes and no. There is a danger coming to both Intel nvidea and AMD/ATI within the next 20 years. Computing tech of all kinds will be old hat.... for the average consumer. The human eye can only see so many pixels the human nervous system only has certain response time. Individuals who only use a computer for day to day activities like web/email and a movie or two probably have or just about have hit the point where they dont care any more. There are however those of us who still just "like" computers and the cool new tech. So far there is still a big enough change between generation to make it interesting. For example i cant wait to be able to build a system with a "affodable" SSD within the next year, or id think it would be way sweet to have a 6-core HT so 12 effective core chip so my simulations get done way quick (Im a college student so i do simulations in matlab on my own computer ).

The most interesting thing i however see coming out of this. Nvidia taking on Intell in the super computing field. I honestly see Nvedia getting pushed out of the market if they stay where they are. Intel is trying to get into the gpu market. and Both Amd/Intel have more resources than Nvidia

Re:Games before hardware (1)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601669)

There is a danger coming to both Intel nvidea and AMD/ATI within the next 20 years.

NEWS FLASH... When you're 2x as old as you are right now, the world will have changed.

Get off my lawn.

Re:Games before hardware (1)

hairyfeet (841228) | about 5 years ago | (#29601535)

You're not wrong. Most of my customers just rave about how "blazing fast" the new dual and quad AMDs I've been building them are, and I sure as hell ain't been building with the top o' the line chips or GPUs. For most of the tasks folks do with their computers the chips have gotten well past 'good enough" for a couple of years now.

That is why after being a lifelong Intel+Nvidia guy I have switched over to AMD builds, even going so far as to eat my own dogfood and building an AMD 7550 dual box for myself. Even in high def these new dual and quad chips paired with the new onboard GPUs give just an incredible bang for the buck, and I was playing Far Cry and Bioshock on the 780v board for a couple of months before breaking down and buying a cheap card for the thing. We truly live in great times as far as processing power goes, when I can build my customers really nice dual cores for less than $550 and quads for less than $750 and still make a nice profit.

And having support for all the major codecs "out of the box" on today's IGPs (DivX, MPEG2-4, H263/264) just adds the wonderful icing to the delicious cake. Hell as I sat up a customer's wireless network the other day he had to call his family in just to show them how the new quad I built him could do LoTR high def while running his browser, messenger, and email all at the same time with nary a hiccup. I just had to smirk at all the "ooohs" and "aaahs" he was getting as he showed off his new PC. And of course that was just the low end AMD quad paired with an Nvidia IGP (an 8400 I believe) and I have a few customers that like I did are gaming on their AMD IGP until they get around to having me pick up them a card, and the framerates are quite smooth when paired with 4Gb of decent RAM.

So I'd have to say you hit the nail firmly on the head. While there are still guys out there that are into the whole epeen "my box is teh elite!" stuff, even the lowest of the low end now seems to have surpassed "good enough" by a pretty good margin for what most folks are doing with them. that is one of the reasons I switched to AMD, as the lower price allows me to give them more RAM and HDD, which IMHO matters more today than whether or not you have the latest and greatest in GPU or CPU.

Re:Games before hardware (1)

PC and Sony Fanboy (1248258) | about 5 years ago | (#29601685)

even going so far as to eat my own dogfood

You do realize this implies you work for ATI/AMD? If not, perhaps you should do a quick check on what dogfood means.

According to your article, you're obviously price sensitive, so why would you *ever* pay full price for a product (from your competitor)?

Clearly you need to inform yourself before attempting to inform others.

Re:Games before hardware (1)

sexconker (1179573) | about 5 years ago | (#29602169)

By the time Diablo 3 comes out, your 8800 GT will be dead.
They're all defective.
http://www.theinquirer.net/inquirer/news/1038400/nvidia-g92s-g94-reportedly [theinquirer.net]

I'm with you, though.
As long as my games run acceptably, fuck upgrading.
Eye candy is nice, but fun games are nicer. With more money I can afford more games, or, you know, save it.

With hot new GPUs I play the same old shit with better textures, and then I look around and see 1, maybe 2 games worth buying. Typically, these are the games that play great on older hardware too.

Hind sight is 20/20, and I think I would have done well if I had done the following:

GeForce 2 --> 9800 Pro (ATi) --> 8800 Ultra/GTX

Sorry AMD and Nvidia, unless there's a damned good game I NEED to play immediately, I'll be waiting til you bring out the 6870 (ATi) / 485 (Nvidia). They're coming out in what, another year? I can wait.

GPU to network (1)

soldack (48581) | about 5 years ago | (#29600805)

I wonder when a GPU will be able to directly access a network of some sort. Right now, you would need glue code on the CPU to link multiple GPUs in different systems together. I imagine that some HPC applications would run quite well with 100 GPUs spread over 25-50 machines with a QDR InfiniBand link between them.

Re:GPU to network (1)

ceoyoyo (59147) | about 5 years ago | (#29601177)

Probably not, without major architectural changes. Currently when you want something to run fast on a GPU you really want to have an algorithm where you load everything up and then just let it run.

You could potentially improve that by cutting out the CPU, PCI, etc., but then you're not really talking about a graphics card anymore and you might as well just market it as a stream processor or a competitor to Cell blades.

Re:GPU to network (1)

symbolset (646467) | about 5 years ago | (#29601439)

Intel has moved the northbridge and the GPU onto the processor. I understand the southbridge is next. The keyword is "SOC" or System on a Chip. They're not inventing this here -- others have done it long ago. The southbridge is the part that talks to the physical port controller (PHY) for the network. PHYs will remain discrete components because of their radically different power requirements - until we go optical. But Intel hasn't promised us multiple 10Gbps optical ports on consumer equipment until next year, and 100Gbps per port could be 10 years out.

Larrabee is supposed to be a massively parallel cache-coherent x86 architecture with 24/32/48 cores on one chip.

But of course for serious work you could put four of them in one 2U system and use your fancy Infiniband to interconnect your 192-core nodes. You'll need some insane power and cooling to fill a rack with 22 of those but at 172TFlops per rack (If the thing hits 2GHz with 32 cores), maybe it's worth it: you're in the top 20 supercomputers in the world, in one rack. I'd probably go with a shared PCIe bus myself, though, and tie together 8-packs of servers in a tiered arrangement, but I'd put five in each server that wasn't a PCIe hub. I'm cheap that way - I'll go asymmetrical to save a few bucks. On the upside, there's space in there for 172TB of local SAS storage as well. Come to think of it, I might try and fit in some SSD.

The Larrabee add-in cards had better come equipped with external DC power connectors and external water or compressed air connectors so they can fit in a single-wide slot. There's no way the PSUs or thermal solutions in servers are going to handle those Watts - unless it comes in significantly under the expected TDP.

Intel had this 10 years ago. (0)

Anonymous Coward | about 5 years ago | (#29601513)

Intel themselves did it over 10 years ago, but due to the design team developing it for RDRAM the project got canned when it turned out rambus memory wouldn't be available in significant quantity and at a pricepoint that would work for the SoC ideal, which was small, cheap, and fast.

If they hadn't screwed it up, we could've had this level of integration 10 years ago, and perhaps intel would've gotten off their sorry butts and produced some higher end integrated graphics, rather than relegating their 8xx/GMA series graphics hardware to the annals of low end hardware history.

Re:Intel had this 10 years ago. (1)

symbolset (646467) | about 5 years ago | (#29602061)

Intel was buying optical processor technologies a quarter century ago. I don't see any threats to their dominance in the next quarter century. They're not dumb. They could launch a 100GHz photonic processor if they needed to. They just don't need that advancement yet. I can't think of a better reason to buy AMD processors than that: Intel is not going to give us the good stuff until their dominance is threatened.

Another article here (3, Informative)

Vigile (99919) | about 5 years ago | (#29600837)

http://www.pcper.com/article.php?aid=789 [pcper.com]

Just for a second glance.

nice die shot (1)

skarhand (1628475) | about 5 years ago | (#29601063)

As opposed to TFA, this article includes a nice die shot, for those that care.

More than just graphics (5, Informative)

mathimus1863 (1120437) | about 5 years ago | (#29600855)

I work at a physics lab, and demand for these newer NVIDIA cards are exploding due to general-purpose GPU programming. With a little bit of creativity and experience, many computational problems can be parallelized, and then run on the multiple GPU cores with fantastic speedup. In our case, we got a simulation from 2s/frame to 12ms/frame. It's not trivial though, and the guy in our group who got good at it... he found himself on 7 different projects simultaneously as everyone was craving this technology. He eventually left b/c of the stress. Now everyone and their mother either wants to learn how to do GPGPU, or recruit someone who does. This is why I bought NVIDIA stock (and they have doubled since I bought it).

But this technology isn't straightforward. Someone asked why not replace your CPU with it? Well for one, GPUs didn't use to be able to do ANY floating or double-precision calculations. You couldn't even program calculations directly -- you had to figure out how to represent your problem as texel- and polygon-operations so that you could trick your GPU into doing non-GPU calculations for you. With each new card released, NVIDIA is making strides to accommodate those who want GPGPU, and for everyone I know those advances couldn't come fast enough.

Re:More than just graphics (0)

Anonymous Coward | about 5 years ago | (#29601099)

CUDA supports C and C++. You don't need to represent your problem as texel or polygon operations to "trick" the GPU.

Re:More than just graphics (1)

mathimus1863 (1120437) | about 5 years ago | (#29601191)

You don't need to "trick" the GPU anymore, but it is still lacking tons of functionality. I remember a coworker complaining that he got some double-precision operations working, but everything crashed if he used any for-loops. Additionally, there are different bit-representations for variables on CPU vs GPU which can complicate things to hell when trying to get the CPU and GPU to cooperate with each other. Yes, CUDA is like C or C++, but it's not the same, yet. Hence, new releases like this get people excited.

Personally, I've found it to require too much time-investment to learn how to do GPGPU programming, but new cards with updated functionality help flatten the learning curve.

Re:More than just graphics (1)

six11 (579) | about 5 years ago | (#29601249)

I'm curious to know how you go about writing code for GPUs. I've been thrown into a project recently that involves programming multicore architectures, so I've been reading about StreamIt (from MIT). It looks really interesting. But they don't mention GPUs in particular (just multicores), probably because the current batch of GPUs don't have a lot of candy that CPUs have (like floats).

Re:More than just graphics (2, Informative)

Anonymous Coward | about 5 years ago | (#29601611)

Start looking at OpenCL as soon as possible if you want to learn gpgpu, cuda is nice but opencl is portable between vendors and hardware types :)

Re:More than just graphics (1)

LucidLion (1145739) | about 5 years ago | (#29601633)

If you have a compatible nVidia graphics card (anything within the last 3 years), you should look into CUDA (http://www.nvidia.com/object/cuda_get.html). It's got a steep learning curve at first, but it's quite awesome. OpenCL is another option designed to be more cross-platform. Either way, you basically write some code in C (well...C with a few extensions and limitations) and it executes in parallel on the GPU. The current batch of nVidia GPUs support floats with both single precision and double precision, although there is a significant performance penalty with double precision for most applications. With the announcement nVidia made today, though, there will soon be new cards with much less of a performance penalty for double precision.

This is the easy way to go (1)

raftpeople (844215) | about 5 years ago | (#29601765)

I had my code up and running quickly. It takes a little more time to re-arrange the algorithm and data to get optimal efficiency and to learn about the strengths and weaknesses of the computing model, but still relatively easy for any competent C coder.

Re:More than just graphics (1)

Mad Merlin (837387) | about 5 years ago | (#29601945)

GPUs (of the past) are basically just massively parallel floating point units. I think the OP meant to say that they lacked integer operations and double precision floats in the beginning.

Re:More than just graphics (1)

Belisar (473474) | about 5 years ago | (#29602227)

Actually, GPUs started out doing all rendering in fixed point arithmetic, i.e. the equivalent to integers. That worked fine for rasterizing and shading for quite a while.

Then, they started doing limited-precision floating point support (with some 16-bit 'half' floats and other non-standard-conforming weirdness). Only later did they actually go on to support full IEEE floats. The current (as in, can buy them now) generation added IEEE double for both manufacturers, but of course performance is about an eighth or so of single precision.

Re:More than just graphics (1)

drgould (24404) | about 5 years ago | (#29601659)

But this technology isn't straightforward. Someone asked why not replace your CPU with it?

Because GPUs have no memory management unit, so they can't run any modern multitasking operating system worth beans; Windows, Linux, Mac OS, whatever.

And why should they? That's not what they're designed for.

Embedded x86? (1, Interesting)

Doc Ruby (173196) | about 5 years ago | (#29600973)

What I'd like to see is nVidia embed a decent x86 CPU, (maybe like a P4/2.4GHz) right on the chip with their superfast graphics chips. I'd like a media PC which isn't processing apps so much as it's processing media streams, pic-in-pic, DVR, audio. Flip the script of the fat Intel CPUs with "integrated" graphics, for the media apps that really need the DSP more than the ALU/CLU.

Gimme a $200 PC that can do 1080p HD while DVR another channel/download, and Intel and AMD will get a real shakeup.

Re:Embedded x86? (1)

deathguppie (768263) | about 5 years ago | (#29601047)

All I know is I want to be able to move, normal map, and texture 10M Triangles in real time. When they can do that.. I'll buy beer for everyone!

Re:Embedded x86? (1)

Sponge Bath (413667) | about 5 years ago | (#29601093)

Your proposal sounds similar to the IBM/Sony cell architecture: one general purpose processor core with a collection of math crunching cores. The enhanced double precision FP in this latest Nvidia chip also maps the progression of cell with the PowerXCell 8i over the original cell processor.

Re:Embedded x86? (1)

Doc Ruby (173196) | about 5 years ago | (#29601199)

Indeed, I have two (original/80GB) PS3s exclusively for running Linux on their Cells. The development of that niche has been very disappointing, especially SPU apps. And now Sony has discontinued "OtherOS" on new PS3s (they claim they won't close it on legacy models with new firmware, but who knows).

I'd love to see a $200 Linux PC with a Cell, even if it had "only" 2-6 (or 7, like the PS3) SPUs, and no hypervisor, but maybe a separate GPU (or at least a framebuffer/RAMDAC, but why not a $40 nVidia GPU?). The Cell shouldn't be so exotic and rare, and it certainly shouldn't cost $5000 for a workstation just to get 8 SPUs when it's really that bus that's the magic, and 2-6 SPUs are so fast, and such a CPU should cost under $80.

Re:Embedded x86? (2, Informative)

Anonymous Coward | about 5 years ago | (#29602141)

What I'd like to see is nVidia embed a decent x86 CPU,

They did, its called Tegra. Except its not using the x86 hog, but way more efficent ARM architecture

JYou insensitive clod.. (-1, Offtopic)

Anonymous Coward | about 5 years ago | (#29601389)

tur8ed over to yet Problem; a few ContributEd code Of progress.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?