×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia 480-Core Graphics Card Approaches 2 Teraflops

ScuttleMonkey posted more than 5 years ago | from the hotter-than-a-thousand-suns dept.

261

An anonymous reader writes "At CES, Nvidia has announced a graphics card with 480 cores that can crank up performance to reach close to 2 teraflops. The company's GTX 295 graphics cards has two GPUs with 240 cores each that can execute graphics and other computing tasks like video processing. The card delivers 1.788 teraflops of performance, which Nvidia claims is the fastest single graphics card in the market."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

261 comments

Tell me how big it is. (1, Insightful)

unity100 (970058) | more than 5 years ago | (#26391601)

its not a problem to implement 52342525113 cores. its a problem to implement in cost, size, and power drain that an acceptably priced gamer pc case can accommodate.

so far, nvidia is failing in that respect.

Re:Tell me how big it is. (0, Informative)

Anonymous Coward | more than 5 years ago | (#26391847)

It's 12" long and 6" around and it's going to go straight up your ass.

Re:Tell me how big it is. (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26392085)

A giant customized Starbucks in Cupertino California where lattes and no soy skim macchiatos are given out free to all employees. The background music involves a playlist of Nora Jones, David Matthews, John Mayer, and Bono on loop from an Ipod docked somewhere in the Apple/Starbucks facility. Hours are long but morale is surprising high as developers, hardware and software, are given 30 minute breaks to masturbate to the new itunes interface.

All developers sit at cafe type tables with a Mac Book Pro while their lord and master Steve Jobs stands deskless in his predictable attire of a turtleneck and jeans. In fact, this is the preferred (mandatory) dress code at Apple. Jobs walks around to each and every department, separated by latte and vegan preferences, and checks on the performance and efficiency of his developers. At any given point in the day one may see Mr Jobs yelling at a programmer for not implementing a button in the perfect shade of corn flower blue (#6495ED) and immediately sends him to the apple punitive chamber, consisting of a HP Compaq running Vista Basic.

There are 2 software development departments and 2 hardware development sections in Apple. For software there is the Apple core team, Apple Open Source team. In hardware there is the Apple systems and management team and the iDevice team. Since the OSX kernel consists of a BSD darwin kernel there is no real need for low level programmers and as such the entirety of the Apple core team consists of UI designers and photoshop junkies. All software churned out from the core team is designed in a program strikingly similar to Visual Studio's form designer but with Cocoa Objective C generated instead. The 16 hour day (Jobs demands 16 hour days since he himself never sleeps) of a core dev involves lining up the right shade of chrome with the latest photoshopped graphite button and maintaining the correct color scheme, not an easy job at all.

The Apple open source team involves a little bit more coding, which is mandated to be done in TextEdit or the option of a $80 third party mac text editor. The Apple open source team doesn't actually create much code but searches the internet for interesting BSD licensed software and modifies it as it's own through obfuscation and conversion to objective C. Many of the items a mac user sees comes from the open source world stamped by apple such as the ability to play music taken from 67 different originally linux based players, CD burning, and the overall ability to click a mouse. Apple's legal department has no qualms about this practice and has assured many that since most of the code is BSD and if any is GPLed many Linux hippies should be grateful that Apple fostered WebKit by using KHTML and adding some Gecko bloat. Perhaps one of the most important items that the open source team has done to date is use parts of the FreeBSD to keep the kernel up to date.

There's not much to say about the Apple systems and management team. I suppose they can be classified in to desktop and laptop systems. Because hardware work is beneath Apple in general and thought of being only worthy of Windows Users and as such can be found working on these beauties in the starbucks bathroom. Desktops are currently made by buying dell machines and putting them in Lian Li cases, where the majority of the costs goes to buying titanium Apple emblems to paste on the sides. Laptops consists of the rebranding of only the most silver and black Sony Viaos but talk has been going around about rebranding Asus EeePCs for a new Apple netbook but you didn't hear that from me, for fear of my life.

The iDevice team's job is to develop for the ipod, iphone, itouch, and many other portable electronics apple may release in the future. Their jobs are very interconnected with the open source team as well as the core dev team. Using firmware from random samsung devices and giving it an OSX skin the ipod stands as a shining example that infringement only applies to greasy file sharers and that the music player remains the best in market. The 16-24 hour day of an iDevice dev consists of partly implementing Zune security flaws and creating new novel ways to make the click wheel sounds, while a majority of the time is spend just playing with (read testing) products future and past.

And there you have it, the mystery of Apple unveiled. Also on a side note to those who are worried about Job's health. He is fine but is trying a new diet consisting of Soy Nuts and Anger.

Re:Tell me how big it is. (0)

Anonymous Coward | more than 5 years ago | (#26392233)

Trolls like this, after being cut and pasted repeatedly only have some replay value when they are a first post. Otherwise it just looks pathetic. Try harder.

Re:Tell me how big it is. (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26392317)

i use mac with my brother sometimes... its very cool... my brother is 30 years old hes pretty smart... he has 45 iq its the same as heis shoe size.. pretu good considaring 100 is full.... mac is cool but visat is beter... i am takeru on msn... bcz when i play halo for the second time i knew what was going too happen befor eit happend... so im takeru... its pretty cooll... is anyone else here mac... thatwould be prety cooll... sonic is cool... i dont like tails though bcz hes sonics girlfrend... i want2 be sonics girlfrend.... sonic is so fast and handsome its increddibnle... sometimes... mac... together... my mom and dad are brother and sister... its prety cool i gess... i herd its prety normal in america.... they love eachother like a father and daugher... theyr so cute together... together... sometimes... mac... my brother is in wheel chair... but hes cool because hes smart... yea... the boy in the basements said he isnt smart and he say bad thing about my dad... but its no mater... he is chained up... in basement... together... vista... yea... maybe... mac is pretty cool bcz they are like copmuters... and the y hav leaf powers btu in mac their in the sfrari... and im there too because im takeru... together... sometimes... i hear screaming from basement... dosnt mater... the boy there is happey.... yea...

Re:Tell me how big it is. (1)

Surt (22457) | more than 5 years ago | (#26391883)

"The card fits into any normal PCI Express slot."

Re:Tell me how big it is. (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26391903)

The average nigger dick fits into any normal white woman's vagina. That's to say nothing about the size of the nigger that's (unfortunately) attached to it.

Re:Tell me how big it is. (0)

Anonymous Coward | more than 5 years ago | (#26391973)

So there was actually little technical thought put into the PCI-E standard, rather it was conceived as a subtle social commentary. Nice.

Re:Tell me how big it is. (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#26392041)

1.21 niggawatts? 1.21 niggawatts!
There is only one thing that can generate 1.21 niggawatts...A nigger.
Unfortunately you don't know when or where one will steal your bike.

Re:Tell me how big it is. (1, Insightful)

BloodyIron (939359) | more than 5 years ago | (#26392207)

I like how you think you're a big enough person to post a racist comment, yet not big enough to log in.

Grow up, and welcome to the real world (READ: Anyone can be president)

Re:Tell me how big it is. (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26392313)

Hi nigger! Welcome to America. The cotton fields are to your left.

Re:Tell me how big it is. (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26392105)

...and the only people who would put a black dick inside are the ones with low self-esteem.

Re:Tell me how big it is. (1)

unity100 (970058) | more than 5 years ago | (#26392131)

yea. like the 280s.

fits into a normal pci e slot, meets atx standards, but you need a shovel to shove it into the case. it leaves not an inch of space left to put anything in.

Re:Tell me how big it is. (3, Informative)

BloodyIron (939359) | more than 5 years ago | (#26392251)

The specs are very specific (lol, get it?).

I take it you havn't seen full-length graphics cards yet? 280's, 8800 GTX's, GX2's, etc, aren't full length cards, but they're close.

These are full length cards: http://management.cadalyst.com/cadman/Review/AMD-ATI-FireGL-V8600-and-FireGL-V8650-Graphics-Car/ArticleStandard/Article/detail/526886?contextCategoryId=6631 [cadalyst.com]

You can tell the difference by them not only being longer, but having that retention connector at the end (right side of the pictures) which helps steady the card.

Re:Tell me how big it is. (0, Troll)

Cheeze (12756) | more than 5 years ago | (#26392071)

I don't think the purpose of a card like that is gaming. Sure, games will work very nicely with it, but paying several thousand dollars for a video card like this when your game is only going to use 10% of the available power is VERY wasteful.

Re:Tell me how big it is. (2, Informative)

Anonymous Coward | more than 5 years ago | (#26392169)

What the fuck are you smoking? It's a $500 card.

Re:Tell me how big it is. (1)

BloodyIron (939359) | more than 5 years ago | (#26392173)

If you do your research, you'll see that:

a) It's based on an improved 260 core design (not 280 as indicated by the memory bit width), so it is for gaming (and work)
b) It's MSRP'd at $500, and currently available for $500ish ( http://www.newegg.com/Product/Product.aspx?Item=N82E16814130439 [newegg.com] )
c) What is your source that it'll only use 10% of the available resources?

I find your conclusions laughable at best.

Re:Tell me how big it is. (1)

Zymergy (803632) | more than 5 years ago | (#26392321)

Actually, I kinda have to like disagree with you on that...
Current PC games are utilizing these latest generation cards NOW...

I use the predecessor to this card (Nvidia's GTX280 GPU with its 240 'cores') to play the latest FPS games at 1920x1200 and it runs a Folding@Home GPU CUDA client whenever it is not gaming...
If I had one of these new $500 GTX295's I could run my games even faster or even assign one of the GPU's with its 240 'cores' for physics processing (A/K/A Nvidia PhysX, F/K/A Ageia PhysX) and the other GPU would serve as the primary rendering GPU. (And of course I would have BOTH of the GTX295 GPU's using their 240 'cores' each for twice the F@H contributions [or whatever distributed computing task I choose to donate some cycles to...]
(Since I have an all-electric home, why not use my PC to heat part of it in the wintertime instead of wasting that electricity heating air solely through with my all-electric HVAC unit...)

Imagine... (0)

zach_the_lizard (1317619) | more than 5 years ago | (#26391613)

...a Beowulf cluster of these! In all seriousness, I'm waiting for the latest and greatest supercomputers to have huge GPU farms.

Re:Imagine... (1)

HTH NE1 (675604) | more than 5 years ago | (#26391765)

I'm waiting for the latest and greatest supercomputers to have huge GPU farms.

Just wait until they perfect rapid fabrication and live expansion. GPU farming is the future, fabricating additional cores on demand.

Re:Imagine... (1)

Joce640k (829181) | more than 5 years ago | (#26392203)

Not gonna happen.

There's a lot of flops, sure, but they're arranged in a long pipeline where the only input is "texture map" and the only output is "frame buffer". That's not much good for general purpose processing.

Oh, and they're only single precision, which wipes out another big chunk of possibilities.

Re:Imagine... (1)

kkwst2 (992504) | more than 5 years ago | (#26392371)

Hmm? The GP said Beowulf cluster. Where in that did you read general purpose computing?

There are many HPC problems that you can solve adequately with single precision.

But will it run Crysis?... (5, Interesting)

TibbonZero (571809) | more than 5 years ago | (#26391615)

No, seriously... can anything run it at full options yet?

Re:But will it run Crysis?... (0, Flamebait)

sexconker (1179573) | more than 5 years ago | (#26391641)

Crysis (and Warhead) are HORRIBLY unoptimized and inefficient. Buggy as hell to boot.

I hate how people use it as a benchmark.

I can make a shiny game that runs super slow too.
Doesn't mean you should give it any weight as a benchmark.

Re:But will it run Crysis?... (4, Insightful)

pieisgood (841871) | more than 5 years ago | (#26391795)

Can you really make a game that looks as good as crysis? Seriously, do you have any idea of what went into making it? Something tells me that you have no idea what so ever.

Re:But will it run Crysis?... (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26392067)

What went into making it, was quite obviously not enough programming talent. Mod me troll if you like, but there it is.

Re:But will it run Crysis?... (1)

Ethanol-fueled (1125189) | more than 5 years ago | (#26392151)

Could it be that Crysis' developers did have the programming talent but they had to rush it out the door before they could optimize it and fix all the bugs?

Marketing vs. Engineering (0)

Anonymous Coward | more than 5 years ago | (#26392177)

Crysis was a brilliant marketing success built on top of very average engineering.

A whole lot of technical buzzwords and silly tech demos. But the game completely falls on its face in actual in game graphics outside of anywhere that isn't entirely covered in jungle. The most inane were the 'amazing' real life comparisons where they would take a lot of high rez scanned textures and carefully find the perfect spot to position the camera. Cute, but technically a complete yawn.

Not to mention the game itself was mind numbingly dull. Even by the fairly low fps genre standards.

Re:But will it run Crysis?... (5, Insightful)

bertok (226922) | more than 5 years ago | (#26392323)

Well, I do know what goes into a game like Crysis, being a 3D game programmer and all. Those programmers were very, very good, believe me. Some of the stuff they pulled off is just amazing.

The reason Crysis is slow is because of the artistic direction. Outdoor environments full of plants and shadows with a huge viewing distance is very hard to implement in a 3D engine. I mean really fucking hard. Making a game like that playable at all is a tradeoff between two scraggly trees on a flat green carpet that pretends to be grass, OR an enormous amount of research into optimization techniques that are very hard and time consuming to implement. The Crysis engine is pretty much the state of the art in optimization. And these guys managed to squeeze in fantastic shader effects on top of that, depth of field, and even some basic radiosity shadowing for the characters!!! That's just insane.

Most reviewers and players with the right hardware thought the game looked amazing, way better than its peers at the time, or even now. I thought the effects (especially in the spaceship) looked better than most Sci-Fi movies, which is a stunning achievement for a 3D game running on a $500 video card. I upgraded my PC just to play the game, and I thought it was worth it. Lots of people did too:

http://www.penny-arcade.com/comic/2007/10/15/ [penny-arcade.com]

Take your head out of your ass and stop belittling other people's achievements until you have some of your own to compare it to, OK?

Re:But will it run Crysis?... (0)

Anonymous Coward | more than 5 years ago | (#26392219)

Crysis was a huge failure lol

Re:But will it run Crysis?... (1)

jaguth (1067484) | more than 5 years ago | (#26392363)

Um, I can think of about 20 games from 2008 that looked as good, if not better, than Crysis, and that also runs faster.

We Have A Winnner! (0)

Anonymous Coward | more than 5 years ago | (#26392007)

Crytek has done a fantastic job creating Id/John Carmack type fanboyism for their game.

With the sorry state PC games is in with more and more developer leaving or focusing on the console market, Crysis has been latched onto as some sort of holy messiah that is the one thing that still justifies their 3-4000 dollar 'rig'.

Lots of silly tech demos that have nothing to do with the actual in game graphics and carefully staged Crysis vs real life comparisons have created a fanatical fanboy worship for the game/engine/company.

Too bad the actual in game graphics are incredibly unimpressive. Even more so when the screen isn't completely covered in foliage.

Re:But will it run Crysis?... (1)

JCSoRocks (1142053) | more than 5 years ago | (#26392097)

How so? I've played through all of Crysis and I never ran into a single bug. It also continues to be the most beautiful game I've ever played and I wasn't even able to max the settings (although I did get the DX10 God Rays, etc).

I'm tired of random rants about how Crysis sucks just because it's graphically demanding. They made an incredible game that has continued to take advantage of new hardware. Most games are the opposite, they code backwards so that most people with existing hardware can max it out. It looks no better on a machine now than it did 2 years ago. Crysis is different... it'll only get prettier (to a point, obviously)!

Yes (2, Informative)

jgtg32a (1173373) | more than 5 years ago | (#26391811)

I can run Crysis/Warhead at 30fps maxed out at 720p. I have a single 4850.

The problem with video card review is they don't bother testing anything lower than 1920x1080 which is 2.25x bigger than 720.

Crysis takes a lot to run but it has already been tamed as long as you aren't running at 2560x1600 or some other absurd resolution.

Re:Yes (1)

Fulcrum of Evil (560260) | more than 5 years ago | (#26391875)

Of course they test at 1920x1200 - that's how you can stress the card. It also avoids the problem of getting ridiculous framerates because you tested on a reasonable resolution.

Re:Yes (2, Insightful)

jgtg32a (1173373) | more than 5 years ago | (#26391941)

A video card test needs to show a consumer the capabilities of the card, so they can decide if the card is for them. If what you said was true than they would only do one test at 1920x1600 and be done with it. The lowest resolution I've ever seen on a review was 1920x1080. Not everyone has a monitor that runs that high.

Re:Yes (0)

Anonymous Coward | more than 5 years ago | (#26392005)

this is obvious, but if the card can run a game at 1920x1080, then it can run it at lower resolutions

Re:Yes (0)

Anonymous Coward | more than 5 years ago | (#26392127)

this is obvious, but if the card can run a game at 1920x1080, then it can run it at lower resolutions

Yes, but knowing that Card X gives a whopping 9fps at 1920x1080 really doesn't give those of us with lower resolution monitors any idea whatsoever how well it will perform at our desired resolutions. All we know is that it should do better than 9fps. But how much better? Who knows?

Re:Yes (2, Insightful)

Surt (22457) | more than 5 years ago | (#26392147)

1920x1200 is the most preferred resolution because it is the native resolution of most of the 24" panels. If you don't play at native resolution, you get to experience glorious scaling artifacts. Glorious, glorious scaling artifacts.

Re:Yes (1)

poetmatt (793785) | more than 5 years ago | (#26392287)

I was thinking about this. I am picking up a 6gb ram/4870x2/i7 920 setup and kept thinking: "Why not just run 1680x1050 dual monitors with like 16xAA"

the thing about being able to run the current generation of games at 2560x1600 also ensures that there isn't a chance in hell you'll be able to with the same setup a year later as games will be too demanding, and lowering resolution while preserving aspect ratio probably makes everything look like crap. Not to mention how disappointing that would be.

Car analogy time: it'd be like owning a ferrari to drive on an open rural road and then a year later you move to the city, only being able to drive it in stop and go traffic.

Better to try and fail than to never try (0)

Anonymous Coward | more than 5 years ago | (#26391617)

No, wait. Better to not try then there's no fear of failing. Right!

Power Requirement (5, Funny)

sexconker (1179573) | more than 5 years ago | (#26391619)

1.21 Jiggawatts

Re:Power Requirement (1, Troll)

Conspiracy_Of_Doves (236787) | more than 5 years ago | (#26391671)

What the hell is a Jiggawatt?

Re:Power Requirement (0)

Anonymous Coward | more than 5 years ago | (#26391747)

It's one billion watts (units of power). Same as one jigabyte is one billion bytes. For example, I have a 640 jigabyte hard-drive in the computer I'm using right now. I also have a 2 jigabyte iPod.

Re:Power Requirement (1)

Adriax (746043) | more than 5 years ago | (#26392325)

It doesn't even take an entire jigabyte of my HD to store my favorite movie, Jodzilla vs Jamera.

Re:Power Requirement (2, Informative)

Anonymous Coward | more than 5 years ago | (#26391761)

It's an allusion to Back to the Future, where Marty goes back to the 50's and tells the doctor that the Delorian time machine requires "one point twenty-one gigawatts" to make the leap. Only back in 1985, the SI prefix "giga" wasn't well known, so presumably the actors or directors in the film arbitrarily, or by following french language convention, decided to pronounce giga with a soft g, hence the line "1.21 jiggawatts" which sounds a little out of place in 2009.

Re:Power Requirement (2, Interesting)

Anonymous Coward | more than 5 years ago | (#26391865)

Actually, the soft g ("j") pronunciation is correct and illiterate computer types abominated it with a hard g. "Back to the Future" wasn't wrong, we are.

Re:Power Requirement (1)

JCSoRocks (1142053) | more than 5 years ago | (#26392121)

Citation? I'm curious more than I'm accusatory. It drives me crazy when people say "jif" rather than "gif". It sounds like they're talking about peanut butter. I've always used a hard "g" for both gif and gigawhatever. I'd be interested in knowing if I was wrong along though...

Re:Power Requirement (1)

Conspiracy_Of_Doves (236787) | more than 5 years ago | (#26392365)

WHOOOSH

And what does Marty say after Doc keeps repeating '1.21 Jigowatts' over and over after hearing himself say it on the video tape?

Re:Power Requirement (1)

morgan_greywolf (835522) | more than 5 years ago | (#26391803)

What the hell is a Jiggawatt?

Don't be ridiculous!

It's unit of power used in measuring flux capacitors!

Now, if you'll excluse me, I have to go back to 1955.

Re:Power Requirement (1)

phagstrom (451510) | more than 5 years ago | (#26391743)

Great, so now I need Mr. Fusion or a blot of lightning to get my computer running. I wonder if the flux capacitor is an optional extra?

Contest... (5, Funny)

Anonymous Coward | more than 5 years ago | (#26391647)

Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.

Re:Contest... (5, Funny)

pla (258480) | more than 5 years ago | (#26391939)

Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.

Not quite - They proved they have the biggest number of penises... Making for some interesting crossover potential into the Hentai gaming market.

/ Wonders what "ultra realistic" means as regards H - "Wow, the fur on her tail looks almost real, and her breasts look like actual porcelain!"

Great... (5, Insightful)

pwolf (1016201) | more than 5 years ago | (#26391679)

That's just great and all but when can I get a video card that doesn't take up half my case and melts down after 6 months of use? Not to mention, doesn't cost an arm and a leg.

Re:Great... (5, Funny)

clarkn0va (807617) | more than 5 years ago | (#26391785)

when can I get a video card that doesn't take up half my case and melts down after 6 months of use? Not to mention, doesn't cost an arm and a leg.

2006?

Re:Great... (1)

slaker (53818) | more than 5 years ago | (#26391807)

No kidding. I have a BFG Tek 8800GTX that's been replaced five times since I got it. My game system used to be an overclocked affair with several hard drives, but over time I've reduced it to a 700W Corsair PSU, an un-overclockable Intel branded motherboard, one hard disk and stock Crucial RAM, thinking maybe my setup was killing the card... all in an enormous Antec P180 case, which has dedicated cooling for the graphics slot and multiple 120mm fans.

Fucking thing died again a couple weeks ago. Even when it's working "right" I can feel a spot on the side of my case that's got to be 50 F hotter than the same area three inches above or below that spot.

Re:Great... (1)

Fweeky (41046) | more than 5 years ago | (#26392049)

I bought an ATI when my 8800GTS 512 died; I didn't want to play the lottery as to whether the replacement would have the same manufacturing defects [tgdaily.com] or not.

nVidia are going to have to do something pretty special to attract me back after that; putting two of their power hungry barely-fabricatable huge monolithic GPUs on a single card just isn't it.

Re:Great... (1)

StikyPad (445176) | more than 5 years ago | (#26392137)

The problem seems to be that many video cards ship with inadequate cooling systems. At least that's been my experience. Back in the day, custom cooling solutions were pretty much reserved for those doing serious overclocking. Now cooling requirements have gone up, but manufacturers generally use the bare minimum, such that the GPU doesn't overheat as soon as it's powered up, and nothing more.

I've only got a 7900GTX, but after having it replaced once, and then getting more jaggies, slowdown, and stuttering, I decided to check the core temperature. I don't remember the number but it was quite high. I shelled out for a Zalman VF1000 [newegg.com] , and installation was fairly straightforward. After a brief heart-stopping moment when my PC wouldn't boot (I had negelected to reconnect the PCI-E 6 pin power connector), I got the system powered up, and found that all of my issues had been resolved.

Note also that if you live in a dusty environment (especially if you're a smoker), your fan/heatsink will need to be cleaned regularly. Dust is a good insulator, and will wreak havoc with your cooling if not removed.

Re:Great... (1)

spire3661 (1038968) | more than 5 years ago | (#26392297)

My 8800 GTX burnt out too, replaced with a 4850 that was on sale for $150 figuring id throw it in my media center PC when i got my 8800 back. The 4850 works so well I havent bothered to send in the 8800. I play on a 24" dell (1920x1200) and the 4850 runs most everything VERY well at that res.

Right now (3, Informative)

Sycraft-fu (314770) | more than 5 years ago | (#26391861)

One of the benefits of the technology war is that it produces good midrange and low end technology as well. This is particularly true in the case of graphics cards since they are so parallel. They more or less just lop off some of the execution units and maybe slow down the clock and you get a budget card.

Whatever your budget is, there's probably a good card available at that level. Now will it be as fast as the GTX 295? Of course not. However they'll be as fast as they can be at that price/power consumption point.

Don't pitch because some people need/want high end cards. Enjoy the fact that they help subsidize you getting good, cheap midrange cards.

If you want serious suggestions, tell me your budget range and what you want to do and I'll recommend some cards.

Re:Great... (1)

morgan_greywolf (835522) | more than 5 years ago | (#26391887)

I don't know about adding that auto-meltdown feature. Sounds like a product liability issue.

It's a feature (1)

Wrexs0ul (515885) | more than 5 years ago | (#26392253)

Didn't you know the second power connection to your GPU is actually for the oven/space heater function? So it's actually a feature!

Nvidia realized long ago that to maximize play-time they needed a way for users to cook and stay warm near their PCs.

I've made some mean eggs on my case, recipe came from the included Nvidia cook-book.

-Matt

Re:Great... (1)

evanbd (210358) | more than 5 years ago | (#26391917)

Perhaps you should simply buy one of the less expensive cards out there? Of course the highest performing card available uses lots of power and costs a lot. Get something less powerful.

So how'd you solve it? (1)

Wrexs0ul (515885) | more than 5 years ago | (#26392047)

No kidding! I just ran into my first Nvidia heat-o'-death situation too.

Anyone know of an after-market part to draw air directly over your PCIe cards? This is a problem that's right now solved by the turning-my-graphics-card-into-a-jet-engine solution. It works, but if there's a quieter answer that keeps the graphics power I'd be happy to hear it.

Here's the skinny:

790i comes with 3 PCIe slots so I thought to try SLI with two new cards, and an older one (in the middle thanks to the bridge) for second monitor/TV. The poor middle card just doesn't stand a chance against two 260's, it's like an oven with both elements on.

I've been using RivaTuner [guru3d.com] to adjust fan speed and watch temperature. Outer cards run ok (44 deg C), but even at Max fan speed the middle card idles at 61, and at normal speed will die if anything tries taxing the card for more than a few minutes.

Better Question (1)

KalvinB (205500) | more than 5 years ago | (#26392077)

When *won't* you be able to get a video card that takes up less than half your case and doesn't require its own power supply?

Right now you can still get a high powered graphics card for less than $50 with a small or no fan. But those cards are 2 year old technology. These days all the latest and greatest are essentially a PC within a PC and I doubt the power and cooling requirements will go down with time.

So in 5 years these rediculously large cards will cost $50 but they'll still be rediculously large.

10 years ago graphics cards and graphics requirements for games were going neck and neck. Now it seems that graphics cards are outpacing what games actually demand. So you can go with a cheaper card and still get very good quality rendering.

Re:Better Question (1)

UncleTogie (1004853) | more than 5 years ago | (#26392161)

Right now you can still get a high powered graphics card for less than $50 with a small or no fan.

Define "high powered", please...

480 core? (5, Interesting)

Anonymous Coward | more than 5 years ago | (#26391715)

Color me doubtful but I suspect it's 480 stream processors which isn't anywhere NEAR the same thing as the "cores" on the CPU or even the core of the GPU.

Why has the press suddenly started to call stream processors "cores"? Marketing?

Re:480 core? (2, Insightful)

Chabo (880571) | more than 5 years ago | (#26391925)

Maybe because GPGPU is coming soon, and the GPU makers want people to think of them as individual cores? So... partly marketing, I guess.

Sounds good but.. (0)

ph1nn (588305) | more than 5 years ago | (#26391767)

Doesn't the PS3's Cell processor do more than this, and it came out years ago?

Re:Sounds good but.. (5, Informative)

jgtg32a (1173373) | more than 5 years ago | (#26391853)

218 GFlops

http://www.realworldtech.com/page.cfm?ArticleID=RWT072405191325&p=2

A single 8800 kill the cell and the video processor in the ps3 combined

Re:Sounds good but.. (1)

JCSoRocks (1142053) | more than 5 years ago | (#26392145)

Which is why PC gaming will always be better than console gaming. *ducks beneath flamewar*. Seriously though, my PS3 is for BluRays and my 360 is for streaming NetFlix movies and playing Rock Band. That's it. If I want to play a real game... it's on the PC.

geez... (1)

tscheez (71929) | more than 5 years ago | (#26391863)

how much processing power does Duke Nukem Forever need! That's like supercomputer performance from 10 years ago.

Yes, but... (0, Redundant)

tgetzoya (827201) | more than 5 years ago | (#26391915)

Will this be the first card to run Windows Aero at a decent speed?

Re:Yes, but... (1)

vux984 (928602) | more than 5 years ago | (#26392087)

Will this be the first card to run Windows Aero at a decent speed?

No. It won't be the first.

Of course, that's because the first cards to run Windows Aero at a decent speed were made several years ago.

Why do we bother... (4, Funny)

Thelasko (1196535) | more than 5 years ago | (#26391991)

with CPUs anymore? I'm just going to fill a case with graphics cards and call it a day.

Re:Why do we bother... (2, Informative)

Anonymous Coward | more than 5 years ago | (#26392139)

Because this card can only do 1.788 tera-multiply-adds per second. Try instead to have it build a parse tree, then run transformation algorithms on it (chasing pointers all over the place) and so on, like you would while compiling code, and this thing will make the Atom look great.

CPUs are optimized for general computing, GPUs are optimized for stream-oriented numeric computing. Both have their uses, and the ideal is probably a combination of both, as is currently done.

Re:Why do we bother... (0)

Anonymous Coward | more than 5 years ago | (#26392341)

That's closer to possible than you might think. GPUs can now be used for a variety of tasks that CPUs usually handle. They are particularly well suited for math intensive simulation and data processing applications, for example.
http://en.wikipedia.org/wiki/GPGPU

Twice that with this stuff..... (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26392175)

* 1 Case Regular Pint size Mason Jars ( Used for canning)
* 2 Boxes Contact 12 hour time released tablets.
* 3 Bottles of Heet.
* 4 feet of surgical tubing.
* 1 Bottle of Rubbing Alchohol.
* 1 Gallon Muriatic Acid ( Used for cleaning concrete)
* 1 Gallon of Coleman's Fuel
* 1 Gallon of Aceton
* 1 Pack of Coffee Filters
* 1 Electric Skillet ( If you don't know what iam talking about i will have pics later)
* 4 Bottles Iodine Tincture 2% (don't get the declorized it won't work)
* 2 Bottles of Hydrogen peroxide
* 3 20 0z Coke Bottles (Plastic type)(with Lids/caps)
* 1 Can Red Devils Lye
* 1 Pair of sharp scissors
* 4 Boxes Book Matches (try to get the ones with brown/red striker pads)
* 1 pyrodex baking dish
* 1 Box execto razor blades single sided
* 1 digital scale that reads grams
* 2 gallons distilled water \
* 1 Roll Aluminum foil tape

That's what you would have to go buy if you wanted to make meth.

First things first -- the Iodine Crystals. Take one 20 oz, plastic Coke Bottle and pour 4 Bottles 2% tincture into it.

Add Hydrogen Peroxide to this. Use only 1/2 a bottle of Hydrogen peroxide. After this you know, the gallon jug that the Muriatic acid comes in take the cap off and fill this cap level with the acid. Add the acid to the coke bottle (Place in a freezer for at least 30 mins).

While the Iodine crystals are being made we are going to extract the Phsuedo from the Contacts. You are going to need a towel for this so go get one. Take the pills out of one box, add it to one of the mason jars fill with rubbing alchohol just enough to cover the pills let set for 3 minutes. Remove pills and take the towel and wipe the top coating off the pills this will remove the wax. Do the same with the other box of Contacts as well, after this add those wiped off pills only 10 to a clean mason jar. On top of this add 1 bottle of Heat do the same for the other box of Contact. Let theese two mason jars with pills, heat stand for 30 minutes. Then shake the jars till pills are completly broke down then let the jars sit again for 4 hours or until the Heats is completly clear . Once clear cyphon the heat off (Not the powder stuff at the Bottom you don't want this it will fuck your dope up).

Well anyway syphon the heat off with a piece of the sergical tubing syphon this into a pyrodex baking dish place in microwave on high till the heat is almost evaporated. Take out of microwave. Now plug up your electric plate set the pyrodex dish on this on about 180 deg continue evaporating till you get a white powder on the pyrodex (Carefull not the burn the phsudo if it turns yellow it's burned) after you get it dried take a razor blade and scrape this powder up. (put this asside for later use)

Now we are going to get the red phosphorus from the book matches take a pair of scissors and cut along the edge of the phosphorus do the whole four boxes of match book matches then take 1 small coffee cup will work to this coffee cup add about 1/4 the way with Acetone dip the match book strike pads into the acetone for 10 seconds this will loosen the phosphorus so it will be easier to scrape with the razor blades. ( put the phosphorus in an empty match book box to let dry. Now it's time to get the iodine crystals get a clean mason jar on top of this place 1 coffee filter and pour the contents of the iodine +muriatic+Hydrogen Peroxide into the filter ( do it slowly don't over pour) well once you get though with the filtering on top of the coffee filter will be a black substance ( This is iodine crystals) dry them by wraping in more coffee filters till you get a pretty good thick pile around the original filter place on ground and step on it to get the rest of the liquids off save this for the cook.

next take your digital scales wiegh your pills first say you had 2 grams of pill powder then weigh out an equal amount of iodine crystals then for the phosphorus devide the total weight of pills by 3 3 will go into 2 1 time so if you had 2 grams pill powder you should have 2 grams iodine crystal 1 gram phosphorus Now its time to make the cook jars you will need 2 clean mason jars with lids 1 foot surgical tubing poke a hole in both jar lids place one end of the tubing into each jar lid and seal with foil tape (buy this at walmart for about $ 1.60 well anyway seal off the tubes as well as you can so you should have 2 mason jars with lids that have surgical tubing foiled taped and sealed. ok this is the cook in one mason jar add distilled water in it fill it half way close the lid on it. now get you hotplate hot first at 180 degreese F when the plate get hot then its time to add the Iodine+pill powder to the other mason jar not the one with water in it once you get both Iodine and pill powder to the jar add 6-10 drops of distilled water to this place it on the hotplate now add the phosphorus once you put this in the jar there is going to be a imediatereaction place the other lid with hose onto the jar screw on tightly then turn your hotplate up to 400 degrees f let this cook for 1 hour to an hour and a half the best way to tell when it is done is when the contents of the cook jar doesn't boil anymore once this has happened turn the hotplate off and let the jar cool so you can touch it now its time to see if we have dope once it has cooled open the lid and you should smell rotten egg like smell if it has this smell congrads you have dope now we have to remove the dope from the black goey substance to this jar add about 1/4 cup of distilled water and seal the jar with a lid that has no holes in it and shake the jar till all the substance on the botom of the jar has come off into the water

next take another clean mason jar and place a coffee filter and filter the cook jars contents though the filter now on the filter is your phosphorus save this for another cook later on just putt it in a dry coffee filter and put it somewhere dry and safe now you have a jar filled with a yellow honey looking substance if its this color you have done good at cooking the dope now to this add colemans fuel fill the jar about full just leave anough room for shaking now add 1-2 table spoons red devil lye let the jar sit for about 5 mins then place lid on the jar and shake the hell out of it then sit the jar somewhere to rest for about 30 mins Now we are going to pull the dope out of the coleman fuel and the product is going to be 90% methamphetamine to do this fallow what i say exactly syphon the coleman fuel into an empty 20 oz coke bottle syphon off much as you can trying not to get the substance off the bottom of the jar once you have the coleman fuel in the coke bottle add about 4-6 coke bottle caps of water to this now add one drop of muriatic acid to the coke bottle place lid on bottle and shake the hell out of it place upside down so it want fall and get your hotplate hot 400 degrees f on top of the hotplate place a clean pyrodex bowl on it now take the coke bottle still upside down and loosen up on the cap let the water drain into the pan don't get any coleman fuel into the pyrodex bowl now the water will evaporate while it is doing this take a coffee cup add acetone to it fill it 1/4 the way up now once the water has dried on the plate take plate off with gloves and add a small amount of acetone to the pyrodex bowl it will sizzle swirl it arouund and if all works out good ther will be cirle crystals all over the pyrodex bowl scrape up with a razor and enjoy Methamphetamine :-) This with 2 boxes of Contacts will make anywhere from 2-3 grams meth....

Is that on Doubles or Floats? (1)

StaticEngine (135635) | more than 5 years ago | (#26392227)

Because their Tesla boards post nearly a TFLOP of performance for single precision computing, but only 78 MFLOPS for double precision.

Next Gen Consoles (1)

Joseph Hayes (982018) | more than 5 years ago | (#26392301)

Just think of what the next generation of consoles will have. Microsoft will learn from their mistake (hopefully) and allow for better heat dissipation. And there is no telling what Sony will come up with to try and secure their share in the market. Anyway... these are all hopes of course. My point being, the next gen consoles should deliver some mind blowing experiences.

*sigh* (5, Funny)

CynicalTyler (986549) | more than 5 years ago | (#26392335)

Can someone please post the link to a how-to guide for convincing your wife/girlfriend of the necessity of owning a graphics card with dual 240-core GPUs? Or, if you are a girl who acknowledges said necessity without a fight, please post a link to your Facebook profile. Thank you in advance.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...