×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Next-Gen Graphics May Slip To End of 2013

Unknown Lamer posted about a year ago | from the what-is-the-world-coming-to dept.

AMD 76

MojoKid writes "AMD has yet to make an official statement on this topic, but several unofficial remarks and leaks point in the same direction. Contrary to rumor, there won't be a new GCN 2.0 GPU out this spring to head up the Radeon HD 8000 family. This breaks with a pattern AMD has followed for nearly six years. AMD recently refreshed its mobile product lines with HD 8000M hardware, replacing some old 40nm parts with new 28nm GPUs based on GCN (Graphics Core Next). In desktop, it's a different story. AMD is already shipping 'Radeon HD 8000' cards to OEMs, but these cards are based on HD 7000 cores with new model numbers. RAM, TDP, core counts, and architectural features are all identical to the HD 7000 lineup. GPU rebadges are nothing new, but this is the first time in at least six years that AMD has rebadged the top end of a product line. Obviously any delay in a cutthroat market against Nvidia is a non-optimal situation, but consider the problem from AMD's point of view. We know AMD built the GPU inside Wii U. It's also widely rumored to have designed the CPU and GPU for the Xbox Durango and possibly both of those components for the PS4 as well. It's possible, if not likely, that the company has opted to focus on the technologies most vital to its survival over the next 12 months." Maybe the Free GNU/Linux drivers will be ready at launch after all.

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

76 comments

C'mon, folks. (0)

Shaman (1148) | about a year ago | (#42867355)

Do we really need more powerful GPUs? What we need is a better way of displaying graphics and a better toolkit to do it.

Whatever happened to the Unlimited Detail guys?

Re:C'mon, folks. (1)

Anonymous Coward | about a year ago | (#42867631)

Whatever happened to the Unlimited Detail guys?

Seem to have bailed on the gaming side of things, the whole concept has problems when you consider doing things like animation, opacity, reflections, multiple varying light sources, shadows, etc... I'm not saying they couldn't solve it but everything they demoed was the sort of stuff we can already do - just look on youtube for voxel renderers - and they omitted all the tricky things, moreover their explanation of how it works means reflections, opacity and shadows don't even work in their paradigm. Some of those challenges they claim to have solved (animation in particular) but no details or demos, just "we have animation".

I won't say they haven't done it, or say they are definitely snake oil salesmen...but the 'trust us, we've done it.' approach when their explanation suggests otherwise doesn't bode well for them.

Re:C'mon, folks. (0)

Anonymous Coward | about a year ago | (#42867633)

Nope. More power = moar polygons = better gaming. Dontcha know?

I mean, fuck art direction. Totally.

Re:C'mon, folks. (1)

0111 1110 (518466) | about a year ago | (#42867793)

Personally i like both more polygons and good art direction. Maybe some talented artists as well.

Re:C'mon, folks. (1)

tepples (727027) | about a year ago | (#42867839)

Personally i like both more polygons and good art direction. Maybe some talented artists as well.

Are you willing to pay two or three times more per copy for such a game?

Re:C'mon, folks. (1)

0111 1110 (518466) | about a year ago | (#42868163)

Are you willing to pay two or three times more per copy for such a game?

Yes. I'll even pony up for a new video card if I like the game enough.

Re:C'mon, folks. (0)

Anonymous Coward | about a year ago | (#42867659)

Whatever happened to the Unlimited Detail guys?

...Steve, er, um, Bruce, is that you?

Re:C'mon, folks. (0)

Anonymous Coward | about a year ago | (#42868161)

Do we really need more powerful GPUs? What we need is a better way of displaying graphics and a better toolkit to do it.

Whatever happened to the Unlimited Detail guys?

They realized they'd never be able to compete with Minecraft.

http://imgur.com/r/gaming/sEB8hO2

Re:C'mon, folks. (1)

Taco Cowboy (5327) | about a year ago | (#42868949)

Do we really need more powerful GPUs? What we need is a better way of displaying graphics and a better toolkit to do it.

Whatever happened to the Unlimited Detail guys?

In a way, we do need a more powerful GPU, but not the way they are doing it.

Simply by adding shader units, or by ramping up the GHz no longer do the job.

A total overhauling of the GPU mindset must take place, but it takes much more than the hardware guys (AMD/nVidia), it also takes a paradigm change on the graphic programmers to push for a real change

Re:C'mon, folks. (1)

hairyfeet (841228) | about a year ago | (#42872271)

Honestly from what I've seen just like with CPUs its now about doing the same that we were doing but with less heat and power usage. look at what an HD4850 used and then compared it to an HD7770 which according to sites like HWCompare is just a little ahead of the HD4850 and you'll see its a pretty big drop in power used to get the same performance.

But as far as graphics goes? i think this is pretty much it guys, its not gonna get much better. Hell it costs nearly 100 million dollars to do a triple A game with today's graphics, more graphics, more physics, all that is more work which will drive the price up and I don't see that happening. THQ is gone, EA is on the selling block, and Activision's parent company is looking at restructuring. It simply costs too much to focus everything on the graphics like we did in the old days folks so I'm betting we have pretty much reached a plateau and new consoles won't change that.

My guess is the future is gonna be a combination of episodic gaming, DLCs, and a mix of big devs and small because graphics won't be the sole selling point anymore. Games like Torchlight II and Legend of Grimrock have shown you can have games without bleeding edge graphics still sell well and I wouldn't be surprised if games like the Borderlands series make nearly as much off the DLC sales as they do the game itself, so to help lower costs I can easily see games changing to more of an episodic and DLC model.

Finally as for AMD if the rumors are true they are looking at 3 out of the 4 consoles using their chips in some manner (the lone holdout being the rumored Steambox but until we get some hard data we can't be sure of that) so I can see them not having the capacity to do a new chip rollout at the same time they are having to supply all these consoles. I'm just glad to see them get the work as we need competition and having 3 out of the 4 consoles ought to give them a regular source of income they can count on while they work on new chips.

The Truth About AMD (-1, Troll)

Anonymous Coward | about a year ago | (#42867425)

AMD has been working for years to destroy the privacy of ordinary citizens-- for profit.

When asked, executives from AMD have been unable to explain why employees at their headquarters keep altering the Wikipedia page for the attempted assassination of Pope John Paul II.

Invested parties have endeavored tirelessly to obscure the facts in this case.

Last August, multiple news agencies reported on joint defense exercises between Afghanistan and China. What they didn't report is that North Korea ALSO was there - with next-generation warship technology developed secretly by AMD for $3.8 billion.

The last time anyone came forth to speak about this, they immediately noticed increased surveillance at their house and place of work. Suspicious, right?

For years, the government has been using industrial accidents and the attempted assassination of Pope John Paul II as excuses for increasing restrictions on the use of aspartame. But we all know that's just an excuse. Ordinary people should be allowed to keep as much aspartame as they want for personal use.

Given these facts, the future does not look good.

If you don't act on this, who will? And who will suffer while you remain indecisive?

Re:The Truth About AMD (0)

Anonymous Coward | about a year ago | (#42869855)

When asked, executives from AMD have been unable to explain why employees at their headquarters keep altering the Wikipedia page for the attempted assassination of Pope John Paul II.

Cool, I must have distant relatives in the AMD headquarters, embarrassed about the event! John Paul II did forgive him, though, for his mental instabilities.

The delay of the HD8000 is a positive thing when considering the arrival dates of the Kaveri APU. This way the customer is able to pair the APU with the current technology.

No thanks (0)

jvillain (546827) | about a year ago | (#42867607)

I have a Southern Island card that will likely never have a usable open source graphics driver so I am never buying AMD again. I can get way better video from Intel Integrated graphics and those nice Intel open source drivers than I can from a 6 core AMD proc with a SI card. I am done with AMD.

Re:No thanks (0)

Anonymous Coward | about a year ago | (#42867663)

It's true. If you prefer Linux and want full support and performance for your GPU, Intel is pretty much the only ticket.

Re:No thanks (2)

ifiwereasculptor (1870574) | about a year ago | (#42867765)

What do you mean, "never"? It's already usable for 2d. 3d will probably take a while longer, but it's still a very recent card by open driver development standards. Support will probably only get better with time, and I'm hoping that talk on Phoronix forums about synching the development of open drivers with Catalyst for the 8xxx or 9xxx cards will bring us better support.

While I agree with you that right now Intel is the only way to go if you're dead set on using open drivers, making future purchase plans based on one bad experience will probably only cripple your choice in the future. Remeber that Sandy Bridge failed to deliver drivers in a timely manner too (it took a couple of months, IIRC, for Linux support - if I got mad back then and decided I would never use Intel graphics again, I'd either have to swallow my unlightened words or follow them up with suboptimal behavior).

Re:No thanks (1)

drinkypoo (153816) | about a year ago | (#42869841)

What do you mean, "never"? It's already usable for 2d. 3d will probably take a while longer,

"Never" means "certainly not while the card still runs modern software".

but it's still a very recent card by open driver development standards.

But not by any reasonable, objective standard.

Re:No thanks (1)

GigaplexNZ (1233886) | about a year ago | (#42870765)

What do you mean, "never"? It's already usable for 2d. 3d will probably take a while longer, but it's still a very recent card by open driver development standards.

I was under the impression that 3D was required before 2D on the SI cards due to them relying on Glamor

Re:No thanks (0)

Anonymous Coward | about a year ago | (#42868467)

Gallium3D 3D acceleration already works on Southern Islands, build from GIt and stop FUDding.

This may not be so bad... (1)

Type44Q (1233630) | about a year ago | (#42867623)

...if it means those of us with Radeon HD 5000 through 8000 series GPU's get a little more life before AMD arbitrarily labels them "legacy" so that they can stop paying their engineers to develop drivers for them (like they recently did with the HD 2000 through 4000 series)... :p

Re:This may not be so bad... (0)

ifiwereasculptor (1870574) | about a year ago | (#42867795)

Yeah. I'm the fearful owner of a HD 5xxx. If we can expect only about five years of support, we're fucked. GPUs should be supported for about ten years, minimum. Especially now, when pretty much any discrete card from the past decade is sufficient for compositing. If they did like Nvidia and released updated legacy drivers whenever Xorg needed, I wouldn't be pissed. (Having said that, Nvidia refuses to release a fix for the FX and 6xxx lines and Gnome 3/Cinnamon/Unity, which is disconcerting.)

Re:This may not be so bad... (1)

GigaplexNZ (1233886) | about a year ago | (#42870879)

I'm not convinced that paying $50 for a graphics card should qualify for 10 years of active driver support. Anybody using a card that old doesn't care about performance which means they're likely to be the low end models in the first place. AMD has enough financial issues at the moment, funding driver development so users don't need to pay for an upgrade is not in their best interests.

Re:This may not be so bad... (1)

FreonTrip (694097) | about a year ago | (#42870963)

Only five years of support is shitty, no question. Out of curiosity, what are the GeforceFX and Geforce 6 cards doing wrong in Gnome 3? The Geforce6 cards were just bumped to legacy support last month, but the FXes have been in limbo for a lot longer... and Nvidia's failure to ever release a driver better than beta quality for NT 6 was pretty fucking irritating.

Re:This may not be so bad... (1)

ifiwereasculptor (1870574) | about a year ago | (#42873881)

Geforce FX, 6000 series and 7000 integrated chipsets draw all kinds of multicolored garbage in GTK3 DEs with open drivers and, with closed drivers, they draw garbage and either hang or are unusably slow (not being hyperbolic - I mean actually taking minutes to draw a window). Nvidia aknowledged the issue, but stated it's not their problem. Support for legacy is only for Xorg ABI changes. Nouveau, on the other hand, is understaffed, receives no official help from them and has been going through a rewrite, so no progress has been made there either. In fact, things have gotten worse. Look at this bug's report date:

https://bugzilla.redhat.com/show_bug.cgi?id=745202 [redhat.com]

Re:This may not be so bad... (1)

FreonTrip (694097) | about a year ago | (#42889543)

Yeesh. That's almost as heartbreaking as finding an ancient box I'd built for college two years ago, booting it up, installing Slackware, and discovering that no one ever bothered to fix the driver for its Rendition Verite 2200 to support 2D acceleration. At this point I wouldn't hold my breath for a fix; I'd just switch to Xfce and swear oaths of vengeance. It's probably more productive than waiting for this to get fixed on either side of the driver support pool.

Re:This may not be so bad... (0)

Type44Q (1233630) | about a year ago | (#42868423)

Wow... am I really getting modded-down by AMD shills??!

Re:This may not be so bad... (0)

Type44Q (1233630) | about a year ago | (#42868433)

Of course, I could've been unfairly modded-up by nVidia and Intel shills... just the luck of the draw, I guess. :p

Re:This may not be so bad... (1)

Anonymous Coward | about a year ago | (#42868743)

How do you know it was shills? It could have been fanboys.

Re:This may not be so bad... (0)

Anonymous Coward | about a year ago | (#42868787)

Wow... am I really getting modded-down by AMD shills??!

why is it whenever some elitist on /. has his opinion modded down he blames 'shills'? as though AMD is actually interested in your drivel and is paying people to silence you. did you consider the label of legacy isn't arbitrary and is based on a pre-determined lifetime? and that they aren't going to continue supporting old technology forever? and that your suggestions to the contrary are moronic and baseless? and why would 'shills' or fanboys mod down your post anyway? you really believe you're that controversial do you?

Re:This may not be so bad... (1)

Type44Q (1233630) | about a year ago | (#42875321)

did you consider the label of legacy isn't arbitrary and is based on a pre-determined lifetime? and that they aren't going to continue supporting old technology forever?

Did you consider that you might not have a clue what you're talking about? Radeon HD 4000 GPU's were shipping in brand new machines up until very recently...

Re:This may not be so bad... (1)

drinkypoo (153816) | about a year ago | (#42869847)

Wow... am I really getting modded-down by AMD shills??!

They may not be shills, they may just be fanboys. Happens to me when I tell the truth about AMD/ATI, too.

Re:This may not be so bad... (1)

exomondo (1725132) | about a year ago | (#42876239)

Wow... am I really getting modded-down by AMD shills??!

Meanwhile, at AMD headquarters:
Peterson: Sir, sir!
Rory Read: What is it Peterson?
Peterson: It's terrible! There is a guy...a free thinking radical! On slashdot, he is suggesting we...
Rory Read: We what!?
Peterson: ...we support our products a little longer.
Rory Read: Oh my god! Quickly Peterson, hire some people to get onto this 'slash dot', you must find a way to suppress this person! We need to devote resources to silencing such an opinion!

Re:This may not be so bad... (1)

hairyfeet (841228) | about a year ago | (#42872341)

Would you rather they just kept upping the numbers on the old drivers? They have already squeezed every last drop of performance they are gonna get out of that hardware, any bugs that show up will still get fixed but putting out a constant stream of drivers that is really just the same driver with new version numbers just makes no sense.

So I honestly don't see what the problem is. I'm gaming with an HD4850 now and its fine, games look good, the driver is stable, what more could they do when they have already got it as optimized as they can possibly get on 4+ year old hardware?

Re:This may not be so bad... (1)

Type44Q (1233630) | about a year ago | (#42875297)

I'm gaming with an HD4850 now and its fine...

Apparently you haven't tried using the HD 4850 with any kernels higher than 3.4...

Re:This may not be so bad... (1)

hairyfeet (841228) | about a year ago | (#42877119)

You should blame Linus Torvalds for that NOT AMD. I can take the driver it came with and run it right this second, even though that driver is 4+ years old, on Windows XP/Vista/7 and it runs just fine. do you HONESTLY think that Linus fucking Torvalds is smarter than the OS developers for BSD, Solaris, Windows, OSX, iOS, ChromeOS, Android AND even OS/2? Because thanks to stable driver ABIs they can ALL keep older drivers, YOU CAN NOT.

So I'm sorry you have chosen an OS controlled by religious zealots, i truly am. And before you say they are NOT zealots just read the manifesto put out by the one kernel dev that went on the record about driver ABIs, he even writes "I hope all non free drivers break often!"...I'm sorry but that is a zealot, when he would rather users have a broken OS than a functional one if it doesn't follow his belief system. And ironically the one card manufacturer everyone says to buy, Nvidia, because they work has NEVER supported free drivers LOL!

So to steal a line from RMS I'll be glad when Torvalds is gone, maybe then Linux users won't have to deal with such a piss poor mess of a driver system. Until then i'm sorry, you deserve what you get. If the free drivers don't work write a nasty letter to Torvalds, he is who you should lay the blame upon.

Re:This may not be so bad... (1)

Type44Q (1233630) | about a year ago | (#42878229)

Correct me if I'm wrong but I was under the impression that this issue isn't directly related to kernels 3.5 and above per se, but rather that AMD is refusing to compile their legacy driver for x.org 1.13 (which is perhaps indirectly related to the kernel version). Maybe someone can shed additional light on the subject...?

Re:This may not be so bad... (1)

hairyfeet (841228) | about a year ago | (#42887765)

The POINT is that they SHOULD NOT HAVE TO because the drivers should work fine. they work fine in windows, the work fine in Mac (for the cards made for mac of course) but the ONLY place it does NOT work is in Linux.

At the end of the day a company should NOT have to constantly recompile their drivers because Linus Torvalds is an egotistical douchebag who THINKS he is smarter than every other OS developer on the planet. The simple fact is he is NOT smarter and driver ABIs are there FOR A REASON, so you don't have to recompile everytime somebody gets a wild hair up their ass and changes something!

Re:This may not be so bad... (1)

ifiwereasculptor (1870574) | about a year ago | (#42891897)

To be completely fair, that's not a just comparison. Windows should be compared to a distribution, not to the Linux kernel. Windows $version is a fixed release, with ABIs well defined. The same is true for any stable version of Debian or CentOS. What happens is Linux is in constant development and the myriad of distros advance too fast. If we all ran RHEL, we would have absolutely no problem with Xorg ABI changes and drivers. But developers build for the latest libraries, thus the distros have to keep fairly current and the end result is the mess we find ourselves in. That's why the push for good open drivers is important - closed source is ok, but it hardly ever keeps the pace with free software, resulting in woe for the users. A distribution's work is exactly filtering the chaos and presenting us harmonic packages, which is easier said than done. Also, remember Vista faced difficulties - IIRC, because there were almost no drivers for it and the few that existed were mostly crap. Microsoft learned, though, and never broke driver ABI since.

Though your complaint makes sense, I'm positive there's a practical reason why Xorg and kernel developers often break driver ABIs. I know Xorg 1.11 broke driver ABI to correct a bug, I don't know about the more recent instances of breakage. If it's a question of organizational culture as you pose, then Wayland might sidestep it. We can hope, at least.

Re:This may not be so bad... (1)

Type44Q (1233630) | about a year ago | (#42892135)

they work fine in windows

What the fuck are you smoking?!

Re:This may not be so bad... (1)

Type44Q (1233630) | about a year ago | (#42892169)

Okay, admittedly Radeon HD 2000 through 4000 drivers work great in Windows (aside from the fact that they require .NET for full functionality) but apparently you haven't been paying attention to the recent fiasco involving their more recent GPU's under Windows.

Anyway, you've made it perfectly clear that you haven't actually looked into the HD2000-40000/Linux/X.Org issue; it's about time you bowed out to stage left on the subject...

Wait a week (3)

rrhal (88665) | about a year ago | (#42867647)

AMD announced today that they would have a message clarifying this. Apparently these rumors are not all true.

Re:Wait a week (2)

nschubach (922175) | about a year ago | (#42868749)

I'm just gonna put it on the table...

GTA5 delayed until Sept 17...
Rayman Legends delayed until Sept 17...
Feb 20 Sony PS4 announcement, AMD chips, scaled up production (for Sept release?)

And to top off the wishlist category:
Valve will release a console in 2013... PS4 will be a Steam "Premium" unit.
. /end_wild_speculation

Bummer (0)

Anonymous Coward | about a year ago | (#42867683)

Looks like NVIDIA will continue to be able to charge a ridiculous premium for their 680M because the AMD equivalent performs all over the place in games that the drivers haven't been specifically written for.

Might be better for profits (1)

Subject-17 (2790647) | about a year ago | (#42867781)

So, what I'm hearing is that AMD will be releasing its new line of video cards right around Christmas season, when a lot of people get new systems anyway? I've never understood why nVidia and ATI release their first cards around spring. Sure, get the bugs out early I guess, and there's got to be a bunch of young kids who have summer jobs willing to put all their profit towards a new gaming rig, but I still find it hard to believe that it isn't more profitable to just release the cards around October-ish, maybe even in September so you can still cash in on all the kids who just finished up their summer jobs.

If they really do get that boost in sales from the new console generation, and take this extra time to put forth more powerful competition towards nVidia, this may actually turn things around for AMD. Now, if they would finally release some decent Linux drivers, I may be sold

Re:Might be better for profits (4, Interesting)

Tr3vin (1220548) | about a year ago | (#42867919)

Typically the supplies dry up pretty quick during a launch. By selling in the spring, they can go through that initial shortage while they ramp up production and then not miss out on sales during the holiday season.

failure to fact check (1)

Mashdar (876825) | about a year ago | (#42867929)

AMD has definitively said that they will not be releasing 8000 series GPUs this quarter, or possibly not even this year.... No need for "several unofficial remarks"....

Sigh...... (0)

TheRealQuestor (1750940) | about a year ago | (#42867949)

I was really looking forward to selling my HD 6990 with a waterblock for enough to offset the cost of a new [or a couple new] 8xxx series card(s) but now it doesn't look so promising. Dangit, dagnabbit, GRRRR.... cry..... I was GOING to ebay it for about 550 bucks for the combo [well worth it] which means a new one would have only cost me about 300 bucks or so minus the water block. Now it's going to be about a hundred less and that really does suck.

Re:Sigh...... (1)

Dorianny (1847922) | about a year ago | (#42868371)

I have been looking to get a second hand graphics card (either a 7970 or a gtx 680) on ebay and I can tell you that cards with waterblocks typically sell for less than the ones with stock air cooling. There is not a whole lot of people with full loop watercooling in the first place and they are a bunch that typically wants the latest and greatest in hardware. The only ones that would be interested in your card are a handfull of people looking to add a second 6990 and already have one with the same waterblock. There is one with a xspc block on ebay at $400 with less than a day to go. It has no bids.

Re:Sigh...... (0)

Omestes (471991) | about a year ago | (#42868711)

Don't take this as an attack; I'm curious why you actually need an 8000 series card, and why you need water cooling on your present card?

I have a single, stock cooled, non-OC 5770, and can run pretty much every game on maximum settings (or rather, any game that doesn't choke on AMD GPUs). Why would you need much more, unless you're using your GPU for calculation, or mining bitcoins or something? I used to be a big graphics bleeding-edger, but thanks to everything being tied to ancient console hardware, I pretty much stopped caring. Early in the next console generation I'm sure I'll grab a decent 6000 or 7000, or maybe hop to Nvidia,, I'm getting a bit sick of bad support and game bugs, though it would pain me, I've been using ATI and AMD exclusively for over a decade (Actually since 3DFX crapped out, something about underdogs, and not being Intel).

After "hardcore" PC games died off, it was about not having to upgrade for years, but even then, now my aging 5000 series is perfectly fine. The only motivation I can think of is just pure masochistic (I mean it in a nice way) geekery. Because I can.

Re:Sigh...... (1)

Dorianny (1847922) | about a year ago | (#42868877)

I have 2 rigs, one with a oc 7870 and one with a oc 6970 and neither one of them can run the newest games with full AA at 30+ fps. Unless you are gaming at 800x600 or consider 10fps a playable frame rate there is no way your 5770 can run "pretty much every game on maximum settings".

Re:Sigh...... (1, Flamebait)

Khyber (864651) | about a year ago | (#42869189)

AA and AF are shit things to concern yourself with.

With those off, every game I play can be maxed out on everything else on my GTX 460, 60+ FPS. Hell, I can almost reach that on my old 9800GTX+

And on a 32" 1080p monitor, sitting 5 feet away, using a GPU with a huge chunk of RAM, you don't need to worry about AA or AF. You're not seeing jaggies unless the models suck.

Re:Sigh...... (0)

Anonymous Coward | about a year ago | (#42869251)

for some reason my 7770 got set to automatic high at 1080 for farcry 3 and actually ran quite well at solid 30+
For some reason it sets metro 2033 to high too at 1080, can't understand why these games autodetect for this, yet it was right, playble framerates. And that's afor a 1000 dollar budget card. He's prolly running 1600x900 on that 5770 which is still a pretty standard res and prolly medium settings, still darn pretty on most games.

Re:Sigh...... (1)

Omestes (471991) | about a year ago | (#42872537)

... full AA at 30+ fps

That might be it, I keep AA down a notch since it is the the feature with highest requirements for the smallest effect. I honestly can't tell the difference (in game) between all the new alphabet soup AAs and the bog-standard AA. I've come to the conclusion that they are largely a marketing thing. Though most of the time I can use whatever FXAA DMAA PPAA WTFBBQAA they have. And generally autodetect throws me into max, at least for the games I play. Perhaps I've saved as well because I don't just do "max", I turn off things that I find annoying (Bloom. Oh lord. And in some games post processing, since I hate glowing textures, GW2 is the worst with this).

But I figure, at max settings, with 6x AA, with bloom, or any other horrible lighting effects (not for performance, just because they are hideous), turned off, running at 30-40fps is just fine. Granted, I'm not a big FPS guy, or a competitive gamer, so FPS for me is just aesthetics. I don't care if it is above 30 in most games, and as long as it is around 60 in FPS I'm fine. Anything above 60 is a bit silly on a 60Hz monitor.

Re:Sigh...... (1)

TheRealQuestor (1750940) | about a year ago | (#42869425)

Don't take this as an attack; I'm curious why you actually need an 8000 series card, and why you need water cooling on your present card

I don't take anything as attacks on this site. I really don't care what people think, say, or do :P But the reason I want to sell it is not so much for lack of performance, as it is still a really fast card, but for worth and age. I've had this one for well over a year and a half now and one of the games I play hates it, SWTOR, and by hates it I mean HATES it [oh it still gets 100+ FPS with everything maxed but it is anything BUT stable :(]. I play all games at my primary monitors native resolution [and sometimes I eyefinity the 3 but it kinda makes me sick :(] of 1920x1200 and I still get 100+ FPS in most everything I throw at it. Well Far Cry 3 kind of hurts when I max everything [about 60ish FPS]. It really is a good card for it's age. But that is the thing. I RARELY keep a card for more than a year and have sold my last 5 or 6 on ebay when it was time to buy a new one [and yes I always keep the box/stock cooler, etc]

I like the x990 series because it's a dual card on a single board. Takes 1 slot and uses one set of power inputs. I've had them since they came out.
I did dual cards for a while, but prefer a single slot solution
The reason I water cool is three fold.
1. Because I can
2. Because of the racket the dustbuster fans make
3. because under load this sucker will push over 200F EASY on the stock air cooler

On water, under 100% load, it maxes out at about 150F which is about what the stock cooler did idle and at idle it rarely goes about 100F
Pretty much the same reason I water cool my cpus since the days of the AMD MP 1733 [I have water cooled every CPU I have had since then, and I have had quite a few :)]
I just like water cooling and overclocking and fast stuff.

I "used" to keep parts about 6 months then sell them and buy new and back years ago it served me well as the shit was x times faster about every 6 months, now, not so much, so I tend to skip a gen and go with the 2nd release from what I have now. This has been slowing down more and more too and that is fine by me as it costs me much less, and I get much more use out of the parts before they hit ebay lol. I'm still rocking my 1.5 year old i7 2600K @ 5Ghz on a Sabertooth P67. I keep waiting for something to come out so that I can upgrade those TOO but alas, nothing has come out that really smokes what I have now, so no need to upgrade yet. Though I did sell the 16 Gigs of DDR3 1600 ram last month and upgraded to 32 Gigs DDR3 1833 ram and it really helps my virtual machines a lot. Almost as much as tossing them on SSDs [ok not nearly as much, but still it helps :P]

Re:Sigh...... (0)

Anonymous Coward | about a year ago | (#42871283)

Make sure you include the stock air cooler.... that will make it more likely to sell at the price you want it.

Do they have any engineers left? (2, Informative)

LordNimon (85072) | about a year ago | (#42868373)

I live in Austin. The only thing that AMD is know for around here is layoffs. I'm surprised they have any engineers left to work on their products. Why anyone would work for them is a mystery to me.

Re:Do they have any engineers left? (1)

Tagged_84 (1144281) | about a year ago | (#42868683)

Perhaps engineers who wish to get a job would work for them? Those that understand AMD isn't firing people for the lulz?

Re:Do they have any engineers left? (1)

drinkypoo (153816) | about a year ago | (#42869835)

Perhaps engineers who wish to get a job would work for them? Those that understand AMD isn't firing people for the lulz?

Well no, AMD is firing people for the lulz. They hired 'em on the same basis. This is not your father's AMD.

Re:Do they have any engineers left? (0)

Anonymous Coward | about a year ago | (#42874361)

Alas, Intel is nearly the only company not to downsize engineering during the bottom of the economic cycle. AMD shows very well what happens when you cut engineering during lean times between two product generations. Instead of having extra money to capitalize on during the next product cycle, you may never have a next product cycle. AMD had a great engineering team they hired away from DEC when Intel was buying them (Intel didn't need more engineers, so were laying the DEC engineers off) who made AMD's hammer processors, but they downsized them and I doubt they can ever rebuild the team. I fear AMD's time in the daylight is permanently over.

Re:Do they have any engineers left? (1)

DarthVain (724186) | about a year ago | (#42883711)

Not sure how they re-organized themselves, however AMD *was* a cpu making, not a gpu maker. They bought out the Canadian company ATI that was nVidia's only real competitor and rebranded the whole thing eventually as AMD. ATI makes the gpu. So unless AMD is new to Austin, or they have combined production across locations, likely they are not one and the same. From what I understand ATI was a pretty cool company.

Waiting for the process shrink (1)

Anonymous Coward | about a year ago | (#42869051)

AMD uses TSMC for its stand-alone GPUs, as does Nvidia. TSMC has been having the greatest difficulty making these very complex chips. Meanwhile, other foundries, like GF, are making great strides in chip technology.

Nvidia and AMD have the choice of going for another round of parts on the same process at TSMC, with only modest improvements at best, or waiting for a 'shrink'. Neither AMD or Nvidia feel much market pressure at this time, since their high-end parts are already way too powerful for all the current computer games. Both companies also harvest semi-working dies, and use them for graphics cards of lower performance. These 'harvested' parts are already like new chips in the market.

The new consoles hitting the market from Sony and Microsoft around Autumn time will change the situation. While the consoles have GPU hardware significantly slower than the best PC products from AMD/Nvidia, console game companies will at last unleash a new generation of very advanced games, providing an incentive once again to own a powerful gaming PC.

It should be noted that both new consoles have a massive 8GB of RAM, and the trend for future games is open-world- massive seem-less environments. Open-world games are very amenable to render-quality 'sliders', allowing the owners of the most powerful hardware to view the same game in much improved quality. The new consoles mean the PC will never again have AAA exclusive titles, but the ported games will be from two platforms that are both very PC like in design and ambition.

Powerful PC GPU hardware will set far render distances, high textures, better shadows and lighting, higher framerates, and larger resolutions. These better settings will suck up all the surplus GPU power AMD and Nvidia can offer the games over the next 4+ years, until stagnation hits again.

No more distant clip planes and popups (1)

Anonymous Coward | about a year ago | (#42869383)

Powerful PC GPU hardware will set far render distances

That approach is old hat now. Modern games don't have far clip planes anymore, but render everything to "infinity". Objects just become less distinct with distance, same as in real life.

Guild Wars 2 is a typical example of an MMO with a modern rendering engine. You can stand on a high mountain pass and see everything to arbitrary distances, and objects don't suddenly "pop" into view as you approach like in the bad old days. The technology doesn't even need hot PC machinery --- even an old Core 2 Duo and a positively antique nVidia 9800GT give you a useable framerate.

Game graphics have really advanced a lot in recent years.

Re:No more distant clip planes and popups (0)

Anonymous Coward | about a year ago | (#42869481)

" Modern games don't have far clip planes anymore, but render everything to "infinity"" - oh yes they have. As long as they use OpenGL/DirectX or any other rasterizing API with a Z buffer. "Renderiing to infinity" can be achieved by rendering stuff in many slices, but that results in a performance hit. The fact that the clipping plane is far, doesn't mean that its in infinity... I don't see Blood Tide coast from Shiverpeaks...

Re:No more distant clip planes and popups (1)

Gr8Apes (679165) | about a year ago | (#42870499)

Perhaps that is because even in real life, "rendering" is capped at 17 miles or less on average? Significantly shorter than infinity. As long as the cut-off is beyond the "haze" plane, things will appear to slowly come into focus, and won't "pop".

Re:No more distant clip planes and popups (0)

Anonymous Coward | about a year ago | (#42870965)

Go out at night.
Look up.
Focus on the small twinkly bits.
You are seeing things significantly farther away than 17miles.
(alternately: hitch a ride up onto the ISS and look down)

Just because the atmosphere and curvature of your puny planet limits how far you can see does not imply that God's render-farm is incapable of generating huge sight lines. Man, he can run Crysis at 60fps even! ;-)

Re:No more distant clip planes and popups (1)

Gr8Apes (679165) | about a year ago | (#42871263)

Yes, but there's no real parallax to speak of, so that can be handled by an image or even a single raster pass. So, from a graphics perspective, this could be handled by 1980s technology, even with near field objects moving or appearing to move against the static background.

Re:No more distant clip planes and popups (1)

WilyCoder (736280) | about a year ago | (#42871511)

We've had skybox/skydome in games for years now.

Can you see a plane at 33,000 feet from the ground? Not the vapor trails it leaves, but the plane itself? 33,000 feet is a lot less than 17 miles. Yeah....

Re:No more distant clip planes and popups (0)

Anonymous Coward | about a year ago | (#42871105)

I'm sure I can see the moon

Underwear Gnome Finds Missing Underpants ! (0)

Anonymous Coward | about a year ago | (#42869701)

Rumors don't *slip*, announced schedules slip.

1.) publicly traded company B starts a rumor that publicly traded Company A is going to release a product on date X.
Company A never said that at all, but hey, that's what rumors are- no one knows who started it. *shrugs*.
 

2.)When Company A does not release on date X, company B then goes on a stealth PR offensive that Company A is *slipping*. Guileless reporters trying to make this week's word count repeat the story about company A *slipping* which is picked up and repeated in social media....

3.) Company B ---> Profit.

The thing is, the above scenario is actually *illegal* since actually against the law to spread false rumors about a company for the purpose of manipulating their stock price.

Now if we only had regulators who weren't caught in a revolving door somewhere. ...

AMD really truly no longer a player in the desktop (1)

Anonymous Coward | about a year ago | (#42871987)

AMD have also recently said they have no ability nor plans to compete with Intel on high end desktop processors either. Their top-of-the-line FX8350 is only modest competition for Intel's midrange.

Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...