Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

First Radeon HD 8000M GPU Benchmarked

timothy posted about 2 years ago | from the some-good-some-bad dept.

AMD 68

J. Dzhugashvili writes "As Slashdot noted earlier this week, AMD has a new line of mid-range Radeon GPUs aimed at notebooks. The chips are based on the Graphics Core Next microarchitecture, and they're slated to show up in systems early next year. While the initial report was limited to specification details, the first review of the Radeon HD 8790M is now out, complete with benchmark data from the latest games. The 8790M is about 35% smaller than its 7690M predecessor but offers substantially better gaming performance across the board. Impressively, the new chip has similar power draw as the outgoing model under load, and its idle power consumption is slightly lower. Notebook makers should have no problems making the switch. However, it is worth noting that this new mobile GPU exhibits some of the same frame latency spikes observed on desktop Radeons, including in games that AMD itself has sponsored."

cancel ×

68 comments

Sorry! There are no comments related to the filter you selected.

First benchmark of Radeon 8000M (0, Redundant)

gagol (583737) | about 2 years ago | (#42366411)

I seriously doubt the reviewer benchmarked the very first board out of the factory...

Re:First benchmark of Radeon 8000M (1)

jones_supa (887896) | about a year ago | (#42368639)

Indeed. The title should probably be "First Benchmark of the Radeon HD 8000M GPU".

Re:First benchmark of Radeon 8000M (1)

Anonymous Coward | about a year ago | (#42369065)

If we're being Pedants, then obviously AMD has benchmarked more than a few themselves.

Maybe First Public Benchmark.

Re:First benchmark of Radeon 8000M (0)

Anonymous Coward | about a year ago | (#42369267)

Reminder: Slashdot !== Journalism

Even AMD thinks AMD CPUs suck (3, Insightful)

Guspaz (556486) | about 2 years ago | (#42366431)

The subject might look like I'm trying to troll, but... I'm actually referring to TFA. AMD sent the TechReport reviewer a Gigabyte Z77 motherboard with an Intel i7-3770K processor. So it says on the first page of TFA.

AMD... sent an Intel processor... to review an AMD GPU...

Talk about lack of faith in your own products.

Re:Even AMD thinks AMD CPUs suck (3, Insightful)

zenlessyank (748553) | about 2 years ago | (#42366461)

I believe AMD licenses its technology to Intel since the Itanium sucked. Intel might know how to fabricate the 'engine' better, but AMD DESIGNS better engines. So AMD is getting paid whether it is an Intel or AMD proc.

Re:Even AMD thinks AMD CPUs suck (2, Insightful)

Anonymous Coward | about 2 years ago | (#42366501)

Actually they cross license.

Re:Even AMD thinks AMD CPUs suck (5, Informative)

cbhacking (979169) | about 2 years ago | (#42366781)

Depends on your definition of "suck". Price-for-price, AMD and Intel are fairly comparable right now (each one is better at some things, sometimes embarrassingly so, but in most cases they aren't far apart). However, Intel's line goes a lot higher than AMD's. A top-of-the-line AMD desktop processor is currently around $200 (less on sale, which isn't hard to find this time of year). A top-of-the-line Intel CPU will run you over $1000, and that's on sale. The 3770K isn't top of the line, but it is well over $300 on sale. Note that that's not including the cost of the motherboards either, which also seem to be higher for Intel chipsets.

To people who want the absolute best performance and money is no problem, Intel is the current king. Since the goal of the benchmarking is to test the graphics processor, they wanted to make sure that the performance wouldn't be CPU bottlenecked.

I'm saying this by way of giving you the benefit of a doubt, but since anybody who pays attention to current benchmarks and hardware prices knew it already, it really does in fact look like you're trolling.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about 2 years ago | (#42366867)

AMD chips always seem to have some irregularity. I recall having a 486 DX4/120 MHz and some games did not play as smoothly as the Intel 486 series. Athlons had heat issues. Phenom X4 cannot even play some Hasbro published games such as Asteroids and QBert--worked flawlessly on an Intel Pentium but on the Phenom X4 the framerate chokes at one point and then jerks to catch up again.

Re:Even AMD thinks AMD CPUs suck (3, Insightful)

Rockoon (1252108) | about 2 years ago | (#42366931)

Anyone who has spent some time listing their alternatives within any sort of normal budget margin knows that there will be a lot of AMD chips under consideration and very few Intel chips under consideration.

Re:Even AMD thinks AMD CPUs suck (1)

gagol (583737) | about a year ago | (#42368461)

Problem is, Joe six pack goes to a store where the advices he receives is from a salesman paid in part by commission.

Re:Even AMD thinks AMD CPUs suck (2)

Guspaz (556486) | about a year ago | (#42367501)

I'm not trolling, and I've owned a few AMD CPUs in my day (four, I think? Does the Geode count?), I was just flabbergasted that AMD would send out Intel CPUs to review their GPUs with. I mean, eating your own dogfood is kind of a fundamental thing, and when you do something like this, it sends the message that your own products aren't good enough for the purpose. I chose an inflammatory title to highlight how ridiculous this is.

I would actually argue that AMD only holds an advantage at the extremely low-end, below the $40 pricepoint (you can get a dual-core sandy bridge for about that), and even then only if you don't care about power. I feel bad for AMD, because I owned a long string of AMD processors, and they were fantastic products. I started on the K6-2 and ended on the Athlon XP, and they were all great. With the Athlon 64, they really hit it out of the park, and had a fantastic run, and then... nothing. It didn't help that Intel had a fantastic chip with the Conroe, but AMD had flop after flop that wasn't even up to their own previous standards. They managed to get some of the worst problems under control to produce decent chips again, but by then they had fallen far enough behind Intel that they could only compete on price, and that approach was driving them bankrupt (hence the whole selling off Global Foundries).

I don't WANT Intel to dominate the CPU market, but AMD just isn't a credible competitor anymore. They haven't been able to compete on the performance front for years, their server chips have kept some of the hope alive but at this point need twice as many cores just to stay anywhere remotely performance competitive, their power efficiency hasn't been competitive for ages, and while their APU stuff has turned out some interesting products, they have a pretty limited market since they have better GPU performance and worse CPU performance than comparable Intel products, and Intel's GPU performance got "good enough" for the kinds of uses that you'd find those chips in anyhow.

In terms of competing on the ultra-low end prices, now ARM is starting to creep up in that market, and AMD is being sandwiched in-between. I think they're a more credible competitor to Intel at this point, but with Microsoft crippling Windows 8 on ARM by refusing to allow you to run Win32 apps not made by Microsoft, we're never going to see ARM competing for the desktop or laptop market (unless Windows 8 is a big enough flop itself). We might see ARM make inroads into the lower-end of the market, which could be enough to keep Intel on their toes, but that doesn't really leave much room for AMD...

CPUs take years to go from concept to market, and I really do hope that AMD has something fresh in the market that puts them on a competitive playing field with Intel. Being a full process node behind Intel hurts, but the power savings we see from die shrinks on Intel's products isn't enough to make it completely impossible to compete from one process node behind. So I do hope that we can bring some competition to the x86 CPU market in the future, but at the current point in time, they don't really have anything worth buying over its Intel counter-part, and sending out Intel's products to review AMD's GPUs really isn't helping things.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about a year ago | (#42368063)

I think its the better CPU to bench the GPU at. AMD admitting they don't own the top end is generally refreshingly honest, and allows you to have faith in them when they say they're delivering in the APU market. Unfortunatly, most people probably agree with your view point on it though. =/

Re:Even AMD thinks AMD CPUs suck (1)

Guspaz (556486) | about a year ago | (#42371211)

Are they delivering in the APU market? The CPU performance is poor in their APU products, and the GPU performance is faster than Intel's, but not fast enough to matter. It's still too slow for gaming, and Intel's iGPUs are fast enough for office use.

If their APUs were matching Intel's power/performance, but still kicking their ass on the GPU side, then it might be something special. As it stands, the tradeoffs are just not worth it.

Re:Even AMD thinks AMD CPUs suck (1)

serviscope_minor (664417) | about a year ago | (#42368847)

I would actually argue that AMD only holds an advantage at the extremely low-end, below the $40 pricepoint (you can get a dual-core sandy bridge for about that),

Why? For multithreaded benchmarks, the latest AMD ones seem to slot in somewhere between the i5 and i7 (ususally closer to the i7) and sometimes beat the i7 handily. For single threaded stuff, they're at around 75% of the i5.

And about the same price as the i5, but support better features such as ECC memory.

As far as I can see, unless you're heavily power constrained or very single threaded, then the AMD processors are very competitive. Many things now (compiling, compression, web browsers, transcoding encryption etc) ascale up to 8 threads easily.

Oh, and if you're doing relying on integrated graphics, than an Ax based processor will absoloutely hammer an ix based processor.

If you want workstation/server like reliability with ECC, then the AMD processors are very much cheaper given that you need to shell out for the Xeon brand with Intel.

Re:Even AMD thinks AMD CPUs suck (1)

Guspaz (556486) | about a year ago | (#42371231)

Consumers need some light multithreading, but single-threaded performance is still king. ECC is not something consumers care about (enterprise does, sure, but not consumers).

Power matters less in desktops, so using more power to do the same thing is not as big a problem there. But in the mobile space, it's a big problem.

The use cases you're talking about aren't really what a typical consumer or even office machine is used for.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about a year ago | (#42371885)

an Ax based processor will absoloutely hammer an ix based processor

I see what you did there.

Re:Even AMD thinks AMD CPUs suck (1)

justthinkit (954982) | about a year ago | (#42369595)

A top-of-the-line AMD desktop processor is currently around $200 (less on sale, which isn't hard to find this time of year). A top-of-the-line Intel CPU will run you over $1000

.
A top-of-the-line Chevy Suburban is currently around $200. A top-of-the-line Dodge Challenger will run you over $1000.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about a year ago | (#42370831)

Your post is so full of errors, it borders on comedic. Did you forget to adjust the prices, or do you know an insane dealer selling top of the line cars for $200 and $1000?

Your analogy fails again when you actually check into it and find that a new top-o-the-line Suburban is $60k, while a top-o-the-line Challenger is $45k.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about 2 years ago | (#42378687)

"A top-of-the-line AMD desktop processor is currently around $200 (less on sale, which isn't hard to find this time of year). A top-of-the-line Intel CPU will run you over $1000

.
A top-of-the-line Chevy Suburban is currently around $200. A top-of-the-line Dodge Challenger will run you over $1000."

Haven't priced them lately, huh? I can get a new "average equipped" Suburban considerably cheaper than I can get a Challenger. Looking at both vehicles maxed-out with options, they become comparable - you won't find a stock Challenger priced at 5x the price of a Suburban, regardless of the options. Car analogies are fun, but in the future please pick on from the roads of reality.

Re:Even AMD thinks AMD CPUs suck (1)

Kjella (173770) | about a year ago | (#42372121)

Depends on your definition of "suck". Price-for-price, AMD and Intel are fairly comparable right now

We heard the same when AMD tried to sell their FX-8150 for $245 and the customers didn't agree, now their FX-8350 sells for $195. When you have to sell a considerably better processor for $50 less while your competitors prices are practically unchanged, it's a good hint that the former wasn't particularly good value. I think AMDs reputation for always providing good value has gotten more than a little tarnished at their high end, sure if you found a good sale but that's usually a case of inventory they can't move otherwise. And their official roadmap is the current Steamroller cores throughout 2013, there must begin to be a lot of customers sitting on AMD Phenom IIs waiting for an upgrade that won't come.

Re:Even AMD thinks AMD CPUs suck (1)

cbhacking (979169) | about 2 years ago | (#42374163)

Vishera is enough of a step up that I think there's still hope for AMD. I only own one AMD processor (and at least 3 Intel ones) but the 8350 looks good enough for its price point and I guess I have a certain degree of "support the underdog" here. Nobody wants Intel to be even more of a monopoly than they are right now. Granted that the 81xx series was a huge disappointment, but that doesn't mean the company is automatically doomed to never again be relevant. The original P4s from Intel were crap too...

Also, AMD has apparently hired some really top talent in microprocessor design, and it came late enough in the 83xx design process that the design was already set... but supposedly there's some real hope for the next generation or two.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about 2 years ago | (#42366897)

AMD mobile GPU end up in Intel laptops. most amd laptops have an integrated gpu and there target audience is different. high end gaming laptops right now just have a tendency to be Intel.

Re:Even AMD thinks AMD CPUs suck (1)

Guspaz (556486) | about a year ago | (#42367505)

That's not the point, the point is that recommending people use your competitor's product over your own is never something a company should do, unless you're in full mea culpa mode like the Apple Maps fiasco.

In the marketing world, you're always supposed to at least PRETEND that your products are superior.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about a year ago | (#42368067)

Marketing sucks. It drives away almost anyone aware of the practice.

Re:Even AMD thinks AMD CPUs suck (1)

amorsen (7485) | about a year ago | (#42368217)

In the marketing world, you're always supposed to at least PRETEND that your products are superior.

Only to be refuted by a bunch of reviewers all at once. Do you really believe that is the better outcome?

It would be a bit silly of AMD to let the CPU division sink the GPU division.

Re:Even AMD thinks AMD CPUs suck (1)

glsunder (241984) | about 2 years ago | (#42366995)

It really depends on what you're doing and what you're spending. If your task can use all 8 cores of a piledriver cpu, it's very competitive. I have to wonder if a large part of amd's problem is intel is at 220nm, while amd is still stuck at 320nm. It would take an incredible design to be competitive.

Re:Even AMD thinks AMD CPUs suck (1)

Guspaz (556486) | about a year ago | (#42367517)

I think the fact that AMD's server chips need twice as many cores just to keep up with Intel's parts is kind of indicative of the problem, and I really do hope that they have something competitive in the market. Intel's current products are great, but only because AMD kicked them in the ass with the Athlon 64 years ago. If they go too long without a real competitor in the higher end of the market (beyond where ARM can reach), they'll stagnate.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about a year ago | (#42368211)

It only indicates that you're one of those people who don't really appreciate or understanding computer science and business. If AMD can match Intel using less advanced fabs or simpler cores.. but putting more cores and the result is cheaper... then WTF. Why do you care?

They make design decisions that are way beyond "top trumps" fuck-heads like you.

Note: I'm not saying AMD is right. I'm just saying that you aren't even capable of understanding the trade offs when they are explained to you, let alone make those decisions yourself. So shut up.

Re:Even AMD thinks AMD CPUs suck (1)

Guspaz (556486) | about a year ago | (#42371197)

Sigh. I realize you're a troll, but I'll respond anyhow: getting half the single-threaded performance limits the usefulness of your CPUs. I haven't seen any reviews for piledriver-based Opteron chips (they're relatively new), but comparing the previous gen Opteron against the Sandy Bridge Xeons, "keep up" still means they're falling behind the top Xeon chips in multi-threaded performance, they're way behind in single-threaded performance, and they use a ton more power.

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about a year ago | (#42371595)

For. Doing. What.

You don't seem to understand that not everyone wants to run CoD on their ninja PC. Some people, like entterprise customers, might want to do something useful... where 8 cores is actually a performance plus.

But anyway.. thanks for proving with your response that you have no fucking idea... as the original reply mentioned.

Re:Even AMD thinks AMD CPUs suck (1)

cbhacking (979169) | about a year ago | (#42367625)

Nitpick: 22nm, 32nm. You may be thinking of Angstroms, which are 1/10 of a nanometer. Also, AMD has 28nm in their GPUs; I'm not sure why their CPUs are still using a 32nm process.

Re: Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about a year ago | (#42368781)

Probably because moving all your products to a new, relatively 'un-debugged' geometry at the same time is foolhardy.

Re:Even AMD thinks AMD CPUs suck (1)

glsunder (241984) | about a year ago | (#42370351)

yeah, you're right. I guess I'm too old.

Re:Even AMD thinks AMD CPUs suck (1)

Noishe (829350) | about a year ago | (#42367809)

Wouldn't AMD be targeting the 8000M for intel boards? If you're going to get an AMD cpu and integrated graphics, they want you to go for a Trinity solution.

Re:Even AMD thinks AMD CPUs suck (1)

Guspaz (556486) | about a year ago | (#42371201)

Amd's APUs are faster than Intel's iGPUs, but much slower than discrete chips. Why shouldn't you have an AMD CPU with a discrete GPU in the notebook market?

Re:Apples to Oranges comparisons suck (1)

Anonymous Coward | about a year ago | (#42368889)

If you want the reporters to do fancy graphs comparing performance with each different component, using an Intel chip would be the only way to go. Now the reporter can show the performance difference between the new AMD card, an Nvidia card, and Intel's HD4000 which may as well be shown as the baseline.

It also shows that AMD hasn't tinkered with their GPU architecture to favor their own CPUs over competitor's.

Disclaimer: I am using an AMD GPU on an Intel CPU system.

Re:Apples to Oranges comparisons suck (1)

Guspaz (556486) | about a year ago | (#42371171)

Benchmarks of the HD4000 would have been useless, as AMD sent a desktop chip, performance would not be representative of the mobile HD4000's performance. AMD could just have easily have sent an AMD CPU for apples-to-apples comparisons.

Re:Even AMD thinks AMD CPUs suck (1)

lsatenstein (949458) | about 2 years ago | (#42376309)

The subject might look like I'm trying to troll, but... I'm actually referring to TFA. AMD sent the TechReport reviewer a Gigabyte Z77 motherboard with an Intel i7-3770K processor. So it says on the first page of TFA.

AMD... sent an Intel processor... to review an AMD GPU...

Talk about lack of faith in your own products.

===
AMD would surely have their products run on in-house stuff. They would also want to show that if you had an I-7 or whatever Intel processor (atom excluded I guess), that the gpu would run well too

Re:Even AMD thinks AMD CPUs suck (0)

Anonymous Coward | about 2 years ago | (#42424667)

dude is not taking intel processor for checking amd gup performance as amd also knws tht intel is is the fastest cpu and to show people how fast is amd gpu. so people like u always keeps ur eyes like a crow who when always speak sonds like crow only. so its better keep ur mouth shut if u dont knw anything NOOB!!!!!

I guess this is how x86 will continue (0)

erroneus (253617) | about 2 years ago | (#42366479)

When people are looking to get better performance, they seek the processing power of other processors. Yes, I know "GPUs are optimized for [blah blah blah]" but in the end, they are still a processor and are efficient at what they do. x86 is just not so efficient but we've got all this legacy crap... and why? Because the software business liked to keep the sources to themselves so we need to keep out x86 processors. If everything was under Linux and we wanted to move to a better performing processor? Recompile the kernel, recompile the OS support, recompile the environment and shells, recompile the applications. There's a whole distro based on the idea of compiling everything from source and it's still quite popular.

But we want more performance. Yeah.. better for games... but better for many other things too eh? Bitcoin mining? Decryption? x86 can do these things too but if the advantages of a better architecture were implemented in x86, it would break compatibility most likely.

Anyone remember back in the day when the x87 co-processor was the way to boost performance of your machine? How the 486 was the combination of 386 and 387? People lusted after the 486 when it was announced. They could do that again I suppose but it would be late... too late. And now Intel will suffer for its failure to keep up and now it will only be used for some stuff while the real processing will go on in graphics cards. Brilliant.

Re:I guess this is how x86 will continue (1)

adolf (21054) | about 2 years ago | (#42366861)

How the 486 was the combination of 386 and 387?

No. A 486 is not a combination of a 386 and a 387.

Re:I guess this is how x86 will continue (0)

Anonymous Coward | about 2 years ago | (#42367067)

A combination of a 386 and 387 would be a 386DX.

Re:I guess this is how x86 will continue (1)

erroneus (253617) | about a year ago | (#42367593)

Uhm no.

http://en.wikipedia.org/wiki/Intel_80386#The_i386SX_variant [wikipedia.org]

But 486SX was not lusted after its announcement. 486 was. Yeah, sure, additional other improvements. Is it really necessary to add to the details? Does it invalidate my point to omit them?

john b wilcox/erroneus = expert on being FAT (0)

Anonymous Coward | about 2 years ago | (#42416251)

Erroneus/john b wilcox: When you eat, is your dish a wheelbarrow, your fork a pitchfork, and spoon a shovel or what http://slashdot.org/comments.pl?sid=3345911&cid=42414637 ? Does your bed use chevy truck coil springs and struts to hold your fat ass off the floor too? Hahahaha. No wonder you said this "Oh... to eat pizza again..." by erroneus (253617) on Saturday December 22, @05:20PM (#42371769) from http://slashdot.org/comments.pl?sid=3335159&cid=42371769 you disgustingly fat hog.

Re:I guess this is how x86 will continue (2)

ghinckley68 (590599) | about a year ago | (#42367643)

Sorry your wrong. The SX used a 16bit data bus and the DX used a 32bit data bus. No 386 had a FPU.

Re:I guess this is how x86 will continue (0)

Anonymous Coward | about 2 years ago | (#42366933)

That only held true for the 486DX as the 486SX had either a defective or disabled FP unit

Re:I guess this is how x86 will continue (0)

Anonymous Coward | about a year ago | (#42367821)

I used to own a Cyrix 486SLC/66. Man, was that ever a slow piece of shit.

Re:I guess this is how x86 will continue (2)

glsunder (241984) | about 2 years ago | (#42367023)

The 486 was the first x86 cpu that was:
pipelined
had cache (8KB)
had built in FPU (387)

Basically, they took concepts that were being done in risc processors and used them in the x86 world.

Following up... Pentium brought superscalar design, and IIRC, pipelined fpu. The Pentium MMX brought integer SIMD. The Pentium 2 brought Out of Order design.

john b wilcox/erroneus: Go eat a jelly donut (0)

Anonymous Coward | about 2 years ago | (#42416259)

Erroneus/john b wilcox: When you eat, is your dish a wheelbarrow, your fork a pitchfork, & spoon a shovel or what http://slashdot.org/comments.pl?sid=3345911&cid=42414637 ? Does your bed use chevy truck coil springs and struts to hold your fat ass off the floor too? Hahahaha. No wonder you said this "Oh... to eat pizza again..." by erroneus (253617) on Saturday December 22, @05:20PM (#42371769) from http://slashdot.org/comments.pl?sid=3335159&cid=42371769 you disgustingly fat hog.

Joseph Stalin is dead (2)

Nimey (114278) | about 2 years ago | (#42366757)

You can't fool me, submitter!

Thi5 FP for GNAA! (-1)

Anonymous Coward | about 2 years ago | (#42366927)

what they 7hink is hand...don't it attempts to

Does it matter? (1)

Osgeld (1900440) | about 2 years ago | (#42367295)

My old Geforce9600 GT, and my slightly less old GTS250 plays every (shitty xbox360 port) game no problem in higher resolution than any notebook provides with little stress, seriously doubt facebook, or shit even solidworks (which runs like butter on a mobile intel chip) gives a shit.

Software has once again peaked, and stagnated for a half decade, while hardware is running nuts for no real reason

Re:Does it matter? (2)

tibman (623933) | about 2 years ago | (#42367437)

You need some new games : )

Re:Does it matter? (-1)

Anonymous Coward | about a year ago | (#42368055)

New games suck in terms of entertainment / buck. They might require $200+ GPU, $200+ CPU, they cost $50+, their DLC can cost $100+ and after all that they won't necessarily give any more entertainment than a free flash based game.

Re:Does it matter? (2)

tibman (623933) | about a year ago | (#42370643)

I've got three for you to try out.
1) Natural-Selection 2 for 25$ (with no deal) on steam.
2) DayZ, which is a free mod for Arma II Combined Operations (Arma2+expansion) for 30$ (with no deal) on steam.
3) A slightly older game called Metro 2033 for 20$ on steam.

You can find most of the games 50% off during sales, like the one going on right now. Though, if you'd rather sit on the floor and play with a bit of string.. have at it.

Re:Does it matter? (1)

Ambassador Kosh (18352) | about 2 years ago | (#42373579)

Get the humble bundle and dungeon defenders for $6. I have had and continue to have a lot of fun with that game.

Re:Does it matter? (1)

fyngyrz (762201) | about a year ago | (#42367815)

while hardware is running nuts for no real reason

I have project builds (in c) that take many minutes to complete on an 8-core, 3 GHz machine with many gigs of memory available. I have at least one application that consumes all eight cores just to run -- and yes, it's written efficiently. I have others that consume a core or two... and less would be better. I *do* multitask. As far as I'm concerned, neither software (c compiler and linker in this case) or hardware are anywhere *near* where I'd like them to be. Your assertion of "no reason" strikes me as ludicrous.

I rather suspect that software will be coming down the pike that can use far more than even I'd like to see... when AI gets here (and yes, I don't consider that to be in any doubt whatsoever, nor is there any significant indication we'll need anything more than high powered VN architecture to do it), "more, faster" will be of benefit no matter how much of it there is.

Expert systems -- not intelligent, but just expert -- can develop indefinitely from right where we are, and the faster they are, and the more data they can get at in a short period of time, the better they will be.

It seems just wantonly blind to say that there is "no real reason" for these improvements.

If all you're thinking about is games... then you're simply not looking at enough of the picture to suggest a decently thought out answer (and even so, I bet there are game designers all over the place who will tell you they'd like more power, more memory, more textures, higher level engines, etc.)

Re:Does it matter? (1)

drinkypoo (153816) | about a year ago | (#42368029)

while hardware is running nuts for no real reason

I have project builds (in c) that take many minutes to complete on an 8-core, 3 GHz machine with many gigs of memory available. I have at least one application that consumes all eight cores just to run -- and yes, it's written efficiently. I have others that consume a core or two... and less would be better. I *do* multitask. As far as I'm concerned, neither software (c compiler and linker in this case) or hardware are anywhere *near* where I'd like them to be. Your assertion of "no reason" strikes me as ludicrous.

It's too bad you couldn't use an example which includes a video card, since that's what we're talking about right now. People out there are buying video cards that consumes more power than their entire computer system including the display and for what? Most of them, for nothing. GPGPU tools for the average user are essentially nonexistent, and you can only perceive so many FPS.

Re:Does it matter? (1)

fyngyrz (762201) | about a year ago | (#42369917)

I am using examples that include video cards (4 of them, in my case.) My OS uses video cards to accelerate mainline processes; furthermore, this is done through a standard system mechanism and is relatively easy to incorporate in a considerable range of code; end users see benefits commensurate with the graphics engines they have installed. The processors in graphics cards are specialized for certain types of operations that can be very useful, and in use will significantly outperform, general purpose CPU instructions.

It really doesn't matter (other than as a market force) if gamers are buying cards that can handle more display activity than they're throwing at them. What matters is if these cards (and CPUs, for that matter) are overpowered in all places they are utilized, and the answer to that is flat-out no.

Re:Does it matter? (1)

gmhowell (26755) | about 2 years ago | (#42373159)

I am using examples that include video cards (4 of them, in my case.) My OS uses video cards to accelerate mainline processes; furthermore, this is done through a standard system mechanism and is relatively easy to incorporate in a considerable range of code; end users see benefits commensurate with the graphics engines they have installed. The processors in graphics cards are specialized for certain types of operations that can be very useful, and in use will significantly outperform, general purpose CPU instructions.

It really doesn't matter (other than as a market force) if gamers are buying cards that can handle more display activity than they're throwing at them. What matters is if these cards (and CPUs, for that matter) are overpowered in all places they are utilized, and the answer to that is flat-out no.

I'm sensing that you use a BSD derived OS...

Re:Does it matter? (1)

drinkypoo (153816) | about 2 years ago | (#42374775)

My OS uses video cards to accelerate mainline processes; furthermore, this is done through a standard system mechanism and is relatively easy to incorporate in a considerable range of code; end users see benefits commensurate with the graphics engines they have installed.

The problem is, it sucks, because you have to send data to the GPU to be processed. If you had more CPU, you wouldn't have to do that. GPGPU is the wrong answer to the problem. Making a CPU that looks more like a GPU is a better one. GPGPU only exists because of a bunch of gamers who keep buying faster and faster parts. If they weren't doing that, people would spend that effort figuring how how to make CPUs faster instead. So instead of just getting faster CPUs that any program can use with a mere recompile, we get power segregated away in a GPU which we have to use for graphics while we're doing graphics, and which we have to use additional libraries and drivers to use while we aren't. And if your GPU is too old (say, mine does 1.2 and many current projects require 2.0) or it's from the other brand then it won't work at all right now, and even when everyone does support OpenCL there's going to be compatibility issues.

Re:Does it matter? (1)

Anonymous Coward | about a year ago | (#42370317)

Certainly an ATI 4870 can play all current games well at 1280x1024 if the visual settings are adjusted a little. By adjusted, I certainly don't mean removing the 'eye candy'- just turning down or off some options that exist purely to burn off the obscene performance available with today's high-end cards.

Only those who play at much higher resolutions (largely silly, given the games derive from console version assets optimised for much lower resolutions) need much newer cards.

The situation for older GPU owners has actually improved over the last two years or so. Why? Because games developers have learnt to be vastly more efficient in their use of the GPU, as they seek to eke out even more performance from the 360 and PS3. Only when the new consoles are released (end of 2013) will the situation change. Even then, the new consoles are due to have quite modest GPU capability compared to the best discrete GPUs on sale at the time of release.

Put simply, save for the freaks who need to push every game setting to max no matter how little it improves the experience, PC gamers will be able to get away with using cards further and further behind the 'bleeding edge'.

FrIst stKop (-1)

Anonymous Coward | about a year ago | (#42367707)

of FreeBSD Usenet around are in need First Avoid going gains market share reaper Nor do the All; in order to go fact came into

radeon hd8000m (0)

Anonymous Coward | about a year ago | (#42367961)

Interesting. Lets hope the price will be affordable.

Re:radeon hd8000m (0)

Anonymous Coward | about a year ago | (#42368125)

Pretty sure the price will be free. Who the fuck charges for pricing?

AMD Lost their Way? (0)

Anonymous Coward | about a year ago | (#42369367)

This shows how much AMD have lost their ability to execute.
Over 1 year late and not in a laptop. If they could execute on their great vision then we would seen this product a year ago and today reviews of the product
in the laptop.

Where are you today AMD? We need competitive balance to Intel and Nvidia.

IT'S OVER 9000 (0)

Anonymous Coward | about 2 years ago | (#42374971)

Does this mean the desktop GPUs are going to be OVER 9000!?

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?