Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

AMD Plans 1,000-GPU Supercomputer For Games, Cloud

Soulskill posted more than 5 years ago | from the still-won't-run-crysis dept.

AMD 148

arcticstoat writes "AMD is planning to use over 1,000 Radeon HD 4870 GPUs to create a supercomputer capable of processing one petaflop, which the company says will make 'cloud' computing a reality. When it's built later this year, the Fusion Render Cloud will be available as an online powerhorse for a variety of people, from gamers to 3D animators. The company claims that it could 'deliver video games, PC applications and other graphically-intensive applications through the Internet "cloud" to virtually any type of mobile device with a web browser.' The idea is that the Fusion Render Cloud will do all the hard work, so all you need is a machine capable of playing back the results, saving battery life and the need for ever greater processing power. AMD also says that the supercomputer will 'enable remote real-time rendering of film and visual effects graphics on an unprecedented scale.' Meanwhile, game developers would be able to use the supercomputer to quickly develop games, and also 'serve up virtual world games with unlimited photo-realistic detail.' The supercomputer will be powered by OTOY software, which allows you to render 3D visuals in your browser via streaming, compressed online data."

cancel ×

148 comments

Sorry! There are no comments related to the filter you selected.

Wow, streamed 3D games.. (0, Redundant)

Malawar (674186) | more than 5 years ago | (#26395289)

Awesome.

Apple Computer, The Homosexual's Favorite (0, Offtopic)

Anonymous Coward | more than 5 years ago | (#26395301)

It's funny that I had only recently been thinking and praying about these queerest of computers. It certainly is true that Apple computers are very popular amongst the homosexual communities, the fact that these computers are so popular indicates the depths to which our great nation has sunk to.

The Apple corporation logo is naturally an apple with a bite taken out of it. Is it not a coincidence that Eve tempted Adam with an Apple? The apple is a symbol of defiance against God, and was an obvious choice for a company whose primary objectives include the liberalisation of all media, and which activly finances the political party that hates God.

When I first saw an apple computer (called a Mac, after the popular fast food product) with it's "fruity" design, I had assumed that it was some kind of obsolete product aimed at latte sipping east-coast homosexual designers. This initial observation turned out to be only half true:

The apple computers are not as obsolete as their gaudy designs suggest - the Apple computer company, based in that Sodomite Central, Cupertino CA, have invested a great deal of money in keeping up with more mainstream American PC brands like Dell or IBM, however rather than compete on computing power, practicality or ease of use the Apple company prefer to emphasize "eye-candy". If you are the sort of person who loves nothing more than gazing for hours at an aquarium full of brightly colored fish, then the feeling of using an Apple desktop will be most familiar.

Note the oddly-shaped apple-mouse. Unike modern computer mice, the Apple product has only one button. This is because historically Apple computer failed to license the patent for including buttons on mice. Since most apple computers are used as children's toys, their homosexual owners have barely noticed this deficiency, they are too busy thinking about sodomy worry about their computer's obvious deficiencies.

Windows appear to swim around, distorting and melding into the "dock", with almost psychedellic fluidity. Parts of the desktop become inexplicably transparant, and then return to normal or else swirl into oblivion. Control over windows is achived not through familiar buttons (like Window's "X"), but candy colored blobs, which are designed to remind the user of "Extasy" tablets. I suspect that the Apple design team must have been doing more drugs than the average touring funk-band.

The Apple OSX platform is missing a large number of common and esential productivity tools commonly used on the Windows platform. For example the endearing BonziBuddy can only be found on Windows, and therefore will only run on a Mac that has been upgraded to Boot-Camp and Windows. I suspect that this is exactly what most Mac-owners will feel forced to do.

Naturally, the big question is, does the "alternative lifestyle" approach to computer design really pay-off for the people who count: The Users?

I think the answer is no. Having used computers all my life, I consider myself an expert in the day to day tasks of computing. The Microsoft Windows operating system makes installing, uninstalling, defragmenting, and removal of viruses and spyware trivially easy. It's a shame that the Apple company (who unbelievably are much praised for their interface design) had not thought to make these everyday tasks simpler.

As I have pointed out on a number of occasions both Linux and AppleMac fail to unclude a disk-defragmentor, a personal firewall, a standard method for installing or removing software or even a system repair utility. Microsoft introduced all of this in their epoch-making "Windows Me" edition. Linux users have had to get used to the lack of these essential productivity tools, however Linux is universally acknowledged as a cheap immitation of Windows. Mac on the other-hand is marketed as a full-price premium product.

Apple computers come preloaded with iTunes which only works with Apple's oddly-coloured iPod. The Apple Mac cannot run the more popular "Windows Media Player", and is therefore incompatible with Microsoft DRM or the wildly popular Microsoft Zune. This seems quite unfair to me, and is most probably an illegal monopoly.

Finally, we should also ask ourselves - is the Apple Mac good value for money? Superficially this may seem to be the case - Apple try to match price-points with Dell on a range of products, however the clues are in the small-print. All Dell products include the industry standard Windows Vista as standard. Dell ensure that each computer comes with an operating system, without which the computer could not function. Apple computers are still bundled with OSX, an attractive but aging operating system based on the very old UNIX, a technology developed by SCO group in the early 70s. This is the the same technology which Linux developers were recently accused of stealing.

Are apple aware of thier obvious limitations? We think they must be - A couple of years ago they released a product that most shrill-voiced liberal Apple pundits believed was impossible: It's called "Boot Camp" - a utility that upgrades any recent Apple computer to be compatible with the industry standard Microsoft Windows. Industry insiders now believe that this release heralds Apple Computer Corporation's exit from the software business. For once, I'd have to agree with Apple - this would be a sensible way to preserve shareholder value.

Apple computer make a big deal out of the claim that their absurdly lurid products are "Designed in California", however a close inspection reveals that just like Linux, they are made in the Republic of China. Christians and Patriots should rather invest in an IBM ThinkPad, which is both designed and made in the USA.

Customers should also consider the moral aspects of buying an Apple computer. One reason for the queerness of Apple's products is that the company's board of directors includes Albert Gore - yes, the same Al Gore whose doom and gloom environmental cassandra-complex is intended to distract America from it's real foes (the Islamofascists and Homosexuals). Apple has historically been a major backer of the Democratic party, and both Bill and Hillary Clinton, not to mention Osama Bin Laden are avid Apple Mac users.

The simple and sad fact is that if you buy a Mac or an iPod you are funding immorality. You are helping to finance the secularists who are ruining America.

Re:Apple Computer, The Homosexual's Favorite (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26395331)

You're too obvious, try harder.

Re:Apple Computer, The Homosexual's Favorite (0, Redundant)

Surt (22457) | more than 5 years ago | (#26395403)

Wow ... the effort that went into writing that boggles the mind. I mean, I feel bad wasting 5 seconds of my life typing up this reply.

Re:Apple Computer, The Homosexual's Favorite (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26395427)

Meh. The negrophobic essay was better.

Re:Apple Computer, The Homosexual's Favorite (-1)

Anonymous Coward | more than 5 years ago | (#26395449)

That coward just masterfully trolled and flamebaited (ahem) at least half a dozen issues, and most of his rant was completely false and was utterly misinformed.

He deserves some sort of award, seriously.

Re:Apple Computer, The Homosexual's Favorite (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26395837)

You read it? why?!

Re:Apple Computer, The Homosexual's Favorite (0, Troll)

hairyfeet (841228) | more than 5 years ago | (#26396105)

Yeah, give that little troll feller credit! I mean he put down that Mac users would "upgrade" to Windows for BonziBuddy! Now damn it THAT'S funny!

Re:Apple Computer, The Homosexual's Favorite (0, Redundant)

Surt (22457) | more than 5 years ago | (#26396321)

Ok metamods, nuke from orbit for the 'redundant' mod please.

Re:Apple Computer, The Homosexual's Favorite (-1, Offtopic)

Anonymous Coward | more than 5 years ago | (#26395485)

tl;dr

Seriously . . . ADD, k?

Re:Apple Computer, The Homosexual's Favorite (1, Informative)

Anonymous Coward | more than 5 years ago | (#26396029)

FYI - this troll was stolen directly from http://shelleytherepublican.com/ [shelleytherepublican.com] The whole site is a satire, like a raving mad blog version of Stephen Colbert.

Read that way, it's pretty funny.

Re:Apple Computer, The Homosexual's Favorite (0, Flamebait)

painehope (580569) | more than 5 years ago | (#26396047)

Hey, slick, I hate to break this one to you...but drugs, homosexuality, and computers have nothing that intrinsically links them together. They are variables (what I call "things" when I'm speaking in layman's terms) in a person's lifestyle (or you can view that as a set called P for person, or an object of the class Homo Sapiens).

Now, I personally enjoy the hell out of drugs, sex, and computers (preferably supercomputers, which is why I'm reading this damn article) - in which order depends on which day of the week it is (today's Friday, so it's drugs and computers; tomorrow's Saturday, so it's sex and drugs). The fact that I enjoy sex with women (preferably more than one, but I think the average /.er is happy if they can get one gal; word of advice to fellow geeks - date bisexual women...it's a lot of fun) doesn't necessarily mean I have anything against gays. To each their own.

And, for the record, I'm a dedicated ThinkPad user (despite the crappy ATI drivers that don't work worth a damn unless you're using a specific version of X) and a patriot (in the sense of Anti-Flag's song "Red, White, And Brainwashed" [azlyrics.com] : "If you don't fight to make things better, then you're the one betraying the country". Though they need to get off their high horse with the word "Aryan". No one is perfect I guess...).

You, my surprisingly lucid friend, should probably check in with your doctor. Schizophrenia is hard to treat, so we'll bear with you while you do that. Check back in when you have something sane to say.

Myself? I'm off to post my original comment. I was just blown away by the sheer psychotic doggedness it took to post what you did (as I had believed that to be solely my province).

Re:Apple Computer, The Homosexual's Favorite (0)

Anonymous Coward | more than 5 years ago | (#26396205)

YHBT. YHL. HAND.

Re:Wow, streamed 3D games.. (0)

Anonymous Coward | more than 5 years ago | (#26395351)

The company claims that it could 'deliver video games, PC applications and other graphically-intensive applications through the Internet "cloud" to virtually any type of mobile device with a web browser.'...

... with a sustained 1 Gbit/sec (1280*1024*24bit*30frames/second) connection with a latency less than 100 ms. Good luck with that.

Re:Wow, streamed 3D games.. (1)

Anpheus (908711) | more than 5 years ago | (#26395611)

Compression is, after all, for losers.

Re:Wow, streamed 3D games.. (0)

Anonymous Coward | more than 5 years ago | (#26395841)

true. especially considering the lack of cpu power and memory to decompress anything on a cellular phone.

Re:Wow, streamed 3D games.. (1)

Anpheus (908711) | more than 5 years ago | (#26396237)

My cellphone has an OpenGL ES rendering engine, as do many of the new generation of smartphones.

Despite that, I'm willing to bet the problem with this cloud computing engine is not the bandwidth, if they get it worked out, but the latency with the display. It's bad enough playing online and having lag issues. But now I have to wait for my screen to update?

Latency? (5, Insightful)

hax0r_this (1073148) | more than 5 years ago | (#26395385)

Try downloading a picture at your phone's native resolution (a screenshot of a 3d game taken on your phone would be ideal). It will take at least that long for a "game" to respond to your input on this system.

And I doubt that streaming a 3d rendering will really save much battery either considering all the network activity.

Re:Wow, streamed 3D games.. (1)

davester666 (731373) | more than 5 years ago | (#26395605)

Um, I need two of these setups, so I can finally play Crysis with my friend.

Re:Wow, streamed 3D games.. (4, Funny)

zippthorne (748122) | more than 5 years ago | (#26395695)

What does your friend use?

Hmm... (0)

Anonymous Coward | more than 5 years ago | (#26395309)

I remember an article being posted here about GPU accelerated cracking of WPA implemented with CUDA. This is why my favorite wireless encryption scheme remains: If you're able, use a cable.

Oh Yeah? Well..... (5, Funny)

Todd Fisher (680265) | more than 5 years ago | (#26395317)

Intel Plans 2,000-GPU Supercomputer For Games, Lightning

Re:Oh Yeah? Well..... (-1, Redundant)

Anonymous Coward | more than 5 years ago | (#26395377)

Intel Plans 2,000-GPU Supercomputer For Games, Lightning

I can finally run Crysis!!!

Re:Oh Yeah? Well..... (0, Flamebait)

novalogic (697144) | more than 5 years ago | (#26395949)

Not with Intel GPUs you won't.

Re:Oh Yeah? Well..... (3, Funny)

Anonymous Coward | more than 5 years ago | (#26395853)

2000 Intel GPUs?? Well, that's like a Radeon 3650, right?

Re:Oh Yeah? Well..... (0)

Anonymous Coward | more than 5 years ago | (#26396357)

Did I ever tell you I was struck by lightning 7 times? Once I was just minding my server farms...

Uhm, bandwidth? (5, Insightful)

Taibhsear (1286214) | more than 5 years ago | (#26395327)

Even if the "work" is offloaded to the cloud won't you still need an assload of bandwidth on said devices in order to actually amount to anything? It's not like you're going to get pci-express bandwidth capabilities over dsl or cable internet connection.

Re:Uhm, bandwidth? (4, Insightful)

megaditto (982598) | more than 5 years ago | (#26395475)

We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings. I'd assume that would be in the target range.

But even if you wanted for some reason to go uncompressed, then 8-bit 800x600 at 25 fps would still be less than 100 mbps, not totally unreasonable.

I would imagine the latency would be a much bigger problem than bandwidth. If you ever used VNC you probably know what I mean.

Re:Uhm, bandwidth? (5, Informative)

Frenchman113 (893369) | more than 5 years ago | (#26395653)

We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings.

No, we can't. Of course, if you've been fooled into thinking that scene crap is "DVD quality", then perhaps this holds true. Otherwise, you would realize that not even H.264 can deliver DVD quality video (720x480, no artifacts) in less than 1 Mbps.

Re:Uhm, bandwidth? (0)

Anonymous Coward | more than 5 years ago | (#26395905)

Technically, 192 Kbit/s MP3s cannot deliver CD quality either.

Videphile-quality cables. (5, Funny)

megaditto (982598) | more than 5 years ago | (#26395973)

Yes, you can. You need to use the correct ethernet cables with high-level tin alloy shielding and vibration elimination: http://www.usa.denon.com/ProductDetails/3429.asp [denon.com]

Re:Videphile-quality cables. (2, Funny)

Megatog615 (1019306) | more than 5 years ago | (#26396215)

You're sure that's not a Monster Cable rebrand?

Re:Videphile-quality cables. (1)

mcrbids (148650) | more than 5 years ago | (#26396697)

Holy fsck. $500 for a 5 foot long ETHERNET CABLE!?!!? For the "serious audiophile"?!?!?

(Um, hello? It's DIGITAL?!?!)

Goes to show, there really IS a sucker born every minute, but at these prices, they'd make out like bandits if they only made 1 sale/week...

Re:Uhm, bandwidth? (1)

painehope (580569) | more than 5 years ago | (#26396171)

Um, yes, everyone.

Bandwidth. That precious commodity.

Obviously, they're going off one or more of these assumptions/instances :

1) They have designed one hell of a compression algorithm. The OTOY site has between fuck-all and nothing on it, and the domain is relatively new (which doesn't say much - if some bright spark at AMD developed a mean compression algorithm that isn't overwhelmingly intensive, and s/he split off, then it would be new).
2) Mobile bandwidth will be making a fantastic leap at roughly the same time as this system is implemented (not an unreasonable assumption - I've done a little bit of work with long-range wireless - it was directed, not broadcast - and can tell you that it's coming faster than most people think, just not this year).
3) Wired, consumer-grade bandwidth will be making a similar leap. This falls under the "yeah, so what?" category. We all know it. Now if someone could explain to the telco execs and the general public why we should be planning/upgrading our infrastructure (both for this and in general - the last hurricane that hit the Texas coast left me with no power for about 3 goddamn weeks), the world would be a better place.
4) This will only be usable in certain hot-spots (like places with > N Mbps - wired or wireless).
5) This will only be usable with certain devices (like ones that have the software and hardware necessary to handle both the bandwidth and/or compression).
6) Someone's been putting liquid LSD-25 in the AMD/ATI water again. Hey, remember the K6-2? Anyone other than me want to shoot someone at IBM/Lenovo for picking out ATI graphics cards for their ThinkPad laptops (which is the only brand of laptop that meets my standards for power and resilience, but those ATI driver suck!).

Now my guess is that more than one of the above are true. Which ones are true remain to be decided by a tripod of brilliance, avarice, and sheer bone-headed susceptibility to delusions.

I'm taking bets...

Re:Uhm, bandwidth? (0)

commodore64_love (1445365) | more than 5 years ago | (#26396875)

Compression has advanced quite a bit. My Netscape ISP squeezes text websites to just 5% their original size, thereby increasing effective bandwidth by 20 times. If the "Cloud" implements a similar algorithm to handle the data, it could operate quite fast.

Re:Uhm, bandwidth? (1)

bmorency (1221186) | more than 5 years ago | (#26395719)

We can already stream DVD-quality movies encoded at 1 mbps or so, well within the current consumer "broadband" offerings. I'd assume that would be in the target range.

I'm not exactly sure how this will work but they said you have to offload the data to their servers. So if you are playing a game wouldn't you have to upload all the data to their servers so they can process it? Consumer internet connections are fairly quick at downloading but it seems that the upload speed is going to be a problem. My internet connection is 10mbps down but only about 700kbps up. So that seems like it would be the problem.

Re:Uhm, bandwidth? (0)

Anonymous Coward | more than 5 years ago | (#26396109)

The main bandwidth cost will be downloading video after it renders the graphics. It shouldn't use too much more upload bandwidth than online games currently do. I suspect that 700kbps upload bandwidth is far more than enough.

Re:Uhm, bandwidth? (2, Insightful)

Wallslide (544078) | more than 5 years ago | (#26396249)

The idea is that the only thing you are uploading to the server is input, such as mouse/keyboard/voice information. The game logic and assets all reside on the server itself, and thus don't have to be upload by your machine. It's as if you were playing a game over a VNC connection.

One thing that is really cool about this technology is that it has the potential to eliminate cheating in games such as first person shooters. A lot of the cheating in the past is because the game client running on a user's machine actually knows a lot more information than what is being shown to the user. If a user can get past those artificial barriers to the information with hacked graphics drivers to see through walls, or sound drivers to see the exact location of footsteps, then they have a huge advantage over another user which leaves those artificial "information limiters" in place. It turns out it's very difficult to limit the sending of information from the server to a game client to only exactly what a game client needs at any given time. Theoretically, if the only thing being received from a game server are pre-rendered images, the a user couldn't use that information to cheat with wallhacks or any other current cheats that I know of. The problem is that there would also be no way to do client-side prediction (which is why extra information generally has to be sent in the first place), and mitigate the lag that inevitably exists between nearly all servers and clients to this day.

Re:Uhm, bandwidth? (0)

commodore64_love (1445365) | more than 5 years ago | (#26396885)

I think the whole idea sounds stupid. I sit down at my keyboard, I log in, and now I have access to a central supercomputer that does all the processing while my PC acts as a dumb terminal.

This AMD idea sounds like something from the 1970s. A step backwards.

Re:Uhm, bandwidth? (1)

sreid (650203) | more than 5 years ago | (#26395487)

more or less.. i think a high def movie can be streamed at under 300kbs

No, latency (3, Insightful)

alvinrod (889928) | more than 5 years ago | (#26395495)

The bandwidth is only a problem until we build bigger tubes. As much as we all like to bitch about internet here in the US, we're at least capable of increasing the bandwidth quite well. The real problem is dealing with the latency. With enough time and money we could easily push as much data as we could possibly want, but we can only push it so fast.

For some games it probably won't matter, but who'd want to use it for an FPS where regardless of how detailed your graphics are, even a tenth of a second lag is the difference between who lives and who dies? Until we can get around those limitations, I don't foresee the traditional setup changing much.

Re:No, latency (0)

Anonymous Coward | more than 5 years ago | (#26395563)

Bandwidth is not and never was really a problem, even if the best technology has to offer is a 1mbit line, you could run 1000 of them if you needed to.

Unfortunately nothing will get around the latency problem. Speed of light is a pretty harsh cap, and anything where latency doesn't matter is something that is probably not going to be pushing graphics (think turn based flash game or similar)

Re:No, latency (1)

Anonymous Coward | more than 5 years ago | (#26395567)

Sure, games that deal with dead reckoning (e.g. FPS) aren't the first candidate for this it is perfect for deterministic peer to peer simulations (e.g. RTS)!

A typical RTS will only simulate at 8-12Hz. Yes, expect 126ms - 249 ms lag! But you don't even notice.

Re:No, latency (1)

KDR_11k (778916) | more than 5 years ago | (#26396485)

You WILL notice the input lag though.

Re:No, latency (3, Interesting)

i.of.the.storm (907783) | more than 5 years ago | (#26396507)

For some people, maybe. But professional RTS gamers can have between 300-400 actions per minute, and some ridiculously good ones have 500, and if they had that much lag I wager they would notice. And of course, that's on top of the amount of time it takes for the supercomputer to generate the image.

Re:No, latency (1)

johnsonav (1098915) | more than 5 years ago | (#26395937)

For some games it probably won't matter, but who'd want to use it for an FPS where regardless of how detailed your graphics are, even a tenth of a second lag is the difference between who lives and who dies?

I might just be talking out of my ass here, but... If latency is your only bottleneck, and you have plenty of bandwidth and CPU on the server, wouldn't it be possible to deliver as many renderings as there are possible inputs, and only use whichever one corresponds to what the player actually does?

A simple example would be a game where, at any moment, the player could be moving up, down, left, or right. The server could generate four different views, one for each possible input. All four are delivered to the client. And the client's only job would be to determine the player's input, display the correct scene, and send notice of the players input to the server so the state of the game could be updated. Obviously, modern FPS have many more possible inputs, but the theory is the same. I don't think there would be any latency using this system.

For multi-player games, a similar setup could be used, only the server would have to create potential renderings for each player's inputs, in addition to the local client's inputs. So you could end up sending hundreds of frames down the wire and only to use one. But the latency would be the same as traditional multi-player games.

Re:No, latency (0)

Anonymous Coward | more than 5 years ago | (#26396123)

Unless both machines are burning atium at the same time, in which case the whole scheme becomes impractical.

Re:No, latency (1)

KDR_11k (778916) | more than 5 years ago | (#26396527)

A problem would be that the number of frames increases exponentially with the time you render ahead. 100ms lag on 60fps would mean something like (number of input options)^6 frames to render. With your four options that would be 4^6=4096 frames. You'd need a system that's more than 4096 times as powerful as the average user's computer times the number of users you have. At this point it's easier to just tell the user to buy his own damn hardware.

Re:No, latency (1)

hairyfeet (841228) | more than 5 years ago | (#26396191)

While I agree that with regards to gaming they are probably blowing smoke, where I can see this thing being a boon is in the field of amateur film making. Imagine the kind of effects all those future film makers could create with access to that kind of rendering power. If the rental price is reasonable then I could see this possibly doing for films what digital gear has done for musicians. That is of course give the power to the creators instead of the middle men.

I would love to see what kinds of films we will get when artists don't need to get approved by some studio exec just to get his sci fi/horror/action movie done. I just hope it pans out and is affordable enough. Hell if it works well even the bigger names could find uses for it. Can you imagine what Josh Whedon could have cooked up for Buffy and Firefly if he wasn't always worried about the budget? With this kind of horsepower anything the artist could imagine could be created.

Re:No, latency (1)

KDR_11k (778916) | more than 5 years ago | (#26396541)

I'm not sure the rendering hardware is the bottleneck for amateur movie CGI. It's more likely that they simply don't have the necessary artists to create the scenes in first place. You need a large staff to do the things modern movies do in reasonable time, hiring 20-30 professionals (probably more even) for the task tends to be a bit too expensive for amateur movie budgets.

Re:Uhm, bandwidth? (1)

martinw89 (1229324) | more than 5 years ago | (#26395519)

While I agree that this is odd with games, I definitely see the potential for 3d animators. It takes my home computers (note the plural) hours/days to render complex scenes, depending on the length of the scene. The advantage in computing power would greatly outweigh the bandwidth cost here, especially if you could just upload the job and wait for the result (instead of sending each frame to be rendered).

But I would imagine it would not take many people to bog this down.

Re:Uhm, bandwidth? (0)

Anonymous Coward | more than 5 years ago | (#26395525)

"...to virtually any type of mobile device with a web browser."

While it's going to be a while before we're doing 1080p it makes sense for low resolution.

Re:Uhm, bandwidth? (1)

khallow (566160) | more than 5 years ago | (#26395585)

Even if the "work" is offloaded to the cloud won't you still need an assload of bandwidth on said devices in order to actually amount to anything? It's not like you're going to get pci-express bandwidth capabilities over dsl or cable internet connection.

There are services that have low demands on the client and high demands on the server. For example, a game with a huge player population (like several hundred thousand). I think that Second Life or Eve Online would be examples of such games. The graphics isn't that demanding on older PCs and they have a huge player population. So no, you wouldn't need to have a huge amount of bandwidth, but it's not going to be state of the art graphics.

Re:Uhm, bandwidth? (1)

amirulbahr (1216502) | more than 5 years ago | (#26395703)

You don't need PCI-e bandwidth. All you are doing is transporting 2-dimensional video. We are already very good at doing that over moderate bandwidth connections.

1-2 Mbps will do standard definition video comfortably well.

Re:Uhm, bandwidth? (1)

Firehed (942385) | more than 5 years ago | (#26396561)

You think this is so you don't have to buy a new graphics card? The only reason companies would go for this is because it changes their games from a product to a service, so piracy goes away. Next-gen DRM, if you will (next-gen doesn't have to be worse, however; I avoid buying games due to how invasive the DRM is and know plenty of people who do the same, so the next generation of the stuff damn well better address that).

If it's implemented correctly it would still offer us advantages - play from any computer with a net connection (meaning proper gaming on Linux and Mac, as you'd need just a client to interface with their app, not the low-level DirectX APIs) would be enough for me. I see publishers taking a Zune-esque approach where you could get unlimited access to any of their games for some standard monthly fee (which makes a lot more sense with games than music anyways). Making stuff cheap, easy, and accessible goes a long way when the only downside to your competition is that it's illegal, especially when people don't seem to care if the numbers you can spot at ThePirateBay are anything to go by.

As for the implementation issues, I'll second the VNC analogy. Of course, it seems to be less responsive than either Microsoft or Apple's implementation of system-specific remote desktop software, but that comes down to encoding tricks (like only re-sending parts of the screen that actually changed, or a more procedural-type "draw a standard app window with this text at point X,Y" sharing rather than sending jpegs across the wires). It hardly matters for gaming though - I *have* tried gaming over VNC, ARD, and RDC, and even over wired gig-e you can't get a smooth picture, let alone something responsive. I think that's more of a CPU bandwidth issue than a network one, though. If you have the CPU/GPU horsepower, you could encode the game's output in x264 (or something of similar compression ratios, but much more response-friendly) in realtime and stream that over the network.

Re:Uhm, bandwidth? (1)

malv (882285) | more than 5 years ago | (#26396725)

Compress and stream like a youtube video.

Sponsored by Microsoft (-1, Flamebait)

Anonymous Coward | more than 5 years ago | (#26395333)

1000 GPUs ought to be enough to run Vista, right?

Good luck (5, Insightful)

4D6963 (933028) | more than 5 years ago | (#26395335)

"VNCing" games through the Internet and possibly a wireless network, and getting a decent enough latency and enough throughoutput to get a good image quality/FPS? Good luck with that, not saying it won't work, but if it does work satisfyingly and reliably it'll be an impressive feat.

Well I know StreamMyGame [streammygame.com] does it, but it's meant to be used locally, not over the internet + WiFi, right?

Re:Good luck (1)

erikina (1112587) | more than 5 years ago | (#26395591)

WiFi itself is enough to completely kill a gaming session. When I'm at home on my laptop, I like to remotely login to my desktop. Allows me the horse power of my desktop, along with access to all the files (read: pr0n).

Works flawlessly really, but the difference between ethernet and wifi is perceptible. But as soon you try gaming over it, it becomes unusable (for any action game at least). Even simple games like kasteroids or kbounce are not worth using (I get routine 1 second freezes). On the other hand, doing the same thing over ethernet is perfect and you couldn't tell the difference.

In fact, connecting from university (where the latency is around 60-80ms) is a lot smoother than wifi (where the latency is

Re:Good luck (1)

erikina (1112587) | more than 5 years ago | (#26395625)

Replying to myself. The end of my post got truncated by html parser thingy. Slashdot has to be the bulletin board where you need to write &lt; to get a < sign..

The end of the post should have:

< 1ms). I assume it must be from packet loss, but it very well might be a bandwidth issue too.

Re:Good luck (1, Funny)

Anonymous Coward | more than 5 years ago | (#26395691)

Slashdot has to be the bulletin board

I'm assuming you used angel brackets to emphasize <only> as well :P

Re:Good luck (1)

TwistedSymmetry (1354405) | more than 5 years ago | (#26396057)

You sure it wasn't the lousy wi-fi connection? ;-)

Only 1.000? (3, Interesting)

Duncan3 (10537) | more than 5 years ago | (#26395343)

Folding@home is at 1.007 PFLOPS of just ATI GPUs :)

(which is an entirely different sort of "computer", but still)

Re:Only 1.000? (1)

i.of.the.storm (907783) | more than 5 years ago | (#26396695)

These days they've got nVidia ones too though. The nVidia ones get more PPD somehow, even though the ATI clients have been out longer and ATI GPUs seem to have higher theoretical output. And yay, I'm contributing to that major FLOPpage with my ATI GPU.

Latency? (0)

Anonymous Coward | more than 5 years ago | (#26395347)

I don't care if it's 10 petaflops, what's the latency going to be like?

Cloud?! TWO IN ONE DAY? (3, Funny)

morgan_greywolf (835522) | more than 5 years ago | (#26395349)

Attention, AMD Marketroids: Please kill yourselves. Now. Do it now.

*blink*

Yes. All of you.

What about latency in gaming? (3, Interesting)

WiiVault (1039946) | more than 5 years ago | (#26395373)

I'm all for cloud gaming- it would be great to not have to upgrade my GPU all the time to play new games, however I wonder how this could be accomplished in a way where lag was so minimal as to not affect gameplay. It seems this would be especially hard if one were to play online games. Correct me if I'm wrong but it would seem one would need to add the lag from the client to the cloud AND the lag from the player to player (or server) in the multiplayer networking. That seems like a too much lag for most FPS's, which I'm assuming are one genre which would gain the most from such a supercomputer.

Re:What about latency in gaming? (1)

gbarules2999 (1440265) | more than 5 years ago | (#26395521)

I agree. FPS and RTS games are where the graphics are really rather pretty and would benefit from this technology, but latency issues would crop up almost instantly. Imagine - you're not only streaming the game's images to your computer, but you're streaming data out of your computer in a multiplayer game. I suppose it could work for single-player games, but even then.

Re:What about latency in gaming? (1)

zippthorne (748122) | more than 5 years ago | (#26395737)

Maybe it could just bake occlusion maps and such and stream that out to you. That stuff can tolerate a little lag once in a while (things will just look weird, or fall back on something less realistic), requires a whole lotta processing per scene, and in say a mmorpg type environment, it only has to be done once for everybody.

Someone has to say it... (0, Redundant)

crashandburn66 (1290292) | more than 5 years ago | (#26395395)

Imagine a beowulf cluster of these!

All I Can Say Is (1, Redundant)

Keanu Reeves (1418607) | more than 5 years ago | (#26395397)

Whoa

online powerhorse? (4, Funny)

mihalis (28146) | more than 5 years ago | (#26395419)

the Fusion Render Cloud will be available as an online powerhorse
AMD also described NVIDIA's Quadroplex as more of an online My Little Pony.

How will this save money? (2, Insightful)

Lord Byron II (671689) | more than 5 years ago | (#26395437)

Instead of buying a $400 video card, now you're paying AMD to buy that video card for you, paying them for the management of that card, and paying your ISP for the bandwidth. The only possible way this works is if you only use your card 10% of the time, then AMD can utilize it at 100%, selling you just one-tenth the total.

Of course, that's great for gamers, who will sporadically play throughout the day, but awful for movie studios who could probably keep a render farm at 100% anyway.

Re:How will this save money? (1)

moosesocks (264553) | more than 5 years ago | (#26396013)

Movie-grade CG tends to rendered via raytracing, which, AFAIK, is an algorithm that is more suited to be run on a general-purpose CPU, instead of a GPU.

I'm sure part of the reason that nVIDIA and ATI have been working to develop alternative applications of their GPU technology is that their GPUs could potentially become unnecessary to gamers, should CPUs ever reach the speed where real-time raytracing is practical.

Obligatory (0)

Anonymous Coward | more than 5 years ago | (#26395453)

But can it run Crysis?

One Problem (5, Insightful)

Akir (878284) | more than 5 years ago | (#26395473)

They're going to have to write a driver that works before they get that to work.

FAIL (-1, Troll)

Anonymous Coward | more than 5 years ago | (#26395501)

EPIC FAIL....why?
cause in the end when they tax the heck out of the net as Hollywood wants and we can't afford net connections what fraking goo dis this going to be. I am already looking to get all the games i don't see as online cause that is all they are a tax on you.
in the end you pay hundreds a dollars for stuff not worth a 100th that. Also who the hell is gonna be able to afford to play there crap.

Time get get the stupid greed out of the internet people and start waking up to HEY what is affordable and useful.

IF they haven't realized the economy is tanking REAL bad and its about half way to the bottom yet, another trillion in bad loans is coming and the USA govt is powerless now.

for reals? (0, Redundant)

kaizokuace (1082079) | more than 5 years ago | (#26395547)

People are trying too hard just to play Crysis. wtf!

Re:for reals? (1)

Draconi (38078) | more than 5 years ago | (#26395745)

Don't you understand!? With this, my god, we could build an entire virtual world, out of fully interactive, fully physic'd, fully exploding *barrels*!

Re:for reals? (1)

kaizokuace (1082079) | more than 5 years ago | (#26395783)

do a barrel roll! in real-time! maybe even SPACE-TIME!!!

I know kung-fu. (2, Funny)

acedotcom (998378) | more than 5 years ago | (#26395553)

AMD also says that the supercomputer will 'enable remote real-time rendering of film and visual effects graphics on an unprecedented scale.' Meanwhile, game developers would be able to use the supercomputer to quickly develop games, and also 'serve up virtual world games with unlimited photo-realistic detail.'

they have this in the future. don't they call it the matrix?

I look forward to (5, Insightful)

sleeponthemic (1253494) | more than 5 years ago | (#26395557)

Playing Duke Nukem Forever @ 1900x1200 through the Fusion Render Cloud, occasionally reloading the latest results of the (fully operational)Super Hadron Collider on my Nintendo VR Goggles powered by a free energy device producing negative infinity carbon emissions.

Re:I look forward to (0, Redundant)

shawb (16347) | more than 5 years ago | (#26395725)

All that from your flying car, I assume?

Re:I look forward to (1)

Nemyst (1383049) | more than 5 years ago | (#26395867)

All that from your flying car, I assume?

Nah, I say from his grave, from the look of things.

Contest, the rematch... (2, Funny)

graymocker (753063) | more than 5 years ago | (#26395643)

A comment from the story earlier today about nVidia's new 2-teraflop multicore card:

Yet again, Nvidia showed ATI that it, indeed, has the biggest penis.

Hah! HAH! While nVidia dicks around with expansion cards measured in mere teraflops, AMD is building a SUPERCOMPUTER. That's a /peta/flop, nVidia! If you don't know what that is, here's a hint: take your teraflop. Then add three zeros to the end. BAM!

AMD's penis is now 500 times larger than nVidia's. It's math.

Re:Contest, the rematch... (4, Interesting)

Zephiris (788562) | more than 5 years ago | (#26395689)

Nvidia's GTX 295 was around 1.7 teraflops I believe, while the (similarly priced) 4870X2 is 2.4. The 'mere' 295 supposedly beats the 4870X2 by 15% average.
The difference is? Nvidia always has pretty good drivers. ATI struggles to allow games to take >50% advantage of even the lowly 3870 (as measured by the card's own performance counters)...let alone a 2.4 tflop card...let alone a massive array of 4870s.

Plus, wouldn't a 1000 GPU 4870 cloud...only allow some 1000 users some fractional percentage of one 4870 capped by latency and other overhead?

Or...are we talking about providing a larger number of mobile devices the equivalent capabilities and speed of 1999's Geforce 256?

Either way...I don't think it'll catch on, and will be a huge money sink for AMD when it needs to be fixing its processor and video card issues for the average, real consumers who are losing faith in AMD's ability to provide reasonable and usefully competitive products.

The Really Important Question... (0, Redundant)

CodeBuster (516420) | more than 5 years ago | (#26395669)

Is will it be able to run Windows Vista?

I'd love to... (1)

Draconi (38078) | more than 5 years ago | (#26395731)

...build games for it - but how does this translate to serving up virtual world games with unlimited photorealistic detail?

Does it draw the perspective for every individual logged on player ahead of time, cache it, and somehow overcome bandwidth and latency concerns to deliver something in higher quality than a local GPU can do?

Or is this about the architecture of the virtual world itself - messaging, AI threads, triggers, events, decision making? It would have to be one incredible world that required more than a rack of servers in a colo can admirably achieve today.

Now, as far as actual development goes, I can see how this would be an incredible tool. I'm just confused where the cloud becomes a gaming platform.

Re:I'd love to... (0)

Anonymous Coward | more than 5 years ago | (#26395789)

Photon/ray tracing from the light source to the "eyes"! Very realistic, and scales fairly well(more players = more light sinks, but only a few more photons/rays, unless they're using torches or something...well in any case, you get to "reuse" most of your calculations), but it still has a hefty break-even point. ...and with per-frame JPEG2000 compression to decrease bandwidth requirements. ...

My mind is 20 years in the future. :(

And there's still that pesky speed-of-light problem that will make this infeasible for everyone but Japan and South Korea. ;b

But it would kill most inherent cheating/hacking vulnerabilities present in current MMOGs...

Whoa!!! (1)

coopaq (601975) | more than 5 years ago | (#26395791)

Streaming video games over the net from a server cloud?

Who let the marketing guys out of their cage on this one?

I mean... it will be faster than Intel's local 3D chips sure, but still... come on!

beowulf cluster (1)

moniker127 (1290002) | more than 5 years ago | (#26395825)

Imagine a beow... oh... damn it, thats what they did.

Ah, the Big Iron versus micros war again.... (4, Interesting)

macraig (621737) | more than 5 years ago | (#26395887)

Figures. See, most people thought that war had been won long ago. Perhaps it was, but now the Big Iron camp has a new ally: Big Software, who REALLY wants to do away with one-time licenses and purchases and substitute the far more lucrative "Web apps" and the subscription licensing and fees that paradigm will allow. They want to re-brand software as "content" and they want consumers to willingly buy into that. Their latest sneaky flanking maneuver is what you know as Web apps, but the objective is the same.

If you say yes to either one, centralized computing or software subscriptions, you're actually saying yes to BOTH.

Nancy Reagan had the better advice: Just Say No... to both.

Has anyone considered the possiblity (1)

HB.3 (794819) | more than 5 years ago | (#26395897)

that we could finally play crysis on this?

I hope we'll show implementation in April (1)

JavaGenosse (1174861) | more than 5 years ago | (#26395901)

Our company is planning to present Nvidia based GPGPU solution at Cloud Computing Conference 2009, keep tuned - http://www.cloudslam09.com/ [cloudslam09.com] imho, AMD's idea is sound and timely from different points. Those who doubt, just lag behind like SGI did.

Bullshit Stack Overflow (4, Insightful)

elysiuan (762931) | more than 5 years ago | (#26395907)

"serve up virtual world games with unlimited photo-realistic detail."

Considering that CGI effects in movie houses have only started approaching effects indistinguishable from reality within the last five or so years this spikes my bullshit meter pretty high.

Factor in Weta/IL&M and the rest are using huge render farms for an extremely non-realtime render process and my meter explodes.

Even if I take the claim at face value and postulate that it is possible to do this then I am forced to wonder about how many concurrent, real-time, 100% realistic scenes it can process at once.

Sounds like the marketing department wet their pants a bit early on this one.

another issue (1)

Voyager529 (1363959) | more than 5 years ago | (#26396045)

They say that they want it to do games and 3D animation in a mobile web browser. Call me nuts, but Quake4iPhone takes alot of skill and patience to control reliably...and now they want to try to get Unreal Tournament 5 in that environment? heh. Almost as much fun as doing...basically anything in Maya or 3DS Max from a phone. "desirable" is not quite the first adjective that comes to mind.

FLOP what? (1)

91degrees (207121) | more than 5 years ago | (#26396499)

1 Petaflop = 10^15 floating-point operations.

So what happens after it completes those 10^15 floating point operations? Or did the poster mean 1 PetaFLOPS? The S stands for "second" It's not a plural of FLOP!

Practical benifits (2, Interesting)

cee--be (1308795) | more than 5 years ago | (#26396547)

One practical use for this would be to run staggeringly complex real-time physics calculations in real time. One example would be doing the necessary calculations to render a physically realistic sea with weather conditions into an animation. You could then send this to users in a sea MMO for example. There are many other cool game related things you could do with it, rather than wastefully rendering some uncanny valley mobile phone game at 2 FPS.

The 1970s? Did I step into a time machine? (2, Insightful)

commodore64_love (1445365) | more than 5 years ago | (#26396855)

I sit down at my dumb terminal, I log in, and now I have access to a central supercomputer (via the network) that does all the processing.

This AMD idea sounds like something from the 1970s.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?