Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Targeting Real-Time Cloud Rendering

Soulskill posted more than 4 years ago | from the hey-if-onlive-and-gaikai-can-do-it dept.

Graphics 184

MojoKid writes "To date, the majority of cloud computing applications have emphasized storage, group collaboration, or the ability to share information and applications with large groups of people. So far, there's been no push to make GPU power available in a cloud computing environment — but that's something NVIDIA hopes to change. The company announced version 3.0 of its RealityServer today. The new revision sports hardware-level 3D acceleration, a new rendering engine (iray), and the ability to create 'images of photorealistic scenes at rates approaching an interactive gaming experience.' NVIDIA claims that the combination of RealityServer and its Tesla hardware can deliver those photorealistic scenes on your workstation or your cell phone, with no difference in speed or quality. Instead of relying on a client PC to handle the task of 3D rendering, NVIDIA wants to move the capability into the cloud, where the task of rendering an image or scene is handed off to a specialized Tesla server. Then that server performs the necessary calculations and fires back the finished product to the client."

cancel ×

184 comments

Sorry! There are no comments related to the filter you selected.

Ouch! (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#29823177)

This story caused me to poor hot grits down my pants. Natalie Portman wasn't involved.

Thank you!

Your official guide to the Jigaboo presidency (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#29823181)

Congratulations on your purchase of a brand new nigger! If handled properly, your apeman will give years of valuable, if reluctant, service.

INSTALLING YOUR NIGGER.
You should install your nigger differently according to whether you have purchased the field or house model. Field niggers work best in a serial configuration, i.e. chained together. Chain your nigger to another nigger immediately after unpacking it, and don't even think about taking that chain off, ever. Many niggers start singing as soon as you put a chain on them. This habit can usually be thrashed out of them if nipped in the bud. House niggers work best as standalone units, but should be hobbled or hamstrung to prevent attempts at escape. At this stage, your nigger can also be given a name. Most owners use the same names over and over, since niggers become confused by too much data. Rufus, Rastus, Remus, Toby, Carslisle, Carlton, Hey-You!-Yes-you!, Yeller, Blackstar, and Sambo are all effective names for your new buck nigger. If your nigger is a ho, it should be called Latrelle, L'Tanya, or Jemima. Some owners call their nigger hoes Latrine for a joke. Pearl, Blossom, and Ivory are also righteous names for nigger hoes. These names go straight over your nigger's head, by the way.

CONFIGURING YOUR NIGGER
Owing to a design error, your nigger comes equipped with a tongue and vocal chords. Most niggers can master only a few basic human phrases with this apparatus - "muh dick" being the most popular. However, others make barking, yelping, yapping noises and appear to be in some pain, so you should probably call a vet and have him remove your nigger's tongue. Once de-tongued your nigger will be a lot happier - at least, you won't hear it complaining anywhere near as much. Niggers have nothing interesting to say, anyway. Many owners also castrate their niggers for health reasons (yours, mine, and that of women, not the nigger's). This is strongly recommended, and frankly, it's a mystery why this is not done on the boat

HOUSING YOUR NIGGER.
Your nigger can be accommodated in cages with stout iron bars. Make sure, however, that the bars are wide enough to push pieces of nigger food through. The rule of thumb is, four niggers per square yard of cage. So a fifteen foot by thirty foot nigger cage can accommodate two hundred niggers. You can site a nigger cage anywhere, even on soft ground. Don't worry about your nigger fashioning makeshift shovels out of odd pieces of wood and digging an escape tunnel under the bars of the cage. Niggers never invented the shovel before and they're not about to now. In any case, your nigger is certainly too lazy to attempt escape. As long as the free food holds out, your nigger is living better than it did in Africa, so it will stay put. Buck niggers and hoe niggers can be safely accommodated in the same cage, as bucks never attempt sex with black hoes.

FEEDING YOUR NIGGER.
Your Nigger likes fried chicken, corn bread, and watermelon. You should therefore give it none of these things because its lazy ass almost certainly doesn't deserve it. Instead, feed it on porridge with salt, and creek water. Your nigger will supplement its diet with whatever it finds in the fields, other niggers, etc. Experienced nigger owners sometimes push watermelon slices through the bars of the nigger cage at the end of the day as a treat, but only if all niggers have worked well and nothing has been stolen that day. Mike of the Old Ranch Plantation reports that this last one is a killer, since all niggers steal something almost every single day of their lives. He reports he doesn't have to spend much on free watermelon for his niggers as a result. You should never allow your nigger meal breaks while at work, since if it stops work for more than ten minutes it will need to be retrained. You would be surprised how long it takes to teach a nigger to pick cotton. You really would. Coffee beans? Don't ask. You have no idea.

MAKING YOUR NIGGER WORK.
Niggers are very, very averse to work of any kind. The nigger's most prominent anatomical feature, after all, its oversized buttocks, which have evolved to make it more comfortable for your nigger to sit around all day doing nothing for its entire life. Niggers are often good runners, too, to enable them to sprint quickly in the opposite direction if they see work heading their way. The solution to this is to *dupe* your nigger into working. After installation, encourage it towards the cotton field with blows of a wooden club, fence post, baseball bat, etc., and then tell it that all that cotton belongs to a white man, who won't be back until tomorrow. Your nigger will then frantically compete with the other field niggers to steal as much of that cotton as it can before the white man returns. At the end of the day, return your nigger to its cage and laugh at its stupidity, then repeat the same trick every day indefinitely. Your nigger comes equipped with the standard nigger IQ of 75 and a memory to match, so it will forget this trick overnight. Niggers can start work at around 5am. You should then return to bed and come back at around 10am. Your niggers can then work through until around 10pm or whenever the light fades.

ENTERTAINING YOUR NIGGER.
Your nigger enjoys play, like most animals, so you should play with it regularly. A happy smiling nigger works best. Games niggers enjoy include: 1) A good thrashing: every few days, take your nigger's pants down, hang it up by its heels, and have some of your other niggers thrash it with a club or whip. Your nigger will signal its intense enjoyment by shrieking and sobbing. 2) Lynch the nigger: niggers are cheap and there are millions more where yours came from. So every now and then, push the boat out a bit and lynch a nigger.

Lynchings are best done with a rope over the branch of a tree, and niggers just love to be lynched. It makes them feel special. Make your other niggers watch. They'll be so grateful, they'll work harder for a day or two (and then you can lynch another one). 3) Nigger dragging: Tie your nigger by one wrist to the tow bar on the back of suitable vehicle, then drive away at approximately 50mph. Your nigger's shrieks of enjoyment will be heard for miles. It will shriek until it falls apart. To prolong the fun for the nigger, do *NOT* drag him by his feet, as his head comes off too soon. This is painless for the nigger, but spoils the fun. Always wear a seatbelt and never exceed the speed limit. 4) Playing on the PNL: a variation on (2), except you can lynch your nigger out in the fields, thus saving work time. Niggers enjoy this game best if the PNL is operated by a man in a tall white hood. 5) Hunt the nigger: a variation of Hunt the Slipper, but played outdoors, with Dobermans. WARNING: do not let your Dobermans bite a nigger, as they are highly toxic.

DISPOSAL OF DEAD NIGGERS.
Niggers die on average at around 40, which some might say is 40 years too late, but there you go. Most people prefer their niggers dead, in fact. When yours dies, report the license number of the car that did the drive-by shooting of your nigger. The police will collect the nigger and dispose of it for you.

COMMON PROBLEMS WITH NIGGERS - MY NIGGER IS VERY AGGRESIVE
Have it put down, for god's sake. Who needs an uppity nigger? What are we, short of niggers or something?

MY NIGGER KEEPS RAPING WHITE WOMEN
They all do this. Shorten your nigger's chain so it can't reach any white women, and arm heavily any white women who might go near it.

WILL MY NIGGER ATTACK ME?
Not unless it outnumbers you 20 to 1, and even then, it's not likely. If niggers successfully overthrew their owners, they'd have to sort out their own food. This is probably why nigger uprisings were nonexistent (until some fool gave them rights).

MY NIGGER BITCHES ABOUT ITS "RIGHTS" AND "RACISM".
Yeah, well, it would. Tell it to shut the fuck up.

MY NIGGER'S HIDE IS A FUNNY COLOR. - WHAT IS THE CORRECT SHADE FOR A NIGGER?
A nigger's skin is actually more or less transparent. That brown color you can see is the shit your nigger is full of. This is why some models of nigger are sold as "The Shitskin".

MY NIGGER ACTS LIKE A NIGGER, BUT IS WHITE.
What you have there is a "wigger". Rough crowd. WOW!

IS THAT LIKE AN ALBINO? ARE THEY RARE?
They're as common as dog shit and about as valuable. In fact, one of them was President between 1992 and 2000. Put your wigger in a cage with a few hundred genuine niggers and you'll soon find it stops acting like a nigger. However, leave it in the cage and let the niggers dispose of it. The best thing for any wigger is a dose of TNB.

MY NIGGER SMELLS REALLY BAD
And you were expecting what?

SHOULD I STORE MY DEAD NIGGER?
When you came in here, did you see a sign that said "Dead nigger storage"? .That's because there ain't no goddamn sign.

Re:Your official guide to the Jigaboo presidency (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#29823525)

old and lame!
can't you come up with anything new?

Drawing a cloud in real-time (0, Flamebait)

tepples (727027) | more than 4 years ago | (#29823207)

I assume it has nothing to do with this video [youtube.com] .

Re:Drawing a cloud in real-time (0)

Anonymous Coward | more than 4 years ago | (#29823333)

It has everything to do with NVidia's inability to reach a solution that could be used in your home (or at least his one-man-workshop / studio small) to reach the quality standards they like to advertise. So instead of admitting defeat to other innovators (either a processor as a V-Ray RT would be happy to comment on any interactive GPU card, or finalRender use specialized hardware solutions Progeniq - reports, already used in 2012 is likely to be an absolute blockbuster movie), just give it a different spin on it and start marketing it as a solution to the clouds.

After all, the transfer of gigabytes to terabytes of scene, bitmaps, shaders, etc data is as efficient and convenient (Why yes, said ILM, please have basically the source code for the purpose we are doing to Spielberg without a project name, we trust you and their low wage employees at all!)
Oh, wait, no it is not, why spend (compared with the "we already have, we are not represented in it now, it might rent") renderfarms there now are struggling to survive.

In the future, perhaps, or if your bid package with a big fat pipe and some really heavy NDA, but for now it seems they're getting ahead of the market .. and themselves.

Re:Drawing a cloud in real-time (1, Funny)

Anonymous Coward | more than 4 years ago | (#29823349)

Idiot. "Real-time" means that you can draw clouds at the screen refresh rate, not every 5 minutes.

Why is rendering clouds so important? (0)

szo (7842) | more than 4 years ago | (#29823215)

oh, wait...

Re:Why is rendering clouds so important? (4, Funny)

suso (153703) | more than 4 years ago | (#29823521)

The gaming industry has been trying to jump start the flight simulator market again.

Stupidest idea ever (0)

Anonymous Coward | more than 4 years ago | (#29823233)

Clouds for render farms seems fine. In fact, you are just putting a fancy new cloudy name on the render farms that have been around forever.

But render farms for on-line gaming seems like the most ridiculous idea. Note the demo nVidia showed:

but the speed of the updates didn't remotely come close to "approaching an interactive gaming experience," unless said experience involved attempting to run Doom on your 16MHz 386 with the screen size set at maximum. Update times varied from 10-20 seconds, and that's a significant lag when discussing online usage patterns.

Re:Stupidest idea ever (0)

Anonymous Coward | more than 4 years ago | (#29823291)

Which makes me wonder what the point is here really. OnLive is in testing already. TFA doesn't compare to OL so I don't know why Nvidia's offering is so much better.

Incomparable to OnLive - different goals (4, Informative)

Animaether (411575) | more than 4 years ago | (#29823491)

NVidia's offering performs full scene raytracing/pathtracing, with effects ranging from reflections and refractions to global illumination and caustics all the way through to sub-surface scattering and participating media.

Some of these things can be done in proper realtime (say, at least, 30fps at 720p) on existing GPUs, but typically by using hacks that look 'good enough', but aren't actually correct. Which is fine for gaming (where refresh rates matter), but not fine for product visualization, architectural visualization or to go to an extreme.. materials and lighting analysis, where you don't care if it's not 30fps, but are more than happy to wait 10 seconds for something that used to take 15 minutes.

That said... if the cards keep getting faster, then eventually 30fps@720p will be possible and there's no reason, in the time inbetween, that games couldn't add the more fancy effects and have the GPGPU solutions take care of those on a 'cloud' platform.

Latency (4, Insightful)

gr8_phk (621180) | more than 4 years ago | (#29823739)

That said... if the cards keep getting faster, then eventually 30fps@720p will be possible and there's no reason, in the time inbetween, that games couldn't add the more fancy effects and have the GPGPU solutions take care of those on a 'cloud' platform.

There's one big reason - latency. 30 FPS is one frame every 33.333ms. What's your ping time? Add the rendering time to that, and that's what your interactivity is going to look like. Remember that many games have ways of hiding the latency between client and server - in particular they know the players POV and the static environment, so those things can be handled very well.

As someone else said, cloud rendering is fine for making movies. It's not viable for games. And besides, if a GPU can do this stuff in real time, why do we need to push it into the cloud? This sounds like OTOY all over again.

BTW, CPUs will be doing realtime ray tracing soon anyway - give me a bunch of bulldozer cores and a frame buffer.

Re:Latency (1)

Sockatume (732728) | more than 4 years ago | (#29823883)

It'll be fine for people who're happy with low-grade graphics that existing hardware can do quickly enough for the latency to be the limit, or gameplay which is not in real-time. Unfortunately this is a market that probably won't see the point in signing up to the service in the first place, and could be just as easily served by a cheap local box.

Re:Latency (1)

natehoy (1608657) | more than 4 years ago | (#29823945)

What if the game server also had the GPU engine on board for all of the clients?

Current case: Game server tells client "you are at coordinates X,Y,Z within the game, you're facing this way, the following is happening in front of you, etc". Client takes all of that an renders a local copy of the game map to match what the server says.

New case: Game server sends game imagery as compressed streaming video and audio for each client, and receives back motion/command instructions from client. All of the calculations and rendering are done on the centralized server, clients are just looking at the equivalent of streaming video.

Sure, it would require a pretty high-bandwidth connection, but as long as the client could render fixed (2D) images quickly, the remote server would do all the tricky 3D rendering and just stream the results to the client. So you're a lot less hardware-dependent for each gaming client. This could be a huge win for LAN parties.

I don't see this as absolutely revolutionary, but for certain things, it could be pretty significant. No loading custom maps to individual gaming stations, because the maps are all on the server. Gamers could sit on pretty low-end hardware (not even requiring a 3D rendering engine). The client software could be measured in kilobytes or maybe megabytes instead of gigabytes like it is today.

You just need some very heavy-duty hardware for a server, and some serious upstream capacity coming from that server. But gaming companies could practically give away (or maybe completely give away) the clients and charge a monthly fee for access to servers.

Re:Latency (1)

hey (83763) | more than 4 years ago | (#29824299)

Cool for small computers (iPod touch, iPhone, cell phone, netbooks) that don't have GPUs.

Re:Stupidest idea ever (1)

slim (1652) | more than 4 years ago | (#29824593)

Which makes me wonder what the point is here really. OnLive is in testing already. TFA doesn't compare to OL so I don't know why Nvidia's offering is so much better.

Nvidia and OnLive are partners: http://www.onlive.com/partners.html [onlive.com]

  These announcements are probably related to OnLive.

Re:Stupidest idea ever (1, Troll)

poetmatt (793785) | more than 4 years ago | (#29823533)

this is the "whoops" of cloud computing and why it doesn't work for these purposes. Render farms do what they do well, and so does distributed computing. Neither of these are cloud.

Can we please stop the marketing hype for everything cloud?

Re:Stupidest idea ever (0)

Anonymous Coward | more than 4 years ago | (#29823755)

I love how you got modded troll and this guy:

"by thisnamestoolong (1584383) on Wednesday October 21, @10:33AM (#29823275)
Please stop talking about "cloud" computing -- it is one of the dumbest buzzwords I have ever heard in my entire life -- not to mention the fact that it is a totally meaningless term."

Got modded insightful.

Re:Stupidest idea ever (2, Informative)

im_thatoneguy (819432) | more than 4 years ago | (#29823935)

I take it you don't have a render farm. If you're closer to delivery your render farm is probably completely occupied rendering final frames. If you are in the middle of a project it's probably running at quarter or half capacity. A render farm is often either over burdened or under burdened. That's a situation that's perfect for cloud computing. Instead of wasting thousands and thousands of dollars in idle machines you simply pay for the time when you need processing power. And since most of the world won't be rendering simultaneously a shared farm better distributes the investment. The only challenge now will be asset management and synchronizing a couple of GBs of scene data back and forth.

Re:Stupidest idea ever (1, Interesting)

Sockatume (732728) | more than 4 years ago | (#29823563)

Well, the new name is supposed to be for the specific case of moving traditionally local computing tasks off to farms. Doing a movie on a remote render-farm is hardly cloud computing, but re-encoding your holiday video is.

Latency aside, my worry is that you're buying a gaming timeshare. It's cheaper to pay for the computing time you actually use, in principle. However online game communities depend on lots of people playing at the same time, which is exactly the sort of thing that would make online gaming uneconomical. Example:

Somebody's got to pay for the shedloads of hardware.

If you have six users, and their usage is distributed over the whole day so each is on for 4 hours with no overlap, then you only have to invest in one "virtual games PC" worth of hardware for those six users. You've got six paying customers for an investment in one games PC! Charge them each a quarter of the cost of an up-to-date games machine over a year, and there's your profit margin (you get back 1.5 times the cost of the hardware), and the value for the end user (they only have to pay 0.25 the cost of the hardware).

If you have six users, and four of them are online at the same time because they're in the US and Western Europe and playing against each other, then you need four "virtual games PCs" worth of hardware to handle that peak demand. The rest of the day, you have two users, sharing the four computers. So over the day you're bringing in an average of three users over four "virtual games PCs" that you've invested in. It's hard to find a way to make that profitable, except having off-peak discounts to try to smooth out the usage patterns.

I guess what I'm saying is that when it comes to gaming, computing power isn't fungible.

Re:Stupidest idea ever (1)

Sockatume (732728) | more than 4 years ago | (#29823717)

Hell, an average of less than 3 users. If those four are online for only four hours, and your system is only loaded with two users the other eight hours, then you're only getting an average of one and a third users over four computers. How do you get one and one third users to pay for four games machines?

Your official guide to the jigaboo presidency (-1, Flamebait)

Anonymous Coward | more than 4 years ago | (#29823259)

Congratulations on your purchase of a brand new nigger! If handled properly, your apeman will give years of valuable, if reluctant, service.

INSTALLING YOUR NIGGER.
You should install your nigger differently according to whether you have purchased the field or house model. Field niggers work best in a serial configuration, i.e. chained together. Chain your nigger to another nigger immediately after unpacking it, and don't even think about taking that chain off, ever. Many niggers start singing as soon as you put a chain on them. This habit can usually be thrashed out of them if nipped in the bud. House niggers work best as standalone units, but should be hobbled or hamstrung to prevent attempts at escape. At this stage, your nigger can also be given a name. Most owners use the same names over and over, since niggers become confused by too much data. Rufus, Rastus, Remus, Toby, Carslisle, Carlton, Hey-You!-Yes-you!, Yeller, Blackstar, and Sambo are all effective names for your new buck nigger. If your nigger is a ho, it should be called Latrelle, L'Tanya, or Jemima. Some owners call their nigger hoes Latrine for a joke. Pearl, Blossom, and Ivory are also righteous names for nigger hoes. These names go straight over your nigger's head, by the way.

CONFIGURING YOUR NIGGER
Owing to a design error, your nigger comes equipped with a tongue and vocal chords. Most niggers can master only a few basic human phrases with this apparatus - "muh dick" being the most popular. However, others make barking, yelping, yapping noises and appear to be in some pain, so you should probably call a vet and have him remove your nigger's tongue. Once de-tongued your nigger will be a lot happier - at least, you won't hear it complaining anywhere near as much. Niggers have nothing interesting to say, anyway. Many owners also castrate their niggers for health reasons (yours, mine, and that of women, not the nigger's). This is strongly recommended, and frankly, it's a mystery why this is not done on the boat

HOUSING YOUR NIGGER.
Your nigger can be accommodated in cages with stout iron bars. Make sure, however, that the bars are wide enough to push pieces of nigger food through. The rule of thumb is, four niggers per square yard of cage. So a fifteen foot by thirty foot nigger cage can accommodate two hundred niggers. You can site a nigger cage anywhere, even on soft ground. Don't worry about your nigger fashioning makeshift shovels out of odd pieces of wood and digging an escape tunnel under the bars of the cage. Niggers never invented the shovel before and they're not about to now. In any case, your nigger is certainly too lazy to attempt escape. As long as the free food holds out, your nigger is living better than it did in Africa, so it will stay put. Buck niggers and hoe niggers can be safely accommodated in the same cage, as bucks never attempt sex with black hoes.

FEEDING YOUR NIGGER.
Your Nigger likes fried chicken, corn bread, and watermelon. You should therefore give it none of these things because its lazy ass almost certainly doesn't deserve it. Instead, feed it on porridge with salt, and creek water. Your nigger will supplement its diet with whatever it finds in the fields, other niggers, etc. Experienced nigger owners sometimes push watermelon slices through the bars of the nigger cage at the end of the day as a treat, but only if all niggers have worked well and nothing has been stolen that day. Mike of the Old Ranch Plantation reports that this last one is a killer, since all niggers steal something almost every single day of their lives. He reports he doesn't have to spend much on free watermelon for his niggers as a result. You should never allow your nigger meal breaks while at work, since if it stops work for more than ten minutes it will need to be retrained. You would be surprised how long it takes to teach a nigger to pick cotton. You really would. Coffee beans? Don't ask. You have no idea.

MAKING YOUR NIGGER WORK.
Niggers are very, very averse to work of any kind. The nigger's most prominent anatomical feature, after all, its oversized buttocks, which have evolved to make it more comfortable for your nigger to sit around all day doing nothing for its entire life. Niggers are often good runners, too, to enable them to sprint quickly in the opposite direction if they see work heading their way. The solution to this is to *dupe* your nigger into working. After installation, encourage it towards the cotton field with blows of a wooden club, fence post, baseball bat, etc., and then tell it that all that cotton belongs to a white man, who won't be back until tomorrow. Your nigger will then frantically compete with the other field niggers to steal as much of that cotton as it can before the white man returns. At the end of the day, return your nigger to its cage and laugh at its stupidity, then repeat the same trick every day indefinitely. Your nigger comes equipped with the standard nigger IQ of 75 and a memory to match, so it will forget this trick overnight. Niggers can start work at around 5am. You should then return to bed and come back at around 10am. Your niggers can then work through until around 10pm or whenever the light fades.

ENTERTAINING YOUR NIGGER.
Your nigger enjoys play, like most animals, so you should play with it regularly. A happy smiling nigger works best. Games niggers enjoy include: 1) A good thrashing: every few days, take your nigger's pants down, hang it up by its heels, and have some of your other niggers thrash it with a club or whip. Your nigger will signal its intense enjoyment by shrieking and sobbing. 2) Lynch the nigger: niggers are cheap and there are millions more where yours came from. So every now and then, push the boat out a bit and lynch a nigger.

Lynchings are best done with a rope over the branch of a tree, and niggers just love to be lynched. It makes them feel special. Make your other niggers watch. They'll be so grateful, they'll work harder for a day or two (and then you can lynch another one). 3) Nigger dragging: Tie your nigger by one wrist to the tow bar on the back of suitable vehicle, then drive away at approximately 50mph. Your nigger's shrieks of enjoyment will be heard for miles. It will shriek until it falls apart. To prolong the fun for the nigger, do *NOT* drag him by his feet, as his head comes off too soon. This is painless for the nigger, but spoils the fun. Always wear a seatbelt and never exceed the speed limit. 4) Playing on the PNL: a variation on (2), except you can lynch your nigger out in the fields, thus saving work time. Niggers enjoy this game best if the PNL is operated by a man in a tall white hood. 5) Hunt the nigger: a variation of Hunt the Slipper, but played outdoors, with Dobermans. WARNING: do not let your Dobermans bite a nigger, as they are highly toxic.

DISPOSAL OF DEAD NIGGERS.
Niggers die on average at around 40, which some might say is 40 years too late, but there you go. Most people prefer their niggers dead, in fact. When yours dies, report the license number of the car that did the drive-by shooting of your nigger. The police will collect the nigger and dispose of it for you.

COMMON PROBLEMS WITH NIGGERS - MY NIGGER IS VERY AGGRESIVE
Have it put down, for god's sake. Who needs an uppity nigger? What are we, short of niggers or something?

MY NIGGER KEEPS RAPING WHITE WOMEN
They all do this. Shorten your nigger's chain so it can't reach any white women, and arm heavily any white women who might go near it.

WILL MY NIGGER ATTACK ME?
Not unless it outnumbers you 20 to 1, and even then, it's not likely. If niggers successfully overthrew their owners, they'd have to sort out their own food. This is probably why nigger uprisings were nonexistent (until some fool gave them rights).

MY NIGGER BITCHES ABOUT ITS "RIGHTS" AND "RACISM".
Yeah well, it would. Tell it to shut the fuck up.

MY NIGGER'S HIDE IS A FUNNY COLOR. - WHAT IS THE CORRECT SHADE FOR A NIGGER?
A nigger's skin is actually more or less transparent. That brown color you can see is the shit your nigger is full of. This is why some models of nigger are sold as "The Shitskin".

MY NIGGER ACTS LIKE A NIGGER, BUT IS WHITE.
What you have there is a "wigger". Rough crowd. WOW!

IS THAT LIKE AN ALBINO? ARE THEY RARE?
They're as common as dog shit and about as valuable. In fact, one of them was President between 1992 and 2000. Put your wigger in a cage with a few hundred genuine niggers and you'll soon find it stops acting like a nigger. However, leave it in the cage and let the niggers dispose of it. The best thing for any wigger is a dose of TNB.

MY NIGGER SMELLS REALLY BAD
And you were expecting what?

SHOULD I STORE MY DEAD NIGGER?
When you came in here, did you see a sign that said "Dead nigger storage"? .That's because there ain't no goddamn sign.

No more!! (4, Insightful)

thisnamestoolong (1584383) | more than 4 years ago | (#29823275)

Please stop talking about "cloud" computing -- it is one of the dumbest buzzwords I have ever heard in my entire life -- not to mention the fact that it is a totally meaningless term.

Re:No more!! (5, Insightful)

Sockatume (732728) | more than 4 years ago | (#29823335)

It's got a very well-defined meaning: performing computing and storing data on an internet-connected server from an internet-connected client. It's a new term for, arguably, a very old thing, coined because the average end-user these days isn't familiar with the idea of doing their computing from a dumb terminal.

Re:No more!! (3, Insightful)

thisnamestoolong (1584383) | more than 4 years ago | (#29823393)

Umm... that was pretty much my whole point. It makes me want to claw out my own eyes when I hear these jack off tech companies talking about this new "cloud" computing phenomenon -- it is only a new (and exceptionally stupid) buzzword for something that we have been doing for a long, long time. It is not "cloud" computing -- it's just fucking regular old computing -- with CPU's and memory and HDD's and the like -- it just happens to be taking place somewhere else.

Re:No more!! (2, Insightful)

Sockatume (732728) | more than 4 years ago | (#29823653)

No, your whole point was that it's meaningless. Which we've established it isn't.

Your new argument is that the distinction between cloud computing and local computing is unimportant. Well, ask anyone who's had a computer-time grant on one of the monstrous IBM research clusters how they feel about the distinction between "fucking regular old computing" that "just happens to be taking place somewhere else" and going out and just buying their own hardware.

Re:No more!! (1)

commodore64_love (1445365) | more than 4 years ago | (#29824087)

If you have Comcast, Time-Warner, or Cox internet you don't have the old style 80s-era time-sharing, but you still have an allotment. About 250 gigs per month. Have fun watching youtube videos, or CBS.com tv shows, or netflix.com rentals, AND doing cloud computing at the same time. You'll have overage fees galore.

Re:No more!! (1)

Sockatume (732728) | more than 4 years ago | (#29824265)

...okay?

Re:No more!! (1)

commodore64_love (1445365) | more than 4 years ago | (#29823993)

>>>"cloud" computing phenomenon -- it is only a new (and exceptionally stupid) buzzword for something that we have been doing for a long, long time
>>>

Well it's similar to when my local TV station started talking about "phantom power". i.e. When you leave your VCR or TV plugged-in, it uses about 5 watts of power. They act as if this is something new, but we engineers have known it as "parasitic" or standby power for a long long time.

And bell-bottom jeans. Today they call them "flares" or "flared" but it's still the same thing.

I guess marketers feel they have to dupe the customers, and the way to do that is by changing the name to something else. "Oh no! Watch out for that ghostly phantom power while your cloud computing in your shinynew flared jeans!"

Re:No more!! (1)

Sockatume (732728) | more than 4 years ago | (#29824493)

I don't think anyone pushing to cut down standby power actually thought they'd discovered some shocking new phenomenon hitherto unreported to science. I've never heard it called "phantom power" at any rate: it must be unique to your region.

Re:No more!! (1)

dintech (998802) | more than 4 years ago | (#29824007)

Even better is Cloud 2.0 Computing which is done in actual clouds using standing stone circles and such.

+++ Divide By Cucumber Error. Please Reinstall Universe And Reboot +++

Re:No more!! (4, Insightful)

MobileTatsu-NJG (946591) | more than 4 years ago | (#29824097)

with CPU's and memory and HDD's and the like -- it just happens to be taking place somewhere else.

That's an important distinction.

Re:No more!! (1)

Cornflake917 (515940) | more than 4 years ago | (#29824331)

...when I hear these jack off tech companies

You mean like the company that developed the fleshlight? I'm surprised slashdot doesn't post more articles about this type of technology, considering the typical slashdotter. Jack off tech: it's the future, man!

Re:No more!! (1)

suso (153703) | more than 4 years ago | (#29823561)

It's got a very well-defined meaning:

No it does not. People continue to misuse what it means in everything from daily speech to presentations, manager proposals and articles. The solution to everything nowadays is "Put it in the cloud", but few people really understand what they are saying when they say that.

Its just like most other buzzwords of the past 10 years. People here them and then think they are smart for repeating them to get what they want.

Re:No more!! (1)

Sockatume (732728) | more than 4 years ago | (#29823769)

By your criterion, "CPU" has no well-defined meaning. Bugger-all people who say that have a clue what it actually means. However that does not magically undefine it.

Re:No more!! (1)

commodore64_love (1445365) | more than 4 years ago | (#29824131)

People misuse the word "CPU" the way they abuse the word "cloud computing"? Really? I've not heard anyone saying they need to buy a new 1920x1080 CPU, or a new 10 gigabyte CPU for their machines.

Re:No more!! (2, Informative)

Sockatume (732728) | more than 4 years ago | (#29824417)

I would humbly suggest that the people who talk about a 1920 by 1080 anything are unlikely to misuse the term "cloud computing", either. The people who use "cloud computing" as a magic talisman without bothering to know what it means are the sort of people who start their "CPU" with the front-panel lock key and download internets from the email.

This is besides the point. It was argued that, because people use "cloud computing" without knowing what it means, then the term has no meaning. This is simply an absurd statement.

Re:No more!! (1)

slim (1652) | more than 4 years ago | (#29824679)

People misuse the word "CPU" the way they abuse the word "cloud computing"? Really? I've not heard anyone saying they need to buy a new 1920x1080 CPU, or a new 10 gigabyte CPU for their machines.

Never heard someone refer to the entire desktop case and its contents as "the CPU"? "I plugged the monitor into the CPU, but nothing seems to be happening". I got that all the time when I was in IT support.

Re:No more!! (0)

Anonymous Coward | more than 4 years ago | (#29824203)

Hear hear!

Re:No more!! (1)

hairyfeet (841228) | more than 4 years ago | (#29824219)

The thing that bus me about this 'cloud computing" nonsense is we already had a perfectly good and well established term for this-thin clients. the reason they are now pushing this cloud computing is for the most part thin client everything went over like a lead balloon so they cook up a new name for the same old crap hoping the buzzword bingo will overcome the problems that everyone had with thin clients, which of course it doesn't.

Latency, control over data, security risks, possibility that your stuff will be gone tomorrow when the company goes tits up that was hosting it,bandwidth caps, etc is all still there, they just want the buzzword bingo to make folks forget about the fact that we have all been down this road before. Considering there are still plenty of places in the USA where I can't get decent broadband and the desire of the ISPs to cap the hell out of us instead of actually investing in infrastructure makes me think that 5 years from now this will be just another dotbomb buzzword chucked in the trashcan of history, just like "On the Internet!" was the buzzword bingo du jour in the late 90s.

Re:No more!! (1)

Sockatume (732728) | more than 4 years ago | (#29824343)

People continue to misuse what it means

It has to have a meaning for people to get the meaning wrong.

Re:No more!! (1)

akabigbro (257295) | more than 4 years ago | (#29824053)

I agree. Cloud computing is well defined. Yes, remote processing has been around for a long time, but now there is an interoperable standard that everyone can use (hint: not COM or CORBA), SOAP.

Re:No more!! (1)

ThatMegathronDude (1189203) | more than 4 years ago | (#29824555)

no, SOAP means processing bulky XML. screw that, do it in JSON.

Re:No more!! (2, Insightful)

slim (1652) | more than 4 years ago | (#29824449)

It's got a very well-defined meaning: performing computing and storing data on an internet-connected server from an internet-connected client.

I disagree. If it doesn't involve large server farms, in which the location of your data/process is arbitrary and ideally diffuse, then it's not cloud computing.

"Cloud" is a fairly good analogy for that.

Re:No more!! (1)

Sockatume (732728) | more than 4 years ago | (#29824617)

You're right, its meaning is even more specific better-defined than I had laid out. In my haste I made it too general.

Re:No more!! (1)

jhfry (829244) | more than 4 years ago | (#29824645)

You missed something... I agree that the term is not clear and is commonly misused, but it does have a meaning.

A cloud service is one that is not provided by A server, but by many servers. Additionally to be considered a cloud service, it must be distributed geographically.

In the beginning computing was centralized, you would use a dumb terminal to access a mainframe system and all of your computing needs were centralized. Then with the PC, computing was distributed. Finally they centralized much of it again with web servers. Now with cloud computing they are actually distributing it again, however they are distributing the servers this time and presentation is still done on a relatively dumb device.

Cloud computing also creates a new networking model. Peer to peer and client-server have been around for ages... but the idea that the "server" can actually be many machines distributed all over the globe working together to provide the service that you perceive as the "server" in the client-server model is a pretty big shift.

Think back just a few years, there were almost no systems of servers that cooperatively worked together to provide a service that weren't within a single data center. Sure there were redundant data centers, but the entirety of your data was stored within a single location. With Gmail for example, your email is distributed across many datacenters. A few years ago the closest we came to distributed processing of a large scale was SETI@home, today they are introducing a product that will render a 3D image that would normally take 10's of minutes in a second or so by leveraging the processing power of thousands of computers around the world.

So "Cloud Computing" definitely has a meaning. Without it you would need to say "global scale distributing computing system".

Re:No more!! (2, Insightful)

Anonymous Coward | more than 4 years ago | (#29823413)

I thought they had peaked with the hype around AJAX. But you're right, computing publications have taken it to the next level with "cloud computing".

The people who hype "cloud computing" tend to be young and ignorant. Here is a perfect example of this. [roadtofailure.com]

Simply put, these young punks have a huge ego, but no knowledge of computing history. They don't realize that "cloud computing" is merely what we called "mainframes" back in the day. Their low-powered hand-held devices that'd supposedly benefit from the cloud really aren't different at all from the dumb terminals we hooked up to our mainframes.

Most enterprises moved away from the mainframe because it just wasn't as useful and efficient as individual desktop systems on each user's desk. Unfortunately, most of those fools pushing "cloud computing" these days were born well after we made that transition. They don't realize that they're just resurrecting problems that we dealt with in the early 1980s.

mod parent up!! (1)

A12m0v (1315511) | more than 4 years ago | (#29823569)

I would if I had mod points. Parent is spot on, we've transitioned away from "cloud computing" when we moved from mainframe terminals to desktop workstations, why would we want to go backwards?

Re:mod parent up!! (1)

Sockatume (732728) | more than 4 years ago | (#29823835)

Nothing for the end user that can't be accomplished through plain old offline data synchronisation, if you're patient. For the computer provider, it gives them control over your upgrade pattern and what features you have to pay for. You can hardly save money by not bothering to upgrade the CPU and RAM on the purely conceptual machine hundreds of miles away that you rent time from: if the cloud is getting its quarterly upgrade, it's happening, and you're paying for it.

Re:No more!! (2, Insightful)

Zumbs (1241138) | more than 4 years ago | (#29824017)

There are a number of important differences when comparing mainframe/workstation systems to the modern notion of cloud computing. One significant difference is distributing computational problems to a number of concurrent processes on (possibly) distant systems. Another is that where the mainframe were typically placed close to the workstations, the servers in the cloud can be placed remotely. A third is that the workstations often were unable to function without access to the mainframe, modern desktops are able to use the advantages of the mainframe/cloud as well as the advantages of an autonomous desktop.

Re:No more!! (0)

Anonymous Coward | more than 4 years ago | (#29824349)

What the fuck are you talking about? Seriously. You're not just wrong here and there, you're wrong on EVERY POINT YOU TRIED TO MAKE!

Do you know even the slightest thing about mainframes? Clearly, you don't. For if you did, you'd realize that even low-end mainframes consist of numerous independent processing units (and I'm not talking about the redundant ones, either). Some mainframe systems had processing units in different rooms of the same building, and I worked with one system that had processing units in three different buildings on one campus.

Some mainframes were placed close to the dumb terminals, but it was also very common for users to access a mainframe remotely. Fuck, this was often done with a 300 baud modem over a POTS connection. I remember setting up such systems in the 1970s so that offices in Miami could connect to the mainframes in New York.

You're not as wrong on your third point, but you're still wrong. Various DEC terminals, for instance, did have processing support built in. They often lacked storage, however, so work couldn't be saved.

Re:No more!! (1)

commodore64_love (1445365) | more than 4 years ago | (#29824305)

>>>these young punks have a huge ego, but no knowledge of computing history. They don't realize that "cloud computing" is merely what we called "mainframes" back in the day.
>>>

What I don't understand, even if these young'uns have no knowledge of history, why do they think cloud computing is a good idea? Why would they want to offload all the processing onto some distant central computer, when they have a quadruple CPU sitting right here in front of them? It makes no logical sense.

My own computer may "only" be a Pentium 4, but it's still about 12,000 times faster than the old 8-bit machine where I used to write book reports. If that ancient machine could handle the workload than my current computer certainly can - there's no need to connect to some distant mainframe.

Re:No more!! (1)

Sockatume (732728) | more than 4 years ago | (#29824569)

It's a trade-off. For trivial tasks like word processing, the performance trade-off is worth the convenience benefit. For the home user's idea of a high-performance-computing task, such as gaming and video watching, the convenience benefit is negligable for the huge performance trade-off.

For real high-performance-computing tasks where purchasing a lot of computing resources for one project might not be justified, again there's a very large convenience benefit to just renting at a distance, which is why mainframes are still in use for scientific research. That's the only example I can think of where it's honestly about power, and it's not what you'd call "cloud computing", if only because that's a term created for home-user consumption.

Re:No more!! (1)

werfu (1487909) | more than 4 years ago | (#29823477)

Cloud computing is the combination of distributed computing using a cluster and network storage. It's nothing new, but it haven't been harnesses to do general purpose computing 'till recently. I think it happened because of our ability to now run VM into a cluster seamlessly (switching workload between servers) and treat the cluster as one unique entity, hence the could.

Re:No more!! (0)

Anonymous Coward | more than 4 years ago | (#29823735)

IBM mainframes have had virtualization capabilities like that since the 1960s! Haven't you kids ever heard of CP/CMS [wikipedia.org] and IBM's VM [wikipedia.org] line of mainframe operating systems?

I was at a conference two weeks ago, and stopped at a booth for a cloud computing company. I talked to one of their marketing reps (some kid in his early 20s) who was going on and on about how "innovative" their product was. So I had to politely inform him how their so-called "innovations" had been done a few years before his parents had even been born.

Not only that, but they actually worked back then! What IBM put together was rock solid. That's why we still see some vintage systems used today. Today's cloud computing "solutions" are a huge farce compared to what IBM put together decades ago.

Re:No more!! (1)

IBBoard (1128019) | more than 4 years ago | (#29823839)

Off-topic (as it is rated) for rendering on the cloud, but potentially on-topic for cloud in general. At the moment people want some degree of privacy of data, but "cloud" wants us to throw it to teh interwebz and process it there. Anyone care to guess how much easier it may become to get the data the OP wanted? ;)

Re:No more!! (1)

werfu (1487909) | more than 4 years ago | (#29824355)

Cloud is as private as any other thing you would put on in a web application : Privacy is bound to your service provider usage term. Their tem of use are a contract between you and them. They are as bound to it as you are, except they have the right to change it anytime and you have the right to refuse the modifications and quit using their service. If they break their term and for any reason your data end up in someone else hands, than you could always go after your service provider and the one that cause the leak. Now if you fear that your info end-up in the government hands, then you've got a bigger problem than worrying about the cloud.

Re:No more!! (1)

commodore64_love (1445365) | more than 4 years ago | (#29824357)

>>>It's nothing new, but it haven't been harnesses to do general purpose computing 'till recently.

That's odd. I seem to recall use my VAX terminal to "cloud compute" and do general computing (math problems) back in the 80s. Maybe you think that doesn't count for some reason?

Re:No more!! (0)

Anonymous Coward | more than 4 years ago | (#29823519)

Where's my privacy, oh no I lost my privacy, please help me find my privacy!
*Sob sob*

- Meaningless.com

Re:No more!! (1)

glop (181086) | more than 4 years ago | (#29823537)

I am not that sure actually. It's not very well defined and different people use it differently, sometimes with a marketing agenda.
But it also conveys some property quite clearly:
  - cloud computing is not precisely located and you don't really care
  - it's not happening in your home
  - it's everywhere or almost
  - it's out of your control (others may access it without your knowledge etc.)
  - it can disappear and be unavailable anytime (just like real clouds ;-)

The previous terms were not bad either, but the market made them more precise. For instance, there was Grid computing: it's always a cluster inside a company that provides centralized computing power for embarassingly parallel problems. The original idea was that it would be like the electrical grid and you would just send your problems to the computing grid and they would go wherever there was an excess supply of computing power. Very cloud-like actually...

I don't think the term is useless and I actually think it's nice to change the buzzwords every so often...

 

Re:No more!! (1)

commodore64_love (1445365) | more than 4 years ago | (#29824559)

>>>it's nice to change the buzzwords every so often...

Bad is good. And good is bad. War is peace, and chocolate rations have been increased from 10 to 5.

How about instead of inventing words we just use the ones we have? Rather than "cloud" computing we could just call it internet-based computing, because that's what it is.

Re:No more!! (1)

PhrostyMcByte (589271) | more than 4 years ago | (#29823817)

"Cloud" is a pretty stupid name, one that bugs me almost as much as "AJAX", but it's hurt even more by being associated with two things at once.

The first is simply a client->server connection, or perhaps hosting your data online. This, I think, doesn't need a new name. The old names were working fine.

The second, and far more interesting, is for much more complex systems that are marking a move from managed server hosting to scalable application hosting. These guys design their systems from the ground up to scale your applications quickly, efficiently, and reliably across a pool of servers. By doing this all in one place, hopefully with a lower cost than what it would take for you to do something similar on your own. I think this is a significantly different approach that it needs a new name to differentiate it from what we're used to. This is also what TFA seems to mean when they say "cloud".

I'm currently evaluating these cloud services for my company -- the idea that I can simply focus on writing good code and let someone else worry about starting new servers when usage spikes, replacing ones that break, adding more storage for the database, etc... it is very tempting. The cons? They all use proprietary APIs that seem similar on the surface but in the end are different enough that you really need to specialize your code for their service -- if you ever want to move your app over to something else, it's not going to be simple.

Re:No more!! (0)

Anonymous Coward | more than 4 years ago | (#29824715)

Chance of cloudy interwebs, as a cold corperate front moves past the mid-west.
60% chance of raining microsoft .ODF and google sharks.
Watch out for the dry-online across texas, we're expecting some tesla-nados and severe thunder-lolcats.

Brought to you by the letter G for Greedy. (0)

TimeElf1 (781120) | more than 4 years ago | (#29823309)

And how much is this going to cost the end user? Just so we can have realistic clouds? No one looks at the clouds. How about realistic trees instead?

Bad idea! (1)

oo_HAWK_oo (1619801) | more than 4 years ago | (#29823313)

And then what happens when your kid fires up his bit torrent!?

Re:Bad idea! (2, Funny)

Yvan256 (722131) | more than 4 years ago | (#29823463)

The rendering of clouds in the cloud computing will stop.

With 31% chances of rain.

Question (0, Offtopic)

dorpus (636554) | more than 4 years ago | (#29823315)

For all the talk of "cloud computing", are there publicly available data sets from Google (or other companies)? I'm a graduate student interested in data mining of health outcomes data. My biggest challenge remains the fact that HIPAA and other patient privacy concerns make it very difficult to obtain health outcomes data; it's still a 1980s world where data are granted through official channels after extensive paperwork, or as a favor from people who trust me.

Re:Question (2, Insightful)

TheKidWho (705796) | more than 4 years ago | (#29823425)

Maybe those patients don't want you to know anything about themselves?

Re:Question (1)

commodore64_love (1445365) | more than 4 years ago | (#29824591)

Yes for example if my records are opened, it will be discovered my IQ is only 90, and I may lose my engineering job. Shhh.

Re:Question (1)

slim (1652) | more than 4 years ago | (#29824471)

OK, so you've been modded offtopic.

But I'm curious why you thought a question about publically available datasets had anything to do with cloud computing? It doesn't look as if you were trolling. So what was it you were misunderstanding?

Re:Question (1)

Sockatume (732728) | more than 4 years ago | (#29824681)

You realise that those restrictions are the only reason that the data could be gathered in the first place, right? People won't allow their information to be disclosed at all unless there's some reassurance about what will be done with it. Maybe you should collaborate with somebody who can get access instead of trying to work around it, if only for your own good. It's not good for your career to be known as "the guy who stole all that private medical data and wrote a paper with it". The journals frown upon ethics violations.

I don't know... (1)

gijoel (628142) | more than 4 years ago | (#29823339)

The last thing I want to see why I'm playing a FPS is buffering.... 32%

Re:I don't know... (1)

im_thatoneguy (819432) | more than 4 years ago | (#29823855)

Reality server is designed to deliver single frames for visualization not interactive games. The acceleration structures would make it pretty much useless for a video game with lots of deforming meshes. The applications are things like an Ikea website where you can build your living room, place furniture and see a photo realistic rendering of the outcome without waiting a few minutes would be required on a local machine without a render farm.

Pay to Play? (5, Insightful)

zcold (916632) | more than 4 years ago | (#29823401)

Awesome! So instead of buying a video card, I will now have an option to pay yet another monthly fee to play games? Im so excited!

Re:Pay to Play? (1)

Sockatume (732728) | more than 4 years ago | (#29823977)

Think of it as paying for everyone else's video cards. On credit. Forever.

Re:Pay to Play? (0)

Anonymous Coward | more than 4 years ago | (#29824363)

Credit? That means it isn't my money! Sweet deal!

Re:Pay to Play? (0)

Anonymous Coward | more than 4 years ago | (#29824059)

If it's ten bucks a month, it'll be less than my investment in video cards.

Re:Pay to Play? (1)

Comatose51 (687974) | more than 4 years ago | (#29824605)

That's a bit of a knee jerk reaction. What if the monthly fee was $1 a month? Instead of continuously upgrading every year or so, you pay $12 a year to have the greatest and latest without having to do any of the work yourself. It's also pretty attractive to game developers because they can assume that all their customers are running on the same 3D rendering platform and can be sure that everyone will have the same experience. Until we know the cost of the service, it seems premature to judge its usefulness. I know telcos have left a bad impression on paid services but not all recurring fees are bad. Electricity and water are for most part pretty reliable services that we pay for monthly.

Re:Pay to Play? (1)

zcold (916632) | more than 4 years ago | (#29824685)

HA., a dollar a month... Lolz.. Sorry if I sound like a jerk but I hardly doubt we would see any prices like that... keep dreaming of a perfect world.. business is business because it makes money.... lots of money...

This really is pointless! (0)

Anonymous Coward | more than 4 years ago | (#29823479)

You want to hand off rendering a high-quality photorealistic image to an offite server, then download the result - an ultra-high quality image, and you think this is going to be faster and rendering locally?

For an onsite gigabit lan, I'm willing to believe that. But most of us have small pipes connecting to the cloud. I bet it would be faster to render locally than to render remotely with upload & download time included.

Return Of The Mainframe! (1)

jabjoe (1042100) | more than 4 years ago | (#29823575)

It's that time in the cycle where we talk about thin clients and the mainframe again. Own nothing, rent everything, submit to central control. You know what, just like the last few times, I'll pass.

YOUR RENDER FARM ESPLODE! (1)

Foktip (736679) | more than 4 years ago | (#29823603)

Um, arent they discontinuing their high-end products because they overheat and explode?
So like, are they gonna use ATI cards for this or something? LOL :P

So long render farm and thank for all the CG fish (1)

w3irdizum (1537733) | more than 4 years ago | (#29824041)

I manage and maintain a 200+ dual quad VFX render farm at the moment...I'll go get my coat. In all seriousness I've been expecting this for quite a while, ever since Nvidia brought mental images. But I cant help remembering that Intel were the ones shouting about ray-tracing cards , Nvidia was all about what ever works even if its a cheat... so ill hold fire on looking for a new job till I see some real world results.. Buttons aren't toys.

End of the upgrade path? (1)

Picass0 (147474) | more than 4 years ago | (#29823663)

So if the GPU become a glorified web client how will they keep soaking everyone for a (bi)yearly card upgrade? If all of the most complex tasks are handed off to a remote server that's where the upgrades should be handled.

Also if part of the secret sauce is being handled remotely NVidia has no further excuses for keeping it's linux drivers closed.

Re:End of the upgrade path? (1)

RealErmine (621439) | more than 4 years ago | (#29823895)

So if the GPU become a glorified web client how will they keep soaking everyone for a (bi)yearly card upgrade?

Oh, but there's a better revenue stream here: subscription fees.

Re:End of the upgrade path? (1)

Sockatume (732728) | more than 4 years ago | (#29824039)

Well, now you get soaked for the upgrade whether you want/need it or not. That's the rub.

Re:End of the upgrade path? (1)

imakemusic (1164993) | more than 4 years ago | (#29824175)

Lets see...I couldn't find any mention of price but, well £5(GBP) per month sounds like a reasonably low figure. My last graphics card was about £60. I buy one roughly every two years.

£5 x 24 months = £120

They're twice as better of with me renting this tech. Plus I'm going to need some compatible hardware to receive it on...

Won't work in some areas (2, Insightful)

Yvan256 (722131) | more than 4 years ago | (#29823699)

There is still this thing called "bandwidth quota" where you get overcharged to death if you go over it. As an example, say 40$/month for 50GB, then 10$ per additional GB.

And please no stupid "change ISP" comments, a lot of people aren't lucky enough to even have a choice of high-speed providers. It's either high-speed cable/DSL, or dial-up. Sometimes from the same ISP, even.

Re:Won't work in some areas (-1, Troll)

Anonymous Coward | more than 4 years ago | (#29823783)

Have you thought about changing ISP?

Re:Won't work in some areas (1)

torchdragon (816357) | more than 4 years ago | (#29823893)

Perhaps you could call my condo association and have them reverse their decision to disallow Verizon's deployment of FIOS.

You're right though, I suppose I should just be asking myself "Have you thought about changing residences?"

That makes a lot of sense just to play a video game, way more sense than going to NewEgg and buying a new video card for my computer.

Re:Won't work in some areas (1)

MozeeToby (1163751) | more than 4 years ago | (#29823957)

Fine be me, if NVIDIA thinks they can make this work it'll be just one more industry supporting net neutrality. Maybe we should encourage more and more industries to implement high bandwidth, questionably useful technologies. Eventually, the lobby money from the Net Neutrality group will be greater than the lobby money from the Telcos/ISP group.

And so... (1)

inode_buddha (576844) | more than 4 years ago | (#29823753)

Now you can get tele-fragged by a n00b even faster, thereby enabling greater synergies for e-presence and brand recognition!

Latency? (1)

kalirion (728907) | more than 4 years ago | (#29823865)

So, even if I had the the bandwidth to upload graphic data (geometry, textures, etc) and download 1080p video in realtime without any buffering, my 5ms Monitor would now have to deal with at least 30ms in video latency?

Great... (1)

Dorsai65 (804760) | more than 4 years ago | (#29823879)

A big-ass binary hairball to further clog the tubes.

How much additional traffic is this going to add to all the other interactive high-bandwidth stuff transiting the infrastructure?

Yuo fail It!! (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#29824135)

To the crowd in plainly Stat3s that

Bandwidth & Latency? (2, Insightful)

Nethemas the Great (909900) | more than 4 years ago | (#29824193)

Backbone and last-mile providers are already crying about filesharers overburdening the infrastructure, especially here in the U.S.. ISPs in the U.S. typically devote well more than 95% of capacity to downstream traffic to try and cope. The modern graphics card works on a bandwidth [wikipedia.org] spoken in terms of GB/s. There's no way a 50 FPS+ 1080p or better video feed from a rendering farm could be supported for every console user. While not needing as high of resolution, mobile devices communicate off of cellular networks that make in-ground network capacity problems seem petty. Even if these could be remedied, the latency involved in even a same city rendering farm would still make for a lack-luster experience.

Problem with this (0)

Anonymous Coward | more than 4 years ago | (#29824217)

in regards to online gaming:

latency, latency...

latency.

OnLive similarity (1, Informative)

Anonymous Coward | more than 4 years ago | (#29824413)

Doing graphics work in real-time is OnLive's department. I wonder what the patent status of this will be - OnLive filed a fair few - although I don't know which specific bits they cover. Should be interesting to see which company can deliver.

On your marks, get set....! (1)

Fishbulb (32296) | more than 4 years ago | (#29824705)

Ok, it's time everybody! Break those old Sun SparcStation ELC's and SLC's out of storage!

Oh, wait, you don't have one? How about all those SunRays you've got in the garage?

No?

Right.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>