Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Cloud Gaming With Ray Tracing

Soulskill posted more than 3 years ago | from the shifting-burdens dept.

Cloud 83

An anonymous reader writes "Since real-time ray tracing on a single desktop machine is still too slow for gaming, Intel has worked on a research project that puts the heavy workload up into the cloud, relying on multiple 32-core chips working together. That enables new special effects for games, like realistic reflections (e.g. on cars or the scope of a sniper rifle), glass simulations and multi-screen surveillance stations. The research paper also takes a closer look at various sources of latencies that become relevant in a cloud-based gaming approach."

cancel ×

83 comments

Sorry! There are no comments related to the filter you selected.

Bad Tech Journalism (4, Informative)

Sonny Yatsen (603655) | more than 3 years ago | (#35431144)

First Line of the Article:
"A new technology from Intel called ray tracing could bring lifelike images and improved 3D effects to games on tablets and other mobile devices."

GAH!

Re:Bad Tech Journalism (1)

devxo (1963088) | more than 3 years ago | (#35431248)

Yeah no shit, I programmed 2.5D ray tracing when I was like eight years old. I could model our home and other houses with it and have a few sprites in the "game" and it had a certain 3D feeling, even while I knew nothing about 3D graphics. I think Doom was done similarly.

Re:Bad Tech Journalism (1)

sockman (133264) | more than 3 years ago | (#35431580)

No. Doom was ray casting, a much simpler technique.

Re:Bad Tech Journalism (1)

dmbasso (1052166) | more than 3 years ago | (#35432062)

That's why he said '2.5D ray tracing' (another way to say 'ray casting').

Re:Bad Tech Journalism (1)

amn108 (1231606) | more than 3 years ago | (#35432468)

Eh, I don't know about your (if any) experience with computer graphics, but as far as I know, and I did my fair share of CG programming, raycasting has nothing to do with the mentioned '2.5D'. DOOM was '2.5D' because of the pecularities and limitation of its supposed 3-D engine where level maps were stored in memory in such a way (BSP) as to make it impossible to put two floors on top of one another (if I am not mistaken), and it also couldn't handle sloped surfaces. That's why it is called 2.5D. Basically it was another revision of the Wolfenstein 3D engine they had created earlier.

Raycasting is simply emitting rays from camera source through its image projection plane, as opposed to raytracing where the rays are emitted from light sources and are 'caught' by a light-sensitive element such as an eye or a camera. Anyway, Wikipedia explains both concepts very aptly, for those curious.

Re:Bad Tech Journalism (0)

Anonymous Coward | more than 3 years ago | (#35432812)

Raycasting is simply emitting rays from camera source through its image projection plane, as opposed to raytracing where the rays are emitted from light sources and are 'caught' by a light-sensitive element such as an eye or a camera. Anyway, Wikipedia explains both concepts very aptly, for those curious.

But you got that backwards... If you've implemented raytracing you really should know that rays are emitted from the camera. =)

Remember the old... For each pixel in image plane, compute camera ray. Then compute intersection of camera ray with object. Maybe compute reflection rays if you want reflections and refraction rays if you want refraction. Then, for each light source, compute intersection point rays and thereby compute illumiation of pixel. And so on...

Re:Bad Tech Journalism (1)

amn108 (1231606) | more than 3 years ago | (#35433578)

It's not the direction that counts actually, it's the steps involved - raycasting does not compute any rays between scene objects themselves, only rays that cross a given projection plane and a source of light or a scene object. Raytracing adds to this interplay between scene objects in the form of rays that are emitted in pretty much all directions from each point on any scene object, ideally even INSIDE it (light penetrates human skin f.e.) This makes raycasting immensely faster than raytracing (which is why real-time raytracing is such a big deal obviously) but does not produce the image quality raytraced images are known for.

Obviously any fast and self-respecting raytracing OR raycasting algorithm does not actually emit anything, mathematically speaking - instead it immediately proceeds to finding a point where a ray crosses a scene object and in case it's a raytracer or a photon mapper steps into a recursion where it 'emits' rays from that point again, with each ray producing an 'emitter' again and so on and so on until you run out of stack space or abort recursion with some trigger ;-)

Re:Bad Tech Journalism (1)

mug funky (910186) | more than 3 years ago | (#35436392)

you're thinking about global illumination.

illuminance on a surface is computed simply by the surface normal (or per-pixel interpolated normal) and it's angle relative to all applicable light sources. no raytracing there.

backward ray tracing is shooting off a ray for every pixel in the camera's projection plane and checking for intersections (and optionally interactions with objects).

forward ray tracing is shooting off a bunch of rays from each and every light source and bouncing them around the scene. this is sometimes called "photon mapping".

typically, the situation you're thinking of is where both are used. the (quite low res) forward ray-trace gives the backward raytracer an idea about where the light is distributed to allow it to conserve the amount of rays it casts in subsequent "bounces" off/through objects. this greatly increases speed and reduces noise for the same number of rays.

Re:Bad Tech Journalism (1)

amn108 (1231606) | more than 3 years ago | (#35439588)

Yes, you're absolutely right. But I was simply talking about how raycasting stops where raytracing truly begins (forward or backward alike.)

Re:Bad Tech Journalism (0)

Anonymous Coward | more than 3 years ago | (#35434900)

And the reason that they were stored that way was so that a simple ray-casting algorithm could cheaply pull out a ceiling and floor height for each column in the display. If you'd spent two seconds doing a search instead of all that time showing your ignorance then the world would be a better and richer place. Everyone that has read your post is a little bit stupider than they were before. Edit: oh my god, you actually did the search and found the standard explanation on the wki page. Words fail me.

Re:Bad Tech Journalism (1)

amn108 (1231606) | more than 3 years ago | (#35439608)

Well, at least I corrected OP notion that '2.5D raytracing is raycasting' which is completely without merit or sense. I was explaining why Doom/Wolfenstein engines are called 2.5D - yes its because among other things Wolfenstein used same wall height everywhere and Doom went a step further and carried the height in the map, something that also made impossible to have two floors on top of one another. I don't see why you paint me clueless here. Yes, when I was experimenting with raytracing algorithms back in the days, I actually hadn't read a single book (much less an English one) about it - I came to it myself after we had our first physical optics class in high school. And I haven't read Wikipedia prior to posting, I am not even sure WHICH article you claim I had read? On Doom? On raytracing?

Re:Bad Tech Journalism (0)

Anonymous Coward | more than 3 years ago | (#35439428)

Raycasting is simply emitting rays from camera source through its image projection plane, as opposed to raytracing where the rays are emitted from light sources and are 'caught' by a light-sensitive element such as an eye or a camera. Anyway, Wikipedia explains both concepts very aptly, for those curious.

Raytracing these days are done by back tracing rays from the camera source, so your distinction is not very helpful in this regard. The primary difference between raytracing and raycasting is that the former attempts to trace rays as they interact with surfaces through reflections, etc. using physically accurate laws. The latter typically stops at what the rays initially hit, and is largely used to determine visibility.

Re:Bad Tech Journalism (1)

Zaphod The 42nd (1205578) | more than 3 years ago | (#35436924)

Doom was absolutely not done with ray casting. The scene is composed, piece by piece, by rendering on-the-fly approximations of the walls and ceilings based on a 2D level map. Nothing was truly 3D (so I guess thats where you're calling it 2.5D), but nothing used ray-tracing or ray-casting either. Maybe they threw a few rays around to find which walls it hit, but they didn't do raytracing or raycasting per-pixel to calculate color values. As soon as it knows which wall is where, it just does a fill on that whole polygon. Also, Doom starts in the back and works its way forward (painter's algorithm) and overlaps many different parts of the scene; it draws the whole skybox, then it draws a roof on top of it, so you can see the sky if there's a window. If it was using raytracing this would never happen.

Re:Bad Tech Journalism (0)

Anonymous Coward | more than 3 years ago | (#35439760)

Doom uses a raycasting algorithm that works in a 2D (xy) plane. The results are then displayed in pseudo-3D space. There's one ray per column of pixels. Where the ray hits a wall determines which vertical column of pixels from the wall texture is drawn there. Because you can only rotate the camera around one (z) axis, it's possible to do a simple vertically scaled scanline across the wall surface for perspective correct texturing. The floors and ceilings ("flats") are rendered a bit differently - the scanlines are horizontal in screen space and arbitrarily angled in the texture space. The optimizations to make this run on a 386 required that the flat textures are all of the same size (64x64 pixels) and couldn't be shifted around like wall textures.

Re:Bad Tech Journalism (1)

BisexualPuppy (914772) | more than 3 years ago | (#35432034)

This is raycasting, not raytracing.

When doing raycasting, you will basically throw one ray per column (320x240 resolution -> 320 rays) (only primary rays are thrown, most of the time). Perspective is then added (the farest the ray hits, the smallest the wall will be on the screen). Quite fast, albeit very simple (no lighting, reflexion, or such).
When doing raytracing, you throw one ray per pixel (that's the primary one) to know which object it hits, then another one for each light source (secondary one, for shadows and color), then N others if needed (think reflection, etc.). Each of these secondary rays will then throw other rays, etc etc. Computationally very intensive, even after good optimizations, but quite realistic.

Re:Bad Tech Journalism (1)

the_humeister (922869) | more than 3 years ago | (#35431758)

Even better:

The research paper also takes a closer look at various sources of latencies that become relevant in a cloud-based gaming approach.

How about those fucking usage caps???

Re:Bad Tech Journalism (1)

slim (1652) | more than 3 years ago | (#35431962)

How about those fucking usage caps???

They don't seem to have put Netflix out of business.

However, it's true that demand for higher bandwidth applications will drive a market for higher caps, uncapped contracts, and faster pipes.

Re:Bad Tech Journalism (1)

creat3d (1489345) | more than 3 years ago | (#35432840)

Don't forget bigger tubes!

Re:Bad Tech Journalism (1)

Stenchwarrior (1335051) | more than 3 years ago | (#35432158)

How about satellite users WITH usage caps? Not only do they have to deal with the cap, but a 500ms latency. I know we're getting into appealing to the rural demographic, but farmers have money, too.

Re:Bad Tech Journalism (1)

bluefoxlucid (723572) | more than 3 years ago | (#35432230)

Didn't you see the word bullshit, spelled "Cloud," in the topic? Unless you see "Sephiroth" with it, you should immediately recognize the fact that nobody knows what they're talking about and they want to sell you garbage.

Re:Bad Tech Journalism (1)

somersault (912633) | more than 3 years ago | (#35432718)

To the cloud!

*rips out eyes/ears and smashes his head through some Windows*

Re:Bad Tech Journalism (1)

Joce640k (829181) | more than 3 years ago | (#35432842)

Intel has been trotting this story out every three months or so for as long as I can remember.

As memes go, "Intel shows fully raytraced game" is right up there with "Duke Nukem is nearly finished!" and "This year will be the year of the Linux desktop".

Re:Bad Tech Journalism (1)

wondershit (1231886) | more than 3 years ago | (#35434814)

For a while I don't open links to *world sites like Computerworld, PCWorld or Infoworld anymore. Keeps the blood pressure low. Too often they have horribly bad researched articles spread over as much pages as possible. It's unbearable how little content they provide with so many words. Comments on slashdot seem to confirm this more often than not.

Re:Bad Tech Journalism (1)

scurvyj (1158787) | more than 3 years ago | (#35437950)

One of my final year 'side' projects at Uni was to write a ray-tracer and for extra points it did Diffuse Reflection.

I've seen a Java implementation, of all things, that can ray trace some simple specularly reflective spheres in real time.

The bit they never talk about is how the screen data is supposed to be pumped back to the viewer in time to render a full frame. Another bit they all neglect is "And each node can access the entire world database - for such is required by a ray tracer - how exactly?".

Is it just me or is IT journalism sinking even lower, just when we thought that was physically impossible?

F*ckin great idea (0)

Anonymous Coward | more than 3 years ago | (#35431296)

Spend 500$ for a tablet.
Spend 5K$/month to rent a cloud to play games ..

Re:F*ckin great idea (1)

MozeeToby (1163751) | more than 3 years ago | (#35431482)

I wouldn't want my games up in Intel's cloud somewhere where I don't have any control and where I have to rely on my ISP to provide good latency. But it might be interesting to me to have a single powerful home server and then a couple laptops and a couple tablets that are basically IO devices and little else. Granted, you couldn't have everyone doing demanding, graphically intense games at all times, but a reasonably powerful desktop server should be more than capable of rendering 2x1080p laptop screens and 2x720p tablet screens, which is all most people have these days anyway (damn HDTV to hell for making people think 1080p is 'high resolution' for a PC monitor!). Of course, I'm not sure a wireless N network would keep up with the bandwidth requirements, but then you'd have the same exact problem (only much worse) trying to put those services on Intel servers in a datacenter somewhere.

Re:F*ckin great idea (1)

drinkypoo (153816) | more than 3 years ago | (#35432454)

Intel wants to find a way to remain relevant, if they can come up with some clustering secret sauce then there is a reason to continue to buy products based on their technology. The PC Gaming market is in decline, the only space in computing that is growing rapidly is mobile computing, and corporations work on the model of boundless expansion. They need to conquer new markets to continue to exist. If Intel can get you to buy an intel-based phone, desktop, and tablet because they will cluster then it's a good strategy. Many technologies for clustering have been around for a very long time and it's long past time we saw some offerings on the consumer market. Next-generation wireless should offer pretty good throughput at close range, so if you put the AP where you spend most of your time it might be adequate for that kind of use.

Re:F*ckin great idea (1)

Joce640k (829181) | more than 3 years ago | (#35432926)

To me it sounds more like the Intel Larrabee division has moved to the 'cloud'. Apart from that it's just a repeat from 2006, 2007, 2008, 2009...etc.

Intel? (1)

fahlesr1 (1910982) | more than 3 years ago | (#35431324)

I was unaware Whitted worked for Intel. </sarcasm>

Re:Intel? (1)

N Monkey (313423) | more than 3 years ago | (#35432384)

I was unaware Whitted worked for Intel. </sarcasm>

Actually, someone else may have beaten him to it .. but now I can't find the paper that cites the earlier reference.. so this post is a bit pointless :-(

..typical... (1)

N Monkey (313423) | more than 3 years ago | (#35432536)

I was unaware Whitted worked for Intel. </sarcasm>

Actually, someone else may have beaten him to it .. but now I can't find the paper that cites the earlier reference.. so this post is a bit pointless :-(

Oh that's typical... I just found the paper. It's "Interactive Rendering with Coherent Ray Tracing" by Ingo Wald, Philipp Slusallek, Carsten Benthin, and Markus Wagner, published at Eurographics 2001.

Re:..typical... (1)

kbensema (1868742) | more than 3 years ago | (#35433130)

No. The only person who really 'beat' Turner Whitted to ray tracing was Appel, back in the 1960s. Turner Whitted formulated ray tracing in the 1980s. Ingo Wald's primary contribution to the field has been the development of packet-based ray tracing, which is the technique of exploiting operational coherence exhibited by a group of 'nearby' rays by tracing them together through the acceleration structure. It is especially effective when vector units on the CPU or GPU are used.

Sigh... (1)

N Monkey (313423) | more than 3 years ago | (#35440162)

If you read the paper I mentioned, you will see it, in turn, cites "A. Appel. Some techniques for shading machine renderings of
solids. SJCC, pages 27–45, 1968."

Ray Tracing? Is he a good opponent. (1)

Viol8 (599362) | more than 3 years ago | (#35431346)

Can't say I've ever heard of him though. I use to play against someone called Polly but shes gone now.

Re:Ray Tracing? Is he a good opponent. (0)

Anonymous Coward | more than 3 years ago | (#35432240)

*canned laughter*

New Technology? (2)

denshao2 (1515775) | more than 3 years ago | (#35431360)

"A new technology from Intel called ray tracing could bring lifelike images and improved 3D effects to games on tablets and other mobile devices." Ray tracing has been around a long time. Even ray tracing in the cloud isn't that new. NVidia has the RealityServer.

Re:New Technology? (2)

Bobfrankly1 (1043848) | more than 3 years ago | (#35431598)

Depending on how you define "Cloud", you might also look at 3D render farms that have been doing ray tracing for close to (maybe more then) 20 years.

Re:New Technology? (1)

CastrTroy (595695) | more than 3 years ago | (#35431666)

No, but raytracing the entire scene at framerates fast enough to play a game is something that is new.

Re:New Technology? (1)

Terrasque (796014) | more than 3 years ago | (#35433354)

http://www.youtube.com/watch?v=tCMo-bJQC8A [youtube.com]

Heaven 7 by Exceed.

64kb executable. Raytraced. Over 10 years old.

Oh, and while wer'e doing cool demos that use raytracing : http://www.youtube.com/watch?v=EK7jkVAvA_Y [youtube.com] (pouet link : http://www.pouet.net/prod.php?which=49856 [pouet.net] )

Re:New Technology? (1)

ConceptJunkie (24823) | more than 3 years ago | (#35438644)

Raytracing in 64kB 10 years ago is definitely cool, but they were raytracing in the early 80s.

So Intel, we finally get to see Larrabee eh? (1)

Mr Thinly Sliced (73041) | more than 3 years ago | (#35431536)

It's not that impressive, either.

On the topic of raytracing - one thing that still stands out to me from the images in the paper are the lack of proper occlusion and shadows.

Take a look at the shot of the close up of the car side - look under the front wheel and it just looks .... artificial.

Unless there's some magic sauce that can be sprinkled on this without added a frame rate hit this isn't really all that wow at all.

Re:So Intel, we finally get to see Larrabee eh? (0)

Anonymous Coward | more than 3 years ago | (#35432352)

Those images (especially the close up of the car side) were rendered to show off specific aspects of ray tracing. The close up of the side of the car, for example, was to demonstrate the reflection and refraction that ray tracing can handle.

There's no denying that if there was somehow a whole library of real time ray tracing features that were available and usable at this time, it would look amazing. The problem and point is that it isn't possible now, but apparently intel is working on that for us.

Re:So Intel, we finally get to see Larrabee eh? (1)

White Flame (1074973) | more than 3 years ago | (#35439850)

Take a look at the shot of the close up of the car side - look under the front wheel and it just looks .... artificial.

Unless there's some magic sauce that can be sprinkled on this

Sure. It's called radiosity [google.com] .

without added a frame rate hit

...oh

Re:So Intel, we finally get to see Larrabee eh? (1)

grumbel (592662) | more than 3 years ago | (#35443082)

The core problem is that "raytracing" isn't a concrete thing or technology, its a general purpose technique. It can do pretty much anything from a few ugly shiny spheres to photo realistic rendering and rendering time might vary from fractions of a seconds to days. Just like good old polygonal graphics can do everything from basic stuff like Starfox to the latest Unreal Engine 3 tech demo or full photorealistic movies. Without any clear footage of actual games its really pointless to discuss the issue, especially considering that all the stuff that makes graphics actually look real (global illumination), is outside of basic raytracing and generally involves some kinds of hacks and shortcuts to get realtime speed.

What's more horrific (2)

airfoobar (1853132) | more than 3 years ago | (#35431584)

That the article thinks "ray tracing" is a new Intel technology, or that it thinks "cloud" rendering is something that hasn't been around for 50 years?

Re:What's more horrific (0)

Anonymous Coward | more than 3 years ago | (#35433268)

This! FFS, we call it "cloud" and suddenly if f'ing new. Damn I'll be glad when this passes.

no meat about cloud computing? (1)

BigJClark (1226554) | more than 3 years ago | (#35431644)


I rtfa, and its confusing. It started with talk of cloud computing on mobile devices (with no mention how the constant speedbump of network lag were to be overcome) and then droned on about a new chip architecture.

Nothing to see here, moving along...

mobile devices? 3g 4g caps are to low to use (0)

Anonymous Coward | more than 3 years ago | (#35432156)

mobile devices? 3g 4g caps are to low to use some thing like this and even on cable you can hit the comcast 250gb fast with this.

stupid article (1)

toxonix (1793960) | more than 3 years ago | (#35431654)

This is garbage. Mobile gaming, cloud computing, eh rewriting wolfenstein to add ray tracing in the cloud??? I can see why that might make a POC, but Wolfenstein's not even 3D! "We have a red car sitting at a courtyard, which has a very shiny reflective surface. That can be rendered very good." OK, not speaking Inglish isn't a crime. But the editors should catch this kind of thing. UNworth reading.

Fail (1)

Graham J - XVI (1076671) | more than 3 years ago | (#35431768)

"A new technology from Intel called ray tracing "

I stopped here.

Intel need to get out of graphics. (0)

Anonymous Coward | more than 3 years ago | (#35431898)

I'm a graphics programmer and researcher at a AAA games dev' and I am bloody sick of Intels crap.

They missed the boat with graphics accelerators, put out some awful hardware (that relied on software processors) that they illegally bundled into laptops, distorting the graphics industry for years. Then every year its another ray tracing demo usually jammed awkwardly into an ID tech engine.

Ray tracing has and never will be the future of realtime graphics. Its pointless and wasteful. Many people forget that Pixars PR Renderman software is a rasterizer and they only started using any raytracing when cars came out. You just don't need it. No one cares about realistic reflection and refraction or caustics. Most of the people I have shown my work to (including other graphics programmers and artists) cant tell the different between a simple noise distortion of the frame buffer and a refraction system that took 20 minutes to render.

I can create photoreal images on my netbook with a GPU that has 24 stream processors. I don't need a server farm.

Re:Intel need to get out of graphics. (1)

tepples (727027) | more than 3 years ago | (#35432072)

Then every year its another ray tracing demo usually jammed awkwardly into an ID tech engine.

What other engine do you recommend? What other major label releases engines of its five-year-old games under the GNU GPL?

I can create photoreal images on my netbook with a GPU that has 24 stream processors.

At what resolution and frame rate?

Re:Intel need to get out of graphics. (1)

grumbel (592662) | more than 3 years ago | (#35443276)

What other engine do you recommend? What other major label releases engines of its five-year-old games under the GNU GPL?

How about something new that they have written themselves? Whats the point of demonstrating the supposedly next big thing in computer games with some obsolete five year old game? How are they ever going to get games written for raytracing if they can't even find somebody to put together a solid tech demo?

Programmer art (1)

tepples (727027) | more than 3 years ago | (#35444496)

How about something new that they have written themselves?

Intel is in the hardware business and possibly the driver business. Making parts of a video game other than code needs a different skill set; otherwise, you will likely end up with the phenomenon called "programmer art". It's far cheaper to start with a 5-year-old Id game than to hire a producer and competent artists to come up with an original setting.

what about mods? (0)

Anonymous Coward | more than 3 years ago | (#35432002)

what about mods?

This sounds familiar (1)

drinkypoo (153816) | more than 3 years ago | (#35432144)

Isn't this what Sony promised us the Playstation 3 would do, and the supposed reason why they went with the "Cell" processor? Because everything Sony that had to do heavy graphics lifting would have one, and they would all cooperate to make your games better? And of course, this never came to pass, and Sony never really used Cells for anything else (ISTR there might have been a Cell-based Blu-Ray player that wasn't the PS3, but maybe that was just a rumor.)

Another silly cloud computing idea (1)

halo_2_rocks (805685) | more than 3 years ago | (#35432878)

Ah cloud computing... pauses to laugh... Ok, earth to the idiots at Intel - your network latency kills any benefits that could ever be imagined for this system. An average video card nowdays can push 80-100 Gbps, higher end cards are exceeding 150 Gbps and more. Let's look a video cable speeds - HDMI pushes 10.2 Gbps, VGA 10.8 Gbps, DVI 9.9 Gbps . Now lets look at the typical home internet connection today, it's avg 1.5-3 Mbps. Ok, let's do a thought experiment about how this stupid system would work. I need one frame rendered and let's pretend I can send out a request to all the computing power I need to render it. Let's say the time it takes to ray trace the frame takes almost no (zero) time since I can use the "cloud". Super!!! Now I need to put the frame together and send it back to my computer from the cloud. Uh ho!!! It's going to a HUGE amount of time get the final rendered frame back. Here's the math, let's pick a resolution of 1024 x 768 (yes, I know most of us run much higher than that) and we need 24 bits minimum of color. That's 18,874,368 bits. Using the typical internet connection, it will take 6 - 12 seconds to get each frame assuming it takes no time to render it. Most games run a minimum of 30 frames per second (50-60 frames per second is preferred). Now we have a system in which we get 1 frame every 6-12 seconds. That's a HUGE improvement Intel. Thank you for that.

Re:Another silly cloud computing idea (0)

Anonymous Coward | more than 3 years ago | (#35433210)

That is uncompressed...

Also you could do things like ray tracing in the background then in the front end do simple bitmaping like we do now.

You could also do things like out past a certain distance is ray traced but nearby it is not.

You are right though with latency in doesnt work. But for some select groups of games it could.

You could also have much pre rendered stuff sitting on the server 'almost' ready to go. Then the client does the last step.

Re:Another silly cloud computing idea (1)

halo_2_rocks (805685) | more than 3 years ago | (#35433402)

Let's look at that for a moment. What compression is going to help this problem or what would you suggest we use? For most real world scenes you are going to get very little benefit from compression. You might get savings of a few percent (and that would be some very excellent compression). The problem is that pixels are usually very different (hence the need to ray trace it in the first place). After all, if the scene were all black, we wouldn't need ray tracing for it. I'm sorry, but I just don't seen any way to get past the network bottle-neck. Any way you pre-position or post-position or imposition this thing, it just doesn't work. You need a much faster network and that isn't going to happen. In 20 years, the typical internet connection may be 10 Mbps unless there is some huge public investment (I'm speaking w/in the US).

Re:Another silly cloud computing idea (1)

CastrTroy (595695) | more than 3 years ago | (#35433558)

Not sure what the problem is here, but you can stream 1080p video over a 10 mbit internet connection. Basically, you are just constructing a video, and playing the video back to the person. Properly compress the video with H.264, and there is not problem. Maybe you live in a part of the US where everybody has 1.5-3 mbps connections, but where I live (canada) 3 mbps is actually the lowest anyone sells. And you can get 15 mbps for pretty cheap.

Re:Another silly cloud computing idea (1)

halo_2_rocks (805685) | more than 3 years ago | (#35433688)

Let's pretend you have a 15 Mbps connection (which I would say is a very atypical connection) and you get it down to 1 frame per second. It still doesn't matter and this is an incredibly slow and stupid system. You need network bandwith equivalent to the video cable spec hooked from your video card to your monitor for this system to work. As far as I know, that WILL never happen while any of us are alive. Also, please link this 1080p system that works on a 10 Mbps connection? As I pointed out already, HDMI (which is used for all known 1080p in the real universe) is a 10.2 Gbps cable spec (NOT 10 Mbps). The math doesn't work at 10 Mbps - sorry.

Re:Another silly cloud computing idea (1)

vadim_t (324782) | more than 3 years ago | (#35434124)

You're joking, right?

All you need is a 10Gbps LAN. It's expensive, but it can be had today. It's most likely doable over plain gigabit with compression or a reduced framerate.

Gigabit is already can be had cheaply enough for home usage. 10G will get there eventually, certainly in a lot less than my remaining lifetime.

Re:Another silly cloud computing idea (1)

halo_2_rocks (805685) | more than 3 years ago | (#35435892)

We are talking about broadband and typical internet connections. I don't know where you get the idea that anyone can get a gigabit connection cheaply. Most broadband is T1 or a fractional T3 in speed of 1.5 Mbps-3Mbps in the US. The price is roughly $30-60 per month for that from their cable providers. An OC-24 on the other hand, which is 1.224 Gbps connection, will only cost you ~$100,000 per month to have in the US. Seeing as how I don't have $100k per month the burn, I can't give you an exact figure. However, you are welcome to go price one and let me know if you can get one for less than that.

Re:Another silly cloud computing idea (1)

vadim_t (324782) | more than 3 years ago | (#35436560)

I can only see this working in LAN settings though.

The problem isn't with the video, it's the enormous CPU power required. Several very high end machines allocated to a single customer easily for hours, with all customers wanting to use it at about the same time, and at the price of a gaming service? I don't think it would work out at all.

This would be more useful for some sort of corporate/scientific visualization purpose maybe. For home usage I imagine video cards will get there fairly soon, especially if they make ones with acceleration for raytracing.

Re:Another silly cloud computing idea (0)

Anonymous Coward | more than 3 years ago | (#35434676)

What? Ever heard of video compression?

HDMI is 10Gbps, but Blu-Ray discs are 48 Mbit/s. HDMI has raw pixel data while Blu-Ray is compressed.

Lower to 1/5th of the quality of Blu-Ray and it'll fit in 10Mbps.

Re:Another silly cloud computing idea (1)

halo_2_rocks (805685) | more than 3 years ago | (#35436146)

I've heard of it and am sick of the compression magic box (since I've actually written compression routines myself) arguement. It's stupid. Stuffing 1080p remotely ray traced games down 10 Mbps lines (which nobody has for the most part) is stupid. Compressing stuff isn't going to help things. Even if you could do it (and believe me you can't), what are you going to get at the most from it? 2 or 3 frames per second. WOW. I'm impressed. Most games I play nowdays seem to render just fine at 30-60 frames per second just fine already.

Re:Another silly cloud computing idea (1)

CastrTroy (595695) | more than 3 years ago | (#35434682)

That's because the HDMI Cable carries uncompressed video + audio data. If you compress the data using H.264 to send it between the cloud and the client PC. If what you say is correct, then I wouldn't be able to play HD Netflix content on my home connection. Nor would I be able to play HD Youtube content. And I clearly can. Because they are sending over encoded video. They aren't sending raw frames. The TV in your living room is a dumb box and can only interpret raw video/audio data, which is why it requires an HDMI cable with such a high bandwidth.

Re:Another silly cloud computing idea (1)

halo_2_rocks (805685) | more than 3 years ago | (#35435774)

You are confused. HD Netflix is NOT 1080p (it is 720p and sorry to break it to you, most of the titles are only recorded in NTSC - not even 720p). Streaming will NOT work for gaming for a number of reasons (lack of sameness between blocks, can't use buffering, etc). And uncompressing images ALSO takes time too. Sorry, no free lunch and this whole idea is just stupid. Much like cloud computing has been a stupid idea since IBM suggested it in the 1960's and it is still a stupid idea today. If any of this was a good idea, we would all have terminals to IBM mainframes right now instead of the personal computers we have today (in its various forms).

Commercial solutions already available! (1)

Blaskowicz (634489) | more than 3 years ago | (#35438052)

h264 doesn't work, you need a low latency codec. Computing the motion compensation between N keyframes means you're introducing N frames of latency.

So you need to transfer still images, encoded in MJPEG or something similar but more advanced. Is it possible?

of course [microsoft.com] it is [virtualgl.org] ! One solution was introduced recently with the windows SP1, the other one is open source and has been available for some years.

doing it from the cloud (i.e. fancy word for the internet) isn't so interesting, the technology sounds so much desirable on the company's lan, then on the home network. But it still is workable over the internet within conditions of bandwith and latency, i.e. you need a home connexion that both qualifies for HDTV over DSL and a good game of counterstrike. Good DSL may do, fiber would be much better. That's why it already exists again, and sold under the name of "OnLive".

Re:Another silly cloud computing idea (1)

clgoh (106162) | more than 3 years ago | (#35437124)

You should take a look at onlive.com It's exactly that kind of "cloud" gaming, and it exists right now.

Re:Another silly cloud computing idea (1)

halo_2_rocks (805685) | more than 3 years ago | (#35437354)

Now go look for real life reviews of the service. I found easily dozens of user reviews like this one - "OnLive works if you're practically sitting at their server, but it's just not ready for mass market in any way shape or form." or "I doubt this service is really going to take off if it gets released to the public like this. No Wifi support and paying for a blurry video feed of a game isn't exactly "fun"." And these aren't even high end games they are hosting. It's all about the math. I'm shocked such a stupid idea as this got funded. Somebody is about to lose alot of money.

Re:Another silly cloud computing idea (1)

clgoh (106162) | more than 3 years ago | (#35439532)

Just signed up for a free trial to test it... Yeah it's blurry, but on a shitty DSL connection, the latency was better than I expected.

But it probably won't go anywhere.

uh huh (1)

Syberz (1170343) | more than 3 years ago | (#35433034)

When tablet/mobile data plans won't be insanely expensive and when broadband will have no upload/download limits and decent speed, then you can start talking to me about rendering graphics in the "cloud".

missing the point (0)

Anonymous Coward | more than 3 years ago | (#35434252)

I think the whole idea about photorealistic ray tracing is wrong. the games are supposed to be different from reality. they all look the same now.

Re:missing the point (1)

Purity Of Essence (1007601) | more than 3 years ago | (#35439710)

I think the notion that games have to be any one thing is preposterous.

article about visual stuff w/ no images. (1)

cathector (972646) | more than 3 years ago | (#35436158)

how great would it be if /. automatically filtered stories which are about imagery but do not in fact have images in them.

"cloud" makes sense? (1)

Onymous Coward (97719) | more than 3 years ago | (#35437068)

Is this an actual example of a good usage of the term "cloud"? In the sense of some computers out there somewhere doing stuff for you and you getting the results? Not long ago I heard about the company OnLive and their cloud-based gaming, where all the computing and rendering is done on their servers, you send your control inputs across the net to them and they send you back sound and video.

Played it not long ago myself and expected the lag to be bad, but it turned out it wasn't bad after all. You can sense it, especially doing certain things, but it doesn't get in the way. And I hear they have more latency cutting measures in store. Pretty neat stuff.

Cloud gaming opens up the possibility of leveraging more computing power per player, so I can see fancy effects like ray tracing being incorporated into cloud games.

Re:"cloud" makes sense? (1)

halo_2_rocks (805685) | more than 3 years ago | (#35437514)

Look at the real user reviews of this service on the web. It's going down hard. There are a few positive reviews but the majority are negative for a simple reason. IT'S A STUPID IDEA. Rendering a NTSC image of a game on a server by creating a virtual session and then sending it to a user over typical internet connections (btw this thing need a really high end internet connection to even work then) makes no sense. This thing will be dead and buried by next year. Mark my words.

Re:"cloud" makes sense? (1)

Onymous Coward (97719) | more than 3 years ago | (#35446332)

NTSC? Those players are using thin pipes, huh?

Sounds like you haven't tried it. Give it a go, there's a free trial. It's easy to run. I'm curious to hear how you like it.

I've got a 20 Mbps connection and I'm probably close to one of their servers, so I haven't had network issues at all. You should only be running NTSC rates if you've got a 1.5 Mbps connection. That's well below average these days, isn't it?

Network speeds keep improving (google "bandwidth over time"). This stuff will only keep getting better. "... makes no sense" is like saying the Web makes no sense because you don't like what you're seeing with this new-fangled Mosaic thingy.

confirmed (0)

Anonymous Coward | more than 3 years ago | (#35439288)

Correct. Doom was the first computer game to use binary space partitioning (BSP). Wolfenstein 3D, Ultima Underworld used raycasting. Very clever raycasting in Ultima Underworld. It was far more advanced than wolf 3d. It allowed sloping walls/floors and walls at arbritrary angles. you could swim under water, and you could pick up items about the place.
it even came out before wolfenstein 3d.
amazing graphics for the time. totally underrated.

In number of images (1)

renoX (11677) | more than 3 years ago | (#35441724)

At 60Hz, one screen refresh is every 16ms, so the rendering takes either 8 to 14 images with 5 images caused by the network RTT..

Interesting.

Get your heads out of the cloud people (0)

Anonymous Coward | more than 3 years ago | (#35448128)

"Cloud" computing will not work, for these reasons:

1) ISP's will continue to throttle and cap bandwidth, especially since the internet infrastructure are owned by teleco's who want to ensure people still get their content from expensive services like cable and cellular.
2) The infrastructure is not fast enough (nor will be) to support intensive Cloud operations like pushing 60fps of 1080p video along with sound and player controls.
3) Broadband is not ubiquitous, some people still using dialup because they have no other choice.
4) Many people do not like "subscription" services. The "cloud" is not going to be free when it costs billions to implement.

I am not saying that there cannot be cloud-like services, but the idea of one day you just have a thin client that can do anything a desktop computer does today is a long, long, long way off because of the piss poor rollout of broadband technology and the general greed of telco's trying to control your access to any content.

I do not want my computer experience to be at the mercy of big telco, they already have way too much involvement in limiting my access to technology and content, to give them 100% control of how I use a computer will be the day I find a remote tropical island and look for smoke monsters.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>