Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Nvidia's RealityServer to Offer Ubiquitous 3D Images

ScuttleMonkey posted more than 4 years ago | from the more-rain-from-the-cloud dept.

Graphics 82

WesternActor writes "ExtremeTech has an interview with a couple of the folks behind Nvidia's new RealityServer platform, which purports to make photorealistic 3D images available to anyone on any computing platform, even things like smartphones. The idea is that all the rendering happens 'in the cloud,' which allows for a much wider distribution of high-quality images. RealityServer isn't released until November 30, but it looks like it could be interesting. The article has photos and a video that show it in action."

cancel ×

82 comments

It takes chutzpah to use the term "RealityServer" (3, Funny)

jeffb (2.718) (1189693) | more than 4 years ago | (#30092284)

...for demoware.

Re:It takes chutzpah to use the term "RealityServe (5, Funny)

DeadDecoy (877617) | more than 4 years ago | (#30092306)

It could have been a CloudServer for vaporware.

Re:It takes chutzpah to use the term "RealityServe (1)

Yvan256 (722131) | more than 4 years ago | (#30095244)

Hey, don't rain on their parade.

Re:It takes chutzpah to use the term "RealityServe (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30093056)

I'd like to see a 3D of this [wordpress.com] .

Re:It takes chutzpah to use the term "RealityServe (1, Insightful)

Anonymous Coward | more than 4 years ago | (#30093758)

Any "new" technology that is marketed with the phrase "cloud computing" is starting to get a really bad reputation with software developers.

The "cloud" is the sort of idea that managers and other fucking morons like that think is totally great, but those of us who actually have to work with such systems know how shitty most of them are.

"Cloud computing" is this year's version last year's "web services", "SOA" and "SaaS". So many bold claims, but in the end nothing but a steaming pile of fecal matter pushed by the peckers in marketing.

I wonder what next year's stupidity is going to be. I wonder what radical claims the marketing fools will make, only to find out that what they propose is stupid, costly and inefficient. There's just so much anticipation!

Re:It takes chutzpah to use the term "RealityServe (0)

Anonymous Coward | more than 4 years ago | (#30096304)

...for demoware.

"RealityServer will be available starting November 30, 2009"

fp! (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#30092294)

:O

What about Data Transfer (5, Insightful)

Monkeedude1212 (1560403) | more than 4 years ago | (#30092366)

Aren't Photo-Realistic Images pretty big in size? If I want to get 30 Frames per second, how am I ever going to push 30 Photorealistic Frames through the internet - I can hardly get 5 Mb/s from my ISP.

Re:What about Data Transfer (1)

Fulcrum of Evil (560260) | more than 4 years ago | (#30092374)

how big is video?

Re:What about Data Transfer (2, Insightful)

Monkeedude1212 (1560403) | more than 4 years ago | (#30092398)

Video's pretty big - but its always compressed to a point I wouldn't call it photo realistic.

Re:What about Data Transfer (1)

Fulcrum of Evil (560260) | more than 4 years ago | (#30092548)

the point here is that photorealistic stuff isn't any bigger than a photo

Re:What about Data Transfer (3, Insightful)

im_thatoneguy (819432) | more than 4 years ago | (#30092620)

Really you wouldn't describe Netflix HD as photorealistic? Even things... originally shot on film? With a camera?

Re:What about Data Transfer (2, Insightful)

LUH 3418 (1429407) | more than 4 years ago | (#30092838)

Still, you can get fairly decent video quality at 720p on youtube nowadays, with connections that aren't so fast (mine is limited to 8mbps download). On a cellphone, you probably can't realistically get very fast speeds just yet (500 kbps?) but the screen is also much smaller. As connections get faster, approaches like this become more feasible.

Another way to see this is that Nvidia just wants to expand its marketshare. They are likely hoping that with something like this, they could sell expensive server equipment (to game companies) for you to play online games on, with the rendering being done remotely. This would make it possible for people to play 3D games that are very CPU/GPU expensives on any platform that can stream and render the video fast enough. Imagine playing WoW on your iPhone... They might just be able to sell this.

Re:What about Data Transfer (0)

Anonymous Coward | more than 4 years ago | (#30095234)

I don't care how fast it gets and I don't care how much bandwidth we get in the future, it will not break the laws of physics. there will be too much latency for most games to be playable.. maybe some rpgs will work ok, but anything people buy big gfx cards today for (ie fps) will not work this way.

Re:What about Data Transfer (1)

Toonol (1057698) | more than 4 years ago | (#30093502)

DVD quality is 720x480. That's much lower resolution than modern consoles and PCs put out, and yet a decent movie looks far more photorealistic than anything they can render. I think a decent game at 640x480 res, with good animations and realistic aliasing, looks better and more realistic than a crisp HD image without those. By focusing so much on resolution, we've been putting emphasis on the wrong thing.

Re:What about Data Transfer (2, Insightful)

cheesybagel (670288) | more than 4 years ago | (#30092466)

Compression. You know, MPEG-4 AVC.

Re:What about Data Transfer (0)

Anonymous Coward | more than 4 years ago | (#30095292)

mpeg4 avc is an example of asymmetric compression. it grants the user quick, seekable playback of a video stream at a relatively low cpu hit in return for very intensive ENcode times.. in order to take advantage of this, each frame of each game being played by each client would need to be compressed realtime.. this makes this stupid 'remote 3d render' even less financially viable than it is by itself.

Re:What about Data Transfer (2, Insightful)

lhoguin (1422973) | more than 4 years ago | (#30092506)

Many applications do not need 30 fps, though. For example, an house architect software would be able to use this for rendering various shots of the designed house.

Re:What about Data Transfer (1)

pipingguy (566974) | more than 4 years ago | (#30096622)

Let's see it perform with something like this: http://tinyurl.com/ycm5uy5 [tinyurl.com] It's a YouTube 3D refinery flythrough - architectural stuff is tame, compared to this. This is not as finely-detailed as the interior of a microprocessor but the actual thing processes much more than electrons and has to accommodate humans walking throughout and managing it. It looks complicated to the uninitiated, but it's not really. Nifty, eh? Complex enough?

Re:What about Data Transfer (1)

RespekMyAthorati (798091) | more than 4 years ago | (#30107256)

That looked like shit. Is that the point?

Re:What about Data Transfer (2, Insightful)

VeNoM0619 (1058216) | more than 4 years ago | (#30092534)

RTFS

which purports to make photorealistic 3D images available to anyone on any computing platform, even things like smartphones. The idea is that all the rendering happens 'in the cloud,' which allows for a much wider distribution of high-quality images. RealityServer isn't released until November 30, but it looks like it could be interesting. The article has photos

Notice there is no emphasis on video or animation. This is for 3d images only. Or were you seriously hoping to play 3d realistic games on your phone?

Re:What about Data Transfer (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#30092640)

Maybe not the phone - I can't imagine why anyone would really need high quality photo realistic Renderings on your phone - I mean once the image is rendered you can just put it on your phone and show people, if thats what you're going for. But there isn't exactly an Engineering or Architect App for the iPhone, as far as I'm aware (don't hold me to that).

However, in my experience, the only time where rendering is preferable over a picture is for entertainment purposes. Though someone above mentioned this would be handy for Engineers and Designers, thats a very small group of people to benefit from this. What was so slow and painful about the old way of dealing with Wireframes and sprites and spending 30 minutes Rendering your final products that way?

If there isn't going to be a push towards some form of animation, I don't see it going much of anywhere. There isn't enough of a need for high performance rendering like this when people achieve these results already at a slower pace.

Rather - let me just ask - what will YOU use this for?

Re:What about Data Transfer (1)

geekoid (135745) | more than 4 years ago | (#30093134)

With this technique, it might be possible with a 4g connection.

Re:What about Data Transfer (1)

ChrisMaple (607946) | more than 4 years ago | (#30093198)

RTFS

RTFA. Animation of a dress worn by a model of a size specified by the user is given as an example.

Re:What about Data Transfer (1)

Dunbal (464142) | more than 4 years ago | (#30092590)

how am I ever going to push 30 Photorealistic Frames through the internet - I can hardly get 5 Mb/s from my ISP.

      I'm far from being a computer programmer/expert.

      But say you have a display at, for argument's sake, 1280x1024 pixels at 32 bits per pixel. That's 41.9 million bits per frame. Call it 42 Mbits. You want to do that at 30 frames per second? You're up to 1.26 Gb/s. Now please raise your hands who has a 2GBs internet connection? OK there will be some compression, and algorithms that mean you don't have to send the WHOLE screen every frame, but still...

      I don't think this is meant to be used for games.

Re:What about Data Transfer (1)

Monkeedude1212 (1560403) | more than 4 years ago | (#30092718)

Then I beg you to come up with more than 5 practical applications.

Re:What about Data Transfer (3, Informative)

JobyOne (1578377) | more than 4 years ago | (#30092874)

How big is your screen?

That's the real question here. "Photorealistic" (a meaningless term in the context of transferring image data) on a smartphone screen is a whole lot smaller than on my full 1920x1280 desktop monitor.

"Photorealistic" will only ever be as high resolution as the screen you view it on.

Re:What about Data Transfer (1)

mikael (484) | more than 4 years ago | (#30094542)

You can have supersampled pixels to avoid jagged lines - for every pixel in the framebuffer, your raytracer might generate a grid of 8x8 or 16x16 rays, each of which has its own unique direction. These leads to smoother blended edges on objects. It takes considerably more time, but helps to improve the appearance of low resolution images, especially mobile phone screen which may only be 640x480 or 320x200 (early VGA screen resolutions).

Re:What about Data Transfer (1)

JobyOne (1578377) | more than 4 years ago | (#30191192)

That's a rendering trick that has zero impact whatsoever on the final size of a rendered frame. I highly doubt they're sending raytracing data to smartphones.

A pixel is a pixel is a pixel, no matter how you go about generating it, and a screen can only display so many of them. The smart phone or whatever isn't generating these images, it doesn't give a crap about the raytracing methods behind them. It just downloads them and puts them on the screen.

Reminds me of people who take a 300x300px image into Photoshop and change the resolution to 300dpi in an attempt to make it "print ready." Congratulations! You've made it print ready to be an inch wide.

Re:What about Data Transfer (1)

mikael (484) | more than 4 years ago | (#30230240)

The screen resolution of an iPhone is around 640x480 . I would guess that there are probably applications to allow larger images to be viewed through the use of scroll and zoom functionality. What I meant is that the server is going to do the raytracing, and all it has to do is send an image back to the iPhone.

Re:What about Data Transfer (2, Informative)

vikstar (615372) | more than 4 years ago | (#30092888)

Forget the data transfers, they'll increase, it's the latency that's the problem. Games using this technology will be almost useless, especially action games. Currently you get practically 0ms latency when you interact with a game, which is what makes it seem fast. If it's a multiplayer game then the only latency you get are from other people, and if they appear to go left 50ms later than when they pressed the button to go left it doesn't make a difference for you, since you don't know when they pressed the button. If you get 50ms latency on your own controls, then it's really really visible, since we have great motion vision... like T-Rexes.

Re:What about Data Transfer (2, Interesting)

im_thatoneguy (819432) | more than 4 years ago | (#30094408)

Not all games. Many genres would work great such as an RTS or many RPGs like WOW or Baldur's gate or any other game where the interface could be run locally on the portable's hardware and then let the server handle the rendering.

I imagine even a local 3D copy which is hidden from the user but handles all of the 3D Mechanics of detecting unit selection etc. Since it's not being shaded and it only needs collision meshes it would run fine on a cell phone. Then let the server render the well shaded and lit views.

Re:What about Data Transfer (2, Insightful)

vikstar (615372) | more than 4 years ago | (#30094486)

Good point, I didn't think about that way. More specifically the server could for example render expensive global illumination and then send the textures to the client, which can use simple GPUs to apply the textures to local meshes.

Re:What about Data Transfer (0)

Anonymous Coward | more than 4 years ago | (#30095402)

talk about a disaster waiting to happen.. I could just imagine the glitchy inconsistent performance, the dropped objects, the interface lag.. oh what fun.. oh and don't forget all the fees they get to charge you. access, bandwidth use, time.. might as well break out the prodigy floppies and credit cards...

I think the real solution is to tell publishers they can't squeeze gaming into a one-way television like service.

Re:What about Data Transfer (2, Funny)

geekoid (135745) | more than 4 years ago | (#30093126)

Yes, no one could ever get you 30 frames a second, that's why we can't watch tv shows and movies online~

Re:What about Data Transfer (0)

Anonymous Coward | more than 4 years ago | (#30093258)

Actually you dont send 30 frames per sec. You will more likely send maybe one that acts as a keyframe and 29 times the difference to the next frame, which is much less data than a whole picture. Thats how .mpeg and the likes work.

Re:What about Data Transfer (1)

KibibyteBrain (1455987) | more than 4 years ago | (#30094608)

Because it would be cloud based, they could merely send the finished rasterized frames for cellphones(very little bandwidth), or preprocessed data for desktops/notebooks/things with a GPU which it could then assemble. The whole problem is usually when you do something like this you need to download much more data than you actually need to your machine to view only one small subset of that information. Now, it can send you only the data you need, or all of the data in progressive chunks that you can start to view in full [apparent] quality right away, vs. having to wait to download all the information to have your own machine start assembling it.

Re:What about Data Transfer (1)

physburn (1095481) | more than 4 years ago | (#30094638)

The reality engine isn't for real time gaming, its for artists, game and CAD designers to see the scenes rendered in near real time. It makes a lot of sense to the render on a remote server, most of time the artists computer will be just a user interface for modelling using very little CPU, only on the few rendering occassion will you need the vast ammout of CPU power that the remote render farm, can provide. Nvidia and Mental image have picked a great application for the cloud here. Even cleverer for Nvidia is decided to let 3d parties buy and host the render engines. No doubt the market will vastly overestimate the need for these engine, making Nvidia a lot of Telsa GPU sales, and a many cheap resources for rendering. I'm not sure what the payment system for using the render farms will be, it will probably vary from provider to provider, and settle down as cheap once there are enough render farms available.

Nvidia where also clever in buying Mental Ray, since there render plug-in, fits most of the industries main 3d packages, like Maya, 3D studio and Autocad, the reality engine farms will already be useable on most of the common software.

---

Ray Tracing [feeddistiller.com] Feed @ Feed Distiller [feeddistiller.com]

Uh oh... (1)

XPeter (1429763) | more than 4 years ago | (#30092386)

Somebody tell 4Chan's /hr/ department, quick!

Re:Uh oh... (2, Funny)

Anonymous Coward | more than 4 years ago | (#30092632)

Not your personal army.

Stop saying "cloud" (5, Funny)

HockeyPuck (141947) | more than 4 years ago | (#30092388)

FTFA:

By moving ray tracing and many other high power graphics algorithms off the client and into the cloud, lightweight-computing platforms like netbooks and smartphones can display photorealistic images in real time.

Why not just say:

By moving ray tracing and many other high power graphics algorithms off the client and onto nvidia's servers, lightweight-computing platforms like netbooks and smartphones can display photorealistic images in real time.

I guess it's just not as cool...

I wonder if this would work for cooking?

By moving cutting, peeling, baking, frying and many other food preparation techniques off the dining room table and into the food cloud (kitchen), lightweight-eating platforms like TV trays and paper plates can be used to eat off of in real time.

Re:Stop saying "cloud" (5, Insightful)

HBI (604924) | more than 4 years ago | (#30092426)

For me, I just hate the marketing cocksuckers who come up with these terms. Some asshole saw too many Visio diagrams with a big cloud in the middle representing the intervening networks and decided that there are computers out there that will magically do what they want. Every time I hear the term 'cloud' I think 'botnet'. Because essentially, that's the only thing extant that resembles what they are proposing.

Re:Stop saying "cloud" (3, Insightful)

nurb432 (527695) | more than 4 years ago | (#30092658)

Don't worry, in 6 months we will have another buzz word we can hate and cloud will be history.

Re:Stop saying "cloud" (5, Funny)

Anonymous Coward | more than 4 years ago | (#30092692)

I'm all about the "river computing" system. You dump whatever crap you want in, and its downstream's problem.

Re:Stop saying "cloud" (1)

JohnnyBGod (1088549) | more than 4 years ago | (#30096462)

Duh! Come on, man, everybody knows that the Internet is not something that you just dump something on!

Re:Stop saying "cloud" (1)

RespekMyAthorati (798091) | more than 4 years ago | (#30107334)

Unless there's a series of pipes to take it away...

I've just worked out what "the cloud" means (1)

Colin Smith (2679) | more than 4 years ago | (#30092946)

"Not my responsibility".

 

Re:Stop saying "cloud" (1)

ChrisMaple (607946) | more than 4 years ago | (#30093320)

I too was skeptical. But last night there was a presentation on cloud computing at Monadlug, and rerendering for a video service to insert new advertisements was given as an example. This is something that is being done NOW, a few dollars paying for 20 minutes of time on someone's "cloud", that would otherwise require that the video service buy a whole roomfull of expensive multiprocessor computers.

Amazon and Rackspace and others are already offering cloud services. I don't like it - I think everyone should own all the processing power they need - but cloud computing is here, it's real, and it performs a valuable economic function.

Re:Stop saying "cloud" (1)

Fred_A (10934) | more than 4 years ago | (#30096664)

I think everyone should own all the processing power they need - but cloud computing is here, it's real, and it performs a valuable economic function.

Old news. It used to be called "server time". There are bits and pieces related to "server time" billing left in most Unix or Unix-like systems (which could probably be brought back to life if need be). No need to bring any meteorology in it.

"Sorry, your cloud computing operations have been cancelled because of an unexpected storm which washed away our reserve of zeroes"

Re:Stop saying "cloud" (1)

symbolset (646467) | more than 4 years ago | (#30094708)

Buzzwords can be fun. Next time you're scheduled for a sales presentation make up a bunch of cards with different sets of mixed buzzwords and give each attendee a card and a highlighter. The first person to get five buzzwords marked off should yell BINGO! and win a small prize for paying attention. It's called buzzword bingo. It works equally well whether you warn the presenters or not, since they can't help themselves. Some salespeople can't get past the first slide without "BINGO" ringing out.

Here's a nice starter list: Manage virtual ROI available secure cloud service protect tier integrate enterprise 2.0 TCO efficiency scale trust partner federate content core architect generation.

Re:Stop saying "cloud" (4, Insightful)

Spad (470073) | more than 4 years ago | (#30092446)

Shhhhh! You'll ruin the scam (of convincing uninformed people that an old idea is a new idea by renaming it).

Thin client -> fat client -> thin client -> fat client. *yawn*

Every time, this happens; things move away from the client for "performance" and "flexibility" and "scalability" reasons and then everyone realises it's a pain because of the lack of control or reliability and by that point the client hardware's moved on to the point where it can do the job better anyway so everyone moves back to it.

Re:Stop saying "cloud" (4, Funny)

thePowerOfGrayskull (905905) | more than 4 years ago | (#30092530)

Thin client -> fat client -> thin client -> fat client. *yawn*

We were forced to stop using the term "fat client' here at Big Bank; our end-users got offended when they heard the term, apparently they thought we were talking about the /users/ and not the systems... Instead, we must call it "thick client"* -- which is odd, since if they interpret it the same way it's just as insulting from another direction.

*go ahead, laugh, but it really happened!

Re:Stop saying "cloud" (2, Informative)

HockeyPuck (141947) | more than 4 years ago | (#30092600)

We were forced to stop using the term "fat client' here at Big Bank; our end-users got offended when they heard the term, apparently they thought we were talking about the /users/ and not the systems... Instead, we must call it "thick client"* -- which is odd, since if they interpret it the same way it's just as insulting from another direction.

You forgot how we used to refer to IDE devices as either a "master" or a "slave"... this wasn't back in the 50s either.

Re:Stop saying "cloud" (1)

F34nor (321515) | more than 4 years ago | (#30095580)

Just tell them that its the BSDM release.

Re:Stop saying "cloud" (1)

Xtravar (725372) | more than 4 years ago | (#30092926)

We used to call "computers on wheels" COWs, except apparently a customer and/or customer's customer got very offended during implementation.

And yes, that really happened, too.

Re:Stop saying "cloud" (1)

symbolset (646467) | more than 4 years ago | (#30094742)

Thick client is also insensitive these days. You want to go with "fluffy client".

Just kidding... the least offensive term is, I believe, "rich client", though in a bank that could be confusing too.

Re:Stop saying "cloud" (1)

thePowerOfGrayskull (905905) | more than 4 years ago | (#30102528)

There's just no way to win... hefty? sizeable? hmmm, "standalone" might work but is also rather ambiguous in technical meaning...

lack of control (1)

nurb432 (527695) | more than 4 years ago | (#30092672)

That depends on which side you are on.

For the people hosting ( or governments that want to butt in ) there is plenty of control.

If you own the symbolset, you own the mindshare (1)

symbolset (646467) | more than 4 years ago | (#30095192)

IBM tried it when they went to OS/2. Suddenly a hard drive was a "Fixed disk" and a motherboard was a "Planar board".

It's a sad game but it's the only one there is. It's fun to watch megacorporations fight to the death over ownership of a word.

Re:Stop saying "cloud" (1)

Hurricane78 (562437) | more than 4 years ago | (#30092982)

Oh, and it's not real-time at all. IT will *at least* have the lag of one ping roundtrip. Then add some ms of rendering time and input/ouput on the device to it. On a mobile phone that can mean 1.5 seconds(!) in delay. Or ever more.

It's real-time, when it does not sound weird anymore, when I press a key in a music game, to hear the sound.
That's below 10 ms for me. But something around 50ms TOTAL for the average Joe.

Oh, and don't even think about winning a game against someone with real real-time rendering.

In Internet speak this project is a... LAG FAIL!

Re:Stop saying "cloud" (1)

unjedai (966274) | more than 4 years ago | (#30093310)

My kitchen sometimes resembles a cloud when I allow my high powered processor to over-render my inputs.

Re:Stop saying "cloud" (0)

Anonymous Coward | more than 4 years ago | (#30094946)

Is this a really subtle "Blender" joke?

The real cloud, desktop cloud is there idling (0)

Anonymous Coward | more than 4 years ago | (#30096294)

World's second most popular desktop OS comes with a technology that would allow a real desktop cloud to do similar things but the company has to sell hardware, faster hardware every year to stay alive.

So, XGrid technology stays exclusive to pro apps like Final Cut Pro.

Just imagine using pool of CPUs of all .Mac (Mobile Me) owners to do similar things. Of course, it would mean that cheap "Macbook" (non pro) owner having access to some amazing CPU power meaning he won't upgrade to latest "Macbook Pro". This is a company who deliberately put Intel junk GPU instead of similar priced Nvidia integrated GPU just to force the owners to upgrade needlessly.

Heh... I got more 'Reality' than I can handle... (0)

Anonymous Coward | more than 4 years ago | (#30092424)

How about some 'surrealism' here ?

Photo-realistic on smart phones! (1)

Enderandrew (866215) | more than 4 years ago | (#30092430)

That low-resolution BlackBerry in your pocket will suddenly be capable of producing high resolution images?

Uh-huh.

Nvidia also claims that simply by wiring money into their account, they can make you lose weight by doing nothing at all!

Re:Photo-realistic on smart phones! (1)

im_thatoneguy (819432) | more than 4 years ago | (#30092650)

The point is that the blackberry doesn't do any processing. It just streams the end result. Which is certainly doable considering the ZuneHD can playback 720p HD footage and it's not much bigger than a blackberry.

Re:Photo-realistic on smart phones! (1)

Enderandrew (866215) | more than 4 years ago | (#30092866)

I'm not talking about real-time processing (which cloud rendering can help with).

The new Zune HD is one of a few select devices that actually supports a decent resolution. It pisses me off because I can't use a Zune in Linux, and I won't buy a Zune, but it does have perhaps the nicest screen of any portable device on the market right now.

Most smart phones have low-resolution screens. You can't produce a photo-realistic image on a low-resolution screen, regardless of pushing rendering out to the cloud.

Re:Photo-realistic on smart phones! (1)

EvanED (569694) | more than 4 years ago | (#30093240)

...it does have perhaps the nicest screen of any portable device on the market right now. Most smart phones have low-resolution screens.

Really?! I'm looking at specs now; by what I see, the Zune HD 32 has a 480x272 pixel screen.

There are quite a few smart phones out there with better than that. The Droid has 480 x 854; the HTC hero has 320x480; the Nokia N900 has 800x480. Even the iPhone, which doesn't have stellar resolution, is 480x320.

Where have we seen this before? (1)

tds67 (670584) | more than 4 years ago | (#30092436)

Sounds like a redistribution of bit wealth to me. Those who can't afford the hardware shouldn't make the rest of us do their work. Get a job.

Paranoid I am (0, Offtopic)

Wardish (699865) | more than 4 years ago | (#30092500)

A few security questions

Any attempt at encryption?

Considering that pretty much all internet traffic is copied, how hard would it be to watch someone's screen?

Is this processing limited to extreme graphics or is that spreadsheet being watched.

Yes there are plenty more, but enough for now.

Does it serve up glasses too? (0, Flamebait)

Tiger4 (840741) | more than 4 years ago | (#30092502)

3D images need three 3D glasses and rendering too don't they? Or is this just a photorealistic 2D image of a solid model? Or something else altogether? And high quality images are usually pretty large files. Does it render extra bandwidth to carry the file across to my smart phone?

Re:Does it serve up glasses too? (2, Insightful)

Rockoon (1252108) | more than 4 years ago | (#30092656)

I believe the term you were looking for is Stereo Images
,br> Anyways, this is just nVidia's attempt to come up with market for its soon to be irrelevant GPU business.

note: I actualy LIKE nVidia video cards, but the writing is on the wall. AMD is going to be putting out a veritable monster with CPU + GPU on a single chip, and Intel is going to be doing similar with larrabee (more general purpose, tho.)

nVidia can't compete without its own line of x64 chips, and they are just too far away from that capability right now.

Re:Does it serve up glasses too? (1)

Neil Hodges (960909) | more than 4 years ago | (#30092742)

Actually, Intel is putting its GMA into the CPU, not Larrabee. In no way will Intel GMA spell the end for discrete graphics cards, a category which includes Larrabee.

Re:Does it serve up glasses too? (1)

Rockoon (1252108) | more than 4 years ago | (#30092886)

Their roadmap for larrabee is first as an add-in card, then integrated into motherboards, and finally integrated onto the cpu itself.

AFAIK, GMA is never going to be integrated into the CPU. Its going to continue to be integrated in motherboards.

Re:Does it serve up glasses too? (1)

Tiger4 (840741) | more than 4 years ago | (#30094318)

Wasn't Larrabee the hapless assistant to the Chief of Control? Just a weird coincidence I hope.

Re:Does it serve up glasses too? (1)

an unsound mind (1419599) | more than 4 years ago | (#30093106)

And we haven't heard promises of technology X doing Y spelling the end of Z before?

I'll believe it when they have functional units past the prototype phase at a reasonable cost.

You may only disagree if you post while driving your flying car.

The article kinda misses the point (1)

fontkick (788075) | more than 4 years ago | (#30093138)

While the comments here are mostly negative, I can say this is a big leap ahead for rendering technology mainly because the rendering is occuring at the hardware level, rendered on the Nvidia processors on a video card, instead of the CPU via software rendering. They are calling this iray and it's developed by mental images, not nvidia. While video cards are currently great at rendering games in real time, they require a tremendous amount of shader programming and only do this sort of rendering within the context of a game, instead of within a CAD application. They are also limited in their ability to render GI, area (soft) shadows and refraction/caustics. By passing the rendering from a CAD app to iray and onto the videocard hardware, you have access to 200 parallel processors, instead of the 2, 4, or 6 processors on the CPU. So in theory a 3dsmax/Maya scene that takes you 5 hours (300 minutes) to render on a dual core CPU will only take 3 minutes with your videocard's processors. With the use of reality server (and enough multiple nvidia cards all rendering the same frame), the 3 minutes drops down to 3 seconds. Personally I'd settle for the 3 minutes and I'd be damned happy about it.

Re:The article kinda misses the point (1)

ceoyoyo (59147) | more than 4 years ago | (#30093344)

It's not the leap you make it out to be.

Ray tracing has been done on video hardware for quite a while. It still takes a pile of shader programming. These things are programmed using CUDA, which is really just another layer over top of a shader. The 200 parallel processors in a Tesla are really just a modified version of the X number of shaders on your video card. Yeah, the Tesla boxes are cool, but they're not a revolutionary change - people have been talking about GPU clusters for a long time.

The "cloud" delivery is even less remarkable. Years ago SGI was busy trying to sell servers that would do high quality rendering and then send the result to other devices... like cell phones. It didn't really go anywhere. And SGI went bankrupt.

GAHHHH! (0)

Anonymous Coward | more than 4 years ago | (#30093248)

It's not happening in a fucking cloud, it's happening on a server!
Oh, well that's a good idea, I guess. It was, when they came up with it about 30 years ago! Do these morons what an X SERVER IS?
Of course they do, they're just smarmy used car salesmen, telling you that this special car they have for you is the only one with the "wheel" they just invented.

Games make no sense... (2, Interesting)

GameMaster (148118) | more than 4 years ago | (#30093306)

Wanna know what playing games on a system like this would be like? Go to your favorite video streaming site and change the player settings (if you can) to 0 caching. The end result is, approximately, what you'd get here. The internet is a very unstable place. The only reason online games work is that programmers have gotten really good at developing latency hiding tricks which all stop working when the video rendering is done by the server. And, don't think this will just effect fps games. Just because it doesn't make or break a game like WOW doesn't mean you'd want the stuttering game-play you'd have to put up with. As far as I can see, the only kind of game this would be useful for it photo-realistic checkers.

Re:Games make no sense... (1)

sowth (748135) | more than 4 years ago | (#30100078)

What about Dragon's Lair or Space Ace? Or how about all the "games" out there which are mostly noninteractive cut scenes?

Hmmm...I see the big game studios may be moving back to those "choose your own adventure [wikipedia.org] " video titles of the late 1990s...except in 3D!!!! Mwahahaha! (**cue cheesy villan is victorious music**)

Though I did think the Star Trek: Klingon one was a bit cool...

SGI?! (1)

sp1nm0nkey (869235) | more than 4 years ago | (#30093752)

I smell an SGI naming convention, i.e. RealityEngine
Check for New Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...