Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×
Graphics Technology

Nvidia's RealityServer to Offer Ubiquitous 3D Images 82

WesternActor writes "ExtremeTech has an interview with a couple of the folks behind Nvidia's new RealityServer platform, which purports to make photorealistic 3D images available to anyone on any computing platform, even things like smartphones. The idea is that all the rendering happens 'in the cloud,' which allows for a much wider distribution of high-quality images. RealityServer isn't released until November 30, but it looks like it could be interesting. The article has photos and a video that show it in action."
This discussion has been archived. No new comments can be posted.

Nvidia's RealityServer to Offer Ubiquitous 3D Images

Comments Filter:
  • by jeffb (2.718) ( 1189693 ) on Friday November 13, 2009 @05:49PM (#30092284)

    ...for demoware.

    • by DeadDecoy ( 877617 ) on Friday November 13, 2009 @05:51PM (#30092306)
      It could have been a CloudServer for vaporware.
    • Re: (Score:1, Insightful)

      by Anonymous Coward

      Any "new" technology that is marketed with the phrase "cloud computing" is starting to get a really bad reputation with software developers.

      The "cloud" is the sort of idea that managers and other fucking morons like that think is totally great, but those of us who actually have to work with such systems know how shitty most of them are.

      "Cloud computing" is this year's version last year's "web services", "SOA" and "SaaS". So many bold claims, but in the end nothing but a steaming pile of fecal matter pushed

  • by Monkeedude1212 ( 1560403 ) on Friday November 13, 2009 @05:55PM (#30092366) Journal

    Aren't Photo-Realistic Images pretty big in size? If I want to get 30 Frames per second, how am I ever going to push 30 Photorealistic Frames through the internet - I can hardly get 5 Mb/s from my ISP.

    • how big is video?
      • Re: (Score:3, Insightful)

        Video's pretty big - but its always compressed to a point I wouldn't call it photo realistic.

        • the point here is that photorealistic stuff isn't any bigger than a photo
        • by im_thatoneguy ( 819432 ) on Friday November 13, 2009 @06:23PM (#30092620)

          Really you wouldn't describe Netflix HD as photorealistic? Even things... originally shot on film? With a camera?

        • Re: (Score:3, Insightful)

          by LUH 3418 ( 1429407 )
          Still, you can get fairly decent video quality at 720p on youtube nowadays, with connections that aren't so fast (mine is limited to 8mbps download). On a cellphone, you probably can't realistically get very fast speeds just yet (500 kbps?) but the screen is also much smaller. As connections get faster, approaches like this become more feasible.

          Another way to see this is that Nvidia just wants to expand its marketshare. They are likely hoping that with something like this, they could sell expensive serve
        • by Toonol ( 1057698 )
          DVD quality is 720x480. That's much lower resolution than modern consoles and PCs put out, and yet a decent movie looks far more photorealistic than anything they can render. I think a decent game at 640x480 res, with good animations and realistic aliasing, looks better and more realistic than a crisp HD image without those. By focusing so much on resolution, we've been putting emphasis on the wrong thing.
    • Re: (Score:3, Insightful)

      by cheesybagel ( 670288 )
      Compression. You know, MPEG-4 AVC.
    • Re: (Score:2, Insightful)

      by lhoguin ( 1422973 )

      Many applications do not need 30 fps, though. For example, an house architect software would be able to use this for rendering various shots of the designed house.

      • Let's see it perform with something like this: http://tinyurl.com/ycm5uy5 [tinyurl.com] It's a YouTube 3D refinery flythrough - architectural stuff is tame, compared to this. This is not as finely-detailed as the interior of a microprocessor but the actual thing processes much more than electrons and has to accommodate humans walking throughout and managing it. It looks complicated to the uninitiated, but it's not really. Nifty, eh? Complex enough?
    • Re: (Score:3, Insightful)

      by VeNoM0619 ( 1058216 )
      RTFS

      which purports to make photorealistic 3D images available to anyone on any computing platform, even things like smartphones. The idea is that all the rendering happens 'in the cloud,' which allows for a much wider distribution of high-quality images. RealityServer isn't released until November 30, but it looks like it could be interesting. The article has photos

      Notice there is no emphasis on video or animation. This is for 3d images only. Or were you seriously hoping to play 3d realistic games on your phone?

      • Maybe not the phone - I can't imagine why anyone would really need high quality photo realistic Renderings on your phone - I mean once the image is rendered you can just put it on your phone and show people, if thats what you're going for. But there isn't exactly an Engineering or Architect App for the iPhone, as far as I'm aware (don't hold me to that).

        However, in my experience, the only time where rendering is preferable over a picture is for entertainment purposes. Though someone above mentioned this wou

      • by geekoid ( 135745 )

        With this technique, it might be possible with a 4g connection.

      • RTFS

        RTFA. Animation of a dress worn by a model of a size specified by the user is given as an example.

    • by Dunbal ( 464142 )

      how am I ever going to push 30 Photorealistic Frames through the internet - I can hardly get 5 Mb/s from my ISP.

      I'm far from being a computer programmer/expert.

      But say you have a display at, for argument's sake, 1280x1024 pixels at 32 bits per pixel. That's 41.9 million bits per frame. Call it 42 Mbits. You want to do that at 30 frames per second? You're up to 1.26 Gb/s. Now please raise your hands who has a 2GBs internet connection? OK there will be some compres

    • by JobyOne ( 1578377 ) on Friday November 13, 2009 @06:50PM (#30092874) Homepage Journal

      How big is your screen?

      That's the real question here. "Photorealistic" (a meaningless term in the context of transferring image data) on a smartphone screen is a whole lot smaller than on my full 1920x1280 desktop monitor.

      "Photorealistic" will only ever be as high resolution as the screen you view it on.

      • by mikael ( 484 )

        You can have supersampled pixels to avoid jagged lines - for every pixel in the framebuffer, your raytracer might generate a grid of 8x8 or 16x16 rays, each of which has its own unique direction. These leads to smoother blended edges on objects. It takes considerably more time, but helps to improve the appearance of low resolution images, especially mobile phone screen which may only be 640x480 or 320x200 (early VGA screen resolutions).

        • That's a rendering trick that has zero impact whatsoever on the final size of a rendered frame. I highly doubt they're sending raytracing data to smartphones.

          A pixel is a pixel is a pixel, no matter how you go about generating it, and a screen can only display so many of them. The smart phone or whatever isn't generating these images, it doesn't give a crap about the raytracing methods behind them. It just downloads them and puts them on the screen.

          Reminds me of people who take a 300x300px image into Photos

          • by mikael ( 484 )

            The screen resolution of an iPhone is around 640x480 . I would guess that there are probably applications to allow larger images to be viewed through the use of scroll and zoom functionality. What I meant is that the server is going to do the raytracing, and all it has to do is send an image back to the iPhone.

    • Re: (Score:3, Informative)

      by vikstar ( 615372 )

      Forget the data transfers, they'll increase, it's the latency that's the problem. Games using this technology will be almost useless, especially action games. Currently you get practically 0ms latency when you interact with a game, which is what makes it seem fast. If it's a multiplayer game then the only latency you get are from other people, and if they appear to go left 50ms later than when they pressed the button to go left it doesn't make a difference for you, since you don't know when they pressed the

      • Re: (Score:3, Interesting)

        Not all games. Many genres would work great such as an RTS or many RPGs like WOW or Baldur's gate or any other game where the interface could be run locally on the portable's hardware and then let the server handle the rendering.

        I imagine even a local 3D copy which is hidden from the user but handles all of the 3D Mechanics of detecting unit selection etc. Since it's not being shaded and it only needs collision meshes it would run fine on a cell phone. Then let the server render the well shaded and lit v

        • Re: (Score:3, Insightful)

          by vikstar ( 615372 )

          Good point, I didn't think about that way. More specifically the server could for example render expensive global illumination and then send the textures to the client, which can use simple GPUs to apply the textures to local meshes.

    • Re: (Score:3, Funny)

      by geekoid ( 135745 )

      Yes, no one could ever get you 30 frames a second, that's why we can't watch tv shows and movies online~

    • Because it would be cloud based, they could merely send the finished rasterized frames for cellphones(very little bandwidth), or preprocessed data for desktops/notebooks/things with a GPU which it could then assemble. The whole problem is usually when you do something like this you need to download much more data than you actually need to your machine to view only one small subset of that information. Now, it can send you only the data you need, or all of the data in progressive chunks that you can start to
    • The reality engine isn't for real time gaming, its for artists, game and CAD designers to see the scenes rendered in near real time. It makes a lot of sense to the render on a remote server, most of time the artists computer will be just a user interface for modelling using very little CPU, only on the few rendering occassion will you need the vast ammout of CPU power that the remote render farm, can provide. Nvidia and Mental image have picked a great application for the cloud here. Even cleverer for Nvidi
  • Somebody tell 4Chan's /hr/ department, quick!
    • Re: (Score:2, Funny)

      by Anonymous Coward
      Not your personal army.
  • by HockeyPuck ( 141947 ) on Friday November 13, 2009 @05:58PM (#30092388)

    FTFA:

    By moving ray tracing and many other high power graphics algorithms off the client and into the cloud, lightweight-computing platforms like netbooks and smartphones can display photorealistic images in real time.

    Why not just say:

    By moving ray tracing and many other high power graphics algorithms off the client and onto nvidia's servers, lightweight-computing platforms like netbooks and smartphones can display photorealistic images in real time.

    I guess it's just not as cool...

    I wonder if this would work for cooking?

    By moving cutting, peeling, baking, frying and many other food preparation techniques off the dining room table and into the food cloud (kitchen), lightweight-eating platforms like TV trays and paper plates can be used to eat off of in real time.

    • Comment removed (Score:5, Insightful)

      by account_deleted ( 4530225 ) on Friday November 13, 2009 @06:01PM (#30092426)
      Comment removed based on user account deletion
      • by nurb432 ( 527695 ) on Friday November 13, 2009 @06:27PM (#30092658) Homepage Journal

        Don't worry, in 6 months we will have another buzz word we can hate and cloud will be history.

      • "Not my responsibility".

         

      • I too was skeptical. But last night there was a presentation on cloud computing at Monadlug, and rerendering for a video service to insert new advertisements was given as an example. This is something that is being done NOW, a few dollars paying for 20 minutes of time on someone's "cloud", that would otherwise require that the video service buy a whole roomfull of expensive multiprocessor computers.

        Amazon and Rackspace and others are already offering cloud services. I don't like it - I think everyone should

        • by Fred_A ( 10934 )

          I think everyone should own all the processing power they need - but cloud computing is here, it's real, and it performs a valuable economic function.

          Old news. It used to be called "server time". There are bits and pieces related to "server time" billing left in most Unix or Unix-like systems (which could probably be brought back to life if need be). No need to bring any meteorology in it.

          "Sorry, your cloud computing operations have been cancelled because of an unexpected storm which washed away our reserve of zeroes"

      • Buzzwords can be fun. Next time you're scheduled for a sales presentation make up a bunch of cards with different sets of mixed buzzwords and give each attendee a card and a highlighter. The first person to get five buzzwords marked off should yell BINGO! and win a small prize for paying attention. It's called buzzword bingo. It works equally well whether you warn the presenters or not, since they can't help themselves. Some salespeople can't get past the first slide without "BINGO" ringing out.

        Here's

    • by Spad ( 470073 ) <slashdot.spad@co@uk> on Friday November 13, 2009 @06:04PM (#30092446) Homepage

      Shhhhh! You'll ruin the scam (of convincing uninformed people that an old idea is a new idea by renaming it).

      Thin client -> fat client -> thin client -> fat client. *yawn*

      Every time, this happens; things move away from the client for "performance" and "flexibility" and "scalability" reasons and then everyone realises it's a pain because of the lack of control or reliability and by that point the client hardware's moved on to the point where it can do the job better anyway so everyone moves back to it.

      • Thin client -> fat client -> thin client -> fat client. *yawn*

        We were forced to stop using the term "fat client' here at Big Bank; our end-users got offended when they heard the term, apparently they thought we were talking about the /users/ and not the systems... Instead, we must call it "thick client"* -- which is odd, since if they interpret it the same way it's just as insulting from another direction.

        *go ahead, laugh, but it really happened!

        • Re: (Score:3, Informative)

          by HockeyPuck ( 141947 )

          We were forced to stop using the term "fat client' here at Big Bank; our end-users got offended when they heard the term, apparently they thought we were talking about the /users/ and not the systems... Instead, we must call it "thick client"* -- which is odd, since if they interpret it the same way it's just as insulting from another direction.

          You forgot how we used to refer to IDE devices as either a "master" or a "slave"... this wasn't back in the 50s either.

        • by Xtravar ( 725372 )

          We used to call "computers on wheels" COWs, except apparently a customer and/or customer's customer got very offended during implementation.

          And yes, that really happened, too.

        • Thick client is also insensitive these days. You want to go with "fluffy client".

          Just kidding... the least offensive term is, I believe, "rich client", though in a bank that could be confusing too.

      • That depends on which side you are on.

        For the people hosting ( or governments that want to butt in ) there is plenty of control.

      • IBM tried it when they went to OS/2. Suddenly a hard drive was a "Fixed disk" and a motherboard was a "Planar board".

        It's a sad game but it's the only one there is. It's fun to watch megacorporations fight to the death over ownership of a word.

    • Oh, and it's not real-time at all. IT will *at least* have the lag of one ping roundtrip. Then add some ms of rendering time and input/ouput on the device to it. On a mobile phone that can mean 1.5 seconds(!) in delay. Or ever more.

      It's real-time, when it does not sound weird anymore, when I press a key in a music game, to hear the sound.
      That's below 10 ms for me. But something around 50ms TOTAL for the average Joe.

      Oh, and don't even think about winning a game against someone with real real-time rendering.

      I

    • by unjedai ( 966274 )
      My kitchen sometimes resembles a cloud when I allow my high powered processor to over-render my inputs.
  • That low-resolution BlackBerry in your pocket will suddenly be capable of producing high resolution images?

    Uh-huh.

    Nvidia also claims that simply by wiring money into their account, they can make you lose weight by doing nothing at all!

    • The point is that the blackberry doesn't do any processing. It just streams the end result. Which is certainly doable considering the ZuneHD can playback 720p HD footage and it's not much bigger than a blackberry.

      • I'm not talking about real-time processing (which cloud rendering can help with).

        The new Zune HD is one of a few select devices that actually supports a decent resolution. It pisses me off because I can't use a Zune in Linux, and I won't buy a Zune, but it does have perhaps the nicest screen of any portable device on the market right now.

        Most smart phones have low-resolution screens. You can't produce a photo-realistic image on a low-resolution screen, regardless of pushing rendering out to the cloud.

        • by EvanED ( 569694 )

          ...it does have perhaps the nicest screen of any portable device on the market right now. Most smart phones have low-resolution screens.

          Really?! I'm looking at specs now; by what I see, the Zune HD 32 has a 480x272 pixel screen.

          There are quite a few smart phones out there with better than that. The Droid has 480 x 854; the HTC hero has 320x480; the Nokia N900 has 800x480. Even the iPhone, which doesn't have stellar resolution, is 480x320.

  • Paranoid I am (Score:1, Offtopic)

    by Wardish ( 699865 )

    A few security questions

    Any attempt at encryption?

    Considering that pretty much all internet traffic is copied, how hard would it be to watch someone's screen?

    Is this processing limited to extreme graphics or is that spreadsheet being watched.

    Yes there are plenty more, but enough for now.

  • While the comments here are mostly negative, I can say this is a big leap ahead for rendering technology mainly because the rendering is occuring at the hardware level, rendered on the Nvidia processors on a video card, instead of the CPU via software rendering. They are calling this iray and it's developed by mental images, not nvidia. While video cards are currently great at rendering games in real time, they require a tremendous amount of shader programming and only do this sort of rendering within the c

    • by ceoyoyo ( 59147 )

      It's not the leap you make it out to be.

      Ray tracing has been done on video hardware for quite a while. It still takes a pile of shader programming. These things are programmed using CUDA, which is really just another layer over top of a shader. The 200 parallel processors in a Tesla are really just a modified version of the X number of shaders on your video card. Yeah, the Tesla boxes are cool, but they're not a revolutionary change - people have been talking about GPU clusters for a long time.

      The "clou

  • by GameMaster ( 148118 ) on Friday November 13, 2009 @07:41PM (#30093306)

    Wanna know what playing games on a system like this would be like? Go to your favorite video streaming site and change the player settings (if you can) to 0 caching. The end result is, approximately, what you'd get here. The internet is a very unstable place. The only reason online games work is that programmers have gotten really good at developing latency hiding tricks which all stop working when the video rendering is done by the server. And, don't think this will just effect fps games. Just because it doesn't make or break a game like WOW doesn't mean you'd want the stuttering game-play you'd have to put up with. As far as I can see, the only kind of game this would be useful for it photo-realistic checkers.

    • by sowth ( 748135 ) *

      What about Dragon's Lair or Space Ace? Or how about all the "games" out there which are mostly noninteractive cut scenes?

      Hmmm...I see the big game studios may be moving back to those "choose your own adventure [wikipedia.org]" video titles of the late 1990s...except in 3D!!!! Mwahahaha! (**cue cheesy villan is victorious music**)

      Though I did think the Star Trek: Klingon one was a bit cool...

  • I smell an SGI naming convention, i.e. RealityEngine

It is easier to write an incorrect program than understand a correct one.

Working...