Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Test-Driving NVIDIA's GRID GPU Cloud Computing Platform

Soulskill posted about 3 months ago | from the wonder-if-it'll-run-crysis dept.

Cloud 29

MojoKid writes: "NVIDIA recently announced that it would offer a free 24-hour test drive of NVIDIA GRID to anyone who wanted to see what the technology could do. It turns out to be pretty impressive. NVIDIA's GRID is a virtual GPU technology that allows for hardware acceleration in a virtual environment. It's designed to run in concert with products from Citrix, VMWare, and Microsoft, and to address some of the weaknesses of these applications. The problem with many conventional Virtual Desktop Interfaces (VDIs) is that they're often either too slow for advanced graphics work or unable to handle 3D workloads at all. Now, with GRID, NVIDIA is claiming that it can offer a vGPU passthrough solution that allows remote users to access a virtualized desktop environment built around a high-end CPU and GPU. The test systems the company is using for these 24-hour test drives all use a GRID K520. That's essentially two GK104 GPUs on a single PCB with 8GB of RAM. The TD program is still in beta, the deployment range is considerable, and the test drives themselves are configured for a 1366x768 display at 30 FPS and a maximum available bandwidth cap of 10Mbit."

cancel ×

29 comments

Sorry! There are no comments related to the filter you selected.

FP(U) (0)

Anonymous Coward | about 3 months ago | (#47102123)

Ugly hack using GPU for parallel computing.

Re:FP(U) (0)

Anonymous Coward | about 3 months ago | (#47102147)

You can't argue with pure unadulturated speed.

Re:FP(U) (1)

binarylarry (1338699) | about 3 months ago | (#47103963)

But Keanu can.

Re:FP(U) (0)

Anonymous Coward | about 3 months ago | (#47102177)

2005 called, it wants your opinion back.

GPUs are now more capable for computing than for graphics.

Re:FP(U) (2)

ArcadeMan (2766669) | about 3 months ago | (#47102217)

So we're going to run code on GPUs and let CPUs render the graphics?

Re:FP(U) (1)

lgw (121541) | about 3 months ago | (#47102243)

It wouldn't surprise me at all. For some specialist workloads, GPUs are amazingly fast. But if you still need a GUI, why not use the built-in graphics processing from the CPU, to avoid any load on the GPU doing the real work?

Re:FP(U) (1)

ArcadeMan (2766669) | about 3 months ago | (#47102337)

I hope the future bottom-of-the-barrel Broadwell GPUs are as powerful as Haswell's Iris.

doomy doom (0)

Anonymous Coward | about 3 months ago | (#47102191)

But can it run doom?

Re:doomy doom (1)

epyT-R (613989) | about 3 months ago | (#47102205)

yes it can. 320x200x8 at 35fps

OpenStack support? (2)

buchner.johannes (1139593) | about 3 months ago | (#47102219)

The only approach that has been successful for CUDA access from a kvm virtual machine that we know of is gVirtuS [uniparthenope.it] .

https://wiki.openstack.org/wik... [openstack.org]

Nice, but expensive (2, Informative)

BaronM (122102) | about 3 months ago | (#47102251)

This isn't particularly new. It's nice tech, but each ~$2000 K1 board supports 4 users. 4. The K2 board supports 2 'power users'. (ref: NVIDIA data sheet: http://www.nvidia.com/content/... [nvidia.com] )

If I cram 4 K1 boards in a server, I can now support 16 virtual desktops with 3D acceleration for an $8k delta over and above the other expenses of VDI.

Unless you ABSOLUTELY MUST have VDI for 3D workloads, I can't see how this makes sense.

Re:Nice, but expensive (1)

DigiShaman (671371) | about 3 months ago | (#47102343)

Outsourcing CAD (drafters) work only to be reviewed by a select few managers. The savings would more than make up for the hardware. Just think about it, they don't have to pay them benefits. Hiring and firing is just a mouse-click away. NEXT!!!

Re:Nice, but expensive (2)

ledow (319597) | about 3 months ago | (#47102345)

It's the old thin-client problem, reinvented with virtual machines.

Of course you can run 16 computers/users off one VM, but they will each get 1/16th of the capabilities of the computer, on average.

Sometimes that scales - e.g. small businesses virtualising their half-idle servers. Sometimes it doesn't - e.g. most thin clients when you want to get towards 3D or anything intensive.

But for power users? It almost always doesn't. If you need that kind of power, you have to spend ridiculous amounts of money, or you have to keep the resources to one-or-more per user (i.e. give them a desktop with a decent 3D card).

It's what killed off OnLive. Sure, if we all played 2D Mario games, it would be fabulous. But we don't. And as soon as the demand on the hardware increases, you can't afford to share it when you could get twice as much power for the same money.

Re:Nice, but expensive (1)

Anonymous Coward | about 3 months ago | (#47102387)

This isn't particularly new. It's nice tech, but each ~$2000 K1 board supports 4 users. 4. The K2 board supports 2 'power users'. (ref: NVIDIA data sheet: http://www.nvidia.com/content/... [nvidia.com] )

If I cram 4 K1 boards in a server, I can now support 16 virtual desktops with 3D acceleration for an $8k delta over and above the other expenses of VDI.

Unless you ABSOLUTELY MUST have VDI for 3D workloads, I can't see how this makes sense.

Please read more than a single fucking PDF before speaking in absolutes.

http://www.nvidia.com/object/virtual-gpus.html

Re:Nice, but expensive (1)

BaronM (122102) | about 3 months ago | (#47102465)

Apparently I was mistaken.

I looked at this tech when it was actually new, around a year ago, and admittedly just pulled the datasheet today to double check my recollection. I'm glad that the limitations are less severe than I thought.

OTOH, nVidia really ought to fix their datasheets, also.

Better now? (and profanity-free to boot!)

Re:Nice, but expensive (2)

Shados (741919) | about 3 months ago | (#47102471)

There's 2 big use cases for desktop virtualization. The common one is to run a ton of desktops of off little hardware, with the idea that most people only read emails and use MS Word all day anyway. Big cost saving.

The other is purely to have desktops centralized in a data center so you can have data center admins deal with them instead of needing (as many) on site tech monkeys.

I worked in companies where it was the later. The users still needed 16+ gb of RAM, dedicated powerful hardware, etc, but now if something blew up, you didn't need to send someone at their desk to fix it, and you could still just move the image to a different machine while the first one was being fixed.

This technology seems very well suited toward the later.

Re:Nice, but expensive (2)

fishybell (516991) | about 3 months ago | (#47102545)

Exactly this. We investigated moving our engineers (running Pro/E / Creo) and our drafters (running Autocad) to this setup because they're the only ones not running a VDI or VDI-like setup. We're a company well suited to this because we already have the entire setup minus the NVidia gear. We ended up skipping it (for now at least) because of one two things: money and productivity. The cost of switching to a virtual environment was significantly more than the cost of getting them all new hardware. We might look at it again in another year or so when they're due for new hardware anyway, but for now it's $40k plus that isn't needed or wanted by anyone other than IT. It makes our jobs easier, but reduces their productivity slightly at the gain of increasing ours slightly. Even with 10+ drafters and 5+ engineers that offset was enough to make us blink.

Re:Nice, but expensive (0)

Anonymous Coward | about 3 months ago | (#47103343)

I'm working at a company where we slowly seem to be shifting engineers and drafters to VDI stations. The slight reduction in productivity you mention for drafters/engineers can not be understated. The sheer frustration of having substantial lag when drafting is extremely frustrating, and makes for a rather miserable experience. Running benchmarks on the hardware doesn't accurately reflect the display lag. Even with extremely good performance I would take a mid-range computer over the "blazing fast" VDI any day.

Now if only I could get management to listen.....

Re:Nice, but expensive (1)

Blaskowicz (634489) | about 3 months ago | (#47116593)

Will there really be lag? You can try the same tech they sell as a consumer product, game streaming to SteamOS. And the WiiU is similar. With a low input lag LCD monitor and a gigabit network I'm sure the latency would be rather low.
Then you have actual performance advantages running on the server (except a typical Xeon will run about 1GHz slower, save for the really expensive ones). You can do something stupid like 384GB memory on the server, and then data is loaded from terabyte SSDs or fast SAN instead of going through a slower network to your workstation.

With multiple 1440p or 4K displays though the results would have to be seen..
At least, GPU requirements are often overrated ; and you get the Quadro drivers (if you actually need them in some way. hmm, no idea if they have additional pricing for that)

Doesn't VMware and Microsoft already support this? (1)

VTBlue (600055) | about 3 months ago | (#47102341)

What's the difference between this and Microsoft RemoteFX? I'm pretty sure Citrix also supports hardware 3D virtualization.

Re:Doesn't VMware and Microsoft already support th (1)

link-error (143838) | about 3 months ago | (#47102439)

Vmware supports it, but I believe you have to assign the GPU to a single virtual instance. It isn't shared or dynamic.

Re:Doesn't VMware and Microsoft already support th (2)

WilyCoder (736280) | about 3 months ago | (#47102771)

remotefx blows. it only provides a direct3d interface, there is no opengl support (which is what the majority of the scientific visualization community uses).

Re:Doesn't VMware and Microsoft already support th (0)

Anonymous Coward | about 3 months ago | (#47105879)

I think remoteFX vGPU is aimed at a different target market than Citrix's vGPU or VMware's vSGA. I think RemoteFX vGPU is just updates to the RDP protocol to allow basic 3d rendering via d3d API interception for aero desktops and basic 3d accelerated workloads. It's very scalable but still aimed at a knowledge worker as opposed to power users of any sort. If a company has some server infrastructure on hyper-v already, vGPU is just an easy way to leverage that existing infrastructure to provide relatively high performance/pretty VDI. It's a lot cheaper than looking at Citrix, VMware or Linux virtualisation solutions - especially in a Microsoft shop where the skillset of technicians is probably just Microsoft (a dying breed of technician!).

Also please note that RemoteFX is more than just the vGPU feature. RFX has lots to offer even without vGPU like USB redirection or VoIP redirection for lync users and more.

Bad Scaling Problem Though (4, Insightful)

grilled-cheese (889107) | about 3 months ago | (#47102359)

We very recently went through adding Grid cards to our VMware View infrastructure. The Grid K1 & K2 cards are a tradeoff on either more kepler processors or more cuda cores in addition to the quantity of RAM. VMware View can utilize a Grid card in either vSGA or vDGA modes (shared or direct passthrough of a kepler processor). From what I can discern, Dell only officially supports the Grid cards in their R720 server. That particular chassis can only accept 2 Grid cards max. So you can get your choice of 2, 4, 6, or 8 kepler processors. If you're using vDGA mode, you're creating a direct VDI desktop allocation of that core with DirectPath I/O. While this means that one desktop is going to have great performance, it means it isn't available for anyone else and you lose vMotion capability. If you run in vSGA mode, the performance per machine isn't as good as vDGA but more desktops can utilize the hardware. There arn't any good whitepapers I've found yet describing how far you can stretch a Grid K1, but the rule of thumb I got from another company who has ran them through their benchmark lab got around 25 desktops per K1 max. Therefore, assuming you've got a pair of them that means you can run ~50 desktops with a reduced performance when compared to vDGA. The technology still appears to be young to me, but we decided to take a chance and see how far we could take it.

Bad Scaling Problem Though (0)

Anonymous Coward | about 3 months ago | (#47105897)

What version of Vmware View? Doesn't the vSGA scaling depend on which 'profile' you use?

See here:
http://www.nvidia.com/object/virtual-gpus.html

I hope to test View 6 soon.

I may be mistaken and this is Citrix only (I've used neither FWIW just vSphere with a vid card passed through with splashtop as the remote protocol... and synergy for relative mouse support.. stupid splashtop).

Re:Bad Scaling Problem Though (1)

grilled-cheese (889107) | about 3 months ago | (#47108873)

What version of Vmware View? Doesn't the vSGA scaling depend on which 'profile' you use?

See here: http://www.nvidia.com/object/v... [nvidia.com]

That link is referencing a citrix only idea, but their general distinctions between user types is apt. As of View 5.3, there is no longer a lockin to NVIDIA products yet nobody has made any yet to my knowledge. Intel & AMD are on the list to produce something sometime.

So where... (1)

pla (258480) | about 3 months ago | (#47103791)

So where do I download the optimized Bitcoin miner for these demos, and does anyone have a few thousand throwaway email addresses I can borrow for 24 hours?

Bring on cloud gaming! (1)

danielzip53 (1717992) | about 3 months ago | (#47106219)

Remote gaming! Bringing the likes of full rez Call of Duty to your pocket device ;)

been doing this off the Shield handheld for months (0)

Anonymous Coward | about 3 months ago | (#47113751)

surprisingly it works decently with games connected over a 4G wifi hotspot. Would have expected it to not work at all.

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>