Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

NVIDIA Unveils GRID Servers, Tegra 4 SoC and Project SHIELD Mobile Gaming Device

samzenpus posted about a year and a half ago | from the going-down-the-list dept.

Android 109

MojoKid writes "NVIDIA made some bold moves at their CES 2013 press conference and announced a couple of potentially game changing products. GeForce GRID is a cloud gaming solution. It allows PC game content to be run and rendered in the cloud and then streamed to any device that can run the GRID receiver utility, like a Smart TV, tablet, or a smartphone. GeForce GRID server architecture combines an NVIDIA-designed server packed with GPUs with NVIDIA-developed software and virtualization layer. A rack of 20 GRID servers was shown, powered by 240 GPUs, capable of 200 TFLOPS and roughly equivalent to the performance of 720 Xbox 360 consoles. The biggest news to come out of NVIDIA's press conference, however, had to do with Tegra 4. Not only was the next-gen SoC officially unveiled, but a new portable gaming device based on Tegra 4, dubbed Project SHIELD, was also demoed. NVIDIA's Tegra 4 builds upon the success of the Tegra 3 by incorporating updated ARM15-based CPU cores with 72 custom GeForce GPU cores, which offer up to 6x the performance of Tegra 3. The A15 cores used in Tegra 4 are up to 2.6x faster than the A9-class cores used in Tegra 3. As a companion to the Tegra 4, NVIDIA also took the wraps off of their new Icera i500 programmable 4G LTE modem processor. Icera i500 features 8 custom, programmable processor cores and is approximately 40% smaller than many fixed function modems. The biggest surprise to come out of NVIDIA's press conference was Project SHIELD, a Tegra 4-powered mobile gaming device running Android that's sure to put Microsoft, Sony, and Nintendo on high alert. Project SHIELD offers a pure Android experience without any skinning or other customizations, save for the SHIELD app environment, that can play any Android game. Project SHIELD has the ability to stream PC games from a GeForce GTX-equipped PC as well. The device is shaped much like an Xbox 360 game controller, but features a 5", flip-out capacitive touch display with a 720P resolution. The device can also stream to an HD TV via HDMI or a WiDi-like wireless dongle. In fact, CEO Jen-Hsun Huang showed Project SHIELD playing a 4K video on an LG 4K TV."

cancel ×

109 comments

Sorry! There are no comments related to the filter you selected.

Game controller with Display (3, Interesting)

mozumder (178398) | about a year and a half ago | (#42504461)

Would be an excellent part for the upcoming console generation.

I miss the old Dreamcast controller with its LCD display in it.

A modern take on that would have a nice 5" touchscreen LCD with built-in GPU.

It would be expensive as hell of course...

Re:Game controller with Display (0, Flamebait)

Anonymous Coward | about a year and a half ago | (#42504605)

Not sure if you're cynical or just an idiot...

WiiU...

Re:Game controller with Display (1)

mozumder (178398) | about a year and a half ago | (#42504631)

WiiU renders on the console, not on the controller.

Re:Game controller with Display (1)

mozumder (178398) | about a year and a half ago | (#42504731)

and when it renders on the console, it becomes GPU bound, limiting the number of controllers that can handle it. (WiiU only allows 1 controller with a display)

The old Dreamcast was great at multiplayer because each controller would have its own display, letting each player pick plays and so on secretly from the others...

Re:Game controller with Display (1)

Anonymous Coward | about a year and a half ago | (#42504951)

Your virginity is astounding.

Re:Game controller with Display (0)

Anonymous Coward | about a year and a half ago | (#42505155)

Correction.WiiU can do 2 controllers, but there's no game that has any use for that, nor will there be for quite some time.

Re:Game controller with Display (1)

kelemvor4 (1980226) | about a year and a half ago | (#42508315)

Correction.WiiU can do 2 controllers, but there's no game that has any use for that, nor will there be for quite some time.

So no football titles are planned for it? Bummer. As GP mentioned, DC was great for those games because of the personal screen on it.

Not the first android game device, (1)

Eric Coleman (833730) | about a year and a half ago | (#42506475)

There are at least two other android game devices. There is the Archos GamePad [wikipedia.org] , which has already shipped in Europe and looks similar to a PSP in my opinion, and there is the not-yet-released Wikipad [wikipedia.org] , which looks to be just a large size tablet that has a snap on game controller.

And there is the Ouya which was mentioned here on slashdot recently.

I can't help but wonder if the android hardware game device market is about to get really crowded.

Re:Not the first android game device, (1)

Black LED (1957016) | about a year and a half ago | (#42516733)

There are a lot more than that. Check out shops like Willgoo and DealExtreme. They have many cheap, Android-based, handheld game systems.

This is a game changer that we have been dreaming (2)

elucido (870205) | about a year and a half ago | (#42504471)

This is the system of our wet dreams. For years we have talked about cloud gaming devices. And in theory internet speeds are fast enough to make this work.

This is actually very interesting. How will Sony, Microsoft and the consoles compete with this? Could this thing be used to bring back arcade gaming? I could see arcades coming back with something like this here.

Re:This is a game changer that we have been dreami (1)

mozumder (178398) | about a year and a half ago | (#42504527)

And in theory internet speeds are fast enough to make this work.

Internet speeds aren't fast enough to make this work, not just bandwidth, but also latency.

Local ping times are far too high to be usable. Local Verizon FIOS has 25ms pings across the city, nevermind across the country.

You need sub 3ms ping times.

Re:This is a game changer that we have been dreami (0)

Anonymous Coward | about a year and a half ago | (#42504749)

Is that true though? I have tried Gaikai when they were still around for trial, and it was quite playable, and my ping was definitely above 3ms.

Re:This is a game changer that we have been dreami (1)

greenfruitsalad (2008354) | about a year and a half ago | (#42505257)

How can you have such a completely different experience from my own? I tried Gaikai at work during lunch breaks. Our offices are on top of a datacentre and we have a few 10Gbps direct links to LINX (london internet exchange). I don't think gaming connectivity gets any better than what I have there. Yet Gaikai SUCKED donkey balls every single time I tried it. I absolutely hated the latency and never played for more than 15 minutes before I got annoyed with it.

Re:This is a game changer that we have been dreami (1)

Joce640k (829181) | about a year and a half ago | (#42506447)

Yep, and remember that the (imaginary) market for this system would be people with tablets connected over 4G or home WiFi.

Nobody with a desktop PC would be simultaneously:
a) Rich enough to subsidize the hardware investment needed by the service providers to make this work (monthly fee...$50?)
and
b) Too poor to buy a $150 graphics card (which is what this hardware will equate to in six months time).

Re:This is a game changer that we have been dreami (1)

serviscope_minor (664417) | about a year and a half ago | (#42504797)

Local ping times are far too high to be usable.

I remember seeing marketing materials from that cloud gaming company, the large one whose name I forget.

They had some lovely graphs about latency. They'd managed to get the rendering and compression latency way down (impressive, but probably possible).

They's also apparently done the same to the network latency. Given that all this supposedly happened in the data centre it smelled like a herd of bulls had recently wandered through.

Re:This is a game changer that we have been dreami (2)

citizenr (871508) | about a year and a half ago | (#42504985)

Local ping times are far too high to be usable.

I remember seeing marketing materials from that cloud gaming company, the large one whose name I forget.

They had some lovely graphs about latency. They'd managed to get the rendering and compression latency way down (impressive, but probably possible).

They's also apparently done the same to the network latency. Given that all this supposedly happened in the data centre it smelled like a herd of bulls had recently wandered through.

Onlive. Their offerings were so great that they went bankrupt.

Re:This is a game changer that we have been dreami (1)

serviscope_minor (664417) | about a year and a half ago | (#42505443)

Onlive. Their offerings were so great that they went bankrupt.

Ah yes. Thanks. That's exactly the bunch of scam artists I was referring to.

I say scam because they basically made shit up about the biggest problem which was network latency. Oh and usually quoted the one way latency, not the ping time, which is much more significant in gaming.

Re:This is a game changer that we have been dreami (1)

Buzer (809214) | about a year and a half ago | (#42505651)

Local ping times are far too high to be usable. Local Verizon FIOS has 25ms pings across the city, nevermind across the country.

I'm not sure what Verizon is doing wrong, but certainly something. I'm using cable in Finland (DNA Welho). Ping to regional exchange point (FICIX) is roughly 7ms, ping to Rovaniemi (~1000km) on different operator is roughly 25ms, ping first hit in Germany (Level3 Dusseldorf) is 44ms.

Re:This is a game changer that we have been dreami (0)

Anonymous Coward | about a year and a half ago | (#42509055)

You need sub 3ms ping times.

Have you given this even a few minutes of thought? A 3 frame latency is not unusual for a game. 30 fps is not unusual either. 30 fps means 33.333 ms per frame, so 3 frames is 100 ms. Then there's also the latency in your mouse and your screen. You absolutely do not need 3ms ping times even for competitive FPS, except maybe if you're at the international level. If a game can run at 60fps and with 1 frame latency you're down to about 16 ms game latency, so if your ping is 25 ms that's 31 ms which puts you right in the same range. Where on earth did you get that 3ms number from?

as long as it's wifi only streaming 3g / 4g data c (1)

Joe_Dragon (2206452) | about a year and a half ago | (#42504575)

as long as it's wifi only streaming 3g / 4g data cost will run up fast and don't even think of roaming as say 1-2 hours of that can cost more then a buying a high end gameing laptop.

Re:as long as it's wifi only streaming 3g / 4g dat (1)

h4rr4r (612664) | about a year and a half ago | (#42505179)

3g/4g latency will make the device unplayable. No need to worry about running up the bill, you would be far to frustrated to play for more than a couple minutes.

Re:This is a game changer that we have been dreami (0)

Anonymous Coward | about a year and a half ago | (#42504619)

herpa derp. it wasn't GPU power that killed onlive, it was everyones crumby slow broadband.

How will Sony, Microsoft and the consoles compete with this?

By making stuff that isn't unresponsive junk.

Re:This is a game changer that we have been dreami (0)

Anonymous Coward | about a year and a half ago | (#42506013)

Yeah? Is that why Onlive games looked like shit? I tried a couple of their games and they looked like they were stuck on low-mid graphics settings.

Re:This is a game changer that we have been dreami (1)

alen (225700) | about a year and a half ago | (#42504689)

WTF is the point of streaming the entire game rather than the cloud just taking care of the back end

nvidia is going down the toilet with apple and qualcomm killing their business so they had to make up this crazy idea to stay relevant

Re:This is a game changer that we have been dreami (4, Insightful)

drinkypoo (153816) | about a year and a half ago | (#42504739)

This is the system of our wet dreams. For years we have talked about cloud gaming devices. And in theory internet speeds are fast enough to make this work.

Who is this "we"? I dream of having a handheld device that runs games. I want it to work where wireless coverage is spotty or nonexistent, without that a portable device is worse than useless, it's a rock I have to drag around because it's expensive.

Re:This is a game changer that we have been dreami (1)

Joce640k (829181) | about a year and a half ago | (#42505003)

Who is this "we"?

I think he means NVIDIA - NVIDIA have been hoping for ages somebody will be stupid enough to actually try this.

Same thing with Intel when they were trying to find a buyer for their Larrabee chips - real-time raytracing for cloud games! Only a billion dollars of Intel chips and motherboards needed!!

Re:This is a game changer that we have been dreami (1)

h4rr4r (612664) | about a year and a half ago | (#42504969)

Unless the server is very close to you, the latency between button press and action is too high. I have tried onlive and similar they all suffer from this. For some game styles it is fine, but anything with fast action makes this an exercise in frustration. The speed of light is a real bitch.

Re:This is a game changer that we have been dreami (1)

deteknician (1634255) | about a year and a half ago | (#42505873)

Exactly. In Street Fighter 4 players are pretty much required to time button presses to 1 frame which is about 16ms (Game runs at 60fps). This needs to be performed many times during a round and missing the 16ms window gets you killed. Games like this will not be cloud friendly anytime soon.

Re:This is a game changer that we have been dreami (0)

Anonymous Coward | about a year and a half ago | (#42507493)

The server will be close to you, the system allows you to stream gameplay from your home PC on your home network, to you, sitting in the living room so you don't get bitched at by the other half for leaving her alone all evening again.

Re:This is a game changer that we have been dreami (1)

bfandreas (603438) | about a year and a half ago | (#42505583)

This is a cloud of one.
It is connected to a Geforce650 or better via HDMI. So that aspect is pretty gimmicky.
And IMHO for your portable gaming needs you'd be better served by a full tablet + a PS2 controller.

This is a tech demo and I do not expect it to go into production anytime soon. nVidia are trying to get more OEMs to buy their Tegra4 SoC. And honestly an Asus Transformer with this baby makes me actually pretty happy in the pants.

Tegra4's party piece is that it can display different stuff via HDMI on a telly than what you see on the tablet. Add some basic controllers to the tablet and what you get is a Wii U. Only portable. And good. And pretty much open. Question is, when will Android actually support this.
Tegra3's party piece was that it could do that proprietary nVidia 3D thing. Only a couple of games support it. Tried it with Riptide attached to my monitor and it was pretty neat.
My great hope is that Tegra4 is as good as current gen game consoles(but proper high-res). Propably not since they haven't changed the GPU a lot. But quad core A15 is pretty beefy. Tegra3 tablets typically also have more RAM than a PS3 or XBOX360.

The grapevine has it that nVidia had been snubbed by both Sony and Microsoft over ATI for the next gen consoles. So it stands to reason that nVidia will show much more interest in Android based game consoles with their Tegra3 SoC. If I were them I'd get in bed with the Ouya folks and design an Ouya2 based on Tegra4. Pronto.

Re:This is a game changer that we have been dreami (0)

Anonymous Coward | about a year and a half ago | (#42506043)

This is the system of our wet dreams. For years we have talked about cloud gaming devices. And in theory internet speeds are fast enough to make this work.

Nightmares, not wet dreams. Yes, we've talked about it for years -- that we've heard hilarious rumors that people are seriously entertaining the idea, and what a dumb idea it is. And in theory, my internet speed is nearly 5% fast enough for it to be viable. Maybe in twenty years it'll be not-necessarily stupid, but still not nearly as good for me, or as profitable for nVidia, as for me to buy chips to render locally.

Re:This is a game changer that we have been dreami (1)

Sockatume (732728) | about a year and a half ago | (#42506271)

These days home consoles are pretty close to general-purpose computing boxes, and could quite easily be used as clients for cloud-gaming services just as they're currently used as clients for cloud-video services. Sony recently bought Gaikai after all, they've got to be doing something with it. (Rumour has it, all their back compatibility...)

The 1980s called (4, Insightful)

PPH (736903) | about a year and a half ago | (#42504483)

They want their X Windows back.

Re:The 1980s called (2)

fredan (54788) | about a year and a half ago | (#42507891)

HEY!

First you wanted to be able to run remote desktop.

Then the user complained of how slow it was and only 1% (of something) used that feature.

Now, as we are working on to get rid of it, you want that feature back before we got rid of it!?

Make up our mind!

Re:The 1980s called (1)

PPH (736903) | about a year and a half ago | (#42509999)

First you wanted to be able to run remote desktop.

Then the user complained of how slow it was and only 1% (of something) used that feature.

Actually, over a decent LAN (Intranet, and now even with broadband around the world) running remote apps over X has worked reasonably well.

Back when I was at Boeing, I used to manage machines all over the Puget Sound region this way (20 to 30 miles between sites). And we had Windows NT with an X client front end, so people with *NIX systems could open a Windows desktop on the rare occasion that some manager e-mailed out a Word Doc with macros* and you absolutely had to run Word to see it.

It was Microsoft's sales staff that kept telling us we didn't need to do that. A common approach for technologies that they are behind in.

*More often than not, the macro was an infection, not something the boss actually sent out. Most of my bosses couldn't change the fonts in MS Word, let alone write a macro.

Wow (1)

93 Escort Wagon (326346) | about a year and a half ago | (#42504533)

That is one long paragraph.

Re:Wow (1)

inamorty (1227366) | about a year and a half ago | (#42505309)

And many happy returns.

Re:Wow (1)

Scarred Intellect (1648867) | about a year and a half ago | (#42506091)

That's nothing, read Don Quixote.

Nothing new? (0)

Anonymous Coward | about a year and a half ago | (#42504541)

What makes GRID any better than OnLive? [wikipedia.org] Specifically in regards to latency, is the lag reduced between controller input and display? Unless nVidia is prepared to upgrade everyone else's infrastructure, I don't see this taking off.

Re:Nothing new? (1)

Joce640k (829181) | about a year and a half ago | (#42505023)

NVIDIA is hoping somebody else will pay for that. Right after they buy 100 million $ of NVIDIA chips and find out the "Latency isn't a problem!" thing was just a rigged demo.

Re:Nothing new? (0)

Anonymous Coward | about a year and a half ago | (#42507721)

For gaming, jitter is even worse than high latency.

Re:Nothing new? (1)

h4rr4r (612664) | about a year and a half ago | (#42505053)

Nothing.
Forget updating everyone else's infrastructure even if it was all multimode fiber this is a nonstarter due to the speed of light. Unless the server is very close, under 280 miles at about the absolute maximum you can forget about this. That would give you a 3ms latency on each button press assuming there is no latency other than that induced by the speed of light.

Re:Nothing new? (1)

fuzzyfuzzyfungus (1223518) | about a year and a half ago | (#42505363)

What makes GRID any better than OnLive? [wikipedia.org] Specifically in regards to latency, is the lag reduced between controller input and display? Unless nVidia is prepared to upgrade everyone else's infrastructure, I don't see this taking off.

It doesn't do a thing to solve the (significant) 'even customers in the same city have shitty ping, and we can't usefully load-balance our datacenters because adding a cross-country fiber trip totally ruins things, so we have to provision for peak getting-home-from-school/work-and-playing-games time; but let most of it sit idle during the day' problems that helped doom Onlive.

It probably is much better placed than Onlive was to fix the "We basically need an entire computer, or a VM with dedicated hardware passthrough to an entire GPU card, to handle each customer instance. Nvidia is in an overwhelmingly better position to get useful 'cloud' features like being able to carve up a large GPU and allocate resources to multiple low-demand instances, or to have a GPU that can dump video output to a virtual 'screen', with a hardware video encoder that passes the resulting video stream back to be sent over the network, rather than having to do a hardware capture at the DVI port or keep the CPU busy scraping the framebuffer...

So, they can't do much about latency or customer use patterns; but as the guys who make the GPU and write the drivers, they are certainly in a better position to allow efficient slicing up of GPU time and resources(along the lines of what contemporary VM setups can do with CPU and system RAM) than some 3rd party outfit is.

GRID for nexgen MMOs (0)

Anonymous Coward | about a year and a half ago | (#42504589)

GRID is probably where MMO games will move, one think sorely lacking in MMO's is a persistant environment that -everyone- sees the same. So much shitty consumer hardware is out there, that you're stuck making games like FFXIV (which looks pretty) have to conform to 5 year old game consoles capabilities. This may not work that well for action MMORPG's (where it's sorely needed) and would work better on turn-based combat (eg WoW) by sending everyone the same image but from their own "camera"'s POV.

Plus it completely eliminates bots and cheating at the client end, so no more malware,crippleware,spyware crap needs to run on the user's system.

Sadly, this is where it's needed, but I don't see companies wanting to move all the "rendering backend" inhouse anytime soon since that's a huge increase in electricity costs. Nice try though.

also need data centers close to major citys (1)

Joe_Dragon (2206452) | about a year and a half ago | (#42504651)

also need data centers close to major city's to keep ping times down so you need more then one to cover the main land usa.

AK and HI may need there own as well.

Re:also need data centers close to major citys (1)

h4rr4r (612664) | about a year and a half ago | (#42505169)

May?

They absolutely would have too. A state the size of AK would need several. HI is also pretty well spread about, but maybe it could be just a few since the islands are in a line. Assuming no latency other than that incurred by the speed of light, and for a spherical cow, 280miles is the maximum for a 3ms latency. That is being very generous and still looks like pretty much an impossible task, unless you only allow some zipcodes to sign up for service to avoid having pissed off customers.

Re:also need data centers close to major citys (1)

Joe_Dragon (2206452) | about a year and a half ago | (#42510945)

may as in price higher or not available in AK or HI

Re:GRID for nexgen MMOs (1)

L4t3r4lu5 (1216702) | about a year and a half ago | (#42504877)

WoW isn't turn-based. Latency is just as important in WoW as it is in CoD or any other FPS game. If your latency is too high, you'll miss your GCD (activating an ability usually caused other abilities to be blocked for a brief time, typically between 1 and 1.5 seconds. They call it a Global CoolDown). It's the MMO equivalent of missing a sniper shot, or failing to lead a moving target properly. You'll also miss timed events which can often be unforgiving (Missing the roll timings on Spine of Deathwing, for instance). It basically makes all but casual play pointless.

Re:GRID for nexgen MMOs (1)

citizenr (871508) | about a year and a half ago | (#42505095)

EVERYTHING is turn-based. Its called server tickrate. Just because its fast enough for you to not feel it doesnt mean its not there.

Re:GRID for nexgen MMOs (1)

L4t3r4lu5 (1216702) | about a year and a half ago | (#42516153)

In that case THE UNIVERSE is turn based with a tickrate of 10^-43.

Pedant.

SHIELD's screen directly over the buttons? (2)

holiggan (522846) | about a year and a half ago | (#42504609)

Humm from my experience with some laptops, usually its not a good idea to have a screen directly over keys (or buttons). I've seen a couple of screens with "marks" from the keys, although this might be due to the quality or age of the laptops in question... Perhaps the SHIELD has the buttons below the level of the screen when closed?
I'm nitpicking, I know, but that was the first thing that crossed my mind while looking at the pictures :)

more stupidity (3, Insightful)

slashmydots (2189826) | about a year and a half ago | (#42504615)

I could have sworn there was another company that recently discovered that you cannot in fact stream 1920x1080 uncompressed over a standard internet connection and went bankrupt. Now Nvidia is trying to do it? So a server can render it, great. So a PC can receive and render it, great. How does the data get between the two? In a delayed, heavily compressed format with a 50+ ms delay, that's how. Totally pointless.

Re:more stupidity (0)

Anonymous Coward | about a year and a half ago | (#42504747)

It will probably stream OpenGL commands, not rendered images.

Re:more stupidity (1)

WillerZ (814133) | about a year and a half ago | (#42504867)

It will probably stream OpenGL commands, not rendered images.

Doubtful.

If you can stream one rendered image every 16ms you can display 60fps. If you send OpenGL commands you will be better-off on a lot of frames but it will stutter every time you need to send a texture larger than a rendered frame (and most textures are larger than a rendered frame).

The game is to make the peak frame latency 16ms (or whatever your target is), not to reduce the average.

Re:more stupidity (1)

phorm (591458) | about a year and a half ago | (#42506485)

Wouldn't it make sense then to have some data (textures etc) locally cached, and just send the scene/model info?

Re:more stupidity (1)

blackraven14250 (902843) | about a year and a half ago | (#42506641)

Not if their intention is to run everything on their servers. Look at what's mentioned in the summary, and tell me with a straight face that rendering is going to be done on the client.

Re:more stupidity (1)

phorm (591458) | about a year and a half ago | (#42510611)

The post a few up suggested they might be streaming OpenGL commands. My suggestion was that it could possibly be done if the textures or similar large data-blobs had a local cache.

Re:more stupidity (1)

walshy007 (906710) | about a year and a half ago | (#42516177)

Read further into it, and you effectively get three options in working with it.

1. Run your android things on it, entirely locally, with a quality controller

2. Use your LOCAL PC to stream stuff to it that then feeds the tv.

3. Use the cloud.

Personally, I'm more interested in number one more than anything else. and I'm disappointed that not a single thread on here has went "this is a neat hand-held console"

Yes, you can stream cloud stuff on it, big deal, it has better uses, lets look at those.

Re:more stupidity (2)

Psyborgue (699890) | about a year and a half ago | (#42507515)

You can probably preload textures and some geometry. Some game engines do something similar now. You get low-res proxies to start with and it updates textures as it can read them off the disk (or in this case, the network). Second Life is a good example of this in action. You don't need to load the whole game at once. Just what's visible, and you keep a cache of that locally. I don't see them needing a Tegra 4 in the handheld if all it's doing is decoding a video stream. It makes sense that they would be doing something like this. If not, they're going to run into serious latency problems with FPS, driving, and similar unforgiving games.

Re:more stupidity (1)

slashmydots (2189826) | about a year and a half ago | (#42515047)

It could easily be done but I thought big downloads, tons of storage, and the need for lots of VRAM are what they're trying to avoid. I mean these days a Pentium chip can render quite a bit but I know OnLive or whatever it was called wanted to run it on something closer to an Intel GMA adapter. That or anything else HDCP-capable-ish.

Re:more stupidity (2)

gl4ss (559668) | about a year and a half ago | (#42505035)

It will probably stream OpenGL commands, not rendered images.

that could very easily be very counter productive to the task at hand - data needed to create the image could fairly easily be much bigger that way than just streaming the images in modern games. and you might just as well play the entire game on the handheld then..

They're probably encoding on gpu though thus the minimum gpu requirements on the pc..

anyways, it's not that onlive doesn't work. it works if you have a fast connection.. of course locally it would be better still. but onlive type of system is pretty hard to make profitable.

this is a niche tegra showcase product. something to showcase the system, place where these would be at home would be in airplane entertainment systems etc.

Re:more stupidity (1)

Joce640k (829181) | about a year and a half ago | (#42505065)

I could have sworn there was another company that recently discovered that you cannot in fact stream 1920x1080 uncompressed over a standard internet connection and went bankrupt. Now Nvidia is trying to do it?

No, NVIDIA's just fishing for somebody who's stupid enough to go bankrupt from buying NVIDIA chips.

Re:more stupidity (1)

bfandreas (603438) | about a year and a half ago | (#42505405)

They are not streaming over the internet. They stream video/audio over HDMI. Which is why that thing is tethered to your gaming PC by an HDMI cable. Which is why they only support one of their own newer graphics cards because presumably they also send controller input via HDMI. This is technologically pretty sound.

Nvidia has solved the old problem of not wanting to sit at your PC to play games on a 27" screen but on a 10" screen while sitting on the couch. Wait what?
The HDMI streaming thing is indeed bit gimicky. And I wouldn't be too surprised if it wasn't dropped entirely.

That thing is a prototype, so hopefully it will be looking less daft when it is released. An Android portable gaming device makes actually a lot of sense. As much as a Gameboy, PSP and so on. Question is, is it good enough to replace gaming on a Smartphone, which the target group propably already has. this thing will float like a lead zeppelin. And I really doubt if this is much more than a tech demo for their Tegra4 SoC and a big fat snub to Sony and Microsoft for passing them over in their next gen consoles.

They built it because they can and to frighten the natives.

On the upside hopefully a lot more Android game devs will support controllers. PS2 controller + Android tablet(+ TV) is actually a very nice thing for those of us who spend a substantial time on the road.

Re:more stupidity (1)

drinkypoo (153816) | about a year and a half ago | (#42505603)

An Android portable gaming device makes actually a lot of sense. As much as a Gameboy, PSP and so on.

More, in fact, for a variety of reasons. But it has to be small and cheap and nVidia will want a nice big screen. A small tablet (MID-sized) that can dock to a controller would make almost infinitely more sense, ala the Transformer series. I, however, think that Asus should do it, and nVidia should stick to making chips.

Re:more stupidity (1)

bfandreas (603438) | about a year and a half ago | (#42506097)

I don't think this is a serious foray by nVidia into that particular market. They are showcasing Tegra4 to the masses. Which, as you have pointed out, hopefully compells Asus to pick it up for the next gen Transformer series.
Tegra4 can show a different screen on the tablet than it does via HDMI. My Prime already is a nice business/gaming/allrounder and I'm just waiting for a proper justification to ditch it for a new shiny. Which hopefully doesn't crawl to a shameful halt while it does any kind of meaningful IO.

Dungeon Keeper 1 is too sluggish in DosBox :(
...also whatever it takes for more Android apps to support a bluetooth controller is fine by me.

depends on the game (3, Insightful)

Arakageeta (671142) | about a year and a half ago | (#42505779)

Maybe this isn't a solution for FPS games, but I would love to be able to play Civilization V from the cloud with all the graphic bells and whistles.

Re:depends on the game (0)

Anonymous Coward | about a year and a half ago | (#42509289)

Excuse me, a turn based game has bells and whistles? I think you are doing it wrong.

Re:more stupidity (1)

Sockatume (732728) | about a year and a half ago | (#42506239)

nVidia has a lot of money in hardware video compression. Throwing their chips in with cloud gaming is a low-cost bet for a company that's going to be developing the technology for other applications.

Uses Kepler's h.264 encoder (1)

Namarrgon (105036) | about a year and a half ago | (#42516009)

The streaming video is of course compressed, by the required GeForce 650+ GPU which has dedicated low-latency h.264 encoding hardware. The Shield unit supports 2x2 MIMO 802.11n, which should be more than capable.

nVidia haven't given any latency figures, but hands-on reports all indicate "no detectable lag" over local connections. Some mention visible encoding artefacts, so the bitrate used may not be very high.

I need realtime ray tracing. (1)

Impy the Impiuos Imp (442658) | about a year and a half ago | (#42504635)

Lemme know when you can stream a 4k render (as in 4096i) to my house with a 50ms latency and reaction time.

Re:I need realtime ray tracing. (1)

Colonel Korn (1258968) | about a year and a half ago | (#42505329)

Lemme know when you can stream a 4k render (as in 4096i) to my house with a 50ms latency and reaction time.

50 ms network latency can be unplayable. The local hardware, display, and controller will add another 50-75 ms of latency on top of that. They already do with consoles. If playing on a computer with a good monitor instead of a television this might be cut down to 25 ms, in which case a 25-50 ms latency from the network might be acceptable overhead.

Additionally, connections don't maintain constant latency. A 50 ms latency will jump to 200 ms (or in my ISP's case, 5 s) every few minutes. If you've ever played pre-Quakeworld Quake 1 you know what that will feel like when streaming.

Re:I need realtime ray tracing. (1)

Psyborgue (699890) | about a year and a half ago | (#42507595)

Why raytracing? Go straight to real-time path tracing (inverse raytracing). Modern GPUs can do it with static scenes quite easily and efficiently, but there are issues where geometry needs to deform or move (that being said, some seem to have solved this problem). It's doable. Next generation or two and we'll start seeing it in action, but it's going to be rough convincing everybody to give up scanline rendering and all the fakery they've gotten used to. If you can make something look better with scanline, why not do it? There are good fakes for almost everything, including bounce lighting, and physically accuracy isn't always desirable.

the best part of this, is the waiting (3, Insightful)

alen (225700) | about a year and a half ago | (#42504727)

geeks love to wait for crap and nvidia is the master of a paper launch

expect this to hit stores by august 31 to and in the mean time geeks will be creaming their shorts reading blog posts about how awesome this and watching unboxing videos made by the PR guys

Re:the best part of this, is the waiting (1)

gtirloni (1531285) | about a year and a half ago | (#42504953)

+1

Re:the best part of this, is the waiting (1)

Anonymous Coward | about a year and a half ago | (#42505011)

Fucking ATI shill!

Re:the best part of this, is the waiting (1)

citizenr (871508) | about a year and a half ago | (#42505119)

Exactly. I remember exact same hype surrounding previous Tegras.

Re:the best part of this, is the waiting (1)

alen (225700) | about a year and a half ago | (#42505211)

i've bought nvidia cards since the 1990's TNT2 and they always pull this crap
one time they announced a card and they hadn't taped out the GPU to TSMC yet

9 month wait for retail means the final product is not ready and they don't have a manufacturing deal set up yet. for all the hype of the Tegra 4, the LTE modem is not ready yet so don't expect phones for it until next year at the earliest. by then the apple A8 will be out

Let's hope it is more efficient than Tegra 3 (1)

IYagami (136831) | about a year and a half ago | (#42504789)

According to a study of power efficiency focused in tablets with different CPUs (nVidia Tegra3, Qualcomm Krait APQ8060A, Samsung Exynos Cortex A15) from anandtech.com ( http://www.anandtech.com/show/6536/arm-vs-x86-the-real-showdown [anandtech.com] and http://www.anandtech.com/show/6529/busting-the-x86-power-myth-indepth-clover-trail-power-analysis [anandtech.com] ), nVidia Tegra3 is less efficient than Intel Clovertrail platform:
* Intel Clovertrail vs nVidia 3: "Ultimately I don't know that this data really changes what we already knew about Clover Trail: it is a more power efficient platform than NVIDIA's Tegra "
* Intel Clovertrail vs Qualcomm Krait: "We already know that Atom is faster than Krait, but from a power standpoint the two SoCs are extremely competitive. At the platform level Intel (at least in the Acer W510) generally leads in power efficiency. Note that this advantage could just as easily be due to display and other power advantages in the W510 itself and not necessarily indicative of an SoC advantage."
* Intel Clovertrail vs Samsung Exynos Cortex A15: "The Cortex A15 data is honestly the most intriguing. I'm not sure how the first A15 based smartphone SoCs will compare to Exynos 5 Dual in terms of power consumption, but at least based on the data here it looks like Cortex A15 is really in a league of its own when it comes to power consumption. Depending on the task that may not be an issue, but you still need a chassis that's capable of dissipating 1 - 4x the power of a present day smartphone SoC made by Qualcomm or Intel. Obviously for tablets the Cortex A15 can work just fine, but I am curious to see what will happen in a smartphone form factor"

Re:Let's hope it is more efficient than Tegra 3 (3, Informative)

hattig (47930) | about a year and a half ago | (#42507543)

NVIDIA say that Tegra 4 is 45% more efficient power-wise than Tegra 3. Some of this will be down to its 28nm process rather than the 40nm process Tegra 3 utilised. They also say it is 2.6x faster.

In addition the AnandTech articles are all set up by Intel, so you need to take the results with a large pinch of salt.

It does raise some questions about Samsung's 32nm process, although a large amount of the power consumption of the Exynos 5250 could be the GPU rather than the CPU - the Exynos uses a very high performance (for a SoC) GPU.

I wonder (0)

Anonymous Coward | about a year and a half ago | (#42504831)

I wonder whether the big game studios will really go develop stuff for something like this. They were always complaining about piracy on the psp and ds. They can now develop for the vita and 3ds that haven't been hacked (yet) with a small userbase or develop for Android with a huge userbase but guarantee that users will pirate.

720 Xbox 360? (1)

gtirloni (1531285) | about a year and a half ago | (#42504893)

I see what you did there [techradar.com] .

Tegra4 vs Intel Medfield (0)

Anonymous Coward | about a year and a half ago | (#42504911)

In a test performed by Anandtech, the Tegra 4 was compared to Intel's latest, by turning off everything: the Wifi, the apps, the screen, the memory, the power, unplugging the battery and comparing the power drain, it was determine that BOTH processor achieved the same level of power consumption!

Anandtech says, "well Intel have really shown it can level the playing field here, with no detectable difference between the power draw of both processors when they're doing bugger all. We could recommend either processor for the user who like to do bugger all with their computers and just leave them powered down".

Re:Tegra4 vs Intel Medfield (1)

Anonymous Coward | about a year and a half ago | (#42505073)

the test you are referring to was done with tegra 3

Re:Tegra *3* vs Intel Medfield (1)

Anonymous Coward | about a year and a half ago | (#42507583)

Old 40nm design pitted against latest Intel 32nm design.
Intel design doesn't impress outright.
Was this the one done with Windows RT, that cannot make use of Tegra 3's low power companion core? Yes, yes it is.

Why such a low resolution display? (1)

h4rr4r (612664) | about a year and a half ago | (#42504941)

With 1080p 5" android phones being all the rage, why bother with 720p?

Re:Why such a low resolution display? (1)

gl4ss (559668) | about a year and a half ago | (#42505059)

With 1080p 5" android phones being all the rage, why bother with 720p?

they cost and are hard to source currently(one affecting the other) .

Re:Why such a low resolution display? (1)

gtirloni (1531285) | about a year and a half ago | (#42505139)

I think currently is the key word here.

Re:Why such a low resolution display? (0)

Anonymous Coward | about a year and a half ago | (#42505403)

It's a gaming device. More FPS

Re:Why such a low resolution display? (1)

Sockatume (732728) | about a year and a half ago | (#42506295)

Don't all the 1080p Android phones currently have terrible performance and battery life?

Re:Why such a low resolution display? (0)

Anonymous Coward | about a year and a half ago | (#42507381)

Because you can't tell the difference at that size.

Re:Why such a low resolution display? (1)

luther349 (645380) | about a year and a half ago | (#42516943)

no price has been set but if its in the sub 200$ area you will know why.

nice (-1, Flamebait)

jonsonbd (2810231) | about a year and a half ago | (#42505347)

This is a nice post for all.I am very impressed to read this article. Shan [method2earn.com]

YOU FAIL IT (-1)

Anonymous Coward | about a year and a half ago | (#42506035)

Win out; >either the

Nvidia confusing people again. (4, Informative)

Anonymous Coward | about a year and a half ago | (#42506725)

Nvidia makes the mistake of thinking its audience might be informed- big mistake.

At this conference, they showed a variety of services, which should be understood individually.

1) The proprietary GRID server rack system. The difference with Nvidia's solution is that the rack includes a butt-load of low-to-mid end Nvidia graphics chips. These are a solution looking for a problem. Nvidia foolishly suggested cloud gaming (rendering on the server- images set via the internet to the clients)- a model that has already crashed and burnt 1.5 times (Onlive went bust, Gaikai gained little traction and was sold). Worse, Nvidia's weak GPU solution is not good for processing the AAA games that people love on their PCs.

2) The launch of the Tegra 4- a high end ARM SoC, with a very high end GPU (graphics).

3) The launch of a reference design for a mobile Android gaming device based on the Tegra 4

4) The launch of a 'streaming' technology that allows PC games to be rendered on the desktop, and then wirelessly transmitted to a hand-held Android device, allowing the tablet to 'run' even the most powerful PC games. Of course, Nvidia was saying that their service would be proprietary, requiring specific Nvida graphics cards in the PC, and a Tegra 4 mobile device.

The new Nintendo Wii U does the same thing, using AMD/ATI technology. Third party apps already exist allowing you to hack current PC games and send their output to generic Android devices.

As one might imagine, the problem is simply one of real time video encoding (for the game output), and then playing back the video-stream on the Android device. Meanwhile, input is gathered on the Android device, and transmitted back to the PC to 'control' the game. It is obvious that such software methods will work at their best when built into the drivers on the PC, and this is what Nvidia is offering.

5) The Tegra 4 is revealed to be extremely power-hungry when all its processing units are being thrashed. No surprise here. The new paradigm for high-end ARM chips is parts that can go from extremely low power usage all the way up to power profiles usually associated with notebooks- and I mean in the same chip. We are actually close to mains-powered desktop ARM parts (which will easily rival Intel on a performance per chip cost basis). The market is demanding that the high-end mobile parts can achieve ever higher performance figures, regardless of the impact on battery life.

6) Nvidia showed various 'soft' smart-TV like functions by using the Tegra 4 as input for a 4K TV. Here we see the growing logic of using an external Android device as the heart of 'smart' TVs, rather than relying on the dreadful proprietary hardware/software solution the TV manufactures build into the TV itself.

So, in conclusion, Nvidia was really just releasing its latest ARM SoC, the Tegra 4. This part will go up against a lot of competition, and is a risky bet. Nvidia really needs the market to value Tegra's unique functions, and this really only means Nvidia's new GPU cores. Unfortunately for Nvidia, the power of their graphics can only be unleashed in mobile devices with very substantial battery capacity, and proper cooling- nothing like your current cheap tablet ecosystem. If Tegra 4 is placed into an ordinary tablet design or phone, the chip will have to be choked to a fraction of its potential, else the battery will last less than 1 hour, and the device will get very hot indeed.

The Tegra 4 may represent Nvidia giving up on the old Android mobile market, and focusing instead on applications that can provide more power, and dissipate more heat, like set-top boxes, and chunky hand-held gaming devices.

Re:Nvidia confusing people again. (0)

Anonymous Coward | about a year and a half ago | (#42516451)

> The Tegra 4 is revealed to be extremely power-hungry when all its processing units are being thrashed. No surprise here.

Show me some benchmark. Man, the chip is brand new! There is no device to test it on, still, it is "revealed". Only in your mind.

Re:Nvidia confusing people again. (1)

luther349 (645380) | about a year and a half ago | (#42516907)

they said it will do 6 hrs under gaming and like 12 playing video. maybe most of this devices size is the huge battery they put in.

If nvidia wants to get into consoles, they shud... (0)

Anonymous Coward | about a year and a half ago | (#42506899)

A Tegra 4 console is a waste of time. nvidia is one of the few companies that can make a big, powerful GPU. If nvidia wants to get into consoles, they should team up with Valve and make a 'traditional' Desktop console. They will need a good CPU. Project Denver might suffice, but it might be better to turn to IBM. Valve will be the senior player in such a console, and that might be too much for nvidia's ego.

Re:If nvidia wants to get into consoles, they shud (1)

luther349 (645380) | about a year and a half ago | (#42516927)

its not really a console its just a android with a built in controller. pretty much anything you can aruldy do with a tablet.if anything its a smart move even if the sales are not good they put very little investment into it and it makes no difference if they quit making them down the road because they are android and will still get apps and games.

240 GPUs, 200TFLOP... (1)

Ed_1024 (744566) | about a year and a half ago | (#42508755)

...should be just about enough to run one instance of Crysis 4. If you've got an OC-192 internet connection, you might be able to play it in HD as well!

Gamma, Color Correction (0)

Anonymous Coward | about a year and a half ago | (#42509867)

Whether or not Nvidia releases support for color management on Tegra 3 will definitely influence confidence in their newer systems. These companies thrive off marketing developing technologies for profit, leaving basic features and documentation in the background.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?