Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

OnLive Gaming Service Gets Lukewarm Approval

Soulskill posted more than 4 years ago | from the better-than-lukewarm-rejection-i-guess dept.

PC Games (Games) 198

Vigile writes "When the OnLive cloud-based gaming service was first announced back in March of 2009, it was met with equal parts excitement and controversy. While the idea of playing games on just about any kind of hardware thanks to remote rendering and streaming video was interesting, the larger issue remained of how OnLive planned to solve the latency problem. With the closed beta currently underway, PC Perspective put the OnLive gaming service to the test by comparing the user experiences of the OnLive-based games to the experiences with the same locally installed titles. The end result appears to be that while slower input-dependent games like Burnout: Paradise worked pretty well, games that require a fast twitch-based input scheme like UT3 did not."

Sorry! There are no comments related to the filter you selected.

Duuuuuh (0)

Anonymous Coward | more than 4 years ago | (#30857732)

Was this really a surprise to anyone who knows anything about the technicalities of time critical mechanisms in games?

Re:Duuuuuh (0)

Anonymous Coward | more than 4 years ago | (#30857970)

If you can stick a rack of servers at most large ISPs, it will work. Even a hop or two away you can probably get 20ms round trip to a lot of residential broadband users, which is roughly equivalent to the input lag of a cheap LCD monitor.

It's completely doable technically, as long as you're not stupid enough to think that you need a few big datacenters. You need lots of small ones, everywhere.

Re:Duuuuuh (1)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#30858064)

However, you are still dealing with 20ms plus the input lag of the cheap LCD monitor that the user will no doubt have...

Re:Duuuuuh (1)

sadness203 (1539377) | more than 4 years ago | (#30858138)

Well, people playing online game already have this latency issue, and a bunch of them already have cheap LCD monitor... I guess they can live with it.

Re:Duuuuuh (0)

Anonymous Coward | more than 4 years ago | (#30858330)

Most LCD monitors, including the cheap ones, have 4-5ms response times. 20ms is much more latency and yet you'd be lucky to even get that. In most online shooter games that I play (UT, Q3) the ping is usually 60-150ms for what is a considered a "good" rate. Servers that I can get 60ms to are all extremely close to me geographically.

To expect someone to have enough bandwidth to stream 1920x1200 at 60fps from a remote render farm with realistically 50-150ms response time is idiotic.

Re:Duuuuuh (5, Interesting)

Svartalf (2997) | more than 4 years ago | (#30858368)

Actually...it's doable technically with only a very, very small number of subscribers.

Latency and bandwidth will kill the whole thing.

You have to use peak values per customer in your figuring for it to even remotely work the way they portrayed this.

Given this:

1.5Mbits/s for the feed per user for SD experience with OnLive.

You can serve roughly as an absolute maximum :

30 users on a T3.
103 users on an OC-3.
404 users on an OC-12.
1658 users on an OC-48.

You can expect about $250-500k/mo recurring costs on that OC-48. As another observation, you will likely need to serve 2/3rds to 3/4ths of those numbers to keep the latency usable because as you fill the pipe to capacity, traffic will be subject to the congestion algorithms in the routers and machines at both ends of the pipe. Now, some will state that they'll place the stuff at the ISP's end of things... Then the ISP gets the joy of this same level of connectivity- and they're bitching about "freeloaders" and "bandwidth problems" right now.

OnLive is snake oil trying to be sold to the game industry as a solution to their "control" problem. It's an alternate DRM play. And it can NEVER work in our lifetime. You can't field enough bandwidth cheaply enough to accomplish it.

Re:Duuuuuh (1)

slim (1652) | more than 4 years ago | (#30858488)

I think this is a well thought out post in general. However:

1.5Mbits/s for the feed per user for SD experience with OnLive.

TFA measured ~750kbps for 720p.

Maybe your 1.5Mbps was from OnLive's announcement of the peak bandwidth required?

Re:Duuuuuh (2, Funny)

IBBoard (1128019) | more than 4 years ago | (#30858526)

[OnLive] can NEVER work in our lifetime.

You say that, but I've been using a "remote generation of gaming images" system for years and there is basically no lag. Okay, the catalogue of games is a little limited, but the control and response is amazing. Distance? I'd say about three or four feet from the input and output devices to the box that generates my images. Definitely remote from the devices and definitely working over wires without latency issues.

So, it is already working, and I can't see why I'd want to change to this new one.

Re:Duuuuuh (1)

poetmatt (793785) | more than 4 years ago | (#30858650)

I'm so tired of people thinking it can be done - this thing has as much hype as the apple tablet, and neither of these have anything. Both are horrible deliveries and over promises, not that such a concept is foreign to apple. Onlive however is new, and has been doing this since day one. I pity anyone who's invested in onlive at this point.

Yet another infomation-free summary... (5, Informative)

Anonymous Coward | more than 4 years ago | (#30857758)

The guy logged in using credentials 'borrowed' from an authorised beta tester, from more than twice the recommended distance from the server, acknowledged multiple high latency (due to distance) notifications, and the best he could do is damn the service with faint praise.

Re:Yet another infomation-free summary... (-1, Flamebait)

juuri (7678) | more than 4 years ago | (#30857800)

Thanks OnLive employee, you sure got those videos down fast!

Re:Yet another infomation-free summary... (1, Funny)

Anonymous Coward | more than 4 years ago | (#30857814)

If you want to try to shoot the messenger, make sure you're close enough to the server. ;)

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30857810)

Do you work for OnLive? Then how do you fail to acknowledge that for games that are highly dependent on reaction speed even a few millis of latency may add up to a laggy experience? The problem may be exacerbated by the reporter's distance from the server, sure. But for serious gamers it is common knowledge that remote playing will not ever be as quick as a LAN frag fest.

Re:Yet another infomation-free summary... (3, Interesting)

Trepidity (597) | more than 4 years ago | (#30857816)

But for serious gamers it is common knowledge that remote playing will not ever be as quick as a LAN frag fest.

Possibly true, but possibly also might not matter, if it's still quick enough. After all, playing on the internet isn't as quick as a "LAN frag fest", and yet the vast majority of gamers, even of twitch-heavy games, are playing on the internet, not on LANs.

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30857844)

Difference between sending signals which correspond to an action and a full round trip of player input and video rendering output.

Re:Yet another infomation-free summary... (5, Informative)

Anonymous Coward | more than 4 years ago | (#30857880)

". After all, playing on the internet isn't as quick as a "LAN frag fest", and yet the vast majority of gamers, even of twitch-heavy games, are playing on the internet, not on LANs."

With tons of client side prediction and faking trying very very hard to hide the client-server lag.

With OnLive, you can't do that - it just sends some inputs and gets some video back.

I mean, this could work under optimal, super fast network connections, but I'm pretty sure ensuring you have such a connection would be so expensive that this is a solution to a problem that doesn't exist - it is always cheaper to spend the money on client side hardware instead. I'm sure stupid venture capitalists will keep pumping money into this with idiotic projections how bazillion people will pay X dollars per month or hour or whatever that will somehow cover those network infrastructure costs.

I doubt it will and few years from now OnLive goes bust taking a big pile of money with it, but hey, you never know... can't do impossible stuff without trying.

Re:Yet another infomation-free summary... (1)

Kneo24 (688412) | more than 4 years ago | (#30857998)

Which, once again, may not matter to the average gamer. We're not talking about the hardcore gamer here. They already have and get what they want for serious gaming. The average gamer doesn't necessarily do this and is used to playing under sub optimal conditions anyway. Ever take a look at STEAM's hardware survey? Most people have older hardware, not newer hardware. They're already experiencing plenty of computational lag on their end. This service may not be any different than that,

Re:Yet another infomation-free summary... (1)

icebraining (1313345) | more than 4 years ago | (#30858712)

Wrong. People with old hardware can reduce the quality to get good performance (I play Call of Duty 4 in 800x600, lowest quality), but there's no switch than can lower the RTT.

Re:Yet another infomation-free summary... (1)

tepples (727027) | more than 4 years ago | (#30858030)

With tons of client side prediction

How well does client-side prediction work in an input-heavy game, like a fighting game, played between continents? You could be playing just fine, and then half a second later when the updates make it through, you're dead because your computer didn't predict that the other player would do a juggle-spike combo on you.

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30858440)

Not very well.

Street Fighter IV over Games for Windows LIVE would be a good example. It requires fairly good connection to be playable.

Anyway, the main point of client side prediction and "cheats" is to *hide* the fact that yes, any online gaming has unavoidable latency between your button press and something happening in the "definite" version of the game world that is usually modeled on a server. If your local side can show the action even before it is confirmed by the server to occur, it looks better and if the client can somehow make things look plausible in cases of conflict, even better.

In normal Client-Server online play, the client is effectively trying to run a very well faked show of what is most likely going to happen on the server in the very near future (0.3-0.5 seconds or 300-500ms) based on your inputs and recent data from other players and then goes for a bag of tricks when it guesses wrong. This is not an exact science, but you can pretty easily tell which games can do this well and which don't. On the MMO side, just compare, say, World of Warcraft which fakes things reasonably well (to a point, it isn't perfect in high lag situations) and, say, Lord of the Rings Online (which suffered, at least early on, from noticeable stuttering and sliding when displaying the positions of other players - no idea if things have improved since I last played it).

OnLive cannot do any of this, so at all times you have an input lag and it will be annoying under anything except the most optimal conditions (50ms). You wouldn't need much client side trickery for normal online games today if you could guarantee that everyone has sub-50ms latency to the server.

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30857882)

Nope, no connection to this at all; just that I comment so rarely on /. that there's no point having an account. I would have thought that the rest of your comment is blindingly obvious and didn't need to be stated. 'Serious' gamers are probably not the target market for this product, so again, no real need to mention it.

Re:Yet another infomation-free summary... (1)

tolan-b (230077) | more than 4 years ago | (#30857952)

OnLive have clearly said that they think the latency isn't too much for most people if it's lower than 80ms, they've made a big deal about how far you have to be from the server and a reviewer dislikes it because the latency is too high when he's using it in a way that OnLive said would make the latency too high? What a surprise!

Re:Yet another infomation-free summary... (1)

slim (1652) | more than 4 years ago | (#30857966)

OnLive knows all this. They set themselves a target of 80ms round-trip latency.

To achieve this, they set certain geographical limits. This journalist broke those limits. The software warning him about high latency. He observed high latency.

Note that some games are perceived as OK despite up to 200ms round-trip latency. GTA IV on the Xbox was measured to have 133-200ms latency [eurogamer.net] . Nobody cared because it's not a twitch game.

Re:Yet another infomation-free summary... (2, Insightful)

Rockoon (1252108) | more than 4 years ago | (#30858052)

We used to play twitch games all the time back when 56K modems (and thus, latency considerably higher than 80ms) were state of the art. Counter-Strike was born on 56K modems, for example.

The game has to be specifically written to deal with it, but it can be done.

Just the same, I think that this service will be a monumental failure. I just dont see how they will recover the costs, because they can't make it too expensive or the end user can save money by buying a machine that doesnt require their service. Those that already have such a machine (mostly everybody) wont even consider paying the extra costs.

There is more to gaming than FPS... (0)

Anonymous Coward | more than 4 years ago | (#30858174)

I dunno. I had never even heard of this service before but... I am a gamer but I don't play that much FPS games. Mostly RTS, RPG, etc... In those, latency really isn't nearly that big of a problem. So, if I could play Dragon Age on a laptop with good graphics and without worrying about the damn thing crashing when rendering a highly detailed cutscene... Great!

Re:Yet another infomation-free summary... (1)

montyzooooma (853414) | more than 4 years ago | (#30858540)

I remember playing the original Quake beta demo (not Quakeworld) online and you had to learn to lead your shots so they'd land where you thought your opponent was going to be. Everything since Quakeworld has been gravy.

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30858612)

but everyone was in the same situation so the playing field was even. now having that much lag combined with instakill stile of gaming mean that the one with the shorter ping wins

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30858750)

You're forgetting about client-side prediction which can't be done if you're only sending results (screen) as opposed to data to be evaluated by the client engine.

The prediction is how a crap connection can still be used to play a modern MMO or FPS reasonably well.

Valve has a very interesting article in the half-life SDK describing the prediction and latency reduction techniques they used to make counter-strike playable.

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30857812)

What's that? The beta tester was a regular guy who didn't get special treatment and his experience was mediocre, thereby more accurately representing the actual product!?

Re:Yet another infomation-free summary... (3, Interesting)

slim (1652) | more than 4 years ago | (#30857850)

Even when it's a full product, you won't be allowed to sign up if you're not in a geographically suitable place.

It seems that the eventual plan is that it will dynamically assign your session to the closest datacentre. But for the timebeing, each Beta tester's ID is assigned a datacentre at registration time, and that's the one that ID will use every time.

It explains in the TFA that he borrowed the login credentials from a beta tester in another part of the country. Hence he wasn't using a nearby server, as he would have been if he was a real beta tester, or in future, a paying customer.

It's pretty amazing it worked as well as it did, considering all that.

Re:Yet another infomation-free summary... (0)

tepples (727027) | more than 4 years ago | (#30858050)

Even when it's a full product, you won't be allowed to sign up if you're not in a geographically suitable place.

Then the product will have few customers because most people aren't willing to spend thousands of dollars/euros to move to "a geographically suitable place."

Re:Yet another infomation-free summary... (1)

slim (1652) | more than 4 years ago | (#30858096)

Have you noticed how people tend to cluster into populous areas?

My wild guess is they'll put the servers where the people are, rather than expect people to move.

Where what people are? (1)

tepples (727027) | more than 4 years ago | (#30858166)

My wild guess is they'll put the servers where the people are

Not everybody lives in New York or Los Angeles. There are 200,000 people in Fort Wayne, Indiana, including myself. What are the odds that I'll get coverage?

Re:Where what people are? (2, Insightful)

slim (1652) | more than 4 years ago | (#30858212)

Less than 200 miles from Chicago. You'll be fine.

In fact you're already on their coverage maps. I'd be astonished if they didn't expand from the three datacentres used for the Beta.

Re:Yet another infomation-free summary... (1)

MetalAngel (1659579) | more than 4 years ago | (#30858246)

Did you see their talk a few weeks ago ? They will deploy datacenters in several locations in the US. Almost everyone (except a few place up in the north) will be able to play.

Re:Yet another infomation-free summary... (1)

Caue (909322) | more than 4 years ago | (#30857940)

mediocre = average, so enough for most people. that's a damn good result if you are going for the average user (not the nerdy pro-gamer).

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30857840)

It's a stupid idea... it needs to be damned by damn...

Re:Yet another infomation-free summary... (1)

MetalAngel (1659579) | more than 4 years ago | (#30858186)

The article itself states that the writer got warnings that the latency is not good. The slashdot bottom line: " The end result appears to be that while slower input-dependent games like Burnout: Paradise worked pretty well, games that require a fast twitch-based input scheme like UT3 did not." is therefore complete utter nonsense. Lots of Lag -> of course UT3 does not work. Duh ! It would have been another story if there weren't any warnings... It also would have been another story if this wasn't a beta and 50% of the users get this warning. ... but like this you can't conclude anything. Thinking before writing helps.

Re:Yet another infomation-free summary... (0)

Anonymous Coward | more than 4 years ago | (#30858188)

You seem to have a poor understanding of the phrases 'information free' and 'damn with faint praise', as I see *neither* of those apply to the summary (for the former) or the article (for the latter).

Sounds like a realistic test to me (2, Insightful)

Sycraft-fu (314770) | more than 4 years ago | (#30858504)

Because guess what? In the real world, people live all over. Onlive isn't going to be able to say "Just move closer to one of our data centers," at least not if they want to pitch themselves as the "cheaper than buying a graphics card" option. Sounds to me like they've been controlling who gets in to the beta to try and create an overly rosy impression. This guy was a more realistic test, a person who doesn't happen to be near their few locations.

That's just the reality of this. If it is to work well I can't only work well for a few people in a few locations.

Also the more revealing part was just how bad things look, just how much the compression degrades the image quality. The difference between the local and remote screenshots is almost the difference between SD and HD. While the Onlive stuff is 720p, technically, a ton of the detail is getting lost. That's just what you are going to get when you try and jam HD video in to a 1mbps stream. Only problem is that detracts from one of the supposed reasons to get the service. The lower the resolution and image quality, the lower end graphics card that could handle it on a local system. So Onlive isn't giving you the same experience as a $400-$600 graphics card, it is giving you maybe the same experience as a $50-100 graphics card. Well then, makes it much less worth while.

So initial results seem to show that the doubters were right:

1) Latency will be an issue. If you don't happen to live near their datacenters, your latency may make playing difficult to impossible.

2) Quality will suffer. They don't have some magic voodoo compression that makes everything look perfect, their compression is like everyone else's and trying to do 720p @ 60fps equals a good deal of detail loss.

3) Even if you have a good net connection, if there are problems or congestion, the service will be unusable, meaning you can't play your games whenever you want.

Makes it not so attractive as they hyped it to be, especially against powerful $100 graphics cards (the low-mid range of graphics is great these days) and $200 game consoles.

Technically competent gamers can sum this up. (0, Redundant)

AbRASiON (589899) | more than 4 years ago | (#30857764)

With a single word.
"DUH!"

As expected (1)

mseeger (40923) | more than 4 years ago | (#30857778)

I think the results are as expected or even slightly better than expected (at least from my viewpoint). It shows that something like OnLive will be workable in the future with slightly faster interenet access.

My problems with OnLive are not related to the technical side. Even though i am mostly a casual gamer (at least since i gave up WoW) and i could profit from Pay-per-Hour, i am not sure i would like this. It would require a lot of trust from my side, OnLive has still to earn.

CU, Martin

Re:As expected (1)

bluesatin (1350681) | more than 4 years ago | (#30857954)

This is an issue with latency, not bandwidth; even if you had a dedicated 100Mb line but had 100 ping it would be unplayable.

I doubt future improvements will be able to speed up the hardware that process signals by a huge margin, and presumably they're not going to change the speed of light any time soon; the only real thing they can do is put the data centres closer to the end-user to improve performance.

Re:As expected (1)

slim (1652) | more than 4 years ago | (#30857996)

the only real thing they can do is put the data centres closer to the end-user to improve performance

They're doing that. But Perlman claims that they're also:
  - Developing smarter routing algorithms
  - Tuning at the IP packet level, to increase speed on domestic routers etc. (I guess this is largely about getting the MTU right, dynamically)

Re:As expected (1)

Svartalf (2997) | more than 4 years ago | (#30858602)

Heh... That's a used car salesman's pitch of things.

"Smarter" routing algorithms have to be applied to each and every router in the mix that might see the traffic for that to work. Do you see the ISP's ripping every Cisco and Juniper out to accomodate them?

"Tuning" at the "IP packet level"? Perlman said this?

There's a magic size that will increase bandwidth to peak. Smaller stuff means you don't get as much through because of latencies, etc. Larger stuff,you end up getting more and more bandwidth with diminishing returns and worse and worse latency. You can't "increase speed" on a domestic router by playing games with packet sizes- and the MTU isn't something that magically changes things like you think it does...

MTU stands for "Maxium Transfer Unit" and is the maximum amount of data that can be transferred via a single packet on a given media transport. You could get a bit of a boost of speed by dinking with the MTU size a bit on a dialup link because there wasn't any max packet size and they almost always set the MTU rather low. 1500's the max on most stuff these days because that's the atom for an Ethernet or similar networking system. Messing with the MTU can cause serious problems for stuff you're pushing across the wire when it's not within the max of the systems you're routing through. Pretty much everything is at 1500 over the Connected Internet and you'll gain nothing by changing it upwards and downwards just fragments the hell out of your packets.

Re:As expected (2, Informative)

mseeger (40923) | more than 4 years ago | (#30858100)

Latency (for a not overbooked line) depends on bandwitdh and packet size. Same packetsize and ten times the bandwidth reduces the latency nearly by a factor of ten (on a single line).

Overall latency depends on the sum of all latencies for each lines on the way plus a bonus for each router. The bonus for the routers is not the issue. The number of hops can be influenced by a service provider like OnLive through Peering Agreements. Something OnLive cannot influence, is the last mile to the customer. Usually 30-50% of the total latency happens here. So an increase in bandwidth will help there.

In my case if have a latency of about 25-30ms to the major hosting providers here in germany (which is due to a fast line [6mbps + Fastpath]). The time can be distributed as follows:

- 2ms (my home network)
- 12ms my DSL line
- 2ms my Provider
- 10ms Upstrean Provider
- 1ms Hosting Provider

Even in my case nearly 50% of the latency is created on the last mile. The packet travels Kiel -> Hamburg -> Hannover -> Duesseldorf -> Frankfurt. That amounts to perhaps 400 miles. 50% of the latency on 1% of the way seems to me as a pretty conclusive argument that more bandwidth to the enduser would overall latency significantly.

CU, Martin

P.S. This all depends on the Bandwidth not being overbooked.....

Re:As expected (1)

bluesatin (1350681) | more than 4 years ago | (#30858336)

Latency (for a not overbooked line) depends on bandwitdh and packet size. Same packetsize and ten times the bandwidth reduces the latency nearly by a factor of ten (on a single line).

I'm confused as to how this statement holds up, currently I have a 16Mb or so line at home with pings of about say 100ms to a server. If I was to get an upgrade to a 50Mb line, this means my ping would go down to 32ms?

Surely latency is how long the signal takes to get to a server and back, how would allowing more signals to go back and forward help increase the speed that they get to the server and back?

Car analogy:
If say the speed-limit on a motorway was 70mph, and there was no congestion on the road; why would adding in extra lanes to the motorway increase how fast I get to my destination?

Re:As expected (3, Informative)

mseeger (40923) | more than 4 years ago | (#30858434)

Hi,

If say the speed-limit on a motorway was 70mph, and there was no congestion on the road; why would adding in extra lanes to the motorway increase how fast I get to my destination?

You get the car analogy wrong. A packet of 100 bytes is not similar to a single car. It consists of 800 cars (bits). So if you increase the number of lanes more cars can travel. Each car travels still the same speed (of light) but by allowing more cars at the same time, the delivery (packet) distributed over 800 cars gets delivered faster.

The time a packet takes to get transmitted is roughly: packetsize/bandbidth.

Say you have a 10mbps line and a 1000bytes packet. This will take 8000 bit / 10.000.0000 bit / s = 0,00008 s or 0,8ms (one way). So the latency through the line will be roughly 1,6ms. If you got to 100mbps ethenet or even gigabit ethernet, the time will go down by factor 10 each step.

But there are some side effects: Sometimes packets are packed into larger packets to fill the line better. This will increase the latency. When the speed of the line is high, the time the OS needs to send/receive the packets gets more influence on the latency. Also the latency may occur in your providers network because he overbvooks the service (selling access for more cars than the lanes allow and therfor creating congestion).

To see wether your line is the chokepoint use Traceroute [wikipedia.org] to see where the latency happens. If the latency already occurs close to you, a faster line may improve the latency. Also look for features from your provider as "fastpath".

CU, Martin

P.S. This is a very short overview of the topic. A complete coverage would come as a book. BTW the books have already been written: Richard W. Stevens: TCP/IP Illustrated [kohala.com] .

Re:As expected (1)

wgoodman (1109297) | more than 4 years ago | (#30858520)

Electrical signals don't travel at the speed of light. Ethernet is generally ~.5C. Not entirely sure what the speed for fiber is, but it is also a bit less than C. Just for giggles, I should point out that the sunlight that you see is also not reaching you traveling at C since C is defined as the speed of light in a vacuum. I'm aware that for all intents and purposes, it's easy enough to consider them all as traveling as C since the difference in time is so minute over terrestrial distances. It just gets me when people start a point by saying that the signals are traveling at the speed of light.

Re:As expected (2, Informative)

mseeger (40923) | more than 4 years ago | (#30858614)

Quibbler :-) You wanted it....

For our example 0.5C is sufficently close to C to call it "speed of light" :-). As you point out, the "speed of light" is not the same as C. I can find materials where the speed of light is below 0.5C. So saying that the electric signal travels at the speed of light is correct since i didn't mention any matrial i would be measuring the speed of light in....

Point, game and match :-)

CU, Martin

P.S. I have references to materials reducing the speed of light to 17m/s (38mph for you imperial bastards) without significant absorption. So even our cars goes at the speed of light ....

Ping isn't a good test (1)

Sycraft-fu (314770) | more than 4 years ago | (#30858530)

What you have to remember about ping is it is more or less testing minimum time. The payload of an ICMP packet is very low. With video data like this, you have more payload. So you not only have to count the transit time from the datacenter, but also how long said amount of data will take to transfer at your line speed.

For example, say each video frame is roughly 50kbytes. If you had a line that was only 50kbps, well then it would take a full second for you to receive a frame, even if the latency on that line was 1 ms. You'd start to receive it almost immediately, however it'd take a full second to get it all. Now if you had a 5000kbps line, you'd receive the data in 10ms. Much better.

So a service like this depends on both speed of the connection and latency. You'll note that he said it didn't work when the speed of his connection was around 1.5mbps. Now at first glance that should be fine, 1mbps video stream, still plenty of overhead. The problem is that the time it would take to transfer the video data at that point would add too much latency.

Thus to work well with this you need high bandwidth and low ping to their servers.

Re:As expected (0)

Anonymous Coward | more than 4 years ago | (#30858464)

Latency (for a not overbooked line) depends on bandwitdh and packet size. Same packetsize and ten times the bandwidth reduces the latency nearly by a factor of ten (on a single line).

I am pretty sure that the amount of bandwidth that you have has no real bearing on how much latency you get. My home connection is only a 1.5mbps DSL line, but I can ping servers from here in Los Angeles to San Francisco with similar response times to yours. I also occasionally use a 20mbps satellite connection and it has noticeably higher latency due to the nature of satellite transmission.

When it comes to latency, the bandwidth isn't what matters, the route used to the destination does.

Re:As expected (0)

Anonymous Coward | more than 4 years ago | (#30858134)

It might seem like getting a local data center to reduce latency is quite far fetched right now. But naturally technology develops and data centers shrink both in size and cost thus distributing them in the same manner as a fast food franchise wouldn't be too hard to imagine in the coming decades. So there's definitely a future for this.

Re:As expected (1)

Svartalf (2997) | more than 4 years ago | (#30858514)

It's an issue of both.

With the ping being bad it sucks for the end user.
With the sheer amount of bandwidth needed, there's no way the feed-ends could keep up with more than a couple hundred to a couple thousand.

An OC-48's only able to really handle about 1200 or so realistically.

If you overbook the bandwidth or server resources, you will degrade things accordingly.

Videos removed from YouTube already (2, Interesting)

l_bratch (865693) | more than 4 years ago | (#30857784)

The menu video seems to be available, but the in game videos now give:

"This video is no longer available due to a copyright claim by OnLive, Inc..."

Slower input dependant? BURNOUT?! (0)

Anonymous Coward | more than 4 years ago | (#30857820)

Are you kidding me? Of all the arcade racing games I've ever played in my life this series ranks up as one higher ones in terms of needing to have fast reflexes (thus, input delay would be a huge factor).

I doubt this will catch on... (1)

Ralz (1634999) | more than 4 years ago | (#30857852)

I don't really see this catching on if I'm honest.
People that play graphically intensive games online are likely to also play those games in single player, or other single player games as well, and I doubt OnLive will be serving single player games in this way, and even if they do who is going to want to play a single player game with all the lag of a multiplayer? And most of these people will have higher-than-average spec computers anyway, so playing games wont be an issue.
And games like WOW etc. aren't particularly demanding of hardware, any mid spec or even most low spec computers made in the last few years will be able to handle it no problem.

And what about people that want to play games over LAN sometimes? Having the game installed locally on your machine is much better than having it stored miles away on someone elses server that you don't have any real access to.

Re:I doubt this will catch on... (1)

slim (1652) | more than 4 years ago | (#30857896)

People that play graphically intensive games online [...] And most of these people will have higher-than-average spec computers anyway

It's not intended for the people you're talking about. Those people already have a means to play high-end games; and they're probably stuck in their ways too.

OnLive is for people who currently don't play those games, because of the cost/effort. People who are curious to try Crysis, but not curious enough to buy a gaming graphics card. The aim is to remove the barriers that say 'Casual gamers must tolerate low end graphics'.

Re:I doubt this will catch on... (0)

Anonymous Coward | more than 4 years ago | (#30858004)

also the crisis demo on the iphone was impressive. on that device, bandwith is also not a concern as you move lot less pixels. still 3g have too much latency and you'll need a goot wireless router.

Re:I doubt this will catch on... (1)

Ralz (1634999) | more than 4 years ago | (#30858066)

Well I'd imagine that games such as Crysis would be almost unplayable (especially on 'Ultra' graphics settings, which I expect most people would want to use) as the amount of data needed to be sent from the server to your computer would be massive in comparison to the data transferred when playing a 'normal' game online, and I'm sure the lag would be huge.

Looking at the article it says that it uses about 700-900kbps download speed, I'd be lucky to sustain 450kpbs for any length of time, and I don't know many people who can get a constant 900kbps (I live in a fairly large city in the UK). Plus there's latency on top of that depending on how far you are from the server.

Re:I doubt this will catch on... (1)

slim (1652) | more than 4 years ago | (#30858194)

I'm not sure the graphical complexity of the game will have much bearing on bandwidth usage. I imagine that scenes from Quake III encoded to some video codec, would be about the same size as scenes from Crysis on Ultra encoded with the same codec, would be about the same size as scenes captured on an HD camcorder encoded with the same codec.

Crysis is *the* demo for this service. I've seen analysis that says they appear to be using a graphics level a notch down from 'Ultra' - but I suspect the motivation for that is to use less GPU resource, not to ease bandwidth usage or to take load off the encoder/decoder.

You *will* need the bandwidth. If you can't sustain 450kbps, you're either not paying for the 5MB line OnLive will require, or you're not getting your money's worth and should be complaining.

OnLive is banking on typical consumer broadband speeds improving. No doubt plenty of other enterprises will be doing the same. Presumably as these services come online, consumer demand will cause a feedback loop that makes this happen.

It's not that long ago that streaming audio over the internet was unthinkable. Now we think nothing of steaming SD video (e.g. YouTube). 720p video streams OK for many people. Low latency bufferless 720p has been shown to be achievable, and if market forces work properly, will become more common over time.

Re:I doubt this will catch on... (1)

Ralz (1634999) | more than 4 years ago | (#30858278)

If you can't sustain 450kbps, you're either not paying for the 5MB line OnLive will require, or you're not getting your money's worth and should be complaining.

I'm paying for 'up to 8mb' but because my ISP is useless and we live more than a mile from our exchange, we have been told that we can't get anything above 4mb, which sucks major balls.

The main thing I use my internet connection for is watching iplayer/youtube and playing a few games (as well as the usual web browsing) in which case its adequate, but when my housemates are torrenting or watching youtube HD it has a noticeable effect on my connection.

Re:I doubt this will catch on... (1)

mister_playboy (1474163) | more than 4 years ago | (#30858126)

People who are curious to try Crysis, but not curious enough to buy a gaming graphics card. The aim is to remove the barriers that say 'Casual gamers must tolerate low end graphics'.

You can get very capable graphics cards for under $200 these days. How is that more expensive than this service will be?

Re:I doubt this will catch on... (1)

slim (1652) | more than 4 years ago | (#30858238)

You can get very capable graphics cards for under $200 these days. How is that more expensive than this service will be?

We don't know how much OnLive will cost. I would hope it compares well with $200. Or at least, the $200 would be spread over time - and as irrational as that is, it works for many people.

Perlman has talked about a subscription fee. I feel this is a mistake. I think they shouldn't put people off with any sense of buying into a commitment. Charge per game, or per hour. You might end up spending $200. But you wouldn't commit to $200 at the start.

But $200 for a graphics card isn't the whole story. Before you buy one, you have to grasp a load of technobabble. Then you have to fit it. Then you have to install drivers. If you're as unlucky as I usually am, you'll have stability problems with those drivers and more headaches.

These aren't major obstacles for a determined gamer. But again, this service isn't aimed at that kind of person. This lets people get the pretty pictures without knowing what DirectX is.

I just gotta saaaaaayy the waaaaayyy I feeeeeeel. (2, Insightful)

Anonymous Coward | more than 4 years ago | (#30857874)

What is it with all this 'cloud' stuff?

I've got half a terabyte of storage, a pretty good graphics card with shader support and a nippy CPU.

When there are raytracing cards with inbuilt physics, I'll enjoy a slightly more realistic gaming experience on my local machine, thanks.

Until then, I'll have to go with pretty realistic and the only significant cause of latency being my old neurons.

GOML.

Re:I just gotta saaaaaayy the waaaaayyy I feeeeeee (0, Troll)

rodrigoandrade (713371) | more than 4 years ago | (#30857930)

Mod parent up. If you can't keep up with the upgrade cycle required to play the latest PC games, buy a console or play older games.

What, you want to have your cake and eat it too?? Have you not learnt anything from the recession??

Re:I just gotta saaaaaayy the waaaaayyy I feeeeeee (2, Informative)

tepples (727027) | more than 4 years ago | (#30858076)

If you can't keep up with the upgrade cycle required to play the latest PC games, buy a console or play older games.

The problem with playing older games is that either the matchmaking servers end up switched off (e.g. DNAS Error -103 on PS2 games), or if not, the established players tend not to be friendly toward newbies (e.g. "gtfo n00b" on several classic first-person shooters).

Re:I just gotta saaaaaayy the waaaaayyy I feeeeeee (1)

IBBoard (1128019) | more than 4 years ago | (#30858248)

And the problem with consoles is that they're not as good as PCs. I've checked comparative screenshots and the PS3 can come close, but the XBox360 tends to look shoddy. PCs these days can pump out way more frames at higher resolutions than consoles (actually, I think they've always done that, but consoles are at least up at decent resolutions now), so I'd still rather have my PC.

Re:I just gotta saaaaaayy the waaaaayyy I feeeeeee (0)

Anonymous Coward | more than 4 years ago | (#30858550)

Just how many remotely rendered fps are you going to get, even just down the hall from the server?

What would I use remote rendering for?

Maybe if I were making the next Pixar movie and the laptop wasn't up to it, or I wanted to render huge datasets from MRI scans.

Gaming? Heroic idea but as retarded as a brainstem in a blender.

Re:I just gotta saaaaaayy the waaaaayyy I feeeeeee (1)

icebraining (1313345) | more than 4 years ago | (#30858770)

the established players tend not to be friendly toward newbies (e.g. "gtfo n00b" on several classic first-person shooters).

Only happened to me twice. You just choose I nice, friendly server and stick with it.

Correction: for "excitement and controversy" (3, Insightful)

Rogerborg (306625) | more than 4 years ago | (#30857890)

Read: "excitement (from clueless arts majors masquerading as tech journalists) and hilarity (from anyone with even a remote shred of knowledge of the technologies involved)".

Look, this tech may - may - be workable for SimWarConquer, but for anything that's reaction based? No. Not going to happen. There is no technobabble solution to latency, and anyone who tells you otherwise wants your credit card number.

Re:Correction: for "excitement and controversy" (3, Interesting)

slim (1652) | more than 4 years ago | (#30857902)

And yet this review - from a sceptic - says it pretty much works. While it's in beta. From a location that would have been excluded from the beta if he'd gone through proper channels.

Re:Correction: for "excitement and controversy" (2, Insightful)

Rogerborg (306625) | more than 4 years ago | (#30858326)

Mmm. Two out of three games were reported as playable, with noticeable latency compared to a local version, one being "right on the edge" of playability. The best experience came from a game that's largely about learning the track, which makes it not a reaction game per se.

Notice that this was while playing over a wired-to-the-wall connected (who still uses those?) and with a low 85ms ping to the server.

I'm also assuming that there was a certain degree of tolerance for a novel experience. Once people are actually paying for a metered service, how much latency and input wackiness are they going to be willing to tolerate? I'm thinking a lot less than for a free or flat rate subscription service.

It'll work tolerably well for some games, some of the time, barring server or transport snafus, ISPs "shaping" the traffic (coming to an ISP near you in 3... 2... 1...) or the service dying under the weight of its own success. Whether that'll be enough to support their business model is dubious at best, since the instant the quality drops, so does their revenue.

The ping should be noted (1)

Sycraft-fu (314770) | more than 4 years ago | (#30858634)

85ms is not a high ping at all. If Onlive considers that to be outside of their acceptable range, they've got a nasty surprise coming when they try to open it to the public. Lot of people going to have a ping of that or higher.

I've got a reasonable good connection here, business class 12/1.5 mbps cable. Represents a mid to high end home connection. Minimum latency on the connection is in the realm of 25ms. That is about what it takes to get out past the CMTS and so on. For good sites, I'm usually in the 40-50ms range, provided my ISP has peering to their ISP and they are in a relatively nearby state. However, I easily see pings in the 80-100ms range to many locations, including a number of TF2 servers I like. 150ms+ isn't out of the question either, I've seen that to various places in the US that are a bit distant that don't have a very direct connection.

So even with an over all low latency connection, you can easily find it up in the 80+ range. Gets worse for ones that aren't as good. When I had DSL, minimum ping was more like 50-60ms. It took much longer to get to the DSLAM and out than the cable connection does. Also peering wasn't as good so it was rare for me to find a site with a ping less than 100ms.

This is just the reality of broadband. You don't get nice low latency all the time everywhere. It is slowly getting better, and I'm given to understand that FIOS is even better latency wise, but you can't look at the best and say "Oh well that'll work great!" You have to look at what most people will be dealing with, especially most people who won't spend a ton of money on Internet. After all if your business model is "This is so much cheaper than buying high end hardware," you can't very well then say "But you have to buy high end Internet."

Re:Correction: for "excitement and controversy" (1)

PriyanPhoenix (900509) | more than 4 years ago | (#30858008)

First off, broadly speaking, I agree with you. Latency is going to be too much of an issue for most people to jump on board without an (inevitable and arguably overdue) infrastructure upgrade. However there are two types of latency OnLive is dealing with. The first is the obvious one from transmitting data back and forth over the internet. The second is the actual video encoding process server-side. That is where OnLive seems to have come up with a novel "technobabble solution" that actually works. It is, in all honesty, probably where their value lies rather than the service they are trying to offer which is almost certainly before its time.

Re:Correction: for "excitement and controversy" (1)

slim (1652) | more than 4 years ago | (#30858060)

However there are two types of latency OnLive is dealing with.

Lots more than two - of varying severity. And they all get summed.
  - controller encoding button presses
  - controller to client transmission
  - client decodes controller protocol
  - encode and transmit control signals to server
  - decode controls and input to game
  - processing performed by the game itself
  - video encoding
  - transmit video to client
  - client decodes video
  - client transmits video to TV

Perlman claims that OnLive examined every step, because every ms counts. Obviously some steps are 100% beyond their control (the last one, for example). In one presentation he said that they couldn't support Wii controllers because the protocol introduced too much lag. The Wii doesn't care about that lag, because it doesn't have the Internet factor to accumulate on top of it.

I am indeed dissapoint, Slashdot. (2, Funny)

EdZ (755139) | more than 4 years ago | (#30857918)

I came here expecting to see a belated "First!" post followed by a joke about lag.

First Post! (3, Funny)

mister_playboy (1474163) | more than 4 years ago | (#30858150)

There was even more latency than you expected, you insensitive clod!

OnLive employee moderating posts. (-1, Offtopic)

AbRASiON (589899) | more than 4 years ago | (#30857926)

Based on the posts in this topic and how many perfectly sensible posts are mod'd down it seems pretty obvious to me someone with some kind of interest in OnLive is moderating.
First time I've seen such distinct bias in moderation (both promotion of posts and demotion)
Look out lads, metamoderatings gonna getcha!

Re:OnLive employee moderating posts. (0)

Anonymous Coward | more than 4 years ago | (#30858270)

Yeah, I also spend my time examining all comments in a story, checking score too see suspected bias. That makes it two of us. Or actually, I was lying, there's only one, you. Fucking pathetic.

"Burnout Paradise" is slow?! (1)

Sockatume (732728) | more than 4 years ago | (#30857932)

I must admit, I've not actually played it, but if it's anything like the other Burnout games, millisecond reaction times are kind of important. It may be that he has having a hard time picking up on la instinctively because of the analogue controls but I doubt the reaction time increase would stand up in serious play.

Looks Great (0)

Anonymous Coward | more than 4 years ago | (#30857984)

Even if the fps's didn't play as well - the video of the game playing looks really impressive. I can understand the games playing better with a controller it's going to be smaller, more consistent movements which I imagine would be a lot easier to track.

Is this going to appeal to the hardcore, twitchy fingered, give me super graphics or I'll puke gamer? Nope; but for the rest of us it looks pretty impressive.

I for one am looking forward to seeing what they do with this.

VNC like remote desktop client (similar to OnLive) (1)

doomy (7461) | more than 4 years ago | (#30858000)

While I've been mildly interested in OnLive, my biggest excitement over this was a confirmation that a streamed remote desktop session with real good responsiveness (say LAN) could be had soon. I even started poking around for similar systems that actually streamed the desktop in 2mbps or similar video stream with interactivity, but alas, it seems like no one is working on this issue.

So, I'm open to suggestions, is there any current existing remote desktop server/client system that actually streams the desktop in the OnLive fashion, or is anyone working on one similar to this? (And I do not mean in the old VNC fashion). I believe such a system is very feasible. Imagine being able to stream your desktop onto thin/mobile devices just like you were on it, being able to play video (at least) would be so much better than the current remote desktop offerings.

In a nutshell, I want this applied to remotely stream desktops with full control ala VNC but similar to OnLive.

Re:VNC like remote desktop client (similar to OnLi (0)

Anonymous Coward | more than 4 years ago | (#30858304)

Check out VMWare's PC over IP protocol and its implementations.
I get great interactivity over high latency and low bandwidth links. But then again, I use this for regular desktop stuff and sometimes watching videos (mostly flash), not hardcore gaming.

IMO, PCoIP is the best there is at the moment.

Re:VNC like remote desktop client (similar to OnLi (1)

British (51765) | more than 4 years ago | (#30858572)

I did this back in 1996. I was wardialing, and found somebody's open PcAnywhere connection. So I connected to it and attempted to play Solitaire over 33.6 dialup. Needless to say, it was a pain dragging those cards around.

concern about patches... hmpf (2, Insightful)

El_Muerte_TDS (592157) | more than 4 years ago | (#30858034)

Gamers would also no longer have to worry about patches and software updates to their gaming titles - one of those annoyances that PC gamers often cite on their way to moving to a console.

I recently bought a PS3 with some games. When I started it I was welcomed with "You need to install the latest PS3 firmware now!". So I had to wait for it to install and reboot. Then I inserted a game and wanted to play, but I was welcomed with "Updates have been found for this game and need to be installed". Which is pretty much identical to the PC, but there you often have the option to install the patch.

Re:concern about patches... hmpf (1)

fuzzyfuzzyfungus (1223518) | more than 4 years ago | (#30858242)

They really should just change the message to "These enhancements to your gaming experience are Exciting and Mandatory!".

Re:concern about patches... hmpf (1)

Sycraft-fu (314770) | more than 4 years ago | (#30858658)

Also PC patching is becoming much easier in most cases. It is getting rather popular for games to have auto updaters, or a button you can click that will check and download what you need. Then of course if you buy a game from a digital service like Impulse or Steam, those will check automatically and keep your stuff up to date.

Really the patching thing is becoming largely a non-issue. I don't see no patches as a major selling point for this.

80 KB/s (1)

Tei (520358) | more than 4 years ago | (#30858048)

Re:80 KB/s (1)

slim (1652) | more than 4 years ago | (#30858426)

You told us there would be a 480ms round trip. That would make it unusable.

This guy found it was pretty much OK, despite being further than the servers than recommended.

Re:80 KB/s (1)

Tei (520358) | more than 4 years ago | (#30858622)

Thats your read?
Mine is 480ms is "pretty much OK" for a console driving game, but is horrible for a PC FPS.

The "Playing UT3 with OnLive" shows how it feels: moving like a drunk.

Anyway, probably lots of games are still fun with 480ms. So yes, as another game system, seems OnLive will work (for a subset of the gammers)

Hope this doesn't start move away from mouselook (0)

Anonymous Coward | more than 4 years ago | (#30858054)

Reading the original article it seems that using an Xbox controller is much less sensitive to lag than mouse and keyboard. Presumably this is because the input device itself is less precise and less responsive. I love mouse and keyboard control though so I really really hope this doesn't catch on and start a move away from good old mouse-look.

Who didn't expect the "end result"? (0)

Anonymous Coward | more than 4 years ago | (#30858086)

The end result appears to be that while slower input-dependent games like Burnout: Paradise worked pretty well, games that require a fast twitch-based input scheme like UT3 did not."

It was going to be pretty damn obvious that this would be the case.
Unless they invented some sort of tachyonic backwards compatible networking layer, it wouldn't have worked.

The only other potentially manageable solution would be servers at exchanges, but that will cost them their entire body several times over.
It still won't make things *that* pretty though.

Well (-1)

ShooterNeo (555040) | more than 4 years ago | (#30858092)

I predict that cloud gaming services will utterly dominate all gaming. Within 5-10 years, virtually all new titles will be released exclusively for cloud gaming services, and will not be available at all otherwise. Consoles as we know them will become totally extinct : the next generation of consoles will be the last.

The reason is economics. It isn't hard to see the enormous forces that will push cloud gaming to domination.

1. All game publishers will be paid for every hour of game played. This increases their income, which will increase the supply of high quality games. This is because with cloud gaming, piracy is basically impossible, yet the forces that drive pirates are also mostly eliminated. You won't have to plunk down $60 to try out a game legally : you'll be able to play any game for $1 an hour or $40 or so a month.

2. Instead of there being 3 console platforms + PC, there will be just one platform : the PCs in the cloud gaming data centers. Most likely, the cloud gaming providers will soon release development kits that are a PC with the exact same hardware and OS image as they have running in their data centers. PCs are already the easiest platform to get a game running on, with the best dev tools. Now game publishers will be able to develop their games exclusively for this platform, with a fixed hardware base.

3. The overall costs of gaming will be lower. Instead of every gamer needing their own CPU + GPU, whether that be in a console box or a gaming PC, they'll just rent a portion of one from a central service. Even after collecting profits, cloud gaming providers will be able to provide gamers with gaming services for much annual lower costs than buying a gaming system and games.

4. The big criticisms in this article will be eliminated once there are more cloud gaming data centers, located all over the United States to put one close in terms of network switches to every ISP customer. (reducing latency to 30ms or so, enough to eliminate perceived lag) Also, there are some tricks with the mouse input that could eliminate the overshoot described in the article. (sync the mouse coordinates on the client with the server) The other big critism : inferior graphics quality : will be mostly eliminated with more bandwidth dedicated to the video stream. The author notes that the current beta client uses only about 1 megabit for the video. 5 megabits would greatly increase the image quality.

5. Next generation games that blow your mind graphically will now be practical. Right now, you can't develop and sell a game that requires cutting edge hardware to give photorealistic graphics. Consoles are years behind, and you can't write a PC game to require a new $2000 PC that only a few gamers have at any given time. With cloud gaming, however, that will be entirely practical : if you're willing to pay a little more per hour, you'll be able to enjoy Crysis 4 maxxed out with smooth as glass, uber realistic graphics.

6. While one form of lag is introduced with cloud gaming, it eliminates another. Since each game client is running in a data center with excellent internet connectivity, latency BETWEEN clients in a multiplayer game will be virtually eliminated.

7. The tech support nightmare of supporting games due to hardware problems is mostly eliminated. Games will also load far, far faster because the cloud gaming service can just switch your session to a PC that has already loaded the game.

8. Eventually, enough hardware to support online will be integrated into new TVs and blu-ray players, so basically anyone with a TV and a spare USB mouse/keyboard or USB gamepad will be able to enjoy PC games in their full glory.

9. The resurgence of mouse/keyboard using gamers will mean that PC game genres like RTS will make a big comeback.

And lots more reasons.

The reasons slowing down cloud gaming?

              1. ISPs have to have a contract with the cloud gaming provider, and to use QoS to make sure the gaming packets are not delayed in any way. The FTC might stupidly call for "net neutrality" that would make this impossible.
              2. The speed of light and limited broadband penetration will mean that some rural users will not be able to play games smoothly for years, and will be relegated to aging consoles.
              3. If your cloud gaming provider goes down, so did your plans to play games for the evening. Probably will happen less often than problems with local hardware, especially if you are a PC gamer, however.

Re:Well (1)

N1AK (864906) | more than 4 years ago | (#30858660)

1. ISPs have to have a contract with the cloud gaming provider, and to use QoS to make sure the gaming packets are not delayed in any way. The FTC might stupidly call for "net neutrality" that would make this impossible.

Net neutrality is about treating packets fairly regardless of whose software is producing it. I don't see how blocking a major ISP from prioritising packets from one cloud game company while disadvantaging those of its competitors is a good thing?

QoS isn't anti-net neutrality it's just prioritising data for uses that require low-latency, which should be fine as long as it is done in a way that doesn't disadvantage individual service providers. Ergo, an internet provider would have no issue with the FTC or anyone else for giving VoiP priority over FTP as long as they give all VoiP that priority.

Guess I won't be using it... (1)

argent (18001) | more than 4 years ago | (#30858104)

... because I couldn't even stream the videos without jitter. :)

I just don't see this working (5, Interesting)

jbb999 (758019) | more than 4 years ago | (#30858160)

The major problem isn't overall latency, it's little spikes of latency on an otherwise good line. A moment of 100ms lag on an otherwise good line doesn't matter for online games because of client prediction and at worst it's a tiny moment where the controls don't seem responsive. It's not a problem for normal video because they can buffer 250ms or 500ms or 1000ms of video without any problem. But on this they can't do any significant buffering or the latency will be too much to play.And even 100ms of sudden latency will cause the picture to lag or freeze or jump. It might only happen occasionally but I suspect people won't put up with it. And they can't do anything about it either, even if your ISP is only 10% loaded on its lines and routers, there will be times when all that 10% send packets at the same moment and they get queued in a router somewhere, just for a tiny time but tiny little amounts of jitter like this are normal and expected and to be honest I think will be the downfall of this project because there is no real way to deal with them. But I guess we'll see :)

1 MB/sec... (3, Interesting)

V50 (248015) | more than 4 years ago | (#30858218)

There are still large areas of North America stuck with either stone-age Dial-Up (in 20-freakin'-10) or slow expensive satellite. Like mine (I cry myself to sleep over my 1200ms latency) This is absolutely a no-go there. Obviously.

Now, in better places, I'm sort of out of the loop. Whenever I've spent time in cities, either visiting my brother in Ottawa or living in London (Ontario, not the good one) for a few months at a time, it's been my experience that even connections that are supposed to get up to 1MB/sec would be lucky to get that in practice, especially at peak times. Furthermore, the sheer amount of lagspikes, connection hiccups, or general time when the interrnet craps out for no apparent reason makes it seem like you'd be dealing with one frustration after another. The number of times I see people get DC'd on World of Warcraft seems to back up my theory that staying connected, and maintaining a constant connection at 5KB/s or so (for WoW) is difficult enough, doing the same for a (whopping?) 1/MB/s while keeping latency under 100ms would be hellish.

So is my experience with the Internet indicative of the general population, or have I just had the misfortune of having terrible service? Can people really keep 1MB/s sustained, without lag hiccups, DCs, lost packets, etc, while keeping under 100ms?

Re:1 MB/sec... (2, Insightful)

Sycraft-fu (314770) | more than 4 years ago | (#30858702)

Well yes, some people can. However in general you need to live in an area with good access anyhow. Also, you are going to need to have a good deal better than what you want minimum. If you want to sustain 1mbps without a problem, you probalby need a 10mbps line. The more headroom you've got, the easier it is to maintain what you need.

That is not to say it is problem free, even good connections will drop sometimes or have problems. Also, of course, better connections tend to cost more money. Not a huge deal, but it does create something of a problem when you are marketing to the cheap crowd. The people who won't spend $100 on a graphic card or $200 on a console may also be the people who buy the bargain cable package which is usually pretty slow.

So sure, it can be done. My connection is 12/1.5 mbps and I almost always get my full speed since it is business class (and have never gotten less than about 80%). Thus even if I drop a packet now and again, I need to sustain less than 10% speed to maintain the requisite 1mbps. More, my connection is pretty low latency. I see 40-50ms to many sites that are in nearby states and on providers peered with my provider.

However, as you probalby guessed, this is not the cheap connection. I pay a good bit for quality Internet and I'm happy to do so. However, I'm also happy to buy a good GPU and have games run well on my system. I've no interest in their service, though I've got the kind of connection you'd want for it.

Until they overcome that little "physics" thing... (2, Interesting)

mykos (1627575) | more than 4 years ago | (#30858352)

This is an obvious pump and dump scheme, unless they have somehow unlocked technology previously unseen and unknown by mankind, and have done so for the purpose of playing video games.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?