Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Cloud-Based, Ray-Traced Games On Intel Tablets

timothy posted more than 3 years ago | from the heavy-lifting-done-elsewhere dept.

Cloud 91

An anonymous reader writes "After Intel showed a ray traced version of Wolfenstein last year running cloud-based streamed to a laptop the company that has just recently announced its shift to the mobile market shows now their research project Lalso running on various x86-tablets with 5 to 10 inch screens. The heavy calculations are performed by a cloud consisting of a machine with a Knights Ferry card (32 cores) inside. The achieved frame rates are around 20-30 fps."

cancel ×

91 comments

Sorry! There are no comments related to the filter you selected.

Cloud cloud cloud (4, Funny)

Anonymous Coward | more than 3 years ago | (#36376090)

I've got a CloudPad running CloudOS 0.99. It is freakign cloudtastic.

Re:Cloud cloud cloud (0)

Anonymous Coward | more than 3 years ago | (#36376214)

Bothered by everything labelled 'cloud'? Just replace the word with 'server' or 'server-side'. Articles will still make just as much sense.

Re:Cloud cloud cloud (1)

bberens (965711) | more than 3 years ago | (#36377496)

You just gave me a great idea for a Chrome/Firefox plug-in. Universal replace. You set up key value pairs.. every instance of [key] on every page you visit will be replaced with [value].

Re:Cloud cloud cloud (0)

Anonymous Coward | more than 3 years ago | (#36379396)

[key] [value] is [key]!

Re:Cloud cloud cloud (1)

AlienIntelligence (1184493) | more than 3 years ago | (#36384074)

You just gave me a great idea for a Chrome/Firefox plug-in. Universal replace. You set up key value pairs.. every instance of [key] on every page you visit will be replaced with [value].

It's existed for a very long time in the form of Proxomitron
http://www.google.com/search?sourceid=chrome&ie=UTF-8&q=proxomitron [google.com]

Best web filter there is... and it's browser
agnostic since it affects the stream at a
proxy level.

I've set it up for a parent before where every
single instance of "a word they don't want
their child to see" was replaced with some
irrelevant 'kid' word.

And it filters ads too, lol

-AI

Re:Cloud cloud cloud (1)

bennomatic (691188) | more than 3 years ago | (#36379182)

Well, normally I'm not bothered, but when your "cloud" consists of one server--even if it's multi-core--it's not a frickin' cloud!! I know that "cloud" has many meanings, but to my mind, if you can't power down one physical device and replace it with another without interrupting your service for all users, it's not a cloud.

Re:Cloud cloud cloud (1)

Joe Snipe (224958) | more than 3 years ago | (#36379906)

Sort of. Cloud is dumbed down market-speak; it was a designed phrase that took the technobabble out of the presentation. In all seriousness cloud more closely is aligned with phase 2 of the Gnomes plan. [wikipedia.org]

Re:Cloud cloud cloud (1)

sortius_nod (1080919) | more than 3 years ago | (#36384912)

See, for me, "the cloud" just means "the internet" as per networking diagrams. I swear we've lost sight of what it means, "a network that we don't want/need to draw".

It's really gotten out of hand to be honest, just whack cloud before something and it immediately sounds like you've got geek cred to idiots. From here on in I'm going to take cloud to mean the person means "a network/technology that's too complex for the journalist to understand".

Re:Cloud cloud cloud (0)

Anonymous Coward | more than 3 years ago | (#36390088)

For me, the thing that distinguishes "cloud" from just "network" or "server" is the commodity.

If you don't care who provides the storage or service (and the choice is handled automatically) then it can legitimately be called a "cloud" app.

Re:Cloud cloud cloud (1)

Anonymous Coward | more than 3 years ago | (#36376274)

I have a SmurfPad running SmurfOS and it's just Smurfy. Heh.

Re:Cloud cloud cloud (1)

snookerhog (1835110) | more than 3 years ago | (#36376364)

smurf you.

Re:Cloud cloud cloud (0)

Anonymous Coward | more than 3 years ago | (#36376840)

Hodor hodor hodor

Re:Cloud cloud cloud (-1)

Anonymous Coward | more than 3 years ago | (#36376928)

So, let me ask you this. Will these Intel tablets be running Windows? If so, instant fail. Don Quixote would be embarassed at the steady procession of doomed Windows streaming from the bowels of the Redmondian partners. Everybody else, we just laugh and return to our iPads and Honeycomb devices.

Re:Cloud cloud cloud (0)

Anonymous Coward | more than 3 years ago | (#36387382)

Yeah seeing how Windows has failed miserably at gaining market share as an OS. I guess they will just have to "settle" for their measly 88% share while iOS enjoys it's humongous 2.4% market share.

Re:Cloud cloud cloud (2)

mattcasters (67972) | more than 3 years ago | (#36380418)

CTO: Well, what've you got?

Vendor: Well, there's servers, cloud and clients; software, hardware, cloud and peripherals; networking and cloud; servers, software and cloud; clients, software networking and cloud; cloud, software, networking and cloud; cloud, peripherals, cloud, cloud, servers, and cloud; cloud, clients, cloud, cloud, software, cloud, networking, and cloud;

Consultants (starting to chant): Cloud, cloud cloud cloud ...

Vendor: ...cloud, cloud, cloud, software, and cloud; cloud, cloud, cloud, cloud, cloud, cloud, services, cloud, cloud, cloud...

Consultants (singing): Cloud! Lovely cloud! Lovely cloud!
...

Re:Cloud cloud cloud (1)

Ancantus (1926920) | more than 3 years ago | (#36382300)

You reminded me of a penny arcade comic.

Lunch [penny-arcade.com]

Network lag (1)

Lunaritian (2018246) | more than 3 years ago | (#36376092)

It may barely work with desktops (if you're close to the servers), but with mobile devices I'm quite sure that they're unplayable in reality.

Re:Network lag (0)

Anonymous Coward | more than 3 years ago | (#36376322)

If that were true, OnLive would be DoA. People are using that service, so, for them at least, it's playable. A tech demo built especially for thin client use by the hardware manufacturer should be at least as responsive as that. It may not be good enough for the Counter Strike crowd but Angry Birds players will be blown away by it, and there's more of them.

Re:Network lag (1)

jojoba_oil (1071932) | more than 3 years ago | (#36377418)

Having first-hand experience with OnLive (got a code for Trine on OnLive with a Humble Bundle), I'd say that it's not that bad. It complains if you're over wireless, but any wired broadband connection should be fine. It's been quite a bit smoother than a VMware PC-over-IP setup I used last year, and the server running that junk was on the local network.

If you want to try it yourself, download the client and go to the "Arena" section to watch other people playing games. Now imagine that was you; it'd almost be easy to forget that the game isn't rendered locally.

Re:Network lag (1)

0123456 (636235) | more than 3 years ago | (#36377930)

Having first-hand experience with OnLive (got a code for Trine on OnLive with a Humble Bundle), I'd say that it's not that bad. It complains if you're over wireless, but any wired broadband connection should be fine.

Cool. So tablet users will have to plug in a LAN cable to play games...

Re:Network lag (1)

HaZardman27 (1521119) | more than 3 years ago | (#36379084)

go to the "Arena" section to watch other people playing games. Now imagine that was you; it'd almost be easy to forget that the game isn't rendered locally.

Except that because you're not in control of the avatars in the game you have no sense of the input lag. I tried out a few games on OnLive (I have a 30Mb downstream connection as well, so that shouldn't have factored into this), and for me the input lag was too great for anything but single-player games and perhaps slow-paced multi-player games. To be fair, I did not really spend any time playing around with configuration (not actually sure if they even give you the option), so perhaps I was not connecting to the most local servers available.

Re:Network lag (1)

pushing-robot (1037830) | more than 3 years ago | (#36376340)

I know, but it sure would be cool if you could play games on a low-power tablet while the actual processing is done by a server in your closet or living room.

I should pitch that idea to one of the big gaming companies, like Nintendo.

Re:Network lag (1)

MozeeToby (1163751) | more than 3 years ago | (#36376514)

That's what I don't understand, put the "cloud" server in a closet somewhere in my house and connect all the other machines up through a strong wireless N network and you might have something usable. I just can't imagine relying on your ISPs latency when it comes to graphics.

Re:Network lag (1)

im_thatoneguy (819432) | more than 3 years ago | (#36376870)

Maybe not for an FPS or racer but if it was a really cool Starcraft killer that would have interesting possibilities. Handle the 2D GUI with the tablet processor and then raytrace a photorealistic game engine.

You could also stream data and handle some of the primary ray data locally.

If you rendered the visible lighting and then used irradiance caching. The cloud could do the heavy lifting with GI and the tablet could do the kNN lookups.

Re:Network lag (1)

truthsearch (249536) | more than 3 years ago | (#36376978)

In this particular case, it sounds like the server-side power required per user is very large. So I would imagine it's impractical at the moment regardless of networking issues.

Followed one of the links and read this... (4, Insightful)

RoverDaddy (869116) | more than 3 years ago | (#36376094)

The first product codenamed "Knights Corner" will target Intel's 22nm process and use Moore's Law to scale to more than 50 Intel cores.

Nonsense marketing babble. Moore's Law is predictive. You can't use it to MAKE anything happen.

Who says laws are predictive? (1)

steelfood (895457) | more than 3 years ago | (#36376378)

Just yesterday, I used the law of gravity to make myself fall down. So there!

Re:Who says laws are predictive? (0)

Anonymous Coward | more than 3 years ago | (#36376490)

I dont think that one is called moore's law, actually.

Re:Who says laws are predictive? (3, Insightful)

Culture20 (968837) | more than 3 years ago | (#36376530)

Just yesterday, I used the law of gravity to make myself fall down. So there!

No you didn't. You used the law of gravity to predict that you would fall down. You utilized gravity to make it happen.

Re:Who says laws are predictive? (1)

NatasRevol (731260) | more than 3 years ago | (#36376638)

Is that a nerd-burn?

Re:Who says laws are predictive? (1)

hierophanta (1345511) | more than 3 years ago | (#36377380)

yes. slashdot blocks all other type of burns. those jerks

Re:Who says laws are predictive? (0)

Anonymous Coward | more than 3 years ago | (#36383396)

One cannot be burned by nerds. They are both not cool enough and not hot enough.

Re:Who says laws are predictive? (0)

Anonymous Coward | more than 3 years ago | (#36379586)

Wooooosh!

Re:Who says laws are predictive? (0)

Anonymous Coward | more than 3 years ago | (#36379600)

Today I used law of gravity to lecture a troll.

Re:Who says laws are predictive? (0)

Anonymous Coward | more than 3 years ago | (#36384008)

Well in this case, we're going to get intel to utilize Gordon E Moore to make it happen.

Re:Followed one of the links and read this... (0)

Anonymous Coward | more than 3 years ago | (#36377912)

The first product codenamed "Knights Corner" will target Intel's 22nm process and use Moore's Law to scale to more than 50 Intel cores.

Nonsense marketing babble. Moore's Law is predictive. You can't use it to MAKE anything happen.

Not nonsense, it simply means the tech isn't developed yet.

Re:Followed one of the links and read this... (0)

Anonymous Coward | more than 3 years ago | (#36380170)

Not nonsense, it simply means the tech isn't developed yet.

It's poorly phrased. You don't "use" Moore's law - if anything, you just rely on it to allow you to scale to 50 cores at some point in the future.

Hay kids run up your data bill while playing high (0)

Anonymous Coward | more than 3 years ago | (#36376176)

Hay kids run up your data bill while playing high end games any where you can get 3g or 4g.

Video (1)

Beardydog (716221) | more than 3 years ago | (#36376186)

or gtfo

Latency (2)

LikwidCirkel (1542097) | more than 3 years ago | (#36376276)

Who cares if it looks awesome if latency sucks. I'd rather have SuperNES StarFox quality graphics with no lag than ray-traced graphics with horrible latency. It can be reduced, but I don't yet believe it's possible to make it unnoticeable. I guess I'll believe it when I see it.

Re:Latency (1)

Uhyve (2143088) | more than 3 years ago | (#36376504)

I can barely play games with vsync enabled, never mind adding live streaming into the mix. I'm not saying "barely play" in a snobby way either, I actually just suck at games when I get any extra input latency.

Re:Latency (1)

Stormwatch (703920) | more than 3 years ago | (#36376548)

And I can't accept vsync disabled, screen tearing is far more noticeable and annoying than a tiny bit of lag.

Re:Latency (1)

Uhyve (2143088) | more than 3 years ago | (#36376944)

Oh yeah, I didn't mean deride anyone who uses vsync, I just literally have trouble playing games with vsync enabled, I wish I didn't, because screen tearing is annoying.

My point was really that if I have trouble with vsync enabled, I have no idea how I could possibly play cloud streamed games. Unfortunately, I've not even had the chance to try OnLive yet because I don't live in the US, so I dunno, maybe it'll be fine.

Re:Latency (1)

Ephemeriis (315124) | more than 3 years ago | (#36376670)

Who cares if it looks awesome if latency sucks. I'd rather have SuperNES StarFox quality graphics with no lag than ray-traced graphics with horrible latency.
It can be reduced, but I don't yet believe it's possible to make it unnoticeable. I guess I'll believe it when I see it.

Latency is an absolutely huge problem. It's a bigger problem, for me, than poor image quality.

I'll happily turn down the visuals if it makes my game more responsive. And nothing will make me throw up my hands in frustration faster than input lag.

Re:Latency (1)

nschubach (922175) | more than 3 years ago | (#36376764)

And nothing will make me throw up my hands in frustration faster than input lag.

Maybe if you quit throwing your hands up and kept them on the input there wouldn't be so much lag between when you want to do something and when it gets fed into the control. ;)

So? (1)

mazesc (1922428) | more than 3 years ago | (#36376290)

Will this be news everytime a new device is targeted?

Re:So? (2)

Enderandrew (866215) | more than 3 years ago | (#36376350)

Intel's PR department keeps releasing press statements and journalists keep eating it up.

Input latency is a real issue. I'm not impressed that Intel can take a bank of servers to produce content for one client. The business model for that just frankly doesn't work yet, and even when the business model for that does work, input latency will remain.

Write a story when they solve input latency.

Re:So? (3, Informative)

Beardydog (716221) | more than 3 years ago | (#36376404)

Isn't it just one server with a 32-core chip?

I never thought OnLive would work, but it kinda, sorta does. Over my office connection.

Not my home connection, that's too slow. Or my friend's connection, which is fast enough, but suffers from occasional hiccups that break the stream... but at work, it works great, so this should work anywhere that a mobile device can receive and sustain a 5Mbps stream of data without going over their data cap in a month.

Re:So? (1)

Enderandrew (866215) | more than 3 years ago | (#36376604)

The last demonstration they did with Wolfenstein was 4 servers to produce the video stream for one client. Perhaps they're down to one server now. But even one server with a 32-core chip producing the video stream for a single client doesn't make financial sense yet.

Constant high def video streaming doesn't work well in the new age of data caps. Input latency depends how much overall WAN latency you have between each client and the servers. That will vary for person to person, but hardcore gamers do care about split-second timing. Adding input latency is counter-intuitive for gaming.

And since we have Unreal Engine running on iPhones, multi-core mobile processors are getting cheaper, etc. I think you'll find by the time they can get the ray tracing processors cheap enough to make their streaming service work, most mobile devices will already have very fast/powerful processors.

Re:So? (1)

RobbieThe1st (1977364) | more than 3 years ago | (#36382638)

Well, why seperate the client and the server? If you think about it, this could be a market for 32...1024 core desktops: Raytraced games. I mean, currently most games don't need nearly so much CPU power, and Intel wants a market for their many-core chips. They do have a market server-side, but if they can get a client-side market too? Great!

Re:So? (0)

Anonymous Coward | more than 3 years ago | (#36383828)

32...1024 core desktops

How the hell would you fit that many cores on a single die? Even dual socket isn't going to get you over 64 cores simply because the cache coherency protocol will shoot performance to shit so that every core you add makes everything slower instead of faster until it becomes virtually gridlocked trying to decide which CPU core owns each 64byte row of memory.

The cores are also just too damn big, you could possibly make a cluster of ARM or maybe even Power cores that fit that many on a chip (without the cache coherency issues as well) but x86+Windows just is not going to die.

Re:So? (1)

whiteboy86 (1930018) | more than 3 years ago | (#36376970)

This "OnLive" like brute-force streaming solutions will die for a simple reason of increasing cheap computing power and also increasing display resolutions (and bandwidth), you can barely stream 1080p now over Wifi, but wait for the next year's new "retina" resolution displays, the bandwidth needed for streaming next-gen video and 3D is beyond the current network capabilities, it is an unfortunate (VC-cow funded) solution that eats your 20GB months allowance in a few minutes, clogs the network for your peers and is inferior latency wise.

Re:So? (1)

grumbel (592662) | more than 3 years ago | (#36385914)

This "OnLive" like brute-force streaming solutions will die for a simple reason of increasing cheap computing power and also increasing display resolutions (and bandwidth)

I doubt it. It might certainly change its target audience, as it doesn't seem to be all that good to replace a gamer PC or a console right now. But its hard to beat having access to a demo in a matter of seconds vs a matter of hours on the PC. With OnLive you could, in theory at least, flip through games like you flip through TV channels, you'll never get that ability when you have to download multiple GB before you can even see the title screen.

ray traced (1)

aahpandasrun (948239) | more than 3 years ago | (#36376326)

Wasn't Wolfenstein ray traced to begin with?

Re:ray traced (2)

Beardydog (716221) | more than 3 years ago | (#36376372)

Ray-cast to begin with. But I think Voxelstein 3D was raytraced before Intel got around to doing it.

Re:ray traced (1)

Anaerin (905998) | more than 3 years ago | (#36376374)

Not even close. Wolfenstein 3D was "Ray Casted".

Re:ray traced (2)

Toonol (1057698) | more than 3 years ago | (#36376460)

The algorithm used rays, but not in the sense that ray-tracing uses. Wolfenstein would fire one ray for each horizontal column on the screen, to see where it intersected with the wall. That would be 320 rays for the full screen, and was why the maps were effectively 2d. Ray-tracing, of course, uses at least one ray per pixel.

Re:ray traced (1)

BradleyUffner (103496) | more than 3 years ago | (#36377234)

Wolfenstein would fire one ray for each horizontal column on the screen

Aren't horizontal columns commonly called... rows?

Re:ray traced (1)

vux984 (928602) | more than 3 years ago | (#36379102)

It fired one ray along each pixel of the horizontal axis (ie 320 rays) to get the relevant rendering information for the entire vertical column.

Re:ray traced (1)

Graymalkin (13732) | more than 3 years ago | (#36376520)

Wolfenstein 3-D used ray casting [wikipedia.org] which is a bit different [permadi.com] than ray tracing.

Re:ray traced (0)

Anonymous Coward | more than 3 years ago | (#36381792)

Wasn't Wolfenstein ray traced to begin with?

I can assure you that Wolfenstein was *not* ray traced- nor even ray-cast "to begin with"! [gamesdbase.com]

Anything is better with ! (1)

pep939 (1957678) | more than 3 years ago | (#36376418)

This story would have been "Bluetooth-Based Game blablabl" a couple of years ago.

Eheh (4, Interesting)

SmallFurryCreature (593017) | more than 3 years ago | (#36376446)

So, basically Intel is saying, fuck Online with its 100ms lag times! We can go for SECONDS! No MINUTES even. Infinite lag! We can do it!

All you need is to run an old game on hardware that can easily run the real game with an insane data plan.

The bubble is indeed back, remember all those ideas of sites that wanted 100kb avatar images on real time updating forum posting sites with 500kb flash animated signatures? When most people were happy with a 56kb modem? Served from Sun hardware? Well this is the same thing.

I am quite willing to accept that you can render a game beautifully on a server. I am quite willing to believe tablet can be a lot cheaper if they only got to show a movie. I am even willing to believe that response time over the internet can in theory be fast enough not to produce outlandish lag in ideal circumstances.

In real life?

  • ISP's and especially mobile ISP's have been trying to cut back on that amount of data consumed. Can you imagine their reaction to the demands of an UNBUFFERED HD video stream during peak hours on a MOBILE device? Most of them can barely stand you downloading a low rez youtube video of a couple of minutes. How long is the average game level?
  • Latency. It is already bad enough on the wired internet were with multiplayer your computer only has to transmit the data not the actual screen itself. On 3G or even 4G? Forget about it. For web pages the latency can easy reach a minute. In chess game that would be to slow.
  • Hardware. It just keeps on getting more powerful, my netbook can now play some of the older games just fine. I recently replaced my Linux desktop with a amd cpu on motherboard setup because I realized that for desktop and movie watching, I don't need anymore (game machine is properly silly overpowered). Tablets are getting more powerful all the time, someone for fun sake report how wolvenstein runs on an atom CPU.

It is interesting geek stuff but the same thing that killed so many "great" ideas during the last bubble still is there. Bandwidth and latency are still not solved enough for this. We now finally can do the instant messaging people had dreamed up around 2000. 10 years later. (Check how long it has been since your mobile phone became truly capable of doing IM without endless waiting or insanely high prices)

Another piece of proof? Slashdot itself. Take the new "ajax" method of posting. No feedback, endless lag, errors, missing posts. It is clear what they wanted but the tech just ain't there yet. For the instant feel to work you need servers that can process a request in a handful of miliseconds, not seconds Mr Slashdot.

Nice idea Mr Intel, now get out your mobile phone and make a post on slashdot. Then you will know how unrealistic your dream is.

There is a reason we carry crotch burning CPU's and insane amounts of storage with us. Moore's law doesn't apply to the internet. AT&T won't allow it.

Re:Eheh (0)

Anonymous Coward | more than 3 years ago | (#36376518)

It would also be a great way to get back at Intel, if the untold hordes (+ rival companies) just kept running the most intense stuff 24/7; their servers would have to be designed for absolute peak capacity; no way in hell they can pay for all that power and maintenance (the hardware, they can, of course). I'd like to see them stop spouting weird crap ideas that are just stupid. Or, if they mean it, I'll buy one and make them eat their bandwidth + power.

Re:Eheh (1)

themightythor (673485) | more than 3 years ago | (#36376566)

Unfortunately, it's your bandwidth and power, too. Granted, I get your point about the scalability of this, but your attack ain't free.

Re:Eheh (0)

Anonymous Coward | more than 3 years ago | (#36376796)

Right, like you said the scalability works against them. I'm ok paying $5 in power if that means I'm ray-tracing out Quake 6 (ray-traced!) and in turn costing them $5*100 (physical maintenance, IT, database engineering, power, etc etc). And I'm sure the big NV wouldn't mind paying $5 MILLION dollars if it costs them $100 million in costs hahaaha (factor of 20 only for what they are suggesting, I think, is reasonable).

Re:Eheh (0)

Anonymous Coward | more than 3 years ago | (#36376832)

Another piece of proof? Slashdot itself. Take the new "ajax" method of posting. No feedback, endless lag, errors, missing posts. It is clear what they wanted but the tech just ain't there yet. For the instant feel to work you need servers that can process a request in a handful of miliseconds, not seconds Mr Slashdot.

Feedback? No, really, what?
Not sure what you are meaning by feedback, but I've seen some pretty direct feedback on Slashdot when posting, such as NOSHOUTING and the like.

I've not experienced lag, besides whatever happened to the server back there a couple days ago. Of course, there is Firefox... are you using FF?

Nor have I saw missing posts, besides the ones that are hidden for a reason. See the lovely little scroller at the top right of the page that allows you to hide and show posts.

Errors though, yeah, I will give you that. Slightly.
The older version was absolutely atrocious, each sub-domain having their own little bugs and such.
This version is significantly much better than that mess. And the recent changes with less white-space even more so.

Re:Eheh (1)

sourcerror (1718066) | more than 3 years ago | (#36377136)

It often takes me more than 10 sec to post comments (and that's not the counter thingie; just network lag + serverload).

Re:Eheh (0)

Anonymous Coward | more than 3 years ago | (#36377152)

Nor have I saw missing posts, besides the ones that are hidden for a reason.

I have submitted posts that are missing in nested mode. Not modded down or filtered, because I can set the threshold to -1 and the post is nowhwere to be seen on any page.

Switch it to flat, no threads and PRESTO, there's the post.

Not to mention nested mode redundantly shows the exact same results on the first few pages once a thread reaches a great enough length...

Nested viewing on Slashcode is badly broken, and has been for years.

Re:Eheh (0)

Anonymous Coward | more than 3 years ago | (#36377462)

Nor have I saw missing posts,

You're killing me, Smalls.

besides the ones that are hidden for a reason. See the lovely little scroller at the top right of the page that allows you to hide and show posts.

That all sort of works; the problem I have is the "50 posts at a time" thing. Fine, limited resources, gotcha.

What would it take to get me the 50 *most recent* posts? Or, especially if I'm moderating, 50 unmoderated posts?

Re:Eheh (1)

0123456 (636235) | more than 3 years ago | (#36378026)

Not sure what you are meaning by feedback, but I've seen some pretty direct feedback on Slashdot when posting, such as NOSHOUTING and the like.

Usually when you enter a comment somewhere it's either accepted straight away or you get some kind of feedback to say that something is happening.

When you enter a comment on the NEW SUPER IMPROVED SLASHDOT, you sit there for twenty seconds wondering whether something is going to happen. Sure, there's a 'Working' icon at the bottom of the screen... BUT IT'S THERE ALL THE TIME. The bastard 'Working' icon is sitting there spinning away right now as I type this. What does it mean? What is it 'Working' to do? Does anyone know? Or is it just lying to me?

And why can't I get new comments without refreshing the entire page after I've clicked the 'new comments' button? Why does it go away if there are no more comments at the time yet not return when there are more comments?

Slashdot was a dozen times better back before it went all 'Web 2.0' 'Cloud' whatever. Now it's a horrible mess of unintuitive and/or stupid design choices.

Re:Eheh (0)

Anonymous Coward | more than 3 years ago | (#36384054)

You miss the point. They are not trying to convince you that this is a practical solution. What you see here is a demonstration of a concept.

Here's you at the circus:

"Well that's just great, but an elephant is a stupid way to get to work. My car is better, plus its trunk can hold more groceries. And what is with the juggling? By not juggling, I keep my hands free to do other tasks like change radio stations or sip my coffee."

Link me to the original articles please (0)

Anonymous Coward | more than 3 years ago | (#36376568)

How about linking me to the actual articles instead of linking me more to slashdot articles which in turn link to more slashdot articles?

I thought the Blackberry tablet was bad (1)

Jellodyne (1876378) | more than 3 years ago | (#36376724)

I thought it was bad that the Blackberry tablet requires your phone to get email and function correctly. This Intel tablet requires a 32 core 'cloud' machine? Am I going to need a co-location service provider for my backend to ensure I can play a tablet based fps? Or is Intel planning to provide unlimited cloud capacity for each low power (and low cost) tablet processor they sell?

over wireless lan? (1)

afaiktoit (831835) | more than 3 years ago | (#36377092)

If this could work over a wireless lan to a beefy desktop computer it might be feasible.

All this just makes me feel old (1)

tchernobog (752560) | more than 3 years ago | (#36377508)

They are streaming content to a device; in other words, calculation happen on a central server and your device acts as a (somewhat) dumb terminal.

It's the mainframe world of Multics again, only 30 years later, much more complex, and servicing trivialities instead that business critical apps.

The "cloud" has become a buzzword for many, but deep down it's just some central servers doing the grunt-work, and you displaying that data. The reverse of a decentralized, democratic and transparent system; more control to the companies, less control to the user.

Re:All this just makes me feel old (0)

Anonymous Coward | more than 3 years ago | (#36391136)

They are streaming content to a device; in other words, calculation happen on a central server and your device acts as a (somewhat) dumb terminal.
It's the mainframe world of Multics again, only 30 years later, much more complex, and servicing trivialities instead that business critical apps

Yeah, I was thinking the same thing - but then I thought about MMPORGs and cyberspace/metaverse, and this is exactly how it's going to have to work.

The user will have to transmit commands to the server/cloud and the host will have to calculate interaction information in real time and stream that back to the user.

It's sort of analogous to terminal/mainframe but the key point is there's no way to get around the need for both parties to possess interaction information.

We can already do that without exotic hardware (1)

j1m+5n0w (749199) | more than 3 years ago | (#36377754)

I'm a big fan of real-time ray tracing, but this doesn't sound all that exciting, considering that about three years ago I was able to play a real-time ray-traced game on a middle-of-the-road laptop. Resolution and framerate weren't great, but it was playable. The game I refer to is Outbound [igad.nhtv.nl] an open-source game based on the Arauna engine.

It's great that this is on Intel's radar, but whenever Intel demonstrates some new real-time ray-tracing demo that requires a cluster of machines, or some other kind of expensive, specialized hardware, I just think they've kind of missed the boat. We can already do that sort of thing on an ordinary desktop. (The linked site is down, so perhaps there's more to this announcement than the slashdot summary would lead me to believe.)

Great (0)

Anonymous Coward | more than 3 years ago | (#36377958)

So then you will have to buy a computer so you can purchase a game that you will need an internet subscription for so you can pay for cloud time to play your game online. Where do I sign up!? When can I expect cloud-based users to complete the trend and play my cloud-based game for me?

Stupid (0)

Anonymous Coward | more than 3 years ago | (#36378746)

Polygons are faster and produce amazing results. Graphics used to be very important, but now it's basically a "solved" problem. The Wii, with SD resolution, and terribly low-end graphics (in comparison to competition) proves the point. You only need a certain threshold of graphical capability before gameplay becomes the weakest link.

Re:Stupid (1)

RightSaidFred99 (874576) | more than 3 years ago | (#36379796)

Bull. The Wii is dead, nobody(*) wants one or plays the one they have. That's why Nintendo is hurrying out with a new console while the Xbox/PS3 have been around much longer and will continue to be around for a few more years.

Gameplay _and_ graphics matter.

* - Yes, I know there are a few wierdos out there who do still play their Wii.

Re:Stupid (1)

Have Brain Will Rent (1031664) | more than 3 years ago | (#36380444)

Ummm polygons are faster than what? Spheres? NURBs?

Perhaps you meant scanline/raster rendering is faster than ray tracing? Both use polygons though...

Data caps and streaming (0)

Anonymous Coward | more than 3 years ago | (#36378824)

Why hasn't anyone mentioned anything about data caps? Online Live may not have much of a market in Canada considering those data caps. If you have such a low Data cap, I don't think you'll playing much if at all, especially if you're a subscriber to the media streaming services.

What about pre-rendering everything? (1)

slapout (93640) | more than 3 years ago | (#36380636)

Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves? (I realize you'd then have to superimpose the bad guys over the screen, but that's what the original Wolf3d did anyway.) It would be a lot of computation up front, but then you wouldn't be having to have the cloud computer constantly render a new frame. (Possibly some that's it's already rendered).

Or would that take so much time that it's not worth it?

Re:What about pre-rendering everything? (1)

j1m+5n0w (749199) | more than 3 years ago | (#36383150)

I don't think pre-rendering all possible images would be practical (except in a game like Myst where movement is confined); however when we get into the realm of global illumination, it does make a certain amount of sense to do the ambient lighting computation (which is the same regardless of camera position) up front. In a sense, this is what modern games already do -- the lighting effects are essentially painted on the walls as textures. I think the areas where there's room for improvement is that global illumination ought to respond to changes in the scene (opening a door or window, for instance), and these calculations could be made part of the game engine rather than as an artist's tool, so that procedurally-generated content could benefit from the same high-quality lighting effects as static scenes produced by level designers.

Re:What about pre-rendering everything? (1)

Eric(b0mb)Dennis (629047) | more than 3 years ago | (#36384012)

I vaguely remember playing a SWAT game for PC that pretty much did this.. kind of..

Well, actually, it used real-life video in place of rendered graphics, the video changing based on which direction you went.

Re:What about pre-rendering everything? (1)

Thiez (1281866) | more than 3 years ago | (#36384064)

> Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves?

No, of course it wouldn't be. Let's say we have a completely featureless map that is 100m by 100m, and we track coordinates in (integer) millimeters. This gives us 100000 * 100000 = 1 billion different points we can occupy. Wait, now we can't jump, and our face is stuck at some distance above the ground. We add being prone and crouching to our positions, and the ability to jump, and, while we're at it, it would be nice if we could see our body fly around when being hit by a grenade or something, so to be on the safe side we add height to our coordinates and set our maximum height to 10m. Now we have 1000000000 * 1000 = 1 trillion possible points. For the sake of the example we'll assume the data from a single point to be 1920 * 1200 * 6 pixels (looking 6 directions: up, down, left, right, forward, and back, and if you look forward-left-up we'll just combine the pictures somehow) so at 4 bytes per pixel that is 52MB, but let's assume we can magically losslessly compress this into 1MB. Congratulations, our image lookup table now contains almost 1 exabyte of information.

Note that all the above numbers are rather conservative. Maps tend to be much bigger than 100 by 100 meters, and coordinates in games tend to be tracked much more precisely. Many games allow you to go much higher/lower than 10 meters, especially those where you can float about as a spectator (which increases the amount of points from where a player might look spectacularly). The magical compression is probably relatively reasonable as neighbouring coordinates will usually look very much alike and you can probably get away with storing the differences in many cases.

Anyway, now we have a service where every client queries a 1 exabyte database about 30 times per second. Even assuming that is feasible, now we have to superimpose the bad guys... and to do that we need to know what the environment looks like. We don't want to superimpose a bad guy over a tree if he's standing behind it, so we need the 3d model of the map so that we can decide which parts of the enemy to draw. This is starting to look a lot like rendering...

It seems your idea would be a complete disaster and it's several orders of magnitude cheaper to render scenes only when they are required (while accepting the risk of sometimes rendering the same scene multiple time).

Re:What about pre-rendering everything? (1)

grumbel (592662) | more than 3 years ago | (#36386194)

Would it be feasible to pre-render every possible scene in the game and then just throw that up based on which direction the user moves?

Feasible? Not so much. Possible in theory, yes. The issue is that you need lots and lots of storage for complete free movement. Older CD based games like Myst3 did essentially that, but you couldn't move freely, just rotate. Movement was restricted to a fixed position every few meters. There was some adventure game prototype a few month/years (discussed on Slashdot) back that did move that tech to the next level and allow free movement an a single plane, but that still ended up looking kind of awkward as you couldn't do things like ducking. It of course also couldn't do dynamic objects, which pretty much ruins it for modern action games.

This is not a bad idea (0)

Anonymous Coward | more than 3 years ago | (#36383988)

If only you can pre-play the game, then download the results. No latency worries at all.

yes (0)

Anonymous Coward | more than 3 years ago | (#36385388)

Great blog. Thanks for sharing!
Wholesale Jewelry [wholesalejewelrytop.com]

net work (1)

Leo Bao (2240476) | more than 3 years ago | (#36385448)

Maybe not for an FPS or racer but if it was a really cool Starcraft killer that would have interesting possibilities. Handle the 2D GUI with the tablet processor and then raytrace a photorealistic game engine. [url=http://www.cheapesthandbagsforsale.com/goods-182.html]Chanel Fashion Handbags Ball Grain Leather Red 47976[/url] [url=http://www.cheapesthandbagsforsale.com/goods-176.html]chanel handbags collection[/url] [url=http://www.cheapesthandbagsforsale.com/goods-182.html]chanel handbags for cheap[/url]
Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>