Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

WebGL Flaw Leaves GPU Exposed To Hackers

Soulskill posted more than 3 years ago | from the out-in-the-cold dept.

Security 120

recoiledsnake writes "Google spent a lot of time yesterday talking up WebGL, but UK security firm Context seems to think users should disable the feature because it poses a serious security threat, and the US Computer Emergency Readiness Team is encouraging people to heed that advice. According to Context, a malicious site could pass code directly to a computer's GPU and trigger a denial of service attack or simply crash the machine. Ne'er-do-wells could also use WebGL and the Canvas element to pull image data from another domain, which could then be used as part of a more elaborate attack. Khronos, the group that organizes the standard, responded by pointing out that there is an extension available to graphics card manufacturers that can detect and protect against DoS attacks, but it did little to satisfy Context — the firm argues that inherent flaws in the design of WebGL make it very difficult to secure."

cancel ×

120 comments

dupe (3, Informative)

erroneus (253617) | more than 3 years ago | (#36119492)

dupe dupe dupe

Re:dupe (-1)

Anonymous Coward | more than 3 years ago | (#36119528)

dupe dupe dupe

dupe

Re:dupe (-1)

Anonymous Coward | more than 3 years ago | (#36119540)

dupe dupe dupe

dupe

Dupe dupe.

Re:dupe (3, Insightful)

Shotgun (30919) | more than 3 years ago | (#36119538)

Do you mean that the article is a dupe, or that Google is duplicating the mistake Microsoft made with ActiveX and the whole "it is so convenient to let anyone in the world do whatever they please on my computer" mentality?

Re:dupe (0)

oGMo (379) | more than 3 years ago | (#36119694)

Yes.

Re:dupe (1)

AmiMoJo (196126) | more than 3 years ago | (#36119820)

It seems incredible that Google would allow such a blatant security hole after making so much effort to sandbox and partition Chrome.

Re:dupe (3, Insightful)

Desler (1608317) | more than 3 years ago | (#36119926)

Don't worry, just like with the previous story they'll just claim it wasn't a flaw in Chrome (despite it bypassing the Chrome sandbox) and downplay it.

The whole thing... (3, Insightful)

fyngyrz (762201) | more than 3 years ago | (#36120368)

...is part of a serious cultural error being made: an impetus by hopeful marketers towards applications that run in/on the browser rather than in the user's machine. Both putting data "in the cloud" and running apps "from the cloud" are fraught with pitfalls; insightful users (a minority, as always) will resist this trend with traditional in-machine applications and fully local storage of data. The rest will suffer as corporations (continue to) misuse their data.

The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know. There is no remedy for this kind of error, either -- once you hand your data over, you have lost control of it, and in turn, you have lost control over the consequences of random third parties misusing your information.

The good news is that we have a broad set of extremely powerful applications available to us that run well in the local environment. Word processors, spreadsheets, sound, image and video editors, music and video library engines, educational software and a whole host more are all very well populated with traditional applications, so for the thinking user, there is no need to "go to the cloud" for classic compute tasks. Instead, the net can be used for communications, both as its heritage dictates and as the most sensible domain fit, while personal data and execution permissions remain secure in and at the local environment.

To help protect yourself, I suggest beginning by disabling flash, scripting and use only CSS/HTML in the web-facing interface. As a side benefit, surfing is much more pleasant without pop-overs, flash ads, and many other corporate infections of the network.

Neither Google or any other corporation has your best interests in mind. Start from that understanding, and the world will make considerably more sense.

Re:The whole thing... (2, Insightful)

SanityInAnarchy (655584) | more than 3 years ago | (#36120700)

insightful users (a minority, as always) will resist this trend with traditional in-machine applications and fully local storage of data.

Let's hope those insightful users are also insightful enough to actually have backups.

The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know.

And it's not possible to avoid both of these, sorry. In fact, it's not possible to avoid the latter at all.

The good news is that we have a broad set of extremely powerful applications available to us that run well in the local environment.

The bad news is that any local application has at least as much access as these web apps do.

To help protect yourself, I suggest beginning by disabling flash,

Thus "protecting" yourself from YouTube, FreeFillableForms (the only way to file US taxes online that I know of), etc.

scripting

Thus "protecting" yourself from things like Gmail, Google Instant Search, and... Do I really need to spell it out? I get the noscript mentality, but disabling these things entirely is both paranoid and backwards. It's a bit like unplugging your machine from the Internet -- sure, it's more secure, but it's also much less useful.

As a side benefit, surfing is much more pleasant without pop-overs, flash ads,

For which we have more specific approaches, like Adblock.

Neither Google or any other corporation has your best interests in mind.

It does help when their interests are aligned with yours, however, and this usually isn't hard to determine. But I'm again at a loss to how you get from that understanding to railing against applications in the browser. The browser application has only the data I allow it. The local application, unless I take great pains to sandbox it, has access to everything I have locally and the network. I can understand wanting to be sure your data is stored safely, whatever "safely" means to you -- I care less about privacy and more about reliability, but you can tweak both knobs as you like. But given the choice between an application that runs locally, and one which runs in the browser, which is already a sandbox, I'll choose the browser app every time unless I have a compelling reason to trust the application and sensitive data I don't want to leave my machine -- and even in that case, it's possible to run some web apps offline and, at the user's discretion, give them more access than any arbitrary web app should.

Re:The whole thing... (1)

countertrolling (1585477) | more than 3 years ago | (#36121164)

The browser application has only the data I allow it.

You really think you have that much control? I can assure you that you don't know if your entire disk has been uploaded or not..

Re:The whole thing... (1)

SanityInAnarchy (655584) | more than 3 years ago | (#36121326)

And how's that? Barring a vulnerability in the browser, that's not going to happen. Barring a massive conspiracy through a fairly large open source community, that vulnerability is not going to be deliberate.

Do I really need to spell this out?

Re:The whole thing... (1)

countertrolling (1585477) | more than 3 years ago | (#36121458)

Nothing to do with open source..

Look in the hardware, where the trade secrets hide out.. There is nothing to protect you from that.. In fact the law already protects them from your attempts at discovery... I shouldn't need to spell that out..

Re:The whole thing... (1)

SanityInAnarchy (655584) | more than 3 years ago | (#36122298)

Which again makes the browser completely irrelevant to this conversation.

If they've pwned my hardware, or my OS, or anything else that runs at Ring 0 or better, I'm fucked. Hell, even if it's not in Ring 0, if it just has root, I'm fucked. May as well give up and go home. I have no choice but to trust those who have that amount of control over my machine.

However, I don't have to trust more than that. For instance, so far as I know, Blackboard hasn't contributed to any of the hardware or root-level software on my machine. Thus, when I run Blackboard software in my browser, it's reasonably contained -- I am trusting them with no more than I have to. Which is still considerable, given that my grades do depend on their software functioning sanely, but they don't get my credit cards, SSN, email, etc.

Really, what was your point? I'm naive for trusting the bare minimum, instead of no one at all?

Re:The whole thing... (1)

fyngyrz (762201) | more than 3 years ago | (#36122618)

Let's hope those insightful users are also insightful enough to actually have backups.

Yes, let's. Of course, on the other hand, you have to hope Google hasn't lost your email or isn't down when you need it, don't you? As opposed to your trusty email program, which is right there all the time. And lets you do things like automatically assign identities to your replies, or delete your attachments, or view your email in non-prop fonts... and doesn't surf your email for keywords... and doesn't have employees that snoop on your email... and doesn't limit the amount of storage you have... all things that are decidedly unlike Google.

And it's not possible to avoid both of these, sorry. In fact, it's not possible to avoid the latter at all.

You can certainly avoid doing it over channels like the Internet, though, which is the point that you're trying to disregard. Nice try.

The bad news is that any local application has at least as much access as these web apps do.

The good news is you can have control over them, because they're not Internet apps, and further, can be kept off the net with tools like Little Snitch, virtual machines, simply not connecting the host hardware to the Internet at all, et al. Again, nice try, no cigar. Your data can't be emailed or otherwise snatched by malware if the machine that carries it isn't on the net. My main machine has no Internet connection at all; no one, including the USG, can steal my data without physical access. I surf on a nettop that has absolutely zero of interest on it. You could completely compromise it *or* my main machine via some installed program and you'd *still* have nothing. The one has no data of any use (and in fact often has a different OS, installed from scratch, on it from day to day... that's how ephemeral that thing's software and data environment is); the other has no network connection. So you see, it's a matter of managing your resources; there is no inevitability to data exposure. If your data is exposed, and you understand how things actually work -- it's your fault. Period.

Thus "protecting" yourself from YouTube, FreeFillableForms (the only way to file US taxes online that I know of), etc.

Yes, precisely. You don't have to deal with Youtube's censorious behavior, invasive ads, or the incredible low resolution mass of 99.99% drivel they call content. A win all around. As for taxes, no need to do those online, and not a good idea, either. See, you've been conditioned to think it's ok to put your financial information online. It's not. It never was. It never will be. That information should be between you and the other legitimate parties to the transaction(s), and no one else. Sending your tax information over the network... just plain careless and hugely risky. Your choice, though, of course. No one is stopping you from doing the wrong thing. Just keep pretending SSL protects you. That's exactly what they (corporations and the USG) want you to think.

Thus "protecting" yourself from things like Gmail,

Yes, exactly. Try Thunderbird instead. You'd be amazed at the power of a real email system. And you have the source code; if you want to ensure it's safe to run, you can. And all those problems I listed WRT Google? Not a problem with Thunderbird at all.

Google Instant Search

Yes, life was utterly horrifying and pointless and it was impossible to better yourself without that, wasn't it? lol

Do I really need to spell it out?

Your spelling isn't at question here -- it's your failure to understand security and your submission to conditioning that is at issue.

For which we have more specific approaches, like Adblock.

That's just a patch on a larger problem. Not a solution. Adblock didn't stop Google employees from surfing people's email, did it? Or scanning it for keywords? Doesn't keep systems in the packet path from intercepting your tax information, does it? Doesn't keep the USG from dropping a keylogger on your system when you file your taxes, does it? Adblock is there to pander to the naive, to apply a visual patch to a problem with much deeper roots. Your data is still at risk.

But I'm again at a loss to how you get from that understanding to railing against applications in the browser.

That's because that's not how it goes. It goes like this: My data is mine; I don't intend to "share" it, much less have it stolen; so I don't let Internet code execute in/on my machine. It isn't "corporations are bad so don't execute", it's "my data is mine so don't execute." A side benefit is that corporations, who I know don't have my best interests in mind, are locked out of my machine except in the most passive ways - gifs, jpegs, html, css. that's plenty to communicate with me (and also, just fyi, it's plenty to allow services like email and video -- CGI does *not* have to execute on your machine. If someone really wants to run some code, they can run it on their own hardware and then show me the results, and that'll be just fine. A lot of people out there seem to have forgotten just how much can be done with server-side execution. Safely!

But given the choice between an application that runs locally, and one which runs in the browser, which is already a sandbox

LOL. You are naive. Extremely so.

Re:The whole thing... (0)

Anonymous Coward | more than 3 years ago | (#36122720)

lol do you live in a bunker and wear aluminum foil hats?

Re:The whole thing... (1)

Joce640k (829181) | more than 3 years ago | (#36121702)

To help protect yourself, I suggest beginning by disabling flash, scripting and use only CSS/HTML in the web-facing interface.

How about you get hold of something that lets you whitelist which sites are allowed to use Flash and Javascript...? That way your computer will still be able to do something useful.

Re:The whole thing... (1)

fyngyrz (762201) | more than 3 years ago | (#36122124)


How about you get hold of something that lets you whitelist which sites are allowed to use Flash and Javascript...? That way your computer will still be able to do something useful.

If your computer can't do anything useful without flash and javascript, then frankly... well, never mind.

If you allow a site to operate those technologies in your execution space, you are allowing third -- and fourth, and 5th, and etc. -- parties access to your hardware. Not a good idea. Your "trusted" site will likely sell some ad space to Malware Inc., and sooner rather than later. Directly or indirectly. Then your data will be gone, while you're stuck being mad and adding the site to your blacklist (or taking it off your whitelist.) But your data is AWOL; this isn't something you can fix after the fact. Even if there were a decent legal remedy -- which there isn't, at present. Or a just legal system to pursue it in -- which there also isn't at present.

The right answer is not to let others execute programs on your computer. Just as simple as that. The current trend is going exactly the wrong way. If you want to be one of the horde with a chunk bitten out of their metaphorical rear, then that's your choice -- at least you've considered the issue, which is more than most people can say. So for you, there's no need for sympathy. For granny, who doesn't get it, some compassion is in order, not to mention some kicking against the traces in her stead.

Re:The whole thing... (1)

maxwell demon (590494) | more than 3 years ago | (#36122866)

The key issue is: Putting your data in the hands of those you don't know is a uniformly bad idea. So is giving control of your computer's execution to those you don't know. There is no remedy for this kind of error, either -- once you hand your data over, you have lost control of it, and in turn, you have lost control over the consequences of random third parties misusing your information.

Well, I have bad news for you. As soon as you start up your computer, even before the operating system starts up, your computer is running code from someone you don't know (unless you know all the people writing processor microcode, your computer's BIOS, and all sorts of firmware for devices in your computer. Then your computer reads and executes the master boot record, which most robably is written by someone you don't know, which then goes on to load an operating system, generally also written in most parts by people you don't know (and yes, that's true even if you run Linux or BSD). Then the operating system loads drivers from authors you don't know, running in ring 0, and starts programs from authors you don't know, running as root. And at some time, it starts a login process, probably also written by someone you don't know. That is, already before you even get to your user account, you rely on code written by thousands of people you don't know, executing in security-critical contexts. And any program you run from your user account will also execute such code.

Indeed, running software from the browser is in principle more secure than running it directly on the computer, because there's another barrier the application has to overcome: The security model of the browser. After it breaks that, it has exactly the same possibilities as normal software.

Now, I don't want to say that browser-based software isn't a security issue at all. It is, but for a completely different reason. The problem is not that you are executing code from someone you don't know; you do that also with software installed on your computer. The problem is rather that you don't know which code you actually execute. With installed programs, you know that the program you execute today is the same program you executed yesterday (unless you installed a new version, of course). So if you have convinced yourself that a program you installed is trustworthy, it remains trustworthy. Web applications OTOH are loaded anew each time you run them. If someone manages to manipulate them, or to redirect you elsewhere, then the code you execute is not the code you think you execute. It's the equivalent of mounting a file system from somewhere on the internet and directly running the executable found there; something nobody in their right mind would do.

Re:dupe (1)

LordLimecat (1103839) | more than 3 years ago | (#36121736)

The previous story was about how a plugin-- which by all counts is installed on >90% of machines-- had a flaw, and was exploitable on any of those machines. Google, in an attempt to ensure timely application of plugin updates, bundled that plugin with Chrome (which is one of the reasons I have started looking at MSI rollouts of Chrome-- if theyre going to have flash no matter what I say, at least it will be up to date).

Fair enough to call the software package "Chrome" vulnerable, while technically the package "Firefox" is not; but in reality, users will not notice the difference. If you are watching youtube (or Hulu) videos in a browser, you are susceptible to the attack. The exploit, AFAICT, was not on any "google" code, but with the bundled Adobe code.

So while technically Vupen is right that it is a Chrome exploit, it is also more directly an Adobe exploit, and affects everyone using Adobe Flash. To make the story about Chrome, when most people running Flash are susceptable, is quite misleading, and I think Google was right to step up and point out that while it is a "Chrome package" exploit, and it isnt really a Chrome-exclusive flaw, nor is it reflective of the code within Chrome itself.

Also, Its rather amusing to find slashdot defending the claims of a company who makes statements like this...
"We will not help Google in finding the vulnerabilities... Nobody knows how we bypassed Google Chrome's sandbox except us and our customers, and any claim is a pure speculation."

Re:dupe (1)

Bengie (1121981) | more than 3 years ago | (#36119992)

"WebGL is implemented in Google Chrome[4], Mozilla Firefox 4 and in development releases of Safari and Opera."

Khronos Group is making WebGL. Blame them, not Google. Khronos Group also makes OpenGL. Google is just following standards and bragging about how they're implementing them first.

Re:dupe (1)

Anonymous Coward | more than 3 years ago | (#36120242)

You don't have to implement every standard, and you certainly don't have to implement Khronos Group standards.

In fact, a company has a responsibility not to implement standards that are ill thought-out. Standard does not, by itself, mean good.

Re:dupe (1)

LordLimecat (1103839) | more than 3 years ago | (#36121682)

Ive never been precisely clear on why "passing instructions directly to the GPU" is different than youtube / h.264 "passing data directly to be decoded by GPU" or JIT'd javascript "being passed directly to the CPU". Surely it is possible to do such things securely? Its not like WebGL allows, for example, access to the GPU BIOS, or overclocking functions, or fan speed, and presumably they would limit the other things that can be done.

I mean, games like WoW display data on the graphics card that is pulled over the network; the fact that they have a trusted client that sanity-checks the incoming graphical data makes it safe (though I know its more than that, the rendering is done locally).

Ive heard a lot of back and forth about "but it allows shader level access" (and ive never been super clear on shader vs normal painting), can someone break down how it could be safe and how it could be unsafe?

Re:dupe (1)

hairyfeet (841228) | more than 3 years ago | (#36121838)

Ya know, I still ain't figured that one out yet. I mean sure MSFT wasn't the brightest bulb when they went "Hey, lets run everyone as admin!" with XP, but at least that could be partially explained with having no clue at the time how to fix it without borking backwards compatibility and having to get metric tons of Win9X third party code to run on the WinNT arch.

But with ActiveX you are letting web sites have low level system access and the amount of data you could glean with the first versions of ActiveX was just nuts! What were they thinking? Did the security team take a vacation or what? I still don't see how a meeting discussing ActiveX could have taken place without someone pointing out "Uhhh...guys we are kinda the biggest target for script kiddies and bad guys. What would happen if they started using this?"

As for TFA I STILL say sooner or later we are gonna have to start over. Just look up "JavaScript malware" to see how much nasty is already out there as it is, and now companies are trying to come out with stuff like WebGL because even after tying JavaScript into a pretzel trying to get more bling bling and speed it STILL isn't enough. What we need is a new language, built from the bottom up for security, that can do CPU and GPU virtualization so the code doesn't need to know squat about what you have to run the site.

Because as we have seen with previous articles on bad guys jumping out of sandboxes, what we have now is a bunch of band aids on bullet wounds. Sandboxes, NoScript, ABP, all of it is band aids trying to stop the bleeding when the simple fact is JavaScript simply isn't designed for all the bling bling bs and security as well. So I say it is time to start fresh, what we need is something that will let the folks have their bling bling in a manner that won't bite them in the ass. It won't be easy, but big things never are. But imagine a world with no drive by malware, no ad carried infections, hell all that would be left would be good old fashioned social engineering. Sounds like a nice place to me.

Re:dupe (1)

kvvbassboy (2010962) | more than 3 years ago | (#36119550)

why so many dupes these days? :s

Re:dupe (0)

Anonymous Coward | more than 3 years ago | (#36120216)

Maybe it's because people who would be involved in that process, or meta modding, don't because it requires enabling scripting?

Re:dupe (1)

TheRaven64 (641858) | more than 3 years ago | (#36121454)

Or maybe its because they removed the big 'mark as dupe' button from stories on the front page...

Re:dupe (1)

asifyoucare (302582) | more than 3 years ago | (#36119586)

trupe?

Aftershocks of the Facebook smear campaign. (0)

boeroboy (1501771) | more than 3 years ago | (#36119734)

Mark Zuckerberg paid for a threepeat. I hear every time WebGL compiles a shader, Orkut emails a thousand SSNs to China.

I asked about this at Google I/O! (4, Insightful)

MostAwesomeDude (980382) | more than 3 years ago | (#36119588)

http://www.youtube.com/watch?v=WgbK0ztUkDM&feature=player_detailpage#t=3195s [youtube.com] is the video. In short, I asked the NaCl guy whether they knew what they were doing by letting NaCl clients access GPUs directly. His response was that they were doing everything WebGL does to protect the system from malicious code. That's unfortunately not sufficient.

Second Life - Already A Proven Attack Vector (1, Interesting)

Anonymous Coward | more than 3 years ago | (#36120574)

In Second Life there are denial of service attacks on the GPU going on as we speak. People who gather at infohubs to voice chat are occasionally knocked offline by them. It can clear a room full of people in seconds. Basically, the attacker wears a spiked sculpty (a texture that turns into geometry) that is far more processing than the GPU can handle rendering. It looks like static on your screen essentially, you lag like hell, and then you crash hard. The only way to stop it is if you get lucky and smash the key combo for disabling Volumes or already had it disabled, they combo is CTRL ALT SHIFT 9. Volumes are prims (geometry primitives), which include sculpties and also regular geometry (cubes, spheres, torus, etc) and by disabling it hair, buildings, floors, walls, jewelry, shoes, etc all vanish. You're left looking at a pretty blank world with more or less naked avatars. The people who commit these attacks are usually either just trolling for the lulz, or they have a chip on their shoulder because they've had their asses handed to them in an argument over mic in that hub. Which is often the case. I'd say it probably happens at least 2 or 3 times a day, if not more, in the korea1 infohub for example. Also, it's been reported that it can also potentially damaging your graphics card. If you can't stop it with the key combo fast enough, you're best off killing the application with Task Manager as soon as you notice it happening.

Re:I asked about this at Google I/O! (1)

gl4ss (559668) | more than 3 years ago | (#36120790)

well that's the new google engineers card, blame others.

whole idea of NaCl is though like using windows 3.11 era style(it's actually convenient if a document can have code in it, in a trusted environment, flash and such are half-way houses to that), really.

but who would use an os without native binaries? accountants maybe, for a short while. a really short while.

Re:I asked about this at Google I/O! (1)

CODiNE (27417) | more than 3 years ago | (#36121010)

Ahhh... well you see NaCl really means Sodium Chloride or common table salt.

Sodium or Chlorine by themselves are toxic, yet when combined they cancel out each other's effects and make our food yummier.

This is how Google plans to use native code execution and direct GPU access together. True, separately they are dangerous and leave a computer vulnerable to hackers. Yet, when combined, incredibly they cancel out each other's effects and make the internet yummier!

just one thing... (-1)

Anonymous Coward | more than 3 years ago | (#36119602)

HTML5 is sooooo the next thing- just a few questions

- then why haven't any large video sites chosen it for video?

- why does performance seem to be so, sluggish?

- why hasn't anything decent been made with it?

- why can't the people behind it seem to agree?

- why did a small minority of idiots expect us all to go way back to the days of "this website is best viewed with [insert favorite browser here]"?

wtf - HTML5 ain't gonna happen so quit bringing it up - just let it die.

Re:just one thing... (1)

mrnobo1024 (464702) | more than 3 years ago | (#36119636)

HTML5 is sooooo the next thing- just a few questions

- then why haven't any large video sites chosen it for video?

YouTube [youtube.com] isn't large enough for you?

Re:just one thing... (1)

Desler (1608317) | more than 3 years ago | (#36119848)

Sure, but you

Re:just one thing... (0)

Anonymous Coward | more than 3 years ago | (#36119884)

yeah that's a great opt in trial there -

still, have a look at http://apiblog.youtube.com/2010/06/flash-and-html5-tag.html if you want to get the bigger picture (no pun intended)

Re:just one thing... (1)

Desler (1608317) | more than 3 years ago | (#36119916)

Sure as long as you ignore the fact that the vast majority of the content on Youtube is still exclusively only served via flash.

Re:just one thing... (0)

Anonymous Coward | more than 3 years ago | (#36120310)

Is it? Everything I can recall for some time can download in h.264 (.mp4) using the Download Helper FF plugin. Then things can be watched in VLC.
That's not exactly in the browser, but it's a workaround to Flash/scripting. So they do serve in something besides Flash.
I wish there was a setting to get Firefix 4 to view that natively. Is there an extension or some other trick?
Maybe a browser ID chnage so it looks like an iPad or something? I know people want to support open codecs, I do too, but can't we have h.264 support that works too?

Re:just one thing... (1)

icebraining (1313345) | more than 3 years ago | (#36120496)

If you use Windows 7, there's an extension [interopera...ridges.com] to play H.264 videos on Firefox using the system codec.

Re:just one thing... (1)

Desler (1608317) | more than 3 years ago | (#36120608)

Is it?

Yes. There are still tons of videos that still are only available in the old Sorenson video codec served up through flash.

Re:just one thing... (1)

_0xd0ad (1974778) | more than 3 years ago | (#36119826)

This [craftymind.com] seems to be "so, sluggish" to you?

Re:just one thing... (0)

Anonymous Coward | more than 3 years ago | (#36120134)

no not sluggish - it just doesn't work! piss orf.

Re:just one thing... (1)

_0xd0ad (1974778) | more than 3 years ago | (#36120232)

How the fuck can you comment on HTML5's video performance when your shitty browser doesn't even support it?

YOU "piss orf".

Re:just one thing... (0)

Anonymous Coward | more than 3 years ago | (#36120414)

isn't that the point here, that this dead-loss hack of different technologies would've taken us back to the days of "this website is best viewed with [insert favorite browser here]"

people are not going to buy that, whether devs or end users, so just forget about it.

Re:just one thing... (1)

_0xd0ad (1974778) | more than 3 years ago | (#36120586)

No, the point is that if you're one of the outliers who still hasn't moved to a modern browser that supports basic HTML5 video...

Get a real browser.

It works in Firefox and Opera. I can't test it in Chrome.

Re:just one thing... (0)

Anonymous Coward | more than 3 years ago | (#36120388)

Yes.

I'm running a dual processor Xeon 2.4GHz with a Radeon HD3850, and it's very choppy and inconsistent. Once I click to blow it up, framerate drops to about 5.

Re:just one thing... (1)

JorDan Clock (664877) | more than 3 years ago | (#36120452)

And my E4500+HD5770 runs the video and the "blow up" completely smooth. So... I guess it's mainly a GPU thing?

Re:just one thing... (1)

_0xd0ad (1974778) | more than 3 years ago | (#36120514)

I'm running a dual-core 1.8GHz Pentium with whatever graphics card HP decided to slap in (the driver is identified as "Intel(R) Q965/Q963 Express Chipset Family") and it plays about as smoothly as I care to have any YouTube videos look.

Re:just one thing... (1)

TheRaven64 (641858) | more than 3 years ago | (#36121494)

Core 2 Duo here, 2.16GHz (mobile version, not desktop). CPU usage for the browser is at 86% playing the video, at 105% (i.e. one and a bit cores worth) when I explode it. No loss in framerate. Sounds like your browser is the problem...

Re:just one thing... (1)

GameboyRMH (1153867) | more than 3 years ago | (#36121704)

Runs at about 4fps on my N900, even when blowing up the video, so something's seriously wrong with your computer.

Re:just one thing... (0)

Anonymous Coward | more than 3 years ago | (#36119950)

- why hasn't anything decent been made with it?

If you havent already then take a look at http://www.optimum7.com/css3-man/animation.html [optimum7.com]

its pretty much cutting edge when it comes to combining html5, css3 and js.

Re:just one thing... (0)

Anonymous Coward | more than 3 years ago | (#36120156)

1. There are a few, but as far as it goes Flash is still LESS resource intensive because it uses the GPU now

2. These are new implementations. Did you really think that day 1 releases would be optimized?

3. Many things have, but when IE has ~60%, and only 10% or so has IE9, there are really only 2 engines writing new HTML(5) features. Gecko, and Webkit. Trident is only putting in safe features because it has to do extended support whereas Webkit browsers (excluding mobile) usually upgrade pretty quick (i.e. they don't fragment). And Gecko's new upgrade system should put it at a really nice path too. So the true answer is, the spec isn't finished and even the parts that are, there aren't enough proper implementations to be able to use it effectively

4. Because there are 4 major engines. Try getting 5 companies agreed on anything. Google, Apple, Microsoft, Mozilla, and Opera. There's other companies like Adobe with stakes in this too. They try to get what their userbase wants the most, while still being secure, and not overwriting features. You don't break the web, and that's a hard thing to do.

5. Those are the people that most web devs are railing against, just like people who introduce only -webkit properties. You enter all or nothing. -ms -moz -o -webkit. There's a battle in CSS going against that. But what you're talking about are SPONSORED sites, as in Google or MS paid or made a contest for people to do that. It's part of the submission to that to get more people to their browser. The people with the biggest browser calls the shots.

HTML is happening, just it's slow to upgrade the world's largest platform. If you don't believe me, ask Brendan Eich. You can't regress, only upgrade, and anything that has regressed, all engines still need to support, otherwise the browser seems broken. And not all the API's are finished, and then they need to go through at least 2 implementations. It's a long process with a lot of companies involved, not just some like OMG WE PATCHED IT.

When will it end? (1)

Anonymous Coward | more than 3 years ago | (#36119604)

Isn't there a point at which keeping your equipment safe from intruders is such a hassle that it's no longer worth even having the equipment? This constant battle to defend your computer is tiresome. I can't imagine not having a computer but this exhausting.

Re:When will it end? (3, Funny)

pushing-robot (1037830) | more than 3 years ago | (#36119684)

Welcome to "Everyone Else"—we're happy to have you as a member! Here's your complimentary iPad.

Re:When will it end? (1)

icebraining (1313345) | more than 3 years ago | (#36120326)

The iPad is still vulnerable to flaws in the browser and apps - for example, the PDF exploits which enabled 1-click jailbreak (and could just as easily enable 1-click malware) - and to flaws in the app overview process - which enabled a tethering app to be approved and installed by thousands before it was taken down.

Re:When will it end? (1)

pushing-robot (1037830) | more than 3 years ago | (#36121044)

Yeah, and a Honda Civic isn't immune to mechanical problems. But when it does, it's somebody else's problem.

Re:When will it end? (1)

GameboyRMH (1153867) | more than 3 years ago | (#36121724)

You can fix your own Civic. A better example might have been a Maybach, a Nissan GTR or a McLaren F1.

Re:When will it end? (0)

Anonymous Coward | more than 3 years ago | (#36120520)

Except at one point, all you had to do was click on a PDF to crack it's security. I would have enjoyed if someone put a malicious payload into that PDF *SINCE IT HAD ROOT ACCESS* and watch everyone squirm.

Not to mention apps can, without jb'ing, steal your contacts, your location for the past year (well, that's patched now), etc.

Incidentally, guess what? Your recommendation will still use OpenGL ES2.0 to implement WebGL -- just like "Everyone Else"! You will still experience the same problems that the article is pointing out should the malware target your OS of choice.

Man, I hate idiots.

"speculative at best..." (1)

jamienk (62492) | more than 3 years ago | (#36119714)

Re:"speculative at best..." (1)

Desler (1608317) | more than 3 years ago | (#36119870)

This just in: Guy who works for Mozilla downplays issues in a standard originating from Mozilla.

Re:"speculative at best..." (1)

JorDan Clock (664877) | more than 3 years ago | (#36120480)

Except WebGL comes from Khronos Group. Ya know, the same guys that brought us OpenGL. Mozilla is part of the founding body, but you can't say it originated from them.

Re:"speculative at best..." (1)

Desler (1608317) | more than 3 years ago | (#36120562)

WebGL grew out of the Canvas 3D experiments started by Vladimir Vukievi at Mozilla. Vukievi first demonstrated a Canvas 3D prototype in 2006. By the end of 2007, both Mozilla[6] and Opera[7] had made their own separate implementations.
In early 2009 Mozilla and Khronos started the WebGL Working Group.

Sorry, what eventually became WebGL originated from Mozilla and then later Mozilla and Khronos started the working group to standardize it. So yes, you can say it originated from them.

Re:"speculative at best..." (1)

idontgno (624372) | more than 3 years ago | (#36120578)

If my kid throws a rock that breaks your window, I probably won't get away with "Yeah, I'm a part of his founding body, but that rock didn't originate with me". It's technically true but effectively worthless as a disclaimer.

Re:"speculative at best..." (1)

Desler (1608317) | more than 3 years ago | (#36120626)

It's not even technically true since the Canvas 3D that became standardized as WebGL was originally created and implemented by Mozilla.

Re:"speculative at best..." (1)

LordLimecat (1103839) | more than 3 years ago | (#36121776)

Desler's statement was that the WebGL standard originates "from Mozilla". Statement is false.

Why do you feel the need to defend something so obviously wrong? Noones playing the blame game in this thread, theyre just fighting gross misinformation.

Re:"speculative at best..." (1)

LO0G (606364) | more than 3 years ago | (#36121618)

That's... Interesting. Buffer overflow attacks were once considered "speculative at best".

Here's a question: What happens when you take drivers which were designed to run only local content (and thus have never been hardened against malicious content) and expose their entire API surface to the internet?

The answer is similar to the answer to the question: What happens when you take network services which were designed to run only intranet content (and thus have never been hardened against malicious content) and expose their entire API surface to the internet?

We know the answer to the second question: Sasser, Blaster, etc.

Why would the answer to the first question be any different?

Re:"speculative at best..." (0)

Anonymous Coward | more than 3 years ago | (#36121768)

That's... Interesting. Buffer overflow attacks were once considered "speculative at best".

Bullshit. Buffer overflow attacks have been a known problem since zero-delimited strings. Back when people were still programming in assembly, self-modifying code was the norm, not the exception.

Re:"speculative at best..." (1)

LO0G (606364) | more than 3 years ago | (#36121808)

Back in the 1990s when you reported a buffer overflow to a company, their usual answer was "So what, it's only a theoretical vulnerability, it can't be used to attack our product - all you can do is crash my app, you can't get reliable remote code execution".

In the intervening 10 years, most companies no longer feel that way.

To me, the "speculative at best..." comment seems disturbingly familar to the old complaints about the buffer overflows.

Horrible Article (5, Interesting)

ace123 (758107) | more than 3 years ago | (#36119830)

This is nothing more than a scary article about the well-known risk of denial-of-service and using shaders to extract pixels from a remote image -- and the media is slurping it up, using senteces like "run arbitrary code on the GPU ... render an entire machine unusable". Ugh.

It's a completely over-hyped article about something the spec designers have known since day one. The article takes the fact that bad OpenGL drivers can crash a computer to mean a security hole, something which driver vendors are actively participating in resolving for future cards.

I wasted my time reading that whole report a few days ago, and it basically said nothing that wasn't obvious and well-known. The only thing new is they are showing that there is no way to stop GPU code from extracting pixels from remote images embedded in a canvas, which is a real "security" hole, though there's not a whole lot of use for this.

Basically, the extent to which this *should* affect webgl is that they will disallow textures from remote sites -- in other words, it could add an extra annoying implementation step for collaborative spaces that could include models from multiple sites. Also, they might choose to add an Infobar to prevent arbitrary websites from crashing the computer or making it run slowly.

However, thanks to the media slurping this up and using words like "run arbitrary code on the GPU", "render an entire machine unusable", etc., people who read these articles and know nothing about the subject (i.e. idiots) will start to ask browser vendors to turn those features off. But to be honest, I hope people aren't this stupid and "FUD"-y articles like these are forgotten.

Also, the title is plain misleading -- a denial of service attack on buggy drivers should not be described as "Leaves GPU Exposed". A website can not in any way take advantage of crashing a user's computer, and browser vendors will quickly respond with a blacklist patch when they learn of the affected GPU.

If you disagree with anything I said, feel free to comment and I'll explain in more detail why what the article describes are not "security issues" in WebGL.

Re:Horrible Article (3, Insightful)

Anonymous Coward | more than 3 years ago | (#36119910)

"A website can not in any way take advantage of crashing a user's computer"

Except those crashes are usually caused by buffer overflows which eventually lead to a well-crafted attack that causes remote code execution.

Re:Horrible Article (2)

ace123 (758107) | more than 3 years ago | (#36120038)

You said usually. That's not true in 99% of cases -- if you read the article (which is unfortunately slashdotted), the specific crashes in this case are in locking up the GPU itself by taking too long to render frames. This means that the computer will reset due to its watchdog timer, not because any malicious code was executed on the CPU.

You are correct that the occasional graphics driver might be buggy, and that's why Mozilla has a whitelist of graphics manufacturers and cards that are actively patching their drivers, and a blacklist of any drivers which are out of date, incorrect, or may otherwise have a bugs.

Not to mention a shader validator both in Chrome and Firefox (The ANGLE project) which will check for any malicious code. Yes, there may be bugs in any of these components, but this is defense in depth, and all of these components are going to be quickly patched and/or blacklisted if an exploit is found.

Re:Horrible Article (0)

Anonymous Coward | more than 3 years ago | (#36119928)

I agree.

I was looking for some example code that demonstrates this supposed DoS... not a wall of text saying "WebGL scary!!!". Display drivers can be buggy. That's not a web standards issue.

Re:Horrible Article (1)

Desler (1608317) | more than 3 years ago | (#36119984)

That's not a web standards issue.

Except for when that standard relies heavily on those very same video drivers just to work?

Re:Horrible Article (0)

Anonymous Coward | more than 3 years ago | (#36119986)

Depending on the type of crash it could lead to code execution, or it could not. If you are absolutely sure about this please elaborate on what type of crash this is. Just because there hasn't been a POC yet doesn't mean there will never be one, the recent triple-step dance to get a payload to execute on Chrome should be a shining example of why certain crashes that are deemed useless are actually just starting points for the determined coder.

Re:Horrible Article (1)

ace123 (758107) | more than 3 years ago | (#36120128)

Read this article from a Mozilla dev:
http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]

Note the last paragraph:

Nevertheless, claims of kernel level hardware access via WebGL are speculative at best since WebGL shaders run on the GPU and shader compilers run in user mode.

Therefore, unless something is horribly wrong in a particular graphics driver (and Mozilla/Google were careful which companies they whitelist in this regard), the worst case is a bug in the code compiler -- which is probably about as likely as a bug in any Javascript interpreter or Adobe Flash.

Also, they would need the "triple-step dance" for each different buggy driver, each of which would get patched in weeks at most, and each of which probably affects 0.1% of the market due to the diversity of hardware, OS and browser -- you have to consider payoff of exploit as well.

Re:Horrible Article (1)

gmueckl (950314) | more than 3 years ago | (#36120278)

Read this article from a Mozilla dev:
http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]

Note the last paragraph:

Nevertheless, claims of kernel level hardware access via WebGL are speculative at best since WebGL shaders run on the GPU and shader compilers run in user mode.

Therefore, unless something is horribly wrong in a particular graphics driver (and Mozilla/Google were careful which companies they whitelist in this regard), the worst case is a bug in the code compiler -- which is probably about as likely as a bug in any Javascript interpreter or Adobe Flash.

You are certainly aware then that the shader compilers in the OpenGL drivers are among the buggiest compilers I've ever worked with. In my experience, a bug in any of these shader compilers is, at least at the moment, much, much more likely than one in the browser's JS interpreter.

Also, they would need the "triple-step dance" for each different buggy driver, each of which would get patched in weeks at most, and each of which probably affects 0.1% of the market due to the diversity of hardware, OS and browser -- you have to consider payoff of exploit as well.

Uhm, 0.1% of the marker? Not a chance. Especially the compilers are shared between different drivers. Again, I can't speak for ATI or Intel, but nVidia has basically exactly one driver for all their platforms and hardware variants (the current builds cover 200 series and newer at least in a single driver). That's about 20 to 30 percent of the market with one single compiler vulnerability.

Note: I do not distinguish between kernel and user mode parts of the OpenGL implementation/driver here. From a OpenGL user's perspective, there's no difference.

Re:Horrible Article (0)

Anonymous Coward | more than 3 years ago | (#36120198)

Or as the OpenBSD guys say, "a bug is an exploit that hasn't been discovered yet."

Re:Horrible Article (3, Interesting)

Mysteray (713473) | more than 3 years ago | (#36120186)

I agree it's misleading to imply that there's a specific 'flaw' that leaves the GPU 'exposed'. That's the entire point of WebGL: to expose the GPU to web applications. Whether or not you think that's a good idea depends on where you fall on the security vs. functionality spectrum. It's an interesting discussion.

Look at it this way: GPUs are extremely complex hardware/software combination systems representing a huge attack surface. They're designed either for zero-cost (integrated graphics) or maximum game performance. Security has never been a big driver for this market. Newer graphics engines like WebGL allow the GPUs to be programmed with somewhat arbitrary code. These programs need lightning-fast parallel access to several different kinds of memory and the security model for this programming environment looks something like an afterthought.

Once again, the developers probably thought they didn't need to put security first since the primary use case was running trusted applications on single-user systems (e.g., games).

It's not uncommon to see crash bugs in GPU systems. They look a heck of a lot like the blue screens that used to plague MS Windows. There's no reason to think these bugs will be any less exploitable than those of Windows XP SP 0. We've seen this play out with Adobe Acrobat reader, Flash, and any number of other binary browser plugins. Hopefully the graphics developers are better, but their challenge is much harder too.

In short, all the ingredients are present making in the recipe for disaster. It's probably only a matter of time for exploitable vulnerabilities to surface. I don't think we should kill off WebGL altogether, but the right thing to do is to put the focus on its security.

Personally, I look forward to using it, but I'm going to turn it off by default. I'm counting on noscript to let me enable it selectively. This is just good practice anyway.

Re:Horrible Article (1)

firewrought (36952) | more than 3 years ago | (#36120682)

However, thanks to the media slurping this up and using words like "run arbitrary code on the GPU", "render an entire machine unusable", etc., people who read these articles and know nothing about the subject (i.e. idiots) will start to ask browser vendors to turn those features off.

The other response said it better, but you must disable WebGL if you want a secure browsing experience. It's going to take ten years for the manufacturers to get this right: you will still be reading about consequential WebGL vulnerabilities in 2021. By then, you will also see that the practicality of WebGL exploits have expanded beyond the seemingly useless list we know of currently.

I'd love to be wrong and discover that it's safe to run WebGL on even the most suspect of websites, but we've seen this flick a hundred times...

Re:Horrible Article (1)

anonymov (1768712) | more than 3 years ago | (#36120894)

Right, we've seen many security companies trying some publicity stunts with half-assed theoretic threats.

That's quite dangerous world we live in, with all those "specially constructed JPEG images that can execute arbitrary code" and all.

Right now the only plausible thing in there is DoS - which would be fault of drivers, not inherent standard flaw.

Crossdomain image data access could be a security breach, but it won't "make it very difficult to secure".

Re:Horrible Article (1)

firewrought (36952) | more than 3 years ago | (#36122324)

That's quite dangerous world we live in, with all those "specially constructed JPEG images that can execute arbitrary code" and all.

Right now the only plausible thing in there is DoS - which would be fault of drivers, not inherent standard flaw.

Don't be so sure... it only took 14 days for the GDI+ JPEG exploit [infosecwriters.com] to go from being a "half-assed theoretic" threat to being an "actually attacking people's computers out there" threat. Security companies certainly try to scare up business--I won't disagree with you there--but theoretical threats are the larval form of actual threats.

I don't think you can just blame the driver manufactures either. The exploits that will be found are inevitable given the history and realities of graphic driver development, and that makes it tremendously cavalier of web browser venders to turn WebGL on by default.

Re:Horrible Article (1)

Lord_Jeremy (1612839) | more than 3 years ago | (#36120904)

Full disclosure, I know very little about WebGL other than a few things I've read and what's in this particular TFA. On the other hand, I consider myself quite knowledgable about computers in general. I'm wondering what the feasibility is of some sort of social engineering attack based around accessing the video buffers in the GPU and essentially snooping what other windows are open on your computer. Complex examples I thought of involve using OCR to determine information about the user that will convince them they are at a valid page (of whatever sort), though it could be as simple as scanning other open windows looking for credit card numbers. Every once in a while, you might get a malicious pop-up or a drive-by attack while you're paying for something on Amazon.

... or just some cheap publicity. (1)

anonymov (1768712) | more than 3 years ago | (#36121060)

I especially like this part:

Fundamentally, WebGL now allows full (Turing Complete) programs from the internet to reach the graphics driver and graphics hardware which operate in what is supposed to be the most protected part of the computer (Kernel Mode)

It elegantly scares ordinary people with sciency words and silently implies equality/correlation between "arbitrary (Turing Complete) programs sent for GPU execution via kernel-level drivers" and "arbitrary programs executed in the most protected part of the computer (Kernel Mode)".

By the way, aren't there moddable games out there that allow modders to use their own shaders?
I can't seem to remember, when was the last "a malicious shader in mod for game X exploits the vulnerability to raise privileges and install rootkit" incident, could anyone point me to one of those?

ffs (1)

Anonymous Coward | more than 3 years ago | (#36119834)

gpu shaders cannot access main memory. if there are driver errors that cause undesired side effects these should be fixed but the underlying architecture is sound.

can we please move beyond 2D graphics. go whine about real security problems. this is not one of them.

Re:ffs (1)

Desler (1608317) | more than 3 years ago | (#36119902)

go whine about real security problems. this is not one of them.

I seem to remember Microsoft telling us the same thing about ActiveX...

Re:ffs (1)

0123456 (636235) | more than 3 years ago | (#36120002)

gpu shaders cannot access main memory.

They can if you configure a texture buffer in main memory, e.g. by exploiting a driver bug to configure the buffer in the wrong place.

Though I agree that the warnings in this post and the previous one are overhyped.

Re:ffs (0)

Anonymous Coward | more than 3 years ago | (#36120022)

> gpu shaders cannot access main memory

Yes they can, at least on many machines. Integrated graphics, for example, often has no memory of its own and uses system memory for everything from textures to the framebuffer to the compiled shader. Even on dedicated adapters, often chunks of host memory are memory mapped across the bus.

Are you really confident that, say, render to texture (just for one example of many) is checking ALL offsets such that it cannot access other parts of the system memory? Can you promise that no SIMD scatter can be performed with offsets that it shouldn't?

To me this whole space seems like a rich attack surface. That being said, so is a bunch of other stuff like flash. In the end, it has to come down to common sense. If you run random shit from random sites, sooner or later you will have a problem.

Re:ffs (0)

Anonymous Coward | more than 3 years ago | (#36120400)

"gpu shaders cannot access main memory."

Can people stop modding crap like this up? Of course a GPU shader can access the main system memory. It's quite common for system memory to be mapped into the GPU's address space. Even more, some GPUs have NO MEMORY OF THEIR OWN. Where do you think they store textures - the ether?

For a tech site, there are an awful lot of ignorance about basic PC architecture around.

Re:ffs (0)

Anonymous Coward | more than 3 years ago | (#36121490)

gpu shaders cannot access main memory.

I'm using an IGP with Hypermemory, you insentitive clod!

DoS or Crash (0)

Anonymous Coward | more than 3 years ago | (#36119860)

Neither of these are a "serious security threat" . If you lock up my GPU while I browsing the WWW, not that big a deal. Annoying? Yes. Serious? No.

Talk to me when you can execute arbitrary code as the browser process user (or root).

Interesting commentary from a Firefox dev (0)

Anonymous Coward | more than 3 years ago | (#36119932)

http://blog.jprosevear.org/2011/05/13/webgl-security/

Re:Interesting commentary from a Firefox dev (1)

Desler (1608317) | more than 3 years ago | (#36119942)

Yes, because the place to get a neutral opinion is someone who works for the company who was the originator of WebGL. Gee, no conflict of interest there...

How About the Response, Slashdot? (3, Informative)

asa (33102) | more than 3 years ago | (#36119970)

Slashdot really should have published a link to this response from Mozilla http://blog.jprosevear.org/2011/05/13/webgl-security/ [jprosevear.org]

Re:How About the Response, Slashdot? (1)

Desler (1608317) | more than 3 years ago | (#36120678)

Yes, the most non-biased source of information is an employee of the company who originated the technology and was a founding member of the standardization group.

Been there, crashed X11 (2)

greed (112493) | more than 3 years ago | (#36120598)

I've already had a newer version of Firefox crash an older X11 display driver. Absolutely rock solid on Firefox 3.6 and down, and every other program I want to run. But the new GPU acceleration in Firefox? Could cause all of X11 to go away.

And Flash inside Firefox would pretty much guarantee a visit from the Coredump Gods. Fixed with a newer driver, but man is updating annoying.

Kinda sad that "web browsing" is the most intense job run on my work machine. I would have thought the massively parallel builds, or data simulation code, or something that actually pegs all the CPUs would be it. But no, it's the web.

it is all fear mongering (1)

Technomancer (51963) | more than 3 years ago | (#36121262)

First, WebGL sends shader source code to the browser and the code is compiled and executed in OpenGL. This is no different from running any other OpenGL program on your machine. The remote attacker cannot make the GPU execute arbitrary hardware instructions, only whatever source he sends.
The shaders pretty much execute in a sandbox (shader on GPU can only access buffers bound textures, vertex buffers, constant buffers, render targets etc etc). The access outside these buffers is not possible because the hardware enforces it (there is no way to even address outside texture or render target). It is little more complicated with compute shaders which have little more flexible addressing but they still cannot access anything outside global buffer (or OpenCL address space). It is like segment based protection in CPUs.

Latest GPUs have actual page table and VM, so on top of security protection from "segment" based addressing, there is also VM/page table based protection which only allows particular GPU context to access pages that have been allocated and mapped into it's VM.

The only real problem is a possibility of DOS attack caused by the fact that GPUs are not preemptable. Therefore if you send some complicated geometry or you write a shader that takes a very long time to execute (multiple nested loops+many pixels/vertices or compute threads) the draw can execute for a very long time. On Vista and later this will cause TDR and kill the trouble process. It happens all the time if you develop games or GPU compute apps. The only way to disable the watchdog is with a registry setting. On XP the watchdogs are implemented in the kernel part of graphic driver (ATI VPU Recover, and whatever nVidia has).
This DOS is a little more problem in Linux since it doesn't have good watchdogs, the DOS should not crash X, but it will definitely lock the UI. Also, if you will bother to take look at the shader docs for AMD http://www.x.org/docs/AMD/r600isa.pdf you will see that the instruction set does not allow for truly infinite shaders, there are no arbitrary jumps, the loops cannot run forever (max loop count is 2^31), the flow control is only structured and easily verifiable. It is different for nVidia ISA which looks more like regular CPU and I think can do infinite loops.

On the upside Linux DRM drivers in kernel have pretty good command buffer parsers and validators, so it is hard for user-space driver to access memory that doesn't belong to it. On Vista and later the user-space driver doesn't even know GPU side addresses of its allocations and sends every render buffer with an allocation and patch list which is resolved and patched by VidMM and kernel mode driver, see D3DKMTRender function etc http://msdn.microsoft.com/en-us/library/ff547145%28v=vs.85%29.aspx

Still Much Ado About Nothing (2)

KewlPC (245768) | more than 3 years ago | (#36121364)

As with the previous article, this is much ado about nothing.

The GPU can only run "arbitrary code" in the loosest possible sense. What happens is that an OpenGL or WebGL application gives the shader source code to the driver, which then compiles it into the native GPU instructions. You *can* pre-compile your shaders in OpenGL ES 2.0, but even then it's just intermediary bytecode, and the bytecode is vendor-specific.

Furthermore, GLSL, the language used for OpenGL and WebGL shaders, is *very* domain-specific. It has no pointers, and no language support for accessing anything outside the GPU other than model geometry and texture data. *AND* it can only access the model geometry and texture data that the application have provided to it, and for GPUs that don't have any on-board VRAM it's up to the *driver* to determine where in shared system memory that the texture will be located.

And you can't get around using shaders on modern GPUs. Modern GPUs don't have a fixed function pipeline, it's not in the silicon at all. For apps that try to use the old OpenGL fixed function pipeline, the driver generates shaders that do what the fixed function pipeline *would* have done based on the current state. Drivers won't keep emulating the old fixed function pipeline forever, though.

Re:Still Much Ado About Nothing (1)

KewlPC (245768) | more than 3 years ago | (#36121516)

Also, Windows (the most likely target of any attack) has had the ability since Vista to restart the GPU if it hangs (which is the only real attack possible when it comes to shaders: use a shader that is so computationally intensive the GPU becomes unresponsive). This isn't bullet proof, of course, but if Windows isn't able to restart the GPU after a few seconds of unresponsiveness then that's a *Windows* bug.

All well and good, but? (-1)

Anonymous Coward | more than 3 years ago | (#36121886)

"... but it did little to satisfy Context — the firm argues that inherent flaws in the design of WebGL make it very difficult to secure."

So what do they have to say about the Windows OS?

It's been done before (0)

Anonymous Coward | more than 3 years ago | (#36122374)

Don't you remember how the hacking was done in Hackers?

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...