×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

WebGL Poses New Security Problems

CmdrTaco posted more than 2 years ago | from the but-this-time-its-in-three-dee dept.

Graphics 178

Julie188 writes "Researchers are warning that the WebGL standard undermines existing operating system security protections and offers up new attack surfaces. To enable rendering of demanding 3D animations, WebGL allows web sites to execute shader code directly on a system's graphics card. This can allow an attacker to exploit security vulnerabilities in the graphics card driver and even inject malicious code onto the system."

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

178 comments

WebGL was always a bad idea (2, Insightful)

Anonymous Coward | more than 2 years ago | (#36086718)

Now that we finally have sandboxing in browsers they want to let any website run directly code on your hardware. Insane! Just forget the WebGL stuff. Silverlight has direct support for XNA which handles it everything better and safer anyway. Are we also supposed to write WebGL games with notepad? At least XNA games can be written with a solid IDE like Visual Studio. Not only that but the games also work on Xbox360 and mobile phones without such a major porting. What a developers dream...

Leave my hardware alone and secure!

Re:WebGL was always a bad idea (1, Insightful)

Anonymous Coward | more than 2 years ago | (#36086856)

Silverlight has direct support for XNA
[snip]
a solid IDE like Visual Studio.
[snip]
Not only that but the games also work on Xbox360 and mobile phones
[snip]
Leave my hardware alone and secure!

Translation: "WebGL is insecure! Use Microsoft products instead!"

Sorry, I need to stop laughing hysterically before I can post any more.

Re:WebGL was always a bad idea (1)

Anonymous Coward | more than 2 years ago | (#36086940)

It may be ironic, but he's right, even the MS solution is inherently more secure than WebGL. Security vulnerabilities have to break out of a sandbox and the sandbox can be fixed much more easily than a straight passthrough to various graphics drivers.

Re:WebGL was always a bad idea (1)

Unoriginal_Nickname (1248894) | more than 2 years ago | (#36087356)

The main vector of attack in WebGL is through shaders. Silverlight doesn't support shaders (it only supports the Reach profile,) which - for better or for worse - means it really is more secure.

Astroturf much? (1)

SanityInAnarchy (655584) | more than 2 years ago | (#36086868)

Now that we finally have sandboxing in browsers they want to let any website run directly code on your hardware.

As opposed to what?

Keep in mind that sandboxing is a way to deal with websites having direct hardware access. That's how they can do NaCL [google.com] safely.

Silverlight has direct support for XNA which handles it everything better and safer anyway.

What's "better" or "safer" about XNA? Never mind that nothing's stopping you from porting frameworks to this anyway -- having a lower-level standard means that, ultimately, you can have something like XNA, and others can have other things.

Never mind that XNA doesn't have a particularly good OpenGL implementation, and that Silverlight is barely an open standard as it is. Why on earth would we go from one proprietary plugin to another, when we can just use the native, open Web?

Are we also supposed to write WebGL games with notepad?

If you like. Or with TextMate, or with Eclipse, and you've got tools like Firebug and Chrome's Developer Tools.

Never mind that Chrome is actually cross-platform and free-as-in-beer, and Chromium is open source. Why should I have to boot Windows, let alone pay a massive licensing fee, in order to use a giant, clunky IDE just to build a web app?

I remember developing HD-DVD using Visual Studio. It was the best tool we had, which was really sad -- no way to make that work in Eclipse, and no good way to debug otherwise. Never mind that the whole thing would only work on XP -- not Vista or 7. Do you have any idea what a relief it was to start doing web development again, where I could have my choice of tools, OSes, and working environment, instead of being stuck on an admin account in XP?

Not only that but the games also work on Xbox360 and mobile phones without such a major porting.

Know what else mobile phones have? The web. And it actually works natively, as opposed to attempts to get XNA working on Android and iOS, which are kludges running on top of Mono.

Re:Astroturf much? (0)

Anonymous Coward | more than 2 years ago | (#36087004)

Well, someone got their ass trolled.

Re:Astroturf much? (0)

Anonymous Coward | more than 2 years ago | (#36087246)

No, that's not how Native Client is done safely. That's how modern ActiveX is done "safely." Sandboxing on Native Client is a "we fucked up" everything-gone-totally-wrong last measure prevention mechanism.

Re:Astroturf much? (2)

Unoriginal_Nickname (1248894) | more than 2 years ago | (#36087250)

XNA doesn't have a particularly good OpenGL implementation? As someone with years of experience with XNA, Direct3D, OpenGL, and even some experience with WebGL, I feel ethically responsible for saying that you have absolutely no idea what you're talking about at all.

Re:Astroturf much? (0)

Anonymous Coward | more than 2 years ago | (#36088360)

XNA doesn't have any OpenGL implementation so far as I'm aware, so he's right.

Xbox 360 doesn't have the web (1)

tepples (727027) | more than 2 years ago | (#36087266)

Not only that but the games also work on Xbox360 and mobile phones without such a major porting.

Know what else mobile phones have? The web.

Xbox 360 doesn't have the web. What do you recommend to get a game ported to a console?

Re:Xbox 360 doesn't have the web (1)

segin (883667) | more than 2 years ago | (#36088020)

Not only that but the games also work on Xbox360 and mobile phones without such a major porting.

Know what else mobile phones have? The web.

Xbox 360 doesn't have the web. What do you recommend to get a game ported to a console?

PSN hack aside, the PLAYSTATION3 *does* have the web (WebGL support is another story). Without a PSN account, even! What say you now?

Re:Xbox 360 doesn't have the web (1)

tepples (727027) | more than 2 years ago | (#36088110)

the PLAYSTATION3 *does* have the web [...] What say you now?

I've heard it's rubbish [suite101.com], even compared to the Android browser that runs on less RAM. Does the PS3 web browser support even the 2D canvas? And how does it expose button presses on the controllers as events? Google failed me somehow.

Re:WebGL was always a bad idea (1)

armanox (826486) | more than 2 years ago | (#36086880)

But you'd lose a lot of platforms too - many phones, PS3/Wii, Linux, Mac, UNIX, etc, using Silverlight.

Re:WebGL was always a bad idea (0)

Anonymous Coward | more than 2 years ago | (#36087042)

I think the Microsoft shill decided to start posting anonymously.

Re:WebGL was always a bad idea (1)

Anonymous Coward | more than 2 years ago | (#36087066)

Forgo iOS and Android? Relegate yourself to Microsoft's little vertical romper room if you want.

Forgo iOS and Android or forgo Xbox 360 (1)

tepples (727027) | more than 2 years ago | (#36087288)

If I use .NET, I forgo iOS and Android. If I use anything else, I forgo Xbox 360. So are people supposed to make one game exclusively for Xbox 360 and a completely unrelated game for iOS and Android?

Relegate yourself to Microsoft's little vertical romper room if you want.

If I ignore Xbox 360, I relegate myself to single-player and online multiplayer, as opposed to local multiplayer within a household. PCs can be connected to HDTVs and gamepads, but the majority of people appear unwilling to do so.

Really? (4, Funny)

internerdj (1319281) | more than 2 years ago | (#36086752)

I mean what could possibly be dangerous about allowing random websites to run hardware level code?

Re:Really? (1)

binarylarry (1338699) | more than 2 years ago | (#36086854)

GLSL and HLSL are about as hardware level as javascript.

Re:Really? (-1)

Anonymous Coward | more than 2 years ago | (#36086884)

That's the stupidest thing I have ever read on this website. Congratulations.

Re:Really? (1)

binarylarry (1338699) | more than 2 years ago | (#36087036)

Ever wonder why HLSL stands for High Level Shading Language?

You should ponder it for a while and get back to me.

Re:Really? (1)

spun (1352) | more than 2 years ago | (#36087176)

http://en.wikipedia.org/wiki/HLSL [wikipedia.org]

Hmm, sure as hell looks a lot more hardware-level than javascript to me. I've never heard of javascript execution depending on the hardware you have, but the features of this HLSL and GLSL stuff seem to be closely tied to the particular make and model of graphics card you own. For instance, the question "can Javascript do X?" can be answered without knowing what hardware it is running on. The question "can HSLS or GSLS do X?" can not always be answered without knowing what hardware it is running on.

To make the example specific, let me ask you, can HSLS or GSLS do geometry shading? Answer yes or no without reference to hardware specifications.

Re:Really? (1)

tepples (727027) | more than 2 years ago | (#36087344)

To make the example specific, let me ask you, can HSLS or GSLS do geometry shading?

Can HTML5 do Canvas, JavaScript Web Workers, SVG filters, and XMLHttpRequest 2? That's environment-specific too. Just as one has to fall back to other methods when certain HTML-family technologies turn out unavailable, one has to fall back to other methods when geometry shaders turn out unavailable.

Re:Really? (1)

spun (1352) | more than 2 years ago | (#36087420)

Who asked anything about HTML 5? Who said anything about "environment specific?" Your entire post is a non-sequitur, relating to nothing that was actually under discussion. Let me refresh your memory. We were comparing HSLS or GSLS to javascript and asking which was more hardware specific. Now, do you have anything to add that is relevant to the topic under discussion?

Reference software renderers (1)

tepples (727027) | more than 2 years ago | (#36087518)

Are you talking about theory or practice? As I understand it, in theory, HLSL and GLSL are not hardware-specific because they have reference software renderers that can run on a CPU, albeit with drastically reduced performance. Even in practice, I understand that Intel's GMA offloads vertex and geometry shading to the CPU, for example.

Re:Reference software renderers (1)

spun (1352) | more than 2 years ago | (#36087682)

Let me summarize the conversation so far. Someone said "GLSL and HLSL are about as hardware level as javascript." Someone else disagreed in no uncertain terms. I gave evidence that GLSL and HLSL were not as hardware level as javascript. You attempted to sidetrack the discussion into the realms of HTML5 and "environment" specificity. When I pointed out that that was off topic, you brought up some supposed "theory versus practice" debate, as if that would actually change the answer to the original question: Are GLSL and HLSL about as hardware level as javascript?

However, I have since determined that this question is itself a red herring, as the original poster appeared to be using "hardware level" when the concept he really meant to address was security: can these languages be used to directly manipulate hardware, bypassing normal operating system security measures? I think the answer to that is actually, "No. Not really any more than javascript."

So! I believe we've solved the dilemma in such a way that everyone can go home thinking they were right all along.

Re:Really? (0)

Anonymous Coward | more than 2 years ago | (#36087188)

Don't make it worse. Every post just convinces more people that the AC above was right in his assessment.

Re:Really? (0)

Anonymous Coward | more than 2 years ago | (#36086894)

s/javascript/c

Re:Really? (1)

binarylarry (1338699) | more than 2 years ago | (#36086960)

orly? Since when do you get pointers, direct memory access, etc.

I must have missed that part of writing shaders when I write them! Maybe you could bang out a tutorial!

Usually your shader code is compiled into an intermediate format by the driver and then that is JIT'd and converted into something executable on the gpu. Even in the case of OpenGL ARB shaders, which look like assembly, those are just converted into yet another internal assembly language, are optimized and executed on the gpu.

Re:Really? (1)

Anonymous Coward | more than 2 years ago | (#36086980)

The shaders CAN cause problems. There is a reason why the Linux open GPU kernel drivers have command stream filtering and command parameter checking on the extremely hot path between anything else and the GPU. Mind you, no proprietary driver does that, because the performance hit is *quite* noticeable.

And on IOMMU-less systems (or where it is turned off/set to bypass which is the vast majority), changes are high that any PCIe GPU can directly access all of the system RAM.

Do the math.

Re:Really? (0)

Anonymous Coward | more than 2 years ago | (#36087134)

And yet despite all that you can still use Javascript to execute arbitrary code!

Integer overflow in the NewIdArray function in Mozilla Firefox before
    3.5.16 and 3.6.x before 3.6.13, and SeaMonkey before 2.0.11, allows
    remote attackers to execute arbitrary code via a JavaScript array with
    many elements.

still? (1)

_0xd0ad (1974778) | more than 2 years ago | (#36087164)

You keep using that word. I do not think it means what you think it means.

Mozilla Firefox before 3.5.16 and 3.6.x before 3.6.13, and SeaMonkey before 2.0.11

Re:Really? (1)

peragrin (659227) | more than 2 years ago | (#36087160)

Ask Adobe. Flash requires direct hardware support in order to function. It is why Flash on Windows performance is faster than all other OS's combined.

Re:Really? (0)

Anonymous Coward | more than 2 years ago | (#36087310)

that might be because windows has more users than all other OS's combined and they bother to get Flash working properly (or in the case of Apple machines are able to.)

Funny how that works!

Re:Really? (1)

Anonymous Coward | more than 2 years ago | (#36087618)

Well I sure hope it's better than all other OSs combined. If I'm on Windows, I don't want to wait for a video to render on OS X, then on Linux, before I watch it!

Re:Really? (0)

Anonymous Coward | more than 2 years ago | (#36087308)

It's ALL hardware level code, bro, and always was. The question is how many layers are there between the hardware and the scripting layer, and whether there are any leaks.

With record-setting speed and efficiency! (5, Funny)

jeffb (2.718) (1189693) | more than 2 years ago | (#36086770)

You can dedicate hundreds of threads to high-volume malware, while freeing up your CPU to maintain a smooth phishing experience!

Glad I'm not using Binary Blob drivers (3, Interesting)

xiando (770382) | more than 2 years ago | (#36086792)

An attack based on "exploit security vulnerabilities in the graphics card driver" seems less likely using the FOSS graphics drivers. I'm not saying they can not be exploited, I'm just saying that this makes me feel somewhat safer than I would feel if I were using the closed Binary Blob drivers.

Re:Glad I'm not using Binary Blob drivers (3, Insightful)

MrEricSir (398214) | more than 2 years ago | (#36086838)

Do any FOSS drivers even support shaders?

Re:Glad I'm not using Binary Blob drivers (5, Funny)

Anonymous Coward | more than 2 years ago | (#36087046)

probably why he feels more secure!

Re:Glad I'm not using Binary Blob drivers (1)

binkzz (779594) | more than 2 years ago | (#36087254)

Do any FOSS drivers even support shaders?

No, OpenGL only recently added support for Phong shading. But one day!

Re:Glad I'm not using Binary Blob drivers (2)

m50d (797211) | more than 2 years ago | (#36087182)

Given the general quality level I'd have to disagree; the nvidia drivers work perfectly, even supporting e.g. S3 suspend, and most of the code is the same as their (quite widely tested) windows driver. Wheras with the free intel drivers I still get lots of random display corruption, and those're supposed to be on the good end of free drivers.

Info and thoughts (4, Informative)

TopSpin (753) | more than 2 years ago | (#36086796)

WebGL is a Javascript expression of OpenGL ES 2.0, the same OpenGL edition that appears on Apple's iOS and recent versions of Android. OpenGL ES 2.0 is essentially OpenGL 2.0 with the fixed function pipeline removed. This reduces the size of the API substantially.

Some may remember the little ARM11 based computer that appeared [slashdot.org] last week supports OpenGL ES 2.0. OpenGL ES 2.0 is also the choice of Wayland developers. There seems to be a big convergence happening around this particular edition of OpenGL due to embedded GPUs

WebGL is manifested as a context of a HTML5 canvas element. Javascript has been extended with new data types to provide aligned, dense arrays for loading vertex attributes into the GL. WebGL allows vertex and fragment shader code to be loaded into the GL.

The end result is very high performance graphics driven by Javascript hosted in a browser. WebGL integrates with the browser in some convenient ways; texture data is loaded from Javascript image objects and CSS can apply 3D transforms, for example.

WebGL has been supported in experimental form by Webkit and Mozilla since late 2010. Opera also supports WebGL. Microsoft is no where to be found.

Operating systems compromise security for the sake of GPUs. Obviously, exposing graphics subsystems to inevitably malicious code will get machines compromised. I think Google, Mozilla, et al. should adopt the 'no-script' paradigm for this stuff and require the operator to explicitly enable WebGL content case by case. The graphics subsystem will never prioritize security over performance so securing these code paths well enough for public exposure will never happen.

It would be nice if they gave this some thought before millions of people get owned and WebGL gets a huge black eye......

Re:Info and thoughts (0)

Anonymous Coward | more than 2 years ago | (#36086892)

Javascript is a language for use on the Web. You know, hypertext: text, links, and images.
There's no reason ever for this to have access to your GPU.
Are people clamoring to play processing-intensive 3D games in a web browser?

Re:Info and thoughts (0)

Anonymous Coward | more than 2 years ago | (#36086976)

The web should always be the same as it was in 1989! It was created perfect for all future uses, and all deviation is merely corruption!

Down with progress!

Who wants to play video games anyway!

That's not to say these security problems aren't real. They most certainly are. But actually asking "are people clamoring to play processing-intensive 3D games in a web browser?"

Yes. Yes they are. They play simple flash games now, and that sort of genre is likely to always exist, but the distinction between "browser games" and "outside the browser games" is not one that typical people want to uphold. There are technical reasons to make that distinction, but that's about it.

Re:Info and thoughts (1)

SanityInAnarchy (655584) | more than 2 years ago | (#36087012)

I am. Certainly, I'm clamoring to develop processing-intensive 3D games for web browsers.

Think of the success of free-to-play stuff online now. Imagine you could just click a link and jump into a demo, no downloading, no security risk (assuming this crap gets fixed), you could try games out without the "download random crap" problem. Even once you decide to pay for it, compare this to approaches like Steam -- the instant gratification of having a game right now with the rest of it progressively downloaded as needed, versus waiting a few hours for it to download, and that's on my 100 mbit fiber connection.

You can actually see some of this effect by, say, buying Half-Life on Steam, then just start playing -- it won't be long till you can actually start playing, and then it's just 30 seconds to load a level instead of half a second. Downside is you still need to download Steam and they don't continue downloading it in the background, or in any particularly smart way, so you pretty much always need to load the next level from the Internet.

It's not immediately obvious how much an improvement this would be. Think about the difference between YouTube and BitTorrent. If it's a movie, I'd rather just download it, I can wait. Even on YouTube, it took awhile before I saw it -- I would mostly download YouTube videos and play them in mplayer, to avoid the Flash overhead -- it would seriously lag and get out of sync on Linux, though it's better now. But compare that process to just following a link -- or, better yet, clicking a "play" button in the middle of a webpage. The dynamics are entirely different -- it's not just a matter of being faster or more convenient, you end up viewing content in a different way, and you end up with a community that actually would not work if it was built around downloading videos first and then playing them.

And, be honest, how many hours have you wasted on Flash games? What if those didn't suck?

Never mind that it would kick Flash's ass for the places where Flash actually sort of almost makes sense now. Google Street View? Do that with better performance and no plugins. Google Earth? No plugins, and someone could literally give you a hyperlink to a location.

Re:Info and thoughts (0)

Anonymous Coward | more than 2 years ago | (#36087136)

And now I know why so many douchebags are still developing for Flash.

Re:Info and thoughts (1)

0123456 (636235) | more than 2 years ago | (#36087272)

Even once you decide to pay for it, compare this to approaches like Steam -- the instant gratification of having a game right now with the rest of it progressively downloaded as needed, versus waiting a few hours for it to download, and that's on my 100 mbit fiber connection.

Kind of like Guild Wars, you mean? An installer that's a couple of megabytes and downloads the rest as required?

Re:Info and thoughts (0)

Anonymous Coward | more than 2 years ago | (#36087110)

Yes, clearly there's a market for them. You may not be in said market but it clearly exists. Not all Flash games are Farmville-like crap. Developers very much want an open platform to deliver their games and clearly there is a market for their games. There's no surprise that there is a push for this.

Ultimately, even if we solve the security issues, I can't say that this will be an overall positive thing for gaming, but I can say it will happen, because there's a huge desire on both sides of the equation.

JavaScript, GPU, and graphics (2)

tepples (727027) | more than 2 years ago | (#36087400)

Javascript is a language for use on the Web. You know, hypertext: text, links, and images.

Another word for "images" is "graphics".

There's no reason ever for this to have access to your GPU.

"GPU" stands for "graphics processing unit". You appear to claim that a language for use in a graphical environment shouldn't have access to capabilities exposed by a graphics processing unit. Can you clarify?

Are people clamoring to play processing-intensive 3D games in a web browser?

Yes. Otherwise, Adobe would not have added rudimentary 3D capability to Adobe Flash.

Re:JavaScript, GPU, and graphics (0)

Anonymous Coward | more than 2 years ago | (#36087584)

Yes. Browsers have been able to display images for 25 years or so. Your CPU needs no help with that.
A web browser is not a video game console; it's for rendering pages of hypertext.

Re:JavaScript, GPU, and graphics (2)

tepples (727027) | more than 2 years ago | (#36087688)

A web browser is not a video game console

Millions of FarmVille fans would disagree with your claim. Browser-based video games have been around since Flash and DHTML first came into use.

Re:Info and thoughts (0)

Anonymous Coward | more than 2 years ago | (#36086962)

I can't be the only one who is completely flabbergasted that the same folks who complain about the inherent insecurity of browser plugins that can extend the browser to "do more" than it currently can are enthusiastically driving towards a world where the browser itself is the vehicle of those attack vectors, sans plugins.

Re:Info and thoughts (1)

SanityInAnarchy (655584) | more than 2 years ago | (#36087146)

If WebGL breaks, I can disable it, fix it in my own browser (it's open source), or switch to a different browser with an entirely different implementation. If it breaks because of my video driver, I can upgrade my video driver, or switch to a different one, even swap out my entire video card, or run it on top of a software GL implementation.

If Flash breaks, I can either disable it and wait till Adobe gets off their lazy asses and fixes it, or leave it enabled and thus be vulnerable.

Hey, I'm not even necessarily anti-plugin. If Adobe Reader breaks, I can switch to Chrome and its built-in PDF viewer, or I can switch to one of several excellent PDF viewers -- I like Okular on Linux. The reason this doesn't bother me is that PDF actually is an open standard, not just in theory (like Silverlight or Flash), but in practice, with several excellent competing implementations.

If Silverlight breaks, I can either disable it till Microsoft fixes it, or switch to Mono and Moonlight -- which is about as effective as avoiding Flash vulnerabilities by switching to Gnash. In other words, for the majority of websites requiring Silverlight (or Flash), this is effectively the same as disabling it anyway.

By contrast, if Java breaks, there's OpenJDK. Not as nice as JavaScript, where there are three or four good engines in separate browsers, but there's still at least the possibility of fixing it myself, and wasn't Apache building their own VM anyway?

Now, because we have so many different browsers with entirely different implementations of the actual core web technologies, when a new technology like WebGL is proposed, you either need one open source implementation with a liberal enough license for code to be shared between browsers, or you need a brand-new implementation for each browser, and the latter seems to be happening more often -- especially for stuff like Canvas. Because there are so many browsers, they are actually competing on who can implement the most features in the most standards-compliant way, so this stuff actually gets done.

By contrast, even with a very good open-source plugin, there's no guarantee that there's more than a single implementation, so there's a chance for a buggy implementation to become the defacto standard (if there's a disagreement between the Silverlight standard and MS Silverlight, which wins?), and there's a chance for a proprietary implementation to become a defacto standard whether or not there's an "open standard" to appeal to (howmuch of the Web is on Flash now?). In fact, Flash pretty much proves that it only takes one plugin to plunge us back into the days of IE6 in terms of one proprietary implementation -- security is impacted, but so is everything else.

Fix it in your own Gnash (1)

tepples (727027) | more than 2 years ago | (#36087430)

If Flash breaks, I can either disable it and wait till Adobe gets off their lazy asses and fixes it, or leave it enabled and thus be vulnerable.

Or switch to Gnash and Moonlight, and fix any deficiencies in your own copy the same way you claimed that you can "fix [WebGL] in your own browser". Ultimately, your complaint appears to be that you haven't had the opportunity to help Gnash and Moonlight become viable.

Re:Info and thoughts (1)

pak9rabid (1011935) | more than 2 years ago | (#36088068)

It would be nice if they gave this some thought before millions of people get owned and WebGL gets a huge black eye......

WebGL has been supported in experimental form by Webkit and Mozilla since late 2010. Opera also supports WebGL. Microsoft is no where to be found.

Well, that sounds like a good first step.

Re:Info and thoughts (1)

Lennie (16154) | more than 2 years ago | (#36088318)

I'm much more worried about the 'hardware acceleration' browser makers are using. WebGL is a very well defined subset of OpenGL.

Anyway I've seen many browser/drivers/operating systems and other parts of the stack fail on it.

I think it will take time for the driver, operating system, browser makers to get it right.

You can also look at it differently: We've had exploits with images, do people disable it by default ?

If you want to disable something, do it like the HTML5 Geolocation, pop up a bar: "do you want to share location information with this site ?"

Gee (1)

TimeElf1 (781120) | more than 2 years ago | (#36086826)

Something, that was just put out that undermines the security of something else......I never saw that one coming.

And... (0)

Anonymous Coward | more than 2 years ago | (#36086852)

The only difference is this provides access to driver vulnerabilities. But those same unsuspecting users are likely to download random games and run them locally anyway.

The outcome seems surprisingly similar to flash:
This vulnerability (CVE-2011-0609) could cause a crash and potentially allow an attacker to take control of the affected system.

Quake Live (3, Interesting)

mfh (56) | more than 2 years ago | (#36086902)

I raised this concern with Quake Live, but was quickly shut down by people. Nobody wants to listen to the possible security holes in something they want to ram through at all cost. Forgive my tone if I'm a little annoyed hearing this. Sometimes you want to be wrong about something, but now I have been proven correct, I'm annoyed with myself.

Re:Quake Live (2)

jonescb (1888008) | more than 2 years ago | (#36086994)

Quake Live isn't WebGL. Even if it was, Quake Live wouldn't be affected. This security concern is only about a user visiting a web page that runs a malicious WebGL program, which Quake Live is not.

I don't get it (1, Insightful)

multi io (640409) | more than 2 years ago | (#36086908)

So they're saying that enabling shader code execution allows web sites to exploit hypothetical vulnerabilities in the graphics driver? How's that different from saying that enabling Javascript code execution allows web sites to exploit hypothetical vulnerabilities in the Javascript interpreter?

Re:I don't get it (2)

Lord Bitman (95493) | more than 2 years ago | (#36086986)

because up until now the response from graphics card manufacturers has been "security auditing? Open specifications? What, are you running arbitrary binaries on your PC and complaining when they take over your system?", whereas now they'll need to say "security auditing? Open specifications? What, are you running a web browser conceived of before 2010?"

Re:I don't get it (1)

Anonymous Coward | more than 2 years ago | (#36087130)

JS interpreter gets CPU protections, user mode execution, limited user access, and a restricted context (low integrity mode).

OTOH some shader languages are very simplistic, extremely trivial to secure.

Re:I don't get it (3, Insightful)

amorsen (7485) | more than 2 years ago | (#36087332)

So they're saying that enabling shader code execution allows web sites to exploit hypothetical vulnerabilities in the graphics driver?

They're not particularly hypothetical. Graphics driver code is such that games programmers carefully work around bugs in order to not crash anything. Imagine if every program running on the main CPU had to carefully avoid certain instruction sequences in order to not crash the system -- would you run a multi-user system on that?

Then again, that was how it was in the 80's on many time sharing systems...

Re:I don't get it (0)

Anonymous Coward | more than 2 years ago | (#36087732)

you've never done low level development or compiler design, have you?

It's called errata, learn it, love it, live it. "hmm, well, the problem is being caused by out of order execution in the PPC processor, but there's an errata on the instruction to disable out of order execution. Oh balls."

Not that that's an actual problem I ran into or anything. And that was on a chip put to market an entire 2 years ago.

Re:I don't get it (2)

Peter Amstutz (501) | more than 2 years ago | (#36087340)

The key here is "attack surface". Having relatively uninhibited access to low level graphics APIs that were not previously assumed to be public means there are probably lots of bugs with security implications. I wouldn't be surprised if graphics drivers eschew error checking in order to gain performance, but now malicious programmers can use that to crash the browser or OS. Shader compilers are also quite complex, and may present opportunities for specially crafted invalid programs to overflow buffers or otherwise screw things up. Security has always always always taken a back seat to performance in the graphics world, and it may take a while for the driver writers to come around.

Re:I don't get it (2)

Bengie (1121981) | more than 2 years ago | (#36087350)

Your JS interpreter doesn't run at kernel level with full access to your low level hardware.

Re:I don't get it (2)

Dutch Gun (899105) | more than 2 years ago | (#36087414)

Because, like Adobe formats before it (like PDF or Flash), graphics card drivers were designed well before any serious thought was given to security issues. Even now, graphics drivers are 100% focused on speed, speed, speed, and security concerns are probably barely even on the radar. As GPUs move closer to general-purpose computing devices with true logic paths, this problem is only going to get worse. In other words, it's probably a softer target than a Javascript runtime environment is, and that's saying something, given how many Java-based exploits there are.

One of the big problems of coders is that most of us don't give much thought to how technology can be abused. It either takes a security consultant or a lot of experience dealing with folks that actively attempt to exploit your products to get in the correct mindset for net-based development. I'm an MMO developer, and any new feature we come up with has to be prefaced with "how could this be abused or exploited?" Sort of frustrating, but that's the way it goes.

It's going to take a different inherent mentality when dealing with Internet technologies going forward in order to keep things actually secure, but honestly, I'm not really sure if or when that's actually going to happen. People are too dazzled by shiny new technology still, and just rush forward without seriously considering security.

Big deal (0)

Anonymous Coward | more than 2 years ago | (#36086958)

Browsers can statically analyze shader code if need be, even if that means only allowing a subset of instructions.

Example of GPU overload? (1)

DavidR1991 (1047748) | more than 2 years ago | (#36086988)

They got BSODs by 'overloading' GPUs. By doing what, continuous high-stress activity? In which case, surely they should be sufficiently ventilated etc. isn't that the real issue - surely it's not possible to kill GPUs by just making them a lot of work? Imagine if CPUs did that, we'd be pretty screwed by now

If they don't mean high-stress/high-throughput activities, what do they actually mean - overloading them with textures or something?

Re:Example of GPU overload? (1)

skids (119237) | more than 2 years ago | (#36087252)

No, GPUs do DMA bus mastering and other newer sorts of independent access to RAM and the bus address space at large. If the command stream is not filtered, or the hardware does not have provisions for contexts that restrict the GPU's memory access capabilities depending on what control channel is being used, a GPU can pretty much read and scribble all over the entire system RAM.

GPU manufacturers have been historically oblivious to this and not bothered to put much in the way of security features into the hardware. Filtering command streams on the CPU side has never been popular, as it kills performance. (I'm actually surprised to learn open source drivers are now starting to actually provide it as an option, I gave up on getting buy in on that idea a decade ago when KGI failed to gain traction.)

Re:Example of GPU overload? (1)

tamyrlin (51) | more than 2 years ago | (#36087710)

Actually, at least early NVIDIA cards had pretty good hardware support for this as far as I understood it. I don't know the status of their current cards, but their early cards had hardware support for different contexts and would generate an error if a user tried to do something it was not allowed to do. (For example, rendering outside of the selected window.) So the cards could allow a user to send commands via DMA to the card (this is often called direct rendering) in a secure manner without any risk of privilege escalation.

Also, NVIDIA wasn't the first company to have this support. GPUs from SGI were designed around the concept of secure direct rendering a long time before PC level GPUs got popular. As far as I understood it, the original DRI developers for Linux were quite security conscious from the beginning as well.

Disclaimer: I haven't been involved in 3D driver development for quite some time now, so I don't know the current hardware/software status at all, but I would be surprised if NVIDIA has started to design hardware that does not allow for secure direct rendering. However, just because the hardware is secure doesn't mean that the driver is free of security related bugs...

Frankly, at least on Linux with open source drivers and/or NVIDIA hardware I would not be very worried about unknown privilege escalation bugs related to being able to issue arbitrary DMA commands. However, I would be quite worried about bugs in the driver which would allow for arbitrary native code execution as the current user. While not a problem for normal OpenGL applications (as they are already executing as native code), it is a huge problems for web application since this could be a way to escape from the javascript sandbox.

Rubbish (1)

Anonymous Coward | more than 2 years ago | (#36087008)

GPU shaders have no access to main memory so where is the attack vector ?

Re:Rubbish (1)

0123456 (636235) | more than 2 years ago | (#36087238)

GPU shaders have no access to main memory so where is the attack vector ?

GPUs have access to main memory, quite possibly the entire address space of the machine (I only briefly worked on Vista drivers so I'm not sure how they compare to XP). The shaders may not be able to access memory directly through a pointer, but if you can somehow exploit a driver bug to get a texture configured to use an arbitrary system memory address, then they could potentially access any memory on the machine.

Of course that's physical memory, so you'd need an exploit which can also map logical to physical memory addresses to determine which place to configure the texture at.

Re:Rubbish (1)

gmueckl (950314) | more than 2 years ago | (#36088386)

The truly dangerous attack vector is not any shader code. It's all the half-decent shader compilers out there that are part of the OpenGL runtime. GLSL shaders always need to be compiled from source at runtime by the driver. So the Browser must pass off a big piece of code/data to the driver's compiler, a complex, non-secure and typically also annoyingly buggy piece of software. In other words: the compilers' fragile parsers are now directly exposed to remote websites. They were never intended to take potentially insecure input. I've been aware of that attack vector for at least 6 months now.

I can't wait for Native Client! (5, Insightful)

Anonymous Coward | more than 2 years ago | (#36087118)

Can anyone remind me why we're putting EVERYTHING in a web browser anyway?

Re:I can't wait for Native Client! (0)

Anonymous Coward | more than 2 years ago | (#36087370)

NaCl is to native what BASIC is to asm.

Re:I can't wait for Native Client! (1)

tepples (727027) | more than 2 years ago | (#36087472)

Can anyone remind me why we're putting EVERYTHING in a web browser anyway?

Because Native Client is specific to Google Chrome. What other instant, sandboxed application deployment method do you recommend before other companies' browsers begin to support Native Client?

Re:I can't wait for Native Client! (1)

perry64 (1324755) | more than 2 years ago | (#36087664)

Why? Because this is what happens when security Nazi's or bad administration make it impossible for users to actually use their computers.

I work in the military training game area, and people in that area want to deliver content without having to install thick clients (or preferably, not have to install ANY client) on the machine because doing so is almost impossible because of the requirements to get anything approved and installed. Systems like the Navy Marine Corps Internet (NMCI) don't allow users to do much besides e-mail and browse; to get anything else installed generally takes months, and if it's specialty software it's a minimum of years.

I've always said that if the security types had their way, we'd all be working on machines in a locked room without a network connection, as that is the only way to guarantee security. Security is important, but computers are designed to be USED - any policy that makes that impossible leads to people trying to get around it. Thus, policy and security measures need to take both into account - the answer, "Can't do it, sorry." doesn't cut it.

Re:I can't wait for Native Client! (0)

Anonymous Coward | more than 2 years ago | (#36087674)

Adsense. And why is it the reverse on smartphones where webapps would often make more sense? Admob.
We did give this a chance on the desktop, but everybody screamed ADWARE!!! So this is what you get :-)

Re:I can't wait for Native Client! (1)

VortexCortex (1117377) | more than 2 years ago | (#36088076)

Can anyone remind me why we're putting EVERYTHING in a web browser anyway?

Simple. Because we really don't give a fuck about web browsers.

All we really want / need is a cross platform widely distributed standardized* text & graphics display environment that can be manipulated via client side scripting, and can communicate with server side processes; And for "applications" or "services" created with such a system to be easily discoverable by our users.

* Yes, we need across the board standards conformance for stability and to reduce development costs, too bad we don't really have it yet.

We couldn't really give two shits if that's HTML in a browser with JavaScript, or a any other technology -- it just so happens that Browsers are ubiquitous. Ergo, the success of: Software Repositories on *nix, App Stores for mobile devices, the Internet + HTML/Scripting & search.

IMHO, we should ditch JS for Lua -- it can be compiled or interpreted, and has a much simpler design which takes less code (and is easier to secure as a result) It's easier to optimize and its language constructs can be used to provide any feature that JS language has. Other VM and scripting languages come to mind as well... actually, It's hard NOT to find a language that's better than JS*. (Hence Chrome <script type="python"> and other native client features -- though this is the wrong way to go, we need a broadly supported language with a standardized feature-set that isn't slow as frozen molasses.)

*inb4 brainfuck, et al.

Re:I can't wait for Native Client! (0)

Anonymous Coward | more than 2 years ago | (#36088198)

I agree. Why do we need anything more than we already have on websites? All I want to do is read text, and occasionally watch videos, that's it. This is just more bullshit that drags down your system and slows it to a crawl. Just imagine the '3D' adverts using WebGL - who needs it?

Everything old is new again, huh? (0)

Anonymous Coward | more than 2 years ago | (#36087194)

I remember the days when folks predicted JavaScript would expose surfer's computers to viruses.

Re:Everything old is new again, huh? (0)

Anonymous Coward | more than 2 years ago | (#36087488)

It did; but most of those problems were ironed out such that some interpreters now are basically immune. Same trend would probably occur here.

wheres the directions on how to write exploit code (1)

elucido (870205) | more than 2 years ago | (#36087642)

and how can we have fun experimenting with this on our LANs? This would be perfect for pen testers to play with.

Perhaps now Mozilla will finally enable their whit (1)

Derek Pomery (2028) | more than 2 years ago | (#36087898)

Which they had already written for WebGL.

http://mxr.mozilla.org/mozilla-central/source/content/canvas/src/WebGLContextUtils.cpp#101 [mozilla.org]

At least then users have to click ok before trusting a site, in a similar fashion to location data.
And hopefully that check will be strongly worded.

Personally, I'm glad I run NoScript

Re:Perhaps now Mozilla will finally enable their w (1)

Derek Pomery (2028) | more than 2 years ago | (#36087932)

What's also funny is I remember discussing precisely this attack method a few weeks ago w/ webgl folks.
Thought at the time it'd be a bit slower than the end result, in terms of image extraction.

Also joked about combining it with a webgl game to keep the user occupied, where their clicks on the "red ball" or "blue ball"
would steal more pixels.

Speed up the attack while keeping them occupied.

Not that big deal (2)

rasmusneckelmann (840111) | more than 2 years ago | (#36088024)

I doubt this is going to be a major cause of future security problems.

As far as I'm aware, WebGL is only allowing shaders to be specified in GLSL which is a pretty high level shading language. Obviously there's no such thing as pointers, and unlike something like javascript there's no interaction with complex objects. Shaders form a very clean and thin interface, basically just being a bunch of floating point vector operations. The only complex objects you're really going to interact with is various texture samplers.

It's easy to make a dangerous bug in a javascript interface to a complete HTML DOM object (or whatever else you can do in javacript these days), it's much harder to make a dangerous bug in a function that calculates a dot product. Sure, shaders are more complicated than that, but you get the drift.

This is FUD from traditional platform vendors (0)

vivarin (106778) | more than 2 years ago | (#36088352)

The natural progression of "web as platform" is to (safely) expose your hardware to web-based applications, for your benefit. Vendors who currently enjoy being the gatekeepers of various walled gardens (I'm lookin' at you, AppleSoft) are going to try to stall this progression as long as they possibly can, as it threatens their cash-cow businesses. I very much doubt the handful of web platform vendors integrating WebGL right now are going to allow it to run amok in the same way that ActiveX and Flash have over the last decade.

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...