Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!



Microsoft Brands WebGL a 'Harmful' Technology

KewlPC Really?!?! (503 comments)

This BS *again*?!?!

GPU shaders != running code on the CPU.

WebGL allowing shader usage is pretty much a non-issue security wise. GLSL shaders are *extremely* limited in scope. They can't access anything besides model data and textures, and even then only the model data and textures provided to them by the host program. GLSL is very domain-specific and doesn't support pointers or any way to access things outside the purview of the GPU.

Furthermore, they aren't pre-compiled (aside from some vendor-specific methods on *OpenGL ES*, and even those only compile to bytecode IIRC), so WebGL can at least attempt to do some shader validation. OpenGL and WebGL programs literally hand the GLSL source code to the driver, which is then responsible for compiling it. This actually turns out to be good for performance, since future compiler improvements in the driver can result in the same shader on the same hardware running faster. It also means WebGL could do validation on the shaders before handing them off to the driver, to keep an eye out for any obvious attempts to do something bad.

And when it comes to malicious shaders, only two attacks can be executed: try to crash the GPU by running a very intensive shader, or try to peek at other web pages via what seems to be an implementation flaw in WebGL/HTML5 Canvas.

The first attack can be easily avoided. In fact, it *shouldn't* be possible at all on Windows, which is supposed to restart the GPU if the GPU crashes, and when it can't that's a *Windows* bug.

The second is a little harder but, again, looks to be an *implementation* flaw, not a fundamental flaw in WebGL or shaders or anything like that.

Face facts, modern GPUs don't offer any of the old fixed-function pipeline anymore. It's not anywhere to be found on the silicon; modern GPU drivers merely emulate it for old OpenGL programs. This means that if WebGL didn't have shader support it would be completely useless.

more than 3 years ago

Lack of Technology Puts Star Wars Series On Hold

KewlPC Re:Not an accident (309 comments)

Raiders of the Lost Ark was good because of people *NOT* named George Lucas. Lucas came up with some good ideas, but also some really bad ones that Spielberg and others shot down.

Basically, once Lucas had the initial idea, it was turned into a good movie by Steven Spielberg, Lawrence Kasdan, and Harrison Ford.

more than 3 years ago

Lack of Technology Puts Star Wars Series On Hold

KewlPC Re:All I can say is (309 comments)

I think you're missing the point, which is that Lawrence Kasdan (the writer) and Irvin Kirshner (the director) tried to lift ESB above being a one-dimensional movie, and succeeded for the most part. Real character development, good dialog, good acting & directing, etc.

Of course, if you're the kind of person who just wants to "turn their brain off" when they watch a movie, well...

more than 3 years ago

How Today's Tech Alienates the Elderly

KewlPC Article is wrong (453 comments)

The built-in alarm clock app in iOS works nothing like the article describes.

The use of a "+" button to mean "add [something]" is used throughout iOS. You don't use the "+" button to adjust an existing alarm, BTW. The alarm clock app initially has no alarms set, so you use the "+" button to add one (which then automatically takes you to a screen where you can set the alarm). If you want to change an alarm, you press the clearly labeled "Edit" button.

more than 3 years ago

WebGL Flaw Leaves GPU Exposed To Hackers

KewlPC Re:Still Much Ado About Nothing (120 comments)

Also, Windows (the most likely target of any attack) has had the ability since Vista to restart the GPU if it hangs (which is the only real attack possible when it comes to shaders: use a shader that is so computationally intensive the GPU becomes unresponsive). This isn't bullet proof, of course, but if Windows isn't able to restart the GPU after a few seconds of unresponsiveness then that's a *Windows* bug.

more than 3 years ago

WebGL Flaw Leaves GPU Exposed To Hackers

KewlPC Still Much Ado About Nothing (120 comments)

As with the previous article, this is much ado about nothing.

The GPU can only run "arbitrary code" in the loosest possible sense. What happens is that an OpenGL or WebGL application gives the shader source code to the driver, which then compiles it into the native GPU instructions. You *can* pre-compile your shaders in OpenGL ES 2.0, but even then it's just intermediary bytecode, and the bytecode is vendor-specific.

Furthermore, GLSL, the language used for OpenGL and WebGL shaders, is *very* domain-specific. It has no pointers, and no language support for accessing anything outside the GPU other than model geometry and texture data. *AND* it can only access the model geometry and texture data that the application have provided to it, and for GPUs that don't have any on-board VRAM it's up to the *driver* to determine where in shared system memory that the texture will be located.

And you can't get around using shaders on modern GPUs. Modern GPUs don't have a fixed function pipeline, it's not in the silicon at all. For apps that try to use the old OpenGL fixed function pipeline, the driver generates shaders that do what the fixed function pipeline *would* have done based on the current state. Drivers won't keep emulating the old fixed function pipeline forever, though.

more than 3 years ago

WebGL Poses New Security Problems

KewlPC Re:Much ado about nothing (178 comments)

IIRC OpenGL ES 2.0 can, but it's all vendor-specific. And it's still intermediary bytecode, not something that will execute directly on the hardware.

more than 3 years ago

Developer Blames Apple For Ruining eBook Business

KewlPC Re:Business 101 (660 comments)

Actually it was the US Gov't that insisted on a second supplier when they used Intel processors in the space shuttle.

more than 3 years ago

WebGL Poses New Security Problems

KewlPC Re:Much ado about nothing (178 comments)

Yes, something similar was mentioned in the article, and it *should* be fixed. But beyond that, shaders themselves don't expose anything particularly dangerous. GLSL, the language WebGL and OpenGL shaders are written in, doesn't have language features to access anything beyond the GPU. You can't access the user's hard disk from within a shader.

You can't get rid of shaders. Modern GPUs don't have a fixed function pipeline, it's totally gone from the silicon. Instead, for apps that try to use the old fixed function pipeline, the driver generates a shader that does what the fixed function pipeline *would* have done given the current state. Sooner or later the drivers are going to stop even emulating it.

Which is part of the reason WebGL has shader support in the first place, it wouldn't do anyone any good if it was obsolete right out of the gate.

Shaders aren't the problem, crappy web browsers are.

more than 3 years ago

WebGL Poses New Security Problems

KewlPC Re:Much ado about nothing (178 comments)

Shaders themselves are pretty limited in scope, though. You can't really access anything beyond the GPU, textures, and model geometry.

GLSL (the language WebGL and OpenGL shaders are written in) doesn't have pointers and is most definitely NOT a general purpose language.

Even without shader support in WebGL you'd have the potential for intentionally bad model geometry crashing a really poorly written driver.

more than 3 years ago

WebGL Poses New Security Problems

KewlPC Much ado about nothing (178 comments)

For the most part this is a lot of security handwaving.

While the GPU itself can do DMA and whatnot, shaders don't have access to any of that. If a shader can access texture memory that hasn't been assigned to it *in certain browsers* then it sounds like a bug in the browser or the browser's WebGL plugin. Being able to "overload" the GPU and blue screen the computer sounds like Yet Another Windows Bug.

A shader isn't just some arbitrary binary blob that gets executed directly by the GPU. Even native programs can't do this. You provide the driver your shader source code, the driver does the rest. It's intentionally a black-box process so that the driver can optimize the shader for the GPU and not force a specific instruction set or architecture onto GPU designers. Thus allowing the underlying GPU design to evolve, possibly radically or in unforseen ways, without breaking compatibility.

Furthermore, a shader can only access memory via the texture sampler units, which must be set up by the application. If the WebGL application (which is just JavaScript) can set things up to access texture memory it isn't supposed to be able to, the problem is with the WebGL and/or HTML5 implementation, not the concept of WebGL or the GPU driver.

more than 3 years ago

Why Does the US Cling To Imperial Measurements?

KewlPC Re:Arrogant Ignorance? (2288 comments)

I find it interesting that people who grew up using the metric system always try to use the yard as a comparable imperial unit, yet people who grew up using the imperial system don't use the yard to measure things that often. Few people in the US know how many yards are in a mile because nobody cares, it's like someone in a metric-using country measuring things in decameters.

The imperial system, for all its warts, has two advantages that metric does not:

1)Human-sized measurements. The foot is about the length of an average adult's foot. A yard is about the length of the average person's arm, and about the length of the average person's stride. Etc., etc., etc.

2)Evenly divisible units. A foot will evenly divide into halves, thirds, and fourths. A gallon will evenly divide into halves and fourths, ditto for the quart and cup. These are pretty common things to do, especially for things like basic home carpentry and cooking & baking, so it's nice that they divide up evenly.

Granted, converting between Imperial units isn't always easy, but the basics aren't that hard. Nobody actually uses rods & hogsheads to measure things. Most gripes about imperial units being too byzantine seem to come from people who grew up using the metric system, so metric is naturally what's more familiar and comfortable to them.

more than 3 years ago

The Importance of Portal

KewlPC Re:Steam is great! (222 comments)

Actually, no. You only need to validate the game *once*, right after installing it.

After that, you can play it just fine without an internet connection.

All Steam games work this way.

more than 6 years ago


KewlPC hasn't submitted any stories.



blah blah blah

KewlPC KewlPC writes  |  more than 11 years ago

"You only hear what you want to hear!"

"Why, yes, I would like to visit right now!"

With apologies to Matt Groenig.

Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>