Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

A popular social news site has been infected by ja

violent.ed (656912) writes | about 5 years ago


violent.ed writes "Someone has figured out a way to exploit a mouseover javascript event within the popular social-news site The javascript attacks the comments section which is designed to make one's web browser (Firefox 3.5.3 Confirmed) resubmit the exploit code as a reply to every existing comment in the existing thread, causing not only severe server load but locking up the browser of the affected client."
Link to Original Source

Sorry! There are no comments related to the filter you selected.

Noscript is useless (1)

RiotingPacifist (1228016) | about 5 years ago | (#29562073)

Unfortunately NoScript comes from a broken way of thinking, "you can identify attacking sites and trusted sites", the attack code for this was coming from (a site you have to allow in order to use reddit). The only way this sort of bug can be protected against is by use of javascript filtering tools such as controldescripts that filter javascript request by type and domain, with such a tool it would be possible to protect yourself much more effectively.

using such tools complex rulesets could do something like this:
mouseclick is submitting info -> allow
mouseover is requesting data -> allow
mouseover is submitting data -> request user confirmation
javascript function is doing something weird -> request user confirmation
javascript is trying to use a known exploit* -> deny and notify user (as a workaround for 0-days simply blocking the bad JS calls will protect users much faster than browsers usually get patched) ...etc

You could also combine this with domain checking to have lists of pages where you allow
*no-js (untrusted),
*simple-JS (google, youtube, etc) but [it might allow functionality but could prevent tracking],
*complex-js (facebook, etc) [all the ajax stuff means simple-JS wouldn't work]
*all-JS ( [even the complex list of functions you allow just isn't enough]

Such tools could also help the paranoid among us use website that require JS, by disabling mousetracking and sending of data on non-click actions.

If I'm so clever, why am I not using it myself? I'm not and i know very little about JS so my attempts at rulesets were useless!

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?