Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Mozilla Experiments With Site Security Policy

ScuttleMonkey posted more than 6 years ago | from the security-catching-up-with-web-2.0 dept.

Security 68

An anonymous reader writes "Mozilla has opened comments for an new experimental browser security policy, dubbed Site Security Policy (SSP), designed to protect against XSS, CSRF, and malware-laced IFRAME attacks which infected over 1.5 million pages Web earlier this year. Security experts and developers are excited because SSP extends control over Web 2.0 applications that allow users to upload/include potentially harmful HTML/JavaScript such as on iGoogle, eBay Auction Listings, Roxer Pages, Windows Live, MySpace / Facebook Widgets, and so on. Banner ads from CDNs have had similar problems with JavaScript malware on social networks. The prototype Firefox SSP add-on aims to provide website owners with granular control over what the third-party content they include is allowed to do and where its supposed to originate. No word if Internet Explorer or Opera will support the initiative."

Sorry! There are no comments related to the filter you selected.

AC Experiments with First Posting (-1, Offtopic)

Anonymous Coward | more than 6 years ago | (#23685915)

take it, bitches

Why not just include NoScript by default? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23685935)

It's what I use for XSS protection. NoScript's security can get kind of annoying sometimes, but it has been useful for some pages I've come across which have tried to load my user/pass data across a affiliate site.

Re:Why not just include NoScript by default? (5, Informative)

sec_login_test (656626) | more than 6 years ago | (#23686059)

NoScript is designed so the user can protect themselves. SSP is designed so that the website owner can protect users from other users, malicious widget developers, or perhaps unscrupulous CDNs.

Re:Why not just include NoScript by default? (3, Funny)

Anonymous Coward | more than 6 years ago | (#23686105)

unscrupulous Canadians?

Re:Why not just include NoScript by default? (2, Funny)

onkelonkel (560274) | more than 6 years ago | (#23686305)

No such thing. We're all totally scrupulous.

Re:Why not just include NoScript by default? (2, Interesting)

morgan_greywolf (835522) | more than 6 years ago | (#23686371)

No such thing. We're all totally scrupulous.
So George Washington was really a Canadian?

Re:Why not just include NoScript by default? (2, Funny)

rootofevil (188401) | more than 6 years ago | (#23686401)

No such thing. We're all totally scrupulous.
if you had ended that sentence with ', eh' or mentioned one or more of the following: ice, mooses, the yukon. i would have believed you.

Re:Why not just include NoScript by default? (1)

PitaBred (632671) | more than 6 years ago | (#23687791)

Not hockey though?

Re:Why not just include NoScript by default? (0)

Anonymous Coward | more than 6 years ago | (#23688503)

Naw, he could be from Michigan if he mentioned hockey.

Re:Why not just include NoScript by default? (1)

onkelonkel (560274) | more than 6 years ago | (#23688517)

Sorry about that. I was going to say eh at the end but I had to fend off a wolverine that was trying to steal my backbacon......eh.

Re:Why not just include NoScript by default? (2, Interesting)

aceofspades1217 (1267996) | more than 6 years ago | (#23686347)

<quote>NoScript is designed so the user can protect themselves. SSP is designed so that the website owner can protect users from other users, malicious widget developers, or perhaps unscrupulous CDNs.</quote>

SSP would be great considering the huge amount of attacks....from google, to microsoft, etc. Which have been successful.

The only problem with SSP is if microsoft doesn't implement it it will be shot down because IE has 60+% market share. Without IE it would be useless because you can't put the work to protect less than half of your users but the rest that use IE will need some other form of protection.

It could skew already complicated cross-browser development.

Let's just hope microsoft doesn't make their browser even less standards compliant...although I did hear that IE8 scored high on the acid2 test. Guess they are making some headway.

Not a problem at all (1)

porneL (674499) | more than 6 years ago | (#23688283)

It would be silly to rely on SSP as the only form of protecton. As an additional measure you can use it even if not 100% of browsers implement it - you're just lowering risk/attack surface.

News headlines like "IE does not implement important security protocol that Firefox does" will get chairs moving fast in Redmond.

Re:Not a problem at all (1)

cheater512 (783349) | more than 6 years ago | (#23688613)

But the question is if Microsoft will adopt the standard or make their own.

Hmm lets look at the candidates... (1)

aceofspades1217 (1267996) | more than 6 years ago | (#23689843)

<quote>But the question is if Microsoft will adopt the standard or make their own.</quote>

MSP [Microsoft Security Policy], OSS [Open Site Security] (OSS is totally on Microsoft's side), MSS [Microsoft Site Security]

all sound like completely reasonable candidates. Or even

SSP [SilverLight Security Policy] if they really want to make it their own :P

Than they can apply for ISO standardization and release half ass documentation. ..."Ring any bells"

Re:Not a problem at all (1)

jlarocco (851450) | more than 6 years ago | (#23691969)

This isn't a standard.

As much as I dislike Microsoft, sometimes I really wonder about the unthinking Microsoft bashing on Slashdot. Here you've more or less assumed Firefox's non-standard feature will become a standard. But what justification is there for that? IE doesn't have a similar feature in this case, but there are several areas where IE does things differently than Firefox or has features that aren't in Firefox, so why are those never added as standards?. Why does the Firefox implementation get a free pass at standardization, while Microsoft's implementation gets demonized? If a specific implementation gets to set the standard, why Firefox? It's not the reference implentation. It's not the most standards compliant browser. It's not even the best open source browser.

Re:Not a problem at all (1)

cheater512 (783349) | more than 6 years ago | (#23692227)

This is a niche which has not been filled yet, its been developed openly and its not horribly broken.

Name a Microsoft 'standard' which meets those criteria.

Re:Not a problem at all (1)

riscthis (597073) | more than 6 years ago | (#23693543)

HttpOnly cookies [microsoft.com]

Re:Not a problem at all (1)

aceofspades1217 (1267996) | more than 6 years ago | (#23689881)

It was never meant to be a primary form of protection. As the documentation says "its another layer of protection". Basically it adds another layer but obviously its just extra, you still need to make it so your app is secure without it. But it can't hurt. There is a ton of little secuirty layers which can be bypassed. Like characters in a form...obviously you can change the form but that just makes it that much harder and doesn't really take away anything from a normal user.

Re:Not a problem at all (0)

Anonymous Coward | more than 6 years ago | (#23691125)

The very next headline will be,
"MS to deploy superior security protocol to protect windows users in upcoming security patch"

Re:Why not just include NoScript by default? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23686981)

If a website owner is going to include links from a third party site, it absolutely SHOULD be his/her responsibility to validate each and every one of those sites, if necessary, in a granular fashion. NoScript is a band-aid for the failure of content owners to do this.

Who is responsible for creating the seamless browsing experience? Users shouldn't have to block scripts at cnn.com and then waste time scratching their head while looking at their noscript menu to try and figure out whether or not the content is to be trusted. Content providers should be validating all remote linking, whether it's ads, mashups, or whatever--ahead of time, or they don't deserve your traffic!

Re:Why not just include NoScript by default? (0)

Anonymous Coward | more than 6 years ago | (#23691745)

If a website owner is going to include links from a third party site, it absolutely SHOULD be his/her responsibility to validate each and every one of those sites, if necessary, in a granular fashion.

I do absolutely agree. But ...

NoScript is a band-aid for the failure of content owners to do this.

That's why they integrate so many scripts (own and external) that are needed "to display the site correctly" that most users simply have to allow each and every script. While NoScript is great, many site owners don't like it and try to educate their visitors to allow everything. Of course, without proper validating.

Re:Why not just include NoScript by default? (1)

(Score.5, Interestin (865513) | more than 6 years ago | (#23692295)

NoScript is designed so the user can protect themselves. SSP is designed so that the website owner can protect users from other users, malicious widget developers, or perhaps unscrupulous CDNs.
So you have to trust the site that's being used to attack you not to attack you? Isn't this a variation of asking the drunk if he's drunk?

Anonymous Answers Itself (1, Funny)

Anonymous Coward | more than 6 years ago | (#23686135)

Q: Why not just include NoScript by default?

A: NoScript's security can get kind of annoying sometimes

Re:Why not just include NoScript by default? (1)

STrinity (723872) | more than 6 years ago | (#23688743)

Because most people just want to visit a site and view the cool scripted content without having to figure out which of the twenty scripts on the page will make the game work or the video play.

Even if you could explain to an average user what NoScript does, they'd probably just enable scripts for every page they visit without caring what the script does.

Re:Why not just include NoScript by default? (0)

Anonymous Coward | more than 6 years ago | (#23694963)

Most people do not care about cool scripted content.

Good news for Phorm victims? (2, Interesting)

RelaxedTension (914174) | more than 6 years ago | (#23686093)

This makes me wonder if it would be good for helping in situations like ISP's using Phorm or their ilk.

Re:Good news for Phorm victims? (2, Informative)

whitehatlurker (867714) | more than 6 years ago | (#23686577)

Not from what I understand of Phorm. You'd need to browse over encrypted connections so that the ISP hosted proxy can't (literally) read your email. (Or your visits to /. or lolcatz or whatever.)

Their demo page surprised me. I didn't think that stuff would work. If the add-on stops that from being exploitable, it is a good thing, even if it doesn't prevent MITM attacks.

Re:Good news for Phorm victims? (1)

goofy183 (451746) | more than 6 years ago | (#23686941)

It could for a little while though since their already doing DPI to do that all they have to do is add a few more headers to allow their advertisers sites to be loaded.

Or even worse it wouldn't surprise me if their DPI ad injecting boxes just stripped all SSP headers, which of course would open their users up to more threats but it isn't like they care much.

Maybe OT... probably not (1)

zappepcs (820751) | more than 6 years ago | (#23686207)

The first thing I thought.. OH, IE catches up, Mozilla moves ahead. FF3 is awsomeness, and now better security development? Impressive. Yes, it's not client side, but offers a way for site owners to add that extra bit of security... hmmm Mr Banker? hello? Are you listening? I hope that something solid comes of this to offer better security for people in general and the Internet as a whole.

No, it won't stop all identity theft attacks, but it is improvement.

Perhaps one day I'll get to use my pager/phone as second path authentication for the bank? I am hoping for it, but any improvement in the meantime is a good thing.

Re:Maybe OT... probably not (2, Interesting)

veganboyjosh (896761) | more than 6 years ago | (#23686363)

Perhaps one day I'll get to use my pager/phone as second path authentication for the bank? I am hoping for it, but any improvement in the meantime is a good thing.

My father in law had a keyfob lcd thing to which his employer would send him passwords when he wanted to use his takehome work laptop. When he turned it on, it sent a signal to the home base, and the keyfob would show a number. He used this number to log in. This way, if the laptop got stolen, it was effectively worthless.

Is this what you're talking about?
1. I go to login to my bank account.
2. The bank server sends a text message to my phone with a one time key/password.
3. I enter the password/key, so the bank has a better idea that I am who I say I am.

Or...
1. ID theif uses my login info to try to get into my online account.
2. The bank server sends a text message to my phone with a one time key/password.
3. I'm not logging in at the moment, so I call the bank to check in, etc. Interesting.

Re:Maybe OT... probably not (1)

tsa (15680) | more than 6 years ago | (#23686417)

My bank uses this system to confirm payments. It works very well and is easy to use.

Re:Maybe OT... probably not (4, Informative)

klui (457783) | more than 6 years ago | (#23687295)

What you've described is an RSA SecurID one-time-password. It may appear that your father's fob communicated with the remote server, but the truth is each fob has a unique seed and a clock that creates one-time-passwords based on the value of its clock and an optional salt (PIN). The remote server that validates the OTP knows the seed of your father's fob and is able to authenticate each password it receives. After each valid authentication the passcode is discarded and cannot be used again (within limits of the passcode length). The system allows for clock skew between the fob and the validating server. Current system functionality may have changed as this description is quite old.

Re:Maybe OT... probably not (1)

veganboyjosh (896761) | more than 6 years ago | (#23687403)

Interesting. Thanks for the explanation. Perhaps something similar could work for getting onto a bank website? Along with the usual detritus one gets when one signs up for a new bank account, is such a keychain used for authenticating logins...on such a huge scale, how much do those things cost?

Re:Maybe OT... probably not (1)

Richy_T (111409) | more than 6 years ago | (#23687643)

They keyrings are fairly cheap (considering) but the server software is somewhat pricey (though not in terms of the kind of money banks should be willing to spend on such things).

Re:Maybe OT... probably not (0)

Anonymous Coward | more than 6 years ago | (#23691913)

It already does. [findarticles.com]

I don't know how much they cost, but it can't be too bad. I've worked for two huge companies that have handed them out to employees for logging in.

Re:Maybe OT... probably not (1)

chrispugh (1301243) | more than 6 years ago | (#23692005)

There is a system being trialled in the UK at the moment, by Barclays bank. It's not the same as this system, but it does increase security massively. You're given a little chip and pin machine, and when you want to access your bank account, you put your card into the machine, enter your pin, and it gives you an access code.
Having seen it in action, it's a brilliant system. Unfortunately, I'm not a Barclays customer, so I haven't used it personally, but I'm waiting quite excitedly for my bank to implement a similar system.

Re:Maybe OT... probably not (1)

jrumney (197329) | more than 6 years ago | (#23692051)

It's well beyond a trial now. My girlfriend got hers from Barclays when they rolled it out properly late last year, and I got one recently from Nationwide. It's a bit annoying, as you need to carry the little calcuator sized machine around wherever you might want to use your internet or telephone banking. I think Barclays requires it for login, Nationwide only requires it for activities that result in money leaving your accounts, so you can at least still check your balance and transfer between accounts if you don't have it on you.

Re:Maybe OT... probably not (1)

mikiN (75494) | more than 6 years ago | (#23692879)

Strange that to many people here this seems to be new.

My bank has been sending me OTPs for some 12 years now, at first by regular mail, currently as a text message to my phone, to be used every time I want to make a transaction.
(New) usernames and passwords have to be collected in person at the bank by presenting them with the confirmation letter and proper ID.

Friends of mine who do internet banking with other banks have been using a personal card reader/PIN pad and a challenge/response system for at least 6 years now.

Re:Maybe OT... probably not (1)

bobbozzo (622815) | more than 6 years ago | (#23698783)

IIRC, the key (license) fees end up being over $100/user (per year?).

Re: RSA SecurID (1)

FurtiveGlancer (1274746) | more than 6 years ago | (#23692197)

FWIW, NSA was using this technology in the '80s.

Re:Maybe OT... probably not (0)

Anonymous Coward | more than 6 years ago | (#23690919)

PayPal has a security keychain that I use to get to my account. It's a little plastic thing with a button and an LCD screen that you can attach to your keyring. I log into PayPal and then they prompt for the security number. I'm not totally sure how it works -- if it's by time or a set algorithm or what -- but it makes me feel a whole lot safer.

Re:Maybe OT... probably not (0)

Anonymous Coward | more than 6 years ago | (#23686865)

Whereas I looked at the summary and thought if Microsoft had proposed this and implemented it in the IE8 beta we'd be complaining they were trying to take control of the internet again. Funny that.

Re:Maybe OT... probably not (1)

slash.duncan (1103465) | more than 6 years ago | (#23691035)

Bank of America has this now. I don't use it as I don't have a cell phone for them to text info to, but they have it as a second-path authentication option.

I had read about the European key fob type things and wondered when something like that would show up here. I guess BofA decided to take the cheaper for them route since many now have cell phones, but personally, I've simply never been able to justify the additional cost for a cell phone over (formerly) a landline, and the hurdle got higher recently when I switched to VoIP, so mobe connectivity isn't looking likely any time soon, here. ($99/mo including text/data/voice, what I understand T-Mobils unlimited offer is, looks interesting, but the available bandwidth on the data side isn't going to compare too well to my cable connection, so...)

But why trust site administrators? (1, Insightful)

Anonymous Coward | more than 6 years ago | (#23686227)

The prototype Firefox SSP add-on aims to provide website owners with granular control over what the third-party content they include is allowed to do and where its supposed to originate.

But the real problem is still on the client end.

For example, I block google-analytics.com, because I don't want to be a part of whatever Google stores by means of "ga.js".

Perhaps I'm misunderstanding SSP, but the last thing I want my web browser to do is automatically "decide" that, for example, my trust in slashdot.org is to be automatically extended to Javascript hosted by google-analytics.com, solely because Slashdot's site administrators have chosen to trust google-analytics.com. (Or ad.doubleclick.net, which is now owned by Google...)

In my eyes, that sort of behavior constitutes adding a security hole, not plugging it.

To use an example that doesn't quite hit so close to home, I just about flipped when my bank added a lovely little "feature" on its login box that would cut down its support costs by enabling me to "chat with a live person". I'll turn on Javashit when I log in to my bank because I trust my bank, but I was damned if I'm going to let a wholly-unrelated outfit like liveperson.com (who are a legitimate company, but they run the Internet equivalent of outsourced phone banks and chatbots) run their Javashit every time I log onto my bank, especially since I've never once felt the need to "chat with a live person" while trying to bank online.

Liveperson.com, much like Doubleclick and Google Analytics, got blocked in the HOSTS file and in the router. Those aren't options for non-technical users, but I fear that SSP may wind up enabling more holes/leaks than it closes. How much do you trust your site administrators? My bank's probably trustworthy. Slashdot's probably more interested in my data than Google is. But some random web forum whose administrator may not know his ass from a hole in the ground? The answer's still "no".

Re:But why trust site administrators? (2, Interesting)

profplump (309017) | more than 6 years ago | (#23687555)

I agree it would be useful to have better client-side protection, but I don't understand how this system could possible make things worse.

Currently the options for limiting the scope of JavaScript are:
1. Turn JS off
2. Prevent certain files from loads (i.e. /etc/hosts or the like)

This does not interfere with either of those, and adds:
3. Allow site administrators explicitly list allowed scripts/domains and block all others.

You can still turn off JS, and you can still prevent certain files from loads. If you came up with some other system to let users decide what JS to run or not (like NoScript), that would still work too. This isn't a system to override your personal settings and force scripts to run, it's just another layer of protection applied before your personal settings, and to help people that can't or don't take the kind of additional protection steps you do.

Re:But why trust site administrators? (1)

POWRSURG (755318) | more than 6 years ago | (#23724647)

First off you are correct, you don't understand SSP. It does not force-enable JavaScript on the client's end. It allows site developers to force-disable JavaScript that they have not verified. It is sort of like NoScript for site developers that don't have 100% control over the source of their site. Consider the case where a site is built using a multi-author CMS. The group could agree to only use scripts that they wrote and ones server from [favorite_script_site]. SSP would prevent clueless Sally from adding a script from [pretty_flowers_dripping_down_the_screen].

SSP also disallows inline scripting, so this prevents people from injecting content inside a form whose results are automatically inserted on the page. Yes, the site author's script should scrub the results clean before displaying them to the user, but enabling SSP on the page would reduce the possibility of the script running if it made it through the scrubbing process.

This all being said, I do not understand the mass hysteria that I see from the /. crowd over Google Analytics. The purpose of GA is not the same as DoubleClick. DoubleClick's purpose is to serve you ads. GA's purpose is to provide site authors with information about their visitors so that they can properly tailor their pages to their customers better. One can find out which sections of one's site are popular and redesign their site to make it easier to get to, while pushing the lesser used sections off the home page. You can see how modified page content affects traffic over time. Think of your visiting habits as a vote for the way you'd like to see the site ran. Denying GA from running is effectively throwing your vote away.

Don't malware attacks have signatures? (1)

blair1q (305137) | more than 6 years ago | (#23686233)

If the ISP isn't virus-scanning uploads, the ISP is inviting attacks.

If SSP is as transparent as SSL (and how could it not be, since it's only 4 letters away in the alphabet!) then it will work. If it takes user intervention, it will probably fail.

There's no reason to believe Opera or IE would not adopt it, if it works.

Re:Don't malware attacks have signatures? (0)

Anonymous Coward | more than 6 years ago | (#23686539)

pages Web (2, Funny)

HTH NE1 (675604) | more than 6 years ago | (#23686291)

which infected over 1.5 million pages Web earlier this year.
That reminds me: I need to update my page Web.

FF3 'Killer App' ? (2, Interesting)

apachetoolbox (456499) | more than 6 years ago | (#23686299)

This sounds like a great idea! Maybe this will be the killer app that pushes FF past IE.

FINALLY (4, Informative)

LeafOnTheWind (1066228) | more than 6 years ago | (#23686307)

As someone who has worked in web security, let me say that many of us have been begging for stricter control over security protocols for years. With all the AJAX going around, more and more sites are proving vulnerable to browsers that are just too friendly with the same-origin policy. If you check out the OWASP Top 10 [owasp.org] , you'll see that a whole bunch of these attacks could be prevented by better browser security.

The best case would be a restructuring Javascript and the DOM as well, but I would be excited to see any increased security. After I used a reflected XSS attack to essentially gain control over a client's browser and all their cookies last year, I don't trust any web application.

Re:FINALLY (4, Insightful)

hedwards (940851) | more than 6 years ago | (#23686435)

This seems like something that would be very useful in this day and age. Noscript ends up being very annoying because a lot of sites will link in a large number of scripts from other servers. It's difficult a lot of the time because you really have to investigate to know what akamai.net does for example, and then there's admdt.com and the like. The list is often times 20 different sites without a good way of restricting the permissions to just the current site.

I block everything by default, and rarely allow things permanent permissions, but trying to figure out which one is causing the site to not work is a real pain in the ass. Really it's something which shouldn't be expected of a user. the JSON ans XSS vulnerabilities really ought to be enough to convince at least financial institutions not to include that kind of crap on their sites.

But of course, this assumes that the system works, is well designed and protects against the things that developers really shouldn't be doing automatically. But I'll be giving it a whirl just to see if it helps at all.

Very confusing. (1)

140Mandak262Jamuna (970587) | more than 6 years ago | (#23686333)

A modded up comment says it helps the web site owners to protect one user from another (malicious) user. But it is part of the browser. How does a browser help the server?

Re:Very confusing. (1)

The MAZZTer (911996) | more than 6 years ago | (#23686487)

If a website's code is changed by a malicious, outside party, it usually has to reference an outside server at some point. SSP will block access to the outside server because it will recognize the current website has no need to have you interacting with it.

At least that's how I'm understanding it.

Re:Very confusing. (5, Informative)

mikeazo (1303365) | more than 6 years ago | (#23686491)

because the server specifies the site policy. the browser uses the site policy to know what to block. if I am running my site and have my policy set up correctly, if an XSS is found in my site and an attacker tries to exploit it, hopefully my policy is defined well enough that a browser knows not to run the attacker's javascript. so the browser doesn't help the server, the server helps the browser know what it should run and what it shouldn't.

Re:Very confusing. (1)

goofy183 (451746) | more than 6 years ago | (#23686521)

The site owner includes additional HTTP headers in the response describing all of the 'good' JavaScript on the page, where it is coming from, what it is allowed to make requests to, etc...

If a site owner implements these headers injected JavaScript won't be able to do much since it will be in the sandbox specified by the site-owner.

Re:Very confusing. (5, Informative)

pavon (30274) | more than 6 years ago | (#23686541)

Say you are a webmaster. You serve pages that you generate and trust them. However, some third parties would like to include content on your pages that is served from servers that you don't control. For example advertisements - these are almost always served from a different computer than the main webpage. Another example is embedded content from another site like a YouTube movie, or all these little panels that the social networking sites are starting to introduce. This gets worse when users themselves are allowed to put this sort of content on your site (say you run a forum or social networking site).

Because you don't control these web servers, the content they are serving could be replaced with malicious content, or just content that goes beyond what you gave them permission to do. Or a user could intentionally post malicious content.

This allows a webmaster to indicate what third party content is allowed on any page (if any), and what that third party content is allowed to do (text, images, animated images, javascript, plugins, etc). The web-browser then enforces the rules that the webmaster set.

Re:Very confusing. (1)

dv8ed (697300) | more than 6 years ago | (#23687119)

This is great for security, but what does it break? There are a lot of useful things that could get potentially caught up in a policy like this (eg, the del.icio.us bookmarklet.) If this can knock down malware, great, but I'm a little wary of taking control of what third-party scripts users are allowed to run on their own machines.

Whos Job Is It? - Security (1)

booleanoperator (1067746) | more than 6 years ago | (#23686443)

Its very one's job... But shouldnt the webmasters be responsible for their content. If an ISP can deep packet filter, and can be forced to by law, why cant a website be forced to filter content it displays, whether it comes from a CDN or other businesses. Pushing this off to the user, is doomed to failure for the majority.

Re:Whos Job Is It? - Security (3, Insightful)

rgviza (1303161) | more than 6 years ago | (#23686817)

If you care about your machine and what happens to it, it's your job. If you just want to flail around in anger when your box becomes a big paperweight, leave it up to someone else.

It's the job of the authorities to lock up the crackers and other people that commit electronic crimes. It's your job to lock your front door.

At the end of the day, the authorities and your ISP can't do anything until the threat is known. By then the damage is done.

99.9% of stuff can be mitigated with NoScript, a $50 firewall, and a slight change in behavior (stop trying to steal music, porn, movies and software). If you can't take these steps, I'm genuinely amazed that you are capable of showering, starting your car and driving to work and functioning. Most people can, but won't. It sort of flows into that whole "not accepting responsibility for stuff I do" attitude that pervades modern society.

Typical transaction:
Dumbass (to self): Sweet I just got the entire collection of every rolling stones song ever made for free! 8)
>click
Computer: all of your files are belong to us. Send payment to uvebeenpwned@yahoo.com or you won't get your files back, MU HU HA HA.

Dumbass (to IT buddy): My computer is running slow and all my files got encrypted. I think I have spyware.
IT Buddy: You use p2p?
Dumbass: no, I have no idea what happened, I swear I dun't download... /snicker

-r

Re:Whos Job Is It? - Security (1)

booleanoperator (1067746) | more than 6 years ago | (#23686953)

while a i completely agree with your assessment of the general user and the responsibility we should all take with everything we do. I do know, that from an IT perspective, both from the side of the help desk and the web application developer, it is a lot more effective to block it on the website end, though not cheaply, than at the user end exactly because of the societal and personal reasons you mention.

Additional Information (5, Informative)

mrkitty (584915) | more than 6 years ago | (#23686659)

This is something a lot of us in the industry have been writing about. Here's my rant from last October Browser Security: I Want A Website Active Content Policy File Standard!
http://www.cgisecurity.com/2007/11/08 [cgisecurity.com]

Jeremiah Grossman's thoughts
http://jeremiahgrossman.blogspot.com/2008/06/site-security-policy-open-for-comments.html [blogspot.com]

it's just not sufficient protection (2, Funny)

fred fleenblat (463628) | more than 6 years ago | (#23687063)

In all likelihood it will be years before the enabling technology is in place to prevent the most vicious malware of all, the dreaded rickroll.

SSL + SSP = Safer Web Apps (4, Insightful)

Giorgio Maone (913745) | more than 6 years ago | (#23687659)

As I commented here [hackademix.net] , SSL and SSP are orthogonal technologies whose correct and joint adoption should be required for any website performing sensitive transactions: the former ensuring integrity and, to a certain extent, identity; the latter defining and guarding application boundaries.

Those websites should encourage their users to adopt a SSP complaint browser, and complaint browsers should educate users to prefer SSP complaint sites with visual clues, just like we're already doing with EV-SSL (and for better reasons in this case, maybe).

On my side, I'm considering to highlight valid SSL + restrictive SSP websites as more reliable candidates for NoScript whitelisting.

Re:SSL + SSP = Safer Web Apps (1)

Just some bastard (1113513) | more than 6 years ago | (#23692885)

In theory SSP could be useful with one huge caveat, HTTP response splitting vulns completely negate it. If a site's vulnerable to XSS, it's badly coded and no SSP style mechanism corrects that. I tried the browser extension yesterday before this hit slashdot...
  • Currently only supports X-SSP-Script-Source.
  • Regex based when it should hook into the mozilla parser sink.
  • Converts application/xhtml+xml to (IIRC) text/html+ssp.
  • Filtered inline script from an XHTML page, removing the opening script element but leaving the closing script element.
The current implementation is worse than useless and what I'd really like to see is UI allowing users to set local SSP policy. If we can rely on admins to add SSP headers, we can rely on developers to use unobtrusive script and thus disable javascript entirely for secure browsing sessions -- yeah?

Re:SSL + SSP = Safer Web Apps (1)

BenoitRen (998927) | more than 6 years ago | (#23692891)

I want to browse the web, not complaints. :(

Google DoubleClick et al (1)

porneL (674499) | more than 6 years ago | (#23688425)

Advertisers may not like it. Currently they use scripts from multiple domains and dozens of CDNs.

SSP will require them to cut down number of domains needed to whitelist (otherwise SSP whitelist would look like AdBlock's database ;) and won't let them add new domains without getting publishers to change SSP.

Re: (1)

clint999 (1277046) | more than 6 years ago | (#23694353)

If you care about your machine and what happens to it, it's your job. If you just want to flail around in anger when your box becomes a big paperweight, leave it up to someone else. It's the job of the authorities to lock up the crackers and other people th

Check for New Comments
Slashdot Login

Need an Account?

Forgot your password?