Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

New Method of Tracking UIP Hits?

ScuttleMonkey posted more than 9 years ago | from the ip's-on-the-table dept.

The Internet 174

smurray writes "iMediaConnection has an interesting article on a new approach to web analysis. The author claims that he is describing 'new, cutting edge methodologies for identifying people, methodologies that -- at this point -- no web analytics product supports.' What's more interesting, the new technology doesn't seem to be privacy intrusive." Many companies seem unhappy with the accepted norms of tracking UIP results. Another approach to solving this problem was also previously covered on Slashdot.

Sorry! There are no comments related to the filter you selected.

Want efficiency? (0)

Anonymous Coward | more than 9 years ago | (#13377415)

Barcoded humans =)

Re:Want efficiency? (0, Offtopic)

JohnPerkins (243021) | more than 9 years ago | (#13377648)

I had a history teacher back when I was in jr high school. She was barcoded, by the Nazis at a concentration camp.

Re:Want efficiency? (0, Offtopic)

komeedipoeg (774107) | more than 9 years ago | (#13377860)

How was it possible when Barcode was invented in 1948 and WW2 ended in 1945? http://en.wikipedia.org/wiki/Barcode [wikipedia.org]

Re:Want efficiency? (1)

utnow (808790) | more than 9 years ago | (#13377940)

war secrets! lol. it was probably just something very similar in appearance to a barcode that didn't quite qualify as the same thing.

Re:Want efficiency? (0, Offtopic)

komeedipoeg (774107) | more than 9 years ago | (#13377973)

Hmm, how did they read it? Because todays barcode readers have sometimes difficulties reading bracodes from black&white print. Maybe they just wrote numbers not barcodes? Or did germans made it harder for themselves and started to use somekind of complicated coding? And one more question when you where in Jr, maybe it was at the same time they released the Alien movie, where they marked prisoners with barcode and that teacher happened to be a fan of that movie :P sry bad english

uhm, what? (3, Funny)

Prophetic_Truth (822032) | more than 9 years ago | (#13377417)

new, cutting edge methodologies for identifying people....the new technology doesn't seem to be privacy intrusive

The Wookie defense in action!

Re:uhm, what? (2)

zxking (777919) | more than 9 years ago | (#13377487)

...the new technology doesn't seem to be privacy intrusive...

Give me a break. How can this be possible when the approach suggests using multiple tests rather than one, ranging from analyzing dated cookies, IP addresses and Flash Shared Objects?

Their approach seems to be common-sense. I believe most sites worth some salt do not use just one metric. Maybe if someone can get a hold of the research paper and post it, then we can see if their implementation is really revolutionary. Another problem is that the guys, both the authors and the researchers, have not actually tested their methodology on scalable site.

Premature elation? Let us see the paper and decide.

You forgot JACU... (1)

ArsenneLupin (766289) | more than 9 years ago | (#13377625)

SCNR...

Re:uhm, what? (4, Insightful)

Shaper_pmp (825142) | more than 9 years ago | (#13378119)

"Their approach seems to be common-sense."

Their suggestion may be common-sense, but their approach borders on messianic:

"This article is going to ask you to make a paradigm shift... new, cutting edge methodologies... no web analytics product supports... a journey from first generation web analytics to second."

Followed by a lengthy paragraph on "paradigm shifts". In fact, the article takes three pages to basically say:

"In a nut-shell: To determine a web metric we should apply multiple tests, not just count one thing."

Here's a clue, Brandt Dainow - It's a common-sense way of counting visitors, not a new fucking religion.

The basic approach is to use a selection of criteria to assess visitor numbers - cookies first, then use different IPs/userAgents with close access-times to differentiate again, etc.

The good news is there are only three problems with this approach. The bad news is, that makes them effectively useless, or certainly not much more useful than the normal method of user-counting:

Problem 1
There is no information returned to a web server that isn't trivially gameable, and absolutely no way to tie any kind of computer access to a particular human:

"1. If the same cookie is present on multiple visits, it's the same person."

Non-techie friends are always wanting to buy things from Amazon as a one-off, so I let them use my account. Boom - that's up to twenty people represented by one cookie, right there.

"2. We next sort our visits by cookie ID and look at the cookie life spans. Different cookies that overlap in time are different users. In other words, one person can't have two cookies at the same time."

Except that I habitually leave my GMail account (for example) logged in both at work and at home. Many people I know use two or more "personal" computers, and don't bother logging out of their webmail between uses. That's a minimum of two cookies with overlapping timestamps right there, and only one person.

"3. This leaves us with sets of cookie IDs that could belong to the same person because they occur at different times, so we now look at IP addresses."

This isn't actually an operative step, or a test of any kind. It's just a numbered paragraph.

"4. We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour."

FFS, has this guy ever touched a computer? For someone writing on technology he's pretty fucking out of touch. As an example, what about people who commonly telnet+lynx, VMWare or PCAnywhere, right across the world, hundreds of times in their workday? Sure, maybe most normal users don't (yet), but for some sites (eg, nerd-heavy sites like /.), it's likely enough to start skewing results.

"5. This leaves us with those IP addresses that can't be eliminated on the basis of geography. We now switch emphasis. Instead of looking for proof of difference, we now look for combinations which indicate it's the same person. These are IP addresses we know to be owned by the same ISP or company."

Except that one ISP can serve as many as hundreds of thousands of users. And proxy gateways often report one IP for all the users connected to them. For example, NTL reports one "gateway" IP for all the people in my town on cable-modems - that's thousands, minimum. So, we're looking at a potential error magnitude of 100-100,000. That's no better than the existing system for assessing unique visitors.

"6. We can refine this test by going back over the IP address/Cookie combination. We can look at all the IP addresses that a cookie had. Do we see one of those addresses used on a new cookie? Do both cookies have the same User Agent? If we get the same pool of IP addresses showing up on multiple cookies over time, with the same User Agent, this probably indicates the same person."

This is the first really useful piece of advice they've given, and forms the basis of their whole (trivial, obvious) approach. Why did it take 4 pages of hyperbole and gibberish to reach it?

"7. You can also throw Flash Shared Objects (FSO) into the mix. FSOs can't replace cookies, but if someone does support FSO you can use FSOs to record cookie IDs..."

Right, so all the weaknesses of cookies also apply to FSOs. IE, they're no better than the current system.

Problem 2
As the article admits, the test doesn't give a single number - it gives a range of probabilities, and this is utterly useless in making business decisions.

Why do you think we've got the existing standard "unique visitor" methodology? Sure, we know it's inaccurate, but it gives a single number result that you can compare and contrast between sites, and a rough order of magnitude error. Try comparing probability-distributions between sites to assess which one's worth advertising on.

I don't agree with the impulse, but in business things are considered pretty useless unless they can be reduced to a single number - why else are executives and PHBs so fond of arbitrary scales and meaningless "scores"?

We've already demonstrated above that there's no reliable correlation between "people" and IPs/geographic sources/cookies/userAgents/etc. In the most extreme case, it's hypothetically possible that all your traffic for an entire day was the result of one clever automated botnet run by one single person, or that every hit with the same cookie was generated by a different actual person using the same account (yay for BugMeNot!).

Problem 3
Even if it's possible to experimentally weight these different unreliable factors to arrive at some kind of ideal, acceptably-accurate single value, it's a proprietary algorithm.

From TFA: "Magdalena and Thomas don't apply the same weight to each test, and they tell me their analysis of IP topology uses some smart technology they'd rather keep to themselves."

Obviously, if you want to use this magic method to assess site traffic between sites, all sites will have to use this algorithm to be comparable (apples to apples). They've clearly stated they're not planning on releasing the research as public-domain, so you're left with a choice - either every webmaster in the world buys their software, or we carry on using the open, free-to-implement algorithm we've been using up to now. Although it might give erroneous figures, the fact that everyone uses it still allows accurate trends (between sites and over time) to be extrapolated.

Oh yes, and they say the new algorithm reports roughly half the number of "unique users" that the standard algorithm does, on the same dataset. How many webmasters or SEO consultants are really going to voluntarily turn around and report to their bosses they're only doing half as well as they thought?

Basically, this is a non-event. It's a potentially, slightly more accurate, proprietary way of measuring user-figures, at the expense of those figures being of any use in the real world. And it's presented like the cure for cancer, by someone who's... oh yes, quel surprise... the CEO of a web analytics firm. Bloody hyperbole-ridden BS artist [thinkmetrics.com] - there, that should stuff up his referrer logs if anyone follows the link >:-)

Re:uhm, what? (4, Insightful)

mwvdlee (775178) | more than 9 years ago | (#13377721)

Since their "cutting edge methodology" is basically all the previous methods botched together, how can it ever be LESS privacy intrusive than the methods it's made up of?

CPUID (4, Funny)

frinkacheese (790787) | more than 9 years ago | (#13377434)


Sending your PCs unique CPUID along with every HTTP request would be ideal for this. You could also group up websites and use this to track people across websites. It would be great for marketing and for law enforcement.

Oh, you all disabled your nice Intel CPUID? Why ever would you want to do that?

Re:CPUID (1)

WebCrapper (667046) | more than 9 years ago | (#13377444)

I don't even think there is a MB manufacturer that ships with the CPUID turned on anymore...

Re:CPUID (3, Interesting)

KillShill (877105) | more than 9 years ago | (#13377458)

Treacherous/Insidious Computing to the rescue.

no need for cpu id's when your entire system and its OS will generate a 128bit id for you. and give them out to "trusted" "partners".

remote attestation never sounded so good.

77340 Upsidedown St (0)

Anonymous Coward | more than 9 years ago | (#13377510)

I wonder what they will think when they start getting impossible bit patterns, like 7734 and 6027734 and 5773857734?

I wonder if they'll notice?

Hexadecimal would probably put the joke way past them.

Re:77340 Upsidedown St (1)

KillShill (877105) | more than 9 years ago | (#13377542)

that wouldn't be possible because on your crippled system you wouldn't have access to the data or network stream.

even if you managed to block the transfer, you just wouldn't be able to use that particular resource. and so the race begins to see if we meet our dystopian future or avert it... for a while until the "lobbyists" strike back.

Re:CPUID (1)

aussie_a (778472) | more than 9 years ago | (#13377680)

no need for cpu id's when your entire system and its OS will generate a 128bit id for you. and give them out to "trusted" "partners".

Which Linux distro does this? I'd like to avoid them.

Re:CPUID (1)

SolitaryMan (538416) | more than 9 years ago | (#13377492)

Sending your PCs unique CPUID along with every HTTP request would be ideal for this.

I understand that you are being ironic, but in fact, CPUID won't be a silver bullet either. These researchers are trying to calculate the amount of different persons visiting the site, not the amount of different CPU's.

Re:CPUID (1)

frinkacheese (790787) | more than 9 years ago | (#13377520)


Indeed, but generally I would say that 1 person = 1 cpu, apart from shared cpus such as in schools, web cafes and such. But I guess that a combination of IP address and Browser information can pretty much od that already.

OK, so what is really needed is a RFID implant - take yer CPUID with you, then software can really be licensed to a PERSON rather that a processor. Pay Amazon every time you click(tm) on a link(tm).

Re:CPUID (3, Insightful)

aussie_a (778472) | more than 9 years ago | (#13377685)

Indeed, but generally I would say that 1 person = 1 cpu

Not really. I surf the internet at home and at school. I imagine I'm not alone. So I would be registered as two different people.

Indeed, but generally I would say that 1 person = 1 cpu, apart from shared cpus such as in schools, web cafes and such

You forgot "pretty much anyone who doesn't alive alone and has a computer with internet access at home." Let's not forget that tiny percentage of people (I know, most slashdotters visit slashdot while avoiding work, but there are people out there who have families that have more then one person using a single computer. It's crazy I know).

Re:CPUID (1)

Ian.Waring (591380) | more than 9 years ago | (#13377727)

Sending your PCs unique CPUID along with every HTTP request would be ideal for this.

Nah. How long into the future do you reckon you'll have one CPU to one person? Anonymity is a deskside cluster...

Ian W.

Re:CPUID (1)

cheekyboy (598084) | more than 9 years ago | (#13378117)

You will never know for 100% sure, unless your site DISALLOWS all nonauthenticated user logins, and shows zero content except a signup page.

So a rough guess of

UNIQUE USERS = (UNIQUE IPS - REAL UNIQUE LOGINS) /2

Will be spot on across the average of the whole planet for all 2 billion websites.

Mmmm gota luv statistics. Averages are your friend.

UIP? (4, Funny)

XanC (644172) | more than 9 years ago | (#13377436)

I tried to find out for myself, I really did. I can't figure out if any of these dictionary.com results apply. This is the complete list, and none of them seemed to fit. There's one kind of humorous one...

International Union of Private Wagons
Quimper, France - Pluguffan (Airport Code)
Ultimate Irrigation Potential
Uncovered Interest Parity
Undegraded Intake Protein
United International Pictures
Universidad Interamericana de Panamá
Unusual Interstitial Pneumonitis
Upgrade Improvement Program
Urinating In Public
User Interface Program
USIGS Interoperability Profile
Usual Interstitial Pneumonia of Liebow
Utilities Infrastructure Plan

Re:UIP? (1)

key134 (673907) | more than 9 years ago | (#13377504)

Unique IP? I think? Just a guess from context...

Re:UIP? (1)

JohnPerkins (243021) | more than 9 years ago | (#13377688)

International Union of Phlebology
Paraguayan Industrial Union
UCAR Intellectual Property
Unintended Pregnancies
Union Interparlementaire
Universal Immunization Program
University Interaction Program
Update In Progress
Urban Indicators Program
Utility Interface Panel...

...and that's enough for now. Bedtime for John.

Re:UIP? (2, Informative)

1u3hr (530656) | more than 9 years ago | (#13377707)

And strangely enough, this acronym isn't used in TFA at all. In fact, if the submitter did mean "Unique IP" that's not at all what the article is about (after all, that's trivial to record). They're looking for the number of unique individuals, and trying to deduce that from Cookies, IP, and other data.

Unique Individual? P???

Re:UIP? (1)

hattig (47930) | more than 9 years ago | (#13377731)

Unique Individual Porn-viewing-habits

Underwear Ingesting Parrot
Unified Identity Procedure

This is a new low for Slashdot. Not only is there an unexplained TLA in the article, no-one can actually work out what it stands for in the context of the story!

Re:UIP? (0)

Anonymous Coward | more than 9 years ago | (#13377764)

Sorry, but what is a TLA?

Re:UIP? (1)

Mostly a lurker (634878) | more than 9 years ago | (#13377728)

Could be "Unique Individual People" I suppose, but this is a classic example of the rule that all acronyms (other than those in universal use) should be explained on first use.

Re:UIP? (2, Funny)

Barsema (106323) | more than 9 years ago | (#13377772)

What you meant to say: all TLA not in TFA should be EFU ;-)

Re:UIP? (1)

karmatic (776420) | more than 9 years ago | (#13377757)

User Identification Persistance? Something that allows you to track users (ala a cookie), but is persistant in some way?

Step 4. . . (5, Insightful)

SpaceAdmiral (869318) | more than 9 years ago | (#13377438)

We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour.

If my company had computers in New York and Tokyo, I could ssh between them in much less than 60 minutes. . .

Re:Step 4. . . (0)

Anonymous Coward | more than 9 years ago | (#13377464)

I frequently do this from my office machines which are on a different ip block from my home cable modem. It helps when accessing sites like expedia that play games with prices that are offered. TFA's theories are nice, but like the other methodologies listed, they provide probabilities, not certainty. My goal is to be in the unpredictable bunch, if possible. Or did you know I was going to say that?

Re:Step 4. . . (1)

jrumney (197329) | more than 9 years ago | (#13378151)

I've encountered a site that played games with its prices before. When trying to book the lowest priced fares - the ones that entice you onto the site, every second mouse click resulted in a 500 error from the server. When trying to book a more expensive fare, everything went smoothly. The only way I could successfully book their low priced fares was to open two browser windows with the same session, and alternate between the steps for booking a high priced fare (aborting at the last minute) and the low priced one. Once I'd successfully booked the low priced fare, I ended up wondering whether I was going to turn up to the airport to discover that that flight did not actually exist.

Re:Step 4. . . (1)

antic (29198) | more than 9 years ago | (#13377475)


What percentage of people do you think do that?

Assumption is valid for 95% cases. (1)

mveloso (325617) | more than 9 years ago | (#13377522)

After some thought, I'd probably agree that step 4 is valid for the vast majority of web users.

The only way this might break is if a large number of people are sitting behind a proxy/cache. But if it is the case they have fallbacks.

Re:Assumption is valid for 95% cases. (1)

hal9000(jr) (316943) | more than 9 years ago | (#13378024)

The only way this might break is if a large number of people are sitting behind a proxy/cache. Three letters--AOL.

Re:Step 4. . . (1)

CoolGopher (142933) | more than 9 years ago | (#13377560)

If my company had computers in New York and Tokyo, I could ssh between them in much less than 60 minutes. . .

The point is, most people wouldn't do that, and those who do wouldn't be significant to skew the metrics too badly.

However, having said that, it is quite possible to have a network configured for high availability such as that if you lose your local internet link traffic gets routed via your internal network out another internet link in another office. Frequently this office is in another country...

Would that be enough to stuff up the metrics? I have no idea, but it'd be worth considering before going all "oooooh" about this "paradigm shift" (was I the only one who missed what it was supposed to be? What's the news here really? Oh, wait, this is Slashdot...)

Re:Step 4. . . (1)

jrumney (197329) | more than 9 years ago | (#13378129)

However, having said that, it is quite possible to have a network configured for high availability such as that if you lose your local internet link traffic gets routed via your internal network out another internet link in another office. Frequently this office is in another country...

Even more frequently your "internal" link to the remote office is via VPN over your local link, so this isn't really an option. Redundant local links are more likely.

Re:Step 4. . . (1)

lgftsa (617184) | more than 9 years ago | (#13377806)

A corporate WAN with multiple routes to the internet and load-balanced http proxies would do it, too.

Re:Step 4. . . (1)

cheekyboy (598084) | more than 9 years ago | (#13378137)

So if only 1% of users are like you, then we will take all hits from your 'case' and divide em by 100, and add that to our unique users count. There you will be counted, but fairly. Its all about have 2 trees of decision, and counting both totals and using a percentage of each for your final tally. A great accurate result at a low resolution of time, (think audio khz, 1hr = ~.000027 hz) So just as a photo of mars showing the whole thing - vs a photo of a one tiny rock. A wider low res gives us a new 'global picture' that is both accurate and both very very low detail. Call in statistical anti aliasing.

Field test (2, Funny)

enoraM (749327) | more than 9 years ago | (#13377441)

iMediaConnection starts a huge field test of tracking unique slashdot readers with their cutting edge technologies.

And uip is ? (0, Redundant)

core (3330) | more than 9 years ago | (#13377448)

Surely not urinating in public, although that would be important to track, too.

--
Atlantis, a runaway hit, ball matching game for mac: http://www.funpause.com/ [funpause.com]

I'm glad it isn't Rocket Science (3, Interesting)

elronxenu (117773) | more than 9 years ago | (#13377449)

He fails to consider the possibility of the same user using different browsers (and hence the same IP address, but different cookies, and a different browser identification string).

So you can use probabilistic means to identify unique visitors. That's not a paradigm shift, except for those whose paradigms are already very small.

Somehow I don't think this research is worthy of an NDA.

Re:I'm glad it isn't Rocket Science (2, Insightful)

Tony.Tang (164961) | more than 9 years ago | (#13377491)

Mod this parent up.

I don't mean to be a poo poo here, but this isn't as huge a deal as the author has made it sound (i.e. it certainly is not a "paradigm shift").

Instead, what we have here is an evolutionary suggestion in how we can track users more accurately. Kudos.

As with all solutions in CS, there are problems. As the parent has correctly observed, this doesn't solve the "multiple browsers, same user" problem (which is common -- you probably use a different computer at work than at home). I am not certain, but in fact, it is possible that realistically, this process they use here only solves the "this is the same browser" problem -- many users simply leave their credentials in place (i.e. logged in -- say to /.).

Re:I'm glad it isn't Rocket Science (1)

fsterman (519061) | more than 9 years ago | (#13377566)

While I agree this is hardly a paradigm shit I think the poster is grasping at straws with his/her example. How many people surf between two browsers? I switch browsers when FF can't handle something. I migrate to a news browser every time something compelling comes along. How many people switch browsers in the same month?

Computers, that might be a larger percentage. But even then more tests could be done. Message boards you distract youself with at work that have a login-system which sets an everlasting cookie with a uniqe ID would be trackable across locations. What percentage of all internet users have more than one computer they browse a variety of sites with?

Re:I'm glad it isn't Rocket Science (1)

byssebu (797117) | more than 9 years ago | (#13377686)

The way a user moves the mouse in the browser together with statistical analasys of the keypresses can serve as an anonymous fingerprint. If all browsers supported that, identifying users would work across many browsers, plattforms and IPs. Ofcourse i don't know if it's sufficiently unique between users...

Re:I'm glad it isn't Rocket Science (0)

Anonymous Coward | more than 9 years ago | (#13377697)

So you can use probabilistic means to identify unique visitors.

No, you can't. Person A visits a website, and finds nothing of interest on the home page, so he closes his browser. Person B does the same.

If they use the same ISP and are located in roughly the same region, then Person B will simply be downloading the resources that make up the front page from his ISP's shared cache.

What to do? Disable caching for your HTML pages? Great, you've slowed down your whole website for everyone, increased your bandwidth bill substantially, and added load to your server.

Put an embedded 1x1 image on the page, which is uncachable? Okay, so now your pages can be cached, so the performance won't be as bad, but now you aren't counting anybody who doesn't load images (e.g. Lynx users, smart dialup users, people running proxies that disable 1x1 images, etc).

Do the same, but with Javascript? Same problem with the images.

The bottom line is that the only way you can reasonably rely on a visitor showing up in your logs is to disable caching for your HTML, which has serious performance issues. And that doesn't even begin to address the problem of distinguishing between visitors, which is an even harder problem.

Whats the new definiation of privacy these days? (4, Insightful)

Anonymous Coward | more than 9 years ago | (#13377454)

"This way Flash can report to the system all the cookies a machine has held. In addition to identifying users, you can use this information to understand the cookie behavior of your flash users"

I'm not sure what the Flash is, but to me, scanning all the cookies your computer has had IS privacy intrusive.

Re:Whats the new definiation of privacy these days (1)

Krach42 (227798) | more than 9 years ago | (#13377593)

Not to mention a security flaw.

When you visit my site, you agree to download and run a Flash/ActiveX control that downloads all your cookies to slashdot.org, and then sends them to me, so that I can now present false credentials to slashdot.org to make it think that I have auto-login privledges.

Awesome design flaw there, but I highly doubt anyone is THAT stupid to put THAT big of a security flaw into a system.

Too much faith in humanity? (4, Informative)

Moraelin (679338) | more than 9 years ago | (#13377889)

"I highly doubt anyone is THAT stupid to put THAT big of a security flaw into a system."

Read the article, and the guy is proposing to build exactly that kind of a security flaw into the system.

Flash can use, basically, some local shared storage on your hard drive. This isn't really designed as cookie storage, and doesn't have even the meager safeguards that cookies have. (E.g., being tied only to a domain.) It's really a space that _any_ flash applet can read and write, and currently noone (with half a clue) puts any important data there.

This guy's idea? Basically, "I know, let's store cookies there, precisely _because_ any other flash applet, e.g., our own again from a different page, can read that back again."

Caveat: so can everyone else. I could make a simple flash game that grabs everything stored there, just as you described, and sends it back to me. Including, yes, your session id (so, yes, I can take over your session in any site you were logged in, including any e-commerce sites or your bank) and anything else they stored there.

Since it's used to track your movements through sites, depending how clueless that's programmed, I may (or may not) also be able gather all sorts of other information about you.

So in a nutshell his miracle solution is to build _exactly_ that kind of a vulnerability (not to mention privacy leak) into the system.

So, well, that's the problem with assuming that "noone could be THAT stupid". Invariably when I say that, someone kindly offers himself as living proof that I'm wrong. Soneone CAN be that stupid.

Re:Whats the new definiation of privacy these days (1)

Finitepoint (132311) | more than 9 years ago | (#13377602)

"I'm not sure what the Flash is"

In this case I think the "Flash" being referred to is Macromedia's Flash plugin. He's not very clear though is he?

Re:Whats the new definiation of privacy these days (0)

Anonymous Coward | more than 9 years ago | (#13377679)

Macromedia Flash has a local shared object (LSO), which is similar to a cookie, but less known.
I presume the proposed tactic is to set a cookie with an id and add the same id to a LSO. That lets you see what happens to your cookies over time, as long as the LSO doesn't get deleted.
Since there is no LSO management tool by default, LSO's have better lifetimes than cookies.
There is a firefox extension [yardley.ca] howerer, that lets you view and delete those LSO's.
I expect this functionality will eventualy become more widespread, giving LSO's the same kind of reliability as cookies.

Adjusting Macromedia Flash Settings (4, Informative)

buro9 (633210) | more than 9 years ago | (#13377839)

Macromedia have a page that allows you to modify what sites can do on your computer in regards to Flash:
http://www.macromedia.com/support/documentation/en /flashplayer/help/settings_manager02.html#118539 [macromedia.com]

beware of the tracking on that page too (0)

Anonymous Coward | more than 9 years ago | (#13377914)

right here [macromedia.com] from our "friends" at Omniture [ca.com]

so a visiting a page that you can adjust your privacy settings will actually compromise your privacy,
now you can see why the privacy GUI is on Macromedia's site and not built into the player, but thats not suprising [clickz.com]

seems Flash is slowly becoming spyware, shame

Protect yourself from this new menace (1)

cortana (588495) | more than 9 years ago | (#13378090)

Wow, I knew that the Flash settings UI was badly designed, confusing and annoying to use. I didn't know that it set up tracking with partner sites as well.

Besides, what steps does Flash take to ensure that any old web site can't just reset these permissions, or except itself to the 'no local storage' policy you set?

Don't bother visiting Macromedia's site at all:

find / -name libflashplayer.so -o -name libflashplayer.xpt -exec rm {} \;

If you really can't live without it, try this instead:

chmod --recursive 500 ~/.macromedia

Re:Adjusting Macromedia Flash Settings (0)

Anonymous Coward | more than 9 years ago | (#13378142)

Even better, there is a one-liner command to completely manage Flash privacy settings under a Cygwin bash shell...

$ dd if=/dev/urandom bs=1024 count=808 \
> of=/cygdrive/c/Program$'\x20'Files/mozilla.org/Moz illa/Plugins/NPSWF32.dll


Just remember to adjust the path and filename for your installation version and O/S.

What's new about this? (1)

nitelord (824762) | more than 9 years ago | (#13377455)

What's so new about this? How is this news? Very little substance to the article, plus I've been using IPs, Cookies and Logins to track people for a long time.

Paradigm shift ? (2, Insightful)

l3v1 (787564) | more than 9 years ago | (#13377472)

No single test is perfectly reliable, so we have to apply multiple tests.

No kidding. This guy probably needs a wake up call.

We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour.

Ok, so this is what normally is called a really stupid argumentation. I don't say that it can't be accounted for, but stating such a thing is nothing more than plain stupidity. Has this guy ever heard about that Internet thing ?

Flash can report to the system all the cookies a machine has held.

Uhmm, not a great argument to make people use it.

No one wants to know.

I don't think they don't want to know. They just don't want to see a sudden drop of ~50% of their user count from a day to the other. And it really doesn't matter if it's the truth or not. A drop is a drop.

Re:Paradigm shift ? (1)

hal9000(jr) (316943) | more than 9 years ago | (#13378066)

I don't think they don't want to know. They just don't want to see a sudden drop of ~50% of their user count from a day to the other. And it really doesn't matter if it's the truth or not. A drop is a drop.

Replace the word "they" with "companies that are deriving revenue from web traffic." This guy makes his money from selling analytics software so that companies can track the success of thier web sites and based on tracking, make modifications, sell advertising, marketing, and so forth.

Seeing a 30-50% drop in unique visitors would be bad even if it were true. And would you want to be the only site out there competing for advetising dollars that shows a 30-50 less traffic even it if is more honest? Hell no, you wouldn't. You wouldn't be able to convince the media buyers that your right and your competition is wrong. For that matter, do you really want to go your board and explain that shiny new 6 figure website is really only serving 1/2 of your customers you thought you were serving?

Privacy (2)

JohnGrahamCumming (684871) | more than 9 years ago | (#13377476)

What's more interesting, the new technology doesn't seem to be privacy intrusive

The only mention of the word "privacy" on the linked web page is the term "Privacy Policy" at the bottom of the page.

John.

UIP = (0)

Anonymous Coward | more than 9 years ago | (#13377480)

Unique IP

Re:UIP = (1)

ZeroExistenZ (721849) | more than 9 years ago | (#13377705)

Thanks for that,
I was already wondering where the pictures were for "Uniquely Inserted Probes", as this article seems to be announced as such a big breakthrough.

What the fuck is "UIP hit"? (0)

Anonymous Coward | more than 9 years ago | (#13377481)

It would be great if submitters would add a sentence or two explaining the key acronym(s) in their article instead of assuming that everybody already knows about their pet interest. RTFA to find out? Why should I waste my time reading an article that might not be of any interest to me?

Re:What the fuck is "UIP hit"? (1)

cheekyboy (598084) | more than 9 years ago | (#13378149)

a if they knew what HTML was,

they could have use

http : : / /

but DUH... get a clue. Get a web page for dummies hand guide man.

crap again. (4, Insightful)

gunix (547717) | more than 9 years ago | (#13377483)

From the article:

" We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it can't be the same person because you can't get from New York to Tokyo in one hour."

Everheard of ssh and similar tools to make that travel?
And they put this on slashdot. Ignorance, just pure ignorance...

Re:crap again. (1)

October_30th (531777) | more than 9 years ago | (#13377493)

The majority of people using computers will never use ssh in their lives. It's not the perfect solution, but it's not complete crap either.

Proof (1)

Tune (17738) | more than 9 years ago | (#13377742)

The article points to Magdalena Urbanska and Thomas Urbanski's original research paper which "reveal" its valididy through a "mathematical proof". (8 pages of formulas, so it must be true) Of course, anyone with a post 1990's knowledge of the thing called "internet" would know that mentioning the existence of a research paper has been replaced by the thing called a "hyper link".

Googling, I found little more than this link to ARF [thearf.org] , an unknown organization boldly calling itself "The Research Authority" (with capitals, mind you).

No paper (at least not online) and no references to the institute or organization that Magdalena Urbanska and Thomas Urbanski might be affiliated with. Their daily lifes seem to be spent as "researchers" - whatever that may mean.

It looks like a hoax. A sad hoax. Because why would anyone want to hoax a story this sad?

Re:Proof (1)

cortana (588495) | more than 9 years ago | (#13378058)

Making idiot PHBs of marketing companies burn their money investigating this new technology seems like a good idea to me.

Re:crap again. (1)

farnz (625056) | more than 9 years ago | (#13377816)

Not SSH, but web caches. If corporate insist that you browse via a cache setup that fails over from the New York link to the Tokyo link whenever the internal network conditions make it worthwhile, you'll merrily surf from all sorts of addresses.

Still doesn't help deleted cookies (5, Insightful)

mattso (578394) | more than 9 years ago | (#13377488)

They make some silly assumptions that I don't think work with users using proxy agents, but in the end it still boils down to the existence of cookies. Which would be ok, if the problem they are trying to solve wasn't that users are deleting and not storing cookies at all. They do mention using Flash to store cookies, which I suspect will have to be the next area users will have to start cleaning up. But just because cookies don't overlap in time and the IP address is the same doesn't mean it's the same person. A bunch of users that use the same browser and share an IP address that always delete their cookies with this system will look like one user. Vastly under counting. Which I don't think web sites are interested in. Vast over counting is profitable. Under counting, not so much.

In the end there is no way they can even mostly recognize repeat web site visitors if the VISITOR DOESN'T WANT THEM TO.

The big problem is stated at the top of the article:

"We need to identify unique users on the web. It's fundamental. We need to know how many people visit, what they read, for how long, how often they return, and at what frequency. These are the 'atoms' of our metrics. Without this knowledge we really can't do much."

If knowing who unique users are is that important they need to create a reason for the user to correctly identify themselves. Some form of incentive that makes it worth giving up an identification for.

Re:Still doesn't help deleted cookies (1)

fsterman (519061) | more than 9 years ago | (#13377591)

AFAIK (someone correct me as I don't have a test machine right here) these programs don't delete ALL cookies they delete _ad_based_ cookies. So say, /. and Amazon will still have it's cookies while known ad companies cookies will be gone.
The less effort it takes too make an account/log in will require less incentive. Please go through as few steps as possible, with log-in and account creatons on the same page as a reply box. Having that reply box on the same page is nice too. Go and read up on [http://en.wikipedia.org/wiki/GOMS [wikipedia.org] GOMS].

Re:Still doesn't help deleted cookies (1)

mattso (578394) | more than 9 years ago | (#13377729)

It wasn't clear to me on a first reading but the article is all about cookies being used by ad companies. It doesn't say that, but only these spyware identified cookies are seeing deletion rates as high as they quote. This article is really aimed at helping advertising networks that have been labeled as spyware and are being regularly deleted by the various anti-spyware apps. What they talk about doesn't really apply on a site level, where the deletion of cookies isn't so common. Of course the banner ads can't easily do more reliable user authentication. So they are totally out of luck basically. In which case combining a few different ways to try and count users might make sense. But in the long run if people don't like and don't want their advertising, they are going to lose. It doesn't matter how they count, if enough users block them and their sites they will go out of business.

We need to know how many people visit, what they r (1)

rednuhter (516649) | more than 9 years ago | (#13377724)

"We need to know how many people visit, what they read, for how long,"

Even before tabbed browsing I would often have well over 10 browser windows open, if they measure what I have open for the time the window is open then they will get VERY scewed results.

Also I have two monitors so windows in the foreground do not always translate to windows I am focusing on. (also high resolution monitors could produce the same effect)

I wish them luck becuase they need it.

Tragically flawed (5, Insightful)

tangledweb (134818) | more than 9 years ago | (#13377490)

The article's "Sky is Falling" tone rests on a single factoid. "30 to 55% of users delete cookies" therefore current analytics products are out by "at least 30 percent, maybe more".

That is of course complete nonsense. Let's say we accept the author's assertion that different studies have given cookie deletion rates across that range. I can accept that a significant number of users might delete cookies at some point, but what percentage of normal, non-geek, non-tinfoil-hat-wearing users are deleting cookies between page requests to a single site in a single session? If it is 30%, then I will eat my hat.

Most cookie deletion amoung the general populace will be being done automatically by anti-spyware software and is not done in realtime.

The author clearly knows that even the most primitive of tools also use other metrics to group page requests into sessions, so even if 30% of users were deleting cookies, it would not result in a 30% inaccuracy.

Of course "researchers propose more complex heuristic that looks to be slightly more accurate than current pracice" does not make as good a story as "paradigm shift" blah blah "blows out of the water" blah blah "We've been off by at least 30 percent, maybe more." blah blah.

Re:Tragically flawed (1)

Andy_R (114137) | more than 9 years ago | (#13377829)

If it's true that "30 to 55% of users delete cookies" therefore current analytics products are out by "at least 30 percent, maybe more". then all they need to do is add 42.5% to their numbers then they'll be at most 12.5% out. Did I just shift a paradigm?

Re:Tragically flawed (2, Insightful)

Stephen (20676) | more than 9 years ago | (#13377931)

The article's "Sky is Falling" tone rests on a single factoid. "30 to 55% of users delete cookies" therefore current analytics products are out by "at least 30 percent, maybe more".

That is of course complete nonsense. [...] I can accept that a significant number of users might delete cookies at some point, but what percentage of [...] users are deleting cookies between page requests to a single site in a single session? If it is 30%, then I will eat my hat.

The author clearly knows that even the most primitive of tools also use other metrics to group page requests into sessions, so even if 30% of users were deleting cookies, it would not result in a 30% inaccuracy.

While I don't want to defend the article, you're missing one crucial point here. Grouping requests into sessions with "good enough" accuracy is easy even without cookies. What companies find cookies essential for is to measure latent conversion. For example: which Google ads best convert into sales, even if the sale doesn't happen for several days? For this sort of analysis, cookie deletion is a problem, and becomes a bigger problem the longer there typically is between lead and sale.

Um, nope. Can't happen. (4, Insightful)

DroopyStonx (683090) | more than 9 years ago | (#13377505)

I develop web analytic software for a living.

There's only so much you can do to track users.

IP address, user agent, some javascript stuff for cookieless tracking.. the only real "unique" identifiers for any one visitor. It stops there.

Of course, using exploits in flash doesn't count, but supposedly this new method is "not intrusive."

I call BS because it simply can't happen.

If a user doesn't wanna be tracked, they won't be tracked. This story is just press, free advertisement, and hype for this particular company.

Re:Um, nope. Can't happen. (4, Funny)

rhizome (115711) | more than 9 years ago | (#13377734)

If a user doesn't wanna be tracked, they won't be tracked. This story is just press, free advertisement, and hype for this particular company.

Whoa, whoa...let's not fly off the handle here! We don't know that they didn't pay anything.

yeah well ... (0, Offtopic)

dancallaghan (890674) | more than 9 years ago | (#13377506)

Analyse this, bitch!

*slashdots his server*

Re:yeah well ... (1)

dancallaghan (890674) | more than 9 years ago | (#13377514)

Crap, I knew I'd forget something [thinkmetrics.com] ...

Sounds like voodoo to me... (1)

Black Art (3335) | more than 9 years ago | (#13377508)

Why do I have this feeling like this "cutting edge technology" involves the entrails of an animal and some form of divination?

God damn arms race (0)

Anonymous Coward | more than 9 years ago | (#13377540)

The problem with cookie deletion is not that it happens, but that we've been relying on a single method for identifying people.

I'm so happy that we have other ways of tracking people. I mean, whenever I clear out my cookies, I'm thinking to myself, "But now how will the Man track my online activities?" Now I can clear out cookies and once again feel safe with the knowledge that somewhere, somebody knows everything I do online.

MInUS 1, TROLL) (-1, Troll)

Anonymous Coward | more than 9 years ago | (#13377574)

to the crowd in achieve any of the or m4ke loud noises to any BSD project, real problems

Paradigm shift ?!? (5, Insightful)

rduke15 (721841) | more than 9 years ago | (#13377618)

When I read "paradigm shift" in the very first paragraph, my bullshit sensor sound such a loud alarm that it's hard to continue reading...

Re:Paradigm shift ?!? (2, Informative)

fbg111 (529550) | more than 9 years ago | (#13377919)

And the fact that he actually felt the need to explain what a "paradigm shift" is to his audience - undoubtedly consisting of cynical techies - as if we'd never been (over)exposed to the concept before, quadrupled the BS meter. Honestly, was he born yesterday?

Oblig Dr. Evil Quote: [about his new "laser"] You see, I've turned the moon into what I like to call a... "Death Star".

Tropical Storm Jose TROLL (0)

Anonymous Coward | more than 9 years ago | (#13377624)

WTNT31 KNHC 230536
TCPAT1
BULLETIN
TROPICAL STORM JOSE INTERMEDIATE ADVISORY NUMBER 4A
NWS TPC/NATIONAL HURRICANE CENTER MIAMI FL
1 AM CDT TUE AUG 23 2005 ...CENTER OF JOSE MAKES LANDFALL ON THE COAST OF MEXICO...WAS
GETTING BETTER ORGANIZED AT LANDFALL...

A TROPICAL STORM WARNING REMAINS IN EFFECT FOR THE GULF COAST OF
MEXICO FROM VERACRUZ NORTHWARD TO CABO ROJO. THIS WARNING WILL
LIKELY BE DISCONTINUED LATER TODAY.

FOR STORM INFORMATION SPECIFIC TO YOUR AREA...INCLUDING POSSIBLE
INLAND WATCHES AND WARNINGS...PLEASE MONITOR PRODUCTS ISSUED
BY YOUR LOCAL WEATHER OFFICE.

DATA FROM THE MEXICAN RADAR AT ALVARADO INDICATE THAT THE CENTER OF
JOSE HAS MADE LANDFALL ON THE EASTERN COAST OF MEXICO.

AT 1 AM CDT...0600Z...THE CENTER OF TROPICAL STORM JOSE WAS LOCATED
NEAR LATITUDE 19.8 NORTH... LONGITUDE 96.8 WEST OR ABOUT 60
MILES... 95 KM... NORTHWEST OF VERACRUZ MEXICO AND ABOUT 90 MILES...
145 KM...SOUTH-SOUTHEAST OF TUXPAN MEXICO.

JOSE IS MOVING TOWARD THE WEST NEAR 9 MPH... 14 KM/HR... AND THIS
GENERAL MOTION IS EXPECTED TO CONTINUE DURING THE NEXT 24 HOURS.
ON THIS TRACK... THE CENTER OF JOSE SHOULD MOVE FARTHER INLAND INTO
THE MOUNTAINS OF EASTERN MEXICO TODAY.

MAXIMUM SUSTAINED WINDS ARE NEAR 50 MPH... 85 KM/HR...WITH HIGHER
GUSTS. JOSE SHOULD WEAKEN AS THE CENTER MOVES FARTHER INLAND. THE
ALVARADO RADAR INDICATED THAT JOSE WAS BECOMING BETTER ORGANIZED IN
THE LAST FEW HOURS BEFORE LANDFALL...AND THE MAXIMUM SUSTAINED
WINDS AT LANDFALL MAY HAVE BEEN HIGHER THAN 50 MPH.

TROPICAL STORM FORCE WINDS EXTEND OUTWARD UP TO 45 MILES... 75 KM
FROM THE CENTER.

ESTIMATED MINIMUM CENTRAL PRESSURE IS 1001 MB...29.56 INCHES.

RAINFALL ACCUMULATIONS OF 3 TO 5 INCHES...WITH ISOLATED HIGHER
AMOUNTS OF UP TO 10 INCHES OVER THE HIGHER ELEVATIONS...CAN BE
EXPECTED IN ASSOCIATION WITH JOSE. THESE RAINS COULD CAUSE
LIFE-THREATENING FLASH FLOODS AND MUD SLIDES.

REPEATING THE 1 AM CDT POSITION...19.8 N... 96.8 W. MOVEMENT
TOWARD...WEST NEAR 9 MPH. MAXIMUM SUSTAINED
WINDS... 50 MPH. MINIMUM CENTRAL PRESSURE...1001 MB.

THE NEXT ADVISORY WILL BE ISSUED BY THE NATIONAL
HURRICANE CENTER AT 4 AM CDT.

FORECASTER BEVEN

We've been using improoved method for years... (1)

maedls.at (663045) | more than 9 years ago | (#13377637)


and are already tired explaining customers, why the unique visitors differ from ther built-in log-file analysis.

See CheckEffect [checkeffect.at] for details.

UIP (0)

Anonymous Coward | more than 9 years ago | (#13377651)

I don't WANT to be tracked you insensitive cloud!

Some additional steps. (2, Interesting)

Saggi (462624) | more than 9 years ago | (#13377659)

The article uses a lot of time to establish that this is a paradigm shift, when it's actually not. I do believe their idea is good, but basically it's just applying a lot of "possible" user identifiers and merge them together to form a unified result.

Some of the identifiers they haven't used are linkage on the site. If one page links to another, it might be the same user, if the pages are called in sequence.

On top of links "time" might be applied. Some links are expected to be clicked fast, others after some reading on the page.

Some may argue that linkage is what you want to determine in the following analysis, and can't therefore be used to determine the use in advance, but this is not true. The determination of the user uniqueness looks to see if its possible for the user to get from one page to an other, while the analysis want to determine if they did it.

Haven't we learned anything from the Simpsons? (1)

halcyon1234 (834388) | more than 9 years ago | (#13377662)

Excuse me, but "proactive" and "paradigm"? Aren't these just buzzwords that dumb people use to sound important?

I mean, seriously folks-- there is a reason why these things are mocked.

More than just cookies (3, Informative)

wranlon (540319) | more than 9 years ago | (#13377665)

ROI is mentioned, along with the 'atoms' of their metrics: page hit count, popular URL count, URL dwell time, and returning visitors. When these metrics are used to produce reports, how valuable are these reports in ascertaining how ROI is affected by said metrics? For example, getting a neat funnel report of the path people take through a site and where the traffic drops off offers insight into popular paths and locations where people bail out, but apart from listening for errors, there is no further insight into why a person bailed.

What seems to be missing is gathering insightful information into what transpires while someone is on a particular page. I'd like to know the general trends in behavior [whitefrost.com] , not just the server requests. I've found it more useful to be able to see the interactions with the content than reporting where people enter, traverse, and exit a site.

Cutting edge? ha! (2, Insightful)

ZeroExistenZ (721849) | more than 9 years ago | (#13377693)

"If the same cookie is present on multiple visits, it's the same person. We next sort our visits by cookie ID"

Only after that they seem to continue the analys ("We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible", etc)

Thus turning off or regulary removing cookies will render their bleeding cutting edge technology useless? And how are cookies a 'breakthrought'?. Their only alternative to this seems to be;
You can also throw Flash Shared Objects (FSO) into the mix. FSOs can't replace cookies, but if someone does support FSO you can use FSOs to record cookie IDs.

I don't know what the fuzz is about

This is just basic logic, which any decent programmer should be able to come up with, even the M$ certified ones.

The Meat of the Article (3, Informative)

RAMMS+EIN (578166) | more than 9 years ago | (#13377701)

For those who can't be bothered to read through all the buzzwords, here's the actual method used:

Each of these steps is applied in order:

      1. If the same cookie is present on multiple visits, its the same person.

      2. We next sort our visits by cookie ID and look at the cookie life spans. Different cookies that overlap in time are different users. In other words, one person cant have two cookies at the same time.

      3. This leaves us with sets of cookie IDs that could belong to the same person because they occur at different times, so we now look at IP addresses.

      4. We know some IP addresses cannot be shared by one person. These are the ones that would require a person to move faster than possible. If we have one IP address in New York, then one in Tokyo 60 minutes later, we know it cant be the same person because you cant get from New York to Tokyo in one hour.

      5. This leaves us with those IP addresses that cant be eliminated on the basis of geography. We now switch emphasis. Instead of looking for proof of difference, we now look for combinations which indicate its the same person. These are IP addresses we know to be owned by the same ISP or company.

      6. We can refine this test by going back over the IP address/Cookie combination. We can look at all the IP addresses that a cookie had. Do we see one of those addresses used on a new cookie? Do both cookies have the same User Agent? If we get the same pool of IP addresses showing up on multiple cookies over time, with the same User Agent, this probably indicates the same person.

      7. You can also throw Flash Shared Objects (FSO) into the mix. FSOs cant replace cookies, but if someone does support FSO you can use FSOs to record cookie IDs. This way Flash can report to the system all the cookies a machine has held. In addition to identifying users, you can use this information to understand the cookie behavior of your flash users and extrapolate to the rest of your visitor population.

Typical web analysis junk (5, Insightful)

Sinner (3398) | more than 9 years ago | (#13377702)

About 20% of my time on my last job was spent doing web analysis. It drove me insane.

The problem is with the word "accurate". To management, "accurate statistics" means knowing exactly how many conscious human beings looked at the site during a given period. However, the computer cannot measure this. What it can measure, accurately, is the number of HTML requests during a given period.

You can use the latter number to estimate the former number. But because this estimate is effected by a multitude of factors like spiders, proxies, bugs, etc., management will say "these stats are clearly not accurate!". You can try to filter out the various "undesirable" requests, but the results you'll get will vary chaotically with the filters you use. The closer you get to "accurate" stats from the point of view of management, the further you'll be from "accurate" stats from a technical point of view.

Makers of web analysis software and services address these problems by the simple of technique of "lying". In fact, a whole industry has built up based on the shared delusion that we can accurately measure distinct users.

Which is where this article comes in. The author has discovered the shocking, shocking fact that the standard means of measuring distinct users are total bollocks. He's discovered that another technique produces dramatically different results. He's shocked, shocked, appalled in fact, that the makers of web analysis software are not interested in this new, highly computationally-intensive technique that spits out lower numbers.

My advice? Instead of doing costly probability analysis on your log files, just multiple your existing user counts by 0.7. The results will be just as meaningful and you can go home earlier.

Bull (0)

RAMMS+EIN (578166) | more than 9 years ago | (#13377722)

``The author claims that he is describing 'new, cutting edge methodologies for identifying people, methodologies that -- at this point -- no web analytics product supports.''

And when you read down to how these "new, cutting edge methodologies" actually work, it comes down to: plant cookies, if that doesn't tell you what you need to know, look at the IP address. Then take into account that different cookies and different IP addresses can still be the same user, if they occur at different times.

It's clever, but it didn't take a genious to think that up. Why is nobody doing it? Well, because it's too much work and still doesn't give you any guarantees that your conclusions are correct.

It's nice how TFA is presenting this as the best thing after sliced bread and Longhorn, though.

Why? (2, Insightful)

RAMMS+EIN (578166) | more than 9 years ago | (#13377736)

Somebody please explain to me: why would you go to all this trouble to get a close estimate of how many unique visitors your site draws?

I'm personally always more interested in how many pages get requested, and which ones. The first gives me an impression of how popular the site is*, the second tells me which pages people particularly like, so I can add more like that.

The only reason I see for really wanting to track people is if your site is actually an app that has state. In those cases, you have to use a more bullet-proof system than the one presented in TFA.

* Some people object that it counts many people who visit once, then never again; but I consider it a success that they got there in the first place - they were probably referred by someone, followed a link that someone made, or the page ranks high in a search engine.

Re:Why? (1)

hal9000(jr) (316943) | more than 9 years ago | (#13378101)

Somebody please explain to me: why would you go to all this trouble to get a close estimate of how many unique visitors your site draws?

Tracking the success of an advertising campaign. Ad-buyers want to get thier message out to eyeballs and they want to message to as many eyeballs as often as possible for their target demographic. Picking sites in their demographic is easy and they know how to do that. Picking a site that drives enough unique traffic is much more difficult.

I would expect, though I don't know, that large media sites have thier results or thier analytic methods audited similar to a financial audit. So if you can prove, using accepted practices, that your site is moving some number of unique visitors, and that number is more than a similar site, you will proably win hte business all other factos being equal.

Paradigm shift? (0)

Anonymous Coward | more than 9 years ago | (#13377755)

Wow! THAT's a paradigm shift if I've ever seen one!

One method to reliably track users... (1)

Vo0k (760020) | more than 9 years ago | (#13377845)

One single method that would reliably allow a site to track its users would be that each user needs to log in, and then needs the "session cookie" on each page they visit, and if they delete it, hard luck, log in again. This method is just a step away from another one: Make the pages password-protected and give the password to nobody. Users tracked: 0. Pages visit: 0. Tracking reliablity: 100%.

This is old hat (1)

dzfoo (772245) | more than 9 years ago | (#13377932)

This is not really "New Technology" or a "Paradigm Shift", or anything extraordinary. This is just another marketeer trying to start a "buzz".

I know plenty of software out there that perform multiple tests in order to establish uniqueness of visitors. Perhaps the current big-biz log-analyzing apps do not do it, but that doesn't mean nothing else does. There was a time when Real Programmers didn't trust cookies as the exclusive identifier. I even remember some popular Log Analyzer Perl script that used to check for the following:

- First, Cookies
- Then IP Address (Whether it is known to be dynamic or static)
- Then compares the IP Addresses by IP pool (ISP)
- Then checks the time between requests, so that requests of different IPs from the same IP pool, with the same User Agent come in within a pre-determined time, they are considered the same visitor.
- Also checks the time between requests from the same IP address, so that if a certain pre-determined time has passed between requests, and the IP address is known to be dynamic, and the User Agent changed, then it is probably someone else.

I do not recall the exact details of the analysis, but it was something along the lines of the above. And there were many scripts like that one.

Comparing IP addresses geographically and in a time-sensitive manner (coupled with other potentially identifying criteria, such as Cookies, User Agent, Screen resolution, etc.) has been known and done for years! In particular, these forms of unique visitor analysis was very popular during the days when you couldn't count on Cookies being supported by all browsers, or on savvy users accepting them -- you know, when dinosaurs roam the data center; way before everybody decided to rely on Cookies as the end-all-be-all of session identification.

If they all forgot that using Cookies exclusively was never a a very reliable solution, then that's their problem.

            -dZ.
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?