Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Cloud-Powered Facial Recognition Is Terrifying

Soulskill posted more than 2 years ago | from the you-are-who-google-says-you-are dept.

Cloud 286

oker sends this quote from The Atlantic: "With Carnegie Mellon's cloud-centric new mobile app, the process of matching a casual snapshot with a person's online identity takes less than a minute. Tools like PittPatt and other cloud-based facial recognition services rely on finding publicly available pictures of you online, whether it's a profile image for social networks like Facebook and Google Plus or from something more official from a company website or a college athletic portrait. In their most recent round of facial recognition studies, researchers at Carnegie Mellon were able to not only match unidentified profile photos from a dating website (where the vast majority of users operate pseudonymously) with positively identified Facebook photos, but also match pedestrians on a North American college campus with their online identities. ... '[C]onceptually, the goal of Experiment 3 was to show that it is possible to start from an anonymous face in the street, and end up with very sensitive information about that person, in a process of data "accretion." In the context of our experiment, it is this blending of online and offline data — made possible by the convergence of face recognition, social networks, data mining, and cloud computing — that we refer to as augmented reality.'

cancel ×

286 comments

Sorry! There are no comments related to the filter you selected.

Google decided against this. (4, Interesting)

605dave (722736) | more than 2 years ago | (#37567174)

This is why Google shelved their version of this tech. The implications were too big.

Where Are the Recall Rates? (5, Insightful)

eldavojohn (898314) | more than 2 years ago | (#37567322)

This is why Google shelved their version of this tech. The implications were too big.

Having studied this in college and witnessed many failed implementations of it [slashdot.org] I casually ask: Where are the recall rates [wikipedia.org] (see also sensitivity and specificity [wikipedia.org] ) of these experiments?

Because when I read the articles, I found this instead of hard numbers:

Q. Are these results scalable?

The capabilities of automated face recognition *today* are still limited - but keep improving. Although our studies were completed in the "wild" (that is, with real social networks profiles data, and webcam shots taken in public, and so forth), they are nevertheless the output of a controlled (set of) experiment(s). The results of a controlled experiment do not necessarily translate to reality with the same level of accuracy. However, considering the technological trends in cloud computing, face recognition accuracy, and online self-disclosures, it is hard not to conclude that what today we presented as a proof-of-concept in our study, tomorrow may become as common as everyday's text-based search engine queries.

How you want to decide Google passed on continuing down this road is up to you. Frankly, I would surmise that the type I and type II errors [wikipedia.org] become woefully problematic when applied to an entire population. Facial recognition is not there yet, not until I see some hard numbers that convince me the error rate is low enough. Right now I bet if you were to snap pictures of 10,000 people, you would incorrectly classify at least 100 of them leading to wasted time, violated rights and wasted opportunity (depending on the misclassification).

Re:Where Are the Recall Rates? (1)

pvt_medic (715692) | more than 2 years ago | (#37567504)

But the challenge is what people consider acceptable, you may have misclassified 100 people right now but there are plenty of people out there that would argue that if you stop one terrorist or criminal it may be worth the "inconvenience" endured by 1%

Re:Where Are the Recall Rates? (2, Interesting)

Anonymous Coward | more than 2 years ago | (#37567798)

Depends what the inconvenience is. If it's a quick background check with no lasting effects (i.e. not being added to a do-no-fly list or terrorist watch list or your record or subjecting you to public humiliation or arrest), then perhaps... If it's a 5 year vacation in Guantanamo without access to legal counsel, then no way--that would be a horrible perversion of justice!

Consider this question: Do only famous people have look-a-likes? Why would that be, especially since famous people often look non-average in some way? If they have many look-a-likes, then the rest of us certainly do. I think most people have met someone who says, "Are you so and so--you look just like them?" or has someone tell them that they saw someone the other day who looked just like them. In short, we ALL have many look-a-likes, they just don't seek us out since we're not famous, and thus we are unlikely to meet most of them, and vice versa.

So you have many large pools of facial synonyms, if you will that will, that all potentially result in false-positives with regard to each other or to one *really* unlucky member of the pool. If one of them happens to be a terrorist, then you're in for a world of trouble just because you happen to look like them.

So if we start applying this tech to the population at large, we had better be certain that the consequences of a false match WHEN IT HAPPENS are acceptable, legally, ethically, and morally, or we shouldn't do it at all, IMHOP.

And that's not even addressing the privacy issues associated with correct identifications...

Just shows to go you.... (1)

cayenne8 (626475) | more than 2 years ago | (#37568092)

...that Python had it right LONG ago...

the importance of NOT being seen [youtube.com]

Re:Where Are the Recall Rates? (1)

peragrin (659227) | more than 2 years ago | (#37567936)

The problem is scale. 1% of people in the USA is 3 million false positives. With 100,000+ people flying everyday, that is 1,000 false positives from 5,000 possible airports

That would be 30,000 jobs just to track down those false positives.

Re:Where Are the Recall Rates? (0)

Anonymous Coward | more than 2 years ago | (#37567506)

Right now I bet if you were to snap pictures of 10,000 people, you would incorrectly classify at least 100 of them leading to wasted time, violated rights and wasted opportunity

Yeah, but I bet that most of those 100 people will be guilty of nothing, and if they are not guilty they have nothing to fear! right?

Re:Where Are the Recall Rates? (1)

kent_eh (543303) | more than 2 years ago | (#37567882)

In a state which still practices excecution, I wouldn't want to settle for any amount of error

Also, it's not just the police/government mistaking someone's identity that is scary [nj.com]

98% Accurate! (4, Interesting)

bigtrike (904535) | more than 2 years ago | (#37567536)

You mean to tell me that 98% accuracy when trying to spot terrorists in airports isn't good enough? That's only 200,000 false positives per year for a typical airport.

False positives OK at airport? (4, Insightful)

drnb (2434720) | more than 2 years ago | (#37567772)

You mean to tell me that 98% accuracy when trying to spot terrorists in airports isn't good enough? That's only 200,000 false positives per year for a typical airport.

Perhaps the false positives at airports are OK? Rather than randomly choosing people for more attentive searchers, and the occasional grandma to give the facade of fairness and not profiling, we could focus on the 2% who are higher probability. Of course 2% are unfairly inconvenienced but isn't that better than 100% unfairly inconvenienced? Clearly a negative/negative decision.

Of course this is all academic and falls apart if the false negatives are at a non-trivial level.

Re:Where Are the Recall Rates? (2)

pinkwarhol (1913356) | more than 2 years ago | (#37567564)

Right now I bet if you were to snap pictures of 10,000 people, you would incorrectly classify at least 100 of them...

thats only a 1% error... is that supposed to make me feel more comfortable? Sounds like the technology works pretty well, pragmatically...
Anyway, sounds mildly-moderately threatening to general privacy. Who's paying for this?

FTFA, grants from:
National Science Foundation, grant # 0713361
US Army Research Office, contract # DAAD190210389

How much?

Re:Where Are the Recall Rates? (1)

ironjaw33 (1645357) | more than 2 years ago | (#37567840)

Right now I bet if you were to snap pictures of 10,000 people, you would incorrectly classify at least 100 of them... thats only a 1% error... is that supposed to make me feel more comfortable? Sounds like the technology works pretty well, pragmatically... Anyway, sounds mildly-moderately threatening to general privacy. Who's paying for this? FTFA, grants from: National Science Foundation, grant # 0713361 US Army Research Office, contract # DAAD190210389 How much?

We'll never get any better if we don't try. That's what these grants are for: improving the state of the art.

Re:Where Are the Recall Rates? (1)

pinkwarhol (1913356) | more than 2 years ago | (#37567958)

Acquisti (NSF grant) had almost $400,000 for his last project, also 'privacy' related.
http://www.nsf.gov/awardsearch/showAward.do?AwardNumber=0713361 [nsf.gov]
(the more current info about this grant isn't up on nsf.gov yet)

anyone know how to find army contracting info?

Re:Where Are the Recall Rates? (3, Interesting)

Joce640k (829181) | more than 2 years ago | (#37567870)

How you want to decide Google passed on continuing down this road is up to you. Frankly, I would surmise that the type I and type II errors [wikipedia.org] become woefully problematic when applied to an entire population.

I dunno. I bet if you combine the location of a photo with what Google knows about where you live/hang out the results would be pretty good.

Re:Where Are the Recall Rates? (0)

Anonymous Coward | more than 2 years ago | (#37568208)

You are so super smart.

Re:Google decided against this. (0)

Anonymous Coward | more than 2 years ago | (#37567490)

actually... google has aquired them...http://www.pittpatt.com/

Re:Google decided against this. (4, Funny)

rwa2 (4391) | more than 2 years ago | (#37567574)

This is why Google shelved their version of this tech. The implications were too big.

I don't know... I fed my pr0n directory to Picasa's face recognition, and the results were pretty awesome.

Re:Google decided against this. (4, Funny)

AliasMarlowe (1042386) | more than 2 years ago | (#37567686)

This is why Google shelved their version of this tech. The implications were too big.

I don't know... I fed my pr0n directory to Picasa's face recognition, and the results were pretty awesome.

You mean there are people with noses shaped like... that?

Re:Google decided against this. (0)

Anonymous Coward | more than 2 years ago | (#37568060)

I'm impressed it'd be able to recognize a face under all that...makeup.

Re:Google decided against this. (0)

Anonymous Coward | more than 2 years ago | (#37567756)

They shelved their version and then they bought PittPatt in July

huh? (0)

Anonymous Coward | more than 2 years ago | (#37568336)

"We are happy to announce that Pittsburgh Pattern Recognition has been acquired by Google!"
http://www.pittpatt.com/

We knew it was coming (3, Insightful)

wiggles (30088) | more than 2 years ago | (#37567218)

It was only a matter of time. This has been one of the most sought after anti-terrorism tools of the last 10 years. Imagine the security implications! I'd be shocked if NSA didn't already have a version of this operational 5 years ago.

Re:We knew it was coming (2)

DinDaddy (1168147) | more than 2 years ago | (#37567580)

Because terrorists all have facebook accounts? I would assume most of them have very little online presence, pictorially anyway.

Re:We knew it was coming (0)

Anonymous Coward | more than 2 years ago | (#37567744)

Or the government might have photos to compare against.

Re:We knew it was coming (2)

nschubach (922175) | more than 2 years ago | (#37567832)

Duh, of course they don't have Facebook. They have Terrorbook, and most of their faces are partially covered with handkerchiefs or some other items.

Facebook centric because its academic research (3, Insightful)

drnb (2434720) | more than 2 years ago | (#37567928)

Because terrorists all have facebook accounts? I would assume most of them have very little online presence, pictorially anyway.

Oddly whenever a new terrorist is discovered and remains at large law enforcement and the mass media seem to be able to come up with a facial photo. Perhaps there are sources of photos other than facebook, in particular sources available to government agencies. DMV photo, passport photo, school photos, team photos, etc.

The experiment is facebook centric because it is an academic project that needs to stick to info made public by the individual to avoid privacy issues.

Re:Facebook centric because its academic research (1)

Anonymous Coward | more than 2 years ago | (#37568202)

I suspect you're seeing some selection bias. Generally telling the population that there's a known terrorist on the loose isn't productive. The authorities only do it as a last resort and only when they have a photo, so that they can leverage the massed eyeballs of the public. That's why you'll never see stories about terrorists on the run minus a photo - not because they all have photos but because there's no point alarming the public unecessarily.

Re:We knew it was coming (2)

Gideon Wells (1412675) | more than 2 years ago | (#37567844)

That is the cool but unnerving part of government tech. It is hard to tell how much is over estimated (like 2001's flights to the moon style overestimation), how far they are genuinely ahead and how much of the bleeding edge is released.

New York was revealed in the media recently to have the tech to track down everyone wearing a "red jacket" through their camera security systems.

Welcome to the world of tomorrow (2)

Coisiche (2000870) | more than 2 years ago | (#37567228)

I already don't like it.

Re:Welcome to the world of tomorrow (2)

Tuan121 (1715852) | more than 2 years ago | (#37567342)

I do.

Re:Welcome to the world of tomorrow (1)

MobileTatsu-NJG (946591) | more than 2 years ago | (#37567410)

Of course you don't, you watch a lot of sci-fi.

But Facebook... (4, Insightful)

gurps_npc (621217) | more than 2 years ago | (#37567240)

is not dangerous. There is no danger from posting all of the intimate details of your life, with pictures, and pictures of other people (often taken without their permission) using real names.

Look, I am not a paranoid man. I am perfectly willing to give out private and personal information - for a reasonable fee.

I give out private information to my bank all the time. In exchange, I get financial services.

Facebook offers - a) a blog, b) email, c) games, d) convenient log in

The first 3 are available for free elsewhere, the last is not worth much.

I'm not paranoid, I'm just not cheap. And Facebook is asking way way too much for the minimal services it provides.

Re:But Facebook... (3, Interesting)

killmenow (184444) | more than 2 years ago | (#37567320)

pictures of other people (often taken without their permission)

One of the reasons I have a facebook account is so I can untag photos others say are me.

Re:But Facebook... (1)

Anonymous Coward | more than 2 years ago | (#37567404)

pictures of other people (often taken without their permission)

One of the reasons I have a facebook account is so I can untag photos others say are me.

But they can't tag you if you don't have an account. They can write your name, but that is not internally or externally searchable. I think your strategy is opening you up to more search connections, by being searchable and for periods tagged.

Re:But Facebook... (1)

Culture20 (968837) | more than 2 years ago | (#37567946)

But they can't tag you if you don't have an account. They can write your name, but that is not internally or externally searchable By Ordinary Users . I think your strategy is opening you up to more search connections, by being searchable and for periods tagged.

FTFY. You can be sure FB has database entries for people that don't have accounts, and that their racial recognition program uses these tags. When they build up enough info on a person, they might start sending them email solicitations* like "We have this photo tagged of you. Please create an account to confirm/deny that this is you." I bet it's two years or so away.

*A lot of people use the "upload my addressbook to facebook" option. If they do it from their smartphone, it might scrape the contact photos too...

Re:But Facebook... (2)

omnichad (1198475) | more than 2 years ago | (#37567508)

That's a lot of work. Didn't you know you can change your privacy settings so that tagged photos of you aren't searchable by other people? http://www.facebook.com/help/?faq=267508226592992 [facebook.com]

Re:But Facebook... (2)

ironjaw33 (1645357) | more than 2 years ago | (#37567938)

pictures of other people (often taken without their permission)

One of the reasons I have a facebook account is so I can untag photos others say are me.

This is one of my arguments for maintaining a public presence on the internet: control over my image/likeness. When someone Googles my name, the first things they see are my professional webpage, personal webpage, and Facebook account. Anyone else with the same name is pushed to the second page of results. Anything not under my direct control is pushed to the bottom of the first page of results.

With a Facebook account and publicly available webpages, I am able to broadcast my side of the story and drown out any impostors/naysayers (if anything were to happen).

Re:But Facebook... (1)

Stubot (2439922) | more than 2 years ago | (#37567944)

What I like to do is any pictures of friends/relatives taken in front of crowds I tag myself as one of the 'faceless' crowd members in the background.

I prefer Red Alert 2's cloud tech. (1)

AdamJS (2466928) | more than 2 years ago | (#37567244)

It's just seems more...electrifying.

public pics? (3, Insightful)

killmenow (184444) | more than 2 years ago | (#37567248)

This is why I always use a picture like this [imgur.com] for any online public pics.

Note that the pic in question (a) does not show a face clearly and (b) may or may not be me.

Re:public pics? (1)

Tuan121 (1715852) | more than 2 years ago | (#37567314)

You are awesome.

Re:public pics? (1)

killmenow (184444) | more than 2 years ago | (#37567682)

I know. But thanks anyway. It's always nice when others acknowledge it. ;-)

Re:public pics? (1)

Joce640k (829181) | more than 2 years ago | (#37567324)

Makes no difference on Facebook. While you're doing that your auntie/mom/friends are busy uploading and tagging hundreds of pictures of you.

Re:public pics? (1)

killmenow (184444) | more than 2 years ago | (#37567484)

Actually, they're not. Any time anyone tags a photo of me, I get a notification and can untag it. It almost never happens anyway. My folks/friends aren't big photo posters.

But this is one thing (among many) that bothers me about facebook. If you don't have a facebook account, you can't stop people from posting pics of you and filling out the "In this picture" stuff with your name essentially tagging you without you getting a chance to remove the tag.

Re:public pics? (1)

Gideon Wells (1412675) | more than 2 years ago | (#37567880)

Then you find out FaceBook still has a log that it was tagged you, and they are granting back door access to certain governments/businesses to said logs.

Re:public pics? (1)

killmenow (184444) | more than 2 years ago | (#37567954)

Well, yeah. That's a good point. DAMN YOU FACEBOOK!

Re:public pics? (0)

Anonymous Coward | more than 2 years ago | (#37567540)

Makes no difference on Facebook. While you're doing that your auntie/mom/friends are busy uploading and tagging hundreds of pictures of you.

If you are a Facebook user you can untag yourself. If you are not a Facebook users they can't tag you.

Re:public pics? (1)

Joce640k (829181) | more than 2 years ago | (#37567894)

If you are a Facebook user you can untag yourself. If you are not a Facebook users they can't tag you.

You think Facebook really throws that information away just because you clicked "untag"?

0 errors? (1)

w_dragon (1802458) | more than 2 years ago | (#37567254)

And what is the error rate when you get a few million people into the database? It's all well and good to say we can identify who someone is against a population of a few dozen, or a couple hundred, but give it all the people in New York City to churn through and I somehow doubt that your false identification rate will be 0.

Re:0 errors? (1)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#37567436)

So long as the expert witness tells the jury that the error rate is zero, conviction statistics will prove... And I'm talking prove here, the fancy kind with numbers and and computers and shit, that the error rate is zero.

By the standards of some of the dodgier corners of forensics, this stuff will be downright impressive...

Jokes on them (0)

Anonymous Coward | more than 2 years ago | (#37567262)

My pics on a certain dating site don't show my face!

Face it (5, Insightful)

boristdog (133725) | more than 2 years ago | (#37567268)

The first real-world, publicly available use of this will be an app that lets you:

1. Take a picture of someone with your smart phone
2. Find naked pictures of this person online

BRB, heading to the local college campus...

Re:Face it (0)

Anonymous Coward | more than 2 years ago | (#37568008)

The first real-world, publicly available use of this will be an app that lets you:

1. Take a picture of someone with your smart phone
2. Find naked pictures of this person online

Brilliant, but I don't think that FaceIt is the best name for this app.

Also, the tricky part is going to be matching all the top-down angle pics with faces and cleavage to all the headless pics with the real nudity.

Software the future of computing (2)

pvt_medic (715692) | more than 2 years ago | (#37567282)

Think about how much raw power computers have today, and how for the most part we are just using that for word processing/email/internet/music/video. This is just an example of how to utilize this power. Its all about software now, this is just another example of how databases will continue to interact more and more. There are great possibilities for how this can be used (and horrible options as well) but think about medicine being able to identify a John Doe who is brought into the Emergency Department, or your home security system identifying who is knocking at the door. And of course, this technology is not new, its just finally coming out for public usage.

Re:Software the future of computing (1)

mikael (484) | more than 2 years ago | (#37567846)

Out of curiousity, I found and tried running one of those old PC magazine performance test programs from the 1990's (SpeedPro or something similar) on a modern PC (3 GHz, dual-core Intel). Performance was 120,000 faster than the original IBM XT, not taking into the use of GPU's. Tests were doing things like random disk access, FFT transform, memory operations.

Given that for some tasks a GPU is 100x faster than a CPU, and cloud computing puts together a grid of thousands or such PC's, that is an insane amount of computing power to do any calculation.

it's annoying (1)

ackthpt (218170) | more than 2 years ago | (#37567284)

when you get an anonymous email telling you you have a booger hanging on the end of a long nose hair

There was once a time that startrek predicted... (1)

Nadaka (224565) | more than 2 years ago | (#37567294)

There was once a time that startrek predicted future technology. CSI is now doing it. And it is far less benevolent than the cell phone and portable medical diagnostic devices.

Re:There was once a time that startrek predicted.. (0)

Anonymous Coward | more than 2 years ago | (#37567414)

Enhance. Enhance.... Enhance.

If we take the technology from CSI, we will be able to use our Motorola razr's to figure out the identity of any one in any seat in a sports arena from across the stadium.

Imagine the possibilities!

Re:There was once a time that startrek predicted.. (1)

omnichad (1198475) | more than 2 years ago | (#37567542)

How about a reflection off a water droplet on a handrail at that stadium? I love this video: Red Dwarf CSI Spoof [youtube.com]

Sigh (4, Insightful)

IWantMoreSpamPlease (571972) | more than 2 years ago | (#37567300)

Time to start dressing like The Stig again.

Re:Sigh (1)

Plunky (929104) | more than 2 years ago | (#37567552)

Yeah, but then they will know.. you are the stig!

Re:Sigh (1)

nschubach (922175) | more than 2 years ago | (#37567922)

If there are 300 million Stigs... how will they know which one is the real one?

Re:Sigh (1)

TheSpoom (715771) | more than 2 years ago | (#37567646)

No worries, gait analysis will still get you.

Re:Sigh (0)

Anonymous Coward | more than 2 years ago | (#37567728)

Not if you adopt several Silly Walks.

Re:Sigh (1)

Howard Beale (92386) | more than 2 years ago | (#37567710)

Nah, it's time to start dressing like Walter Kovacs. You might know him as Rorschach.

Re:Sigh (1)

the_humeister (922869) | more than 2 years ago | (#37568084)

Ben Collins, is that you? Some say he has a full tattoo of his face on his face..

New hobby (1)

Blackajack (1856892) | more than 2 years ago | (#37567318)

Might be a good time to take up theatrical masking as a hobby?

Re:New hobby (2)

Nadaka (224565) | more than 2 years ago | (#37567408)

Not if you live in new york city. Wearing a mask is a crime.

Re:New hobby (1)

Thud457 (234763) | more than 2 years ago | (#37567738)

This is going to drive fashion in new directions. [cvdazzle.com]
Finally, a practical application.

I'm glad I have a clone ... (2)

oneiros27 (46144) | more than 2 years ago | (#37567330)

Of course, they just managed to link to *someone* ... did they then ask the person to confirm if they were correct?

I have a LinkedIn page, but without a picture. My twin brother on the other hand, uses Facebook, while I don't. (I'm rather sensitive about my info being out there, after having a stalker during undergrad) So, it's entirely possible that they would've gotten information from my face ... but unlikely that it'd have been my information

In this case, the error might still lead them to me, as my brother would recognize me if they showed him the picture ... but how many other incorrect matches might there have been? Just getting *a* match is not the same as getting the *correct* match.

Re:I'm glad I have a clone ... (0)

Anonymous Coward | more than 2 years ago | (#37567910)

The stats on correct matches are in the 70% range. The program also provides options of possible matches if no match is confirmed. In under a minute. Granted, you have to be in the DB it is querying.

And (assuming you have one) they can correctly guess the first five digits of your social security number either 16 or 30% of the time (I forget the exact number).

masks (1)

Anonymous Coward | more than 2 years ago | (#37567334)

It would be AWESOME if this causes masks to become fashionable. How cool would that be?

Nothing to worry about! (2)

fuzzyfuzzyfungus (1223518) | more than 2 years ago | (#37567344)

As always, the completely innocent, not socially related to anybody not completely innocent, totally conformant with local and regional cultural and lifestyle standards, possessing enough money to not be of interest to debt collectors; but not so much as to be of interest to marketers, not being followed by any stalkers/vindictive exes/etc., people have Absolutely Nothing To Fear!

Fucking luddites. Go tighten your tinfoil hats.

Low cost workaround... (3)

condition-label-red (657497) | more than 2 years ago | (#37567362)

Burkas [wikipedia.org] for *EVERYONE* !!!

Re:Low cost workaround... (0)

Anonymous Coward | more than 2 years ago | (#37567688)

Would you like bombs with that?

Re:Low cost workaround... (2)

atisss (1661313) | more than 2 years ago | (#37567820)

Already forbidden in France / Switzerland

who killed privacy? (3, Insightful)

circletimessquare (444983) | more than 2 years ago | (#37567374)

you did

it's funny that the tech industry holds some of the most privacy-concerned individuals, yet all their dedication to their craft has done is provide the most privacy destroying entity ever to exist

privacy is dead as a doorknob. just forget about the concept. really, you needn't bother about privacy anymore, it's a nonstarter in today's world. big brother? try little brother: every joe shmoe with a smart phone with a camera has more power than the NSA, KGB, MI6, MSS: those guys are amateur hour

i'm not saying it's wrong, i'm not saying it's right. i'm just saying it's the simple truth of the matter, right or wrong: privacy is dead. acceptance is your only option now. you simply can't fight this

and government didn't kill it, you paranoid schizophrenic goons

your technolust did

Re:who killed privacy? (0)

Anonymous Coward | more than 2 years ago | (#37567526)

Maybe you could make a movie about private zombies. That would be great.

Re:who killed privacy? (3, Insightful)

Kazymyr (190114) | more than 2 years ago | (#37567616)

it's funny that the tech industry holds some of the most privacy-concerned individuals (..)

That is only if you believe the all-caps paragraphs on all the EULAs and TOS you click through. Often the following paragraphs will contradict the bombastic declarations of commitment to privacy - on the same page.

Hysteria (2)

feenberg (201582) | more than 2 years ago | (#37567396)

They say the false accept rate is .001, or one in a thousand. That is, they can extract about 10 bits of information from a picture. From those 10 bits they claim to get the SSN? Or, they have the picture of a person, and need to identify them in a sample of a million people, they will get back 1000 possible matches.

The complaints about privacy seem greatly overblown. In essence they are saying that if you post a picture with your name, and then another picture without your name, someone with a million dollars of software might recognize the similarities. Of course they might without the computer too. This is just another in the long line of "security" scares which presume that items of public knowledge such as your appearance, name, DOB and SSN can be turned into a secret passwords after 40 years of being public knowledge. The security experts should be spending their time convincing banks not to pretend an SSN is a secret, rather than enabling them by agitating for legislation to make it so.

Finally, a wake-up call on privacy policy? (2)

SirGarlon (845873) | more than 2 years ago | (#37567420)

The implications of this look big enough to concern even the apathetic, non-technical majority. Perhaps this will finally motivate the long-needed policy reform on privacy in the digital age.

Get a website. (1)

metrometro (1092237) | more than 2 years ago | (#37567430)

This isn't going away. The only real answer is to clog the information channels about you with what you actually want the world to know.

Does this pose a problem for, say, pseudonym online dating? Yep. Unless you're willing to drop the pseudonym and link out to your dating profile, alongside your work profile, your hobby blog. It's time to stop pretending that we can post to Facebook and compartmentalize it -- the service providers do not want to do this, and increasingly are unable to provide this even if we do.

Date screening (1)

ZeroExistenZ (721849) | more than 2 years ago | (#37567438)

So now I can trace my date to see wether she ever did a porno ?

Re:Date screening (0)

Anonymous Coward | more than 2 years ago | (#37568022)

And this is bad because...?

Or are you screening them because you _want_ a date that's done a porno?

William Gibson gets it right again (0)

Anonymous Coward | more than 2 years ago | (#37567474)

Wasn't this tech used in a William Gibson book by an abusive boyfriend to find his ex who appeared in the background of a photo taken at a party in a different city? Anyone remember the book? It was a recent one. This will be automatic and available to everyone in a few years I figure.

Wait, this is new? (1)

MrLizard (95131) | more than 2 years ago | (#37567556)

They've been doing this on shows like CSI and NCIS for years. :) You mean... they were just making it up? Wow. My faith in Hollywood's technical advisers is shattered forever.

Re:Wait, this is new? (0)

Anonymous Coward | more than 2 years ago | (#37568174)

This is why I only watch 100% true stories like the unbiased kind you find in such films as Fahrenheit 911.

For example, this is dangerous for women (5, Interesting)

aestheticpriest (2002576) | more than 2 years ago | (#37567578)

I am a good looking female. When I was a waitress I had a stalker at my workplace. Because the schedule was posted in view-- not a clear view, but view enough for him to find an opportunity to read it without looking suspicious-- he consistently showed up during work hours and tried to follow me home. I didn't have a car, so I walked home alone in the middle of the night; I worked 3rd shift at a 24-hour diner. This might seem like a poor choice, but I desperately needed a job. With this technology a stranger could find out who I am through a picture of me taken with his cellphone. This is also dangerous for people in the sex industry who are already way more vulnerable to stalking than I was walking home from 3rds at a diner. I'm now doing amateur porn-- difficult to resist when it earns an unskilled laborer a grownup sized income for part time hours-- but my image is everywhere online.

Re:For example, this is dangerous for women (4, Funny)

HangingChad (677530) | more than 2 years ago | (#37567862)

I am a good looking female.

On Slashdot? Are you lost?

Re:For example, this is dangerous for women (1)

Bardwick (696376) | more than 2 years ago | (#37568004)

Get a concealed carry permit.. At least then you have options you didn't have before.

Re:For example, this is dangerous for women (2, Informative)

icebraining (1313345) | more than 2 years ago | (#37568072)

Yes, now besides getting raped, she can be shot too!

Re:For example, this is dangerous for women (2)

cayenne8 (626475) | more than 2 years ago | (#37568244)

Yes, now besides getting raped, she can be shot too!

Well, that's the thing.

You do not carry a gun...unless you are prepared to use it when needed without flinching.

I, for one...have no compunction about unloading a magazine into someone that is threatening to do me bodily harm...ESPECIALLY if it is in my own home.

If you're not willing to pull that trigger, then no...don't carry a gun, it will likely end up being used on you (assuming of course they don't have one already).

I'm not sure why this is terrifying (0)

Anonymous Coward | more than 2 years ago | (#37567652)

This isn't the Brave New World everyone's always been afraid of. If you wanted to know someone's identity in the past (or present), you just ask around. You can find out almost anything about anyone who hasn't been entirely private their whole lives. It's been this way for millenia. Why else do "Have you seen this person?" posters work?

In this instance, you're just taking the same information and making it more easily accessible. Sure, anyone can see your public photos you've posted online by searching for you with image recognition. Except, they could've done that exact same thing before by asking someone who you are.

Granted, it's slightly more creepy now. But it's not like they'll be able to see your private pictures. You didn't share your *private* pictures publicly, did you? Oh, well in that case, yeah. You're screwed.

Combine this with video storage (1)

HangingChad (677530) | more than 2 years ago | (#37567816)

There are companies actually selling access to large stocks of video surveillance. Imagine combining facial recognition software with the video from thousands of security cameras. You could do all kinds of scary things.

No, it's not. (1)

Tuan121 (1715852) | more than 2 years ago | (#37567822)

nt

We're F'ed (1)

Anonymous Coward | more than 2 years ago | (#37567876)

May as well embrace it.

Only the beginning (1)

trenobus (730756) | more than 2 years ago | (#37567956)

This is only the beginning of the end of privacy. It will not be too much longer before it will be possible to start with a picture and actually locate the person in real life. The general trend is for the real world to become increasingly accessible from the virtual (online) world as real-time data. The question is whether that data will be available to only a few privileged people or institutions, or available to everyone. In the former case, Big Brother (on steroids!) is the outcome. In the latter case, there is at least the possibility that new social norms will emerge, in which people afford each other some privacy in exchange for their own. When you may be watched while watching someone else (particularly the person you're watching), you may think twice about it.

It's Not A Problem (1)

glorybe (946151) | more than 2 years ago | (#37568048)

All that will happen in a truly transparent society is that people will take responsibility for their actions. The truth will set us free is more than a trivial statement. People of faith normally believe that God sees all of their actions and even their thoughts. A society in which life is transparent just might be wonderful. Crime would vanish. Cheating and lying would vanish. This technology is only a step along the path to a truly open society.

Scramble Suit (1)

Cryogenic Specter (702059) | more than 2 years ago | (#37568166)

Looks like the time for a Scramble Suit like in A Scanner Darkly. I so wish I had one of those.

Person of Interest (0)

Anonymous Coward | more than 2 years ago | (#37568194)

I really though this article was going to be about the new show, "Person of Interest" with Michael Emerson and Jim Caviezel.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>