Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google Spots Explicit Images of a Child In Man's Email, Tips Off Police

samzenpus posted about 3 months ago | from the do-not-pass-go dept.

Google 790

mrspoonsi writes with this story about a tip sent to police by Google after scanning a users email. A Houston man has been arrested after Google sent a tip to the National Center for Missing and Exploited Children saying the man had explicit images of a child in his email, according to Houston police. The man was a registered sex offender, convicted of sexually assaulting a child in 1994, reports Tim Wetzel at KHOU Channel 11 News in Houston. "He was keeping it inside of his email. I can't see that information, I can't see that photo, but Google can," Detective David Nettles of the Houston Metro Internet Crimes Against Children Taskforce told Channel 11. After Google reportedly tipped off the National Center for Missing and Exploited Children, the Center alerted police, which used the information to get a warrant.

cancel ×

790 comments

Sorry! There are no comments related to the filter you selected.

Well at least they saved the children! (5, Insightful)

the_Bionic_lemming (446569) | about 3 months ago | (#47596577)

The great things google can offer, 1984 saves the children!

(Yes it's good that pedophiles get hurt - But there is a very very bad precedent here...)

Re:Well at least they saved the children! (2, Insightful)

Anonymous Coward | about 3 months ago | (#47596605)

I'm not sure how this sets any sort of new precedence, often times these people are caught because they grant other people access to their computer. And that computer happens to have childporn on it.

The bigger issue is how much effort Google is placing into search people's accounts for child porn and what assurances there are that the images being possessed are actually known about by the alleged offender.

Re:Well at least they saved the children! (4, Insightful)

Joe_Dragon (2206452) | about 3 months ago | (#47596629)

but if Google messed up the chain of evidence then he may get off.

child porn is bad and this guy seems to be guilty but some needs to stand up for the rights and prove that the IP they have is going to right place.

Re: Well at least they saved the children! (5, Insightful)

supersat (639745) | about 3 months ago | (#47596783)

Tips received from private companies or individuals are not subject to the same constitutional limits on evidence, provided they are not being paid by law enforcement. This is why CrimeStoppers exists.

Re: Well at least they saved the children! (4, Insightful)

Anonymous Coward | about 3 months ago | (#47596901)

Which is pretty nonsensical, and an end run around the constitution. If the police cover their tracks well enough, all they have to do is pay some people off to gather the evidence.

Re:Well at least they saved the children! (5, Insightful)

Anonymous Coward | about 3 months ago | (#47596793)

And thats the catch no one seems to be talking about. An influenced chain of evidence can break entire cases simply because the police cannot prove beyond a reasonable doubt that the evidence was not tampered with/planted.

Someone at Google blew the whistle? If 'someone at Google' was able to look inside, how do we know they didn't put it inside in the first place? If you rent an apartment and your landlord has the master key, the police are going to have a VERY hard time convincing the court that you are the guilty party when the only reason they investigated you in the first place was because your landlord tipped off the police.

Re:Well at least they saved the children! (2, Insightful)

AK Marc (707885) | about 3 months ago | (#47596867)

You don't have to prove evidence wasn't tampered with. You just need to prove beyond a reasonable doubt that the convicted child molester was in possession of child porn he was attempting to distribute. Most trust Google far enough to demonstrate a picture in an email. Why wouldn't you convict if a server admin presented a file, with logs, timestamps, and permissions that demonstrate the owner, creator, and time which that person had it?

Re:Well at least they saved the children! (-1, Troll)

Anonymous Coward | about 3 months ago | (#47596907)

Why wouldn't you convict

Because viewing child porn should not be illegal. I'd use jury nullification here.

Re:Well at least they saved the children! (2)

Nutria (679911) | about 3 months ago | (#47596801)

and prove that the IP they have is going to right place.

Eh? It's gmail. The need is to associate a gmail account with a person, not with an IP address.

And doing that depends mainly on how much PII the person tells Google when he registers the account.

Re:Well at least they saved the children! (5, Insightful)

MikeBabcock (65886) | about 3 months ago | (#47596611)

Agreed. Even good outcomes do not justify bad behaviour. We should not be happy that Google is perusing the content of our E-mail with anything but automated tools (for advertising, etc.)

Re:Well at least they saved the children! (3, Insightful)

Anonymous Coward | about 3 months ago | (#47596645)

Agreed. Even good outcomes do not justify bad behaviour. We should not be happy that Google is perusing the content of our E-mail with anything

FYP

Its all in the gmail terms of use ... (1)

Anonymous Coward | about 3 months ago | (#47596681)

Agreed. Even good outcomes do not justify bad behaviour. We should not be happy that Google is perusing the content of our E-mail with anything but automated tools (for advertising, etc.)

An automated tool probably flagged the image, hopeful it wasn't simply probable nudity but probable nudity combined with some other alert, maybe something in the body of the text. Humans probably only review flagged images. The system is working as google has always intended, go read the terms of use. Working with local law enforcement when google deems it appropriate or legally required probably falls under what you refer to as "etc".

Re: Its all in the gmail terms of use ... (1)

RickRussellTX (755670) | about 3 months ago | (#47596727)

I smell a lot of "probably" coming off of this post.

Re: Its all in the gmail terms of use ... (0)

Anonymous Coward | about 3 months ago | (#47596807)

I smell a lot of "probably" coming off of this post.

The funny thing is that the post is probably correct. The terms of service most likely makes the attached images no more private than the email's text.

Re:Its all in the gmail terms of use ... (5, Informative)

Anonymous Coward | about 3 months ago | (#47596805)

An automated tool probably flagged the image, hopeful it wasn't simply probable nudity but probable nudity combined with some other alert, maybe something in the body of the text. Humans probably only review flagged images. The system is working as google has always intended, go read the terms of use. Working with local law enforcement when google deems it appropriate or legally required probably falls under what you refer to as "etc".

Read the full article. There's an agency ("National Center for Missing & Exploited Children") that provides hashes of known child porn images and videos to companies like Google. I don't think it's outside Google's purview to ensure files with hashes appearing on that list don't reside on their servers. Contrary to what the peanut gallery here has to say, Google aren't opening up individual mailboxes for a quick squiz. Not to mention that even if they aren't looking inside mailboxes for these images, they probably do scan messages traversing their network (i.e. incoming/outgoing) for files with known hashes.

Re:Well at least they saved the children! (2, Insightful)

scottbomb (1290580) | about 3 months ago | (#47596713)

Which is why I don't use gmail and I find it rather alarming just how many people are/have switched to gmail. This is not to say Hotmail and Yahoo are any better at minding our privacy but I don't use them anymore either - for the same reason.

Re:Well at least they saved the children! (5, Insightful)

Pepebuho (167300) | about 3 months ago | (#47596859)

Guess what, even if you are not using gmail, chances ae people that you communicate with regularly ARE using e-mail, therefore, some of your email still passes through google's servers.

Cheer up!

Re:Well at least they saved the children! (5, Insightful)

Masked Coward (3773883) | about 3 months ago | (#47596623)

My thoughts exactly. It goes without saying that I feel no sympathy for a child molester. BUT....... oh the abuse this could lead to. Remember, some people classify "potential terrorist" as those who cite the Constitution in online article comments.

Re:Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596631)

I sort of get the feeling there has to be more to the story than this. How can google even do this? Someone manually looking at photos? Doubt it, and certainly hope not. Automated software to detect it? How the fuck do they even do that? Do they have some sort of secret contract with a government agency to develop that? I imagine they would have to, because in order for them to even develop and calibrate it, it seems they'd have to have a stockpile test photos, including some of child porn, thus they'd need immunity.

Re:Well at least they saved the children! (1)

Anonymous Coward | about 3 months ago | (#47596699)

One simple way would be just look up a hash against known CP images.

Re:Well at least they saved the children! (3, Informative)

evilviper (135110) | about 3 months ago | (#47596703)

Automated software to detect it? How the fuck do they even do that?

You're kidding, right? Ever heard of Google Image Search or TinEye? You give it a URL, or upload a photo, and it'll give you a list of identical and highly-similar images...

From there, it's a no-brainer to feed the system with URLs of known pedo sites... either ones Google employees have identified, or those they've gotten law-enforcement requests to take-down.

And even without the TinEye type system, it's still a no-brainer to checksum/hash all those images, and see if an exactly identical one shows up on your servers, somewhere, somehow.

Re:Well at least they saved the children! (4, Interesting)

hjf (703092) | about 3 months ago | (#47596749)

Not checksumming or hashing. It's called "feature extraction". I know about it. I made a video about a little software I made based on OpenCV which is able to identify a picture I show it, through my webcam, among 20,000 pictures stored in my computer. It's the only video in my youtube channel that actually has views.
Anyway, once you have the features, you can analyze an image and see if it contains any part of any of the images in your database. It doesn't matter if it's slighlty blurred, partially covered, rotated, and it doesn't matter if it takes the whole screen or just a fraction. In my demo I show how my program recognizes Magic: The Gathering cards in my hand (which is much more difficult than recognizing poker cards).
Oh, and it does this at several matches per second on a Core 2 duo class machine.

Re:Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596753)

Automated software to detect it? How the fuck do they even do that?

Facial recognition software, its fairly common nowadays. Some victims of child porn as missing children. Their parents entered their photos in national databases, think of the milk cartoon photos.

Re:Well at least they saved the children! (2)

scottbomb (1290580) | about 3 months ago | (#47596755)

This is somewhat over-simplified but Google can also zero in on human faces in street view in order to slightly blur them. It's all automated. I think it has something to do with scanning for skin tone hues and corrosponding shapes. We recognize that an object we see is a human because of how they are put together: arms, legs, a torso, chest, head. Yes, all varying in sizes and hues but not by much. Parental control engines scan images for what they consider to have excessive skin tones, especially when those tones are interrupted with other skin tones that make up things like nipples, public hair, etc. It's quite sophisticated indeed, but when Facebook can do facial recognition, considering that Google can flag an image of something like a child with a dick in his mouth doesn't seem too far-fetched.

Re:Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596643)

Why? a child porn cross reference likely found it. Not a Google employee maliciously checking every single gmail users emails.

Re:Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596679)

That actually makes a lot of sense. When I first saw the title, I was somewhat concerned that it would be about somebody having a picture of his naked baby in his inbox. But that's not a potential risk if they're literally only checking for imagines that match known child porn.

Re: Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596745)

Great! Hopefully google w

Re:Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596857)

No, not big deal at all... what's your email address again? I have a few pictures i want to send you ;)

Re: Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596655)

Which part is surprising, google reading the email on their servers, or reporting obvious crimes?

Re: Well at least they saved the children! (4, Interesting)

felixrising (1135205) | about 3 months ago | (#47596825)

I highly doubt this is as nefarious as it seems on the surface. Chances are google applies hashing to each image that passes through their servers in order to reduce duplication of stored files. Some files may have been flagged before as being child porn and they setup some alerting when new emailed images match this pre-existing hash... nothing worse than an AV signature match... Note: I'm just guessing here, but there is no way google has a team of people sitting there scanning every single email, it's all automated and we have already given express permission for google to do some content analysis on our emails, that is after all how they target advertising at us and turn a profit... gmail isn't free!

Well at least they saved the children! (1)

Anonymous Coward | about 3 months ago | (#47596685)

I can't agree more. I think more people should be up in arms over what Google has just done. I'm also sure someone is going to point out that in Google terms of service has some type of clause in it that states that they can search your emails and do anything they want to with them. I personally think Google has crossed the line. I'm not sure when Google went from being a corporation to being in law enforcement, but this really seems to me to be a very bad precedent.

This is type of search to me is how United States government has done an end run around the fourth amendment. All law enforcement has to do now is ask companies like Google, Facebook, Microsoft, or Yahoo to do searches and report back.

All I can say is "Fuck You Google!!"

Re:Well at least they saved the children! (5, Insightful)

gweihir (88907) | about 3 months ago | (#47596871)

Aehm, what children were saved here? The article does not mention anything about it, just about some illegal pixels.

Re:Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596887)

I'm not sure when Google went from being a corporation to being in law enforcement

They're not in law enforcement. That's why they informed the police instead of arresting the guy themselves.

What likely happened is that their automated system for matching ads to the images in your emails just flagged an image as potential CP. An employee then manually checked if it was a false-positive or not. After confirming that the image was in fact CP, he informed the police.

Are you suggesting that the Google employee shouldn't have informed the police? In some countries it's illegal to withhold evidence and not inform the police about a crime.

Re:Well at least they saved the children! (0)

Xenx (2211586) | about 3 months ago | (#47596903)

They're using automated software to scan against hashes of known child porn. They're not just up and viewing peoples email here. Child porn is a serious enough problem, and there is no legal reason for you to have child porn. The situation would be different if the legality of the files themselves would be questionable.

Re:Well at least they saved the children! (5, Insightful)

gweihir (88907) | about 3 months ago | (#47596851)

The "success" here is completely insignificant in comparison to the huge costs to society. That you even feel the need to qualify your statement just shows that the artificial demonization of this material in order to justify a surveillance state has worked very well. It seems that by now people have completely forgotten that the actual problem is children getting hurt, not pictures of it or teenagers "sexting" each other. For all we know this person has a picture of a nude teenager, which does not even qualify as pornography in most countries. There is a reason this material does not get shown to the public. With the strong focus on digital material, the police gets easy "successes", and can justify any and all surveillance, but does not actually prevent any child from getting hurt. While it is difficult to get information (what a surprise), it seems that most acts of child abuse do not actually end up documented on the Internet and that commercial production is basically non-existent, as following money-trails is very, very easy.

At the same time, the police-state and the fascism that universally follows it get more and more established.

Re:Well at least they saved the children! (0)

Anonymous Coward | about 3 months ago | (#47596905)

(Yes it's good that pedophiles get hurt - But there is a very very bad precedent here...)

So... If the guy turns out to be innocent -- is Google liable for damages?

Guy is an idiot. (-1)

Anonymous Coward | about 3 months ago | (#47596585)

If he's going to use any type of webmail to share illegal child porn, he should be using Tor to mask his identity.

Re:Guy is an idiot. (0)

Anonymous Coward | about 3 months ago | (#47596719)

If he's going to use any type of webmail to share illegal child porn, he should be using Tor to mask his identity.

Suddenly I could care less if Tor usage invites gov't scrutiny. Thank you for helping me get off the fence on Tor.

Re:Guy is an idiot. (0)

Anonymous Coward | about 3 months ago | (#47596879)

So before now you were pretty sure only good people could benefit from security? lol

I would have gotten first post... (1)

Anonymous Coward | about 3 months ago | (#47596591)

I would have gotten first post, but I needed to delete some emails...

Re:I would have gotten first post... (3, Funny)

Anonymous Coward | about 3 months ago | (#47596707)

That's really funny. The idea that you can delete things.

Re:I would have gotten first post... (1)

mark-t (151149) | about 3 months ago | (#47596795)

You can. It just takes some effort.

Riiiiiight (0)

Anonymous Coward | about 3 months ago | (#47596593)

We don't do warrentless searches of email, for real! We catch sex offenders, too!

Re: Riiiiiight (0)

Anonymous Coward | about 3 months ago | (#47596617)

Idiot. The police did get a warrant. Google, on the other hand, didn't need a warrant since they're not a police force.

Re: Riiiiiight (1)

Anonymous Coward | about 3 months ago | (#47596677)

Bet that it was the NSA that tipped Google, that tipped Huston PD.

Yay for consistency (0)

Anonymous Coward | about 3 months ago | (#47596609)

We can do a lot to ensure it’s not available online — and that when people try to share this disgusting content they are caught and prosecuted.

Google should tell all of its subsidiaries this, too. Or maybe Google should just hand YouTube over to the police and be done with it. I'm sick of searching YouTube for IT-related stuff and being given results with boobs, penises and other disgusting things in the video thumbnails.

Re:Yay for consistency (1)

cheater512 (783349) | about 3 months ago | (#47596747)

You find your own anatomy disgusting? How do you live with yourself?

This is chilling (5, Insightful)

MasseKid (1294554) | about 3 months ago | (#47596625)

This is chilling, not for pedophiles, fuck them, but for the average citizen. While, I absolutely believe it's google's job to report illegal activity they accidentally uncover to the police, this appears google is actively searching your e-mails for things to forward to the police, and that's a chilling thought for free speech, freedom, and prevention of abuse of power.

Re:This is chilling (0)

Anonymous Coward | about 3 months ago | (#47596659)

darkmail.info

Re:This is chilling (0)

Anonymous Coward | about 3 months ago | (#47596661)

this appears google is actively searching your e-mails for things to forward to the police

Hmm, I don't know. This is the first time I've heard of something like this from Google, so it could have been just an inquiry into a random technical problem, a Google employee suspicious of their neighbor, a Google employee who got a tip-off from his best friend, or anything, really.

Re:This is chilling (3, Interesting)

NoKaOi (1415755) | about 3 months ago | (#47596895)

Hmm, I don't know. This is the first time I've heard of something like this from Google, so it could have been just an inquiry into a random technical problem, a Google employee suspicious of their neighbor, a Google employee who got a tip-off from his best friend, or anything, really.

All of those scenarios just go to show that, contrary to what Google has claimed in the past, their employees can and do view emails even without a court order.

Re:This is chilling (5, Insightful)

MikeBabcock (65886) | about 3 months ago | (#47596671)

People seem to miss the opportunity for incredibly bad behaviour. What about if a company like Google starts reporting on who you want to vote for? There are a lot of reasons the post office doesn't open the mail -- and our electronic equivalents should respect that same privacy.

Re:This is chilling (2, Insightful)

Anonymous Coward | about 3 months ago | (#47596721)

"There are a lot of reasons the post office doesn't open the mail -- and our electronic equivalents should respect that same privacy."

For the postal analogy, an email is a postcard.

Encryption is an envelope.

Re:This is chilling (2)

rollingcalf (605357) | about 3 months ago | (#47596769)

No, your email account password is the envelope. Nobody should be accessing your email account without either a warrant or you giving them the password.

Of course, emails can be read without your password by employees of the email provider who have access to the relevant servers. But your letters can also be easily opened by postal service employees who get their hands on your letters ... that doesn't mean you need to seal your letters in a titanium case welded shut (ie. the equivalent of strong encryption) to have a reasonable expectation of privacy and protection by the 4th amendment.

Re:This is chilling (1)

Anonymous Coward | about 3 months ago | (#47596797)

FWIW, in a dozen cities I've lisved in, Law Enforcement has an office eithe rin the same building, or next to the the main post office.
And the alert can see a steady string of snail mail being delivered there in the morning.

Federal law prohibits the Post Office from opening mail.
It does not prohibit other government agencies from opening mail.
And the Postmaster General has publicly stated that they can, and do, as a matter of course, submit mail related to "suspicious addresses" to law enforcement, prior to delivery to said address.

Re:This is chilling (4, Insightful)

gweihir (88907) | about 3 months ago | (#47596761)

There is no "accidental" here. They either are systematically scanning all email or they had (again) some system administrator looking at private email without authorization. That is extremely troubling. That they found somebody possessing illegal digital goods is besides the point. A police state is characterized by universal surveillance and the eradication of all privacy. Sure, in a police state, more people doing illegal things are caught initially (but only then), but that is in no way desirable at this huge price.

Re:This is chilling (0)

Anonymous Coward | about 3 months ago | (#47596777)

in this case, RTFA already. the image hashed to a known pedo image.
google is almost obligated to report this because they only store a single
copy of duplicated images (at least according to the article).

Re:This is chilling (0)

Anonymous Coward | about 3 months ago | (#47596791)

While comma. I, like that.

Re:This is chilling (1)

rossdee (243626) | about 3 months ago | (#47596863)

And don't forget that Gmail is used all over the world, and in other countries some things are against the law that are legal here. EG same sex couples in Russia or Iran

The moral of the story is, don't use Gmail if you want privacy.

Re:This is NOT chilling - RTFA (0)

Anonymous Coward | about 3 months ago | (#47596875)

Read the full article. There's an agency ("National Center for Missing & Exploited Children") that provides hashes of known child porn images and videos to companies like Google. I don't think it's outside Google's purview to ensure files with hashes appearing on that list don't reside on their servers. Contrary to what the peanut gallery here has to say, Google aren't opening up individual mailboxes for a quick squiz. Not to mention that even if they aren't looking inside mailboxes for these images, they probably do scan messages traversing their network (i.e. incoming/outgoing) for files with known hashes.

Stolen word for word from another AC at

http://tech.slashdot.org/comments.pl?sid=5487261&cid=47596805

Best secure email? (1)

Iamthecheese (1264298) | about 3 months ago | (#47596639)

I don't want ANYONE looking in my email and I don't want to require my friends and email to have to set up security just to read emails from me. What's the best email service offering end-to-end encryption?

Re:Best secure email? (1)

Anonymous Coward | about 3 months ago | (#47596669)

End to end encryption that doesn't require your friends to set up security? Well if you find that then you've discovered the holy grail of secure email since nobody else has found it.

Re:Best secure email? (0)

Anonymous Coward | about 3 months ago | (#47596693)

Any mail service you like. Just use PGP.

Re: Best secure email? (1)

corychristison (951993) | about 3 months ago | (#47596695)

That's a tricky thing to do.

Email is inherently insecure by design. It was never meant for how it is used today.

The most common and fairly effective option I known of is to use PGP or GPG encryption. Some providers integrate it and make it easy to use, but it still is not seamless.

Another option would simply to be to not use email. There are other secure communication means, typically centralized and therefor anyone you want to communicate with will also need to use said service.

Re:Best secure email? (5, Insightful)

wisnoskij (1206448) | about 3 months ago | (#47596717)

Lay your own cable to all your friends houses, then run your own encrypted email server.
Then learn to accept that the NSA installed a hardware backdoor in your router and is reading your emails (and now they are monitoring your for suspected terrorist activities), and China installed one in your computer hardware and are doing the same.

Re:Best secure email? (3, Informative)

BradMajors (995624) | about 3 months ago | (#47596733)

email your friend encrypted pdf files and tell him the pdf file password over the telephone.

Re:Best secure email? (3, Informative)

suprcvic (684521) | about 3 months ago | (#47596741)

I use runbox. Secure email based out of Norway. https://runbox.com/why-runbox/... [runbox.com]

Re:Best secure email? (1)

50000BTU_barbecue (588132) | about 3 months ago | (#47596803)

One time pad.

Re:Best secure email? (1)

gweihir (88907) | about 3 months ago | (#47596897)

Do not trust any "service", do encryption locally on your system. Companies get coerced into looking at any and all content routinely. And leaking your keys is really, really easy.

Others?? (4, Interesting)

wisnoskij (1206448) | about 3 months ago | (#47596657)

How does Google do this for one person? If they suddenly started scanning images for this, you think they would uncover a few thousand people at a time. Are we supposed to believe that they specially targeted him, or that he is the only person to ever send naked pictures of children through gmail?

Re:Others?? (1)

The Raven (30575) | about 3 months ago | (#47596849)

I choose to remain optimistic that it does NOT happen all the time because they do not look at the contents of your email all the time. In other words, someone was diagnosing an algorithm (say, how to choose advertisements using the content of attached images), the images triggered the offensive filter, engineer took a look, and reported it.

Perhaps I am naive, but I simply think that Google does not do this frequently because I don't think they look at email frequently, or scan for naughty pics on purpose. As a sysadmin, I generally don't give a fuck what's in my user's email. I doubt they do either, except to advertise to it.

Amazing Technology (0)

Anonymous Coward | about 3 months ago | (#47596665)

I thought Google has said for years that it can't automatically identify copyrighted material and is therefore legally exempt from being required to block objectionable material. But now that it appears their algorithms can search email images and make the determination, then it proves Google is now capable of identifying pretty much anything, correct? Wow, this is going to open them up to a ton of liability!

Re:Amazing Technology (1)

Anonymous Coward | about 3 months ago | (#47596823)

I thought Google has said for years that it can't automatically identify copyrighted material and is therefore legally exempt from being required to block objectionable material. But now that it appears their algorithms can search email images and make the determination, then it proves Google is now capable of identifying pretty much anything, correct? Wow, this is going to open them up to a ton of liability!

Read the fucking article. NCMEC identifies the content, they give their list of hashes to Google.

Re:Amazing Technology (1)

mendax (114116) | about 3 months ago | (#47596835)

RTFA....

The Google rep said:

Since 2008, we’ve used 'hashing' technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. ...

We’re in the business of making information widely available, but there’s certain 'information' that should never be created or found. We can do a lot to ensure it’s not available online—and that when people try to share this disgusting content they are caught and prosecuted.

The U.S. Justice Department is almost certainly giving Google the MD5 tags of the images they have in their child pornography database and those of new images that are discovered by law enforcement, and Google is using them to identify such images in web pages they index and in the e-mails and report it to law enforcement. They do maintain one, you know.

Good riddance (5, Interesting)

penguinoid (724646) | about 3 months ago | (#47596675)

Both to the pedophile and to the illusion of privacy people had when using Gmail.

(They have an obligation to report child porn if they find it, but they don't have an obligation to look. My suspicion is Google is not happy about what happened.)

Re:Good riddance (0)

Anonymous Coward | about 3 months ago | (#47596785)

Slight correction: "... Both to the pedophile and to the illusion of privacy people had when using E-mail".

Unless you are encrypting your email, any email service or relay can view/store/scan your emails.

Re:Good riddance (4, Insightful)

LordLucless (582312) | about 3 months ago | (#47596787)

Yeah, I have absolutely no problem with this article. You don't want RandomCompany looking at your emails? Don't send your emails through RandomCompany servers.

Don't want your ISP looking at your emails? Encrypt your emails.

Don't have the ability to understand how to encrypt your emails and want someone to manage it for you because technology is all so hard but you still want to use it? Suck it up and learn, or pay someone to do it for you and stop whining about your own ignorance.

Re:Good riddance (1)

I'm New Around Here (1154723) | about 3 months ago | (#47596865)

Exactly.

My main email account is on Yahoo. Because I can log in from anywhere and read or send messages with it. I don't send illegal messages with it. One, that would be stupid. Two, I don't do that much illegal stuff.

So if someone is running a drug operation, he should not use Yahoo to do it.

Re:Good riddance (0)

Anonymous Coward | about 3 months ago | (#47596813)

I wouldn't be surprised if Gogole's board of directors quietly orders a few of their coders to figure out a way to prevent whatever situation occured that required them to view the illegal subject matter, so that they don't have a repeat offense.

Re:Good riddance (5, Insightful)

Mitreya (579078) | about 3 months ago | (#47596847)

They have an obligation to report child porn if they find it, but they don't have an obligation to look.

Actually, naive me was thinking that they have an obligation NOT TO LOOK.
I also have a storage room rental -- does that mean the owner is allowed to do random checks for stolen goods? Just in case?

Could be the start of something. (0)

Anonymous Coward | about 3 months ago | (#47596715)

I can't help but think that this was a calculated move by google. They could have had information on many other people this whole time, like people who talk about committing piracy in their conversation etc. By choosing the first "tip" to the government to be about a registered sex offender you can start the narrative off (in favour of surveillance is good) proving your point on a highly one sided extreme people will agree with. I just don't feel that this was done a whim, like they just came across this person's email content and without an agenda decided to report him for justice.

Why are people surprised (0)

Anonymous Coward | about 3 months ago | (#47596723)

Google said they do this, if you have a gmail account google owns it, scans it, base advertising off it, and sell it to others after 7 years.

Criminals are lazy and so is everyone else... (1)

ClaudeVMS (637469) | about 3 months ago | (#47596729)

Encryption? I'm happy Google did what they did... If everyone sent bogus information to phishing emails then phishing would stop. If everyone encrypted their email then snooping would stop.

Are the *sure* they got the right guy? (3, Insightful)

imag0 (605684) | about 3 months ago | (#47596737)

Gmail allows for dot address matching. This is a *huge* problem that has never been addressed.

Apparently my first letter, last name gmail address happens to be pretty popular. So popular, I receive emails from at least 5 other people in my inbox. One from PA, another one in Florida, still another in New Zealand... I could go on and on, but you get the idea. Apparently, this seems to happen a bit to people [google.com] .

Sadly, Google has no fix for it, no way to get it to stop. Their support address and site are useless, imho.

I have since moved all of my email off to my own domain and mail services not controlled by Google. I still keep the account open and forwarding to my new email address, so I still get their email, too. I do what I can to minimize problems by auto-deleting everything that hits my inbox that's obviously not for me.

Stories like this scare the shit out of me because, at any time, if one of those people I happen to receive email for suddenly decides to go into full-creep mode, I could be put in prison for a very, very long time. Not for anything that I have done, but for how gmail has been setup to allow for this.

Re:Are the *sure* they got the right guy? (1)

nedlohs (1335013) | about 3 months ago | (#47596809)

People can send stuff to a non-gmail address just as easily to a gmail address, so how exactly would that make any different at all? (well aside from google not going through your email and reporting objectionable material to the cops of course...).

two persons? (1)

BradMajors (995624) | about 3 months ago | (#47596743)

Shouldn't two persons have been arrested? ie. both the sender and the receiver of the emails?

Re:two persons? (0)

Anonymous Coward | about 3 months ago | (#47596767)

Great thinking! On a totally unrelated note, what's your email address?

Spam your enemies gmail account with CP! (0)

Anonymous Coward | about 3 months ago | (#47596757)

Taking the appropriate steps to hide your identity of course.

It's not just Google... (2)

supersat (639745) | about 3 months ago | (#47596759)

Microsoft has something called PhotoDNA which scours Bing, Outlook, etc. for child porn. I believe they also make it available to other companies. In fact, given the difficulty of getting images to train on, I wouldn't be surprised if Google was using Microsoft's PhotoDNA technology.

Chilling not just for scanning email... (1)

Balthisar (649688) | about 3 months ago | (#47596763)

This implies *much* more than the simple scanning of email and image recognition. After all, is Google also reporting innocent pictures people take of their babies in, e.g., the bathtub to send to daddy while he's in China on a business trip? Or is it more likely that Google knew the guy was a sex offender and targeted the scanning of his email specifically?

Lets try this.... (4, Insightful)

wbr1 (2538558) | about 3 months ago | (#47596771)

I have no idea if this guy did this or not (innocent until proven right?) It looks like he did, but consider the following . Registered sex offenders in most states have to register their email address. Sometimes even so much as providing the password.

With legal (or cracked) access to anyone's email account (sex offender or not) lets see how easy it is to plant evidence.

1. Access account, add a folder or label (preferably hidden buy being buried in default sort order or under another folder).
2. Set filter with obscure rule to automatically route certain emails to said folder.
3. Send "illicit" or "evidentiary" messages that match said filter. These can be sent from self or whatever generated entity seems appropriate.
4. Access account again from various public IP addresses (or from target's own wifi). Read already read email, plus messages in target folder.
5. Remove filter. Have Google 'find' the evidence. Arrest wrongdoer.

This is not that far fetched. The chain of evidence doe not prove that the target is guilty, but can be made to look enough like it to convince a judge or jury. From the vantage of Google or a jury, it looks as though the subject sent or had sent, expected, and read the messages.
Just about anyone here could do this with the creds to an account - which in most situations are not terribly hard to garner.
Before you say you would notice the folder in your account, think of this. I have over 100 folders in my email account, some rarely opened, and never all visible on the screen. I wouldn't have noticed - but I may have enough knowledge to fight - a little anyway. How about a novice, when a folder named 'Archived Messages' appears. Would he/she even think twice?

I did not RTFA, but I know google uses their image search algos for blocking known child porn sites. It is not a hard step to run that against email messages. How about when the NSA/CIA/FBI tells google (via a NSL) scan all messages for x terms. How about when said terms are sent to and from hacked accounts as a matter of course?

It is important to realize that absolutely no communication that is unencrypted is private, but how about whe forged open communications can make you a criminal?

Re:Lets try this.... (0)

Anonymous Coward | about 3 months ago | (#47596861)

Your insane paranoid story is interesting, but it raises the question--why?

Trust Google? (2)

mbone (558574) | about 3 months ago | (#47596819)

If they can do this for this cause, they can do this for any cause, or for no cause at all.

I can't say I am surprised.

Snooping or running hashes? (4, Interesting)

JThundley (631154) | about 3 months ago | (#47596833)

Were they really snooping around this guy's email for no reason or do they check your attachments against a list of hashes of known child porn?

Re:Snooping or running hashes? (2, Informative)

Anonymous Coward | about 3 months ago | (#47596891)

The article says specifically that they are comparing email attachments against a list of known hashes of child porn.

Terms of Service? (0)

forevermore (582201) | about 3 months ago | (#47596839)

One would assume that Google has the right to make sure you're complying with their terms of service, and if in that (presumably automated) scan they find illegal activity, is it not their prerogative to report it to the authorities? On the flip side, is this much different from your leaving a stash of cocaine on the back seat when you take your car in for service? Do you expect that the mechanic wouldn't report it to the cops?

Isn't snooping on someone's emails illegal? (0)

Anonymous Coward | about 3 months ago | (#47596841)

Also, how the fuck could anyone be dumb enough to use GMail for any kind of business purposes now that you know that they can simply poach competitive information right out of your communications, and not get arrested for it?

Brain surgery? (1)

Tablizer (95088) | about 3 months ago | (#47596881)

Why can't they just remove the sexual areas of pedophile brains rather than jail them for 20 years (as an option)? Often they are otherwise normal people who abide by the law, show up to work on time, and pay taxes. Their craving is very specific such as to be relatively easy to "short circuit".

As a tax-payer, it would probably be cheaper to snip around in their brain than house them for 20 years.

So Like 30 Minutes Ago Google Finds Out ? (0)

Anonymous Coward | about 3 months ago | (#47596889)

A lot to this story and Google's interest are not adding up.

Sorry Google but you are as compromised as the "bait" you handed off.

Ka Ching Baby !

File Hashing - just like antivirus (0)

Anonymous Coward | about 3 months ago | (#47596893)

Summarized from the details:

Google has partnered with many anti-child-exploitation groups to see what it can do to help.

The way discussed here is by hashing known files. So, for example, whenever there's a huge child porn bust and gigs of files are discovered, google hashes the content (think md5, but maybe something image-specific in case it's resized).
Now, if it ever encounters that exact hash from a known bad file, that registers as a hit. If there are thousands of hits on a single account, or lots of sending/receiving activity, that might be cause for alarm.

Same shit I do when people upload malware to the hosting servers I used to administer - whenever I found some malware I'd hash it with ClamAV, then whenever anything new was uploaded it'd get scanned.

Disclosure: I actually read the article. I don't work for anyone involved, nor give too many fucks.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?