Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

We are sorry to see you leave - Beta is different and we value the time you took to try it out. Before you decide to go, please take a look at some value-adds for Beta and learn more about it. Thank you for reading Slashdot, and for making the site better!

Microsoft Tip Leads To Child Porn Arrest In Pennsylvania

timothy posted about 3 months ago | from the looking-looking-everywhere dept.

Cloud 353

Shades of the recent arrest based on child porn images flagged by Google in an email, mrspoonsi writes A tip-off from Microsoft has led to the arrest of a man in Pennsylvania who has been charged with receiving and sharing child abuse images. It flagged the matter after discovering that an image involving a young girl had been allegedly saved to the man's OneDrive cloud storage account. According to court documents, the man was subsequently detected trying to send two illegal pictures via one of Microsoft's live.com email accounts. Police arrested him on 31 July.

Sorry! There are no comments related to the filter you selected.

Which company is next in line? (4, Insightful)

mrspoonsi (2955715) | about 3 months ago | (#47617263)

Dropbox? Apple iCloud?

Re:Which company is next in line? (0)

Anonymous Coward | about 3 months ago | (#47617269)

First Google, now Microsoft, so next should be Apple. Or do you think Dropbox is bigger than any of these three companies?

Re:Which company is next in line? (4, Interesting)

NettiWelho (1147351) | about 3 months ago | (#47617459)

Which company is next in line?

What makes you think they have not been parallel-processed?

Microsoft's terms and conditions for its US users explicitly state that it has the right to deploy "automated technologies to detect child pornography or abusive behaviour that might harm the system, our customers, or others".

Now, is it my imagination or does that description cover something like: "Our employees have free access to everyones files so eventually all pics get viewed and tagged. Because think of the children. Terrorism, fire, brimstone and death!".

TFA says it requires 'fingerprint'. ie. already having whatever theyre looking for archived...

Re:Which company is next in line? (1)

Pro923 (1447307) | about 3 months ago | (#47618037)

If they already have it archived, doesn't that make them guilty of possession of child porn?

Re:Which company is next in line? (0)

Anonymous Coward | about 3 months ago | (#47617781)

Perhaps Apple is the only one of these companies that doesn't have a vested interest in looking at your data while it's sat on their servers ;)

Re:Which company is next in line? (1)

thieh (3654731) | about 3 months ago | (#47617321)

Or Amazon? Afterall EC2 is perfect for your hidden service if you are using its other services.

Re:Which company is next in line? (1)

will_die (586523) | about 3 months ago | (#47617367)

dropbox usage license already allows them to do something like this, so chances they are.

Re:Which company is next in line? (2)

EvilJoker (192907) | about 3 months ago | (#47617679)

I'm more concerned about where the scans extend from here. It would be relatively trivial to include "scene release" pirated content in a similar hash group, and report it accordingly.

Even worse would be that Dropbox, Google Drive, etc starts scanning OUTSIDE of their directories, or adding new ones without asking. The only thing really stopping this is a matter of volume - hashing that many files would slow down the system too much, and uploading the hashes would take too long. Neither of these is insurmountable.

Trust the Computer. The Computer is your friend. (1)

Gothmolly (148874) | about 3 months ago | (#47617275)

It's For The Children!

Re:Trust the Computer. The Computer is your friend (0, Flamebait)

bobbied (2522392) | about 3 months ago | (#47617561)

Hey, pedophiles have serious mental issues and deserve a special place in prison . The people who take the pictures are a special kind of vile, and the people who distribute this stuff enable the takers. and the nut cases that pay for it have a SERIOUS problem and need help and monitoring of their sickness. In most states it is *required* that you report the existence of any of this kind of stuff because of the harm that is caused to the children.

So in this case, it IS for the children and it's hard to argue with the logic.

Re:Trust the Computer. The Computer is your friend (2, Insightful)

Anonymous Coward | about 3 months ago | (#47617687)

So I take it you will allow me to move in to your home and watch your every intimate movement? All in the name of making sure you aren't distributing child porn, of course.

Re:Trust the Computer. The Computer is your friend (2, Insightful)

preaction (1526109) | about 3 months ago | (#47617761)

No, and that is a ludicrous analogy.

It's more, I should not go into someone else's home, leave my stuff there, and when a legally-dubious thing happens to be in my stuff in their house, I should not expect them to simply let it go (considering that a lot of legally-dubious things have clauses about "conspiracy" and "required to report").

Re:Trust the Computer. The Computer is your friend (2, Insightful)

Anonymous Coward | about 3 months ago | (#47617813)

Your comparison is perfect, assuming you want people searching through your stuff for legally dubious things. The big issue is that this searching could be expanded to catch other, less harmful files. What if they were searching for generic pornography, leaked government documents, or "backups" of programs/media? Surely that isn't something you'd want.

Re:Trust the Computer. The Computer is your friend (0)

Anonymous Coward | about 3 months ago | (#47617833)

I have no idea how your comment makes any sense in this case.

Try again...

Re:Trust the Computer. The Computer is your friend (5, Insightful)

jeIIomizer (3670945) | about 3 months ago | (#47617769)

Hey, pedophiles have serious mental issues and deserve a special place in prison .

A pedophile is nothing more than a person who is sexually attracted to prepubescent children. Not all pedophiles rape or even look at child porn, and not all child rapists are even necessarily pedophiles.

Also, why do they need a special place in prison? Why not 'normal' rapists, or murderers? Do they also get special places in prison? If not, then why single out this group? Because mentions of 'the children' cause your irrational brain to malfunction?

that you report the existence of any of this kind of stuff because of the harm that is caused to the children.

Voodoo is not real. Voodoo does not exist. Images will not harm people like voodoo dolls. Any 'harm' is caused by their own reaction, assuming that they even see it. But if the mere thought that an image of themselves could be out there is enough to make themselves emotionally unstable, then there is nothing that can be done for them, because censorship is - in practice - futile.

So in this case, it IS for the children and it's hard to argue with the logic.

No, it's easy, and that's because there is no logic; just a strong desire for more and more government control over what information is accessible to people.

Re:Trust the Computer. The Computer is your friend (-1)

Anonymous Coward | about 3 months ago | (#47617863)

blah blah. Your just commenting for the sake of commenting.
Yes I understand this comment is a bit of the same.

Re:Trust the Computer. The Computer is your friend (5, Insightful)

dave562 (969951) | about 3 months ago | (#47617867)

The harm is in the production of the images in the first place, not in the viewing of them. The viewing supports the production. Or the production supports the viewing. I am not sure, given that I do not operate in those circles. From what I have read about it, the consensus seems to be that most kiddie porn is produced by family members abusing their younger relatives.

It can probably be argued that the people making the images would continue to make them even if they did not have an audience to share them with. Even so, there is still some social value in discouraging people from consuming the images. If people are interested in the images, that is a form of social acceptance for those who make the images.

It is bad enough that people have these demons that they struggle with. It is terrible that they abuse those who are too young to protect themselves and in most cases, do not even realize how wrong the activities are. The last thing that we need as a society is to encourage others to consume the evidence of that abuse.

Re:Trust the Computer. The Computer is your friend (2, Insightful)

jeIIomizer (3670945) | about 3 months ago | (#47617923)

The harm is in the production of the images in the first place

I agree 100%.

The viewing supports the production.

People's actions are their own. If the rapists rape, then it is their fault for raping, whether or not they're doing it for a profit or because they want others to see the videos or images. Going after people who merely look at the content is blaming them for other people's actions, and I don't condone that.

But even if that were true, I'm 100% opposed to government censorship, even if it keeps people 'safe.' So no such arguments will work on me.

The last thing that we need as a society is to encourage others to consume the evidence of that abuse.

The last thing we need is censorship.

Re:Trust the Computer. The Computer is your friend (0)

Anonymous Coward | about 3 months ago | (#47618073)

Would anyone post Instagram pictures if there was no-one there to 'like' them?

Re:Trust the Computer. The Computer is your friend (5, Insightful)

Mordok-DestroyerOfWo (1000167) | about 3 months ago | (#47617787)

Here's a novel idea...let's get the downloaders the mental help that they obviously need and save the torches and pitchforks for the ones that are taking the photos/videos.

Quick, sue /. as these numbers are illegal ! (1)

UnknownSoldier (67820) | about 3 months ago | (#47617285)

The demo 1x1 pixel has the color 0xF0B8A0 and contains child pr0n down-sampled.

What, you thought numbers being illegal was nonsense too?

Re:Quick, sue /. as these numbers are illegal ! (1)

sacrilicious (316896) | about 3 months ago | (#47617369)

I *knew* I recognized that shade of pink!

second of the row (0)

Anonymous Coward | about 3 months ago | (#47617287)

Should we expect a sudden outburst of "large IT company found a pedo" news? As a subtle way of making their scans acceptable?

In the clear? SRSLY? (4, Insightful)

Mal-2 (675116) | about 3 months ago | (#47617291)

Sweet Jesus, if you're going to send things in the clear, you have no idea who might be able to lay eyes on it. This goes for storing things locally -- people have been busted for stored files when they take a machine in for repair as well.

When in doubt, encrypt. When not in doubt, get in doubt.

Re:In the clear? SRSLY? (0)

Anonymous Coward | about 3 months ago | (#47617311)

^^^^must be a pedo^^^^

Re:In the clear? SRSLY? (0)

Anonymous Coward | about 3 months ago | (#47617329)

^^^^must be libel^^^^

Re:In the clear? SRSLY? (2)

Mal-2 (675116) | about 3 months ago | (#47617381)

Nope, just severely allergic to stupidity. Whether I agree with the law (some parts I do, some I don't), or indulge in that sort of material myself (which I don't) are both irrelevant -- if content you are distributing is likely to cause authorities to intervene if it is noticed, then encrypt that shit. Simple as that. If you are in the habit of moving such content, it's even better to get in the habit of encrypting EVERYTHING so as to obfuscate what is worth attacking and what is not.

Re:In the clear? SRSLY? (1)

thieh (3654731) | about 3 months ago | (#47617571)

On top of that, I am surprised that pedos didn't do anything AFTER Google caught someone. This once again validates the claim that we always underestimate the bounds of stupidity in people.

Re:In the clear? SRSLY? (1)

Anonymous Coward | about 3 months ago | (#47617751)

Police arrested him on 31 July. He did his deed earlier than that. The google stuff was released in August.

Re:In the clear? SRSLY? (1)

Impy the Impiuos Imp (442658) | about 3 months ago | (#47617621)

A lot of online storage knows people store music and movies. So they check if it's a copy of a file they have already, and if so, don't really store your copy but just a pointer to a master copy.

Google knows lots of child porn (and violent images e.g. accidents) because they pay people to scan their crawler's found images precisely so they can hide them in returned searches. They probably don't keep child porn copies for legal reasons, but would keep some kind of electronic signature of them so they can be auto-recognized by their crawler software. Their humans would hence only need to check "new" pictures.

MS probably has similar for Bing. Running all their email or cloud picures (for space reduction as described) would trivially let them scan for known child porn. Easy forwarding.

This does make me nervous because such a system could be easily abused in the future to track political issues, pictures, text blocks people forward around. Of course the NSA probably does this already via normal internet server passthrouh nodes. :(

Re:In the clear? SRSLY? (1)

AmiMoJo (196126) | about 3 months ago | (#47617663)

We don't know the nature of these "child porn" files. Might have been fully clothed "jailbait" that the user didn't think was actually illegal, but which the police take a dim view of, for example. These days the police will claim "child porn" in their press releases when it's really more like teens sexting or something.

Re:In the clear? SRSLY? (0)

Anonymous Coward | about 3 months ago | (#47617825)

That doesn't really matter. If an encrypted file has hashX, then anyone who knows that that file contains 'bad' content can leak the hash to authorities. Boom, any file with that hash is now flagged. If you share around communities of criminals, there's probably no honor among them so the trick is to encrypt + not share.

I need therapy (1)

akozakie (633875) | about 3 months ago | (#47617305)

Seriously, I must not be normal. This is clearly "for the children", there's really nothing morally disputable about this particular case. So, why can't I see it as progress? Why am I worried that it was automatically spotted?

I need to get my s... straight. Think of the children. Think of the children. The system is good for me. The good guys have nothing to worry about.

No you do not (3, Insightful)

thieh (3654731) | about 3 months ago | (#47617563)

This is what studying ethics/morality feels like. And this isn't exactly progress, unless you count "progressing to a police state". Many things in life are conflicts of various field of interest, and it is up to the philosophers/activists/lawyers/judges/lobbyists/legislature to figure them out.

Re:No you do not (1)

akozakie (633875) | about 3 months ago | (#47617991)

Nah, not "to a police state". The difficult part of observing a slippery slope is the admission than "yup, we're mostly there". Otherwise you'll just lie to yourself until you're not just at the end of the slope, but anchored, settled and playing solitaire with a full family there.

That's the problem with highly emotional subjects (like pedophilia). You need to conciously limit your emotional response to them, otherwise you will accept as "lesser evil" things that are really a bigger problem.

Scale is what matters. A single child saved is a huge accomplishment? Hell yes. But even a thousand saved children is still just a tiny fraction of the children out there, while the lost freedom affects everyone - including all those children people think they think about. It's frighteningly difficult to deconnect from emotional response like that, but it is really the only way to limit the effects of manipulation.

It all boils down to one question - do you trust your government (all levels) to use such abilities wisely and responsibly? Because if you do, this really is good news. If you have any doubt, however...

And that's the funny thing. The US was literally built on distrust towards the government. That's the general spirit of your entire constitution and most of the amendments. And yet you seem to be one of the countries with the lowest backlash to things like that.

Moral outrage is a great thing. It's so emotional, so detached from reason, so easy to steer... And so universal. Works with muslims, catholics, hell, even atheists... Just needs a bit of tuning towards the target audience and the standads it accepts. And "think of the children" is gold - works practically everywhere, just adjust the wording.

Really? (0)

Anonymous Coward | about 3 months ago | (#47617339)

Microsoft has openly discussed its use of image-processing software to detect suspected paedophiles in the past...

Really now. And a false positive fucks over a person for life. Justice? Please! Not when it comes to child porn, drugs, and terrorism. Getting accused of those things is enough to ruin you.

There have been HUMANS who [dallasobserver.com] have [sun-sentinel.com] fucked up [reason.com] interpreting what child pornography is.

It's a Goddamn modern day witch hunt!

My parents of pictures of me that would probably have sent to jail if they took them today - you know, naked baby in bathtub, running around naked, etc .....

Well, this just tells me that the "Cloud" is untrustworthy, regardless of who the vendor is - obviously they are snooping into the contents.

Re:Really? (1)

pixelpusher220 (529617) | about 3 months ago | (#47617447)

My parents of pictures of me that would probably have sent to jail if they took them today

fuuuuuuck, mine were god damned MOVIES!

Who born in the late 80s if not early... (0)

Anonymous Coward | about 3 months ago | (#47618009)

Doesn't have some of those pictures of them?

We have pictures of my siblings, cousins, etc running around naked, in diapers, bathing together, etc.

It used to just be considered an immature act to run around naked. Now it's considered child pornography, sexual assault, etc.

Does anybody remember when the new sex offender stuff went into effect in the US? I remember there was a local State College student (maybe more than one) who did the usual immature 'Streak the Graduation' stunt and as I remember, were being charged with a sex offense of some form. I believe they finally got off with just parole and a stern warning, but I'm pretty sure everybody after that was getting nailed to the full extent of the law.

When being naked in a non-sexual but perhaps immature manner can get you labelled a sex offender for life, you know the system is fucked beyond repair.

Re:Really? (2)

mythosaz (572040) | about 3 months ago | (#47617641)

Microsoft (or Google) getting a hit on a flagged image (or on image processing) means that they turn over the results of that hit to LEO.

If LEO works to arrest you based on that information, then you're subject to the justice system like any other suspected criminal.

You can argue that the justice system might have an axe to grind against pedo's, and you're probably right, but they're still afforded due process.

Witch hunts describe looking for things that aren't there - you know, witches. Sick fucks with pictures of exploited children are very, very real.

Re:Really? (1)

jeIIomizer (3670945) | about 3 months ago | (#47617815)

There is a reason that the red scare is described as a witch hunt, so saying that the term "witch hunt" can only be used for imaginary things is simply false.

Re:Really? (1)

Torodung (31985) | about 3 months ago | (#47617999)

I always thought of "witch hunt" not as referring to the actual pursuit of those engaged in witchcraft (which is not imaginary, btw, only the idea that it works is imaginary, IMHO), but rather as referring to the drive to utterly crucify the subject of the hunt. BURN THEM, do not treat them as a human being, subject them to cruel and unusual punishment.

YMMV.

I was wondering... (2)

thieh (3654731) | about 3 months ago | (#47617355)

Why aren't these guys encrypting their stuff? I would imagine extra care are to be taken if they think what they are doing can be morally objectionable... And then it hit me that the NSA works like that too. Always blow on the morally objectionable stuff.

Re:I was wondering... (2)

pixelpusher220 (529617) | about 3 months ago | (#47617455)

because most criminals are stupid...and thank god they are. The authorities are inept enough on their own.

Why wouldn't you think they are scanning? (3, Insightful)

Sonny Yatsen (603655) | about 3 months ago | (#47617357)

I don't understand the surprise people are experiencing from the revelation that Google and Microsoft scans the stuff you upload to their cloud storage systems.

You are literally giving them a copy of your files, and generally speaking, you also agreed to allow them to allow them to scan your stuff. Google Drive's terms of service explicitly states that your stuff will be scanned:

"Our automated systems analyze your content (including emails) to provide you personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection. This analysis occurs as the content is sent, received, and when it is stored. "

Why would anyone reasonably think that their stuff is somehow private when it's in the cloud?

Re:Why wouldn't you think they are scanning? (1)

michaelmalak (91262) | about 3 months ago | (#47617427)

You are correct that automated scanning combined with reporting to the government is to be expected in today's political climate. However, you would be incorrect if you asserted that the founding fathers expected the asymmetry where the populace could not similarly examine Lois Lerner's e-mails.

Re:Why wouldn't you think they are scanning? (3, Insightful)

thieh (3654731) | about 3 months ago | (#47617441)

The problem usually comes down to that "personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection" didn't include "days in court" nor "jail time" as their catalog of "personally relevant product features".

Re:Why wouldn't you think they are scanning? (1)

Sonny Yatsen (603655) | about 3 months ago | (#47617579)

Yes, but the terms of the ToS also generally states that you wouldn't misuse their services. For instance, Google Drive's ToS states:

"You may use our Services only as permitted by law, including applicable export and re-export control laws and regulations."

Using Google Drive for child porn obviously violates this clause of the ToS, and once that happens, you are at the mercy of the Cloud provider on the basis of you having agreed to the terms of the Terms of Service.

Re:Why wouldn't you think they are scanning? (1)

thieh (3654731) | about 3 months ago | (#47618049)

At the point you described it would become the problem "what gives the companies the authority to enforce laws?" Because they didn't say they will scan it for law enforcement purposes, and laws vary by places. So if they operate in a place which restrict free speech and starts scanning, does that imply they should report all those who are violating the speech restrictions?

Re:Why wouldn't you think they are scanning? (5, Interesting)

RobinH (124750) | about 3 months ago | (#47617507)

My significant other deals with teenagers all the time in schools, and it's amazing how many of them get irate when parents/teachers/police start to question them about stuff they posted on Facebook. The content usually comes to light because one of their "friends" have showed the authorities the content, or in some cases the teen actually friends the teacher/police officer. Their typical response is, "that's my private Facebook page!"

Re:Why wouldn't you think they are scanning? (0)

Anonymous Coward | about 3 months ago | (#47617877)

I think I understand what you were going for, but to be fair.. nothing you said really implies that the teen is upset that the parent/teacher/cop knows the information. Only that they don't like being asked about it.

There are many subjects that I myself generally have no problem with sharing to the world at large, but try talking with me about it, and I'll just dismiss you, because I don't want to talk to you personally about it. I don't care that you know I got drunk and went wild last night, just.. please don't talk to me about it face-to-face.

Re:Why wouldn't you think they are scanning? (1)

nine-times (778537) | about 3 months ago | (#47617601)

Of course they're scanning it. I would have assumed that they're scanning it for viruses/malware, for the sake of deduplication, and to provide indexing so that I can search it. It's been very public that Google also scans your email in order to serve ads, with the assurance (whether it comforts you or not) that this is all done by machines and Google employees don't see your email.

However, searching email for the sake of reporting illegal activity to law enforcement is a bit concerning. It seems easy to say, "Well yes, but they're only looking for child pornography," and who wants to defend child pornographers? But it seems valid to me to worry, in this case, about slippery slopes. After all, who would defend terrorists? Who would defend murderers? Who would defend drug dealers? It wouldn't take much of a leap to expect that Google and Microsoft would scan your storage and email for other illegal activities as well.

To me, the only thing here that makes the slope a lot less slippery is that they're reportedly doing purely automated scans, comparing against a database of illegal images, as opposed to open-ended heuristics attempting to detect anything suspicious. Still, I don't find that completely satisfying. I could imagine China asking Google to report anyone storing/sending images of Tiananmen Square.

Re:Why wouldn't you think they are scanning? (1)

turp182 (1020263) | about 3 months ago | (#47617767)

And in turn the cloud services are storing very illegal images. It's just due diligence if you ask me.

I wonder how much staff they have to review this sort of thing (it would be a terrible job if you ask me, like watching the toilets in Southland Tales - which was awesome when combined with the comic book).

Re:Why wouldn't you think they are scanning? (1)

houghi (78078) | about 3 months ago | (#47617785)

Why would anyone reasonably think that their stuff is somehow private when it's in the cloud?

We live in a thime where you can ask : Why would anyone reasonably think that their stuff is somehow private?

What next? (1)

Anonymous Coward | about 3 months ago | (#47617359)

What other illegal activity will they be watching for?

Drip. Drip. Drip. (2)

runeghost (2509522) | about 3 months ago | (#47617375)

This is the sound of the panoptic, dystopian police state coming. Good luck everyone!

Hold on to your DVD backups (4, Insightful)

iamacat (583406) | about 3 months ago | (#47617389)

We all know what kind files they will scan for next. Because MPAA/RIAA are way more important than children!

Re:Hold on to your DVD backups (1)

mi (197448) | about 3 months ago | (#47617543)

Because MPAA/RIAA are way more important than children!

A remarkably stupid statement. And with an exclamation mark too!

Google and others are doing this for two reasons

  1. Genuine and sincere disapproval of child pornography, which remains one of the very few things, that are still considered wrong by (almost) everybody;
  2. Fear of bad publicity, which would surely ensue, when a CP-ring is discovered by other means later and the mail-providers get asked the uncomfortable questions over why they've tolerated it despite having the technology to do exactly, what they are doing now.

Neither consideration is applicable to the plight of content-creators. Unfortunately...

Re:Hold on to your DVD backups (0)

Anonymous Coward | about 3 months ago | (#47618051)

And Slashdot should report you to the grammar police for such criminally bad comma usage.

Glad to hear it (0)

Greg Heller (3031971) | about 3 months ago | (#47617409)

I hope they catch every person involved with exploiting children and string them up by their Buster Browns.

Re:Glad to hear it (0)

Anonymous Coward | about 3 months ago | (#47617495)

Typical "for the children" idiot. Out for blood, are we? Are you happy with government censorship?

Rather than going after people who receive or share images, why not go after the rapists?

Re:Glad to hear it (0)

Anonymous Coward | about 3 months ago | (#47617523)

These people [independent.co.uk] are also "involved". I suppose you support those convictions too?

Re:Glad to hear it (1)

jeIIomizer (3670945) | about 3 months ago | (#47617605)

Possessing images that authorities subjectively deem "disgusting" is illegal there? The US has "I know it when I see it." The fact that such censorship (based on opinions about subjective matters, no less) is allowed in any 'free' countries shows that free countries don't truly exist.

Re:Glad to hear it (1)

mrspoonsi (2955715) | about 3 months ago | (#47618031)

From the article "possessing an extreme pornographic image likely to cause injury", this could apply to many BDSM images. Seems odd that...given that taking part in a Sunday sporting activity, might 'likely cause an injury', mountain biking, football, etc, I not see the police locking up these dare-devils.

Microsoft's child porn collection (5, Funny)

BradMajors (995624) | about 3 months ago | (#47617417)

In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

Re:Microsoft's child porn collection (0)

Anonymous Coward | about 3 months ago | (#47617477)

In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

Nope. Just the hashes.

Re:Microsoft's child porn collection (0)

Anonymous Coward | about 3 months ago | (#47617749)

Nope. Just the hashes.

Which is all well and good. From what I hear, people who actually have to look at the images to verify them end up having psych problems. When the agencies are doing it right, I think they rotate those agents through counseling on a regular basis. As soon as the image is recognized, hash it so nobody else has to look at it again, store the original bits and if the computer does a bit-for-bit match on the image that should be evidence enough without anybody having to look at it again.

Re:Microsoft's child porn collection (0)

Anonymous Coward | about 3 months ago | (#47617503)

More likely a large collection of hashes.

Re:Microsoft's child porn collection (1)

Anonymous Coward | about 3 months ago | (#47617595)

More likely a large collection of hashes.

so much drugs... send the cops to seize all that hash!

Re:Microsoft's child porn collection (1)

tlhIngan (30335) | about 3 months ago | (#47617577)

In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

Actually, no.

They get a big list of file hashes from the National Center for Exploited Children or something, and it's implemented as part of the file scan. All that happens is they check file hashes and if it matches, then they do more in-depth analysis (is it an image file? etc).

Which begs the question on the general stupidity since hashes are so trivially easy to change and it's extremely easy to obfuscate (just zip it up with a password).

People are lazy. Even ones who really know that what they do isn't really appreciated by the general population and really ought to try to cover their tracks... and don't.

Re:Microsoft's child porn collection (3, Informative)

godel_56 (1287256) | about 3 months ago | (#47617849)

In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

Actually, no.

They get a big list of file hashes from the National Center for Exploited Children or something, and it's implemented as part of the file scan. All that happens is they check file hashes and if it matches, then they do more in-depth analysis (is it an image file? etc).

Which begs the question on the general stupidity since hashes are so trivially easy to change and it's extremely easy to obfuscate (just zip it up with a password).

People are lazy. Even ones who really know that what they do isn't really appreciated by the general population and really ought to try to cover their tracks... and don't.

Nope, from the TFA they process the image to derive a signature which can survive things like resizing, changing resolution etc. It's not just a simple hash.

Re:Microsoft's child porn collection (0)

Anonymous Coward | about 3 months ago | (#47617861)

So the NCEC has a large database of child porn?

If Hashes are what they are scanning... (1)

thieh (3654731) | about 3 months ago | (#47617957)

I would imagine a potential way of exploiting it would be to randomly modify one of the bits in one or more pixels of the image, and make enough copies of them so we are in the hash collision territories. Especially if you are passing these things through email as opposed to P2P, but I think it is doable in P2P as well.

Re:Microsoft's child porn collection (1)

DrXym (126579) | about 3 months ago | (#47617611)

Not necessarily. The FBI could have supplied Google & Microsoft with a long list of md5 / sha1 hashcodes for abuse images which they obtained in raids or forums and these providers have programmed their system to raise a flag whenever they get a hit. Then a human might go in to confirm the match and from there its just a matter of informing the police. It may well be there are other ways of "fingerprinting" an image that are more resilient than a hash code and still useful enough for matching pictures against a known set of bad ones.

Perhaps it will come out in the trial how the file was identified.

Anyway it's more proof (if any were needed) why it's an incredibly bad idea to use a cloud service to store anything illegal. At least encrypt the data. Better yet don't put it up there at all.

Re:Microsoft's child porn collection (0)

Anonymous Coward | about 3 months ago | (#47617805)

And even better yet, don't download, store and watch CP. It's that simple.

Re:Microsoft's child porn collection (0)

Anonymous Coward | about 3 months ago | (#47617613)

Well, someone at least has an enormous stash of "digital signatures"; typically some quango or other whose database gets used by multiple large parties. Also typically without much oversight at all as to what goes into the database, because nobody in their right mind wants to look at such images.

And that is the briliant part. Maybe, soonish, the mere mention of a match will be enough for a conviction. Because, "proven technology" and all that, don't you know. The pitch is so seductive. "Just trust us, citizen, we got this nasty icky problem licked like it was a spam message. We'll catch it, no sweat!"

And what's to stop *cough* certain agencies *cough* from slipping in "digital signatures" of state secrets? Now wouldn't that be a neat trick? Think about it, suppose they succeed adding signatures. Then the "next Snowden" will be nothing of the sort, but will instead be quietly put away for life in maximum security, where the inmates do not look kindly on convicted child molesters.

Far-fetched? Eh, tell it the NSA.

Re:Microsoft's child porn collection (0)

Anonymous Coward | about 3 months ago | (#47617723)

In order to successfully perform these matches, Microsoft likely has one of the world's largest collection of child porn.

That would be funny, if even only by incidence. I'm sure they are just matching up media hashes. Not sure what I feel about that. While I certainly don't condone trading or wanting media of non-consenting individuals (adults and kids) for sexual use, many images do not fall into that category by those taking them, but may be interpreted by others as such. I can not say I approve of searching leased out virtual "mailboxes" or "deposit boxes" but I can not say I feel bad about those that do being punished by being found by accident or laziness.

How about this stance. If an individual stores the media and it never leaves the connection between their client and their server space, it shouldn't be searched, but if a third party gets transmitted to, then I don't mind a hash comparison, which can also be used for cache purposes. So, you can upload those childhood photos, but you can not send it to another, at least as-is. I also hold a stance that a person can not be charged for child porn of themselves if they transmit it. That is going to be a case, if it already wasn't. I think there might have been one about a 16 year old girl sending her older boyfriend an image.

Phone company. (0)

Anonymous Coward | about 3 months ago | (#47617421)

This would be like if the phone company policed their network for illegal activity. It is creepy.

Re:Phone company. (1)

Russ1642 (1087959) | about 3 months ago | (#47617573)

This would be like a self-storage site that had a drug sniffing dog.

Re:Phone company. (0)

Anonymous Coward | about 3 months ago | (#47617913)

http://www.nbcbayarea.com/news/local/San-Jose-Police-Officer-Accused-of-Stashing-Marijuana-in-Storage-Unit-261875331.html

Re:Phone company. (1)

thieh (3654731) | about 3 months ago | (#47617995)

If you are a rich cartel who needs such spaces, before you use the service, sneak in and get a lot of cheap drugs and powder spray them all over the places so the next time the dog went by they have to search everything. do this everyday for a few weeks and people will stop responding to these things.

I could make a fortune (4, Funny)

GrumpySteen (1250194) | about 3 months ago | (#47617451)

If only I had a large enough collection of tinfoil hats to sell to all the posters freaking out over this.

Re:I could make a fortune (1)

Holammer (1217422) | about 3 months ago | (#47617681)

It only takes a minute to make your own, besides, the ones you're selling are probably bogus ones developed by the NSA with backdoors for mind reading.

Re:I could make a fortune (0)

Anonymous Coward | about 3 months ago | (#47617775)

Selling 'revised' history books where all instances of corruption have been removed is also quite profitable, I hear. Can't have people being justifiably wary of authority figures, or getting the idea that authority figures aren't all perfect little angels who are immune from corruption and making mistakes.

Have you considered that some people just like having privacy and don't like having their communications read, automatically or not, just to keep people 'safe' from bogeymen? Of course, that's partly why you shouldn't use "the cloud" to begin with.

Why do people even use this garbage? (2)

jeIIomizer (3670945) | about 3 months ago | (#47617465)

This cloud crap is just trash. At least use encryption (not theirs) or something.

Plus, they caught someone with images that shouldn't be illegal to have to begin with. When is an actual rapist going to be arrested?

Re:Why do people even use this garbage? (0)

jratcliffe (208809) | about 3 months ago | (#47617585)

This cloud crap is just trash. At least use encryption (not theirs) or something.

Plus, they caught someone with images that shouldn't be illegal to have to begin with. When is an actual rapist going to be arrested?

Yeah, why are they picking on the poor fences, they're just receiving stolen property!

Re:Why do people even use this garbage? (2)

jeIIomizer (3670945) | about 3 months ago | (#47617691)

Does not compute. How is having copies of something in any way, shape, or form similar to receiving stolen goods? Someone loses their goods when they're stolen; that's the point.

This is just government censorship, and pointless government censorship at that; that's something everyone should oppose. None of your useless analogies will convince me otherwise.

Re:Why do people even use this garbage? (0, Troll)

jratcliffe (208809) | about 3 months ago | (#47617977)

1. Does not compute because you clearly missed the point. We punish people for receiving stolen goods, because doing so reduces the demand for stolen goods, and reduces the appeal of stealing them in the first place. We punish people for possessing and distributing child pornography because doing so reduces the demand for the creation of child pornography and the sexual abuse that entails.

2. Yes, it's government censorship. Yes, the right to free speech isn't absolute - incitement to riot, libel, etc.

3. Since nothing will convince you otherwise, there's no point in continuing this discussion - your beliefs are an article of faith, like those of a New Earth Creationist.

Re:Why do people even use this garbage? (1)

jeIIomizer (3670945) | about 3 months ago | (#47618043)

We punish people for receiving stolen goods, because doing so reduces the demand for stolen goods, and reduces the appeal of stealing them in the first place.

So that was your point? I care about it because the stolen goods are actually someone else's property. People's actions are their own; if you steal something, that's your fault. If you rape someone, that's your fault. It doesn't matter if you think someone else wanted you to.

Also, I value fundamental freedoms over safety, so even if I bought into your point, I would still oppose you.

2. Yes, it's government censorship. Yes, the right to free speech isn't absolute - incitement to riot, libel, etc.

It should be. I oppose all government censorship.

3. Since nothing will convince you otherwise, there's no point in continuing this discussion - your beliefs are an article of faith, like those of a New Earth Creationist.

So you're saying that all of your beliefs could be changed through a mere discussion? Could I convince you that 1 + 1 = 3? To expect me to change my fundamental belief system (that free speech should reign above all) is simply unrealistic, and it has nothing to do with creationism.

Re:Why do people even use this garbage? (1)

Russ1642 (1087959) | about 3 months ago | (#47617593)

Just like with drugs they go after the users, then they turn on the dealers, and then they turn on the producers. Gotta work your way up.

Re:Why do people even use this garbage? (1)

jeIIomizer (3670945) | about 3 months ago | (#47617699)

Just like with drugs they go after the users

That works so well!

Gotta work your way up.

Going after people looking at/sharing images is morally wrong, so it isn't even a viable option. And that's a non sequitur, anyway.

What is the law? (0)

Anonymous Coward | about 3 months ago | (#47617481)

If Microsoft is monitoring cloud storage for illegal activity can they be sued as a conspirator is they do not report the activity?

Calm down (1)

David C Billen (3753981) | about 3 months ago | (#47617575)

You people will get your internet privacy as soon as that last pedophile racist bomb-plotting terrorist is rotting in prison.

Implying they cannot blackmail anyone (0)

Anonymous Coward | about 3 months ago | (#47617683)

And officials believe these notorious liars and criminals at microsoft...

Obligatory Quote (0)

Anonymous Coward | about 3 months ago | (#47617719)

"There was of course no way of knowing whether you were being watched at any given moment. How often, or on what system, the Thought Police plugged in on any individual wire was guesswork. It was even conceivable that they watched everybody all the time. But at any rate they could plug in your wire whenever they wanted to. You had to live—did live, from habit that became instinct—in the assumption that every sound you made was overheard, and, except in darkness, every movement scrutinized."

you f41l it (-1)

Anonymous Coward | about 3 months ago | (#47617739)

though I have never The failure of operating systems, llok at your soft, market. Therefbore,

Hash collision in 3 2 1 ... (2)

wiredlogic (135348) | about 3 months ago | (#47617771)

One of these days a hash collision will happen on an innocuous file and the jackboots will ruin someone's life over it.

Vague (0)

Anonymous Coward | about 3 months ago | (#47617887)

discovering that an image involving a young girl

I love the overreaction from everybody in this thread.

The image could be anything. Let's not how forget how broad these laws are...

Oh wow, the commenters in here... (5, Insightful)

MindPrison (864299) | about 3 months ago | (#47617899)

...was actually much more interesting to read than the actual news, where to start...lets see now:

- We have a member here who thinks Pedophilia is a disease and think Pedophilia equals abusing children:
He/she is one of the numerous clueless people out there who have NO idea if this is actually a disease or just like Homosexuality. Arguing with such a person is completely futile, but they'll always be in numbers. It's kind of voting for stupid. (Yes, that was a H2G2 reference).
- We also have several members here who thinks Pedophiles should be arrested and behind bars just for being Pedophiles, never mind if they committed any crimes.
- We've got the usual anonymous coward zealots that thinks that if you don't have anything to hide, there is nothing to worry about.
Wanna bet who's next on tomorrows "sick" list? It can't possibly be you, can it?
- We've got the next predictable bunch who immediately attacks someone who defends the freedom of the individual, and calls them Pedophiles, because they can't POSSIBLY be normal or straight if they defend Pedophiles, now can they?
(Who exactly defended who now?) Never mind the actual facts, just as long as you get YOUR hidden agenda across.
- And then we have those who thinks that images of kids being exploited are okay, just as long as you bust the purps behind the images, and not the users.
(And who are the users now again? Sick Pedophiles, or nasty voyeuristic perverts that wants to get a kick out of something unthinkable and illegal?) And where do we draw the line? Naked kids? Kids posing sexually, and how do you define that?), family photos available to all? Imagine the number of youtube and imageshack users you'd have to arrest or at least suspect. Who do you trust today?

I'll let you in on a little secret of mine, for years I've been working undercover together with a police agent who is a close friend of mine to uncover several secret child-abuse rings in various countries - trust me when I say...this is the WORST JOB IN THE WORLD. I got into it because some family members of mine was abused, and I thought I'd use my skills for something good. Over time I learned that albeit we DID get a lot of these rings busted, we also ruined several families lives, destroyed childhoods because the law and common sense doesn't mix at all.

Everyone sees red when it comes to Child Abuse, and rightly so - but it is important...no...VITAL for progress that we somewhat keep our heads above water here and try to think rationally. It is NOT rational to point fingers at everyone who wants anonymity as a suspect of anything, it is NOT rational to call every Pedophile a CHILD ABUSER, it is NOT rational to think that if your opinion differs from the stupid masses...that you are in LEAGUE with ANYONE who happens to NOT fit your OPINION today (eg. those who want to HELP PEDOPHILES - are NOT nessesarily Pedophiles themselves, but a lot of the angry mob especially in here seem to think so).

I get upset by this, because I think of Mr. Allan Turing, who was just recently pardoned by the British for the grave injustice brought upon him just for having a sexual preference he might not even have ANY control over (we're not talking urges and constraint here, we're talking sexual PREFERENCES).

I do NOT want a society that becomes totalitarian where every deviant of nature becomes a freak to be hung, burned and ridiculed for just being different. I see YOUR mind as a private thing, just like your diary as a private thing. What you THINK of or FANTASIZE of is YOUR BUSINESS ONLY, and NO ONE ELSE.
And there is nothing that gets me fired up more than someone using child abuse in ever shape and form, fantasy or drawn, real or not - to excuse severe abuse of human rights, to pry into our daily lives with the law in hand...and with a lot of supporters that mean well...but really have NO CLUE of the REAL danger they're actually putting themselves in by supporting this ludicrous development.

Wake up and smell the coffee, people!

Nude kids (1)

reikae (80981) | about 3 months ago | (#47617907)

What about pictures of one's baby or young kid nude, is it illegal to send such images to the kids grandparents for example? It can be very hard to tell from just a picture whether abuse took place or not. Sometimes it's clear, and I hope the victims get all the help they need. But in other cases, would these automated systems mark the images as "child abuse" and get the parents in trouble? The topic is so heated that even slight suspicions can lead to big problems.

What could possibly go wrong? (0)

Anonymous Coward | about 3 months ago | (#47617975)

This doesn't lend itself to abuse at all! Corporations have proven themselves so trustworthy after all.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?