Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google Aims To Cull Child Porn By Algorithm, Not Human Review

timothy posted about a year ago | from the safer-search dept.

AI 306

According to a story at VentureBeat, "Google is working on a new database of flagged images of child porn and abuse that can be shared with other search engines and child protection organizations. The database will help create systems that automatically eliminate that sort of content. ... If the database is used effectively, any flagged image in the database would not be searchable through participating search engines or web hosting providers. And maybe best of all, computers will automatically flag and remove these images without any human needing to see them." Here's the announcement.

cancel ×

306 comments

Google has the worlds largest cp collection (-1, Flamebait)

biy55 (2951825) | about a year ago | (#44027655)

With Google Images search Google practically has the largest cp image collection on earth. Now, Google needs to see some of the images to remove cp. Does it make them guilty? Yes, I think it does.

Re:Google has the worlds largest cp collection (0)

Anonymous Coward | about a year ago | (#44027667)

So by that logic, police officers investigating possible CP crime are also guilty? I think not.

Re:Google has the worlds largest cp collection (2)

Trepidity (597) | about a year ago | (#44027801)

Police are legally allowed to possess contraband in the course of an investigation; private-sector entities aren't, absent some exception in the law permitting them to. For example, you can't keep a large collection of drugs for research purposes (e.g. training drug-detecting sensors) unless you apply for special permits.

Re:Google has the worlds largest cp collection (-1)

Anonymous Coward | about a year ago | (#44027831)

So by that logic, police officers investigating possible CP crime are also guilty? I think not.

Re:Google has the worlds largest cp collection (-1)

Anonymous Coward | about a year ago | (#44027671)

So, by your logic, a police officer investigating possible CP crime is guilty? I think not!

Re:Google has the worlds largest cp collection (-1)

Anonymous Coward | about a year ago | (#44027777)

So, by your logic, a police officer investigating possible CP crime is guilty? I think not!

Re:Google has the worlds largest cp collection (-1)

Anonymous Coward | about a year ago | (#44027789)

So, by your logic, a police officer investigating possible CP crime is guilty!? I think not!

What is the point of this? (5, Insightful)

Electricity Likes Me (1098643) | about a year ago | (#44027683)

What is the point of automatically removing child porn so it's not searchable. That's not the problem with child porn.

The problem with child porn is real children are being really abused to make it.

Making it "not searchable" doesn't stop that. Arresting the people who are making it does.

Re:What is the point of this? (4, Insightful)

TheBlackMan (1458563) | about a year ago | (#44027701)

Exactly.

Also, I am browsing the net since at least 12 years and i have NEVER found child porn by accident or whatsoever. I am thinking that child porn can be found only in the "dark internet".

So that makes one wonder what Google's real motives are.

Re:What is the point of this? (5, Insightful)

ebno-10db (1459097) | about a year ago | (#44027935)

Realistically this is just a feel good effort. No one is going to seriously criticize Google for this, and they can say "we're doing our part". Not that their part really helps anything, but that's not Google's fault.

So that makes one wonder what Google's real motives are.

Good PR. I'm as cynical as the next person, but PR is often the only motive for these things. If they had a sinister motive, they'd just offer to help the NSA some more.

Re:What is the point of this? (1)

egcagrac0 (1410377) | about a year ago | (#44028021)

Ding, have a cookie.

"Think of the children!" is the perfect answer to "Hey, why are you handing all the data to the government investigators?"

Re:What is the point of this? (1)

ebno-10db (1459097) | about a year ago | (#44028127)

"Think of the children!" is the perfect answer to "Hey, why are you handing all the data to the government investigators?"

Why do they need to answer that question? You're assuming the proles have any power. Besides, in 21st century America, "terrorism" trumps even "think of the children".

Re:What is the point of this? (3, Informative)

cdrudge (68377) | about a year ago | (#44027937)

Also, I am browsing the net since at least 12 years and i have NEVER found child porn by accident or whatsoever. I am thinking that child porn can be found only in the "dark internet".

Unfortunately it is out there. In a previous life as an intern I received a computer from a retail store that needed "fixed" as the store manager put it. Figuring it had some malware on it I booted it up to see what the damage was. Almost as soon as the computer was started numerous browser sessions autostarted with some of the most vile websites you wouldn't want to imagine. It wasn't a picture or two of some amateur girlfriend that might have been a little too young. They had the appearance of professionally designed and maintained websites just like any other porn website, but just happened to have kids 13- instead of 18+. I just turned off the computer, went to my boss, explained briefly what I found and said I wasn't dealing with it.

That was 13+ years ago. I'm sure things have changed some since then, but I'm also not naive to think that child porn is just on the "dark internet" whatever that is.

Re:What is the point of this? (0)

Anonymous Coward | about a year ago | (#44028111)

The situation has improved so far as that the police have learned how to use a web spider and any obvious website are either honeypots or quickly raided. it gets difficult once the pages are hosted in some odd third world countries, but honestly, few countries won't assist in getting the pages down. But like with warez, it was just forced further underground, such as freenet or tor onion sites. But if you hang out in the right circles some links or keywords get passed around. (Following them is not wise... they may be honeypots) The problem with modern obfuscation / cryptography is that it gets hard to figure out who uploaded / hosts / created them and that makes police work hard.

Re:What is the point of this? (2)

ebno-10db (1459097) | about a year ago | (#44028331)

dark internet

FWIW, according to Wikipedia the term you're looking for is 'darknet' or 'deep web' [wikipedia.org] . I love clear terminology.

Re:What is the point of this? (1)

aevan (903814) | about a year ago | (#44028013)

I have. Lot more common in the 90s (you'd see it as a banner ads on download sites for cracked software-and I'm talking porn, not just naked children like some russian nudism thing), but it's still out there. Wouldn't call those sites 'dark internet' either, as an altavista search could pull them up.

More recently, remember that girl that did the 'my life sucks' then suicided? Uncensored autopsy pics got pulled up by Google- CP according to some definitions.

Still, totally agree with grandparent post - stopping the search aspect is like arresting drug buyers on the street rather than going for the dealers or growers. It's a feel-good PR move, but really isn't accomplishing anything. Suppose though as a private entity it's the limit Google *CAN* do - with all the real stuff being 'non-searchable dark internet' to begin with :P

Though..an algorithm. Maa..won't someone think of the flat loliesque porn stars? First Australia now Google... :D

Re: What is the point of this? (0)

Anonymous Coward | about a year ago | (#44028167)

I stumbled across some the other day and I wasn't using tor. Just because you haven't found it doesn't mean it isn't there.

Re:What is the point of this? (3, Informative)

Ardyvee (2447206) | about a year ago | (#44027709)

The summary is a bit incomplete. I suppose that if the algorithm finds something, it will warm law enforcement.

FTFA: "This will enable companies, law enforcement and charities to better collaborate on detecting and removing these images, and to take action against the criminals." "We can do a lot to ensure it’s not available online—and that when people try to share this disgusting content they are caught and prosecuted. "

Re:What is the point of this? (5, Funny)

Errol backfiring (1280012) | about a year ago | (#44027797)

if the algorithm finds something, it will warm law enforcement.

I may not always agree with law enforcement, but I do not think they are THAT corrupt.

Re:What is the point of this? (2)

Ardyvee (2447206) | about a year ago | (#44027833)

Woops, I meant warn, I MEANT WARN!

Re:What is the point of this? (1)

Anonymous Coward | about a year ago | (#44027717)

Eliminating demand will certainly decrease offer.

Re:What is the point of this? (0)

Anonymous Coward | about a year ago | (#44027735)

Making it more difficult to find may just be one portion of the strategy - no doubt the location of the images is reported to the relevant authorities, and then it's their job to take up the issue. Perhaps reducing access to the material will reduce the ability for people that search for it to find it, which may reduce the number of cases where the activity escalates to direct abuse. Maybe it'll increase the number of abuse cases as they're unable to relieve their desires and turn to local sources.

It's not like it's the easiest thing to combat, with darknets making it difficult to locate sources. Google is trying something, at least, so that probably should score them some political points. At least this way reduces the number of workers they have which require serious therapy after viewing those images for manual filtering purposes; false positives could be a problem but they're probably aware of that.

Re:What is the point of this? (2, Interesting)

rioki (1328185) | about a year ago | (#44028245)

Making it more difficult to find may just be one portion of the strategy - no doubt the location of the images is reported to the relevant authorities, and then it's their job to take up the issue. Perhaps reducing access to the material will reduce the ability for people that search for it to find it, which may reduce the number of cases where the activity escalates to direct abuse. Maybe it'll increase the number of abuse cases as they're unable to relieve their desires and turn to local sources.

Although I think that people abusing children should be outright shot. I have trouble following the logic of the above statement. There used to be a under the counter market for such material and big bucks could be made with it. (The internet basically killed that market, hopefully.) Here there was a real economic incentive to produce material and demand encouraged production. But now, thanks to police work, there is little to no commercial trade of the material. Because the material is such a hot potato, people searching and distributing the material are forced to use means of strong anonymisation and thus no economic transaction can occur. I think most CP created nowadays is distributed along the same lines are people uploading their private videos to porn sharing sites.

The service by Google is very useful to prevent from services hosting the stuff. Since hosting stuff, even for a very short period of time is always bad PR. What Google builds is basically for PR, for Google and everybody using it.

Re:What is the point of this? (2, Interesting)

Anonymous Coward | about a year ago | (#44027757)

The problem with arresting the people who are making child porn is that it provides the government no excuse to monitor all internet traffic of innocent citizens.

Re:What is the point of this? (1)

Anonymous Coward | about a year ago | (#44027767)

Raising the level of effort to access child porn at all may prevent a certain amount of new customers for these networks.

At the same time, automatic detection could also be used to help law enforcement track down new content.

Re:What is the point of this? (5, Insightful)

ebno-10db (1459097) | about a year ago | (#44027779)

True, which means this isn't a solution. It is about as much as a search engine can do though, so it's to Google's credit.

Re:What is the point of this? (0, Interesting)

Anonymous Coward | about a year ago | (#44028101)

So Google are attempting to create a system to automatically disappear content from the Internet with no human supervision, not only that but it also autonomously informs law enforcement of "child porn" found on these sites and you think thats to their credit?

I'd hate to see what you think is a step too far.

Re:What is the point of this? (0)

Anonymous Coward | about a year ago | (#44027785)

The point of this is that Claire-Fucking-Perry wont shut the hell up and go away in the UK despite the fact her plans for porn firewalls have been defeated multiple times now.

Somehow she's finally bored David Cameron so close to death that he's agreed to haul all the major internet companies in the UK together today to give him something that he can shut her the fuck up with.

This is what Google is offering to shut her the fuck up with.

It wont work of course, because single issue attention whore politicians don't care about things like democracy, they just keep wittering on until they get their own way in the face of the will of the other 649 representatives she sits with and their constituents or until they get kicked out at election time because even their own constituents want them to shut the fuck up. Here's hoping for the latter, that or a fatal car crash. Either is fine.

RTFA; it involves more than that. (2)

sirwired (27582) | about a year ago | (#44027945)

You are absolutely correct that this won't make child porn disappear. But from Google's standpoint, it will help keep their top-notch search engine (and other search engines) from being used to find it. In addition, it's more than making it "not searchable"; RTFA. This will also have "hooks" into law enforcement and ISPs.

Re:What is the point of this? (1)

Joce640k (829181) | about a year ago | (#44027975)

Making it "not searchable" doesn't stop that. Arresting the people who are making it does.

Nope, it makes it more valuable to the people who distribute it - no more pesky freeloaders!

(just like drugs, etc.)

Re:What is the point of this? (2, Insightful)

gweihir (88907) | about a year ago | (#44028033)

You are looking at this from the wrong angle. It is extremely likely that this is not about CP at all, but that Google wanted an effective image censorship system. They realized that this may be hard to sell to the public, so they found CP as an easy solution. They can even use it for that only for the first few months (which will be almost invisible, as there cannot be a lot of CP accessible via Google, if there is anything at all...), then they an start to put in other pictures that are "undesirable", like pictures of political protests, police brutality, etc. And if anybody protests, they can just report them for searching CP. When the life of the one protesting has been ruined, they can just blame it on a "technical problem".

Quite ingenious, if utterly evil.

Re:What is the point of this? (1)

Xest (935314) | about a year ago | (#44028209)

Yes, that's right, Google, the firm that publicises all DMCA requests it receives and flags up when it's been forced to censor search results by linking to the request on relevant searches, Google, that publishes the source code for many of it's products, Google, that produces a regular transparency report stating as much as it can about what data it's been requested to hand over, what it's been requested to censor and so forth has just arbitrarily decided one day that it wants to censor images. We don't know what images it'll censor after child porn or why, you didn't explain that in your crackpot conspiracy theory, but when it comes to images, they've just out and out decided to go against everything they've ever stood for.

Care to explain why they'd want to arbitrarily censor images? care to explain what images you think they're desperate to censor? care to explain why they'd be so keen to put themselves at a commercial advantage against existing and future competitors?

Alternatively, rather than mash your mind trying to answer those questions which you've obviously not padded out to make your conspiracy theory at least make some semblance of sense, care to simply consider that maybe you're just trolling Google and are full of shit?

They're doing this because doing something voluntarily in a half arsed manner is easier than having it forced on you in a brutal, draconian and difficult to implement manner by government, which is the current alternative in the UK. They also just gave £1million to the IWF which is the UK's child porn censor. It's entirely about appeasing the politicians who are on yet another "think of the children" mission right now.

Re:What is the point of this? (0)

Anonymous Coward | about a year ago | (#44028125)

Um. Its called Supply and demand.
If you eliminate the demand by makign it unsearchable, then there is no point in producing it anymore .. get it?

Re: What is the point of this? (0)

Anonymous Coward | about a year ago | (#44028227)

Except the people looking at it for free on the internet aren't the ones demanding the supply, get it?

Re:What is the point of this? (3, Insightful)

c (8461) | about a year ago | (#44028241)

What is the point of automatically removing child porn so it's not searchable.

Well, if it works to prevent people from seeing it unintentionally then it means the Google search engine provides more relevant search results. So that's a major improvement in Google's search engine.

If it's automatically identified removed, then presumably Google would be able to purge ephemeral copies from their caches and whatnot, which is probably nice from a liability perspective.

It might help to reduce casual interest in the subject if it's not easily searchable.

It doubt it would prevent anyone actively trying to find it, and it certainly won't stop the kinds of people who would go to the length of producing it; at least, I can't imagine that fame through improved search engine results is a significant part of their motivation.

The question is what is the impact on the people who might make a transition from casual interest (if they could view it by searching) to actual production? If it helps prevents that, it's a win. On the other hand, if these people deal with frustrated urges by just going ahead and making their own, we'd have to call it a major failure.

Ideally, someone has actually done the research and determined that yes, blocking casual searches for child porn should amount to a net benefit.

In practice it wouldn't surprise me if it's a move to reduce the threat from Attorney General's who see child porn in Google's search results as an easy PR and courtroom win.

Re:What is the point of this? (1)

slashmydots (2189826) | about a year ago | (#44028267)

Exactly. How about they automatically find out who's behind the website and arrest them in their home country or shut down the hosting company hosting it?

Re:What is the point of this? (0)

Anonymous Coward | about a year ago | (#44028315)

That's a stupid approach. They are a search engine company, obviously they should go and arrest people, right? Right??
Having access to child porn fuels child porn, and creates child porn. It could make a significant difference.

Re:What is the point of this? (2)

J'raxis (248192) | about a year ago | (#44028325)

The point is to try to sell automated censorware to the public by saying it'll only be used against something "everyone" thinks ought to be censored. Once it's established, it's scope will be expanded to cover all sorts of other materials.

I hope they really mean child (5, Interesting)

Mal-2 (675116) | about a year ago | (#44027689)

If they mean "all underage" and not just "blatantly children", good luck with that. There are no characteristics that will distinguish between 17 and 18, or even older. What is the software going to think of Kat Young, for example? What about models who are just small?

Also are they going to attempt to sort through drawings at all, considering they are legal in some jurisdictions and not others?

I sense false positives and angry models in Google's future.

Re:I hope they really mean child (1)

Ardyvee (2447206) | about a year ago | (#44027751)

Yes, this seems to be a difficulty they may face, and something I would like to see addressed. While I have seen things I would rather have not (and not necessarily related to this topic), I'm not entirely sure I would deem it illegal. Maybe I'm just biased by finding it as something that just comes with the Internet as part of the package. That and as long as it's not shoved in my face I don't really mind it existing*

*of course, child porn/abuse is illegal, so my personal view on it is kind of irrelevant and thus not part of that statement.

Re:I hope they really mean child (2)

Barny (103770) | about a year ago | (#44027783)

I for one can't wait until the Aussie government get in on this. Women with small breasts will be flagged and, as you said, drawings too.

It is a good effort, but the world is really becoming just a little too fucked up to start trying to stop things now.

Re:I hope they really mean child (0)

Anonymous Coward | about a year ago | (#44027821)

well it would include a db of photos they have found.
so photos that were among some other collection but not of underage would be flagged.

more alarming though if they would really implement all computers to automatically censor content they want..

Re:I hope they really mean child (1)

rioki (1328185) | about a year ago | (#44028309)

Hope that it is just some characteristics, such as file hashes. Because the DB with actual pictures, even "not searchable", would probably be illegal at least as the laws are spelled out currently.

Re:I hope they really mean child (1)

gmack (197796) | about a year ago | (#44027823)

The summary is a bit off. The algorithm can't actually detect child porn, what this looks like is a system similar to the one Youtube uses where one item gets reported and it's blocked globally rather than have to have someone report each instance of the same image.

This, also misleading headline (0)

Anonymous Coward | about a year ago | (#44027849)

It's not that no human needs to see them. It's that 30 humans in different countries working on different and separate hash identification systems all don't need to see them.

Re:I hope they really mean child (4, Informative)

ebno-10db (1459097) | about a year ago | (#44027851)

This is about detecting known images (presumably even if altered a bit), not automatically detecting if a heretofore unseen image is CP. From the Google announcement:

Since 2008, we’ve used “hashing” technology to tag known child sexual abuse images, allowing us to identify duplicate images which may exist elsewhere. Each offending image in effect gets a unique ID that our computers can recognize without humans having to view them again . Recently, we’ve started working to incorporate encrypted “fingerprints” of child sexual abuse images into a cross-industry database. This will enable companies, law enforcement and charities to better collaborate on detecting and removing these images, and to take action against the criminals. Today we’ve also announced a $2 million Child Protection Technology Fund to encourage the development of ever more effective tools. [emphasis added]

Re:I hope they really mean child (1)

Conspiracy_Of_Doves (236787) | about a year ago | (#44028247)

This is what I assumed.

Re:I hope they really mean child (1)

bWareiWare.co.uk (660144) | about a year ago | (#44027855)

As usually by the time it made it to the Slashdot headline it was completely miss-reported. They are working on a DB to share hashes of known images. This only prevents them having to review the image on each new URL, someone still has to have seen the image and added it to the DB (to be honest they had better be conforming these flags at least sometimes so that's two people etc.etc.)

Re:I hope they really mean child (2)

wvmarle (1070040) | about a year ago | (#44027879)

This system is not "looking" at images. It is a database of hashes of known offending files, against which found content can be compared. Matching content will be filtered.

Of course this only works for known files (which are flagged by humans, I supposed, though that is not explicitly mentioned in TFA), and if a file is altered the hash changes. Though that doesn't happen too often, most people share content they find unaltered. And it doesn't work for new files, either. Those still need to be flagged - however a lot can be done automatically there, too, as if you find a certain unknown jpg on a site containing many known offending images, it's likely this unknown image is also offending.

Re:I hope they really mean child (0)

Anonymous Coward | about a year ago | (#44028153)

From the article:

"And maybe best of all, computers will automatically flag and remove these images without any human needing to see them."

So while the database is certainly one component it does suggest they are working on cutting out the human element for flagging content too.

Exactly - and how do you define underage? (5, Interesting)

Viol8 (599362) | about a year ago | (#44027931)

The age of consent in spain is 14, in the uk 16, in the USA 18 , so if there's a picture of a nude 15 or 17 year old in what country does it get to decided if its legal?

While this may be a laudable effort I have the sneaking feeling the USA once again will be pushing its legal system and morality onto the rest of the world.

Re:Exactly - and how do you define underage? (1)

Nephandus (2953269) | about a year ago | (#44028257)

It varies by state, but 16 was the mode age of consent in the US. IIRC, SCOTUS ruled all the gender based AoCs unconstitutional, so I don't know if that stuck or whether they went with the high or low age (mostly 16 male vs 17 female). You can even marry at 16 with parental consent in my state. Not sure what that means for photographically documenting the honeymoon. The child porn issue was always nuts in the US. It's not about protecting anyone. It's just about killing the icky with scorched earth policy.

Re:Exactly - and how do you define underage? (0)

Anonymous Coward | about a year ago | (#44028273)

Legal to have sex with is one thing, legal to see pictures of is another. The limit for pictures is 18 years old in the EU.
Note that there is probably not a single Swede who has not violated the law on unintentionally seeing âchild pornâ(TM) without realising it was child porn at the time...
That is really a crime here.

Not a bad idea, but... (0)

Anonymous Coward | about a year ago | (#44027695)

It would be a good thing to keep people from clicking on this sort of thing by accident ("accident?") and suffering an unfair legal penalty, or I suppose to keep aging rock stars like Pete Townshend from getting too excited and heaving a heart attack, but do people dealing in illegal porn really put up websites that are freely searchable? So, if anything, it comes off sounding more like a "hey, we're community minded!" sort of advertisement.

Re:Not a bad idea, but... (1)

benlwilson (983210) | about a year ago | (#44028305)

It would be a good thing to keep people from clicking on this sort of thing by accident ("accident?")

I'm not too sure about that part.
Consider a world where child porn exists but is totally hidden from everyone else's eyes.
If people don't know that something is going on, or how prevalent it is, then they're less likely to take or support any action to stop it.

Don't get me wrong, i not saying everyone should be exposed to CP, all i'm saying is that hiding it away may make the problem worse.

False positive (1)

Anonymous Coward | about a year ago | (#44027705)

How do we know it will only flag illegal content? What if iPhone gets flagged and only Android phones shows up in searches? Intentionally or not. This is just like when they came up with the idea to avid searching for certain words. I instantly stated "will this ban the party, which writes on their homepage that they will increase the jailtime for such offenses?".

Don't get me wrong. I think it's great if the intended pictures aren't available or better yet, they aren't made in the first place. However my experience with auto detection tells me that it always include some false positives.

Re:False positive (1)

sjwt (161428) | about a year ago | (#44027723)

"What if iPhone gets flagged and only Android phones shows up in searches?"

Then the world will be a better place.

Which still doesn't explain (0)

Anonymous Coward | about a year ago | (#44027711)

How they're verifying the algorithm...

I support this (1)

Murdoch5 (1563847) | about a year ago | (#44027713)

This is a really good idea.

Keep That Stuff Off My Computer (0)

Anonymous Coward | about a year ago | (#44027721)

This is great, if it becomes integrated with browsers. I don't want clandestinely downloaded illegal images polluting my browser cache.

Kiss "Sailor Moon" goodbye on search engines (0)

Anonymous Coward | about a year ago | (#44027731)

And about 50% of all anime. The amount of deliberately child erotic content in most Japanese anime is truly frightening. Entire seasons of waif-like, huge breasted, schoolgirls and androgynously pretty schoolboys going "ooooohhhhh" and "aaahhhhhh" and prancing around to show off their busts and asses has always been a big factor in anime. Schoolgirl porn is a *big* market in Japan, and it sells a lot of anime to a lot of fans in different countries.

Re:Kiss "Sailor Moon" goodbye on search engines (0)

Anonymous Coward | about a year ago | (#44027759)

Those schoolgirls are pretty old and mature. I don't think they'll be threatened under this. I'm more worried they'll go after the loli stuff.

This is stupid. (0)

Anonymous Coward | about a year ago | (#44027763)

It is apparently a good thing that google is coming up with algorithms to cull stuff from search without anyone needing to see them, now? So we're not even checking whether it's actual pictures of child abuse and not, say, pictures of children playing at the beach?

Besides that, child porn is not the same as sexual child abuse: The former is images (or drawings) depicting the latter. That makes cracking down on child porn, cracking down on symptoms, not on the actual abuse. Personally I care far more about the actual harm than the pictures.

Especially since few people will voluntarily look at depictions of prepubescent child abuse, sexual or otherwise, unless that happens to be their kink.

And in that case, I'd rather have them in therapy so they keep their urges in check, than that I'd have them in jail --working on their networks with fellow-minded since they have to be kept apart from the general criminal population-- and eventually on the loose again.

That quite apart from all the censorship issues that inevitably leak over to more and more censorship, regardless of the noble intentions whence it all started.

So this is really quite stupid, seemingly noble, but long-term quite futile and even carrying a lot of collateral damage. As such, again quite a good example of seeming to not be evil but turning out quite evil anyway.

They could be doing actual good instead of just seeming to do good. But that doesn't involve quite as much technology, and quite a lot of social work with very icky perverts.

Re:This is stupid. (1)

GameboyRMH (1153867) | about a year ago | (#44027985)

I don't think they're going to be using computer vision to try to identify child porn, they'll probably be using a database of hashes (either file hashes, or some kind of "image hash" that can identify pics even if they've been resized, recompressed, added a watermark etc) of known child porn. It's slightly helpful and has a vanishingly small chance of false positives.

I have a better idea.... (3, Insightful)

Semmi Zamunda (2897397) | about a year ago | (#44027765)

How about instead you compile a list of where these images are HOSTED.....and then DO SOMETHING about that? Notify local law enforcement of the images and give all garnered info about said images to them.

Re:I have a better idea.... (1)

ebno-10db (1459097) | about a year ago | (#44027881)

RTFA. They'll do that.

Re:I have a better idea.... (0)

Anonymous Coward | about a year ago | (#44028073)

The answer is third world countries, and their local law enforcement doesn't give a shit. The best Google can do is de-index them.

Not sure I agree 100% that this is a good idea.... (4, Insightful)

realsilly (186931) | about a year ago | (#44027819)

Let me be clear about this. I DO NOT condone child pornography at all; I find it foul and disgusting. But there is a over-reaching that I think may go on here. If I purchase a server and I engage in a P2P network, then it is not Google nor any one else's business what I transmit. If the server is a public server or one owned by a company (such as Google), then I would agree they have every right to remove such foul content from their servers.

Yes I would rather that the people who engage in this be stopped. But whenever programs like this are created they tend to start out being put to use with the best of intentions, but will likely be used for other more nefarious purposes. If this algorithm is used to sniff out child pornography, it could be modified to sniff out a information about a political party and quell it, or news that a government agency doesn't want people to know about.

With all that has recently come to light about the spying by the US Govt. can you really say that this with 100% certainty that this technology won't be abuse for other purposes? I can't.

Again I DO NOT condone Child Pornography.

Re:Not sure I agree 100% that this is a good idea. (3, Interesting)

jbmartin6 (1232050) | about a year ago | (#44027889)

My reaction was something similar. I question the value of a search engine when it is no longer neutral. Now I will only see what Google has decided it is in my interest to see. This technology will be used in the future to skew political searches for example, or to favor one company's products over another's. (If it isn't already.) Now if they said 'we are using Google's search engine to catch child pornographers' I would say good for you please continue.

Re:Not sure I agree 100% that this is a good idea. (1)

wvmarle (1070040) | about a year ago | (#44027915)

Google and other search engines filter content already - like Google's "safe search" options to block images showing naked people to appear in their image search. The technology exists, and "safe search" appears to actually analyse images to judge the content, while this child porn database only compares file hashes against known offending content.

The technology is there, it's not new, this is just a new application of it. And I have to say I'm quite confident that it's not being used for political purposes, partly because it's Google themselves that take the initiative, not the government.

Also if it becomes known that Google actively filters certain political content or skews search results intentionally to push a political agenda, they may end up losing their #1 spot as search engine really fast (especially if at the same time the competition, most notably Bing because that's the only one that I know wiith serious money behind it, finally gets their act together and provides a proper alternative).

And that's a reason to avoid a useful tool? (1)

sirwired (27582) | about a year ago | (#44027917)

This is certainly, unarguably, a useful tool that can be used in order to accomplish a worthy societal goal; I don't think our criteria for such things should be: "Well, it could be used for bad things, so we should stick our heads in the sand instead." No cars because they might be driven by bank robbers! No knives because they might be used to cut people instead of carrots! etc.

In any case, content recognition algorithms already exist and are already used for nefarious purposes. Why not use those tools towards a worthy end?

Re:Not sure I agree 100% that this is a good idea. (2)

GLMDesigns (2044134) | about a year ago | (#44028001)

The technology is out there. It will only get better (by a magnitudes of a 1000) in the next decade or so. It can be used by governments for all sorts of purposes - so the solution is not to limit the technology (which can't be done) but by limiting the government (which can be done).

This is getting out of hand (0)

Anonymous Coward | about a year ago | (#44027847)

I don't think that with such censorship tightening around the search engines we can stand still and watch. It is not just child porn, there is strong censorship around movies and other "protected" material. When they start censoring "terrorism" and after that censoring legitimate political opposition in the name of anti-terrorism, it will be too late.
I think, that Google and other filtered search engines should be replaced by peer to peer search engines, like yacy [yacy.net] . However, such search engines are not usable yet. For example yacy is written in java so it is slow and resource consuming. I'm looking forward to see some good solution that is actually working - I would deploy it on my server immediately.

Re:This is getting out of hand (1)

gweihir (88907) | about a year ago | (#44027977)

It seems unlikely that there is much (or any at all) CP that can be found using a search engine (have not tried, but others have), as everything findable with a search engine is easily reported to law enforcement and traceable back to the ones putting it there. This strongly indicates that Google wanted (or was coerced) to implement image censorship and is just using CP as an easy and plausible to the clueless excuse. It is, of course, completely bogus. Once you look at the facts, it makes zero sense. And it is by far not the first attempt to justify a general censorship infrastructure with CP. The infamous German stop-signs come to mind (by now abolished as completely ineffectual for the stated purpose, but the amoral scum that established the law is still in office).

great (0)

Anonymous Coward | about a year ago | (#44027861)

Parents across the internet begin frantically removing topeless pictures of their 4 year old child from Facebook for fear of an FBI raid. We all know how they like flash-bangs.

Seems like this could be used for other things (2, Insightful)

usuallylost (2468686) | about a year ago | (#44027863)

Removing child pornogragphy is a laudable goal.

We just have to realize that it won't stop at that. From the what the article says it seems like that technology could be used for any image. At the very least I expect we'll see general copyright enforcement from this. Worst case we will see things like various regimes being able to use this to suppress images they don't like. Oh you have pictures of us slaughtering our opponents well we better put those on the bad list.

brute force (1)

Spaham (634471) | about a year ago | (#44027867)

Do you think it could be possible to reconstruct those images by brute-force trying all combinations until you get a positive answer from the database ?
It would take time, but doesn't sound impossible...

Re:brute force (1)

gweihir (88907) | about a year ago | (#44027929)

Quite impossible. You are completely and utterly clueless of the difficulties involved _and_ you are clueless about your cluelessness. Look up the Dunning-Krueger Effect. You are on the far left of the curves.

Re:brute force (1)

dhTardis (1326285) | about a year ago | (#44027943)

Let's see... even Wikipedia's example of a poor JPEG [wikipedia.org] is 1523 bytes, so (accounting for metadata) at least 2^10000 (10^3000) possible images. Divide by whatever images/second you like (a billion? a billion billion?) and it's still "more universe lifetimes than you can imagine". "Take time" and "impossible" are not, in this case, mutually exclusive.

Re:brute force (0)

Anonymous Coward | about a year ago | (#44028031)

Consider a 1kiB jpeg: there are 2^1024 possible binary combinations in the search space. The smallest valid jpeg is 125 bytes, so let's be generous and exclude twice that amount. We now have 774 bytes to work with. If each combination is a valid image, and we can generate and test one trillion images per second, then it will only take roughly 3.15 * 10^213 years to brute force. Since that's well beyond expected date of the heat death of the known universe, I'd say calling this impossible is a fair assessment.

Just remove all children from the web (0)

Anonymous Coward | about a year ago | (#44027899)

I'm serious. Kids are too young to know their pictures will be there forever on the internet. They need to be old enough to understand the consequences.

I feel sorry for the programmer (0)

Anonymous Coward | about a year ago | (#44027901)

Who had to test the child porn detecting algoritm.

Out of sight, out of mind (3, Insightful)

gweihir (88907) | about a year ago | (#44027913)

This will increase child abuse. As soon as it becomes invisible, perpetrators are completely free to do whatever they like, as the public will not be aware it is a problem. The reason is that it addresses the wrong problem. Distribution of CP is a minor issue. Creation of CP (and all the child abuse that is not documented or does not end up on the Internet) is the real problem. It seems politicians have become so focused on distribution of CP, that nothing is being done anymore to fight actual child abuse. After all, distribution of CP gives nice and easy convictions and to hell with the children themselves.

Re:Out of sight, out of mind (2)

ebno-10db (1459097) | about a year ago | (#44028029)

This will increase child abuse. As soon as it becomes invisible, perpetrators are completely free to do whatever they like, as the public will not be aware it is a problem.

This probably won't do squat to make it less visible, because it's already reasonably well hidden. Any poster of CP that's too dumb to keep it out of the range of search engines has probably already been caught. This is a feel good effort. I can't criticize it, but it won't have much effect one way or the other.

Fighting the good fight (1)

TQL (793194) | about a year ago | (#44027919)

I think we should be praising and supporting any organisation that is trying to protect innocent children from being subjected to this. We should not only lobby governments and organisations to do more to stop the practice and bring these people to justice but also praying for the poor children that are at the centre of this.

Also, spare a thought for the poor Google employees who are going to have to test this algorithm. I sincerely hope that Google ensure that these people are given any support and counselling they might need.

Re:Fighting the good fight (1)

ebno-10db (1459097) | about a year ago | (#44028067)

As to your 1st paragraph, do you know anyone who disagrees? And if they did, they wouldn't say so.

As to your 2nd paragraph, there was a Slashdot story a while ago (can't find it now) about Google temps who were hired for just that kind of stuff. They were "released" after 6-12 months, and really did have psychological problems because of it. Of course no help or assistance of any kind was offered. What do you think this is, the 20th century?

Re:Fighting the good fight (1)

ark1 (873448) | about a year ago | (#44028173)

Google already has employees who, when notified, investigate and remove obscene content which means they are already exposed to the worst of what Internet has to offer. When CP is reported, there is a legal requirement to act and remove such content within 24 hours or so. If Google can pull a decent solution, they will have a more proactive approach to dealing with this problem and will potentially save money as less human intervention will be required.

...What could go wrong? (1)

bferrell (253291) | about a year ago | (#44027923)

sheesh

See the old story, by CM Kornbluth called the marching morons

Next step? (0)

Anonymous Coward | about a year ago | (#44028027)

A goatse detection algorithm?

Re:Next step? (1)

Chrisq (894406) | about a year ago | (#44028131)

Next step

A goatse detection algorithm?

I think that's stretching things a bit.

What about the URL for this topic? (0)

Anonymous Coward | about a year ago | (#44028057)

Here's the URL of this page:
"http://search.slashdot.org/story/13/06/17/0545226/google-aims-to-cull-child-porn-by-algorithm-not-human-review"

Does that get filtered? Am I now on a watch list for having viewed a "child porn" webpage?

Google is now the largest child porn collector (1)

Anonymous Coward | about a year ago | (#44028119)

"It's research, I swear"

Rare (1)

Reliable Windmill (2932227) | about a year ago | (#44028147)

In my 15 years on the Internet I've seen A LOT of nude imagery, but never ONCE come across child pornography. I get the feeling it's extremely rare, and the people who do find it spend a lot of hard time actually digging it up. What has happened now that urges Google to find new ways of combatting CP? Is there a sudden increase in posting of CP on public, easily accessed and indexed adult web sites?

Corporate dystopia (0)

Anonymous Coward | about a year ago | (#44028159)

Google wants to know everything about us, who we are, what things we like, what we shop, what we read, where we go, who we talk to, who we email & occasionally they will secretly share these riches of knowledge with the Secret Police. Now they will also protect us automatically from bad things, just lie back and think of England. Let the Googleplex take care of everything, why bother your mind? You should shopping for bargains on the Web.

This company sends shivers down my spine. I beg all of you to think twice and then think again and again before you transact with this company or use it's products. They will take us to a very bad place.

Misleading summary (1)

Chrisq (894406) | about a year ago | (#44028201)

The summary "By Algorithm, Not Human Review" implies that the algorithm is somehow evaluating pictures. In fact from TFA it is clear all it is doing is looking for copes of known existing images by hash-code. If it were examining images I would be worried about false positives, but as it just looks for know child porn I cannot see any down-side - this is a good move.

Impossible (1)

wisnoskij (1206448) | about a year ago | (#44028237)

Judges have already declared that porn is basically undefinable, and I disagree with them that you know it when you see it.
Added to this that you cannot tell the difference between a 15 yo and a 21 100% of the time, sure 95% you would get it right with that big of a range, but not always.
And trying to tell the difference between 18 and 17 or 16 if more like a 50% chance of getting it right, regardless of if you are a computer or a human being.

demand / supply (1)

beefoot (2250164) | about a year ago | (#44028255)

So google is trying to control the supply of CP images? That itself will only increase the price of these images or worse violation of these children. If google is seriously wanting to help, they should help law enforcer to track down these people and bring them to justice.

no profit here (1)

paiute (550198) | about a year ago | (#44028293)

1. Upload to Picasa picture of kids at birthday pool party holding balloon animals with long noses.
2. End up on floor being beaten by local SWAT team.
3. ??????
4. Prison

Unintended consequences? (1)

hawguy (1600213) | about a year ago | (#44028295)

If this system were 100% effective and preventing all known CP images from being searchable or even downloaded, then wouldn't that drive demand for brand new images to be created that don't trip the filters?

Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...