Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Google and Microsoft To Block Child-Abuse Search Terms

samzenpus posted about a year ago | from the thinking-of-the-children dept.

Google 308

mrspoonsi writes "Leading search engine companies Google and Microsoft have agreed measures to make it harder to find child abuse images online. As many as 100,000 search terms will now return no results that find illegal material, and will trigger warnings that child abuse imagery is illegal. The Google chairman said he hired a 200-strong team to work out a solution over the last three months. Google's previous set of measures, which displayed a warning to people attempting to search for illegal material and caused a 20 percent drop in illicit activity."

cancel ×

308 comments

Sorry! There are no comments related to the filter you selected.

Well, it's something. (1, Funny)

AltGrendel (175092) | about a year ago | (#45453719)

I imagine that this will work until the child abusers find a way around it.

Re:Well, it's something. (5, Insightful)

Joining Yet Again (2992179) | about a year ago | (#45453777)

It's something to INCREASE abuse by:

1) Redirecting resources away from finding abusers;

2) Giving the impression that "something is being done already" so resources don't need to be reviewed;

3) Misidentifying abuse as something which is caused by the availability of images of abuse, when in fact almost all child sex abuse occurs within families or thanks to trusted acquaintances for various complex reasons which require careful analysis rather than knee-jerk political reactions.

Re:Well, it's something. (5, Insightful)

Pi1grim (1956208) | about a year ago | (#45453887)

Exactly. Unfortunately this is tactics of sweeping the dirt under the rug. Shutting your eyes and pretending it's not happening. I don't understand why noone in their right mind thinks that hiding criminal activity reports will stop crime, but are sure that if we remove all child abuse pictures from the internet, then the problem will solve itself.

Re:Well, it's something. (-1)

Anonymous Coward | about a year ago | (#45454025)

The NYPD has been doing for years under Mayor Bloomberg. They get to keep property values high for the Mayor's real estate developer buddies by pretending crime is going down. Things that would have been felonies get downgraded and written up as misdemeanors and misdemeanors get swept under the rug. Of course the NYPD is also on a strict quota system for giving out tickets in order to raise revenue for the city so if you smoke marijuana outside, ride a bike on the sidewalk or even (if it's the end of the month and the cop is desperate to make quota) a jay walk, expect a ticket.

Re:Well, it's something. (0)

Anonymous Coward | about a year ago | (#45454329)

Lots of police departments apparently fudge things to piddle the stats. As for your statement that the NYPD is "on a strict quota system", can you site proof, as well as proof that this is a problem-- i.e. that they are ticketing things that aren't patently illegal in NYC? I.e. smoking marijuana on the street is currently illegal- you get a ticket for it, you don't site the NYPD for doing something wrong. Get the laws changed. But don't blame the cops.

Re:Well, it's something. (2)

Joining Yet Again (2992179) | about a year ago | (#45454383)

It is an executive issue if the police are picking+choosing which laws to enforce and when.

The legislative matter needs to be addressed additionally, not instead.

Re:Well, it's something. (0, Troll)

Lumpy (12016) | about a year ago | (#45454435)

You ask ANY of the guys that are actually in the streets, or people that live in edge neighborhoods... crime is going up and going up rapidly. 99% of what you hear from your local,state or federal government is 100% BS to simply calm you down.

If crime rates are going down, then why is my local police getting military grade equipment and gear? Cripes for the last sports event here they had M16 machine guns in the open and wearing full military armor.

Re:Well, it's something. (2)

Anonymous Coward | about a year ago | (#45454491)

Because there are really dangerous people to arrest, like Jeremy Hammond

Re:Well, it's something. (3, Interesting)

ArsenneLupin (766289) | about a year ago | (#45454097)

Protecting the children is not the point of this. It's done to give the powers that be just another arrow in their quiver to crush the little man if he ever dares to fight against one of their corrupt construction projects, or if he ever dares to do his job too well researching who planted bombs against utility poles in the eighties. At least, that's what it is used for here in Luxembourg.

Re:Well, it's something. (4, Informative)

Xest (935314) | about a year ago | (#45454377)

Actually, I think it's done for no other reason than to shut Claire Perry and The Daily Mail with their "Stop online porn" campaign the fuck up - yes, that's a real thing.

Since she was elected this is the only issue she's focussed on, if I were Dave Cameron I'd be pretty sick of hearing her harp on about things she doesn't understand by now too and would probably do something useless and pointless just to get her off my back.

Not saying it'll work of course, and not defending it, but I can understand why someone would cave in to a multi-year barrage of whining from that silly cow.

Now we just need her to suffer the same fate as Jacqui Smith, the last MP who was as whiny and clueless as Claire Perry - her being caught charging her husband's porn to her expenses. Karma - it's great.

Re:Well, it's something. (1)

interkin3tic (1469267) | about a year ago | (#45454189)

4) If pedophiles are prevented from getting this stuff virtually online, they might turn to doing it themselves and actually molest children.

I have no idea how plausible that hypothesis is, but it might give some of those knee-jerk political reactions a second thought.

Re:Well, it's something. (3, Interesting)

JackieBrown (987087) | about a year ago | (#45454257)

I think the opposite is probably true. I know watching woman in pornographic videos increases my visualizing women in day to day interactions in similar roles.

One of the best things for my marriage was when we decided to quit watching these types of videos. It moved the focus of sex back to love instead of a sport.

Re:Well, it's something. (4, Interesting)

Joining Yet Again (2992179) | about a year ago | (#45454321)

You understand the difference between "visualising" and "raping", yes? Watching porn did not making you a rapist?

Re:Well, it's something. (1)

Jeff Flanagan (2981883) | about a year ago | (#45454413)

It doesn't sound like the porn was the problem. Your reaction to it wasn't typical.

Re:Well, it's something. (2, Insightful)

Anonymous Coward | about a year ago | (#45454433)

A healthy mind, even with tendencies toward socially unaccepted thoughts or occasional actions is still a healthy mind. It is much different than a mind that doesn't work correctly in the first place. Rape (toward anyone) and abusing children and/or child porn doesn't stem from a healthy mind. The healthy mind will look back and say, "this is wrong, I have to stop this." The unhealthy mind will continue to harbor fantasies and then eventually act on them. That's not saying they couldn't harbor fantasies for a very long time..but "not getting caught" leads to acting those fantasies out.

Re:Well, it's something. (1)

P-niiice (1703362) | about a year ago | (#45454399)

I see the point you're trying to make, but changing a couple of search sites' reactions to certain search terms entered into them really isn't going to have a detrimental effect. All it's going to do is prevent abusers and non-abusing consumers if the content from getting to the product, which is a good thing and needed to happen anyway. People making incorrect judgements on this sort of thing were misinformed regardless, and anyone who bases their directing of resources based on what google and microsoft does in their search engine is an incompetant and probably wouldn't have their job for much longer.

Re:Well, it's something. (0)

Anonymous Coward | about a year ago | (#45454467)

2) Giving the impression that "something is being done already" so resources don't need to be reviewed;

AKA "Security Theater"

Re:Well, it's something. (5, Insightful)

gweihir (88907) | about a year ago | (#45453879)

Actually it does not do anything about child abuse. It just hides the problem. People that look at such images are a minor side-issue. The real issue is people that abuse children, and even there those that document their crimes in images or video seem to be a small minority.

I think this is designed (like so many efforts by law enforcement) to give the appearance of doing something really valuable, while it is likely rather meaningless in reality and may even be counter-productive. If this effort went into preventing children from being harmed in the first place, it might actually accomplish something. Instead they go for an easy, but fake, win.

Re:Well, it's something. (1)

Anonymous Coward | about a year ago | (#45453921)

If this effort went into preventing children from being harmed in the first place, it might actually accomplish something.

I agree that this is idiotic, but how could you prevent children from being harmed in the first place? That seems rather difficult, and the usual 'solutions' seem to involve violating people's rights or privacy.

Re:Well, it's something. (3, Insightful)

gweihir (88907) | about a year ago | (#45454089)

True, but there are leads you can follow up with traditional police work, e.g. trying to find people that make and sell this kind of material. Focusing on those that search for it just deviates resources from it for cheap, bombastic, but meaningless headlines. Example: In one of these operations in Germany, 3000 homes were searched. That made for grand headlines. Do you know how many people were actually charged? Less than 20! But the police got their headline and gave the impression of doing something. (So much for violating rights and privacy...)

I have the impression by now that they care far more about the appearance of doing something than actually doing something, because actually doing something worthwhile here is hard and gives far less impressive headlines. So they go the easy way, and all the abused children are just out of luck. That strikes me as incredibly unethical.

Re:Well, it's something. (1)

ArsenneLupin (766289) | about a year ago | (#45454127)

hat seems rather difficult, and the usual 'solutions' seem to involve violating people's rights or privacy.

Just as if this witch-hunt against pictures wasn't invading people's privacy or violating their rights...

Re:Well, it's something. (1)

badfish99 (826052) | about a year ago | (#45454341)

I don't imagine it will do anything about child abuse, but if I wanted to look for pictures of it all day long, there are now 200 more jobs that have been created, where I can do so without fear or being caught.

Re:Well, it's something. (0)

Anonymous Coward | about a year ago | (#45454529)

It's like forbidding alcohol.

Sure, drinking goes down, but sure as hell there will be many more brewing moonshine right at home.

Re:Well, it's something. (1)

methano (519830) | about a year ago | (#45454497)

It seems that there is no problem so bad, that the coders don't think it can be solved by writing more code.

Friendly request to non-Brits (5, Insightful)

Joining Yet Again (2992179) | about a year ago | (#45453723)

Please search for and compile the list of 100,000 terms.

Which will inevitably all:
- Have double meanings;
- Be likely to be used by victims of abuse who are looking for help;
- Be useful for legitimate research;
- Be searched for by people looking for news or discussion on censorship;
- End up with a lot of political hot topics thrown in.

Thanks!

Re:Friendly request to non-Brits (0)

Anonymous Coward | about a year ago | (#45453785)

From what I understood, the terms are only blocked when performing image searches, so a number of your concerns should already be addressed.

Re:Friendly request to non-Brits (5, Informative)

Joining Yet Again (2992179) | about a year ago | (#45453815)

That's not the impression the BBC article [bbc.co.uk] gives me. Indeed, it says:

Typing "child pornography" in to Google's search engine now brings up a set of search results that include warnings that child abuse imagery is illegal.

The first three links are all related to reporting disturbing images or seeking help if you think you or someone you know has a problem with child porn.

The first link is an advert that links to a Google statement about protecting children from sexual abuse. The next link directs you to the Internet Watch Foundation, where you can report criminal online content, and a link to Stop it Now advises users how they can get help and advice.

The remaining search results are mainly news stories from around the world reporting on child pornography.

So Google are now engaging in government-directed manipulation of search results covering the discussion of child sex abuse images.

Re:Friendly request to non-Brits (4, Funny)

Anonymous Coward | about a year ago | (#45453943)

The Catholic Church sighs with relief.

Re:Friendly request to non-Brits (4, Informative)

fatphil (181876) | about a year ago | (#45454219)

Sites that discuss contentious issues often get dragged down by the same net.

There was a Finnish site called lapsiporno.info (= "kiddieporn") which was an freedom-of-speech advocate's site who was complaining about excessively wide (and anti-constitutional) governmental blocking of things which weren't actually the distribution of child pornography. His reward for his actions - being added to the blocked list himself.
http://www.effi.org/blog/kai-2008-02-18.html

But it's a small price to pay, because think of the chiiiiildren!

Re:Friendly request to non-Brits (3, Insightful)

AmiMoJo (196126) | about a year ago | (#45453791)

For example, there is a popular French singer who does a song called "Lolita", presumably after the novel. For that matter the novel itself is perfectly legitimate.

Anyway, what kind of idiot googles for child pornography. Really, how many users are that dumb?

Re:Friendly request to non-Brits (4, Funny)

Thanshin (1188877) | about a year ago | (#45453889)

Really, how many users are that dumb?

The answer to that question should be clear* to anyone who uses the word "users" and has over one month or professional experience.

*: In this context, the word "clear" is to be interpreted as "painfully obvious. Crystalline as one of the axioms on which the universe stands; bright as the one truth all other truths are to be measured against.".

Re:Friendly request to non-Brits (1)

Anonymous Coward | about a year ago | (#45453937)

There's a restaurant on the Danforth in Toronto called "Lolita's Lust".

Good luck googleing it now.

Re:Friendly request to non-Brits (3, Insightful)

gweihir (88907) | about a year ago | (#45453959)

I think nobody will be that dumb. But on the other side, people may be dumb enough to think that some searches were for such material. That all this has very little to do with children actually being abused seems to escape them as well, because most of the messed up people that abuse children will not document it and the few that do will not put that material online where Google can find it.

This is designed to give the appearance of "doing something" about child abuse, while it really accomplished nothing. It might be a test-run for a censorship list though.

Re:Friendly request to non-Brits (4, Insightful)

Millennium (2451) | about a year ago | (#45454407)

"Nobody will be that dumb" is one of the most dangerous bets a person can make, regardless of context. Someone will always be "that dumb".

Re:Friendly request to non-Brits (2)

drinkypoo (153816) | about a year ago | (#45454085)

Anyway, what kind of idiot googles for child pornography. Really, how many users are that dumb?

Obligatory You Must Be New Here.

Re:Friendly request to non-Brits (1)

Nemesisghost (1720424) | about a year ago | (#45453823)

It seems that the filter is less aimed at general filtering of all websites & more towards just those that host the illicit images. The idea I got was that using these terms to search for images would return no results, but a general web search would still have results.
And since this is done by a set of companies, one would hope that politics would not come into play in how the list of terms is managed. But in this day & age, I highly doubt it.

Re:Friendly request to non-Brits (0)

Anonymous Coward | about a year ago | (#45454357)

And since this is done by a set of companies, one would hope that politics would not come into play in how the list of terms is managed. But in this day & age, I highly doubt it.

That doubt is well founded. As a general rule: If something can be abused, it will be abused. The patriot act was supposed to protect us against the threat of Al Quaeda, this framework is now being used to enable mass surveillance of people who have nothign to do with Al Quaeda or even Islam in general (which in no way encourages terrorism any more than the Christianity does). Knowing the mentality of the people who brought us the Patriot Act, this is only the first step in a long path towards mass censorship.

Re:Friendly request to non-Brits (1)

Your.Master (1088569) | about a year ago | (#45453871)

It didn't say that 100k terms returned no results at all. It said that 100k terms returned no child abuse results.

It looks like what they're doing is removing the sites from their index, not really screwing around with the algorithms (which is why it's possible for Bing and Google to share their work here).

As such, none of the things you mentioned are particularly relevant, because none of them would be removed. In fact, by removing child porn from the results, they would be promoted. You could argue that some things will be removed with the excuse that they are child porn, but honestly, they don't need the child porn excuse to remove it and have you never even find out about it, so that is kind of a null worry.

Another argument is that something can legitimately be "on the fence", eg. something that's legal age in place A but illegal in place B because place B has an age of consent of 30 years old or somesuch thing.

Another one is a site like the old geocities that happens to host some child porn on arm A, and arm B on that site hosts no child porn, and A and B are unaware of each other.

But those problems aren't really particular to child porn. They are search optimization problems in general.

Re:Friendly request to non-Brits (2)

Joining Yet Again (2992179) | about a year ago | (#45453975)

It didn't say that 100k terms returned no results at all. It said that 100k terms returned no child abuse results.

Wait... Google has EVER returned results with pictures of child sex abuse?!

I suppose I'm lucky in that for the past 15 years I've never accidentally entered the wrong terms, because I've never seen anything I'd regard as an image of child abuse.

Given this, my concern is what new things they are doing - particularly (see above) re manipulation of text results.

Re:Friendly request to non-Brits (3, Insightful)

Chrisq (894406) | about a year ago | (#45453953)

Please search for and compile the list of 100,000 terms.

Which will inevitably all:
- Have double meanings;
- Be likely to be used by victims of abuse who are looking for help;
- Be useful for legitimate research;
- Be searched for by people looking for news or discussion on censorship;
- End up with a lot of political hot topics thrown in.

Thanks!

Very true ..... for example I was thinking about searching about how this technology works but to do so would mean searching for dodgy things like "child abuse image filter"

Re:Friendly request to non-Brits (1)

Joining Yet Again (2992179) | about a year ago | (#45454039)

"child abuse image filter"

Oh my! You were probably... trying to bypass it. Pedo!

Your name vill also go on ze list.

(To think we do with sincerity what we once saw as so wrong that we once mocked it...)

Re:Friendly request to non-Brits (0)

CheezburgerBrown . (3417019) | about a year ago | (#45454295)

Please search for and compile the list of 100,000 terms. Which will inevitably all: - Have double meanings; - Be likely to be used by victims of abuse who are looking for help; - Be useful for legitimate research; - Be searched for by people looking for news or discussion on censorship; - End up with a lot of political hot topics thrown in. Thanks, Obama! -there, fixed that for you.

Re:Friendly request to non-Brits (1)

Xest (935314) | about a year ago | (#45454463)

Non-Brits wont be able to help. According to the article I read this morning this is a global thing. Microsoft and Google are going to censor these terms right across the globe.

yes (0)

ILongForDarkness (1134931) | about a year ago | (#45453739)

because it is so hard for kiddie fiddlers to find porn on torrent clients.

Re:yes (0)

Anonymous Coward | about a year ago | (#45453763)

I'll take your word for it.

Re:yes (3, Insightful)

gweihir (88907) | about a year ago | (#45454019)

You realize there is a difference between people that harm children, and people that look at pictures of it, right? And that in order to protect children you have to find the first kind in time, and not the second one?

This while action just gives the appearance of doing something valuable, while it is pretty meaningless for actually stopping abuse.

Re:yes (2, Insightful)

Anonymous Coward | about a year ago | (#45454417)

Children are harmed and continue to be harmed the moment a photo is snapped of them... they have to go through life not knowing what has become of the pictures. This is why viewing such images is illegal and must be stopped- because it is indeed an ongoing form of abuse, and courts have ruled this way.

Re:yes (-1)

Anonymous Coward | about a year ago | (#45454269)

because it is so hard for kiddie fiddlers to find porn on torrent clients.

Well...YOU would know. Wouldn't you, you sick fuck.

Leave those kids alone already

Well, (3, Funny)

Zanadou (1043400) | about a year ago | (#45453771)

I guess children will have to search for abuse some other way, then.

Just the Start? (5, Interesting)

mrspoonsi (2955715) | about a year ago | (#45453781)

Fair enough, child abuse is universally against the law (unless there are a few countries without such laws on their statue), but by the same token murder is illegal the whole world over, and I do not see Google bringing up an "Illegal search" page if you were to type "how to murder someone", perhaps it will do one day...

Yesterday I was not allowed to take a single photograph of my daughter who was in a dance competition, to quote "in case it ends up on the internet". This memory (dance competition) will be lost now, because it was not recorded. There was even an announcement, make sure all Phones and iPads are kept in your pocket / bag, something seems very wrong with this endless search for the boogeyman.

Re:Just the Start? (3, Interesting)

rioki (1328185) | about a year ago | (#45453831)

How fitting, the current quote:

Do you guys know what you're doing, or are you just hacking?

Re:Just the Start? (0)

Anonymous Coward | about a year ago | (#45454009)

Jeff Goldblum: "Can you really fly this thing?"
Will Smith: "Can you really do all that BS you just said?"

Re:Just the Start? (1)

Anonymous Coward | about a year ago | (#45453881)

Fair enough, child abuse is universally against the law (unless there are a few countries without such laws on their statue), [...]

Universally? There are countries where it's legal to wed a girl, then less than a year later demand the brideprice back because she died in labour... aged 11. So, no. Not by a long shot. That's just what prudish westerners like to think.

Moreover, I'd not say "fair enough", because it equates "looking at pictures of $crime" with "committing $crime". So, anyone who's looked at world press photos is now also guilty of war crimes then? No? Why the double standard?

[...] something seems very wrong with this endless search for the boogeyman.

At least we're thinking of the children.

Re:Just the Start? (0)

Anonymous Coward | about a year ago | (#45454137)

Universally? There are countries where it's legal to wed a girl, then less than a year later demand the brideprice back because she died in labour... aged 11. So, no. Not by a long shot. That's just what prudish westerners like to think.

Moreover, I'd not say "fair enough", because it equates "looking at pictures of $crime" with "committing $crime". So, anyone who's looked at world press photos is now also guilty of war crimes then? No? Why the double standard?

While I think these blocks are the wrong way of addressing this problem, your comparison is a very poor one.

Unlike child abuse images, people looking at world press photos doesn't create a market for war crimes.

Re:Just the Start? (0)

Anonymous Coward | about a year ago | (#45454195)

Unlike child abuse images, people looking at world press photos doesn't create a market for war crimes.

That's not even true for "child abuse images," either. Merely looking has little to no effect.

But really, people's actions are their own. The real problem are the actual rapists, as usual. Prosecuting people for looking at images is disgusting, but I'd expect no less from the "protect the children" people.

Like tears in rain (1)

Anonymous Coward | about a year ago | (#45453915)

A little dramatic, but I'm glad I didn't grow up with the internet the way it is. As it stands most kids will have their life plastered all over the internet with no say to what is put on there.

Re:Just the Start? (2, Insightful)

Anonymous Coward | about a year ago | (#45453967)

They did that at my niece's dance competition, but guess what? They had photographs and recordings that I could buy at some ridiculous price.

Re:Just the Start? (1)

Chrisq (894406) | about a year ago | (#45453983)

Fair enough, child abuse is universally against the law (unless there are a few countries without such laws on their statue)

Read most countries with sharia based law [rawa.org]

Re:Just the Start? (1)

gweihir (88907) | about a year ago | (#45453991)

That sounds like people there are in hysterics and have lost all rationality. Even if it ends up on the Internet, so what?

Re:Just the Start? (2)

mrspoonsi (2955715) | about a year ago | (#45454303)

Exactly, a picture of a child dancing in a leotard is not child abuse, except for when it is found on a pedophiles computer, then it is classed as such. That creates the problem because stigmatizes normal images of children and yes I class a photo of a child wearing a swimming costume or dance costume as normal, should I feel odd taking a photo of my child on a beach? a mother would not, but as a man I am open to suspicion.

Re:Just the Start? (3, Interesting)

drinkypoo (153816) | about a year ago | (#45454063)

Yesterday I was not allowed to take a single photograph of my daughter who was in a dance competition, to quote "in case it ends up on the internet". This memory (dance competition) will be lost now, because it was not recorded.

Are you keeping a scrapbook? One fun thing to do would be to put a MEMORY REDACTED card in it for every event you're not permitted to photograph for some bullshit reason. Hopefully in 40 years you'll be permitted to look at it and shake your head.

Re:Just the Start? (0)

Anonymous Coward | about a year ago | (#45454105)

I have to take contention with the idea that if you don't record something, you're going to lose it. If it was important to you, if it mattered, you will probably remember it. If not, a photograph probably isn't going to do much to help. I also think the camera just acts as a distraction that prevents you from getting into the moment. I've always been an advocate of just putting down the camera and getting in the moment.

Re:Just the Start? (1)

mrspoonsi (2955715) | about a year ago | (#45454155)

So a 4 year old will remember this when looking back 20 years? and I am sure I would not remember it in another 30 years, a photograph would jog the memory though.

Re:Just the Start? (1)

havana9 (101033) | about a year ago | (#45454115)

There were 'official' photographers? I remember that on similar events the presence of a professional photographer implied that unofficial photographers weren't allowed, and one had to buy the official image. If I know in advance that there's a ban on phones, I'll buy a black and white 35 mm film and take my old '70s mechanical camera and its giant and noisy flash lamp. Just for trolling.

Re:Just the Start? (1)

mrspoonsi (2955715) | about a year ago | (#45454207)

Yes there were, outside of the dance hall at the entrance taking photos on a white background. I can do that at home (stood next to a white wall), instead I would have liked a photo of the actual dancing, in the event, during the moment.

Re:Just the Start? (0)

Anonymous Coward | about a year ago | (#45454263)

Yesterday I was not allowed to take a single photograph of my daughter who was in a dance competition, to quote "in case it ends up on the internet". This memory (dance competition) will be lost now, because it was not recorded. There was even an announcement, make sure all Phones and iPads are kept in your pocket / bag, something seems very wrong with this endless search for the boogeyman.

The problem is that the focus has shifted. The focus used to lie on the kids that were being hurt, which is what is horrible and should be where the focus lie.

This example where you can't even photograph your own kid at a dance contest is just absurd!

For the sake of discussion, lets say you recorded the dance competition and put it on the internet, and then some perv downloaded it and got turned on by it. Who was hurt? That's right, noone.

I'd much rather see the effort going into catching people who actually hurt kids. This is just the usual "look we are doing something about a problem", without actually doing anything at all.

Political correctness sucks in all shapes and forms!

Re:Just the Start? (0)

Anonymous Coward | about a year ago | (#45454309)

child abuse is universally against the law (unless there are a few countries without such laws on their statue)

Saudi Arabia.

Re:Just the Start? (5, Insightful)

Xest (935314) | about a year ago | (#45454511)

So join or put a question to the PTA demanding the school answer why on earth it's preventing parents from saving memorable moments of their children's upbringing.

If no one questions it this shit will keep propagating, I'd wager you're not the only parent pissed off about this and given that the school wouldn't exist without the parents and their kids then it needs to be stamped out.

Google under “child abuse” umbrella is (0)

Anonymous Coward | about a year ago | (#45453849)

Why this is BS? If they would really remove child abuse terms it would be 50-100 maybe 200 terms, with 100.000 they are removing half of the bloody internet.

Considering we use only 3000-5000 different words in every day conversation. Yes all the software titles, movie titles, it will be advertisement screened, paid for content same as on yahoo.com news section, all "user" reviews that were produced by paid reviewers.
In last few months I have noticed Google is referring to 2-3 year old reviews or cnet BS advertisement.

Are they any alternatives? DuckDuck or some other engines?

Re:Google under “child abuse” umbrella (1)

Buchenskjoll (762354) | about a year ago | (#45454289)

DuckDuckGo uses Bing data, so I don't think you'll have greater success using that.

Why that sounds useful!: (4, Insightful)

Hartree (191324) | about a year ago | (#45453859)

You could try to get a secret court order that Google wasn't allowed to talk about that made them add noted child pornography search terms like "Edward Snowden" to the list.

Well this is totally not going to be abused. (0)

Anonymous Coward | about a year ago | (#45453901)

Coming to any user-generated content site near you, tiny / encoded links that google for CP terms.
There are already plenty around the place, including some that embed many iframes of government websites with CP related searches.

Intent is very different to "trolled in to clicking things".
All this shit is going to do is cause considerable noise in efforts to actually find people that might need help.
Yes, help. How are people going to find out if these things were illegal if you get blasted with CHILD PORN IS NASTY YOU SICKO on Google?
They'll probably just close the damn tab and never google for anything again and be even more paranoid while possibly being abused.
Thanks Google.

Seeing that smug gits face on the news earlier pissed me off too. Claire Perry needs to be sacked.
All she is doing is hindering efforts to help people and hindering efforts to find those causing harm.
Censorship DOES NOT WORK.
Censorship FORCES PEOPLE TO GO AROUND those measures, sending them even deeper in to the alleys of society that aren't capable of being monitored.
You fucktards are only making it harder to find abusers, and worse, possibly forcing casual pedos that look at images on random websites OUTSIDE.
Great work, you god damn geniuses. Do you even psychology? Do you even common sense? Oh wait, of course not, Claire just shouts at people until she wins.
I remember I was going to give her a chance, "eh, people blow things out of proportion, maybe she legit wants to solve the problem", then I saw an interview with her on channel4 news and instantly knew what type she was, the shouting louder than the opposition while they are talking type, the "la la la children, pedo supporter" type. How the hell she got in to that position of power is beyond me.
Even some die-hard supporters of her views that I know hate her. She is causing way more harm than good.
This shit isn't going to stop, it is only going to get worse.

Depressing job (2, Insightful)

dubdays (410710) | about a year ago | (#45453909)

I'm not going to comment on whether or not this is a good idea or not. However, I will say that 200 Google employees had to code and test this. That has to be one of the shittiest jobs I can think of. Maybe it could be rewarding in some way, but damn.

Re:Depressing job (0)

Anonymous Coward | about a year ago | (#45453949)

Meh. There's plenty of bad things happening to people every day. Including stuff like getting your family assassinated by a drone strike.

Re:Depressing job (1)

Anonymous Coward | about a year ago | (#45453989)

You mean much like this?

http://tech.slashdot.org/story/12/08/21/2028207/the-worst-job-at-google-a-year-of-watching-terrible-things-on-the-internet

Re:Depressing job (2)

akeeneye (1788292) | about a year ago | (#45454027)

I think a shittier job would be doing computer forensics. You end up having to see this stuff as well as go testify about it in court. It would become part of your life, inescapable. I'd given some thought to going into forensics but the thought of that deterred me, I don't think I could hack it. I've heard it said that there's a great personal reward in locking up the pervs, but it seems to me it would come at a great personal price. I wonder what the suicide rate is in the profession?

Re:Depressing job (0)

Anonymous Coward | about a year ago | (#45454125)

I imagine its not THAT high. You probably become desensitized to it pretty quick. I mean, look at 4chan. It churns out users desensitized to all manors of filth.

Better Plan? (0)

Anonymous Coward | about a year ago | (#45453917)

Instead of blocking the searches why don't they just forward the request to the State Police or FBI along with their IP addresses and any other pertinent details they can?

Re:Better Plan? (1)

Chrisq (894406) | about a year ago | (#45453995)

Instead of blocking the searches why don't they just forward the request to the State Police or FBI along with their IP addresses and any other pertinent details they can?

How do you know that they don't?

Helping the paedos cover their tracks (0)

Anonymous Coward | about a year ago | (#45453935)

And ensuring that only the dumb get caught.

Unintended consequences (1)

redelm (54142) | about a year ago | (#45453945)

How is keyword blocking going to help abuse victims find recovery resources? I thought most kiddie-pr0n was on the darknet.

Far more innocents will be hurt than the intended targets.

Re:Unintended consequences (-1)

Anonymous Coward | about a year ago | (#45454157)

If a child is too stupid to search for abuse resources without triggering the filter then they deserve to be raped.

It's a mad mad mad mad world (1)

fatphil (181876) | about a year ago | (#45453999)

Where
    http://www.simpsoncrazy.com/content/pictures/family/HomerStranglesBart1.gif
is blocked, while
    http://www.manowar-collection.de/Manowar1984Poster.jpg
is considered safe.

20 years (2)

zakeria (1031430) | about a year ago | (#45454017)

after using the Internet for 20 years, I have never come across images of child abuse? not once!... from my understanding 99% of this stuff in located on darknets.

Little Cuties (0)

Anonymous Coward | about a year ago | (#45454043)

This is going to tank my 'Little Cuties' line of clothing for Seniors.

won't they just use new words? (1)

Anonymous Coward | about a year ago | (#45454095)

Kinda like how people circumvent the great firewall of china by using homophones, even if they are complete nonsense

Over-reach Erasure (1)

Tommi Morre (235789) | about a year ago | (#45454111)

Vitally important as it is to protect children from sexual predators (which of course includes the market for child pornography), I'm concerned that including general child abuse in this will both silence child abuse survivors and make it more difficult for abused children to ask for help or advise anonymously online. I don't think differences in English usage across the world should override this.

Real news is (0)

Anonymous Coward | about a year ago | (#45454147)

They allowed these searches till now

NOT FAR ENOUGH! (1)

Charliemopps (1157495) | about a year ago | (#45454173)

If they can do this, why not block anti-American speech while we're at it? I mean, those people are terrorists right?

Re:NOT FAR ENOUGH! (1)

Anonymous Coward | about a year ago | (#45454385)

If they can do this, why not block anti-American speech while we're at it? I mean, those people are terrorists right?

That's what is known in the business as "feature creep" and it will happen before you know it.

All of the useful idiots who support this kind of thing swallow the "won't somebody PLEASE think of the children!!!" thing hook, line, and sinker and they gleefully throw their basic human rights out the window.

Feel good move, nothing more (1)

kaizendojo (956951) | about a year ago | (#45454179)

The people who frequent in this kind of material aren't searching for it on the open internet; they're using TOR networks and hidden FTP sites. The real solution is good investigative work, but that requires resources and effort.

Another Failure (0)

Anonymous Coward | about a year ago | (#45454255)

I'm glad they are trying something, It wont work its been done before and failed.

Because the keywords simply changed and/or database sites were created with direct links, you would need such a huge team of dedicated people working 24/7 actively searching for Cporn to even keep up with the changes. this is very similar to actions tried on the war on piracy and were/have been dismissed as unwinnable.

As many people have said, this is only a 1/2 measure, Cporn is a side effect not the cause/solution.

100,000! (0)

Anonymous Coward | about a year ago | (#45454291)

How is it reasonable to block 100,000 search terms? And one thing would be to block images or videos of such abuses (still questionable since it makes it harder to find the perpetrators), but another one completely different is to block all the discussion around the issue. This is very bad for anyone who suffers from this, help groups, etc. If today anyone told me that google had a "don't be evil" motto I wouldn't believe.
And a 20 percent drop of illicit activity? How was that measured? By google searches? How many children stopped being abused by that measure?

The next thing will be arresting everybody that search for these unknown terms and charging people for attempted illegal search.

Only 100,000? (0)

Anonymous Coward | about a year ago | (#45454395)

Either be 100% content-neutral like in the old days or discard known-to-be-illegal pages from the index and report them to authorities as soon as they are detected.

Unless the images themselves are stripped from the index for **any** search term people can still find them.

Gonna get sued (1)

waspleg (316038) | about a year ago | (#45454441)

They're baiting the MPAA/RIAA by doing this. They're going to get sued by every agency that doesn't want something found be they torrents or unpopular political views. Slippery slope ready for action.

Stupid and shortsighted (1)

DrXym (126579) | about a year ago | (#45454443)

A vastly better idea would be to allow these search terms through, monitor which images / sites were subsequently clicked on and then provide this information allow with IP logs to the relevant law enforcement agencies. In other words, let these freaks hang themselves with their own rope. The most likely consequence of banning these terms is that child porn will be driven underground, into Tor servers and so forth where it is far more difficult to monitor.

1984... (1)

MrKaos (858439) | about a year ago | (#45454445)

Sex Crime!! Sex Crime!!

I predict that...^2 (1)

J'raxis (248192) | about a year ago | (#45454461)

Wow, two for two [slashdot.org] today, eh?

I predict that this will be about as successful as all other attempts to censor information has been. But don't let that stop you. At least you look like you're "doing something" just like the fool politicians in the other story, right?

Statistics (2)

Walterk (124748) | about a year ago | (#45454471)

This is likely to be hugely ineffectual, as the actual numbers point to a rather different typical abuser:

In the United States, approximately 15% to 25% of women and 5% to 15% of men were sexually abused when they were children.[33][34][35][36][37] Most sexual abuse offenders are acquainted with their victims; approximately 30% are relatives of the child, most often brothers, fathers, mothers, uncles or cousins; around 60% are other acquaintances such as friends of the family, babysitters, or neighbours; strangers are the offenders in approximately 10% of child sexual abuse cases.[33] In over one-third of cases, the perpetrator is also a minor.[38]

From: Wikipedia [wikipedia.org]

So what is this actually supposed to accomplish apart from censorship? What sort of "unsavoury" things are in this list of 100k search terms that are not even illegal? Snowden perhaps?

iffy (0)

Anonymous Coward | about a year ago | (#45454477)

100,000 terms. that seems like there's gonna be a whole lot of overlap.

as time goes by, i can't help but think google is getting to be a government's wet dream. it's an incredible tool to obliquely shape a population through the creation of a censored information bubble. paranoia, huh. just a rumination.

a fairly ignorant question on my part but how much of a problem is cp, actually? what kinds of figures are we actually talking about? because from the sounds of it, it sounds like tens to hundreds of millions of children are abused through cp each year.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?