Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

DoJ Following Porn Blocker Advances?

Zonk posted more than 8 years ago | from the dirty-little-program dept.

265

GreedyCapitalist writes "A new filter called iShield is able to recognize porn images based on the content of the image (other filters look at URLs and text) and according to PC Magazine, it is very effective. The next generation will probably be even better -- which highlights the retarding effect regulation has on technological progress - if we relied solely on government to ban 'inappropriate' content from the web, we'd never know what solutions the market might come up with. Will the DOJ (which argues that porn filters don't work) take note of filtering innovation or continue its quest for censorship?"

cancel ×

265 comments

Sorry! There are no comments related to the filter you selected.

Reversal (4, Funny)

orangeguru (411012) | more than 8 years ago | (#14955803)

Since it is so good identifing pr0n I can't wait to get my first pornbot with that function to find me some more.

Re:Reversal (1, Redundant)

bhima (46039) | more than 8 years ago | (#14955826)

Man, if it's good enough to find the sorts I like with "actresses" I like... sign me up!

Re:Reversal (5, Funny)

Anonymous Coward | more than 8 years ago | (#14955833)

If you're having trouble finding porn on the Internet you're doing something wrong...

Re:Reversal (0)

Anonymous Coward | more than 8 years ago | (#14955949)

If you're having trouble finding porn on the Internet you're doing something wrong...

True. But it would be nice if the bot could filter the 'good' pr0n (the stuff I like) from the crap pr0n (the stuf I don't like).

Re:Reversal (2, Funny)

aichpvee (631243) | more than 8 years ago | (#14955978)

Maybe it could syndicate the porn via RSS after collecting it. And before anyone gets any ideas, PornCast(tm) is now officially a trademark of me. I will be submitting my patent on the iPorn Portable Pornography Player shortly.

Re:Reversal (1)

Kjella (173770) | more than 8 years ago | (#14956030)

I think the problem is that even if a filter is installed, if you're having trouble finding porn on the Internet you're still doing something wrong...

Re:Reversal (0)

Anonymous Coward | more than 8 years ago | (#14956011)

Keyboard protectors for staff: $200
Hand lotion for each desk: $150
Cubicle privacy shields: $800

Programmers getting paid to surf p0rn: Priceless

What Is The Story here? (5, Insightful)

Anonymous Coward | more than 8 years ago | (#14955808)

I see nothing in this article that the DOJ is about to do anything. This is just a review of a a product that can block some images that would be useful for some families.

I don't understand why this summary has to bring the government into this or speculate that they might do something. There's no evidence of impending censorship, no political issues at work here. It's just a review of a product. Why does Zonk continually try to troll politics on slashdot? He's turning into worse than Michael ever did.

Re:What Is The Story here? (1, Interesting)

Spacejock (727523) | more than 8 years ago | (#14955908)

I think they're just reasoning that when there's a market, private enterprise will always get in before government. Instead of demanding the government DO something about internet porn, parents can now spend a few bucks and do something themselves.
On a related topic, I'm still amazed that introducing a .xxx domain for porn is considered a violation of free speech/human rights/whatever. Speaking for a local primary school whose web filters I maintain, just get on with it so we can fence of that part of the web. Please. Right now the filters we're using are so restrictive they block a lot of useful sites. Yes, I whitelist them as required, but it's still a PITA.

Re:What Is The Story here? (2, Insightful)

quintesse (654840) | more than 8 years ago | (#14955927)

Oh sure.... private enterprises are sooo well known for being well-behaved and doing what is good for us and the whole of mankind. Down with government! Who needs them anyways?

You must be American

Re:What Is The Story here? (5, Insightful)

hcdejong (561314) | more than 8 years ago | (#14955995)

there are several problems with a .xxx domain:
- you'd have to get every country in the world to go along with this
- how would you decide if a site needs a .xxx domain? There are lots of edge cases. Would collegehumor.com qualify?
- you'd have to create an 'internet police' to enforce compliance

Re:What Is The Story here? (1)

Tim C (15259) | more than 8 years ago | (#14956012)

.xxx won't work, unless you can persuage every country with a TLD of their own to force their pornographers to move out of .co.uk, .co.jp, or whatever and into .xxx.

Re:What Is The Story here? (1)

bleppie (129980) | more than 8 years ago | (#14956018)

Bump the parent. I especially love this little unnecessary quote:

      "...highlights the retarding effect regulation has on technological progress"

I would moderate that article as a troll.

Re:What Is The Story here? (1)

gavri (663286) | more than 8 years ago | (#14956029)

The speculation about what the Government might do was from the original submitter, not Zonk. Maybe Zonk should have edited it though.

Screaming so loud we can't hear you anymore (5, Interesting)

BadAnalogyGuy (945258) | more than 8 years ago | (#14955809)

There is no mention of the DOJ anywhere in the articles you posted. [storyarts.org]

But according to the article, it works well and doesn't filter out health-related websites. It also doesn't work for black and white images, but the majority of online porn isn't b&w. Or so I've heard.

Re:Screaming so loud we can't hear you anymore (1)

fatduck (961824) | more than 8 years ago | (#14955815)

The real question is whether "the government that cried porn" will ever take a step back and wonder why they're trying to censor pornography to begin with?

Re:Screaming so loud we can't hear you anymore (1)

bulldogzerofive (947922) | more than 8 years ago | (#14955819)

There's no claim in the synopsis that the DoJ is mentioned in the article.

Re:Screaming so loud we can't hear you anymore (0)

Anonymous Coward | more than 8 years ago | (#14955820)

......or seen.

Re:Screaming so loud we can't hear you anymore (1)

leuk_he (194174) | more than 8 years ago | (#14955861)

But would you really beleive all the claims of such a compagny. There are also compagnies that claimed they could find identify the music from listening to it, but until today there is no compagny that makes a good p2p program with this info in it.

Now there is a compagny that claims it can catagorize porn (hmm, try to explain to your wife/virtual gf: i work for a compagny that catagorize porn 8) ). They would love to sell it to a government, and might sell it to some nazi filtering compagny (they always filter just good enough, but only 95% accurate) But would you believe them?

What about sepia? ;) (1)

TheScienceKid (611371) | more than 8 years ago | (#14955888)

*chortle*

hmm (4, Insightful)

DrSkwid (118965) | more than 8 years ago | (#14955811)

So does it filter out Rubens

Would Michelangelo's David be filtered out

How about anatomy/autopsy pictures ?

I would RTFA but it is 404, perhaps my ISP filters out stories about filtering.

Re:hmm (2, Funny)

spiritraveller (641174) | more than 8 years ago | (#14955847)

At some point, the computer would have to decide what is arousing and what is not.

Could give HAL-9000 a whole new outlook.

Re:hmm (1)

quintesse (654840) | more than 8 years ago | (#14955942)

Well it would have to be able to read minds because there are things that would arouse one person and not another and vise versa. I think Rubens is positively erotic ;-)

Re:hmm (4, Informative)

Jugalator (259273) | more than 8 years ago | (#14955853)

So does it filter out Rubens
Would Michelangelo's David be filtered out
How about anatomy/autopsy pictures ?


This excerpt answers these pretty well:
A Google Images search on "breast self-examination" was correctly allowed. On a page dedicated to the artistic nudes of Alberto Vargas, it inexplicably decided to tile the text-only links menu with hundreds of tiny shield images; Guardware confirmed this is a bug.

So it's business as usual. If PC Mag's quick checks revealed innocent sites being blocked, I hope this never sees the light as anything with a mandatory use anywhere. I think missing to spread information is worse than actually even showing human intercourse. Yes, even if there's a vagina there. I hope the kids aren't traumatized for life if they'd stumble over such things and the dirtiness of our anatomy.

Oh, also watch out for the new Pumpk1n Pr0n:
And we found that some oddly innocent imagesin particular, "head shots" of pumpkins from last Halloweenwere blocked.

The article says IE would crash more with this tool in use too, but I'm not sure anyone would notice the difference from before and after. ;-)

I would RTFA but it is 404, perhaps my ISP filters out stories about filtering.

Just use the Mirrordot version [mirrordot.org] .

Re:hmm (0)

Anonymous Coward | more than 8 years ago | (#14955893)

I hope the kids aren't traumatized for life if they'd stumble over such things and the dirtiness of our anatomy.

Nothing dirty about our anatomy. Nor sex. Just the ways it's often portrayed. I.e. kids seeing pictures of intercourse is not super inappropriate, but maybe we could save the pictures of people getting shit on til a bit later?

Re:hmm (3, Insightful)

geoff lane (93738) | more than 8 years ago | (#14955911)

I'm stunned that a bit of software can both read and understand the law and interpret it exactly as a real judge would.

Why isn't this amazing AI advance being reported?

Re:hmm (1)

swillden (191260) | more than 8 years ago | (#14955999)

I'm stunned that a bit of software can both read and understand the law and interpret it exactly as a real judge would.

I'm not at all stunned that a bit of software can interpret the rules exactly as some real judge would. There aren't many of them, fortunately, but there are some idiot judges out there.

our favorite (2, Funny)

gEvil (beta) (945888) | more than 8 years ago | (#14955813)

Does it work on everyone's favorite image, hello.jpg?

Re:our favorite (4, Funny)

ettlz (639203) | more than 8 years ago | (#14955873)

$ ishield hello.jpg
Segmentation fault (core dumped)

Inconceivable! (4, Funny)

rkcallaghan (858110) | more than 8 years ago | (#14955887)

Does it work on everyone's favorite image, hello.jpg?

A goatse reference that is helpful and useful? Inconceivable!

I would buy this software if it could filter me from seeing that ever again.(I jest, but only slightly)

~Rebecca

Re:Inconceivable! (0)

Anonymous Coward | more than 8 years ago | (#14955961)

The best way to remove Goatse from your mind is to find a different, more troubling image to replace it with -- just as you forget about a splinter if you break your arm :).

May I suggest MeatSpin [meatspin.com] ? Or Tubgirl [encycloped...matica.com] ?

Re:Inconceivable! (3, Funny)

ettlz (639203) | more than 8 years ago | (#14956017)

The best way to remove Goatse from your mind is to find a different, more troubling image to replace it with
Like this one [google.com] .

Re:Inconceivable! (0)

Anonymous Coward | more than 8 years ago | (#14956041)

Whod ever think wed be happy to see the Hoff!

Re:Inconceivable! (1)

ettlz (639203) | more than 8 years ago | (#14956056)

Whod ever think wed be happy to see the Hoff!
Speak for yourself, gansta!

Re:Inconceivable! (5, Funny)

ettlz (639203) | more than 8 years ago | (#14956036)

I would buy this software if it could filter me from seeing that ever again.

Or you could train your mind, as I have. My occiputal lobe no longer processes Goatse as it "should", and substitutes a fuzzy blur in its lieu. I literally cannot see it!

Unfortunately, this is not without side effects. For instance, it's no longer safe for me to drive through tunnels.

Which, in turn... (5, Insightful)

kahei (466208) | more than 8 years ago | (#14955814)

which highlights the retarding effect regulation has on technological progress

In other news, today I successfully opened a can of Diet Coke -- which highlights the retarding effect regulation has on quenching thirst. Man, if I'd waited for the government to open that can for me, I'd still be thirsty now!

If only there were a more effective way to highlight the retarding effect that obsessing over the complete works of Ayn Rand has on independant thought...

Re:Which, in turn... (2, Insightful)

cvmvision (245679) | more than 8 years ago | (#14955964)

"In other news, today I successfully opened a can of Diet Coke -- which highlights the retarding effect regulation has on quenching thirst. Man, if I'd waited for the government to open that can for me, I'd still be thirsty now!"

Yet for many - they expect government to be that first line of defense against the "undesirable" and refuse to help themselves. Of course after so many years of public "education" this shouldn't be a surprise.

I don;t get it. (2, Insightful)

freedom_india (780002) | more than 8 years ago | (#14955816)

I simply don't get it.

First we shout the Govt. to get Off our backs on this issue, and when they actually fail to come up with any solutions (because we told them NOT to), we wham them for not guiding us/providing us with any solution.

What a load of cr*p !

On one hand we shout at the ineffectiveness of Govt's first real action in decades to counteract this problem (by yahoo, msn and google searches), and then we shout at them for NOT providing a solution at all.

You tie both my hands behind my back, then you blame me for not shooting at the thief !

It's the Slashdot Fallacy ... (5, Insightful)

rkcallaghan (858110) | more than 8 years ago | (#14955877)

I simply don't get it.

... and you fell for it.

First we shout the Govt. to get Off our backs on this issue, and when they actually fail to come up with any solutions (because we told them NOT to), we wham them for not guiding us/providing us with any solution.

You are failing to realize that the same person is not talking in both cases. Also, while Slashdot as a whole leans to the left, the same issue can have articles written by, and about people on, both sides. The only thing that is happening here is that someone thought a discussion about a software for image identification and its future impact on us would be a good thread, and here we are.

You tie both my hands behind my back, then you blame me for not shooting at the thief!

The fallacy lies in missing that the ties hands speaker is not the same speaker as the one doing the blaming.

Make more sense now?

~Rebecca

Re:It's the Slashdot Fallacy ... (1)

CptPicard (680154) | more than 8 years ago | (#14955993)

Actually it's not... the favourite tactic of right-wingers worldwide these days in their quest to dismantle the public sphere is to complain that it doesn't work, and play along just far enough to be allowed in to making decisions, which allows them to essentially sabotage whatever they are involved in. Then they get a nice good reason to complain even more that this doesn't work and therefore it needs to be taken down.

Most modern left-wing policies of the western world work just fine, but they require broad commitment. You can't trust right-wingers in government because they are not in it bona fide.

Re:I don;t get it. (2, Insightful)

TapeCutter (624760) | more than 8 years ago | (#14955967)

"You tie both my hands behind my back, then you blame me for not shooting at the thief !"

You think it's a charcter flaw not to kill for property?

Re:I don;t get it. (0)

Anonymous Coward | more than 8 years ago | (#14956046)

I do.

Re:I don;t get it. (4, Informative)

Secrity (742221) | more than 8 years ago | (#14955991)

There are two seperate and distinct "solutions" for people who have issues with porn. The first "solution" is government censorship of the inernet. The second "solution" involves local filtering installed by the computer owner, and there are at least two flavors of this "solution". There are bastard situations where various non-federal governments (including libraries) own the computer or the network which get REAL complicated. There are also situations where ISPs and networks censor access.

Government Censorship: There are wing nuts who want the US government to censor the internet, usually with cries of "think of the children" or "help fight terrorism". People who know how the internet works generally realize that this is a stupid "solution".

Local Filtering: There are several different way that this can be done and all of the currently available local filtering "solutions" have problems. TFA was about a new local filtering scheme, which COULD be better than the existing methods.

Local filtering vs. government censorship is, I think, where you see the contradiction. It really isn't a contradiction for people to say NO to government censorship (including local filtering in public libraries) and to also have some of the same people wanting the government to get involved in improving local filtering technologies.

If it wasn't for porn on the internet, war, gay marriage, and abortion; you couldn't get anybody to go to the polls.

I wont belive it... (0)

Anonymous Coward | more than 8 years ago | (#14955821)

...without any proof of concept.

And also, would it be possible for it to filter out everything but porn? Would make my surfing alot more pleasant.

Backwards.. (4, Funny)

onion2k (203094) | more than 8 years ago | (#14955824)

Can I run it backwards and filter out everything that isn't porn? I'd find that more .. useful.

How so? (1)

Opportunist (166417) | more than 8 years ago | (#14955986)

Filtering out anything that isn't porn on the net is like trying to filter your Mailbox for anything that isn't spam.

In other words, something you could easily do by hand. Unless that hand is busy because of the porn, of course. :)

What A Sick Piece Of Software (0)

Anonymous Coward | more than 8 years ago | (#14955827)

I am disgusted by both the type of company that would come up with such a disturbing piece of software and by the people who desire to wield such tools.

You have to be seriously fucked in the head to want to use or force others to use something like this. There should be an automatic disqualification of the legal right to be a parent if you use this garbage.

windows only (1)

skynare (777361) | more than 8 years ago | (#14955828)

so, kids using other operating systems don't need protection.

Re:windows only (1)

bulldogzerofive (947922) | more than 8 years ago | (#14955836)

Kids using Linux probably know what knoppix is. Kids using Mac are just screwed.

False Positives (4, Insightful)

michaelhood (667393) | more than 8 years ago | (#14955831)

This thing will be ruined with false positives. Swimsuit photos, maybe pictures of animals (similar color tones), etc.

This won't go anywhere for a long time, until image recognition technology catches up.

Re:False Positives (1)

No Salvation (914727) | more than 8 years ago | (#14955846)

maybe pictures of animals (similar color tones)
Yeah, NAKED animals. Do YOU want naked alpacas exposing themselves to YOUR children. I think not.

Re:False Positives (1)

1u3hr (530656) | more than 8 years ago | (#14955850)

This thing will be ruined with false positives.

RTFA. "It didn't block department-store lingerie ads but covered up a few scantily clad models at the Victoria's Secret site. A Google Images search on "breast self-examination" was correctly allowed."

But they also say: "your tech-savvy teenager may attempt to evade this monitoring by terminating iShield....Without the password, you just can't turn it off." Right. Unless he reboots to a Knoppix CD, for instance. But basically if you don't want porn, this seems like a good way to manage that.

Re:False Positives (1)

Jugalator (259273) | more than 8 years ago | (#14955871)

RTFA. "It didn't block department-store lingerie ads but covered up a few scantily clad models at the Victoria's Secret site.

Since Victoria's Secret isn't a porn site, there's even more evidence it doesn't work right there.

Re:False Positives (4, Interesting)

anthony_dipierro (543308) | more than 8 years ago | (#14955915)

This won't go anywhere for a long time, until image recognition technology catches up.

Even then, one person's "porn" is another's "art". Even a human can't correctly distinguish offensive vs. non-offensive content with all that much accuracy. (This is besides the fact that around the same time as image recognition technology catches up computers will have overtaken the world and we'll be following their rules rather than our own.)

why ? (0)

Anonymous Coward | more than 8 years ago | (#14955834)

why would anyone wanna block pr0n ?

Need programers ? (5, Funny)

lemonjus2 (939285) | more than 8 years ago | (#14955835)

I wonder how many hours the poor programers worked in order to test this thing :)

Looking for porn that the filter cant handle...

What those meetings must have looked like.

Dev Meetings (3, Funny)

TiggertheMad (556308) | more than 8 years ago | (#14955883)

Manager: "All right team, looks like Joe has finally come up with a fast and fairly accurate algorithim to spot those dirty old pornographic images. We will need to test it a bit first, to see what the signal to noise ratio is. We will need some test groups, though."

"Yeah, sure bob, you can run the 'barely legal college girls' tests. Janet and Simone, you check the 'hot lesbian' batches. What? Sure Ramone, you can check the 'young gay studs' test. Now, who is going to run the 'goatse.cx' tests?"

"Guys....? Anyone?"

Errors abound (4, Interesting)

michaelhood (667393) | more than 8 years ago | (#14955839)

FTA: And we found that some oddly innocent images--in particular, "head shots" of pumpkins from last Halloween--were blocked. But overall, of blocking the images you'd want blocked.

This thing won't be deployed en masse with problems like that.. it quickly becomes uneconomical for admins to be whitelisting pictures of pumpkins.

Re:Errors abound (0)

Anonymous Coward | more than 8 years ago | (#14955859)

This is as dimwitted as saying spam filters won't be deployed en masse because of a few false positives.

And to your previous stupid comment, TFA said department store lingerie passed, but a few Victoria Secret items didn't. So swimsuits should pass, unless they're on the risque side of things. In which case, if you don't want junior seeing pr0n, you probably also don't want him seeing th0ng either!

Re:Errors abound (3, Insightful)

BeardsmoreA (951706) | more than 8 years ago | (#14955862)

But how many employees will come to their BOFH complaining that they couldn't look at their neighbours halloween photos? On their work machine? In work time? Irritating if you're the employee, but not likely to keep employers awake at night I'd have thought. Lets be honest, 90% of most employees work surfing is probably less than work related, and if you really do have a job that involves looking for pictures online a lot, you're probably a prime candidate for whitelisting from the whole thing.

OTOH,For something like a home machine that you wanted to configure for keeping the kiddies safe, yes, this might not be a great solution yet.

Re:Errors abound (0)

Anonymous Coward | more than 8 years ago | (#14955916)

This thing won't be deployed en masse with problems like that.. it quickly becomes uneconomical for admins to be whitelisting pictures of pumpkins.

Oh, you don't put it in block mode, just have it set to report mode and then periodically review the content to see who you can fire for browsing porn. That's how we use a similar product at work. /totally against it, but I'm not the decision maker

Re:Errors abound (1)

LordKronos (470910) | more than 8 years ago | (#14955981)

I clicked the link in your sig. Just when you think you've seen it all...chainmail lingerie?

Encryption (1)

Threni (635302) | more than 8 years ago | (#14955844)

The DOJ is going to go crazy when they learn about encryption!

Marality and AI (1, Insightful)

PrinceAshitaka (562972) | more than 8 years ago | (#14955849)

So what they are really doing is trying to teach an AI morality? Does anybody know how they do this. What is the difference between a nipple and a cherry (the fruit) to a computer. In some point in the future will are goverment be able to make computeres see thier motrality and then tell them to go enforce it?

Re:Marality and AI (1, Informative)

PrinceAshitaka (562972) | more than 8 years ago | (#14955856)

somebody mod me down for not knowing how to spell, My post is so embarassing.

Re:Marality and AI (0)

Anonymous Coward | more than 8 years ago | (#14955935)

Do not be afraid of your mistakes.
The message got through regardless :)

Re:Marality and AI (0)

Anonymous Coward | more than 8 years ago | (#14956009)

That'd be embarrassing ;)

You know, if you don't point these things out, no one would have cared.

Re:Marality and AI (2, Insightful)

Jugalator (259273) | more than 8 years ago | (#14955858)

Does anybody know how they do this. What is the difference between a nipple and a cherry (the fruit) to a computer.

And more interestingly, what's the difference between a nipple on a nudist shot and not?
Nudism wasn't illegal in any modern country I know.

There are plenty of even less grey area cases like these that would be problematic, mentioned by a poster above. Art, both as for paintings and photography, etc. If we simply forbid the human body out of religious reasons and whatever, isn't that admitting Satan got what he wanted?

Re:Marality and AI (1)

Jugalator (259273) | more than 8 years ago | (#14955869)

Nudism wasn't illegal in any modern country I know.

Sorry, I should've written "modern and democratic country".
This was those I was thinking of, not well industrialized and "modern" dictatorships etc. ;-)

Re:Marality and AI (1)

dJOEK (66178) | more than 8 years ago | (#14955880)

Then again, if you're an employer there's not much difference between an employee watching pr0n and an employee watching nudists.
unless you're, say, a travel agency and offer trips to those resorts

I've seen something similar before (2, Informative)

NigelJohnstone (242811) | more than 8 years ago | (#14955854)

From 2001:

http://www.isp-planet.com/news/2001/messagelabs_01 1126.html [isp-planet.com]

"SkyScan AP uses Image Composition Software (ICA), which decomposes an image," White explained. "It runs 22,000 algorithms and in addition to skin tone textures, it can decipher porn through other features such as facial expressions.""

In practice these tools are simply filtering by URL, then by colour gamut analysis.

Re:I've seen something similar before (0)

Anonymous Coward | more than 8 years ago | (#14955865)

"It runs 22,000 algorithms and in addition to skin tone textures, it can decipher porn through other features such as facial expressions."

Aha - the Goatse Man is safe then. He is pulling a face of sorts though..

So much for animal rights (3, Funny)

alx5000 (896642) | more than 8 years ago | (#14955876)

Nobody seems to be caring bout the 10^5 monkeys that check every image thus making the filter work... Poor perverted animals...

This is not new... (1)

lennart78 (515598) | more than 8 years ago | (#14955879)

I remember Clearswift (or whatever they were called in the day, or are now...), had a PornSweeper product years ago, which judged images based on skin-tones. Probably things have gotten a bit more sophisticated nowadays, but the principle remains the same. And there will still be a lot of false positives. Plus, people will quickly find a way to overcome these kind of filters.

All in all however, I'd rather see these kinds of initiatives than a governmental crusade against online porn, which would not only be doomed from the start, but also give a government too much control over the Internet, or the idea that they (should) have that kind of control.

Re:This is not new... (0)

Anonymous Coward | more than 8 years ago | (#14955963)

So it only works on porn with white people? Or naughty bits the color of white people's naughty bits? I guess we're going to see a lot of old National Geographic issues slipping through the filters.

What is porn? (3, Funny)

houghi (78078) | more than 8 years ago | (#14955881)

Is this cameltoe [bigfun.be] porn or is it only porn if you get exited watching it?

Why didn't I think of this? (2, Funny)

Zapman (2662) | more than 8 years ago | (#14955903)

if (percent_pink_pixels(image) >= 70%)
      flag_as_porn(image);
endif

Step1: use silly algorithm
Step2: ...
Step3: PROFIT!

Dream job (1)

porneL (674499) | more than 8 years ago | (#14955925)

Being paid to surf pr0n at work... oh, wait.

Re:Why didn't I think of this? (0)

Anonymous Coward | more than 8 years ago | (#14956016)

What about blacks, asians?

My thoughts (1)

Antony-Kyre (807195) | more than 8 years ago | (#14955917)

One, I would hope that children viewing pornography issues is more of a state issue, not a federal government issue. I don't believe it fits in with the general welfare clause. However, they can legislate all they want for the 10 mile federal district.

I don't want the federal government getting involved with this. This is just censorship towards minors. Minors should be able to view what they please, but parents should be the ones responsible for stopping them from viewing things they don't wish for their young ones to view.

The only thing the federal government should be helping with in regards with pornography is giving grants to state governments to help stop child pornography, and by that I mean pornography that actually has real life minors in it.

Moderate me down if you wish, I just had to voice my opinion.

One more thing... (1)

Antony-Kyre (807195) | more than 8 years ago | (#14955924)

I don't know how many, if any popular browsers at all, have this option, but what if there was a way to disable all image content? If all image content was disabled, wouldn't that solve all the problems of visual pornography?

what is the problem with .xxx domains (0)

Anonymous Coward | more than 8 years ago | (#14955918)

I don't know what the problem was with the .xxx domain idea.
if you want to, search google with site:.xxx and if you don't, then block .xxx!
simple and effective

Re:what is the problem with .xxx domains (2, Insightful)

cvmvision (245679) | more than 8 years ago | (#14955983)

Personally I'm offended by stupidity propagated on the Internet. I'd like to see a new top level domain .stupid for these domains. Google would be so much easier to use then.

Solution? (2, Insightful)

BoxedFlame (231097) | more than 8 years ago | (#14955919)

For there to be a solution, there has to be a problem. I don't see a problem except moral panic and one groups willingness to impose their sense of morality on everybody else.

Hmm.. (2, Funny)

bigattichouse (527527) | more than 8 years ago | (#14955921)

So if I build a website for close-up shots of orchids, will it get banned?

My work so far... (4, Funny)

OverflowingBitBucket (464177) | more than 8 years ago | (#14955923)

I've developed a simple algorithm for checking web pages for pornographic content. It is roughly 98% accurate when fed a random page from the 'net. Here's the code so far:

bool check_porn_content(const char *url)
{
    (void)url;
    return true;
}

Any suggestions for further development, or licensing queries, please let me know.

Leave the Government out of this, thank you. (2, Insightful)

Krystlih (543841) | more than 8 years ago | (#14955932)

First of all, I am under the belief that it is NOT the governments job to tell us what we can and cannot see. I do not care what it is, the government should take no part in forcing its citizens to look at one topic vs another. Secondly, us as adults and responsible human beings need to start taking responsibility for things and not wait for father government to step in and tell us how to think. It is YOUR responsibility has an adult to view what you want to and if you come across something offensive how hard is it to hit your 'back button' on your browser? If you have children, it is still YOUR responsbility to censor what you find offensive so your children do not run into it.

Ugh, the more and more we fall into this mentality of relying on our government the more and more we let our freedoms and rights slip through our fingers. Please people start thinking for yourselves, and be not afraid of public opinion or the governments opinion.

As someone with some idea of all of this... (1)

NoMoreNicksLeft (516230) | more than 8 years ago | (#14955934)

It's pretty dumb. Every previous claim of this functionality has turned out to be skintone detection.

Machine vision can't even begin to start on this until you actually know what porn is, from an ontological point of view (a problem I've mostly nailed). Even then, the recognition algorithms for such won't be written for many years, and won't run on reasonable hardware for years after that. It's pretty dumb if you ask me.

Not many of you... (4, Interesting)

hdparm (575302) | more than 8 years ago | (#14955940)

...know about what happened [nzherald.co.nz] to Bryce Coad of Zombie Linux [zombie.net.nz] , almost 4 years ago. Wheteher his explanation was in fact true, I don't know. But obviously, some people have thought about this long time ago.

Re:Not many of you... (1)

joe545 (871599) | more than 8 years ago | (#14955971)

The article just used the vague term "sexually exploited children" when referencing some of the "child porn" images that were found. It's obvious that for him to test his filter he would need some photos of some children in a state of undress (although photos where the child is being physically molested would surely not be needed) and if it was for these "state of undress" photos that he was convicted, then do the NZ authorities not realise that they are effectively barring the development of filters designed to protect children? It seems counter-productive to me.

Why? (4, Informative)

BenjyD (316700) | more than 8 years ago | (#14955944)

Your 6-year-old may mistype his favorite cartoon's URL and wind up at a porn site; a 16-year-old may reach the same site deliberately

Why should the sixteen year old be stopped from looking at porn? He's over the age of consent, what's wrong with letting him look at some naked women? He's probably thinking about sex all the time anyway, that's just what teenagers do.

would it filter breast cancer images? (1)

joe545 (871599) | more than 8 years ago | (#14955945)

How will this bot be able to distinguish between images that show women how to examine their breats for possibly cancerous lumps and images that show boob? I can't see how this bot could ever be used on library/public computers if such valuable (and potentially life-saving) content is censored.

A History of Violence (4, Insightful)

digitaldc (879047) | more than 8 years ago | (#14955952)

Funny how they make very effective filters for pr0n, but violence is AOK.
You can bomb, shoot, maim every night on the nightly news, but God forbid you show a naked breast...people might be harmed!
There are hypocritical cultural 'norms' in the USA.

Nothing Earthshaking (1)

Eadwacer (722852) | more than 8 years ago | (#14955960)

A year or so ago, somebody claimed to be using artificial neural networks to do the same thing. As I recall, it turned out that all they were doing was running against the URLs. Having RTFA'd on this one, I see nothing there that would indicate this bunch is doing anything other than a slightly more sophisticated categorization by URL and then scanning the uncertain ones for skin tones. If you can live with a high false positive rate, this might be acceptable -- you might like it at home, but it probably couldn't be used by a library.

Patent Opportunity (1)

Coeurderoy (717228) | more than 8 years ago | (#14955976)

I'm pattenting the following plug-in, take a photo, cut it in little pieces, change the skin tones to vivid green, make some simple random transformations, add a button: "remove fig-leaf".

If you see this, the user has just to hit the remove fig-leaf button, each element of the photo is unscrambled locally (with java or maybe javascript), and the various vignettes are shown ordered and side-by side for the enjoyment of people at work and other people that could potentially be minors in their juridiction, but clicked on the link that say that they swear (a lot, hum no i mean) that they are "of age".

I'm also patenting the process that enables an operator to distribute unlock codes for this "humanity progress generating" technology through an informal high school network, in exchange of msn tags and e-mail addresses that the operator can use later for absolutelly legal uses (at least legal in transdniestrian, the sovreign republic of turcish Cyprus and some interesting parts of laos.

Unfortunatelly I'm against business methods and software patents, so I guess I'll have to find another way to become rich beyond all dreams of avarice.

(excluding savings of course, that wouldn't be in the fun category :-))

                [ps]

--------------
This sig is a temporary place holder in case I ever decide that I need one.

How could this work... (1)

Hosiah (849792) | more than 8 years ago | (#14956014)

on the same Internet where a picture of slightly distorted text can defeat a script, like with a captcha?

Brilliant (1)

cyfer2000 (548592) | more than 8 years ago | (#14956022)

The machine has learned to watch pOrn, when will they learn to laugh at adult joke?

Doesn't work (2, Informative)

laptop006 (37721) | more than 8 years ago | (#14956027)

Here's a great review of a previous generation of this kind of thing.

http://dansdata.com/pornsweeper.htm [dansdata.com]

There's been research on avenues like this for IR (1)

backslashdot (95548) | more than 8 years ago | (#14956039)

Offtopic .. but related to software that can recognize content .. oh yeah I'll tie into law enforcement too..

Supposedly the next version of Mac OS will have at minimum OCR built in. Meaning it will scan images for words and put them in a searchable index. Now, eventually (or maybe the next release) .. maybe it can do facial recognition too. Or, even scenery recognition (beach, house in background, night versus day .. etc.) Maybe it can learn or be taught which house is who's etc. so even that can be automagically gotten. Scenery recognition AI is very hard though, so maybe it'll just be facial reognition.

Expect someday to be able to load the "top ten most wanted criminals" into your computer and see if they are in pictures taken on your last vacation (to implement image recognition would have to be able to detect objects as faces and store intermediary data about the face in some searchable index format otherwise searching would take too long if you have a mad number of files). Google image or a p2p app woud work with this well. Seems like a nice idea until all the false positives i suppose. I sure as hell wouldnt want to be harassed or arrested "for the public good" cause i look like someone.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>