Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

Software That Can Censor 'Sexual Images.' Or Not.

timothy posted more than 14 years ago | from the fill-your-hard-drive-with-flesh-colored-noise? dept.

Censorship 247

Halster writes: "Here's an interesting story on Newswire about censorware that detects excessive skintones in images, and implements blocking accordingly. What next?" What's next is (you'd hope) the realization that image-analysis heuristics are inherently limited, and not the best thing on which to pre-emptively base system-admin decisions. ( michael : That story is a company press release. For a much better evaluation of how this software works, see this Wired expose detailing the fraudulent nature of image-filtering "artificial intelligence," or an older review from Businessweek on Eyeguard.)

cancel ×


Sorry! There are no comments related to the filter you selected.

I'm on a Porn Collection Task Force (5)

David Wong (199703) | more than 14 years ago | (#990172)

"...Not only does eyeguard alert the network administrator, but it also disables the computer and takes a snapshot of the suspect image.."

My boss has installed this software, and is now forcing the entire office to surf for porn. These "snapshots" are sent directly to his hard drive, which is saving him the time of having to sift through thousands of non-porn pictures to get the ones he wants. Thanks to this software and the snapshot feature, my boss is able to accumulate pornographic images at 10X his previous efficiency.

Eye-T, Mr. Wilkerson thanks you.

If only it worked as well as they claim (1)

lildogie (54998) | more than 14 years ago | (#990173)

I would love to have a configurable discriminator for pornography.

If they can tell what the people in the picture are doing by analyzing the picture, I wouldn't have to sort through all of those GIFS myself...


Re:If I had this.... (1)

bolind (33496) | more than 14 years ago | (#990175)

[quickly, since I'm in a hurry]:

Me and a buddy actually started "development" (we talked a lot) about "p0rnster", a webcrawler that would take links from the porn-linksites, look for certain keywords, follow X links through, and save all jpeg's over Y kb big.

Never got around to it though.

good grief (2)

john_locke (196789) | more than 14 years ago | (#990190)

heaven forbid that our children see pornographic material at home rather than going out and having sex, getting std's, and pregant.

New internet and unaccountable businesses (3)

Tayknight (93940) | more than 14 years ago | (#990192)

I think this is one of the down-falls of the way many businesses are conducting themselves on the internet. This is obviously a company that was able to drive up the hype on their product. Now they are able to keep saying, "we're working on it" and most people will say 'OK' and not really hold the company responsible. Imagine if a company that made a real physical object tried this. Cars that crashed or drove the wrong direction 90% of the time. Or a kitchen disposal that ground up your hand in addition to the kitchen waste. Consumers wouldn't allow this product to remain. Its time we do the same for e-companies. Especially ones that proclaim to help children. None of these products work well. Just check out [] . Get the lowdown.

_How_ many neural networks???? (3)

Karellen (104380) | more than 14 years ago | (#990194)

From page 2...

"There are over 10 million neural networks involved in the thing," Beecher says."

Bloody hell! Imagine the computing power they've got to run 10 million neural _networks_.

...or maybe they mean 10 million neurons. Doh!

Re:Hrrrmmm (1)

pb (1020) | more than 14 years ago | (#990195)

How did you find out about my She-Hulk collection!!!

Oh well, you know this wouldn't affect Captain Kirk: he's got that fetish for blue women...

So where's the web filter that filters out stupid e-censorship ideas?

Oh. It filtered itself. Never mind...
pb Reply or e-mail; don't vaguely moderate [] .

Measure? Counter-measure! (2)

dillon_rinker (17944) | more than 14 years ago | (#990198)

Come see! All our girls wear pink body paint, guaranteed to beat your compay's software! Don't let your parents, your library, or your HR department jerk you around! Exercise your constitutional right to exercise your arm!

Heuristics (2)

Dungeon Dweller (134014) | more than 14 years ago | (#990199)

I could imagine heuristics which do a true analysis of the image to determine if it is pornographic, better than looking at the colors, but wouldn't that be an issue in and of itself, since it would take up processor time? People always say, "What would you do with that much processor?" Everything that we're not doing now. In the future, with faster processors, better analysis than this can be done, and instead of "Netnanny," parents will just load software that puts nice black blocks over everything that they're kids can't see. Real time censorship folks. And you wondered what a home supercomputer could be used for.

Re:Only for white-folk? - Of course!!!! (2)

RobertAG (176761) | more than 14 years ago | (#990202)

That is a good point. The article's use of the word "flesh" brought back images of the Crayola Crayon color of flesh, namely the pinkish skin tone prevalent among caucasians, that I used to use in kindergarten.

Given the fact that there are many different skin tones in the world, how is it to distinguish? Are they so arrogant that they assume only images of blond-haired, blue-eyed people are being downloaded?

It would be simple to get around... (2)

farrellj (563) | more than 14 years ago | (#990204)

We would quickly see a round of green, purple and blue skin, with a lot of science fiction themed porn. It would be just like what started the whole Hackerspeak...back in BBS days, some software implemented the ability to filter out swear words, so we started doing things like substituting an * for a some letters like "sh*t", and "F*CK!". This later evolved to using numbers to sub for letters to further get around these "features". And thus, 3l33t and d00dz came into our vocabulary.

It would be interesting to see what other ways, beside coloured skin people would use to get around a "excessive skintone" filter...a return to black and white pictures? Weirdly skewed colour maps? Use of "oil paint" filters to break up the skin-tone areas?

(who has been on line far too long...)

No Close-ups (1)

nagora (177841) | more than 14 years ago | (#990215)

So, any portrait-style picture will be blocked because of the high use of skin-tones?

What a load.


skin tones? (1)

jhittner (66567) | more than 14 years ago | (#990222)

my monitor is the same color as skin. would it block sonys web page?

hmm (5)

the_other_one (178565) | more than 14 years ago | (#990224)

What I want to see is an image filter that will filter out the clothing.

How will this distinguish things? (2)

linuxonceleron (87032) | more than 14 years ago | (#990227)

I can see how this product works by detecting skin tones, but how will it distinguish between say, a face and a naked body. Unless it has the outlines of human genetalia programmed into it and actually spent the time tracing each proxied image to check it for such outlines, this seems impossible. Also, what about dark skinned individuals, how would this product be able to detect them? This seems like a good idea that would never work unless it had a massive amount of processing power behind it for tracing and comparing images.

Re:Software/Hardware filtering? :) (1)

el_chicano (36361) | more than 14 years ago | (#990228)

If there were a filtering proxy service you could use that would recolor the images, and a lens you could hold in front of the computer monitor to see the corrected color, you could avoid any color-based stuff like this :)

A Netscape plug-in would be just the ticket!

You think being a MIB is all voodoo mind control? You should see the paperwork!

Does this actually (1)

nachoman (87476) | more than 14 years ago | (#990229)

So, what does "excessive skin tone" really mean.
is it like a percentage of the picture with certain pixel colors? Well how would it be able to distinguish between a naked chick with a lot of background and a picture of someone's tattoo on their arm.

Now if they can design an algorithm to analyze a picture to tell exactly what part of the body it is... then i'd be impressed. This just seems like just another ploy to make money on something that doesn't really work that great.

A few problems... (1)

cheesethegreat (132893) | more than 14 years ago | (#990230)

What picture types will it protect against, how about movies and PDF? What if site admins just adjust the color slightly, to get out of the skin color definition. This software definitely would need a lot of work.

Re:Oh, yea...this is a *great* idea.... (1)

shippo (166521) | more than 14 years ago | (#990231)

This would really fail in the UK.

We have counties named Essex, and Sussex and historically though still used Middlesex and Wessex. Woops!

Award this sysadmin the BOFH award - Baldrick Operator From Hell (I have a cunning plan!).

I worked somewhere where a commercial filter the clowns into our system stopped access to well-known pron sites such as

My favorite quote... (5)

A Big Gnu Thrush (12795) | more than 14 years ago | (#990232)

From the Wired article: "How do you tell the difference between a woman in a bikini in a sailboat which is not racy and a naked woman in a sailboat?" Touretzky asks. "The only difference is a couple of nipples and a patch of pubic hair. You're not going to be able to find that with a neural network."

Maybe our definition of obscenity is the problem.

Re:The different ways to circumvent this: (2)

Emil Brink (69213) | more than 14 years ago | (#990233)

Hm... Yeah, perhaps. But this quote from the first link:
Once installed on a single PC or across a network, the antiporn software known as eyeguard is activated each time an image is displayed
Makes me speculate that eyeguard actually hooks into the operating system itself, so that it sits somewhere in the code that displays bitmaps on screen (on Win32, that'd be inside GDI, right?). If so, then most of the above techniques won't work very well, unless you can "counter-hook" those API calls, so that eyeguard sees an altered version of the image, then calls the original OS entry to display that, but actually ends up in your anti-blocker which turns the image back to normal and displays it. I suspect that it would be fairly easy for eyeguard to protect itself from calling a "false" original entrypoint, though... Hm, this is pretty close to some serious cyberwarfare. The lengths some people go to control each other... *Shrug* :( On the off chance that eyeguard does not work like the above, consider it a free business idea and start hacking. You might get rich! ;^)

"skin tones"? (1)

FascDot Killed My Pr (24021) | more than 14 years ago | (#990234)

Does any remember the Bloom County strip where Oliver puts on a band-aid--on his black skin and Milo says "convenient flesh tone"?

Which color(s) exactly does this software block? Human skins varies from near-white to near-black (esp in a photograph).

Are they going to add new skin tones based on "popularity"? And if so, does that mean when I view so-called "inter-racial" pics, I'm going to see a black man apparently humping a blank space?

And of course, workarounds spring immediately to mind: Use gimp or photoshop scripts to automatically transform skin to purple and distribute a viewer that transforms back.

And let's don't even get into the perfectly valid images this will block (like closeups of non-porn humans), medical sites (esp dermatology), etc.
Compaq dropping MAILWorks?

Re:No Close-ups (2)

wnissen (59924) | more than 14 years ago | (#990235)

I don't know how good their system is, but there is published research on this sort of thing. One of my former professors made a "naked people finder" that's based on finding cylinders in the picture and evaluating if they are skin tone, and if they make a reasonable human body. An interesting aspect is that bikini pictures are out, but maybe pictures that don't show caucasians are OK. For more info on finding naked people, see


Useful to alert admins, not to block (5)

Paul Johnson (33553) | more than 14 years ago | (#990259)

OK, its pretty obvious that obscenity is in the mind of the beholder, not the computer. So computers can't spot this stuff.

But I can imagine a program which tracks the average flesh tone score for pictures over time. If the moving average goes over a certain threshold then a dialog box pops up on the sysadmins screen telling him that Joe in cubicle 69 may be abusing company bandwidth, click here for a list of the suspicious URLs. Or, as it might be, sends an email to Junior's father. The key point is that this stuff can work as part of a monitoring system that uses human judgement for the final bit, rather than being a blocking solution.

Companies do have a legitimate need to monitor this stuff. Quite apart from the abuse of company resources, companies who allow employees to download and view sexually explicit materials can find themselves on the wrong end of a big discrimination lawsuit.


Re:Oh, yea...this is a *great* idea.... (1)

gwicks (61255) | more than 14 years ago | (#990260)

Scunthorpe, North Linconshire.

Re:Ummmm... What If. (1)

carlos_benj (140796) | more than 14 years ago | (#990261)

Since the Venus is marble, that shouldn't be detected as skin tones.

Flawed, like all other filters, but... (1)

ParticleGirl (197721) | more than 14 years ago | (#990262)

Of course this filtering software is flawed; all filtering software made to date is flawed. What's the range of the skin tones they're searching for? If it's really a full range of skin tones, it's going to be barely distinguishable from a palate of earth tones. It won't differentiate between portraits and landscapes and kinky sex. Or, for that matter, between "art" and "porn" -- most people have different opinions about where one ends and the other begins.
On the other hand, this filtering software takes a shot of the "questionable" page and then has it screened by a human monitor . This is invaluable. Even subject to the whims of a human monitor, this is much better than a purely digital filter. The filtering technology is used to point the human monitor to spots that he should be concerned with, not to make the final decisions. (Which is good, coming back to how flawed the technology can be.)
This may not be good, but at least it's a step in a productive direction. I'm not one for censorship, but if people demand a product, it should at least be an effective one. Too many people are gung-ho about filters because filters make them feel better or more in control; they often don't even think about how effective the filter may or may not be.

Measure, counter-measure. (1)

Linux Freak (18608) | more than 14 years ago | (#990263)

When will people realize that anti-censoring measures do *NOT* work? If this thing proliferates to the point where online pr0n retailers are feeling it financially, you'll start seeing them offer encrypted pr0n images (the GIMP offers such a filter).

You'll then download this seeming random gibberish image and decrypt into an image of the luscious Ms. Portman in all her petrified glory! ;-)

What to fight (3)

sandler (9145) | more than 14 years ago | (#990264)

There will always be people who feel the need to "do something" about "evil" on the internet, and as long as there are, there will be people to buy this kind of software. As long as we keep pointing out why the software is broken, people will keep coming out with software that's supposed to be better. If we want to fight censorware, we need to argue about why censorware is wrong in general, not why this or that specific software is broken.

That having been said, I think the reason is that any censorware, present or future, puts the decision of what is and is not appropriate for me or my kids into the hands of people who don't know me and don't share my values.

Open censorware (with open block lists) is a possible solution to this. This way, parents, who should be deciding what their children will see, can actually make real decisions, rather than have to abide by whatever decisions Mattel or whoever else makes for them.

It's just as wrong for a company to insist that my kids shouldn't see a certain site as it is for anti-censorware advocates to insist that my kids should be able to see anything. The right thing to do is to give parents the choice to make that call.

Re:Oh, yea...this is a *great* idea.... (3)

Paul Johnson (33553) | more than 14 years ago | (#990265)

Not to mention Scunthorpe.

See RISKS [] for details.


Re:How will this distinguish things? (3)

orpheus (14534) | more than 14 years ago | (#990266)

Since the article specifically Refers to the Aussie situation (mandatory porn filtering by ISPs), here's what the final report of the Australian Government (National Office for the Information Economy) has to say about the weaknesses of this approach in their review of blocking technologies entitled []
Access Prevention Techniques for Internet Content Filtering (Google cache) :

The quest to detect pornography is often more concerned with images than text and getting computers to recognise a pornographic image is equally, if not more, difficult than the task of distinguishing between erotica and other literature. There have been efforts to characterise pornographic pictures based on the amount of 'flesh tone' in the images and on the poses struck by the subjects. Computers have difficulty distinguishing between art and pornography, say between a Rubens painting and a Penthouse centrefold, and this approach is not generally regarded as being effective. Video and other streaming media further complicate the filtering task by supplying a constant flow of images to be examined for undesirable content.

Furthermore, they complain:

This approach is affected by the same issue as profile filtering in that an image - or a fair percentage of an image - needs to be loaded before it can be analysed, during which time it may be displayed on a user's screen..

Of course, this second problem only applies to an Aussie-type ISP restriction. Geocities did this years ago (don't know if they still do): scanning their own HDDs (Free user pages), deleting 'questionable graphics' (with or without human review) and waiting for the page authors to complain about any mistakes,

Interesting idea, bad implementation. (1)

billcopc (196330) | more than 14 years ago | (#990289)

While the concept of pseudo-intelligent software trying to figure out what's depicted in an image is very interesting, it simply has nothing to gain from neural networks.

Surely it is quite possible to figure out if a picture depicts genitalia in most cases (excluding the cases where even human judgement is defeated), but I think it would be best applied by scanning small round areas iteratively like a spotlight, and trying to follow "interesting" trails until they can be identified. Once the image has been mapped and subdivided into physical elements, a second phase of the recognition could verify if indeed this is a human we're looking at, by analyzing the overall size of the "object" and checking the proportions/alignment of the individual areas. This is just typical visual recognition, the hard part comes when you consider the infinite angles and positions that the subject can assume.

A quick example: suppose the subject is facing away from the viewpoint on all fours (yeah I know, typical).. how will the software figure out if there is depiction of "bush" or just dark underwear ?

Image quality would also be of concern.. a blurry image would probably confuse the recognition engine, and overlaid text/graphics would be a serious problem. Not to mention the extreme cpu load for the entire process, repeated N times for all the images on a page.

It would probably be cheaper and faster to hire a few hundred gumbies and have them check all images manually (no pun intended).


The real dark side of this (5)

jabber (13196) | more than 14 years ago | (#990290)

Image recognition, refined enough to filter porn, will not be around for a VERY long time. I'm not that imaginitive, and I can easily picture all sorts of "unnatural" positions which an automated system would have a hard time recognizing as porn. :)

It will take an AI with the understanding of what "porn" means, with an appreciation for the human body's full range of motion, and with the comprehension of the latest fetishes - else National Geographic and will find themselves filtered out of libraries and schools. After all, what is the difference between an image of a 'man riding a horse' and that of a 'man riding a horse'?

But the research being put into this sort of image recognition has an even seedier and more sinister side. It can/will filter based on LOGO. That's right.

Imagine Time-Warner/AOL being 'unable to connect' to sites which feature their competitor's logos.. Imagine ISPs who show Reebok ad banners suddenly disabling links to pages that display the Nike "swoosh". Imagine your favorite web-site [] suddenly not letting you click through to any other site that does not proudly wear a "VA" on it's 'sponsors' page.

And all this technology is being developed... (oh, say it with me) "In the name of the children!". BS - all the children I know would get a kick out of looking at porn, and are being damaged more by advertising than by sexual content.

Personally, I think we should assist in the development of this technology, and make sure that it only filters on Red Maple leaves on white backgrounds! Blame Canada!! Hooyah!

Skintones. (2)

Carthain (86046) | more than 14 years ago | (#990291)

Hmmm, now, when it's filtering and looking for those skintones, is it just looking for skintones of 'white' people? if so, what about the skintones of other races?

On the other hand, if they do include skintones from all races, then that's a lot of colours they're filtering.

Re:Ummmm... What If. (1)

B-B (169492) | more than 14 years ago | (#990292)

The statue is white marble. (not flesh toned)

Sculptor of the Venus De Milo is not DaVinci.

Now, If you said the painting "Birth of Venus" by Botticelli, then you would have a point!


Finally, the dubdubdub gets a touch of class... (1)

Halloween Jack (182035) | more than 14 years ago | (#990293) and white "art" photography sites get a huge boost.

Say, has Nerve [] had its IPO yet?

Every day a good laugh (2)

CaptainZapp (182233) | more than 14 years ago | (#990294)

This article was a hoot. I can really picture some slick talking marketing guy and his lines of reasoning:

Yeah, there's this independant testing lab, verifying our nano-cool neuronal rocket science algorithms, working with 99.8% reliability. The name ? No, that's really secret that the Rush Limbaugh institute for creative certification is the independent lab...

Nah, I've used it myself 30 or 60 days ago. I could only get to the dirty pickies at the XXXsmutshop after disabling our super software...

Wot ? An old version ? Hey we're into rocket science advancing our secret algorithm on a daily basis. Since it's so advanced we don't need version control, therefor we don't have a version that actually worked anymore. But trust me, I'm in the DOTCOM business...

Shheeeesh, that guy must have been straight out from twisted tongue marketing academy...

hey, it's not all bad... (2)

1299709 (30310) | more than 14 years ago | (#990295)

BAIR frequently blocks ads on the websites of PC Magazine, Wired News,, and other news services.
If nothing else, it's an ad blocking proxy!

whatever (1)

Tyriphobe (28459) | more than 14 years ago | (#990296)

This is obviously just a load of... hype, to put it politely. If there actually is a heuristic engine looking through these pictures, it's working just as well as a coin toss in Wired's test cases (numeric URLs containing images). If anything, this is just PR to make their blocking program look high tech and junk. I'd say it probably just works like all the others - a big list of blacklisted sites and maybe a simple string parser.

All the AI hot air is just that - the only new thing about this is how they run it: it reroutes your traffic through their own server (see Wired article), so I imagine instead of going out and purchasing a program, you could just pay for monthly access to their service. A little more convenient, if you don't have the time to talk to your kids about anything important and just want to keep them ignorant.

My experiance: (1)

acidrain (35064) | more than 14 years ago | (#990297)

I just took an image processing course and the professor was one of the leading researchers in the field of image compression. One of the technologies we covered was edge detection, and the use of edge extraction for catagorising images.
Frankly, you can tell whether an image has lots of straight lines, distribution of directions of the lines, find the average direction of the lines, and standard deviation from that average... This gives you a siganture that can be used to find similar images and descern between natural scenes and scenes with artificial constructs. And of course I'm sure these folks are using more advanced statistical anylsis than that... But, from what I've seen, it's chasing a cat with a blindfold on. As for the value of colour histograms? I think that points been made.
Processing power? Yes. This is not realtime technology for the masses.
If these folks could do this at all well, don't you think the AI folks would have put something better than Abio on the table? Ahh, those crazy VCs.

Cut 'em off at the Cash Register (1)

Schnedt McWapt (195938) | more than 14 years ago | (#990298)

There's a simple solution for getting rid of most of the Porn on the 'net.

Get a legal precedent started that absolves people from responsiblity for paying for access to porn websites. If credit card firms were allowed to simply refuse payment to porn vendors, they'd disappear in a matter of weeks.

Don't try to stop people from accessing the sites, just make it impossible for the site owners to collect from said people. There are very few non-pay sites on the web, and actual special interest groups like the BDSM people who hang out on various Usenet groups do have legitimate free speech issues to raise. Most of them would be glad to see the commercial website spammers eliminated from their newsfeeds.

Grep this. (1)

ktakki (64573) | more than 14 years ago | (#990299)

Wow. This is one time I'm actually glad that I have a thing for albinos.

p.s.: Not you, Nick. You lack the proper topology.

"In spite of everything, I still believe that people
are really good at heart." - Anne Frank

Re:Oh, yea...this is a *great* idea.... (1)

shippo (166521) | more than 14 years ago | (#990300)

Of football supporters discussing Arsenal.

Re:What a job! (1)

adpowers (153922) | more than 14 years ago | (#990301)

Imagine being the one that gets^H^H^Hhas to surf through all the images after they are tagged.

"Hmm, lets see. Yes this one is nudity, I will save you for later..."

ha! (2)

roman_mir (125474) | more than 14 years ago | (#990302)

Imagine if software like this was possible, when installed on a computer it could hook into your OS and every time you looked at Natalie Portman's lashes hooters it would put a red rectangle over the most important places on her body - her head and her feet!

Re:Only for white-folk? - Of course!!!! (1)

angelo (21182) | more than 14 years ago | (#990303)

Well, yes. It appears to be some sort of aesthetic given that white people are in porn. Perhaps a lot of it comes from America where the census puts whites at 70-80% of the population. Or perhaps idiots post messages on usenet that say "don't post ugly n*gger b*tches!" to intimidate the posting of only whites. There seems to be a problem here, but it's most likely that white women/men have higher opinions and half the magazines contain whites. Go figure.

What a useless bit of code! (3)

jd (1658) | more than 14 years ago | (#990304)

On the "Hello World" scale of uselessness (on which "Hello World" ranks a 5/10), this has to rate a 1.


  • Short of having libraries which support EVERY graphics image format (stills AND movies), it's useless. Say that it supports GIF and JPEG. Two popular formats now, but aging. (In the case of GIF87, aged, dead, buried, rotted and recycled as UNISYS demands for money.)
  • There's the aspect that others have pointed out, that "flesh-tones" vary between peoples, and not everything that falls into that range is flesh. (Pine would probably trigger it.)
  • Herustics -ARE- a very good approach, for many things, but this isn't one of them. If it's not linearly seperable, then neither herustics NOR neural nets will be able to produce accurate results.
  • With the advent of alpha channels, what's in an image, anyway? If two images can be blended on-the-fly, then one image can be split randomly on-the-fly, in such a way as to make any one image appear incoherent, but the combined image as whatever you started off with.
  • Almost forgot. You not only need to support all image types, but also all compression schemes. No use being able to process GIFs, if the image you're fetching is also gzipped or bzipped.

All in all, it comes back to what I've always said about these types of system. Give someone freedom to filter, ALL ON THEIR OWN, and they'll probably do so. Everyone, however "liberal", has something they just don't want to spend the time with. And that's OK! That's GOOD! But the definition of OK cannot come from outside, it has to come from inside.

As for parents of kids, same sort of thing applies. When you pick a meal to cook, do you select out of the cook book(s), or cook everything in them, all at once? You select! Ergo, being able to select out of a range of databases (eg: your own personal filters, the school's database, the databases built by the various clubs & societies the kids belong to, etc, ad nausium), makes MUCH more sense than blindly following one database built around the fiction of one-size-fits-all.

Yes, it takes more time. But in the end, you will ALWAYS have a trade-off. The easy and generic routes are INVARIABLY harmful in the long term. You can become a cabbage-patch human, and live in Number 6's Village for all eternity, or you can put some effort in and live as a human being, instead. This doesn't mean being "rebellious" - if you rebel for the sake of defying what someone else says, your brain is as much a rotten cabbage as the obsessive conformist.

Getting back to this censorware, it's market is that of the obsessive conformist, and the most vocal critics (in the media) are the obsessive rebels. It's a symbiotic relationship made in hell. The more extreme one group gets, the more it feeds the other. Don't you think the makers knew it would be controversial? Of course they did! They are counting on it! The more attention it gets, the more free advertising, the more money they make and the more brownie-points they can give themselves.

The media critics are the same. Without products like this, there's nothing to vent about, and therefore no reason for anyone to read their articles, and therefore no reason for anyone to keep them employed. They don't want their "enemies" to go away, because they're the ones who justify the pay-cheque.

IMHO, whilst the Extreme Wing and the Press are "best of enemies", there's no place for sanity in the world. Who needs it, when you've a symbiotic, self-perpetuating feeding-frenzy?

Oh good, that means these types of sites are bad (1)

hardaker (32597) | more than 14 years ago | (#990305)

The bad:
  • Any science sites with close ups of skin samples (dermetology, medical databases, eg).
  • head shots []

The good:

  • The really hairy people organization.
  • Beastiality web sites

I think slashdot needs a solid color gif of just a skin tone color for it's new cencorship icon.

Tripod has been doing this for some time... (2)

waldeaux (109942) | more than 14 years ago | (#990327)

I seem to recall a friend having his web pages turned off at tripod because they tripped the "nudity" sensors. Apparently they did a sweep looking for excessing fleshtones, the had someone look at all the images that were flagged.

Of course there are several ways to defeat the program:

  1. Don't be white. "Flesh tones" as used is insanely non-representative of the true range of flesh colors.
  2. Don't use color. Greyscale will be a LOT harder to deal with.
  3. Put a lot of face shots in. Lots of flesh tones but not anything pornographic.
  4. Retro-60's - purple on gold, etc... :-)

The best way to get rid of a stupid system is to think around it.

some nudity will slip through... (2)

LocalYokel (85558) | more than 14 years ago | (#990328)

Wow, this article is some serious trollbait -- I'll bite!

Naked and petrified will live on, such as my creation for the /. trolls:
Natalie Portman naked and petrified [] .

Someone else created:
Signal 11 naked and petrified []
and I wonder where they got the idea???

Young boys are always going to be able to get pictures of naked, not necessarily petrified women -- they're beautiful things, are they not? Didn't you ever stay up late to watch a "naughty" movie like Revenge of the Nerds or Porky's knowing that it would be an opportunity to catch a glimpse at some "hooters"? It's sad, but kinda funny, innit?


Best filtration software... (4)

ocelotbob (173602) | more than 14 years ago | (#990329)

I have the best solution to the Censorware problem, plus it'll make a lot of people very happy. First, in the rich, upper crust neighboroods, you advertise for realtime filtration of Bad Stuff to protect the whelplings. Next, you advertise in and around colleges such things as "Make Money Viewing Porn". You pay these students about $6.00/hour. Now you put all these students in front of computer terminals, and hook them up to heart monitors. Any time someone subscribing to the service wants to view a page, it's first shown to one of the random college students. Now, if their heart rate rises once they see the page, you know that the page should be filtered.

Whats next? (1)

godofredo (198906) | more than 14 years ago | (#990330)

Lots of porno with red and blue body paint! JJ

Nothing new, just a new scam (5)

Old Man Kensey (5209) | more than 14 years ago | (#990331)

I remember speculation about filtering web content by checking the skin-tone levels of images as far back as 1996. At the time everybody more or less decided it was too impractical and gave up on the idea. The standard points about non-flesh fleshtoned objects and large amounts of non-pornographic flesh were made then too.

This is a particularly disgusting (to me at least) instance of the "for the children!" canard. Now instead of politicians using it to achieve their aims, which is bad enough, we've got a company using it to bilk panicked consumers out of their money.

And of course, just as with the quality of our politicians, we Americans have only ourselves to thank for this. If people weren't so damn gullible, companies like this would never sell a dime of product (of course in this case it's questionable whether what they have constututes a "product", but the point stands...)

What's needed is people willing to stand up and say "Yes, damn it, I do support porn on the Internet, and the easy availability of information on things like bomb-making and lock-picking, and if you don't like the speech I support, TOUGH SHIT. You don't get to pick and choose. If you want free speech, you got it. If you don't want it, go start your own damn country and LEAVE MINE ALONE."

But what are the odds of that happening?

Heuristic analysis (5)

jabber (13196) | more than 14 years ago | (#990332)

All forms of naked women are to be filtered, except when their arms are missing, in which case it's Venus de Milo, and therefore a bona fide work of art.

Clears the way for amputee fetishes, I think. :)

Boticcelli's Venus, the image of a naked woman coming out of the surf, that has been used as the box art for Adobe Illustrator (IIRC) would of course be flagged. She has nipples and a 'patch of hair', as do most nudes painted during that time period....

Hell, the Sistene Chapel ceiling is offensive, it shows Adam (naked youth) and God (Old man) touching fingers.... There's a bunch of naked little boy cherubs flouncing around them to boot. What horrific kinkiness!!

Re:Nothing new, just a new scam (1)

Schnedt McWapt (195938) | more than 14 years ago | (#990333)

There is no guarantee that filters will be limited to 'jiggling tits' as you so eloquently put it. Look at CyperPatrol with its politcal agenda.

The 'content filtering' industry is in it's infancy yet. Believe me, there's commercial potential there, and as the industry grows, people will figure out who has a political agenda and who doesn't.

Put simply, places that pull a 'CyberPatrol' will lose customers.

Planned Enhancements for the Program (2)

StoryMan (130421) | more than 14 years ago | (#990334)

Last week I attended a meeting in Santa Barbra and ran into one of the dudes coding for this place.

One planned enhancement for the software is configurability for the amount of *exposed flesh* shown before the engine kicks in and blocks the image. The idea is to have 'sliders' -- client-side java applets, I'm told -- on a admin/config page which would allow for a specific percentage of (for example) nipple. Once the network identifies the presence of nipple, the position of the configuration sliders determine if this presence is, in fact, pornographic or not pornographic.

My question to the dude I met was how does the program quantify 'pornography' in the first place? If the neural networks are scanning for flesh, then they must have some sort of way to contextualize and quantify porn. (Since the 'I know porn when i see it' definition can't possibly work in a programmatic environment.)

His response was interesting: he claimed that while he couldn't explain exactly how it was accomplished, he mentioned that several state governments are looking to extrope's definitions of a 'porn' image in order to settle various state and local pornographic cases throughout the country.

He explained that it will possible to dump out the specific 'porn' settings -- set by the sliders on the config page -- and generate a long list of what, according to the admin, constitutes porn: 63.5% exposed nipple, more than 72% bare (suntanned but not pale) flesh, the absence of either a shirt or pants [but the presence of black {but but not white} underwear], the presence of various objects in the room in which the photograph was taken (a smoking cigarette in an ashtry, for example; or a bottle of Dewar's scotch that looks as though it could have been inbibed by the photographic subject; one black high heeled pump turned on its side, pointing away from the camera but [an important distinction] *toward* the bed), and so on.

The difficulty, I was told, was derving an algorithm robust enough to exploit the neural networks but not tax it to its limit. (The employee was telling me that just a few hours ago successfully implemented the algorithm if the clothing on the subject in question was purchased from JC Penney's or from Victoria Secret.

"It was tough," he explained. "Victoria's Secret uses significantly smaller weaves in their nylon undergarments (hence the higher price for lingerie from VC as opposed to JC Penney's). Try getting a program to recognize a bra from VC, and you've got the holy grail of censorware!"

Re:Oh, yea...this is a *great* idea.... (1)

Phroggy (441) | more than 14 years ago | (#990335)

At any rate, this sort of half-assed content filtering still doesn't replace mom or dad talking to Dick and Jane about the world and what's in it.

Would "Dick and Jane" be blocked as pornographic too?


Re:Ummmm... What If. (1)

SirStanley (95545) | more than 14 years ago | (#990336)

ahhh... to early in the morning Am not thinking right... but you get the point.

Re:Detecting Skin tones (1)

TheNecromancer (179644) | more than 14 years ago | (#990337)

>>but there were a lot of things that looked like skin to it. Especially light colored woodwork.

How offensive! I think that erotic photos of trees mating should be censored!

Re:Only for white-folk? (1)

/ (33804) | more than 14 years ago | (#990338)

You get to satisfy your porn addiction in peace, it would seem, assuming we postulate the narrow view of racial segregation in erotic desire (as if...).

Re:Only for white-folk? - Of course!!!! (1)

mrzaph0d (25646) | more than 14 years ago | (#990339)

not to mention the fact that most of the "blondes" aren't really...

"Leave the gun, take the canoli."

Re:Oh, yea...this is a *great* idea.... (1)

Schnedt McWapt (195938) | more than 14 years ago | (#990340)

It's pretty humorous that for a bunch of people supposedly tech-savvy, you fail to understand that regular expressions can easily tell if a three or four letter pattern is bounded in whitespace or not.

Naw, you're just carrying on to be cute, you couldn't be that clueless and hang out on /.

Re:The different ways to circumvent this: (1)

mtphoto (179367) | more than 14 years ago | (#990341)

According to the Wired article, it's a proxy server setup for this software. It does the work on their servers (comparing it to their large database of images) and passes the "OK" images on to you.

Open Source (2)

BJH (11355) | more than 14 years ago | (#990342)

I quote:

Wired News tested BAIR by creating a Perl program to extract images randomly from an 87MB database of thousands of both pornographic and non-pornographic photographs. The program then assigned each of those images random numbers as file names.

...Do you suppose they could be convinced to open source that database? Quick, someone call ESR! ;)

Re:Detecting Skin tones (1)

mtphoto (179367) | more than 14 years ago | (#990343)

What would be more interesting is something that does analysis for gradients. I'm not sure how absolute this is, but in all the photography I've encountered (both studied and my own work), gradient of tonality accross skin is pretty unique. I don't mean absolute values that could be fooled by skin color, but the values of the colors relative to each other (due to reflectance values of skin and the limb effect of light falloff). Though this would not be able to take into account porn with dramatic, artistic lighting. Then again, it's not that I really need or want effective blocking software, but it would be interesting because something along those lines would be capable of determining.

Re:The different ways to circumvent this: (1)

Emil Brink (69213) | more than 14 years ago | (#990344)

Ah. That's just me being stuck in the old mind set of doing things, I guess. Nowadays, it's all done by proxy. ;^) Hm... I still think "my" way of doing it is cool, though.

what's next ... (1)

rwa2 (4391) | more than 14 years ago | (#990347)

a proliferation green martian alien pr0n

Only for white-folk? (5)

Anonymous Coward | more than 14 years ago | (#990348)

What about us black folk?

No, I am not trolling. This is seriously flawed. Not to mention stupid.

Can someone mirror the article? (3)

Anonymous Coward | more than 14 years ago | (#990350)

Only joking, but for those of us using corporate HTTP proxies where the sysadmins look for 'abuse' of the network, we're going to get some strange looks after visiting a page called Couldn't they have called the page something a bit more discreet?

Thank god it won't cover.... (1)

JSurguy (85240) | more than 14 years ago | (#990353)

my website showing hot venusian love goddesses.

Flesh tones = pornography? (1)

uk_greg (187765) | more than 14 years ago | (#990356)

I remember a news report from a few years ago that Brigham Young University had implemented similar technology to monitor hits on pornography sites. Whether that was true, or simply an urban legend, I have no idea.

Oh, yea...this is a *great* idea.... (2)

mat catastrophe (105256) | more than 14 years ago | (#990360)

It ranks right up there with sysadmins who add the string "sex" to the list of places you can't go.
This happened at a place I once worked. All of a sudden, we couldn't (for instance) look up the words:

etc, etc.
they also added "young," "adult," and other words to the list...

keep in mind that this wasn't looking just in the domain name for the string, but in the entire URL.
At any rate, this sort of half-assed content filtering still doesn't replace mom or dad talking to Dick and Jane about the world and what's in it.

Reminded... (1)

gavinhall (33) | more than 14 years ago | (#990362)

Posted by 11223:

...of a dilbert strip (I'm looking for the link) where Dilber invents a device that detects naked bodies and tries it out on a teenager - wanders away and hears the kid's eyes popping (obviously because the device failed and he found it anyway). Besides, the porn industry is smart - they can defeat any kind of protection.

I agree with most of its decisions (1)

jedwards (135260) | more than 14 years ago | (#990369)

Many phallic trees : Yellowstone []

Naughty Areas : Snoopy []

Genitalia : Dogs []

More phalluses : vegetables []

I, for one, will use this software to protect myself from all this obscenity I hadn't noticed before.

I need an IPO for my better product. (1)

Spectre (1685) | more than 14 years ago | (#990370)

My filter blocks 100% of pornographic images, requires little power, and is compatible with all POTS modem connections. I am working on repacking this hardware filter for DSL and cable modem users.

It consists of a 1 megaohm resister placed inline with each of the signal wires...

Interested venture capitalists, please send checks made out to CASH, thank-you!

It has a slight overblocking problem, but we are working on this issue...

Even if it were 95% accurate... (1)

grahamsz (150076) | more than 14 years ago | (#990371)

Surely there is enough porn on the internet that 5% of it would be enough to satisfy even the most perverted of 15 year olds.

Also this seems to not make the distinction between nudity and porn. Certainly in America the two are interchangable but i've never seen it that way.

Re:Whats next? (1)

ocelotbob (173602) | more than 14 years ago | (#990372)

No, we'll just see more Kitty Porn. []

Re:Detecting Skin tones (1)

mpe (36238) | more than 14 years ago | (#990373)

An algorithm like this may be able to filter a lot of stuff off the web. But it will filter a lot of other stuff too. I can also think of 100 ways to fool it. The easiest being put images through a color filter before posting them, or post them in black and white. Other people have pointed out that it will filter portraits and other shots of humans that arn't porn.

The specific product in question considers the Windows 3D text screensaver to be "skin tones" as well as sunsets.
The algorithms involved don't appeat to be very good at actually recognising "skin tones". Also they are proprietary, rather than carrying the name of some noted AI researcher.

this will be unpopular but... (1)

multimed (189254) | more than 14 years ago | (#990374)

I vehemently agree that censorship is bad--always. But I think companies should be able to try and make products that filter out things--if there's enough parents who want software to filter out pornographic images, then companies will try and make them. The question is choice--as long as it is not forced on people (excluding kids) as long as they choose to use software that does this, I see no problem.

Of course the companies that are just playing PR games and the like (and this one sounds like one of them) are bad.

I guess I just believe that parents should be able to raise their kids the way they see fit and should probably have some help in internet filtration if they want it

Of course as someone else said, if the software is 95% or even 99% accurate, that 1% or 5% is still an awful lot of content.

Re:Nothing new, just a new scam (1)

sqlrob (173498) | more than 14 years ago | (#990375)

It's shocking that people like you not only refuse to accept the idea that there should be moral standards imposed on the net, you also seem to feel people have no right to decide what they want to filter out and block their children and themselves from content they find distasteful.

I refuse to accept moral standards imposed on the net. But should people be allowed to filter? If they want to, feel free. PROVIDED, they don't force it on anyone else.

There is no guarantee that filters will be limited to 'jiggling tits' as you so eloquently put it. Look at CyperPatrol with its politcal agenda.

A different way to detect..... (2)

blogan (84463) | more than 14 years ago | (#990376)

I saw software similar to this a while ago (sorry, can't remember the name or anything), but what they did to detect was to look for a large "blobs" of a skin tone (white, black, tan, etc.) and then did some computations to determine what it was. They could "detect" a torso with legs and arms coming out even if they were crossed. So if someone had a bikini on, the blob wouldn't be continuous, and therefore not something naked. So if you add this algorithm with the other it might be pretty good.

Disclaimer: This information might be wrong, it was a while ago that I saw it.

Re:hmm (3)

roman_mir (125474) | more than 14 years ago | (#990377)

These evil bastards must be stopped!

Re:hey, it's not all bad... (1)

generic-man (33649) | more than 14 years ago | (#990378)

If it'll get rid of those annoying X10 ads, then I'm all for it. They already use porn stars anyway, it's only a matter of time before they actually start using porno images.

Doesn't have to work well... (2)

Danse (1026) | more than 14 years ago | (#990379)

That's the beauty of this thing (from a rather draconian management point of view anyway), it doesn't have to do a very good job, it just has to work well enough to intimidate the employees enough that they don't dare visit any sites without a really good reason. Viola! You've eliminated recreational use of the Internet at your company!

Re:The different ways to circumvent this: (1)

SomeOne2 (51129) | more than 14 years ago | (#990380)

on Win32, that'd be inside GDI, right
Impossible. If that would be done (hooking BitBlt etc.) _every_ Bitmap-Copy would have to be checked which includes double-buffering etc. The System would be unbearable slow. So they probably only hook in IE

Re:Tripod has been doing this for some time... (1)

BoneFlower (107640) | more than 14 years ago | (#990381)

Tripods TOS does not allow you to post nudity on your site, and bans hyperlinking to outside sites done for the purpose of getting around their TOS content restrictions.

>the had someone look at all the images that were >flagged.

Which shows that they realize the software is not perfect, and check the results against the best image recognition hardware available, the human eye.

They used the software as it should be used. As a tool to make their enforcement of their TOS more efficient, not as an automatic blocker/account tosser. As blocking software I'd say its next to useless. But as part of a monitoring program where the humans make the final decisions, it can be useful.

Ummmm... What If. (1)

SirStanley (95545) | more than 14 years ago | (#990382)

Ok... Say Im Studying the Venus DiMilo(Spelling) Is it gonna classify that as Porn considering its a Statue by that DaVinci Guy of a topless woman with no arms (I do not claim to know anything about this statue.. thats from memory..If facts are wrong boohooo its the point that matters)

We Are A Nation of Loonies! (1)

LaNMaN2000 (173615) | more than 14 years ago | (#990384)

The lengths to which some people will go to "protect" their children from information that they are seeking is incredible. It is unfathomable to me that this is even necessary. Nude pictures do not just appear in the absence of any context! What type of pornographic web site would this image-recognition algorithm block, that a simple text-parsing engine would not.

Maybe this is going to be used to protect children from the "flesh tones" that are part of the color scheme on a liberal web site. Go figure.

My New Comic Strip (2)

Dungeon Dweller (134014) | more than 14 years ago | (#990386)

I think that I am going to start a comic strip about pink little bunnies. Yes, they will be very large, flesh toned, round fat little bunnies, with bright pink bellybuttons, big round ears with sort of black hair in between, maybe a couple of mohawks. Red and blonde hair on a few. Big loving eyes, with red pupils.

Hrmm, this software probably works great already .

Next, I think that I'll do a photo spread on fields of wheat.

(Before you mod this down, read what I wrote, it's a joke about the heuristic).

Solutions (1)

paulproteus (112149) | more than 14 years ago | (#990388)

Well, there are a few ways to avoid this filtering technology.

One involves the pads about which we learned yesterday. Those pads could be everyone's ticket around broken filtering.

Or, better yet, a local proxy server using those pads. Encryption, however, really is the only way around it -- or disguising other filetypes as images.

After all, if it doesn't realize that .asdf is the new extension for JPGs, what can it do?

Censorship is inherently broken. Have a look at

If I had this.... (2)

Denor (89982) | more than 14 years ago | (#990390)

I can tell you one thing for certain: If I had a program that could go through the web and find pages that are almost certainly pornographic content, I wouldn't be censoring them ;)
Porn search engine, anyone?

New search engine: "Google Goggles!" (1)

ChiaBen (160517) | more than 14 years ago | (#990392)

Hey, a guy could make money from reversing this type of program to add flesh tones where there is an extraordinary LACK of skintones...:o)

Then you could set up a search engine to dig through images, and add these skin tones, and display the results!

I'm gonna be rich!


Re:Nothing new, just a new scam (1)

sqlrob (173498) | more than 14 years ago | (#990397)

I am not completely up to date on filtering software. Are there currently ANY that don't have an agenda? All of the ones I'm familiar with have the list encrypted or are server based, making it difficult to evaluate whether or not something is going on. And with DMCA, it would be illegal to reverse engineer these lists, making anybody who looked for an agenda a criminal.

The utter idiocy of this.. (1)

talks_to_birds (2488) | more than 14 years ago | (#990398) absolutely mindblowing!

"Unlike most Internet filters, which search for keywords like 'sex', eyeguard checks for "excessive skin tones"."

What kind of skin tone? Caucasian? Negroid? Middle Eastern? Asian?

What if the image is black-and-white? What if it's been scanned and diddled-with too many times and the color balance is all off?

And excessive? What? What's going to happen to a site like or

(Hint: they're both swimwear...)

"Each time a suspect image is detected, the program will alert an office supervisor."

For christ's sake: what has to happen here is two things:

  • the self-righteous minority of christian moral absolutists has got to lose its fear of the naked human body

  • and fucking idiot supervisors and managers have got to establish serious web-use policies for their staff and enforce them
In my experience it's the supervisors themselves, and their pet lackeys, who are doing a whole lot of inappropriate surfing while at work.


How many flaws in this stupid idea? (3)

Archeopteryx (4648) | more than 14 years ago | (#990401)

The sorts of images that would not be dealt with correctly;

1. People in swimsuits.
2. People doing nasty things, but wearing "fetish atire."
3. People doing nasty things with Members Of Other Species. (Animals, ICK!)
4. Wresting (Including Sumo.)
5. Sunsets. (Some of them have a lot of "flesh tones" in them.
6. Manipulated images with a slighly more blue color temperature.
7. Medical images.
8. Fine art.
9. Bodybuilding pictures. (see: swimsuits)

What an obvious, but still obviously stupid idea! I've been doing image analysis for over 20 years, and this idea did not deserve a moment's consideration, much less venture capital.

Software/Hardware filtering? :) (1)

Improv (2467) | more than 14 years ago | (#990409)

If there were a filtering proxy service you
could use that would recolor the images,
and a lens you could hold in front of the computer
monitor to see the corrected color, you could
avoid any color-based stuff like this :)

The different ways to circumvent this: (2)

(void*) (113680) | more than 14 years ago | (#990410)

  • color index remapping. Write a trivial piece of program that swaps color indexes around. Even a simple color inversion would do.
  • rename jpg files to something else.
  • FFT's the images.
  • uuencode/decode into ASCII files.
  • encryption/steganography.
In other words, the software only catches stupid people viewing porn. It drives the smart people who view these things into inventing all sorts technologically interesting stuff. Thanks!!

Detecting Skin tones (4)

DeadSea (69598) | more than 14 years ago | (#990413)

As part of a project for a multimedia class that I took as a Senior in college, we had to write software to count the number of people in an mpeg video. It was a very open ended project and we weren't expected to be able to get the right answer all the time.

The professor suggested that we start with skin tones. He pointed us to research that tried to pick out the parts of the spectrum considered "skin tone". There were some simple algorithms that were suggested. We did this and it worked decently well, but there were a lot of things that looked like skin to it. Especially light colored woodwork.

An algorithm like this may be able to filter a lot of stuff off the web. But it will filter a lot of other stuff too. I can also think of 100 ways to fool it. The easiest being put images through a color filter before posting them, or post them in black and white. Other people have pointed out that it will filter portraits and other shots of humans that arn't porn.

about time (1)

froz (69551) | more than 14 years ago | (#990416)

... can detect excessive skintones

so i finally can filter out images of zit-covered, pubescent 15 year-olds?

What can I say? (2)

anatoli (74215) | more than 14 years ago | (#990418)

Now censor this [] .
Load More Comments
Slashdot Login

Need an Account?

Forgot your password?