Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

DoJ Following Porn Blocker Advances? 265

GreedyCapitalist writes "A new filter called iShield is able to recognize porn images based on the content of the image (other filters look at URLs and text) and according to PC Magazine, it is very effective. The next generation will probably be even better -- which highlights the retarding effect regulation has on technological progress - if we relied solely on government to ban 'inappropriate' content from the web, we'd never know what solutions the market might come up with. Will the DOJ (which argues that porn filters don't work) take note of filtering innovation or continue its quest for censorship?"
This discussion has been archived. No new comments can be posted.

DoJ Following Porn Blocker Advances?

Comments Filter:
  • Reversal (Score:5, Funny)

    by orangeguru ( 411012 ) on Monday March 20, 2006 @07:30AM (#14955803) Homepage
    Since it is so good identifing pr0n I can't wait to get my first pornbot with that function to find me some more.
    • Re:Reversal (Score:2, Redundant)

      by bhima ( 46039 )
      Man, if it's good enough to find the sorts I like with "actresses" I like... sign me up!
    • Re:Reversal (Score:5, Funny)

      by Anonymous Coward on Monday March 20, 2006 @07:42AM (#14955833)
      If you're having trouble finding porn on the Internet you're doing something wrong...
      • I think the problem is that even if a filter is installed, if you're having trouble finding porn on the Internet you're still doing something wrong...
      • I mean, I don't want fat chicks or gay porn or anything with animals, but I do want midgets, bungee-cords and lesbians!

        I welcome this new technology!
  • by Anonymous Coward on Monday March 20, 2006 @07:33AM (#14955808)
    I see nothing in this article that the DOJ is about to do anything. This is just a review of a a product that can block some images that would be useful for some families.

    I don't understand why this summary has to bring the government into this or speculate that they might do something. There's no evidence of impending censorship, no political issues at work here. It's just a review of a product. Why does Zonk continually try to troll politics on slashdot? He's turning into worse than Michael ever did.
    • I don't understand why this summary has to bring the government into this or speculate that they might do something. There's no evidence of impending censorship, no political issues at work here.

      Although I agree that this FP has little to do with government regulation (other than a sort of "proof of concept" for potentially effective porn blocking), we have plenty of proof that the current administration wants to censor the internet... The entire quest to make Google turn over search records on a legal f
      • When people start believing lines such as "We have protected civil liberties by extending the patriot act", they pose more of a threat to this country than Osama ever did.

        Bad news... the fools started swallowing things like that wholesale on 9/12
    • He's turning into worse than Michael ever did.

      His assertion that products of this kind have some sort of "retarding effect" on "technological progress" is a bit naive. Making a machine smart enough to distinguish between porn, lingerie and breast examines is fairly remarkable. This is probably and extension of eigenface analysis applied to more general images. Machine vision, in other words.

      Porn driving technology, again? Perhaps. A few years from now we'll take it for granted that machines can identif
  • by BadAnalogyGuy ( 945258 ) <BadAnalogyGuy@gmail.com> on Monday March 20, 2006 @07:33AM (#14955809)
    There is no mention of the DOJ anywhere in the articles you posted. [storyarts.org]

    But according to the article, it works well and doesn't filter out health-related websites. It also doesn't work for black and white images, but the majority of online porn isn't b&w. Or so I've heard.
    • But would you really beleive all the claims of such a compagny. There are also compagnies that claimed they could find identify the music from listening to it, but until today there is no compagny that makes a good p2p program with this info in it.

      Now there is a compagny that claims it can catagorize porn (hmm, try to explain to your wife/virtual gf: i work for a compagny that catagorize porn 8) ). They would love to sell it to a government, and might sell it to some nazi filtering compagny (they always fi
    • Thats because black & white porn is called "art".
  • hmm (Score:5, Insightful)

    by DrSkwid ( 118965 ) on Monday March 20, 2006 @07:35AM (#14955811) Journal
    So does it filter out Rubens

    Would Michelangelo's David be filtered out

    How about anatomy/autopsy pictures ?

    I would RTFA but it is 404, perhaps my ISP filters out stories about filtering.

    • Re:hmm (Score:3, Funny)

      At some point, the computer would have to decide what is arousing and what is not.

      Could give HAL-9000 a whole new outlook.
    • Re:hmm (Score:5, Informative)

      by Jugalator ( 259273 ) on Monday March 20, 2006 @07:51AM (#14955853) Journal
      So does it filter out Rubens
      Would Michelangelo's David be filtered out
      How about anatomy/autopsy pictures ?


      This excerpt answers these pretty well:
      A Google Images search on "breast self-examination" was correctly allowed. On a page dedicated to the artistic nudes of Alberto Vargas, it inexplicably decided to tile the text-only links menu with hundreds of tiny shield images; Guardware confirmed this is a bug.

      So it's business as usual. If PC Mag's quick checks revealed innocent sites being blocked, I hope this never sees the light as anything with a mandatory use anywhere. I think missing to spread information is worse than actually even showing human intercourse. Yes, even if there's a vagina there. I hope the kids aren't traumatized for life if they'd stumble over such things and the dirtiness of our anatomy.

      Oh, also watch out for the new Pumpk1n Pr0n:
      And we found that some oddly innocent imagesin particular, "head shots" of pumpkins from last Halloweenwere blocked.

      The article says IE would crash more with this tool in use too, but I'm not sure anyone would notice the difference from before and after. ;-)

      I would RTFA but it is 404, perhaps my ISP filters out stories about filtering.

      Just use the Mirrordot version [mirrordot.org].
      • by x2A ( 858210 )
        "I hope the kids aren't traumatized for life if they'd stumble over such things and the dirtiness of our anatomy"

        Of cause they're not, it's their parents who are... so for maximum effect, this filter should be used to stop such sites/images from being added to the browser history/cache... the parents will never find out, and the kids just think it's funny, problem solved!!!
      • by hey! ( 33014 )
        Hmmm.

        Reading between the lines of TFA, I'd say that it isn't just doing an analysis of the images, but must be tweaked to allow exceptions, possibly using some kind of analysis of the accompanying text. I doubt this would deter people who wanted to sneak their site around the filter, but it might help with the model problem of the accidental Google link targetted at young children.

        Unless the analysis is very crude, I don't think it would be hard to distinguish oil paintings from photos, which may be a bo
      • by jtcm ( 452335 )
        And we found that some oddly innocent images in particular, "head shots" of pumpkins from last Halloween were blocked.

        Could it be this pumpkin [goat.cx] that gets blocked? ...not so innocent after all!

    • Re:hmm (Score:4, Insightful)

      by geoff lane ( 93738 ) on Monday March 20, 2006 @08:16AM (#14955911)
      I'm stunned that a bit of software can both read and understand the law and interpret it exactly as a real judge would.

      Why isn't this amazing AI advance being reported?

      • I'm stunned that a bit of software can both read and understand the law and interpret it exactly as a real judge would.

        I'm not at all stunned that a bit of software can interpret the rules exactly as some real judge would. There aren't many of them, fortunately, but there are some idiot judges out there.

      • Re:hmm (Score:3, Insightful)

        by x2A ( 858210 )
        "can both read and understand the law and interpret it exactly as a real judge would"

        Erm, surely the filter is set up to filter based on the wishes of the person who installs/manages it, not legislature. It's not interpreting anything but the image.
    • I would RTFA but it is 404, perhaps my ISP filters out stories about filtering.

      It is more probable that it would filter out pages with 'porn' in the title

  • by gEvil (beta) ( 945888 ) on Monday March 20, 2006 @07:35AM (#14955813)
    Does it work on everyone's favorite image, hello.jpg?
  • Which, in turn... (Score:5, Insightful)

    by kahei ( 466208 ) on Monday March 20, 2006 @07:36AM (#14955814) Homepage
    which highlights the retarding effect regulation has on technological progress

    In other news, today I successfully opened a can of Diet Coke -- which highlights the retarding effect regulation has on quenching thirst. Man, if I'd waited for the government to open that can for me, I'd still be thirsty now!

    If only there were a more effective way to highlight the retarding effect that obsessing over the complete works of Ayn Rand has on independant thought...

    • "In other news, today I successfully opened a can of Diet Coke -- which highlights the retarding effect regulation has on quenching thirst. Man, if I'd waited for the government to open that can for me, I'd still be thirsty now!"

      Yet for many - they expect government to be that first line of defense against the "undesirable" and refuse to help themselves. Of course after so many years of public "education" this shouldn't be a surprise.
      • by badfish99 ( 826052 ) on Monday March 20, 2006 @09:16AM (#14956090)
        But surely the reason that people call for "the government to do something" is not that they want to be protected against porn themselves, but that they want laws put in place to force their own views on everyone else. It's not "I don't want to see this", it's "nobody should be allowed to see this, even if they want to".
  • I don;t get it. (Score:3, Insightful)

    by freedom_india ( 780002 ) on Monday March 20, 2006 @07:36AM (#14955816) Homepage Journal
    I simply don't get it.

    First we shout the Govt. to get Off our backs on this issue, and when they actually fail to come up with any solutions (because we told them NOT to), we wham them for not guiding us/providing us with any solution.

    What a load of cr*p !

    On one hand we shout at the ineffectiveness of Govt's first real action in decades to counteract this problem (by yahoo, msn and google searches), and then we shout at them for NOT providing a solution at all.

    You tie both my hands behind my back, then you blame me for not shooting at the thief !

    • by rkcallaghan ( 858110 ) on Monday March 20, 2006 @08:01AM (#14955877)
      I simply don't get it.

      ... and you fell for it.

      First we shout the Govt. to get Off our backs on this issue, and when they actually fail to come up with any solutions (because we told them NOT to), we wham them for not guiding us/providing us with any solution.

      You are failing to realize that the same person is not talking in both cases. Also, while Slashdot as a whole leans to the left, the same issue can have articles written by, and about people on, both sides. The only thing that is happening here is that someone thought a discussion about a software for image identification and its future impact on us would be a good thread, and here we are.

      You tie both my hands behind my back, then you blame me for not shooting at the thief!

      The fallacy lies in missing that the ties hands speaker is not the same speaker as the one doing the blaming.

      Make more sense now?

      ~Rebecca
    • Re:I don;t get it. (Score:3, Insightful)

      by TapeCutter ( 624760 )
      "You tie both my hands behind my back, then you blame me for not shooting at the thief !"

      You think it's a charcter flaw not to kill for property?
    • Re:I don;t get it. (Score:5, Informative)

      by Secrity ( 742221 ) on Monday March 20, 2006 @08:46AM (#14955991)
      There are two seperate and distinct "solutions" for people who have issues with porn. The first "solution" is government censorship of the inernet. The second "solution" involves local filtering installed by the computer owner, and there are at least two flavors of this "solution". There are bastard situations where various non-federal governments (including libraries) own the computer or the network which get REAL complicated. There are also situations where ISPs and networks censor access.

      Government Censorship: There are wing nuts who want the US government to censor the internet, usually with cries of "think of the children" or "help fight terrorism". People who know how the internet works generally realize that this is a stupid "solution".

      Local Filtering: There are several different way that this can be done and all of the currently available local filtering "solutions" have problems. TFA was about a new local filtering scheme, which COULD be better than the existing methods.

      Local filtering vs. government censorship is, I think, where you see the contradiction. It really isn't a contradiction for people to say NO to government censorship (including local filtering in public libraries) and to also have some of the same people wanting the government to get involved in improving local filtering technologies.

      If it wasn't for porn on the internet, war, gay marriage, and abortion; you couldn't get anybody to go to the polls.
      • (Off-topic slightly)

        "There are wing nuts who want the US government to censor the internet"

        It's alright when it's done in the way they want it done, but I bet the same people would accuse china of being draconian for filtering.

        It's funny how people's views of censorship change depending on what they're talking about censoring, and that sexuality ranks so high on so many people's kill list :-/
      • The other solution of course is to embrace the porn...
    • I never asked the government to do crap about pornography. Sure, there are a few select groups out there who want it censored, but they make of the vast minority. Most people want our government to stay out of our way and out of our business. Our current government is very non-Republican in its strong desire to be big brother. Hopefully we will get a real Republican as president in 2008, not this neo-con, and revert back to better days. Hell, I hope any republican representive this November who is pro-
  • Backwards.. (Score:5, Funny)

    by onion2k ( 203094 ) on Monday March 20, 2006 @07:38AM (#14955824) Homepage
    Can I run it backwards and filter out everything that isn't porn? I'd find that more .. useful.
    • Filtering out anything that isn't porn on the net is like trying to filter your Mailbox for anything that isn't spam.

      In other words, something you could easily do by hand. Unless that hand is busy because of the porn, of course. :)
  • False Positives (Score:5, Insightful)

    by michaelhood ( 667393 ) on Monday March 20, 2006 @07:41AM (#14955831)
    This thing will be ruined with false positives. Swimsuit photos, maybe pictures of animals (similar color tones), etc.

    This won't go anywhere for a long time, until image recognition technology catches up.
    • This thing will be ruined with false positives.

      RTFA. "It didn't block department-store lingerie ads but covered up a few scantily clad models at the Victoria's Secret site. A Google Images search on "breast self-examination" was correctly allowed."

      But they also say: "your tech-savvy teenager may attempt to evade this monitoring by terminating iShield....Without the password, you just can't turn it off." Right. Unless he reboots to a Knoppix CD, for instance. But basically if you don't want porn, this see

      • RTFA. "It didn't block department-store lingerie ads but covered up a few scantily clad models at the Victoria's Secret site.

        Since Victoria's Secret isn't a porn site, there's even more evidence it doesn't work right there.
    • Re:False Positives (Score:5, Interesting)

      by anthony_dipierro ( 543308 ) on Monday March 20, 2006 @08:17AM (#14955915) Journal

      This won't go anywhere for a long time, until image recognition technology catches up.

      Even then, one person's "porn" is another's "art". Even a human can't correctly distinguish offensive vs. non-offensive content with all that much accuracy. (This is besides the fact that around the same time as image recognition technology catches up computers will have overtaken the world and we'll be following their rules rather than our own.)

    • Swimsuit photos, maybe pictures of animals (similar color tones), etc.

      If they made this for video, It'd be interesting to see how many commercials, award-ceremonies and various news-shows that would be labeled as soft-porn.

      Can't say you could call it a bug either...

    • Indeed. But it seems to me that this is just one kind of false positive, anyway - filtering images that contain naked bodies (or body parts) even though they're not porn is one problem, but how do you make sure that images that don't contain naked body parts but still *are* porn are filtered? There's more of those than one might think.
  • by lemonjus2 ( 939285 ) on Monday March 20, 2006 @07:43AM (#14955835)
    I wonder how many hours the poor programers worked in order to test this thing :)

    Looking for porn that the filter cant handle...

    What those meetings must have looked like.
    • by TiggertheMad ( 556308 ) on Monday March 20, 2006 @08:05AM (#14955883) Journal
      Manager: "All right team, looks like Joe has finally come up with a fast and fairly accurate algorithim to spot those dirty old pornographic images. We will need to test it a bit first, to see what the signal to noise ratio is. We will need some test groups, though."

      "Yeah, sure bob, you can run the 'barely legal college girls' tests. Janet and Simone, you check the 'hot lesbian' batches. What? Sure Ramone, you can check the 'young gay studs' test. Now, who is going to run the 'goatse.cx' tests?"

      "Guys....? Anyone?"
  • Errors abound (Score:5, Interesting)

    by michaelhood ( 667393 ) on Monday March 20, 2006 @07:45AM (#14955839)
    FTA: And we found that some oddly innocent images--in particular, "head shots" of pumpkins from last Halloween--were blocked. But overall, of blocking the images you'd want blocked.

    This thing won't be deployed en masse with problems like that.. it quickly becomes uneconomical for admins to be whitelisting pictures of pumpkins.
    • Re:Errors abound (Score:3, Insightful)

      by BeardsmoreA ( 951706 )
      But how many employees will come to their BOFH complaining that they couldn't look at their neighbours halloween photos? On their work machine? In work time? Irritating if you're the employee, but not likely to keep employers awake at night I'd have thought. Lets be honest, 90% of most employees work surfing is probably less than work related, and if you really do have a job that involves looking for pictures online a lot, you're probably a prime candidate for whitelisting from the whole thing.

      OTOH,For

      • "But how many employees will come to their BOFH complaining that they couldn't look at their neighbours halloween photos?"

        In the city I live in alone there are 3400 jobs in 8 different companies that specifically market selling pumpkins (unless I'm getting confused with something I just imagined), there's a lot of money in it and they wouldn't be happy having their images blocked.
  • So what they are really doing is trying to teach an AI morality? Does anybody know how they do this. What is the difference between a nipple and a cherry (the fruit) to a computer. In some point in the future will are goverment be able to make computeres see thier motrality and then tell them to go enforce it?
    • somebody mod me down for not knowing how to spell, My post is so embarassing.
    • Re:Marality and AI (Score:3, Insightful)

      by Jugalator ( 259273 )
      Does anybody know how they do this. What is the difference between a nipple and a cherry (the fruit) to a computer.

      And more interestingly, what's the difference between a nipple on a nudist shot and not?
      Nudism wasn't illegal in any modern country I know.

      There are plenty of even less grey area cases like these that would be problematic, mentioned by a poster above. Art, both as for paintings and photography, etc. If we simply forbid the human body out of religious reasons and whatever, isn't that admitting S
      • Nudism wasn't illegal in any modern country I know.

        Sorry, I should've written "modern and democratic country".
        This was those I was thinking of, not well industrialized and "modern" dictatorships etc. ;-)
    • "In some point in the future will are goverment be able to make computeres see thier motrality and then tell them to go enforce it?"

      I hate to repeat myself, but I feel it's worth mentioning here anyway (and I will add to it) - the filter is set up to filter based on the wishes of the person who installs/manages it, not legislature. It's not interpreting anything but the image, and it's not enforcing anything other than the wishes of the person who ownes the computer. This is not a big-brother issue! Sure it
  • by NigelJohnstone ( 242811 ) on Monday March 20, 2006 @07:52AM (#14955854)
    From 2001:

    http://www.isp-planet.com/news/2001/messagelabs_01 1126.html [isp-planet.com]

    "SkyScan AP uses Image Composition Software (ICA), which decomposes an image," White explained. "It runs 22,000 algorithms and in addition to skin tone textures, it can decipher porn through other features such as facial expressions.""

    In practice these tools are simply filtering by URL, then by colour gamut analysis.

  • by alx5000 ( 896642 ) <alx5000&alx5000,net> on Monday March 20, 2006 @08:01AM (#14955876) Homepage
    Nobody seems to be caring bout the 10^5 monkeys that check every image thus making the filter work... Poor perverted animals...
  • I remember Clearswift (or whatever they were called in the day, or are now...), had a PornSweeper product years ago, which judged images based on skin-tones. Probably things have gotten a bit more sophisticated nowadays, but the principle remains the same. And there will still be a lot of false positives. Plus, people will quickly find a way to overcome these kind of filters.

    All in all however, I'd rather see these kinds of initiatives than a governmental crusade against online porn, which would not only be
  • by account_deleted ( 4530225 ) on Monday March 20, 2006 @08:03AM (#14955881)
    Comment removed based on user account deletion
  • by Zapman ( 2662 ) on Monday March 20, 2006 @08:12AM (#14955903)
    if (percent_pink_pixels(image) >= 70%)
          flag_as_porn(image);
    endif

    Step1: use silly algorithm
    Step2: ...
    Step3: PROFIT!
  • One, I would hope that children viewing pornography issues is more of a state issue, not a federal government issue. I don't believe it fits in with the general welfare clause. However, they can legislate all they want for the 10 mile federal district.

    I don't want the federal government getting involved with this. This is just censorship towards minors. Minors should be able to view what they please, but parents should be the ones responsible for stopping them from viewing things they don't wish for their y
    • I don't know how many, if any popular browsers at all, have this option, but what if there was a way to disable all image content? If all image content was disabled, wouldn't that solve all the problems of visual pornography?
  • Solution? (Score:2, Insightful)

    by BoxedFlame ( 231097 )
    For there to be a solution, there has to be a problem. I don't see a problem except moral panic and one groups willingness to impose their sense of morality on everybody else.
  • Hmm.. (Score:3, Funny)

    by bigattichouse ( 527527 ) on Monday March 20, 2006 @08:18AM (#14955921) Homepage
    So if I build a website for close-up shots of orchids, will it get banned?
  • by OverflowingBitBucket ( 464177 ) on Monday March 20, 2006 @08:19AM (#14955923) Homepage Journal
    I've developed a simple algorithm for checking web pages for pornographic content. It is roughly 98% accurate when fed a random page from the 'net. Here's the code so far:

    bool check_porn_content(const char *url)
    {
        (void)url;
        return true;
    }

    Any suggestions for further development, or licensing queries, please let me know.
  • First of all, I am under the belief that it is NOT the governments job to tell us what we can and cannot see. I do not care what it is, the government should take no part in forcing its citizens to look at one topic vs another. Secondly, us as adults and responsible human beings need to start taking responsibility for things and not wait for father government to step in and tell us how to think. It is YOUR responsibility has an adult to view what you want to and if you come across something offensive how
  • It's pretty dumb. Every previous claim of this functionality has turned out to be skintone detection.

    Machine vision can't even begin to start on this until you actually know what porn is, from an ontological point of view (a problem I've mostly nailed). Even then, the recognition algorithms for such won't be written for many years, and won't run on reasonable hardware for years after that. It's pretty dumb if you ask me.
  • Not many of you... (Score:5, Interesting)

    by hdparm ( 575302 ) on Monday March 20, 2006 @08:26AM (#14955940) Homepage
    ...know about what happened [nzherald.co.nz] to Bryce Coad of Zombie Linux [zombie.net.nz], almost 4 years ago. Wheteher his explanation was in fact true, I don't know. But obviously, some people have thought about this long time ago.
  • Why? (Score:5, Informative)

    by BenjyD ( 316700 ) on Monday March 20, 2006 @08:27AM (#14955944)
    Your 6-year-old may mistype his favorite cartoon's URL and wind up at a porn site; a 16-year-old may reach the same site deliberately

    Why should the sixteen year old be stopped from looking at porn? He's over the age of consent, what's wrong with letting him look at some naked women? He's probably thinking about sex all the time anyway, that's just what teenagers do.
    • Re:Why? (Score:3, Insightful)

      by RPoet ( 20693 )
      Sixteen is not a universal age of consent. There are places that set that age higher, and places where it's lower. In either case it has nothing to do with the appropriativeness of watching porn.
  • by digitaldc ( 879047 ) * on Monday March 20, 2006 @08:33AM (#14955952)
    Funny how they make very effective filters for pr0n, but violence is AOK.
    You can bomb, shoot, maim every night on the nightly news, but God forbid you show a naked breast...people might be harmed!
    There are hypocritical cultural 'norms' in the USA.
  • I'm pattenting the following plug-in, take a photo, cut it in little pieces, change the skin tones to vivid green, make some simple random transformations, add a button: "remove fig-leaf".

    If you see this, the user has just to hit the remove fig-leaf button, each element of the photo is unscrambled locally (with java or maybe javascript), and the various vignettes are shown ordered and side-by side for the enjoyment of people at work and other people that could potentially be minors in their juridiction, but
  • on the same Internet where a picture of slightly distorted text can defeat a script, like with a captcha?
  • Doesn't work (Score:3, Informative)

    by laptop006 ( 37721 ) on Monday March 20, 2006 @08:59AM (#14956027) Homepage Journal
    Here's a great review of a previous generation of this kind of thing.

    http://dansdata.com/pornsweeper.htm [dansdata.com]
  • Offtopic .. but related to software that can recognize content .. oh yeah I'll tie into law enforcement too..

    Supposedly the next version of Mac OS will have at minimum OCR built in. Meaning it will scan images for words and put them in a searchable index. Now, eventually (or maybe the next release) .. maybe it can do facial recognition too. Or, even scenery recognition (beach, house in background, night versus day .. etc.) Maybe it can learn or be taught which house is who's etc. so even that can be automag
  • "When there are too many porn images on a page, even the Google logo is blocked!"

    That's a feature?

    http://www.pcmag.com/slideshow_viewer/0,1205,l=&s= 26690&a=171720&po=6,00.asp [pcmag.com]
  • The article mentioned that shape-recognition technology is being employed. This is especially interesting because the abstract mathematics that underlie shape recognition are the same as the abstract mathematics that underlie another computational problem: decompilation.

    Machine-language instructions are vertices. High-level-language loops, functions and similar structures are more or less complex primitive shapes {squares, triangles, circles .....} The question "To what shape do these points belong?"
  • by account_deleted ( 4530225 ) on Monday March 20, 2006 @09:35AM (#14956163)
    Comment removed based on user account deletion
  • by FleaPlus ( 6935 ) on Monday March 20, 2006 @01:42PM (#14958082) Journal
    I'm not too familiar with more recent work, but there's a well-cited paper by Fleck, Forsyth & Bregler (1996) on using image analysis to determine whether or not there were naked people in an image. My inner juvenile always found the title kind of amusing, "Finding Naked People" [hmc.edu]. Fleck also has a web page [hmc.edu] with some descriptions.

    I'm not so sure about its applications, but it's certainly an interesting vision problem.

THEGODDESSOFTHENETHASTWISTINGFINGERSANDHERVOICEISLIKEAJAVELININTHENIGHTDUDE

Working...