Slashdot regular Bennett Haselton writes "The ACLU has targeted a group of Tennessee school districts for blocking websites categorized by a blocking company as 'LGBT.' I hope the ACLU wins, but it may create the mistaken impression that egregious overblocking of websites is easy to fix. On the contrary, the vast majority of errors are hard-coded into the products and cannot be fixed by unblocking a single category." Hit that tantalizingly entitled 'Read More' link to read his essay.
The ACLU is threatening to sue a group of Tennessee School Districts for using blocking software that blocks sites categorized as "LGBT" — that is, sites themed around lesbian, gay, bisexual or transgender issues that would not be classified as pornographic. Some of the blocked sites include the Gay and Lesbian Alliance Against Defamation and the Human Rights Campaign.
Legally, the school districts' decision to block these sites seems fairly indefensible. The content being censored is political speech, not illegal to distribute to minors, and as the ACLU points out, by blocking these sites the school districts are engaging in "viewpoint discrimination," since the schools allow access to anti-gay sites like Americans for Truth Against Homosexuality (which, ironically, features a disclaimer saying its content is not suitable for children). But, you never can tell with judges. A judge in Utah once ruled in favor of a school that suspended a student for wearing a t-shirt with the word "Vegan." (Do you think the judge would have made the same ruling if the student's t-shirt had said "Christian"?)
However, while the ACLU would be right to bring this case, there may be another unintended side effect. By focusing on the fact that the "LGBT" category is enabled to be blocked in these districts, this sets up a contrast with districts that do not have the "LGBT" category enabled, which could lead people to think that such districts are not blocking LGBT sites. This is not the case.
When a school district buys blocking software, the software comes with an encrypted list of websites listed in different categories; categories like Pornography and Nudity are typically blocked, while categories like LGBT would usually not be. If a site falls into one or more of the blocked categories, then attempts to access that site will be blocked (at least until some reprobates help you get around the filter.) However, it's the blocking company that decides what to put on the list under each category. And even if only categories like "Pornography" are enabled, there are likely to be many non-pornographic sites categorized as "Pornography," and hence blocked wherever that category is turned on.
When the ACLU of Washington sued the North County Regional Library system for enabling blocking software for all patrons (including adults), they asked me to test the Fortinet Web filter that the library was using. I used a random sample of 100,000 .com and 100,000 .org domains and ran them through an automated script to find 536 .com domains and 207 .org domains that were blocked by Fortinet. Of those, about one out of every eight .com sites categorized as "Pornography" or "Adult Materials," and one of out of every four .org sites blocked in those categories, was a site with content that could not possibly be considered "adult" — some of the sites blocked in these categories included the Dabar Worship Center, the immigrant-rights group Families for Freedom, and the Seattle Women's Jazz Orchestra. Extrapolating these ratios to the set of all .com and .org domains in existence, one could conclude that there were about 71,000 non-pornographic .com sites and 5,800 non-pornographic .org sites blocked by FortiNet as "Pornography" or "Adult Materials" — a number almost certain to grow into six figures when you add in all the sites outside of .com and .org. Years earlier, I had run similar tests for Cyber Patrol and SurfWatch (products which have since been discontinued) and found that an absolute majority of sites blocked by each program were actually non-pornographic, which translated into an estimate of hundreds of thousands of .com and .org sites wrongly classified as "porn."
Only the blocking companies know for sure how such stupid mistakes end up on their lists, but the most widely accepted explanation is that they use machines to crawl the Web and guess which sites are pornographic, and add those sites to their blacklists without any human intervention. In their early years, the makers of SurfWatch and Cyber Patrol claimed that employees actually did review sites before adding them to their lists, but that claim became increasingly untenable as more and more reports came out of sites being blocked with no adult content on them.
Nobody has yet done a similar study for the ENA blocking program, but every blocking program that has ever been tested has had a non-trivial error rate that extrapolates to at least hundreds of thousands of non-pornographic websites being blocked under "Pornography" and similar categories. There is no reason to think that the ENA blocker is different; at the very least, if they claim that it is, then the burden of proof should be on them.
So, the ACLU will probably succeed in persuading the Tennessee Schools Cooperative to stop blocking the "LGBT" category, but that doesn't mean that LGBT sites — or any other category of non-pornographic sites — will no longer be blocked. A student who encounters a blocked LGBT site could request an override, but what if they don't want to "out" themselves as someone who was browsing an LGBT site? Is Tennessee the best place to be known as the "queer who wanted to get around the porn filter"? And there may not be an option of getting an override anyway. Some of the correspondents on Peacefire's mailing list for new proxy sites to get around blockers are teachers who aren't given a password to bypass the blocker on their school's computers.
Then of course — you know what's coming — there is the other "larger sense" in which unblocking the LGBT category doesn't "fix the problem," which is that there would be no "problem" if we didn't think of teenagers as children instead of adults. You've probably already decided which side you're on in that debate, but consider it as a scientific question instead of a moral one. Do you think there is any objective evidence that teenagers, if they were given the opportunity to have the same rights and responsibilities as adults, would behave differently from adults to a large degree — more differently than, say, men and women behave from each other? The trouble with the "evidence" that we gather from personal interactions is that it's not truly objective — if someone believes that teenagers are immature and adults are not, they're likely to see and remember only the pieces of evidence that confirm that belief. A true double-blind experiment might involve talking to someone through a computer terminal and rating the other person's "maturity" just based on their responses. That's a start, but the trouble with that experiment is that adults tend to know a larger set of words, so a participant might rate the other person as more "mature" because of their large vocabulary, even though having a large vocabulary is completely different from having mature thoughts or logical reasoning skills. A fairer test might be to take a non-native-English-speaking adult and a native-English-speaking young teenager who scored about the same on a test of English vocabulary, and see if participants could tell the difference in maturity between those two test subjects while talking to them through a computer terminal. I am not aware of any experiment along these lines that has been done, but this is the sort of evidence of differences between adults and minors, that would be truly objective.
Most of the evidence in favor of the innate "adulthood" of teenagers is also anecdotal and not scientific, but it is compelling. As psychologist Robert Epstein has pointed out in The Case Against Adolescence, for thousands of years humans in their early teens were giving birth and raising children of their own. That obviously does not mean that that is a good idea in today's society, it just means that somewhere along the way, we must have lost sight of the level of responsibility that human teenagers are biologically capable of handling. If one of our Stone Age forebears could be brought back to life, he might eventually get used to the Web, but he'd probably always be amused by the idea of Web blockers for teenagers who are older than he was when he was raising his first child.