Beta

Slashdot: News for Nerds

×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

cancel ×

312 comments

1st Spam (0, Informative)

Anonymous Coward | more than 12 years ago | (#3328908)

FS !!

Moderation - A Warning from History (-1)

ringbarer (545020) | more than 12 years ago | (#3328954)

Visitors to the website slashdot.org [slashdot.org] will by now have surely heard of the act of Moderation. This is where a contributor's post can be 'Moderated' either positively or negatively, depending on how the Moderator perceives the value of the post. There is a sliding scale of total moderation points, from -1 to 5, along with snappy summaries of the reason for moderation, such as "Funny", "Insightful", or the ever popular "Troll". An additional benefit offered to Moderators is the ability to ban a poster from contributing, by negatively moderating enough of his postings in a 24 hour period.

In order to retain some level of fairness for the Slashdot population, the Slashdot Editors (adopting the role of 'Benevolent Dictators') have implemented a scheme whereby regular users of Slashdot, chosen essentially at random, are given the ability to act as Moderators.

This underlines an inherent flaw in the system. Psychological studies have shown that in any community, no matter how small, should a random sampling of people be given the slightest grasp of power, they will immediately abuse it. There is a primal, evolutionary desire in Man to place himself higher than his peers by whatever measurement they can muster. Slashdot Moderation provides the ideal means for which a man can prove himself more equal than others.

At the risk of invoking Godwin's Law at such an early point in my thesis, I have no choice but to compare Slashdot Moderation to the systematic genocide of the Jewish community in 1930's Germany.

A bold statement, I admit, and deliberately designed to shock, but I feel the statement is necessary. I shall now offer a more rational explanation, as well as a comparison of the parallels between Slashdot Culture, and the National Socialist regime.

First, some history. National Socialism did not spring up overnight. It grew from a feeling of national bitterness and resentment at the war reparations Germany was forced to make after World War One. Germany was a broken country, populated by desperate starving people. And to the desperate, an extreme ideology begins to seem like a rational choice.

The advent of new technology forces a paradigm shift in the way the beholders of that technology think. The Christianity Meme was made wide spread by the invention of the Gutenberg press. And the rise of National Socialism was made popular because of the invention of Cinema. Here we had a new means to control the flow of information to the populace, that they are willing to unquestioningly listen to due to the 'novelty factor' of moving pictures. It is no coincidence that some of the best Cinematography of the early 20th Century came out of the National Socialist propaganda machine.

Why is this the case? It is yet another fault of man that a new means of distributing memes is perceived, due to the 'newness' of the medium, to have a greater 'validity' than older media. Those harnessing new inventions have the power to win control of the hearts and minds of others.

With the tools in place, who should the National Socialists target? Clearly, as a counterpoint to Man's desire to hold power over others, there is also a desire to resent the success of others. If someone is successful, they reduce the self-worth of their beholders. Although times were harsh in Germany in the prelude to World War II, there were still successful inhabitants of that country. Possessing shrewd business acumen as well as the contacts in other countries needed to maintain support in such a poverty stricken and broken land, who else should deserve the wrath of the populace more than the Jews?

Fast-forward to the latter quarter of the 20th Century. Computing technology is focused in niche markets, and limited to big successful companies like IBM and Microsoft. As the markets were limited, there were also limited opportunities for employment. This gave rise to a rising number of college dropouts, seething with resentment and unable to relate to society beyond the staccato clatter of keyboards and the pallid green glow of an 80x24 text display, and lacking the basic business skills (and a smart suit) needed to secure employment at one of these companies.

At this time, a new invention was beginning to take hold in College campuses throughout the world. The Internet. As with the Gutenberg press and Cinema beforehand, this new technology would grow to spread one of the most virulent memes of the modern age - Open Source Software, created as the antithesis of successful business practise.

So, the parallels between the birth of Anti-Semetic National Socialism and the birth of Open Source Software have been made. Of course, it is easy to claim that A=B without providing further logical evidence in support. So, the next task of my thesis is to provide further parallels, and bring this discourse back to the initial focus on Slashdot Moderation.

Slashdot was conceived, in it's original 'Chips 'n' Dips' incarnation, as a vehemently anti-corporate Open Source website. Roughly 10-15 years down the line from the birth of Open Source, it has become saturated with propaganda, and now forms the centrepiece of the Open Source Development Network. An authority in it's field, Slashdot's success is in no small part due to the ability of the editors to 'pick and choose' valid news articles submitted by users, and present the same old tired "Open Source Good / Closed Source Bad" rhetoric time and time again, dabbling with anti-copyright and the right of the 'common man' to remove an artist's ability to gain compensation for the work. In essence, this is similar to the 'paring down' of artistic worth in 1930's Germany. If no-one is willing to contribute valid and vibrant art to the community, then all art shall become harsh and functional, possessing a certain intimidating aesthetic.

Which leads onto Open Source's shining achievement - Linux. This diatribe is not aimed towards Linux in particular, as it is a well-oiled, well-tuned machine. A technically adept Operating System, it is worthy of admiration by any rational man. The point of this thesis is not to attack the art produced by Open Source coders, which in itself is worthy, but to enlighten all as to the political processes behind the OSS movement.

By the same scale, it is hard to fault Mercedes for the technical excellence of the vehicles which were used by the National Socialist party. But the politics behind the party are what taint the image of Mercedes' vehicles of the era. The Swastika itself is a benign symbol, found this day in such diverse locations as Pokemon cards, but is permanently tainted with the history of the acts made under its auspice. In the same way, companies switching to Open Source solutions will begin to regard the Penguin with the same trepidation as their profits fall.

It should be worth noting at this point that IBM, previously one of the world's greatest companies, has begun reporting servere financial losses, no doubt due to its adoption of Open Source practises. This epoch-making event was NOT reported on Slashdot, even though articles were submitted.

And what of the other great company mentioned above? Microsoft, aka Micro$oft, Mickeysoft, Microshaft, Kro$oft, and many other derogatory and undeserved names. Throughout the previous 25 years, Microsoft has grown from strength to strength, again possessing shrewd business acumen as well as providing products that people want. This makes them the number one target for the OSS movement. Incapable of standing by their own merits, the OSS zealot would rather attack Microsoft as a priority than produce anything of worth for their community.

Slashdot Moderators, crazed with their limited new-found power, exhibit this behavior. It is a sad state of affairs that the majority of article moderations are negative. Where is the positive feedback and sense of social contribution? Nowhere to be found. Moderators are too focused on putting their peers down to make themselves appear superior, rather than doing the hard work and becoming better on their own terms.

As the National Socialists required a scapegoat, Slashdot Moderators require a constant stream of Postings to label '-1, Inferior'. Once a posting is reduced to the score of -1, it becomes invisible to the casual user. Again, this is a parallel to the Ghettoization of Germany upon the election of Hitler.

In essence this would not be so bad, were postings to be evaluated on their own terms. However, alongside the moderation of their postings, each user has a 'Karma' value, namely the sum of their worth to the Slashdot community. As a user's posts are moderated up or down, so their Karma fluctuates. As Karma becomes negative, a user's default posting score is reduced, until they are posting at a default of -1. Again, ghettoizing PEOPLE, not just their opinions.

This ghettoization is reinforced with the often fake belief that a negatively moderated post, and therefore the poster, is a "Troll". (Is it any wonder that such a name has been chosen to describe these people, invoking mental imagery of facial disfigurement and hooked noses?) As the Jews were accused of fraud, dishonesty and being subhuman animals, so too are Trolls accused of FUD, Crapflooding, and obfuscated goatse.cx links. Quite often, these 'undesirables' are capable of providing a valid insightful comment on a topic, but because it is in opposition to the Political dogma of Slashdot they are moderated back into their ghetto. The person becomes moderated, not their opinion.

This is just the thin end of the wedge. Although, as memes are transient, it is difficult to silence an opinion, it is trivial to silence a person. Upon the rise of National Socialism in Germany, the populace were motivated by propaganda into entering the Jewish Ghettos en masse with the sole purpose of causing as much damage as possible to Jewish businesses and residences. This parallels far too accurately with the Slashdot Editor's non-discouragement of the act of IP-banning. As mentioned above, this occurs when an individual user's postings are repeatedly moderated down in a short period. They then become incapable of posting any contributions themselves. In essence, they have been silenced, regardless of the worth of their postings.

Of course, the editors claim that Meta-Moderation is the panacea to solve this clear abuse of moderating privledge. But if a Meta Moderator is presented with a list of moderations that they disagree with, such as this targetted 'silencing' mentioned above, they cannot note them as such without in turn becoming an 'Undesirable' themselves, as too many Disagreements with the Moderation groupthink also result in loss of Karma.

Throughout all of this, the Editors have claimed a false level of detachment from the acts of moderation. In a same way, as the National Socialists gathered their power and began working on their Elite Political wing, The SS, they too remained detached from the civilians working in their name. Why? Because after inspiring the populace to such acts of violence through their propaganda, they could then claim that they were only giving the people what they want.

And then began the next stage of the atrocities. The Gestapo, Germany's secret police, were recruited from the best and the brightest of Germany's elite. As is the case now, the best and the brightest of society were often shunned and ostracized in society. In essence, the Gestapo were a tightly controlled 'Geek Army' of intelligent young men with a burning, seething resentment of normal society. The perfect psychological profile for the cause.

After all, give a normal man (with an active sex life) a gun and he will use it responsibly in self defence. Give a geek a gun and he will behave according to his sociopathic logic and hatred of the world he arrogantly presumes to be distant from. Ask yourself why Slashdot flat-out justified the murder of innocents at Columbine. And then ask yourself why, even for a brief moment, you almost began to sympathize with the killers after Jon Katz' manipulative and pseudo-emotive Hellmouth articles.

How this relates to Slashdot is clear. The majority of Slashdot posters are Sociopathic OSS zealots, unable through lack of social finesse or personal hygiene to mate regularly. Sexually and emotionally frustrated and with grudges to bear, incapable in their blinkered sense of self-righteousness of accepting any dissenting opinion than the OSS cause. Now give these people the opportunity to Moderate these dissenting opinions. Of course they are going to want to silence them, by any means necessary.

Now, the Slashdot Editors have admitted taking this silence of opinion into the next stage, by moderating whole swathes of 'undesirable' posts negatively. And then permanently banning anyone who moderates said posts back up from moderating EVER again! The result of this new policy? The few Moderators with any sense of fairness and decency are removed from the moderation pool, leaving the power ENTIRELY in the hands of the zealots. Clearly, positive moderation is discouraged under this regime, which is a direct parallel with the way the National Socialists moved their own sympathisers into positions of power throughout Europe.

So how does this compare to the genocide performed in Auschwitz and their ilk? I would like at this point to explain that in NO way do I wish to belittle the horrors that were performed in the name of National Socialism. The six million innocents killed were a cry of anguish from which humanity may never recover. And a vast distance in time and scope from a few banned posters on some shitty "My Favourite Links - now with comments" website. But these stories need to be retold before the horror is lost forever.

For the only thing that we learn from history is that we never learn anything from history. Time and time again, the St. Vitus dance is played out, we make the same mistakes, and we perpetually fail to see the warning signs.

So, moderators, the next time you moderate a rational, insightful post down, maybe because you disagree with it or because it's posted by a 'Known Troll', just ask yourself this...

"Am I really contributing to the Slashdot Community, or selfishly destroying it?"

Re:1st Spam (0)

Anonymous Coward | more than 12 years ago | (#3329170)

I think the answer to spambots and the servers that run is not blockage. To trully solve the problem I think they should be hunted down and annilated. Like a hacker group called Spam Busters! Yeah, that would get the job done. Those bastards are the reasons I get phone calls all hours of the night and day.. automated pieces of shit. Aight, I'm done ranting for now.

Not fp, but still a wide page! (-1)

Klerck (213193) | more than 12 years ago | (#3328910)

Can these pages get any wider?! Sources say no!


http://www.eveeieyhfgfcdoosammgwsnboivvbsczxlzga bc / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /ooieiabdcdjsvbkeldfogjhiyeeejkagclmieooionoepdk / /abcdefmfighyiqxjklmonopqrosoyotuvwxoyqwertyuiov / /sdfghjklqewiuznmbjadzmcloeuirquakndsflksjdflkas / /fskdfasiewurznmcvweroiqewrnamdnzcvuowieramnfkas / /dfhzuxcihskjrnakjzkjcxbviusayrkajsfzxncvizudyri / /bakdnfbzkcvhgiuegriweramdnfzxlcvueirhamdnzkciue / /jranbsdmfzcowierandmfxzncbkjhfabsdifuweajzkxcuw / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /fzbxcvkxlkcnvmndskfjwehaiursdfzjxnbjkdfhskdflas / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /yroausdfzxmncvskeyiqozsjhfasdfoiwueranmcnzbkjhd / /ueafhksjfwheuirasdjhbzxiuewjhasmdnkfzxciurhaskj / /roiquwermcvkhiruhasdkjfnzxkjyeiuahsdbzxckjvopwe / /uqweuirjhvxzckjhweriuasydfoiqurnmxckvhweruiahdj / /znkxcvjhwierahsfzkxhhidufhsakjbzxjchiwueryqagsd / /kjhaksdfnbakwreyhaisknfjkzxbcvkoiqwueraskfzxcbk / /nlkwejrasoidjfxzlknvlkwjeroiasudflknzxlkbjeoiru / /slkdjfzxnmvkljdfawienzxveoriuaskdfjzxcmbnkseuri / /kfjlznxcvksjroeijasdklzjfowierqouasdhfzxncbkjhd / /jsdfljkweoriuasdfkjzxmcnvlkjdowuieraksdflkzxjbo / /werklasdnfmzxclkjewoijasdlfknzlkjwoeirqpweoiasd / /kjzxjvwperaksdjfxzweirjaslkdfzxnclvkjweroiasufd / /zxclkjeworijasdflknzlbkoiwuraksjflknxblkwjerois / /jfweknasdkfjzoxijkenraksjdfoizxjvlknwerlkajsdfo / /erhasdfzxncvkjdfyiuzxcnvsikirkajeajsbdfkzxbuyef / /rahsdjbzcvxmnvcuweyriausdnfzxbcvkwueyrajnbvkjxg / /iwueyajdfkzxjcnbkeyriaushdfkjbzbuowrnasdkfbhuie / /asjmfnkkbyiurnakjsndfkzjbhiuwerajsknfkzbyhweiua / /dkfjbzkxvbjywekrjaskjnvzxjcweruiasdhfkzjxnsjkld / /fasoidfjalskdfasklhfxjdnmenrqoiuozxcopjgneaksjo / /nzxdkfajlsdfkljsdfoiasdfasndflzxkcvozixucoqweiu / /pwoeiruzxmncvoutyqwerizxnvmxmcnvoweurqmznxmbouw / /rmnzbkhuyrtjghanzxcvbkhgjweyriaudfbznbkweruyabz / /bcvnkdhityqhagsdfjglsieurakfsdnfbvfdsajkbiuyqwe / /kweorjasdknfbkjsdoifuzxbcmfgsltjewioahsdfnbzxcb / /heoiroaisjdfzbxckjksrhiuehadsfbzkxjcbhkeuryaksj



Email me and tell me what you think of widening! [mailto]

Re:Not fp, but still a wide page! (-1, Offtopic)

blankmange (571591) | more than 12 years ago | (#3328916)

For shame: just because you can, doesn't mean you should.....

Re:Not fp, but still a wide page! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3329003)

I think I am getting the hang of this -- if I tell someone they are offtopic, I get modded down if I do not post it AC...

You are a total.... (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#3328917)

....wanker.

Re:Not fp, but still a wide page! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3328918)

Looks just fine in Opera.

I agree with this post. (-1)

Luke SkyTroller (564295) | more than 12 years ago | (#3328924)

Yup.

Re:Not fp, but still a wide page! (0, Offtopic)

aozilla (133143) | more than 12 years ago | (#3328933)

Looks fine in Mozilla 0.9.9, too...

Re:Not fp, but still a wide page! (1)

loply (571615) | more than 12 years ago | (#3328996)

Fine in Konqueror. Why, is it somehow broken in other browsers?

Re:Not fp, but still a wide page! (1, Troll)

aozilla (133143) | more than 12 years ago | (#3329011)

Those other browsers must suck

Re:Not fp, but still a wide page! (0)

Anonymous Coward | more than 12 years ago | (#3329174)

fine in windows ie5 / ie6 /ns3 and even webTV !

Re:Not fp, but still a wide page! (0)

Anonymous Coward | more than 12 years ago | (#3329177)

no, it's broken on ie6

Re:Not fp, but still a wide page! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3329053)

Have problems down here with IE 7.1

I just got a BSOD....

Re:Not fp, but still a wide page! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3329120)

-1?

why -1?
are you assholes?

Re:Not fp, but still a wide page! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3328946)

looks fine on IE6

Re:Not fp, but still a wide page! (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3328948)

Thank God! I was beginning to think my pages would look anorexic forever!

Elements of good design I'd missed (4, Informative)

Dark Paladin (116525) | more than 12 years ago | (#3328925)

Looking at my Day Job and personal web site, other than the very cool technical achievement of the trap (I'll have to see if I can rewrite this for my Checkpoint FW system), there were one things I learned about good design from this article:

Eliminate mailto - makes sense. You should have an http based "send me a message system" - force a live person to type stuff in instead of letting a program pick out addresses.

Eliminating mailto alone would probably help in mot of my spam problems (as I have my "contact me" address right on the first page).

Re:Elements of good design I'd missed (5, Interesting)

hagardtroll (562208) | more than 12 years ago | (#3328932)

I put my email address in a jpeg image. Haven't found a spambot yet that can decipher that.

Re:Elements of good design I'd missed (1)

DickPhallus (472621) | more than 12 years ago | (#3328939)

If that technique became common placed, I'm sure Optical character recognition software could be used... but for now you're safe.

Re:Elements of good design I'd missed (1, Insightful)

Anonymous Coward | more than 12 years ago | (#3328945)

I put my email address in a jpeg image. Haven't found a spambot yet that can decipher that.

But neither could blind internet users...

Re:Elements of good design I'd missed (2, Informative)

Dark Paladin (116525) | more than 12 years ago | (#3328964)

Good point - some sites (I think AOL did once) can get sued if you're a large enough business and don't make your site accessable to the blind. (Americans with Disabilities Act thing.)

Re:Elements of good design I'd missed (1)

cholokoy (265199) | more than 12 years ago | (#3329022)

The put an audio file since I would think that blind people can still hear.

-----
Return the bells of Balangiga

Re:Elements of good design I'd missed (0)

Anonymous Coward | more than 12 years ago | (#3329057)

What that is ridiculous. I'm sorry but if they don't want to support the blind why should they? Have you ever had to walk a blind person through anything over the phone it takes 2-3x the amount of time which costs $$$$ fuck em

Re:Elements of good design I'd missed (1)

blibbleblobble (526872) | more than 12 years ago | (#3329092)

I don't think blind people would be -that- interested in a skating club...

Of course, that's just an assumption

Re:Elements of good design I'd missed (2, Informative)

Permission Denied (551645) | more than 12 years ago | (#3329229)


I put my email address in a jpeg image. Haven't found a spambot yet that can decipher that.

But neither could blind internet users...


Add an alt tag that describes how to email you. Eg, "The first part of my email address is 'username' and the second part is 'host.com' - the two parts are separated by an '@' sign." I've been doing the jpeg thing for three years; works great.

Re:Elements of good design I'd missed (0)

Anonymous Coward | more than 12 years ago | (#3329013)

Excellent Idea!!! Will be using that in the near future....

Re:Elements of good design I'd missed (2)

jonbrewer (11894) | more than 12 years ago | (#3329098)

I've found a text file works as well. Spambots don't seem to bother loading "contact.txt".

Re:Elements of good design I'd missed (2)

British (51765) | more than 12 years ago | (#3329137)

AOL's personal ads(not that I visit that) do that already. They just use a GIF that looks just like it was a regular string of text. Very clever. I'm assuming there's a module out there that can do this easily on the fly?

Re:problem with not giving an email address ... (2, Insightful)

wmoore (45078) | more than 12 years ago | (#3328975)


The only problem with the idea of using entirely http based "send me a message systems" is that some people, like myself, would much rather have an actual email address to use instead of having to use 50 different layouts and 50 different configurations and 50 different methods of communicating with someone or a company. Every html based contact system has its own quirks and problems, I'd rather just need to learn my email programs issues instead.

Re:problem with not giving an email address ... (2)

Luyseyal (3154) | more than 12 years ago | (#3329031)

But, if you send him the message once with your return address, he'll know you're for real and when he replies you can use your regular mailer.

$0.02USD,
-l

Re:Elements of good design I'd missed (3, Insightful)

carm$y$ (532675) | more than 12 years ago | (#3328982)

Eliminating mailto alone would probably help in mot of my spam problems

You're 100% right. And fighting against spambots by relying on UserAgent is akin to... well.... security thru obscurity, albeit somehow in reverse.

What also looks strange is that he doesn't consider that one can get a link directly to a page on the n-th level: as human browsers don't usually download robots.txt either, sounds like he's gonna ban some poor guys who got a link from a friend...

/.ed (2, Funny)

Anonymous Coward | more than 12 years ago | (#3328926)

Looks like you should've written some code to handle an overload from slashdot too!

Re:/.ed (3, Funny)

HiQ (159108) | more than 12 years ago | (#3328944)

The dude fell in his own trap. :-D

Slashbot (3, Funny)

Ctrl-Z (28806) | more than 12 years ago | (#3328927)


"I have a truly marvelous demonstration of this proposition which this bandwidth is too narrow to transmit."

but can you ... (1)

filtrs (548248) | more than 12 years ago | (#3328929)

You can stop a SpamBot, but can you stop a /.'ing?

Okay... (1)

zaren (204877) | more than 12 years ago | (#3328931)

Well, his idea of removing "mailto:"s is an obvious one...

I dunno, most of this stuff sounds like common sense work for someone who's got a well-trafficed web site. The badhosts_loop looks like an interesting addition, though...

On the surface, it almost looks like this system could be built up to act like a SPEWS for web servers.

Aww, FSCK! [cafepress.com]

I see page widening is back (-1)

neal n bob (531011) | more than 12 years ago | (#3328938)

or would see it but Klerk is listed as foe. slash code + IE = teh suck

Block? Are you kidding? (5, Interesting)

Anonymous Coward | more than 12 years ago | (#3328940)

Why on Earth would you like to block a spambot? So it doesn't get any more useful addresses?
No way, man.
If you realize you're serving to a bot, go on serving. Each time the bot follows the "next page" link, you /give/ it a next page. With a nicely formatted word1word2num1num2@word1word2.com, where words and nums are random.
Give it thousands, millions of addresses this way.

Re:Block? Are you kidding? (1)

cholokoy (265199) | more than 12 years ago | (#3329010)

At first glance this might be a good idea but this will be resource burden on your system.

Not a good way to stop spammers.

------
Return the bells of Balangiga

Re:Block? Are you kidding? (3, Insightful)

BlueUnderwear (73957) | more than 12 years ago | (#3329060)

At first glance this might be a good idea but this will be resource burden on your system.

Add a couple of sleep(20); into the cgi script that generates the bot fodder. The bot will still stay busy waiting for your webserver's response, but your script will exactly consume zero resources.

For additional kicks, set up a DNS teergrube.

Re:Block? Are you kidding? (4, Interesting)

cperciva (102828) | more than 12 years ago | (#3329105)

Add a couple of sleep(20); into the cgi script that generates the bot fodder. The bot will still stay busy waiting for your webserver's response, but your script will exactly consume zero resources.

Zero resources, except for memory.

A much better solution would be to point the bot at a set of "servers" with IP addresses where you're running a stateless tarpit.

Re:Block? Are you kidding? (5, Interesting)

f3lix (234302) | more than 12 years ago | (#3329017)

This isn't such a good idea - for every random (non-existent) domain that you generate, a root DNS server will be queried when an email is sent to this address, which increases the load on the root servers, which is generally a bad thing. How about instead, returning pages with the email address abuse@domain-that-spambot-is-coming-from all over them...

Re:Block? Are you kidding? (0)

Anonymous Coward | more than 12 years ago | (#3329044)

I think the spamer will filter abuse@ ...

Re:Block? Are you kidding? (5, Funny)

BlueUnderwear (73957) | more than 12 years ago | (#3329095)

- for every random (non-existent) domain that you generate, a root DNS server will be queried when an email is sent to this address, which increases the load on the root servers, which is generally a bad thing.

Why is this a bad thing? They are owned by Verisign.

How about instead, returning pages with the email address abuse@domain-that-spambot-is-coming-from all over them...

This is also a good idea. In fact, I have a script which does a traceroute to the IP of the bot, and then looks up the admin contact using whois for the last couple of hops, and returns these. Oh, and for additional fun, throw in a couple of addresses of especially loved "friends"...

Re:Block? Are you kidding? (2)

liquidsin (398151) | more than 12 years ago | (#3329181)

I like that idea...look up the originating host, and make links back to abuse@, root@, webmaster@, and whatever else you can think of. Clog their mailservers. The problem is, it would be simple enough (if it's not already in place) to have your spam bot ignore addresses for your own domain.

Re:Block? Are you kidding? (3, Informative)

Ralp (541345) | more than 12 years ago | (#3329151)

Wpoison [monkeys.com] does this.

From the website: Wpoison is a free tool that can be used to help reduce the problem of bulk junk e-mail on the Internet in general, and at sites using Wpoison in particular.

It solves the problems of trapped spambots sucking up massive bandwidth/CPU time, as well as sparing legitimate spiders (say, google) from severe confusion.

Re:Block? Are you kidding? (3, Interesting)

gclef (96311) | more than 12 years ago | (#3329154)

Actually, I've done this w/a bot trap on my site at home. It's a perl script that generates a bunch of weird-sounding text w/some fake email addresses at the bottom and a bunch of database-query-looking links back to the original page.

The bots don't fall for it anymore. Some dorks in Washington state decided to make a couple requests a second to it once, but in the two years I've had it up, they're the only ones.

Re:Block? Are you kidding? (2)

Martin S. (98249) | more than 12 years ago | (#3329218)

Give it thousands, millions of addresses this way.

Liberally sprinkled postmaster@127.0.0.1 and abuse]@127.0.0.1.

Re:Block? Are you kidding? (5, Interesting)

boky (220626) | more than 12 years ago | (#3329243)

I agree. And, come on, how much technology do you need?

This is my solution to stopping spambots. It's in a JavaServlet technology and I am posting it here to prevent my company's site from being slashdotted. It does not prevent the spammer from harvesting emails it just slows them down.... a lot :) If everyone had a script like this, spambots would be unusable.

Feel free to use the code in anyway you please (LGPL like and stuff)

Put robots.txt in your root folder. Content:

User-agent: *
Disallow: /members/

Put StopSpammersServlet.java in WEB-INF/classes/com/parsek/util:

package com.parsek.util;
// Slashdot lameness filter trick... sklj lijef oiwej goweignm lkjhg woeèi weoij woefh woegih weoigj woefm weoikjf woeifh woefhpweifjwopejf pw
// Slashdot lameness filter trick... flk joweij pgwej pweof ,mpeof ,mpweorj pweomfwpegj pwehg woeigh owèefij woeij eogih oibhwepoi upeorw wpeo
// Slashdot lameness filter trick... fkjew fiwje spbojkwe gkwpeori wpbv-j wpeofksweok pweorjsw eigjhwoeifj pweorj wepoj wepfomwe fpmwoe fpowe
// Slashdot lameness filter trick... epoiw epw0 w'pg wpoe wpeom, wpog wepfoiwpeor kwpeof, wpobm wepofkwpeofk wopvf,w bowkpeoirf pwoef,mwepof p
// Slashdot lameness filter trick... vlwkepo wesp ibebemwf èsdm fèefo.bp kwèpef èlfk èeofsw èegjwegoweofiw peok èglks dgèlksdfèokwe ofèkwe èfoe
import java.io.File;
import java.io.StringWriter;
import javax.servlet.ServletContext;
import java.net.URL;
import java.util.Enumeration;
import java.lang.reflect.Array;
public class StopSpammersServlet extends javax.servlet.http.HttpServlet {
private static String[] names = { "root", "webmaster", "postmaster", "abuse", "abuse", "abuse", "bill", "john", "jane", "richard", "billy", "mike", "michelle", "george", "michael", "britney" };
private static String[] lasts = { "gates", "crystal", "fonda", "gere", "crystal", "scheffield", "douglas", "spears", "greene", "walker", "bush", "harisson" };
private String[] endns = new String[7];
private static long getNumberOfShashes(String path) {
int i = 1;
java.util.StringTokenizer st = new java.util.StringTokenizer(path, "/");
while(st.hasMoreTokens()) { i++; st.nextToken(); }
return(i);
}
// Respond to HTTP GET requests from browsers.
public void doGet (javax.servlet.http.HttpServletRequest request,
javax.servlet.http.HttpServletResponse response)
throws javax.servlet.ServletException, java.io.IOException {
// Set content type for HTML.
response.setContentType("text/html; charset=UTF-8");
// Output goes to the response PrintWriter.
java.io.PrintWriter out = response.getWriter();
try {
ServletContext servletContext = getServletContext();
endns[0] = "localhost";
endns[1] = "127.0.0.1";
endns[2] = "2130706433";
endns[3] = "fbi.gov";
endns[4] = "whitehouse.gov";
endns[5] = request.getRemoteAddr();
endns[6] = request.getRemoteHost();
String query = request.getQueryString();
String path = request.getPathInfo();
out.println("<html>");
out.println("<head>");
out.println("<title>Members area</title>");
out.println("</head>");
out.println("<body>");
out.println("<p>Hello random visitor. There is a big chance you are a robot collecting mail addresses and have no place being here.");
out.println("Therefore you will get some random generated email addresses and some random links to follow endlessly.</p>");
out.println("<p>Please be aware that your IP has been logged and will be reported to proper authorities if required.</p>");
out.println("<p>Also note that browsing through the tree will get slower and slower and gradually stop you from spidering other sites.</p>");
response.flushBuffer();
long sleepTime = (long) Math.pow(3, getNumberOfShashes(path));

do {
String name = names[ (int) (Math.random() * Array.getLength(names)) ];
String last = lasts[ (int) (Math.random() * Array.getLength(lasts)) ];
String endn = endns[ (int) (Math.random() * Array.getLength(endns)) ];
String email= "";

double a = Math.random() * 15;
if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a if(a email = email + "@" + endn;

out.print("<a href=\"mailto:" + email + "\">" + email + "</a><br>");
response.flushBuffer();

Thread.sleep(sleepTime);

} while (Math.random()
out.print("<br>");
do {
int a = (int) (Math.random() * 1000);
out.print("<a href=\"" + a + "/\">" + a + "</a> ");
Thread.sleep(sleepTime);
response.flushBuffer();
} while (Math.random() out.println("</body>");
out.println("</html>");

} catch (Exception e) {
// If an Exception occurs, return the error to the client.
out.write("<pre>");
out.write(e.getMessage());
e.printStackTrace(out);
out.write("</pre>");
}
// Close the PrintWriter.
out.close();
}
}

Put this in your WEB-INF/web.xml

<servlet>
<servlet-name>stopSpammers</servlet-name& gt;
<servlet-class>com.parsek.util.StopSpammersS ervlet</servlet-class>
</servlet>
<servlet-mapping>
<servlet-name>stopSpammers</servlet-name& gt;
<url-pattern>/members/*</url-pattern>
</servlet-mapping>

Here you go. No PHP, no APache, no mySQL, no Perl, just one servlet container.

Ciao

http-referrer (2)

sofar (317980) | more than 12 years ago | (#3328955)


hmm, just a wild guess, but does this technique involve using the http-referrer to see if there are too many clients coming from just a particalar address (which would obviously be a *bad* thingy), and subsequently block them too?

might explain why we can't see it no more :-(

I want it too!!! it seems to work pretty good!

Re:http-referrer (1)

cheekymonkey_68 (156096) | more than 12 years ago | (#3328997)

Wouldn't it block search engine bots like the googlebot as well, if so bang goes all your hard work on SEO...

For instance when you get "Googlebot/2.1 (+http://www.googlebot.com/bot.html)"

turning up in your logs ?

Re:http-referrer (2, Interesting)

DutchSter (150891) | more than 12 years ago | (#3329066)

No. The point the author made was that good bots follow the 'robots.txt' standard. A versatile program like this can differentiate. If a robot comes in and plays by the rules on robots.txt, it's welcomed. OTOH, if one comes in and just starts grabbing at everything, it will quickly find itself blocked.

I believe the exact quote in regards to why robots.txt should still be used is: "Most bad spambots don't even check the robots.txt file, so this is mainly for protection of the good bots."

Another thing I find appealing is that on a large enough system the DB could be shared amongst several servers to provide common protection for all. I've always taken a don't put an address on the page approach, but it's cool to see someone looking at how these bots operate from a technical standpoint.

Some ISPs (like mine) have policies against SPAM that stipulate that in addition to not actually spamming people, using their resources to prepare/collect addresses to SPAM is just as bad. The advantage the database gives you is that you can track the most recent offenders. A quick lookup to who owns the address, with hard evidence of one of their subscribers abusing both your system, and their policy will, if nothing else, cause the cost of spamming to rise. The reason SPAM is so popular is because it is VERY cheap to do. Once its costs approach those of 'traditional' marketing, things might get a bit more selective rather than sending my three year old '1-3 inches in 6 weeks!','Stop paying for cable', or 'Get out of debt now!' messages. Hardly directed.

(Now I don't want anyone marketing to my three year old, but I know it will happen so I'd like to at least think they would be reasonable things, perhaps a bit relevant)

Re:http-referrer (0)

Anonymous Coward | more than 12 years ago | (#3329080)

No, he was talking about http-referer, not agent...

I'm sorry, am I a bot? (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3328957)

I honestly can't get to:
http://www.neilgunton.com/spambot_trap/
or to
http://www.neilgunton.com/
I'm coming from a hungarian IP, but I don't have any problems with any other page. If somebody's seen the text, could you post it in reply to this?

How I track spammers using PHP (5, Interesting)

Elkman (198705) | more than 12 years ago | (#3328960)

I did something rather low-tech: I created a "Contact Us" page on my web server that has an automatically-generated address at the bottom. It says, "Note: The address spamtest.1018617636@example.com is not a valid contact address. It's just here to catch spammers." The number is actually the current UNIX timestamp, so I know exactly who grabbed this mail address and sent me mail.

As it turns out, I really haven't received that much mail to this address. About the only mail I've ever received to it is someone from trafficmagnet.net, who tells me that I'm not listed on a few search engines and that I can pay them to have my site listed. I need to send her a nasty reply saying that I don't care about being listed on Bob's Pay-Per-Click Search Engine, and that if she had actually read the page, she would have noticed that she was sending mail to an invalid address. Besides, the web server is for my inline skate club and we don't have a $10/month budget to pay for search engine placement.

I think I've received more spam from my Usenet posting history, from my other web site, and from my WHOIS registrations than I've received from the skate club web site.

Huh? (-1, Troll)

Anonymous Coward | more than 12 years ago | (#3328961)

Why would you use MySQL. Is it because you are afraid to use a REAL database? MySQL sucks. Don't be afraid to spend a little bit of moeny, you no good freeloading hippie.

Re:Huh? (1)

loply (571615) | more than 12 years ago | (#3329026)

Whats wrong with MySQL? It does everything the website claims it does.

Re:Huh? (0)

Anonymous Coward | more than 12 years ago | (#3329146)

Nothing, MySQL and mSQL are dope as shite! I've never used anything but. Don't get me wrong, Oracle is great for massive databases I'm sure, however I don't want to pay 80k for a Oracle server that so damn complicated I'ld have to send another 5 - 10k on schooling. It doesn't need to be that complicated, however it justifies the 70 - 80k per year you have to spend on your developer. All in all, not worth it. If ya can't do it with MySQL or mSQL your a poor programmer.

Hammered already.... (5, Funny)

cswiii (11061) | more than 12 years ago | (#3328965)

From the website:
The Problem: Spambots Ate My Website

s/Spambots/Slashdot/

mod_perl!!! I can hardly contain myself!! (1, Flamebait)

cscx (541332) | more than 12 years ago | (#3328970)

Hold back the excitment, people, it's another episode of story recycling. [slashdot.org]

This site is pretty handy, [evolt.org] now that I'm on the topic. Also make sure to check out RobotCop [robotcop.org] . Out for Apache now, coming soon for IIS and Zeus!

re: spidertrap (4, Interesting)

blibbleblobble (526872) | more than 12 years ago | (#3328972)

My PHP spider-trap [blibbleblobble.co.uk] - See an infinity of email addresses and links in action!

removing mailto: a bad solution (5, Interesting)

bluGill (862) | more than 12 years ago | (#3328978)

Removing mailto: links is a bad solution to the problem. It might be the only solution, but it is bad.

I hate the editor in my web browser. No spell check (and a quick read of this message will prove who diasterious that is to me), not good editing ability, and other problems. By contrast my email client has an excellent editor, and a spell checker. Let me pull up a real mail client when I want to send email, please!

In addition, I want people to contact me, and not everyone is computer literate. I hang out in antique iron groups, I expect people there to be up on the latest in hot tube ignition technology, not computer technology. To many of them computers are just a tool, and they don't have time to learn all the tricks to make it work, they just learn enough to make it do what they want, and then ignore the rest. Clicking on a mailto: link is easy and does the right thing. Opening up a mail client, and typing in some address is error prone at best.

Removing mailto: links might be the only solution, but I hope not. So I make sure to regualrly use spamcop [spamcop.org] .

Simple solution! (3)

Balinares (316703) | more than 12 years ago | (#3329090)

1) Put a link such as: mailto:dedicatedaddress@wherever.com?Subject= [Question] About your site (or whatever)
2) Trash any email sent to dedicatedaddress that doesn't have the [Question] tag in the subject.

Hope this helps.

Re:Simple solution! (3, Insightful)

c=sixty4 (35259) | more than 12 years ago | (#3329216)

  1. Put a link such as: mailto:dedicatedaddress@wherever.com?Subject= [Question] About your site (or whatever)
  2. Trash any email sent to dedicatedaddress that doesn't have the [Question] tag in the subject.
Congratulations. You just ensured you can't be emailed by anyone not running Internet Explorer.

A better solution: obfuscate the mailto: link (5, Insightful)

rsidd (6328) | more than 12 years ago | (#3329101)

Write some of your email address using html code for the ascii characters, like &#36 &#35 114 for "r".
(Yes, I've posted about this before [slashdot.org] , but it does work for me.) Browsers render it so users get the address they want, but spambots try to grab it from the raw html and get something meaningless.

Re:removing mailto: a bad solution (1)

ichimunki (194887) | more than 12 years ago | (#3329196)

I like SpamAssassin myself. It's pretty accurate in tagging spam. I agree, obscuring your address gets to be a pain. And it's usually not going to keep your more common addresses from getting passed around at some point. I get most of my spam as a result of shopping online, eBay, or just plain having a registered domain name.

As to web browsers, wouldn't it be a great plugin that could transform a text field into a mini-WP, complete with a limited function spell-checker, and minimal HTML-compliant formatting (for sites like Slashdot where it would be nice not to have to compose HTML to do things like bold or italics or blockquoting)? You know, you select the word you want bold, hit the bold tool in the toolbar, and on submit the correct tags are added? No more forgetting to close tags!

Take this tool... (0, Troll)

SkyLeach (188871) | more than 12 years ago | (#3328981)

And install it in Hawaii. Those Somoans even eat that sh*t at nice restaurants!

Yuck [myhawaiionline.com]

Eww [brown.edu]

QUICHE!? [khon.com]

Cookbook!? [besspress.com]

Re:Take this tool... (-1, Troll)

linzeal (197905) | more than 12 years ago | (#3329076)

Dude you do not yet know someone that can prepare spam well for you then. I suggest trying to date chicks you pick up from food4less, waffle house, or checker auto parts for awhile. They will show you the light and you will continue in glory the rest of your days with said satisfaction of discovering the greatest meat by product known to man.

slashdoted (first part) (-1)

Anonymous Coward | more than 12 years ago | (#3328989)

Stopping Spambots: A Spambot Trap Using Linux, Apache, mod_perl, Perl, MySQL, ipchains and Embperl Copyright 2002 by Neil Gunton Last updated: April 11th 2002 This document describes my experiences with spambots on my websites, and the techniques I have developed to stop them dead. I assume the reader has basic familiarity with Linux, Apache, mod_perl, Perl, MySQL and firewall rules using ipchains - each of these topics could fill a book, so I won't talk about installation or basic configuration. I will, however, provide full scripts and instructions on using these within the context of these tools. If you'd like some basic pointers on getting set up using these tools, then you could take a look at my short series of three Linux Network Howto articles. Contents * The Problem: Spambots Ate My Website * Overview of the Spambot Trap * Banishing 'mailto:' * MySQL * BlockAgent.pm * ipchains * badhosts_loop * spambot_trap/ Directory * robots.txt * Your HTML Files o Embperl * httpd.conf * Monitoring * Conclusions o Strengths o Weaknesses o Possible future enhancements The Problem: Spambots Ate My Website Spambot: (noun) - A software program that browses websites looking for email addresses, which it then "harvests" and collects into large lists. These lists are then either used directly for marketing purposes, or else sold, often in the form of CD-ROMs packed with millions of addresses. To add insult to injury, you may receive a spam email which is asking you to buy one of these lists yourself. Spambots (and spam) are a pestilence which needs to be stamped out wherever it is found. I have a website, http://www.crazyguyonabike.com, which has bicycle tour journals, message boards and guestbooks. I started noticing around the end of 2001 that the site was getting hit a lot by spambots. You can spot this sort of activity by looking for very rapid surfing, strange request patterns, and non-browser User-Agents. After looking at the server logs, I realized a couple of things: Firstly, the spambots came from many different IP addresses, so this precluded the simple option of adding the source IP to my firewall blocks list. Secondly, there seemed to be a common behavior between the bots - even if this was the first visit from a particular IP address (or even a particular network, so no chance of just being a different proxy) they would come straight into the middle of my website, at a specific page rather than the root. This means that the spambots obviously had some kind of database of pages, which had presumably been built up from previous visits, before I'd noticed the activity, and this database was being shared between a large number of different hosts, each of which was apparently running the same software. Another distinctive behavior was that the spambots would follow only those links which had certain keywords which would seem promising if you're looking for email addresses: "guestbook", "journal", "message", "post" and so on. On each of the pages in my site there were many other links in the navbars, but only links with these keywords were being followed. Also, robots.txt was never even being read, let alone followed. Moreover, the bot would come in, scan pages rapidly for maybe a few seconds, and then stop for a while. So it was obviously making at least some attempt to circumvent blocks based on frequency/quantity of requests. This was very annoying. For one thing, these things were picking off email addresses from my website (at that point, I was letting people who posted on my message boards decide for themselves whether they wanted their email addresses to be visible or not). But quite apart from that, it was taking up resources, and was just plain rude. I hate spam. I resent my webserver having to play host to people whose obvious goal is to cynically exploit the co-operative protocols of the internet to their own selfish, antisocial gain. So, I decided to do something about it. The first thing I did was to look at the User-Agent fields which were being used by the bots. There were a variety, including variations on the following: * DSurf15a 01 * PSurf15a VA * SSurf15a 11 * DBrowse 1.4b * PBrowse 1.4b * UJTBYFWGYA (and other strings of random capital letters) I searched the internet for references to these strings, but all I found was a slew of website statistics analysis logs. This meant that these particular spambots obviously got around. It was also discouraging, because there was no mention anywhere of what these things actually were. I was surprised that there seemed to be no discussion whatsoever of something that seemed to be pandemic. Then I found a couple of other websites with guestbooks that had actually been defiled by these spambots: (if you follow these links and you don't see a lot of empty messages left by the above user agents, then that means the webmaster of the site has finally found a way to stop it, so good for them...) * http://www.virtualglasgow.com/guestbook.html * http://www.donotenter.com/guestbook/gbook.html I reckon the spambots didn't really intend to leave empty messages. They just tend to want to follow links with the keyword 'post'. So if the guestbook posting form has no preview or confirmation page, then the spambot would leave a message simply by following this link! My guestbooks and message boards have a preview page, which is probably why I hadn't had any of this. Anyway, I started thinking about what kind of program this thing was. First of all, it comes from all kinds of different IP addresses. I couldn't quite believe that this many different IP addresses were all intentionally using the same software, of which I could find absolutely no mention anywhere on the Web. This made me think it might be some kind of virus/trojan/worm or whatever that silently installed itself on people's computers, and then used the CPU and bandwidth to surf the Web without the owner being aware of it. I thought that if this was the case, then it must be sending the results somewhere - and if we could find out where, then we could go about shutting the operation down. But I have had no luck at all in getting any help from the sysadmins at ISP's I have contacted. A typical exchange was the one with a guy at Cox internet, which was where a persistent offending IP address was sourced. He just couldn't be bothered, and eventually told me that spidering was not against the law, or their terms of service. I asked whether actions which were blatantly obviously geared toward the generation of spam were against their terms of use, but he never replied to that. I had no more luck anywhere else: Nobody had heard of this thing. I even sent an email to CERT, but no response. So, I turned instead to thinking about how I could erase these pests from my life as much as possible. This document is about my quest to stop spambots (not just this one, but ALL spambots) from abusing my website. Hopefully it will be useful to you. Overview of the Spambot Trap There are three main parts to the technique which I outline here: # Banish visible email addresses from your websites altogether, or else obfuscate them so they can't be harvested. Examples of how to do this are given. This is your fail-safe, in case the spambots figure out a way around your other defences. Even if they manage to cruise your website on their very best behavior, they still should not be able to harvest email addresses! # Block known spambots: Certain User-Agents are just known to be bad, so there's no reason to let them come on your site at all. True, spambots could in theory spoof the User-Agent, but the simple reality is that a lot of them don't. We use an enhanced version of the BlockAgent.pm module from the O'Reilly mod_perl book. This extension adds offending IP addresses to a MySQL (or other relational) database, which is picked up by the third part of our cunning system... # Set a Spambot Trap, which blocks hosts based on behavior. We set a trap for spambots, which normal users with browsers and well-behaved spiders should not fall into. If the bot falls in the trap, then its IP address is quickly blocked from all further connections to the webserver. This works using a persistent, looping Perl script called badhosts_loop, which checks every few seconds for additions to a 'badhosts' database. This script then adds 'DENY' rules for each bad hosts to the ipchains firewall. Blocks have an expiry, which is initially set to one day. If a host falls in the trap again after the block expires, then that IP is blocked again - and the expiration time is doubled to 2 days. And so on. This algorithm ensures that the worst offenders get progressively more blocked, while one-time offenders don't stick around in our firewall rules eating up resources. There are various components to the Spambot Trap, including the badhosts_loop Perl script, the BlockAgent.pm module, ipchains config, MySQL database, httpd.conf, robots.txt, and your HTML files. These are all covered in the sections below. Banishing 'mailto:' The first and most urgent thing you need to do is to get email addresses off your website altogether. This means, unfortunately, banishing the venerable mailto: link. It's a real shame that perfectly good mechanisms should be removed because of abuse, but that's just the way the world is these days. You need to be defensive, and assume that the spammers will try to take advantage of your resources as much as possible. It's an arms race The important thing that you need to realize is that no matter what blocks we put in place, this game is an arms race. Eventually the spambot writers will develop smarter bots which circumvent our techniques. Therefore you want to have a failsafe, which will prevent email addresses from getting into the hands of the spambot even if all else fails. The only real way to do that is to completely remove all email address from your website.

The WOrld is now safe. (2, Funny)

ksplatter (573000) | more than 12 years ago | (#3328999)

Thanks to: Spam Bots!!! More than meets the Eye :)

BAD MONKEYs (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3329005)

i hate monkeys

sincerely the coward man

Re:BAD MONKEYs (-1, Offtopic)

Anonymous Coward | more than 12 years ago | (#3329019)

FOOL
how can you talk about monkeys in a time like this?

Re:BAD MONKEYs (-1, Flamebait)

Anonymous Coward | more than 12 years ago | (#3329038)

I LOVE THOSE BLOODY MONKEYS YOU BE QUIET OR ILL Rox0r you a new face! a monkey face that is!! your soooooo lame. monkeys rule and you should know it. only if there were less monkey haters in this world we wouldnt have this problem with morons getting pissed on the board talking bout sumthing tottaly irrealavent-------

-------````MonkeYBoY````-------

Similar to how the new ORBZ works? (4, Interesting)

Masem (1171) | more than 12 years ago | (#3329024)

After the Battle Creek incident with ORBZ, the maintain changed the way it worked; instead of being pro-active on checking for open relays, he now has a 'honeypot' like system where a unique email address that isn't directly visible on the site but still may be harvested by a spam bot. Any server that sends email to that address is automatically added to The List. Mail server admins that believe that they should not be on this list can argue their case to remove their server.

Re:Similar to how the new ORBZ works? (1)

Slash Veteran (561542) | more than 12 years ago | (#3329062)

can argue their case to remove their server

Actually, there's no arguing involved. Just submit your IP, and you're removed -- until the next time your mailer sends mail to the trap address.

Re:Similar to how the new ORBZ works? (4, Interesting)

toupsie (88295) | more than 12 years ago | (#3329147)

he now has a 'honeypot' like system where a unique email address that isn't directly visible on the site but still may be harvested by a spam bot. Any server that sends email to that address is automatically added to

This is the same method I have been using for a while. I have an e-mail account called "cannedham" that I had posted on several web sites as a mailto: anchor on a 1x1 pixel graphic. Any e-mail sent to that address updates my Postfix header_checks file to protect the rest of my accounts. It works like a charm.

Now, let's fake the other end. (1)

iggly_iguana (36376) | more than 12 years ago | (#3329030)

This gives me an idea for a spam version of a roach motel (Spam gets in, but it never gets out).

I wonder what it would take to create an open relay server that would fool spammers into using it.

Ideas would be welcome. This could be just the revenge I've been looking for!!!

Sig: "That's not a duck!"

A tip (5, Informative)

anthony_dipierro (543308) | more than 12 years ago | (#3329047)

Here's a tip for those of you writing spambot traps... How about not blindly responding to the faked Return-Path address?

Now that should be illegal. You people whine about your 10 spams a day, try 10,000 from 2000 different email addresses. Idiot postmasters should be caught and jailed.

he suggests formmail, another spam tool (5, Informative)

nwc10 (306671) | more than 12 years ago | (#3329054)

Interestingly within the article he suggests hiding your e-mail addresses by making a feedback page. One of the programs that he suggests is formmail, and he links to Matt's original version.

formmail itself (even the most recent version) can still be abused by spammers to use your webserver as a bulk mail relay - see the advisory at [monkeys.com]
http://www.monkeys.com/anti-spam/formmail-adviso ry . df

It's a shame he didn't suggest the more robust formmail replacement at nms [sourceforge.net] which is maintained, and attempts to close all the known bugs and insecurities.

Speed (1)

egon (29680) | more than 12 years ago | (#3329059)

If he really wants to make the thing run faster, turn those varchars into regular chars. And index index index!

Suicidal (0, Redundant)

Captain Large Face (559804) | more than 12 years ago | (#3329064)

Wow, this guy slashdotted himself..

Stopping Spambots: A Spambot Trap

Using Linux, Apache, mod_perl, Perl, MySQL, ipchains and Embperl

Copyright 2002 by Neil Gunton

This document describes my experiences with spambots on my websites, and the techniques I have developed to stop them dead. I assume the reader has basic familiarity with Linux, Apache, mod_perl, Perl, MySQL and firewall rules using ipchains - each of these topics could fill a book, so I won't talk about installation or basic configuration. I will, however, provide full scripts and instructions on using these within the context of these tools. If you'd like some basic pointers on getting set up using these tools, then you could take a look at my short series of three Linux Network Howto articles.

Contents

  • The Problem: Spambots Ate My Website
  • Overview of the Spambot Trap
  • Banishing 'mailto:'
  • MySQL
  • BlockAgent.pm
  • ipchains
  • badhosts_loop
  • spambot_trap/ Directory
  • robots.txt
  • Your HTML Files
    • Embperl
  • httpd.conf
  • Monitoring
  • Conclusions
    • Strengths
    • Weaknesses
    • Possible future enhancements

The Problem: Spambots Ate My Website

Spambot: (noun) - A software program that browses websites looking for email addresses, which it then "harvests" and collects into large lists. These lists are then either used directly for marketing purposes, or else sold, often in the form of CD-ROMs packed with millions of addresses. To add insult to injury, you may receive a spam email which is asking you to buy one of these lists yourself. Spambots (and spam) are a pestilence which needs to be stamped out wherever it is found.

I have a website, http://www.crazyguyonabike.com, which has bicycle tour journals, message boards and guestbooks. I started noticing around the end of 2001 that the site was getting hit a lot by spambots. You can spot this sort of activity by looking for very rapid surfing, strange request patterns, and non-browser User-Agents.

Another distinctive behavior was that the spambots would follow only those links which had certain keywords which would seem promising if you're looking for email addresses: "guestbook", "journal", "message", "post" and so on. On each of the pages in my site there were many other links in the navbars, but only links with these keywords were being followed. Also, robots.txt was never even being read, let alone followed. Moreover, the bot would come in, scan pages rapidly for maybe a few seconds, and then stop for a while. So it was obviously making at least some attempt to circumvent blocks based on frequency/quantity of requests.

This was very annoying. For one thing, these things were picking off email addresses from my website (at that point, I was letting people who posted on my message boards decide for themselves whether they wanted their email addresses to be visible or not). But quite apart from that, it was taking up resources, and was just plain rude. I hate spam. I resent my webserver having to play host to people whose obvious goal is to cynically exploit the co-operative protocols of the internet to their own selfish, antisocial gain. So, I decided to do something about it.

The first thing I did was to look at the User-Agent fields which were being used by the bots. There were a variety, including variations on the following:

  • DSurf15a 01
  • PSurf15a VA
  • SSurf15a 11
  • DBrowse 1.4b
  • PBrowse 1.4b
  • UJTBYFWGYA (and other strings of random capital letters)

I searched the internet for references to these strings, but all I found was a slew of website statistics analysis logs. This meant that these particular spambots obviously got around. It was also discouraging, because there was no mention anywhere of what these things actually were. I was surprised that there seemed to be no discussion whatsoever of something that seemed to be pandemic. Then I found a couple of other websites with guestbooks that had actually been defiled by these spambots: (if you follow these links and you don't see a lot of empty messages left by the above user agents, then that means the webmaster of the site has finally found a way to stop it, so good for them...)

  • http://www.virtualglasgow.com/guestbook.html
  • http://www.donotenter.com/guestbook/gbook.html

I reckon the spambots didn't really intend to leave empty messages. They just tend to want to follow links with the keyword 'post'. So if the guestbook posting form has no preview or confirmation page, then the spambot would leave a message simply by following this link! My guestbooks and message boards have a preview page, which is probably why I hadn't had any of this.

Anyway, I started thinking about what kind of program this thing was. First of all, it comes from all kinds of different IP addresses. I couldn't quite believe that this many different IP addresses were all intentionally using the same software, of which I could find absolutely no mention anywhere on the Web. This made me think it might be some kind of virus/trojan/worm or whatever that silently installed itself on people's computers, and then used the CPU and bandwidth to surf the Web without the owner being aware of it. I thought that if this was the case, then it must be sending the results somewhere - and if we could find out where, then we could go about shutting the operation down. But I have had no luck at all in getting any help from the sysadmins at ISP's I have contacted. A typical exchange was the one with a guy at Cox internet, which was where a persistent offending IP address was sourced. He just couldn't be bothered, and eventually told me that spidering was not against the law, or their terms of service. I asked whether actions which were blatantly obviously geared toward the generation of spam were against their terms of use, but he never replied to that. I had no more luck anywhere else: Nobody had heard of this thing. I even sent an email to CERT, but no response. So, I turned instead to thinking about how I could erase these pests from my life as much as possible. This document is about my quest to stop spambots (not just this one, but ALL spambots) from abusing my website. Hopefully it will be useful to you.

Overview of the Spambot Trap

There are three main parts to the technique which I outline here:

  • Banish visible email addresses from your websites altogether, or else obfuscate them so they can't be harvested. Examples of how to do this are given. This is your fail-safe, in case the spambots figure out a way around your other defences. Even if they manage to cruise your website on their very best behavior, they still should not be able to harvest email addresses!
  • Block known spambots: Certain User-Agents are just known to be bad, so there's no reason to let them come on your site at all. True, spambots could in theory spoof the User-Agent, but the simple reality is that a lot of them don't. We use an enhanced version of the BlockAgent.pm module from the O'Reilly mod_perl book. This extension adds offending IP addresses to a MySQL (or other relational) database, which is picked up by the third part of our cunning system...
  • Set a Spambot Trap, which blocks hosts based on behavior. We set a trap for spambots, which normal users with browsers and well-behaved spiders should not fall into. If the bot falls in the trap, then its IP address is quickly blocked from all further connections to the webserver.
  • This works using a persistent, looping Perl script called badhosts_loop, which checks every few seconds for additions to a 'badhosts' database. This script then adds 'DENY' rules for each bad hosts to the ipchains firewall. Blocks have an expiry, which is initially set to one day. If a host falls in the trap again after the block expires, then that IP is blocked again - and the expiration time is doubled to 2 days. And so on. This algorithm ensures that the worst offenders get progressively more blocked, while one-time offenders don't stick around in our firewall rules eating up resources.

There are various components to the Spambot Trap, including the badhosts_loop Perl script, the BlockAgent.pm module, ipchains config, MySQL database, httpd.conf, robots.txt, and your HTML files. These are all covered in the sections below.

Banishing 'mailto:'

The first and most urgent thing you need to do is to get email addresses off your website altogether. This means, unfortunately, banishing the venerable mailto: link. It's a real shame that perfectly good mechanisms should be removed because of abuse, but that's just the way the world is these days. You need to be defensive, and assume that the spammers will try to take advantage of your resources as much as possible.

It's an arms race

The important thing that you need to realize is that no matter what blocks we put in place, this game is an arms race. Eventually the spambot writers will develop smarter bots which circumvent our techniques. Therefore you want to have a failsafe, which will prevent email addresses from getting into the hands of the spambot even if all else fails. The only real way to do that is to completely remove all email address from your website.

Contact forms

You should replace the mailto: links with links to a special form where people can type their name, email address and message. A CGI can then deliver the email, and your email address never has to be disclosed. There are a number of different mailer scripts out there - just be careful to check for vulnerabilities which could allow malicious users to use the form to send email to third parties (i.e. spam, ironically enough) using your server. The formmail script is popular, but an earlier version had such a vulnerability (since fixed). The Embperl package has a simple MailFormTo command to send an email from a form.

Since I have seen guestbooks out there which have been extensively defiled by spambots, I would add that you should have a preview screen on your contact forms. This will ensure that an email doesn't get fired off simply by a spambot following the 'post' or 'contact' link (which it will likely try to do).

Alternatives to totally banishing mailto:

There are alternatives to completely removing email addresses, but they all depend on the stupidity of the spambot, and so could be compromised by a new generation of pest. These include:

  • Write out email addresses in a non-email format, e.g. instead of writing 'username@domain.com' you would write 'username at domain dot com', or something similar. It would only take some spambot with a little more intelligence to be able to scan these patterns and pick up "likely" addresses, so this strategy is a little risky. Any consistent method you choose to write out email addresses could in theory be analyzed and decoded by a savvy bot.
  • Add stuff to the email address to make it invalid, but so that a human could easily know what to do to make it work. An example of this is writing 'username@_NO_SPAM_domain.com'. You need to remove the "_NO_SPAM_" part to make the email address valid. You can have some kind of explanation to make it clear what people have to do to use the address. Personally, I don't like this - you're depending on a level of sophistication on the part of your users which is risky. In my experience, there are a lot of very 'novice' level users out there, who only know how to click on a link. They don't know how to edit an email address. Heck, I've had people come to my site by typing the URL into Google, rather than the 'Location' box of their browser. Also, people don't read instructions.
  • Make graphics images which contain the email address. Spambots usually don't download graphics, and even if they did, they probably couldn't decode the bits to get the text. However, they could do it in theory, since software for doing OCR (optical character recognition, getting text from scanned documents) has been around for a while. A downside to this approach is that the user has to manually copy down the email address, since it can't be cut'n'pasted. Also, you can't put a mailto: link on the image, otherwise you're back to square one. But you could put a link to a contact form, with an argument in the link telling your server internally what email address to use. For example, the link could say "contact.cgi?to=23", where '23' is some database key to the actual email address. But the downside here is that you still need to generate the image, which is a bit of a pain in the ass if you have a lot of them. You can do it automatically, if you're willing to put the work in and write the scripts. There are some very nice graphics generation packages out there on CPAN for Perl. Here's an example of an email address presented as an image:

MySQL

Download badhosts MySQL database dump

We need to set up a MySQL database, where we store records of the hosts which are to be blocked. This doesn't have to be MySQL, but I use it because it's extremely fast, and very appropriate for this kind of application. You need to create a new database, called 'badhosts'. You then create a table, again called 'badhosts', with the following structure:

Field
Type
Comment

ip_address
varchar(20) not null, indexed
The IP address of the host to be blocked

user_agent
varchar(255) not null
The HTTP User-Agent of the spambot, for reference

expire_days
int unsigned not null
How many days is this block for. Doubled every time a new block has to be created for a particular IP address

created
datetime not null
When this block was created

expiry
datetime not null, indexed
When this block expires

You could use the dump provided above to load directly into your database:

shell> mysqladmin create badhosts
shell> mysql badhosts < badhosts.dump

That's about it! The fields which are marked as 'indexed' are the only ones which need indexes, because they are searched on to see if a particular IP address has been previously blocked, and also to see which blocks should be removed because they've expired. If you have access privilages set on your MySQL databases, then you need to allow the Apache user (usually 'nobody') access. The other script that will require access is badhosts_loop, which runs as root.

Next, we look at the script that populates this database.

BlockAgent.pm

Download BlockAgent.pm

Download bad_agents.txt

The BlockAgent.pm Apache/mod_perl module is taken from the excellent book "Writing Apache Modules with Perl and C" by Lincoln Stein & Doug MacEachern (O'Reilly). This script basically acts as an Apache authentication module which checks the HTTP User-Agent header against a list of known bad agents. If there's a match, then a 403 'Forbidden' code is returned. The script compiles and caches a list of subroutines for doing the matches, and automatically detects when the 'bad_agents.txt' file has changed. I have found that it has no noticeable impact on the performance of the webserver. This script is useful in the case where you know for certain that a certain User-Agent is bad; there's no point in letting it go anywhere on your site, so it's a good first line of defense. We'll cover how to add this module to your website a little later, along with the rest of the configuration settings in the section on httpd.conf.

Of course, one of the first arguments you'll see with regard to this method of blocking spambots is that it's easy to circumvent, by simply passing in a User-Agent string which is identical to the major browsers out there. This is perfectly true, but don't ask me why the spambot writers haven't done this - maybe it's a question of pride or ego, they want to see their baby out there on record in Web server logs. I honestly don't know. The main point is that at present, the User-Agent header CAN be used very effectively to block most bad agents. But, I have added more features so that we can also block agents which look ok, but behave badly by going somewhere they shouldn't - the Spambot Trap. More on that soon.

You'll notice that the bad_agents.txt file which I have supplied here is very comprehensive. A good strategy here is probably to save the full version somewhere (perhaps as bad_agents.txt.all), and just keep the ones you actually encounter in the bad_agents.txt file. Then you keep the list shorter, and more relevant to what actually hits you. For example, my bad_agents.txt file currently has the following lines in it, because these are the spambots that I see most frequently:

  • [A-Z]+$
  • .Browse\s
  • .Eval
  • EO Browse
  • .Surf
  • Microsoft.URL
  • ^Mozilla\/3.0.+Indy Library
  • Zeus.*Webster

You'll notice from this that BlockAgents.pm is very flexible, being able to take full advantage of the excellent regular expression capabilities of Perl. This means you can capture a lot of different agents with just one line. For example, the very first line catches all the variations of the agent which passes in random strings of capital letters, e.g. FHASFJDDJKHG or UYTWHJVJ. The spambot obviously thinks it's being pretty smart by looking different each time, but by using an easily identifiable pattern, it shoots itself in the foot. Hah.

The original version of the BlockAgent.pm script is well explained in the O'Reilly book, but I've added an extra hook that checks to see whether the client is accessing any of the spambot trap directories. If it is, then we add an entry to the MySQL database (you could use another relational database if you want, as long as it's accessible from Perl DBI).

The first time an IP address is blocked, an expiry of one day is set. If the same host subsequently comes in and falls into the trap again, then the expiry time is doubled. And so on. This way, the block gets longer and longer, in proportion to how persistently the spambot revisits our website. Once the IP address is blocked, the spambot can't even connect to our web server, since we use 'Deny' in the ipchains rule. This means that no acknowledgement is given to any packets coming in from the badhost, and as far as they know, our server has just gone away. Hopefully, after this happens for long enough, our server will be taken off the spambot's "visit" list. Another nice little side-effect of this is that the spambot will probably have to wait for a while before giving up each connection attempt. Anything that makes them waste more time is ok by me!

BlockAgent.pm notifies the badhosts_loop script that something has happened by touching a file called /tmp/badhosts.new. The badhosts_loop file checks this file every few seconds and if it has changed then it knows that a new record's been added to the database, and it needs to re-generate the blocks list.

The BlockAgent.pm script is our alarm system. It's what tells us that something happened. In order to act on this information, we need to be able to add rules to the ipchains firewall. We'll cover this next.

ipchains

Download sample ipchains config file

The ipchains module (here's the HOWTO doc) is a very nice way of providing a good level of basic network security to your server. If you haven't already set it up (or it's successor, iptables), then you really should. It's a very easy way to configure who can and cannot have access to your machine. A good resource for learning about this is "Building Linux and OpenBSD Firewalls", by Wes Sonnenreich and Tom Yates (Wiley). This is where I learned about ipchains, and it's on their excellent explanations and examples that I based my own config file. Another is "Linux Firewalls" by Ziegler (New Riders), which seems to have a more recent 2nd edition that covers iptables too.

The example ipchains config file given here is complete, but the bit which is most important to us is that we create a chain called 'blocks'. This is our own custom chain, which we can then add rules to. The badhosts_loop script will flush this chain and build it back up whenever a spambot falls in your trap. Once the spambot's IP address is on the blocks list, that host cannot connect to your server at all.

Remember to restart ipchains after you've changed the config file. Next, we'll look at the script that actually adds the firewall rules. badhosts_loop

Download badhosts_loop script

You run this script in the background, as root. It has to be run as root, because only root has the ability to add rules to the firewall. The script spends most of its time sleeping. It wakes up every five seconds or so and does a quick check on /tmp/badhosts.new. If this file has been changed since the last time it looked, then it goes and re-generates the firewall blocks list with all the current (non-expired) blocks. If nothing else happens, then the script will automatically do this at least once a day, to ensure that blocks really do expire even if there is no new activity.

You should probably add the following line to your /etc/rc.local file (or equivalent), so that the script is automatically started up on reboot:

/path/to/badhosts_loop --loop &

This will start the script looping in the background. The script automatically checks to see if it is already running, by attempting to lock /var/lock/badhosts_loop.lock. If the file is already locked then the script will exit with an error message. If you want to just run the script once, without looping, then just omit the '--loop' option. This can be useful for testing.

Logging is done to /var/log/badhosts_loop.log by default. Every time the script generates the blocks list, it writes a list of all the blocks to the log. This is a good place to monitor if you're interested in what hosts are being blocked. Here's an example of the log output:

EDITOR: SNIPPED

Thu Apr 11 16:09:07 2002: Flushing blocks chain: Generating blocks list:

Adding 63.148.99.247 (1) 2002-04-11 11:16:11 to 2002-04-12 11:16:11 Mozilla/4.0 (compatible; MSIE 5.01; Windows NT 5.0)

The log shows the IP address which is being added, then (in brackets) the number of days the block is effective for (doubling each time), then the start and end dates of this block, and finally the name of the User-Agent which committed the crime. This can be useful for quickly seeing whether you need to add a new one to the bad_agents.txt file.

This is a pretty stable script that should just sit there and chug quietly, not taking up much in the way of resources. Checking for a file being changed every five seconds is not a big deal in Unix, so you shouldn't even notice it.

Now you have to create the trap itself - the spambot_trap directory.

spambot_trap/ Directory

Download gzipped tarball of sample spambot_trap directory

View the sample directory

You can create this directory anywhere on your server. We will create an alias the httpd.conf to access it. I put mine in /www/spambot_trap/. The point is, this doesn't have to be a real directory under your webserver directory root. If you use the directive, then multiple websites can access the same spambot_trap directory, potentially through different aliases. You can use the sample tarball as a starting point, it has subdirectories and links which the spambots I have seen find irresistable. You should create your own image file for the unblock_email.gif file, to have a valid email address of your own.

The spambot_trap and spambot_trap/guestbook/ directories are not used directly to spring the trap. This is because I wanted to have a warning level, a lead-in, where real users would be able to realize they are getting into dangerous waters and could then back out. You're going to be placing hard-to-click links on your web pages which lead into the real trap, and there's always a chance that a real user will accidentally click on one of these. So, some of the links will point into the warning level. I have made a GIF image which contains a warning text. Why an image? Mainly because spambots can't understand images, and I didn't want to give big clues like "WARNING!!! DO NOT ENTER" in plain text. So, the user sees the warning, the spambots don't. If the spambot proceeds into any of the subdirectories (email, contact, post, message), then the trap is sprung and the host is blocked.

You also need to try to stop good spiders (e.g. google) from falling into the spambot trap and being blocked. To do this, we utilize the robots.txt file.

robots.txt

Download sample robots.txt

This should allow good robots (such as google) to surf your site without falling into the spambot trap. Most bad spambots don't even check the robots.txt file, so this is mainly for protection of the good bots.

You'll see that we list a bunch of directories under '/squirrel'. This could be anything; you'll set an alias later in httpd.conf. In fact, you may even want this to be dynamically generated (see later, under Embperl), so that you can quickly change the name of the spambot trap directory if the spambots adapt and start avoiding it. At present, a static setup should work just fine, however.

Next, we need to look at the bait - links within your HTML files which lead the spambot into the trap.

Your HTML Files

Download sample HTML code

Download sample transparent 1 pixel image for hiding the trap

Here's an example of HTML with links into the spambot trap:

<HTML>
<BODY BGCOLOR="beige">
<A HREF="/squirrel/guestbook/message/"></A> <A HREF="/squirrel/guestbook/post/"><IMG SRC="/guestbook.gif" WIDTH=1 HEIGHT=1 BORDER=0></A>
Body of the page here
<TABLE WIDTH=100%> <TR>
<TD ALIGN=RIGHT> <A HREF="/squirrel/guestbook/"> <SMALL><FONT COLOR="beige">guestbook</FONT></SMALL& gt; </A></TD>
</TR>
</TABLE>
</BODY>
</HTML>

Spambots tend to be stupid. You'd think they would check for empty links (which don't show up in a real browser), but they don't seem to. Sure, they may get smarter, but meantime you might as well pick the low hanging fruit. So, the very first thing in the body of your HTML should be an empty link which goes straight into the trap proper - not the warning level, but the actual trap itself. This is because there is no way for someone using a real browser to click on this link, and good spiders will ignore it anyway because it's in the robots.txt file.

We also use a one pixel big transparent GIF (a favorite web bug technique) to anchor a link to the trap, just in case the spambot is smart enough to avoid empty links. If we put this as the very first thing in the body, then it'll be pretty hard for a real user to click on, since it's only one pixel in size. But a spambot will quite happily go there!

Finally, there is an example of a non-graphic, text based link. This will be placed on the right side of the screen by the table, and the text will appear in the same color as the background (in this example, beige). The link does not go straight into the trap, but into the warning level, because with this one there is a bigger chance that real people could click on it accidentally. The link may be invisible, but it's still there, and someone could find it. So, they get to see a nice warning, and they should back off from there. But the spambot won't. By the way, we have the link going to /squirrel/guestbook/ rather than just /squirrel/ because some of the spambots seem to specifically follow links with certain keywords, e.g. 'guestbook', 'message', 'post', etc.

You can sprinkle these links all around your HTML files. I put them in every single one, since I use Embperl templates which make that sort of thing very easy.

Embperl

Download sample dynamic robots.txt using Embperl

Download sample dynamic HTML code using Embperl

The point of this is to make it easier to change the spambot trap directory without having to edit a whole bunch of files. We pass an environment variable to Perl from httpd.conf (see below), which says what the trap directory is called. We then use this in Embperl to substitute into the HTML and robots.txt files at request time. Thus if we wanted to change the name of the trap from 'squirrel' to 'badger', then we only need to change httpd.conf, restart apache, and we're done. All the links in the HTML are dynamic, as is robots.txt (see the samples above).

Now, we bring it all together in the Apache configuration file.

httpd.conf

Download sample httpd.conf directives

Download sample startup.pl script (used in httpd.conf)

You need to have mod_perl installed before you can use BlockAgent.pm. You should take a look at the sample given above, and integrate these directives into your own virtual hosts. The most important lines are:

Alias /squirrel /www/spambot_trap
PerlSetEnv SPAMBOT_TRAP_DIR squirrel

You should set the 'squirrel' name to whatever you'd like for your website; you'll then access the trap using a URL something like http://www.yourdomain.com/squirrel/guestbook/messa ge. This will spring the trap. You also need to set up the BlockAgent.pm access handler:


PerlAccessHandler Apache::BlockAgent
PerlSetVar BlockAgentFile /www/conf/bad_agents.txt

This ensures that all accesses to your website will go through BlockAgent.pm first. You should choose your own location for the bad_agents.txt file.

Finally, you might want to install Embperl so that you can embed Perl into your HTML code (always executed on the server side, never seen on the client side):

# Set EmbPerl handler for main directory

# Handle HTML files with Embperl

SetHandler perl-script
PerlHandler HTML::Embperl
Options ExecCGI

# Handle robots.txt with Embperl

SetHandler perl-script
PerlHandler HTML::Embperl
Options ExecCGI

That about does it. You should now have the setup which will allow you to block spambots. You'll probably be interested in monitoring what happens...

Monitoring

Download sample script for monitoring web server logs

This simple script just tails the badhosts_loop log. You'll have fun (I do) seeing what comes on your site and promptly falls into the trap, and then SPLAT. No more spambot. Heh heh heh.

Conclusions

This setup works pretty well for me at the moment. I've no doubt there are flaws in my design, but it seems stable and is "good enough" for the time being. If you can see any improvements then I'd love to hear about them. To finish up, here's a summary of the strengths and potential weaknesses of the Spambot Trap system.

Strengths

  • Does not rely exclusively on the HTTP User-Agent header, but at the same time allows us to block agents which we know to be bad.
  • Does not rely on the spambot abusing the robots.txt file. Many spambots don't even load it. But the robots.txt file will protect "good" robots from falling into the spambot trap. So, for example, googlebot will be just fine.
  • The blocks happen based on behavior, rather than trusting anything the spambot tells us about itself (e.g. User-Agent). Thus we don't rely on any prior knowledge of the spambots in order to block them; an entirely new one that we've never seen before will still fall in the trap and be duly blocked.
  • Once a spambot is blocked, then it cannot connect to your server again at all for the duration of the block. If it tries to connect, it won't even get a 'connection refused' error, because the firewall rule just quietly drops all the packets from the bad hosts. The ipchains firewall is very effective, and more efficient at blocking hosts than anything you could put together with Apache. So, you save on server resources. If you're wondering whether the block lists might get large, I have found that with the constant expiring of one day blocks, the active block list has never been more than about 20 IP addresses at a time, out of a list (so far) of 100 distinct hosts.
  • The blocks initially expire after one day. This means that one-off offenders are quickly removed from the firewall rules. On the other hand, repeat offenders get progressively longer and longer blocks (doubled each time). This means that the more abusive a host is, the more it will be blocked. It also means that if a bot is coming in from multiple IP addresses (through a proxy), then each of the individual IP addresses will probably not go on to be blocked for too long. Thus you won't be blocking everyone in AOL. On the other hand, if you continue to get hit from the same network, then it's obviously a source of trouble and should be blocked. If it's a major network like AOL, which you really don't want to block, then you need to take the IP addresses and times of the abuse, and send it to the sysadmin at the ISP concerned. There's really not a lot else you can do. I haven't seen this in reality, though. In my experience, the spambots come in from all sorts of different IP addresses, and the ones that are very persistent over time are mostly static IPs from DSL and small ranges of IPs from cable modems. These are the people with the always-on, high bandwidth capabilities which are needed for large scale email harvesting.
  • The system uses a relational database to manage the blocks, and so it is very scalable, and potentially you could share the database between multiple servers. If any one server gets a spambot, the the offending IP address can automatically also be blocked at all the other servers. Also, the fact that we don't delete expired blocks means that we can keep track of the history of the blocks, and perhaps perform analyses which would lead to more permanent ipchains blocks of entire subnets, if desired.

Weaknesses

  • It would be possible for the spambots to get wise, and start following the robots.txt file rules. Then the spambot could in theory surf your entire site (or at least the bits allowed by robots.txt) without falling into the trap. However this also means that you can control where the spambot goes, which is the whole point of robots.txt. If you want, you can allow google into one part of the site, but exclude all others. Still, you should remove all email addresses from your site as the fail-safe.
  • It's possible that a spambot could come in through a proxy such as AOL, which means you'll be blocking multiple AOL IP addresses. This is not very nice, and I'm not sure what the solution is at the moment. All I can say is that it hasn't happened yet, and the worst offenders on my site all have static IPs. They seem to come in from cable and DSL connections mostly.
  • I don't know how feasible this would be, but it may be possible to conduct a "denial of service" type attack on your webserver by making many requests to the spambot trap directory from different IP addresses. I think, however, that you actually need to have those IP addresses (rather than spoofing them) in order to set up a real TCP connection with the web server. I don't know how likely this is, but it comes more under the "attack" category than spambots. If someone tries this on your site, then it's definitely something that can be pursued with legal means. It's no longer just a petty annoyance, but rather a hostile action which must be chased down. Also, the motivation is totally different - the spammers don't want to do this kind of thing. They just want their email addresses. The DDOS attacks are notoriously difficult to track, but I think in the couple of years that have passed since the first ones brought down Amazon and Yahoo!, there has been some progress made. Anyhow, I just wanted to bring the idea into the light of day. If anyone has any clues about it then I'd be glad to know.

Possible Future Enhancements

  • Spot large numbers of blocks occurring on a particular subnet, and automatically consolidate blocks into a single one which blocks the entire subnet (e.g. 128.123.31.0/24).
  • More interactive tools to allow removal of blocks
  • Analysis tools which can tell us something about patterns of abuse from particular networks.

If you can think of any more potential problems (or unrecognised strengths!) then I'd be happy to hear about it. I'd also like to hear about any comments on this document.

Removing the Mailto: may not be the best plan.. (5, Interesting)

liquidsin (398151) | more than 12 years ago | (#3329065)

I've found that a lot of people just won't send email if there's not a link to facillitate it. I've become rather fond of using javascript to write the address to the page. Spambots read the source so they don't piece the address together but *most* browsers will still do it right. Just use something like:

<script>document.write("<A CLASS=\"link\" HREF=\"mailto: " + "myname" + String.FromCharCode(64) + "mydomain"</script>

Seems to work fine. Anyone know of any reason it shouldn't, or have any other way to keep down spam without totally removing the Mailto: ? I know this won't work with *every* browser, but it beats totally removing mail links. And I don't think spammers can get it without having a human actually look at the page...

Re:Removing the Mailto: may not be the best plan.. (1)

SuperCal (549671) | more than 12 years ago | (#3329156)

Awsome... great Idea. It so simple that I'm kicking myself for not thinking of it myself. It will keep working untill everyone does it and spam bot writers figure it out. Ironicly, I'm changeing all my websites' mailto tags to the java format... sorry

Re:Removing the Mailto: may not be the best plan.. (2)

bero-rh (98815) | more than 12 years ago | (#3329190)

This also makes it invisible to anyone who disabled JavaScript, and anyone using a browser that doesn't do JavaScript (lynx, links, etc.)

mirror (0, Redundant)

loraksus (171574) | more than 12 years ago | (#3329067)

looks like /. ate his website, not spambots :)

The Problem: Spambots Ate My Website
Spambot: (noun) - A software program that browses websites looking for email addresses, which it then "harvests" and collects into large lists. These lists are then either used directly for marketing purposes, or else sold, often in the form of CD-ROMs packed with millions of addresses. To add insult to injury, you may receive a spam email which is asking you to buy one of these lists yourself. Spambots (and spam) are a pestilence which needs to be stamped out wherever it is found.

I have a website, http://www.crazyguyonabike.com, which has bicycle tour journals, message boards and guestbooks. I started noticing around the end of 2001 that the site was getting hit a lot by spambots. You can spot this sort of activity by looking for very rapid surfing, strange request patterns, and non-browser User-Agents.

After looking at the server logs, I realized a couple of things: Firstly, the spambots came from many different IP addresses, so this precluded the simple option of adding the source IP to my firewall blocks list. Secondly, there seemed to be a common behavior between the bots - even if this was the first visit from a particular IP address (or even a particular network, so no chance of just being a different proxy) they would come straight into the middle of my website, at a specific page rather than the root. This means that the spambots obviously had some kind of database of pages, which had presumably been built up from previous visits, before I'd noticed the activity, and this database was being shared between a large number of different hosts, each of which was apparently running the same software.

Another distinctive behavior was that the spambots would follow only those links which had certain keywords which would seem promising if you're looking for email addresses: "guestbook", "journal", "message", "post" and so on. On each of the pages in my site there were many other links in the navbars, but only links with these keywords were being followed. Also, robots.txt was never even being read, let alone followed. Moreover, the bot would come in, scan pages rapidly for maybe a few seconds, and then stop for a while. So it was obviously making at least some attempt to circumvent blocks based on frequency/quantity of requests.

This was very annoying. For one thing, these things were picking off email addresses from my website (at that point, I was letting people who posted on my message boards decide for themselves whether they wanted their email addresses to be visible or not). But quite apart from that, it was taking up resources, and was just plain rude. I hate spam. I resent my webserver having to play host to people whose obvious goal is to cynically exploit the co-operative protocols of the internet to their own selfish, antisocial gain. So, I decided to do something about it.

The first thing I did was to look at the User-Agent fields which were being used by the bots. There were a variety, including variations on the following:

DSurf15a 01
PSurf15a VA
SSurf15a 11
DBrowse 1.4b
PBrowse 1.4b
UJTBYFWGYA (and other strings of random capital letters)
I searched the internet for references to these strings, but all I found was a slew of website statistics analysis logs. This meant that these particular spambots obviously got around. It was also discouraging, because there was no mention anywhere of what these things actually were. I was surprised that there seemed to be no discussion whatsoever of something that seemed to be pandemic. Then I found a couple of other websites with guestbooks that had actually been defiled by these spambots: (if you follow these links and you don't see a lot of empty messages left by the above user agents, then that means the webmaster of the site has finally found a way to stop it, so good for them...)

http://www.virtualglasgow.com/guestbook.html
http://www.donotenter.com/guestbook/gbook.html
I reckon the spambots didn't really intend to leave empty messages. They just tend to want to follow links with the keyword 'post'. So if the guestbook posting form has no preview or confirmation page, then the spambot would leave a message simply by following this link! My guestbooks and message boards have a preview page, which is probably why I hadn't had any of this.

Anyway, I started thinking about what kind of program this thing was. First of all, it comes from all kinds of different IP addresses. I couldn't quite believe that this many different IP addresses were all intentionally using the same software, of which I could find absolutely no mention anywhere on the Web. This made me think it might be some kind of virus/trojan/worm or whatever that silently installed itself on people's computers, and then used the CPU and bandwidth to surf the Web without the owner being aware of it. I thought that if this was the case, then it must be sending the results somewhere - and if we could find out where, then we could go about shutting the operation down. But I have had no luck at all in getting any help from the sysadmins at ISP's I have contacted. A typical exchange was the one with a guy at Cox internet, which was where a persistent offending IP address was sourced. He just couldn't be bothered, and eventually told me that spidering was not against the law, or their terms of service. I asked whether actions which were blatantly obviously geared toward the generation of spam were against their terms of use, but he never replied to that. I had no more luck anywhere else: Nobody had heard of this thing. I even sent an email to CERT, but no response. So, I turned instead to thinking about how I could erase these pests from my life as much as possible. This document is about my quest to stop spambots (not just this one, but ALL spambots) from abusing my website. Hopefully it will be useful to you.

Overview of the Spambot Trap
There are three main parts to the technique which I outline here:

Banish visible email addresses from your websites altogether, or else obfuscate them so they can't be harvested. Examples of how to do this are given. This is your fail-safe, in case the spambots figure out a way around your other defences. Even if they manage to cruise your website on their very best behavior, they still should not be able to harvest email addresses!

Block known spambots: Certain User-Agents are just known to be bad, so there's no reason to let them come on your site at all. True, spambots could in theory spoof the User-Agent, but the simple reality is that a lot of them don't. We use an enhanced version of the BlockAgent.pm module from the O'Reilly mod_perl book. This extension adds offending IP addresses to a MySQL (or other relational) database, which is picked up by the third part of our cunning system...

Set a Spambot Trap, which blocks hosts based on behavior. We set a trap for spambots, which normal users with browsers and well-behaved spiders should not fall into. If the bot falls in the trap, then its IP address is quickly blocked from all further connections to the webserver.
This works using a persistent, looping Perl script called badhosts_loop, which checks every few seconds for additions to a 'badhosts' database. This script then adds 'DENY' rules for each bad hosts to the ipchains firewall. Blocks have an expiry, which is initially set to one day. If a host falls in the trap again after the block expires, then that IP is blocked again - and the expiration time is doubled to 2 days. And so on. This algorithm ensures that the worst offenders get progressively more blocked, while one-time offenders don't stick around in our firewall rules eating up resources.

There are various components to the Spambot Trap, including the badhosts_loop Perl script, the BlockAgent.pm module, ipchains config, MySQL database, httpd.conf, robots.txt, and your HTML files. These are all covered in the sections below.

Banishing 'mailto:'
The first and most urgent thing you need to do is to get email addresses off your website altogether. This means, unfortunately, banishing the venerable mailto: link. It's a real shame that perfectly good mechanisms should be removed because of abuse, but that's just the way the world is these days. You need to be defensive, and assume that the spammers will try to take advantage of your resources as much as possible.
It's an arms race
The important thing that you need to realize is that no matter what blocks we put in place, this game is an arms race. Eventually the spambot writers will develop smarter bots which circumvent our techniques. Therefore you want to have a failsafe, which will prevent email addresses from getting into the hands of the spambot even if all else fails. The only real way to do that is to completely remove all email address from your website.
Contact forms
You should replace the mailto: links with links to a special form where people can type their name, email address and message. A CGI can then deliver the email, and your email address never has to be disclosed. There are a number of different mailer scripts out there - just be careful to check for vulnerabilities which could allow malicious users to use the form to send email to third parties (i.e. spam, ironically enough) using your server. The formmail script is popular, but an earlier version had such a vulnerability (since fixed). The Embperl package has a simple MailFormTo command to send an email from a form.
Since I have seen guestbooks out there which have been extensively defiled by spambots, I would add that you should have a preview screen on your contact forms. This will ensure that an email doesn't get fired off simply by a spambot following the 'post' or 'contact' link (which it will likely try to do).

Alternatives to totally banishing mailto:
There are alternatives to completely removing email addresses, but they all depend on the stupidity of the spambot, and so could be compromised by a new generation of pest. These include:

Write out email addresses in a non-email format, e.g. instead of writing 'username@domain.com' you would write 'username at domain dot com', or something similar. It would only take some spambot with a little more intelligence to be able to scan these patterns and pick up "likely" addresses, so this strategy is a little risky. Any consistent method you choose to write out email addresses could in theory be analyzed and decoded by a savvy bot.

Add stuff to the email address to make it invalid, but so that a human could easily know what to do to make it work. An example of this is writing 'username@_NO_SPAM_domain.com'. You need to remove the "_NO_SPAM_" part to make the email address valid. You can have some kind of explanation to make it clear what people have to do to use the address. Personally, I don't like this - you're depending on a level of sophistication on the part of your users which is risky. In my experience, there are a lot of very 'novice' level users out there, who only know how to click on a link. They don't know how to edit an email address. Heck, I've had people come to my site by typing the URL into Google, rather than the 'Location' box of their browser. Also, people don't read instructions.

Make graphics images which contain the email address. Spambots usually don't download graphics, and even if they did, they probably couldn't decode the bits to get the text. However, they could do it in theory, since software for doing OCR (optical character recognition, getting text from scanned documents) has been around for a while. A downside to this approach is that the user has to manually copy down the email address, since it can't be cut'n'pasted. Also, you can't put a mailto: link on the image, otherwise you're back to square one. But you could put a link to a contact form, with an argument in the link telling your server internally what email address to use. For example, the link could say "contact.cgi?to=23", where '23' is some database key to the actual email address. But the downside here is that you still need to generate the image, which is a bit of a pain in the ass if you have a lot of them. You can do it automatically, if you're willing to put the work in and write the scripts. There are some very nice graphics generation packages out there on CPAN for Perl. Here's an example of an email address presented as an image:

MySQL
Download badhosts MySQL database dump
We need to set up a MySQL database, where we store records of the hosts which are to be blocked. This doesn't have to be MySQL, but I use it because it's extremely fast, and very appropriate for this kind of application. You need to create a new database, called 'badhosts'. You then create a table, again called 'badhosts', with the following structure:

Field Type Comment
ip_address varchar(20) not null, indexed The IP address of the host to be blocked
user_agent varchar(255) not null The HTTP User-Agent of the spambot, for reference
expire_days int unsigned not null How many days is this block for. Doubled every time a new block has to be created for a particular IP address
created datetime not null When this block was created
expiry datetime not null, indexed When this block expires

You could use the dump provided above to load directly into your database:

shell> mysqladmin create badhosts
shell> mysql badhosts < badhosts.dump

That's about it! The fields which are marked as 'indexed' are the only ones which need indexes, because they are searched on to see if a particular IP address has been previously blocked, and also to see which blocks should be removed because they've expired. If you have access privilages set on your MySQL databases, then you need to allow the Apache user (usually 'nobody') access. The other script that will require access is badhosts_loop, which runs as root.
Next, we look at the script that populates this database.

BlockAgent.pm
Download BlockAgent.pm
Download bad_agents.txt
The BlockAgent.pm Apache/mod_perl module is taken from the excellent book "Writing Apache Modules with Perl and C" by Lincoln Stein & Doug MacEachern (O'Reilly). This script basically acts as an Apache authentication module which checks the HTTP User-Agent header against a list of known bad agents. If there's a match, then a 403 'Forbidden' code is returned. The script compiles and caches a list of subroutines for doing the matches, and automatically detects when the 'bad_agents.txt' file has changed. I have found that it has no noticeable impact on the performance of the webserver. This script is useful in the case where you know for certain that a certain User-Agent is bad; there's no point in letting it go anywhere on your site, so it's a good first line of defense. We'll cover how to add this module to your website a little later, along with the rest of the configuration settings in the section on httpd.conf.
Of course, one of the first arguments you'll see with regard to this method of blocking spambots is that it's easy to circumvent, by simply passing in a User-Agent string which is identical to the major browsers out there. This is perfectly true, but don't ask me why the spambot writers haven't done this - maybe it's a question of pride or ego, they want to see their baby out there on record in Web server logs. I honestly don't know. The main point is that at present, the User-Agent header CAN be used very effectively to block most bad agents. But, I have added more features so that we can also block agents which look ok, but behave badly by going somewhere they shouldn't - the Spambot Trap. More on that soon.

You'll notice that the bad_agents.txt file which I have supplied here is very comprehensive. A good strategy here is probably to save the full version somewhere (perhaps as bad_agents.txt.all), and just keep the ones you actually encounter in the bad_agents.txt file. Then you keep the list shorter, and more relevant to what actually hits you. For example, my bad_agents.txt file currently has the following lines in it, because these are the spambots that I see most frequently:

^[A-Z]+$
^.Browse\s
^.Eval
^EO Browse
^.Surf
^Microsoft.URL
^Mozilla\/3.0.+Indy Library
^Zeus.*Webster

You'll notice from this that BlockAgents.pm is very flexible, being able to take full advantage of the excellent regular expression capabilities of Perl. This means you can capture a lot of different agents with just one line. For example, the very first line catches all the variations of the agent which passes in random strings of capital letters, e.g. FHASFJDDJKHG or UYTWHJVJ. The spambot obviously thinks it's being pretty smart by looking different each time, but by using an easily identifiable pattern, it shoots itself in the foot. Hah.
The original version of the BlockAgent.pm script is well explained in the O'Reilly book, but I've added an extra hook that checks to see whether the client is accessing any of the spambot trap directories. If it is, then we add an entry to the MySQL database (you could use another relational database if you want, as long as it's accessible from Perl DBI).

The first time an IP address is blocked, an expiry of one day is set. If the same host subsequently comes in and falls into the trap again, then the expiry time is doubled. And so on. This way, the block gets longer and longer, in proportion to how persistently the spambot revisits our website. Once the IP address is blocked, the spambot can't even connect to our web server, since we use 'Deny' in the ipchains rule. This means that no acknowledgement is given to any packets coming in from the badhost, and as far as they know, our server has just gone away. Hopefully, after this happens for long enough, our server will be taken off the spambot's "visit" list. Another nice little side-effect of this is that the spambot will probably have to wait for a while before giving up each connection attempt. Anything that makes them waste more time is ok by me!

BlockAgent.pm notifies the badhosts_loop script that something has happened by touching a file called /tmp/badhosts.new. The badhosts_loop file checks this file every few seconds and if it has changed then it knows that a new record's been added to the database, and it needs to re-generate the blocks list.

The BlockAgent.pm script is our alarm system. It's what tells us that something happened. In order to act on this information, we need to be able to add rules to the ipchains firewall. We'll cover this next.

ipchains
Download sample ipchains config file
The ipchains module (here's the HOWTO doc) is a very nice way of providing a good level of basic network security to your server. If you haven't already set it up (or it's successor, iptables), then you really should. It's a very easy way to configure who can and cannot have access to your machine. A good resource for learning about this is "Building Linux and OpenBSD Firewalls", by Wes Sonnenreich and Tom Yates (Wiley). This is where I learned about ipchains, and it's on their excellent explanations and examples that I based my own config file. Another is "Linux Firewalls" by Ziegler (New Riders), which seems to have a more recent 2nd edition that covers iptables too.
The example ipchains config file given here is complete, but the bit which is most important to us is that we create a chain called 'blocks'. This is our own custom chain, which we can then add rules to. The badhosts_loop script will flush this chain and build it back up whenever a spambot falls in your trap. Once the spambot's IP address is on the blocks list, that host cannot connect to your server at all.

Remember to restart ipchains after you've changed the config file. Next, we'll look at the script that actually adds the firewall rules.

badhosts_loop
Download badhosts_loop script
You run this script in the background, as root. It has to be run as root, because only root has the ability to add rules to the firewall. The script spends most of its time sleeping. It wakes up every five seconds or so and does a quick check on /tmp/badhosts.new. If this file has been changed since the last time it looked, then it goes and re-generates the firewall blocks list with all the current (non-expired) blocks. If nothing else happens, then the script will automatically do this at least once a day, to ensure that blocks really do expire even if there is no new activity.
You should probably add the following line to your /etc/rc.local file (or equivalent), so that the script is automatically started up on reboot:

/path/to/badhosts_loop --loop &

This will start the script looping in the background. The script automatically checks to see if it is already running, by attempting to lock /var/lock/badhosts_loop.lock. If the file is already locked then the script will exit with an error message. If you want to just run the script once, without looping, then just omit the '--loop' option. This can be useful for testing.
Logging is done to /var/log/badhosts_loop.log by default. Every time the script generates the blocks list, it writes a list of all the blocks to the log. This is a good place to monitor if you're interested in what hosts are being blocked. Here's an example of the log output:

Thu Apr 11 16:09:07 2002:
Flushing blocks chain:
Generating blocks list:
Adding 68.5.99.89 (8) 2002-04-04 14:08:11 to 2002-04-12 14:08:11 DSurf15a 01
Adding 24.234.28.85 (8) 2002-04-07 10:43:42 to 2002-04-15 10:43:42 DBrowse 1.4b

The log shows the IP address which is being added, then (in brackets) the number of days the block is effective for (doubling each time), then the start and end dates of this block, and finally the name of the User-Agent which committed the crime. This can be useful for quickly seeing whether you need to add a new one to the bad_agents.txt file.
This is a pretty stable script that should just sit there and chug quietly, not taking up much in the way of resources. Checking for a file being changed every five seconds is not a big deal in Unix, so you shouldn't even notice it.

Now you have to create the trap itself - the spambot_trap directory.

spambot_trap/ Directory
Download gzipped tarball of sample spambot_trap directory
View the sample directory
You can create this directory anywhere on your server. We will create an alias the httpd.conf to access it. I put mine in /www/spambot_trap/. The point is, this doesn't have to be a real directory under your webserver directory root. If you use the <Alias> directive, then multiple websites can access the same spambot_trap directory, potentially through different aliases. You can use the sample tarball as a starting point, it has subdirectories and links which the spambots I have seen find irresistable. You should create your own image file for the unblock_email.gif file, to have a valid email address of your own.
The spambot_trap and spambot_trap/guestbook/ directories are not used directly to spring the trap. This is because I wanted to have a warning level, a lead-in, where real users would be able to realize they are getting into dangerous waters and could then back out. You're going to be placing hard-to-click links on your web pages which lead into the real trap, and there's always a chance that a real user will accidentally click on one of these. So, some of the links will point into the warning level. I have made a GIF image which contains a warning text. Why an image? Mainly because spambots can't understand images, and I didn't want to give big clues like "WARNING!!! DO NOT ENTER" in plain text. So, the user sees the warning, the spambots don't. If the spambot proceeds into any of the subdirectories (email, contact, post, message), then the trap is sprung and the host is blocked.

You also need to try to stop good spiders (e.g. google) from falling into the spambot trap and being blocked. To do this, we utilize the robots.txt file.

robots.txt
Download sample robots.txt
This should allow good robots (such as google) to surf your site without falling into the spambot trap. Most bad spambots don't even check the robots.txt file, so this is mainly for protection of the good bots.
You'll see that we list a bunch of directories under '/squirrel'. This could be anything; you'll set an alias later in httpd.conf. In fact, you may even want this to be dynamically generated (see later, under Embperl), so that you can quickly change the name of the spambot trap directory if the spambots adapt and start avoiding it. At present, a static setup should work just fine, however.

Next, we need to look at the bait - links within your HTML files which lead the spambot into the trap.

Your HTML Files
Download sample HTML code
Download sample transparent 1 pixel image for hiding the trap
Here's an example of HTML with links into the spambot trap:
<HTML>

<BODY BGCOLOR="beige">
<A HREF="/squirrel/guestbook/message/"></A>
<A HREF="/squirrel/guestbook/post/"><IMG SRC="/guestbook.gif" WIDTH=1 HEIGHT=1 BORDER=0></A>

Body of the page here

<TABLE WIDTH=100%>
<TR>
<TD ALIGN=RIGHT>
<A HREF="/squirrel/guestbook/">
<SMALL><FONT COLOR="beige">guestbook</FONT></SMALL& gt;
</A>
</TD>
</TR>
</TABLE>

</BODY>

</HTML>

Spambots tend to be stupid. You'd think they would check for empty links (which don't show up in a real browser), but they don't seem to. Sure, they may get smarter, but meantime you might as well pick the low hanging fruit. So, the very first thing in the body of your HTML should be an empty link which goes straight into the trap proper - not the warning level, but the actual trap itself. This is because there is no way for someone using a real browser to click on this link, and good spiders will ignore it anyway because it's in the robots.txt file.
We also use a one pixel big transparent GIF (a favorite web bug technique) to anchor a link to the trap, just in case the spambot is smart enough to avoid empty links. If we put this as the very first thing in the body, then it'll be pretty hard for a real user to click on, since it's only one pixel in size. But a spambot will quite happily go there!

Finally, there is an example of a non-graphic, text based link. This will be placed on the right side of the screen by the table, and the text will appear in the same color as the background (in this example, beige). The link does not go straight into the trap, but into the warning level, because with this one there is a bigger chance that real people could click on it accidentally. The link may be invisible, but it's still there, and someone could find it. So, they get to see a nice warning, and they should back off from there. But the spambot won't. By the way, we have the link going to /squirrel/guestbook/ rather than just /squirrel/ because some of the spambots seem to specifically follow links with certain keywords, e.g. 'guestbook', 'message', 'post', etc.

You can sprinkle these links all around your HTML files. I put them in every single one, since I use Embperl templates which make that sort of thing very easy.

Embperl
Download sample dynamic robots.txt using Embperl
Download sample dynamic HTML code using Embperl
The point of this is to make it easier to change the spambot trap directory without having to edit a whole bunch of files. We pass an environment variable to Perl from httpd.conf (see below), which says what the trap directory is called. We then use this in Embperl to substitute into the HTML and robots.txt files at request time. Thus if we wanted to change the name of the trap from 'squirrel' to 'badger', then we only need to change httpd.conf, restart apache, and we're done. All the links in the HTML are dynamic, as is robots.txt (see the samples above).
Now, we bring it all together in the Apache configuration file.

httpd.conf
Download sample httpd.conf directives
Download sample startup.pl script (used in httpd.conf)
You need to have mod_perl installed before you can use BlockAgent.pm. You should take a look at the sample given above, and integrate these directives into your own virtual hosts. The most important lines are:
Alias /squirrel /www/spambot_trap
PerlSetEnv SPAMBOT_TRAP_DIR squirrel

You should set the 'squirrel' name to whatever you'd like for your website; you'll then access the trap using a URL something like http://www.yourdomain.com/squirrel/guestbook/messa ge. This will spring the trap. You also need to set up the BlockAgent.pm access handler:
<Location />
PerlAccessHandler Apache::BlockAgent
PerlSetVar BlockAgentFile /www/conf/bad_agents.txt
</Location>

This ensures that all accesses to your website will go through BlockAgent.pm first. You should choose your own location for the bad_agents.txt file.
Finally, you might want to install Embperl so that you can embed Perl into your HTML code (always executed on the server side, never seen on the client side):

# Set EmbPerl handler for main directory
<Directory "/www/vhosts/www.yourdomain.com/htdocs/">

# Handle HTML files with Embperl
<FilesMatch ".*\.html$">
SetHandler perl-script
PerlHandler HTML::Embperl
Options ExecCGI
</FilesMatch>

# Handle robots.txt with Embperl
<FilesMatch "^robots.txt$">
SetHandler perl-script
PerlHandler HTML::Embperl
Options ExecCGI
</FilesMatch>

</Directory>

That about does it. You should now have the setup which will allow you to block spambots. You'll probably be interested in monitoring what happens...
Monitoring
Download sample script for monitoring web server logs
This simple script just tails the badhosts_loop log. You'll have fun (I do) seeing what comes on your site and promptly falls into the trap, and then SPLAT. No more spambot. Heh heh heh.
Conclusions
This setup works pretty well for me at the moment. I've no doubt there are flaws in my design, but it seems stable and is "good enough" for the time being. If you can see any improvements then I'd love to hear about them. To finish up, here's a summary of the strengths and potential weaknesses of the Spambot Trap system.
Strengths
Does not rely exclusively on the HTTP User-Agent header, but at the same time allows us to block agents which we know to be bad.

Does not rely on the spambot abusing the robots.txt file. Many spambots don't even load it. But the robots.txt file will protect "good" robots from falling into the spambot trap. So, for example, googlebot will be just fine.

The blocks happen based on behavior, rather than trusting anything the spambot tells us about itself (e.g. User-Agent). Thus we don't rely on any prior knowledge of the spambots in order to block them; an entirely new one that we've never seen before will still fall in the trap and be duly blocked.

Once a spambot is blocked, then it cannot connect to your server again at all for the duration of the block. If it tries to connect, it won't even get a 'connection refused' error, because the firewall rule just quietly drops all the packets from the bad hosts. The ipchains firewall is very effective, and more efficient at blocking hosts than anything you could put together with Apache. So, you save on server resources. If you're wondering whether the block lists might get large, I have found that with the constant expiring of one day blocks, the active block list has never been more than about 20 IP addresses at a time, out of a list (so far) of 100 distinct hosts.

The blocks initially expire after one day. This means that one-off offenders are quickly removed from the firewall rules. On the other hand, repeat offenders get progressively longer and longer blocks (doubled each time). This means that the more abusive a host is, the more it will be blocked. It also means that if a bot is coming in from multiple IP addresses (through a proxy), then each of the individual IP addresses will probably not go on to be blocked for too long. Thus you won't be blocking everyone in AOL. On the other hand, if you continue to get hit from the same network, then it's obviously a source of trouble and should be blocked. If it's a major network like AOL, which you really don't want to block, then you need to take the IP addresses and times of the abuse, and send it to the sysadmin at the ISP concerned. There's really not a lot else you can do. I haven't seen this in reality, though. In my experience, the spambots come in from all sorts of different IP addresses, and the ones that are very persistent over time are mostly static IPs from DSL and small ranges of IPs from cable modems. These are the people with the always-on, high bandwidth capabilities which are needed for large scale email harvesting.

The system uses a relational database to manage the blocks, and so it is very scalable, and potentially you could share the database between multiple servers. If any one server gets a spambot, the the offending IP address can automatically also be blocked at all the other servers. Also, the fact that we don't delete expired blocks means that we can keep track of the history of the blocks, and perhaps perform analyses which would lead to more permanent ipchains blocks of entire subnets, if desired.
Weaknesses
It would be possible for the spambots to get wise, and start following the robots.txt file rules. Then the spambot could in theory surf your entire site (or at least the bits allowed by robots.txt) without falling into the trap. However this also means that you can control where the spambot goes, which is the whole point of robots.txt. If you want, you can allow google into one part of the site, but exclude all others. Still, you should remove all email addresses from your site as the fail-safe.

It's possible that a spambot could come in through a proxy such as AOL, which means you'll be blocking multiple AOL IP addresses. This is not very nice, and I'm not sure what the solution is at the moment. All I can say is that it hasn't happened yet, and the worst offenders on my site all have static IPs. They seem to come in from cable and DSL connections mostly.

I don't know how feasible this would be, but it may be possible to conduct a "denial of service" type attack on your webserver by making many requests to the spambot trap directory from different IP addresses. I think, however, that you actually need to have those IP addresses (rather than spoofing them) in order to set up a real TCP connection with the web server. I don't know how likely this is, but it comes more under the "attack" category than spambots. If someone tries this on your site, then it's definitely something that can be pursued with legal means. It's no longer just a petty annoyance, but rather a hostile action which must be chased down. Also, the motivation is totally different - the spammers don't want to do this kind of thing. They just want their email addresses. The DDOS attacks are notoriously difficult to track, but I think in the couple of years that have passed since the first ones brought down Amazon and Yahoo!, there has been some progress made. Anyhow, I just wanted to bring the idea into the light of day. If anyone has any clues about it then I'd be glad to know.

Possible Future Enhancements
Spot large numbers of blocks occurring on a particular subnet, and automatically consolidate blocks into a single one which blocks the entire subnet (e.g. 128.123.31.0/24).

More interactive tools to allow removal of blocks

Analysis tools which can tell us something about patterns of abuse from particular networks.

If you can think of any more potential problems (or unrecognised strengths!) then I'd be happy to hear about it. I'd also like to hear about any comments on this document.

Similar setup without SQL requirements (4, Interesting)

bero-rh (98815) | more than 12 years ago | (#3329072)

My setup (catches some of the more commonly used spambots) uses mod_rewrite to send spammers to a trap.
Setup details at http://www.bero.org/NoSpam/isp.php [bero.org]

Pollute their database (1, Redundant)

Steev (5372) | more than 12 years ago | (#3329082)

I think a better idea was one that I heard a while back. This guy set up a script to constantly create new pages with randomly created garbage email addresses and links to new random pages with new random garbage email addresses, ad infinitum. Sure, you'll get a few more hits from the spambot, but it'll keep crawling your script-based heirarchy and keep polluting its database with email addresses that don't exist!

Another way to stop spambots (3, Funny)

PanBanger (465405) | more than 12 years ago | (#3329083)

Have your page linked on slashdot! Page gets slashdotted, problem solved.

Removing email addresses (2)

Mr_Silver (213637) | more than 12 years ago | (#3329086)

I used a very nifty bit of javascript which masks your mailto address. Provided the person has javascript on (and lets face it, nearly everyone who doesn't read /. does) then it works well.

You can generate the code for your own email address here [pgregg.com] or, if you want some source code, then you can find an implementation of it here [uk.net] .

linux? (0)

Anonymous Coward | more than 12 years ago | (#3329108)

How does your solution require linux? And why in gods name would you want to run a webserver using linux and mysql... do you just want a slow webserver or what?

Re:linux? (0)

Anonymous Coward | more than 12 years ago | (#3329188)

Better then Molasses OS (aka Nintendows) running IIS. Besides, why waste the money on that trash when you can have something worlds better (and faster) for free (that doesn't crash often)? The only other real option is Slowlaris and Oracle. Better yet Linux and Oracle, and even better yet BSDi (or xBSD) and Oracle. I guess what it comes down to is that if your not Toyota, CNN or Slashdot, why the hell do you need anything faster? This coward must be one of those damn gear heads that put the computing equivalent of a porche in for the job of a skateboard. "Look at my Athlon 1.4g box!!! Its soo much faster then yours!! I also have XP and now I'm quicker then a ray of light!!!" Etard, more code != faster, ever. Fixed code == faster, and never forget it!

my spambot trap (4, Informative)

romco (61131) | more than 12 years ago | (#3329109)

The page is already slashdoted. Here is a little
script that traps bots (and others) that use your robots.txt
to find directories to look through. Requires an .htaccess file with mod_rewrite turned on

robots.txt
#################

User-agent: *

Disallow: /dont_go_here
Disallow: /images
Disallow: /cgi-bin

dont_go_here/index.php
############

$now = date ("h:ia m/d/Y");
$IP=getenv(REMOTE_ADDR);
$host=getenv(R EMOTE_HOST);
$your_email_address=you@whatever;

$ban_code =
"\n".
'# '."$host banned $now\n".
'RewriteCond %{REMOTE_ADDR} ^'."$IP\n".
'RewriteRule ^.*$ denied.html [L]'."\n\n";

$fp = fopen ("/path/to/.htaccess", "a");
fwrite($fp, $ban_code);
fclose ($fp);

mail("$your_email_address", "Spambot Whacked!", "$host banned $now\n");

Other options.. (4, Informative)

primetyme (22415) | more than 12 years ago | (#3329163)

A pretty good article, but being able to install modules into Apache may not be the best situation for everyone who wants to stop Spambots..

Shameless plug, but I've got an ongoing series in the Apache section of /. that deals with easy ways that administrators *and* regular users can keep Spambots off their sites:
Stopping Spambots with Apache [slashdot.org]
and
Stopping Spambots II - The Admin Strikes Back [slashdot.org]

Just some more options and choices to help people out!

If I were a spambot author... (0)

Anonymous Coward | more than 12 years ago | (#3329185)

It would be possible for the spambots to get wise, and start following the robots.txt file rules. Then the spambot could in theory surf your entire site (or at least the bits allowed by robots.txt) without falling into the trap. However this also means that you can control where the spambot goes, which is the whole point of robots.txt. If you want, you can allow google into one part of the site, but exclude all others.

I'd read robots.txt and just go where google was allowed to go...

burp (1, Interesting)

Anonymous Coward | more than 12 years ago | (#3329187)

We had an Evil Harvestor Robot irritator on our web site back in 1996. It worked rather well. It didn't hit legitimate spiders by using an appropriate robot directive. It also gave the harvester a whole heap of nonsense addresses to add to its database.

None of that Perl nonsense, either. All in pure C on a BSD host, with a damn good attention to potential overflows. That was also the site which had my own custom MTA (I only knew sendmail, so it seemed a wise decision), demanded full W3C compliance (we would test it on about 10 platforms), and got used as evidence in the DoJ case against Microsoft.

Sigh, those were the days. Now, all I see is rehashing of old ideas. So, I view this news is 6 years old -- perhaps even a record for Slashdot?

using images is bad for people with text browsers (2, Insightful)

hsenag (56002) | more than 12 years ago | (#3329236)

If you use images for email addresses, what are people using text browsers supposed to do? Even worse is using them on the "warning" pages - someone with a text browser would have no idea what the image said and therefore nothing to stop them falling into the trap and getting firewalled.

And of course if he uses ALT text for the images, then he has the same problem he was trying to avoid, of creating something the spambots can read.

SAUCE (0)

Anonymous Coward | more than 12 years ago | (#3329247)

SAUCE [debian.org]
Load More Comments
Slashdot Account

Need an Account?

Forgot your password?

Don't worry, we never post anything without your permission.

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>
Create a Slashdot Account

Loading...