Spider bait page 

Home | Audio | Buy | Contact | Downloads | FAQ | Links | | TOC | Videos

This is a page that is set up as bait for spiders that crawl through the whole web site looking for e-mail addresses and other things to harvest.  There is a file called robots.txt that tells these spiders which private directories to avoid.  The directory this particular file is stored in appears in the robots.txt file as a directory they are to avoid.  If a particular spider does index this file anyway, then it will be permanently banned from ever accessing this web site again, usually within 24-hours.

Home