Perhaps we could enlist the aid of Google
in this full-text
indexing. Currently, http://everything2.com/robots.txt
The administration used to allow Google to spider the site (until E2 started lagging like a
bitch). If we were to allow some robots and detect appearance of bot names in the HTTP
request's User-Agent field and serve a special theme (similar to
printable version except with soft links and a sleep()) that
displayed no nodelets or ads, we could let some spiders roam the
site without incurring excessive load.
<aside>Would it ease the load on the user dbtable if we deleted fled noders with no writeups?</aside>
© 2001 Damian Yerrick
. Verbatim copying and redistribution are permitted.