SPIDERS
Computer robot programs, referred to sometimes as "crawlers" or "knowledge-bots" or "knowbots" that are used by search engines to roam the World Wide Web via the Internet, visit sites and databases, and keep the search engine database of web pages up to date. They obtain new pages, update known pages, and delete obsolete ones. Their findings are then integrated into the "home" database.
Most large search engines operate several robots all the time. Even so, the Web is so enormous that it can take six months for spiders to cover it, resulting in a certain degree of "out-of-datedness" (link rot) in all the search engines. For more information, read about search engines.
If you watch "Stargate" think replicators. That's about as good a physical analog as I can think of. They mindless, brainless things that scuttle about the Web. Sort of like many posters here.
--------------------
"Things fall apart; the center cannot hold; mere anarchy is loosed upon the world,"
I'm Ragnar Floggurass, and I approve this messageEdited on 9/17/2004 2:45 PM