Using extreme values with the RootNode specification mechanism it is possible to specify a Web ``robot''. We implore the user not to do this. Robots are very inefficient -- they generate excessive load on network links and remote information servers, do not coordinate gathering effort, and will become decreasingly useful over time because they do not focus their content on a specific topic or community. The Harvest RootNode specification mechanism was developed to support gathering needs for topical collections, not to build robots.
NOTE: As of version 1.4 patchlevel 2, the Gatherer obeys the robots.txt convention.