standard for robot exclusion


standard for robot exclusion

(World-Wide Web)A proposal to try to prevent the havocwreaked by many of the early World-Wide Web robots whenthey retrieved documents too rapidly or retrieved documentsthat had side effects (such as voting). The proposed standardfor robot exclusion offers a solution to these problems in theform of a file called "robots.txt" placed in the document root of the web site.

W3C standard.