The automaton Exclusion customary, conjointly referred to as the Robots Exclusion Protocol or robots.txt protocol, could be a convention to stop cooperating net spiders and different net robots from accessing all or a part of an internet site that is otherwise in public seeable. Robots square measure usually utilized by search engines to reason and archive websites, or by webmasters to see ASCII text file. The quality is unrelated to, however may be employed in conjunction with, Sitemaps, an automaton inclusion customary for websites. A robots.txt file on an internet site can operate as a call for participation that given robots ignore given files or directories in their search. This can be, for instance, out of a preference for privacy from program results, or the idea that the content of the chosen directories can be dishonorable or moot to the categorization of the positioning as a full, or out of a need that associate application solely treat sure information.