The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard
used by websites to communicate with web crawlers and other web robots. The standard specifies how to
inform the web robot about which areas of the website should not be processed or scanned.
You can give the website URL starts with / then click on the add it will display the result