Robots File: Disallow URLs and Sitemaps Autodiscovery

Easily create create robots.txt files that contain disallow instructions and references to generated XML sitemaps for autodiscovery.

Robots.txt File Introduction

Ever since the beginning of internet and search engines, the robots.txt file has been how website owners and their webmasters could tell search engine crawlers like GoogleBot which pages and content should be ignored and left out of search results.

This was the situation for many years until Google created Google Sitemaps / XML Sitemaps Protocol.

The new functionality was called Sitemaps Autodiscovery and included additional instructions for the robots.txt file that made it possible to point search engines to your XML sitemaps. This made is possible for search engines to more easily discover all your XML sitemap files.

Complete Robots.txt File Example

If you intend to manually create and/or edit robots.txt files, you can see a complete example here with disallow of URLs and XML Sitemaps Autodiscovery.

User-agent: *
Disallow: /home/feedback.php
Disallow: /home/link-us.php?
Disallow: /home/social-bookmark.php?


See below for more information.

Robots Disallow URLs with Sitemapper "List" and "Analysis" Filters

If you want our sitemap builder to create your robots.txt file, you will need to read the help about configuring output and analysis filters.

Note: It is only standard path filters that are added to robots.txt file, i.e. filters starting with a single : colon.

exclude path filters

From sitemap generator tool tip:
Text string matches: "mypics". Path relative to root: ":mypics/", subpaths only: ":mypics/*", regex search: "::mypics[0-9]*/"

Robots Text File and XML Sitemaps Autodiscovery

In 2007 it became possible to indirectly submit XML sitemaps to search engines by listing listing them in robots.txt file. This concept is called XML sitemaps autodiscovery, and it is part of the XML sitemaps protocol. To add XML sitemaps autodiscovery to a robots.txt file, add the fully qualified XML sitemap file path like this: Sitemap:

Below, we have listed some complete examples using XML sitemaps autodiscovery in robots.txt file:

If you created multiple XML sitemap files covering different parts of your website:

User-agent: *


Or refer to the XML sitemap index file instead that links all XML sitemap files:

User-agent: *


Some URLs for more information about robots text file:

Create Robots Text File with Sitemap Generator

Sitemap generator can create robots.txt files.

robots text file disallow urls option

Generated robots.txt files are ready to be uploaded and used by search engines.

sitemap generator robots file

Robots File and Cross Submit Sitemaps for Multiple Websites

In the beginning, and for a long time after that, it was not possible to submit sitemaps for websites unless the sitemaps were hosted and located on the same domain as the websites.

However, now some search engines include support for another way of managing sitemaps across multiple sites and domains. The requirement is that you need to verify ownership of all websites in Google Search Console or similar depending on the search engine.

To learn more see:
  • Sitemaps protocol: Cross sitemaps submit and manage using robots.txt.
  • Google: More website verification methods than sitemaps protocol defines.
TechSEO360 | help | previous | next
SEO website crawler tool that can find broken links, analyze internal link juice flow, show duplicate titles, perform custom code/text search and much more.
This help page is maintained by
As one of the lead developers, his hands have touched most of the code in the software from Microsys. If you email any questions, chances are that he will be the one answering.
Share this page with friends   LinkedIn   Twitter   Facebook   Pinterest   YouTube  
 © Copyright 1997-2024 Microsys

 Usage of this website constitutes an accept of our legal, privacy policy and cookies information.