Submit Your Sitemaps with robots.txt

A sitemap is a list of content on your site that makes it easier to find. Sitemap.xml files are designed to make it easier for search engines to index your site. If you have a small simple website that is easy to navigate you probably don’t need a sitemap but it doesn’t hurt to have one.

I use the free link checking program Xenu to create my sitemaps.

Once you’ve created your sitemap you need to get it indexed so that search engines can use it. The proper way is to set up a webmaster account with each search engine, register your site, validate your site and then submit your sitemap. This is the best way but there is an easier way. Just add one line to your robots text file:


Just replace my domain with yours and change the name of the sitmap.xml file if you use a different name.

Here is a simple robots.txt file example:

User-agent: *
Allow: /
Disallow: /cgi-bin/

This robots.txt file tells the search engine spiders where to find my sitemap. It also says that all spiders are allowed to index all of my site except the /cgi-bin/ directory. If you don’t have a robots.txt file just open up any text editor and type in the above code. Save the file as: robots.txt in the root directory of your website.

This entry was posted in Servers, Websites and tagged , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *