Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

robots.txt is a file that can be placed in the root folder of your website to help search engines index your site more appropriately. Search engines such as Google use website crawlers, or robots that review all the content on your website. It is essentially a text file containing the directives that search engine spiders, also called robots, read to follow a strict syntax.

This file can be used to tell the search engines about your website’s engagement rules. Search engines check the website’s robots.txt file regularly to find instructions for crawling the site. These instructions are called directives. If the robots.txt file is not present, the search engine will crawl the entire website. Robots.txt plays a critical role in SEO of the website as it tells the search engines how they can best crawl the site.

Using this file, you can prevent duplicate content, block the search engines from accessing parts of your website and guide them on crawling the site more efficiently.