site auditing tools

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


site auditing tools

About Robots.txt Generator

It is explained as robot exclusion protocol or standards. It is a file having a .txt extension located in the source code of the website. Generally, it is present in the source code if not one can create by using our above tool for search engine optimization. If it is not present then as per the protocol it permits search engines to crawl every web page of the website.

It is utilized to give proper instructions and messages to various search engines or search protocols as the list mentioned above in the tool with heading search robots ie Google, Google Image, MSN search, Yahoo, and so on to allow or refuse (disallow) to crawl the websites. It is created to leverage by editing the file and rank your website. Robot.txt Generator it is very easy to generate robots file for any search engine. By the Defaults- All Robots selection if selected "Allowed" it will generate a file

User-Agent : *
Disallow:/

By this, not a single search engine is allowed to crawl your website. However, if you want only Google can crawl and rest others are not authorized you can allow the Option of Google on a tab and it will only allow google to crawl and will generate the file as

User-agent: Googlebot
Disallow :

When the files are generated you have to create the file and save its source code of the website. Give a look to our one of the gem in SEO tools