Free tools for developers

Robots.txt Generator

Generate a robots.txt file for your website. Control which pages search engines can and cannot crawl.

Quick Presets

✅ Allow All

Let all search engines crawl everything

🚫 Block All

Block all search engines from everything

🔒 Block Admin

Allow all but block admin and private areas

WordPress

Standard robots.txt for WordPress sites

Sitemap URL (optional)

What is robots.txt?

The robots.txt file is a text file placed in the root of your website that tells search engine crawlers which pages they are allowed to visit. It is part of the Robots Exclusion Protocol - a standard that well-behaved bots follow when crawling your site.

A properly configured robots.txt file helps search engines crawl your site efficiently by steering them away from unimportant pages like admin panels, login pages, and duplicate content. This can improve your site's crawl budget and help important pages get indexed faster.

Important Notes

robots.txt is a suggestion, not a security measure. Malicious bots may ignore it. Do not rely on robots.txt to hide sensitive content - use authentication instead. Also note that blocking a page in robots.txt does not remove it from search results if other sites link to it.


Frequently Asked Questions

Where do I put robots.txt?

The robots.txt file must be placed in the root directory of your website at example.com/robots.txt. It will not work in subdirectories. Most hosting control panels let you create or edit it directly.

Does blocking in robots.txt remove pages from Google?

No. Blocking crawling prevents Google from reading the page content but if other sites link to the page it may still appear in search results as a URL with no description. To remove a page from results entirely use the noindex meta tag or the URL removal tool in Google Search Console.