robots.txt Generator
Create robots.txt files for search engine crawling rules
SEO & Config Runs in your browser
Configuration
Output
User-agent: * Disallow: /admin/ Disallow: /private/ Allow: / Sitemap: https://example.com/sitemap.xml
How to Use
1Add user-agent rules (use * for all crawlers)
2Specify paths to disallow or allow
3Add your sitemap URL
4Copy or download the generated robots.txt file
Frequently Asked Questions
What is robots.txt?
robots.txt is a file placed at the root of a website that tells search engine crawlers which pages or sections they can or cannot access.
Does robots.txt block indexing?
No, robots.txt only controls crawling. Pages blocked by robots.txt can still appear in search results if other pages link to them. Use meta noindex tags to prevent indexing.
Where should I place robots.txt?
robots.txt must be placed at the root of your domain: https://example.com/robots.txt. It won't work in subdirectories.