SEO Tools
Robots.txt Generator
Your robots.txt file tells search engine crawlers which pages they can and cannot access. Getting this wrong can inadvertently block your site from Google. This builder lets you create a correct, valid robots.txt without writing any code.
1
Select which bots to configure (start with * for all)
2
Choose whether to allow all, block all, or use custom rules
3
Add specific paths to disallow if using custom mode
4
Enter your sitemap URL
5
Click Generate and download the file
6
Upload robots.txt to your website's root directory
🤖 Robots.txt Generator
Build a valid robots.txt file to tell search engines what to crawl or ignore. No coding needed.
Bot Rules
💡 robots.txt goes in your site root:
https://yourdomain.com/robots.txt💡 Pro Tips & Best Practices
✓
Your robots.txt must be accessible at https://yourdomain.com/robots.txt — not in any subdirectory.
✓
Do not use robots.txt to try to hide sensitive pages — it is public and readable by anyone.
✓
For most websites, allowing all crawlers and adding your sitemap URL is all you need.
✓
Block /admin/, /checkout/, /cart/ and other non-public paths to save crawl budget.
✓
After uploading, verify it works using Google Search Console's robots.txt tester.
✓
A disallow rule in robots.txt does not remove already-indexed pages — use noindex meta tags for that.
✓
Never accidentally disallow your entire site (Disallow: /) — this blocks all search crawlers.