Robots.txt Generator
Create robots.txt files to control how search engine crawlers access your website. Free tool with common presets and custom rules.
Common paths to disallow:
Optional: Add delay between requests
Configure your crawl rules and generate your robots.txt file
Common Robots.txt Directives
Allow
Permit crawlers to access specific paths
Disallow
Block crawlers from specific paths
Sitemap
Point to your XML sitemap location
Crawl-delay
Set delay between crawler requests
User-agent
Target specific crawlers/bots
Private Areas
Block admin and login pages
Media Folders
Control access to media files
Search Pages
Block internal search results
Frequently Asked Questions
What is robots.txt?βΌ
Robots.txt is a text file that tells search engine crawlers which pages or sections of your website they can or cannot access.
Where should I put robots.txt?βΌ
The robots.txt file must be placed in the root directory of your website (e.g., https://example.com/robots.txt).
Can robots.txt block all crawlers?βΌ
Yes, you can use 'User-agent: * Disallow: /' to block all crawlers from your entire site, but this is rarely recommended.
Is robots.txt mandatory?βΌ
No, but it's highly recommended for SEO. It helps search engines crawl your site more efficiently.
Does robots.txt guarantee privacy?βΌ
No, robots.txt is a suggestion, not a security measure. Some bots may ignore it. Use proper authentication for truly private content.