Robots.txt Generator

Build a valid robots.txt in seconds. Fully client-side. Presets, live preview, validation, and a quick path tester.

1) Configure

Preset mode: Append lets you combine templates (e.g., block AI + allow major search).

Allow everything

Make all content crawlable by any bot.

Block everything

Disallow all bots globally (useful for staging).

Only major search engines

Block others (replace) or just add search bots (append).

Block common AI/data crawlers

Add groups for known AI/data bots and disallow them.

SEO baseline

Allow all, but block noisy system paths (edit as needed).

Block sensitive folders

Keep admin/login/private paths out of indexes.

Groups (User-agent specific rules)

2) Preview & Test

Live


          
Path tester uses a pragmatic match (longest rule wins; details).

Reminder: robots.txt is advisory. It does not secure private content. Use authentication for sensitive areas.

Open-source acknowledgements