How to Block AI Bots on Squarespace
Squarespace limits what you can do with robots.txt — but there are still effective options depending on your plan. Here's what actually works.
Squarespace plan matters for AI bot blocking
Unlike WordPress or static sites, Squarespace tightly controls server configuration. The Crawlers panel and Code Injection are Business plan and above only. Personal plan users can't edit robots.txt or inject meta tags natively — but Cloudflare WAF works on any plan with a custom domain.
What Each Plan Can Do
| Plan | Crawlers panel | Code Injection | Cloudflare WAF | Verdict |
|---|---|---|---|---|
| Personal | No | No | Yes (custom domain) | Cloudflare WAF only |
| Business | Yes | Yes | Yes | Full control |
| Basic Commerce | Yes | Yes | Yes | Full control |
| Advanced Commerce | Yes | Yes | Yes | Full control |
Method 1: Crawlers Settings Panel
Squarespace's Crawlers panel lets you enter user agents to block. Each entry you add gets a Disallow: / rule in your robots.txt. It's the simplest native option — no code required.
- 1
Go to your Squarespace admin and click Marketing in the left sidebar.
- 2
Click SEO, then the Crawlers tab.
If you don't see the Crawlers tab, you're on Personal plan — use Method 3 (Cloudflare) instead.
- 3
In the "Disallowed crawlers" field, add each AI bot user agent, one per line:
Paste these into the Crawlers panel — one per line
GPTBotChatGPT-UserOAI-SearchBotClaudeBotanthropic-aiGoogle-ExtendedBytespiderCCBotPerplexityBotmeta-externalagentAmazonbotApplebot-ExtendedxAI-BotDeepSeekBotMistralBotDiffbotcohere-aiAI2BotAi2Bot-DolmaYouBotDuckAssistBotomgiliomgilibotwebzio-extendedgemini-deep-research- 4Click Save.
- 5Verify at
yourdomain.com/robots.txt— each bot should have aDisallow: /rule.
Disallow: / rules for entire user agents. You can't write complex robots.txt directives (Allow, Crawl-delay, path-specific rules) or add a Sitemap directive. This is sufficient for AI bot blocking.Method 2: noai Meta Tags via Code Injection
The noai and noimageai meta tags tell AI crawlers not to use your content for training, even if they visit. Squarespace's Code Injection lets you add arbitrary HTML to your site header — perfect for this.
Global injection (all pages)
- 1Go to Settings → Advanced → Code Injection.
- 2In the Header field, paste:
<meta name="robots" content="noai, noimageai">
- 3Click Save. The tag will be injected into every page's
<head>.
Per-page injection (e.g. blog posts only)
- 1.Edit the individual page → click the gear icon → Advanced tab.
- 2.In Page Header Code Injection, paste the meta tag.
- 3.Save. Only that page gets the noai tag.
Method 3: Cloudflare WAF Proxy
This is the strongest option — and the only one available on Personal plan. Route your custom domain through Cloudflare's proxy, then use WAF Custom Rules to block AI bots at the network edge before requests reach Squarespace at all. Even Bytespider (which ignores robots.txt) gets blocked this way.
How to proxy Squarespace through Cloudflare
- 1.Add your custom domain to Cloudflare (free account). Do not transfer the domain — just change your nameservers to Cloudflare's.
- 2.In Cloudflare DNS, add a CNAME record pointing your domain to
ext-cust.squarespace.comwith the orange cloud (proxied) enabled. - 3.In Squarespace: Domains → your domain → Advanced Settings — ensure SSL mode is set to Full (not Full Strict) to avoid certificate errors through Cloudflare.
- 4.In Cloudflare, set SSL/TLS encryption mode to Full.
Then add the WAF rule
- 1Cloudflare Dashboard → your domain → Security → WAF → Custom Rules → Create rule.
- 2Click Edit expression and paste:
(http.user_agent contains "GPTBot") or (http.user_agent contains "ClaudeBot") or (http.user_agent contains "anthropic-ai") or (http.user_agent contains "Google-Extended") or (http.user_agent contains "Bytespider") or (http.user_agent contains "CCBot") or (http.user_agent contains "PerplexityBot") or (http.user_agent contains "meta-externalagent") or (http.user_agent contains "DeepSeekBot") or (http.user_agent contains "MistralBot") or (http.user_agent contains "xAI-Bot") or (http.user_agent contains "Diffbot") or (http.user_agent contains "cohere-ai") or (http.user_agent contains "AI2Bot") or (http.user_agent contains "DuckAssistBot") or (http.user_agent contains "omgilibot") or (http.user_agent contains "webzio-extended") or (http.user_agent contains "gemini-deep-research")
- 3Set action to Block. Deploy. AI bots receive 403 Forbidden — Squarespace never sees the request.
On Personal plan? Cloudflare is your only option
Personal plan doesn't include the Crawlers panel or Code Injection. If you have a custom domain (not a .squarespace.com subdomain), you can proxy it through Cloudflare and use WAF rules as described above. This gives you stronger protection than the robots.txt method anyway — WAF blocks at the network layer, robots.txt is just a convention.
If you're on a .squarespace.com subdomain with no custom domain, you have no AI bot blocking options. Upgrading to Business or adding a custom domain is required.
Will This Affect My Squarespace SEO?
Safe to block
- ✓ No effect on Google Search rankings
- ✓ Googlebot, Bingbot untouched
- ✓ Google Image Search unaffected
- ✓ Social media crawlers unaffected
- ✓ Squarespace's own SEO tools unaffected
AI search tradeoff
- ⚠ OAI-SearchBot → no ChatGPT Search
- ⚠ PerplexityBot → no Perplexity results
- ⚠ DuckAssistBot → no DuckDuckGo AI
- Only relevant if AI search traffic matters to you.
Verify Your Block Is Working
1. Check your robots.txt
Visit https://yourdomain.com/robots.txt. You should see Disallow rules for each bot you added in the Crawlers panel.
2. Google robots.txt Tester
Google Search Console → Settings → robots.txt → Test. Enter GPTBot — it should show "Blocked".
3. Check the noai meta tag is live
Right-click any page → View Page Source → search for "noai". You should see the meta tag in the <head> section.
4. Open Shadow scanner
The free scanner checks your robots.txt, noai tags, and AI readiness score in one scan.
Frequently Asked Questions
Can you edit robots.txt on Squarespace?↓
Partially. Business plan and above has a Crawlers panel (Marketing → SEO → Crawlers) where you enter user agents to block. Each gets a Disallow: / rule in your robots.txt. You can't write arbitrary robots.txt content — only user-agent blocking is supported. Personal plan has no robots.txt editing.
I'm on Personal plan — can I still block AI bots?↓
Yes, but only via Cloudflare. If you have a custom domain (not a .squarespace.com address), add it to Cloudflare as a proxy and use WAF Custom Rules to block AI bots by user agent. This is actually stronger than robots.txt — it blocks at the network layer and stops bots that ignore robots.txt.
How do I add a noai meta tag to Squarespace?↓
Business plan+: Settings → Advanced → Code Injection → Header, paste: <meta name="robots" content="noai, noimageai">. For individual pages: page settings → Advanced → Page Header Code Injection.
Will blocking AI bots affect my Squarespace SEO?↓
No. Blocking AI training bots (GPTBot, ClaudeBot, CCBot, Google-Extended, Bytespider) has zero effect on Googlebot or Bing. Your Squarespace SEO and Google rankings are completely unaffected. The only consideration is AI search visibility — blocking OAI-SearchBot removes you from ChatGPT Search, etc.
Does Squarespace have a setting to block all crawlers?↓
Squarespace has a 'Discourage search engines from indexing your site' option in Settings → Advanced → External API Keys (older versions) or Settings → SEO. This blocks ALL crawlers including Googlebot — which destroys your SEO. Do NOT use this to block AI bots. Use the Crawlers panel with specific user agents instead.
What if I switch Squarespace templates — do I lose my settings?↓
No. The Crawlers panel settings are tied to your Squarespace account, not your template. Switching templates does not affect robots.txt rules you added via the Crawlers panel. Code Injection settings are also template-independent when set globally.
Is your site protected from AI bots?
Run a free scan to check your robots.txt, meta tags, and overall AI readiness score.
Scan My Site Free →