Skip to content
Nuxt.jsVueNew8 min read

How to Block AI Bots on Nuxt.js

Nuxt 3 has a dedicated @nuxtjs/robots module, a routeRules API for HTTP headers, and server middleware for hard blocking — everything you need to lock out AI crawlers without leaving Vue.

Quick fix — create public/robots.txt

Same level as nuxt.config.ts. Nuxt serves it at /robots.txt automatically — no config needed.

User-agent: GPTBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: CCBot
Disallow: /

User-agent: Google-Extended
Disallow: /

All Methods

public/robots.txt (Recommended)

Easy

SSG + SSR

public/robots.txt

Nuxt serves everything in public/ as static assets at the root URL. A plain robots.txt here works in all Nuxt modes with no configuration.

No front matter, no Vue syntax — plain text only.

@nuxtjs/robots module

Easy

SSG + SSR

nuxt.config.ts → modules + robots config

Official Nuxt module that generates robots.txt from your nuxt.config.ts. Supports environment-based rules, dynamic site URLs, and integration with @nuxtjs/sitemap.

Install: npx nuxi module add robots. Conflict warning: remove public/robots.txt if using this module.

useHead() — global noai tag

Easy

SSG + SSR

app.vue or layouts/default.vue

Use Nuxt 3's useHead() composable in app.vue to inject <meta name="robots" content="noai, noimageai"> on every page. Can also be set via nuxt.config.ts app.head.

useHead() in app.vue applies globally. In individual pages it applies only to that page.

nuxt.config.ts routeRules headers

Easy

SSG + SSR

nuxt.config.ts → routeRules

Set X-Robots-Tag: noai, noimageai as an HTTP header on all routes via Nuxt's routeRules. More authoritative than the HTML meta tag — applies at the HTTP layer.

Works in SSR (served by Nitro server) and SSG (written to static response headers on Netlify/Vercel).

server/middleware — hard blocking (SSR only)

Intermediate

SSR only

server/middleware/botBlock.ts

Intercept all requests in Nuxt's Nitro server layer and return 403 for matched AI bot user agents. Only works in SSR — does not run in SSG (nuxi generate) mode.

The most powerful Nuxt method. Bots never reach Vue rendering. Requires running as a Node.js server.

Method 1: public/robots.txt

Nuxt serves every file in public/ as a static asset at the root URL. Create public/robots.txt in your project root (same level as nuxt.config.ts and app.vue).

User-agent: *
Allow: /

User-agent: GPTBot
Disallow: /

User-agent: ChatGPT-User
Disallow: /

User-agent: OAI-SearchBot
Disallow: /

User-agent: ClaudeBot
Disallow: /

User-agent: anthropic-ai
Disallow: /

User-agent: Google-Extended
Disallow: /

User-agent: Bytespider
Disallow: /

User-agent: CCBot
Disallow: /

User-agent: PerplexityBot
Disallow: /

User-agent: meta-externalagent
Disallow: /

User-agent: Amazonbot
Disallow: /

User-agent: Applebot-Extended
Disallow: /

User-agent: xAI-Bot
Disallow: /

User-agent: DeepSeekBot
Disallow: /

User-agent: MistralBot
Disallow: /

User-agent: Diffbot
Disallow: /

User-agent: cohere-ai
Disallow: /

User-agent: AI2Bot
Disallow: /

User-agent: Ai2Bot-Dolma
Disallow: /

User-agent: YouBot
Disallow: /

User-agent: DuckAssistBot
Disallow: /

User-agent: omgili
Disallow: /

User-agent: omgilibot
Disallow: /

User-agent: webzio-extended
Disallow: /

User-agent: gemini-deep-research
Disallow: /

Method 2: @nuxtjs/robots Module

The official Nuxt robots module generates robots.txt from your nuxt.config.ts. Supports environment-based rules, dynamic site URLs, and integrates with @nuxtjs/sitemap.

# Install
npx nuxi module add robots
// nuxt.config.ts
export default defineNuxtConfig({
  modules: ['@nuxtjs/robots'],

  robots: {
    // Allow standard search crawlers
    allow: '/',

    // Block AI training bots
    disallow: [
      // Specify rules per user-agent via groups:
    ],

    // Or use the groups array for per-bot control:
    groups: [
      {
        userAgent: '*',
        allow: '/',
      },
      {
        userAgent: ['GPTBot', 'ChatGPT-User', 'OAI-SearchBot'],
        disallow: '/',
      },
      {
        userAgent: ['ClaudeBot', 'anthropic-ai'],
        disallow: '/',
      },
      {
        userAgent: 'Google-Extended',
        disallow: '/',
      },
      {
        userAgent: 'Bytespider',
        disallow: '/',
      },
      {
        userAgent: 'CCBot',
        disallow: '/',
      },
      {
        userAgent: 'PerplexityBot',
        disallow: '/',
      },
      {
        userAgent: ['meta-externalagent', 'Diffbot', 'DeepSeekBot'],
        disallow: '/',
      },
      {
        userAgent: ['MistralBot', 'cohere-ai', 'AI2Bot', 'Ai2Bot-Dolma'],
        disallow: '/',
      },
      {
        userAgent: ['YouBot', 'DuckAssistBot', 'xAI-Bot', 'Applebot-Extended'],
        disallow: '/',
      },
      {
        userAgent: ['omgili', 'omgilibot', 'webzio-extended', 'gemini-deep-research', 'Amazonbot'],
        disallow: '/',
      },
    ],
  },
});
Conflict: If you use @nuxtjs/robots, remove public/robots.txt — having both causes a conflict. The module-generated file takes precedence.

Method 3: useHead() — Global noai Tag

Add the noai meta tag to every page using Nuxt 3's useHead() composable in app.vue, or declaratively in nuxt.config.ts.

Option A: app.vue with useHead()

<!-- app.vue -->
<script setup lang="ts">
useHead({
  meta: [
    { name: 'robots', content: 'noai, noimageai' },
  ],
});
</script>

<template>
  <NuxtLayout>
    <NuxtPage />
  </NuxtLayout>
</template>

Option B: nuxt.config.ts app.head (no composable needed)

// nuxt.config.ts
export default defineNuxtConfig({
  app: {
    head: {
      meta: [
        { name: 'robots', content: 'noai, noimageai' },
      ],
    },
  },
});

Method 4: routeRules — X-Robots-Tag Header

Set X-Robots-Tag: noai, noimageai as an HTTP response header on all routes via routeRules in nuxt.config.ts. This is more authoritative than the HTML meta tag.

// nuxt.config.ts
export default defineNuxtConfig({
  routeRules: {
    // Apply X-Robots-Tag to all routes
    '/**': {
      headers: {
        'X-Robots-Tag': 'noai, noimageai',
      },
    },
  },
});

Method 5: Server Middleware — Hard Blocking (SSR)

For SSR deployments — Nuxt server middleware runs in the Nitro server layer on every request before Vue rendering. Create server/middleware/botBlock.ts.

SSR only: Server middleware does not run when using nuxi generate (static mode). Check your nuxt.config.ts — you need ssr: true (default) and a deployment that runs the Nitro server (Vercel, Netlify SSR, Node.js).
// server/middleware/botBlock.ts
import { defineEventHandler, setResponseStatus, send } from 'h3';

const BLOCKED_BOTS = /GPTBot|ChatGPT-User|OAI-SearchBot|ClaudeBot|anthropic-ai|Google-Extended|Bytespider|CCBot|PerplexityBot|meta-externalagent|Diffbot|DeepSeekBot|MistralBot|cohere-ai|AI2Bot|Ai2Bot-Dolma|YouBot|DuckAssistBot|omgili|omgilibot|webzio-extended|gemini-deep-research|xAI-Bot|Applebot-Extended|Amazonbot/i;

export default defineEventHandler((event) => {
  const ua = getRequestHeader(event, 'user-agent') ?? '';

  if (BLOCKED_BOTS.test(ua)) {
    setResponseStatus(event, 403);
    return send(event, 'Forbidden');
  }
});

Full AI Bot Reference

All 25 AI bots covered by the robots.txt block list above:

GPTBotChatGPT-UserOAI-SearchBotClaudeBotanthropic-aiGoogle-ExtendedBytespiderCCBotPerplexityBotmeta-externalagentAmazonbotApplebot-ExtendedxAI-BotDeepSeekBotMistralBotDiffbotcohere-aiAI2BotAi2Bot-DolmaYouBotDuckAssistBotomgiliomgilibotwebzio-extendedgemini-deep-research

Frequently Asked Questions

Where do I put robots.txt in a Nuxt.js 3 site?

The simplest approach: place robots.txt in the public/ directory at your Nuxt project root. Nuxt serves everything in public/ as static assets at the root path, so public/robots.txt becomes yourdomain.com/robots.txt automatically. No configuration required — this works in both SSR (nuxi build) and SSG (nuxi generate) modes. Alternatively, use the @nuxtjs/robots module which generates robots.txt from your nuxt.config.ts configuration.

What is @nuxtjs/robots and do I need it?

@nuxtjs/robots is an official Nuxt module that generates your robots.txt from configuration in nuxt.config.ts. It's useful if you want to: reference your site URL dynamically, use different rules per environment (block everything on staging), integrate with @nuxtjs/sitemap for the Sitemap directive, or manage robots rules alongside your other Nuxt config. For simple static AI bot blocking, public/robots.txt is simpler and doesn't require an additional dependency.

How do I add a noai meta tag to every Nuxt page?

Three options in Nuxt 3: (1) app.vue with useHead() — add useHead({ meta: [{ name: 'robots', content: 'noai, noimageai' }] }) in your root app.vue component; (2) nuxt.config.ts app.head — add the meta tag in the global head configuration; (3) layouts/default.vue with useHead() — applies to all pages using the default layout. The nuxt.config.ts approach is cleanest as it keeps configuration centralised.

Does Nuxt server middleware work for blocking AI bots in SSG mode?

No. Nuxt server middleware (files in server/middleware/) only runs in SSR (server) mode — when Nuxt is running as a Node.js server. In SSG mode (nuxi generate), the site is pre-rendered to static HTML files and there is no server to run middleware. For SSG deployments, use robots.txt + noai meta tags, and rely on platform-level tools (Cloudflare WAF, Netlify Edge Functions) for runtime bot blocking.

How does nuxt.config.ts routeRules work for bot blocking?

Nuxt 3's routeRules lets you set HTTP response headers on specific routes (or all routes) directly in nuxt.config.ts. Add routeRules: { '/**': { headers: { 'X-Robots-Tag': 'noai, noimageai' } } } to set the X-Robots-Tag header on every response. This is more authoritative than the HTML meta tag and works at the HTTP layer — bots that fetch pages without rendering JavaScript still see the header. routeRules works in both SSR and SSG (as static headers on generated files when deployed to Netlify or Vercel).

Will blocking AI bots affect Nuxt's built-in SEO or sitemap?

No. Blocking GPTBot, ClaudeBot, CCBot, and other AI training bots has no effect on Googlebot or Bingbot. The @nuxtjs/sitemap module continues generating sitemap.xml normally. Nuxt's built-in useSeoMeta(), useHead(), and meta tag handling are completely unaffected. Your search engine rankings and sitemap discovery remain unchanged.

Is your site protected from AI bots?

Run a free scan to check your robots.txt, meta tags, and overall AI readiness score.

Related Guides