How to Block AI Bots on Micronaut (Java): Complete 2026 Guide
Micronaut is the JVM framework built for microservices — compile-time DI, fast startup, low memory. Its filter system is reactive: HttpServerFilter.doFilter() returns Publisher<MutableHttpResponse<?>>. Unlike Spring's synchronous doFilterInternal() and unlike Quarkus JAX-RS abortWith(), Micronaut blocks by returning a reactive response directly.
Reactive return — not abortWith(), not return false
Micronaut filters are reactive pipelines. doFilter() returns a Publisher — to block, return Mono.just(HttpResponse.status(FORBIDDEN)). To continue, return chain.proceed(request). Mutable responses: MutableHttpResponse.header() mutates in place (unlike PSR-7's immutable withHeader()).
Protection layers
Layer 1: robots.txt
Place robots.txt in src/main/resources/public/. Micronaut serves static files from the classpath public/ root automatically — served by Netty before any filters:
# src/main/resources/public/robots.txt User-agent: * Allow: / User-agent: GPTBot User-agent: ClaudeBot User-agent: anthropic-ai User-agent: Google-Extended User-agent: CCBot User-agent: cohere-ai User-agent: Bytespider User-agent: Amazonbot User-agent: PerplexityBot User-agent: YouBot User-agent: Diffbot User-agent: DeepSeekBot User-agent: MistralBot User-agent: xAI-Bot User-agent: AI2Bot Disallow: /
Micronaut:
src/main/resources/public/robots.txt → /robots.txtQuarkus:
src/main/resources/META-INF/resources/robots.txt → /robots.txtSpring Boot:
src/main/resources/static/robots.txt → /robots.txtLayers 2, 3 & 4: HttpServerFilter
Implement HttpServerFilter and annotate with @Filter("/**"). Use Reactor (micronaut-reactor) for the reactive pipeline:
// src/main/java/com/example/filter/AiBotFilter.java
package com.example.filter;
import io.micronaut.http.HttpRequest;
import io.micronaut.http.HttpResponse;
import io.micronaut.http.HttpStatus;
import io.micronaut.http.MutableHttpResponse;
import io.micronaut.http.annotation.Filter;
import io.micronaut.http.filter.HttpServerFilter;
import io.micronaut.http.filter.ServerFilterChain;
import org.reactivestreams.Publisher;
import reactor.core.publisher.Mono;
import jakarta.inject.Singleton;
import java.util.List;
import java.util.Set;
@Singleton
@Filter("/**")
public class AiBotFilter implements HttpServerFilter {
private static final List<String> AI_BOTS = List.of(
"gptbot", "chatgpt-user", "claudebot", "anthropic-ai",
"ccbot", "cohere-ai", "bytespider", "amazonbot",
"applebot-extended", "perplexitybot", "youbot", "diffbot",
"google-extended", "deepseekbot", "mistralbot", "xai-bot",
"ai2bot", "oai-searchbot", "duckassistbot"
);
private static final Set<String> EXEMPT_PATHS = Set.of(
"/robots.txt", "/sitemap.xml", "/favicon.ico"
);
@Override
public Publisher<MutableHttpResponse<?>> doFilter(
HttpRequest<?> request, ServerFilterChain chain) {
String path = request.getPath();
// Exempt paths always pass through
if (EXEMPT_PATHS.contains(path)) {
return addRobotsTag(chain.proceed(request));
}
String ua = request.getHeaders().get("User-Agent", "").toLowerCase();
for (String bot : AI_BOTS) {
if (ua.contains(bot)) {
// Block — chain.proceed() is never called
return Mono.just(
HttpResponse.<String>status(HttpStatus.FORBIDDEN)
.body("Forbidden: AI crawlers are not permitted.")
);
}
}
// Pass through — add X-Robots-Tag to response
return addRobotsTag(chain.proceed(request));
}
private Publisher<MutableHttpResponse<?>> addRobotsTag(
Publisher<MutableHttpResponse<?>> responsePublisher) {
return Mono.from(responsePublisher)
.doOnNext(res -> res.header("X-Robots-Tag", "noai, noimageai"));
}
@Override
public int getOrder() {
// Lower = runs first. Default is 0. Run before authentication filters.
return -100;
}
}Micronaut's
MutableHttpResponse.header() mutates the response in place. Unlike PSR-7 (which requires $response = $response->withHeader(...)), you do not need to capture a new object.Path-scoped filtering
Use Ant-style path patterns in the @Filter annotation to scope the filter to specific route prefixes:
// Block only on /api/** routes
@Singleton
@Filter("/api/**")
public class ApiAiBotFilter implements HttpServerFilter { ... }
// Block on multiple prefixes
@Singleton
@Filter({"/api/**", "/webhook/**"})
public class AiBotFilter implements HttpServerFilter { ... }
// Scope to specific HTTP methods (GET and HEAD only)
@Singleton
@Filter(patterns = "/api/**", methods = {HttpMethod.GET, HttpMethod.HEAD})
public class AiBotFilter implements HttpServerFilter { ... }Filter execution order
Override getOrder() to control execution order. Lower numbers run first:
@Override
public int getOrder() {
return -100; // runs before default (0) and most auth filters
}
// Alternative: implement Ordered interface
// @Singleton
// @Filter("/**")
// public class AiBotFilter implements HttpServerFilter, Ordered {
// @Override public int getOrder() { return -100; }
// }Kotlin variant
In Kotlin, the same interface works with Reactor's Kotlin extensions:
// Kotlin
@Singleton
@Filter("/**")
class AiBotFilter : HttpServerFilter {
override fun doFilter(
request: HttpRequest<*>,
chain: ServerFilterChain
): Publisher<MutableHttpResponse<*>> {
val ua = request.headers.get("User-Agent", "").lowercase()
return if (AI_BOTS.any { it in ua }) {
Mono.just(
HttpResponse.status<String>(HttpStatus.FORBIDDEN)
.body("Forbidden: AI crawlers are not permitted.")
)
} else {
Mono.from(chain.proceed(request))
.doOnNext { it.header("X-Robots-Tag", "noai, noimageai") }
}
}
override fun getOrder(): Int = -100
}Micronaut vs Quarkus vs Spring Boot — Java comparison
Micronaut — reactive Publisher return
// doFilter returns Publisher<MutableHttpResponse<?>>
if (isAiBot(ua))
return Mono.just(HttpResponse.status(HttpStatus.FORBIDDEN));
return Mono.from(chain.proceed(request))
.doOnNext(res -> res.header("X-Robots-Tag", "noai, noimageai"));Quarkus RESTEasy Reactive — return Response or null
@ServerRequestFilter(preMatching = true)
public Response filterRequest(ContainerRequestContext ctx) {
if (isAiBot(ctx.getHeaderString("User-Agent")))
return Response.status(403).build();
return null; // continue
}Spring Boot — synchronous doFilterInternal
@Override
protected void doFilterInternal(HttpServletRequest req,
HttpServletResponse res, FilterChain chain) throws ... {
if (isAiBot(req.getHeader("User-Agent"))) {
res.sendError(403, "Forbidden"); return;
}
chain.doFilter(req, res);
}Testing
Use @MicronautTest with the injected HttpClient:
// src/test/java/com/example/filter/AiBotFilterTest.java
package com.example.filter;
import io.micronaut.http.HttpRequest;
import io.micronaut.http.HttpStatus;
import io.micronaut.http.client.HttpClient;
import io.micronaut.http.client.annotation.Client;
import io.micronaut.http.client.exceptions.HttpClientResponseException;
import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import jakarta.inject.Inject;
import org.junit.jupiter.api.Test;
import static org.junit.jupiter.api.Assertions.*;
@MicronautTest
class AiBotFilterTest {
@Inject
@Client("/")
HttpClient client;
@Test
void testBlocksAiBot() {
HttpClientResponseException exception = assertThrows(
HttpClientResponseException.class,
() -> client.toBlocking().exchange(
HttpRequest.GET("/api/articles")
.header("User-Agent", "GPTBot/1.0")
)
);
assertEquals(HttpStatus.FORBIDDEN, exception.getStatus());
}
@Test
void testAllowsBrowser() {
var response = client.toBlocking().exchange(
HttpRequest.GET("/api/articles")
.header("User-Agent", "Mozilla/5.0 (compatible)")
);
assertEquals(HttpStatus.OK, response.getStatus());
assertEquals("noai, noimageai", response.header("X-Robots-Tag"));
}
}AI bot User-Agent strings (2026)
Use .toLowerCase() before .contains() — Java String.contains() is case-sensitive.
Is your site protected from AI bots?
Run a free scan to check your robots.txt, meta tags, and overall AI readiness score.