As digital analytics experts at FELD M, we've observed a significant shift in how users are consuming online content. It’s no longer just humans browsing your pages and products. AI agents are now acting on behalf of humans, searching for information, evaluating products, and making decisions. To effectively analyze your data and understand what humans need versus what AI machines require, you need to track this AI bot traffic and distinguish between these two types of traffic.
FELD M web analytics experts Eric Böhme and Elisabeth Knapp presented on this topic in person at our last FELD M analytics meetup: Adobe edition in November 2025. If the topic is of interest to you, consider taking a look at our upcoming event prior to the Adobe Summit 2026, The Road to Summit, where we'll share similar insights.
AI traffic is growing at an unprecedented rate, and this growth is having a measurable impact on traditional metrics. Research has shown that click rates are declining as AI summaries become more prevalent in search results. In other words, when Google displays an AI-generated summary in search results, users are less likely to click through to individual websites.
Google refers to these as "higher quality clicks," suggesting that users who do click through are more engaged and intentional. However, this shift means that understanding the full picture of how your content is being consumed by humans and AI is crucial for sound decision-making.
By tracking AI traffic separately, you can:
Optimize your content strategy: Understand which content attracts AI crawlers and how to structure information for AI consumption
Maintain data accuracy: Separate bot traffic from human behavior to ensure your analytics reflect genuine user engagement
Prepare for the future: As AI search and AI agents become more prevalent, early adopters will have a competitive advantage
Understand consumption patterns: Identify how AI systems interpret and use your content
Not all AI traffic is created equal. There are two primary categories of AI bots you'll encounter:
These bots systematically scan websites to gather data for training large language models (LLMs). They're building the knowledge base that powers AI systems, and they typically:
Visit pages systematically and comprehensively
Focus on text-heavy, informational content
May revisit pages periodically to capture updates
These bots crawl content across the web in order to index it for inclusion in search results on search engines like Google and Bing.
These agentic crawler perform live retrieval-augmented generation by embedding/indexing the content, then comparing it to a user's prompt to determine whether it's relevant, and providing a tailored answer. These bots therefore:
Respond to immediate information requests
May visit pages more selectively based on queries
Are directly connected to user interactions with AI systems
Understanding the difference between these three types helps you optimize your content strategy and technical implementation.
As AI becomes the primary interface for information discovery, traditional SEO is evolving into what some call "GEO" (Generative Engine Optimization). The goal shifts from ranking in search results to being cited and referenced by AI systems.
It’s important to remember, though, that likely only a small percentage of your total traffic is coming from LLMs right now. We’d recommend doing an audit of your content so that you have a full picture before making any concrete decisions to keep or delete certain content.
As we’ll explain below, many of the key principles of GEO are in fact SEO principles wearing an attention-grabbing new hat. Content that succeeds in both arenas should deliver well-structured, readable information that directly answers the user’s question and provides value through experience, expertise, authoritativeness and trustworthiness (E-E-A-T).
The more you optimize for AI visibility, the more bot traffic you'll receive. This creates a positive feedback loop, but it requires a different approach to content creation and technical implementation.
Key aspects include:
Structured data: Ensure your content is well-organized and uses semantic HTML
Clear, authoritative information: AI systems prioritize reliable sources
Up-to-date content: Fresh information is more likely to be referenced
Technical accessibility: Make sure bots can easily crawl and understand your content
Now that we understand why AI traffic matters, let's explore practical methods for identifying and tracking it in Adobe Analytics and Customer Journey Analytics (CJA).
The most straightforward method for detecting AI traffic relies on three key data points:
Many AI bots will include identifiable referrer information that indicates they're coming from AI platforms or search engines. Look for referrers that contain:
Known AI platform domains
RAG system identifiers
LLM service endpoints
AI bots often declare themselves in the User Agent string. Common patterns you can look out for include:
"Bot" or "crawler" in the User Agent
Specific AI service identifiers (e.g., GPTBot, ClaudeBot)
LLM platform names
You can also implement custom tracking parameters to explicitly identify AI-driven traffic. This requires coordination with your development team but provides the most reliable identification method. Here’s an implementation example:
Add a URL parameter like ?source=ai for AI-referred traffic
Create a custom dimension in Adobe Analytics to capture this parameter
Build segments based on this dimension
For more sophisticated detection, create specialized segments in Adobe Analytics that identify bot-like behavior patterns:
Unusually fast page views: Bots often navigate much faster than humans
Non-standard session patterns: Lack of typical user hesitation or exploration
No interaction with dynamic elements: Missing clicks on buttons, forms, or interactive content
Straight-path navigation: Linear movement through content without typical user wandering
Missing events: Absence of scroll events, time-on-page metrics, or engagement indicators
Combine multiple behavioral indicators for higher accuracy
Test and refine thresholds based on your specific site patterns
Monitor false positives to avoid excluding legitimate users
Consider creating both "likely bot" and "definitely bot" segments
For organizations with a significant amount of AI traffic or complex needs, professional bot detection tools provide the most comprehensive solution:
Specialized vendors offer sophisticated bot detection that goes beyond basic user agent analysis, including features like:
Machine learning-based detection
Real-time identification and filtering
Detailed reporting on bot types and behavior
Integration with Adobe Analytics
These tools typically use multiple signals simultaneously, such as:
IP address patterns
Behavioral fingerprinting
Device and browser consistency checks
Network-level indicators
The following four factors can help you decide whether looking at third-party tools should be your next step:
Your site receives substantial traffic (1M+ pageviews/month)
Bot traffic is significantly impacting your analytics accuracy
You need to protect your site against malicious bots (not just identify crawlers)
Compliance or security requirements demand precise identification
The way content is discovered and consumed online is undergoing a huge shift. While it remains uncertain whether AI search will completely replace traditional search, being prepared for this shift is essential.
By implementing proper AI traffic detection in Adobe Analytics, you can:
Maintain accurate analytics that reflect real human behavior
Optimize your content strategy for both human and AI audiences
Stay ahead of the curve
Make data-informed decisions about how to position your content
The environment is constantly changing, and early adopters who understand and adapt to AI traffic patterns will have a significant competitive advantage. Start with the basic detection methods, refine your approach based on your findings, and consider advanced solutions as your needs grow.
To learn more about AI traffic detection and optimization, we recommend these resources:
Need help implementing AI traffic detection in your Adobe Analytics setup? You can get in touch with us for support here. Our team of digital analytics experts can help you develop a comprehensive strategy for understanding and optimizing your AI traffic.