Blog | FELD M

How to detect AI traffic in Adobe Analytics: A practical guide | FELD M

Written by Eric Böhme, Elisabeth Knapp | Mar 4, 2026 11:17:08 AM

As digital analytics experts at FELD M, we've observed a significant shift in how users are consuming online content. It’s no longer just humans browsing your pages and products. AI agents are now acting on behalf of humans, searching for information, evaluating products, and making decisions. To effectively analyze your data and understand what humans need versus what AI machines require, you need to track this AI bot traffic and distinguish between these two types of traffic.

FELD M web analytics experts Eric Böhme and Elisabeth Knapp presented on this topic in person at our last FELD M analytics meetup: Adobe edition in November 2025. If the topic is of interest to you, consider taking a look at our upcoming event prior to the Adobe Summit 2026, The Road to Summit, where we'll share similar insights.

AI traffic detection matters because the landscape of web traffic is changing

AI traffic is growing at an unprecedented rate, and this growth is having a measurable impact on traditional metrics. Research has shown that click rates are declining as AI summaries become more prevalent in search results. In other words, when Google displays an AI-generated summary in search results, users are less likely to click through to individual websites.

Google refers to these as "higher quality clicks," suggesting that users who do click through are more engaged and intentional. However, this shift means that understanding the full picture of how your content is being consumed by humans and AI is crucial for sound decision-making.

 

Why is tracking AI traffic separately important?

By tracking AI traffic separately, you can:

  • Optimize your content strategy: Understand which content attracts AI crawlers and how to structure information for AI consumption

  • Maintain data accuracy: Separate bot traffic from human behavior to ensure your analytics reflect genuine user engagement

  • Prepare for the future: As AI search and AI agents become more prevalent, early adopters will have a competitive advantage

  • Understand consumption patterns: Identify how AI systems interpret and use your content

Understanding the types of AI traffic

Not all AI traffic is created equal. There are two primary categories of AI bots you'll encounter:

 

1. Research: Collects data for model training

These bots systematically scan websites to gather data for training large language models (LLMs). They're building the knowledge base that powers AI systems, and they  typically:

  • Visit pages systematically and comprehensively

  • Focus on text-heavy, informational content

  • May revisit pages periodically to capture updates

2. Search: Collects data for search engines

These bots crawl content across the web in order to index it for inclusion in search results on search engines like Google and Bing. 

 

3. Answer: Real-time agentic crawlers

These agentic crawler perform live retrieval-augmented generation by embedding/indexing the content, then comparing it to a user's prompt to determine whether it's relevant, and providing a tailored answer. These bots therefore:

  • Respond to immediate information requests

  • May visit pages more selectively based on queries

  • Are directly connected to user interactions with AI systems

Understanding the difference between these three types helps you optimize your content strategy and technical implementation.

 

How to stay visible in the AI era

As AI becomes the primary interface for information discovery, traditional SEO is evolving into what some call "GEO" (Generative Engine Optimization). The goal shifts from ranking in search results to being cited and referenced by AI systems.

It’s important to remember, though, that likely only a small percentage of your total traffic is coming from LLMs right now. We’d recommend doing an audit of your content so that you have a full picture before making any concrete decisions to keep or delete certain content. 

As we’ll explain below, many of the key principles of GEO are in fact SEO principles wearing an attention-grabbing new hat. Content that succeeds in both arenas should deliver well-structured, readable information that directly answers the user’s question and provides value through experience, expertise, authoritativeness and trustworthiness (E-E-A-T).

 

Adapting your strategy

The more you optimize for AI visibility, the more bot traffic you'll receive. This creates a positive feedback loop, but it requires a different approach to content creation and technical implementation. 

Key aspects include:

  • Structured data: Ensure your content is well-organized and uses semantic HTML

  • Clear, authoritative information: AI systems prioritize reliable sources

  • Up-to-date content: Fresh information is more likely to be referenced

  • Technical accessibility: Make sure bots can easily crawl and understand your content

Methods for detecting AI traffic in Adobe Analytics

Now that we understand why AI traffic matters, let's explore practical methods for identifying and tracking it in Adobe Analytics and Customer Journey Analytics (CJA).

 

The basic approach

The most straightforward method for detecting AI traffic relies on three key data points:

 

1. Referrer analysis

Many AI bots will include identifiable referrer information that indicates they're coming from AI platforms or search engines. Look for referrers that contain:

  • Known AI platform domains

  • RAG system identifiers

  • LLM service endpoints

2. User agent strings

AI bots often declare themselves in the User Agent string. Common patterns you can look out for include:

  • "Bot" or "crawler" in the User Agent

  • Specific AI service identifiers (e.g., GPTBot, ClaudeBot)

  • LLM platform names

3. Custom parameters

You can also implement custom tracking parameters to explicitly identify AI-driven traffic. This requires coordination with your development team but provides the most reliable identification method. Here’s an implementation example:

  • Add a URL parameter like ?source=ai for AI-referred traffic

  • Create a custom dimension in Adobe Analytics to capture this parameter

  • Build segments based on this dimension

Advanced segmentation

For more sophisticated detection, create specialized segments in Adobe Analytics that identify bot-like behavior patterns:

 

Behavioral indicators
  • Unusually fast page views: Bots often navigate much faster than humans

  • Non-standard session patterns: Lack of typical user hesitation or exploration

  • No interaction with dynamic elements: Missing clicks on buttons, forms, or interactive content

  • Straight-path navigation: Linear movement through content without typical user wandering

  • Missing events: Absence of scroll events, time-on-page metrics, or engagement indicators

How to create effective segments
  1. Combine multiple behavioral indicators for higher accuracy

  2. Test and refine thresholds based on your specific site patterns

  3. Monitor false positives to avoid excluding legitimate users

  4. Consider creating both "likely bot" and "definitely bot" segments

 

Professional tools can help in more complex environments

For organizations with a significant amount of AI traffic or complex needs, professional bot detection tools provide the most comprehensive solution:

 

Third-party solutions

Specialized vendors offer sophisticated bot detection that goes beyond basic user agent analysis, including features like:

  • Machine learning-based detection

  • Real-time identification and filtering

  • Detailed reporting on bot types and behavior

  • Integration with Adobe Analytics

These tools typically use multiple signals simultaneously, such as:

  • IP address patterns

  • Behavioral fingerprinting

  • Device and browser consistency checks

  • Network-level indicators

Do I need professional tools?

The following four factors can help you decide whether looking at third-party tools should be your next step:

  1. Your site receives substantial traffic (1M+ pageviews/month)

  2. Bot traffic is significantly impacting your analytics accuracy

  3. You need to protect your site against malicious bots (not just identify crawlers)

  4. Compliance or security requirements demand precise identification

 

Conclusion

The way content is discovered and consumed online is undergoing a huge shift. While it remains uncertain whether AI search will completely replace traditional search, being prepared for this shift is essential.

By implementing proper AI traffic detection in Adobe Analytics, you can:

  1. Maintain accurate analytics that reflect real human behavior

  2. Optimize your content strategy for both human and AI audiences

  3. Stay ahead of the curve

  4. Make data-informed decisions about how to position your content

The environment is constantly changing, and early adopters who understand and adapt to AI traffic patterns will have a significant competitive advantage. Start with the basic detection methods, refine your approach based on your findings, and consider advanced solutions as your needs grow.

 

Additional resources

To learn more about AI traffic detection and optimization, we recommend these resources:

  • Bots & Analytics II: From Filtering Out Bots to Filtering In Humans - A comprehensive look at evolving bot detection strategies
  • Tracking and Analyzing LLM and AI-Generated Traffic in Adobe - Official Adobe guidance on AI traffic tracking
  • Adobe Analytics: AI Traffic Technote - Technical documentation from Adobe
  • Measuring AI Referral Traffic in Web Analytics - Practical implementation guide
  • AI Crawler Playbook 2025 - How to recognize AI bots and secure their traffic (German)

Need help implementing AI traffic detection in your Adobe Analytics setup? You can get in touch with us for support here. Our team of digital analytics experts can help you develop a comprehensive strategy for understanding and optimizing your AI traffic.