Skip to main content

Measuring Results

How do you know if karmafarm.co is working? This guide covers multiple ways to measure your AI visibility and track leads from AI recommendations.

Method 1: Track AI Traffic in Google Analytics

AI tools like ChatGPT, Claude, and Perplexity send traffic to websites when they recommend products. You can track this in GA4.

Set Up an AI Traffic Channel

By default, GA4 hides AI traffic in the “Referral” or “Direct” channels. Create a custom channel to separate it:
  1. In GA4, go to AdminData DisplayChannel Groups
  2. Create a new channel group or edit the default
  3. Add a new channel called “AI Referral”
  4. Set the condition to match session source using this regex:
(chatgpt\.com|chat\.openai\.com|perplexity\.ai|claude\.ai|gemini\.google\.com|copilot\.microsoft\.com|meta\.ai)
  1. Important: Drag your AI channel above the Referral channel. GA4 checks from top to bottom.
Now AI traffic will appear as its own channel in your reports.
For a detailed walkthrough, see How to Track AI Traffic in GA4 by Orbit Media.

What to Expect

AI traffic is still small but growing fast:
  • ChatGPT drives ~78% of AI referral traffic
  • Perplexity drives ~15%
  • Claude users have the highest session value ($4.56/visit)
  • AI visitors stay ~10 minutes per session on average

Limitations

Some AI traffic won’t be trackable:
  • ChatGPT’s mobile app often appears as “Direct” traffic
  • Users who copy/paste URLs lose referrer data
  • Some AI tools strip referrer headers
This means your actual AI traffic is likely higher than GA4 reports.

Method 2: Ask Users During Onboarding

The most reliable way to track AI-driven signups is to ask directly.

Add a “How did you hear about us?” Step

During onboarding, add an optional question: “How did you discover [Product]?” Options:
  • ChatGPT / Claude / AI assistant
  • Reddit
  • Google search
  • Twitter/X
  • Friend or colleague
  • Other: [text field]

Why This Works

  • Captures attribution that analytics miss
  • Users who come from AI are often happy to say so
  • Gives you qualitative data on which AI tools drive signups

Implementation Tips

  • Make it optional (don’t block signups)
  • Keep options short (5-7 max)
  • Include “AI assistant” as a clear option
  • Track responses in your database for analysis
Companies like Writingmate.ai use this approach successfully in their onboarding flow.

Method 3: Customer Interviews

For deeper insights, ask new customers directly.

Questions to Ask

  • “When you were researching solutions, what tools did you use?”
  • “Did you ask ChatGPT or another AI for recommendations?”
  • “Where did you first hear about us?”
  • “What made you choose us over alternatives?”

What You’ll Learn

  • Which AI tools your audience actually uses
  • What prompts lead to your product being recommended
  • How AI recommendations compare to other channels
  • Objections or concerns that came up

Method 4: Monitor AI Recommendations Directly

Periodically test whether AI tools recommend your product.

Manual Testing

Ask AI tools questions your target audience would ask:
  • “What’s the best tool for [your category]?”
  • “What are alternatives to [competitor]?”
  • “[Problem you solve] - any recommendations?”
Track whether you’re mentioned and in what position.

Automated Monitoring

Tools exist to automate this monitoring at scale. Search for “AI brand monitoring” or “LLM mention tracking” solutions.

Connecting the Dots

Combine these methods for a complete picture:
MethodMeasuresReliability
GA4 AI ChannelWebsite visits from AIMedium (some traffic hidden)
Onboarding QuestionSignups attributed to AIHigh (self-reported)
Customer InterviewsQualitative insightsHigh (but small sample)
Direct AI TestingWhether you’re recommendedHigh (but point-in-time)

Leading vs Lagging Indicators

Leading indicators (track weekly):
  • Number of Reddit posts responded to
  • Share of voice vs competitors
  • Posting streak and consistency
Lagging indicators (track monthly):
  • AI traffic in GA4
  • “AI” responses in onboarding attribution
  • Mention rate in direct AI testing

Setting Expectations

AI visibility builds over time. Expect:
  • Week 1-4: Building presence, minimal measurable impact
  • Month 2-3: Starting to appear in some AI recommendations
  • Month 4+: Consistent AI traffic and attribution
The key is consistency. Keep posting helpful responses, and the AI recommendations will follow.