Google Analytics is a powerful tool that tracks and analyzes website traffic for informed marketing decisions.
Service URL: policies.google.com (opens in a new window)
_gac_
Contains information related to marketing campaigns of the user. These are shared with Google AdWords / Google Ads when the Google Ads and Google Analytics accounts are linked together.
90 days
__utma
ID used to identify users and sessions
2 years after last activity
__utmt
Used to monitor number of Google Analytics server requests
10 minutes
__utmb
Used to distinguish new sessions and visits. This cookie is set when the GA.js javascript library is loaded and there is no existing __utmb cookie. The cookie is updated every time data is sent to the Google Analytics server.
30 minutes after last activity
__utmc
Used only with old Urchin versions of Google Analytics and not with GA.js. Was used to distinguish between new sessions and visits at the end of a session.
End of session (browser)
__utmz
Contains information about the traffic source or campaign that directed user to the website. The cookie is set when the GA.js javascript is loaded and updated when data is sent to the Google Anaytics server
6 months after last activity
__utmv
Contains custom information set by the web developer via the _setCustomVar method in Google Analytics. This cookie is updated every time new data is sent to the Google Analytics server.
2 years after last activity
__utmx
Used to determine whether a user is included in an A / B or Multivariate test.
18 months
_ga
ID used to identify users
2 years
_gali
Used by Google Analytics to determine which links on a page are being clicked
30 seconds
_ga_
ID used to identify users
2 years
_gid
ID used to identify users for 24 hours after last activity
24 hours
_gat
Used to monitor number of Google Analytics server requests when using Google Tag Manager
1 minute
Generative engines (ChatGPT-like assistants, AI answer boxes, and “AI search” experiences) don’t behave like classic search. You’re not just competing for a blue-link ranking anymore, you’re competing to be understood, trusted, and repeated accurately inside an answer.
That changes what “brand reputation tracking” means. It’s no longer only reviews + PR + SERP rankings. It’s also:
Below is a practical, SEO-manager-friendly framework to track reputation on generative engines—plus how Luciqo.ai fits as an end-to-end workflow to monitor, benchmark, and improve what these engines “believe” about your brand.
Key takeaways
Tracking reputation on generative engines is visibility + sentiment + accuracy + competitive context.
What counts as “brand reputation” in generative engines?
In traditional SEO, reputation tracking often focuses on:
In generative engines, the equivalent signals show up differently. A model may summarise dozens of sources and produce a single paragraph that becomes the buyer’s “truth”. So you’re measuring:
Step 1: Build a prompt library that mirrors real intent
If you only test one or two prompts, you’ll get misleading results. You need a structured set of prompts, grouped by intent, that you can run repeatedly.
A good starting library includes:
Category discovery (top-of-funnel)
Comparison (mid-funnel)
Risk & trust (high sensitivity)
Local / availability / suitability (conversion)
Important: Keep the prompt library stable over time so you can measure movement. Add new prompts, but don’t constantly replace the core set—or you’ll lose trend comparability.
Step 2: Define the metrics you’ll track (make it measurable)
To track “reputation” properly, you need a few metrics that can be recorded consistently. Here’s a practical scorecard model:
A) Brand Mention Rate (BMR)
% of prompts where your brand is mentioned
If you’re not mentioned, reputation is irrelevant—you’re invisible.
B) Recommendation Rate (RR)
% of prompts where your brand is actively recommended
Mentions can be incidental; recommendations signal trust.
C) Sentiment / Stance (qualitative + quantitative)
Track:
D) Accuracy & Consistency Log
Record:
E) Competitive Share of Voice (AI-SOV)
For each prompt set, measure:
F) Time-to-Resolve (operational KPI)
How long does it take you to:
These metrics turn reputation tracking into a proper SEO process you can report on and improve.
Step 3: Collect evidence in a way you can audit later
Generative outputs can change with time, context, and phrasing. So your tracking should produce an evidence trail:
This matters for internal reporting (“why did we drop?”), and it helps you avoid fuzzy discussions like “I feel like AI is ignoring us.”
Step 4: Diagnose why the engine is portraying you that way
When outputs are wrong or negative, the root cause is usually one of these:
Your brand is not consistently described (different taglines, shifting service descriptions, vague positioning).
Homepage says one thing, service pages say another, LinkedIn says something else.
Old directory listings, outdated articles, forum threads, reviews, or low-quality citations can shape the model’s summary.
You haven’t published clear answers to the questions people actually ask (pricing approach, process, compliance stance, guarantees, exclusions).
Clearer FAQs, stronger review footprint, more consistent descriptions, more structured content.
Tracking is only valuable if it leads to action: fix the narrative inputs, not just complain about the outputs.
Where Luciqo.ai fits as the practical solution
Doing all of the above manually becomes a spreadsheet-heavy routine: prompt tracking, evidence capture, sentiment notes, competitor benchmarking, monthly reporting, and follow-up tasks.
Luciqo.ai is positioned to make this operational for SEO managers by bringing reputation and generative visibility into one workflow. In practice, that means it can help you:
1) Standardise monitoring across prompts and personas
Instead of ad-hoc testing, you run a repeatable prompt set (your library), and track how your brand appears across:
2) Track brand mentions, sentiment signals, and inconsistencies
You want to spot patterns like:
A tool-based workflow makes it easier to log these issues, trend them, and assign fixes.
3) Benchmark against competitors (AI-SOV)
GEO reputation is relative. If competitors are consistently recommended in the same prompts where you’re absent, that’s a clear strategic signal:
4) Connect reputation signals to outcomes
Reputation tracking becomes far more valuable when you can connect it to real business impact (leads, conversions, pipeline quality). If your team already lives in analytics and CRM tools, it’s useful to bring reputation monitoring closer to those reporting rhythms—so you can say:
(You don’t need to overclaim causality; you’re looking for directional evidence.)
5) Produce reporting that stakeholders actually understand
Most stakeholders don’t want a theory lecture about LLMs. They want:
A dedicated workflow helps you generate clean monthly reporting without reinventing the wheel.
A simple monthly routine you can run (starting tomorrow)
If you want this to be scalable (and not a monthly headache), that’s where Luciqo.ai earns its place: turning reputation tracking into an ongoing GEO operations loop rather than an occasional manual check.
Bottom line
Tracking brand reputation on generative engines is not guesswork, it’s a measurable discipline built from prompt-based monitoring, sentiment and accuracy logging, competitive benchmarking, and operational follow-through.
If you approach it like a proper SEO system, and use a tool like Luciqo.ai to keep it consistent, you’ll be able to do what most brands can’t yet do: prove how AI engines describe you, spot reputation risks early, and steadily improve the narrative buyers are reading.
Recent Posts
Recent Comments
Recent Posts
As a SEO Manager, How Do I
February 15, 2026If Your Content Is Not AI-Readable or
February 4, 2026Automating Persona and Intent-Driven Personalisation at Scale
January 13, 2026What Is the Best Generative Engine Optimization
January 12, 2026Categories