u/EmilleIrmsch

People have a lot of questions around AI visibility at the moment. Also, a lot of misinformation is being spread by people who don't fully understand what they are talking about. Not trying to talk shit about them or anything, this is still an early space so we're all still learning. I am a technical person and I've been doing extensive research on AEO and AI visibility tracking, so I will try to answer some of the most common/interesting questions I've seen around this topic in a way that is easy to understand in order to try and clear up some of the confusion.

1. Measurability

Q: Can I measure if LLMs mention my brand? (1)
A: Not the same as in SEO. LLMs are probabilistic, so responses can vary wildly even for the same prompts. This makes the way we have to measure visibility in AI fundamentally different from SEO. In short: for SEO visibility, we check once "are we in the Google results, if yes, what position?". For LLM visibility, we check repeatedly if we are mentioned or not, with many versions of the same prompts. We take the average of these visibility results as our current share of voice for the LLMs tracked.

Q: Can I attribute website visits to LLMs referrals? (2)
A: In theory yes. There are many tools that allow you to track this for your website. But in practice, this solution doesn't work well. LLMs often don't link to websites directly (this depends a lot on the specific model), so users just google your brand and it becomes a regular visit from Google instead of an AI referral. That means you will see a lot less referrals from LLMs than there really are. It can still be interesting to see if people already find you through LLMs or not, but it's not reliable on its own. To make it more accurate, you could implement a "Where did you find us?" screen with "AI" as an option (optimally with the option to select the specific AI provider).

Q: Can I track if LLMs use my resources in its answers? (3)
A: Yes, AI visibility tracking tools like Peec, Columbus, Profound and others allow you to track which sources the LLMs use. But the same as in question (1), keep in mind that LLMs are probabilistic, so not only their answers will differ every time, but also the sources they use can vary a lot. However, here we usually see clear winning pages for specific prompts. This is where you can optimize. Your goal should be to get your pages into the sources LLMs use regularly for specific prompts.

Q: Can I do A/B testing in AEO? (4)
A: Not on a granular level, because measuring AI visibility in general can't be done by checking once "am I mentioned", it's done with continuous testing across multiple prompts (see question (1)). You can only try to change general things about your strategy like writing style, FAQ position in blog pages etc. and see how your AI visibility changes over time. This will take a while and you could see decreased visibility for some time in the process.

2. Optimization

Q: Can I even optimize for AI visibility? (5)
A: Yes, but it's hard. Your main objective is to get your pages into the sources LLMs use for answering specific prompts. How? This is unclear and wildly varies by AI platform. You can't go wrong with just doing good SEO. Especially quality content will help a lot. Actually put effort into your posts, backing it by real data that you collected is great. LLMs rely heavily on semantic understanding of content, so they often evaluate depth and clarity differently from traditional ranking systems.

Q: Can I optimize for LLMs in general? (6)
A: No. The way different LLMs from different providers answer prompts and cite sources varies a lot. Your best bet is to identify where you already perform well, then double down on that one provider.

Q: How big is Reddit's role in AEO? (7)
A: Completely depends on your niche. There is no general answer, you need to track it for your own industry, niche and prompts. For example, based on our research with Columbus, prompts asking for fintech solutions like Wise or Remitly almost never result in Reddit or other UGC platforms being cited in any of the six LLMs we tracked, while Reddit and YouTube were the #1 and #2 most cited sources in prompts asking for SEO solutions.

3. Technical

Q: How do LLMs actually search? (8)
A: Depends on the LLM, but most of them use some kind of search tool. Behind the scenes they use something like Google Search API or Bing Search. So yes, they essentially "just Google" like many people already said in this space. This layer is what you're already optimizing for with SEO. In AEO you need to focus on the query fan-out (see question 9), since these are keywords that humans usually don't search for, so you might not be optimizing for them yet in your SEO.

Q: What is "Query Fan-Out"? (9)
A: When you ask an LLM something that it can't answer properly just based on its training data, it will try to get more context by searching the internet. It uses some underlying API like Google Search API or Bing Search to get results. But what does it search for? It compresses your prompt into one or multiple search queries that it gives to the search API to retrieve the best results for your question. Exactly these generated queries are what we call the query fan-out. These queries can be things no human ever searches for, so it won't show up as keywords you should optimize for in SEO tools, but they are very important for AEO. Unfortunately, AI providers usually don't expose their query fan-out. Only some platforms like Claude and Perplexity show it at the moment.

Q: Why can't we just track visibility in AI like we can in search engines? (10)
A: Because they are fundamentally different things. LLMs are highly probabilistic, while search engines are comparatively more consistent in their outputs. Think of it like this: when a search engine receives a query, it always executes the same code on your query to give you some set of results. The only thing making it variable is the search engine's algorithm that decides which page appears at which position. This is easy to track - just check once and you immediately know your current performance. It won't significantly change when you run the same query again. On the other hand, when a LLM receives a query, it doesn't just execute some code on it. It runs your query through billions of neurons and predicts letter by letter, word for word the best possible answer based on its training. The sheer amount of computation and possible answers creates a lot of room for variance in the LLM's response. This makes it hard to predict if your brand will be mentioned in the next response for the same prompt just based on a single previous response. In question 1 you can read how AI visibility should actually be tracked.

And finally:

Q: Is AEO replacing SEO? (11)
A: No. AEO is built on top of SEO. If you’re not visible in search, you’re unlikely to be visible in LLM retrieval either.

reddit.com
u/EmilleIrmsch — 9 days ago