Log in
Answer Engine Optimization · AI citation tracking

See if customers are finding you on AI search

AI engines answer your customers’ questions every day. AuditAE tells you who they link to — and whether your brand is even on the page.

Live demo
Try···

One prompt × 4 engines. Free — no credit card.

No signup
$5 free credit on signup · 100 free checks
Pay per check
Tracked acrossChatGPTPerplexityGeminiGoogle AI Overviews
The unmeasured loss

AI is answering for your customers. You can’t see what it says.

ChatGPT cited 700M weekly active users in February 2026. Perplexity has 30M+ monthly users. 60% of Google searches end without a click — 77% on mobile. Each one is a moment where someone asked about your category, got an answer, and moved on. Your analytics never registered the question. Your CRM never logged the buyer.

Whether your brand was mentioned

The floor — does the engine name you at all when asked about your category?

Whether it was cited as a source

Different problem, bigger gap. Mentioned ≠ linked. Most brands fail here.

Which competitors got the link credit

The part that compounds. Engines learn from sources they already cite.

Old SEO measured clicks. AEO measures answers.

Stats current as of Q1 2026 — sources: OpenAI, Perplexity, SparkToro.

How it works

How AuditAE checks ChatGPT, Perplexity, Gemini, and Google AI Overviews

You give us a brand, a domain, and the prompts your customers actually type. We send each prompt to all four engines the way a real user would, capture every answer, extract every source URL, and tell you whether your brand got mentioned, cited, or quietly skipped.

01

Pick prompts

Start with 5–10 questions your buyers actually ask. We surface templates by category if you don't know where to start.

02

Run the audit

Every (prompt × engine) pair runs in parallel. Five prompts × four engines = 20 checks, ~3 minutes, $1.00.

03

Read the gap

Per-prompt grid, share-of-voice panel, and a competitor split that shows where the link credit actually went.

A real audit

Notion is mentioned in 17 of 20 AI answers.
Cited as a source: 0 times.

That’s the gap most teams don’t measure. ChatGPT, Perplexity, Gemini, and Google’s AI Overviews all describe Notion when asked about team note-taking, wikis, or PKM tools — but the link credit goes to Slite, third-party roundups, and Reddit threads. Brand visibility looks healthy. Brand attribution is zero. AuditAE shows you both.

Audit results · May 6, 2026

Notion

notion.so
Citation rate
85%
Prompts
5
Engines
4
Cites returned
17

Citation rate by engine

ChatGPT80%
4 / 5 cited
Perplexity80%
4 / 5 cited
Gemini80%
4 / 5 cited
Google AI Overviews100%
5 / 5 cited

Share of voice

How often each domain was cited across all queried engines.

  • notion.so · Notion0 cites0%
  • obsidian.md 1 cites9%
  • evernote.com 3 cites27%
  • atlassian.com 1 cites9%
  • slite.com 6 cites55%

Per-prompt breakdown

PromptEngineCitedPosSourcesSentiment
best note-taking app for teams in 2026
ChatGPT Yes2232
https://www.atlassian.com/software/confluence?utm_source=openaihttps://support.microsoft.com/en-us/office/add-a-onenote-notebook-in-microsoft-teams-0ec78cc3-ba3b-4279-a88e-aa40af9865c2?utm_source=openaihttps://help.coda.io/hc/en-us/articles/39555725230989-Billing-and-pricing-basics?utm_source=openaihttps://workspace.google.com/intl/en/pricing/?utm_source=openai
neutral
best note-taking app for teams in 2026
Perplexity Yes2
https://zapier.com/blog/best-note-taking-apps/https://www.read.ai/articles/best-ai-note-taking-app-for-microsoft-teamshttps://www.obsibrain.com/blog/note-taker-app-the-7-best-ways-to-capture-ideas-in-2026https://zackproser.com/blog/best-meeting-notes-app-2026+4
positive
best note-taking app for teams in 2026
Gemini Yes908
https://pcmag.comhttps://obsibrain.comhttps://zapier.comhttps://guideflow.com+6
positive
best note-taking app for teams in 2026
Google AI Overviews Yes18
https://zapier.com/blog/best-note-taking-apps/https://www.pcmag.com/picks/the-best-note-taking-appshttps://www.obsibrain.com/blog/note-taker-app-the-7-best-ways-to-capture-ideas-in-2026https://zackproser.com/blog/best-meeting-notes-app-2026+5
positive
Notion vs Obsidian for personal knowledge management
ChatGPT Yes38
positive
Notion vs Obsidian for personal knowledge management
Perplexity Yes98
https://learn.g2.com/obsidian-vs-notionhttps://productive.io/blog/notion-vs-obsidian/https://www.eesel.ai/blog/notion-vs-obsidianhttps://www.youtube.com/watch?v=mpn5fze4RKo+3
neutral
Notion vs Obsidian for personal knowledge management
Gemini Yes17
https://g2.comhttps://xp-pen.comhttps://youtube.comhttps://obsidian.md+9
neutral
Notion vs Obsidian for personal knowledge management
Google AI Overviews Yes0
https://www.reddit.com/r/productivity/comments/1fpgz7k/notion_or_obsidian_for_personal_knowledge_database/https://www.youtube.com/watch?t=421&v=Dw_XUTXgW94https://medium.com/@michaelswengel/would-i-switch-from-notion-to-obsidian-again-knowing-what-i-know-now-63e63b7c6527https://slite.com/learn/obsidian-vs-notion+9
positive
how to build a company wiki without Confluence
ChatGPT Yes262
positive
how to build a company wiki without Confluence
Perplexity Yes807
https://slite.com/learn/how-to-build-a-company-wikihttps://www.docsie.io/blog/articles/establishing-an-effective-internal-wiki-for-your-organization/https://www.nitishmathew.com/post/how-to-build-internal-wikishttps://xwiki.com/en/Blog/open-source-alternatives-to-Confluence/+2
neutral
how to build a company wiki without Confluence
Gemini Yes5366
https://thedigitalprojectmanager.comhttps://teamwork.comhttps://siit.iohttps://glitter.io+13
positive
how to build a company wiki without Confluence
Google AI Overviews Yes111
https://www.youtube.com/watch?v=vPVqZ8dIo9Uhttps://www.reddit.com/r/dotnet/comments/1gwdx7a/how_do_you_document_stuff_without_confluence/https://sites.google.com/view/minimalist-wiki-kwd/wikihttps://massivegrid.com/blog/xwiki-vs-confluence-enterprise-comparison/+8
positive
AI features in modern productivity tools
ChatGPT No
neutral
AI features in modern productivity tools
Perplexity Yes550
https://www.coursera.org/articles/ai-tools-for-workhttps://monday.com/blog/project-management/ai-productivity-tools/https://vibe.us/blog/ai-productivity-tools/https://www.nngroup.com/articles/ai-tools-productivity-gains/+5
positive
AI features in modern productivity tools
Gemini No
https://calpcc.comhttps://microsoft.comhttps://gloat.comhttps://medium.com+5
neutral
AI features in modern productivity tools
Google AI Overviews Yes840
https://zapier.com/blog/best-ai-productivity-tools/https://monday.com/blog/project-management/ai-productivity-tools/https://numerous.ai/blog/ai-productivity-toolshttps://blog.webex.com/innovation-ai/top-ai-productivity-tools/+10
positive
what's the best tool for documenting engineering processes
ChatGPT Yes147
positive
what's the best tool for documenting engineering processes
Perplexity No
https://www.colabsoftware.com/research/what-tools-do-engineering-teams-use-to-document-design-feedbackhttps://resources.pcb.cadence.com/blog/best-engineering-document-management-software-top-features-cadencehttps://blog.hagerman.com/what-are-the-best-document-management-systems-for-engineershttps://www.g2.com/categories/engineering-document-management+3
neutral
what's the best tool for documenting engineering processes
Gemini Yes973
https://technicalwriterhq.comhttps://thedigitalprojectmanager.comhttps://axerosolutions.comhttps://tango.ai+5
neutral
what's the best tool for documenting engineering processes
Google AI Overviews Yes155
https://www.reddit.com/r/Engineers/comments/1gqonw5/what_is_the_best_software_for_organizing_and/https://usefluency.com/blog/top-ai-process-documentation-toolshttps://www.youtube.com/watch?t=363&v=DiAgPiW0hhYhttps://blog.hagerman.com/what-are-the-best-document-management-systems-for-engineers+10
positive
Real audit · 5 prompts × 4 engines · run on auditae.appView full audit

ChatGPT’s API doesn’t return source annotations on every response — what you see is exactly what the engine returned.

Founder take

AEO vs SEO: the part nobody is honest about

Most “AEO vs SEO” articles tell you they’re complementary. They are, sort of. Here’s what they leave out: traditional SEO measures rank order on a results page. AEO measures whether you exist at all in the answer. You can rank #1 on Google and have ChatGPT, Gemini, and Perplexity all describe your category without ever naming you.

The mechanic is different. SEO is a ranking problem on a list of links. AEO is a presence problem inside a synthesized paragraph — and that paragraph cites somebody. The third-party site reviewing you, usually. Or a Reddit thread. Or a competitor’s case study.

The miss SEO tools don’t catch: you can have great backlinks and zero AI citations because the LLM trained on a corpus that didn’t quote you. Domain authority doesn’t transfer.

What changes in practice — answer-first paragraphs, named expert quotes, structured data, freshness signals. Different optimization, different surface.

Stop calling them complementary. AEO is a different read on a different surface.

What to do with results

How to rank on Perplexity and get cited by ChatGPT

Four levers move citation rate. They show up consistently in citation studies from Frase, Conductor, AirOps, and Onely. Run them in this order.

01

Answer-first paragraphs

The first 60 words of every section should directly answer the query. AI engines tend to lift these almost verbatim — Frase has reported answer-first pages get cited at meaningfully higher rates than buried-lede content.

02

Question-shaped H2/H3s + FAQ schema

Google killed the FAQ rich result in 2023. AI engines still consume the schema. Walker Sands' 2025 audit found 71% of ChatGPT-cited pages used structured data.

03

Named expert quotes + outbound citations

Pages that quote a specific named source see citation lift in the +30–41% range across Frase, AirOps, and Conductor's published studies. The signal LLMs read is this is verifiable.

04

Freshness

Citation studies consistently find pages updated in the last 12 months dominate AI-cited results on commercial queries. Add a date, then update it.

Audit, fix the levers, re-audit in 30 days. That’s the loop.

Why pay-per-use

$0.05 per check. No subscription.

Otterly’s Lite tier starts at $29/month. Profound’s Lite plan is $499/month. Semrush’s AI Visibility Toolkit runs $99/month per domain as an add-on, or $199/month bundled into Semrush One. AuditAE charges $0.05 per (prompt × engine) cell, billed only when you run an audit. No floor, no cliff.

ToolEntry priceCommitment
AuditAE$0.05 / checkNone — pay as you go
Otterly Lite$29 / monthMonthly subscription
Semrush AI Visibility Toolkit$99 / month per domainMonthly subscription
Semrush One bundle$199 / monthMonthly subscription
Profound Lite$499 / monthMonthly subscription

This works for the SEO running monthly audits, the agency batching client checks, and the founder running this themselves. It doesn’t work for an enterprise tracking 50 brands across 200 prompts daily — that’s where Profound earns its keep. We’re the other thing.

Competitor pricing as of February 2026from each vendor’s published pricing page.

Pricing

Start with $5 free.

100 free checks. No card required. Top up when you want — unused balance never expires.

Agency calculator
Pay only for the checks you run.
Per run
$10.00
Run frequency
Engines
Checks / run
200
Per run
$10.00
Monthly
$40.00
Who it's for

Built for SEO teams losing traffic to AI

In-house SEO

Your CMO is asking “are we showing up in AI?” This is the answer in 3 minutes — with the share-of-voice screenshot to back it up.

Agency

Run audits across all your clients. Per-engine breakdowns are screenshot-ready for monthly reports. Per-check pricing means you charge what you charge — no SaaS markup squeeze.

Founder doing your own marketing

$5 free, no subscription, no sales call. Run it tonight, fix the obvious gaps, run it again in 30 days.

Built by an SEO who got tired of guessing

I’d been doing SEO for years when clients started saying “ChatGPT recommended us!” like it was good news, and “why isn’t ChatGPT recommending us?”the next month. I couldn’t tell either of them whether anything had actually changed. The tools that existed cost $99–$1,000/month and were built for enterprise teams running daily monitors. I just wanted to ask “does this prompt mention us, and who’s getting the link?”and get an answer for less than the cost of a coffee. AuditAE is that. Built it for myself, opened it up because I figured I’m not the only one.

Run your first audit
FAQ

Questions, answered

Which AI engines does AuditAE check?

ChatGPT (OpenAI Responses API), Perplexity (Sonar), Google Gemini, and Google's AI Overviews via SerpAPI. Same four every audit.

How does citation detection actually work?

For each prompt × engine cell we capture the full answer text and any source URLs the engine returns. Brand mention is detected with deterministic string matching against your brand name (case-insensitive, word-boundary-aware, handles prefix variants like “Notion” matching “Notion Labs”). A separate Anthropic Claude Haiku call extracts competitor names mentioned in the answer and infers sentiment. Source-domain matching is hostname-based — notion.so matches both notion.so/page and www.notion.so/foo.

What's the difference between “mentioned” and “cited”?

Mentioned = brand name appears in the answer text. Cited = answer includes a source URL pointing to your domain. Most AEO problems live in the gap between the two.

Do you store the prompts I run?

Yes. Audits, prompts, and results persist to your org's row in our database so you can compare runs over time. We don't share prompts across orgs and don't use them to train anything.

Can I delete an audit?

Not from the dashboard yet. Email support@auditae.app with the audit ID and we'll wipe it within 24 hours.

How accurate is the citation detection?

Brand mention uses deterministic string matching, so what you see is what we found — we always surface the excerpt and position so you can verify it's a real reference, not a coincidental word match. Edge case: brands sharing a name with a common word (e.g. “Notion” the noun vs. Notion the company) can register as cited when the engine wasn't actually talking about you. The excerpt makes those obvious.

Why doesn't ChatGPT always return source URLs?

OpenAI's Responses API returns url_citation annotations on some calls and not others — model-side behavior we can't control. We capture what they give us and flag cells where source data is missing.

What about refunds?

Failed checks (engine error, network issue) aren't billed. Top-ups themselves are non-refundable, but unused balance never expires. If something's wrong, email support@auditae.app and we'll make it right case by case.

Do you offer programmatic / API access?

Yes — there's an MCP server (Model Context Protocol) you can wire into Claude Code or any MCP client. Email support@auditae.app for the config; we'll surface this in-app once we have enough programmatic users to justify a docs page.

What happens to my data if I close my account?

Audits and credit history are retained for 30 days, then deleted. Email support@auditae.app to request immediate deletion.

Audit your AI citations for $0.05 a check.

No subscription. 3 minutes. $5 free credit on signup.

Run a free audit
$5 free credit · 100 free checks · no card required
Run a free audit →