
0 to 15K active users in 8 weeks. $0 on ads. Here's the exact AEO + SEO playbook I used with Claude.
I built a micro SaaS marketplace for AI agent skills. Think app store but for SKILL.md files that make Claude Code, Cursor, and Codex CLI better at specific tasks. I'm a non-technical solo founder. Built the whole thing with Lovable and Claude.
This post isn't about the product. It's about the distribution strategy that got it to 15K active monthly users with zero ad spend. Most of this is AEO stuff that I think this sub will find useful.
The numbers first
15,000+ active users in the last 30 days. 219% growth over the previous period. 300K+ Google search impressions per month. 4,000+ organic clicks per month. 878+ page-1 Google rankings, up from about 15 at launch. Cited as a source by ChatGPT, Gemini, Claude, Perplexity, Doubao, and Kagi. 350+ AI-referred sessions per month from 9 different AI engines. Total marketing spend: $0.
The AEO strategy that drives most of my growth now
This is the part I want to focus on because I don't see many people talking about it with real numbers.
AI Overviews are eating Google clicks. I have 121 queries where I rank positions 1 through 3 on Google with zero clicks. The AI answers the question before anyone scrolls down. That insight changed my entire strategy. I stopped chasing rankings and started optimizing to become the source AI engines cite.
Here's what I did specifically:
Every article has a Quick Answer block at the top. 40 to 60 words that directly answer the main question. This is what AI Overviews and LLMs extract. If you bury the answer halfway down your page, the AI will pull from someone who doesn't.
All H2 headings are phrased as questions. Not "Claude Code Skill Locations" but "Where Does Claude Code Store Skills?" AI engines prefer extracting from question-format sections. I tested this across 96 articles and the ones with question headings get cited significantly more.
Every page has FAQ schema. Google's AI picks up our Q&As directly. This is one of the highest leverage things I did and it took maybe 2 hours total with Claude generating the schema.
I built an entity anchor page (/about) with Organization, Person, and AboutPage schema. This tells AI engines who we are and establishes the entity relationship. Without this, you're just another anonymous source.
I created an llms.txt file that explicitly tells LLM crawlers what the site is and where to find key content. Also made sure robots.txt allows all AI crawlers. Most people block them. I want them to read everything.
The result: when someone asks ChatGPT "where can I find SKILL.md skills" or asks Perplexity "what is the best skill marketplace for AI agents," they get pointed to us. That didn't happen by accident. Claude helped me engineer every piece of it.
The content engine behind it
I don't ask Claude to "write me a blog post about X." That produces generic stuff that Google doesn't rank and AI engines don't cite.
Instead I feed Claude my Google Search Console exports (queries, impressions, click-through rates, average positions) and ask it to find keyword gaps. Claude identifies queries where I have high impressions but zero clicks, finds topics where competitors have content but I don't, and spots cannibalization where multiple pages compete for the same query.
Then we write articles together targeting those specific gaps. Every article has the Quick Answer block at the top, question-based H2s, comparison tables where relevant, and internal links to related articles.
96 articles later, we went from 5 clicks per week to 1,000+ clicks per week. 878+ page-1 rankings. All organic.
The technical SEO layer most people skip
I built the app with Lovable (React SPA). Out of the box it was an SEO disaster. Google's crawler saw an empty div and a JavaScript bundle. The JS was 460KB. The logo was a 179KB PNG rendered at 112 pixels. LCP was 4+ seconds on mobile.
Claude diagnosed all of this. It wrote the SSR layer, found the bundle bloat, identified the oversized images, and rewrote the performance bottlenecks. Desktop LCP went from 2.5 seconds to 0.9 seconds. PageSpeed performance score went from 70 to 97. Logo went from 179KB to 7KB.
If you're building with Lovable, Bolt, or any React framework and expecting organic traffic, you need to fix this layer. Google will not reliably index a client-side rendered SPA.
The structured data architecture
Claude built the entire structured data layer:
Homepage: Organization, WebSite with SearchAction, FAQPage with 15 Q&As. Individual skill pages: SoftwareApplication with pricing, BreadcrumbList, conditional FAQPage. Article pages: Article, FAQPage, HowTo, BreadcrumbList, Organization. About page: Organization, AboutPage, Person schema for entity anchoring.
PageSpeed Insights shows "Structured data is valid" on every page with a 100 SEO score. I didn't know any of this before Claude explained it. Now every page is machine-readable for both Google and AI engines.
The weekly growth loop
Every Monday I do the same thing:
- Export Google Search Console data (queries, pages, clicks, impressions, positions)
- Upload the CSVs to Claude and ask: find keyword gaps, cannibalization, and CTR problems
- Claude identifies 3 to 5 specific opportunities with exact numbers
- I write 2 to 3 articles targeting those gaps, rewrite titles on underperforming pages, and add internal links from high-authority pages to weak ones
- Ping IndexNow to get new pages crawled within 24 hours
- Manually request indexing on GSC for the new articles
Takes about 2 to 3 hours per week. Highest ROI activity in the business.
What I'd tell someone starting AEO today
Set up Google Search Console before you launch. Even with zero traffic it starts collecting data on what queries your site appears for. That data becomes your content strategy within 2 to 3 weeks.
Write articles targeting specific questions, not broad topics. "How to install skills in Claude Code" beats "The Ultimate Guide to AI Agent Skills" every time.
Set up structured data from day one. Organization schema on your homepage, Product schema on your product pages, FAQ schema on your content pages. This is what AI engines read to decide whether to cite you.
Create an llms.txt file. It's like robots.txt but for AI crawlers. Takes 10 minutes and it tells every LLM exactly what your site offers.
Don't fight AI Overviews. Become the source they pull from.
Happy to answer questions about any of this. The site is agensi.io if you want to see how this looks in practice. If you want to support a bootstrapped one-person startup, making a free account genuinely helps more than you'd think 😄