u/Cherrypili

Need honest eyes on a landing page I made for Meta advertisers.

I’ve been helping a bunch of people on Reddit troubleshoot weird Meta behavior, unstable ROAS, cheap traffic with no intent, “healthy” campaigns that fall apart and what have you.

I built something to help interpret what’s actually happening under the hood. The landing page is live, but I need a quick sanity check.

reddit.com
u/Cherrypili — 6 days ago

I've been doing my own deep dive and analysis and I think the biggest problem with Meta ads right now is that advertisers can’t tell whether the signal is trustworthy anymore. It also seems like this has been the biggest issue in this subreddit as well.

A campaign can look healthy on the surface:

  • good CTR
  • low CPM
  • cheap clicks

and still completely fail downstream.

Or the opposite:
a campaign looks shaky early but becomes profitable after enough conversion signal stabilizes.

It feels like more people are struggling with trusting what they are seeing than how to get impressions.

That’s probably why so many advertisers are reacting emotionally by:

  • turning ads off too early
  • rebuilding campaigns constantly
  • changing targeting every 24 hours
  • confusing cheap traffic with buyer intent

I’ve been seeing the same patterns over and over lately while testing my own campaigns and building internal diagnosis tooling around this stuff.

Curious what everyone else is seeing right now:

  • unstable traffic quality?
  • delayed conversion behavior?
  • weird scaling volatility?
  • normal metrics but weak purchases?
reddit.com
u/Cherrypili — 6 days ago
▲ 4 r/facebookadsexperts+1 crossposts

Over the past two months I've been spending a lot of time in marketing subreddits, reading through posts, watching people troubleshoot campaigns, and noticing a pattern. A lot of advertisers know their ads aren't performing but don't know why, and even when they get some answers, they don't know what to do next. The data is there but it's hard to read, and Meta doesn't exactly make it easy to turn insights into action.

I kept seeing the same questions: why is my CTR dropping, should I kill this campaign or let it run, how do I know if my ad is actually working or just spending. So I decided to stop theorizing and test it myself. I built a tool to help solve exactly this, and I'm using my own waitlist campaign as the test subject — real data, real budget, real decisions. Here's what I'm seeing after the first 24 hours.

The setup:

  • Objective: Traffic → waitlist signup page
  • Budget: $10/day
  • Audience: US, 24-45, interests in Facebook Ads, online advertising, digital marketing, small business owners
  • Creative: Static image ad

First 24 hours of data:

  • Impressions: 439
  • Clicks: 13
  • CTR: 2.96%
  • Spend: $8.04
  • CPC: $0.62
  • Conversions: 0 (waitlist signups tracked separately via the form, not Meta conversion events)

One thing that surprised me: my platform was already surfacing this campaign data before Meta Ads Manager had even updated. If you're not seeing your numbers right away in Ads Manager, that doesn't mean nothing is happening.

What NVSN scored it:

NVSN gave this campaign a Campaign Score of 65/100 after the first 24 hours and here is the verdict: Promising, but iterate before scaling.

Here's what the score breakdown said:

  • CTR signal: Near-perfect. Almost 3% on a cold audience means the creative is stopping the scroll and getting clicks. That part is working.
  • CPM at $18.31: Moderate. This is the weakest signal and where the score takes the biggest hit. For a marketing/business owner audience this is expected but worth improving.
  • Engagement and quality rankings: Too early to call. Meta's system needs more data before it can benchmark you against competitors. Both defaulted to neutral — these will shift as the campaign matures.

The recommendation: Don't scale yet. Don't kill it. Test a creative variation to bring CPM down while keeping CTR strong.

Are waitlist ads worth it?

IMO, it depends on what you're trying to validate. For me this campaign is doing two jobs at once: collecting emails AND testing whether my creative resonates with the right audience before I drive bigger traffic. At $10/day the feedback is cheap. Im not optimizing for volume, Im optimizing for signal.

If I was considering bumping to $15-20/day, I'd hit more reliable data faster like more impressions, cleaner metrics, a clearer picture within 48 hours. At $10/day, I'm in directional territory but not decisive territory yet. The score reflects that honestly.

The sweet spot for a low-budget signal test like this is probably $15/day. Enough to get meaningful data without burning money before I know if the creative is working. A bigger budget doesn't make a bad creative good — it just speeds up how fast you find out.

I'll post a 48-hour update with:

  • Whether the score improved or dropped
  • Whether CPM came down
  • How many waitlist signups actually came through
  • Whether I made any creative changes based on what the data showed

Happy to answer questions in the meantime. Anyone else running waitlist campaigns or testing ad creative at a low budget? What's your personal threshold for deciding a creative is working in the first couple of days?

https://preview.redd.it/r8oixzymogzg1.png?width=2850&format=png&auto=webp&s=8dce13df91ba43bdac95af3c19de90db5e29dde2

reddit.com
u/Cherrypili — 8 days ago