Rebuilt my lead research pipeline in Computer. dropped the n8n stack entirely.
Most of the leads for the small thing I do on the side come from the same motion: pick a vertical, pull businesses off Google Maps, research each one enough to write a first message that isn't generic slop. Specific outreach converts, templated stuff doesn't, and the only way to get specific is to do the homework.
I used to run this through an n8n stack. Maps scraper into Notion, then deep research, scoring, draft generation. It worked but the scraper quite frequently broke every few weeks when selectors changed and I was spending more time keeping the pipeline alive than using its output.
Rebuilt it as a single scheduled task. Once a week it takes a location and vertical from a config file, browses Google Maps through Comet for the top 40-50 listings (even more sometimes), and for each one pulls their website, socials, recent news, any ads running. Then it scores each against a plain-English ICP description and writes a short brief for anything above the threshold. Briefs land in Notion with the business name, the specific hook I should lead with (a stale ad, a recent hire, complaints in reviews), and a one-paragraph first-message draft.
Comet replacing the Maps scraper is the bit I was most worried about. It's slower than a real scraper, maybe 10-15 minutes for 40-50 listings since it actually navigates and opens each listing card. But it doesn't break at all, much more reliable.
Recent example: a pool contractor whose reviews kept mentioning two-plus-week quote turnaround times. Lead with that in the first message, got a reply same day, the hit rate is way better compared to the old templated approach.