u/After-Condition4007

Cut my monday trend research from 3 hours to 10 minutes after stitching 4 agents together, here is the build

Im the head of marketing at a small outdoor gear brand. Every monday morning used to start with 3 to 4 hours of manual trend research. TikTok hashtag pages, IG creator performance, 6 competitor blogs via rss, all dumped into a doc the rest of the team would only half read because by the time it landed it was already tuesday.

Spent 2 weekends last month rebuilding this and ended up with a 4 agent setup that runs at 6am every monday so the team has something readable before standup at 9. Sharing because the build process took me through more tools than expected.

Make and n8n. Gonna group these because i hit the same wall with both. Beautiful for stitching apis i actually have. Painful for "open this page in a browser, scroll, grab the top posts." The tiktok and instagram parts were the problem. Both platforms want clean apis, and the public data on those is mostly behind login walls or rate limited. I know there are community nodes for some of this but i didnt want to maintain someone elses code when it breaks. If your data sources are all api friendly, either of these will serve you well. Mine werent.

Bardeen. Browser native, exactly what i needed for the scrape parts. Setup was genuinely easy, just point at what you want and go. But scheduled runs are limited on my plan and i couldnt chain it into a single output the way i wanted. Also the outputs lived in different places which meant i was still manually stitching things together. Closer but not quite.

MuleRun. Set up a multi agent workflow that runs every monday 6am. Agent A opens 5 hashtag pages on tiktok and pulls top 50 posts. Agent B logs into my IG via a browser extension that uses my existing session and grabs top performing creator posts. Agent C pulls 8 competitor blog rss. Agent D compiles the digest into a shareable page my team opens at 9am. Before this i spent 3 to 4 hours every monday morning and still missed niche trends. Now its 10 min to skim the auto digest. Drawback worth flagging, when an algorithm changes (eg tiktok hides hashtag view counts for certain regions) i have to retune which signals matter or the digest gets noisy. Happened once already and i lost half a morning fixing it.

Final stack is mulerun for the full monday digest, n8n for the longer term campaign reporting that hits clean apis, and a shared drive that both write into.

What actually mattered in the build, in case youre rebuilding something similar.

  1. Define the output before defining the agents. I wasted a week trying to design the perfect pipeline and only after did i realize my team needed bullet points by category not a 6 tab spreadsheet. Output first, agents second.
  2. Login state matters more than scraping power. Half the data i needed required being logged in. Tools that drive your actual browser session avoid the whole captcha and bot detection mess.
  3. Schedule + recovery + diff. Schedule is obvious. Recovery means what does the system do if one agent fails. Diff means tell me whats new vs last week, not just whats there. Without diff its just noise.
  4. One shareable page. The team isnt opening a notion db. The team is opening 1 link. Make sure the final output lives at 1 url.

Considering adding a 5th agent for influencer outreach next, if a few of the trending creators line up with our brand. Havent committed.

reddit.com
u/After-Condition4007 — 4 days ago