Best patterns for handling 10k+ outgoing HTTP requests? (Hitting ECONNRESET and 403s)
Hey everyone,
I’m currently building a Node.js microservice (using standard fetch / Axios) that needs to pull daily pricing data from thousands of external retail URLs.
Initially, I made the rookie mistake of throwing them all into a massive Promise.all(), which obviously spiked my memory and crashed the event loop.
I’ve since refactored it to use p-limit (and also tried an async queue) to restrict concurrency to around 50 active requests at a time. The memory is much more stable now, but I'm running into two new issues:
- Getting a lot of ECONNRESET and socket hang-ups.
- Target servers start throwing 403 Forbidden or rate-limiting me after a few hundred requests.
How do you guys architect large-scale outgoing fetch jobs in Node? Do you use a custom http.Agent with keepAlive? Or farm it out to worker threads/Redis queues?
Would love to hear how you handle the networking side of high-volume data extraction.