u/buddies2705

Looking Feedback on our Trading MCP

Hello everyone,

This is Gaurav from the Bitquery team. We have launched a new Trading MCP.

This MCP includes all trades from the last 30 days across 8 blockchains, including Solana, Ethereum, BNB, Base, Tron, Optimism, Arbitrum, and Polygon.

It has a latency of less than 10 sec.

It allows you to do any type of analysis, such as finding interesting tokens, price volume surge, creating trading models, finding traders, and analyzing them.

It is currently free https://mcp.bitquery.io/

Looking for feedback

We plan to add token transfer data next from multiple chains.

Feel free to ask any questions.

reddit.com
u/buddies2705 — 17 hours ago

Looking Feedback on our Trading MCP

We have launched a new MCP https://mcp.bitquery.io/

It has last 30 days of all trades done on chains - Solana, Ethereum, Tron, BNB Chain, Optimism, Arbitrum, Base.

It has a latency of less than 10 seconds, and you can do any type of analysis, such as finding interesting tokens to trade.. profitable traders to copy, etc.

It's free to use, we looking for feedback

reddit.com
u/buddies2705 — 18 hours ago
▲ 4 r/solana

How do you reconstruct a Solana wallet's actual cost basis when half their activity is LP deposits and yield farming?

I manage reporting for a small crypto fund. Our Solana wallet has hundreds of token movements per month — but a huge chunk aren't simple buys and sells. They're LP deposits into Meteora, yield farming rewards from Marinade, airdrop claims, token migrations. Each of these creates transfer events but they mean very different things for cost basis.

What I need as a starting point is just a clean per-token summary: total tokens received, total tokens sent, net change, across a specific reporting period. From there I can classify each flow. But even getting that first step — a per-currency inflow/outflow summary for one wallet over one quarter — is surprisingly hard without running my own indexer.

How are crypto fund accountants handling this on Solana?

reddit.com
u/buddies2705 — 18 hours ago
▲ 4 r/solana

Before buying a Solana token — how do you check if distribution is healthy or if 5 wallets hold everything?

I want a simple check before aping into any SPL token: who holds what percentage, how many unique receivers have ever gotten this token, and what the top 10 concentration looks like. CoinGecko shows "holders" but the number is usually wrong for newer tokens, and it doesn't show distribution curves.

What I really want is something like "for token X, show me total received per wallet, sorted by balance, with a count of unique holders." Basically an aggregated view of all transfer history for one mint address. But computing this myself from raw transfer events for tokens with millions of transfers is painfully slow.

What are you using for quick token distribution checks on Solana?

reddit.com
u/buddies2705 — 18 hours ago
▲ 5 r/Tronix

Building a USDT payment system on Tron — how are you monitoring incoming transfers in production?

Tron handles the most USDT volume of any chain but the tooling is years behind Ethereum.

I'm building a merchant payment system that needs to detect incoming TRC-20 USDT payments to hundreds of deposit addresses within a few seconds of confirmation.

TronGrid's event polling works but the rate limits mean I'm either paying a fortune or missing payments during busy periods.

What's everyone using as the backbone for Tron payment detection in production?

Especially interested in how you handle the reliability side — can't afford to miss a customer deposit.

reddit.com
u/buddies2705 — 18 hours ago

Sourcing bulk historical Pump.fun trade data for analytics — what does your stack look like?

Working on a research project that needs the full trade history for graduated Pump.fun tokens. Buys/sells on the bonding curve, creator wallet activity, graduation timing, then post-migration AMM trades on PumpSwap.

Live data is solved — WebSocket subscriptions, Geyser, take your pick. The historical side is what's wrecking the budget.

What I've tried:

- Solana RPC — rate limits made it useless past a few hours of data

- Archive node providers — quotes started at $500/mo, hard to justify before I even know if the dataset is useful

- Dune on Solana — slow at this scale, and decoded data on the Pumpfun program was patchy when I last looked

- Parsing logs from my own node — works for a small window, doesn't scale

What I actually want is a parquet or CSV dump per month, hosted somewhere I can pay per GB or per query rather than a flat retainer.

For folks doing serious on-chain research on Solana — what does your historical-data layer actually look like? Self-hosted? Vendor? Some hybrid?

reddit.com
u/buddies2705 — 3 days ago
▲ 2 r/solana

Anyone else find that "smart money" wallet tracking is basically useless for actual copy-trading?

Spent the last few weekends trying to backtest a copy-trading strategy on Solana. The idea: find wallets with high win rates over the past 30 days, copy their trades with ~2s execution delay.

Problem: every "smart money" leaderboard I look at (Cielo, Nansen, the various Solana-specific ones) shows wallets with 60–80% win rates that look amazing on paper. When I actually replay them with realistic execution lag and fees, the alpha mostly disappears.

A few things I've figured out so far:

  • A lot of these "winners" got most of their PnL from buying tokens during pre-launch or in the first few blocks. By the time you can copy them with any realistic latency, you're buying at 3–5x their entry.
  • The shorter the analysis window (8h, 24h), the more contamination from this pre-window position-taking. 14–30 day windows are way cleaner, but most tools default to 24h.
  • Memecoin wallets are the worst — anyone trading low-liquidity stuff has zero tolerance for execution delay. Your 1.5s lag kills you before the trade is even confirmed.

The only category that actually held up under realistic conditions was wallets trading higher-liquidity stuff (LSTs, tokenized equities, blue chips on Solana). Lower headline returns, but the alpha survives being copied.

Curious if anyone has different experience. Do you actually run copy-trading in production, and if so, what data window are you using to qualify wallets? Or have you given up on this whole approach?

reddit.com
u/buddies2705 — 3 days ago
▲ 7 r/BASE

How are you pulling historical DEX trade data without burning $5k/month?

Building a tax/PnL tool for DeFi traders. Need every swap a wallet has done across Uniswap v2/v3, Sushi, Balancer, Curve, and a handful of L2 DEXs over the last 18 months.

What I've tried:

  1. The Graph hosted subgraphs — fine for one chain at a time, dies when you try to query a year of data with pagination. Half the subgraphs I need got deprecated when the hosted service sunset.
  2. Self-hosted archive node — Geth full sync is fine, but archive mode is 18 TB and growing. We're a 3-person team. This is not a thing we want to babysit.
  3. Alchemy/Infura — rate limits make 18-month backfills take literal days, and the bill scales linearly per user. Not great.
  4. Dune — great for analysis, terrible as a backend. Query API is rate limited and not really meant for app-tier reads.

So what's everyone actually doing? Are people just eating the archive node cost? Paying $3–5k/month to one of the paid providers? I feel like I'm missing an obvious option.

Context: ~600 users in beta. Our infra cost per user is currently embarrassing. Need to fix this before we open paid signups.

reddit.com
u/buddies2705 — 3 days ago
▲ 6 r/solana

Spent 3 months building a Solana indexer and I'm still missing slots. What am I doing wrong?

Hey, looking for a sanity check from anyone who's been through this.

We started building our own Solana indexer back in January for a small DEX analytics dashboard. Validator + Geyser plugin writing to Postgres. Seemed manageable on paper.

Three months later: still falling behind by 200–400 slots during peak hours. Restart the plugin and it takes 6–8 hours to backfill before we're real-time again. AWS bill went from ~$400 to ~$2,100 because we kept scaling the box up trying to outrun the firehose.

Things I've tried:

  • Switched from Yellowstone to a custom Geyser fork
  • Filtering events at the plugin level (helped, but we still lose state we actually need)
  • Splitting writes into a Kafka topic + downstream consumers
  • Throwing a beefier machine at it (i3en.6xlarge → i3en.12xlarge, marginal improvement)

Honestly at this point I'm wondering if running our own infra is the right call at all. My CTO is convinced we'll save money long-term, but I'm not seeing the math anymore.

For anyone running a production Solana indexer:

  • How long did it take you to get stable?
  • Are you actually saving money vs paying a data provider?
  • Is there a setup that doesn't fall over every 10 days?

Considering throwing it all away and just paying someone. Talk me into or out of it.

reddit.com
u/buddies2705 — 3 days ago
▲ 7 r/solana

I'm a data scientist, not a blockchain engineer — easiest way to get Ethereum + Solana data into BigQuery?

For a research project I need fairly standard tabular data: transactions, transfers, DEX trades, balance updates for Ethereum and Solana, ideally with consistent column naming across chains and going back at least 2–3 years.

The team's existing approach involves running an Ethereum archive node and a custom Solana Geyser plugin, and frankly I don't want to spend my next quarter on infra I don't understand well.

Is there a sane way to just get Parquet files dropped into a GCS bucket so I can LOAD DATA into BigQuery and start working? I'd rather pay for the data than spend three months building the pipeline. What are people in this situation actually doing in 2026?

reddit.com
u/buddies2705 — 5 days ago
▲ 3 r/ethdev

Need 4+ years of historical DEX trades for backtesting — what does the loading pipeline actually look like?

For an ML feature engineering project I need every Uniswap, Curve, PancakeSwap, and Raydium trade from 2021 onwards loaded into Snowflake.

RPC backfill on a self-hosted Ethereum archive node is going to take weeks at this volume, the existing subgraphs are missing fields we need, and Dune is great for ad-hoc but I can't COPY INTO from a query result.

Has anyone done bulk historical loading of DEX trades into a warehouse cleanly?

Specifically curious about file format (Parquet vs JSONL), how people partition by block range, and whether anyone has found a vendor that just delivers this as columnar dumps to S3 instead of forcing us to build the extraction layer ourselves.

reddit.com
u/buddies2705 — 5 days ago
▲ 2 r/ethdev

We need a real-time pipeline: every DEX trade across major EVM chains and Solana lands in ClickHouse within seconds for live dashboards. I've seen teams do this with Kafka → ClickHouse Kafka engine, but I'm wondering if there are managed crypto-specific data feeds that already deliver in protobuf/Avro and integrate cleanly.

What's the architecture people are actually running in production? Specifically interested in how you handle backfill and gap recovery.

reddit.com
u/buddies2705 — 13 days ago
▲ 4 r/ethdev

For a project I'm backtesting an LP strategy on Polygon and I need historical reserves for a specific pool every minute over the last 90 days. Pulling this from RPC means archive node queries which is expensive. Subgraphs are sometimes incomplete for older pools.

How are people getting historical pool state at this granularity? Especially across L2s.

reddit.com
u/buddies2705 — 13 days ago
▲ 1 r/ethdev

or our quarterly DAO report I want to show how much ETH our multisig has spent on gas across all its operational transactions. Sounds simple but distinguishing actual gas spend from refunds, separating it from value transfers, and getting USD values at the moment each transaction was executed — it's surprisingly tedious.

What's working for other DAOs?

reddit.com
u/buddies2705 — 13 days ago
▲ 7 r/solana

We need 2+ years of complete Solana SPL and SOL transfers loaded into our data warehouse for compliance analytics. GraphQL pagination is fine for sampling but burns through quotas at this scale. Running a Solana archive node and parsing transactions ourselves is overkill — we just need the transfer rows.

Is there a vendor that delivers historical Solana transfer data as Parquet files via S3, partitioned by date, that I can ingest into Snowflake/BigQuery directly?

reddit.com
u/buddies2705 — 13 days ago
▲ 2 r/Tronix

For a custody tool I need an alert the instant a specific cold wallet sends anything — any token, any amount. The wallet shouldn't normally have outbound activity so any send is suspicious. RPC subscriptions on Tron are limited and I don't want to run a full node just for one wallet.

Is there a hosted WebSocket where I subscribe with a filter like sender = <my_address> and I just receive an event whenever that address sends? Bonus if it includes the receiving address and USD value of what was sent.

reddit.com
u/buddies2705 — 13 days ago
▲ 2 r/Tronix

Building a whale alert bot for Tron USDT specifically. The volume is massive — hundreds of millions in transfers per day — and I only care about the ones above a threshold like $10K. Polling TronGrid every few seconds with a high amount filter is technically possible but the rate limits make it useless in practice, and most providers don't even let you filter server-side on amount.

Is there a real-time WebSocket I can subscribe to where I filter by USDT contract address AND minimum amount in the subscription itself, so I'm not pulling the full firehose and filtering client-side?

reddit.com
u/buddies2705 — 13 days ago
▲ 6 r/solana

A lot of Solana transfers happen inside CPI calls — a user calls program A, which internally calls Token Program to move funds. From the top-level transaction these are invisible unless you parse innerInstructions. Most explorers and APIs I've tried only show top-level transfers.

Is there a transfer API that indexes inner-instruction transfers separately?

reddit.com
u/buddies2705 — 14 days ago
▲ 3 r/solana

I want to pull every SPL transfer that was executed by a particular program — say a vesting contract or a custom escrow program — and ignore the rest. Each transfer has the program that triggered it but parsing this from raw ledger data means iterating through instructions and inner instructions per transaction.

Is there an indexed transfers API that lets you filter by program ID directly?

reddit.com
u/buddies2705 — 14 days ago
▲ 3 r/solana

For a treasury accounting report I need total inflow USD, total outflow USD, and net position change for ~50 wallets across an entire fiscal year. Solscan only goes back so far in the UI and exports are paginated to the point of being useless. Helius works but I'm doing chained RPC calls and stitching results together.

Is there an indexed API where I can just query "all transfers for wallet X between date A and date B" with USD values already attached?

reddit.com
u/buddies2705 — 14 days ago