PointFlow: open-source React library for live LiDAR streams in the browser (WebGPU-accelerated)
I built a browser rendering library specifically for live LiDAR and point-cloud streams, published v0.1.0 this week.
It handles the problems that come up with live data: memory that would otherwise grow without bound (bounded ring buffer, configurable ceiling), main-thread stalls from parsing (dedicated Web Worker), and wasted render budget on off-screen or low-priority points (importance-weighted GPU sampling via WebGPU compute).
Supports WebSocket, SSE, and ROS rosbridge as stream sources. Also loads COPC, LAS 1.0-1.4, LAZ, PLY, and XYZ files with progressive rendering.
Demo (synthetic Lorenz attractor stream, not real LiDAR):
https://pointflow-demo.vercel.app
Docs:
https://pointflow-docs.vercel.app
GitHub:
https://github.com/Zleman/pointflow
This is a React library, so it won't replace desktop viewers. But if you're building a web dashboard that ingests a live sensor stream, it handles the rendering layer.
I'm posting this because I want it scrutinised by people who actually work with this data. I don't think what I've built is perfect and I'd genuinely value feedback from someone with real LiDAR use cases more than anything else. If something doesn't map to how you work or there's a gap in what the library handles, I want to know.