
Built an XR system that recognizes heart signals using just a camera would love feedback
We started exploring a simple question:
>
That led us into remote photoplethysmography (rPPG) estimating heart rate using subtle pixel variations from the face captured through a webcam/camera.
From there, we started thinking:
What if physiology wasn’t just shown as numbers… but experienced spatially?
So we built CardioVerse an XR concept where:
- heart rate drives a reactive environment
- emotions influence colors and atmosphere
- blood flow and hormones become visual elements
- AR overlays can show physiological state
- VR mode turns the body into a navigable “digital twin”
Some scenarios we explored:
- XR glasses in a meeting room showing calm/stress states (consent-based)
- immersive VR “inside the body” visualization
- phone-based AR self-analysis using camera input only
Current stack:
- rPPG (CHROM / POS methods)
- OpenCV + signal processing
- Three.js / WebXR
- real-time BPM streaming via WebSocket
Would genuinely love feedback on:
- the product direction
- technical realism
- XR interaction ideas
- ethical/privacy considerations
- possible healthcare / wellness use cases
u/slakashkumar — 3 days ago