![I built a SwiftUI app that uses AI to generate synchronized haptic feedback for any video [Open Source]](https://preview.redd.it/cv4r3b5i4j0h1.png?auto=webp&s=40d61258d3f6f08c5667a49026066d1a4b4a0aed)
I built a SwiftUI app that uses AI to generate synchronized haptic feedback for any video [Open Source]
Hey everyone!
I've been working on a project called HapticVideoApp and I'm finally ready to share the source code.
The core idea was to make videos feel more immersive by generating haptic patterns based on the audio track. I'm using on-device FFT analysis to detect peaks and rhythms, then mapping those to CoreHaptics.
Key Features:
- AI-Driven Haptics: Automatically generates
.ahappatterns from audio during the upload process. - Firebase Integration: Uses Firebase Auth for users and Cloud Storage for the heavy video assets.
- Deep Linking: Implemented a custom URL scheme (
hapticapp://) for sharing specific videos. - Feedback System: Built a small bridge that lets users submit bugs directly to GitHub Issues from within the app.
I'm still refining the FFT algorithm to make the haptics feel even "tighter" with the beats, so I'd love some feedback on the code or the approach!
GitHub Link: https://github.com/banthia14aman/HapticVideoApp
Tech Stack: SwiftUI, AVFoundation, CoreHaptics, Firebase, Firestore.
Let me know what you think!