Feel · Form · Frame
An iOS app that auto-generates haptics from a video using on-device AI — sound classification, optical-flow motion analysis, and spectral DSP all fused — then lets you fine-tune on a Premiere-style multi-lane timeline.
By the numbers
The editor
A familiar mobile DAW chrome with a top bar, video monitor with floating timecode, transport row, tool palette, and a multi-lane timeline that locks frames to events.
The timeline
Video on top. Three haptic lanes underneath — one per event type. A sticky left column so you always know what you're editing, even when scrolled three zoom-levels deep.
The AI pipeline
A pure DSP analyzer is blind to what is happening — it fires on dialogue's bass, misses silent action, and can't tell a snare from a glass break. Phase 1 layers two on-device AI gates on top so the haptics actually understand the video.
Everything above runs on-device using frameworks shipped with iOS — zero API calls, zero cost, zero data leaving the phone. Read the research →
Design philosophy
Why HapticVideoApp
A focused stack of features that get out of your way and let you feel the cut.
One lane per event type so markers never collide. Sticky left header column, adaptive ruler labels, pinch-to-zoom from 0.5× to 4×.
CHHapticAdvancedPatternPlayer is locked to AVPlayer state. Seeks, edits, and re-edits all re-sync the engine without dropped events.
On-device sound classification suppresses dialogue and upgrades gunshots / explosions / glass; optical-flow adds haptics for silent action; classical FFT catches everything else.
Tap (transient), Beat (impact), Hold (continuous). Each one has intensity, sharpness, and — for holds — a duration slider.
Firebase Firestore + Storage means your videos and patterns live everywhere. Share a link, feel the haptics.
Every button, slider, and event press has its own UIKit haptic. Editing haptics with haptic feedback feels right.
How it works
Grab any clip from your library or record a new one — HapticVideoApp pulls thumbnails and duration on the fly.
FFT + spectral flux runs in parallel with Apple's sound classifier and Vision optical flow. All three signals are fused into a semantically-aware pattern — dialogue suppressed, impacts boosted, silent action filled in.
Drop, drag, tweak intensity. Preview a single event or scrub the whole clip. Pinch to zoom in on the cut you care about.
Your finished video and its haptic pattern upload to Firebase and become a shareable link for anyone on iOS to feel.
A video without haptics is half a story. Feel the other half.
Clone the repo, open it in Xcode 16+, drop in a Firebase config, and run. The full source for the editor, the timeline, and the audio analyzer is on GitHub.
git clone https://github.com/banthia14aman/HapticVideoApp.git