23 APR 2026

rahulmnavneeth

uify skill — build anything

homedocs

This page is the single entry point for building with UIFY. It is self-contained for the common path — an agent or human can complete ~90% of typical tasks without leaving this document. Links go out only for deep material (EKF-on-manifold derivation, CLAP internals, per-platform camera backend quirks).

If something in this document contradicts a reference doc, the code wins; file an issue or fix the doc.

What UIFY is

A Rust engine that turns camera input into a time-indexed stream of geometric primitives, with optional real-time transport to other programs. Five built-in trackers, all implementing one trait:

TrackerGeometry GTypical consumer
point (ball)Vec2 / Vec3audio automation, cursor control
bounding box(SE(2), w, h)detection gating, VFX masks
planeSE(3) + homographyAR, virtual camera body, compositing
facelandmarks + SE(3) poseexpression capture, retargeting
rotoscopecontour / maskmatte extraction, VFX

Every tracker produces Sample<G, C> values — value, tangent-space covariance, confidence, host-monotonic timestamp.

60-second quick start

# Development shell (Nix). Installs Rust toolchain, pre-commit hooks.
nix develop

# Build the workspace.
make

# Run the full test suite (unit + property + synthetic GT).
make test

# Lint (clippy warnings = errors).
make lint

# Build the CLAP plugin bundle.
make clap

# Format everything.
treefmt

The one abstraction

Every tracker is this:

pub trait Tracker {
    type Geometry;
    type Covariance;
    type Measurement<'a>;

    fn step(&mut self, measurement: Self::Measurement<'_>)
        -> Result<Option<Sample<Self::Geometry, Self::Covariance>>, TrackerError>;

    fn reset(&mut self);
}

Different trackers differ only in Geometry and Covariance. The pipeline around them — detector, associator, filter, smoother, sink — is the same shape.

Minimal pipeline (point tracker)

use uify_core::Sample;
use uify_point::PointTracker;

// Construct, feed frames, get samples.
let mut t = PointTracker::new(/* config */);

loop {
    let frame = camera.next_frame()?;                // runtime::camera
    if let Some(sample) = t.step(&frame)? {          // Sample<Vec2, Mat2>
        sink.send(sample)?;                           // OSC / MIDI / shm / CLAP param
    }
}

DAW hand-control example (the flagship)

Camera → HandLandmarker → GestureFSM → ParameterMap → CLAP plugin params
                              │
                              └─ states: idle, grab, hold, release

The GestureFSM is a consumer of tracker samples, not part of the tracker. That keeps the same hand tracker reusable for non-DAW consumers. See examples/daw-hand-control/ for the full source.

Latency + accuracy rules of thumb

Glass-to-parameter target: ~15–25 ms baseline, ~8–12 ms with distilled models + zero-copy capture. Below ~8 ms requires a 240 Hz camera.

Gotchas

Extending

See architecture-extension for the full guide.