Realtime mesh
A peer graph that converges in milliseconds. State stays on-device, intent stays in sight.
Halon Labs · v0.4 preview
A perception layer for ambient interfaces. Realtime mesh, edge inference, adaptive UI.
01 / Surface
Six primitives. One mesh. The interface gets out of the way.
A peer graph that converges in milliseconds. State stays on-device, intent stays in sight.
Models compile to WebGPU. The cloud is optional, the latency is not.
Layout responds to attention, not just viewport. Quiet by default.
Camera, motion, audio, location. Composable streams, deterministic timing.
Personal context never leaves the device. Auditable by default. No silent telemetry.
Typed interfaces for Web, iOS, Android. One mental model, three runtimes.
02 / Principle
Four principles. Tested against every line we ship.
03 / Method
Instrument the mesh. Capture intent, latency, attention. No assumptions, only signal.
Weave inference, motion, surface. Each layer earns its place by removing one tap.
Subtract until the interface stops asking for attention. Then ship.
04 / Performance
0.0ms
Median end-to-end inference latency on commodity hardware.
A small, considered SDK. A short waitlist. Early access for builders shipping ambient products.