Halon Labs · v0.4 preview

Perception, rendered.

A perception layer for ambient interfaces. Realtime mesh, edge inference, adaptive UI.

Realtime Mesh Edge Inference Adaptive UI Sensor Fusion Privacy First Open SDK Realtime Mesh Edge Inference Adaptive UI Sensor Fusion Privacy First Open SDK

01 / Surface

Built like instruments. Felt like ambient light.

Six primitives. One mesh. The interface gets out of the way.

Mesh

Realtime mesh

A peer graph that converges in milliseconds. State stays on-device, intent stays in sight.

Inference

Edge inference

Models compile to WebGPU. The cloud is optional, the latency is not.

Adapt

Adaptive UI

Layout responds to attention, not just viewport. Quiet by default.

Fusion

Sensor fusion

Camera, motion, audio, location. Composable streams, deterministic timing.

Trust

Privacy first

Personal context never leaves the device. Auditable by default. No silent telemetry.

SDK

Open SDK

Typed interfaces for Web, iOS, Android. One mental model, three runtimes.

Principles

02 / Principle

Four principles. Tested against every line we ship.

03 / Method

A short loop. Repeated until the seams disappear.

  1. 01

    Listen

    Instrument the mesh. Capture intent, latency, attention. No assumptions, only signal.

  2. 02

    Compose

    Weave inference, motion, surface. Each layer earns its place by removing one tap.

  3. 03

    Quiet

    Subtract until the interface stops asking for attention. Then ship.

04 / Performance

Performance

0.0ms

Median end-to-end inference latency on commodity hardware.

Build with Halon.

A small, considered SDK. A short waitlist. Early access for builders shipping ambient products.