StrangeCompute

Gauge Intelligence for Language, Data, and Field Dynamics

StrangeCompute is the narrative shell around Holon inference, the Hamiltonian Factory, and edge-deployable language systems that learn to speak through geometric memory rather than monolithic model dependency.

Soul(512) -> RosettaStone -> Thought(4096)
token -> embeddings -> Compressor -> evolve_soul
single artifact | selected council | all models
Arrival-like operator shell for living inference systems.

Scrollytelling

Four operating layers, one geometry-first stack

01

Holon Runtime

Gauge-equivariant artifacts are compressed into soul states, transported through structured memory, then expanded back into language for live reasoning and dialogue.

02

Hamiltonian Factory

Living data is reduced toward anyonic statistics and Hamiltonian structure, letting the same geometric machinery operate beyond pure language into field-like data spaces.

03

Edge LLM Compute

Inference can run per artifact, per selected council, or through the full ensemble, with lightweight head and embedding strategies designed for smaller footprints and local deployment.

04

Demonstrations

Operator-facing and user-facing experiences sit side by side: cinematic chat, artifact telemetry, training controls, and export paths into Rust and production runtime targets.

Holon

Inference that can run as a single mind or a council

Zeus routing, Bellerophon mixing, and the coherency filter all remain visible in the system architecture, so you can operate one artifact at a time, selected combinations, or the full chorus depending on the task.

Factory

Data reduced into anyonic and Hamiltonian structure

The factory side is not a separate gimmick. It is the same gauge logic applied to a wider data space, letting artifact creation, reduction, and interpretation share a common language with the LLM-facing runtime.

Edge

Smaller footprints, local demonstrations, production migration

The roadmap runs from Python artifacts and training through Rust/Burn export, lightweight head training, edge inference, and eventually production-grade deployment surfaces.

Demonstration Paths

What visitors, operators, and collaborators can actually do

User Experience

Public chat, cinematic token traces, model selection, and live story-driven explanation of what the system is doing.

Operator Console

Training controls, artifacts, model routing, smoke tests, and export surfaces for the Holon pipeline.

Research Surface

Documentation, methods, and academic framing live in AnyonicLabs while the StrangeCompute shell stays experiential and forward-facing.