Project Dossier

The Synchronicity Engine

StatusFunctional Prototype · Phase 0
LocationAbercrave, Upper Swansea Valley
ObjectiveRecursive video · autonomous time curation

1 — Architectural Logic

A tri-partite system for digesting time.

The Engine operates across three interlocking components.

Component 01

The Eye

High-resolution acquisition of the live environment. The input layer — what the machine sees in the room it inhabits.

Component 02

The Brain

Real-time frame analysis using Computer Vision. Assigns Interest Weights to temporal segments — deciding which moments carry density and which do not.

Component 03

The Loom

The storage and playback engine. Folds recent generations of footage into the current frame, driven by the Brain's weights. The site of the recursion.

2 — The Temporal Zoom Heuristic

Time treated as elastic.

Unlike standard video — which treats every second as equal — the Engine assigns a value to time. Stasis compresses. Activity expands. The machine is not a neutral recorder; it is an editor that runs in real-time.

Low Interest → compress

Stasis Compression

When the ML identifies empty room, repetitive micro-movements, or darkness, it increases the read-head speed. Uneventful time is swallowed.

High Interest → dilate

Activity Dilation

Gestures of construction, human interaction, complex tool use — these slow the playback and increase the Opacity Weight of that layer. Labour becomes visually dense.

The result

A "Time-Condensed Fossil." Ten hours of real-time might produce a twenty-minute video where periods of intense human labour are visually thick and slow — while the intervening hours are mere ghost-frames.

3 — The Modes of Playback

A director's vocabulary of time.

The Engine is not a passive loop. It is a multi-state machine. In performance the operator — or the heuristic layer — switches between specific temporal behaviours. Each mode is a different relationship between the machine and its own accumulated past.

Mode 01
Linear
The Baseline

High-fidelity, real-time passthrough. Used at the start of the build to ground the audience in the physical space before the temporal warping begins. Establishing the "now."

Mode 02
Elastic
The Pulse

Time stretches and contracts based on detected activity. Lingers on moments of high information density — fine-motor assembly, complex wiring — and snaps through stasis. Playback that feels alive.

Mode 03
Compressed
The Weight

Successive generations of the past are folded into the frame at high speed. The "Time Machine" mode — collapsing hours of construction into a dense visual fossil where the past layers over the present with increasing opacity.

Mode 04
Scattered
The Debris Field

Disparate frames pulled from the entire 8-week buffer simultaneously. A temporal montage where multiple points in the build history occupy the screen at once — a non-linear visual record of the machine's evolution.

Mode 05
Randomized
The Entropic State

The engine jumps between speeds, directions, and generations without linear sequence. Breaks the audience's sense of "when" they are. Emphasises abstraction once the recorded data reaches peak density.

Mode 06
Context-Specific
The Semantic Filter

Playback prioritised by what is happening — only frames tagged as "Interaction," or "Hand," or "Screen." The machine curates its own history to reveal a single narrative thread running through the build.

For the record

"The Engine has a specific vocabulary of time. During the build, I can shift it from a simple Linear witness into an Elastic or Compressed state. It's an automated editor that can jump into Scattered or Context-Specific modes to reveal the history of the build in ways a standard camera never could."

4 — Research Avenues · Phase 1 Roadmap

From night's work to production machine.

Three research trajectories are required to move the functional prototype into a reliable performance system.

A

The Mode Controller — state management

Transition Architecture

Building a robust state-management system that can transition between modes — fading from Linear to Compressed, for instance — without breaking the frame-capture pipeline mid-performance.

Performance Dashboard

Developing the operator interface for the live show. The dashboard needs to allow real-time mode switching, with enough visual feedback to make live decisions legible under performance conditions.

B

Semantic Tagging & Retrieval — the ML layer

Text Tag Pipeline

Researching how the ML layer passes semantic tags — "hand," "tool," "screen," "interaction" — to the storage database in real time, without creating a latency bottleneck in the capture pipeline.

Archive Querying

Ensuring Context-Specific mode can instantly query the 8-week archive to surface every frame where a specific action occurred. Speed of retrieval is a performance-critical requirement.

C

Temporal Zoom Implementation — the elastic algorithm

Speed Slope Refinement

Refining the algorithm that decides the slope of the speed change in Elastic mode. How do we linger on interesting frames without the acceleration feeling jarring? The transition is as important as the destination.

Interest Weight Calibration

Researching lightweight pre-trained models — MediaPipe, TSN — that can map visual interest to a numerical value (0.0–1.0) and live-drive a playbackRate variable without frame drops.

5 — Budgetary Justification

A technical feasibility study.

The £8,250 ask funds two things: the hardware that takes the system from a browser experiment to a persistent physical archive, and the specialist consultancy that bridges standard motion detection with sophisticated temporal tagging.

Hardware
Moves the buffer from 600 frames (20 seconds) in browser RAM to a persistent TB-scale NVMe storage system. Required for any multi-hour performance scenario.
£2,000
ML Consultancy
Specifically to bridge the gap between standard motion detection and the nuanced temporal tagging the Brain requires. Saliency mapping at performance speed is a non-trivial problem.
£600
Total ask £8,250

6 — Conceptual Lineage

Not generative AI — interpretive AI.

The distinction matters. This machine does not create from nothing. It shapes what is already there — it digests duration. The lineage runs through artists who understood time as material.

Steina & Woody Vasulka
Pioneers of electronic signal manipulation. Video feedback as a generative system — the signal turning on itself.
Alvin Lucier
I Am Sitting in a Room. Acoustic recursion as a disclosure of space. Each generation degrades into the room's own resonance — the medium becomes the message.
Tehching Hsieh
Durational art that explores the physical weight of passing time. One year at a time. The work is inseparable from the duration required to make it.
Interpretive AI
The move from Generative AI (creating from nothing) to Interpretive AI (shaping what is already there). The Engine belongs to the second tradition.
The pitch in one breath

We are building a machine that doesn't just record the history of its own making — it digests it. It discards the empty space and crystallises the action, leaving behind a visual object that carries the physical weight of the time spent building it.