Creative Practice Development · Arts Council of Wales

The Synchronicity Engine gathers people by synchronicity. Those who enter its field become part of it — their presence folded into its memory, time growing elastic and non-linear around them.

A live machine that compresses time into a present moment crowded with its own recent past.

The Synchronicity Engine is a live performance machine. This application funds its technical heart — a recursive video compression pipeline that folds real-world duration into playback, generation after generation, until the frame accumulates visual density and begins to carry a sense of weight.

PhaseDevelopment
Duration15 days · 8 weeks
Ask£8,350
DeliverableWorking pipeline + demo
Place yourself inside the machine →

In Performance

A machine witnessing its own construction.

The Synchronicity Engine runs as a recursive video system. The machine displays its output live — and because the camera is also filming the machine itself, that output is continuously re-ingested. Each generation is screen-captured and fed back in, the image compounding its own recent past as the build progresses.

Time accumulates and distorts inside the image as successive generations compress more real-world duration into less playback time. A decision layer — reading motion, duration, and activity — shapes what is revealed, and when. Its choices are visible on screen: gestures persist and amplify, periods of stillness compress away.

This development phase builds that core behaviour: a system that captures, compresses, and replays its own recent past.

  1. Two performers begin assembling the machine on stage.
  2. A camera, already running, captures them at work.
  3. Early moments of the build reappear inside later frames — faster, layered.
  4. Cables, screens, gestures of construction compress and accumulate across generations.
  5. By the time the machine is complete, it has recorded — and is still replaying — its own coming-into-being.

The Experience

For those who enter the Engine's field, time becomes elastic and non-linear. The machine does not merely record — it folds presence into memory. Moments of encounter persist and accumulate; the room carries its own recent past visibly alongside the present. Those gathered here were not invited by arrangement — they arrived by proximity, by chance, by the field the machine creates.

The full performance

The full performance — in which the machine watches itself being built — follows from this foundation. This phase funds the machine. The performance follows.

The Idea

Duration becomes visible. Each generation folds more real time into the frame — the image gaining weight, strangeness, and density with every pass.

Footage accumulates density across generations, becoming stranger and more abstract as it deepens.

A decision-making layer determines which generations are shown, when, and at what speed. This is where the machine exhibits behaviour — selecting, foregrounding, and shaping the image in real time.

Heuristic approaches first — motion detection, duration thresholds, activity-based foregrounding. Then machine-learning image recognition to see if it offers meaningful improvements. Both approaches tested, documented, compared.

The deliverable is not a finished artwork. It is a proven technical foundation — a working pipeline that demonstrates the concept is buildable, captured in a short demonstration piece and documented for future development.

01 / CAPTURE

A recursive capture loop

Camera records a scene. The playback of that scene is screen-captured. The playback of that is screen-captured. Real time folds into shorter and shorter playback.

02 / COMPRESS

Information density rises

Each generation packs more source seconds into fewer output seconds. Visual artefacts accumulate. The image gains weight, strangeness, and abstraction.

03 / CHOOSE

A decision layer with visible behaviour

The system foregrounds activity — human gestures persist and amplify across generations, while periods of stillness compress away. The output is authored: motion privileged over rest, density over duration.

Builds on traditions of video feedback and generative systems — focused specifically on temporal compression and recursive self-observation.

The Synchronicity Engine is the latest iteration of a long obsession with building a practical time machine — a machine that accumulates the recent past into the present, generation after generation, until the image can barely hold it.
— From the application

Development Structure

Fifteen working days, distributed part-time across eight weeks.

Rather than attempting the full ambition in one leap, this phase builds the technological heart first — a working pipeline, testable and documented, before any larger production ask is made.

+
Phase 01
Research
≈ 3 days
Survey existing work in recursive video feedback, live compression, and generative video systems. Identify libraries, frameworks, and precedents. Establish the technical shape of the build.
+
Phase 02
Prototype
≈ 6 days
Build the recursive screen capture loop. Test with source material. Document the visual and temporal behaviour that emerges across successive generations. Iterate.
+
Phase 03
Intelligence layer
≈ 4 days
Implement heuristic decision-making — motion, duration, activity-based foregrounding. Test machine-learning alternatives. Document findings and comparative behaviour between the two approaches.
+
Phase 04
Demonstration
≈ 2 days
Produce a short demonstration piece showing the system running on a real scene. Write up the technical documentation.

The Ask · £8,350

A responsible first scale — funding only what the first provable step requires.

Artist fee
£350/day × 15 days · part-time over 8 weeks
£5,250
Hardware — computer
Mid-to-high spec for real-time video processingThe current functional prototype operates within browser RAM, limiting the buffer to approximately 600 frames — around 20 seconds of footage. A persistent NVMe storage system is required to maintain the multi-hour archive that any real performance scenario demands. This is not a general-purpose purchase; it is the specific infrastructure that takes the system from a browser experiment to a performance-viable machine.
£2,000
Specialist consultation
Machine-learning advice and review · 2 days
£600
Documentation
Artist-led video capture of the working system, distributed across the build phase · 1 day @ £350 day rate
£350
Storage
External SSD, 2TB
£150
Total £8,350
Fifteen days
£8,350
Hover segments

About

A multidisciplinary artist, musician, and creative technologist in Abercrave, Upper Swansea Valley.

For over twenty years I've built and shipped browser-based interactive experiences, community filmmaking infrastructure, live social platforms, and original sound work — dozens of completed projects across those disciplines, delivered independently. Procrastinatrix (444+ tracks, 2019—) and ffilm.org (community filmmaking infrastructure, ongoing) are the most public evidence of that delivery habit.

I'm neurodivergent and currently pursuing formal assessment. Most of this work has happened alone, outside the scaffolding of collaborators, venues, or funding. This application seeks support for a focused, credible first step on the next thing.

This development phase represents a specific step change in practice — from independent screen-based and browser work into durational live performance with autonomous systems. A working Synchronicity Engine opens commissioning relationships, venue partnerships, and collaborative possibilities that do not currently exist. It is a new capacity, not an extension of an existing one.

Beneath all of it has run a single obsession: building a time machine. The Synchronicity Engine is the form that obsession has finally found.

MID 2000s
Triskabiblios
Alternate reality game covered by argn.com and discussed on Unfiction.
2008 —
Time Travellers Guild
A loose collective and philosophical container for the time-machine obsession — founded as a standing public frame for works exploring time, recursion, and temporal perception.
2019 —
Procrastinatrix
444+ tracks of original electronic music, ongoing.
ONGOING
ffilm.org
Development platform for Arianrhod, the Oliverse, and ongoing mythology work.

What this enables · Still to refine

A proven technical system makes the next ask grounded in reality, not hypothesis.

This phase turns a speculative idea into a demonstrable system — one that can be experienced, tested, and built upon. Risk reduced; ambition preserved.

What the funding supports

  • Fifteen days of focused technical research and prototyping, part-time across eight weeks.
  • Hardware capable of running real-time recursive video compression.
  • Specialist consultation on machine-learning approaches.
  • A documented, demonstrable pipeline as the foundation for future production.

Still to refine

  • Exact hardware specification — dependent on early prototyping decisions.
  • Specialist consultant sourced and confirmed.
  • Start date.

Impact on the field

A documented, open pipeline for recursive video compression and autonomous temporal curation has value beyond this project. The technical research — particularly the comparative findings on heuristic versus machine-learning approaches to interest weighting — will be published openly. Artists and practitioners working in live performance, installation, and durational work in Wales and beyond will be able to build on this foundation.

Supporting experiment

A rough webcam sketch exploring the same core idea.

Uses your camera to demonstrate temporal playback distortion in the browser. Not representative of the finished pipeline — but the underlying impulse is the same one.

Open webcam experiment →