H

The Generative Threshold — An Overall Thesis

A philosophical synthesis across the full practice, with a future orientation.


The central claim

The practice is a sustained investigation of a single threshold: the moment at which aesthetic intention passes into a generative system and emerges as something that neither fully belongs to the artist nor fully belongs to the algorithm. What persists across that passage — what survives when composition gives way to initiation — is the underlying question.

The vocabulary the work itself has produced for this threshold: "less composed than initiated" (Senescenence press release); "tested, not executed" (Stained Unravel text); "permitted" (same). The artist authors the rules; the rules run; what appears was not designed but allowed. Authorship is reconstituted on the far side of autonomous process — not abandoned, but redistributed between the rule-giver and the rule-runner.

This is not an aesthetic position so much as a philosophical problem that the practice keeps restating in new formal terms, at different scales, with different materials.


The long arc: taste externalised

Read chronologically, the work traces a single trajectory: the progressive externalisation of taste from subjective preference into structural protocol.

Dissociations (2010–): taste as negative selection. The artist eliminates the incoherent item from each trio; the algorithm accumulates the trace of those eliminations. Taste here is not a positive program but an immune system — not "what I want" but "what doesn't cohere." The result is not a representation of taste but a record of its exercise, which is different.

Death Imitates Language (2016): taste as selection pressure. The artist decides which genetic organisms live, die, reproduce. Taste becomes a force operating on a population — evolutionary rather than intentional. The artist is not making forms but culling them; the forms that persist are those that survived the artist's taste, which is a different kind of authorship.

Hybrid Vigor (2017): taste distributed across a crowd. The selection pressure is still there, but it has been decentralised — now multiple agents apply it simultaneously. What persists is not any individual's taste but whatever survives the intersection of many tastes.

Nested Exchange (2017): taste formalised as maximum-difference. The "hipster algorithm" — each specimen succeeds by being maximally different from all others in the population. This is taste encoded as a structural principle, with no human judge required. The fitness function is a theory of what "better" means, extracted from subjective preference and built into the system's architecture.

Mutant Garden (2019–): taste replaced by formal complexity. The criterion is no longer aesthetic preference at all — it is the relationship between construction complexity and experiential complexity. Taste has become an objective formal measurement. The artist no longer decides; the algorithm decides by counting.

Quantizer
Quantizer (2025) — the protocol as aesthetic content; the system's rules made perceptible as the thing you judge

Quantizer (2025): the protocol is the aesthetic content. The work makes its underlying rules perceptible as the thing you look at and judge. Not "the system that produces the image" but "the system itself as image." The destination: desire for protocols, not just for their outputs.

Stained Unravel (2026): the process runs fully autonomously and makes its own history visible. The staining mechanism records how long each cell has held its form; color encodes duration. The system does not just evolve — it remembers its evolution, visibly, as the output you see. Taste has been left so far behind that the question of what the artist "prefers" is no longer coherent. What remains is the rule, running.

The trajectory: subjective elimination → evolutionary selection → distributed selection → structural criterion → formal measurement → protocol as form → autonomous process with memory. Each step pushes the composing hand further from the result while making the result more precisely a record of the process that produced it.


The compression/decompression axis

The practice has identified the inverse relationship between the complexity of the rule and the complexity of the output. The Our Inner Child exhibition text frames this art-historically:

Mondrian compressed: he looked at perceptual complexity (a tree, a landscape) and reduced it toward geometric essence. The De Stijl grid is maximum pattern, minimum randomness. Generative art runs the reverse. Input: a minimal rule. Output: extreme complexity at the pixel level — forms that could not be composed in advance, that exceed what the rule seems to entail when you read it.

But the decompressed output is compressed again by the screen. Every digital display layer is a resolution decision, and the screen's resolution is always lower than the rule's potential. The Resolution Paintings make this explicit: large-format photographic prints on metallic paper are the only medium that carries the full decompressed output without reimposing compression. The algorithm generates more than any screen can show; only the print preserves it.

This is not just a technical observation but a philosophical one: the rule is richer than the display. The substrate is always a compression of what the process produced. The work we see is always a sampling of the work that ran. This connects to the archival argument: the on-chain work preserves the rule; the display is the interface that changes. The rule, not the image, is the thing.


Memory and the memoryless

The Markov process is memoryless: each state is determined by the immediately preceding state and nothing else. The recurring palette — blue, green, magenta, greyscale — is the direct counter-example: a 2004 choice that has persisted across twenty-two years and three subsequent works in completely different media and technical substrates.

The palette persists because humans, unlike Markov chains, carry their history forward. The palette is memory. But it is a particular kind of memory: not narrative but chromatic, not argued but embedded, not stated but present in the output regardless of what the algorithm does. It is below the level of intentional decision; it reappears because it was never consciously re-chosen.

The CMYK/RGB analysis extends this: the palette is not just memory but a digital declaration, asserting the medium against the other medium that preceded and surrounds it. Blue (test-screen), green (unreachable in print), magenta (native to CMYK — the exception), black-to-white gradient (trivial on screen, laborious in print). The choice was not theorised in 2004; the account arrived later. Memory and intention operate on different timescales: the choice was made faster than its justification.

The staining mechanism in Stained Unravel builds memory directly into the rule. Duration since last change → color. The cell's history is written on its surface. This is the palette's memory externalized into the system's architecture: instead of a palette that the artist carries across decades, a mechanism that the system uses to carry its own short-term history. Memory at the scale of the run, memory at the scale of the career — both operative simultaneously.


The grid and its autonomy

The loom genealogy (Senescenence press release) identifies a lineage running from Jacquard's punch cards through early pixel graphics to the contemporary GPU. At each threshold, the capacity for complexity expands beyond full human control. The binary structure is identical across the entire arc: at each cell/intersection, a local decision; accumulated across a field, a pattern that exceeds any single decision.

What changes is the degree to which the rule, once set, operates independently of the hand that formulated it:

This is the deepest version of "less composed than initiated." The loom analogy makes it historical rather than merely formal: this is not an avant-garde gesture but a continuation of a computation logic that is centuries old. The generative artist is not breaking with the tradition of the hand — they are following it to its logical terminus, the point where the rule runs the hand rather than the other way around.


What the practice has not yet fully addressed

Four territories are present in the work but underdeveloped, and they are the most philosophically consequential:

1. AI as the inversion of the generative threshold

AI (large generative models) represents a structurally different operation from the practice described above. Where the practice compresses a rule into a system and decompresses it into complexity, AI compresses a vast corpus of human output into statistical weights and decompresses on demand. The fitness function — which has been the central conceptual object of the practice — is absent from AI: the model was trained to match its training distribution, not to maximize any aesthetic criterion.

The question is not whether AI is good or bad. It is whether AI-generated work crosses the threshold that the practice has been examining: is it "tested, not executed"? The Stained Unravel text identifies the irreplaceable element as the criterion — the artist's judgment about what the system should optimise for. AI does not have this. It can be steered, but it cannot be made to hold a criterion the way a fitness function holds one.

This creates the specific opportunity: a work in which AI is not the output but the fitness function itself — where a model learned from a curated corpus (Dissociations, the vault) operates as the selection criterion for a generative system. The taste that the vault accumulated through twenty years of eliminations becomes the evaluator of the system's outputs. This would be the most radical version of the externalisation-of-taste trajectory: not just encoding taste as a structural algorithm but learning taste from the accumulated record and deploying the learned representation as a formal criterion.

2. Knowledge breaking down as aesthetic program

The Struggle for Pleasure press release contains the sharpest claim in the entire corpus: "Simplicity of operation can become a window into territories of complexity where knowledge itself breaks down." This is stronger than emergence or unpredictability — it is a claim about aesthetics as epistemological disruption. The work can produce an experience at the limit of comprehension, where what you encounter exceeds what you can process.

The sacred geometry framework from Senescenence gives this claim a pre-modern precedent: the mandala, the stained glass window, the heraldic grid are rule-based systems in which the accumulated output carries a quality described as transcendental. Not beautiful-but-graspable but incomprehensible-and-therefore-opening.

The practice has produced this effect but has not yet taken it as an explicit program. Future work could be calibrated specifically toward the cognitive breakdown point — systems whose parameters are set to produce outputs at the upper boundary of processability. This is not obscurantism; it is a specific formal target: the threshold where pattern continues but comprehension stops, where the eye keeps moving but the mind cannot follow.

3. Memory architectures at multiple temporal scales

The palette is a 22-year memory. The staining mechanism is a memory measured in CA generations (seconds). The loom genealogy is a 200-year memory. The CMYK/RGB divide is a 60-year memory of computing history. The Albers lineage is a century-old memory of embodied algorithm.

These operate simultaneously in the work but are not yet composed as a single system with explicit temporal layers. A work structured around multiple strata of memory — each scale producing a distinct formal layer, each layer legible as a distinct temporal register — would synthesize what the practice has developed separately. The palette would be one layer. The CA staining another. The art-historical citation another. The medium's relationship to print another. The layers would interact: what persists at the 22-year scale would constrain what appears at the second-scale. Memory as multi-scale structure rather than single-register continuity.

4. The squircle as explicit ideology, not just formal terrain

The squircle has been occupied (Markov's Dream, Mutant Garden) but its ideological content — the warmth-as-interpassivity argument, the inversion of Malevich's autonomy-claim — has not been made the subject of a work. What would it mean to make a work in which the fitness function explicitly inverts platform-optimisation? Where the criterion is: be as unlike engagement-maximisation as possible — the hipster algorithm applied not to specimen-difference but to platform-value-difference?

This would be the squircle argument made operational: not just formally inhabiting the corporate aesthetic vocabulary but using the logic of maximum-difference-from-the-dominant to drive a generative system. The Nested Exchange hipster algorithm generalised from specimen-level to protocol-level.


The future thesis

The work has been building, across two decades, toward a single philosophical position that it has not yet fully stated in a single work: that the rules which generate complexity are a more durable form of authorship than the forms they produce, that memory is what makes a process an identity rather than just a sequence, and that the moment when knowledge breaks down is not the limit of aesthetics but its threshold.

The future work that would develop this most fully:

A work in which multiple temporal scales of memory operate simultaneously as formal layers — where the 22-year palette is one layer, the CA staining-mechanism another, a learned fitness function (trained on the Dissociations corpus) a third, and each layer's constraints propagate into the others. The output would be a form in which the artist's entire accumulated taste history is structurally present — not as style but as criterion, not as image but as the rule that the rule runs on.

This would be the synthesis the practice has been building toward: not a work that has a style, but a work that is a history — that carries its own genealogy as its generative architecture, that makes the past visible not as quotation but as constraint, and that produces at the other end, if the calibration is right, outputs at the threshold where knowledge breaks down and something else opens.


See also