What is persistence?
The word "persistence" has a precise technical meaning in PT: it is the structured part of information, in bits, that resists mixing. Here is how it is defined and why it is conserved.
The question
When a signal goes through a noisy channel, part gets through and part is lost. When a physical system evolves, some structures persist, others dilute. When you shuffle a deck, the initial order fades, but our knowledge of it is more subtle.
In all three situations, we have an intuition of “what persists”. PT gives it an operational definition.
The definition
In plain words, persistence is the part of a system that remains readable after mixing, noise, or constraint. If everything becomes perfectly random, there is no persistence left. If a form remains recognizable, then part of the information has persisted.
A very ordinary image helps: a pebble. The sea does not add its shape from the outside; it removes what does not hold. What remains is not just any residue, but the stable trace of a long filtering. In PT, the discrete layer often has this role: it marks the remarkable positions where a continuous mechanics under constraint becomes stable and readable.
The technical formula says the same thing with three terms:
- the total budget: the number of possible distinctions in the system;
- persistence: the structured part of that budget;
- entropy: the still dispersed or unpredictable part.
The persistence of a distribution on states is:
where is Shannon entropy and the uniform distribution on states. Here, is only the number of possibilities, and counts the total budget in bits: in other words, how many binary distinctions it would take to identify a state. The more deviates from chance, the larger becomes.
A uniform distribution has — no persistence, pure noise. A distribution concentrated on a single state has — the entire informational capacity is structured.
That is the Gap Fundamental Theorem (GFT):
This identity is exact, not approximate. For any distribution, on any number of states. It is the fundamental principle of persistence: the total budget of distinctions is conserved, partitioned between persistence and entropy.
Why it is central
The conservation is an algebraic identity, not a physical law. But it has a strong physical consequence: knowing two of the three quantities (, , ) determines the third. No double counting possible.
That is precisely what prevents cheating in PT by adding an extra parameter to compensate for an error. Any correction to must mirror in . Any redefinition of must shift by the same amount. The balance is exactly conserved.
In the language of binary codes: is the optimal description length of a state. is what we save on that description thanks to the structure of . is what we still must spend because is not concentrated.
Physical persistence
When is identified with the distribution of gaps between consecutive primes, becomes a physical quantity:
- it counts the bits by which the gap sequence deviates from chance;
- it is conserved along the cascade T0 → L0 → T6 (each step transfers its structure to the next);
- it caps per CRT channel at 1 bit (, theorem T6).
This last bound is the Shannon cap of PT: no prime can carry more than one bit. Three active primes, three bits — exactly the informational content of a Standard Model particle with its gauge quantum numbers.
An analogy
Take a text. Its bit length is — the raw capacity. Its entropy measures how unpredictable the words are. Its persistence measures what is structured: grammar, redundancies, patterns.
A random text has , : useless and unreadable.
A hyper-structured text (“aaaaaa…”) has , : predictable, no new information.
An interesting text lives in between, with a partition between the two. PT says physics also lives in between, and that partition follows the arithmetic cascade of the sieve.
And conservation?
One last way to put it: the GFT is the PT equivalent of energy conservation, but written directly in the language of persistence. Informational capacity is neither created nor destroyed — it is transformed. That is why the identity holds at every scale, for every distribution.
This is what makes persistence usable as a biomarker (medical imaging, IST/IEE), a linguistic measure (language evolvability), or an internal consistency check in a physical derivation. The same object, everywhere.