Predictability

Conditional entropy, memory depth, sample entropy
dynamicaldim information6 metrics

What It Measures

How much does knowing the past help predict the future?

Computes conditional entropy H(X_t | X_{t-1}, ..., X_{t-k}) at increasing depths k = 1, 2, 4, 8. If the conditional entropy drops as you add more history, the signal has memory — past values constrain the future. Also includes sample entropy (SampEn), a phase-space regularity measure.

Metrics

excess_predictability

Total information gain from knowing the past 8 values: H(X) - H(X | past_8). De Bruijn sequence scores 3.0 (maximally predictable — it's constructed so every 8-bit pattern appears exactly once, making the next bit deterministic given the last 7). Hilbert walk (2.90) and sawtooth (2.89) are nearly as predictable. White noise scores 0.0 (the past tells you nothing).

sample_entropy

Regularity of the phase-space trajectory. Low = self-similar, predictable. High = complex, unpredictable. Pi digits (2.22) and BSL residues (2.21) are the most irregular signals in the atlas — close to the theoretical maximum for byte-valued data. Devil's staircase scores 0.019 (nearly zero — its long constant plateaus create trivially self-similar trajectories). Constants score exactly 0.0.

entropy_decay_rate

How fast does conditional entropy decrease with depth? Baker map has the steepest positive slope (0.26): it reveals more structure at each depth. De Bruijn has the steepest negative slope (-0.31): it becomes maximally predictable at depth 7 and the entropy collapses. Near zero means either unpredictable at all depths (noise) or already fully predicted at depth 1 (simple periodic).

cond_entropy_k1

Conditional entropy at depth 1: H(X_t | X_{t-1}). White noise scores ~3.0 (knowing the previous value tells you nothing). Logistic period-3 scores 0.0 (the previous value fully determines the next). This is the simplest predictability measure — first-order Markov constraint.

cond_entropy_k8

Conditional entropy at depth 8: H(X_t | X_{t-1},...,X_{t-8}). The gap between k1 and k8 reveals hidden long-range dependencies — PRNG outputs score identically at k1 and k8 (memoryless), while the baker map drops from 2.8 to 2.3 (its 2D structure creates long-range predictability invisible at lag 1).

transition_entropy_variance

Variance of per-context conditional entropy across all observed symbol contexts. Champernowne (0.125) scores highest — some contexts are highly predictable while others are maximally uncertain. Middle-Square (0.071) and Intermittent Silence (0.085) also score high. Constants and periodic orbits score 0.0 (all contexts equally predictable). This captures heterogeneity in predictability: a signal with low mean conditional entropy but high variance has pockets of both predictable and unpredictable structure.

Atlas Rankings

cond_entropy_k1
SourceDomainValue
XorShift32binary2.9977
White Noisenoise2.9976
Wichmann-Hillbinary2.9976
···
Constant 0xFFnoise0.0000
Logistic r=3.83 (Period-3 Window)chaos0.0000
Collatz Gap Lengthsnumber_theory0.0000
cond_entropy_k8
SourceDomainValue
MINSTD (Park-Miller)binary2.9997
XorShift32binary2.9997
AES Encryptedbinary2.9997
···
Constant 0xFFnoise0.0000
Logistic r=3.83 (Period-3 Window)chaos0.0000
Collatz Gap Lengthsnumber_theory0.0000
entropy_decay_rate
SourceDomainValue
Baker Mapchaos0.2637
Sunspot Numberastro0.2015
Network Packet Sizesbinary0.1637
···
De Bruijn Sequencenumber_theory-0.3130
Champernownenumber_theory-0.2149
Collatz Trajectorynumber_theory-0.2071
excess_predictability
SourceDomainValue
De Bruijn Sequencenumber_theory3.0000
Hilbert Walkexotic2.8955
Sawtooth Wavewaveform2.8851
···
White Noisenoise0.0000
Constant 0xFFnoise0.0000
Pink Noisenoise0.0000
sample_entropy
SourceDomainValue
Pi Digitsnumber_theory2.2149
BSL Residuesnumber_theory2.2136
White Noisenoise2.2097
···
Constant 0xFFnoise0.0000
Devil's Staircaseexotic0.0191
Forest Fireexotic0.0227
transition_entropy_variance
SourceDomainValue
Champernownenumber_theory0.1250
Intermittent Silenceexotic0.0848
Middle-Square (von Neumann)binary0.0710
···
Constant 0xFFnoise0.0000
Logistic r=3.83 (Period-3 Window)chaos0.0000
Collatz Gap Lengthsnumber_theory0.0000

When It Lights Up

Predictability's sample_entropy and entropy_decay_rate were key discriminators in the negative re-evaluation study (2026-03-05). Standard map (44 significant metrics), Arnold cat (19), and GARCH (29) were reclassified from negative to positive detections largely on Predictability metrics. Sample entropy distinguishes deterministic chaos (moderate, ~1.5) from true noise (high, ~2.2) and periodicity (low, <0.5).

Open in Atlas
← BoltzmannNonstationarity →