Information Theory

Shannon entropy, complexity, redundancy
distributionaldim information6 metrics

What It Measures

How compressible is the signal, and where does the redundancy come from?

Shannon entropy at multiple block sizes, compression ratio via Lempel-Ziv (zlib), and mutual information at lags 1 and 8. Together these measure the signal's intrinsic complexity from three complementary angles: distributional (are byte patterns uniform?), algorithmic (can a compressor find structure?), and temporal (does the past predict the future?).

Metrics

compression_ratio

Lempel-Ziv compression ratio: compressed_size / original_size. 1.0 means incompressible (Wichmann-Hill, MINSTD, XorShift32 — good PRNGs are incompressible). 0.0 means trivially compressible (constants). Logistic period-2 scores 0.002 (alternating between two values compresses almost completely). This is a direct proxy for Kolmogorov complexity.

mutual_info_1

Mutual information between consecutive bytes: how much does byte t tell you about byte t+1? Logistic periodic orbits score 1.0 (each byte perfectly predicts the next). L-System Dragon Curve scores 0.0 (its symbolic dynamics are unpredictable one step ahead despite being deterministic). This separates "locally predictable" from "locally random" deterministic systems.

mutual_info_8

Mutual information at lag 8. Rule 30 scores 0.0 (no 8-step memory), while logistic period-2 still scores 1.0 (period divides 8). The comparison between lag-1 and lag-8 mutual information reveals the memory timescale: signals where MI_8 ≈ MI_1 have long memory; signals where MI_8 ≪ MI_1 have short memory.

excess_entropy

The total shared information between past and future. Hilbert walk (0.95) and sawtooth (0.95) maximize this: their deterministic structure creates maximum past-future coupling. Random steps (0.94) are high too — the random-walk integration creates long-range correlations even from IID increments. block_entropy_2 / block_entropy_4 — Shannon entropy of byte pairs and 4-grams, normalized. PRNGs and white noise score ~1.0 (all patterns equally likely). Constants and periodic orbits score 0.0. The drop from block_entropy_2 to block_entropy_4 measures how much additional structure emerges at longer pattern lengths.

Atlas Rankings

block_entropy_2
SourceDomainValue
Wichmann-Hillbinary0.9986
XorShift32binary0.9986
White Noisenoise0.9986
···
Constant 0xFFnoise0.0000
Constant 0x00noise0.0000
Collatz Gap Lengthsnumber_theory0.0000
block_entropy_4
SourceDomainValue
MINSTD (Park-Miller)binary0.8602
Pi Digitsnumber_theory0.8602
glibc LCGbinary0.8602
···
Constant 0xFFnoise0.0000
Constant 0x00noise0.0000
Collatz Gap Lengthsnumber_theory0.0000
compression_ratio
SourceDomainValue
Wichmann-Hillbinary1.0000
MINSTD (Park-Miller)binary1.0000
XorShift32binary1.0000
···
Constant 0xFFnoise0.0000
Constant 0x00noise0.0000
Logistic r=3.2 (Period-2)chaos0.0018
excess_entropy
SourceDomainValue
Hilbert Walkexotic0.9515
Sawtooth Wavewaveform0.9491
Random Stepsexotic0.9352
···
Constant 0xFFnoise0.0000
Constant 0x00noise0.0000
Collatz Gap Lengthsnumber_theory0.0000
mutual_info_1
SourceDomainValue
Logistic r=3.83 (Period-3 Window)chaos1.0000
Logistic r=3.74 (Period-5 Window)chaos1.0000
Logistic r=3.2 (Period-2)chaos1.0000
···
Constant 0xFFnoise0.0000
Constant 0x00noise0.0000
L-System (Dragon Curve)exotic0.0000
mutual_info_8
SourceDomainValue
Logistic r=3.2 (Period-2)chaos1.0000
Logistic r=3.5 (Period-4)chaos1.0000
Logistic r=3.83 (Period-3 Window)chaos1.0000
···
Constant 0xFFnoise0.0000
Constant 0x00noise0.0000
Rule 30exotic0.0000

When It Lights Up

Information Theory metrics are the framework's workhorse for separating noise from structure. Compression ratio alone separates PRNGs (incompressible) from all other sources. Mutual information at multiple lags provides the temporal skeleton that static entropy measures miss. In the atlas, Information Theory drives the distributional view's separation between C1 (compressible, high MI: oscillators and periodic chaos) and C5 (incompressible, zero MI: noise and PRNGs).

Open in Atlas
← PredictabilityCayley →