Information Theory

0 metrics

What It Measures

Not in atlas — all metrics redundant with existing geometries at |r| > 0.9.

What it measures: How compressible is the signal, and where does the redundancy come from?

Shannon entropy at multiple block sizes, compression ratio via Lempel-Ziv (zlib), and mutual information at lags 1 and 8. Together these measure the signal's intrinsic complexity from three complementary angles: distributional (are byte patterns uniform?), algorithmic (can a compressor find structure?), and temporal (does the past predict the future?).

Metrics

compression_ratio

Lempel-Ziv compression ratio: compressed_size / original_size. 1.0 means incompressible (Wichmann-Hill, MINSTD, XorShift32 — good PRNGs are incompressible). 0.0 means trivially compressible (constants). Logistic period-2 scores 0.002 (alternating between two values compresses almost completely). This is a direct proxy for Kolmogorov complexity.

mutual_info_1

Mutual information between consecutive bytes: how much does byte t tell you about byte t+1? Logistic periodic orbits score 1.0 (each byte perfectly predicts the next). L-System Dragon Curve scores 0.0 (its symbolic dynamics are unpredictable one step ahead despite being deterministic). This separates "locally predictable" from "locally random" deterministic systems.

mutual_info_8

Mutual information at lag 8. Rule 30 scores 0.0 (no 8-step memory), while logistic period-2 still scores 1.0 (period divides 8). The comparison between lag-1 and lag-8 mutual information reveals the memory timescale: signals where MI_8 ≈ MI_1 have long memory; signals where MI_8 ≪ MI_1 have short memory.

excess_entropy

The total shared information between past and future. Hilbert walk (0.95) and sawtooth (0.95) maximize this: their deterministic structure creates maximum past-future coupling. Random steps (0.94) are high too — the random-walk integration creates long-range correlations even from IID increments.

block_entropy_2

Shannon entropy of byte pairs, normalized. PRNGs and white noise score ~1.0 (all 65,536 pairs equally likely). Constants and periodic orbits score 0.0.

block_entropy_4

Shannon entropy of 4-grams, normalized. The drop from block_entropy_2 to block_entropy_4 measures how much additional structure emerges at longer pattern lengths. Signals where block_entropy_4 ≈ block_entropy_2 have no long-range pattern structure; signals where it drops significantly have sequential dependencies beyond adjacent bytes.

When It Lights Up

Information Theory metrics are the framework's workhorse for separating noise from structure. Compression ratio alone separates PRNGs (incompressible) from all other sources. Mutual information at multiple lags provides the temporal skeleton that static entropy measures miss. In the atlas, Information Theory drives the distributional view's separation between C1 (compressible, high MI: oscillators and periodic chaos) and C5 (incompressible, zero MI: noise and PRNGs).

Open in Atlas
← Spherical S²Torus T^2 →