The dimension and divergence rate of the signal's phase-space attractor.
Delay-embeds the time series (at dimensions 2 through 8 for correlation dimension, up to 10 for Lyapunov exponent) using the first zero-crossing of the autocorrelation as the lag. In this reconstructed space, applies Grassberger-Procaccia to estimate the correlation dimension D2 (how many dimensions the attractor fills) and Rosenstein's method to estimate the maximum Lyapunov exponent (how fast nearby trajectories diverge).
How many effective dimensions does the attractor fill? Collatz Stopping Times leads at 4.22: its complex branching dynamics fill a roughly 4D manifold. Neural Net Dense (4.02) and ECG Supraventricular (4.01) are similarly high-dimensional. The Lorenz attractor sits around 2.05 (textbook D2 for the Lorenz system). Constants and Fibonacci Word score 0.0 — degenerate point or 1D attractors.
Does the dimension estimate converge as you increase the embedding dimension? Champernowne (0.997) and Triangle Wave (0.997) saturate immediately — their low intrinsic dimension is captured at the lowest embedding. Collatz Parity scores 0.0 (dimension never converges, suggesting the signal doesn't live on a finite-dimensional manifold). High saturation means you can trust the D2 estimate; low saturation means the attractor is higher-dimensional than the embedding can capture.
What fraction of the embedding space does the trajectory actually visit? Dice Rolls (0.994) and XorShift32 (0.982) fill almost all of it — they're space-filling in delay coordinates. Logistic Period-2 scores 0.002 (the trajectory visits only two points in any embedding). This separates low-dimensional attractors from space-filling noise.
The maximum Lyapunov exponent: how fast do nearby trajectories diverge? Positive means chaos (exponential separation), zero means periodic or quasiperiodic, negative means contracting. Henon Near-Crisis leads at 0.106: it's on the edge of destruction, with maximum divergence. Financial returns (Nikkei -0.003, NYSE -0.0003) are slightly negative — they're mean-reverting on short timescales.
| Source | Domain | Value |
|---|---|---|
| Collatz Stopping Times | number_theory | 4.2236 |
| Neural Net (Dense) | binary | 4.0161 |
| ECG Supraventr. | medical | 4.0060 |
| ··· | ||
| Constant 0xFF | noise | 0.0000 |
| Constant 0x00 | noise | 0.0000 |
| Fibonacci Word | exotic | 0.0000 |
| Source | Domain | Value |
|---|---|---|
| Champernowne | number_theory | 0.9971 |
| Triangle Wave | waveform | 0.9969 |
| Devil's Staircase | exotic | 0.9909 |
| ··· | ||
| Constant 0xFF | noise | 0.0000 |
| Constant 0x00 | noise | 0.0000 |
| Collatz Parity | number_theory | 0.0000 |
| Source | Domain | Value |
|---|---|---|
| Dice Rolls | exotic | 0.9939 |
| XorShift32 | binary | 0.9825 |
| RANDU | binary | 0.9821 |
| ··· | ||
| Constant 0xFF | noise | 0.0000 |
| Constant 0x00 | noise | 0.0000 |
| Logistic r=3.2 (Period-2) | chaos | 0.0020 |
| Source | Domain | Value |
|---|---|---|
| Henon Near-Crisis (a=1.2) | chaos | 0.1059 |
| Logistic r=3.68 (Banded Chaos) | chaos | 0.0959 |
| Henon Map | chaos | 0.0896 |
| ··· | ||
| Nikkei Returns | financial | -0.0032 |
| NYSE Returns | financial | -0.0003 |
| Circle Map Quasiperiodic | chaos | -0.0001 |
Attractor Reconstruction provides the classic chaos diagnostic: positive Lyapunov with finite correlation dimension means deterministic chaos. The framework uses it alongside Gottwald-Melbourne (which doesn't need embedding) as a cross-check. In the atlas, correlation_dimension separates the dynamical view's low-dimensional chaos cluster (D2 = 2-4: Lorenz, Rossler, Henon) from noise (D2 saturates at embedding dimension) and periodicity (D2 = 1). The filling_ratio metric complements this by detecting whether the trajectory is confined to a manifold or fills the space uniformly.