[HTML][HTML] Divergent predictive states: The statistical complexity dimension of stationary, ergodic hidden Markov processes

AM Jurgens, JP Crutchfield - Chaos: An Interdisciplinary Journal of …, 2021 - pubs.aip.org
Even simply defined, finite-state generators produce stochastic processes that require
tracking an uncountable infinity of probabilistic features for optimal prediction. For processes …

Informational and causal architecture of discrete-time renewal processes

SE Marzen, JP Crutchfield - Entropy, 2015 - mdpi.com
Renewal processes are broadly used to model stochastic behavior consisting of isolated
events separated by periods of quiescence, whose durations are specified by a given …

Infinite excess entropy processes with countable-state generators

NF Travers, JP Crutchfield - Entropy, 2014 - mdpi.com
We present two examples of finite-alphabet, infinite excess entropy processes generated by
stationary hidden Markov models (HMMs) with countable state sets. The first, simpler …

Statistical signatures of structural organization: The case of long memory in renewal processes

SE Marzen, JP Crutchfield - Physics Letters A, 2016 - Elsevier
Identifying and quantifying memory are often critical steps in developing a mechanistic
understanding of stochastic processes. These are particularly challenging and necessary …

Complexity-calibrated benchmarks for machine learning reveal when next-generation reservoir computer predictions succeed and mislead

SE Marzen, PM Riechers, JP Crutchfield - arXiv preprint arXiv:2303.14553, 2023 - arxiv.org
Recurrent neural networks are used to forecast time series in finance, climate, language,
and from many other domains. Reservoir computers are a particularly easily trainable form …

Signatures of infinity: Nonergodicity and resource scaling in prediction, complexity, and learning

JP Crutchfield, S Marzen - Physical Review E, 2015 - APS
We introduce a simple analysis of the structural complexity of infinite-memory processes
built from random samples of stationary, ergodic finite-memory component processes. Such …

Complexity-calibrated benchmarks for machine learning reveal when prediction algorithms succeed and mislead

SE Marzen, PM Riechers, JP Crutchfield - Scientific Reports, 2024 - nature.com
Recurrent neural networks are used to forecast time series in finance, climate, language,
and from many other domains. Reservoir computers are a particularly easily trainable form …

Maximal Repetitions in Written Texts: Finite Energy Hypothesis vs. Strong Hilberg Conjecture

Ł Dębowski - Entropy, 2015 - mdpi.com
The article discusses two mutually-incompatible hypotheses about the stochastic
mechanism of the generation of texts in natural language, which could be related to entropy …

The relaxed Hilberg conjecture: A review and new experimental support

Ł Dębowski - Journal of Quantitative Linguistics, 2015 - Taylor & Francis
The relaxed Hilberg conjecture states that the mutual information between two adjacent
blocks of text in natural language grows as a power of the block length. The present paper …

Approximating information measures for fields

Ł Dębowski - Entropy, 2020 - mdpi.com
We supply corrected proofs of the invariance of completion and the chain rule for the
Shannon information measures of arbitrary fields, as stated by Dębowski in 2009. Our …