James Crutchfield, Sarah Marzen

Paper #: 14-08-032

Renewal processes are broadly used to model stochastic behavior consisting of isolated events separated by periods of quiescence, whose durations are specified by a given probability law. Here, we identify the minimal sufficient statistic for their prediction (the set of causal states), calculate the historical memory capacity required to store those states (statistical complexity), delineate what information is predictable (excess entropy), and decompose the entropy of a single measurement into that shared with the past, future, or both. The causal state equivalence relation defines a new subclass of renewal processes with a finite number of causal states despite having an unbounded interevent count distribution. We apply our new formulae for information measures to analyze the output of the parametrized simple nonunifilar source, a simple two-state machine with an infinite- state ε-machine presentation. All in all, the results lay the groundwork for analyzing processes with divergent statistical complexity and divergent excess entropy.

PDF