Rényi entropy based on characteristic function has been used as an information measure contained in wide-sense and real stationary vector autoregressive and moving average …
JE Contreras-Reyes, F Jeldes-Delgado… - Physica A: Statistical …, 2024 - Elsevier
Variance has an important role in statistics and information theory fields, by forming the basis for many well-known information measures. Based on Jensen's inequality and …
E Ugarte, PD Hastings - Development and Psychopathology, 2023 - cambridge.org
There has been significant interest and progress in understanding the role of caregiver unpredictability on brain maturation, cognitive and socioemotional development, and …
The Jensen-variance (JV) information based on Jensen's inequality and variance has been previously proposed to measure the distance between two random variables. Based on the …
R Tamir - IEEE Transactions on Information Theory, 2022 - ieeexplore.ieee.org
This work contains two single-letter upper bounds on the entropy rate of an integer-valued stationary stochastic process, which only depend on second-order statistics, and are …
R Tamir - arXiv preprint arXiv:2203.05237, 2022 - arxiv.org
This work contains two single-letter upper bounds on the entropy rate of a discrete-valued stationary stochastic process, which only depend on second-order statistics, and are …