Rényi entropy was originally introduced in the field of information theory as a parametric relaxation of Shannon (in physics, Boltzmann–Gibbs) entropy. This has also fuelled different …
The concept of entropy connects the number of possible configurations with the number of variables in large stochastic systems. Independent or weakly interacting variables render the …
J Fuentes, JL López, O Obregón - Physical Review E, 2020 - APS
We derive generalized Fokker-Planck equations (FPEs) based on two nonextensive entropy measures S±that depend exclusively on the probability. These entropies have been …
TN Bakiev, DV Nakashidze, AM Savchenko… - Moscow University …, 2023 - Springer
The statistical theory based on the two-parameter Sharma–Mittal functional is a generalization of Gibbs, Renyi, and Tsallis statistics. This study focuses on the formalism of …
J Fuentes, O Obregón - International Journal of Information …, 2022 - inderscienceonline.com
As a non-extensive statistical mechanics application, a possible path to generalised information theory is discussed by introducing a family of non-extensive entropies …
It is discussed how the superstatistical formulation of effective Boltzmann factors can be related to the concept of Kolmogorov complexity, generating an infinite set of complexity …
As an application of generalised statistical mechanics, it is studied a possible route toward a consistent generalised information theory in terms of a family of non-extensive, non …
The generalized theory of fluctuations corresponding to probability-dependent generalized entropies in the canonical and grand canonical ensembles is formulated. The entropies …
ТН Бакиев, ДВ Накашидзе, АМ Савченко… - … . Серия 3. Физика …, 2023 - cyberleninka.ru
Статистическая теория, построенная на основе двухпараметрического функционала Шарма-Миттала, является обобщением статистик Гиббса, Реньи и Тсаллиса. В …